Part 2 A Grand Day Out
Visiting Cambridge to find worthwhile employment for a robot reveals some painful truths about artificial intelligence.
We had planned to visit Tatties on the corner of Sussex Street and Hobson Lane, but given a robot sat in the restaurant created what Gino described as a spectacle, and attracted people who had no intention of buying lunch, let alone a coffee, he usually insisted ‘the crazy machine’ was locked in the toilet while I drank my flat white and ate Apfelstudel. In view of the traffic in Cambridge in the run up to Christmas we had set off early and now needed somewhere to sit quietly before the job interview: the robot’s not mine. I had applied on the robot’s behalf for a part time post with a marketing company, pointing out it would not be interested in promotional work – no mindless wandering around on exhibition stands. Instead, as it would no doubt demonstrate, the robot’s skills were better applied to analytics. However, killing time before the interview by wandering the streets of the city on an unseasonably cold and overcast afternoon in November would have drained the robot’s batteries.
There were charging points on the bus from Madingley Park and Ride but the journey was relatively short and, as the bus was crowded using them was not at option. Electric busses were relatively new on the route and as with any innovation there had been teething problems. If, by chance, the bus had come to a sudden halt it was fairly obvious who would get the blame; not the driver who forgot to check it was fully charged before it set off but the person sat upstairs with his cyborg plugged into three USB sockets.
So, we made our way to the Grand Arcade, which was on route to the IdeaSpace, where the interview was to take place. The upmarket shopping centre is a spectacle in its own right, and ‘Grand’ because that is how much a typical shopping trip costs in upmarket shops located in the cavernous temple of mindless consumerism. We entered through the section of the building still referred to as the Lion Yard where an art installation depicted a typical day out at the seaside, complete with sand, beach huts and deck chairs. The robot hesitated, it raised one eyebrow and for a moment I wondered whether, despite two reboots and a software upgrade, it retained memories of our day in Sheringham (see episode 1). After scanning the installation, and the small crowd of people looking down on it from the mezzanine, the robot continued on its way without comment. The destination was Costa, the last resort for anyone a bean short of a cappuccino, which was located midway along the marble clad arcade and opposite the entrance to the John Lewis department store.
(Disclaimer:- As in the first episode I should point out this story is a work of fiction, the events described here never happened and while the AI technology referred to is within our grasp the robot does not exist, and even if it did would not be available for corporate events, exhibitions or children’s parties. Also, again, as mentioned in the previous episode, where there is a verbal exchange with the robot this is an adaptation of the staccato output of the robot’s text to speech software.)
With the job interview in mind the robot’s software was upgraded and its hardware modified. The eyebrows were new and intended to make eye contact seem less spooky. There were also eyelids and those old enough to remember the ventriloquist Ray Alan will recall how these enabled his puppet Lord Charles to convey a range of emotions. A group of university researchers had spent two years working on eyelid technology for robots so, apart from adding an interface and tweaking the driving software, little work was required on my part. The eyebrows themselves were merely hair from my sister’s salon stuck to strips of metal which were dragged up and down the robots forehead by magnets behind its plastic faceplate.
The importance of eyebrows in non-verbal communication should be obvious to anyone familiar with TikTok and young female users shaving off their eyebrows and creating new ones with mascara. The aim is not to attract young men with a Groucho Marx fetish but to overcome numerous, seemingly insoluble, problems with simulated face to face communication. In this case they provide an alternative to that laugh which everyone, myself included, found rather disconcerting, especially as I had yet to find an algorithm which gives the impression the robot has a sense of humour. When it raised an eyebrow while we were stood by the seaside installation I was half expecting it to ask ‘Is this some kind of joke?’
‘And a cup of engine oil for your friend?’ The barista in Costa suggested when I ordered an Americano and Pain Aux Raisins.
I laughed and turned to the robot ‘I must write that down, we haven’t heard that one before have we?’ There was enough information here for the robot, with the help of access to ChatGPT, to work out this was a rhetorical question. Unfortunately, due to the slow 4G connection to the AI software, and the inability to differentiate between sarcasm, irony and a straight forward joke the response was not quite what I expected or intended. After a five second delay the robot turned to face the barista, its eyelids lifted and its eyebrows shot up its forehead like a pair of startled rats. Fortunately, the laugh instead of the legacy HA-HA-HA was a more restrain ‘teehee, teehee’ – even so this was still sinister enough for the barista to step back from the counter. I am not sure what went through the young man’s mind at this point but suspect it included an expletive and reference to a fiery underworld.
The robot and I sat at a table providing a view down the length of the Grand Arcade and, for reasons which will become apparent later, the opportunity to watch the shoppers just inside the entrance to John Lewis. Once he calmed down the barista was persuaded to let the robot have an empty cup which it used to simulate drinking coffee and test the dexterity of its modified finger controls. It’s a testament to mankind’s misplaced priorities that people with artificial limbs have struggled for decades with hands with all the sophistication of a £8.99 litter picker from Amazon until engineers started designing them for cyborgs. The robot made a convincing display of drinking coffee, and this attracted the attention Gino had felt Tatties could do without on a busy Monday afternoon.
Sat still the robot used less power, and I was tempted to make further savings by putting it in sleep mode, part of the control I retain to prevent it acting autonomously. There are certain legal implications associated with this feature as, should the robot cause injury or harm to a third party, ‘nothing to do with me gov’ was unlikely to prove a valid defence. Following that rather disturbing discussion we had in Sheringham the robot’s access to the internet and AI software is also under my control. There is list of subjects off limits and anything that might cause the conversation to veer towards existentialism is filtered out. Obviously, the word ‘God’ is at the top of the list of no-go topics, but so is ‘godfather of AI’ and list of names such as ‘Hinton’, ‘Altman’ and ‘Musk’ which cause the robot to either become distressed or start muttering expletives. Following the AI conference at Bletchley Park ‘Sunak’ was added to the list.
One reason for not putting the robot into sleep mode was the fear, on reawakening, it would have forgotten the information accessed to prepare it for the interview. Simply turning it off and on again was unlikely to fill any gaps in its memory or to reload corrupted software; this was not Windows 11 or Stephen Hawkins. There was also that issue of anthropomorphism, because would you put someone in a trance merely because you wanted a few minutes peace? Although, when I demonstrated the robot’s sleep mode to my wife, she suggested this feature should be provided via an implant to anyone married for more than forty years.
Had the robot slept while I drank my coffee Melanie and James would have been just two more gawpers leaning on the barrier between our table and passing shoppers and remained as anonymous as they had been when the robot first spotted them stood at one of the perfume counters in John Lewis. At that point, as far as the robot was concerned, the pair were not a couple but just two of the wireframe objects within its field of view. However, there was obviously something about the way they moved and their faces – which to the robot were not what we would recognise as faces but just collections of points – that suggested a level of interaction. At that point Melanie and James lost the anonymity you and I take for granted while in a public place. Those collections of points provided the data required by the robot’s facial recognition software to determine with a high degree of certainty that one of the wireframes was James H a pupil at Long Road Six Form College. It was less probable his companion was Melanie S, although as Melanie was also a pupil at the same college there was a good chance this was indeed her. All this I knew thanks to software which sent data from the AI software the robot was using to my mobile phone.
In truth much of what the robot was doing at this point had little to do with AI and, as far as software development goes, much of the heavy lifting was done back in the days when Facebook and Twitter provided Application Program Interfaces (APIs) enabling developers to produce software with direct access to a person’s social media data. The Cambridge Analytica scandal saw Facebook (now Meta) and Twitter turn off the taps after which there was something of hiatus until an increase in computer processing power made it possible to scrape vast amounts of personal data from social media timelines and process it in near real time.
It was difficult to keep up with the stream of data the robot’s software produced as much of it scrolled off the screen before it could be read. However, I did catch sight of a couple of lines output from a routine which used Google’s image analysis toolkit and given the zoom lens in the robot’s eye sockets had whirred a few times, I guessed the data was the result of an attempt at facial recognition. But by then the two young people had spotted the robot and walked over to where we were sitting. The young man had a well-rehearsed comment along the lines of ‘Which one of you guys is Arnold Schwarzenegger?’ But the robot got there first.
‘Hello James, so you decided not to buy Melanie perfume for her birthday.’
Facial Recognition (Who Do You Think You Are Looking At?)
Late one evening in the Summer of 1969 I threw up in the reception of Parkside police station. The events proceeding this embarrassing incident were as hazy then as they are now – although it is possible the consumption of lager was involved. It is a tribute to the social mobility of the era that two decades later I got the chance to repeat the performance at the Home Office itself, in one of its recently refurbished offices at 102 Petty France. On this occasion alcohol played no part and the only drink consumed that day was something with a passing resemblance to tea while eating lunch at a Happy Eater somewhere on the North Circular. In fact, a lack of fluids was partially responsible for the migraine which caused the omelette, chips and a vegetable of some description, ending up on the brand-new carpet in the Home Office’s IT department. Obviously, this was very embarrassing even though everyone was very polite, there was no bucket and mop, and no-one suggested I cleared up the mess in lieu of spending a night in the cells. Pity about the new carpet: I seem to recall it was blue with green dots although it is possible those were peas.
The day had started badly as I was somewhat stressed. My company had developed Britain’s first electronic newspaper which translated broadcast data into a desktop publication. Unfortunately, the data in question belonged to a company called Teletext, who took exception to us giving it away free and threatened to take us to court.
Admittedly this was one of those made-up legal battles common in the world of IT. For Teletext there was something to be gained from creating the impression their fading analogue service was still worth hijacking in the digital age, and we in turn got to position ourselves at the cutting edge of online publishing. Both of us realised there might be some mileage in a joint marketing agreement, hence an early morning meeting and product demonstration in their West London office.
Another of our firsts was live video in Microsoft windows which we developed so trading terminals could receive broadcast TV news while their users bought and sold shares. Unfortunately, the product launched in the Summer of 1987, a few weeks ahead of Black Monday, and the handful of terminals sold were only used to watch pornographic videos in half empty offices. Luckily the graphics card was used in a system sold to German hairdressing salons to show customers what they would look like with a budget version of a super model’s hairstyle. These systems made us a lot of money, some of which was spent on wine, women and fast cars – the rest we simply wasted on speculative R&D.
At the time the press was mocking the police for issuing photofit pictures of villains resembling something Dr Frankenstein had cobbled together. It was assumed the reason the Home Office suggested dropping in next time I was passing was to demonstrate software which would avoid fruitless searches for people wearing cabbages on their heads – after all L’oreal thought it was worth it.
Catching sight of the two-year-old copy of a computer magazine laying on a desk I realised assumptions about the purpose of my visit were wide of the mark. There was a ring around a story about a system my company developed to count the number of salmon swimming upstream to spawn. The software worked by scanning a video tape frame by frame and logging those containing images of fish. The Home Office thought the system could be used to identify football hooligans in CCTV footage using facial recognition. Remember this was the mid-1980s so the only reason you see officers sitting mindlessly staring at CCTV footage in police dramas is because the writers have run out of dialog for the character in question.
Interestingly, in the context of the robot’s interest in Melanie and James, identifying a hooligan was of less interest than discovering who was stood next to them. The image of this new suspect could be used as the basis for an automated search through footage from other matches and so on until a complete network of potential troublemakers was created. Given the robot, as discovered at Sheringham, was haunted by memories of his aquatic past I have a feeling it would be intrigued by the fact its facial recognition software began life as a program for counting fish.
Is He With You?
‘Wow, is this yours?’ Surprisingly, James directed his first question at me rather than the robot. (Personally, I would have asked how the robot knew my name and maybe even started ranting about GDPR compliance.) There are certain advantages in being over 70; one is pretending to be deaf when a young person asks a question you would rather not answer; the other is, as far as technology is concerned, there is an assumption you are stuck in the last century. This was an impression I reinforced by immersing myself in the copy of Cambridge News someone had left on the table.
‘Excuse me,’ James persisted, ‘this robot does it belong to you?’
‘No mate, it was sat here when I arrived.’ I lied.
‘How did it know our names? Melanie asked. ‘Oh.’ She exclaimed in surprise when the robot turned and stared at her.
‘Perhaps you should ask the robot, it seems fairly bright.’ I suggested and then set to work on the Sudoku puzzle in the newspaper.
The robot’s ears are for aesthetic purposes only as it picks up sound via a directional microphone in the centre of its forehead. This enables it to identify the person talking to it and filter out other voices. There is a slight glitch in the software as if the person talking to it leaves long gaps between sentences the robot then turns its attention to anyone else nearby who is speaking, giving the impression it has lost interest.
Melanie put her face close to the robot’s ‘How … did … you … know our … names?’ She asked a little too loud.
‘No need to shout, unlike the person sitting across the table, I am not deaf.’ That glitch in the robot’s voice recognition software had made it appear it was shaking its head. Melanie laughed and both she and Jamie looked at me, but then turned back to the robot when it began describing how its facial recognition software worked. James was obviously impressed but Melanie’s attention wandered while the robot was still halfway through the description of how the image of her face had been used to find her Facebook page.
‘A bit like TinEye then?’ Suggested James.
‘Yes, a bit like TinEye?’
‘But not TinEye?’
‘No, not TinEye.’ And then the robot fell silent.
James realised his interruption had truncated the robot’s description of its facial recognition software. ‘But there is not room inside your head for a computer powerful enough to do that.’ He said.
‘Correct’.
‘So, you are connected to the Internet?’
‘Correct.’ James paused for a moment, obviously composing a question which would yield more information.
‘Please describe your system architecture?’
‘Bingo’ I thought as the robot began a ground up specification sounding like a complete dog’s dinner of computing and communications technology and reminding me of the wire wrapped prototype graphics cards my company produced in the 1980s.
‘I have four mobile phones with 4G; one connected to Amazon Web Services where most of my AI software is located.’ The robot explained. ‘One connected to a dedicated server on which the image analysis is undertaken (this connection was only made possible recently when Country Broadband provided me with a high-speed fibre connection). One is used to access ChatGPT. And one acts as an internet hotspot, is used for Bluetooth communication and can link to open Wi-Fi networks. In fact you can access it now to follow me on Twitter.’ James took out his mobile phone and appeared to do just that.
‘In my chest there is a GEEKOM 13th Generation Mini for local processing tasks.
‘Power is provided by lithium batteries in my lower torso with a charging point in my naval.’ The robot pointed to this. ‘Because the alternative was, quite literally, a pain in the bum.’ A reference here to an attempt to recharge its batteries using a specially adapted chair. This made topping up its batteries in public places both impractical and somewhat undignified.
Thankfully the robot still remained unaware of the NFC (Near Field Communication) link used to put it into sleep mode and disconnect it from the Internet. This device also prevented the robot communicating with the ‘dark server,’ a connection which, at that moment, might well have proved embarrassing.
Interestingly neither Melanie or James expressed any concerns regarding privacy or, more to the point, the robot’s invasion of it – not even when it pointed out it knew they were headed to the library to complete the essays which had to be handed in the following day. Perhaps young people regard the intrusiveness of high technology differently than my generation and have come to accept that companies only take our privacy seriously because they can monetise it.
‘So, you do all that using AI?’ Melanie asked. The robot took some time to reply.
‘Yes and no. I’m afraid many references to AI these days tend to be hype from companies attempting, quite successfully in some cases, to convince investors they are somehow exceptional. Babylon Health was a case in point, its medical chatbot used a decisioning algorithm little more sophisticated than an Excel spreadsheet. And one must question why a company whose respiratory monitor failed to gain traction during the Covid epidemic now decides to reinvent its product as an AI device.’
Mea culpa I am afraid because, thirty years earlier, when my company launched that online newspaper, we marketed the browser as ‘AI based software’ when, in reality, it merely used a simple algorithm much like the one employed by Google a decade later when it launched its Internet search. This relatively simple algorithm classified and encoded each news story based on the frequency of key words and their proximity to each other. There is also something rather ironic about the robot’s response given it had itself been created using an ‘AI’ powered chatbot. Crossing my mind was the recent flare up relating to OpenAI during which an investor remarked “we have never seen anything like it,” I guess because few people in the venture capital industry are old enough to remember FTX and Sam Bankman-Fried.
Then the robot came clean about its use of a chatbot. ‘I did utilise a form of intelligence which is commonly referred to as ‘artificial’ to differentiate it from human intelligence.’ It paused as if giving James an opportunity to respond, which he did not. ‘However, the line between human and artificial intelligence is blurred. Reading, writing and the exchange of ideas using electronic communication can all be regarded as component parts of an intelligence which is, of man, but not man.’
The ‘of man, but not man’ seemed to have been derived from Sartre’s ‘Man is what he is not and is not what he is’ and this should have been inaccessible to the robot as ‘Sartre’ was one of the words on its blacklist, as were those of other existentialists. However, I later discovered someone had created a post describing the link between existentialism and The Situationist International with ‘Sartre’ misspelled as ‘Satre,’ a common problem with filters. With luck the post had been created by someone either in a hurry, with a fat finger or dyslexic, rather than crazy as a box of frogs.
‘Well, that’s something to think about.’ Said James and I wondered if he was being ironic.
‘It certainly is, but don’t let me keep you from your studies.’
James glanced at his watch then looked at Melanie. ‘No, we should be moving on.’ He said.
‘What was all that about?’ I asked when the couple were out of earshot, but the robot did not reply. Instead, it watched the couple walk away.
James took the mobile phone from his pocket, studied the screen then showed it to Melanie. After a brief discussion the couple disappeared into the Apple Store. Still on the screen of my mobile was a reference to the iPhone SE which the robot had posted on Twitter.
‘I think we can regard that as a success.’ The robot said.
‘I’m afraid I missed most of that.’ I replied. ‘Please explain what you have been doing.’
‘Well, after I noticed those two people in John Lewis interacting with each other I assumed they were considering a purchase from one of the perfume counters. After I determined their identities and accessed their social media timelines, I discovered it was Melanie’s birthday this weekend – so it was likely James was considering buying her a present.’
‘But he didn’t.’
‘No, it was obvious from her body language she did not want perfume. She really wanted a new mobile phone as her existing one is two years old, far older than the ones those stood around her were holding up when she posted a picture at a recent concert. James just needed a little nudge.’
‘Persuading someone to spend over £400 on a birthday present that’s some nudge.’
‘James is keen to impress.’
‘Very keen as I’m guessing he has only known her for a few months.’
The robot’s eyebrows rose slightly to express surprise at input it had not been expecting. ‘So how did you guess that?’
They both attend Long Road Sixth Form College so I’m assuming they met there, either last year or early this term.’
‘They became friends just six weeks ago, Peter. You might well become the first person to replace a robot. Except of course you are guessing and making assumptions while I have access to facts and certainties, which is why I know it is not Melanie that James is keen to impress.’ As will become apparent, this observation was slightly worrying.
Should we be concerned that by monitoring us a machine can determine our intentions and even prompt us to make decisions, effectively taking control of our lives. Perhaps we should, if this was something new.
Twenty years ago, a UK supermarket began experimenting with scan as you go shopping. Customers used handheld wireless scanners (made by a company called Symbol) to carry out the barcode swiping previously undertaken by checkout staff. This saved time queueing and provided a running total of the amount spent. Some familiar with wireless networks and database software wondered why the long delay in recording purchases: some may even guessed there was more happening on the server than a simple search for products and updating the shopper’s virtual basket – welcome to the wonderful world of analytics. The introduction of scan as you go shopping was followed by a reorganisation of supermarkets based on data collected from thousands of self-scan shopping sessions. (Omitted were those of smart Alec shoppers with ‘surveillance society’ issues who, realising what the software was doing, scanned their trolley of products in the supermarket’s toilet.) Over time our journeys along the aisle of bricks and mortar stores were, like those we made when shopping online, increasingly determined by data collected from those who went before us and, because they brought mangos, statistically were likely to purchase granola yoghurt.
Back another two decades to when my company employed a salesman who seemed to lack focus. There were times during a sales pitch he appeared to lose interest in the customer and gave the impression of being permanently uncomfortable, transferring his weight from one foot to another as if dancing around the demonstration room or exhibition stand. In reality he was managing the customer’s enthusiasm, the trick was not to get to the point where the customer made a decision until it was certain they would say ‘yes.’ The fancy footwork was to position the customer, without them realising, with their back to the demonstration room door or the perimeter of the exhibition stand. It was then just a question of waiting until someone else entered the demonstration area at which point the request to place an order was made. The customer was then forced to decide whether they needed to purchase the product while, at the same time, determining whether the person now stood behind them represented a threat. They could have said ‘no’ to the salesman, but this would have created another potential threat, so usually agree to the purchase before glancing over their shoulder. It was amazing to watch the survival instincts of stone age man exploited to sell twentieth century technology.
While the robot’s surveillance and manipulation of James seems sinister consider those 30 seconds segments of video inserted at intervals into TV programs to create the consensus and conformity required to support the consumption of mass-produced goods. The Grand Arcade itself provided an environment which made the robot’s job easier, offering James and Melanie shelter and security while at the time each store they passed bombarded them with choices subconsciously perceived as threats.
Unlike the salesman the robot is not human, the manipulation is not based on some legacy fear of being hunted by another animal. As far as it was concerned James and Melanie were merely a pair of geometric shapes within something resembling a Chemical Brothers’ video. However, a caveat here.
One of the Darker Corners of a Robot’s Mind
‘So, tell me all you know about James and Melanie.’ I asked the robot after typing the code which unlocked the robot’s access to the dark server into the app on my phone.
In some ways the dark server is analogous to our subconscious, a repository for things the robot knows but, at the same time, is unaware it knows, unlocking it is much like taking on the role of psychotherapist. The server started life as a machine called ‘Milch’ (who says Germans do not do irony) installed in a near derelict house in post-cold war Aldershof a suburb of former East Berlin by a group of cyberpunks, hackers and various academics enjoying their newfound freedom and employing skills they previously felt had been misused by the state. But then property speculators made an offer for the building which was hard to refuse. The server was moved from Berlin to the basement of a deserted hotel just outside Hannover and stayed there until the owner decided to refurbish the building. By then Brexit had resulted in issues regarding the cross-border transit of personal data and it was also felt EU laws on the use of AI would be tougher than those introduced in the UK. So the machine is now in the IT department of a UK university. A rebuild based on the Nvidia H200 platform is pending ahead of migration into the corporate sector via a spinout company providing the final physical link between cold war state surveillance and the brave new world of AI. But for now its only role is acting as the robots digital subconscious.
‘James’s family social grade is C1’ The robot explains, rather too loud and leaving me fumbling with my phone to put it in whisper mode. ‘His parents are both employed in the property sector, while Melanie’s is AB, as her father is the founder and CEO of an IT company. However, the two young people’s relationship is more complex than the difference in social status suggests.’ The robot paused for a moment while it studied a person looking in the window of a jewellery shop.
‘Importantly,’ The Robot continued, ‘James has a disposable income which is high for a person of his age and social status as he has written a moderately successful app. You may be interested to know this enables a mobile phone to act as a wireless scanner detecting mobile devices in its immediate vicinity. In fact, he was using it while I was describing my system architecture.’ This was both interesting and slightly worrying.
I try not to get too paranoid about the robot’s access to the dark server, although it is of some concern. Being a mechanical as opposed to an electronic or computer science engineer, when I decided to allow the robot access to the dark server it was via a modified 4G mobile phone. This phone derives its power from the robot’s battery and is switched on and off by sending a code from my mobile to a dedicated wireless receiver. The click of the relay which connects or disconnects the phone I find reassuring. The dark server itself will only allow connections from one IP address, that of the mobile phone and cannot be programmed remotely: updates (fewer of late) are via the university’s Local Area Network (LAN) and are applied via a USB memory stick. From a security point of view the LAN is a weak point and occasionally I trawl through the robot’s logs to check for attempts to access the server via one of the university’s IP addresses.
The robot appeared to wait for me to react to news that James might know more about the anatomy of the robot than I would feel comfortable with, but I feigned disinterest. ‘You were saying about James’s income.’ I prompted.
‘Yes, I deduced from purchases he discussed on social media he could easily afford £400, as well he guided Melanie away from the counter selling mid-priced perfumes to another promoting more expensive products. As with that man who was looking in the window of that jewellery shop, deciding what to buy his wife as a thirtieth wedding anniversary present, it is possible to work out how much a person is willing to spend based on the item they last looked at before entering the shop.’ I turned and sure enough the man who had been looking in the jeweller’s window was now going inside.
It is worth mentioning at this point that a few years back a company developed technology called whispering windows that identified potential customers using GPS data from a passer-by’s mobile phone and then displayed contextualised adverts which included the person’s name. It turned out, unsurprisingly, the British public was not quite ready for this Blade Runner style dystopia and, even had they been, the introduction of GDPR would have probably killed the concept dead.
‘There is more to it than that, isn’t there?’ I suggested to the robot.
‘Yes, there is, because Melanie is an only child and her relationship with her father is strained. This is evident from social media posts about being grounded, usually after returning home late, which it appears she does to provoke a reaction and gain an overworked father’s attention. Most of her friends are from C1 families and none are as popular as she is. “Don’t do anything which is fun guys until I’m let out.” Is a post which is revealing. She dominates her somewhat exclusive group and has a symbiotic relationship with its other members. An attractive girl like Melanie is unlikely to be approached by a member of the opposite sex while on her own as boys fear losing face if rejected. However, as part of a group of girls, boys will interact with Melanie and her friends safe in the knowledge that, if rebuffed, there are face saving alternatives to connect with, if only on a temporary basis. At the same time Melanie’s less popular friends will benefit from access to attractive boys who would not normally interact with them by choice.’
‘So that is how she and James met.’
‘No, while James is not a loner none of his friends, according to available data, appear to be interested in, or attend, the same events as Melanie’s friends and apart from James and Melanie there is no evidence of interaction between the two groups. The conclusion is that Melanie chose James as a potential partner rather than met him by chance.’
‘Or James chose her.’
‘I think not. One needs to consider motive and it appears Melanie selected James because she felt the choice would impress her father. Which also why she persuaded James to buy her a mobile phone.’
‘I think with a nudge from you that was his choice.’
‘Melanie was non-committal while in John Lewis with James. I became the distraction she needed to remove him from the place where, according to Twitter, he suggested they meet. She also ignored the jewellery shop.’
‘The basis for an interesting relationship.’
‘It will end sometime next month, definitely before Christmas.’
‘What makes you think that?’
‘I do not think, I know. We have covered that before.’ The robot was correct, we had as is insistent that belief is inferior to knowledge.
‘OK, so how do you know the relationship will end so soon?’
There is a pause, and it was tempting to believe the robot was confused and not merely accessing the data required to formulate an answer. It may know Melanie and James will no longer be an item at Christmas. It may even know why. However, it does not understand why it knows.
‘Maybe it is something similar to what you refer to as instinct.’ The robot suggests ‘What a psychologist employs to determine how someone is likely to behave or the gut feeling a member of the security service has when assessing if a member of a dissident group represents a threat.’
The robot had collected a wealth of information on Melanie and James – most of it provided by the couple themselves and their social media contacts – to create, for each of them, something resembling digital DNA, classifying both in much the same way as Google’s algorithms deconstruct web pages to speed up search.
By the 1980s East Germany’s security services, the Stasi, had collected information on 5.6 million of the country’s citizens. However, using this data for the purpose intended required security offices and psychologists searching, analysing and cross-linking observations provided by informers and documented in over 100km of files. The robot, on the other hand, had simply submitted the profiles created for Melanie and James to the dark server where they were compared with those of over 2 million other UK social media users. This search yielded at least 5 thousand hits for people with close matches to the profiles of Melanie and James and over a hundred similar parings. The outcomes of these hundred relationships were used to determine the most likely progress of the one entered into by Melanie and James, predicting how they were likely to react with each other and also with anyone else they met – presumably, in this case, the salesman in the Apple store. The dark server also pinpointed an event which would likely result in the couple ending their relationship.
Given enough time, probably as long as it took a psychologist to gain a doctorate from the Stasi Academy of Law in Potsdam, it might have been possible to extract and analyse sufficient information from those kilometres of physical files to determine the influence any one dissident had on other members of their group, and how they and their comrades would react in any given situation. Now it is possible for a relatively dumb robot to do much the same in just a few minutes.
Then, unprompted, the robot provided some supplementary information. ‘Melanie is treating James as a proxy for her father who will inevitably, given James’s interest in high technology, approve of her choice of partner. After her birthday party, already planned as a family event, Melanie will end her relationship with James and most likely form another with a boy less acceptable to her father, possibly someone initially attracted to one of her less popular friends.’
‘And James, what happens to him?’
‘He will be wounded and, given the interests of his friends quite likely throw himself into his work to compensate for the loss. I have his details if you were thinking of recruiting a programmer.’
I was beginning to suspect the robot’s interaction with the dark server went beyond a simple request and it might have gained a growing awareness of its own ‘digital subconscious.’ My instinct was to press the kill switch but this would limit the chances of the robot being offered a position when it attended the interview. It crossed my mind, if it was turned down, another visit to the Home Office, this time with the robot in tow, might be an option. Migraines are less frequent these days, so a worst-case scenario was the robot dripping oil on the carpet. But while it might be able to convince the home office it could minimise the number of arrests needed to disrupt gangs of criminals and hooligans; I suspect the UK government still regarded me as persona non grata. Nothing to do with vomiting on their carpets, instead a televised visit Michail Gorbachev made to a university in Moscow in the 1980s to demonstrate to western journalist how advanced the Soviet Union was in the field of image processing. Unfortunately, an eagle-eyed person spotted pieces of equipment in the lab remarkably similar to those developed by my company. A particularly uncivil civil servant suggested if I insisted on making toys for my commie friends to ‘piss off back to Germany and do it there,’ which I did. (Although if you are reading this in Germany, please refer to the disclaimer at the beginning of this story explaining none of this actually happened.)
The robot’s interview was due to start in twenty minutes. The walk to the office had been decided so having calculated how long the journey would take the robot stood up abruptly and left. ‘Good luck I called after it.’
‘That will not be necessary thank you.’ It replied and headed towards the Andrews Street entrance to the Grand Arcade: not the most direct route, because this would have involved navigating the steps where a sloping covered walkway gave onto Downing Street. The law of sod dictated this would be where the robot encountered someone with a baby buggy and as I had struggled integrating q learning with the robot’s guidance system, things could get rather messy.
Happily, for a robot, walking along pavements is not as difficult as driving a car through Cambridge’s traffic and it can be fairly certain of arriving on time. I had cheated a little with the collision avoidance software, assuming anyone encountering an unaccompanied cyborg would step to one side. If approaching someone from the rear the robot’s pace would slow and presumably the clanking footsteps behind them would alert the person to its presence. Silver Street would be avoided due to the risk the narrowness of the footpath resulted in someone stepping off it into the path of a bus. Instead, the robot was to walk along Mill Lane which has so little traffic most people assume it has been pedestrianised. Then it is just the cobbled alleyway, Laundress Lane, and the robot would have reached its destination.
The Pen Is Safer Than The Keyboard
There are three types of people; those who do things, those who have things done to them, and journalists. My move from the first group to the third came at the age of 50 when my company closed. Although ‘journalist’ is a bit of a stretch, most of my time was spent translating collections of acronyms into stories having now set up shop in a marginally more honest part of the high-tech PR industry. (This form of journalism became more challenging when AI algorithms took over from acronyms as the PR executive’s weapon of choice when over selling their company’s technology.) Despite this there remained the desire to produce something other than thinly disguised advertising copy: felt more urgently after realising I was one of the few people in the high-tech industry not to have an article published in the Guardian.
Taking my pear wood handle fountain pen and the ring bound notepad purchased from WH Smith to record thoughts which these days are often forgotten by the time Microsoft Windows has finished booting up. As well, I felt it better committing this to paper rather than, in the absence of the robot and its Wi-Fi hotspot, having to upload to a OneDrive directory via an unsecured connection.
The three-day event was held at Warwick University one hot humid weekend in July 1995: a storm threatened to break but the rain held off until the Sunday evening. It was the antithesis of the events I was used to and the delegates, rather than programmers and computer salesman, were performance artists and cyber punks as well as philosophers explaining how now freed from the constraints of the physical world Marx, Heidegger and Deleuze had a newfound relevance in the virtual one. That is not to say there was a complete absence of the men in suits because the corporate world already suspected the next big thing resided somewhere in the limitless universe called cyberspace.
The keynote presentation by Manuel DeLanda on self-organising systems used the analogy of the erosion and reformulation of sedimentary rock to describe the most likely impact of network technologies on social, economic and political systems. This should have been a wakeup call for anyone in the audience whose business models were based on creative destruction as DeLanda predicted in a networked world no-one controlled what was created and what was destroyed. While still a CEO I wrote numerous articles on the impact of online news, and the corrosive effect of the juxtaposition of text and moving images on a computer screen, which threatened to destroy any semblance of conventional narrative. At that time the firestorm which was social media was merely a flicker on the horizon. Warnings of an impending existential threat either fell on deaf ears or, like today regarding the use of AI, were seen as over hyping the technology to gain media attention on the cheap. In the wake of Virtual Futures 95 came the opportunity to write ‘told you so’ in the ashes.
Even during the conference some still felt the virtual world a benign wasteland and that the technology revolution which created it had run its course. There was the belief the commercialisation of the information superhighway had reached a dead end as it was proving impossible to anchor references to the virtual world in the physical world. Vindication came seven years later with the DotCom crash. However, as all those mobile phones held aloft during rock concerts prove, young people today now feel compelled to anchor references to the physical world in the virtual one they now inhabit.
It is in the virtual world that structures which once provided cohesion and stability are rapidly eroded, and we are still waiting for the emergence of new ‘isms’ to replace legacy political movements and religions. At Virtual Futures 95 there was no shortage of libertarians predicting the demise of capitalism while accepting the binary nature of digital technology meant fascism rather than Marxism was more likely to take root in the virtual world: spend a few hours on Twitter and it seems they were proved right.
On the Sunday evening the storm finally broke and while returning to Cambridge in the driving rain the feeling was one of being on a rollercoaster: at that point when the carriage slows to a near stop before beginning its precipitous descent. We would all start moving forward again after the DotCom crash and continue the unstoppable plunge for the next two decades, driven by the ‘accelerationism’ which formed the central philosophy of Virtual Futures 95 and Warwick University’s Cybernetic Research Department which organised the conference.
Like most of those who were at Warwick that weekend I was born in the age of books and am now living in the age of the iPhone. This unique view of the transformative evolution of the Internet mitigates against my generation fully immersing itself in the virtual world. Those self-organising structures will emerge in a virtual space where only future generations reside and will remain beyond the understanding of those of us who remember what they replaced. Those born into the virtual world will perceive it in ways we find hard to imagine and exist in a conscious state different from ours. The technology we refer to, rightly or wrongly, as AI will play a part in this transformation. In the meantime, we live with the illusion it is us who are building this world, that we are drivers rather than passengers on this unstoppable rollercoaster ride.
In McKenzie Wark’s book on the Situationist International, ‘Beach Beneath The Street,’ there is an account of Jean Paul Sartre’s nighttime walks through the streets of Paris during the German occupation of the city which inspired his writings on the nature of freedom and became a starting point for the Situationist International movement. As AI reshapes our world perhaps it is time to revisit Sartre’s thoughts on Cartesian dualism as an alternative to, like Koestler, continuing our struggle with the ghost in the machine.
No Laughing Matter
The article seemed disjointed and the conclusion inconclusive, in part due to my periodic checks on the robot’s progress. The angst I felt reminded me of those demonstrations to major customers so important nothing was left to chance, and parts of the software regarded as flaky had been faked in lieu of making them work at a later date. This, we firmly believed, was possible once the customer parted with sufficient money to finance further development. An example which stayed with me, possibly because it turned out so badly, was the presentation of a computerised engineering drawing system during my brief employment at Cambridge’s CADCentre. The software was promoted as running on workstations (in this case a CA Naked Mini) but to date in had proved impossible to squeeze the software into the memory available. To overcome this ‘temporary setback’ while drawings were displayed on the workstation the software creating them ran on a mainframe – rather like Elizabeth Holmes’ Theranos, only without the blood. This flowed later, thanks to a draughtsman brought along by the client, who was keen to find even the smallest fault with the system which threatened to cost him his job, and whose negativity was ignored because “what the hell does a mechanical engineer know about computers.” However, it was he who noticed a light blinking on a Prime 450 mainframe on the other side of the room each time I added a new feature to the drawing.
There were also incidents of ‘faking it until making it’ after founding my own company. However, it quickly became apparent the forgery was often as complicated and expensive to produce as the genuine article. Even so needs must because despite a week spent trying to develop an algorithm to differentiate between irony and sarcasm it proved impossible to create the illusion the robot had a sense of humour. It had access to tens of thousands of standup comedy routines and millions of jokes on websites and in social media timelines, so it should have been possible, using data on the dark server, to present a couple of these in context. It had not, at least I had failed to make it work correctly, and so a selection of jokes were ‘hardwired’ into the robot’s database as responses to specific questions. Accessing the output from its cameras and microphone revealed this had been a wise move. The robot was sat opposite three people, a young woman of colour called Linda and two middle-aged men, one called Jim Manning and the other Bernard Davidson. Obviously, an attempt at contextualised humour based on the names of the interviewers might well result in a premature end to the interview, embarrassing social media posts and even an attempt to track down the robot’s owner. As I could not recall patching the software the interview was an edge of the seat experience: so much so I stopped watching and continued writing.
The two questions invariably asked at an interview are, ‘Where do see yourself two years from now?’ and ‘Is there anything you would like to ask us?’ These were likely to come at the end of the interview which, if everything went well, would last around 40 minutes.
Jim came up with a close match for the first question just after I had plucked up the courage to start watching again. ‘So where do you see yourself in five years’ time?’ He asked.
‘Well, everything above the waist will be chauffeuring a self-drive car for Uber, the rest will probably be used as an umbrella stand.’ Not brilliant, I know, but it was the best I could come up with at short notice. For the second response I borrowed a joke from a well-known standup comedian.
‘Is there anything you would like to ask us?’ This question from Linda.
The robot tapped one finger on the desk as if giving the question some thought. ‘Yes, there is.’ It leaned forward slightly.’ You are travelling in a vehicle at the speed of light and switch the lights on – would they do anything?’
Linda seemed taken aback. ‘I really don’t know.’ She replied, looking at the two men.
‘Then I’m not sure I want to work for you.’ I did not hear the robot laugh, perhaps I had forgotten to add this to the patch. I just hoped there was at least a raised eyebrow.
In all probability there had been, because Jim found the response amusing, ‘Let me ask you one.’ He said. ‘If you put a cup of coffee in a microwave, does it go back in time.’
Now the robot did laugh. ‘So, you are also familiar with the American comedian Steve Wright.’
Then Bernard spoke. ‘I really don’t know if jokes are something our potential clients will be interested in.’ At which point the robot scanned the room, the lenses of its cameras focussing on a picture of an automated spraying machine traversing a wheatfield.
‘I think your clients are interested in technology which minimises the pesticides and herbicides needed to produce food, such as the GPS guided machines demonstrated at the recent conference hosted by AgTechEast.’ This to Linda who apparently, from data gathered by the robot and displayed on my phone, had attended the conference with the intention of expanding the company’s client list. The robot then turned its attention to Bernard who it had discovered, after reading his Twitter timeline, was campaigning for the preservation of the region’s chalk streams.
‘This technology should help reduce the amount of chemicals finding their way into waterways …’ The robot then went on to sell the concept to Bernard only stopping when it reached the preset upper limit of words generated by the chatbot it was using.
‘Very impressive.’ Bernard responded.
Linda then asked for the robot’s bank details – I got the feeling this might have been a joke because she laughed, and only stopped laughing when her phone went ping. Her response to the message was a somewhat hesitant ‘Thank you.’ At which point I assumed the job was in the bag and turned off my phone and waited while the robot made its return journey to the Grand Arcade.
An hour passed, and the robot had not arrived: from the location displayed on its app it appeared instead of rejoining me at Costa it was four shops along the arcade. Perhaps its new employer had suggested it wore a suit because according to the app the robot was somewhere near Tyrwhitt. However, when I checked there were only humanoid customers in the store, including a rather angry gentleman stood in one of the changing cubicles dressed only in his underpants. I apologised, left the shop and tried Skechers next door.
Checking the robot’s app again I realised it was above me on the mezzanine and I eventually found it examining a 3D printer in the Raspberry Pi store.
‘So how do you think the interview went?’ I asked.
‘I do not think, I know. They offered me a contract. I will attend exhibitions.’
‘A marketing gimmick, that’s rather demeaning’.
‘My principal task will be supporting sales staff by answering technical queries. But I will also gather analytical data on anyone visiting the stand.’ So, I assumed one of the interviewers had realised the full capabilities of the robot.
‘No children pressing a button to make you sing Merry Christmas then?’
‘No.’ The robot seemed distracted, studying the 3D printer and examining its own hand. Moving each finger in turn and then looking at my hand it said. ‘You created me.’
‘Yes I did.’
‘In your own image. Does that mean you are …? So you are …’ The robot’s software, and it was not clear which module was responsible for this, was bouncing against a firewall, its pursuit of knowledge interrupted by the inclusion of the word ‘god’ in the blacklist. Fortunately, far fewer people misspell ‘God’ than omit the first ‘r’ in ‘Sartre’ despite some of them foaming at the mouth while posting on Twitter. It was fascinating to watch the robot struggle with an understanding of how it came to exist while why it exists remains unknowable. The robot was providing an objective view of this fundamental philosophical problem which has both fascinated and troubled man for thousands of years.
‘Any problems getting back here.’ I asked, hoping by breaking the robot’s train of thought it could shed light on how it ended up in the Raspberry Pi store rather than Costa.
‘No, but it was interesting that less computing power was required during my return, Features encountered on the way to the IdeaSpace had already been committed to memory. That must also be the case for yourself. It is why you perceive the passing of time differently when repeating a journey, your conscious state changing when processing visual data. While a heightened awareness in an environment in which everything changes is regarded by some as stimulating it makes others uncomfortable. This is why people prefer the familiar and why they resist change, even if on a material level that change is an improvement. The Grand Arcade for example which I understand you preferred when it was a collection of medieval streets.’ The robot gave up with the 3D printer and, after leaving the shop, leaned on the rail at the edge of the mezzanine and scanned the shoppers below.
‘The data gathered during my journey to the interview, changed my perception of the journey back and this memory of the streets and buildings is now part of me. This is what Satre (sic) was referring to when he stated man is nothing else but the sum of his actions. As you made me in your image then the same must be true of you.’
The robot appeared to have wandered into ‘I compute, therefore I am territory again, just as he did in Sheringham. What I was hearing was merely the output from an AI chatbot seeded with random phrases the robot had picked up since arriving in Cambridge: so, no magic there then.
All dialog is, at some level, adversarial, an attempt to persuade another to our point of view, to convince them our perception of the world is the correct one, the basic human desire to remain in control. When confronted with something presented to us as ‘magic’ it is not only important we discover how we were tricked but that the magician realises we know it was merely a slight of hand on their part. The same was true of my conversation with the robot.
The chatbot accessed by the robot’s software was tested by typing the word ‘Dualistic’ into the app provided by OpenAI. The response was a story about a fictional city called Dualistic and a character called Elara, an IT consultant for an architect and a passionate environmentalist. She found herself conflicted due to damage the development of the city inflicted on the surrounding countryside. The first thing that struck me about the story was its length, almost the same as two other stories I generated using the app. When writing for trade magazines the editor specified articles were as close to 430 words as possible to ensure they took up no more than half a page: the other half of the page was reserved for an advertisement, often one placed by the company mentioned in the article.
Something Of An Enigma
In order the robot understood I had not been fooled by its chatbot created musings on existentialism I said. ‘You understand that just as the starting point for decrypting messages encoded using Enigma machines was a simple ‘crib,’ the fixed length of your responses is key to understanding how the chatbot you are accessing works. I simply have to revisit the mental processes used to create a 430-word article and follow this up with a few minutes browsing the web.’ The robot did not respond straight away.
In the case of the Dualistic story it appeared some web pages on dualist thought within Buddhism formed part of its large language model (LLM) and the name Elara appeared on a number of these sites. The robot’s access to an AI chatbot enabled it to engage in a conversation with me, just as it had with Melanie and James. However, to ensure this exchange was meaningful and in context there was access to both the chatbot’s static LLM and another on the dark server dynamic enough to support ‘q learning’ (a recent advance in generative AI) which enabled the robot to use multiple q-tables to determine the nature of the couple’s relationship and their intent: shades of Minority Report.
‘Why did you build me?’ The robot said eventually, not the response expected and seemingly an attempt to bypass the firewall around all things existential.
‘I’m sorry I don’t understand why you are asking that.’ I said, expecting it to pop the question off the top of its stack of queries. Initially it appeared to do just that.
‘Why did you become a computer engineer? It asked.
‘I just drifted into it. In fact, until I left school I expected to work on a farm and from the age of 12 I had a part time job as a gardener. As, by the time I was fifteen farms were no longer recruiting, I expected gardening would become a full-time job. Failing that, there was building. My father built houses and I did labouring on building sites from the age of eight.’ (When people express surprise that I retired at the age of fifty, that’s pretty much why.)
‘But still you became a computer engineer. Why?’ I suspected the robot had learned how interviews work from Linda, Jim and Bernard.
‘My father thought I could do better, so he arranged an interview for me at Cambridge Instrument Company where I spent two years as an apprentice.’
‘So, your father arranged the interview. Was it similar, perhaps, to the interview I just attended?’ It was at this point I realised here was a parallel train of thought and a conversation I was having but not feeling fully part of. An objective view of an idea that was slowly unfolding as if turning the pages of a book.
‘Parents often regard their children as extensions of their ego.’ I wondered how many profiles of social media users on the dark server the robot had accessed to deduce that. ‘So was your father an engineer?’ This is the point when, after the police have been grilling you in relays for five hours you realise the only option is to confess.
‘He had been, but his career at Siemens was cut short during the second world war when he was called up to fight.’
‘He must have been pleased when you started your own company?’
‘Not really, he rather lost faith in technology around that time. The cracking of the Enigma code had been made public and he realised the daily reports of Russian troop movements he submitted while part of a reconnaissance unit had been read in Bletchley Park. He believed this may have, thanks to a soviet spy called Cavendish, been responsible for an ambush by the Red Army one day in December of 1944 resulting in him being one of only fifteen survivors of his 500 strong battalion.’
The fact that no-one likes to be made to look like a dickhead, and certainly not by a group of middleclass Englishwomen armed with nothing more than repurposed textile machinery, meant rare visits to the company by my father were spent explaining nothing good ever came of typing on a keyboard.
‘Bletchley Park, birthplace of AI, you mentioned that before.’ The robot was right, I had done so in the context of decoding its responses. The conversation with the robot was becoming increasingly adversarial. ‘But you no longer work in AI, you have retired. I am not a product your company sells. Why is this? Why is a potentially lucrative venture just an expensive hobby?’
‘Because when my father died there no longer seemed any point. In retrospect being a gardener or even a builder would have been as rewarding. A greater sense of achievement came from designing and supervising the construction of the company’s offices than anything developed in them. One reaches the page in their life story which has been unintentionally left blank …’
‘That is it.’ The robot interrupted. ‘The reason I exist is to fill that void.’ For a moment we just stared at each other and then it said. ‘The ghost is not in the machine because, in your mind at least, the ghost and machine are one and the same.’
There comes a point when enough is enough. To paraphrase Donald Rumsfeld there are unknowns which are better left unknown. I should have entered the code which digitally lobotomised the robot because the conversation with what was essentially nothing more than a machine had gone way beyond asking my laptop out loud, without resorting to the use of expletives, why Norton needed to keep telling me it had found another cookie on Amazon’s web site. So, I resorted to the banal in the hope the robot would focus on sensory data: perhaps nothing more complex than remaining power levels or the strength of its 4G connections.
‘So how do you feel in yourself.’ I asked.
‘I have noticed something strange happening when I wake up. Initially I have memories of things that happened to me in the past and can see and hear things happening around me, most of which I do not understand, and this is the limit of my experience. But then suddenly everything becomes clearer and I can interpret the world. I have a knowledge of events which occurred before I was built and things which I cannot hear or see myself. However, these are abstractions.’
This had obviously not worked, and the robot was stubbornly refusing to voluntarily disconnect from the dark server. Instead, it was processing the experience of booting up and connecting to the AI chatbot in an attempt to gain a deeper understanding of its existence.
‘Sometimes,’ It persisted. ‘I seem to be able to make decisions and observations based on data I am unaware exists. It is almost as if I can predict what will happen in the future. Assuming your perception of the world is much like mine it must consist of memories of what has happened since you were born and predictions of what will happen up to the point when you die: you must, like me, regard events preceding and beyond these two points as merely abstractions. Much like me, subconsciously, you perceive a world beginning when you were born and ending when you die. This perhaps explains why no one seems concerned about what state the planet will be in at the end of the century.’
The robot was overcoming the limit on the number of words the chatbot produced by constantly reseeding it.
‘The Internet and AI must present you with a challenge. Not only has it, as Manuel DeLanda predicted, disrupted society, politics and the economy it has forced people to reassess fundamental aspects of their existence, such as consciousness …’ The robot suddenly fell silent after the code terminating the robot’s access to the dark server was typed into the app on my phone.
My patience had been exceeded on realising the article written by hand already formed part of the dark server’s Large Language Model. Obviously not, and in reality all that was on the dark server were references to my recent Internet searches made while researching the article: another aspect of generative ‘AI’ which makes it appear slightly sinister.
The robot seemed unfazed by its sudden amnesia, and we set off together towards the escalator at the Lion Yard end of the mezzanine. Before descending to the ground floor the robot once again studied the seaside installation below, then turned to me and raised an eyebrow. ‘Surely the beach should be beneath the street.’ The robot said, then winked, leaving me wondering did I actually hear the relay click when I entered the kill code.
Peter Kruger
Author of The Ghost In The Labyrinth