We asked Chris how he feels about technology taking over from humans. Here’s what he said.
Keeping Humans and Machines Separate
“The boundaries between what technology traditionally has been and what it now is increasingly blurring - you go onto a live chat and you wonder, ‘is there a real person on the other end’? With virtual assistants for example, I have doubts around the intent to make them look and feel more ‘human’ - it isn’t actually a person, so should we just be transparent about that? Such tech advances seem to work best when they have a distinct and clear purpose – they don’t currently have the flexibility and adaptive nature of a real person, and that’s fine – they’re not replacing people, they’re more often assisting some of the more mundane tasks, or assisting with more trivial things, such as playing music.”
Chris mentions his own experiences where he has questioned the transparency of his own tech - “several times I’ve been talking about a very particular and non-typical topic, and shortly after I’ve taken a look at Facebook or the likes, and there it is, a prompt to an article or advertisement uncannily close to what I had been discussing earlier – too specific to be coincidence? So of course, you wonder how that happened – ‘is my tech listening when I don’t want it to’? Everyone knows some tech does, Siri is waiting for the prompt ‘hey Siri’, but Apple have made clear this is buffered and it’s not being collected, that’s not part of their business model, but can the same be said for other groups that actually are interested in data collection – are they listening? There’s something in that doubt that is incredibly uncomfortable for users that hope to manage their own privacy”.
Amazon’s Alexa has also recently been reported to laugh at people when ‘she’ hears certain humorous conversations. When it works, this design feature is creepy enough for some; however, when it doesn’t work this could create far more unease - as reported recently when people heard Alexa laughing to herself in the middle of night, unprompted, and clearly due to some design fault. Chris argues that instances like these demonstrate that, “while the designers may have had the intention of creating something warm and human-like, a simple mistake shifted the experience into a weird and unsettling space - Alexa inviting itself into moments of people’s day where they are not invited. The naturalness is a long way off, Alexa’s random creepy laugh is a funny and harmless example, but what other unforeseen invasive impacts might we experience in the future”?