As the use of virtual assistants like Siri and Alexa expand in homes and workplaces, designers are working to make voice exchanges more natural.
The age of 'artificial intimacy' is upon us. What does it mean for the way we love, have sex and build friendships?
They're associated with fake news and celebrity porn videos but there are some unexpected upsides to these slippery clips.
The likes of Alexa and Siri shouldn't blindly aim to sound and behave like us - their voices need to reflect what they can actually do.
The first thing to know is that Siri is not a "who" – Siri is a "what".
Tech companies portray virtual assistants like Alexa and Siri as our helpers. In reality, we're helping them gather the behavioural data they need to turn a profit.
Reports of the death of accents have been greatly exaggerated.
Virtual assistants are often assumed to be female – perpetuating gendered assumptions in our imagined future.
Amazon, Google and Apple's attempts to understand the tone of human voices can reflect human biases.
Siri is sassy. But when does the tone of voice in digital help override usefulness?
People will still be needed on factory floors, even as robots become more common. Future operators will have technical support and be super-strong, super-smart and constantly connected.