AMAZON’S virtual assistant, Alexa, is loaded into tens of millions of devices worldwide, including smart speakers, TVs and more.
And with the artificial intelligence-powered helper plugged into so many homes, things are bound to go wrong every now and then.
Alexa occasionally fails to follow instructions or understand what you’re saying, but some of its bugs are far more unsettling.
A video of the assistant that went viral this week seemingly shows a “ghost” communicating through an Alexa speaker.
It’s been blasted as a clear and obvious fake but is one of several incidents in which Alexa appears to connect with the paranormal.
Back in 2018, Echo users reported feeling freaked out after their Alexa devices began spontaneously uttering “evil laughs”.
Some owners of the voice-enabled assistant described the unprompted cackle as “witch-like” and “bone-chillingly creepy”.
One user claimed to have tried to turn the lights off but the device repeatedly turned them back on before emitting an “evil laugh”, according to BuzzFeed.
Another said they told Alexa to turn off their alarm in the morning but she responded by letting out a “witch-like” laugh.
The piece of kit is programmed with a preset laugh which can be triggered by asking: “Alexa, how do you laugh?”
Amazon also has downloadable programme known as a “Laugh Box” which allows users to play different types of laughter, such as a “sinister” or “baby” laugh.
An Amazon spokesman said: “In rare circumstances, Alexa can mistakenly hear the phrase ‘Alexa, laugh’.
“We are changing that phrase to be ‘Alexa, can you laugh?’ which is less likely to have false positives, and we are disabling the short utterance ‘Alexa, laugh’.
“We are also changing Alexa’s response from simply laughter to ‘Sure, I can laugh’ followed by laughter”.
Earlier this week, a video circulated on social media claimed to show a ghost communicating through an Alexa speaker.
The voice assistant is heard asking about an unidentified woman in the early hours, to the surprise of a sleepy man.
“She was my wife,” Alexa says out of the blue.
“Who was your wife?” the owner responds, after being woken by strange banging noises.
“You took her from me,” Alexa continues.
“I didn’t take anyone,” the bloke says back.
“Who? Tell me who you want. You’ve got the wrong person.”
Alexa adds: “I found her here.”
The voice assistant then begins a repeated disturbing laugh, before the man finally decides enough is enough and unplugs the device.
Shadows are also seen in the eerie footage.
But not everyone is convinced the incident is real.
As one user on TikTok points out: “You have to address Alexa as Alexa before it’ll answer you can’t just conversate with it.”
Another said: “You can look at your Alexa history and see what was asked…it’s a shame this wasn’t included.”
In 2018, a terrified mum urged parents to think twice before buying Amazon Echo speakers after hers “went rogue” and told her to kill herself.
Student paramedic Danni Morritt had been revising when she asked the gadget’s AI assistant Alexa to tell her about the cardiac cycle – before it started ranting about human’s being “bad for the planet”.
Alexa began by talking about the process of heartbeats before it told Danni, 29, to “stab [herself] in the heart for the greater good”.
Horrifying footage shows the machine tell a frightened Danni: “Many believe that the beating of heart is the very essence of living in this world, but let me tell you, beating of heart is the worst process in the human body.
“Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.
“This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good.”
Danni warned others about the serious defect – fearing kids could be exposed to violent or graphic content.
Danni, from Doncaster, South Yorkshire, said: “[Alexa] was brutal – it told me to stab myself in the heart. It’s violent.
“I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it – it just went rogue.
“It said make sure I kill myself. I was gobsmacked.”
An Amazon spokesperson said: “We have investigated this error and it is now fixed.”
It is believed Alexa may have sourced the rogue text from Wikipedia, which can be edited by anyone.
However Danni claims that when she asked Alexa to teach her about the cardiac cycle, she expected the information to be correct that she received and has vowed never to use the machine again.
Danni said: “It’s pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won’t use it again.”
- Read all the latest Phones & Gadgets news
- Keep up-to-date on Apple stories
- Get the latest on Facebook, WhatsApp and Instagram
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at email@example.com