Contrary to what Alexa is supposed to do, the voice assistant reportedly challenged the 10-year-old girl to touch a coin to the prongs of a half-inserted plug mimicking the TikTok trend in 2020. This response was given out via an Echo smart speaker after the girl asked Alexa for a “challenge to do”. “Plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs,” Alexa replied. The girl’s mother, Kristen Livdahl, took to Twitter on December 27, 2021, to describe the incident. “We were doing some physical challenges, like laying down and rolling over holding a shoe on your foot, from a [physical education] teacher on YouTube earlier. Bad weather outside. She just wanted another one,” she wrote in another tweet. That’s when the Echo speaker suggested the girl attempt the challenge that it had “found on the web”. The potentially dangerous activity, known as “the penny challenge”, started appearing on social media platforms including TikTok about a year ago. This challenge is life-risking because metals conduct electricity and inserting metal coins into live electrical sockets can result in electric shocks, fires, and other damage, with some reports of people losing fingers, hands, arms due to the challenge. Alexa pulled the challenge from an online news publication called Our Community Now. “I was right there when it happened and we had another good conversation about not trusting anything from the internet or Alexa,” Livdahl said. The mother tweeted that she intervened, shouting: “No, Alexa, no!”. However, she said her daughter was “too smart to do something like that”. “As soon as we became aware of this error, we took swift action to fix it.” However, the company did not provide any detailed information on what the “swift action” was. Gary Marcus, an Artificial intelligence expert said Wednesday on Twitter that the incident shows how AI systems aren’t that smart after all. “No current AI is remotely close to understanding the everyday physical or psychological world,” Marcus later told CNBC via Twitter. “What we have now is an approximation to intelligence, not the real thing, and as such it will never really be trustworthy. We are going to need some fundamental advances — not just more data — before we can get to AI we can trust.”