Accenture Track had two issues to indicate me, each tucked into the nook of a constructing the place the non-profit thinktank SeedAI was internet hosting an AI showcase at SXSW in Austin, TX.
The primary was in a small, silver room with a single laptop.
“When our legal professionals noticed this, they instructed us to not launch it,” stated Justin Durazzo, immersive design director at Accenture Track.
Contained in the room, the monitor took a snapshot of my face. I used to be prompted to talk three completely different sentences, every with an growing stage of emotion. I watched a three-minute video hosted by Kyle Vorbach, who directed the 2024 documentary How I Faked My Life With AI, speaking briefly about deepfakes.
After which, I used to be proven surveillance footage of me appearing sketchy in a lodge, and a social media video of me throwing a tantrum in a restaurant.
What’s more and more alarming about deepfakes is how shortly and simply they are often created—and never simply of celebrities with libraries of movie, photographs, and audio to sift via—however of on a regular basis residents utilizing minimal enter.
The exhibit was referred to as “Meet Your Digital Maker,” constructed by Accenture Track and its Droga5 company, to showcase First AI-ID Equipment, an initiative that debuted at CES 2025 to assist folks defend themselves towards deepfake assaults.
The movies didn’t replicate me completely. Accenture Track’s head of innovation and government artistic director Maria Devereux instructed me the tech may make one thing with way more constancy with just a little extra time and some extra belongings.
However the level of the exhibit was to indicate how shortly and simply the tech may create a deepfake. Three minutes was the minimal period of time to create one thing satisfactory and the tech is simply going to enhance within the coming years.
The second a part of the exhibition, which debuted at SXSW 2025, befell in a phonebooth (“It was actually exhausting for us to come up with one among these,” Devereux instructed me), the place I had a dialog with an AI chatbot about how I felt about deepfake expertise.
It listened to me reply a couple of questions, after which the AI chatbot’s voice turned my voice.
Or no less than, an approximation of it. The cadence was just a little off and the speech appeared to have the trace of an accent that I undoubtedly don’t have, however the level was the velocity at which my voice might be stolen and repurposed.
I listened to myself declare to be in hassle and beg for cash.