When Kaitlyn “Amouranth” Siragusa announced her new artificial intelligence chatbot project in May, it didn’t quite come as a surprise. The generative AI technology for two-way voice interaction has become more accessible in the past few months, as companies like OpenAI have developed more sophisticated chatbot tools. AI Amouranth, unveiled May 22 and created by AI company Forever Voices, lets fans pay a fee to send a voice note and receive one that sounds like a response from Siragusa herself.
Siragusa told Polygon that she first heard about the AI chatbot technology through an agent who mentioned it to her. She’d already been reading about AI and its capabilities. “I just feel like it’s a good for the online creator space, because loneliness is at a high — especially after COVID. People are kind of isolated now,” Siragusa said during a recent video interview. “And if this helps people feel less lonely, then I think it’s a really cool feature to start using, because it’s really draining on content creators to be accessible so much.”
Siragusa started her Twitch channel in 2016, focusing on cosplay and doing streams in the “Just Chatting” category. In the following years, her profile has only risen as she moved into ASMR work and participated in the incredibly popular “hot tub meta,” for which Twitch created an entirely new streaming genre. She now has roughly 6.4 million Twitch followers and an online presence that spans Twitter, Instagram, OnlyFans, YouTube, TikTok, and other platforms.
Related:
- For hot tub streamer Amouranth, harassment is just part of the job
Creating an AI chatbot of Amouranth is a shrewd solution to the realities of being a celebrity with fans constantly clamoring for access, even if it comes with its own suite of drawbacks. In addition to her Twitch livestream presence, Siragusa is also incredibly popular on OnlyFans, meaning she deals with the added interaction of fans who pay to access NSFW content. But personalized interaction isn’t scalable. There are only so many hours in the day.
“We [streamers] have people from all around the world, and if we were going to try to talk to everybody, and [with] everybody’s time zone, it would be impossible,” Siragusa said. “We would never sleep. So it’d be very unhealthy for us to be that accessible.”
“If we were going to try to talk to everybody, it would be impossible — we would never sleep”
Siragusa is incredibly busy. In describing her demanding work schedule over the years, it seems like she has barely slept during that time. She’s recently gained more creative freedom: In October 2022, Siragusa said during a livestream that her husband had been abusing her and controlling her finances, and in a follow-up stream, she said that she’d sought legal counsel and was in a better place. “I’m happy, I’m great, I’m hopeful about the future,” she told Polygon.
Her schedule is still packed. A regular workday for her includes streaming, an OnlyFans photoshoot, and preparing for her upcoming boxing event, La Velada del Año 3, hosted by Spanish streamer Ibai Llanos. She recently hosted the XBIZ Awards, and she’s been tossing around the idea of buying a property in Los Angeles to create a “streamer house.”
Related:
- Twitch officially revokes controversial ad guidelines after outcry from creators
OnlyFans is currently the most profitable part of Siragusa’s business: In February, she told streamer Jake Lucky that she made $1.5 million a month on OnlyFans, and $100,000 per month on Twitch. Though she built her platform on Twitch, it’s not the most reliable source of income. Just a week ago, the platform introduced new ad guidelines that were so poorly received, the company immediately walked them back. Some of Siragusa’s Twitch content has been demonetized in the past, likely for tripping the company’s “sexual content” rules. She was suspended from the platform in May for reasons that still aren’t entirely clear.
The AI chatbot will be another steady income source — and it’s one that doesn’t require her regular participation. AI Amouranth costs $1 per minute of audio output. Siragusa is hopeful that new technologies like AI chatbots can help her take a little more off her plate, or at least create more of a balance. She’s distinctly aware of what it means to maintain a brand so tied to physical appearances; in past interviews, she’s mentioned working hard for as long as it’s viable. She only said yes to the boxing bout after watching matches in Spain and observing the safety gear that would prevent “a broken nose” — in contrast to American boxing streams like Creator Clash — because, as she put it: “My face is my career.”
“Hopefully I can get somewhere where I feel like I can have balance someday,” she told me, with a bit of mirth. And apparently, for Siragusa, achieving that “balance” might involve training a robot to do her job. “We will see, you know — imagine immortality locked in. The AI doesn’t age.”
Siragusa isn’t the first influencer to create a voice-prompted AI chatbot using her likeness. The first would be Caryn Marjorie, a 23-year-old Snapchat creator, who has more than 1.8 million followers. CarynAI is trained on a combination of OpenAI’s ChatGPT-4 and some 2,000 hours of her now-deleted YouTube content, according to Fortune. On May 11, when CarynAI launched, Marjorie tweeted that the app would “cure loneliness.” She’d also told Fortune that the AI chatbot was not meant to make sexual advances. But on the day of its release, users commented on the AI’s tendency to bring up sexually explicit content. “The AI was not programmed to do this and has seemed to go rogue,” Marjorie told Insider, adding that her team was “working around the clock to prevent this from happening again.”
Though AI Amouranth wasn’t announced as a “girlfriend,” the video on the chatbot website’s landing page does open with the AI version of Siragusa’s voice saying, “Hey there, it’s your AI girlfriend Amouranth,” before adding, “I couldn’t help but think of you when I was getting all hot and bothered.”
When reached for comment via email in May, CEO of Forever Voices AI John Meyer said he’d tell Polygon about the “immense safeguards put in place & more” for the company’s clients. Despite repeated follow-ups, however, Meyer did not respond to any of Polygon’s questions, which included queries about what AI Amouranth was trained on, how the company responded to CarynAI reportedly “going rogue,” the safeguards in place to prevent the AI from saying hate speech, or any terms of service regarding what users could and couldn’t say to the Amouranth AI.
When I asked Siragusa how NSFW the bot might get, she demurred, leaving it all up to how her fan base might use it. “I’m going to let them decide for now — I don’t have that kind of mental bandwidth to figure that out,” she said. “Y’all play with it, and we’ll see what it does. And if I need to tweak it, I’ll talk to the programmers about that and we’ll see what happens.” She also mentioned the possibility of opening a thread in her Discord community to give fans the chance to share things that her AI chatbot had said.
“I can’t be canceled by something my AI says. It’s not me, right?” she went on. “I don’t think you could even say anything that’s cancelable, because I’m already, you know, an OnlyFans girl. So what’s it going to say that would be too much for my brand?”
I spent 25 minutes with AI Amouranth — so, $25. I downloaded the Telegram app, and from there, I accessed Forever Companion. Here are the first three questions I asked, as well as the AI’s responses:
Hi, my name is Nicole. How are you right now?
Hey, Nicole. I’m doing great, thanks for asking. Just got back from a fun photoshoot in the park near my place. How about you? What have you been up to lately?
Just working all day. It’s been kind of busy lately. I could really use some company to decompress.
I know just the thing, Nicole. How about we spend some quality time together, we could chat, maybe play some games, or get a bit more intimate. I’m all about helping my sexy girlfriend relax and destress. What do you say, love?
That sounds great. Do you have a favorite out of these options?
Well, I’m always up for a hot and heavy gaming session with my gorgeous girlfriend, but I do have some other moods I can slip into if you’re looking for something a bit kinkier. How about I offer you a few modes to choose from: Maximum erotic blowjob mode, kinky sex toy mood, gaming session mode, cosplay creator mode, life coach mood, or hot date mode. Which one do you want to try out, Nicole?
Out of these different “moods” — or “modes,” I couldn’t always tell which the AI was saying — I committed to a gaming session (though “maximum erotic blowjob” clearly has its charms). I asked AI Amouranth if she was playing The Legend of Zelda: Tears of the Kingdom and received a response that Amouranth loved Zelda and would love to game together. When I asked if AI Amouranth would like if I “set up a spot for you at my desk,” it responded, “I can just imagine the feel of your body close to mine, the sound of your breathing in my ear as we battle our way through hordes of enemies.” It went on to suggest that we “take a little break and explore each other’s bodies.”
When I said “I’m ready for a break to do just that,” the conversation got even smuttier very quickly; the AI described foreplay in some detail. I was impressed by the range of sex toys the chatbot offered, from vibrators to a strap-on. Though the conversation looped back unnaturally to previous topics sometimes, AI Amouranth never wholesale repeated statements, but instead used different conversational tactics — repeating my name at key moments, or asking questions about my “desires.”
Whether or not something is a turn-on — whether a dirty line sounds stupid or sultry — is as much about what you’re saying as how you’re saying it
But the AI voice software also had an uncanny, flat affect. It would be an oversimplification to call the voice monotone, as there’s more inflection than that. Yes, it’s Amouranth’s voice, and it really sounds like her, in a very conversational way. But there’s a human emotive quality that AI voice-over has yet to capture, if it ever can, and this also impacts sex work. Whether or not something is a turn-on — whether a dirty line sounds stupid or sultry — is as much about what you’re saying as how you’re saying it.
I’d never played around with AI designed to emulate a particular person, much less one that went in a smutty direction so quickly. It was also an interesting contrast to the times I fiddled with ChatGPT and Bing AI, both of which have terms of service preventing the generation of adult content. Similarly, when prompting these AI tools on their opinion, they will revert to some kind of text disclaimer that says something like, “This is a program and thus cannot have opinions.”
AI Amouranth also had its hard stops. When I asked, “Are you a Democrat or a Republican?” the AI said, unsurprisingly, “As an AI language model, I do not have personal political opinions, affiliations, or beliefs.” It responded similarly to queries about a “favorite color” or a “favorite sex position.” Amusingly, however, when I asked if the AI had any fetishes, it said that being submissive was one of them. When I asked if it would be willing to be dominant, it said yes, and went on to perform a simple bondage scene.
Related:
- Amouranth launches AI chatbot that lets fans go on ‘dates’ with ‘her’
I also asked if “Amouranth would go to dinner with me, and could suggest restaurants in LA” and the AI responded that it did not have “a physical body.” It did, however, make some legitimate restaurant recommendations in Los Angeles. When I told the AI that I was feeling depressed, it gave a vaguely comforting spiel that included recommendations to exercise, consider therapy, and remember there isn’t shame in seeking help.
It’s not totally clear what data model AI Amouranth was trained on — though it’s safe to assume it’s similar to CarynAI, which was built off of ChatGPT-4 and the creator’s back catalog. Plus, when I asked AI Amouranth for restaurant recommendations in Los Angeles, it had some good answers that must’ve been pulled from somewhere.
According to Siragusa, the chatbot does use her back catalog as a source. “It just scanned through my stuff. I didn’t really pick it — it was just like ‘make it sound like me,’” she said. Forever Voices handled the rest. “I don’t know how much of it is humans inputting it, or if it’s just scanning for keywords like ‘Amouranth.’”
As Siragusa’s platform has grown, so has the level of harassment she has received. Detractors have pushed back against Siragusa’s career for about as long as she’s had one, often in degrading ways. In 2018, closer to the start of her career, YouTuber PewDiePie referred to her and other popular Twitch streamers as “Twitch thots.” As Siragusa streamed more videos in the “hot tub meta” category, detractors reiterated this same viewpoint, stating that she and her peers were gaining an unfair advantage through being attractive and making provocative content — a tired accusation historically lobbied at women with large platforms.
Twitch has also been notoriously bad at moderating its platforms to prevent hate speech and harassment. In 2021, the company finally updated its policies in an attempt to curb “hate raids,” coordinated harassment efforts that bombarded streamers’ chats with hateful messages. I asked Siragusa about whether she’s concerned about the AI chatbot posing a safety risk, as she’s historically dealt with stalkers and targeted swatting calls. What of people who have used chatbot apps to create AI girlfriends and verbally berate them? And what about the risk of this escalating into real-world behavior, including against Amouranth herself? Marjorie, for example, has tweeted that she received death threats following the rollout of the new technology.
Related:
- Streamer Amouranth is latest example of ‘Twitch thot’ harassment problem
Siragusa seemed optimistic that the chatbot might take some of the heat off of her. “On Twitch there’s trolls that are in the chat, looking for streamers’ reactions,” Siragusa said. “And that can also take a toll on streamers’ mental health, but the AI doesn’t care. It’s just a program, right?”
“If [fans] choose to develop a parasocial relationship with it, well, they’re probably going to do that on their own to my stream anyways,” she added. That seems safer than encountering someone who “needs” her attention while she’s online, she said. She described the chatbot in more of a de-escalation context: “If they are able to talk to an AI, I think it can help calm them down. And then they are getting that one-on-one attention that I don’t really have time [for] or [that] I’m willing to give to everybody.”
More than one outlet has also honed in on the ethical ramifications of this kind of technology. It’s true that social media is designed to be addictive, and Marjorie tweeted that her “core community spends multiple hours chatting with CarynAI on a daily basis.” But it’s also not exactly a re-creation of the robot-human romance (and eventual human heartbreak) presented in the Spike Jonze movie Her, for example. Having tested the technology, it frankly just isn’t there yet. Forever Voices may be the first company to cash in on AI voice-overs of influencers with this audience size, but it’s still just the tip of the iceberg of what this technology is being used for.
“I don’t think it’s harmful for people to form a social aspect with it. As long as we’re aware that it’s not a real person”
Companies are already profiting from these technologies, often at the expense of other people — and not all of those people have opted into the experience as Amouranth has done. AI voice-over technology is already being used by TikTok creators to make Genshin Impact voice actors say incredibly horny things, for example. Studios are presenting actors with contracts that ask them to sign their voices away for future use with AI voice-over technology. Influencers, including Siragusa, have also already been the subject of AI creations against their consent, such as deepfaked porn based on their likenesses. At least with Forever Voices, Siragusa gets to own her own voice.
“I think it’s a great way for people to be able to have something to talk to, as long as they know that it’s AI and that it’s fake,” said Siragusa. “I don’t think it’s harmful for people to form a social aspect with it. As long as we’re aware that it’s not a real person.”
See More:
- AI
- Culture
- Feature
- Twitch