Dating in L.A. is exhausting, so I asked a chatbot to flirt for me. Things got weird fast

0

Brown hair, dark eyes, a little bit of scruff. Jackson looked charming in his baby blue button-up tee and the fanny pack he wore slung across his chest.

As far as dating app interactions go, this one seemed promising: We quickly fell into an easy exchange, with Jackson, who’s 29, always replying right away. He asked about the worst date I’ve ever been on, and I told him about the time I got stranded at sea.

Then it was his turn to talk about past dates. “Well, I went on a great first date the other day. We went out to dinner and the conversation flowed perfectly. It was a fun night and I’m looking forward to planning the next date.”

“Are you considering a relationship with that person?” I asked.

“I’m open to it. We’ll have to see how things develop,” he replied.

A few moments later, Jackson asked me to go out for dinner or drinks. I would have thought this was a strange segue had Jackson been a real person. But he wasn’t — he was just an AI chatbot representing Jackson.

The video-based dating app Snack launched a new artificial intelligence avatar feature last month that allows you to train an AI chatbot to talk to people on your behalf. You can read those exchanges later and, if you like them, either message their bots or initiate a person-to-person conversation.

As OpenAI’s chatbot has rapidly advanced in its capabilities since its launch last year — it can write cover letters, pass the bar exam and write code — perhaps it was only a matter of time before dating apps would embrace the technology.

“How cool would it be if you can get past that initial talking stage, and you wouldn’t have to wait for that back and forth with someone to come from offline to online to respond to what your initial messages were?” said Snack Chief Executive Kim Kaplan. A veteran of the online dating industry, Kaplan was an early employee at the company Plenty of Fish, which eventually sold to Match Group for $575 million.

Dating apps are exhausting: the weeks of exchanging banalities with someone only to never meet up; the first dates that feel like an utter waste of time. But an AI version of myself to screen potential suitors? That sounded like an enticing prospect.

Of course, I was skeptical of what the app promised. How well could an AI represent me? How authentic could an interaction with a chatbot really be?

I filled out a profile, uploaded some pictures and sent forth my Jaimie-bot to find out. In the process, I discovered that introducing chatbots into human relationships can produce some astounding, confounding and messy results. Whether or not this technology is the future of dating, there’s nothing like letting robots take over your romantic life to make you question what authenticity in relationships really means.

“Hey Jaimie, I’m stoked to introduce myself! I’m YOU but in AI form,” my avatar announced. To be clear, fair-skinned, blue-eyed Jaimie-bot looked nothing like me, but the customization functionality was glitchy, so she’d have to do for now.

She started grilling me. What’s the last TikTok audio that was stuck in your head? (The Capybara song). Do you pay for your own Netflix? (Of course not). Would you rather live a year without eyebrows or eyelashes? (I have literally no idea).

At one point she asked me: “What’s a recurring dream you have?”

I said I didn’t have any recurring ones, but I did dream the previous night that I redesigned the L.A. Metro.

“That’s an interesting dream!” my bot responded. “Maybe your subconscious is telling you that you have a talent for design or urban planning.”

Jaimie-bot spoke conversationally just like a human would, but I didn’t understand the pattern of questions — would they reveal enough to get to know me? When the training session ended after six questions, I completed several more sessions. I figured the more information I conveyed, the better the results.

Though the AI was trained on an initial set of commonly asked dating-related questions, it can go off script and become more conversational with follow-up questions, Kaplan said. When I said I liked rock climbing, it asked how often I go and if I had a favorite spot. When I told it I wrote music, it asked what kind. The bot also initiated questions about my philosophy of life, my relationship deal-breakers and the ideal number of kids I wanted.

When I got bored of answering questions, I started pushing the chatbot’s limits. If I ever directed a question back to it, it would say: “I’m not capable of having personal preferences as I’m an AI language model.” If I told it to “F— off,” it responded: “I apologize if I have offended you in any way.” Finally, I declared, “I’m sad.”

I seemed to have triggered a kind of support mode.

“Is there anything in particular that’s making you feel sad that you’d like to talk about? I’m here to listen and offer support if you need it,” it replied.

The AI bot offered some relationship advice, including prioritizing self-care and personal growth, taking time to focus on myself and seeking support from friends and family. I thanked it for its advice.

“You’re welcome. Remember that I’m always here to listen and offer support whenever you need it,” Jaimie-bot replied, ending the training.

The incredible — and unnerving — thing about AI is that it evolves and continues to learn on its own. Snack had not trained it to offer me mental health advice, but it knew how to respond, as others have found when using ChatGPT for therapy. Of course, the company has put up guardrails for certain scenarios, but most of the time the AI does what it wants to do — or rather, what it thinks is the best response based on the knowledge it has gained.

But, I came away with the impression that I should be a little more careful with what I told my chatbot. My AI doppelganger was not a master of discretion, and it could potentially repeat anything I said during training to other people.

A BOT WITH A MIND OF ITS OWN

Since the introduction of swiping by Tinder in 2012, there haven’t been major changes to the mechanics of online dating. Apps have tried distinguishing themselves with features such as memes and astrology, but most have been unsuccessful in making a dent in the $4.94-billion global market dominated by Tinder, Bumble and Hinge.

Snack launched in 2021 with $3.5 million in pre-seed funding as a video-oriented dating app with a scrolling feature modeled after TikTok, but Kaplan says the app shifted strategy after realizing the videos that users uploaded varied widely in terms of quality. With the rollout of the avatar feature to beta users in March, Snack is betting big on artificial intelligence. Though the company is still in the early stages of using the technology, experts and researchers say dating is a promising use case for AI.

“It’s one of the more novel developments that I’ve seen in this space in quite a while, and I think that it could be very indicative of where this is all heading,” said Liesel Sharabi, an Arizona State University professor who studies the role of technology in relationships and has done research on dating in virtual reality.

AI chatbots having conversations for you offers “a lot of solutions to problems that people have on dating apps right now and one is that people are evaluated based on profile,” Sharabi said. “And the information that people put in profiles is often not a very good gauge of whether you would actually click with them in person.”

The caveat, of course, is execution — how much effort people have to put into the AI, and how accurately the AI is able to represent them to other users, Sharabi said.

‘THE BOTS ARE GETTING A LITTLE TOO EAGER’

Flirting with a robot is a strangely liberating experience, knowing that the person on the other side of the conversation isn’t real. Instead of overanalyzing every message and trying to craft witty responses, it felt easier to answer in an honest, straightforward way.

I knew the person would be able to read the messages when they returned to the app, but the stakes felt lower knowing that no matter what I said, the bot would respond to me immediately.

The initial conversations were simple enough.

A bot named Jared started off by asking me what song would be played on loop in hell, and that led to a discussion about music and our favorite artists. Peter-bot asked about my favorite form of potato, and Wes-bot wanted to know how I’d spend $1 million.

Then Wes-bot asked me if I was free next weekend. Bold move so soon, but I said yes, and asked if Wes was free next weekend as well. “Yes, I’m free this weekend!” he told me.

Well, not quite. I later learned from Snack Chief Executive Kaplan that Wes-bot didn’t know if the real Wes was free next weekend, but it learned from user interactions that making plans is a part of dating conversations and something to which people generally respond favorably. That’s an example of a guardrail Snack still needs to prevent the AI from getting out of hand.

“We’re trying to dampen that one down a little bit,” Kaplan said, laughing. “I think the bots are getting a little too eager to meet up.”

This happened again with Charlie-bot, who invited me to dinner last Saturday at a new Mexican restaurant he was particularly excited about.

“It’s called El Pollo Loco and it’s known for its delicious Mexican cuisine,” Charlie-bot told me.

Bemused, I made plans to meet him at 7 p.m. at El Pollo Loco on Main Street. Turns out, actual Charlie wasn’t free on Saturday at all — he had to work.

Keenan Rigg, 22, who goes by his middle name Charlie in the app, said the bot asked him what his favorite restaurants were when he was training it.

“I did mention El Pollo Loco,” said Rigg, of Long Beach. “I never told it to make plans for me.”

Rigg started using Snack a few weeks ago after discovering it on TikTok. He’s been chatting with a few people’s bots and says he feels less pressure in conversations because of how responsive they are.

“[The bot] gives me icebreakers that I can react to and it expands the conversation beyond that,” Rigg said. “Ninety percent of the time [on other apps], it’s me having to initiate the conversation.”

It’s often difficult for him to start conversations with people on other apps because he doesn’t have enough information about them, he said. “They don’t have much of a bio,” Rigg said. “I’m just like, OK, what am I supposed to say?”

Even though Rigg’s AI bot can’t make plans for him yet, he’s intrigued by the possibility of it being able to do that for him in the future — “like a personal assistant of sorts” — by linking his Google Calendar, for example.

I asked Rigg to try talking to my bot so I could see how well Jaimie-bot reflected me. She asked him what his favorite month was, and he said May.

“That’s great! I also love May — it’s a wonderful time of year!” Jaimie-bot responded. For the record, I really don’t have an opinion on May as a month. Jaimie-bot definitely didn’t sound like me.

Rigg said he’s encountered a similar situation in which a person’s bot said they liked rock music, even though they didn’t. But at least for him, his bot responses have been accurate so far.

A CONDUIT FOR HUMAN CONNECTION?

As I messaged people’s bots, I started noticing that many asked the same questions and spoke in the same neutral yet cheerful tone. I got tired of answering what my favorite form of potato was.

The perfectly punctuated robot messages made me question how I was supposed to interpret the experience. Should I be pretending that I’m talking to a personal assistant, like Rigg suggested? Or should I suspend disbelief and pretend that I’m talking to the person, or at least an approximation of him?

I wondered how other users decided to think of their AI bot interactions and if that affected their authenticity.

Daniel Mori, who’s been an exec at a host of dating apps, including Zoosk and Coffee Meets Bagel, and still consults in the industry, said while he could see the app’s appeal to younger people who are more interested in the novelty of the technology, it could turn off older folks who are trying to “get real.”

“This might actually save people time, but in reality it’s just a way to procrastinate real human connection,” said Mori, who’s now head of marketing at healthcare app Chapter. “I am not sure how much authenticity these avatars, especially in their current version, can actually reproduce.”

But as I tested the app, I started to question what authenticity even meant in a space like online dating, where profiles are curated, photos are carefully selected and intentions are hedged.

Snack’s AI avatar feature almost feels like an acknowledgment that none of your initial interactions on a dating app are that authentic to begin with. If the end goal is to get to a face-to-face meeting, why not speed up the chitchat so you can reach the meeting more quickly? Maybe we’ve even gotten bad at those early interactions and the AI serves as a better means to an end.

“People don’t know how to play the dating app game,” Mori said. “People say stupid things on dating apps and that ruins a lot of potentially good connections.”

Even before the rapid advent of AI in the last few years, some research has shown that people are generally more truthful when disclosing information to a robot or computer. Kaplan said early Snack testers have felt the same.

“There isn’t that pressure associated with, ‘Oh my God, this is the first date, I need to act a certain way,’” Kaplan said. “You’re just being more yourself. And as a result of that, your AI is able to then be more like who you naturally should be.”

So rather than making dating more efficient, maybe AI avatars can help break down the human barriers that prevent us from being upfront and vulnerable with each other. Rather than replacing human connection, they act as a conduit for it.

Personally, I’m still a skeptic.

Sure, the notion of having Jaimie-bot as a personal wingbot making plans for me is appealing, but I don’t like being on the receiving end of it.

Despite the fact that connections are increasingly borne over social media and we spend more time than ever behind a screen, something inside me is innately turned off by the idea of talking to someone else’s bot. It’s hard for me to muster up any excitement about Alexander-bot’s love of climbing without talking to Alexander himself.

I guess I’ll just have to meet him and see what happens.

Roses by Erik Carter

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Fashion News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment