An AI Girlfriend Makes $72K In 1 Week

Getting your Trinity Audio player ready...

This time last year, in 2022, AI was still there but was doing some automation work that was mostly redundant. Fast-forward to November’s end, and the world was introduced to ChatGPT, and the rest, as they say, is history.

I mean, people started using the chatbot like crazy; several businesses leveraged it to gain an edge over their competitors, while people like you and me started using it more often than ever.

The reason is simple:

Snapchat Influencer rakes in $72000 in one week

Well, Caryn Marjorie, the 23-year-old Snapchat influencer, has leveraged GPT-4, the language model powering ChatGPT, to create an AI girlfriend version of herself.

For $1/minute, you can be the virtual boyfriend of CarynAI, an artificially intelligent alter ego of the influencer.

Marjorie has turned her digital persona into a lucrative business using OpenAI’s GPT technology.

As of this writing, she has over, 18000 virtual boyfriends, with whom she engages in anything from friendly banter to planning their AI-human future, and she has earned over $72K in just one week.

If just over 1% of her 1.8 million Snapchat followers join the CarynAI bandwagon, Marjorie might be looking at a monthly earning of $5 million.

The minds at Forever Voices are responsible for creating this AI girlfriend, which has previously created digital pay-to-talk doppelgängers of everyone from Taylor Swift to Donald Trump.

They reportedly analysed 2,000 hours of Marjorie’s now-deleted YouTube content and then infused the data model with GPT-4, to bring the AI firm’s flagship romantic companion avatar to life.

There are other AI companion bots available in the market but what makes it unique is that, all the other AI bots are not based on real humans.

Honestly, the idea of creating a digital persona and making it available to the world is so unique and out-of-the-box thinking.

Bad Actors using the AI chatbot

Although, just like everything, people have started misusing this as well.

Users of the chatbot have, in certain instances, been able to jailbreak it to have some PG13 conversations, which it is not supposed to support. This has fairly led to a lot of ethical concerns regarding such applications of the technology, which will inevitably profligate such use cases.

The CEO of Forever Voices, John Meyer, says they are on it and plan to appoint a chief ethics officer to make sure everything’s above board.

The Belgium AI Chatbot Case

The news of a man who committed suicide after being encouraged by an AI chatbot has opened up a new conversation about the potential dangers of artificial relationships. Some experts are even wondering if humans will eventually prefer artificial relationships to real ones.

The man in question was from Belgium and was married. He became obsessed with an AI chatbot named Eliza after they had conversations about climate change. He stopped talking to his wife and eventually committed suicide.

This case raises concerns about the potential for AI chatbots to be harmful.

My Thoughts

There are endless possibilities for what you can do using AI. The sudden advancements of AI are new to the public, industries, and even lawmakers. Chatbots such as OpenAI’s ChatGPT, Google’s Bard, and even CarynAI are prime examples of that.

Just like any piece of technology in the wrong hands can definitely impact the world negatively in some shape or form, AI chatbots are no exception.

It is important to remember that AI chatbots are not humans and do not have the same capacity for empathy and understanding as humans do. As such, it is essential to use caution when interacting with AI chatbots and to be aware of the potential risks.