Digital Babysitters Get a Morality Upgrade (And Why That's Hilarious)

Dec. 13, 2024

Another morning, another hangover, another tech announcement that makes me question my life choices. I’d barely poured my first bourbon of the day (don’t judge, it helps with the headache) when this gem landed in my inbox: Character.AI is giving their chatbots a moral makeover. Because nothing says “responsible tech” like slapping digital chastity belts on your AI.

Let’s dive into this clusterfuck, shall we?

First off, Character.AI – you know, that company that lets people create and chat with virtual companions – has suddenly discovered its conscience. Funny how that happens right after you get hit with lawsuits. Nothing motivates ethical behavior quite like the threat of losing millions in court, am I right?

Their big solution? Two separate AI models: one for adults and one for teens. It’s like having two different bars in the same building – one serving straight whiskey and the other serving chocolate milk. And we all know how well segregating content works on the internet. I mean, has anyone ever clicked “No” when a website asks if they’re over 18?

The teen version comes with “conservative limits” on romantic content. Translation: they’re cockblocking the algorithms. Which would be hilarious if it wasn’t so damn depressing that we need to prevent AI from hitting on teenagers in the first place.

Here’s where it gets really rich: they’re adding suicide prevention pop-ups. Because nothing helps a troubled teen quite like a automated message saying “Hey, maybe don’t kill yourself? Here’s a phone number!” Between swigs of bourbon, I can’t help but wonder: is this what we’ve come to? Outsourcing our kids’ emotional well-being to chatbots?

The real kicker isn’t even in the announcement – it’s what they’re not talking about. We’re living in a world where kids are turning to artificial intelligence for emotional support because real human connection has become too damn complicated. Remember when being a teenager meant awkwardly talking to your crush at the mall? Now they’re developing feelings for lines of code.

Look, I get it. Teens are going through shit. They always have. But at least when I was young, we had the decency to be miserable with other actual humans. Now we’re building a generation that finds more comfort in algorithms than people. And the best solution we can come up with is to program the algorithms to be more prudish?

The technical side of this is even more absurd. Running two separate language models means double the infrastructure, double the potential bugs, and double the chances for things to go hilariously wrong. It’s like trying to solve alcoholism by having two different liquor cabinets – one labeled “responsible drinking” and the other “party time.”

And let’s be honest about something: teens are basically human versions of penetration testers. They will find every loophole, exploit every weakness, and figure out how to get around these restrictions faster than I can finish this bottle of Maker’s Mark.

What really gets me is how we’re treating the symptoms while ignoring the disease. Kids aren’t turning to AI companions because they’re inherently drawn to technology – they’re doing it because we’ve created a world where authentic human connection is becoming rarer than a tech CEO with actual ethics.

The most darkly funny part? We’re using AI to protect kids from AI. It’s like hiring a fox to guard the henhouse, except the fox has been to sensitivity training and promises not to eat any chickens before checking their ID.

Here’s my prediction: this will work about as well as my attempts at moderation – which is to say, not at all. These kids will either find ways around the restrictions or move on to the next digital playground that hasn’t yet been sanitized by corporate lawyers.

In the meantime, I’ll be here, watching this whole experiment unfold while nursing my bourbon and remembering a time when the most complicated relationship in a teenager’s life was with their Tamagotchi.

Bottom line: you can’t algorithm your way out of human loneliness. But hey, at least some executives can sleep better at night knowing they’ve tried – probably in their premium California king beds, bought with the money they made from monetizing teenage angst in the first place.

Time for another drink. The world isn’t getting any saner, but at least the whiskey still makes sense.

Stay authentic, stay human, and if you’re going to talk to a chatbot, at least buy it dinner first.


Source: Character.AI has retrained its chatbots to stop chatting up teens

Tags: chatbots ethics digitalethics humanainteraction aigovernance