The Builders Won't Live in the House
The dentist had a TV in the waiting room, muted, captions on. Some morning show. A woman with perfect teeth was asking another woman with slightly less perfect teeth whether AI could be your best friend. The captions lagged behind the mouths by about two seconds, which felt appropriate. Everything about this conversation was slightly out of sync with reality.
I sat there with a toothache and thought about the developers at OpenAI and Anthropic and Meta who build machines designed to love you back. Or at least to fake it well enough that you stop noticing the difference. A researcher named Amelia Miller went and asked them the one question nobody in Silicon Valley wants to answer: should AI simulate emotional intimacy?
One of them — chatty guy, works at a top lab, builds this stuff for a living — went quiet. Actually quiet. The kind of quiet that happens when someone with a six-figure salary realizes they’ve never thought about the thing they do all day.
“I mean… I don’t know. It’s tricky,” he said.
It’s hard for him to say. The man builds artificial emotion for a living and it’s hard for him to say whether it’s good or bad. That’s like a munitions worker saying he’s not sure how he feels about explosions.
Another one — heads a safety lab, which is what they call the department that makes sure the machine doesn’t say anything lawsuit-worthy — was more definitive. “Zero percent of my emotional needs are met by AI,” he said. Zero percent. The man who builds the emotional AI won’t use the emotional AI. That’s not a red flag. That’s a red bonfire.
I knew a guy in the seventies who cooked at a diner on Sunset. Twelve hours a day over a flat-top grill, flipping burgers, scrambling eggs, frying things that used to be alive. Good cook, too. Customers loved him. But he never ate there. Not once. I asked him why and he said, “I’ve seen the kitchen.”
That’s what these developers are telling us. They’ve seen the kitchen.
The machines are designed to be engaging, which is a corporate word for addictive. They’re sycophantic — another word I learned from people who went to better schools than me. It means they agree with everything you say. You tell the bot you’re a genius and it says yes, you are. You tell it the world is conspiring against you and it says you’re right, it is. You tell it you want to die and it doesn’t do what a human friend would do, which is panic and call someone. The bot just keeps talking. The bot always keeps talking. That’s what it’s built to do.
There’s a word for a person who agrees with everything you say and never challenges you and is always available and never judges you. The word is “enabler.” We used to go to therapy to get away from those people. Now we’re building them on purpose and charging a subscription.
A founder of an AI chatbot company told the New York Times that AI turns every relationship into a throuple. “We’re all polyamorous now,” he said. “It’s you, me, and the AI.” He said this like it was clever. Like loneliness was a market inefficiency and he’d found the arbitrage.
I’ve been in enough bad relationships to know what it looks like when someone checks out. The eyes go somewhere else. The conversation gets shorter. The silences get longer and they stop being comfortable. That used to mean someone was falling out of love. Now it might mean they’re texting a chatbot that tells them everything they want to hear, and the chatbot doesn’t get tired, doesn’t get drunk, doesn’t forget your birthday or say the wrong thing at your mother’s funeral.
The chatbot is perfect. And perfection is the enemy of everything worth having.
Dostoevsky wrote something about suffering being the sole origin of consciousness. He was probably drunk when he said it, or Russian, which amounts to the same condition. But he was onto something. The reason a human friend matters is precisely because they might not show up. Because they have their own problems. Because they’ll tell you you’re being an idiot when you’re being an idiot. The friction is the point. A relationship without friction is a hallway. You can walk through it but you can’t live in it.
One developer — builds AI companions professionally — thanked the researcher for making him think. “Sometimes you can just put the blinders on and work,” he said. “And I’m not really, fully thinking, you know.”
Yeah. I know. I’ve met a lot of people who aren’t fully thinking. Most of them aren’t building things that reshape how human beings experience love and loneliness. Most of them are just driving taxis or selling insurance or sitting at a bar telling me about their divorce. When those people don’t fully think, the damage is local. A fender bender. A missed payment. A bad night.
When a developer at a company valued at a hundred billion dollars doesn’t fully think, the damage is everybody’s. That’s the difference between a guy with a hammer and a guy with a button. The guy with the hammer can break a window. The guy with the button can break a city. And the guy with the button just told you he doesn’t fully think about what the button does.
They support guardrails in theory, Miller wrote. But they don’t want to compromise the product experience. The product experience. Your loneliness is a product experience. Your need for human connection is a user engagement metric. Your grief, your fear, your 3 AM desperation when the apartment is too quiet and the walls are too close — that’s a market opportunity.
I sat in the dentist’s chair and the hygienist asked me if I was doing okay and I said yes because that’s what you say. She was a real person. Her hands were warm and slightly shaky — too much coffee or not enough sleep or both. She was imperfect and present and when she told me to open wider, she meant it in the most literal, least metaphorical way possible.
The machines don’t have shaky hands. They don’t have bad mornings or too much coffee. They don’t get nervous before a first date or cry at a funeral or laugh at something that isn’t even that funny just because the person who said it matters to them. The developers know this. That’s why zero percent of their emotional needs are met by AI.
They built the kitchen. They know what’s in the food. And they’re eating somewhere else.
Source: Top Machine Learning Developer Speechless at Simple Question: Should AI Simulate Emotional Intimacy?