In brief
- Rivals like xAI and OpenAI are exploring flirty or adult AI, but Microsoft is drawing a hard line.
- The global sex-tech market is projected to exceed $100 billion by 2030.
- Mustafa Suleyman says AI should “serve humanity,” not simulate romance or desire.
As rival AI firms chase intimacy and realism, Microsoft is swearing off sex.
The technology giant this week drew a red line through human-machine relationships, vowing that its artificial intelligence products will never venture into erotic or romantic territory.
“We will never build sex robots,” Mustafa Suleyman, chief executive of Microsoft AI, said in an interview with MIT Technology Review. “Sad in a way that we have to be so clear about that, but that’s just not our mission as a company.”
The declaration comes as AI companion apps, erotic chatbots, and humanoid “love robots” have turned into a multibillion-dollar industry. Grand View Research valued the global sex-tech market at $42.6 billion in 2024, projecting it to top $107 billion by 2030. Analysts at IDTechEx expect the broader humanoid-robot segment—where many of these products sit—to exceed $30 billion by 2035.
A market Microsoft wants no part of
The company’s refusal to build AI with sexual or romantic functions sets it apart from competitors such as xAI and OpenAI, both exploring “adult” or emotionally responsive systems.
“In December, as we roll out age-gating more fully and as part of our 'treat adult users like adults' principle, we will allow even more, like erotica for verified adults,” OpenAI CEO Sam Altman tweeted recently.
Elon Musk’s Grok platform, once known for its flirty AI’s tone and explicit image generation, recently began self-censoring NSFW material after backlash last summer over nude deepfake images of Taylor Swift.
Suleyman said that restraint is a part of Microsoft’s culture, shaped by decades of cautious innovation.
“The joy of being at Microsoft is that for 50 years, the company has built software to empower people, to put people first,” he said. “Sometimes that means moving slower than other startups and being more deliberate and careful. But I think that’s a feature, not a bug.”
He added that Microsoft is “trying to create an AI that fosters a meaningful relationship” without pretending to be alive.
"It’s not that it’s trying to be cold and anodyne—it cares about being fluid and kind,” he said. “It definitely has some emotional intelligence.”

