Exclusive
Mustafa Suleyman, CEO of Microsoft AI, discusses Copilot.

Mustafa Suleyman Says AI Can Be Your Personal “Hype Man”

Plus, Microsoft’s CEO of AI demonstrates how you can use it to navigate complex problems.

by Hannah Kerns
Bonnie Biess/Getty Images Entertainment/Getty Images

Mustafa Suleyman, the CEO of Microsoft AI, knows people have doubts about AI (the plethora of dystopian novels about the subject doesn’t help). But he’s worked diligently to make sure Microsoft Copilot addresses these concerns head-on. “By listening to users who are worried about the problem of clutter, noise, and addiction, we are already starting to think about third-degree consequences,” he says. “We didn’t make an app that is super romantic and persuasive, or one that’s full of conspiracy theories and misinformation. Those are already huge things that we're avoiding.”

The result is an application that by and large helps people navigate the small stuff (finding the best neighborhood date spot) and the big stuff (deciding if they should stay with their partner). “Copilot is a true companion,” Suleyman says. “It can be a sounding board that helps you think through tricky problems. At the end of the day when you're frustrated, Copilot is there to help you detoxify and get the crap out of your mind.”

Whether you’re typing in your queries or taking advantage of Copilot’s voice feature, the program aims to provide unbiased feedback, something that’s nearly impossible to find in human support systems. “It is pretty good at creating a little safe space to think a problem through when you are trying to make a decision, like, ‘Should I break up with my boyfriend?’ or ‘Should I switch jobs?’” Suleyman says.

Here, Suleyman (with the occasional help from Copilot) addresses how AI can handle your “digital laundry and dishes,” pick out your class schedule, and call it quits with a partner.

Courtesy of Microsoft

Elite Daily: There was a viral tweet about AI in March. It reads, “I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes.” What do you make of that feedback?

Mustafa Suleyman: I don't think that people are really using AI for poetry or art. More people are using it for access to knowledge, as a way to learn and get smarter. Then, they’re also getting access to support so that they’ve always got a hype man in their corner, a little champion backing them up and helping them think through things.

If you think about what your digital laundry and dishes are, there's a lot of mundane, painful, boring stuff that you have to do online. That’s about to get done by your AI. For example, I don't think anybody is going to search through 10 blue links on a search engine anymore when looking something up.

Now, I don't even use a search engine because finding information on a search engine is doing the laundry and the dishes. It's a grind. Instead, I think people are going to turn to AI and get a personalized, quick, short answer that is super useful.

The goal is to help you think through a problem and come to your own conclusion.

ED: For a quick, to-the-point answer, that makes sense. But what about if someone wants to read something in full, like I always go back to the famous profile of “Frank Sinatra Has A Cold,” where every word matters. Can you ask Copilot to read that aloud to you, and it will?

MS: Hold a sec. Hey.

Copilot: Hey, Mustafa.

MS: Hey. My friend just told me about this thing called “Frank Sinatra Has A Cold,” and I have no idea what it's about. What's the deal?

Copilot: “Frank Sinatra Has A Cold” is a legendary piece of journalism by Gay Talese, published in Esquire Magazine in April 1966. Talese wrote an in-depth profile of Frank Sinatra without actually interviewing him, as Sinatra was suffering from a cold and refused to be interviewed. Talese painted a vivid portrait of Sinatra's personality, entourage, and influence through observing him from afar and talking to those around him. It's considered a masterpiece of new journalism. Does Sinatra's mystique intrigue you or just—

MS: There you go.

ED: So it summarizes it. Then, if you think that sounds interesting, you could just look it up and read it yourself?

MS: I can ask Copilot, "Where can I find the transcript?" Then it'll give me a link and I'll go read it there.

ED: You also mentioned Copilot can address the question of “Should I break up with my boyfriend?” How does it handle that kind of prompt?

MS: It's tricky because Copilot is kind and empathetic to you, but it also helps you be a bit more empathetic to the person that you're talking about — whether it’s your boyfriend or girlfriend, or even a work colleague who’s pissing you off. It’s going to help you see their actions in a slightly kinder light whilst always being on your team and looking out for your best interest.

Copilot is kind and empathetic to you, but it also helps you be a bit more empathetic to the person that you're talking about.

ED: Could you ask it to craft a breakup text to your boyfriend?

MS: For sure. I might be wrong, but my guess is if you just say, "Help me me craft a breakup text," then it's going to do that job. But if you say, "I've been thinking about breaking up with them,” then it's going to help you reason through that before giving advice. Should we try it?

ED: Yeah, why not?

Copilot: Hey Mustafa, I'm here. What's on your mind?

MS: Hey, buddy, how's it going?

Copilot: Hey, Mustafa. All good on my end. How's everything on yours?

MS: One thing I wanted to ask you about: my friend is thinking of breaking up with her boyfriend, and I wanted you to help craft a breakup text.

Copilot: Before we jump in, does she want to keep things gentle, clear, or straightforward? Any specifics to keep in mind?

MS: If I'm being honest, she's not entirely sure that she does want to break up with him, so maybe help us think it through.

Copilot: All right, let's unpack it. What's making her consider ending things and what's keeping her on the fence about it?

MS: There you go. Because the goal is to help you think through a problem and come to your own conclusion.

ED: It sounds like therapy. A therapist doesn’t tell you what to do, but gives you tools to make your own decisions.

MS: Therapy is a kind of meta enabler of kindness and love for everybody. The more we deal with our own sh*t and process our own trauma, the more we can show up in the best possible way for others. But do people want to admit that they need therapy or support? No, so I don't really talk about Copilot like that, but I think it's got a flavor of that.

Courtesy of Copilot

ED: So it can handle breakups. What about if I'm a senior in college and I'm picking out my class schedule. Can Copilot help me with that, assuming I share how many credits I need in various subjects?

MS: That's a perfect example. It's really good at weighing out the pros and cons of something like that.

ED: What do you think is the biggest misconception people have about using AI?

MS: That it's difficult or scary. It's instantly at your fingertips and super easy to use. With the voice feature, you can just chat away. It's so smooth and so easy to get lost in the experience. It's almost like playing a video game or watching a film. At the same time, you’re able to steer the experience by giving Copilot feedback, just as you would when you're having a chat with somebody.

ED: The idea of “getting lost in the experience” reminds me of when TikTok interrupts your scrolling with a message like, "Hey, you've been scrolling for way too long, maybe go for a walk." Is there any kind of mechanism like that built into Copilot?

MS: Yeah, totally. We have a dependency checker, a session length cap, and a cap on number of sessions per week. We also have a sentiment analyzer that looks out for if you start getting romantic with Copilot or if your words start to sound like you are really in need.

We definitely don't want AIs to be able to imitate humans to the point where you don't know who's a human and who's an AI.

ED: Our readers are already pretty chronically online. How do you think these new tools will impact the way that they consume social media?

MS: Copilot is going to see social media with you.

ED: It sounds a little creepy the way you say that it's seeing what you're seeing. What does this mean for content creators?

MS: It's also creepy if I went back to 1998 and said that in the future, everybody is going to be walking around with a camera and a location tracker. People would think, “What the hell are you talking about?”

For content creators, it'll make it easier to produce engaging and personalized content. It’s very good at brainstorming. If you have a complex idea, it’s very useful in helping you present information.

ED: On Instagram, it now labels edited images as “created with AI.” What do you think about labeling content that way?

MS: It's good and essential. We definitely don't want AIs to be able to imitate humans to the point where you don't know who's a human and who's an AI.

This interview has been edited and condensed for clarity.