AR gets its big moment in the spotlight | Magic Leap CEO

AR gets its big moment in the spotlight | Magic Leap CEO

Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.




Apple CEO Tim Cook is likely talking about his mixed reality headset today. But Peggy Johnson, CEO of Magic Leap, upstaged him last week during a talk at Augmented World Expo, a trade show attended by thousands of XR industry followers.

Johnson did something that Cook is not likely to do. Before a crowd of hundreds, she was very transparent. She showed off an MRI of her torso, as viewed via a Magic Leap 2 augmented reality headset. It showed with precision what her liver looked like — her CTO Daniel Diez said it was looking good — as a way of showing the kind of advances that XR headsets can bring to healthcare.

Johnson believes that the precision you can get from enterprise technology that overlays AR animated images on the real world is something that consumer devices are very unlikely to do. And she thinks that will be an edge that her company will have for a while.

After the keynote last week, I spoke with Johnson and Diez. Johnson said she welcomed the attention that Apple would bring to the XR market. In a conversation with Nvidia healthcare vice president Kimberly Powell, she said Magic Leap will focus on applications that can make healthcare much more efficient.

“Magic Leap has been doing this for over a decade,” she said, regarding Apple. “It feels good to have more entrants. It will help grow the ecosystem. It gets the developers in the audience excited about this medium and the programming in it. And we’re excited about the coming announcement. Apple doesn’t typically jump into a market too early. So that’s very validating if they’re coming into the market.”

Here’s an edited transcript of our interview.

Peggy Johnson is CEO of Magic Leap. She spoke at AWE USA last week.

VentureBeat: I guess the most striking thing about the keynote was that you put your insides up on the screen.

Peggy Johnson: I knew people were going to write about that. We’re so transparent at Magic Leap now! A completely different company.

The problem is, any time you show that, you need someone’s permission. I’m like, “Just take mine! Show whatever you want!” But I’ve only ever seen it in my doctor’s office, black and white on a TV screen. I hadn’t seen it until yesterday when you showed me, all lit up in color. All my organs. “Did I say you could…?” But that’s out there now in the wild. It’s in our booth, too, in our meeting room. You can go in and slice away to the bone, or go back and add all the organs.

VentureBeat: Was there something about that you didn’t want to know? Or did you really want to see it all?

Johnson: That’s a good question. But first of all, I can’t really read it very well. Daniel actually went to med school. He says, “Yeah, there’s your liver.” But it’s going to be useful. Nvidia’s holoscan stuff, Clara, is amazing. It really is. The power that it offers our developers, because we’ve integrated with it, is over the top. It’s great.

VentureBeat: It seems to bring it home to more people, too. You can look at this and understand it. Doctors are going to understand even more because they know what to look for, but regular people can look at these things now.

Johnson: One of the companies we worked with on Magic Leap One, BrainLab, they said that not only did doctors use it for pre-surgical planning, but they used it to show the patient what they were going to do. It comes to life when you can see it in 3D, compared to a 2D screen.

Magic Leap 2 is a sophisticated AR headset.

VentureBeat: For that, the detail, did you take a CT scan and that gets converted into something Magic Leap can display?

Johnson: Correct. I gave them my CT scan file and, using the Nvidia software, Clara, they volumized it, turned it into color, and put it in front of a thousand people’s eyes.

VentureBeat: Is that the stage you’re at? Is it being used this way in hospitals?

Johnson: Yes. Now that we’ve made this integration and we’re working closely with Nvidia on this, that whole package can be shared with developers. That’s why I wanted to talk about it, because we had an audience of developers there. Some lead tech companies. They can see the capability and turn it into solutions.

VentureBeat: You’re going to be immortalized now.

Johnson: I am! So far that’s the thing people have talked about. The most transparent tech CEO in the world, literally!

VentureBeat: That level of visualization, is it just self-evident to a lot of people that this kind of thing is valuable?

Daniel Diez: People are using that now. 3D visualization in health care is massive. It’s all about surgical guidance and pre-surgical planning. SentiAR is a great example of that. I don’t know if you’ve seen that, the cardiac catheterization. Typically when you catheterize a patient, it goes through the femoral vein in the leg. You snake the catheter up from there. The doctor is looking at a 2D screen, so they have to navigate a 3D environment with only 2D guidance.

With SentiAR, you slip on Magic Leap and put the catheter in. It has a radio frequency tip. It re-creates the vessel in real time right in front of you. Now you have a 3D model of what you’re doing in real time. The accuracy and safety of that goes way up. Then, as you go up to the heart, you get the 3D model of the heart, and you can do ablation, or you can do a stent into the heart. You get all the readouts and stuff. It’s game-changing when it comes to the level of accuracy and precision.

The other one is knee surgery. That’s going to be really interesting. You can overlay the surgical plan and the device can tell you the ideal location to cut for a knee replacement. Right now, for robotic knee replacement surgery, the robot is a million dollars. Who gets those? Just big university hospitals, urban medical centers. Ironically, the doctors who get the robot are in a way the ones who need it the least, because they’re so practiced at the surgery. The ones who don’t get it are the ones in rural hospitals or community hospitals. With this application, at a fraction of the cost, those doctors can have the same level of accuracy. The distribution of advanced medical methods goes way up.

This rotating image shows Peggy Johnson’s innards.

VentureBeat: Do you feel like this kind of space is defensible against whatever other competition is here? There’s Meta and mixed reality in their VR headset, and then whatever Apple is going to do.

Johnson: As I said in my talk, passthrough is good for things like screen replacement and entertainment, that sort of thing. But when you have this highly precise incision line on a human patient, it needs to be accurate. Augmented reality, with its ability to see the physical patient in front of you and the digital overlay on top, offers a different level of precision, I would say. The color uniformity is good. Text legibility. Any patient vitals can also be blown up to whatever size is comfortable for the surgeon.

I was talking to a surgeon downstairs, and he said that one of the biggest problems is that there’s a screen you look at. It could be anywhere in the operating room. But you have to crane your head to look at it and that takes your eyes away from the patient. Now you can put those vitals right in front of the doctor’s eyes. I hadn’t thought about that. I’m not a surgeon. But that’s helpful to them. And the distortion that passthrough creates, the latency, it’s just not good. I don’t know if we did this trick with you, but we put a passthrough device on you and then throw a ball for you to catch. You’ll drop it. There’s just enough latency that you can’t catch the ball.

VentureBeat: Do you see this space as pushing deeper into the higher end of the market?

Johnson: I think it starts there. It opens up a lot of use cases, to have that high precision. And then as we see how the demand goes, we can certainly do different things with products going forward. Right now we’re just taking in all the signal and trying to understand where the highest engagement is. So far it’s been in health care, industrial settings, and public sector training, things like that. We’re focused there right now. But with the precision we have, we believe it will open up a lot of new use cases. And then we might refine the product more for a specific market.

VentureBeat: As far as other milestones in the bigger picture, have you hit any that you’d like to mention? Unit sales, things like that.

Johnson: We haven’t talked about unit sales publicly. I like the momentum I see. The numbers keep growing. We don’t just see people trying it a few times and dropping off, so that feels good. The biggest thing is continuing to build the ecosystem. There has to be good content. There has to be software that delivers viable return on investment for the headset. We’re seeing that in these areas. I don’t want to broaden the areas until we’ve really nailed the ones we’re focused on. That’s why we’re trying to stay very focused and keep our attention there for now. We’re listening very carefully to the signal and making real time changes to the software.

VentureBeat: As far as different parts of the space that you might want to push into more–are you signaling where the road map is going to be?

Johnson: I don’t want to get ahead of things, which is why we’ve put the telemetry in. We’re really trying to understand how people are using it. I think an interesting new area will be location-based experiences. As an example, one that surprised us is a company called Tin Drum. They have a facility in Brooklyn, a big warehouse, where they’ve done a whole architectural display inside the space. People come in, put the headset on, and go through this architectural experience. It surprised us. When you put an SDK out there, people just start developing on it. They developed a really cool application and experience for people in that discipline of architecture. We’ll probably start to see more of that. That may be our first foray into the consumer side of things, understanding what the demand is there, what kinds of experiences can be built in spaces like that.

Magic Leap CEO Peggy Johnson (left) speaks with Kimberly Powell of Nvidia.

VentureBeat: Have you said anything regarding the story that ran about working with Meta?

Johnson: We haven’t, although we have talked about the fact that we have several partnerships in that space. We’ve been in the business for a long time. We manufacture in south Florida. We know how to make these eye pieces. The assembly around it, the projectors, the cameras. That’s attracted attention. Our ability to do that a very high yield rate has attracted the attention of the industry. We have signed some partnerships.

VentureBeat: Manufacturing agreements, basically?

Johnson: Yes, IP licensing and component manufacturing.

VentureBeat: I assume that’s very helpful for the bottom line.

Johnson: Yeah, volume is good. We can handle the volume at the factory. I love that it’s right there in south Florida, because things get fixed just like that. The engineers sit upstairs and the factory is downstairs. They just run downstairs if there’s any issue. It’s taken care of. We don’t have to put people on planes. That’s always helpful.

VentureBeat: How many people are you at now?

Johnson: We’re just over 1,000. We haven’t grown that much. What we’ve done is focus. When we had areas of need, we made sure that we were very focused in those areas. We’ve been able to keep up with the demands of the business.

VentureBeat: Were there other big points you wanted to get across to this crowd through the keynote?

Johnson: A lot of it was about–we just wanted to say that AR is here. We keep fighting headlines that talk about how AR is coming in several years, some of them even from the big tech companies. No, it’s here today. We’re a small company, but we need to make our voice louder. That’s why we’re on stage talking about it.

Diez: The technology in its current state is capable of delivering value for enterprise right now. It’s not a future proposition. It’s ironic that we keep hearing these messages about, “Someday, in the future, in the metaverse…” No, you can do this right now with Magic Leap. That’s the message we want people to understand.

Johnson: We have lots of partners downstairs using Magic Leap for construction, health care applications, all sorts of things. That was great to see for me, to see it all in one place like that. It shows what we have here now, as well as the potential coming in.

VentureBeat: Some of the other headsets have veered into the enterprise space after trying to consumer. They may be doing all right in consumer, but they’ve gravitated more toward enterprise. Is that another class of competition? Or do you think they don’t come close to duplicating what you do?

Johnson: Well, first of all, it’s validating that they are veering over that way. I believe, we believe, that the entry point is the enterprise. With the current state of the technology, the entry point is enterprise. But because we don’t fight the latency or the distortion of passthrough, the fact that we have very accurate placement of digital content, it opens up a certain set of applications that you can only do with that level of precision. It’s highly precise. We call it behaviorally aware digital content. It’s able to produce the solutions these companies are looking for. As I said on stage, there are going to be things like screen replacement and entertainment that the other devices do just fine. But that’s where our focus is.

Magic Leap is focusing on professional training in the enterprise.

VentureBeat: Do you have, say, a top five list of applications?

Johnson: We do. It’s interesting, because the wider the funnel is as far as the companies we talk to, it all comes back to the same handful of things. They want remote assistance, the ability to call in someone from across the world to help digitally annotate the physical world. It’s almost any kind of training. Even me seeing my CT scan in 2D versus 3D. That’s a huge difference. There’s something that’s cognitively much more comfortable to seeing things in 3D. And then any kind of 3D visualization. Data sets that, in the past, have been presented in a 3D form on a 2D screen, anything like that, we can volumize it and put it in front of your eyes. Some of that includes construction applications that Maret Thatcher was talking about on stage today. Her company Argyle, with that you can actually see it in 3D. That’s very helpful.

VentureBeat: Do you think the generative AI craze is going to help you at all?

Johnson: Oh, totally. We already had a lot of AI in the device. Computer vision, machine learning. But we’ve just started to pull in these new AI systems. The engineers are just playing around with it right now, but it’s amazing what you can do. The fact that you can give a speech command–we did this silly one the other day. Make a chicken and a pig fight on this table. A chicken and a pig appeared and started fighting. It’s crazy. With volumetric models.

Diez: There’s already a lot of AI built into the platform. As you say, the computer vision, all the ability to create digital twins, that’s all based on AI. The interesting thing is that the device is poised–devices like Magic Leap, with the sensor array and the camera array it has, are poised to become probably the strongest input engines for AI. Chat is very interesting, but it’s text-based. It’s limited in that way. This is about contextual ingestion of 3D data. You combine that with Nvidia’s remote rendering capability and you can spit that back right out, put output into the world. It’s going to be incredible. It becomes something you talk about more like an autonomous vehicle than a mobile device.

Johnson: It really is. You can think of AR as a really interesting AI input. It has the ability to be contextually aware of the surroundings. One example we were talking about, which we’ve started discussing–you’re in a factory, and maybe you’re looking around, but the AI system–instead of having a remote person, you have a remote AI, if you will, that’s connected and can see what you’re seeing. It’s taking in everything as you move around. As a human you might be focusing only on the things your eyes gaze at, but this AI system can take in everything. It can say to the factory owner, “Hey, that tool over there shouldn’t be there. It should be back in this case, because it’s dangerous to lay out there.” It can prompt you in a way that provides yet another level of usefulness in a situation like that.

VentureBeat: Where do you think we are by the end of the year? It depends on what happens on Monday, but–

Johnson: We welcome more attention on this industry, more entrants into this industry. Magic Leap has been doing this for more than a decade. It feels good to have more entrants to help grow the ecosystem. It gets the developers excited about this medium of programming. We’re excited about the coming announcements. Also, Apple doesn’t typically jump into a market too early. That’s also very validating. If they’re coming into the market, that will help everyone. Everybody who’s at this show, everyone who’s in AR and VR, for all of us it’ll be a good thing.

Who is that masked man with the Magic Leap 2?Who is that masked man with the Magic Leap 2?

VentureBeat: Lots of people who forgot about it or wrote it off will be back.

Johnson: I’ve been around so many technologies that had a quick bump, died away, and then it came back in a more solid way. You have that hype cycle, and then you get the actual reality. I guess I should use another word than “reality.” But I remember years ago at Qualcomm when we were making mobile phones. The biggest thing our minds could come up with was that they’d be a replacement for pay phones, or you could make a phone call from your car. That was as big as we thought mobile phones were going to be. We thought, “We might sell a million of these things one day!” Because really, how many pay phones are there?

Like any technology, you have to have the usefulness. As that starts to spread, there are other innovations that will come in on top of that and really start to drive volume.

VentureBeat: Do you think XR becomes the preferred way of describing things? “Metaverse” seems to have lost its brand value.

Johnson: I would like that, yeah. We’ve always felt like–the metaverse sounds like another world. Really, this is about being in the physical world and just augmenting that world, rather than jumping into another world where you’re fully occluded. I like that version of it, the XR world.



GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.