Apple unveils the future of AR/VR wearable tech - Apple Pro Vision

6 June 2023
Paul McGillivray

Yesterday evening, Apple gave us a clear vision of the future. Unveiling their new AR/VR headset, they’ve shown us the possibility of a world where our work, our movies, our documents, and our photographs are not confined to the restrictions of the screen we’re working on. Where a spatial operating system can detect the shapes and contours of the space we’re in and project what we need to see, when we need to see it, within the room. Audio isn’t confined to speakers in front of us but projected into the 3d space. Apps aren’t opened by clicking with a mouse, but by talking to them or looking at them and touching our fingertips together.

Truly immersive environments can give us focus, but when someone comes into our space, we see them, and they see our eyes even though we’re wearing a headset. We can talk to people in 3d Facetime or Zoom calls, but they don’t see us with that headset on - AI watches our facial expressions and imitates them in photorealism so that as far as others are concerned, we’re sitting or standing right there in front of them.

We’ll no longer need to see banks of computer monitors in offices, and we won’t need a 70” black mirror on our walls to get a home cinema experience. We’ll record and watch home videos in 3d with real volume and depth. And all of this without being tied by cables to our laptops or phones.

What we’ve seen a glimpse of tonight is the world perfectly articulated in the recent sci-fi movie Swan Song, and clumsily but imaginatively hinted at in Minority Report. Gestures, Voice, and 3D holographic projections are no longer the future. They’re ours now.

There’s a chance that you’re looking at the images in this post and thinking, ‘clunky headset’. I get it but think back to the first iPod's click wheel, the first iPad's weighty boxiness, and the first iPhone's uselessness. Year by year, technology is getting smaller, and smarter. The headset is a first iteration, and Tim Cook has already admitted that he had to make multiple compromises on his vision to meet this first release's deadline and budget restraints.

Project these concepts a few years out, and I’m sure that these devices will be no more intrusive than the glasses I wear to drive or watch tv now. Perhaps, if the innovations of companies like Magic Leap come to fruition, they’ll just be contact lenses like in Swan Song.

The images, documents, and videos, will only be visible when we need to see them - half of the time, we’ll simply be able to talk to our computer and it’ll be able to give us the information we need or carry out a task for us without us needing to see anything at all. When we do need to see a document, it’ll appear in the space we’re already in, and we’ll be able to use gestures to change its position and size. Imagine the collaborative possibilities of working together in 3d space on science or engineering projects. Think of the engagement potential of giving presentations and keynotes that bring people together, surrounded by your data or ideas.

We’ve had hints of this before - with Google Glass, Microsoft Hololens, and Oculus Quest. They’ve been necessary milestones on this path. But today, as it does with all of its previous product releases, Apple has come as a latecomer to the AR/VR party that has made all of these previous innovations suddenly feel dated and unintuitive.

None of us wants screens in our lives. But we need them. For me to suggest that a big, heavy headset is a huge step towards invisible, wearable tech might seem like a stretch, but a huge block towards that goal up til now has been the inability of technology to allow us to interact with small gestures, to see through the screens we wear, and to engage seamlessly with both the tech and the 3d environment we’re in sufficiently for us to avoid tripping over the furniture or feeling cut off from the world or just feeling car sick. That’s all changing from now.

The headset will get progressively lighter and smaller. The price will (must) reduce. The lens resolution will increase - they’re at just over 4k now, and we only need to reach 32k to be indistinguishable from Windows. The software we use daily will become more spatially aware and voice-controlled. I do believe that our tech will be both wearable and invisible very soon.

And it will happen in a much shorter time window than any of us imagine.

Paul McGillivray

Get in touch

If you'd like to talk to me about a project or an idea, get in touch, I'd love to hear from you.

Lets make a difference