Over the last few months, it's difficult not to notice the elevated buzz around Apple’s VR/AR efforts. There’s clearly a lot of smoke, and from that, the fire appears enormous. Despite the metaphor, this is exciting!
With 14+ years developing tappr.tv, I have a tremendously invested interest in Apple VR/AR. I’m constantly thinking about the foundational technologies Apple releases in iOS, and pondering when and how arOS will arrive. Specifically:
- How will third parties create arOS apps?
- What first-party apps will Apple create?
With that, I’m interested in pondering three aspects:
- UI in 3D
- First-Party apps in VR
- First-Party apps in AR
Let’s begin.
UI in 3D
A few weeks ago Dan Moren published an article about Apple’s AR efforts, citing three obvious technologies that are clearly part of the VR/AR foundation:
- Memoji
- SharePlay
- Spatial Audio
I think he’s right. And I too have been thinking about the foundations that Apple has been laying down, in plain sight, over the last several years:
- SceneKit
- ARKit
- RealityKit
- Vision
- Depth
- Hand Pose Recognition
- Body Pose Recognition
If you’ve invested any time into these technologies, you are well positioned to create apps in Apple’s VR/AR environments.
But there one piece still missing from the foundation… how will UIKit function in VR/AR?
Apple has the UIFocus system, first introduced in tvOS, to handle pointing and gazing gestures. SceneKit gained UIFocus support just a few years ago. Now pair that with hand posing, and we can see how Apple is going to transform hand gestures into (the equivalent of) IBActions.
But beyond that, we don’t know how user interfaces will be created. Obviously, Apple will provide labels, controls, text entry, lists and collections, alerts, and so on. Will these be SCNNode subclasses? Will we declare these with SwiftUI?
There are no windows in 3D, per se, only space and scenes. Interesting to note that when multi-window support was added to iOS, they called it UIScene. Even in the vastness of 3D space, we need to define regions of UI. So maybe Apple will model this like the real world?
let palette = SCNPalete(with: SwiftUI.View)
palette.position = SCNVector3(x, y, z)
But this further begs the question… could you build an AR app entirely in SwiftUI, which is what Apple is striving for on all their other platforms? Will there be new UI elements specifically catered to 3D space? And this doesn’t even consider UI navigation and organization.
I’m excited to see how Apple addresses all these issues.
First-Party Apps in VR
Beyond providing the frameworks, Apple will need to build their own apps to define the core value propositions of these new devices.
Think back to the introduction of the iPhone and the Apple Watch. In both cases, Apple offered three use-cases for each product. That is, they focused on several first-party apps that solved specific problems.
iPhone focused on:
- iTunes
- Phone
- Safari
Apple Watch focused on:
- Watch Faces
- Messages
- Activities
So, what will the three pillars of Apple AR look like?
Well, first, I think Apple could start in VR. VR is an activity done in a single room, whereas AR wants to be out in the real world. And so, the first set of apps for VR should work in a confined space:
- Apple Arcade
- AppleTV
- FaceTime
Games are naturally 3D, and we’ll probably use our iPhones as soft controllers. And just like Apple Music recently gained Surround Sound for my entire library, I bet all my AppleTV movies gain 3D for free as well.
FaceTime is the gateway to SharePlay, so it obviously needs to be there, front and center. Bloomberg’s Mark Gurman thinks communication will be integral to the first effort, which might suggest that FaceTime in iOS 16 will handle business meetings better than FaceTime does today. We can hope.
To round out, here’s what might also appear on the Apple VR Home Screen:
- Fitness*
- App Store
- Settings
- TestFlight
- Widgets
*I’ve gone back and forth on whether to include Fitness. There is clearly interest in fitness on other VR platforms, but the bulk of a 1st generation device might be limiting.
First-Party Apps in AR
AR is much further off, primarily because the devices will have to weigh about the same as sunglasses, and display the real world with augmentation with crystal clarity. We’re not just there yet, but when Apple AR arrives, what first-party apps will it include?
- Maps
- Fitness
- Camera
- ?
And that’s part of the challenge for Apple; it's difficult to name 3 problems that AR could solve elegantly. Maybe in the end, Apple AR will be to Apple VR what iPad is to iPhone: specialized and work oriented, versus general purpose and mobile.
Or maybe Apple AR replaces the iPhone as *the* device we carry all day, every day, into every environment we venture. But then we’re back to the glasshole paradox, with cameras aimed at everyone and everything. Not sure how society changes around that.
Conclusion
Maybe even trying to guess the pillar apps of Apple VR/AR is futile. Can you remember back before Apple Watch or iPhone… what problems did we really have? In both of these previous cases, they made three bets, hoping one would take hold. And in both cases, one unexpected bet paid off. The iPhone transformed our connection to the internet, and the Apple Watch is transforming healthcare.
Which three bets will Apple make with VR/AR, and which one unexpectedly pays off? I’m so eager to see what happens!