Apple Glasses leaks and rumors: How Apple could reinvent AR
VR came first. Then came a wave of AR headsets that were high-priced and full of promises of wild mixed reality worlds. Apple finally seems to be preparing its own smart glasses, finally, seven years after Google Glass and four years after the debut of Oculus Rift. These reports go back several years, including a story published by CNET’s Shara Tibken in 2018.
Apple has been in the starting blocks all the time without a headset, although the company’s aspirations for AR have been clear and well telegraphed on iPhones and iPads for years. Every year, Apple has made significant advances in iOS with its AR tools. It was discussed when this hardware would be launched: next year or a year later or later. Or whether Apple only works with glasses or with a VR / AR headset with mixed reality.
With Apple’s upcoming virtual WWDC event a lot more AR-related news is likely to be revealed, where do glasses come into the picture? Apple is unlikely to launch an AR headset next week, but more of the software foundation should definitely show up.
I have worn more AR and VR headsets than I can remember and have been following the entire landscape for years. In many ways, the logical flight path of a future Apple AR headset should be clear if you only study the parts that have already been put on. Apple has just taken over the VR media streaming company NextVR and previously the AR headset lens manufacturer Akonia Holographics.
I’ve had my own thoughts on what the long-rumored headset might look like, and so far the reports feel well-tuned to be just that. Similar to the Apple WatchApple’s glasses, which has popped up among many other smartwatches and had many features that I had seen in other forms before, will probably not be a big surprise if you have been following the beats of the AR / VR landscape lately.
Do you remember Google Glass? How about Snapchat’s glasses? Or the HoloLens or Magical jump? Facebook is working on AR glassesalso and snap and also Niantic. The landscape could suddenly become crowded.
This is where Apple is likely to go based on the reports and how the company could avoid the pitfalls of these earlier platforms.
Apple declined to comment on this story.
Normal glasses, maybe with a normal name
It’s hard to get people to put on an AR headset. I found it difficult to remember to pack smart glasses and find a place to wear them. Most of them do not support my recipe, either.
Apple has always touted the Apple Watch primarily as a “great watch”. I expect the same from his glasses. If Apple makes glasses and makes them available in Warby Parker style in seasonal frames from the Apple Stores that might be enough for people if the frames look good.
From there, Apple could add AR features and give newbies the opportunity to settle in to the experience. Augmented reality is strange, possibly repulsive, and people have to figure out how much of it is right for them. The original Apple Watch was designed to work for five seconds at a time. The same idea for Apple AR features may be in the works.
Apple Glass is the new supposed name for the glasses. No wonder, because the watch is Apple Watch, the TV box is Apple TV. Apple could have followed the “Air” route like “AirFrames”, but I wonder if these things get connected sometimes.
A recent patent application also shows that Apple is trying to resolve vision problems with adaptive lenses. If that’s true, this could be the biggest killer app from Apple’s smart glasses.
Lower costs than you think?
A new report from Apple Leaker Jon Prosser says a product called Apple Glass will start at $ 499 plus prescription add-ons like lenses. That could still increase the price beyond what I pay for my glasses, but still remain in an area that is not crazy. While HoloLens and Magic Leap cost thousands of dollars, they’re not aimed at regular consumers at all. VR headsets cost between $ 200 and $ 1,000 Oculus Quest The price of $ 400 to $ 500 seems to be a good settlement point. The original iPad started at $ 500. The Apple Watch was about the same. If the glasses are accessories and should fit a watch, AirPods and an iPhone, you can’t make them too expensive.
iPhone powered
Qualcomm’s AR and VR plans have telegraphed the next wave of headsets: many of them is driven by phones. Phone-powered headsets can be lighter and only have essential cameras and sensors on board to measure movement and capture information, while the phone does the heavy lifting and doesn’t drain the headset’s battery life.
Apple’s star device is the iPhone, and it already comes with advanced chipsets that can do tons of AR and computer vision calculations. It could already power an AR headset. Imagine what could happen in a year or two more.
A world of QR codes and possibly location-related objects
Reports of QR codes in an upcoming iOS 14 AR app that trigger 3D experiences when scanning a code in a physical location such as a Starbucks are confirmed by Prosser’s report. The Apple Glass (s) scan these codes and use them to quickly launch AR experiences.
This idea that QR codes work for AR is not new: the 2011 Nintendo 3DS Start with a package of QR cards that also worked with a built-in AR game.
Maybe QR codes can help speed up AR work in the “stupid” world. Apple’s latest iPhones have one mysterious U1 chip it can be used to improve accuracy in the AR object placement and also to find other Apple devices with the U1 chip more quickly. Reports of tracker tiles arriving earlier this year that could be viewed through an iPhone app with AR could potentially extend to Apple’s glasses. If all objects from Apple recognize each other, they can act as beacons in a house. The U1 chips could be indoor navigation tools for added precision.
Apple’s latest iPad has the sensor technology it needs
Apple has already invested heavily in camera arrays that can capture the world from short and long distances. The front TrueDepth camera on every Face ID iPhone since the X. is like a shrink Microsoft Kinectand can scan a few meters away and capture 3D information with accuracy that can be used for a safe face scan. The newer Lidar rear sensor on the 2020 iPad Pro can scan much further, several meters away. This is the area that glasses would need.
According to developers, the iPad Pro Lidar scanner from Apple is more suitable for depth detection than for scanning photo-realistic objects: the number of points that are sent to the world is less fine-grained, but good enough to network and a landscape to scan furniture, people and more. Current iPad Pro apps that use Lidar use the technology to Improve room scans and even improve the camera’s understanding of room details. This lidar sensor array is reported to be Apple AR glasses sensors, and this makes perfect sense. The iPad Pro and the next iPhone could act as a living development kit for the glasses’ sensors, but iOS 13 already has code Display of a stereoscopic software called “Starboard”. and reported iOS 14 support for a handheld controller remote.
How current will the graphics be?
Will Apple push ahead with the latest in realistic holographic AR, or aim to build a few key features in style and build from there? Undoubtedly the latter. The first Apple Watch was full of features, but it still lacked some important things that other watches had, such as GPS and cellular connectivity. So also the first iPhone that had neither an App Store nor 3G or GPS. Apple tends to market its new products when it comes to doing some important things extraordinarily well.
High-end headsets with mixed reality like HoloLens 2 and Magic Leap, which show advanced 3D effects, are heavy. Smaller, more normal smart glasses like North Focals or Vuzix Blade are more like Google Glass. They present some heads-up information on a flat 2D screen.
There aren’t that many lightweight AR headsets yet, but that will change. Plug-in glasses like the nReal Light show some Magic Leap-like 3D graphics and run on a phone. That comes closer to what Apple could do.
Apple’s dual displays could overtake the competition and offer better image quality for its size. We have already seen normal-looking glasses that can embed waveguides so that the images float invisibly. And over time Apple could have more advanced hardware in stock.
Contact AirPods for ease of use and audio augmented reality
I’ve been thinking how AirPods and their instant comfortand strange design was an early experiment on how to wear Apple’s hardware right on our faces could be accepted and become normal. AirPods are expensive compared to wired buds in the box, but also useful. You are relaxed. Apple Glass has to feel the same way.
AirPods could also contain spatially aware audio to include information from places that could appear and alert someone to potentially turn on their glasses. Maybe the two would work together. Immersive audio is casual and we do it all the time. Immersive video is difficult and is not always required. I could see AR as an audio-first approach, like a ping. Apple Glass could potentially take over spatial awareness of scanning in the world for spatial audio to work.
Apple Watch and AirPods could be great glass companions
Apple already has a collection of wearable devices connected to the iPhone, and both make sense with glasses. AirPods can be paired for audio (although the glasses may also have their own Bose Frames-like audio), while the watch could be a helpful remote control. The Apple Watch temporarily functions as a remote control for the Apple TV or for connecting to the iPhone camera. Apple’s glasses could also look at the watch and virtually expand the display and offer improved extras that appear discreetly like a halo.
The Apple Watch could also offer something that is difficult to achieve with hand gestures or touch-sensitive frames on glasses: haptics. The rumbling feedback on the watch could potentially result in a tactile response to virtual things.
Could Qualcomm and Apple coordinate XR?
Qualcomm and Apple work together again on future iPhones, and I don’t think it’s just about modems. 5G is undoubtedly an important feature for phones. But it is also a killer element for next generation AR and VR. Qualcomm has already explored how remote rendering can connect 5G-enabled phones and connected glasses to streaming content and location data connected to the cloud. Eventually, glasses could stand on their own and use 5G to do advanced computing, just like the Apple Watch finally works on cell phones.
The Qualcomm chipsets are included in almost every self-contained AR and VR headset I can imagine (Oculus Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Apple’s hardware is also likely to be compatible with some of Qualcomm’s new XR tools.
Expect the iPhone to support other VR and AR as well
While Apple Glass may be Apple’s biggest focus, it doesn’t mean that there can’t or shouldn’t be any competitors. There are countless smartwatches and fitness trackers that work with the iPhone, for example. For other trackers and watches, it’s annoying how they are sealed off in a more limited interaction with iOS than the Apple Watch. This could also be the case later if connected VR and AR headsets are allowed to work with a future iOS update. This is where Qualcomm comes in with phone chips, and Google’s Android could probably follow.
Start date: 2021, 2022, 2023 … or later?
New Apple products are usually announced months before they arrive, maybe even more. The iPhone, Apple Watch, HomePod and iPad followed this path. Prosser’s report says a first announcement could come along with the next iPhone in the fall; It is a standard Apple event as originally planned for Precoronavirus (which is unlikely to be the case). Even then, actual availability could be 2021. This is in line with Shara Tibken’s 2018 report.
Bloomberg’s Mark Gurman has since contested Prosser’s report, and other well-known analysts like Ming-Chi Kuo say the glasses could come in 2022. A 2019 report from The Information, based on leaked Apple presentation materials, suggested 2022 for an Oculus Quest-like AR / VR headset and 2023 for glasses. Perhaps Apple is pursuing a staggered strategy with AR and is releasing several devices: one for developers first at a higher price and one for everyday wearers later.
In any case, developers need a long lead to get used to developing Apple glasses and designing apps to work and flow with Apple’s design guidelines. That’s where Apple’s WWDC conference This could be a springboard to continue developing AR software long before hardware is officially announced, as Apple has been doing for years.
Apple Glass sounds like the culmination of years of acquisitions, hiring, and a drama behind the scenes, but they may not arrive as quickly as you think.