Apple’s RealityKit 2 allows developers to create 3D models for AR using iPhone photos – ProWellTech
At its global developer conference, Apple announced a major update to RealityKit, its suite of technologies that enable developers to start creating augmented reality (AR) experiences. Apple says that with the introduction of RealityKit 2, developers will have more visual, audio, and animation control as they work on their AR experiences. But the most notable part of the update is how Apple’s new object capture technology enables developers to create 3D models in minutes using just an iPhone.
Apple noted in their developer talk that one of the most difficult parts of creating great AR apps was the process of creating 3D models. This could take hours and thousands of dollars.
Apple’s new tools allow developers with just an iPhone (or iPad or DSLR if they prefer) to take a series of images to capture 2D images of an object from all angles, including the bottom.
With the Object Capture API on macOS Monterey, only a few lines of code are then required to generate the 3D model, explained Apple.
To begin with, developers would start a new photogrammetry session in RealityKit that points to the folder where they took the images. Then they would call the process function to generate the 3D model with the desired level of detail. Object Capture enables developers to generate USDZ files optimized for AR quick view – the system that allows developers to add 3D virtual objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in the Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy, and others are using Object Capture to create 3D models of real-world objects – an indication that online shopping is on the verge of a major AR upgrade.
Wayfair, for example, uses Object Capture to develop tools for its manufacturers to create a virtual representation of their goods. This will allow Wayfair customers to preview more products in AR than they do today.
In addition, Apple developers like Maxon and Unity are using Object Capture to create 3D content in 3D content creation apps like Cinema 4D and Unity MARS.
Other Updates included in RealityKit 2 custom shaders that give developers more control over the rendering pipeline to fine-tune the appearance of AR objects; dynamic loading for assets; the ability to create your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale, and explore AR worlds in RealityKit-based games.
A developer Mikko Haapoja from Shopify, tried the new technology (see below) and shared some real-world tests in which he shot objects with an iPhone 12 Max via Twitter.
Developers who want to test it out for themselves can benefit Apple’s sample app and install Monterey on your Mac to try it out.