Another WWDC announcement has come by, but this time RealityKit has received a noteworthy upgrade! Although in comparison, the ARKit upgrades are mostly performant.
All the topics mention in this article will be added to my ongoing RealityKit series over the coming months:
What Happened Last Year? (WWDC20)
As a reminder, last year the updates in ARKit 4.0 were Hand Pose, Depth API, Location Anchors, Face Tracking (for devices without FaceID). RealityKit 2.0 only came with Video Textures and a Debug Component for 3D models; as well as access to the ARKit updates. Here’s my summary from last year:
These were welcome updates, but much lighter than was hoped for and certainly much lighter than this year.
What’s New at WWDC21?
Custom Shaders
A much anticipated features has come to RealityKit — we can now use custom shaders in RealityKit! I have not yet fully explored this new features and able to say what we can and cannot do when compared with the shaders available in other frameworks such as SceneKit. Check back here for more content once I have fully explored it!
Otherwise, there this session will cover this in depth coming 9th June 2021.
Procedural Mesh Generation
Horrah! We now have procedural meshes with RealityKit. This is huge, as previously we had a choice of generating a box, sphere, text or a plane; otherwise we’d have to have a USDZ file to import to our RealityKit scene.
This is the most simple example of how procedural mesh generation works. To see slightly more depth, check out my post on generating procedural geometries with RealityKit:
I will be posting a lot more on this topic, so watch out on twitter or here on Medium to see updates — as well as Apple’s WWDC events throughout the week, including this one also mentioned in the custom shaders section:
Object Capture
This SDK lets you turn photos into high quality 3D models. Similar to 123D from Autodesk (no longer available). The demo for this looks impressive, but I’ll wait and see the performance for myself throughout the week.
With Object Capture your captures object can be instantly shared in USDZ format or brought into your Xcode project.
In my previous experience these object capture solutions often get messy, and unnecessarily large 3D files (too many vertices, too complex geometries etc). But I expect Apple has done a good job with this.
This session should have much more content:
Character Controller
This new feature sounds very intriguing, you can control characters using this API to run around AR worlds. There’s not much content available just yet, but keep an eye on this session tomorrow (8th June 2021):
Custom Entity Component Systems
I’m very excited to see what this is all about. The ECS in RealityKit is something that I’ve really come to like. I think this feature addition will really help direct the RealityKit developer ecosystem. There’s almost no information about what this really is, but I believe it will help us create better structured AR apps with RealityKit, and that can’t be a bad thing at all!
It’s unclear which session will speak more to this, so just be sure to check out all ARKit/RealityKit sessions or check back here for more details in the days to come.
What about ARKit?
On an initial glance ARKit has much more modist updates this year.
ARKit is already leading the industry with Augmented Reality features on mobile devices, especially when compared to ARCore.
The named features on Apple’s website include Location Anchors in additional cities, and an update to the ARCoachingOverlay that “coaches” the user in anchoring themselves with Location Anchors.
Summary
It’s only the first day of WWDC, the full potential of all the above features will become clear from the upcoming sessions, as well as developers like you an I digging around in the SDK. If there’s anything I’ve missed, send me a tweet!
Otherwise check back here or on my Twitter as I add more content throughout this week.