Getting started with RealityKit: Touch Gestures
This article is part of a series, the main article which will point to all in the series is this one:
Entity Gestures (transformations)
RealityKit has some great new built-in 3D gestures for manipulating entities in your scene called EntityGestures. Just make sure that your Entity has a CollisionComponent, this is required for both hitTests and Collisions. If you’re not sure how to add a CollisionComponent to an entity, check out my previous post, where I show how to add a couple of component, including one for collision.
All you have to do is take your object, which already conforms to HasCollision along with having a CollisionComponent, then call installGestures(_:for:) on your ARView.
// CustomBox is an Entity subclass
// which conforms to HasCollision and HasAnchoring
let greenBox = CustomBox(color: .green, position: [0.6,-1,-2])// *self* is type ARView
self.installGestures(.all, for: greenBox)// add box to the scene
self.scene.anchors.append(greenBox)
Where I have put .all
, you can swap that out for any of .scale
, .rotation
and .translation
, or a combination of those. These can be seen with more detail at ARView.EntityGestures.
That’s all it takes to get some nice, fluid, gestures working in AR like these:
There are some things I’d like to see changed, such as being able to rotate an object beyond just the Y axis, and enabling the two finger to float object, as we’ve seen in a WWDC presentation about AR Quick Look. But really you get a lot with only a little amount of effort. I remember implementing similar gestures using SceneKit, which was a fun challenge, but I’m glad this is now more native. This way we can expect peak performance that we expect from iOS and concentrate on other challenges within AR apps.
Hit-Testing with GestureRecognizers
Gestures in RealityKit can work in a very similar way to SceneKit if that’s what you want.
Using an example of a tap gesture, I’ll assume you have a tap gesture set up on your view with a callback to a function called handleTap
. If you’re not sure how to do this, I’ve found an answer on stackoverflow that will help you get started with that. Alternatively, check out this repository containing a small app I’m using to test all the information I put in these posts.
@objc func handleTap(_ sender: UITapGestureRecognizer? = nil) {
guard let touchInView = sender?.location(in: self) else {
return
}
guard let hitEntity = self.entity(
at: touchInView
) else {
// no entity was hit
return
}
hitEntity.randomScale()
}
// randomScale is an example that gives feedback
extension Entity {
func randomScale() {
var newTransform = self.transform
newTransform.scale = .init(
repeating: Float.random(in: 0.5...1.5)
)
self.transform = newTransform
}
}
The above boxes both have a CollisionComponent, this is required for both hitTests and Collisions. If you’re not sure how to add a CollisionComponent to an entity, as mentioned before, check out my previous post, where I show how to add a couple of component, including one for collision.
If you wanted to target specific entities for hit tests, you can still access the hitTest methods you might be used to form SceneKit, where you can get a list of all the nodes hit etc. These are listed on ARView’s main documentation page, under ‘Finding Entities at a Point in the View’.
For more examples, and to see these things in action, I’ve created a repository where I’m adding parts of RealityKit as I reach them into a game, click the link below to see it. Also if you have any suggestions of better methods of implementing Apple’s new framework, I encourage you to open a Pull Request on the below repository.
If you’ve enjoyed this article and want to see more in this RealityKit series; tap the 👏 button (you can do this 50 times 😱)