Advanced gestures
After creating multiple 3D objects in a visionOS scene, I thought it would be good to go further than the simple drag gesture mentioned there.
In my research, I found an Apple article on transforming realitykit entities with gestures.
Installing Components and Extensions #
In the downloaded code the RealityKitContent
includes Components
and Extensions
folders with code for handling gestures. I copied these across into my project's RealityKitContent folder.
These have been set up to allow for customisation in how we apply gestures. The first thing though is install the extensions:
RealityView { content in
let floor = viewModel.generateFloor()
content.add(floor)
addEntities(content)
} update: { content in
addEntities(content)
}
.installGestures()
The installGestures
method sets up the extensions.
Using with Reality Composer Pro #
As the above files are added to the Reality Composer Pro scene, they become available as settings to allow interaction on objects and scenes defined there. However I'd like to try to apply the code to the objects created dynamically within my app.
Applying to entities #
In the shared view model (SharedViewModel
), I added the components
to handle gestures like so:
let jsonData = """
{
"canDrag": true,
"pivotOnDrag": true,
"preserveOrientationOnPivotDrag": true,
"canScale": true,
"canRotate": true
}
""".data(using: .utf8)!
do {
let decoder = JSONDecoder()
let gestureComponent = try decoder.decode(GestureComponent.self, from: jsonData)
shape.components.set(gestureComponent)
} catch {
print("Failed to decode JSON: \(error)")
}
This sets up the configuration, in which you can set to true
or false
the various options, then encodes the config. It then uses a JSON decoder with the GestureComponent.self
type to create a gestureComponent
, all configured with our interaction settings. We then apply this to the shape
with shape.components.set(gestureComponent)
.
- Previous: Multiple objects
- Next: Simple hover effect