I tried out my recent experience (Time Jump | Stijn Spanhove | 8th Wall) in the Apple Vision Pro.
I enabled the feature flags in Safari and I could open the experience in VR.
But it was not capturing my βtap to placeβ event. I was trying to pinch?
Am I missing something? Are there any examples available?
WebXR standards have recently been updated to support natural input for VisionOS. This new method for interaction has not been implemented in our initial metaversal deployment updates to support the Apple Vision Pro, but it is on our radar!
The initial updates include support for hand tracking, but you can try implementing this new interaction method yourself as well.