Zum Hauptinhalt

Sound Gesture Intelligence

Dr. Greg Beller

Hand Tracking

LIDAR sensor and XR headset

The most advanced hand-tracking systems are now directly integrated into VR and XR headsets. In Unity or Unreal, you can use a XR-SDK such as webXR framework, XR interaction toolkit or Meta XR interaction SDK to obtain the joints of the hand skeletons. Then you can send this data via OSC to Max to build a new instrument. Alternatively, using Graham Wakefield's Max VR package,2 you can easily integrate static position and dynamic acceleration data from the controllers of a MetaQuest or HTC Vive headset.

In the Spatial Sampler XR, sounds are arranged in the surrounding space using a Meta Quest 2 thanks to a Max patch based on the VR package. In the same way that a sampler is an empty keyboard that is filled with sounds, Spatial Sampler XR uses hand tracking to transform the surrounding physical space into a key zone for indexing, placing and replaying samples. With Spatial Sampler XR, the musician spreads sound around him/her through gesture, creating a spatialized and interactive sound scene. The 3D immersion greatly facilitates the organization of the sounds and increases the precision of the interaction. Several playing modes are possible, the Sound Space, the Spatial Trigger and the Spatial Looper. The interaction modalities also vary according to the type of performance, in solo, duo or with several people. Movement links time (sound) and space. This makes Spatial Sampler XR suitable for movement artists as well, and for various applications.

The Spatial Sampler XR adds to the Sound Space the possibility to visualize sounds in mixed reality.