Bodily presence in the digital expands without limitations. Awareness of its implications is key. What data is generated when body tracking is used? Where is it stored? Who owns it? Hand Sounds is an audio-visual AR experience that raises awareness of the user’s data profile through playful and immersive generative interactions. Bodily gestures are performed and turned into unique audio and visual assets. Machine learning assists users in calibrating the experience beginning with custom gestures. This enables a personalized journey with localised data storage - the end result being a user-owned data profile generated by their unique audio visual session.