send link to app

Holon lets you perform improvised electronic music by moving spatially between different locations. Holon uses the iPhones sensors and GPS data to generate musical parameters. In this way, music is created as a result of your behaviour and activities in the environment.

- Holon is a passive, “zero-UI” app, which means no screen interaction is required. Instead, the physical world becomes a tangible musical interface that facilitates musical agency.

- Use functional feedback, (proprioceptive/locomotor/biofeedback) to support your activities, as passive operation allows you to remain fully immersed in real-world interactions. Holon is not just a passive listening experience though, it is an interactive score that adapts to your context.

- Instead of waving your phone around, use headphones for immersive auditory feedback and place the phone in a pocket for action-perception coupling to work properly. This also reduces screen time.

- For even further musical embodiment, Apple AirPods 3/Pro/Max can be used as head tracking controllers that add fills and modulation to the music. Set them to transparency mode for improved situational awareness and natural acoustic blending.

- Holon provides a seamless auditory experience that responds to real-time movement and location data. The app now reflects changing environments (urban morphology) by responding musically to Land Use/Land Cover (LULC) data and proximity to landmarks and Points Of Interest (POIs).

- Holon showcases the Holonic white-label B2B platform for Auditory AR applications - send business inquiries to [email protected]

- Become part of the circuit and... Become Sound™

Holon is based on theories of affective design, urban musicology and biomusicology. Through entrainment and sonification, Holon couples self-movement with sound in order to support and encourage activity in different ways.
Holon synchronizes the tempo of the music with your activity: BPM is calculated from speed when in a vehicle, from step rate when walking, from heart rate if stationary and wearing an Apple Watch.

Holon uses data from Geographic Information Systems and various onboard sensors to create data-driven, auto-generated, long-form content. The data is synthesized in realtime on your device, based on musical frameworks created by artists, who can now, as an exclusive first, use the miRack virtual modular system for synthesis. This results in emergent musical events and complex sonic transformations that are perceptually and structurally coupled with user activities.

Entering certain areas changes the music in several ways. We’ve categorised land use areas (zoning, basically) and nodes (PointsOfInterest) according to their amount of human activity. This means that the intensity of the music increases or decreases depending on the area. Natural areas (parks, water) lower it and results in ambient music. Retail zones increase it, as do transportation and industrial areas. POIs are categorized into Transit or Utility nodes. These respond to your proximity and result in various interactions, ranging from influencing the melody or adding percussion elements as you pass them. POIs include amenities, street furniture, bars, hotels, cafés and also objects related to urban mobility.

Holon offers an alternative to generative AI and streaming algorithms. In the Holonic model, audiences become performers, performers become composers, and composers become world builders. Artists can use the Holonist editor app to map data to musical parameters. As an exclusive first, the miRack virtual modular synthesis platform is now available to synthesize sound for musical interactions embedded in places and objects. This is sonic placemaking on an epic scale.

Through the use of perceptual correlation, conceptual blending, and action-perception coupling, Holon creates personally meaningful and contextually relevant multi-sensory Ubiquitous Music experiences.