Meditation profoundly affected me; transcending debilitating depression into vibrant awareness, acceptance, and fuller reality.
Explore, reflect, and discover your own insights by direct participatory experience to co-create this artwork.
created by epok.tech
Immerse and engage in your deepest reality
Locus is an artwork in progress, immersive digital meditative experiences that you create and explore by embodied interaction.
Support our team to bring the project to life and own part of the artwork; an installation, free online exhibit, and collectible editions.










Locus is an artwork in progress
Designed to engage audiences in embodied experiences of meditation, it’s being developed for exhibition as a physical installation, a free online experience, and an NFT
series of editions.
Concept and aesthetic
Millions of fluid particles pass through a translucent human bust – their colours match the anatomy they flow within (skin, bone, tissue), tracing ephemeral hints of forms and colours layered within the volume’s depths.
Reflecting on this fluid motion elicits a meditation on our own subjective sensations (bodily, cognitive, emotional) – each organic, dynamic, interconnected, emergent phenomena.
Meditation can’t be truly be known conceptually or discursively, only by direct participatory experience – instilled here by interactive real-time visuals, physics, audio.
The bust and internal anatomy, clear glass sculptures where particles trace colours, are 3D volumes of Signed-Distance Fields
.
The bust forms a hyper-realistic 3D scan of your face using Gaussian Splatting
technology.
Interact by your natural motion
Your natural motions affect the flow – movement agitates turbulence, clouding outer surfaces; stillness settles smooth filaments, revealing intricacies of inner depths.
In this way the artwork reveals itself by your still mindful reflection, in contrast to busy activity – as still water’s clearer than an agitated surface, you see more deeply within.
Your body’s intuitive motion affects the artwork via its camera:
Optical-flow
tracks any motions to influence particle flows.
AI
body-tracking mirrors your viewpoint by the turn of your head and lets you slice the visuals along your palm’s surface.
Peer into the Flow explores AI
face-tracking interactions to intuitively mirror your viewpoint – in an NFT
series of interactive sketches supporting and prototyping parts of Locus, our crowdfund backers are rewarded with a random edition.
Emergent immersive audio
Flowing particles generate audio in real-time, spatially-mapped to speakers enclosing the space – individual particle sounds culminate into an immersive collective sound; as the sound of wind moving a leaf is to that of a forest.
Dynamically driving sound by the emergent motions of millions of interacting particles in an experimental process – we explore the potential of procedural audio to create aleatoric music.
Hear how aleatoric music can be composed in an emergent process.
Motion and fluid dynamics
The hyper-realistic fluid dynamics are driven by Material Point Method
simulation – a cutting-edge physics method handling multiple forms of matter interacting in one model.
Locus will adapt MPM
technology for real-time fluid-dynamics on web-platforms – from its origins in high-end film and game fields – and release it as Open-Source Software.
See how MPM
simulates hyper-realistic physics here, here, and here.
Exhibited in complementary formats
Physical installation: The main exhibit; a projection on a large circular screen, parting the space between observers and a participant interacting via camera, enclosed by an array of immersive-audio speakers.
Online digital: The free and open exhibit; explored freely anywhere by anyone on their own web-devices, supporting equity of access to the artwork.
Collectible editions: An NFT
series spanning the artwork’s range of aesthetics and themes; driven by on-chain data and collectors’ input. Supporters and installation audiences earn rewards that interact with these NFT
editions.
Embodying meditative experiences and exploring by interaction, artwork and audience affect each other in a truly reciprocal dialogue, to create the experience together.
Progress demo ~ Material Point Method
Work-in-progress developing gl-mpm
, the Material Point Method
fluid-dynamics technology for Locus, and a cutting-edge physics simulation handling multiple forms of matter interacting in one model.
Implementing the MLS-MPM
physics model, with ASFLIP
integration, supporting WebGL 1.0
, particle-based physics and rendering, real-time, natural-motion interaction – try it with the camera enabled.
So far 2D not 3D, fluid-dynamics not yet other materials, up to AFLIP
not yet ASFLIP
, basic rendering, uniform grid not yet hierarchical SPGrid
.
Integrates gl-gpgpu
to simulate all the physics and several other aspects, while supporting WebGL 1.0
for equity of access on many devices.
Integrates glsl-optical-flow
for natural-motion interaction via camera to affect fluid particle flow.
Generously advised by leading academic researcher Raymond Yun Fei PhD.
Progress demo ~ aleatoric immersive audio
Exploring and prototyping the aleatoric audio process – it will be driven by the emergent fluid motions of millions of particles, each individual influence accumulated to generate a collective sound, played across many-channel audio enclosing the space and immersing the audience.
Morgan Carparelli – the sound-engineer for Locus, coming from the world of high-end film, television, and game audio – iterated this concept into this early prototype, where many spatialised audio sources are driven along parametric curves and synths are driven by overlapping inputs.
This prototype is built with Iannix
, Reaper
, and PureData
– to be ported into JavaScript
, WebAudio
, and WebGL
as we progress the audio process and musical qualities.
Progress demo ~ Peer into the Flow
Developed Peer into the Flow, an artwork series supporting Locus – integrating GPU
particle simulation and rendering, and AI
face-tracking technology.
Your touch or cursor influence the particles, while the view mirrors the turn of your head via AI
face-tracking – try it with the camera enabled.
Released as a series of NFT
editions on Olta and OpenSea, to raise collector funds for Locus development.
Integrates gl-gpgpu
to simulate all the physics and several other aspects, while supporting WebGL 1.0
for equity of access on many devices.
Integrates AI
face-tracking for natural-motion interaction via camera to control the viewpoint.
Progress demo ~ GPGPU
Developed gl-gpgpu
, a General-Processing GPU or GPGPU
tool – useful for computationally intensive tasks like real-time physics and particle systems, of which Locus has many.
Provides equity of access to Locus, with high-performance parallel-computation of complex systems not reliant on the availability of compute-shaders or WebGPU
– just WebGL 1.0
, available on most devices audiences may own personally, not only high-end devices in custom installations.
Developed for optimal computation on the GPU
via WebGL 1.0
and GLSL
, and published Open-Source.
Progress demo ~ Optical-Flow
Developed glsl-optical-flow
, an Optical-Flow
Computer-Vision method for natural-motion interaction via video – try it with the camera enabled.
This natural-motion interaction embraces myriad kinds of people, forms, movements, abilities, appearances, and environments – intuitively and in real-time.
Developed to run optimally on the GPU
via GLSL
, and published Open-Source.
An artwork engaging your conscious exploration of awareness and sensation, to release attachments, face aversions, and perceive and attune to clearest reality.
Inspiration and references
Examples of key aesthetic elements of Locus:
Fluid particles, fluid dynamics physics simulation, translucent 3D volumes.
The Locus concept art sketch blends these elements together.








Material Point Method
simulation of fluid particles by Grant Kot. 




Other progress
Team and partners
We’ve assembled talented people into a multi-disciplinary creative team to bring this project to life:
- Creative Consultant, Technical Producer, Designers, Photographers, Public Engagement Consultant.
- Sound-Designer and Musician collaborating on experimental generative sound.
- Steel & Form metalwork studio for set-design and fabrication.
- Innovation Creative Director support by Amplify creative agency.
- Meditation Consultant, interviewing other meditative practitioners and audiences to keep this artistic interpretation authentic.
Milestones reached
We’ve deeply researched key elements:
Material Point Method
for physics realism, in real-time web-tech onGPU
, with the advice of leading academic researcher Raymond Yun Fei PhD.Signed-Distance Fields
for the anatomical bust 3D forms to influence fluid particles’ colour and matter properties; and related methods.
Developed Optical-Flow
, AI
face-tracking, GPU
-optimised real-time simulation elements, splitting parts into smaller projects supporting this main artwork.
Completed the detailed production timeline and budget.
Exhibitions confirmed and expected
- BETA Festival in Dublin, Ireland.
- Peckham Digital in London, United Kingdom.
- Among others expecting confirmation…