Meditation profoundly affected me; transcending debilitating depression into vibrant awareness, acceptance, and fuller reality.
Explore, reflect, and discover your own insights by direct participatory experience to co-create this artwork.
created by epok.tech
Immerse and engage in your deepest reality
Locus is an artwork in progress, immersive digital meditative experiences that you create and explore by embodied interaction.
Support our team to bring the project to life and own part of the artwork; an installation, free online exhibit, and collectible editions.










Locus is an artwork in progress
Designed to engage audiences in embodied experiences of meditation, it’s being developed for exhibition as a physical installation, a free online experience, and an NFT
series of editions.
Concept and aesthetic
Millions of fluid particles pass through a translucent human bust – their colours match the anatomy they flow within (skin, bone, tissue), tracing ephemeral hints of forms and colours layered within the volume’s depths.
Reflecting on this fluid motion elicits a meditation on our own subjective sensations (bodily, cognitive, emotional) – each organic, dynamic, interconnected, emergent phenomena.
Meditation can’t be truly be known conceptually or discursively, only by direct participatory experience – instilled here by interactive real-time visuals, physics, audio.
The bust and internal anatomy, clear glass sculptures where particles trace colours, are 3D volumes of Signed-Distance Fields
.
The bust forms a hyper-realistic 3D scan of your face using Gaussian Splatting
technology.
Interact by your natural motion
Your natural motions affect the flow – movement agitates turbulence, clouding outer surfaces; stillness settles smooth filaments, revealing intricacies of inner depths.
In this way the artwork reveals itself by your still mindful reflection, in contrast to busy activity – as still water’s clearer than an agitated surface, you see more deeply within.
Your body’s intuitive motion affects the artwork via its camera:
Optical-flow
tracks any motions to influence particle flows.
AI
body-tracking mirrors your viewpoint by the turn of your head and lets you slice the visuals along your palm’s surface.
Peer into the Flow explores AI
face-tracking interactions to intuitively mirror your viewpoint – in an NFT
series of interactive sketches supporting and prototyping parts of Locus, our crowdfund backers are rewarded with a random edition.
Emergent immersive audio
Flowing particles generate audio in real-time, spatially-mapped to speakers enclosing the space – individual particle sounds culminate into an immersive collective sound; as the sound of wind moving a leaf is to that of a forest.
Dynamically driving sound by the emergent motions of millions of interacting particles in an experimental process – we explore the potential of procedural audio to create aleatoric music.
Hear how aleatoric music can be composed in an emergent process.
Motion and fluid dynamics
The hyper-realistic fluid dynamics are driven by Material Point Method
simulation – a cutting-edge physics method handling multiple forms of matter interacting in one model.
Locus will adapt MPM
technology for real-time fluid-dynamics on web-platforms – from its origins in high-end film and game fields – and release it as Open-Source Software.
See how MPM
simulates hyper-realistic physics here, here, and here.
Exhibited in complementary formats
Physical installation: The main exhibit; a projection on a large circular screen, parting the space between observers and a participant interacting via camera, enclosed by an array of immersive-audio speakers.
Online digital: The free and open exhibit; explored freely anywhere by anyone on their own web-devices, supporting equity of access to the artwork.
Collectible editions: An NFT
series spanning the artwork’s range of aesthetics and themes; driven by on-chain data and collectors’ input. Supporters and installation audiences earn rewards that interact with these NFT
editions.
Embodying meditative experiences and exploring by interaction, artwork and audience affect each other in a truly reciprocal dialogue, to create the experience together.
Progress demo ~ Material Point Method
Work-in-progress developing gl-mpm
, the Material Point Method
fluid-dynamics technology for Locus, and a cutting-edge physics simulation handling multiple forms of matter interacting in one model.
Implementing the MLS-MPM
physics model, with ASFLIP
integration, supporting WebGL 1.0
, particle-based physics and rendering, real-time, natural-motion interaction – try it with the camera enabled.
So far 2D not 3D, fluid-dynamics not yet other materials, up to AFLIP
not yet ASFLIP
, basic rendering, uniform grid not yet hierarchical SPGrid
.
Integrates gl-gpgpu
to simulate all the physics and several other aspects, while supporting WebGL 1.0
for equity of access on many devices.
Integrates glsl-optical-flow
for natural-motion interaction via camera to affect fluid particle flow.
Generously advised by leading academic researcher Raymond Yun Fei PhD.
Progress demo ~ aleatoric immersive audio
Exploring and prototyping the aleatoric audio process – it will be driven by the emergent fluid motions of millions of particles, each individual influence accumulated to generate a collective sound, played across many-channel audio enclosing the space and immersing the audience.
Morgan Carparelli – the sound-engineer for Locus, coming from the world of high-end film, television, and game audio – iterated this concept into this early prototype, where many spatialised audio sources are driven along parametric curves and synths are driven by overlapping inputs.
This prototype is built with Iannix
, Reaper
, and PureData
– to be ported into JavaScript
, WebAudio
, and WebGL
as we progress the audio process and musical qualities.
Progress demo ~ Peer into the Flow
Developed Peer into the Flow, an artwork series supporting Locus – integrating GPU
particle simulation and rendering, and AI
face-tracking technology.
Your touch or cursor influence the particles, while the view mirrors the turn of your head via AI
face-tracking – try it with the camera enabled.
Released as a series of NFT
editions on Olta and OpenSea, to raise collector funds for Locus development.
Integrates gl-gpgpu
to simulate all the physics and several other aspects, while supporting WebGL 1.0
for equity of access on many devices.
Integrates AI
face-tracking for natural-motion interaction via camera to control the viewpoint.
Progress demo ~ GPGPU
Developed gl-gpgpu
, a General-Processing GPU or GPGPU
tool – useful for computationally intensive tasks like real-time physics and particle systems, of which Locus has many.
Provides equity of access to Locus, with high-performance parallel-computation of complex systems not reliant on the availability of compute-shaders or WebGPU
– just WebGL 1.0
, available on most devices audiences may own personally, not only high-end devices in custom installations.
Developed for optimal computation on the GPU
via WebGL 1.0
and GLSL
, and published Open-Source.
Progress demo ~ Optical-Flow
Developed glsl-optical-flow
, an Optical-Flow
Computer-Vision method for natural-motion interaction via video – try it with the camera enabled.
This natural-motion interaction embraces myriad kinds of people, forms, movements, abilities, appearances, and environments – intuitively and in real-time.
Developed to run optimally on the GPU
via GLSL
, and published Open-Source.
An artwork engaging your conscious exploration of awareness and sensation, to release attachments, face aversions, and perceive and attune to clearest reality.
Support Locus on Artizen
0d:0h:0m left
Locus is crowdfunding on Artizen
Locus has been curated for the Season 4 Official Selection of the Artizen fund for human creativity – after raising Ξ1.89 / US$4,368
in Season 3 by supporters matched by sponsors!
On this crowdfunding platform, creators engage their communities to support projects with contributions multiplied by match-funds, and compete for the Artizen Prize.
If Locus resonates with you, supporting it on Artizen is an ideal way to help the team creating it.
You can contribute on Artizen using crypto or card payment until 24 January 2025 – 0d:0h:0m left.
Need some help?
We can help you make a contribution, just contact us.
What Locus gets
Artizen and their sponsors match your contributions, amplifying your impact on the project.
You can support Locus by buying Artifacts, a token per Ξ0.01 / US$25
you contribute.
Artifact sales directly fund the project, unlock more match-funding, and move it up the leaderboard for the Artizen Prize.
What you get
Locus rewards your support, both right now and in the future integrated into the final artwork.
Each Artifact is an NFT
asset that benefits both you and the project.
Your Artifacts invest in Locus, growing in value with the project and its community; and grant you access to Locus and Artizen private communities and votes to curate future Artizen seasons.
Your Rewards
You can earn rewards for supporting the Locus project – each unlocked in tiers for every 1× or 2× or 3× Artifacts that you buy.
Some rewards can only be sent to you if you have a crypto wallet.
Some rewards require further information like your email or photos.
1×Artifact ~ A Locus Artifact edition
Pay by crypto ~ must be sent to your wallet.
Own a part of Locus in its earliest phase with every crypto contribution, as an Artifact NFT
edition of the Locus concept art, sent to your wallet.
Each Artifact is an NFT
crypto asset – its value may grow with the project and its community, and can be traded on the blockchain.
Your Artifacts give you a voice to curate projects for Artizen funding seasons – each grants you a vote you can give to projects you believe in and would like to give the chance to be funded on Artizen.
Ownership also grants you access to private communities of:
- Locus: backers, team members; shaping the project, insights, progress, exhibitions, events, connecting collectors with art.
- Artizen: curators, creators, supporters; meeting online and at events around the world, driving positive-impact projects.
Artifact collectors will be connected to the final artwork, as sponsors credited by name or brand or alias, and by unlocking extra features of Locus.

2×Artifact ~ A Peer into the Flow edition
Pay by crypto ~ must be sent to your wallet.
For every 2 Artifacts you buy via crypto, your wallet also receives a random unique NFT
edition of Peer into the Flow, while this limited series of 66 lasts.
Own an edition of these digital interactive kinetic sketches, exploring flowing forms and natural interactions with your look and touch, evolving with their on-chain context.
Here your touch or cursor influence the particles, while the view mirrors the turn of your head with the camera AI
face-tracking enabled.
This series was created by epok.tech to support Locus with sales to collectors, and to prototype some of its elements – AI
body-tracking for natural interactions, complex motions of millions of particles.
A free random edition from 2 Artifacts at Ξ0.02 / NaNMATIC
is discounted from Ξ0.04 / 92MATIC
. See it on Olta or OpenSea to learn about the series, interactions, controls, dynamic on-chain effects, and specify an edition to mint (at full price).
Peer into the Flow collectors will see their editions dynamically update and connect with the final Locus artwork.
3×Artifact ~ A 3D face scan shown in Locus
Coming Soon ~ this reward will apply retroactively when released.
No wallet needed ~ pay by crypto or card and add your email.
Further information required ~ contact us to set up this reward.
For every 3 Artifacts you buy from Locus, a hyper-realistic 3D scan of your face will feature in the artwork.
Using Gaussian Splatting
, a new hyper-realistic 3D scanning technology, we’ll process a video or images of your face at different angles to generate a physically-realistic 3D reconstruction.
Your 3D face-scan will then be integrated into the art as:
- A unique digital artwork of your face, given to you as digital assets and optionally an
NFT
– soon after we receive your imagery. - One of many faces that fade through the Locus artwork while noone’s interacting with it – when the final artwork is exhibited.
This uses the same technique as in the final Locus artwork – to reflect your likeness faithfully in the visuals, mirror you and deepen your personal connection with your experience, and gather an archive of all the varied participants and supporters of this project.
Inspiration and references
Examples of key aesthetic elements of Locus:
Fluid particles, fluid dynamics physics simulation, translucent 3D volumes.
The Locus concept art sketch blends these elements together.








Material Point Method
simulation of fluid particles by Grant Kot. 




Other progress
Team and partners
We’ve assembled talented people into a multi-disciplinary creative team to bring this project to life:
- Creative Consultant, Technical Producer, Designers, Photographers, Public Engagement Consultant.
- Sound-Designer and Musician collaborating on experimental generative sound.
- Steel & Form metalwork studio for set-design and fabrication.
- Innovation Creative Director support by Amplify creative agency.
- Meditation Consultant, interviewing other meditative practitioners and audiences to keep this artistic interpretation authentic.
Milestones reached
We’ve deeply researched key elements:
Material Point Method
for physics realism, in real-time web-tech onGPU
, with the advice of leading academic researcher Raymond Yun Fei PhD.Signed-Distance Fields
for the anatomical bust 3D forms to influence fluid particles’ colour and matter properties; and related methods.
Developed Optical-Flow
, AI
face-tracking, GPU
-optimised real-time simulation elements, splitting parts into smaller projects supporting this main artwork.
Completed the detailed production timeline and budget.
Exhibitions confirmed and expected
- BETA Festival in Dublin, Ireland.
- Peckham Digital in London, United Kingdom.
- Among others expecting confirmation…