This visual experiment uses T3D to achieve high realism while maintaining real-time performance, blending detailed rendering with responsive, generative control.
This visual study uses smoothstep functions for seamless fading and soft shadows to enhance depth, creating a clean and atmospheric real-time aesthetic.
A real-time visual using feedback loops and displacement to create evolving, abstract motion and depth.
This project uses StreamDiffusion for real-time rendering, blending AI-generated visuals with live generative processes to create fluid, evolving imagery.
Real-time performance using a MIDI controller to trigger displacements and generative effects in TouchDesigner, creating dynamic, reactive visuals.
A live generative piece combining real-time TouchDesigner visuals with reactive sound, exploring the velocity and force of particles, algorithmic design and immersive audio.