at Resonanzraum (snippit 4 of 4)
at Resonanzraum (snippit 3 of 4)
at Resonanzraum (snippit 2 of 4)
at Resonanzraum (snippit 1 of 4)
This video shows a live AI-driven creative process with two screens: one displaying the generated visuals, the other showing a real-time AI interpretation. It highlights the interplay between raw output and live AI-enhanced transformation.
Created with generative AI in real time using StreamDiffusion, this piece explores machine-driven aesthetics and evolving visual forms.
This visual experiment uses T3D to achieve high realism while maintaining real-time performance, blending detailed rendering with responsive, generative control.
This visual study uses smoothstep functions for seamless fading and soft shadows to enhance depth, creating a clean and atmospheric real-time aesthetic.
A real-time visual using feedback loops and displacement to create evolving, abstract motion and depth.
This project uses StreamDiffusion for real-time rendering, blending AI-generated visuals with live generative processes to create fluid, evolving imagery.
Real-time performance using a MIDI controller to trigger displacements and generative effects in TouchDesigner, creating dynamic, reactive visuals.
This project explores spatial transformation through real-time projection and sound-reactive design.
A live generative piece combining real-time TouchDesigner visuals with reactive sound, exploring the velocity and force of particles, algorithmic design and immersive audio.
Created in real time with TouchDesigner, featuring realistic animations that showcase dynamic and lifelike motion.
Created with Unreal Engine’s Metahuman and custom PBR textures made in Blender, enabling detailed, cross-platform character design.
A real-time setup combining Unreal Engine and TouchDesigner. 3D scenes from Blender and live visuals stream across multiple screens via Switchboard.
A real time visualization built with TouchDesigner and Python that uses live weather data from API to generate visuals reacting to wind and temperature changes.
An audio-reactive smoke visualization created in TouchDesigner, with sound generated in Ableton Live.
A real-time fluid simulation built in TouchDesigner, exploring dynamic motion and generative visual design.
A real-time installation using Kinect and OpenCV to create a dynamic, interactive point cloud of the body. Built in TouchDesigner with Python for control and integration.
Interactive fluid animation created with TouchDesigner that adapts to the moving person using Kinect in real time.
Live at SHIFTED Festival, this TouchDesigner project features real-time audio-reactive visuals projected onto cubes, creating an immersive interactive experience.