Real-Time Rendering with StreamDiffusion

This project explores the creative potential of AI-assisted real-time rendering using StreamDiffusion. By integrating generative processes with live control, the visuals evolve continuously, blending neural imagery with dynamic input in a fluid and responsive way. The result is a unique visual language that merges algorithmic aesthetics with performative spontaneity.

Previous

Feedback Displacement Effect in TouchDesigner

Next

Live Visual Performance with MIDI Control