0
Skip to Content
Services
Portfolio
Contact
TomBKN
Services
Portfolio
Contact
TomBKN
Services
Portfolio
Contact
Skip to Videos
  • All |
  • Audio Visualizations |
  • Immersive Experiences |
  • AI-Visuals |
  • Generative Arts |
  • Projection Mapping |
  • Realtime Data Visualizations |
  • TouchDesigner Development |
  • Virtual Production |
    • Audio Visualizations,

    Audio-Visuals for composition by Julian Gutjahr (4 of 4)

    at Resonanzraum (snippit 4 of 4)

    • Audio Visualizations,

    Audio-Visuals for composition by Julian Gutjahr (3 of 4)

    at Resonanzraum (snippit 3 of 4)

    • Audio Visualizations,

    Audio-Visuals for composition by Julian Gutjahr (2 of 4)

    at Resonanzraum (snippit 2 of 4)

    • Audio Visualizations,

    Audio-Visuals for composition by Julian Gutjahr (1 of 4)

    at Resonanzraum (snippit 1 of 4)

    • TouchDesigner Development,

    Check my GitHub...

    check out my GitHub Link

    • AI-Visuals,

    AI Visual Workflow in Real Time

    This video shows a live AI-driven creative process with two screens: one displaying the generated visuals, the other showing a real-time AI interpretation. It highlights the interplay between raw output and live AI-enhanced transformation.

    • AI-Visuals,

    AI-Generated Visuals with StreamDiffusion

    Created with generative AI in real time using StreamDiffusion, this piece explores machine-driven aesthetics and evolving visual forms.

    • Generative Arts,

    Realistic Real-Time – Built with T3D

    This visual experiment uses T3D to achieve high realism while maintaining real-time performance, blending detailed rendering with responsive, generative control.

    • Generative Arts,

    Smoothstep Fade & Soft Shadows

    This visual study uses smoothstep functions for seamless fading and soft shadows to enhance depth, creating a clean and atmospheric real-time aesthetic.

    • Generative Arts,

    Feedback Displacement Effect in TouchDesigner

    A real-time visual using feedback loops and displacement to create evolving, abstract motion and depth.

    • Generative Arts,

    Real-Time Rendering with StreamDiffusion

    This project uses StreamDiffusion for real-time rendering, blending AI-generated visuals with live generative processes to create fluid, evolving imagery.

    • Generative Arts,

    Live Visual Performance with MIDI Control

    Real-time performance using a MIDI controller to trigger displacements and generative effects in TouchDesigner, creating dynamic, reactive visuals.

    • Projection Mapping,

    Projection mapping on cubes

    This project explores spatial transformation through real-time projection and sound-reactive design.

    • Generative Arts,

    Snippet from Audiovisual Performance at Resonanzraum

    A live generative piece combining real-time TouchDesigner visuals with reactive sound, exploring the velocity and force of particles, algorithmic design and immersive audio.

    • Audio Visualizations,

    Smoke and Particles Simulation

    Created in real time with TouchDesigner, featuring realistic animations that showcase dynamic and lifelike motion.

    • Virtual Production,

    Live Stage Simulation

    Created with Unreal Engine’s Metahuman and custom PBR textures made in Blender, enabling detailed, cross-platform character design.

    • Virtual Production,

    Hybrid Real Time Scenario

    A real-time setup combining Unreal Engine and TouchDesigner. 3D scenes from Blender and live visuals stream across multiple screens via Switchboard.

    • Realtime Data Visualizations,

    Real Time Weather Visualization

    A real time visualization built with TouchDesigner and Python that uses live weather data from API to generate visuals reacting to wind and temperature changes.

    • Audio Visualizations,

    Audio-visualized Smoke Simulation

    An audio-reactive smoke visualization created in TouchDesigner, with sound generated in Ableton Live.

    • Audio Visualizations,

    Fluid Simulation

    A real-time fluid simulation built in TouchDesigner, exploring dynamic motion and generative visual design.

    • Immersive Experiences,

    Interactive Point Cloud Installation

    A real-time installation using Kinect and OpenCV to create a dynamic, interactive point cloud of the body. Built in TouchDesigner with Python for control and integration.

    • Immersive Experiences,

    Kinect Fade Installation on Screen

    Interactive fluid animation created with TouchDesigner that adapts to the moving person using Kinect in real time.

    • Audio Visualizations,

    Live at SHIFTED Festival

    Live at SHIFTED Festival, this TouchDesigner project features real-time audio-reactive visuals projected onto cubes, creating an immersive interactive experience.

visual art & development by TomBKN

Copyright. All rights reserved.
AGB & Datenschutz