2025 Submerge Submission

04-10-2025

Artist Statement

I’m a Houston-based digital artist and software engineer with a deep interest in the expressive potential of shaders and real-time graphics. My work explores the fundamental building blocks of computer graphics—shaders—and how these simple, low-level programs can be orchestrated into complex, emotionally resonant visual systems. What draws me to this medium is its boundless flexibility; shaders are often seen as technical tools, but I see them as digital paintbrushes that respond, evolve, and react to the world around them.

Recently, I’ve been inspired by the nostalgia of early digital interfaces—particularly the music visualizers from Windows XP. Despite their simplicity, those reactive visuals captured something deeply human: a computer responding to sound in a way that felt alive. My practice builds on that idea, creating immersive shader-based experiences that reflect the computer’s point of view—how it might see, hear, and feel music or other stimuli.

Beyond simple audio reactivity, I’m currently exploring how shaders can interact with one another, forming networks of visual logic that feel more like digital ecosystems than isolated effects. These interdependent systems allow for emergent behaviors, where one shader’s movement or rhythm influences another, opening the door to deeper forms of generative storytelling and visual dialogue.

Through my work, I aim to bridge the gap between technical precision and emotional expression, using the raw tools of software and computation to create visual environments that pulse with life and respond in real time—much like we do. SUBMERGE represents the perfect platform to bring these explorations off the screen and into a physical, immersive space.

Requirements

I created this piece using the commercial version of Touch designer, build 2023.11340. As I mentioned in my original submission, using TouchDesigner (ideally the current version) is a key requirement. However, if the software isn’t readily available to the reviewers or board, I’ve uploaded several videos to a Google Drive folder for easy viewing and download. You can access the drive here.

Examples

Below are a few moving and still images of the shader:

Mouse Version
Mouse input changing color during runtime
High Resolution Screenshot
Still of the shader during a particularly nice frame

Features

The project is currently set up to use an audio file as input, analyzing various frequency ranges to drive visual changes such as movement, warping, and color shifts. It also supports interactive input. Mouse movement (or touch input on compatible screens) is already enabled and adds an extra layer of dynamic interaction to the visual experience.

The ZIP archive for this project includes two audio tracks I used for testing:

You can change which file is playing by referring to the picture below:

Audio file instructions
Screenshot of the TouchDesigner Interface

I have several ideas for how this piece could be presented at ARTECHOUSE. During a past visit to the Houston, TX location, I was immediately drawn to the Samsung transparent display installation showcasing layered data visualizations. I created this shader with those types of displays in mind — though, as shown in the header of this blog, it would work just as effectively with a projector setup.

Sound and user input aren’t the only data sources this program can respond to. I’m currently developing a version that visualizes data from IoT (Internet of Things) devices connected to a local network, opening up even more possibilities for real-time, responsive visual environments.

Misc

I want to give a sincere thank you to the reviewers and board for taking the time to check out my submission for SUBMERGE 2025. This looks like an awesome opportunity, and no matter the outcome, it’s been a really fun and rewarding project to work on.

Below are links to my social media where I post most of my work — they give a fuller picture of what I’ve been up to creatively.

Created by Arturo Villalobos. © 2023