4 March 2024
This entry is a collection of my recent experiments with the connection of TouchDesigner and Stable Diffusion. Currently, I am using a Component built by Lyell Hintz which connects a locally running version of SD to TD using automatic1111's webui version and its API, making it possible to send data back and forth between the two tools.
The possibilities of their connection is simply incredible. I personally intend to use SD more of an enhancement of TD, instead of letting SD alone do all the work. Meaning I can create complex (audiovisual) graphics in TD and make them look more interesting and more real with SD. Of course, I am also testing different prompt-only processes to aclimate myself with the new environment and to test different styles and options.
The amazing thing about setting it up the way I (and Lyell) are doing, is that it essentially creates a feedback loop: So we're using the image SD spits out and send it back into SD for the generation of the following frame. And while doing so, you can manipulate that input in any way, just like you would manipulate a texture in a TD-only feedback loop. Changing the scale slightly will, for example, zoom in or out of the texture.
Creating a base that is running with an independant timeline makes it possible to input video and use audio, as the whole process is not truly realtime. On my 3070 it takes around 2-5sec for a frame to be generated with the resolution of 448x640 I am currently working with, but as the process is asynchronous, TD is still running smoothly with any specified FPS like 30 or 60.
----
"The All Seeing Eye" For this first test, I was creating some surrealist art simply with prompts, zooming into the artwork infinitely.
"Forest Spirits" The second test is a very similar approach, but I composited some circles into the loop
Text Experiments Here, I was simply messing with adding text and more graphical elements and their connection to different prompts.
"Sonic Forest" My first proper audiovisual test. The idea here is that I used the audio spectrum and converted it to TOPs (a texture). The processed outcome of that already looked remotely like a forest, so SD served more of a "leveling up" of the already exitisting base artwork.
Track: Machinedrum - Inner Ear
Waves This approach was very similar to the one above, but instead of making use of the audio spectrum, I manipulated the movement of noise by the audio level and turned it into an ocean with prompts. The texture before was already blue and, again, already looked remotely like water.
Track: Coton -- Fasme
"Night Ride" Getting more sophisticated, but essentially I'm just using the audio spectrum again. I'm loving how SD can be used in a more abstract way.
Track: Lomboy -- Loverboy
"City Levels" For this one, I actually built a rather complex texture in TD before using it in SD. So basically, I used TOPs to create a schematic city and audio analysis to drive to illumination of the windows. Adding some noise on top, quite a bit of of post processing into the loop and voilá!
Track: Lone -- Oedo 808