These are some animated tileable procedural textures I made with the procedural texturing tool/engine that I also made (Based on MiniOrk, of course..). The engine includes modules such as Image, Gradient, 2OpBlend, 3OpBlend, 2D Displacement Map, Heightmap to Normalmap, Spherical Reflection, Kaleidoscope, Cellular Noise, Octaves, etc… All modules run on the GPU with varying levels of support from the CPU. They all animate in real-time. The underlying tech is a generic multi-threaded heterogeneous (CPU/GPU) dataflow graph system which is also used in my particle-engine, modular-audio-synthesizer, and modular terrain generator. I have been interested in dataflow graph based systems since I was a teenager. My interest was sparked by modular analog audio synthesizers, followed by TurboSynth, and then later CSound and Generator/Reaktor. It made sense to me to try to and apply the paradigm to the visual realm (as others have also done). The dataflow graph system shown here has many parallels to a csound like system, an example would be that each plug type has a sampling frequency associated with it, such as “float-uniform”, “vect3-uniform”, or “image-perpixel” which are analogous to CSound’s e-rate, k-rate and a-rate or even GLSL/HLSL’s uniform, vertex-rate and fragment-rates. These days, I am considering layering in an LLVM based system, by translating the dataflow graph into LLVM IR, letting LLVM optimize, and then translating the resultant IR back to execute on the “dataflow machine”. I may even consider generating a parser to translate expressions directly into dataflow-graphs, although I do have to say the tool makes it easier to animate and ‘explore’ procedural textures, so I think a textual/visual hybrid system would work best.