I’ve been spending a lot of time writing code, which hasn’t left a lot of time for writing blog posts, so now it’s time for a sweeping update to cover all the work I’ve done in the past few months. Photo copyright Miguel Legualt I continue to develop my visualization engine, Oculon, using cinder. The focus has shifted from generating video content to creating a live performance system. In collaboration with Diagraf, we’ve created a hybrid setup that combines the output of Oculon with Resolume, allowing us to mix real-time generative content with video. We’ve used this setup to perform at several shows, most recently at Metropolis for MUTEK. I also used it solo for the first time to provide visuals for an RTS.FM podcast featuring Orbital Mechanics. This is a continual work in progress, with efforts divided between creating new visuals, improving the tech under the hood, and streamlining the user interface for live performance. Following is a summary of some of the work in these areas.


It all started with Orbiter, but now Oculon has several visualization modules in various states of maturity (nothing is ever considered complete). The modules are mostly based around some combination of physics simulation, data analysis and procedural rendering. My first experiments after the orbital simulation were with particle systems. I started with some basic implementations, then built a module based on Kyle McDonald’s binned particle system. Later I learned about GPU programming using GLSL and OpenCL and wrote a few of my own GPU-based particle systems. Using the GPU gives far better performance but also makes the code more complicated.

[caption id=”attachment_573” align=”alignnone” test[/caption]

After my particle system obsession period I started looking into data visualization. I obtained data for earthquakes, solar activity, satellites, and so on. I haven’t found ways to use all of this data yet but I continue to experiment with it and every now and then something viable emerges. This also led to making simpler visualizers that use only the audio signal and FFT analysis as inputs. It is very a much an organic process – it starts with a simple algorithm or idea and I keep tinkering with the code and variables until something I like emerges.


Oculon is long way from working as a stand-alone VJ system, but we wanted to incorporate its real-time ouput for live shows instead of just using captured videos. We considered digital video mixers or using a capture card to route the output from one computer to another, but these solutions require expensive hardware. syphon is an open-source library that enables re-routing the graphics output of one program to another on the same machine with little overhead. Luckily it works with both Resolume and cinder, and made it possible to use Oculon as an input source inside Resolume. Each module renders directly to a FBO and gets syphoned to Resolume as a separate clip. Then Resolume can manipulate them like any other video source.

The downside to this approach is that one computer is used to run both Resolume and Oculon and both programs are very CPU/GPU-intensive. However, I still prefer this setup because offloading all the mixing/effects work to Resolume saves me from having to implement all those features from scratch.


Another issue with one machine running both Resolume and Oculon is interacting with both programs at the same time. I started with keyboard shortcuts until I ran out of keys, then added an on-screen UI using SimpleGUI. I tried using MIDI controllers, but they weren’t flexible enough to handle all the various modules and parameters and also didn’t provide enough feedback.

I decided to use OSC for communication, since the protocol is easy to implement and there are many OSC control surface apps available for iOS and Android. After trying several (even considered rolling my own using cinder for iOS), I chose TouchOSC for iPad. Lemur would be my top choice – the editor seems much more powerful – but TouchOSC wins on price ($5 vs $50). Everything I tried on Android was too unstable/unpolished (even TouchOSC for Android has no editor at all). So far this setup has proven very flexible since I can make a custom layout page for each module, send data back to display on the iPad, and relay OSC messages from Oculon to Resolume to control both programs with one interface.


This project has grown a lot from the first iteration 8 months ago. It is now over 10,000 lines of code. One challenge continues to be balancing stability and performance while keeping the system flexible enough for experimentation. Using git has been very helpful, allowing me to branch the code for prototypes while always having a stable version I can modify in case of a show. I also use Trello to track and prioritize all my to-dos, ideas, and bugs.

Before every show I am usually frantically writing a lot of new code, sometimes minutes before showtime. After each show I go back and clean up all the hacks and do some refactoring to make life easier in the future. I also take notes about what went poorly at each show and try to fix all the major issues before the next one. If I don’t make an effort to reduce the technical debt the project would eventually become unmaintainable, and I want to continue building on this platform for the foreseeable future.

Special thanks to Diagraf for all the opportunities to showcase this work with him. More to come!