EWERX
DEV, TECH, VISUALS

Orbital Mechanics in the Satosphere

On February 27, 2016, Orbital Mechanics performed a live immersive AV show in the Satosphere at SAT for Nuit Blanche.

This was our first show in a dome and my first VJ performance in one. Much of the content I used was created by Diagraf, who has a lot of experience building visuals for domes, but I also managed to port some Oculon material over after overcoming a few challenges. In this post I’ll describe the process I used for building visual content for the dome.

 

Domemaster

First, it’s important to understand how the dome works and what it expects as input.

Satosphere setup

To create an immersive surround image, the dome uses multiple projectors to project on a spherical surface. The business of managing the multiple projectors, splicing and overlapping images, and spherical warping is all handled internally by the Satosphere. You output to the dome as if it was a regular external monitor and the software responsible for outputting your content (in my case, Resolume) is still pushing out a single rectangular image.

The difference is that this image needs to be in domemaster format. Domemaster is a spherical fisheye distorted image: a 2D image that becomes an immersive environment when projected on the inside of a sphere. For example:

Fisheye/domemaster example

When passed through the dome system, images like this are “unwarped” and spliced up among the projectors to be shown inside the surface of the dome, creating an image that surrounds you.

I had previously made some content for the Satosphere, but I “cheated” by making 2D radial designs that didn’t reveal the unwarping applied to them. Making real 3D immersive content was a new challenge.

Rendering actual domemaster output requires a lot of processing power and I couldn’t do this in real-time with Oculon. Normally I perform with Oculon running live, generating and improvising the material as I go, but for this show I needed to pre-render video clips of all the content I wanted to use. I lost the ability to react to audio input or change simulation parameters on the fly. The only option was to build a pipeline to make domemaster video clips with Oculon.

3D world → hemicube frames

Normally, a virtual 3D environment is represented on your flat 2D screen by placing a virtual camera in the 3D world and rendering what it sees (in one particular direction) to a flat image.

To create an immersive view of a virtual 3D world, you need to be able to see in every direction. In the dome, you can look around you horizontally 360 degrees, and also 180-220 degrees from in front to back (through the top). You can’t look down since the floor is not part of the projection surface.

Fisheye projection

One way to build this image is to stitch together a view of the scene from every direction (top, front, back, left, right), creating a hemicube of images, then apply the fisheye distortion to it. This is not the only method, but it was the most straightforward way to do it with my existing render pipeline. This method is described in more detail here.

Hemicube to domemaster

Now I have to render 5 different camera angles for each frame of video. I built a 5-camera rig, using code from ofxDomemaster as a starting point. There is a “master” camera, which moves/looks around the scene, and 4 other cameras attached to it and looking in all the other perpendicular directions. Each frame is rendered 5 times, once per camera, and saved to disk as a PNG. This process is extremely time-consuming, with render times over 1 second per frame! That means a 5 minute clip takes nearly 3 hours to render. And that’s only the first step…

Hemicube frames → domemaster frames

Now I had a folder full of PNG files that looked like this:

hemicube frames

Next I needed to stitch these images together and warp them into a single domemaster fisheye image (5 hemicube fames → 1 domemaster frame). This can be done with After Effects plugins like Fulldome, but this is very expensive software. Luckily I found an open-source command-line tool that does the job: domemaster.

I rendered frames at 2k resolution and was constantly battling running out of disk space. I learned the hard way that the domemaster tool doesn’t warn you about insufficient space and just keeps processing images for hours but failing to save any! After processing the frames I end up with a folder of domemaster frames:

domemaster frames

Processing frames is faster than rendering them, but still a slow process (over an hour for 5 minutes). Since domemaster did not take advantage of multiple cores I could run 4 instances at once, but then disk I/O becomes a bottleneck. Lost in a sea of technical difficulties and still not done…

Domemaster frames → video

oculon-lines domemaster

With a folder full of domemaster frames, I still needed to make a video file that I could load up in Resolume and use to VJ. This was done with the ffmpeg command line tool. The videos were rendered to Resolume’s hardware-accelerated DXV format for best performance. I wrote a crude script to automate all these post-processing commands, and there was a lot of trial and error involved until the entire pipeline was nailed down.

Preview

Since I had no access to the dome before the show, I had no idea how my material would look! All I could see is this:

domemaster preview

The SAT has a preview tool called DomePort, which I used to preview my domemaster video clips in a virtual dome. This lets you see how the video looks after it’s unwarped back into a sphere and look around the room. You can also preview with Syphon, so I could see how multiple dome videos mixed in Resolume would look. Due to lengthy render process I had to make a lot of blind guesses since I couldn’t afford too many trial and error passes.

Final Thoughts

Overall the Nuit Blanche show was a great success and I had a lot of fun performing at the SAT. However, I think there’s a lot of room for improvement in the content-creation process to make it less tedious and time-consuming and allow greater artistic flexibility. If I were to do this more often I would at least invest in a powerful rendering machine, since my Macbook Pro could barely manage it. I would love to one day be able to plug Oculon directly into a dome and run everything in real-time, but this is probably a long ways (and a lot of expensive hardware) away. This work has also got me excited about working with virtual reality content, as dome immersion and VR immersion have a lot of parallels. I hope to invest more time in improving my capabilities in this area in the near future.