The Satosphere is a 360 degree dome display located at the top of the Société des Arts Technoloqiue (SAT) building in Montréal.

This March for Montréal’s annual Nuit Blanche festival, Diagraf performed inside the dome and I had the opportunity to generate some pre-rendered clips for him using the visualization engine I am building.

Making videos for the dome presented a series of challenges because it requires very high resolution input (in my case 2600x2600 pixels, but it can vary). My Macbook Pro can’t output at such high resolutions, and even if it could I have no display that could show it.

To get around this the scene is rendered to an FBO instead of the window. The FBO is the size of the output resolution, while a scaled down version is rendered to the screen as a texture so I can preview what’s happening.

The higher resolution also meant increasing the particle count in the simulation to maintain visual density, which reduced the framerate.

It got much worse when I started capturing every frame to disk, to the point where I couldn’t reliably manipulate the particles in real-time. This led me to automate the whole process: applying patterns of forces on the particles with framerate-independent timing, capturing the frames, and a shell script to generate the videos using ffmpeg.

Particles are especially sensitive to lossy video compression. Based on a few tests, H.264 seemed to give the best combination of file size and image quality. Installing ffmpeg with support for H.264 required some effort.

Another challenge of generating content for a dome is the shape of the projection surface. I cheated in this case, and just made all aspects of the particle simulation circular. Even though the videos I generated were not transformed for a spherical projection, since everything was centered and symmetrical the warping of the image was not noticeable.

Here is a scaled down screenshot from one of the video clips:

sample screenshot

You can imagine how projecting this flat image onto the inside of a sphere wouldn’t affect the overall geometry by much. To display a 3D scene properly would require a fish-eye camera to capture the full 360 degree view, as explained in this paper.

I see a lot of potential with the Satosphere and look forward to another chance to make something for it. Thanks to Diagraf for using my clips in his VJ set and the SAT for creating this amazing venue.