Since Google released their #deepdream algorithm I’ve been totally fascinated with the images you can generate.
The principle is futuristic but simple:
You take a computer version of a neural network (like our brain) and train it to recognize objects in photos.
You give it a new photo and “ask” it what it recognizes in there. And then you tell it to enhance the photo, so that it looks more like what it thinks it sees – and repeat that over and over. This makes visible for us what the computer recognizes in the photos. We see its thoughts – its dreams.
Obviously the network was trained to recognize different breeds of dogs a lot.. 😉
Of course the next logical step is to do this with videos.
But the results are either jumpy or just step through the different “presets” (layers) of the algorithm.
Which looks kind of nice, but I wanted to decide myself where the zoom journey goes and only zoom to whatever part of the dream looks promising.
To achieve this I set up my pc to do high resolution deepdreams – 14 GB RAM needed to generate a resolution of 2560*1440.
I then took one of my mandelbrot-fractals as source and started dreaming. I then took a look at the result and used Paint to zoom in 50% into a nice location. (Pic 2)
And let the algorithm dream about that picture.
This process was repeated many times until I had a zoom sequence of 43 images. So the final zoom depth is 2^42 -what a nice coincidence that the universe looks back at you at 42!
Problem now was to get these static images into a smooth video-zoom. This wasn’t an easy task, because the zoom center isn’t the middle of the picture like in the usual deepdream videos but in the guided location that I chose deliberately.
This was beyond my abilities so I asked my good pal and VFX-artist Matthias Rohrbach for help.
Animation & Post Production
And he found a way.
MR: “At first I started straight forward and added the dreamed pictures one by one and zoomed manually within the compositing workspace. But soon I had reached the limit of resolution any video editing programm can handle. I realized I have to find a different approach to handle a zoom wich a destination of 2^42*1280 pixel = 5.629.499.534.213.120 pixels width.
The solution was to do this procedural by a logarithmic formula.
After that the project began to develop its own dynamic and in the discussions with chillheimer more and more ideas came up how to get that relatively static zoom-ride more alive. And finally we ended up with some playful special effects and a philosophic message that framed the whole journey.
I really love such projects where you don’t have to aim towards a strict goal. Just to stay in a continuous flow of communication with your creation while it envolves and surprise yourselve with a result you never would have expected in the beginning.”
The Music – Remix of Jefferson Airplane “White Rabbit”
When I saw the first results from Matthias it became clear that this would need a very special soundtrack.
And after several failed composition attempts, I remembered this remix I did of Jefferson Airplanes “White Rabbit” back in 2000.
I had cut the original song into small pieces and filtered out unwanted parts (as far as this was technical possible 15 years ago) and then added new beats & improvised on it with current sounds.
To make it fit to the video I rearranged that 2000-version and added new elements and sfx to enhance the experience.
You can download the track from my soundcloud page:
If you enjoyed the trip, let me know in the comments and perhaps give a thumbs up or share on facebook.