For the third and final project in the series of three, I linked multiple videos with the same audio track to be played on the projector and all the other lab computers. The video files were a realization of a journey described by Air Liquide's 'Liquid Air' and rendered in Blender. All 26 camera renders were unique views of the same world.
The easiest way to explain this in writing is that I turned the physical lab into a vessel where the monitors became windows into this outside world and the projector became the main windshield.
This was made possible through a reuse of project 2's code. For this project I spent all my time on the visual component which is why project 2's code was built with additional easily changeable parameters.
I spent a month planning and sculpting this world and an entire weekend rendering out 26 * 2000 frames in an 18 computer lab.
Below is the in-class documentation for the project.
For this project I decided to scrap all audio-based ideas and go for my final video based system that I had dreamed of being my final project. I wanted to create a video loosely based off of 1st Ave Machines's Sixes Last with my own music to make the animation easier by passing the MIDI information to blender for easy animation. My cousin and I were set to co-produce from a 4 hour distance by sending loops to each other created in an array of programs (Abelton, FL Studio, and an assortment of plugins). But the time I allocated for the song slowly diminished and I was left with just a week burned as I tried to create a song. So I scrapped the idea by turning to my Zune for a hopefully perfect song. Then, in alphabetical order the first song played was "a1. Air Liquide - Liquid Air," the first song I really heard at my first field party. The same song that introduced me to the fact that DJs are incredibly friendly people who are immersed and one with the crowd.
Very few segments of code were changed if any. But here is the link to the zipped archive. Note: Do run the "ntpd -q" command in terminal for every machine before a presentation. The computers especially in the ACTLab tend to lose sync after 10 hours. It is a small loss, but enough to notice.
This code now runs on an 20-second interval. Run the play signal after having a 4-reset code sent and established (test this by running a synchronous test first to see when the screen returns to black). Then when the system time hits ~5 seconds after any 20-second interval (0, 20, 40), send the 9-play code. This will allow the 4 seconds of leeway to have the computers check the server data. At the next 20-second interval the video should play in sync with up to a 10 millisecond delay.
Again I used Blender for this project. I no longer messed with the game engine or much coding within Blender, but did write a few scripts again located under the text panel and used the video sequence that I had been hesitant to use. I went straight into the project by analyzing all the lyrics of Liquid Air and finding pictures to think about. I progressed line by line and created many of the environments save the ending Stonehenge models.
Most of this was normal work for me using Blender since I have consulted in lawyer mediation settlement cases by creating to-scale recreations of actual figures and numerical data. This definitely exercised my right brain creativity since I was creating things that I could only partially visualize and never touch. This is perhaps why it was definitely my most heartfelt project due to all the inner turmoil I overcame by figuring out how to relate internally to an external medium.
In the end, I loved how it turned out and only wished I had more time to create the full video with each of the lyrics mapped and my own ideas mapped out during the middle 6 minutes.
Here are a few of my favorite frames that lasted only a fraction of a second, but I really enjoyed and wanted to showcase during the 6-minute middle segment:
The rendering for these videos took place in the Division of Instructional Innovation and Assessment's New Media Lab. It took all the computers, including two office computers, to render to a central location as a series of PNGs. Many times each of the computers just worked on one image sequence, but by using Blender's "touch" option, I was able to allow all the computers to work on the same image sequence on multiple occasions.
Then after all the images were in place, I ran a batch script to convert the image sequence to it's video.
ffmpeg-0.5\ffmpeg -y -i LiquidAirEarlyOut.wav -i Z:\Joaquin\Cam.000\%%4d.png -s 768x480 -f mp4 -vtag xvid -ab 192k -vcodec libx264 -b 9000k -vpre "C:\tmp\imgs\ffmpeg-0.5\ffpresets\libx264-blender.ffpreset" -threads 0 -r 25 Early_000.mp4
The presentation went a lot smoother than project 2's. Primarily because of the code change that allowed for synchronicity based on the clients and not on the servers.
I started by setting a wallpaper loop of my favorite images with a Theta Wave playing in the background to try in some sense to numb the audience and clear their mental palette, even if the binaural beat was lost, the bass hum definitely leaves a soothing numbness. There was a slight echo that I adjusted to be at 10 milliseconds to help give the ambiance of the music bouncing off the trees surrounding the same field that I visited when I first encountered this song.
Unique IDs (*See appended note.):
NOTE: I included the ffmpegx script to show the exact settings that I used. This was NOT a fast streaming video because I wanted to make sure the entire video was loaded to prevent server drop-offs during the actual presentation. The bitrate is also INCREDIBLE overkill, but I wanted to allow the audience at the actual time of the presentation to wander around and enjoy the animation from a variety of distances at a near perfect resolution.
*The documentation is the website, the video was just part of the installation.
There was one video that was recorded during the actual presentation, but the constraint of one camera was not able to catch the full effect and for this reason was not included. Update: A 3D reconstruction Blender video is planned.