VR Music Visualizer (Work in Progress)
I have been working on a music visualizer program for VR headsets, inspired by the music visualizer programs that used to come with desktop music programs such as iTunes. While those tended to work best with a strong beat, this one is designed for music with slower or less regular rhythms. Here is a 2d demo of my most recent version.
Volumetric Cinema
In 2019 I was part of a team based at MIT Media Lab experimenting with the Voxon VX1, a type of volumetric display. It creates a small 3D image that can be viewed from any angle without special glasses. Our team is led by a Harvard professor who wants to make an art film using this display, and my role has been creating some of the tools he will need to do that, using Unity game engine and coding in c#. His script calls for "ambient beautiful patterns" so I have created many of those. See the video below for some highlights.
EyeOn 
I wrote a software called "EyeOn" that moves video loops in time to the ambient music in a venue, using Processing (a Java-based programming language for artists invented at MIT).  I had the opportunity to try it out in a real venue when I was the VJ for a series of arts events at a nightclub in Boston.  I started by making loops from archival footage (second video below) but later created my own loops in collaboration with visual artists Colette Aimee (first video below).
Generative Video
I also use Processing to create generative visual effects for some of my video projects, such as the spinning backgrounds in the first video on this page (based on the artist's paintings), the kaleidoscopic images in the music video below (based on Hubble Telescope Images), and the moving star fields in my "Children of the Sun" video (second below).
Back to Top