<< Chapter < Page | Chapter >> Page > |
Our analysis of the music was performed in Matlab, whilst our visualizations were run using Java, with an IDE called Processing, which contains a user friendly graphics library.
A major part of our design was choosing how to map the data from our results in Matlab to our visualizations in Java. Once the mapping was chosen, we had to keep our visualizations synced with the music, and maintain a constant framerate in the visualizations. An example of our visualization without any input is shown below:
We chose to get the two most dominant components from every source, giving us a total of six components per frame, and chaining them together to determine what our "notes" were. Then, we displayed them as falling rain, or lines, from the top of the screen, with their frequency determining where they appear (low frequencies to the left, high frequencies to the right), and their duration determining the length of the line that appears.
We chose to use the FFT of the song to control the color wheel at the center of the screen. The low frequencies of the song were mapped closer to blue, whilst the higher frequencies were closer to red. The amplitude of each frequency determines the radial expansion of each line in our color wheel.
Finally, we used a process called BeatDetect, which was part of the library in Processing, to create our last animation. BeatDetect ran FFT on two (normally one) low frequency bands of the song to determine the higher amplitudes found, and calls them a beat, then, at each beat in the song it will release around 200 particles radially from the center of the color wheel.
Notification Switch
Would you like to follow the 'Music visualization using independent component analysis' conversation and receive update notifications?