Music Animation Machine

jersmi's picture

If I wanted to try something like this in QC with a midi file, how could I? Is it possible to get a whole midi file into QC as a structure? Am I limited to feeding in the midi file from other software?

http://www.musanim.com/

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

cybero's picture
Re: Music Animation Machine

Well the site you point me to has a whole load of different programs.

Here's what I do to get at information within a MIDI file.

Being often found to be in environments equipped with a MIDI in only application [GarageBand] I have to resort to just such techniques to exploit MIDI in situ :-). I don't pretend BTW that it is as slick as having full MIDI in and out using say QC to Ableton or any other MIDI in / out application, like Logic for example.

Whatever .....

Open the MIDI file in MuseScore, a freeware application BTW, & then save in Music XML format and that should give you access to all sorts of useful information.

Although it will save all sorts of page formatting XML information, you can use the QC structure tools & XML Importer to pick and choose by what key term your wanting to exploit the information of.

You could also examine and figure out what you wanted to edit out & maybe do a stripped down version of that XML file.

Find attached a default XML output from Muse Score when asked to save a MIDI transcription of a John McLaughlin track, Lotus Feet so you can see just what it outputs.

Hope that helps.

PreviewAttachmentSize
Lotus_Feet.xml_.zip76.79 KB

jersmi's picture
Re: Music Animation Machine

cybero, thanks so much. xml looks like a very viable solution. I did find another piece of software that turns midi into csv, but had not had time to pursue yet... not surprised that your solution sounds better. I thought you might have some valuable insight considering how deeply involved you are with music visualization in QC.

And the site -- yes, a good amount of nuts and bolts info on a couple decades of research in music + computers, referencing some heavy hitters (at least in the SF Bay Area where I live) like Max Matthews, etc. But when you say "loads of different software", I am referring to one piece of software, the Music Animation Machine -- see the vimeo clips posted on the site. I was pointed to this by a musician who wants to try and do something like it with a symphony in combination with live reactive audio input. Luckily for me this project is nine months out, and I will have on my team a programmer familiar with QC/JS, et al...

Personally, I think the MAM stuff is very cool because -- even though it looks like so many piano roll midi sequencers we are all used to seeing these days -- it is a clear, articulate graphic delivery of complex information. Deceptively simple.

gtoledo3's picture
Re: Music Animation Machine

Sounds like a cool project! Wish someone had called me ;-)

This look reminds me a lot of setups I already have, I think I'll take a go at doing some stuff like this myself.

gtoledo3's picture
Re: Music Animation Machine

Oh, so you're actually trying to eventually get this going with microphones and live input, and not midi? Hmmm.... There isn't really such a thing as chord detection in QC for live audio, let along pitch detection, amongst other possible issues (though you can "sorta" do pitch detection). The midi thing is pretty doable (and still achieve that look).

Your trick with live audio will be to not have everything look like barrage of info (which is imo, what most people fail at - the result of the frequency info doesn't look tied into the music unless you shape it), and to do something to allow someone to control sensitivity on the incoming channels during the show if things are volume level dependent. Just some thoughts...

jersmi's picture
Re: Music Animation Machine

You never know, George -- your expertise is definitely on my radar. :)

jersmi's picture
Re: Music Animation Machine

Correction here, my mistake:

Probably not "live audio reactive input"

  1. The topic of midi input here is a point of departure for graphic display of an orchestral score, or one part, or ? It is the beginning of a set of questions like, how do we visualize / present these musical pieces in interesting ways? How do we take pieces from the canon and make them interesting for contemporary audiences, etc.

  2. For the performance, the soloist will be on stage most likely with IR motion tracking something or other, generating live graphics projected over the top of the other graphics based on movement. Though the question of generating imagery based on audio amplitude is still in the air...

My job in this is more artistic direction and content development. The technical apparatus will be developed by (at least) one other person. So I can imagine what I want to see and say to another person, "Let's try to make this."

Anyway, I have the first meeting in about three hours. I'll know more after that.

dust's picture
Re: Music Animation Machine

there is a program called melodyne dna that will let you take audio and convert it to midi. its pretty cool it actually splits up chords for you to their integral notes which is amazing to me. after that you can convert audio to midi. how that will help you in qc im not sure. if you look in the repository i have midi note frequency converter. the demo file shows you how to do rudimentary pitch detection from a file (although live input works).

once you have found the frequency you could then create some graphics at runtime that represent a specific frequency. if you know the input is a chord you could also make graphics representing that chord. that is supposing you know music theory. so if your matching a particular frequency you could display that info some how. for instance if you match an Am then you know the integral notes to that chord are A C and E.

like i said the demo is rudimentary as im using single notes as a source but you could certainly save out a bunch of chords to be matched.

gtoledo3's picture
Re: Music Animation Machine

Well, it's an interesting idea, and it will be cool to see where it goes.

SteveElbows's picture
Re: Music Animation Machine

This sort of thing has long interested me, and there is a lot more potential for visualisation of the music if you have the full midi data rather than just an audio feed.

If I wanted to copy the sort of thing MAM is doing, I would probably cheat a little - rather than try to load the whole midi file in, I would just feed it midi from an app in realtime, but I would start this going some seconds ahead of what was actually being played, so the info about what notes are coming up is available enough seconds in advance to visualise as it scrolls onto the screen. Alternatively I might try to use the kineme value historian patch to record the data into the composition. This could quickly become unworkable, especially if trying to record every possible bit of midi info, but may work better if I record the variables that actually end up changing the various visual elements, which would hopefully be a lot less different variables than the raw midi data.

There are boatloads of small and large variations on how MAM is presenting the note data that could be achieved in QC. If you have a lot of different tracks/instruments, lots of simultaneous notes from one instrument, and lots of different notes across wide octave ranges then it can be a challenge to present all of the info visually, or it may take a long time to setup in QC.

If being done in conjunction with a live performance of the music then I imagine there could easily be timing issues with the music not being in sync with the visuals, you may need to explore whether the visuals are in charge of the timing of what is played, a kind of digital conductor, though Ive no idea how well that would go down with the musicians.

I'll probably try to have a go at knocking up something in QC this weekend that does something vaguely similar, will post it here if I get anywhere.

gtoledo3's picture
Re: Music Animation Machine

I'll have to remember that one! "Ok, you guys play to the visuals!" In a perfect world ;-)

I think the reality of the thought process in these scenarios is much more like "hmmm, let's not spend as much money or time on content, and have some way of having musical notes generate awesome_content_through_artificial_intelligence".

SteveElbows's picture
Re: Music Animation Machine

Some years ago I saw Cornelius play at a festival. The visuals & music were very tightly synced and it made for a great spectacle. However the visuals were the same as on their DVD, and apparently they were playing live in time to the visuals, some kind of click track or other timing info coming from the DVD to help them stay in time.

Personally Ive always been more interested in getting realtime data from the musicians & their instruments and turning that into visuals but it takes a lot of setting up, so many things can go wrong, and I always get sidetracked by wanting to do stuff in 3D thats not so easy in realtime with the tools available. I'll probably have another stab at it but maybe I will give up on proper realtime 3D for a while and pre-render it and then either trigger video clips or do some more simplistic 2d animation based off of the 3d renders.

Which reminds me that Apple Motion supports midi, and is pretty good for 2.5D stuff, although its realtime performance can be underwhelming. Still for the sort of thing that MSM does I question whether generating the visuals in realtime is actually necessary, it might be easier to pre-render it as the data is available ahead of time anyway in the form of the score. Its only if you want to augment the visuals with certain extra stuff from a realtime feed that realtime generation of the visuals seems worth the extra 'risk', and even then, depending on what visuals exactly you are creating, you could pre-render some of it and then layer some realtime stuff over the top.

SteveElbows's picture
Re: Music Animation Machine

Well I got a very sloppy 'lazy mans' version working this evening. I use the accumulator to build up the image over time, and a Global Input Midi Value Follower so that I dont have to wire up notes separately. It works but its not very good, the accumulator way of doing things shows up the timing instabilities quite badly. I'll post it tomorrow but its nothing clever and I havent bothered setting it up to respond nicely to a wide range of notes, not got the colours fine-tuned etc.

On a related note I am also reminded that things like QC arent all that smooth when it comes to scrolling stuff, you get annoying glitches here and there which can be quite noticeable when doing stuff like this. My use of the accumulator is only partly to blame, when Ive used other techniques to scroll stuff the problem still shows up. From memory it can be less noticeable when doing stuff in 3D, although Im not sure if this is technically true it may just be a matter of visual perception and how much you can get away with with certain stuff without the mind noticing.

gtoledo3's picture
Re: Music Animation Machine

I can totally imagine Cornelius having the where withal to take that approach. If music is sequenced though, it can be pretty easy to have a guy flipping video clips that look like they are in sequence with the music, even if there's no real audio interactivity going on, and it was all rendered on the front end.

Do you mean 3D, as in 3D screen stuff, or 3D graphics?

Interesting thoughts Steve...good taste too ;-)

jersmi's picture
Re: Music Animation Machine

Love to see your layout with the accumulator, SteveElbows.

Some combo of live input + prerecorded / triggered -- that's how it usually goes, right? On my small budget(s), the interactive part is fun to develop but making it stable for live performance usually takes a good chunk of time, not to mention hardware issues as you go, etc.. Even with live sensors (visual or audio), someone will be triggering clips or whatever. I'm no purist -- I want the best product and want it to run well. I'll use whatever tools I have at my disposal. Right now I'm envisioning two laptops and two projectors -- one laptop / projector dedicated to interactive, one dedicated to pre-rendered backdrops.

Those Cornelius music vids on their myspace page are fun and strange -- i like them a lot...

The person building the sensor setup wants to use the Reactable fiducial stuff maybe in combo with IR/OpenCV. I'm skeptical because fiducials need light and video projection looks best in dark. And I still have audio / amplitude driven animation in mind to try. The trombonist wants to point his instrument at the screen and "paint" when he plays. I can't help but make a direct relationship to sound, but so far he has used IR tracking (in jitter) with a small light attached to his trombone being picked up by a camera with a piece of exposed film over the lens to filter IR. He has expressed frustration at the limited range / angle for the sensor. Anyway, we'll be trying some things over the next couple months. I'm excited to learn what works best in a live scenario with an eye towards something that'll work in a range of possible scenarios.