Music JSON: very rough outline

I’ve been having a look through the MIDI specs and roughing out something similar in JSON style. It won’t literally be MIDI-in-JSON because along the way I’ve changed a few things to made it easier to read and write by hand (or with simple script code).

One thing you find in MIDI files is that because the format originated as a stream of data rather than a file format you have features like separate ‘note on’ and ‘note off’ events. I thought it’d better match the way music is notated to have a single ‘note’ event with a ‘duration’ property (measured in ‘ticks’, the smallest subdivision of a beat).

Also, MIDI events are listed in the order in which they occur and their timing is given as a ‘delta’ (i.e. elapsed) time since the previous event. MIDI-XML adds the ability to specify absolute times (in ticks). In this outline I have toyed with the idea of expressing the absolute time as [bars,beats,ticks] and also to allow negative values for the tick component to make an intuitive way to specify that a note should ‘push’ ahead of its natural beat. Possibly this is unnecessary in that most files will be composed with the aid of software rather than by hand. Similarly I thought it’d be nice to be able specify notes by name rather than by ordinal numbers, but maybe who cares?

At the moment I’m not sure if JSON parsers are required to respect the source order when dealing with arrays. I think so, but if not then the whole delta time thing is impossible without adding some kind of incremental id to track events.

One final thing to explain is about the MIDI concept of ‘tracks’ and ‘channels’. As I have understood it there are three types of MIDI file: ‘level 0’ contains a single track with all the events. ‘level 1’ contains multiple tracks to be played simultaneously. I believe the intention is that each track is directed to a different device (e.g. sound module). ‘level 2’ contains multiple tracks to be played separately/consecutively, e.g. movements within a larger piece.

I have focussed on ‘level 1’ since the matching of tracks to devices fits well with the idea of having Flash ‘instruments’ that can be used to play sounds. Each track can have up to 16 channels (or in our case, I don’t see any need for a limit…) each of which corresponds to a particular sound bank, e.g. ‘piano’.

MIDI provides ways to identify devices and sounds banks quite specifically. I have kept the track and channel part simple for now because either way you will need some kind of routing code to say, “ok device X is my flash instrument Y” and to set up a piano sound on channel 1 of that instrument. By using only descriptive names the code stays portable. But maybe that’ll have to change?

I guess my next step is to make a program to convert MIDI <-> M-JSON…

{
	"name": "Test",
	"beatsPerMinute": 160,
	"timeSignature": [4,4],
	"ticksPerBeat": 192,
	"tracks": [
		{
			"device": "Flash synth 1",
			"channels": [
				"electric bass",
				"piano",
				"drums"
			],
			"events": [
				{
					"type": "note",
					"channel": 2,
					"data": {
						"note": "C#3",
						"absoluteTime": 2496,
						"duration": 64,
						"velocity": 96
					}
				},
				{
					"type": "note",
					"channel": 0,
					"data": {
						"note": "Db2",
						"absoluteTime": [0,1,-16],
						"duration": 64,
						"velocity": 96
					}
				},
				{
					"type": "note",
					"channel": 1,
					"data": {
						"note": "Db2",
						"deltaTime": 96,
						"duration": 64,
						"velocity": 96
					}
				}
			]
		}
	]
}
Advertisements

About anentropic
songwriter, musician, and er web programmer...

19 Responses to Music JSON: very rough outline

  1. JD says:

    This is a very cool idea. I was looking for a MIDI to JSON parser. My end goal is to make a particle effect visualizer using HTML5 + javascript + midi tracks (converted to JSON objects) to use on my band’s Web site. This is the page that inspired me, but I don’t believe their particles are actually responding the music: http://9elements.com/io/projects/html5/canvas/ (requires an HTML5 browser)

    Since I have access to all the midi tracks in our songs, the visualizer could be very accurate. Unless you happen to know of a way to do frequency analysis on an mp3 or ogg or whatever the html5 spec uses. Anyways, if you have any updates on this idea of yours, I’d love to see them or help in any way I can.

  2. anentropic says:

    Damn, this is awesome – a great notation syntax with some working useful code:
    http://vexflow.com

    Rock!

  3. JD says:

    That is really well done. Not exactly MIDI parsing, but probably even easier to create from scratch than a MIDI file.

    • anentropic says:

      Sadly I still haven’t written any code…

      I just found VexFlow today though and it’s way ahead. Via discussion on the author’s blog I also came across this:
      http://abcnotation.com/

      It’s another concise notation syntax, with interpreters available for many languages. It covers different ground to MIDI or my proposal (it just captures the essence of a tune rather than all the data needed to reconstruct a performance).

  4. anentropic says:

    Just found this:
    http://www.sergimansilla.com/blog/dinamically-generating-midi-in-javascript/

    generates actual MIDI data and attempts to feed it into the appropriate playback plugin via a Data URI

    has a simple OOP type of API for specifying the notes in Javascript code

  5. anentropic says:

    DSP in Javascript (js reverb!) …obviously requires support for Audio Data API (Mozilla-only at present)
    https://github.com/corbanbrook/dsp.js

    …and a ‘guitar tab player’ that makes use of Vexflow and the Audio Data API. Sadly they were stuck with Music XML as a data format:
    http://www.gregjopa.com/2010/12/html5-guitar-tab-player-with-firefox-audio-data-api/

  6. JD says:

    Wow this is all really impressive stuff. I’m starting to feel a bit overwhelmed with all the possibilities. JS MIDI + DSP – Imagine code-generated, high fidelity, cross-platform interactive music…

    • anentropic says:

      yeah, what’s surely coming before too long are HTML5 ‘instruments’ … tag based sample players, with DSP effects, controlled via JS sequencers…

  7. anentropic says:

    just saw this… very clever and something like I had in mind:
    http://wiki.musichackday.org/index.php?title=JSONloops

    AFAICT the audio for the loop is generated in realtime as a stream on a server (running node.js) while multiple browsers can ‘connect’, as it were, to that loop and control it

    • JD says:

      Very cool. I would guess the audio is played via loaded samples, mp3/aiff/wav. I really need to look into node.js, seems like everyone is using it for all kinds of unrelated projects.

      • anentropic says:

        actually I think the audio only plays on the ‘server’, the connected browsers are just control interfaces. in their demo video the ‘server’ is one of several laptops.

        if your server is in a data centre somewhere then I think hearing the loop may be an unsolved problem for this project… must be doable though!

  8. anentropic says:

    starting to get interesting…
    https://github.com/justinlatimer/node-midi

    • JD says:

      We are on the same wavelength, apparently. Just last week finally got around to setting up a node.js server to play around with for a simple 3D engine I’m trying to write in js + canvas. That github project is perfect for an idea I’ve long wanted to do: programmatically generate midi, but have playback through a software sampler. I use Propellerhead’s Reason when I feel like making a tune (www.headchemists.com) but as far as I know, programmatically generating Reason files isn’t an option. This Node.js module is perfect, where javascript logic would create note events, even split across multiple MIDI input channels, which are all wired to the various instruments in the Reason song.

      Somewhat related, I found this blog post about using javascript to actually synthesize audio waveforms in the browser using Frequency Modulation. http://js.do/sound-waves-with-javascript/ The visualization of the waveforms is an especially nice touch, as if generating the audio alone wasn’t cool enough!

  9. anentropic says:

    While I haven’t seen any JSON in there yet, this is very much in line with what I was thinking towards:
    http://badassjs.com/post/40190128792/midi-js-a-soundfont-based-midi-sequencer-in-javascript

  10. stephband says:

    Oh hello. I just came across this post while searching for my own repo. You guys might be interested in the Music JSON spec I’ve written here:

    https://github.com/soundio/music-json

    I did play with a similar format to the one you’ve outlined above, but decided arrays as event objects (an idea that comes from reading the OSC spec) are better for their brevity. Anyway, if you had any input I’d love to hear it (github.com/soundio/music-json/issues).

    Music JSON is being used at sound.io/sequencer.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: