This document describes methods used to convert ORG format music (used in the game Cave Story) to MIDI format as implemented in ORGMID, my ORG to MIDI converter.
All ORG melody instruments have no decay. That is, they don't fade away. As long as OrgMaker's volume level graph isn't changed, they have the same volume for the entire length of the note. Because of this, when you convert ORG instruments to MIDI instruments, you usually don't want to use a MIDI instrument like a piano (when the note is long, the sound fades away), but something like organ or flute (the sound stays the same volume for the entire length of the note). To attempt to match ORG melody instruments to MIDI programs, I first made a list of all MIDI programs that have no decay.
Using an audio player with a spectrogram plug-in, I took screen shots of the spectrograms of all of the ORG instruments, and of all the MIDI instruments with no decay, all playing the same pitch. My idea was to match instruments with similar looking spectrograms, but I only found a few matches (those with only a strong fundamental) then gave up. Because of the kind of spectrograms I made, it was difficult to compare the higher harmonics. If I restarted a project like this, I'd look for a different kind of spectrum analyzer, and make notes about the harmonic strengths, instead of relying on just a visual match.
So for now, I've given up trying to make a one-size-fits-all mapping. All melody tracks are assigned an organ sound. You should use a MIDI editor to further choose appropriate sounding MIDI instruments in the resulting MIDI file.
MIDI has two kinds of percussive sounds. When you send notes to channel 10, they always make drum sounds according to a note-to-drum mapping. On other channels, there are several MIDI programs that can make percussion or sound effect sounds. Right now, I've only used channel 10 drum sounds. To match the ORG drums to MIDI drums, I first made up lists of every ORG drum and MIDI drum name. I grouped each list into similar names, then compared the sounds to further make a match.
Here's something I thought of but haven't implemented yet. Some ORG drums could be matched to non-channel 10 MIDI programs. For example, the OrgMaker drum RevSym01 could be converted to the MIDI program Reverse Cymbal. Also, ORG drums with significant pitch changes could be matched to non-channel 10 MIDI programs. For example, an OrgMaker drum track that uses one of OrgMaker's tom sounds with significant pitch changes might be converted to a MIDI track using the Melodic Tom program.
I recorded samples of ORG and MIDI sounds. I examined how volume and pan values affect the amplitude of the recorded sound wave, then used a method to convert ORG volume and pan to MIDI volume and pan so that the sound amplitude matches well.
Go to OrgMaker Notes: Volume, Pan, Pi duration and read the Volume and Pan sections.
For MIDI, I tested three different MIDI devices I have:
I measured MIDI volume and pan amplitudes the same way as I measured ORG volume and pan. I prepared test MIDI files with equally spaced volume or pan values, recorded the sounds, and examined the amplitudes of the recordings.
I tested three different ways MIDI can change the volume of the sound:
More specifically, I tested changing only one of velocity, volume, or expression and leaving the other two at maximum.
On each MIDI device, each of these three ways of changing the volume had the same effect on the amplitude of the sound that was output. Across the three MIDI devices, each had different conversions from the input values to the output amplitudes, but they were roughly the same. I came up with the following formula as a compromise match.
V is relative MIDI velocity (0 to 1)
A = V1.75
MIDI pan behaves like this:
Here are the formulas I found that fit.
P is relative MIDI pan (0=left, 0.5=center, 1=right)
L = √
R = √
The ORG pan actually functions like a balance control. It keeps one channel (left or right) at a constant amplitude and decreases the other channel.
MIDI pan functions as a true pan. It simulates moving an instrument closer to the left microphone or right microphone. This increases the amplitude of one channel and decreases the amplitude of the other.
In my first conversions, I thought MIDI pan used the same behavior as ORG pan, so I only measured the amplitude of the decreasing channel. I mapped ORG volume to MIDI volume and independently mapped ORG pan to MIDI pan, but the results didn't sound right.
When I realized ORG pan actually functions like a balance control, I noticed there's a MIDI controller for balance and started testing if I could use that to get better matching results. Unfortunately, none of my MIDI devices responded to the MIDI balance controller, so I guess MIDI devices don't typically implement the balance controller. The only reliable way to change the volume of left and right channels was to use the MIDI pan controller.
Because the MIDI pan controller increases the amplitude of one channel, it messes up any ORG volume to MIDI volume conversion that was previously applied. Because of this, I could no longer map volume and pan independently. I needed to map both ORG volume and pan together into MIDI volume and pan.
I ended up looping through every possible value of ORG volume and pan, calculating the relative amplitude of the left and right channels, then found the MIDI volume and pan that was the closest match for the desired amplitudes. I hard-coded the results into arrays. In the ORGMID source code zip file, see the folder volpanconv.
OrgMaker never specifies note values (quarter notes, eighth notes, and so on) for its notes; note durations are only displayed in terms of grid divisions. The largest grid division in OrgMaker is the measure. The medium and smallest grid divisions are given confusing names in OrgMaker. I've decided to call the medium grid division a beat, and the smallest grid division a step.
Here are two different methods I've used to assign note values and a time signature in the MIDI file.Quarter note beat
* In a MIDI file, the Time Signature meta-event also specifies the period between metronome clicks. In both methods, I set this to be the same as the OrgMaker beat. This is intended to be helpful for the avoid triplets method. For example, if the method sets a 12/8 time signature, it will also set the metronome to click 4 times each measure (every dotted quarter note). Unfortunately, MIDI players and editors usually ignore this setting and use the time signature denominator as the period of the metronome click. So in a 12/8 piece, most MIDI players will click the metronome 12 times each measure (every eighth note) no matter what the "period between metronome clicks" setting is.
The two methods are only different when the OrgMaker beat is divided into a multiple of 3 steps.
The quarter note beat method is useful for MIDI players that don't obey the metronome click period specified in the Time Signature meta-event. Using this method, the beat is always a quarter note, and these players will always click the metronome correctly. However, if the beat is divided into steps that are a multiple of 3, then viewing the MIDI file as music notation could end up showing many triplets.
The avoid triplets method was designed to make music notation without triplets. If the OrgMaker beat is divided into a multiple of 3 steps, this method converts the OrgMaker beat into a dotted quarter note. When viewed as music notation, the beat is divided into steps of 3 eighth notes, or 6 sixteenth notes, or so on. However, as explained above, most MIDI players will then click metronome every eighth note.
For now, I've decided to use the quarter note beat method. Future versions might let you choose between these two methods or let you manually specify the note values and time signature.
In an ORG file, the tempo is specified as the duration of the OrgMaker step (the smallest grid division) in milliseconds. In a MIDI file, the tempo is specified as the duration of a quarter note in microseconds.
Once you've decided what note values to assign to the OrgMaker time divisions, you can convert the ORG tempo value to an appropriate MIDI tempo value.
An ORG file doesn't store a key signature. Choosing an appropriate key in the MIDI file helps make music notation easier to read: it consolidates accidentals into the key signature.
A simple algorithm could be to try every possible key signature and find one that needs the fewest accidentals. But that method can't tell the difference between major and minor keys.
I found a key-finding algorithm that uses the total durations of each pitch class, compares them to a "profile" for each possible major and minor key, and picks the best key. This algorithm can tell major and minor keys apart. See Key-finding algorithm.
Be aware the key-finding algorithm may not always choose the correct key, but the result will at least make music notation easier to read.
I have examined how the OrgMaker Freq number changes the pitch and written about it here: OrgMaker Notes: Pitch.
Because the Freq number changes the frequency in hertz, this means each pitch class would have a different pitch bend in cents. To accurately convert this into MIDI form, there would need to be a pitch bend before every note. Currently, I've decided this isn't worth it and have just ignored the Freq number when converting to MIDI.
I have examined how the OrgMaker Pi setting effects the note duration and written about it here: OrgMaker Notes: Volume, Pan, Pi duration, section "Pi duration".
Right now, my ORG to MIDI converter ignores the Pi setting.
Programs > ORGMID - ORG to MIDI converter >