Sunday 29 December 2013

Killing Off Spring

Glad to see the back of Spring.

Sonic Field Is Now Stand Alone Core Java

About a year ago I moves Sonic Field over to being a Spring based application. I had all sorts of ideas about making a UI using html etc etc. Whilst the exercise of converting to Spring was good for the clean structure of Sonic Field, the weight of the Spring framework is huge.

I am much happier now that Sonic Field just requires the Core JDK to work. It can be compiled down to a single jar file and run.

The next step is to make a standard release script which creates an executable jar file, then Sonic Field will be a very simple program to execute indeed.


Friday 27 December 2013

What Is ADSR

I made a short video tutorial

Downloading Sonic Field

Sonic Field Is Open Source

It always has been open source under the AGPL 3.0 license. For a while I had a dedicated website for Sonic Field. To be honest, the combination of Google Drive and blogger makes more sense now.

Sonic Field can be downloaded from here:

https://drive.google.com/folderview?id=0BwiW6ZSSu_DEVmZfbW1Qa1o5Q1U&usp=sharing


I hope to add a good examples section in the near future.

Thursday 26 December 2013

Phase Modulation Synthesis Explanation And Example

We here a lot about FM for synthesis - but the truth is that most 'FM' is not Frequency Modulation at all - it is Phase Modulation.

...The full example patch is at the bottom of this post...

The reason is simply that Phase Modulation can produce much the same effects as Frequency Modulation but it is much easier to get correct. Let us look into why this by first having a quick look at what Frequency and Phase modulation actually are. I am going to start at the well read amateur level here.
  • As we know - a note has a pitch.
  • Another word for pitch is (ish) frequency. Frequency is more precise in its usage than pitch so we shall use it instead.
  • So, A4 is 440Hz which is a frequency.
  • A pure note is a sine wave. This means the pressure waves in air for that note follow a sinusoidal shape. Equally, the voltage in an amplifier or synthesiser circuit will follow a sinusoidal shape (again - ish).
  • A pure note has just one frequency in it.
  • However, we want to create non pure tones (as pure ones are boring in general). If you don't believe me that a pure note is boring and that something like a flute makes a pure note - think again http://www.soundonsound.com/sos/oct03/articles/synthsecrets.htm. Even a flute has a very complex tone.
  • We can make rich tones in many ways, but Frequency Modulation of a pure tone is one of the more interesting.
Frequency Modulation Step By Step So we have a pure tone and we slowly move its frequency down a bit and then up a bit and back to the middle. What we get is vibrato (tremolo being the amplitude equivalent). Vibrato is technically frequency modulation but not in the sense that syntheses nerds talk about FM. The interesting stuff happens when we modulate the frequency of a tone up and down as a rate similar to that frequency of the tone its self. In other words, the tone (carrier) and modulation are both in the audible range.

Frequency modulating (let's drop the caps for now) a tone of 1000Hz with a tone of 1100Hz will produce a tone with tones inside it rather than vibrato. In this case we will get 1000, 2100, 4200 and so on. There are also things which are called (sorry - I know it sounds confusing) negative frequencies. Now, negative frequencies cannot actually exist in sound though they are important in the mathematical modelling of sound. In real sound they are just frequencies.
A bit confused? Here is a simple way of looking at it:
  • We take a fundamental of frequency X.
  • We frequency modulate it with frequency Y.
  • We get new frequencies along side X, these are X+Y, X+Yx2, X+Yx3 etc.
  • We also get X-Y, X-Yx2, X-Yx3 etc.
  • However, the ones where we take some multiple Y away from X might end up negative.
  • So, X=1000 and Y=1100 then X-Y=-100Hz.
  • Negative frequencies don't exist in sound so we actually get the positive equivalent - e,.g. 100Hz.
This means that 1000Hz frequency modulated with 1100Hz will give:
  1. 100,
  2. 1000,
  3. 1200,
  4. 2100,
  5. 2300,
  6. 3200
  7. ...
Consequently, what we are left with is the original (fundamental) and a mixture of inharmonic overtones (and under-tones which we often remove via filtering).  We will also notice that the contribution of each inharmonic overtone reduces as the frequency goes up.

What does all this sound like?
Well, the 1000m1100 example will sound quite bell like. But, the truth is that, frequency modulation will make pretty much any tone you want (especially with a bit of filtering). This is why it is so powerful. Other techniques like wave shaping and distortion will produce just the harmonic overtones of the fundamental. Amplitude modulation will produce inharmonic over and under-tones but only one. Ring modulation is similarly more limited than FM. Nothing (other than additive synthesis which can do anything in theory) is quite as powerful at creating rich sounds.

But - why still using it when we have additive synthesis? I mentioned additive synthesis. As I say, this can produce any sound; so why bother with anything but additive synthesis. Why go to the effort of creating FM? The reasons are two fold. Firstly, to create a really complex rich sound with additive synthesis is very computationally complex and therefore still (even with modern technology) poses a major challenge to synthesis engines. Just creating a single waveform with additive synthesis is not so bad, but a complex evolving tone is quite a challenge. The second reason is very human: creativity is iterative. We make a sound we like it, we tweak it, we listen, we get inspired, we change it and we keep going. Frequency modulation gives an jaw dropping capacity to shape the resulting sound with only a few parameters to tweak, this makes it ideal for creative exploration.

From Bells To Clarinets
By controlling difference in frequency of the modulation and fundamental and the amount of modulation (sometimes called the modulation index) we get all sorts of different tones. Here are some example spectrograms. I have not filtered these so two have the low under-tones in them:
Modulation of fundamental of 440Hz with 880Hz producing the a spectrum similar to that of a clarinet. Because the negative frequencies coincide with the positive ones we do not seem them as separate lines on the spectrogram.
Modulation of 440Hz with 440.88Hz producing a spread harmonic spectrum similar to a low pitched piano note of a guitar string.
A classic bell tone modulating 440Hz with 528Hz.
The above notes all had an ADSR envelope applied which gives the characteristic intensity shape seen left to right.

From Frequency To Phase
My examples above are a cheat because they were not produced with frequency modulation; I used phase modulation. Here is the Sonic Field patch which created these tones:

{
    (
        (0,0),
        (?a,1),
        (?d,0.5),
        (?s,0.1)
        (?r,0.0)
    )NumericShape !env


    (
        ?frequency,
        (
            ((1,(?r,(?frequency,?frequency-spacing)*)ExactSinWave)DirectMix,?modulation-amount)NumericVolume
        )Mix
    )PhaseModulatedSinWave !signal
    
    (
        >signal,
        >env
    )   Multiply Normalise,
}!play-bell-inner

 64  !a
128  !d
2024 !s
4096 !r
440  !frequency

0.15 !modulation-amount
1.2  !frequency-spacing
((?play-bell-inner Do),"temp/bell-tone-example.wav")WriteFile32

0.35   !modulation-amount
1.002  !frequency-spacing
((?play-bell-inner Do),"temp/string-tone-example.wav")WriteFile32

0.5  !modulation-amount
2.0  !frequency-spacing
((?play-bell-inner Do),"temp/clarinet-tone-example.wav")WriteFile32

It really does not matter if that is meaningless to you! However, we can see some key points in there:
(
        ?frequency,
        (
            ((1,(?r,(?frequency,?frequency-spacing)*)ExactSinWave)DirectMix,?modulation-amount)NumericVolume
        )Mix
    )PhaseModulatedSinWave !signal

The above is the piece of the patch which actually does the modulation. It modulates the fundamental (given by the value ?frequency) by a sine wave at the modulation frequency which is given by ?frequency-spacing * ?frequency.

Also we have:
0.15 !modulation-amount
1.2  !frequency-spacing
((?play-bell-inner Do),"temp/bell-tone-example.wav")WriteFile32

0.35   !modulation-amount
1.002  !frequency-spacing
((?play-bell-inner Do),"temp/string-tone-example.wav")WriteFile32

0.5  !modulation-amount
2.0  !frequency-spacing
((?play-bell-inner Do),"temp/clarinet-tone-example.wav")WriteFile32

This is where the three tones are produced. Different spacing between fundamental and the modulation and different amounts of modulation are all that is required to produce completely different spectra and amazingly different sounding notes.

Frequency modulation is the moving of the fundamental frequency up and down. But that is quite hard to get right. The slightest numerical error and what we hear is the note being off pitch. To get frequency modulation correct, we need to exactly balance the up and down parts. This can prove quite hard to get right in practice. Fortunately, there is another property of a pure sine wave we can mess with which has the same effect for synthesis but is easier to work with.

Phase:
Again, this gets a little tricky to explain without a little mathematics. I will try really hard to make it as simple as possible! Let us look at the sine wave:
This image is Wiki Commons - see here: http://en.wikipedia.org/wiki/File:Sine_curve_drawing_animation.gif
If we imagine a spot of paint on a wheel at 3 o'clock and we roll that wheel along the ground, then just look at the hight of the spot we get a sine wave.

As the wheel rotates, we can see there is an angle between where the spot started,the centre of the wheel and where the spot is currently. For every value of this angle there is a value of the sin wave height.

This is effectively how the sine wave generator in a digital synthesiser works. It takes an angle which continually increases from 0 to 360 degrees (usually this is measured in radians but that is a detail) and for each tiny incremental step in the angle it looks up the value of a sine wave. Repeating this continuously produces a continuous sine wave which is what we call a 'Digital Oscillator'.

To phase modulate the digital oscillator we simply add a varying amount to the phase. This can change the instantaneous pitch but not the fundamental frequency. OK the mathematics here is a little more complex - feel free to stop reading this paragraph now.... Still here? OK, modulating the phase is the same as altering the first differential of the the wave form. But the first differential of a sine wave is a cosine wave which has the same shape. So, from the human perception point of view, except for instantaneous effects (transients - which I am not discussing here) phase and frequency modulation are exactly the same. The advantage is that if we get a slight asymmetry in the modulation it does not effect the fundamental as the rate of change asymmetry required to make the modulation asymmetrical cancels out the net effect. 

To prove phase modulation does not effect the fundamental even when the modulator is asymmetrical I will modulate 440Hz with a rectified sine wave.

The same 440 modulated by 528 tone as above but with the 528Hz sine wave rectified to make an asymmetric wave form.
We can clearly see that whilst the modulated signal now has many more, stronger inharmonics in it, the fundamental frequency has not changed. 

Here is an example of piece of music created using phase modulation:

[Here is the patch for the above]

{
    (
        (0,0),
        (?a,1),
        (?d,0.5),
        (?s,0.1)
        (?r,0.0)
    )NumericShape !env

    (
        (0,0),
        ((?a,2)/,1),
        (?d,0),
        (?r,0.0)
    )NumericShape !hitEnv
    
    (
        (?d WhiteNoise,(?d,?pitch)ExactSinWave MakeSquare)Mix pcnt+50,
        ?hitEnv
    )Multiply !hit
    (>hit,(?pitch,2)*,2)ButterworthLowPass !hit

    (
        ?pitch,
        (
            ((1,(?r,(?pitch,?pitch-spacing)*)ExactSinWave)DirectMix,?ring-amount)NumericVolume
        )Mix
    )PhaseModulatedSinWave !signal
    
    (
        (
            >signal,
            >env
        )   Multiply Normalise,
        ?hit
    )Mix Normalise 
}!play-bell-inner

{
    ?play-bell-inner Do        !s1
    (>pitch,1.005)*            !pitch
    ?play-bell-inner Do dbs-16 !s2
    (>s1,>s2)Mix Normalise !signal
    (
        (?dullness,0)gt,
        {
            (>signal,?pitch,?dullness)BesselLowPass  Normalise !signal
            (>signal,?pitch,2)ButterworthHighPass    Normalise !signal
        },{}
    )Choose Invoke
    >signal
}!play-bell

{
    (?velocity,1.5)**         !volume

    ?play-bell Do !signal
    (60000,?length)+ !length  
    {?length WhiteNoise}Do                                    !reverb
    {((>reverb,3.0)Power Normalise,10,1)ButterworthLowPass}Do !reverb
    {((0,-99),(50,0),(?length,-60))SimpleShape}Do             !renv
    {(>reverb Normalise,>renv)Multiply}Do                     !reverb
    
    {(>reverb,?r Silence)Concatenate}Do    !reverb
    {(>signal,?length Silence)Concatenate}Do !signal !signal-dry

    ?signal Magnitude !mag     
    {>signal FrequencyDomain}Do    !signal
    {>Reverb FrequencyDomain}Do    !reverb
    (?signal,?reverb)CrossMultiply !signal
    >signal TimeDomain !signal
    ?signal Magnitude !newMag 
    (>signal,(>mag,>newMag)/)NumericVolume !signal-wet 
    
    (>signal-wet,>signal-dry)Mix Normalise              !signal
    ((>signal,?pitch,1)ButterworthHighPass Normalise,(?volume,(0.25,Random)*)+)NumericVolume
     
}!reverb-bell

{
    Bunch !notes
    0     !count
    0     !prev-high
    (
        ?track,
        {
            ^tickOn ^tickOff ^note ^key ^velocity
            ("Note ",?count)Println
            (
                (?count,?notesToPlay)lt,
                {
                    [ Set up the note ]
                    (?tickOn,?beat)*                  !at
                    ((?tickOff,?tickOn)-,?beat)*      !length
                    (Semitone,?key)**                 !multi
                    (?baseSound,>multi)*              !pitch
                    (>pitch,2)/                       !pitch
                    (>velocity,100)/                  !velocity
                    
                    [ Play the note ]
                    ((?voice Do,?at),>notes)AddEnd    !notes
                },{}
             )Choose Invoke
             (>count,1)+ !count 
         }
     )InvokeAll
     >notes MixAt Normalise
}!play

128   !pitch
1.2   !pitch-spacing
  1   !dullness
0.15  !ring-amount
32    !a
128   !d
2048  !s
8192  !r

?reverb-bell !voice
"C0" Note    !baseSound
 48.00       !beat
 
"temp/chpn-p6.mid" ReadMidiFile 
^t1
^t2
^t3

?t3  !track
999  !notesToPlay

?play Do !left
?play Do !right


((?left,?right),"temp/bell-preb.wav")WriteFile32 

Saturday 21 December 2013

Chopin Prelude No 4

Below is a render of Chopin Prelude No 4 by Sonic Field.

It was done using much the same patch as that for Requiem For Peace; however, in this case the left hand notes were more gentle and this lead of the contamination of the attack with a slight high frequency click sound. To round this out I added from Bessel filtering just in the attach of the left hand notes (patch for this after the video). I used a Bessel filter to make sure there were not significant phase issues remixing the filtered and non filtered signals.


[ this patch removes clicks from the 
  attack phase of notes when soft-attack is true
]
    (
        ?soft-attack,
        {
            (
                (
                    (
                        (0,1),
                        (125,0)
                    )NumericShape,
                (?signal,(?pitch,7)*,6)BesselLowPass
                )Multiply,
                (
                    (
                        (0,0),
                        (125,1),
                        (?signal Length,1)
                    )NumericShape,
                   ?signal
                )Multiply
            )Mix !signal
        },{
        }
    )Choose Invoke

Friday 20 December 2013

Requiem For Peace

Here is another patch which I used to render the Moonlight Sonata.

This piece uses a similar percussive string sound to some of my other pieces. It has some very vague similarity to a piano (hence the silly name solar-piano) but in reality it sounds nothing like a piano. What it does have is a complex release. The signal is frequency and amplitude modulated by slow frequency noise. Whilst this produces a potentially pleasing and somewhat wistful sound for single notes, it effects on harmonies is to pain a complex sound scape just from the overlap of signals.

Because these sound generator is therefore good at filling space between the attacks of notes, it allowed me to slow the music down around 5:1. A hand made reverberation impulse response then helped further by smoothing everything into a near realistic but overly wet space.

The patch is below this video:


[
  Reverberator
  ============
]
{
    ?signal Magnitude !mag
    (>signal,?grain-length Silence)Concatenate !signal   
    (
        (?mag,0)Eq,
        {
        },
        {
            (>signal,?grain-length Silence)Concatenate !signal   
            >signal FrequencyDomain  !signal
            (>convol,>signal)CrossMultiply  !signal
            >signal TimeDomain !signal
            ?signal Magnitude !newMag
            (>signal,(>mag,>newMag)/)NumericVolume !signal
            [ tail out clicks due to amplitude at end of signal ]
            (
                (
                    (0,1),
                    (100,1),
                    ((?signal Length,100)-,1),
                    (?signal Length,0)
                )NumericShape,
                >signal
            )multiply !signal
        }
    )Choose Invoke
    >signal
}!reverb-inner

{
    ?convol Length !grain-length 

    (>convol,?grain-length Silence)Concatenate FrequencyDomain  !convol
    (>signal,?grain-length Silence)Concatenate !signal
    Bunch !out
    (
        (>signal,?grain-length)Granulate,
        {
                ^signal ^time
                ((?reverb-inner Do,>time),>out)AddEnd !out
        }
    )InvokeAll
    >out MixAt Normalise
}!reverb

{
    (100,?frequency)SinWave MakeSawTooth !filter
    (>filter,?length,1)ButterworthLowPass !filter
    ?length WhiteNoise !signal
    ((0,0),(?attack-a,0.25),(?attack-b,  1),((?length,0.8)*,  1),(?length,0))NumericShape !sinEnv
    (>signal,?sinEnv)Multiply                  !signal
    (>signal,>filter)Convolve Normalise        !signal
    (>signal,2000,5)ButterworthLowPass         !signal
    (>signal,?frequency,2)ButterworthHighPass  !signal
    (>signal,?frequency,1)ButterworthLowPass   !signal
    (((0,0),((?length,0.75)*,1),(?length,0.5))NumericShape,(?length,2.5)SinWave)Multiply !vvib
    (1,>vvib pcnt+50)DirectMix                 !vvib
    (>signal,?vvib)Multiply Saturate Normalise !signal
    (>signal,0.6,0.5,?frequency Period)ResonantFilter Normalise !wave
    (
        -0.03,0.2,0,-1,0.2,2,
        ?wave
    )WaveShaper Normalise      !signal
    (>signal,?sinEnv)Multiply  !signal
    ((0,64),((?length,0.25)*,?frequency),(?length,64))NumericShape                      !lower
    ((0,?frequency),((?length,0.25)*,(?frequency,4)*),(?length,?frequency))NumericShape !upper
    (>signal,>lower,>upper,2)ShapedButterworthBandPass Normalise !signal
    (>signal,?volume)Volume
} !flute

{
    ((?attack,2)*,?pitch)ExactSinWave MakeSquare !hit
    (?hit,(?pitch,2)*,4)ButterworthLowPass       !hit
    (
        (0,1),
        (45,0.5),
        (75,0)
    )NumericShape !hit-env
    (
        >hit,
        >hit-env
    )Multiply Normalise !hit 
    
    (>hit,?hit-strength)Volume !hit
    
    (?attack,?decay,?sustain,?release)+ !length
    (
       (?length,?pitch)ExactSinWave      dbs+3,
       (?length,(?pitch,2)*)ExactSinWave dbs-12, 
       (?length,(?pitch,3)*)ExactSinWave dbs-18
    )Mix                                !note
    
    (
        (0,0),
        (?attack,1),
        ((?attack,?decay)+,0.5),
        ((?attack,?decay,?sustain)+,0.1),
        (?length,0)
    )NumericShape !note-env
    (
        >note,
        ?note-env
    )Multiply Normalise !note 
    (
        1,
        0,
        1,
        1,
        4,
        10,
        >note
    ) WaveShaper Normalise !note
    (
        0,
        0,
        1,
        1,
        4,
        10,
        >note
    ) WaveShaper Normalise !note
    
    (>note,?pitch,1)ButterworthHighPass Normalise !note
    (>note,1500,2)ButterworthLowPass    Normalise !note
    
    (
        ((0,5),((?attack,?decay,?sustain)+,1))Slide,
        ?note-env
    )Multiply !twang
    (?twang,?release silence)Concatenate !twang
    
    ( 1,>twang  pcnt+20)DirectMix !twang
    (
        >twang,
        >note
    )Multiply !note
    
    (
        >hit dbs+6,
        >note
    )Mix !note
    
    (?note,((0,1.001),(?note length,0.999))NumericShape)Resample !note-1
    (?note,((0,0.999),(?note length,1.001))NumericShape)Resample !note+1
    (>note,>note-1 pcnt-50,>note+1 pcnt+50)Mix Normalise !note
    
    64 !base
    (
        (?note,110,2)ButterworthLowPass Normalise,
        (
            (-7,?base         Period),
            (-7,(?base,1.25)* Period),
            (-7,(?base,1.50)* Period),
            (-7,(?base,1.75)* Period)
        ),
        -20
    )MultipleResonantFilter Normalise !sounding-board
    (>sounding-board,110,2)ButterworthLowPass Normalise !sounding-board 
    ((0,0),(125,0),(500,1),((?note length,250)-,1),(?note length,0))NumericShape !damper
    (>sounding-board,25,4)ButterworthHighPass !sounding-board
    (
        >sounding-board,
        ?damper
    )Multiply Normalise !sounding-board 
    (>sounding-board,25,4)ButterworthHighPass !sounding-board
    (
        >sounding-board,
        ?damper
    )Multiply Normalise !sounding-board 
    (>sounding-board,?pitch,1)ButterworthHighPass !sounding-board
    (250,?sounding-board length,>sounding-board)Cut !sounding-board
   
    (
        >sounding-board,
        >note
    )Mix Normalise !note
    
    (
        (
            250 Silence,
            (?damper,?note)Multiply Saturate Normalise,
        )Concatenate,
        (
            (-3,(?pitch,0.05)* Period),
            (-6,(?pitch,0.2)* Period),
            (-8,(?pitch,0.25)* Period),
            (-8,(?pitch,0.333)* Period),
            (-9,(?pitch,0.5)* Period),
            (-6,(?pitch,1.5)* Period),
            (-6,(?pitch,2.0)* Period) 
        ),
        -80
    ) MultipleResonantFilter Normalise        !res
    (0,?length,>res)Cut               !res
    (
        0,
        0,
        1,
        1,
        4,
        10,
        >res
    ) WaveShaper Normalise !res    
    (
        >note,
        >res Normalise pcnt+15
    )Mix !note
    (?note,(?pitch,2)*,1)ButterworthLowPass Normalise !note    
    
    (>note,500 Silence)Concatenate !note
    
    [ Mix in the characteristic white noise ]
    (
        (
            1,
            (
                (
                    ?note length WhiteNoise,
                    ?pitch,
                    4
                )ButterworthHighPass
                ,(?pitch,1.25)*
                ,4
            )ButterworthLowPass Normalise pcnt+5
        )DirectMix,
        >note
    )Multiply !note
    (>note,15,2)ButterworthHighPass     !note

    (
        (0,0),
        (500,0),
        (1000,1),
        (?note length,1)
    )NumericShape !note-env
    
    (
        >note-env,
        ?note
    )Multiply !sample

    (>sample,25,4)ButterworthHighPass !sample
    ((?sample Length WhiteNoise,500,6)ButterworthLowPass,0.01)DirectResample Normalise !multi
    (>sample,(1,>multi)DirectMix)Multiply !sample

    >sample !signal
    
    [
    (
        (
            ((2000 WhiteNoise,2)Power,2000,1)ButterworthLowPass Normalise,
            ((0,0),(80,0),(90,1),(2000,0))NumericShape
        )Multiply,
        (
            ((5000 WhiteNoise,3)Power,1000,2)ButterworthLowPass Normalise,
            ((0,0),(125,0),(135,1),(5000,0))NumericShape
        )Multiply
    )Mix Normalise !convol
    ]
    
    (
        ?reverb Do pcnt+50,
        !sample
    )Mix

    ((?sample Length WhiteNoise,500,6)ButterworthLowPass,0.01)DirectResample Normalise !multi
    (>sample,(1,>multi dbs-6)DirectMix)Multiply !sample
    (
        0,
        0,
        1,
        1,
        4,
        10,
        >sample
    ) WaveShaper Normalise !sample
    (
        (((0,0.125),(4000,0.25),(?sample Length,1))NumericShape,>sample)Multiply,
        >note
    )Mix Normalise !note

    
    (
        ((?note,1000,4)ButterworthHighPass,0.87)Power dbs-6,
        >note
    )Mix Normalise !note

   ( >note,?volume)NumericVolume 
        
}!solar-piano

{
    (?velocity,1.5)**         !volume
    (((?velocity,1)+,2)**,1)- !velocity
    (
        (?velocity,0.5)lt,
        {
            120 !attack
            400 !decay
        },{
             50 !attack
            400 !decay
        }
    )Choose Invoke 
        
    (?decay,(?length,4.5)/)+   !sustain
        
    ?length        !release
    ?velocity      !hit-strength

    [bass boost]
    (
        (?pitch,220)lt,
        {
            (>hit-strength,0.5)+ !hit-strength
            (?volume,0.33)+      !volume
        },{
        }
    )Choose Invoke
    
    (
        "C",?count,
        "P",?pitch,
        "V",?volume,
        "H",?hit-strength,
        "A",?attack,
        "D",?decay,
        "S",?sustain,
        "R",?release
    )Println
    ?solar-piano Invoke !signal

    >signal
}!play-solar

{
    Bunch !notes
    0     !count
    0     !prev-high
    (
        ?track,
        {
            ^tickOn ^tickOff ^note ^key ^velocity
            (
                (?count,?notesToPlay)lt,
                {
                    [ Set up the note ]
                    (?tickOn,?beat)*                  !at
                    ((?tickOff,?tickOn)-,?beat)*      !length
                    (Semitone,?key)**                 !multi
                    (?baseSound,>multi)*              !pitch
                    [(>pitch,2)/                       !pitch]
                    [(>pitch,6)/                       !pitch]
                    (>velocity,100)/                  !velocity
                    (>length,1.5)*                    !length
                    [ Play the note ]
                    ((?play-solar Do,?at),>notes)AddEnd    !notes
                },{
                }
             )Choose Invoke
             (>count,1)+ !count 
         }
     )InvokeAll
     >notes MixAt Normalise
}!play

"C0" Note !baseSound
  8.00    !beat
2000      !shortCut
9999      !notesToPlay

"temp/ml1.mid" ReadMidiFile 
^t1
^t2

?t2 !track
"temp/Solar-Piano-Internal.wav" Readfile ^cLeft ^cRight

>cLeft !convol
?play Do !left

>cRight !convol
?play Do !right

[
   Post
   ===
]
"Post Processing" PrintLn
((?left,?right),"temp/moon-mix.wav")WriteFile32 

"temp/moon-impulse-b.wav" ReadFile ^revl ^revr
"temp/moon-mix.wav" ReadFile ^left ^right 

?left  !signal >revl !convol ?reverb Do Normalise !wleft
?right !signal >revr !convol ?reverb Do Normalise !wright
(
    >wleft,
    >left
)Mix Normalise !left
(
    >wright,
    >right
)Mix Normalise !right

((>left,>right),"temp/moon-post-b.wav")WriteFile32 

Wednesday 18 December 2013

Example: Liquid Sound

Here is the patch which produced Liquid Sound:



"temp/in.mid" ReadMidiFile 
^t1
^t2
^t3
^t4
^t5

?t1 Println
?t2 Println
?t3 Println
?t4 Println
?t5 Println


[
  Reverberator
  ============
]
{
    ?signal Magnitude !mag
    (>signal,?grain-length Silence)Concatenate !signal   
    (
        (?mag,0)Eq,
        {
        },
        {
            (>signal,?grain-length Silence)Concatenate !signal   
            >signal FrequencyDomain  !signal
            (>convol,>signal)CrossMultiply  !signal
            >signal TimeDomain !signal
            ?signal Magnitude !newMag
            (>signal,(>mag,>newMag)/)NumericVolume !signal
            [ tail out clicks due to amplitude at end of signal ]
            (
                (
                    (0,1),
                    (100,1),
                    ((?signal Length,100)-,1),
                    (?signal Length,0)
                )NumericShape,
                >signal
            )multiply !signal
        }
    )Choose Invoke
    >signal
}!reverb-inner

{
    ?convol Length !grain-length 

[    ("Convolving SignalLength:",?signal Length," ConvolutionLength:",?convol Length)Println ]

    (>convol,?grain-length Silence)Concatenate FrequencyDomain  !convol
    Bunch !out
    (
        (>signal,?grain-length)Granulate,
        {
                ^signal ^time
                ((?reverb-inner Do,>time),>out)AddEnd !out
        }
    )InvokeAll
    >out MixAt Normalise
}!reverb

{
    ?length         !outLen
    (?length,1.01)* !length
    {
        ?length Silence !signal
        1 !volume
        ?pitch !harmonic
        (
            (
                (1,1),
                (2.001, 0.5),
                (3.002, 0.1),
                (4.003, 0.25),
                (5.004, 0.05),
                (6.005, 0.125),
                (8.006, 0.0625),
                (9.007, 0.0125),
                (11.009,0.01),
                (13.01, 0.005),
                (15.011,0.0025),
                (19.013,0.0015)
            ),
            {
                ^harmonic ^volume
                ("Pitch",?pitch,"Harmonic",?harmonic,"Volume",?volume,"Length",?length)Println
                (>harmonic,?pitch)* !harmonic
                {
                    (?length,50)/ WhiteNoise          !wave
                    (>wave,100,4)ButterworthLowPass   !wave
                    (>wave, 10,1)ButterworthHighPass  !wave
                
                    (>wave,0.01)DirectRelength        !wave
                    (0,?length,>wave)Cut Normalise    !wave 
                    (1,>wave)DirectMix                !wave
                }!makeWaves

                ?makeWaves Do pcnt+25 !phase
                ?makeWaves Do pcnt+25 !volumeWave
                (>volumeWave,2)Power Normalise        !volumeWave
                (>volumeWave,2)Power Normalise        !volumeWave
                (
                    (
                        (
                            (?harmonic,>phase)PhaseModulatedSinWave,
                            ?volumeWave
                        )Multiply,
                        ?volume
                    )NumericVolume,
                    >signal
                )Mix !signal
                ?length WhiteNoise !noise
                
                (?noise,?pitch,2)ButterworthHighPass !noise
                (?noise,?pitch,2)ButterworthLowPass  !noise
                >noise Normalise !noise
                (?noise,?pitch,2)ButterworthHighPass !noise
                (?noise,?pitch,2)ButterworthLowPass  !noise
                (
                    >noise Normalise   pcnt+5,
                    >signal Normalise  pcnt+95
                )Mix !signal
                                
            }
        )InvokeAll
        >signal Normalise
    
    } !makeTriangle 
    
    ?makeTriangle Do !signal       !root
    (
        (?outlen,2100)gt,
        {
            1000 !point
            ((0,0),(?point,1),((?outLen,?point)-,1),(?outLen,0))NumericShape !venv
        },{
            (?outlen,0.5)* !point
            ((0,0),(100,1),((?outLen,?point)-,1),(?outLen,0))NumericShape   !venv
            
        }
    )Choose Invoke
    
    ((0,0),(?point,1),((?outLen,?point)-,1),(?outLen,0))NumericShape !venv
    (
        ?venv,
        >signal
    )Multiply !signal
    (
        >signal,
        >velocity
    )NumericVolume 
} !chord

{
    Bunch !notes
    0     !count
    (
        ?track,
        {
            ^tickOn ^tickOff ^note ^key ^velocity
            
            (
                (?count,?notesToPlay)lt,
                {
                    [ Set up the note ]
                    (?tickOn,?beat)*                  !at
                    ((?tickOff,?tickOn)-,?beat)*      !length
                    (Semitone,?key)**                 !multi
                    (?baseSound,>multi)*              !pitch
                    (>pitch,2)/                       !pitch
                    (>pitch,6)/                       !pitch
                    (>velocity,100)/                  !velocity
                    (>length,1000)+                   !length
                    ("Note at",?at) Println
                    [ Play the note ]
                    ((?chord Do,?at),>notes)AddEnd    !notes
                },{
                }
             )Choose Invoke
             (>count,1)+ !count 
         }
     )InvokeAll
     >notes MixAt Normalise
}!play

"C0" Note !baseSound
 75.00    !beat
2000      !shortCut
1000      !notesToPlay

?t2 !track

?play Do !left
?play Do !right
[(?left TrimSilence, ?right TrimSilence )StereoMonitor]
"temp/reverba.wav" ReadFile ^revl ^revr

?left  !signal >revl !convol ?reverb Do Normalise !wleft
?right !signal >revr !convol ?reverb Do Normalise !wright

"temp/reverbb.wav" ReadFile ^revl ^revr

?left  !signal >revl !convol ?reverb Do Normalise !vwleft
?right !signal >revr !convol ?reverb Do Normalise !vwright

{
    (
        >wleft  pcnt+50,
        >vwleft pcnt+20,
        >left   pcnt+30
    )Mix Normalise
} Do !left

{
    (
        >wright  pcnt+50,
        >vwright pcnt+20,
        >right   pcnt+30,
    )Mix Normalise
}Do !right

"Done" Println
((>left,>right),"temp/temp1.wav")WriteFile32

Saturday 7 December 2013

Where Is Sonic Field

I have been remiss in not posting at all for months.

Sonic Field is live and kicking. I did not bother renewing the website for it as the traffic was so woefully light. However, the project lives and the source is backed up into Google's Drive system.

Progress over the last few months:

1) The memory manager is working very much better indeed now and I found a near perfect set of GC settings to that Sonic Field no longer swaps out data which is about to be garbage collected anyway. I guess it still does this a bit - but nothing like as much as it did.

2) Better memory management has caused me to be able to go back to double precision mathematics throughout.

3) 2 was very important for the next big improvement which is the addition of a FFT engine. Now, I promised to never put FFT in Sonic Field because so many of the effects produced in the frequency domain are so artificial sounding; the whole thing of FFT processing breaks the analogue synth' model of Sonic Field. However, there are some effects which are just impractical in the time domain; most important of these is convolution. I have a granular convolution reverb system running now and it brings Sonic Field's reverbs up to leaders in the field for the first time.

4) More music: Now I am working only one contract (I had a time working 2 and that was exhausting) I am putting more time into composing. I also indulged in rendering someone else's music. That is not something I normally like to do any more. In this case, I did it as a challenge to learn more about how to handle classical music in Sonic Field and how to render string sounds better. The result is on youtube:

Saturday 23 February 2013

How Will Oracle Monetise Java?

Jane Seymour - maybe she can tell Java's future?
This image is Creative Commons.
Sun made money from Java by accident - Oracle is much more systematic than that.

Sun made money from Java because people used Java and if you are running Java than Solaris is the way to go - yes? That made sense then; it does not now. This is for many reasons, not least of which is Linux; however, the key reason being the Oracle expects each business unit to stand on its own feet (or at least that is what an ex senior Oracle manager once told me).

 Oracle has End Of Life'd Java 6 - all support and patches must now be paid for. 

How can Java make money for Oracle? 
"They give it away for free' surely that is no way to run a business". Some people feared that Oracle would stop giving it away for free - but that is not their game at all (though I will go back to this later). The are acting much more like Dr. Kananga by giving the product away, destroying the competition and then charging for it. 

Does Oracle Charge For Java? Yes - rather a lot by all accounts. Now that they have managed to get Java security center screen and thousands of free column inches plus countless Reddit rants FOR FREE discussing security they are have created a license to print money.

Oracle is running a protection racket - but no-one minds!

Don't get me wrong, I am not dissing Oracle here. If you believe doctors should work for free and the common good, then you can go die somewhere else. Equally, if you think Oracle should support your software for ever, for free, just go look what happend to Sun. Nope - now that support is center stage as a Java must have, why not use that to fund the whole project? 

Customers who do not upgrade need to pay maintenance for security patches, simples.

It get better though. There is positive feedback which makes Java better.

Oracle's plan is to use support to fun the whole Java project by making Java better - quite clever really.

The plan is simple:

  1. Make sure customers feel worried about not having support, especially security patches.
  2. Release a new version of Java every year.
  3. End Of Life older versions after the new version has been out for a year. This means - no free security or performance patches.
  4. Then, if you do not upgrade your version of Java every year, you will need to pay to have support and having support is the only security sensible way to go.
  5. If you are a large organisation, upgrading Java is actually very expensive. Much better to pay Oracle and not have to do it. If it works - don't fix it.
Hey, Oracle have realised that fear is the ultimate method of control. From control comes compliance and obedience  Much as the 'terrorist threat' has been used by governments the world over to strip agency from the populous and erode democratic freedoms, Oracle is using fear to drive commercial Java users to their checkbooks like so many willing sheep to the slaughter. 

Does it matter?

No - it is brilliant! Oracle could have started to charge for Java. That would have caused the whole ecosystem to collapse like a house of cards in a stiff breeze. This approach, is so much better.
  1. It funds Java development.
  2. It give Oracle a great reason to release new versions every year. They need to fund Java to get better.
  3. It only targets large organisations which are reluctant (for good reasons) to keep upgrading. Hey - they should be getting proper support anyhow. Would you want your bank account to run on an unsupported platform?
  4. It makes Java stand on its own as a capitalist project rather than some waffly 'community project for the greater good' non-sense.

Where does this leave Oracle's Java tooling?
Selling seat license for development tools has not been a really good money spinner for a while now. The real place to make money is runtime licensing and maintenance. 

This might explain why Oracle seem to be struggling to port Jrocket Misson Control to Hotspot. It is still languishing in the Java 6 world of Jrocket. They cannot make money out of it and it does not act as part of a new Java version. No reason to spend money on it.

Similarly, Netbeans is not likely to move very far very fast either. Yes, Oracle need to keep it alive because otherwise Eclipse would have too much power over the Java ecosystem; but when embedding Chrome into a IDE is news - one can tell the IDE has little new to tell.

Conclusions:
Oracle have learned from MySQL how to monetize a product they give away. MySQL was used as a low end product. Java is a high end product, equivalent to COBOL in many ways, sitting in the beating heart of the worlds largest banks and companies. Applying the same techniques to Java will be a much better way to make money than selling addictive drugs to poor Americans in the 1970s.



Saturday 16 February 2013

Automated Large Object Swapping Part II


 

Very Large Breasts
Very large objects can be hard to contain.
This image is public domain.
Here I carry on from Part I to discuss fast serialisation.

Note - this is a cross post form Nerds-Central

Writing large objects to disk takes a lot of time, so we might think that efficient serialisation is not required. However, this does not seem to be the case in practice. Java does not allow us to treat a block of memory which stores and array of floats (or any other non byte type) as though it were a block of bytes. Java's read and write routines use byte arrays; this results in copying to and from byte arrays. The approach I took was to move the contents of Sonic Field's float arrays into a byte array as quickly as possible and then set the references to the float array to null so making it eligible for garbage collection.

Now - it is possible (I did try it) to serialise the data from the large objects to the swap file 'in place' and not require the byte array intermediate. However, the speed of methods like RandomAccessFile.writeFloat and DataOutputStream.writeFloat are so slow (even ObjectOutputStream was) that on a really fast drive, the CPU would become the limiting factor. This is really an attribute of hard drives becoming so very fast these days. The SSD in my Mac Book Pro Retina can easy write 300 mega bytes per second. This is not even that high speed by some modern standards. Calling DataOutPutStream.writeFloat (with a buffering system between it and the drive) takes around a third of a CPU core to write out at 80 mega bytes per second to my external USB3 drive. So, if I were to use the SSD in my machine for swap (which I don't as that is just SSD abuse), the CPU would be the limiting factor.

We need much faster serialisation than Java provides by default!

What is the fastest way to move several megabytes of float data into a byte array? The fastest way I have found is to use the (slightly naughty) sun.misc.Unsafe class. Here is an example piece of code:

setUpUnsafe();
payload = allocateBytes(getReference().getLength() * 4);
unsafe.copyMemory(
    getReference().getDataInternalOnly(), 
    floatArrayOffset, 
    payload, 
    byteArrayOffset,
    payload.length
);

What copyMemory is doing is copying bytes - raw memory with no type information - from one place to another. The first argument is the float array, the second is the position within the in-memory layout of a float[] class where the float data sits. The third is a byte array and the fourth the data offset within a byte array class. The final argument is the number of bytes to copy. The Unsafe class code its self works out all the tricky stuff like memory pinning; so from the Java point of view, the raw data in the float array just turns up in the byte array very very quickly indeed.

It is worth noting that this is nothing like using byte buffers to move the information over. There is no attempt to change endianness or any other bit twiddling; this is just raw memory copying. Do not expect this sort of trick to work if the resulting byte array is going to be serialised and read into a different architecture (x86 to Itanium for example).

In Sonic Field, the byte array thus loaded with float data is stored in a Random access file:

ra.seek(position);
ra.writeInt(payload.length);
ra.writeLong(uniqueId);
ra.write(payload);
ra.writeLong(uniqueId);

Performing this call on my external drive at around 80 MBytesPerSecond uses about 4 percent of one core. This give a comfortable 16GBytesPerSecond to saturate one core which is more like it!

Reading the data back in is just the reverse.


ra.seek(position);
int payloadLen = ra.readInt();
long unid = ra.readLong();
if (unid != this.uniqueId) throw new IOException(/* message goes here*/);
payload = allocateBytes(payloadLen);
if (ra.read(payload) != payloadLen)
{
    throw new IOException(/* Message Goes Here */);
}
unid = ra.readLong();
if (unid != this.uniqueId) throw new IOException(/* message goes here*/);
ret = SFData.build(payload.length / 4);
unsafe.copyMemory(
    payload,
    byteArrayOffset,
    ret.getDataInternalOnly(),
    floatArrayOffset,
    payload.length
);

Note that the length of the data is recorded as a integer before the actual data block. I record the unique ID for the object the data came form before and after the serialised data. This is a safe guard against corruption or algorithm failure elsewhere in the memory manager.

Setting Up Unsafe
Unsafe is not that easy to set up, especially if you do not also set up a security manager. Here is the code I use:
java.lang.reflect.Field theUnsafeInstance = Unsafe.class.getDeclaredField("theUnsafe"); //$NON-NLS-1$
 theUnsafeInstance.setAccessible(true);
 Unsafe unsafe = (Unsafe) theUnsafeInstance.get(Unsafe.class);

Also we need to get those offsets within classes:
// Lazy - eventually consistent initialization
    private static void setUpUnsafe()
    {
        if (byteArrayOffset == 0)
        {
            byteArrayOffset = unsafe.arrayBaseOffset(byte[].class);
        }
        if (longArrayOffset == 0)
        {
            longArrayOffset = unsafe.arrayBaseOffset(long[].class);
        }
        if (floatArrayOffset == 0)
        {
            floatArrayOffset = unsafe.arrayBaseOffset(float[].class);
        }
    }

Note that all code from Sonic Field (including all the code on this page) is AGPL3.0 licensed.

Automated Large Object Swapping Part I


 

I don't like the title - says a bird
I don't like that title!
Creative Commons - see here
I hate that title! But it does describe what has been keeping me up nights for a week now. The challenge - make swapping large objects to disk completely automatic, not use too much disk and be efficient.


I have a system which hits these goals now; it is far too complex for a single Nerds Central post so I will discuss it over a few posts. This is an introduction and a description of the algorithms used.

The challenge comes when Sonic Field is performing very large renders. However, I believe the solution is general purpose. 

The particular piece which caused all the trouble was 'Further Into The Caverns'. You see, I have made a new memory management system to deal with vocoding. The idea I used for that was to keep track of all the SFData objects (audio data in effect) via weak references. When the total amount of SFData live objects went above a particular level, some were written to disk. The simple approach worked on the idea 'disks are big and cheap, so we will just keep adding to the end of the file'. This worked find for small renders with big memory requirements, but for long running renders it was not so good. I would have needed over a terabyte of storage for Further Into The Caverns.

Whilst all this is about audio in my case, I suspect the problem is a more general one. I suspect that any large scale computation could hit similar issues and may well benefit from my work. Here is a general description of the challenge.

  1. A program is performing very large computations on very large (multi-megabyte) in memory objects.
  2. Reading and writing objects from disk is very slow compared to storing in memory so should be avoided.
  3. Memory is insufficient to store the peak level of live large objects the program uses.
  4. Disk is large, but the amount of data used by the system will consume all the disk if disk space is not re-used.
  5. The program/system should handle the memory management automatically.
  6. Using normal operating system supplied swap technology does not work well.
This latter one is something I tend to see a lot with Java. Because the JVM is one process and it has a huge heap which has objects constantly being moved around in it, regular OS style swapping just cannot cope. A JVM has to fit in RAM or it will thrash the machine.

Trial And Error
I wish I could say I designed this system from scratch and it worked first time. The reality is that many different ideas and designes went by the way-side before I got one which worked well. I will not bore you with all the failures; I believe it is sufficient to say that the approach I took is born of hard kicks and a lot of stress testing.

Object Life Cycle
Sonic Field's model has a particular object life cycle which makes it possible to perform the disk swap technique without a significant performance impact where swapping is not occurring. This is achieved by wrapping large objects in memory manager objects when the former is not actively involved in a algorithm tight loop. Thus the overhead of the extra indirection (of the memory manager) is not incurred inside tight loops:
  1. Creation
  2. Filling with data
  3. Wrapping in memory manager object
  4. In RAM Storage
  5. Retrieval
  6. Unrapping - retain wrapper
  7. Algorithmic manipulation (read only)
  8. Return to 3 or
  9. Garbage collected
  10. Wrapper garbage collected
The key is that large objects are unwrapped when in use. However, when not actively being used in an algorithm, they are referenced only from the wrapper. The memory manager can store the large object on disk and replace the reference in the wrapper with information on where the object is stored on disk. When the object is requested for an algorithm again, it can be retrieved from disk.

When To Store
This was a tricky one! It turned out that the difference between the maximum heap and the available heap (are reported from the Java Runtime object) is the best way of triggering swapping out of objects. Just before a new large object is allocated the memory manager assesses if the maximum heap less the current heap use less the size of the object (approximately) is below a threshold. If it is, all currently in memory large objects are scheduled for swapping out. The details of this swapping out scheduling are complex and I will cover them in another post. There are other points at which the threshold is checked as well, though the pre-allocation one is the most important.

Cute Woman Posing In Torn Top
I'm all sort of torn up and fragmentary.
Creative Commons - See Here
Non Fragmenting Swap File Algorithm
All this sounds fine, but is really is not enough. A simple application of this idea was more than capable of consuming an entire 1 terabyte disk! The problem is fragmentation (if you are interested in the subject - here is a good place to start looking).

Let me explain in a little bit more detail:
Consider that I write out ten 1 gigabyte objects. Now my swap file contains 10 slots each of which is approximately 1 gig long (there a few header bytes as well). Now - the JVM garbage collector may eventually garbage collect the wrapper object which contains the information about one of the objects. This is detected by the memory manager by a weak reference to that wrapper returning null from the .get() method. OK! Now we have a slot that is free to be used again.

Unfortunately, this 1 gigabyte slot might well become filled with a 10 megabyte slot. We could then create a new slot out of the remaining 990 gig, but we have made sure that if we were then to need another gigabyte, it would have to go on the end of the swap file thus increasing swap usage.

It turns out that this process of fragmentation just keeps happening. Rather than (as I had originally thought) it levelling off so the swap file grows asymptomatically to maximum, the reality is that the file just grows and grows until what ever medium is it on fills up. Clearly a better algorithm is required.

The next obvious step is slot merging where by, if two slots next to each other are found to be empty (weak references to the file info objects) then they can be merged to create a bigger slot. There is no doubt this helped, but not enough to run Further Into The Caverns in less than 250G of file space. I became ambitious and wanted to run that render on my Mac Book without an external drive, which meant getting the swap down below 100G or so (it has a 250G SSD, and it is not good for SSDs to be filled completely).

So, an even better algorithm was still needed. I could have used one like that buddy algorithm (linked above) or something based on the idea that Sonic Field objects come in similar sizes. However, due to the point of indirection between wrapper objects and the swap file, an near perfect option exists.

Compaction - Simple and Effective
A rubbish compacting lorry
Squash it, move it, dump it.
Creative Commons see here.
Yep - garbage is better off squashed - makes it easier to deal with and move around.

Let us thing of the swap file as a set of slots, those with stuff in I will represent at [*] and empty ones (i.e. space which can be reused) as [ ].

[*][ ][*][ ][ ][*]
 1  2  3  4  5  6

If we swap 3 and 2 we get:

[*][*][ ][ ][ ][*]
 1  3  2  4  5  6


Now we can merge 2, 4 and 5:

[*][*][       ][*]
 1  3  2  4  5  6


Finally, we swap the new large empty slot with 6:

[*][*][*][       ]
 1  3  6  2  4  5 


With a single pass of the swap file we have moved all the empty space to one large slot at the end. Next time something is written to the file the big slot at the end can be split into two.

The final result
Moving slots is an expensive operation because it means copying data from one part of the disk to another. As a result, running the full compaction algorithm every time we need to find some space in the swap file is not such a good idea. It turns out that merging adjacent free slots and splitting slots when not all of one is used can keep the swap file from growing for a while. Eventually, though, it gets badly fragmented and starts growing quickly. The compromise I have found works OK (but I am sure could be tweaked to improve it) is to use a fast merge/split approach most of the time, but at random intervals which on average are every 100 times the memory manager looks for space, a full compaction is performed. I went for the random interval approach to ensure that no cycles between loops in a patch and the memory algorithms develop.

Did it work? Yes - Further Into The Caverns renders with just 5.8 gigabytes of swap file!