Posted: Nov 19, 2015 5:31 pm
by John Platko
Adding rest notes in licks is interesting - to a point. I mean, it's not so interesting if too many rests are added. Which brings up the question of how to quantify when too much is too much. One obvious way is to just not allow too much silence in a lick- and that might end up being one rule of thumb in my ultimate heuristic.

Another way to think about quantifying licks is to use the idea of Shannon Entropy from information Theory. You can read about it here. I'm told we have some members who are expert on the subject of entropy so maybe they'll help.


People have been trying to use entropy to understand music since the 50s. There are lots of ways to apply the idea. I think I'll eventually end up using many of them to create a multi-dimensional entropy vector which I'll use to compare licks with.

One fairly simple way to quantify a lick's entropy is to use the probabilities of notes in a lick. That is:

Image

I did just that in the following example which started with a minor pentatonic run and mutated notes to rests through the generations. The following is some examples of the licks that came out of that:. (The generational information is shown along with the Entropy (E) of the lick. Notice that adding a rest can increase this measure of entropy.

Image