Hi,

I'm trying to compare the timings of two sets of data, consisting of only 1s or 0s.

Basically I have an input audio file that has a '1' representing each hit of a drum pattern. There are going to be different variations of this audio file as I hope to automate parameter changes in the creation of the file. Therefore I want to test this input file against a set pattern that I know is correct to see how accurate the algorithm that creates the audio file is.

I was just wondering if any of you clever guys might have any ideas on how to achieve this.

At the moment I'm using an array that stores the correct pattern from a step-sequencer in the same format as the audio input file. Then using [klashnikov] I count every '1' in both tables. By subtracting the pattern array count from the input array I'll get a number that indicates the accuracy. If the result is 0 then it must mean that there are the same number of hits in both files, if its a minus number then there must be too little in the input file and if its greater than 0 then there are too many.

However, I wanted to introduce a sort of window. So that the input file hit could be within so many samples of the pattern hits.

My knowledge of dealing with arrays is limited, but I feel there must be some object/function that would helpful that I'm overlooking.

Any ideas would be extremely helpful. Hope I've explained it properly.

Cheers,
J