I'm trying to plot amplitude and phase outputs from an fft~. So I convert rectangular coordinates to polar, and tabwrite~ them, resetting the phase of the incoming oscillator with every bang sent to the tabwrites.
I've done it using cartopol~ from cyclone, and also making the explicit calculations with the squaring and sqrt~ for amplitude, and atan~ for phase. No problem so far, both work and I have what I expected...
...EXCEPT when the frequency of the sinewave I send into the fft~ is exactly a multiple of <samplerate/fftSize>. When I do that, I have exactly one bin with non-zero amplitude (as expected), but the phase is all over the place, and dramatically so for low frequencies. It's only well-behaved at 0, Nyquist and Nyquist/2.
Extra unexpected behavior: moving the frequency just a bit from one of these "fft-harmonics" brings the phase-table back to a nice curve.
Is this expected behavior? If so, can anyone explain; if not, can anyone tell me what I'm doing wrong? I really expected all phases to be zero (or in a single-discontinuity curve) when analysing a multiple of the "fft-fundamental".
I've enclosed a patch with all that. Blocksize here is 128, but I've tried with different sizes getting similar results.
(forgive my lengthiness, it's my first post)
This may seem like a ridiculously n00b question, but is there a particular way I should connect my output to my input to use your latency-tester?
I mean, would it give wildly off results if I used my laptop speakers and mic just listening to each other "through air" ?
Also (question to everyone), just to be sure, changing the vector size on the Audio Config window in 0.43.4 doesn't change the default block size inside Pd, right?
Cheers in advance,
I'm trying to install PdDroidParty on my Samsung Galaxy 3 with Android 2.1, but get a parse error when I launch the .apk. Any clues / hints?
I had downloaded "What Comes Around" and it seemed to work (although very very slowly, probably due to my device's age and speed).
In any case, this looks great!
I didn't seem to see any mention of reading the accelerometers into PdDroidParty objects... any plans for that?
sebfumaster is right, it's the order in which you're sending your values that is the issue. Without a [trigger] object or an [unpack], they're sent in the order in which you created the connections (here: from left to right, the opposite of what we'd want). Obviously this is very hard to keep track of, whence the objects that force this order for us.
If you put a [t b b b b] between your [bng] and the [random]s, which each of those connected to an output of the former, it should do the trick.
The [delay~] object in Max is, if I'm not mistaken, a delay in samples. The [z~] object in zexy does that sort of delay, but you can use [delwrite~] / [delread~] for that too, provided you place them in connected subpatches (like you did) in order to allow for delay times smaller than one block.
Not sure it would work exactly the same, though, especially with the [line]s connected to it.
You're getting transposition because of the Doppler effect (raising the delay time is like moving the sound away from you). The way to avoid that is to keep a constant delay time within a window, and move the delay time for the start of the window once you've read it and are about to read another one.
Basically, that's what I did in my patch.
The second effect on that vid is quite similar to what I get when stretching in RT, except that there the delay time goes back to 0 when they stop the stretching (whence the jump we hear). That can be achieved with the modification I added some posts back.
I'm not sure I follow what you mean, or that what you want is actually a time-stretch.
In your example of the snare drum being hit once every 4 seconds, a stretched version of this would be snare hits every 5 seconds (if you stretch it a little), or every 8 seconds (if you slow it down by half). So you wouldn't have *at all* the effect you're describing with a "nice gap" (even if the attack itself would be stretched too, of course - but not that much).
You do seem to describe something closer to reverberation, you should look into that too.
You can try and use more than one instance of the abstraction, windowing the outputs, with the delay time resetting to 0 at each cycle and starting the stretch again. That's closer to "spawning runners" continuously, but it still wouldn't do what you described, I'm afraid.
It's not really latency (in the usual sense). The time-stretching in this patch is based on a delay line: this means you have a maximum length for the segment you're stretching (or compressing), but also means that once you've streched your input a bit, the output will be some amount of time "late" in relation to the input, even if you bring the speed back to 1.
You can think of it this way: if you have two runners at the same speed, and one of them slows down for a while then gets back to the original speed, the first one will get (and remain) ahead.
More technically: bringing the speed back to 1 only stops the [phasor~] responsible for varying the delay time, it doesn't bring its output back to 0. You could force it to do that, I suppose, by sending a [0( message to its phase (right input) when the speed goes to 1. Haven't really tried it; might work, but might also give you some not-so-nice jumps.
Here's a way to do it (adding this to the existing stuff in the patch):
Or just connect the [0 ( to the [phasor~] phase and click it whenever you want to get your delay time back to 0.
Incidentally, the maximum delay time I put in that patch is 8000 ms, which is probably what you're experiencing after having slowed down your audio for a while. You can change that behaviour in the multipliers before and after the [phasor~] (for its frequency and range, respectively). If you want more than 8000 ms, you have to change it in the [delwrite~] as well. Note that this will affect how much of your audio stream you can stretch before a jump.
Hope this helps. Glad that it worked well - at least for a while.