• jameslo

    @porres Thanks for your interest! This is an envelope generator that is trying to copy and improve on an effect I used in a piece way back in the '90s. It was originally done with a Valley People Gatex, which I'm about to give away to the friend who turned me on to them. Anyway, if I set the Gatex to 0.2s release time, a 60dB range, and then ride the threshold to constantly trigger on an external key (for which I usually used speech so it would be fast and unpredictable), I'd get a pleasing chattering sound with a consistent soft click on the attack. Even though the release is so short, the triggers are coming awfully fast so the gate is retriggering before it has had a chance to fully close.

    I made some recordings of the Gatex and tried to measure what it was doing, but as these things often go, I found that the Pd simulation sounded more true to the original with parameters that didn't quite match what I had measured. The release curve I'm using is much steeper at the beginning, release time much longer, and the attack curve is longer and logarithmic. I kept the 20ms sustain time as is. I found that if I retriggered the same [vline~] before it had had a chance to fully release, it would either pop too much (if I made it jump to 0 first), or had inconsistent attacks (if the new attack didn't start from 0), so that's why there's two of them computing the ramps and hold. I'm outputting whichever one is higher at any given moment. When a new trigger comes in, I look to see which EG is lower (and thus not seen on the output) and route the new trigger to it. It first jumps to 0, but you don't hear it click because I'm outputting the other one at that instant.

    To control the curvature of the attack and release, I use [vline~] again to synchronize the values used to shape the attack and release via [pow~]. Finally, I noticed that the Gatex had a unpredictable amplitude variation, probably based on the key signal at the moment the trigger was detected. That's what the randomized volume stuff is for, although the Gatex's actual variation is only 3 dB, if that. I'm not trying to copy their detection algorithm because for my music there's usually been no correlation between the main channel and the side chain, so I just use randomized [delay]s to trigger it, which is how my troubles began. Without the intervening [vline~], the unsynchronized random volume part added unpredictable discontinuities, especially when the trigger density was high.

    Here's some sound: jittery gate demo.mp3

    posted in technical issues read more
  • jameslo

    @porres Yes, my curiosity is entirely academic. I solved my problem by leaving the existing [vline~] as is and converting the other block-aligned processing to use [vline~] (somewhat trivially) in order to synchronize things. Rereading the [vline~] documentation, I found a discussion of [sig~] which suggests that my solution is not entirely unreasonable. Here's my patch--not that it's relevant to my questions, but you seem curious:
    Screenshot 2025-01-22 071130.png
    The adaptation is in the random volume for each envelope trigger.

    posted in technical issues read more
  • jameslo

    @porres Yeah, this thread is definitely hard to follow, but please don't feel any pressure to follow it. The subject I introduced was already way down in the weeds and each respondant (you included) brought their own focus, interests, and experience to the table. The result is more like cool party banter than a research paper. Plus, my interest in it turned purely academic after I found a solution to my practical problem (i.e. delay everything else using [vline~] so its timing is aligned), and so if any speculation that followed seems impractical, well, yeah, that's how academic things are sometimes. Personally, I think it's fun, but I get that many people don't, which is why I issued my warning at the end of post 12.

    Speaking only for myself, I'm just wondering if there's an alternative to the way I aligned the start of my [vline~] audio processing with my other audio processing that can't do inter-block timing. @lacuna, @seb-harmonik.ar and I found one way, and I'm still looking for a way to do that same thing one block earlier, which I suspect doesn't exist without resorting to externals. I think your and @seb-harmonik.ar's [line~] suggestion requires that every segment of the piecewise linear function align with block boundaries, am I wrong? If that's good enough for most purposes, then that was worth noting and I'm glad it was suggested, but it's more restrictive than I required.

    posted in technical issues read more
  • jameslo

    @seb-harmonik.ar Like this?
    Screenshot 2025-01-20 130049.png
    It looks like the second ramp (jump in this case) is still timed relative to halfway through the block though.

    posted in technical issues read more
  • jameslo

    @whale-av I think I answered some of your questions in my previous reply to @lacuna, so let me ask you to clarify what you meant by tabwrite~ vs print~. I think the following test shows they're exactly the same:
    tabwrite~ vs print~.pd

    tabwrite~ vs print~.png
    Even if I run it via a delay with a fractional block time it's the same.

    Screenshot 2025-01-19 195550.png

    posted in technical issues read more
  • jameslo

    @lacuna said:

    click bang [o] (GUI always on block-boundary) > [spigot] opens > [bang~] bangs > [spigot] closes > [vline~] starts (still on same block-boundary).

    Yes, this is how I was hoping it would work, but it's not always the case that [bang~] bangs after the spigot is open within the same control block. It appears to bang afterward if there is a direct bang from a GUI element, but bangs before if the bang came from a delay. (Edit: I'm going to start a new topic on this, I don't see why this should be true)
    bang~ runs last.pd
    bang~ runs last.png bang~ runs first.pd
    bang~ runs first.png If it runs before, then we are really waiting until it bangs at the end of the next audio block. (BTW, you can modify the number of blocks of delay and see the numbers change as you'd expect)

    You use [rpole~ 1] as sample-counter, right? But [snapshot~] only snapshots on block-boundaries.

    Yes, but all I'm concluding from that is that the message is available in the control block before the audio block that contains its actual time. The fact that so many Pd objects ignore the fractional block of time remaining suggests to me that it could be possible to truncate that fractional block of time in the message in order to make objects like vline~ start at the beginning of the audio block, like sig~.

    So is it correct that you want to move the start of vline~ 'backward' ? To the start of the block?

    Your diagram annotation is correct, but "want" is a strong word :). Let's just say I'm curious if it's possible. Right now, I'm delaying things like [sig~] to match [vline~] by inserting [vline~] just before them. I could have used any of our 3 quantizers to align the first [vline~] ramp with the block following the block that contains the fractional block timed message, but then I would have had to also delay my [sig~] processing a block as @seb-harmonik-ar confirmed.

    The last two slides in that PP deck you linked to shows what I mean. Objects like [vline~] implement the ideal, but I'm wondering how to make it behave more like the last slide. See how the message's timing is labeled "t0+16"? Wouldn't it be possible just to zero out the "16" part?

    posted in technical issues read more
  • jameslo

    @lacuna First of all, I honestly don't understand why your left bang works, but let me ignore that temporarily. Using the right bang in your example, here's what I'm asserting:

    Drawing1.png

    Similarly, in my first response to you, you can see in the snapshots that the bang from [delay 1.5 64 samp] is being processed during Ctrl 1, 32 samples before its real timing. If I used that same message to, say, set a value into [sig~], that value would take effect immediately in Audio 1. So I'm wondering if there's a way to truncate that fractional block timing without resorting to an external.

    So then, WRT your left bang, I think you proved that my theory about when bang~s happen is wrong. But I'm pretty confused about that at this point.

    posted in technical issues read more
  • jameslo

    @lacuna The left bang is not the issue because UI clicks are always block-aligned (I think). It's the right bang I'm trying to anticipate. The bang out of [del 1] is processed at the beginning of the first block, but it has a timing attribute that says it's 1 ms into the first block. block-quantizer still delays it to the beginning of the 2nd block.

    Edit: oh wait, did I miss your point?

    posted in technical issues read more
  • jameslo

    @whale-av said:

    @jameslo Pretty sure it is documented here...
    Messages are received and run out during an audio block?...... but the resulting instructions are not passed to an object until the next boundary?..... at which time they will be scheduled to be acted upon within the next block if the object can do so?.

    Yep, I think that's it, so let me list some of the ways I don't understand things :)
    [1] Not reading the documentation
    [2] Not understanding the documentation I've read
    [3] Forgetting what I used to understand
    :) :) :)

    In the middle of a message cascade you may schedule another one at a delay of zero. This delayed cascade happens after the present cascade has finished, but at the same logical time".

    Man, this got me excited because the only reason my bang quantizer outputs where it does and not one block earlier is because [bang~] for the end of the first block has already executed by the time my spigot opens and so we have to wait for the next audio block to complete. I was hoping [delay 0] would cause the bang from [bang~] to be rescheduled after my spigot opens, but it doesn't appear to.

    posted in technical issues read more
  • jameslo

    @lacuna Nice! I think this has the same outcome as @seb-harmonik.ar's external, which is to delay the timing to the next block after the intra-block time. I was hoping to anticipate, i.e. move the timing forward to when the message is processed. Look at my version of your thing:
    Screenshot 2025-01-19 065713.png
    The snapshots at the bottom show that the bang is processed at the beginning of the 2nd block, even though it is actually timed to the middle of the 2nd block. It's at the beginning of the 2nd block where my other non-vline~ audio processing happens, and that's the stuff I was hoping to synchronize with.

    I should warn folks that I managed to use vline~ to delay my other audio processing and so my practical problem has been solved. I'm just continuing this discussion because it's interesting.

    posted in technical issues read more
  • jameslo

    @seb-harmonik.ar said:

    @jameslo midi messages don't have timing info themselves afaik (except for realtime/clock?)

    With REAPER I think they are trying to preserve the time that each MIDI message arrived WRT the audio tracks that could be playing back during MIDI record. That's where the fractional block timing comes from. I think.... I gave up on JSFX when I finally understood its memory model. Reminded me of coding in assembler.

    posted in technical issues read more
  • jameslo

    @seb-harmonik.ar Thanks, just checking my understanding.

    posted in technical issues read more
  • jameslo

    @porres I understood that documentation to apply to ramps subsequent to the first ramp, and so am commenting on the part I didn't understand. Especially the part that the timing of all messages can be fractional WRT blocks. I'm sure that must be documented somewhere. I think I may have been predisposed to discovering it because I had recently studied REAPER's bare metal scripting language JSFX--that's how they do control rate timing.

    FWIW, I tested MIDI messages and they looked to be block-aligned. I think REAPER keeps the fractional timing.

    posted in technical issues read more
  • jameslo

    @seb-harmonik.ar By "current block" do you mean the one that will be processed after all the control-rate processing completes? If so I think that means I'd have to delay my other audio processing (like changing the value of sig~ in my example above) one block to be aligned, is that correct?

    posted in technical issues read more
  • jameslo

    This sentence in timer-help helped me find a bug in one of my envelope generators made with [vline~]:

    Screenshot 2025-01-18 082251.png
    I was wondering why [timer] sometimes returned fractions when set to [tempo 64 samp(--now I know. But hold on--bangs are only processed between audio blocks, so that implies that the bang must carry with it some attribute that indicates its actual timing, one which might be within the next audio block.

    Awright, then maybe [vline~], being all sophisticated and stuff, takes a peak at that attribute? Holy smokes, yes it does:
    Screenshot 2025-01-18 084413.png
    So my mistake was assuming that [vline~] always started at block boundaries, and that made it misaligned with other audio processing that was initiated at block boundaries. How have folks dealt with this? I suppose I could look for a way to use [vline~] to potentially delay the other audio processing. Is there a way to zero out that bang timing attribute, so that the bangs are quantized to block boundaries? Or some other way to force [vline~] to start at the beginning of the next audio block?

    Edit: it's not just bangs that have that intra-block timing attribute--you can see in my example that it's preserved through the message that's sent to [vline~]

    posted in technical issues read more
  • jameslo

    @erhandemirci OK, I see another issue: your wav file is 22.05kHz 32 bits. I think that makes things more complicated, e.g. your sound card has to support it too, or you have to resample it in Pd. Try converting your sound to 44.1khz 16 bits.

    posted in technical issues read more
  • jameslo

    @erhandemirci said:

    I created this one but not success
    mypdFile.pd

    I agree with @whale-av, but if your latest issue is that there is no sound at all, then it's probably because you are muting it:
    Screenshot 2025-01-16 104610.png
    That audio multiplication is equivalent to multiplication by 0 because you have not provided a non-zero input to the right inlet. You can also just add a non-zero argument.

    posted in technical issues read more
  • jameslo

    @ChicoVaca I don't know about other DAWs, but REAPER can record and playback sysex in time just like any other MIDI message. It's not so pretty to look at and edit, but you can move sysex events along the timeline, copy and paste sysex phrases, etc. Is that what you meant?

    posted in technical issues read more
  • jameslo

    @blindingSlow And I'm sure your example was only meant to show the curved connection, but as an actual patch it overflows the stack because it's an endless feedback loop. Just sayin' :)

    posted in technical issues read more
  • jameslo

    @willblackhurst Good catch. Alternatively, you can add it to your start up preferences where it may stay for so long that you forget it's not part of Vanilla :)

    posted in technical issues read more
Internal error.

Oops! Looks like something went wrong!