• ddw_music

    @porres said:

    nah, I'll just leave as it is, the object is already too much complicated and I don't know how to deal with it (if anyone has a suggestion, please let me know).

    Maybe like this? Instead of velocity --> envelope, derive a gate by way of [change]. Then multiply the envelope by the velocity value. The volume will change if the velocity changes on a slurred note. If you don't want that, it should be possible to close the spigot when slurring to a note, and open it only when a brand-new note is being played.

    pd-mono-midi-else-2.png

    hjh

    posted in technical issues read more
  • ddw_music

    @porres said:

    As for the discussion, if my objects are not working as expected, or if you have suggestions, you can open reports and requests.

    OK, sure.

    FWIW, I had developed my own methodology for this some years ago. At that time, I didn't know that [else/mono] existed.

    I was just sharing a way to handle this problem that worked for me. My intention wasn't to raise shortcomings with your library -- it was only, "this is how I do it."

    hjh

    posted in technical issues read more
  • ddw_music

    @porres said:

    well, [else/mono] has a built-in portamento... and I don;t know what is wrong with it or how it works with [else/adsr~] for you...

    I'm also sensing an unnecessarily punchy tone here. I won't engage with this.

    hjh

    posted in technical issues read more
  • ddw_music

    @leonardopiuu said:

    I began using plugdata a couple days ago and I'm making a simple mono synth

    Mono synths are not simple! lol

    MIDI's way of representing keyboard activity is not exactly convenient for mono use. So, I'm going to guess that the velocity and/or ADSR behavior is the result of a logic problem before reaching the ADSR.

    Especially based on this comment: "BUT the velocity isn't sent out from the pack unless I press at least TWO keys at the same time..."

    It's tricky. This was my solution, in the midimonoglide.pd abstraction:

    pd-midimonoglide.png

    Confused? Yeah. MIDI-to-mono logic is complicated enough that it will probably take you more than a couple of days of plugdata usage to really grasp it. There's no such thing as a "simple MIDI mono synth."

    So I'd suggest to use either [else/mono] -- you already have ELSE lib, so, nothing else to install -- or my abstraction (which helps with mono note glide -- but you might not need that at this point.

    Then, when you have properly cleaned up and filtered MIDI data, then the ADSR "should" be the easy part.

    The ELSE way -- note here that every note-on sends out a non-zero velocity, so the envelope will retrigger repeatedly. I'm not crazy about that behavior, but it might be OK for your use case (and the patch is simpler).

    midi-mono-by-else.pd

    pd-mono-midi-else.png

    The hjh way -- where it becomes possible to distinguish between sliding, non-retriggering notes and genuinely new notes. (The [noteglide] isn't strictly necessary -- but, IMO fingered portamento is the whole point of a MIDI monosynth :wink: so I'm using it.)

    midi-mono-by-hjh.pd

    pd-mono-midi-hjh.png

    Hope this helps -- and no worries. If you made this much progress in Pd in a couple days, to the point where you run into the not-at-all obvious subtleties of MIDI mono handling, that's pretty good.

    hjh

    posted in technical issues read more
  • ddw_music

    Finally posted video from my electronic ensemble class's concert last June. Pd didn't figure into the audio side, but 2 out of 3 pieces used Gem for the graphical backdrop.

    It was a fun night, hope you enjoy!

    Link goes directly to the third piece; the playlist includes the other two.

    hjh

    posted in output~ read more
  • ddw_music

    Here's a short clip of an interactive installation piece (I guess more of a prototype?) that was shown last weekend.

    • Audio: SuperCollider (+ VSTPlugin for the piano and guzheng)
    • Graphics: Pure Data + Gem
    • UI: Open Stage Control (with heavy CSS gradient abuse :laughing: -- iPad batteries really don't like rendering 13-14 new gradients, 9 times per second)

    Musical decisions are made by flipping bits in a 40-bit word, and mapping various segments of the bits onto sequencing parameters.

    At some point, I'd like to do an explainer video, but not today.

    hjh

    posted in output~ read more
  • ddw_music

    Ahem.

    pd-mod-again.png

    lol

    hjh

    posted in technical issues read more
  • ddw_music

    @jameslo said:

    @whale-av My issue is (was?) that there are things that affect the DSP sort order that you can't see from the graphics alone. As far as I understand, that's different from @ddw_music's question, which is "how much of the DSP graph does Pd have to resort when there's a change"? That said, I don't understand the reason that @spacechild1 gave on the mailing list--I'd love to see an example.

    Well, here's what he said: "The issue is that DSP objects can be 'connected' by other means. Take the delay objects, for example. Some of these objects need to be aware of the order of execution within the DSP graph. Others will affect other objects, e.g. automatically changing the channel count. Pd itself doesn't know what a particular ugen is doing, so the only thing it can do is rebuild the whole graph."

    @whale-av Yes, I'm writing a paper, but the paper isn't about Pd graph sorting -- this is a side point -- the latency thread is interesting but would be way too much detail for basically a footnote.

    hjh

    posted in technical issues read more
  • ddw_music

    Per spacechild1 on the mailing list: Yes, it does sort all dsp objects in all canvases, every time (because they may have invisible connections, like delay objects or send~ / receive~ / throw~ / catch~).

    hjh

    posted in technical issues read more
  • ddw_music

    @jameslo said:

    @ddw_music If Pd's topological sort algorithm were smart enough to know when a change inside an abstraction did not change the sort outside the abstraction, then it would be an easy lift for it to detect that feedback to some abstraction (e.g. one with an [inlet~], an [outlet~], and no connections) does not produce a cycle. But the last I checked, that was not the case, so I would bet that when anything changes, the whole directed graph is resorted.

    I guess the issue, then, is, if all the tilde objects get lifted into one flat list, then a change inside a subcanvas could get inserted in the midst of objects outside the canvas. In that case, it probably is necessary to walk the entire tree.

    In the video, he starts off the sorting section by saying that canvases tell the DSP layer what has changed locally within the canvas, but then discusses the sorting flow when DSP gets turned on (which obviously has to start at the topmost canvas).

    And then in g_canvas.c there are comments like

            /* create a new "DSP graph" object to use in sorting this canvas.
            If we aren't toplevel, there are already other dspcontexts around. */
    

    so the data structure does seem to be split up by canvases.

    It's not a crucial point -- just that, I'm expending a paper and wanted to contrast SC's per-SynthDef graph sorting vs Pd's seemingly global sorting. I'd rather not make a false claim.

    Mailing list, I guess.

    hjh

    posted in technical issues read more

Internal error.

Oops! Looks like something went wrong!