• shpeck

    hello all!
    i would like to create a delay effect that would take an input and would delay it, gradually decreasing its delay time (like a bouncing ball). the volume should stay constant if possible, and the number of repeats controllable... how would one go about doing this?

    thank you in advance
    shpeck

    posted in technical issues read more
  • shpeck

    hello all!

    i would like to create a delay effect on my incoming signal (from my microphone) that, depending on the incoming signal's amplitude, the delay time would change (that is, if the volume is louder, the delay time will be longer, and if the volume is softer, the delay time will be shorter).

    i scan my incoming signal (adc~) using fiddle~, a tool which i found using the search option in this forum. my problem is, that i dont know how to make the numeric amplitude information i get from fiddle~ useful for my plan. i use a delwrite~ and a vd~ to try and create this effect, but sometimes i get crackling sounds, and sometimes it just doesnt do what i want.

    also, i would like the signal to delay only once, not many times in decreasing volume...

    thanks so much for your help
    ofer from israel

    posted in technical issues read more
  • shpeck

    hello,

    i would like to ask you if someone could explain to me something. i have a computer connected to a soundcard, and i would like to use pd to process the audio i recieve from the microphone connected to the soundcard. how do i do this? am i missing something very basic?

    thank you for your help!
    shpeck

    posted in technical issues read more
  • shpeck

    thanks alot guys, i appreciate very much your help

    i understand much more now about all this

    i tried what you offered toxonic, and it worked very well (i also played with the transition time and tweaked with it until i found what i was looking for)

    here is another issue:

    when i do an [env~] to my signal, i get a very rapid read of dB. one of my mic inputs is between 20-25 when its "idle", that is, when a mic is connected and everything is on, but there is no sound going into the mic; and another mic input is around 54 during the same situation.
    my question is, if i want to control the delay time in milliseconds, using this information, and i want the delay time to be dependent on the amplitude (dB) of the signal, what would you recommend to be the better mathematical process on it?

    right now i am doing something that is not quite what im searching for... it more or less stays at the same delay time...

    thanks for your help!
    ofer

    posted in technical issues read more
  • shpeck

    my bad, there IS pitch shifting in Supercollider when running that line...

    but anyhow, that line has a MUCH smoother result than pd... why?

    posted in technical issues read more
  • shpeck

    great! thanks toxonic, the one time repeat works excellent

    i am still frustrated with the fact that pd is kind of limited in this sense of crackling and pitch shifting when dealing with the delay time... here is a line in Supercollider that does the exact thing im looking for, but without any crackling or pitch shifting:

    { DelayN.ar(SoundIn.ar(0), 3, (Amplitude.ar(SoundIn.ar(0), 0.01,5)*8).poll )+SoundIn.ar(0).dup }.play;

    why is it so simple in supercollider and so difficult to achieve in pd?

    thanks
    ofer

    posted in technical issues read more
  • shpeck

    many thanks toxonic
    but my main problem is not connected to gathering the db info of the signal. my main problem concerns the delay effect im trying to acheive. so even if i convert the db to rms using [env~] & [dbtorms] , i still have issues with the crackling of the delay, not to mention that i dont know how to delay the signal only once.

    any help? or am i wrong, and using the [env~] would solve everything i mentioned above?

    thanks alot
    ofer

    posted in technical issues read more
Internal error.

Oops! Looks like something went wrong!