Framebuffering for effects and glsl pixelshaders: a "help"-patch
the freeframe effects (i assume, you use pete wardens freeframe collection) are notorious for cpu load (especially in pd). using them, it doesn't matter if i use this method or effects on a video, the load is always extremely high.
i am planning to not using them in the future at all and instead build up my own collection of firstly ported glsl ones from example files and lastly of course my own. ... won't be happening anytime soon though.
keep in mind, that i used 800x600 with 25fps for window and buffersize. maybe using sth smaller and a tad lower framerate will help you.
GEM limitations?
this doesn't work quite like that, since the ways of tehcnical workings are different.
what the freeframes and the glsl pixel shaders always do, is modifying the pixels of a texture.
when you are creating a normal like a cube, circle or a model, they are in a state that is different to a pixelimage. they are sort of in a real 3d envorinment.
... i sort of think of it like the difference between vector- and pixel-graphics that we find in illustrator vs. photoshop.
what you can do is render your whole 3d-space to a rectangle, so it all becomes a texture that gets updated in realtime. then you can apply all the effects the work on videos on that.
you need to use frambuffering for that (see 07.texture->07.framebuffering and 10.glsl->06.framebufferandshader in the gem examples )
there is ways to influence the actual object through glsl shaders. like "primitive distortion" in the glsl gem examples.
but this distorts "only" the mesh of the object by moving its faces around.
sort of like taper, twists or explode, that you can find in a 3dmodelling program.
i imagine, if one wants to write a shader for a native glow, we'd have to think in ways of creating millions of new tiny rectangels with transparencies in realtime floating like particles around the native object.
unfortunately, the geometry shading patch in the examples doesn't work for me ("error: gl: invalid variable"...whatever that means), so i can't work on that further. ...yet.
i am a total noob to glsl, so for now i am more than occupied figuring out glsl pixel shaders and not entirely sure if what i say here is correct.
please correct me, anybody, if i am wrong - got a lot to learn.
Guitar multi-effects rig
This is my live guitar effects right as of Feb 14, 2009. Please let me know if you find it useful or have any ideas for effects or other improvements. If you make some music with it, I'd love to hear it!
Once I have some more time to program it, my next effect will probably be a Vocoder.
Run effectsrig.pd to load it up. A midi expression pedal is recommended for the best experience - but it's not required.
It contains the following effects:
whammy~
Digitech whammy style pitch shifter. Allows for smooth changes to the pitch shift amount.
Based on the one posted by "kenn" on the puredata.info forums (which in turn is based on the pd example code).
shimmer~
A "shimmer" synth-like effect. This is done with a pitch shift in a feedback loop of a very short delay.
octfuzz~
Octave-up distortion like you can obtained with the classic transform and 2 diodde rectifier circuit. Basically it just full-wave rectifies the audio signal. This one really brings out the high frequencies (some times a little too much!).
leslie~
A stereo leslie (rotating speaker) simulator. This is one of my favorites. If modulation is turned all the way down it becomes tremolo. Take one of the outlets for mono use. Try it in stereo for super-swirley bliss! When using an expression pedal to control the rate, heel down will bypass the effect.
Expression pedal control is done by expression.pd. It simply reads in MIDI and scales it to a 0->1 range. You can change the midi channel used by editing this file.
The preset system is a little hack-ish, but it works for me. If anyone has any better ideas on how to do this, I'd love to hear them. When you load up the main effectsrig.pd file, you will see a bunch of message boxes. This are quick settings buttons - just click one to apply that effect. They are designed so you can click a couple in a row to quickly apply a few different settings. To start over, click the big "default" one on the left.
It can also load presets based on midi messages. I use this with my Eventide TimeFactor pedal. When I change presets on the TimeFactor, PD follows along. This is handled by the box in the top right. The symbol box is for song titles, and the number boxes show the current TimeFactor preset. Open this box to see how I've done a couple of example midi controlled presets. "pd your_love_never_fails" is a more complicated example that changes the expression pedal behavior slightly.
If you want to use a different midi channel for listening to program changes, just edit preset.pd and presetnum.pd. preset.pd outputs a bang when the preset number supplied as a parameter is chose. presetnum.pd just outputs the number of the selected preset.
Synthesis metal bars sound
HI,
i'm working on an installation based on this apllication made in java
i communique with pd via OSC
for each collision pd receive a bang with two parameters
height tube
position tube
i'm looking for synthesis metal bars sounds to transform this "thing" into a musical instrument
there is samples here
http://obiwannabe.co.uk/html/sound-design/sound-design-audio.html
http://obiwannabe.co.uk/sounds/effect-clonk-002-bar.mp3
http://obiwannabe.co.uk/sounds/effect-clonk-002-bar.mp3
http://obiwannabe.co.uk/sounds/effect-clonk-004-iron.mp3
http://obiwannabe.co.uk/sounds/effect-clonk-006-bar.mp3
What kind of simple patch should i have to make for this goal?
au revoir
Denis
3d extrusion from pix\_texture?
you probably already know/use this patch but [pix_coordinate] looks like it'd come in handy here. i never quite understood the whole deal with the s/t params, but it makes for a fun kaleidescope effect (http://puredata.hurleur.com/sujet-661-gem-kaleidescope-patch)
i'm pretty sure i get what you're trying to do but figuring out how to do it is gonna be tricky i bet - i'm not sure what patch achieves the "transparent" or "opaque" effect you're looking for, and then on top of that you need to figure out a way to stretch it out to a 3d model/figure and retain the transparent qualities. if you can figure out how to implement that openGL external you linked to then maybe it would be possible to just have two copies of the same pix_texture (be it a still or a video or a webcam) wrapped onto two 2d squares, have one on top of the other with a decent gap inbetween them, and then figure out a way to "fill" the areas between the 'active' colors on the Y axis - wrapping the texture to a cuboid wraps it to each side, and besides that i don't think an object like cuboid would be flexible enough for this.
basically what i'm thinking is take the image you posted and instead of trying to deal with one 3d block (cuboid) maybe use 2d squares, 2 of them with the same texture, and place them like the top and the bottom of the cuboid (like the bottom and top of a box just floating in space). since these two squares contain the same image figure out what object you need to fill the 3d space between the 'active' colors (according to how you set your color threshold, that extern looks like it might be handy for this). have it fill along the Y axis between the active pixels in the image, and then you should have basically the effect you're looking for. a benefit to this method (if it is even a feasible one, i'm just thinking out loud at this point) is that you would really even need a 'transparent' effect - the transparent bits would just be spots not filled because of their color threshold not triggering the fill. then you could take the 2d squares that act as the limits of the fill and just make them invisible one way or another - that way you could spin it all around in 3d and you should be able to see 'through' it from every angle!
all in all i would say it's definitely possible, it's just a matter of what method would work best... i can see there being multiple methods of achieving the desired effect, but you also have to make sure to not be redundant and waste processing overhead (like, say you used a transparency effect on a 2d square and then [repeat]ed it a few hundred times). i just threw that idea out there because i'm not sure that you can treat the default 3d objects to transparency like that...
hope that made sense, and i also hope i understood what you're trying to do correctly (otherwise that's one long useless post i just made ... if i'm on the right track let me know and i'd be willing to help you out with it. either way i just had a lot of fun thinking that out!
Some new bird sounds
http://www.obiwannabe.co.uk/sounds/effect-rainforestbirds.mp3
http://www.obiwannabe.co.uk/sounds/effect-riverbirds.mp3
http://www.obiwannabe.co.uk/sounds/effect-seabirds.mp3
Background reading and inspiration
http://ccrma.stanford.edu/~tamara/publications/
http://www.acoustics.hut.fi/research/avesound/pubs/akusem04.pdf
http://www.csounds.com/ezine/winter2000/realtime/
http://www.obiwannabe.co.uk/tutorials/html/tutorial_birds.html
http://www.indiana.edu/~songbird/pubs/publications_index.html
http://web.mit.edu/fee/Public/Publications/Fee_etal1998.pdf
Modular signal chain?
>
You could also put _all_ effects in a sub patch three times in a row and then from the main patch control what should be on/off inside these patches, which makes it kinda modular...but it would be a cheap hack and perhaps it will consume cpu even if some effects are OFF?<
that's what i do. as long as you thoroughly [switch~] all audio and [spigot] all inputs to control calculations, the cpu won't rise TOO badly.
in my modular synth, i currently have:
8 sampler voices, 4 synth voices, 8 drum voices, 9 effects channels..
each with 5 stages of effect functions, with 25 effects in each stage.
so that's:
(8+4+8+9) * 5 * 25 = 3625 effects units!!!
all loaded at once, but only switched on when i need them.
this uses about 50% of my cpu when all effects are off, so the cpu usage is climbing by loading them...but not intolerably so.
Inside on a rainy day
Something to share that combines a few different models in a linked way.
Start with a wind model based on turbulence, objects in the path vary their signals according to wind speed and their size and texture.
http://www.obiwannabe.co.uk/sounds/effect-wind3.mp3
And a rain model with carefully distributed droplets that make little clicks according to a range of textures they hit...
http://www.obiwannabe.co.uk/sounds/effect-plainrain.mp3
Next is a window pane built around a square lamina with glass-like character Here's a few knocks on the virtual window with a virtual stick.
http://www.obiwannabe.co.uk/sounds/effect-knockonwindow.mp3
and finally I combine them all in the same auditory scene with causal linkage, so the rain lashes against the window...
http://www.obiwannabe.co.uk/sounds/effect-rainywindow.mp3
(Total object count 80 operators)
Andy
Crack
Yep. Things are going that way. Making it a standard is just my wishful vision! Not everyone is going to settle on Pd. There are other dataflow type interfaces to different unit generator sets like CPS, but there's a distinct movement in the direction of dataflow as a method for building procedural audio code...as it should be I started to advocate this years ago as many here know, but found out only recently that EA have indeed ported Pd into some games, mainly for generative music scoring. Sony have something in R&D for the PS3 and certain game audio engine manufacturers have certainly considered it. I continue to knock on their doors, thump my bible and try to convince them to accept the good news into their hearts It would be wonderful to establish Pd as the main audio component in games for runtime production because it's the correct tool to break down the barrier between sound designer and audio programmer, that's the way to push things forwards.
If you want to support this direction, the title to run out and buy is Spore. Brian Eno and others wrote procedural music scores using a cut down version called EAPd, which Mark Danks (GEM author, now at Sony) led the charge to embed as the audio engine.
More than one chapter of the book I'm working on is devoted to designing patches for game applications, how to do dynamic level of detail and interface to event streams from world controllers and physics engines.
I'd say dataflow programmers, whether audio or visual, have a good future ahead for commercial employment (but then I'm (very) biased
Here's some types of things in dev, these are components for planes, sort of thing you'd use in air combat games or whatever.
One of them is developed as a practical example in the book. (I'm trying to get an accurate Supermarine Spitfire working at the moment...)
http://obiwannabe.co.uk/sounds/effect-jetengine.mp3
http://obiwannabe.co.uk/sounds/effect-three-synthetic-jets-flypast.mp3
http://obiwannabe.co.uk/sounds/effect-singleprop-cockpit.mp3