Interaction Design Student Patches Available
Greetings all,
I have just posted a collection of student patches for an interaction design course I was teaching at Emily Carr University of Art and Design. I hope that the patches will be useful to people playing around with Pure Data in a learning environment, installation artwork and other uses.
The link is: http://bit.ly/8OtDAq
or: http://www.sfu.ca/~leonardp/VideoGameAudio/main.htm#patches
The patches include multi-area motion detection, colour tracking, live audio looping, live video looping, collision detection, real-time video effects, real-time audio effects, 3D object manipulation and more...
Cheers,
Leonard
Pure Data Interaction Design Patches
These are projects from the Emily Carr University of Art and Design DIVA 202 Interaction Design course for Spring 2010 term. All projects use Pure Data Extended and run on Mac OS X. They could likely be modified with small changes to run on other platforms as well. The focus was on education so the patches are sometimes "works in progress" technically but should be quite useful for others learning about PD and interaction design.
NOTE: This page may move, please link from: http://www.VideoGameAudio.com for correct location.
Instructor: Leonard J. Paul
Students: Ben, Christine, Collin, Euginia, Gabriel K, Gabriel P, Gokce, Huan, Jing, Katy, Nasrin, Quinton, Tony and Sandy
GabrielK-AsteroidTracker - An entire game based on motion tracking. This is a simple arcade-style game in which the user must navigate the spaceship through a field of oncoming asteroids. The user controls the spaceship by moving a specifically coloured object in front of the camera.
Features: Motion tracking, collision detection, texture mapping, real-time music synthesis, game logic
GabrielP-DogHead - Maps your face from the webcam onto different dog's bodies in real-time with an interactive audio loop jammer. Fun!
Features: Colour tracking, audio loop jammer, real-time webcam texture mapping
Euginia-DanceMix - Live audio loop playback of four separate channels. Loop selection is random for first two channels and sequenced for last two channels. Slow volume muting of channels allows for crossfading. Tempo-based video crossfading.
Features: Four channel live loop jammer (extended from Hardoff's ma4u patch), beat-based video cross-cutting
Huan-CarDance - Rotates 3D object based on the audio output level so that it looks like it's dancing to the music.
Features: 3D object display, 3d line synthesis, live audio looper
Ben-VideoGameWiiMix - Randomly remixes classic video game footage and music together. Uses the wiimote to trigger new video by DarwiinRemote and OSC messages.
Features: Wiimote control, OSC, tempo-based video crossmixing, music loop remixing and effects
Christine-eMotionAudio - Mixes together video with recorded sounds and music depending on the amount of motion in the webcam. Intensity level of music increases and speed of video playback increases with more motion.
Features: Adaptive music branching, motion blur, blob size motion detection, video mixing
Collin-LouderCars - Videos of cars respond to audio input level.
Features: Video switching, audio input level detection.
Gokce-AVmixer - Live remixing of video and audio loops.
Features: video remixing, live audio looper
Jing-LadyGaga-ing - Remixes video from Lady Gaga's videos with video effects and music effects.
Features: Video warping, video stuttering, live audio looper, audio effects
KatyC_Bunnies - Triggers video and audio using multi-area motion detection. There are three areas on each side to control the video and audio loop selections. Video and audio loops are loaded from directories.
Features: Multi-area motion detection, audio loop directory loader, video loop directory loader
Nasrin-AnimationMixer - Hand animation videos are superimposed over the webcam image and chosen by multi-area motion sensing. Audio loop playback is randomly chosen with each new video.
Features: Multi-area motion sensing, audio loop directory loader
Quintons-AmericaRedux - Videos are remixed in response to live audio loop playback. Some audio effects are mirrored with corresponding video effects.
Features: Real-time video effects, live audio looper
Tony-MusicGame - A music game where the player needs to find how to piece together the music segments triggered by multi-area motion detection on a webcam.
Features: Multi-area motion detection, audio loop directory loader
Sandy-Exerciser - An exercise game where you move to the motions of the video above the webcam video. Stutter effects on video and live audio looper.
Features: Video stutter effect, real-time webcam video effects
Puredata and ableton
hi,
i use puredata and IAC drivers to communicate to ableton live 8.
most of the time it works, but sometimes it seems that live is unable to get the right midi channel. i can see that live recieves midi data, but it can not react to them.
i know this seems to be more a live problem then a puredata one, but maybe someone had the same issue and knows how to handle it.
thanx,
slx
How do I go about....
oh and if you did manage to find an OSC output from the liveOSC scripts, you would then need to call your LED the same as the OSC command from live, or you would need to reformat the command from live using puredata to that of the LED.
e.g. if LiveOSC has an OSC send called /live/tempo/beat (I doubt it does) then you would have LiveOSC script connect to your TouchOSC, and the LED would be also /live/tempo/beat.
It may be that even if there's no way to send OSC tempo beat messages from Live using the liveosc script, you could maybe use live to map the beat to move a controller, and send this information to puredata where you could reformat it.
I just checked http://github.com/willrjmarshall/AbletonDJTemplate/blob/44609dbd1be136d517c420f14fc987e9aa96fcc6/TouchOSCTemplate/LiveOSC/OSCAPI.txt
There isn't anything that will solely broadcast beats over OSC, but there are a couple of things that with some puredata manipulation might be able to do this:
live/tempo = Request current tempo, replies with /live/tempo (float tempo)
/live/tempo (float tempo) = Set the tempo, replies with /live/tempo (float tempo)
/live/time = Request current song time, replies with /live/time (float time)
/live/time (float time) = Set the time , replies with /live/time (float time)
It might (just) be possible to write code to examine the song tempo, examine regularly the song time and from there calculate when the beat will drop. I'm not about to start on this any time soon but there is the potential to do this. The amount of work might be offset by actually monitoring the metronome, or just listening to the music to get an idea where the bars are ;o)
Just guesses but good luck with it, I want to do similar as I'm working on a full remote DJ tool for TouchOSC (clip browser and selecter, tempo, monitoring, mixer, xfader, effects and filters). I'm using puredata for this, touchOSc and the l;iveOSC remote scripts. I tested all the theoretical bits for this and have it working, just need to but the behemoth PD together to handle it all! Oh, and another feature will be a toggle on the accelerometer to control any effect, fader etc so "put your hands in the air" and "drop the bass" will be actually doable! Of course, the main reason I want this is I like the music I spin and I'm sick of being stuck behind sensitive equipment where I can't dance properly
Looking for very simple looper patch...
I'm building a live setup with pd, and I want to have a tempo synced looper that's a constant number of bars behind the live playing, so that with one button I can switch from "live" to "looped", and it will loop the last x bars until I switch it back to "live". Has anyone seen something like this I can easily modify, or is this something I'm going to have to do from scratch?
Max4Live... How about Pd4Live?
@nestor said:
ASFAIK, the best way to do what you want to do is with Jack, so keep it up.
True - once you have it setup right. Its totally cool. But when you bounce down stuf from Pd tho, i find it better to record Pd side. I've notice that when recording in Live, if theres a heavy processor part the latency increases - Pd's logical time and Live's logical time are NOT synced. Therefore parts of the recording get delayed by a few ms, bit annoying. So I have a handy multi track recorder for that (i can upload if any one finds it useful) and I just drag the resulting wavs into Live.
I do wish there was Jack transport/midi support in Live tho.
At the end of the day, its all about convenience. I don't see what you can do with max4live that you can't with Pd, Jack and Live. Its just not as integrated. Before someone says sample accuracy, yeah yeah yeah....but did we even worry about it when we all had hardware samplers and synths? Did that stop some of the best electronic tunes being produced?
When you really think about it (and start to really pick it apart) what are you going to be able to do extra specially in this new arrangement. Like slvmchn says you'll have to remake every patch hah hah ha!
(i'm just fed up with shelling out for proprietry software - can you tell?!)
S
Max4Live... How about Pd4Live?
Is it possible to integrate Pd with Ableton Live in a way similar to Max4Live, as a Live/Pd user (newbie really) this will be a very important issue in my work. As far as i know, Max4live includes many instruments that can be opened up in a Max canvas (similar to Pd) environment for editing, as well as allowing additional access to the inner workings of Live, http://www.ableton.com/maxforlive. I just wanted to verify if there is (or going to be) any movement to allow for such integration. i am already aware of Jack for sending audio directly from pd to live and vice versa(not sure about midi capabilities), but i'm refering to truly open possibilities that this type of integration will allow.
Thanks,
Jeff
PD to Ableton Live MIDI wierdness - ctlout works, noteout does not???
Hi, this question goes out to those of you who use PD and Ableton live.
I am trying to control an instrument in Live, Live's Sampler, from PD, and I have ctlout working just fine, in that I can control parameters of filter knobs etc. on a Live Sampler from a PureData ctlout with no problem.
I am using the IAC MIDI driver to route MIDI from PD to Live as I am on a Mac OSX machine.
My problem is that when sending noteout signals, Live is receiving the MIDI as evidenced by Live's MIDI receive indicator lighting up, but it is not making notes play in the sampler, even though I have it set to receive MIDI from that IAC bus. Like I say the ctlout has no problem controlling a filter knob, etc...
Anyone have any suggestions? I am kind of a noob to MIDI stuff and to interapplication MIDI routing.
Knobs and Sliders PD MIDI Setup?
what i don't understand is, why you need pd if you want to control traktor? isn't it possible to control traktor directly with nintendo?
however, have a look at the objects [noteout] and [ctlout], this way you can send midi from pd via the midi out you selected in pd's properties. so, if you choose for example midi yoke 2 as midi out in pd, then you have to set midi yoke 2 as midi in in traktor. then sliders connectetd to [ctlout] should be sent to traktor.
an example: create a ctlout object with the arguments 60 and 1, this should look like this:
[ctlout 60 1]
connect a slider to the first inlet of the ctlout object.
now the slider values should be send to pd's midi out via controller number 60 on channel 1.
Pdvst help
i just tried the whole thing with a simple reverb during my lunchbreak and it worked (for me) as follows, using ableton live 7:
-
put "pdvst" folder (the whole unpacked thing from the download, including the custom pd program) in C:\Program Files\Ableton\Live 7.0.3\Program
-
copied the PD_Gain.pd and PD_Gain.pdv to my pd installation to edit it. (at least i can't really edit with the pd version in the pdvst folder, it's obviously just meant to run in the backgorund, when the DAW starts)
-
edited the .pd and .pdv files to make them what you see in the attachment. i threw out the gripd stuff to erase a possible error-source. maybe comparing your pdv file to mine, you find if there's something wrong with it.
-
copied the finished, now named PD_freeverb.pd and PD_freeverb.pdv back into the pdvst folder within the ableton live folder. since my patch uses the external "freeverb~.dll" (which comes with pd-extended), i copied that file into the pdvst folder as well. - the according line in the pdv file needs to point to this lib.
-
made a copy of the template dll and named it PD_freeverb.dll
-
put the PD_freeverb.dll into my vst-plugins directory (the one monitored by live)
started live - the plugin shows up as a (audio) vst plugin. all fine.
since i set the line
# Display Pd GUI or not
DEBUG = TRUE
in the pdv file, i get the full patch and surprisingly can even edit it, while having it running as a plugin. i doubt, that this is a safe way of working though. ^_^
i hope that all helps somehow.
attached are the PD_freeverb.pd, PD_freeverb.pdv, PD_freeverb.dll and freeverb~.dll (from the newest pd-extended installer)
edit: the only thing "suspicious" is that when i start live, a messagebox comes up, telling me that pd has crashed/exited. (pd was not run by me at that point)
however, observing the taskmanager, whenever i load the plugin into a track in live, pd.exe is started and fully quit, if i remove that plugin again.
Dumb(?) LFO and Phase question
Hi,
I think this might be a dumb question - but (possibly for want of the correct vocabulary) Google hasn't turned up an answer, so if anybody could help I'd be really grateful.
I'm putting together a modulation matrix to route a few LFOs to different outputs, and it occured to me that it'd be cool to be able to control the phase shift of the LFO at some outputs- so the same LFO was driving (for example) the amplitudes of two oscilators, but with one cycle 90/180/arbitrary degrees out of phase with the other - so osc1 volume rises as osc 2 volume falls.
Is there a reasonably straightforward way of doing this?
My first thought was to use phasors to drive the LFOs, then shape the LFO using lookup tables - running a few tables with (eg) sine waves shifted by different increments - pick the table and you pick your phase shift.
Seems like there must be a simpler and more flexible way though.
Any ideas?
Ta
Dan