BECAUSE you guys are MIDI experts, you could well help on this...
Dear Anyone who understands virtual MIDI circuitry
I'm a disabled wannabe composer who has to use a notation package and mouse, because I can't physically play a keyboard. I use Quick Score Elite Level 2 - it doesn't have its own forum - and I'm having one HUGE problem with it that's stopping me from mixing - literally! I can see it IS possible to do what I want with it, I just can't get my outputs and virtual circuitry right.
I've got 2 main multi-sound plug-ins I use with QSE. Sampletank 2.5 with Miroslav Orchestra and Proteus VX. Now if I choose a bunch of sounds from one of them, each sound comes up on its own little stave and slider, complete with places to insert plug-in effects (like EQ and stuff.) So far, so pretty.
So you've got - say - 5 sounds. Each one is on its own stave, so any notes you put on that stave get played by that sound. The staves have controllers so you can control the individual sound's velocity/volume/pan/aftertouch etc. They all work fine. There are also a bunch of spare controller numbers. The documentation with QSE doesn't really go into how you use those. It's a great program but its customer relations need sorting - no forum, Canadian guys who wrote it very rarely answer E-mails in a meaningful way, hence me having to ask this here.
Except the sliders don't DO anything! The only one that does anything is the one the main synth. is on. That's the only one that takes any notice of the effects you use. Which means you're putting the SAME effect on the WHOLE SYNTH, not just on one instrument sound you've chosen from it. Yet the slider the main synth is on looks exactly the same as all the other sliders. The other sliders just slide up and down without changing the output sounds in any way. Neither do any effects plugins you put on the individual sliders change any of the sounds in any way. The only time they work is if you put them on the main slider that the whole synth. is sitting on - and then, of course, the effect's applied to ALL the sounds coming out of that synth, not just the single sound you want to alter.
I DO understand that MIDI isn't sounds, it's instructions to make sounds, but if the slider the whole synth is on works, how do you route the instructions to the other sliders so they accept them, too?
Anyone got any idea WHY the sounds aren't obeying the sliders they're sitting on? Oddly enough, single-shot plug-ins DO obey the sliders perfectly. It's just the multi-sound VSTs who's sounds don't individually want to play ball.
Now when you select a VSTi, you get 2 choices - assign to a track or use All Channels. If you assign it to a track, of course only instructions routed to that track will be picked up by the VSTi. BUT - they only go to the one instrument on that VST channel. So you can then apply effects happily to the sound on Channel One. I can't work out how to route the effects for the instrument on Channel 2 to Channel 2 in the VSTi, and so on. Someone told me on another forum that because I've got everything on All Channels, the effects signals are cancelling eachother out, I can't find out anything about this at the moment.
I know, theoretically, if I had one instance of the whole synth and just used one instrument from each instance, that would work. It does. Thing is, with Sampletank I got Miroslav Orchestra and you can't load PART of Miroslav. It's all or nothing. So if I wanted 12 instruments that way, I'd have to have 12 copies of Miroslav in memory and you just don't get enough memory in a 32 bit PC for that.
To round up. What I'm trying to do is set things up so I can send separate effects - EQ etc - to separate virtual instruments from ONE instance of a multi-sound sampler (Proteus VX or Sampletank.) I know it must be possible because the main synth takes the effects OK, it's just routing them to the individual sounds that's thrown me. I know you get one-shot sound VSTi's, but - no offence to any creators here - the sounds usually aint that good from them. Besides, all my best sounds are in Miroslav/Proteus VX and I just wanted to be able to create/mix pieces using those.
I'm a REAL NOOOB with all this so if anyone answers - keep it simple. Please! If anyone needs more info to answer this, just ask me what info you need and I'll look it up on the program.
Yours respectfully
ulrichburke
Interaction Design Student Patches Available
Greetings all,
I have just posted a collection of student patches for an interaction design course I was teaching at Emily Carr University of Art and Design. I hope that the patches will be useful to people playing around with Pure Data in a learning environment, installation artwork and other uses.
The link is: http://bit.ly/8OtDAq
or: http://www.sfu.ca/~leonardp/VideoGameAudio/main.htm#patches
The patches include multi-area motion detection, colour tracking, live audio looping, live video looping, collision detection, real-time video effects, real-time audio effects, 3D object manipulation and more...
Cheers,
Leonard
Pure Data Interaction Design Patches
These are projects from the Emily Carr University of Art and Design DIVA 202 Interaction Design course for Spring 2010 term. All projects use Pure Data Extended and run on Mac OS X. They could likely be modified with small changes to run on other platforms as well. The focus was on education so the patches are sometimes "works in progress" technically but should be quite useful for others learning about PD and interaction design.
NOTE: This page may move, please link from: http://www.VideoGameAudio.com for correct location.
Instructor: Leonard J. Paul
Students: Ben, Christine, Collin, Euginia, Gabriel K, Gabriel P, Gokce, Huan, Jing, Katy, Nasrin, Quinton, Tony and Sandy
GabrielK-AsteroidTracker - An entire game based on motion tracking. This is a simple arcade-style game in which the user must navigate the spaceship through a field of oncoming asteroids. The user controls the spaceship by moving a specifically coloured object in front of the camera.
Features: Motion tracking, collision detection, texture mapping, real-time music synthesis, game logic
GabrielP-DogHead - Maps your face from the webcam onto different dog's bodies in real-time with an interactive audio loop jammer. Fun!
Features: Colour tracking, audio loop jammer, real-time webcam texture mapping
Euginia-DanceMix - Live audio loop playback of four separate channels. Loop selection is random for first two channels and sequenced for last two channels. Slow volume muting of channels allows for crossfading. Tempo-based video crossfading.
Features: Four channel live loop jammer (extended from Hardoff's ma4u patch), beat-based video cross-cutting
Huan-CarDance - Rotates 3D object based on the audio output level so that it looks like it's dancing to the music.
Features: 3D object display, 3d line synthesis, live audio looper
Ben-VideoGameWiiMix - Randomly remixes classic video game footage and music together. Uses the wiimote to trigger new video by DarwiinRemote and OSC messages.
Features: Wiimote control, OSC, tempo-based video crossmixing, music loop remixing and effects
Christine-eMotionAudio - Mixes together video with recorded sounds and music depending on the amount of motion in the webcam. Intensity level of music increases and speed of video playback increases with more motion.
Features: Adaptive music branching, motion blur, blob size motion detection, video mixing
Collin-LouderCars - Videos of cars respond to audio input level.
Features: Video switching, audio input level detection.
Gokce-AVmixer - Live remixing of video and audio loops.
Features: video remixing, live audio looper
Jing-LadyGaga-ing - Remixes video from Lady Gaga's videos with video effects and music effects.
Features: Video warping, video stuttering, live audio looper, audio effects
KatyC_Bunnies - Triggers video and audio using multi-area motion detection. There are three areas on each side to control the video and audio loop selections. Video and audio loops are loaded from directories.
Features: Multi-area motion detection, audio loop directory loader, video loop directory loader
Nasrin-AnimationMixer - Hand animation videos are superimposed over the webcam image and chosen by multi-area motion sensing. Audio loop playback is randomly chosen with each new video.
Features: Multi-area motion sensing, audio loop directory loader
Quintons-AmericaRedux - Videos are remixed in response to live audio loop playback. Some audio effects are mirrored with corresponding video effects.
Features: Real-time video effects, live audio looper
Tony-MusicGame - A music game where the player needs to find how to piece together the music segments triggered by multi-area motion detection on a webcam.
Features: Multi-area motion detection, audio loop directory loader
Sandy-Exerciser - An exercise game where you move to the motions of the video above the webcam video. Stutter effects on video and live audio looper.
Features: Video stutter effect, real-time webcam video effects
How do I go about....
oh and if you did manage to find an OSC output from the liveOSC scripts, you would then need to call your LED the same as the OSC command from live, or you would need to reformat the command from live using puredata to that of the LED.
e.g. if LiveOSC has an OSC send called /live/tempo/beat (I doubt it does) then you would have LiveOSC script connect to your TouchOSC, and the LED would be also /live/tempo/beat.
It may be that even if there's no way to send OSC tempo beat messages from Live using the liveosc script, you could maybe use live to map the beat to move a controller, and send this information to puredata where you could reformat it.
I just checked http://github.com/willrjmarshall/AbletonDJTemplate/blob/44609dbd1be136d517c420f14fc987e9aa96fcc6/TouchOSCTemplate/LiveOSC/OSCAPI.txt
There isn't anything that will solely broadcast beats over OSC, but there are a couple of things that with some puredata manipulation might be able to do this:
live/tempo = Request current tempo, replies with /live/tempo (float tempo)
/live/tempo (float tempo) = Set the tempo, replies with /live/tempo (float tempo)
/live/time = Request current song time, replies with /live/time (float time)
/live/time (float time) = Set the time , replies with /live/time (float time)
It might (just) be possible to write code to examine the song tempo, examine regularly the song time and from there calculate when the beat will drop. I'm not about to start on this any time soon but there is the potential to do this. The amount of work might be offset by actually monitoring the metronome, or just listening to the music to get an idea where the bars are ;o)
Just guesses but good luck with it, I want to do similar as I'm working on a full remote DJ tool for TouchOSC (clip browser and selecter, tempo, monitoring, mixer, xfader, effects and filters). I'm using puredata for this, touchOSc and the l;iveOSC remote scripts. I tested all the theoretical bits for this and have it working, just need to but the behemoth PD together to handle it all! Oh, and another feature will be a toggle on the accelerometer to control any effect, fader etc so "put your hands in the air" and "drop the bass" will be actually doable! Of course, the main reason I want this is I like the music I spin and I'm sick of being stuck behind sensitive equipment where I can't dance properly
Max4Live... How about Pd4Live?
@nestor said:
ASFAIK, the best way to do what you want to do is with Jack, so keep it up.
True - once you have it setup right. Its totally cool. But when you bounce down stuf from Pd tho, i find it better to record Pd side. I've notice that when recording in Live, if theres a heavy processor part the latency increases - Pd's logical time and Live's logical time are NOT synced. Therefore parts of the recording get delayed by a few ms, bit annoying. So I have a handy multi track recorder for that (i can upload if any one finds it useful) and I just drag the resulting wavs into Live.
I do wish there was Jack transport/midi support in Live tho.
At the end of the day, its all about convenience. I don't see what you can do with max4live that you can't with Pd, Jack and Live. Its just not as integrated. Before someone says sample accuracy, yeah yeah yeah....but did we even worry about it when we all had hardware samplers and synths? Did that stop some of the best electronic tunes being produced?
When you really think about it (and start to really pick it apart) what are you going to be able to do extra specially in this new arrangement. Like slvmchn says you'll have to remake every patch hah hah ha!
(i'm just fed up with shelling out for proprietry software - can you tell?!)
S
PD to Ableton Live MIDI wierdness - ctlout works, noteout does not???
Hi, this question goes out to those of you who use PD and Ableton live.
I am trying to control an instrument in Live, Live's Sampler, from PD, and I have ctlout working just fine, in that I can control parameters of filter knobs etc. on a Live Sampler from a PureData ctlout with no problem.
I am using the IAC MIDI driver to route MIDI from PD to Live as I am on a Mac OSX machine.
My problem is that when sending noteout signals, Live is receiving the MIDI as evidenced by Live's MIDI receive indicator lighting up, but it is not making notes play in the sampler, even though I have it set to receive MIDI from that IAC bus. Like I say the ctlout has no problem controlling a filter knob, etc...
Anyone have any suggestions? I am kind of a noob to MIDI stuff and to interapplication MIDI routing.
Generat v0.1 liveperformance 3d synth
not sure whether to put this into patches or pixeles, but since everybody is always so nice to share patches, i feel like pointing to a project i made back in 2004.
it is sort of a live tool for 3d mixing called generat. i am working on it again for the first time since then, since i got a few gigs doing visuals again.
anyhow, the old pack i put together back then, is here:
http://0io.org/projects.htm there's a link to a extensive online manual as well
http://0io.org/media/hybrid/generat/generat_essentials_v01.zip
interface screenshot:
http://0io.org/media/generat/gen00.jpg
maybe someone can make use of it.
have fun
edit: updated broken links.
Pdvst help
i just tried the whole thing with a simple reverb during my lunchbreak and it worked (for me) as follows, using ableton live 7:
-
put "pdvst" folder (the whole unpacked thing from the download, including the custom pd program) in C:\Program Files\Ableton\Live 7.0.3\Program
-
copied the PD_Gain.pd and PD_Gain.pdv to my pd installation to edit it. (at least i can't really edit with the pd version in the pdvst folder, it's obviously just meant to run in the backgorund, when the DAW starts)
-
edited the .pd and .pdv files to make them what you see in the attachment. i threw out the gripd stuff to erase a possible error-source. maybe comparing your pdv file to mine, you find if there's something wrong with it.
-
copied the finished, now named PD_freeverb.pd and PD_freeverb.pdv back into the pdvst folder within the ableton live folder. since my patch uses the external "freeverb~.dll" (which comes with pd-extended), i copied that file into the pdvst folder as well. - the according line in the pdv file needs to point to this lib.
-
made a copy of the template dll and named it PD_freeverb.dll
-
put the PD_freeverb.dll into my vst-plugins directory (the one monitored by live)
started live - the plugin shows up as a (audio) vst plugin. all fine.
since i set the line
# Display Pd GUI or not
DEBUG = TRUE
in the pdv file, i get the full patch and surprisingly can even edit it, while having it running as a plugin. i doubt, that this is a safe way of working though. ^_^
i hope that all helps somehow.
attached are the PD_freeverb.pd, PD_freeverb.pdv, PD_freeverb.dll and freeverb~.dll (from the newest pd-extended installer)
edit: the only thing "suspicious" is that when i start live, a messagebox comes up, telling me that pd has crashed/exited. (pd was not run by me at that point)
however, observing the taskmanager, whenever i load the plugin into a track in live, pd.exe is started and fully quit, if i remove that plugin again.
I've changed to ubuntu ... but met new problem ...
I want to remove them all and reinstall , but in root terminal it report me 0 files has been removed .
so I removed all the pd folder in terminal
after that I reinstalled the extended-intrepid version . but
when I run in terminal , a report shows here:
priority 98 scheduling enabled.
sh: /usr/local/bin/pd-watchdog: not found
priority 96 scheduling enabled.
sh: /usr/local/bin/pd-gui: not found
and I think that because I removed pd-0.40.2 folder which I compiled before.
and then I'm searching from google ,that the files which we compiled that should be install and removed by checkinstall.
so I downloaded the package and installed in terminal .
but checkinstall failed ..
report is here :
checkinstall 1.6.1, Copyright 2002 Felipe Eduardo Sanchez Diaz Duran
This software is released under the GNU GPL.
The package documentation directory ./doc-pak does not exist.
Should I create a default set of package docs? [y]: y
Preparing package documentation...OK
*** No known documentation files were found. The new package
*** won't include a documentation directory.
*****************************************
**** Debian package creation selected ***
*****************************************
This package will be built according to these values:
0 - Maintainer: [ root@mal ]
1 - Summary: [ pd description ]
2 - Name: [ src ]
3 - Version: [ ]
4 - Release: [ 1 ]
5 - License: [ GPL ]
6 - Group: [ checkinstall ]
7 - Architecture: [ i386 ]
8 - Source location: [ src ]
9 - Alternate source location: [ ]
10 - Requires: [ ]
11 - Provides: [ src ]
Enter a number to change any of them or press ENTER to continue:
Installing with make install...
========================= Installation results ===========================
cd ../obj; cc -Wl,-export-dynamic -o ../bin/pd g_canvas.o g_graph.o g_text.o g_rtext.o g_array.o g_template.o g_io.o g_scalar.o g_traversal.o g_guiconnect.o g_readwrite.o g_editor.o g_all_guis.o g_bang.o g_hdial.o g_hslider.o g_mycanvas.o g_numbox.o g_toggle.o g_vdial.o g_vslider.o g_vumeter.o m_pd.o m_class.o m_obj.o m_atom.o m_memory.o m_binbuf.o m_conf.o m_glob.o m_sched.o s_main.o s_inter.o s_file.o s_print.o s_loader.o s_path.o s_entry.o s_audio.o s_midi.o d_ugen.o d_ctl.o d_arithmetic.o d_osc.o d_filter.o d_dac.o d_misc.o d_math.o d_fft.o d_array.o d_global.o d_delay.o d_resample.o x_arithmetic.o x_connective.o x_interface.o x_midi.o x_misc.o x_time.o x_acoustics.o x_net.o x_qlist.o x_gui.o x_list.o d_soundfile.o s_midi_oss.o s_audio_oss.o d_fft_mayer.o d_fftroutine.o \
-ldl -lm -lpthread
cc -g -O2 -DPD -Wall -W -Wstrict-prototypes -Wno-unused -Wno-parentheses -Wno-switch -DDL_OPEN -DPA_USE_OSS -DUNIX -DUNISTD -DUSEAPI_OSS -I../portaudio/pa_common -I../portaudio/pablio -I../portmidi/pm_common -I../portmidi/pm_linux -fno-strict-aliasing -O6 -funroll-loops -fomit-frame-pointer -D_LARGEFILE64_SOURCE -o ../bin/pd-watchdog s_watchdog.c
cc -g -O2 -DPD -Wall -W -Wstrict-prototypes -Wno-unused -Wno-parentheses -Wno-switch -DDL_OPEN -DPA_USE_OSS -DUNIX -DUNISTD -DUSEAPI_OSS -I../portaudio/pa_common -I../portaudio/pablio -I../portmidi/pm_common -I../portmidi/pm_linux -fno-strict-aliasing -O6 -funroll-loops -fomit-frame-pointer -D_LARGEFILE64_SOURCE -o ../bin/pdsend u_pdsend.c
cd ../extra/bonk~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/bonk~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/bonk~'
cd ../extra/choice;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/choice'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/choice'
cd ../extra/expr~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/expr~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/expr~'
cd ../extra/fiddle~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/fiddle~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/fiddle~'
cd ../extra/loop~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/loop~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/loop~'
cd ../extra/lrshift~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/lrshift~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/lrshift~'
cd ../extra/pique;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/pique'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/pique'
cd ../extra/sigmund~;make
make[1]: Entering directory `/home/mal/Documents/pd-0.40-2/extra/sigmund~'
make[1]: Nothing to be done for `current'.
make[1]: Leaving directory `/home/mal/Documents/pd-0.40-2/extra/sigmund~'
install -d /usr/local/lib/pd/bin
install: cannot create directory `/usr/local/lib/pd': No such file or directory
make: *** [install] Error 1
**** Installation failed. Aborting package creation.
Cleaning up...OK
Bye.
root@mal:/home/mal/Documents/pd-0.40-2/src#
I dont know what to do now ....
C::NTR::L 1.0 - live AV improv+physical computing
Hi everybody, i've never write on this forum, but i was always following threads and i must say thanks to all the contributors in here for their help, you all are making an amazing work with this forum.
Now i'm here cause i would like to introduce you C::NTR::L 1.0 (beta).
C::NTR::L is the name of the free software for real-time Human-Computer Interaction exploiting physical computing possibilities. Developed in PureData by Marco Donnarumma. It seeks to be a tool for audiovisual live improvisation. The project started in 2007 and it remains a constant work in progress, i'm always interested in new ideas and collaborations (recently for example i worked a bit with Servando Barreiro and we included a module to use sensors, exploiting his DIY hardware Minia). This is the version 1.0BETA.
I'm planning to publish the patch, but before i want to work more on the interface, and enhance some features to offer a good usability of the tool, also for who doesn't work everyday with graphical programming.
C::NTR::L transforms your standard chord instrument - electric bass, guitar, violin, etc. - in a audiovideo controller without exploiting a specific external hardware or MIDI technology.
Once you connected the instrument to the sound card of your computer, C::NTR::L starts to recognize which notes you play. This is possible trought a complex structure of band pass/low pass/hi pass filters which automatically separate the core frequency of the incoming audio signal.
Then C::NTR::L analyzes the duration and the RMS of each single note and finally translates this data in order to control and trigger a set of audiovideo efx and modules which at this moment features:
**
VIDEO
* playlist
* scratch and loop points
* white/black fade
* color matrix
* blur
* delay
* strobo
* 3D efx
* presets save
AUDIO
* real-time sound in processing
* support multiple sound input (up to as amany as you want and your machine can stand)
* granulator (original module by Matt Davey THANKSSSSSS for the great ispiration!! i put your reference in posts and in the patch itself, but please tell me if you want more specific references.)
* bit-crusher (original module by Matt Davey)
* reverb (original module by Matt Davey)
* oscillators
* presets
**
I'm looking for beta-testers, so If you're interested, please write to info [at] thesaddj.com. Keep on checking www.thesaddj.com for future news.
And www.thesaddj.com/icontrolnature for the live show i perform with C::NTR::L.
An extract of the live project performed with C::NTR::L can be found here (live @ Cinesthesy Festival, France):
Soulful thanks for sharing, supporting and inspiring goes to: Rep (PD'r and Multimedia Artist), AssoBeam (PD'r and Multimedia Artists), Husk (PD'r and Multimedia Artist), Sero (Sound Artist), Brendan Byrne (PD'r and teacher), Jorg Koch (MAX'r and sound artist), Servando Barreiro (PD'r and Multimedia Artist), Hardoff (PD'r) G-noma (Multimedia Artist) and the incredible community on the PD Forum.
Marco / The S.A.D
Noteout into Live (OSX)
Hi guys
I'm trying to route MIDI between Ableton Live and PD using Mac OSX 10.4. I've set up a couple of IAC buses and can receive MIDI in PD from Live ok, but when I send MIDI back to Live using a noteout object it is not received by Live's armed MIDI track. The MIDI is reaching Live because the MIDI Track In Indicator in the Top RHC is flashing, it's just not getting to the track.
Looking at the MIDI stream in MIDI monitor Live gives this:
TIME SOURCE MESSAGE CHANNEL DATA
16:24:30.721 From IAC Bus 1 Note On 1 C3 100
16:24:30.846 From IAC Bus 1 Note Off 1 C3 64
whereas PD gives this:
*** ZERO *** From IAC Bus 2 Note On 1 C3 100
*** ZERO *** From IAC Bus 2 Note Off 1 C3 0
So PD does not seem to be outputting any kind of timestamp, could this be the problem? The only other difference seems to be the velocity of the note off message.
Any help would be much appreciated.
Cheers
Ummo