16 parameters for 1 voice, continued...
@H.H.-Alejandro Probably the wrong word....... it adds an adjustable very small change to every sine wave oscillator so that their frequencies are very slightly different..
The ocillators in bank 0 are
curve freq
curve freq x 2
curve freq x3....... etc.
and in bank 1 they are
curve freq + small amount from blue fader x 1
curve freq x2 + (small amount from blue fader x 1).. etc.
and in bank 2 they are
curve freq + (small amount from blue fader x 2).. etc.
until
curve freq + small amount from blue fader x 15.. etc.
If you stop Iannix output, and drag the blue fader you will see all the frequencies change except in bank 0.
Useful? I don't know how to use 128 oscillators and keep the basic note otherwise.
Also, because I am unsure about what you will need I am trying to keep to abstractions and keep the changes minimal.
David.
udpsend and receive
@whale-av
No problem for all the questions, I have been working in solitude on this project with a lot of unknowns, so a lot of trial and error (previously have only used max/msp in laptop sound installations and performance scenarios).
Actually, no back up cards, but I did use the one that I hadn't completely set up yet, so getting it back to where it was isn't that hard. And to be honest i"m used to re-setting these things, its been quite a steep learning curve with lots not working.
The manual I found for puredata (http://en.flossmanuals.net/pure-data/network-data/send-and-receive/) was suggesting the netsend / netreceive objects, so I just took that for granted. At the moment, in its simplest form I would just like the same bang to be sent to all rpi's simultaneously, to that playback start could be synchronised. I assume for this to happen they could all just listen on the same port, and receive the same bang?
I was planning on one of the four rpi as being the router as well as an interface. Is this a bad idea? To be honest getting one to be a router was a huge battle, so if i could just use a router that would be easier. Conveniently those router's you suggested run off 12v, and my system runs of that, so it would be easy to put in the mix. Would the signal be stronger/reach further using a router as opposed to an rpi?
No, not planning on streaming audio. My thought process was that this might be a headache, and take up a bit of CPU. But in all honest its not needed for the project. The main forms of playback would literally be a mono file loaded into a PD patch with synchronised playback, or a PD patch as instrument (ie, sampler/synth type of interface) loaded onto each pi. In either case, they would be controlled via OSC. The instrument version might have been controlled by different ipads/OSC devices.
Too late, have already bought the screens. Part of this decision process was that there is so much that could go wrong in this scenario, that with a touch screen added I could easily trouble shoot in the field. Also, for the instrument patch, the instrument could just be on the screen which would be great.
The original touchscreens, small 2" ish ones, wouldn't work with Jessie, that's the short answer. However, what I wanted to do was to be able to access the HDMI out in order to mirror the screen, so that I could actually see what I was programming. I didn't know when purchasing them that the HDMI out is disabled once they are installed, and its a headache to get them output both to screen and HDMI out. So the 7" screens I figured would easily allow me to program the device in my studio without the need for extra monitors, and could easily be used in the field.
So, I'm pretty sure i can now change to Jessie, as the 7" screens seem easier to work with. The only thing that wouldn't work is the tutorial I have for turning a RPI into a router. But, if that will make things complicated (which if I'm reading between the lines, it might?), then I think powering an external router might be an easier option.
I think Armel in the long run will cause headaches, it does make sense to stay with the newer version, and I think switching to Jessie at this point in time would make sense.
Epic reply. Thoughts welcome!!
Call for Participation: ANTHEMITY
Call for Participation
sound artists, musicians, Europeans by choice or fate
ANTHEMITY – A participative installation about the Anthem of Europe
Europe, who or what are you? A place of multicultural exchange, of cross-fertilisation? Cradle of human rights and democracy? Or just a bureaucratic monster, threatening national sovereignty, and quite pointless in view of globalisation? Is Europe as a political amalgamation paying attention to social aspects? Where will this lead to?
Utopian Europe, as a region of cohabitation, is deep-rooted in symbols. Symbols that manifest the supra-national merging: a common currency, a common fag and a common Anthem of Europe, the Prelude from “Ode to the Joy”, Beethoven's 9th Symphony. Referring to the unspoken motto: „All humans will become brothers”.
Our daily life is already using the possibilities of the European Union: we can travel around, sell and buy items, settle and work wherever we want. With a minimum of bureaucratic efort. The European Union as a step forward to a pacifstic and democratic World Union? Or is Brussels a symbol of the post-democratic society, where economics are the new sovereigns?
Somewhere in between, the inhabitants of Europe have to position themselves. The installation ANTHEMITY focuses on the anthem and asks to what extent an European identity might be discovered. For that reason, it is necessary to involve the inhabitants of Europe:
All of you – those who have found a home in the European Union, and those of you still looking for a home in the EU, take the European Anthem and inscribe yourself into it. Deconstruct it. Take parts of it, recomplete it, inject missing items or moods... make your own interpretation!
ANTHEMITY is a stage and soundboard for your version. In this project, several speakers will be arranged in a room. You will find several versions announced here later on. Some as a sketch, other ones as a detailed documentation.
As ANTHEMITY is the master's thesis of Philipp Weber in European Media Studies at the University of Potsdam and the University of Applied Science Potsdam, it is not possible at the moment to come up with too many details.
But pd is used for all installations and the patches will be released in this forum and on the website as soon as I'm allowed to.
Be part of it from now on and send your track to: http://www.dropitto.me/anthemity (password: imeu)
Details:
Advice: Due to legal circumstances, I'm not able to hand out the ofcial score of the Anthem.
Listen to: mms://webstream.ec.europa.eu/scic/press/euanthem/WindOrchestra_320kbs.mp3
Read about: http://europa.eu/about-eu/basic-information/symbols/anthem/index_en.htm
Length: 2'07"
name and format: YourName-YourCountry.wav (all other formats will be recoded automatically.)
Licence: creative commons oder member of a collecting society? Please send me an email to me: pweber@uni-potsdam.de.
All entries submitted by August, 10th 2012 will take part at the premiere in Berlin.
Further details coming soon. Visit: http://philmstelle.de/projects/anthemity.html
Philipp Weber is always happy to hear from you: pweber@uni-potsdam.de
Running your patches on Android usind PdDroidParty in 10 Steps
Just a report back. I've finally succeeded in getting my app to function stand-alone and now have it available in the Android Market. The free version is available on the Android Market here: What Comes Around Free. Here are the steps I followed to get it working for me.
After installing Ubuntu on a spare computer (along with the rpl command as detailed before), I managed to get the scripts to run, but still couldn't get the "ant install" to function as expected.
I went back to the Windows 7 PC and tried the Eclipse route again. This time I laid the proper groundwork and resolved a lot of errors by importing everything as per Peter Brinkmann's instructions on the Pd Everywhere forum.
"[...] make sure that you’ve cloned pd-for-android and its dependencies like this:
git clone git://gitorious.org/pdlib/pd-for-android.git
cd pd-for-android
git submodule init
git submodule update
Now you need to import btmidi/BluetoothMidi and pd-for-android/PdCore/jni/libpd into Eclipse (make sure to use
File -> Import… -> General -> Existing project…). Now you can import PdCore. If you see error messages, those are probably due to bugs in Eclipse, and you should be able to sort them out by refreshing and cleaning everything once or twice. (This is the annoying part…)"
I had to fix additional Eclipse errors on my system by setting the Java Compiler Compliance Level to 1.6 for each of the imported projects.
Next I tried to import the project I created by running the scripts in Linux into Eclipse on the Windows machine, but it seems as if one of the scripts deletes the Eclipse project. I went back and imported the originally downloaded PdDroidParty project from the MySynthesizer folder. The project imports under the title PatchSelector. Again, I set Java Compiler Compliance Level to 1.6 (right-click PatchSelector in the Package Explorer - it's under Java Compiler). I also had complaints until I set the Project Build Target to Android 2.2 (right-click PatchSelector in the Package Explorer - it's under Android).
This left me with 9 final errors related to the SVGParser. I resolved this by right-clicking PatchSelector in the Package Explorer and going to Build Path/Configure Build Path/Add JARs and selecting svg-android.jar under PatchSelector/libs in my workspace directory. I remember having to restart Eclipse a couple of times to get rid of spurious errors (!)
Next I renamed PatchSelector to my own title (in my case WhatComesAround) by right-clicking in the Package Explorer and choosing Refactor/Rename.
I copied across files that were created when I ran the Linux scrips, notably:
* patch.zip (res/raw folder) - this zip contains a folder called "patch", itself containing your pd patch as a droidparty_main.pd file, as well as a file titled VERSION-169 (in my case) - it doesn't seem to have an extension and simply contains the text 169
* I placed my splash.svg in the same folder (res/raw)
* my icon.png in the folder res/drawable
* the same icon.png in bin/res/drawable
* in the res/values folder, an patch.xml and strings.xml
Next I set about renaming files and occurrences within files with my own names - one of the Linux scripts does this, but I did it by hand on the Windows machine using Windows Grep (http://www.wingrep.com/) to search for all instances, and changing names inside Eclipse. I replaced all occurrences of PdDroidParty with my own name (WhatComesAround), a single instance of PdDroid Party with What Comes Around, as well as any mention of MySynthesizer (again with WhatComesAround).
After testing in the emulator I exported signed .apk's from Eclipse - one for a free and one for a paid version.
Connecting Canon 7D
What OS are you on?
Does the 7D register itself as a webcam on the system outside of PD?
(can you use it with skype, etc)
AFAIK, you can't get the live video from the 7d (or any DSLR) over usb.
You have to use:
-
a converter cable that will give video out (composite/component)
and then plug that into a usb video device that accepts composite/component -
a cable that takes the camera output and converts it to HDMI and then plug that into a (expensive) HDMI capture card. Though I have no idea if the HDMI capture device would be detected/useable for all video apps...
Problems or missing understanding with send and receive
you are very close with your understanding of how pack works.
it will only output when a value is input to its left inlet. this is a basic pd standard, followed by most objects. (there are exceptions, such as [timer])
so, actually you can throw away most of those [f ] objects you created, and just leave 1 for the y-abs value, that can be triggered by doCalcAbs bang.
the other receives (ymax, ymin, xabs, etc) can be plugged directly into [pack f f f f f f ] where they will be stored until the left inlet gets a value.
actually, you can make it even simpler by just sending directly from the number boxes into the [pack f f f f f] object.
be careful though. there is one big 'mistake' in your patch. you have two bangs: getValues and doCalcSlope. It is not 'wrong' to send many patch cables from one bang (or any other object), but the order in which the messages are sent is only decided by which order the patch cables are created. If you do any sort of cut and paste or something like that, the order may change, and you will have know way of knowing what order the bang messages go. In general, it is a good idea NEVER to send more than one patch cable from one outlet of any object. Even if the order doesn't matter, it is still good practice and safer to use [t b b b b b b] or something like that to force the order. have a look at the help file for [trigger] ( [t ] ) and you will hopefully understand.
In your patch, the order DOES matter, because you need to send the value to the left inlet of [pack f f f f f f] LAST, so that it triggers properly.
Compiling on osx \#1
hello
i have hard times on osx here...
system: osx (10.6) on a macbook pro
i downloaded pd, checked it out from git and svn, but so far there is no version i compiled which is working
when i compile the checked out version from git-repository, pd compiles, but when i start it i get this error:
2011-02-18 17:21:56.182 pd[72376:903] Error loading /Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHDAudio: dlopen(/Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHDAudio, 262): no suitable image found. Did find:
/Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHDAudio: no matching architecture in universal wrapper
2011-02-18 17:21:56.184 pd[72376:903] Cannot find function pointer NewPlugIn for factory C5A4CE5B-0BB8-11D8-9D75-0003939615B6 in CFBundle/CFPlugIn 0x100600a70 </Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin> (bundle, not loaded)
2011-02-18 17:21:56.189 pd[72376:903] Error loading /Library/Audio/Plug-Ins/HAL/JackRouter.plugin/Contents/MacOS/JackRouter: dlopen(/Library/Audio/Plug-Ins/HAL/JackRouter.plugin/Contents/MacOS/JackRouter, 262): no suitable image found. Did find:
/Library/Audio/Plug-Ins/HAL/JackRouter.plugin/Contents/MacOS/JackRouter: no matching architecture in universal wrapper
2011-02-18 17:21:56.190 pd[72376:903] Cannot find function pointer New_JackRouterPlugIn for factory 7CB18864-927D-48B5-904C-CCFBCFBC7ADD in CFBundle/CFPlugIn 0x100602d20 </Library/Audio/Plug-Ins/HAL/JackRouter.plugin> (bundle, not loaded)
Error in startup script: couldn't read file "5400": no such file or directory
i dont know what to do.
thanks for any help.
ingo
Using Jack with pd
Thanks ShankarBaba. I believe LoopBe creates virtual midi cables between applications. I use MidiYoke to do the same thing. What I need is to create virtual cables for transporting audio between applications. That's what virtual Audio Cable and Jack are supposed to do. But I cannot get either to work - Jack not at all with pd, and Virtual Audio Cable not for more than two stereo cables, carrying a total of four channels of audio. I am trying to get something working with 8 channels of audio in total.
Again, thank you ShankarBaba for taking the time to respond - greatly appreciated.
Help with Pd controlling effects in Logic using ctlout
Hi, don't know if this is still relevant but anyways, there is a way without Logics automation menu.
Here is what I do:
(I'm not sure if it is the slickest thing to do, but it works)
Logic is using it's own "control language" for its own and external vst/au plugIns.
You will have to translate your midi cc into Logics language.
In Logics "environment" on the "mixer"-layer simply connect a "monitor" behind the channelstrip (the cable icon in the upper right corner of the channel object) in which your filter plug is located, so that you can see the messages going out of your channel.
If you now turn a knob (let's say, the cut-off frequency) in your plugIn, you should see a message in the monitor that looks just like a cc-message only that it is marked with an "F" instead of the "cc-icon".
Now, on the "click&ports"-layer grab the port on which your cc-data is coming in from the "physical input" and connect it to a "transformer". Configure the "transformer" so that it takes the controller you want to use to control the cut-off frequency and changes it into the "fader"-control data you need for the cut-off-parameter in your plugIn. You will have to change the "Channel" and "Data Byte 1".
The mapping function in the transformer is the way to go, if you want to control more than one parameter in the plugIn.
Connect the "transformer" output to the "Channel" (alt-click on the output cable icon to connect between layers) and Voilá! When you send the cc to Logic, you should see the parameter change in the plugIn GUI.
In this patch, the incoming data goes only to the cable and not to the "SUM" output of the "physical input" object. So it doesn't reach the "Sequencer" by default anymore and you won't see any incoming data in the transport bar. (This is due to the hierarchy of Logic in which the "Environment" is a like a shell around the "Sequencer") To Y-split the incoming data, simply patch a "Monitor" as the first object, which will give you multiple outputs. In case you want to control several Channelstrips via the same port.
This is of course a "fixed" patch, but Logic lets you patch in a way that you can change the destination of the cc-data within Logic (in the environment-menu New->Fader->Specials is the awesome "cable switcher".
Hope this helps.
If anyone has a better solution, please let me know.
Regards,
j
(If this is nothing new, please excuse me. I added some explanations in case someone, who doesn't know that much about Logics environment, finds this )
Make art - call for projects "in-between design" - deadline 31st July
_
_ _ ____ _ _ ||
| | | | ||_ | | || | |
| || | | || | | ||| | |_____
| | || | | | ____ | | | | |
| | | | | | | | | | || | |_
|| || || || || || ||
|| ____ _______ _________
|| || | | | | _
| || | | ||| | | || ||
from | __ | | _ | | |
4 to 7 | | | | | | ||__ ||
NOVEMBER 2010 || || || |_| ||
*CALL FOR PROJECTS*
The sixth edition of make art – in-between design: rediscovering
collaboration in digital art – will take place in Poitiers (FR), from
the 4th to the 7th of November 2010.
make art is an international festival dedicated to the integration
of Free/Libre/Open Source Software (FLOSS) in digital art.
make art offers performances, presentations, workshops and an
exhibition, focusing on the encounter between digital art and free
software.
*in-between design: rediscovering collaboration in digital art*
Today's market production accelerates the spread of non-critical and
standardized aesthetics, by means of locked top-down distribution
mechanisms and a series of tools that enforce it. At the same time
new forms of methodologies inspired or powered by free software,
participatory practices and peer-to-peer networks are fueling many
Internet subcultures. Some of these emerging practices will lead to
competitive social productions, while other will remain as pure
artistic experiments.
By adopting production and distribution methods based on free software
and open standards and by sharing the sources of one's work with
others, the collective knowledge base and aesthetic sensibilities can
freely interact to explore uncharted, hybrid directions which no
longer reflect the supremacy of a single idea.
- Does the sharing of artworks "recipes" and tools help debunk
the myth of the isolated design genius? - By leaving the possibility of ongoing development and refinement,
is it possible to ever produce a "final" design? - Can these methods and technologies inspire new forms of creation or
tools, beyond self-referential productions? - Is it wishful thinking to approach collaborative graphical design
in the same way as an open source software project? - Is Free and Open Source licensing a catalyst for creation or does
it add an extra level of complexity? - Can the limitation of one license trigger new forms of constrained
creativity?
We're currently seeking new, innovative media art and design works and
projects focusing on the above theme and questions:
- graphical artworks and installations
- lectures
- project presentations
- software and hardware demos
We're also seeking audiovisual performances that will take place
during the festival evenings.
The submitted projects must fit this focus and be made in a free/libre
and open source environment, this includes both its optional
dependencies or production tools and the operating system. We are
asking you to publish the sources of your project under a free culture
license of your choice or release it into the public domain.
Projects that do not meet these criteria will not be considered.
*How to apply*
Submission form and a list of additional requirements are available at:
http://makeart.goto10.org/call/
Deadline: *Saturday 31 July 2010*
Incomplete or late applications will not be processed.
*Timeline*
31st of July 2010 – Deadline call for proposals
Beginning of September – Selected projects notifications
4th-7th of November 10 – make art 2010 - Poitiers (FR)
For examples of previous editions, please visit the archives :
http://makeart.goto10.org/
make art is powered by GOTO10