Starting a Pure Data Wiki (Database/Examples Collection)
@beem This site is a "sort of" wiki already, especially as since @EEight last updated the site the search works correctly.
There are already the "patch" and "tutorial" pages.
There is no up-level link on the forum to http://pdpatchrepo.info/patches though, which is a shame as there are many useful patches lurking there.
I think a true wiki would need a lot of oversight and no-one is going to put in the hours..
I know that is not supposed to be the case for a wiki, but there is very little "absolute truth" in Pd patching.
Someone, somewhere, should write a tutorial that would take a complete beginner through the first principals of dataflow...... variables, message order, atom types etc. So many technical questions stem from no understanding of such things, although they are explained here..... http://puredata.info/docs/manuals/pd/x2.htm and those pages are included in Pd/doc/manual that everyone has on their computer but never reads.....?
And a dedicated page for the RPI and Linux would be good.
So many users are bogged down in audio problems with the PI at the moment.
But at the end of the day the RPI has its own forum, the problems are with the OS, and there is some real Linux expertise "over there....>>>" .... https://www.raspberrypi.org/forums/
And thoughtful googling gives us all a web-wide wiki for Pd.
David.
A little Pd mod...
@60hz hmm.. on my system it blinks between black and white. Is this not the case for you? I thought that would be good enough since you'd be able to see the white blink if you couldn't see the black and visa-versa.
But thinking about it now I can see wanting to be able to see the cursor most of the time rather than when it just blinks :->
I can probably do this today and upload the app to the release page later.
edit: ok, the app on the release page is updated. I used canvas_text_cursor instead of box_cursor since one cursor color is used for all canvas items. (so you still have to find one that works for objects, messages, and comments)
the source isn't changed in the release page, but is in the "next" branch
I also learned that the cursor only blinks white if the color is set to black, otherwise it blinks out of existence.
Web Audio Conference 2019 - 2nd Call for Submissions & Keynotes
Apologies for cross-postings
Fifth Annual Web Audio Conference - 2nd Call for Submissions
The fifth Web Audio Conference (WAC) will be held 4-6 December, 2019 at the Norwegian University of Science and Technology (NTNU) in Trondheim, Norway. WAC is an international conference dedicated to web audio technologies and applications. The conference addresses academic research, artistic research, development, design, evaluation and standards concerned with emerging audio-related web technologies such as Web Audio API, Web RTC, WebSockets and Javascript. The conference welcomes web developers, music technologists, computer musicians, application designers, industry engineers, R&D scientists, academic researchers, artists, students and people interested in the fields of web development, music technology, computer music, audio applications and web standards. The previous Web Audio Conferences were held in 2015 at IRCAM and Mozilla in Paris, in 2016 at Georgia Tech in Atlanta, in 2017 at the Centre for Digital Music, Queen Mary University of London in London, and in 2018 at TU Berlin in Berlin.
The internet has become much more than a simple storage and delivery network for audio files, as modern web browsers on desktop and mobile devices bring new user experiences and interaction opportunities. New and emerging web technologies and standards now allow applications to create and manipulate sound in real-time at near-native speeds, enabling the creation of a new generation of web-based applications that mimic the capabilities of desktop software while leveraging unique opportunities afforded by the web in areas such as social collaboration, user experience, cloud computing, and portability. The Web Audio Conference focuses on innovative work by artists, researchers, students, and engineers in industry and academia, highlighting new standards, tools, APIs, and practices as well as innovative web audio applications for musical performance, education, research, collaboration, and production, with an emphasis on bringing more diversity into audio.
Keynote Speakers
We are pleased to announce our two keynote speakers: Rebekah Wilson (independent researcher, technologist, composer, co-founder and technology director for Chicago’s Source Elements) and Norbert Schnell (professor of Music Design at the Digital Media Faculty at the Furtwangen University).
More info available at: https://www.ntnu.edu/wac2019/keynotes
Theme and Topics
The theme for the fifth edition of the Web Audio Conference is Diversity in Web Audio. We particularly encourage submissions focusing on inclusive computing, cultural computing, postcolonial computing, and collaborative and participatory interfaces across the web in the context of generation, production, distribution, consumption and delivery of audio material that especially promote diversity and inclusion.
Further areas of interest include:
- Web Audio API, Web MIDI, Web RTC and other existing or emerging web standards for audio and music.
- Development tools, practices, and strategies of web audio applications.
- Innovative audio-based web applications.
- Web-based music composition, production, delivery, and experience.
- Client-side audio engines and audio processing/rendering (real-time or non real-time).
- Cloud/HPC for music production and live performances.
- Audio data and metadata formats and network delivery.
- Server-side audio processing and client access.
- Frameworks for audio synthesis, processing, and transformation.
- Web-based audio visualization and/or sonification.
- Multimedia integration.
- Web-based live coding and collaborative environments for audio and music generation.
- Web standards and use of standards within audio-based web projects.
- Hardware and tangible interfaces and human-computer interaction in web applications.
- Codecs and standards for remote audio transmission.
- Any other innovative work related to web audio that does not fall into the above categories.
Submission Tracks
We welcome submissions in the following tracks: papers, talks, posters, demos, performances, and artworks. All submissions will be single-blind peer reviewed. The conference proceedings, which will include both papers (for papers and posters) and extended abstracts (for talks, demos, performances, and artworks), will be published open-access online with Creative Commons attribution, and with an ISSN number. A selection of the best papers, as determined by a specialized jury, will be offered the opportunity to publish an extended version at the Journal of Audio Engineering Society.
Papers: Submit a 4-6 page paper to be given as an oral presentation.
Talks: Submit a 1-2 page extended abstract to be given as an oral presentation.
Posters: Submit a 2-4 page paper to be presented at a poster session.
Demos: Submit a work to be presented at a hands-on demo session. Demo submissions should consist of a 1-2 page extended abstract including diagrams or images, and a complete list of technical requirements (including anything expected to be provided by the conference organizers).
Performances: Submit a performance making creative use of web-based audio applications. Performances can include elements such as audience device participation and collaboration, web-based interfaces, Web MIDI, WebSockets, and/or other imaginative approaches to web technology. Submissions must include a title, a 1-2 page description of the performance, links to audio/video/image documentation of the work, a complete list of technical requirements (including anything expected to be provided by conference organizers), and names and one-paragraph biographies of all performers.
Artworks: Submit a sonic web artwork or interactive application which makes significant use of web audio standards such as Web Audio API or Web MIDI in conjunction with other technologies such as HTML5 graphics, WebGL, and Virtual Reality frameworks. Works must be suitable for presentation on a computer kiosk with headphones. They will be featured at the conference venue throughout the conference and on the conference web site. Submissions must include a title, 1-2 page description of the work, a link to access the work, and names and one-paragraph biographies of the authors.
Tutorials: If you are interested in running a tutorial session at the conference, please contact the organizers directly.
Important Dates
March 26, 2019: Open call for submissions starts.
June 16, 2019: Submissions deadline.
September 2, 2019: Notification of acceptances and rejections.
September 15, 2019: Early-bird registration deadline.
October 6, 2019: Camera ready submission and presenter registration deadline.
December 4-6, 2019: The conference.
At least one author of each accepted submission must register for and attend the conference in order to present their work. A limited number of diversity tickets will be available.
Templates and Submission System
Templates and information about the submission system are available on the official conference website: https://www.ntnu.edu/wac2019
Best wishes,
The WAC 2019 Committee
commands for forum search engine
hello all,
are there symbols to use for specific searching in the pure data forum search engine?
f.ex. google has apostrophes like: "whatever you want to search", which gives results where exactly 'whatever you want to search' gets displayed.
in pd the engine gives you posts that would include the words: whatever, search, you, want, to (etc) which doesn't help when looking for a specific term...
any tips on more effective forum use are welcome.. i want this so i dont have to ask a new question everytime i want to ask something and can browse the archives better.. thank you
split list
I have a list
1 2 3,page,1
[1] [2] [3,page,1]
[list split]
Is there any way that when [1] is bang I get a 1 but if [3,page,1] is bang I can split at each comma
and print 3 atoms like float 3 smybol page flot 1
MobMuPlat OSC-to-MIDI Remote Control, spec. for PD, Guitarix, and Rakarrack and other guitar effects apps
It's always funny to me how working on one PD project results in fruit that might also prove useful.
Currently, working on a broader project, but this part (at least) (I think) is ready for distribution.
Background: I Really hate leaving the side of my guitar (for example to click a mouse or tweak a setting) when I am playing. So designed this mmp app so that I can attach (with funtack, ex.) my handheld to my guitar and control my settings from there or perhaps even as I play. Like so:
[Aside: the broader project also includes (on the front-side of the app) controls driven by the handheld's tilt, as well as, the adc~ pitch, env~, and distance of the pitch from a "center". But that part has to wait (though the code is all inside the mmp_osc2midi_rc.pd patch just disconnected/does not use the cpu).]
SETUP (linux is what I know, so other OSes may work just can't/haven't tested them):
JACK
Pure Data
a2jmidid - JACK MIDI daemon for ALSA MIDI (in the Ubuntu at least repos)
guitarix a/o rakarrack, or other midi-driven effects apps (such as the included set of 30 pd effects(see below))
PIECES:
The mmp_osc2midi_rc.zip (extract(android)/install(ios) to MobMuPlat directory)
The osc2midi_bridge.zip (to be run on the "pd receiver") and includes the osc2midi_bridge.pd, help file, and a set of 30 mmp-ready-(mono) effects (in the ./effs directory) that can be used if PD is to be used as the receiver. (They are standardized to include 3 inlets: left=inlet~, center=[0 $1(, [1 $1(, [2 $1( messages sent to 3 parameters on the effect, and right=[switch~] and 1 outlet~ and a demo guitarix "bank" file called "MIDITEST.gx" (which includes 2 presets/programs and set midi values (0-8) to test. Just add the file to the guitarix config "banks" folder).
Instructions:
- Start Jack;
- In Pure Data, open the osc2midi_bridge-help.pd file; toggle "listen" on; and set MEDIA to "ALSA-MIDI" (the additional pieces are just examples of receiving the midi values);
- In Jack>ALSA: Connect Read:MIDI to Write:PD & Read:PD to Write:Midi;
- From the command line execute "a2j_control start" (no quotes);
- Start (ex.) guitarix;
- In Jack>Midi connect a2j:Pure Data(capture) to (ex) gx_head_amp:midi_in_1.
Note/Alert/etc. For a machine to receive OSC messages the recipient-computer's firewall must (I believe) be turned off.
GUI:
The GUI has 3 pages that look (basically) as follows (with the subsequent 2 pages having only the 3 knobs and 3 buttons).
The buttons trigger text entry boxes which allow you to enter numbers (0-127) representing:
PGM: the number sent to (midi) [pgm]
and
(the buttons beside the knobs) the midi value, i.e [ctrl] (0-127), each knob is to be sent to (all on channel=0). (The resulting number of which are all written to the label to the left.)
For example:
Results in:
You can now to do 1 of 2 things:
Set the mmp-knobs to whatever midi values you have set in guitarix
or
Set the guitarix midi values (mousewheel click on a control) to one of the mmp preset 0-8 midi values.
WOW!!!
That took a helluva a lot of writing (and reading) but I hope you can both see the value and make use of this bit of technology.
As I said before,
I look forward to being able to fine tune my sound ALL while my hands and I are BOTH still at my guitar and not bend over, move, etc. etc. etc. to get the sound I want.
Peace, Love, and Ever-Lasting Good Cheer.
If you made it this far , Thanks for Listening.
Sincerely and Optimistically,
Scott
p.s. ask whatever you want regarding setup, how to use, points-of-clarification, etc. I am more than happy to help.
"Out of Love comes Joy"
digital artifacts when using the shell object
the [shell] object starts a new process through the system scheduler I think so I believe it already has it's own thread (process)..
I would think that most of the glitches happen when you close and open Preview every time you need a new page..
Perhaps you could have all of the pages in a single pdf file with preview open, and when you need a new page perhaps there is a program with a "page down" scriptable event?
Novation Launch controller abstraction, with LED feedback for the buttons.
Heeeeelllloooo PD users
Here is my first contribution to the community library. This is a midi controller abstraction for the
Novation Launch Controller. The first 4 pages on the Launch Controller are assigned to midi cc and gives you full feedback over the LED's See further description in patch.
Patch with abstractions for each of the first 4 pages:
Launch Controller .pd
I have also included the 4 Launch Controller set up for the 4 pages to get you started.
PD user page 1.syx
PD user page 2.syx
PD user page 3.syx
PD user page 4.syx
These have been updated, the first version I put was not really working. I was a bit too quick posting
it out of Pure excitement
Have fun!
Jaffa
Andúril (MobMuPlat app): fwd/bwd looper + 14 effects + elven cloak (control parameters via env~ and pitch as you play)
Andúril (MobMuPlat app): fwd/bwd looper + 14 effects + elven cloak (control parameters via env~ and pitch as you play)
UPDATED VERSION (corrected MobMuPlat system crash problem):
anduril.zip
This has been long in coming and I am very glad to finally release it (even tho my handheld hardware is not up to the job of running the elven cloak feature).
First a demo video and some screenshots, , and then the instructions.
DEMO VIDEO
SCREENSHOTS
Intention(s):
The app is designed to give (specifically a guitarist) tho really any input (even prerecorded as is the case in the demo (from: "Laura DeNardis Performing Pachabels Canon" from https://archive.org/details/LauraDenardisPerformingPachabelsCanon, specifically the wave file at: https://archive.org/download/LauraDenardisPerformingPachabelsCanon/PachabelsCanon.wav, Attribution-Noncommercial-Share Alike 3.0) FULL Control over the "voice" of their output-sound.
It includes:
a 5-band EQ (on page 2 of the app) (upfront that is applied to all incoming sounds);
a looper: with record, forward, backward, speed, and bypass controls (that runs via a throw along with the effects channel)
14 effects each with 3 controllable parameters (via the xy-slider+centered knob) including: chorus, distortion, delay, reverb, flanger, tremolo, vibrato, vcf, pitchshifter, pitchdelay, 12string, stepvibrato, pushdelay (delayfb driven by magnitude of the env~), and stagdelay (2 out-of-sync delay lines which can be driven in and out of phase by the sum of their delwrite+vd's so what goes in first may come out last)
elven_cloak: which drives the 3 parameter controls via the peak bands amplitude and proximity to a set pitch (midi note) and whose window can be broadened or shrunk and shifted within that window, i.e. the three effect parameters are changed automatically according to what and how you play
and
a tester synth: that randomly sends midi pitches between 20-108, velocities between 20-127, and durations between 250-500ms.
CONTROLS (from top-left to bottom-right):
PAGE 1:
Effect: effects menu where the you choose an effect;
>>>,<<<: page navigation buttons;
IN,OUT: gains (IN is the preamp on the EQ5, and OUT is applied to total output);
REC,FWD,BWD,speed,normspd: the looper toggles and on speed, higher is faster and mid normal and normspd resets to mid;
xy-slider+centered knob: the 3 parameter controls + their labels (the bottom is x, top y and above the knob for the third one), the name of the selected effect and its parameters load each time you choose from the Effects menu, bottom left is lowest, top-right highest;
ByLp,ByEff: bypasses for the looper and effects "channel" (the outputs are summed);
EC-on: elven cloak toggle (default=off);
PAGE 2:
the EQ5 controls;
synthtest: off|on, default is off;
PAGE 3: elven cloak controls
reset: sets shift, metro, mid, and radius to 0, 500(ms),64,100% respectively (i.e. the entire midispectrum, 0-127) respectively;
mini-xyz, test: if test is on, you see a miniature representation of the xyz controls on the first page, so you can calibrate the cloak to your desired values;
shift: throws the center of the range to either the left or right(+/-1);
metro: how frequently in milliseconds to take env~ readings;
mid: the center in midipitch, i.e. 0-127, of the "watched" bands
radius(%): the width of the total bands to watch as a percentage of whichever is lower 1-mid or mid
END CONTROLS
Basic Logic:
There are 4 modes according to the bypass state of the looper and effects.
A throw catch and gain/sum/divide is applied accordingly.
End:
As I mentioned at the first, my handheld(s) are not good enough to let me use this but it runs great on my laptop.
So...
I would love to hear if this Does or Does Not work for others and even better any output others might make using it. I am enormously curious to hear what is "possible" with it.
Presets have not (yet been included as I see it, esp. with the cloak as a tool to be used for improv and less set work. Tho I think it will work nicely for that too if you just turn the cloak off.
hmmm, hmmm,...
I think that's about it.
Let me know if you need any help, suggestions, ideas, explanations, etc. etc. etc. regarding the tool. I would be more than happy to share what I learned.
Peace, Love, and Ever-Lasting Music.
Sincerely,
Scott
p.s. please let me know if I did not handle the "attribution" part of "Laura DeNardis Performing Pachabels Canon" License correctly and I will correct it immediately.
Ciao, for now. Happy PD-ing!
The Harmonizer: Communal Synthesizer via Wifi-LAN and Mobmuplat
The Harmonizer: Communal Synthesizer via Wifi-LAN and Mobmuplat
The Harmonizer
The Harmonizer is a single or multi-player mini-moog synthesizer played over a shared LAN.
(credits: The original "minimoog" patch is used by permission from Jaime E. Oliver La Rosa at the the New York University, Music Department and NYU Waverly Labs (Spring 2014) and can be found at: http://nyu-waverlylabs.org/wp-content/uploads/2014/01/minimoog.zip)
One or more players can play the instrument with each player contributing to one or more copies of the synthesizer (via the app installed on each handheld) depending on whether they opt to play "player 1" or "player 2".
By default, all users are "player 1" so any changes to their app, ex. changing a parameter, playing a note, etc., goes to all other players playing "player 1".
If a user is "player 2", then their notes, controls, mod-wheel etc. are all still routed to the network, i.e. to all "player 1"'s, but they hear no sound on their own machine.
There are 2 pages in The Harmonizer. (See screenshots below.)
PAGE 1:
PAGE 2:
The first page of the app contains all controls operating on a (more or less) "meta"-level for the player: in the following order (reading top-left to bottom-right):
pl2: if selected (toggled) the user is choosing to play "player 2"
mot(ion): triggers system motion controls of the osc1,2&3 levels (volume) based on the accelerometer inside the smartphone (i.e as you twist and turn the handheld in your hand the 3 oscs' volumes change)
rate: how frequently should the handheld update its accelerometer data
slimit: by how much should the app slow down sending the (continuous) accelerometer data over the network
presets: from 1 to 5 preset "save-slots" to record and reload the Grid 1 and Grid 2 settings that are currently active
S: save the current Grid1 and Grid2 selections to the current "save slot"
L: load the currently selected preset into both Grids
">>": go the the next page (page 2 has the reverse, a "<<" button)
Grid 1: the settings, in 4 banks of 3 parameters per-, which are labeled top-down equating to left-right
Grid 2: the same as Grid 1, but with a different set of parameters
The second page comprises:
the 2-octave keyboard (lower notes on top),
a 9-button octave grid (which can go either up or down 4 octaves),
a quick-preset grid which loads one of the currently saved 5 presets
the "<<" button mentioned above, and
both a mod and pitch-bend wheel (as labeled).
SETUP:
All players install Mobmuplat;
Receive The Harmonizer (in the form of a .zip file either via download or thru email, etc.)
When on your smartphone, click on the zip file, for example, as an attachment in an email.
Both android and iphone will recognize (unless you have previously set a default behavior for .zip files) the zip file and ask if you would like to open it in Mobmuplat. Do so.
When you open Mobmuplat, you will be presented with a list of names, if in android click the 3 dots in the top right of the window and on the settings window , click "Network" Or on an iphone click "Network" just below the name list;
On the Network tab, click "LANDINI".
Switch "LANDINI" from "off" to "on".
(this will allow you to send your control data over your local area network with anyone else who is on that same LAN).
From that window, click "Documents".
You will be presented again, with the previous list of names.
Scroll down to "TheHarmonizer" and click on it.
The app will open to Page 1 as described and shown in the image above.
Enjoy with Or without Friends, Loved, Ones, or just folks who want to know what you mean "is possible" with Pure Data )!
Theories of Thought on the Matter
My opinion is:
While competition could begin over "who controls" the song, in not too great a deal of time, players will see first hand, that it is better (at least in this case) to work together than against one another.
If any form of competition emerges in the game, for instance loading a preset when a another player was working on a tune or musical idea, the Overall playablilty and gratitude-level will wain.
However, on the other hand, if players see the many, many ways one can constructively collaborate I think the rewards will be far more measurable than the costs, for instance, one player plays notes while the other player plays the controls.
p.s. my thinking is:
since you can play solo: it will be fun to create cool presets when alone then throw them into the mix once you start to play together. (Has sort of a card collecting fee ).
Afterward:
This was just too easy Not to do.
It conjoins many aspects of pure data together (I have been working on lately (afterward: i did this app a long time ago but for some reason and am only now thinking to share it) both logistical and procedural into a single whole.
I think it does both quite well, as well as, offer the user an opportunity to consider or perhaps even wonder: What is 'possible'?"
Always share. Life is just too damn short not to.
Love only.
-svanya