Ofelia: How to get pixel data?
Does the getColor() integrated into Ofelia have the index 'bug' mentioned here - https://github.com/openframeworks/openFrameworks/issues/5321? I'm able to retrieve values way beyond what I would expect in the image like so:
ofelia f;
m = require("abc");
pix = m.img:getPixels();
//both of these agree on the number of pixels
print("number of pixels:", pix:getWidth() * pix:getHeight());
print("number of pixels: ", pix:getTotalBytes() / pix:getNumChannels());
//but this returns a value????
numofPixels = pix:getWidth() * pix:getHeight();
print(pix:getColor(numofPixels + 1));
return(anything);
Failsafe Against Audio Volume Slips
I have heard that it is very easy to produce dangerous audio volumes (both to speakers and hearing) with Pure Data when even a slight slip or oversight occurs.
The general advice for combating this is simply to be very careful, and I have developed a number of habits when working in Pure Data, such as always using external speakers so that (presumably) my machine's internal sound system would not take any damage, keeping my Windows volume control as low as feasible, and adjusting my speaker volume knob up from min volume whenever playing audio. However, these measures still rely on accurate, human execution. Having recently read The Design of Everyday Things by Don Norman, I have come to belief that this is a suboptimal design which makes human error very likely to occur.
I am running Pure Data on Windows 7. Is there a way I can have either Windows or Pure Data put a cap on the maximum volume it will play? To be clear, I do not want a way to turn the Windows volume down more, as I would then be unable to hear the desired output of Pure Data. The danger is that the difference between desired volume and potential erroneous volume is many orders of magnitude, so I would like to be able to either put a hard ceiling on volume, or simply have the audio automatically muted while it is above a certain volume threshold. In the former case, it would be nice to also have some signifier to indicate that the ceiling is being applied. Is it possible to do this in Windows or in Pure Data?
[pix] + [pix-ds] Draw images directly onto the patch (vanilla)
The abstraction [pix] allows to draw images that are in plain PPM format directly onto the patch. Any image can be converted into PPM using graphic software like GIMP or others. The images are displayed in full RGB color.
There is also [pix-ds], which is much faster, as it uses data structures. It only uses the reduced color space of data structures. [pix-ds] still has a border on the right and bottom. Sorry!
[pix] uses canvases to create the image. Depending on the image size, the number of canvases can get very large and it may take a long time to display the image and also to close the patch! [pix-ds] will run into the same issues. Images will also add up! Use with care!
Due to the limitations of the Pd GUI this is just a nerdy fun project and may not be really usable for a serious task.
Anyhow, enjoy!
Download: pix.zip
Sysex program dump with random zeros
Okay ;that's what I suspected you were doing. It is possible to use the MTP serial directly with linux and the mtpav driver, though this requires a machine with true hardware parallel port access (USB adapters won't cut it) and is limited to MIDI output only as the driver is a hack that was never finished. I used to run mine that way with a second USB MIDI interface connected to one or more of the MTP's unused routed MIDI outputs (there is, BTW still no way to connect the MTP USB version directly to a Linux system as it doesn't use a standard USB MIDI driver)
As far as the kernel is concerned, I've only ever used lowlatency or RT kernels for MIDI stuff. I don't know how a generic kernel would affect this problem but it's probably best avoided as they are not specifically built for multimedia.
Now read closely as this is where it becomes stupidly complicated.
First, make sure that the problem is internal to Pd. Your interfaces are probably not the issue (since Pd is only seeing the UM-1 driver I don't think the MTP even figures in here). However, the chosen MIDI API can get in the way, specifically in the case of the unfinished JACK MIDI. JACK MIDI presently can only pass SYSEX as short realtime messages. SYSEX data dumps will disappear if sent into the current JACK MIDI system. ALSA works OK. OSS probably too, but I have not tried it.
If that stuff is ruled out and you are still getting problems, it probably has to do with a set of longstanding bugs/oversights in the Pd MIDI stack that can affect the input, output, or both.
On the output side there is a sysex transmission bug which affects all versions of Pd Vanilla before 0.48. The patch that fixed Vanilla had already been applied to L20rk/PurrData for some time (years, I think). The output bug did not completely disable SYSEX output. What it did was to miss-format SYSEX in a way that can't be understood by most modern midi applications, including the standard USB MIDI driver software (SYSEX output from Pd is ignored and the driver/interface will not transmit anything). You would not notice this bug with a computer connected directly to an MTP because the MAC and Linux MTP drivers are programmed to pass raw unformatted MIDI.
If this is the problem and you have to use a pre-0.48 version of Vanilla it can be patched. See:
https://sourceforge.net/p/pure-data/bugs/1272/
On the MIDI input side of Pd we have 2 common problems. One is the (annoyingly unfinished but nevertheless implemented) input to output timestamping buffer that's supposed to provide sample-accurate MIDI at the output (if it was finished). This can be minimized with the proper startup flags (for a MIDI-only instance of Pd, -rt -noaudio -audiobuf 1 -sleepgrain 0.5 works for me) or completely defeated with a source code tweak to the s_midi.c file. This may not be the source of your particular problem as it usually only affects the time it takes to pass messages from input to output, but is worth mentioning as it can be very annoying regardless.
With SYSEX dumps it is more likely that the problem lies with the MIDI queue size. This is very common and I experienced it myself when trying to dump memory from a Korg Electribe Ea-1.. The current version of Pd limits the queue to a size of 512 (even worse it used to be 20) and any input larger than that will get truncated w/no error warning. There is not yet any way to change this with startup flags or user settings. This can only be "fixed" by a different tweak to s_midi.c and recompiling the app.
The attached text files are derived from the Pd-list and will show the specific mods that need to be made to s_midi.c.
Sound problems of Patches that moved from Windows 10 to Raspbian
Hello,
I've been making guitar effects with Puredata and Raspberry Pi.
Previously I did not care much about this problem, but at the last moment I got a terrible problem so I would like to consult with you.
I created Puredata Patches on Windows 10, copy Patches to Raspbian and use it.
In most cases there is no problem,
but depending on the Patches the sound reproducibility will be different.
Please help if there is a solution,thank you.
Windows_And_Raspbian.mp4
test.pd
*I uploaded same files in Google Drive and Dropbox.
1)There is a problem with the reproducibility of the sound, especially the problem is remarkable with the decay sound.
2)The problem is reproduced even if Windows 10 and Raspbian use the same USB Audio interface(Behringer UCG102)
3)Even if I use a different USB Audio interface (Zoom U - 22) in Raspbian, the problem is reproduced.
audio video playback using [gemwin] and [readsf] system needs help
Hi All, looking for some help to close out this patch and progress to user testing...
I have constructed a live cam mixer: where Quad 1 and 2 are in the gem win (using [pix video] and works fine) and I want to load a pre recorded clip into Quad 3 of gem win using [pix film]- i am treating audio and video as two separate streams ; [pix film] to play back the frames and [readsf] to play its corresponding audio - but playback is too fast outta [pix film] and sound filer is throwing up errors- i have tried using a metro system to count and match the frame rate outta [pix film] so i can set a frame rate if known but have now tied myself up in knots
anyone with a little more experience than I wanna help get this over the line ??
any help would be very much appreciated- thanks LESSA FINAL.pd
link to pre rec clips audio and video:
https://drive.google.com/a/mycit.ie/folderview?id=0B1ivfjGLTsmRal85SGwxeXpxNUk&usp=sharing
Video tracking
Hi guys,
i'm trying to implement a system to track robots by a webcam. My problem is, it is difficult to track more than one robot with the PIX basic objects.
My system comport three robots identified by plates of different vivid colors, red, green and blue, and a platform with amoeba form that probably will be white or black in the final.
I've already tried to implement the track with pix_movement and it worked very well with just one robot. To work with two robots i thought it would be better to separate the RGBA matrix, to focus in each color, and have a functional track of each robot separately. BUUUT, it did not work very well, because the separated matrix did not present a good contrast of the predominant color, as a nice and big red mark over the red robot, for example, Instead of it, i had a very messy image caused by the interference of the background colors.
to filter and track the image i used:
pix_separator - to use 3 different images
pix_threshold - to keep just the wanted color, if it is high enough
pix_gain - to intensify the required color
pix_movement - to track the movement of the robot
pix_blob - to track the center of mass
One solution i thought its to use a filter that can recognize the image in another format, not RGBA, a format that differ colors, saturation and highness in different values and scales. Does it exists?
Thanks very much for reading it, i would be very glad if you could help me! =)
Set up the path for abstractions
hey,
I don't know but I wonder if maybe there isn't a more general problem going on right now (though it seems simple and wierd so I dont get it)
I am also having trouble loading abstractions even though I have done it quite a lot before, in particular (I posted this problem a few days ago but am posting in response to your problem in hopes that it might be a more global problem - though my abstractions, other than the ones mentioned below, load so I dont understand)
specifically, was the problem with the path not being recognised?
anyway -
Hello everyone,
This is a strange problem because I have loaded libraries and things with the 'Paths' dialog under the file menu before and had no problems.
I am trying to get Chris McCormick's s-abstractions to load. The folder is in the same folder as my other libraries and it is listed in both the "paths" and "startup" areas. It wasn't before but I added it.
I am running windows 7 64 bit. Another oddity that I noticed is that in my Program Files (x86) which is where the 64 bit programs live, I have pd installed and the folder is simply called pd. However in the Program Files folder (missing the x86, where the non-64 bit programs live) I have a folder called "Pd-0.42.5-extended"
I wonder if that couldn't be the problem. The s-abstractions folder is included in both though...
Hopefully somebody has some idea about this...
((anyway sorry to post twice but I hoped maybe there was some common problem there))
Unable to lower resolution of unibrain webcams.
yup!
It's super weird.
[pix_videoDarwin]: using RGB
[pix_videoDarwin]: height 480 width 640
[pix_videoDarwin]: starting reset
[pix_videoDarwin]: SG channnel Device List count 6 index 4
[pix_videoDarwin]: SG channnel Device List DV Video
[pix_videoDarwin]: SG channnel Device List DVCPRO HD (1080i50)
[pix_videoDarwin]: SG channnel Device List DVCPRO HD (1080i60)
[pix_videoDarwin]: SG channnel Device List DVCPRO HD (720p60)>
[pix_videoDarwin]: SG channnel Device List IIDC FireWire Videofstg
[pix_videoDarwin]: SG channnel Device List USB Video Class Video
[pix_videoDarwin]: SG channnel Input Device List 0 unibrain Fire-i
error: [pix_videoDarwin]: SGSetChannelDevice returned error -9408
error: [pix_videoDarwin]: SGSetChannelDeviceInput returned error 709
[pix_videoDarwin]: vdigName is unibrain Fire-i
[pix_videoDarwin]: digitizer rect is top 0 bottom 1200 left 0 right 1600
[pix_videoDarwin]: active src rect is top 0 bottom 1200 left 0 right 1600
that's kind of the extent of the messages and I don't really know what to make of it. What's with all that HD jazz? I certainly don't want that.
but yeah, so far, I can only get 1200 1600 and higher. Maybe if I worked in YUV?
edit: I forgot to mention that I'm running a macbook pro with snow leopard.
Pix\_texture
Hi everyone !
i'm using Gem to Vjing but i have so problems
on display .png or .tif pictures with alpha Layer.
I want to superpose several outlined images without
alpha but png doesn't work on Windows, so i try
on ubuntu:
[gemhead] [alpha] [pix_image] [translateXYZ] [pix_draw]
but i can't resize with pix_draw and it's really slow when
i load a picture....
http://mcnaze.free.fr/pix_draw.png
[gemhead] [alpha] [pix_image] [translateXYZ] [pix_texture] [square 4]
but my images (which are solid color) are displayed with
an ugly outline.
http://mcnaze.free.fr/pix_texture.png
Is there a way to display several clean pictures without alpha?
Thanks a lot.
G.