Invert spectra
toxonic:
did you ever connect that FFT shift subpatch that is hidden in your patch? i just tried that and it works really well.
i can see a really neat school of fx coming out of this:
as far as i know, the following can all be pretty easily applied:
inverse
shift
all sort of EQ effects
feedback
then, some that i just brainstormed, but haven't tried yet:
FFT "bitcrush"
reduce the resolution of each value output by the FFT
LFO'd FFT
use lfo's (or envelopes, or whatever) to modulate FFT outputs. i already tried this a little bit with one of my DIY objects, and it works well.
Randomization:
rather than just an inverse, maybe try some random scrambling on a selected range of the FFT to see what happens.
again, might be even more interesting if the range gets modulated with an LFO or something.
however, the bug 'AHA' moment i just had was that it will be a million times more computationally efficient to include ALL of these things within one FFT analysis, and have switches to turn them on and off as desired.
you know, it's going in and out of the FFT domain that uses most of the cpu resources, so while you're in there, you may as well get as much done as possible.
i'll have a go with this in the coming week and see where i can get. any help would be excellent!
NVidia puredyne vs vista
Hey
Once again I'm back on vista. My brief journey to puredyne goes something like this.
from puredyne
partitioned new hard drive with 4 partitions 1-swap , 2 ext4 200Gib, 1 NTFS 200GB . Copied Vista to NTFS partition with gparted(started to do this with dd but with new Gibabits stuff I was skeered but hey a platter is a platter and a cluster is a cluster right). Actually there was a 5th partition in there which was the vista restoration partition but I eventually deleted it for obvious reasons.
Removed old drive ,
took machine to ethernet connection with internet,
booted to vista ,
installed Norton virus scanner - only 1 virus, 2 tracking cookies, and one gain trickler after over a year with no virus scanner, Thats pretty good and shows you can be selective about what you download and where you visit on the web and not get any viruses.
Was trying to get windows movie maker to work with mov codec which it doesnt.
Installed auto updates for the first and only time. this took about 3 hours and still needed to install service pack 1, 2. Aborted trying to update win movie maker
****Problem 1***** MS movie maker does not work with GEM mov video output codec.
Installed latest Nvidia drivers for vista.
Woohooo OpenGl hardware acceleration GEM works so much better in vista now.
*****Problem 2********ASIO for all does not allow 2 applications to run with audio and midi at the same time. If I wanted to run Widi and play my guitar so puredata can receive midi input from guitar ASIO will not allow it but ASIO for all is the only way I can get latency down so this would be feasible.
Got frustrated.
Installed fresh copy of puredyne onto 2nd ext4 partition.
ran puredata audio latency good with fresh install. gem video bad.
used synaptics to install nvidia drivers. v185 I think.
ran GEm error no opengl config.
downloaded gdeb , flash player, google chrome, nvidia drivers from nvidia site.
installed gdeb from terminal, flash and chrome from gdeb.
rebooted to log in as root and iunstalled nvidia drivers.
woohoo hardware acceleration for opengl
******problem 3************
gem works great but audio glitches with puredata window visible.
when only gem window is visible i can unaudioglitchily move the mouse around to control same parameters that would normaly be controled with sliders in pd window.
Seems like before i upgraded nvidia video drivers hw:do o whatever it is, was listed in JackD as realtek HD Audio, now it is listed as NVIDIA HD Audio.
So Gem works great but not when Pd window is open and I think my audio drivers were replaced with Nvidia.
I would really like to get back to my patch and stop wasting time on these operating systems and hardware configurations I thought in the last 15 years all this hardware acelleration and autodetection would be figured out but no all we got is rc.0, rc.01, rc.02, rc, etc...... instead of just rc.d.
Ok I'm getting an error when I exit puredyne that says pid file not found is jackd running. anyone know what that means?
Also does anyone have a keymap for gem keyboard for linux, windows and mac so I don't have to record them.
Any suggestions about what to do about my ASIO problems in windows?
Any suggestions about my audio driver in Linux?
Any suggestions about my GEM mov video / windows incompatibilities?
I can import the mov video and audio to SF Acid 3.0 and output a 2Gb avi then import it into windows movie maker but this is crazy to do for a 7 minute 12fps 320x240 video.
Ahhhhhhhhh!
Any suggestions are welcome!
PS I believe someone could rewrite puredata's audio and video interface in assembly language to control their hardware before figuring out how to get these operating systems to do it.
Pix\_record
Thanks sonofsol, that was just what I was looking for today.
Great job on your GEM engine. I think it would be awesome to have 3d controllers for the pd stuff. My main synth patch is about 4 screens wide by 2 screens high. Not that there are there that many controls but there are that many things between the controls. It would be nice to just make a slick 3d control interface. I can imagine a block sequencer haha!
Now I have my video and audio recorded. I tried to open the video in microsoft movie maker. Codecs not installed. The video opens in VLC. The video also opens in Sonic Foundry Acid 3.0 but is shorter duration in than the audio. The video barely records at 12 frames per second. While controlling audio. The computer almost hangs up if I select mpg instead of video. VLC reports the fps as 24.788104. I used the default video as the codec at 12fps.
How do I get the video synced with the audio?
Where are the qff codecs for Windows so I can use microsoft movie maker?
What is a better program for editing videos to send to youtube?
I would like to keep my audio in a lossless format at 48k.
Which fft-code is built into Pd binaries?
Does anyone happen to know which fft-code is built into Pd / Pd-extended binary releases?
The source file d_fft.c states that it interfaces 'to one of the Mayer, Ooura or fftw FFT packages to implement the "fft~", etc, Pd objects. ... The configure script can be used to select which one.'
The configure.in script in SVN says about fftw:
AC_SUBST(fftw, no)
AC_ARG_ENABLE(fftw, [ --enable-fftw use FFTW package],
fftw=$enableval)
dnl Check for fftw package
if test x$fftw = "xyes";
then
AC_CHECK_LIB(fftw, fftw_one,PDLIB="$PDLIB -lfftw",
echo "fftw package not found - using built-in FFT"; fftw=no)
So it depends on the build machines. On the other hand, [partconv~] from Ben Saylor depends on fftw and this object works fine, at least in 0.42.5-extended build for OSX.
From these bits I conclude that fftw is built into Pd binaries and also used for [fft~] & co. Is that correct? fftw claims to be the fastest. Not that it is very important to me, I am just curious.
Katja
Pdj lib problems
I successfully compiled the pdj lib
placed the whole dir in the /usr/lib/pd/extra dir
set a path to it in pd-extended
tried to open a help file for the 'help class' and got
a red outline around the [pdj help class] external
and this error in the console:
pdj help_class
... couldn't create
pdj: unable to use the JVM specified at pdj.JAVA_HOME
pdj: using JVM from the LD_LIBRARY_PATH
error: pdj: libjava.so: cannot open shared object file: No such file or directory
pdj help_class @attr1 10
... couldn't create
tried text-help.pd and failed
tried /usr/lib/pd/doc/5.reference/text-help.pd and succeeded
I'm reading through the 'pdj.properties' file but not sure of what to modify
any help is appreciated!
??
===================
my Linux audio config:
Dell Studio 15 Core2 Duo 2.0GHz 3G-RAM
Ubuntu 9.04
kernel 2.6.28.18-generic
Processing to communicate with Pd
hey Arif, thanks for the reply. I have input what I think is right. Still no joy. Now it is disagreeing with CoordsCalc. I have included the code. I already had some of the things from the example. I think I just need to add integers but I dont know how or what to put.
import processing.video.*;
import oscP5.*;
import netP5.*;
OscP5 oscP5;
NetAddress myRemoteLocation;
Capture video;
int numPixels; // number of pixels in the video
int rectDivide = 4; // the stage width/height divided by this number is the video width/height
int vidW; // video width
int vidH; // video height
int[][] colouredPixels; // the different colour references for each pixel
int[][] colourCompareData; // captured r, g and b colours
int currR; //
int currG; //
int currB; //
int[][] squareCoords; // x, y, w + h of the coloured areas
color[] colours; // captured colours
int colourRange = 25; // colour threshold
int[][] centrePoints; // centres of the coloured squares
color[] pixelColours;
boolean isShowPixels = false; // determines whether the square and coloured pixels are displayed
int colourMax = 2; // max amount of colours - also adjust the amount of colours added to pixelColours in setup()
int coloursAssigned = 0; // amount of cours currently assigned
CoordsCalc coordsCalc;
void setup()
{
size(640, 480);
vidW = width / rectDivide;
vidH = height / rectDivide;
video = new Capture(this, vidW, vidH, 30);
noStroke();
numPixels = vidW * vidH;
colouredPixels = new int[vidH][vidW];
colourCompareData = new int[colourMax][3];
squareCoords = new int[colourMax][4];
colours = new color[colourMax];
centrePoints = new int[colourMax][2];
color c1 = color(0, 255, 0);
color c2 = color(255, 0, 0);
pixelColours = new color[colourMax];
pixelColours[0] = color(0, 255, 0);
pixelColours[1] = color(255, 0, 0);
coordsCalc = new CoordsCalc();
oscP5 = new OscP5(this, 12000);
myRemoteLocation = new NetAddress("127.0.0.1", 12000);
}
void captureEvent(Capture video)
{
video.read();
}
void draw()
{
noStroke();
fill(255, 255, 255);
rect(0, 0, width, height);
drawVideo();
coordsCalc.update();
for (int i = 0; i < coloursAssigned; i++)
{
if (isShowPixels) drawSquare(i);
}
background(0);
}
void mousePressed() {
/* in the following different ways of creating osc messages are shown by example */
OscMessage myMessage = new OscMessage("/test");
myMessage.add(colourLocation); /* add an int to the osc message */
myMessage.add(colourLocation);
/* send the message */
oscP5.send(myMessage, myRemoteLocation);
}
/* incoming osc message are forwarded to the oscEvent method. */
void oscEvent(OscMessage theOscMessage) {
/* print the address pattern and the typetag of the received OscMessage */
print("### received an osc message.");
print(" addrpattern: "+theOscMessage.addrPattern());
println(" typetag: "+theOscMessage.typetag());
}
}
void drawVideo()
{
for (int i = 0; i < coloursAssigned; i++)
{
fill(colours_);
rect(i * 10, vidH, 10, 10);
}
image(video, 0, 0);
noFill();
stroke(255, 0, 0);
strokeWeight(2);
rect(vidW - 4, vidH - 4, 4, 4);
}
void drawSquare(int
{
int sqX = squareCoords_[0];
int sqY = squareCoords_[1];
int sqW = squareCoords_[2];
int sqH = squareCoords_[3];
noFill();
stroke(0, 0, 255);
strokeWeight(3);
rect(sqX, sqY, sqW, sqH);
//stroke(0, 0, 255);
//strokeWeight(4);
rect(sqX * rectDivide, sqY * rectDivide, sqW * rectDivide, sqH * rectDivide);
line(sqX * rectDivide, sqY * rectDivide, ((sqX * rectDivide) + (sqW * rectDivide)), ((sqY * rectDivide) + (sqH * rectDivide)));
line(((sqX * rectDivide) + (sqW * rectDivide)), sqY * rectDivide, sqX * rectDivide, (sqY * rectDivide + sqH * rectDivide));
}
void keyPressed()
{
println("key pressed = " + key);
color currPixColor = video.pixels[numPixels - (vidW * 2) - 3];
int pixR = (currPixColor >> 16) & 0xFF;
int pixG = (currPixColor >> & 0xFF;
int pixB = currPixColor & 0xFF;
if (key == 'p')
{
isShowPixels = !isShowPixels;
}
if (key == '1')
{
coloursAssigned = 1;
colourCompareData[0][0] = pixR;
colourCompareData[0][1] = pixG;
colourCompareData[0][2] = pixB;
colours[0] = color(pixR, pixG, pixB);
}
if (colourMax < 2 || coloursAssigned < 1) return;
if (key == '2')
{
coloursAssigned = 2;
colourCompareData[1][0] = pixR;
colourCompareData[1][1] = pixG;
colourCompareData[1][2] = pixB;
colours[1] = color(pixR, pixG, pixB);
}
if (key == '0')
{
coloursAssigned = 0;
}
}_____
Interaction Design Student Patches Available
Greetings all,
I have just posted a collection of student patches for an interaction design course I was teaching at Emily Carr University of Art and Design. I hope that the patches will be useful to people playing around with Pure Data in a learning environment, installation artwork and other uses.
The link is: http://bit.ly/8OtDAq
or: http://www.sfu.ca/~leonardp/VideoGameAudio/main.htm#patches
The patches include multi-area motion detection, colour tracking, live audio looping, live video looping, collision detection, real-time video effects, real-time audio effects, 3D object manipulation and more...
Cheers,
Leonard
Pure Data Interaction Design Patches
These are projects from the Emily Carr University of Art and Design DIVA 202 Interaction Design course for Spring 2010 term. All projects use Pure Data Extended and run on Mac OS X. They could likely be modified with small changes to run on other platforms as well. The focus was on education so the patches are sometimes "works in progress" technically but should be quite useful for others learning about PD and interaction design.
NOTE: This page may move, please link from: http://www.VideoGameAudio.com for correct location.
Instructor: Leonard J. Paul
Students: Ben, Christine, Collin, Euginia, Gabriel K, Gabriel P, Gokce, Huan, Jing, Katy, Nasrin, Quinton, Tony and Sandy
GabrielK-AsteroidTracker - An entire game based on motion tracking. This is a simple arcade-style game in which the user must navigate the spaceship through a field of oncoming asteroids. The user controls the spaceship by moving a specifically coloured object in front of the camera.
Features: Motion tracking, collision detection, texture mapping, real-time music synthesis, game logic
GabrielP-DogHead - Maps your face from the webcam onto different dog's bodies in real-time with an interactive audio loop jammer. Fun!
Features: Colour tracking, audio loop jammer, real-time webcam texture mapping
Euginia-DanceMix - Live audio loop playback of four separate channels. Loop selection is random for first two channels and sequenced for last two channels. Slow volume muting of channels allows for crossfading. Tempo-based video crossfading.
Features: Four channel live loop jammer (extended from Hardoff's ma4u patch), beat-based video cross-cutting
Huan-CarDance - Rotates 3D object based on the audio output level so that it looks like it's dancing to the music.
Features: 3D object display, 3d line synthesis, live audio looper
Ben-VideoGameWiiMix - Randomly remixes classic video game footage and music together. Uses the wiimote to trigger new video by DarwiinRemote and OSC messages.
Features: Wiimote control, OSC, tempo-based video crossmixing, music loop remixing and effects
Christine-eMotionAudio - Mixes together video with recorded sounds and music depending on the amount of motion in the webcam. Intensity level of music increases and speed of video playback increases with more motion.
Features: Adaptive music branching, motion blur, blob size motion detection, video mixing
Collin-LouderCars - Videos of cars respond to audio input level.
Features: Video switching, audio input level detection.
Gokce-AVmixer - Live remixing of video and audio loops.
Features: video remixing, live audio looper
Jing-LadyGaga-ing - Remixes video from Lady Gaga's videos with video effects and music effects.
Features: Video warping, video stuttering, live audio looper, audio effects
KatyC_Bunnies - Triggers video and audio using multi-area motion detection. There are three areas on each side to control the video and audio loop selections. Video and audio loops are loaded from directories.
Features: Multi-area motion detection, audio loop directory loader, video loop directory loader
Nasrin-AnimationMixer - Hand animation videos are superimposed over the webcam image and chosen by multi-area motion sensing. Audio loop playback is randomly chosen with each new video.
Features: Multi-area motion sensing, audio loop directory loader
Quintons-AmericaRedux - Videos are remixed in response to live audio loop playback. Some audio effects are mirrored with corresponding video effects.
Features: Real-time video effects, live audio looper
Tony-MusicGame - A music game where the player needs to find how to piece together the music segments triggered by multi-area motion detection on a webcam.
Features: Multi-area motion detection, audio loop directory loader
Sandy-Exerciser - An exercise game where you move to the motions of the video above the webcam video. Stutter effects on video and live audio looper.
Features: Video stutter effect, real-time webcam video effects
How to Enjoy Your Favorite Videos on Portable Devices at Will For Mac
Are you a Mac user?
Do you still feel frustrated that you can't enjoy your favorite videos on portable devices at will?
Now, a professional software---Aiseesoft Video Converter for Mac(http://www.aiseesoft.com/video-converter-for-mac.html)
can help you to solve all the problems. With it, you can convert between all popular video and audio formats with super fast conversion speed and high output quality, such as AVI, MP4, MOV, MKV, WMV, DivX, XviD, MPEG-1/2, 3GP, 3G2, VOB Video, MP3, AAC, and AC3 Audio etc. In addition, the best video converter for Mac can also extract audio from video file and convert video to MP3, AC3, and AAC...as you want.
OK, let's move to how to use the amazing software.
Step 0: Download and install Aiseesoft Video Converter for Mac.
After a while, you can use the following interface:
http://www.aiseesoft.com/images/guide/dvd-converter-suite-mac/video.jpg
Step 2. Load Video
You can load your video by clicking "Add File" button or clicking "File" button, you can choose "add file" on a drop-down list.
Step 3. Output format and Settings
From the "Profile" drop-down list you can find one format that meets your requirement.
After doing the 3 steps above, you can click "start" button to start conversion.
Wait a minute, the conversion will be soon finished.
Tips:
1. Trim
"Trim" function is for you to select the clips you want to convert.
There are 3 ways that you can trim your video.
a. You can drag the buttons(1) to set the start and end time
b. You can preview the video first and when you want to start trim click the left one of the pair buttons(2) when you want to end click the right one.
c. You can set the exact start and end time on the right side of the pop-up window.
It is for you to select the clips you want to convert.
http://www.aiseesoft.com/images/guide/dvd-ripper-for-mac/trim.jpg
2. Crop
Cut off the black edges of the original movie video and watch in full screen using the "Crop" function.
There are 3 ways that you can crop your video.
a. We provide 7 modes on our "Crop Mode"(1)
b. You can set your own mode on the right side of the pop-up window(2)
c. You can drag frame to set your own crop mode(3)
You can cut off the black edges of the original movie video and watch in full screen using the "Crop" function.
http://www.aiseesoft.com/images/guide/dvd-ripper-for-mac/crop.jpg
3. Snapshot and merge into one file
If you like the current image of the video you can use the "Snapshot" option. Just click the "Snapshot" button the image will be saved and you can click the "Open" button next to "Snapshot" button to open your picture.
If you want to make several files output as one you can choose "Merge into one file".
If you are windows users, you can go to Aiseesoft Total Video Converter(http://www.aiseesoft.com/total-video-converter.html) to get more information.
HELP: "shot" & translate pixels
this sounds intersting, although i don't understand quite a lot of it.
first the easy part:
"Let's take a gemwin 9*9 pixels. So, 9 columns of pixels. In this gemwin we see the webcam or a video. Every column of pixels changes frame by frame (because video or webcam go on)."
you want to display a video or cam, with a resolution displayed in the gemwin of only 9x9 pixels?
so you get 9x9 coloured squares, that somewhat change colour if something happens in the video?
for that, you could use a gemframebuffer. i accidentally wrote a pixelizer-thingy recently while trying to figure out framebuffering.
in the attached patch i more or less just a modified the framebuffer.readback help patch.
- then you could use a pix_snap2tex with a sufficient size and offset to snap the row of pixels you want an texture them back onto a rectangle the size and position of the lefthand row of pixels. with each frame of the video, those rectangles need to move further to the left to make space for a new rectangle.
i don't quite understand, where the video is shown, when one axis (the "central", horizontal or vertical?) is being snapped.
would the video not be 9x9 pixels large?
is the video only shown on the right part of the screen?
the left part would be filled up by the shots frame by frame?
Problem opening video files with libavcodec
Hi everybody,
I'm trying to write a pdp external using the libavcodec library, to expand the video decoding capabilities of the system. I'm following Stephen Dranger's tutorial (available at http://www.dranger.com/ffmpeg/) cause it's very well organized and clear. Still I'm not able to open video files inside pd, while the same piece of code works perfectly in the sample standalone program.
This is the code of my external:
#include <stdlib.h>
#include <stdio.h>
#include <libavcodec/avcodec.h>
#include <libavformat/avformat.h>
#include <libswscale/swscale.h>
#include "pdp.h"
typedef struct pdp_in_struct {
t_object x_obj;
t_outlet *x_outlet0;
int x_packet0, x_queue_id;
u32 x_width, x_height;
} t_pdp_in;
static void pdp_in_sendpacket(t_pdp_in *x)
{
/* unregister and propagate if valid dest packet */
pdp_packet_pass_if_valid(x->x_outlet0, &x->x_packet0);
}
static void pdp_in_process(t_pdp_in *x)
{
}
int pdp_in_open(t_pdp_in *x, t_symbol *s)
{
const char *filename = s->s_name;
AVFormatContext *pFormatCtx;
av_register_all();
// Open video file
if(av_open_input_file(&pFormatCtx, filename, NULL, 0, NULL)!=0)
{
post("Can't open file");
return -1; // Couldn't open file
}
// Retrieve stream information
if(av_find_stream_info(pFormatCtx)<0)
return -1; // Couldn't find stream information
// Dump information about file onto standard error
dump_format(pFormatCtx, 0, filename, 0);
return 0;
}
t_class *pdp_in_class;
void pdp_in_free(t_pdp_in *x)
{
t_pdp_procqueue *q = pdp_queue_get_queue();
pdp_procqueue_finish(q, x->x_queue_id);
pdp_packet_mark_unused(x->x_packet0);
}
void *pdp_in_new(void)
{
t_pdp_in *x = (t_pdp_in *)pd_new(pdp_in_class);
x->x_outlet0 = outlet_new(&x->x_obj, &s_anything);
x->x_packet0 = -1;
x->x_queue_id = -1;
x->x_width = -1;
x->x_height = -1;
return (void *)x;
}
#ifdef __cplusplus
extern "C"
{
#endif
void pdp_in_setup(void)
{
pdp_in_class = class_new(gensym("pdp_in"), (t_newmethod)pdp_in_new,
(t_method)pdp_in_free, sizeof(t_pdp_in), 0, A_DEFFLOAT, A_DEFFLOAT, A_NULL);
class_addmethod(pdp_in_class, (t_method)pdp_in_open, gensym("open"), A_SYMBOL, A_NULL);
}
#ifdef __cplusplus
}
#endif
Am I missing something?
Thanks
Stefano