• cfry

    @whale-av said:

    @cfry Also....... 50cents for 6 buttons....... because...... how long can you bounce my dear switch?
    https://hackaday.com/2015/12/09/embed-with-elliot-debounce-your-noisy-buttons-part-i/

    Hi,
    I am necromancing this thread a bit. :)

    I have chosen to debounce in hardware (also, cause I will need to do in Pd too it seems).

    I asked about this at the ardunio forum as well, but they suggest (naturally) that I solve it in the arduino code. Since I use the firmata combined with pduino I can not do that.

    I would like some help please with choosing the right hardware debouncing configuration. It works pretty ok as of now, but I want to make sure I do not make any errors in selecting electronics. I am debouncing the digital inputs of an arduino nano clone. It seems that I get quite different bouncing on the different inputs.

    I am following this guide:

    A Guide to Debouncing - Part 2, or, How to Debounce a Contact in Two Easy Pages, by Jack Ganssle

    I am choosing between these two configurations:

    debouncerrc.jpg debouncerrc2.jpg

    The math equations are beyond my understanding (I need to get a math textbook sooner or later, to brush up on things).

    The latter, with the diode seems to work fine. Suggested values are C: 1 µF, R1: 82K, R2: 18K. I use slightly larger values.

    The one without a diode, have example values C: 0.1 µF, R1:? (I used 10K), R2: 185K.

    Question

    Since I ran out of C 1µF, I used the values from the first circuit, but combined with a diode as in the second circuit. Does this make sense?

    This is the context, the switches:

    100626965_1018250641946862_9197936218325647360_o.jpg

    So there will be a lot of jitter, the debounce needs to be at least 30 ms is my guess.

    [I use the nano as an interface for sound synthesis software (puredata).]

    Thank you!

    posted in technical issues read more
  • cfry

    Thank you for that neat explanation. It totally makes sense.

    When I did a comparison test with just one of each (set to fast rate), the cpu load did not differ much but swayed around 7%. But that is actually just the basic load on my old computer.

    posted in technical issues read more
  • cfry

    Hi,

    I am making a sample and hold that should modify the frequency control signal to a [phasor~].

    This works:
    Skärmavbild 2020-05-28 kl. 10.47.59.png

    My wondering is, would it be better to do this with [random]? Cpu is my main concern.

    Thanks!

    N.

    posted in technical issues read more
  • cfry

    @whale-av Great, I have seen that book and wanted to read it, thanks. That example seems quite spot on, a good start.

    I got the impression that six detection areas was the maximum with VPT/VideoTrigger, but maybe it is just a matter of settings.
    ...
    I am not sure how to read this, but both 32-bit and 64-bit is in there. Also the Pd and Gem paths are specified as of the authors computers file paths. Could that be the problem?

    The makefile:

    PD_PATH = /Applications/Pd-0.42-5.app/Contents/Resources
    GEM_PATH = /Users/williambrent/Pd/gem-0.92-3
    
    FAT_FLAGS = -arch i386 -arch x86_64
    
    EXTENSION = pd_darwin
    
    all: pix_motion_sector.$(EXTENSION)
    
    pix_motion_sector.pd_darwin:
        g++ $(FAT_FLAGS) -c -g -O2 -fPIC -freg-struct-return -Os -falign-loops=32 -falign-functions=32 -falign-jumps=32 -funroll-loops -ffast-math -mssse3 -fpascal-strings -I$(PD_PATH)/src  -I$(GEM_PATH)/src/ pix_motion_sector.cpp -o pix_motion_sector.o
        g++ $(FAT_FLAGS) -bundle -undefined dynamic_lookup -o pix_motion_sector.pd_darwin pix_motion_sector.o
    
    
    clean:
        -rm -r -- pix_motion_sector.$(EXTENSION) pix_motion_sector.o
    

    .

    posted in technical issues read more
  • cfry

    Hi,

    I can not get this to work, I am still trying to get comfortable with externals. Thankful for help, sorry for my ignorance. :/

    pix_motion_sector (scroll down)
    link text

    README:
    "Throw the pix_motion_sector folder in Pd's Contents/Resources/extra directory. The provided macintosh binary should work as-is. To build, cd to the pix_motion_sector directory and type "make". "

    I put it first in Contents/Resources/extra and did build.
    Something about installing xcode, I though I had it so I just went along.
    Could not create [pix_motion_sector] in pd.
    Tried to move the folder to my other externals user...documents/pd/externals. No luck.
    Tried to remove and all files and rebuild.
    Terminal:

    make: Nothing to be done for `all'
    

    Dowloaded Command_Line_Tools_for_Xcode_11.3.1 (I am on Mojave).

    Pd says:

    /Users/Nicklas/Documents/Pd/externals/pix_motion_sector/pix_motion_sector.pd_darwin: dlopen(/Users/Nicklas/Documents/Pd/externals/pix_motion_sector/pix_motion_sector.pd_darwin, 10): Symbol not found: __ZN7GemBase10isRunnableEv
      Referenced from: /Users/Nicklas/Documents/Pd/externals/pix_motion_sector/pix_motion_sector.pd_darwin
      Expected in: dynamic lookup
    
     pix_motion_sector
    ... couldn't create
    

    posted in technical issues read more
  • cfry

    Looking some more into it, I think I should use camera(s). Gems pix_movement or the extension pix_motion_sector williambrent, maybe his DILib external. Or the VPT8 or the Video Trigger from Zach Poff it is based on. But it is great if I can stay in Pd only.

    I figure it would be possible to create "swipe areas" to let the user push the coils in different directions. One would need to be able to set an area for detection and detect from which direction the swipe comes from.

    posted in I/O hardware diyread more
  • cfry

    I am making an installation in a small gallery in May, An interactive tactile sound sculpture. Corona flips the intimate setting and the tactile elements from features to serious problems. I am trying to find a way to still do the exhibition.

    My idea is to place the sculpture visible from the gallery windows and have the audience to interact with the sculptures from the outside. I would have to mount speakers on the outside and/or use the window as a vibration speaker.

    @whale-av said:

    @cfry Why 3 areas?
    To push in different directions.
    PIR will not work well through glass...... could it be put outside?
    On second thought, I think it could.
    PIR only has one sensor..... and a mask to change the IR levels as you move..... so 3 required and some modification of the masks.
    I was thinking that since PIRs are not expensive I could have several.
    Will Pd be involved?
    Yes.
    A camera could do this.... in Pd with Gem.....or VPT7 or VPT8 https://hcgilje.wordpress.com/vpt/ with its optional video trigger......or some other software.
    Seems interesting. I need to look into it.
    Here are some impressive gestural instruments using camera (The Gesturally Extended Piano, The Open Shaper). This is from an interactive perspective much more sophisticated functions than I would need.

    posted in I/O hardware diyread more
  • cfry

    Hi,

    (if this is off topic for this subforum let me know and I will post in "off topic")

    Can I get an advice on what tech solution would be good to solve this:

    I would like to make a motion detect interface, working similar to a touch interface.
    Three adjacent areas that you can hoover your hand over to send an on/off message. It should work outside, in daylight and optimally through a glass window.

    Maybe one could hack pir sensors to just detect a small area (10x10 cm)? Or would this be a job for ultrasonic sensors (not through a glass then)?

    Thanks!

    posted in I/O hardware diyread more
  • cfry

    @Il-pleut TimbreID http://williambrent.conflations.com/pagez/research.html#timbreID
    works along those lines, not machine learning but it compares features of sounds, and do real time concatenative synthesis and timbre-based orderings of sound sets among other things. And it works really well I say out of my initial fooling around with it. Thanks @Johnny-Mauser for letting me know it existed. If TimbreID could be used to both analyze and process in realtime I am still to find out. Maybe that is where a trained neural network is needed.

    posted in technical issues read more
  • cfry

    @oid Thanks a lot, really good to know that [sigmund~] works that good with decent audio equipment.

    But I am not after bird song, I actually want to track the wind, cars passing by, all sounds in a sound field. Of course there are no clear pitches to analyze and synthesize in those type of sounds, but I was hoping for a reaction from the patch, for a start. I think that TimbreID would let me do what I hope for.

    Still it is great to know that there is such a difference with the right gear.

    posted in technical issues read more

Internal error.

Oops! Looks like something went wrong!