Midi to hz, and hz to midi formulas
The circles we calculated the orbits to be proved each to be slightly short of a true, pure circle, thus returning the satellites to the ground. Pi more precisely can be evaluated to 3.1446055, and not the perpetuation of imperfection, and the maintenance of ignorance still taught today.
Sorry to be blunt, but what you are stating is awfully incorrect and the way you proclaim to know this "hidden truth" is really awufully ignorant. So you are stating that there is a mistake on the 3rd decimal digit of pi... It so happens that this decimal place was precisely calculated centuries ago, and the result still (obviously) stands today. There are tons of different approaches on how to get correct digits of pi, such as by using certain convergence series as mentioned by @seb-harmonik.ar. One can manually calculate the 3rd digit of pi using these type of series, and this result has been known for more than a millennium (in the year 480, the value known was 3.1415926, which is way more precise than what you state. By the early 18th century, we knew 100 digits of pi, none of which have been "corrected" later on. See: https://en.wikipedia.org/wiki/Chronology_of_computation_of_π). There is no way that in the 1960's pi had to be defined as 3.1446... instead of 3.1415..., that is just pure ignorance and witchery.
But reading about your number on the internet, I found that this 3.1446055 appears in several articles about satellites, pyramids and all that "dark hidden world that nobody tells you about open your eyes sheep people controlled by the illuminati" kind of talk, but not a single time in a serious mathematical or physics website/journal/wiki. Sorry but the world is not a dark place controlled by people with magic powers trying to keep you in the shadows... simply do not use random blogs as source of information.
The assumption of equal temperament proves a repetitious vexation, denotes ignorance, apathy, laziness, and/or unfamiliarity with the math of music. Worse, this assumption proves far too common among engineers / mathematicians / scientists / software programmers. Which then deprives us musicians of useable tools, and the stupidity of the need for MIDI tuning standard / scala / etc, is the vacuum such engenders.
As for the "stupidity" of using a tempered system, that is just as stupid as using a non-tempered one: it is simply a convention, upon which we built centuries of music. Western music has been based on it for long time (although contemporary composers, myself included, often choose to use micro-intervals), and the decision to create MIDI around this is as logical as it gets when you think what they were aiming at. Or should we have went through all sorts of trouble to incorporate all crazy stuff in the MIDI protocol (which is an Western creation), such as the possibility of having Indian micro-tonal scales? But wait, then what about gamelan scales? No wait, what about <insert ethnomusic genre here> scales?
From a practical point of view, you can still use MIDI cents, and you can also directly use frequencies if you want to precisely define the pitch of a sound. You can compose music in 12-tones, 27-tones or 193-tones if you wish to. The tools are here, and Pd allows you to do whatever you want with them (I myself have composed works using Pd and MIDI that deals with microtones and microtonal glissandi in real-time).
I hope you won't get personally offended with my message, but I can't really read this type of statement, which tries to propagate pseudo-scientific kind of stuff, without writing a strong reply.
Composing Indian Raga using Pd
Composing Indian Raga using Pd:
A raga is a melodic recipe for a mood. In Hindustani (North Indian) classical music, each raga has certain moods associated with it, and usually has a specific time of day and/or season in which it is meant to be played. Raga could be described as a "super scale" using a set of notes in ascending (arohi) and descending (avarohi) order, sometimes including prescribed alternate or zig zag routes (vakra chal), a hierarchy of note importance including king notes (vadi) and prime minister notes (samvadi) a fourth or fifth apart, and a key phrase that shows the heart of the movement of the raga (pakar).
This ancient system is both an art and a science of how musical notes create certain moods. Western music also recognizes that note order and hierarchy create a mood, with some theory texts noting that songs in major keys tend to communicate happiness while songs in minor keys show a feeling of sadness. But the rags show precisely how to create a specific mood. The recipe for each raga holds the key to an unlimited number of potential melodies, each perpetuating the mood or rasa contained in the raga, but each a unique work of art.
Study of this system can be of great benefit to any composer or improvising musician.
Ravikiran has composed and presented several concerts with top caliber artistes from many parts of the world featuring melharmony. He has arranged several melharmonic pieces for performances by full sized, medium sized or chamber orchestras. Some of these are in collaboration with composers such as Prof Robert Morris and Charles Demuynck, Toronto, Canada.
Ravikiran's melharmonic compositions endeavours to enrich both melodic and harmonic systems. Flavoured with exciting and often highly original rhythmic patterns, his compositions have started blazing a new trail in world music. One of his creations, Ujjwal, a full-fledged, first of its kind 45-minute long Melharmonic Concerto was presented in the Swar Utsav Festival at India Gate, New Delhi to an audience close to 20,000 people.
Melharmonic concerts in cities such as Boston, London, New Delhi, Toronto, Mumbai, San Jose, Austin (TX), Chennai, Bangalore, Tulsa (OK) at times in collaboration with top calibre artistes and groups from diverse parts of the world have also won plaudits.
Strange midi issue (delayed midi in data ???)
explanation of my problem :
config : ableton + touchOSC + loopMidi (also tryed midiyoke) (+ PD of course)
I use midi clips, sending CC data (1 cc per clip, only 1 channel used for midi output, to avoid wrong data) so I can use the CC value to make play position cursors on touchOSC.
in PD I set something like that :
[ctlin cc# ch#]
| [Number) ***for monitor purpose, easier to read than [print]
[send /fader $1(
My play position faders are sometimes working perfectly and sometimes they are glitchy, jittery.
at first i thought it was caused by the big number of OSC messages sent by puredata (I planned to do a trick that would filter values after [ctlin] so that it doesn't route redundant values and then only sends different OSC messages. but i got lazy and investigated elsewhere first)
I set a number object after [ctlin] and found that the values are delayed. sometimes cleanly delayed : everything coming late, and sometimes warped : everything ok, but some intermitent lower values. and sometimes for no reasons either it works perfectly. the delays are several seconds long.
in ableton i can see the cursor moving correctly all along the midi clip. if i turn off midi out from ableton during a delay, I still have midi activity for a few seconds in PD
other data are sent & received on time (OSC, midi out)
I only use 1 midi in and out in PD and ableton, via 2 loopMidi virtual cables. audio is off, no delay (1ms)
what could cause that ? it's either between ableton and PD or inside PD
thanks in advance
TouchOSC \> PureData \> Cubase SX3, and vice versa (iPad)
So, tonight have sussed 2 way comms from TouchOSC on my iPad to CubaseSX3, and thought I'd share this for those who were struggling with getting their DAW to talk back to the TouchOSC App. These patches translate OSC to MIDI, and MIDI back to OSC.
Basically, when talking back to OSC your patch needs to use:
[ctrlin midictrllr# midichan] > [/127] > [send /osc/controller $1] > [sendOSC] > [connect the.ipa.ddr.ess port]
So, in PD:
[ctrlin 30 1] listens to MIDI Controller 30 on MIDI Channel 1, and gets a value between 0 & 127. This value is divided by 127, as TouchOSC expects a value between 0 & 1. We then specify the OSC controller to send it to and the result of the maths ($1), and send the complete OSC packet to the specified IPAddress and Port.
This can be seen in LogicPad.vst-2-osc.pd
All files can be found in http://www.minimotoscene.co.uk/touchosc/TouchOSC.zip (24Mb)
This archive contains:
MidiYoke (configure via Windows > Start > Control Panel)
touchosc-default-layouts (just incase you don't have LogicPad)
Cubase SX3 GenericRemote XML file (to import).
Cubase SX3 project file.
2x pd files... osc-2-vst, and vst-2-osc. Open both together.
In PD > Midi settings, set it's Midi Input and Midi Output to different channels (eg: Output to 3, Input to 2). In Cubase > Device Settings > Generic Remote, set Input to 3, and output to 2.
Only PAGE ONE of the LogicPad TouchOSC layout has been done in the vst-2-osc file.
Am working on the rest and will update once complete.
As the layout was designed for Logic, some functions don't work as expected, but most do, or have been remapped to do something else. Will have a look at those once I've gotten the rest of the PD patch completed.
Patches possibly not done in most efficient method... sorry. This is a case of function over form, but if anyone wants to tweak and share a more efficient way of doing it then that would be appreciated!
Hope this helps some of you...
I conceptualized this the other day. The main reason I wanted to make this is because I'm a little tired of complicated ableton live. I wanted to just be able to right click parameters and tell them to follow midi tracks.
The big feature in this abstract is a "Midi CC Module Window" That contains an unlimited (or potentially very large)number of Midi CC Envelope Modules. In each Midi CC Envelope Module are Midi CC Envelope Clips. These clips hold a waveform that is plotted on a tempo divided graph. The waveform is played in a loop and synced to the tempo according to how long the loop is. Only one clip can be playing per module. If a parameter is right clicked, you can choose "Follow Midi CC Envelope Module 1" and the parameter will then be following the envelope that is looping in "Midi CC Envelope Module 1".
Midi note clips function in the same way. Every instrument will be able to select one Midi Notes Module. If you right clicked "Instrument Module 2" in the "Instrument Module Window" and selected "Midi input from Midi Notes Module 1", then the notes coming out of "Midi Notes Module 1" would be playing through the single virtual instrument you placed in "Instrument Module 2".
If you want the sound to come out of your speakers, then navigate to the "Bus" window. Select "Instrument Module 2" with a drop-down check off menu by right-clicking "Inputs". While still in the "Bus" window look at the "Output" window and check the box that says "Audio Output". Now the sound is coming through your speakers. Check off more Instrument Modules or Audio Track Modules to get more sound coming through the same bus.
Turn the "Aux" on to put all audio through effects.
Work in "Bounce" by selecting inputs like "Input Module 3" by right clicking and checking off Input Modules. Then press record and stop. Copy and paste your clip to an Audio Track Module, the "Sampler" or a Side Chain Audio Track Module.
Work in "Master Bounce" to produce audio clips by recording whatever is coming through the system for everyone to hear.
Chop and screw your audio in the sampler with highlight and right click processing effects. Glue your sample together and put it in an Audio Track Module or a Side Chain Audio Track Module.
Use the "Threshold Setter" to perform long linear modulation. Right click any parameter and select "Adjust to Threshold". The parameter will then adjust its minimum and maximum values over the length of time described in the "Threshold Setter".
The "Execution Engine" is used to make sure all changes happen in sync with the music.
IE>If you selected a subdivision of 2, and a length of 2, then it would take four quarter beats(starting from the next quarter beat) for the change to take place. So if you're somewhere in the a (1e+a) then you will have to wait for 2, 3, 4, 5, to pass and your change would happen on 6.
IE>If you selected a subdivision of 1 and a length of 3, you would have to wait 12 beats starting on the next quater beat.
IE>If you selected a subdivision of 8 and a length of 3, you would have to wait one and a half quarter beats starting on the next 8th note.
Can PdVst send OUT midi data?
I installed PD because I wanted to figure out how to convert CC into Notes. It was amazingly simple after reading a few help files.
So... I then discovered PdVst and set it up. It works fine with the 2 demos that come with it and after making a VST out of my little CC to Note patch I found that it receives MIDI just fine but it doesn't seem to send MIDI data OUT.
If I put my plugin in one Ableton Live track and send CCs to it I can see the number boxes connected to ctlin going up and down as I turn a knob, then if I put an instrument in another track and tell it to receive MIDI from the first track I get nothing.
This patch works fine if I run it in Pure Data and send it's MIDI output to Ableton Live using Midi-Yoke but it doesn't seem to output MIDI if I use it in PdVst.
Is PdVst capable of sending out MIDI?
PD, TouchOSC, Traktor, lots of problems =\[
I don't have a clear enough picture of how your setup is configured to determine where the problem might be, so I will ask a bunch of questions...
1) Midi mapping problem
Touch OSC on the iPad is configured and communicating with PD on windows, yes?
Are you using mrpeach OSC objects in PD?
Did you also install Midi Yoke as well as Midi OX?
Is there another midi controller that could be overriding the midi sent from PD to Tracktor?
Are you using Midi OX to watch the midi traffic to Tracktor in realtime?
I don't use Touch OSC (using MRMR) but you mentioned that the PD patch is being generated? Can you post the patch that it's generating?
2) Midi Feedback
It sounds like you want the communication to be bidirectional, this is not as simple as midi feedback. Remember Touch OSC communicates using OSC and Tracktor accepts and sends Midi. So what you have to do is have Tracktor output the midi back to PD and have PD send the OSC to Touch OSC to update the sliders.
I would seriously consider building your PD patch by hand and not using the generator. You'll learn more that way and will be able to customize it better to your needs.
How do I get midi out from pd??
I am new to Pd, having some encouraging feeling successes, and some totally frustrating FAILS with no understanding why.
1. I am trying to save plain .txt as .pd, just like all the forms say,only to get "error"
pd does not like the X it seems. how does one save these .txt as working .pd?
2. I am trying to use the 'fiddle" object to do pitch to midi. I dl the only .pd I could find http://www.imartron.com/pd/rep2/rep2.shtml (again, how does one save .pd from .txt on a mac??)
I have the audio from my soundcard adc~ 1 as the inlet, and it works.
the midi out test works
I see midi activity on my midi hub when I set the outlet to noteout , but it does not get any response from my synth. any idea how one sets things like midi channels etc? this synth response to just about any midi data you throw at it, and the hub confirms its coming out, soooo.......
MIDI YOKE beginner
Hello, i stumbled on Pure Data today and loved the idea of controlling Ableton via an ipod touch. So now i've been sitting here for two hours trying to get this to work.
I've downloaded Pure data and the app for the ipod. So far so good.
When i press buttons on the ipod, it recognises that in Pure Data. Going great.
Now, and i should point out i'm on a windows computer, i have to connect it to ableton. No IAC bus midi stuffs since windows don't have those, so Midi Yoke it is.
I have installed Midi Ox (couldn't find Midi Yoke so... is this the same program?)
and i think... i understand it. I set midi out on DP to midi yoke 1. Set Midi yoke 1 in to midi Yoke 1 out in Midi Ox. In ableton i then highlight everything (track, sync and remote) on both Midi yoke 1 out and in (because i dont know what to highlight so...)
now... nothing. I think the problem is that Midi Ox don't recognise the Midi out from DP but i'm not sure. I don't see any changes at all when i press the buttons on the Ipod in Midi ox.
I'm sorry for all this rambling, and i have googled as much as i can but i haven't found any solution. I did find a tutorial named how to setup as i described http://www.gadgetcracker.com/2009/06/how-to-iphone-touchosc-pure-data-ableton-live/ but link just 404
Please help me
Pdvst (effect-plugin) notein-problem
Hi! I'm new in this forum =)
I created a PD-patch that takes midi notes to place several narrow bandpass-filters, in order to extract tone-defining sounds out of noisy sounds. This effect requires both audio and midi input. I want to implement this PD-patch as a VST-plugin in my DAW (in this case FL Studio). I have therefore downloaded and applied PdVst to my patch and successfully been able to run it as an effect-plugin in my DAW. But one crucial thing is not working: The sending of midi notes from the DAW (or directly from a midi-keyboard, for that sake) to the [notein] object in my PD patch! * Maybe PdVst-effect-plugins just aren't designed to receive midi notes via the notein-object? If so, what would you suggest as an alternative (method)? Btw, I've tried to create simple a PdVst-instrument (as opposed to effect), and it could successfully receive midi-notes from the DAW, but it didn't have any audio input...
Any help is highly appreciated =)
Here's a (slightly edited) screenshot of my DAW and PdVst with the patch:
* I know the VST-effect standard generally allows effects to receive midi notes this way. For example the VST effect plugin GSnap (ala AutoTune) can tune any (monophonic) sound input to the key played on a midi-keyboard.