Start a new topic

„Rate of change“ output

It would be great to have an extra output, derived of the rate of change of sliding speed and pressure.

Hello!

What do you mean by extra output?


There are already 2 USB Output and also 2 midi-TRS output

You can also route any element to the CV-Gate output.

I don’t mean a physical output! A midi cc output option that can be enabled via erae lab

You can Enable Cc on X or Y axis on any element

And use Cc 74 for polyphonic behavior as well 

It does not reproduce the Glissando element but it can work in parrallel to replicate a very similar gesture (but limited to the size of the element


What I mean is not the actual midi CC value, but it’s first derivative - the rate of change. So when my finger is resting on one position, it outputs zero, and when I move it, the value increases, depending on how fast I move it. It would be necessary to define a maximum speed in order to get an output (whereas the top limit would be the scanning rate of the sensors). And an (maybe?) interpolation parameter to get smooth readings. This would enable a lot of creative possibilities!

Ok. 

We will discuss this with the dev team

But this kind of advanced features are much easier to do with MaxMSP or external as the settings are always very specific to each users

Do you have an example of a device that output similar values ?

i dont think there is another device that does this. but please hear me out, I think it is a missed opportunity if you don't do it.


maybe look at it like this:

MPE is an attempt to bring the ways in which a player might interact with his digital instrument closer to how a player would interact with an acoustic instrument, by adding more dimensions the the old concept of "key down, velocity".


lets take a second to imagine, what would be an "ideal" MPE controller?

it would allow the player to access the infinite possibilities of digital sound design AND the infinite possibilities of how a player would manipulate an acoustic instrument, and let the player recombine the two in novel ways that are impossible to do in the physical world, while maintaining the feeling of an acoustic instrument. This to me is why MPE is so interesting and exciting.


now lets take that old equation: F=m*a. thats what happens when a sound is created physically, by transferring force from one object to another. think about that..

it means force=mass*accelleration, not force=mass*velocity. in other words: when hitting a drum with a stick, its not the speed that matters, its acceleration that was applied to the stick. or a better example: when a violin bow is moving over a string, the variation in speed is fundamental to the resulting sound.


while velocity is a good standard for lots of applications, it is just a readout of speed at a single point in time. real physical objects also react to acceleration. on the erae, you now have all the sensors you would ever need for mimicking this real world behaviour of acoustic instruments. but that would require acceleration data, which could be derived from pressure and gliding gestures.


i am pretty sure that the way you determine velocity on the erae right now already is by comparing the time difference between two samples, because thats the industry standard.


nowadays physical modeling synthesis is becoming better and better, and the rise of MPE interfaces is a symbiotical development. BUT there is no MPE controller (that I am aware of) which does this, so you could be the first company to do it! and it would open many new doors, because it is perfectly in line with the spirit of MPE, which is combining digital and physical is new ways.


cheers! and thank me later :P


1 person likes this
I can offer a direct use case: brush drumming. When brushing, the acceleration of the brush across the head produces change in amplitude and tone. In fact it's very like using an expressive controller with a SWAM instrument, in which changing breath or motion is what actually actuates and modulates blow or bow. I can emulate this with an iPad controller called Pen2bow, which allows you to sweep the surface of the pad and convert this motion to MPE data for an appropriate instrument. I've also demonstrated for myself that I can use Pen2bow to emulate brush sweeping, by pushing its data to a simple modular setup, using noise as a snare sound, with the velocity of the sweep - the rate of change of the pen moving across the surface - to control the amplitude of the noise. No envelope generator... just that motion. When considering the Erae II, this is one of the first things I've been wondering about. Seeing this thread, it's evident that the device does not process the speed with which you sweep the surface, so I would indeed have to interject a processing app between the Erae and my drum brush sound generator... perhaps even Superior Drummer. The actual process of sensing and providing this continuous differential should be pretty easy, and not a great added burden to the current Erae II operating system, and would offer an excellent addition to its MPE capability.

2 people like this
Yes! Please make it happen! It adds another dimension to the controller, which is actually quite huge as there are currently 3 dimensions available (pressure, position and velocity), especially given the effort required to implement. Adding the area of contact, as I proposed in another thread, would complete list. Or at least I can’t think of another way of interacting with a real instrument, that could be mimicked with the controller.
Login or Signup to post a comment