Hi there!
Thanks for sharing your innovative ideas for tactile overlays! It’s exciting to see how you’re pushing the boundaries of what’s possible with the controller.
To address your questions:
The Erae 2 features approximately 16,000 force sensors, but these sensors are grouped in clusters. Due to this clustering, the absolute minimum distance between two distinct touches that the hardware can differentiate is about 10mm. In the current software implementation, we’ve set this limit to 15mm to ensure reliable performance for finger-based interaction, which covers the needs of the vast majority of our users.
While we understand the appeal of reducing this distance for specialized use cases like yours, we don’t currently have plans to adjust this threshold. Doing so could introduce unexpected side effects (such as ghost notes) that might negatively impact the experience for 99% of our user base.
As for multi-touch polyphony, the Erae 2 is currently designed to detect up to 10 simultaneous touches in software.
We appreciate your enthusiasm for exploring alternative interfaces! If you’re prototyping with the current settings, we’d love to hear how your experiments progress.
Keep us posted on your developments, and don’t hesitate to reach out if you have more questions or insights to share!
Thanks! I can see where I can go with this info. I need to do a few experiments! It might be that as long as I don't need to pick up two adjacent small keys, there an be a fairly dense array of them.
If you've seen my API keys demo, those are pretty dense! They are based on 2x2 LED array size.
Henry Lowengard
Hi Folks!
This is really for the developers!I'm thinking of making some tactile overlays to interact with the squishy or drum-ready Erae surfaces. The idea being to make some keyboard-like flexible hard keys with bumps on the bottom that simulate finger presses. Clever construction and/or 3d printing would let these keys "recover" either via the springiness of the Erae's surfaces or in the framework holding the keys.
That way, there could something more naturally scaled for keyboard playing, with a little "give", as opposed to what I get from Keith McMillen MPE keyboards. Also, some more complicated isometric keyboards, could be constructed with keys in hexagonal or other outlays. This would all be interacting with an API zone to decode the touches to events.
The questions are , then, how close can two touches be to be differentiated? I've already seen some signals where two close touches are treated as one. I know from other touch interface work, that the actual signals you have to deal with are messy! A finger tip can put out a signal for dozens of spots, so deciding on how to decode that into a set of multi touch points is non-trivial. But with a different, non finger based interface, where the surface pressed is probably a circle with a 3-5mm diameter or so, and you expect that, the radius of what a touch is could be shrunk. I'm just saying that could be a settable parameter somehow in the API zone.
My tests with pencil erasers as touching devices show that touches need to be at least about 1.5cm apart to be differentiated as two separate touches. if that could be optionally set down to .5 or . 75 cm, that could be useful. I can prototype with what exists, though.
You don't explicitly mention the actual size of the FSR array, which I take to be about 92 by 163, but if it's like other interfaces I know, you can figure out a touch's location more precisely from the little cloud of pressure values that a touch produces, and get a touch's pressure value as a side effect. Fancier processing could also pick up an angle of the touch as well, but I but that would slow things way down. That said, maybe the FSR really can't easily/speedily detect touches closer than around 1.5cm.
I'd also like to know the multi-touch polyphony limit. I'm thinking several people might be touching the Erae at once. (I wish)
The UTE [ https://hpi.zentral.zone/ute ] supports a lot of these kinds of keyboards if you want to see what I'm talking about. It might be fun to take polyphonic API messages and convert them either to direct synthesis as I've been doing or MPE MIDI to play microtonal pieces in a natural way, or fake being these other devices and let UTE (or other microtonal software) do the tuning.
Getting real numbers about these metrics will help, and of course, getting a way to customize that would also be helpful.