Hi Folks! This is really for the developers!I'm thinking of making some tactile overlays to interact with the squishy or drum-ready Erae surfaces. The idea being to make some keyboard-like flexible hard keys with bumps on the bottom that simulate finger presses. Clever construction and/or 3d printing would let these keys "recover" either via the springiness of the Erae's surfaces or in the framework holding the keys.
That way, there could something more naturally scaled for keyboard playing, with a little "give", as opposed to what I get from Keith McMillen MPE keyboards. Also, some more complicated isometric keyboards, could be constructed with keys in hexagonal or other outlays. This would all be interacting with an API zone to decode the touches to events. The questions are , then, how close can two touches be to be differentiated? I've already seen some signals where two close touches are treated as one. I know from other touch interface work, that the actual signals you have to deal with are messy! A finger tip can put out a signal for dozens of spots, so deciding on how to decode that into a set of multi touch points is non-trivial. But with a different, non finger based interface, where the surface pressed is probably a circle with a 3-5mm diameter or so, and you expect that, the radius of what a touch is could be shrunk. I'm just saying that could be a settable parameter somehow in the API zone. My tests with pencil erasers as touching devices show that touches need to be at least about 1.5cm apart to be differentiated as two separate touches. if that could be optionally set down to .5 or . 75 cm, that could be useful. I can prototype with what exists, though. You don't explicitly mention the actual size of the FSR array, which I take to be about 92 by 163, but if it's like other interfaces I know, you can figure out a touch's location more precisely from the little cloud of pressure values that a touch produces, and get a touch's pressure value as a side effect. Fancier processing could also pick up an angle of the touch as well, but I but that would slow things way down. That said, maybe the FSR really can't easily/speedily detect touches closer than around 1.5cm. I'd also like to know the multi-touch polyphony limit. I'm thinking several people might be touching the Erae at once. (I wish)
The UTE [ https://hpi.zentral.zone/ute] supports a lot of these kinds of keyboards if you want to see what I'm talking about. It might be fun to take polyphonic API messages and convert them either to direct synthesis as I've been doing or MPE MIDI to play microtonal pieces in a natural way, or fake being these other devices and let UTE (or other microtonal software) do the tuning. Getting real numbers about these metrics will help, and of course, getting a way to customize that would also be helpful.
Henry Lowengard
Hi Folks!
This is really for the developers!I'm thinking of making some tactile overlays to interact with the squishy or drum-ready Erae surfaces. The idea being to make some keyboard-like flexible hard keys with bumps on the bottom that simulate finger presses. Clever construction and/or 3d printing would let these keys "recover" either via the springiness of the Erae's surfaces or in the framework holding the keys.
That way, there could something more naturally scaled for keyboard playing, with a little "give", as opposed to what I get from Keith McMillen MPE keyboards. Also, some more complicated isometric keyboards, could be constructed with keys in hexagonal or other outlays. This would all be interacting with an API zone to decode the touches to events.
The questions are , then, how close can two touches be to be differentiated? I've already seen some signals where two close touches are treated as one. I know from other touch interface work, that the actual signals you have to deal with are messy! A finger tip can put out a signal for dozens of spots, so deciding on how to decode that into a set of multi touch points is non-trivial. But with a different, non finger based interface, where the surface pressed is probably a circle with a 3-5mm diameter or so, and you expect that, the radius of what a touch is could be shrunk. I'm just saying that could be a settable parameter somehow in the API zone.
My tests with pencil erasers as touching devices show that touches need to be at least about 1.5cm apart to be differentiated as two separate touches. if that could be optionally set down to .5 or . 75 cm, that could be useful. I can prototype with what exists, though.
You don't explicitly mention the actual size of the FSR array, which I take to be about 92 by 163, but if it's like other interfaces I know, you can figure out a touch's location more precisely from the little cloud of pressure values that a touch produces, and get a touch's pressure value as a side effect. Fancier processing could also pick up an angle of the touch as well, but I but that would slow things way down. That said, maybe the FSR really can't easily/speedily detect touches closer than around 1.5cm.
I'd also like to know the multi-touch polyphony limit. I'm thinking several people might be touching the Erae at once. (I wish)
The UTE [ https://hpi.zentral.zone/ute ] supports a lot of these kinds of keyboards if you want to see what I'm talking about. It might be fun to take polyphonic API messages and convert them either to direct synthesis as I've been doing or MPE MIDI to play microtonal pieces in a natural way, or fake being these other devices and let UTE (or other microtonal software) do the tuning.
Getting real numbers about these metrics will help, and of course, getting a way to customize that would also be helpful.