NextMind’s brain-computer interface is developer ready


Beyond size and aesthetics, NextMind technology also seems quite mature. I tried a demo (via the SDK which goes on sale today for $ 399) and was surprised at how refined the experience was. The set-up consisted of only one basic “training” exercise and I was up and running, controlling things with my mind. The variety of demos made it clear that NextMind thinks far beyond just mental button presses.

There is still a slight learning curve to getting “talent” – and it won’t replace your mouse or keyboard. fair again. Mainly because we’ll have to wait for an app library to be created for it first, but also it’s still new technology – and it takes a bit of practice to get “fluent” with it, like my terrible performance on a mind – the controlled play of Breakout can attest to this. But the diverse and creative demo apps that I have experimented with show great promise.

James Trew / Engadget

Right now the apps are pretty straightforward – mainly controlling media and games etc., but NextMind Founder and CEO Sid Kouider is confident that the technology will evolve to the point where you can just think of a picture. to search for it, for example. . There are also complementary technologies, like AR, where this type of control not only seems appropriate, but almost essential. Imagine wearing augmented reality glasses and being able to choose from menu items or move virtual furniture around your room with just a glance.

The technology that drives things is quite familiar: The sensor is an EEG that rests gently against the back of your head. This position is essential, according to Koudier, because this is where signals from your visual cortex can be reached most easily (or comfortably). And it’s these signals that NextMind uses, interpreting what you’re looking at as the element or signal to act on. In its simplest form it would be a button or trigger, but the demos also show how it can be used to DJ, copy and paste, and even augment (instead of just replacing) other inputs, such as this mouse. or a game controller you are already using.

Perhaps more intriguing is its potential application at home. As everyday household items get “smart,” the way we interact with them can change. Koudier said that after this year’s CES, his company spoke to around 25 companies. Mainly in the entertainment or gaming industry, but also companies that want to integrate their technology into their own products. “This is something very, very exciting that we are currently working on to integrate it into real and physical objects,” Koudier added. Double clapping to turn off the lights will feel more out of date than ever once you can just stare at the switch and do it that way.

If you’re the suspicious type and worried that your innermost thoughts will become the next tech data for sale, Koudier wants to remind you that NextMind doesn’t decode your thoughts (that would be a lot more amazing tech). Instead, it’s just about interpreting your visual concentration.

As with the iPhone, which popularized the touchscreen as the primary interface, the way this technology fits into our lives will evolve as users (and therefore developers) have new expectations and demands. Time isn’t quite up for the trusty keyboard and mouse combo, but they could perhaps start to look a bit quaint.

Leave a Reply

Your email address will not be published. Required fields are marked *