Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Microsoft demos muscle-computer interface, air Guitar Hero now a reality (pcgerms.com)
45 points by jhony_d on Aug 2, 2012 | hide | past | favorite | 29 comments


While I love the fact that the gaming division of Microsoft seems to be genuinely innovative and doing some really great stuff with interfaces, I'm still really unconvinced by the actual games that are produced.

Wii, Kinnect and Move are all fun for a while but there really aren't many great titles taking advantage of the new controllers and this feels like it's more of the same - innovative, novel but ultimately not something that's going to produce really great game which for all the hype is what these things should lead to.

Obviously there might be possibilities beyond gaming but in that area I'm not convinced anything that requires physical sensors on the body is likely to become part of a mainstream UI.


> Wii, Kinnect and Move are all fun for a while but there really aren't many great titles taking advantage of the new controllers

My feeling is that innovative software solutions come a good amount of time after the release of hardware that enables them; gaming consoles just don't keep the hype alive long enough to pick sufficient interest.

> I'm not convinced anything that requires physical sensors on the body is likely to become part of a mainstream UI

Maybe not, but the physical sensors could be attached to a structure resembling a bracelet. Make it wireless and aesthetically pleasing, and I could see myself wearing one without feeling silly.


> My feeling is that innovative software solutions come a good amount of time after the release of hardware that enables them; gaming consoles just don't keep the hype alive long enough to pick sufficient interest.

I'd buy that argument more if there were more great titles for the Wii (which has been around ages now) that really used it's remote.

(And I don't believe it's because the Wii is under powered in any way. Yes it's not a powerful console by todays standards but people have managed to create great, compelling games with less power than the Wii has.)

> Maybe not, but the physical sensors could be attached to a structure resembling a bracelet. Make it wireless and aesthetically pleasing, and I could see myself wearing one without feeling silly.

I wasn't really thinking about making it look great, just looking at the pictures there are a fair number of sensors in non-convenient places (higher up the arm than the wrist and the diagram shows them all over the body). Maybe they'll improve it but I'm not convinced that you'd get a really good range of movements monitored without considerable coverage.


Similarly, the people who buy games again and again are more likely to be playing first-person shooters or heavily-involved RPG games that don't lend themselves well to motion based control. Moving your whole body is simply too inefficient in comparison to using a controller. As a result, the motion controls are limited to "party games" that rely on simple gameplay rather than a deep experience.


> the gaming division of Microsoft seems to be genuinely innovative and doing some really great stuff with interfaces

Just to provide some context: The whole EMG research presented in the article has been conducted by T. Scott Saponas (et al.) at Microsoft Research - not the gaming division. And they conducted academic research, not product development.

http://research.microsoft.com/en-us/um/people/ssaponas/


I get that, though it's interesting that video games was their demo medium.


I am working in a lab that is developing a similar technology, the main difference being we use a single muscle site to control two degrees of freedom (primarily moving a cursor on a screen). We are also working on applications for severely paralyzed individuals who can use the "ear wiggling" muscles (auricularis superior--the side of your head) to control anything from a television remote to a motorized wheelchair. The control isn't perfect, but with a few hours of training most participants are achieving cursor-to-target accuracies of around 80%. One participant was able to reach 100% in one session.

It is fun to play with a muscle-computer interface for a while, but I don't think it will catch on as a heavy-use gaming controller unless they significantly improve upon current technology. It is simply too imprecise, though on/off (button) control like what is used for Guitar Hero is better suited for it than x,y movement of a cursor. Trying to control the equivalent actions of what you can do with a PS3 controller is out of the question.


Would most or all participants reach cursor-to-target accuracies in the 98-100% range if you allowed them to continue training indefinitely?


Probably yes. Our focus so far has been to demonstrate that it is physiologically possible to control multiple degrees of freedom with a single muscle site, so we haven't done extensive testing to address that. It's also worth noting that hitting a target and having very precise control of movement, as you would with a mouse, are not the same. I doubt the precision would ever reach that of a mouse, but it could conceivably get close.


This looks great, but I'm a bit worried that this will only bring more people RSI or similar problems.

I think that the future of input devices are our brains and a device that can read what's going on in them and translate that to the computer. We really need to move away from devices that use hands because they simply aren't made for such tasks. The only way I see humans still using hands for computer interaction in the future is if we get prosthetic robot hands like in the Deus Ex game for example.


> The only way I see humans still using hands for computer interaction in the future is if we get prosthetic robot hands like in the Deus Ex game for example.

If we could access the brain outputs that control the hands, why would we make robot hands and special input devices to be controlled by those hands to control the computers, rather than interface to the computers directly?


Perhaps you won't want third party hardware or software interfacing with your brain.

Another issue could be that repeating simplistic patterns over and over with your brain could generate a kind of "brain RSI" and you'd prefer the old school way to do things.

I find it infinitely entertaining to think about all the practical things you'll care about when a dream technology actually becomes available.


Well both use cases could be made in parallel. You could have prosthetic hands to replace your lost ones (or to "upgrade" your existing ones) and have a direct link to the computer.


Definitely, but they wouldn't use those hands for interaction, as the GP stated.


Could be. But it also could be that we would be able to tap into nerves and make artificial hands sooner than a usable interface between the brains and computer (there is a whole set of other problems to solve here). And that would mean that the interface for the computer would stay the same (keyboard, mouse, touch screen, ...) but our hands would be mechanical (and you couldn't get Carpal Tunnel Syndrom since, well, you wouldn't have a carpal tunnel :))


What are the other problems? Making a prosthetic arm is a superset of getting inputs from nerves, since you need to do that to make the actual arm. Besides, Microsoft already did that, as this very post indicates.


Well for starters you would have to read the signals from the brain, otherwise you are limited to the nerves in your hands. Maybe I wasn't clear enough about what I mean. I was thinking along the lines of thought control, not that you would control a virtual hand, if you will, on the computer. And if you wanted that you would, like I said, have to capture brain activity and develop software that would be able to make sense of your thoughts and translate those thoughts into actions on the computer ... this, to me, is a whole different game.


Ah, that might never happen, sadly.


Care to elaborate why you think so :) ?


Well, it's much, much more complicated to decode the brain's internals than just read its outputs to hands and other peripherals. I'm not sure it won't happen, but it's that much more complicated that it wouldn't surprise me.


Well I truly really hope you are wrong :)


I think that the future of input devices are our brains and a device that can read what's going on in them

A major obstacle to developing a useful brain-computer interface using EEG is signal strength. The event-related potentials measured at the scalp are microvolts and do not play well with other electronics nearby. Direct measurement of the motor cortex solves this problem, but most people aren't eager for brain surgery. The other option is to move downstream to the muscle site where a clean signal is quite accessible. I agree, however, that this is a less than ideal solution.


I think this is just something that future advancements in technology will make possible and practical. So I'm not to worried about this problems that you listed :)


Agree with you! But I must say it is still functional.


This seems like an interface perfectly suited for Google Glass. I'd rather use this, than voice commands or even worse the button on the glasses.


For all the other shit MS gets (I'm thinking of Kinect), at least they are actively trying out new interfaces/inputs. Let's hope they go places.


What shit did Microsoft get for Kinect? I was under the impression that it was well received all around.


Recent article: Has Kinect failed gamers?

http://360.mmgn.com/Articles/has-kinect-failed-gamers


While the Kinect was sold to the public as a video gaming peripheral, it was an extremely inexpensive LIDAR with impressive capabilities for the time (2 years ago).

From an economic standpoint, it was most successful in selling a boatload of units. However, from a game enthusiast's view, I see it as a stepping stone to something bigger, like the LEAP.

At this point, the Kinect is dated. The LEAP comes out Q4 of this year and is vastly superior (from what I've seen).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: