Exciting change is on the way! Please join us at nsf.gov for the latest news on NSF-funded research. While the NSF Science360 page and daily newsletter have now been retired, there’s much happening at nsf.gov. You’ll find current research news on the homepage and much more to explore throughout the site. Best of all, we’ve begun to build a brand-new website that will bring together news, social media, multimedia and more in a way that offers visitors a rich, rewarding, user-friendly experience.

Want to continue to receive email updates on the latest NSF research news and multimedia content? On September 23rd we’ll begin sending those updates via GovDelivery. If you’d prefer not to receive them, please unsubscribe now from Science360 News and your email address will not be moved into the new system.

Thanks so much for being part of the NSF Science360 News Service community. We hope you’ll stay with us during this transition so that we can continue to share the many ways NSF-funded research is advancing knowledge that transforms our future.

For additional information, please contact us at NewsTravels@nsf.gov

Picture of the Day

New tech may make prosthetic hands easier for patients to use

Researchers have developed new technology for decoding neuromuscular signals to control powered, prosthetic wrists and hands. The work relies on computer models that closely mimic the behavior of the natural structures in the forearm, wrist and hand. The technology could also be used to develop new computer interface devices for applications such as gaming and computer-aided design. Current state-of-the-art prosthetics rely on machine learning to create a "pattern recognition" approach to prosthesis control. This approach requires users to "teach" the device to recognize specific patterns of muscle activity and translate them into commands -- such as opening or closing a prosthetic hand. Instead, the researchers developed a user-generic, musculoskeletal model. The researchers placed electromyography sensors on the forearms of six able-bodied volunteers, tracking exactly which neuromuscular signals were sent when they performed various actions with their wrists and hands. This data was then used to create the generic model, which translated those neuromuscular signals into commands that manipulate a powered prosthetic. In preliminary testing, both able-bodied and amputee volunteers were able to use the model-controlled interface to perform all of the required hand and wrist motions -- despite having very little training.

Visit Website | Image credit: Lizhi Pan/North Carolina State University