Exciting change is on the way! Please join us at nsf.gov for the latest news on NSF-funded research. While the NSF Science360 page and daily newsletter have now been retired, there’s much happening at nsf.gov. You’ll find current research news on the homepage and much more to explore throughout the site. Best of all, we’ve begun to build a brand-new website that will bring together news, social media, multimedia and more in a way that offers visitors a rich, rewarding, user-friendly experience.

Want to continue to receive email updates on the latest NSF research news and multimedia content? On September 23rd we’ll begin sending those updates via GovDelivery. If you’d prefer not to receive them, please unsubscribe now from Science360 News and your email address will not be moved into the new system.

Thanks so much for being part of the NSF Science360 News Service community. We hope you’ll stay with us during this transition so that we can continue to share the many ways NSF-funded research is advancing knowledge that transforms our future.

For additional information, please contact us at NewsTravels@nsf.gov

Top Story

Reach in and touch objects in videos with ‘Interactive Dynamic Video’

We learn a lot about objects by manipulating them: poking, pushing, prodding, and then seeing how they react. We obviously can’t do that with videos — just try touching that cat video on your phone and see what happens. But is it crazy to think that we could take that video and simulate how the cat moves, without ever interacting with the real one? Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory have recently done just that, developing an imaging technique called Interactive Dynamic Video (IDV) that lets you reach in and “touch” objects in videos. Using traditional cameras and algorithms, IDV looks at the tiny, almost invisible vibrations of an object to create video simulations that users can virtually interact with.

Visit Website | Image credit: Abe Davis/MIT CSAIL