AI to tap your egocentric perception

Facebook AI has announced Ego4D, a project that is trying to give AI the ability to understand and interact with the world from a first-person perspective (the way that humans do). AI typically learns from photos and videos captured in third-person, but next-generation AI will need to learn from videos that show the world from the center of action.

Giving AI the ability to understand first-person context would enhance upcoming augmented reality (AR) devices and would greatly improve the performance of all extended reality (XR, MR, VR) interfaces. The name “Ego4D” might be a bit tone deaf, and that’s to say nothing of the fact that this is Mark Zuckerberg getting even deeper into our personal experiential data. But if you can get past that, this is super cool!

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.

About Shelly Palmer

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, is a regular commentator on CNN and writes a popular daily business blog. He's a bestselling author, and the creator of the popular, free online course, Generative AI for Execs. Follow @shellypalmer or visit shellypalmer.com.

Tags

Categories

PreviousStick it in your ear! NextThe Deepest Deepfake of All

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in technology, media, and marketing.

Subscribe