NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
NSF
Understanding how humans interact with the physical world is essential for teaching intelligent systems to perform complex tasks effectively. Everyday activities, such as grasping a coffee mug and taking a sip, often involve the seamless integration of visual perception and motor control, a process that current intelligent systems struggle to replicate. This project aims to bridge the gap by developing a new family of visual sensing and learning approaches to track hand motion and interpret daily interactions, all using wearable cameras. Further, this project will demonstrate a key application in occupational safety and health by adapting the technology to assess injury risks in the workplace. By advancing the sensing and analysis of human interactions, this research will deepen the understanding of human intelligent behavior, drive the development of more capable intelligent systems, and expand practical applications of such systems. The project’s outputs, including open-source algorithms and hardware platforms, will be disseminated through public competitions, online courses, and diverse outreach activities. This project will advocate a new paradigm of proprioceptive vision, where an intelligent system integrates physical awareness with visual perception to understand how its actions relate to its observations. The project’s goal will be achieved through two interconnected research thrusts. Thrust 1 will develop computer vision approaches to harness wrist-mounted, spatially distributed, miniature single-photon cameras for reconstructing hands and in-hand objects. This will create a new solution for wearable hand tracking, supporting long-term recording under various lighting conditions. Thrust 2 will create machine learning techniques to integrate egocentric videos and hand motion for understanding hand-object interactions. This will advance the foundational representation of objects and actions and enable compositional reasoning of interactions. Finally, the outcomes from both thrusts will be integrated to address a significant health challenge: assessing the injury risk of mobile workers engaged in repetitive manual picking tasks in workplaces. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Up to $351K
2030-06-30
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.
One-time $749 fee · Includes AI drafting + templates + PDF export
Research Infrastructure: National Geophysical Facility (NGF): Advancing Earth Science Capabilities through Innovation - EAR Scope
NSF — up to $26.6M
AmLight: The Next Frontier Towards Discovery in the Americas and Africa
NSF — up to $9M
CREST Phase II Center for Complex Materials Design
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Energy Technologies
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Post-Transcriptional Regulation
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Semiconductors Research
NSF — up to $7.5M