NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
NSF
The overarching theme of the project is to systematically expand understanding of how deep neural networks (DNNs) work and why or when they are better than classical methods through the lens of "adaptivity." Adaptivity refers to the properties of an algorithm that take advantage of favorable structures in the input data without knowing that these structures exist. That is, adaptive algorithms are those that are free of tuning parameters and could automatically configure themselves to adapt to each input data. The anticipated outcome of the project includes a new theory that explains and quantifies the adaptivity of popular DNN models such as multi-layer perceptrons, self-attention mechanisms (namely, transformer models), and meta-learning. The theory could result in substantial savings in the statistical and computational complexity of these models, allowing them to be applied in resource-constrained settings and to have more environmentally friendly energy footprint. This project will also provide opportunities for students and postdocs to explore interdisciplinary research topics related to deep learning. Specifically, this project investigates (1) the "local adaptivity" of DNNs in estimating functions from noisy data; (2) the "relational adaptivity" of self-attention mechanism that parses a structure data point (such as an image or a chunk of text); and (3) the "task adaptivity" of multi-task and meta-learning algorithms that learn to share information across multiple tasks. The research covers some of the most popular DNN models. Technically the project leverages multiple branches of mathematics (such as function classes, nonparametric statistics, statistical learning theory, optimization, and compressed sensing) and involves innovations in the approximation-theoretic understanding, algorithmic insights, and statistical theory of DNNs. The new analytical tools to be developed are also of independent interest to the broader machine learning theory community. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Up to $147K
2026-09-30
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.
One-time $49 fee · Includes AI drafting + templates + PDF export
Research Infrastructure: National Geophysical Facility (NGF): Advancing Earth Science Capabilities through Innovation - EAR Scope
NSF — up to $26.6M
AmLight: The Next Frontier Towards Discovery in the Americas and Africa
NSF — up to $9M
CREST Phase II Center for Complex Materials Design
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Energy Technologies
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Post-Transcriptional Regulation
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Semiconductors Research
NSF — up to $7.5M