From the National Science Foundation (NSF):
The National Science Foundation (NSF) is investing over $25 million in 26 projects to advance the cognitive and physical capabilities of workers in the context of human-technology interactions. These new awards will address critical social, technical, educational and economic needs in the workplace.
The awards were issued under the Future of Work at the Human-Technology Frontier (FW-HTF), one of 10 Big Ideas for Future NSF Investments announced by NSF in 2016.
The new projects will advance human-technology collaboration in the workplace and focus on enhancing productivity, innovation and learning. Research will provide foundations for augmenting human cognition, including:
- Models for social understanding and interaction.
- Teaching and learning.
- Biases in judgment.
- Attention, memory and more.
Research also will work advance the field of embodied intelligent cognitive assistants, systems that harness machine intelligence to enhance human cognitive and physical capabilities. These interactive cyber-physical systems involve robots, exoskeletons, virtual reality and augmented reality, including in autonomous vehicles and the built environment.
The award amounts range from $750,000 to $3 million each for three to five years, depending on the scope, duration and team size for the project.
A condition of the awards is that they must study human-technology interaction within the broader socioeconomic framework of jobs and work, and must also be attentive to social and economic impacts that can benefit workers, like training and workforce development.
Here are a few projects awarded funding that might be of special interest to infoDOCKET readers:
- Augmented cognition for teaching: Transforming teacher work with intelligent cognitive assistants
James Lester and Bradford Mott, North Carolina State University; Krista Glazewski, Thomas Brush and Cindy Hmelo-Silver, Indiana University
- Enhancing human capabilities through virtual personal embodied assistants in self-contained eyeglasses-based augmented reality systems
Gordon Wetzstein and Jeremy Bailenson, Stanford University; Henry Fuchs, Jan-Michael Frahm, Mohit Bansal, Prudence Plummer and Felicia Williams, University of North Carolina at Chapel Hill
- The future of classroom work: Automated teaching assistants
Kurt VanLehn, Arizona State University
- Human-machine teaming for medical decision making
Suchi Saria, David Newman-Toker, Chien-Ming Huang, Martin Makary and William Padula, Johns Hopkins University
- First person view and augmented reality for airborne embodied intelligent cognitive assistants
Craig Woolsey, Joseph Gabbard, Pratap Tokekar and Matthew Hebdon, Virginia Polytechnic Institute and State University
Direct to List/Links of All Funded Projects