esearchers at the LEARNER project have created a mixed reality learning environment, which uses exoskeletons and AR to train and assist emergency service workers and improve their safety and performance by enhancing their physical and cognitive abilities.
The mixed reality cloud based learning platform has been named The Learning Environments with Augmentation and Robotics for Next-gen Emergency Responders or LEARNER for short, and is specifically designed to train emergency service responders to use human augmentation technologies (HATs).
The LEARNER platform, originally funded in 2019 by the National Science Foundation’s Convergence Accelerator, integrates a multidisciplinary team based science addressing national-scale societal challenges.
Last month, the National Science Foundation awarded follow-on funding to the partners at Virginia Tech, the University of Florida and Texas A&M University and Knowledge Based Systems Inc. and Sarcos Robotics. Part of the multidisciplinary team is now made up by The Public Safety Research Division UI/UX group at the National Institute of Standards and Technology.
HATs, including AR and exoskeletons are able to dramatically improve the safety and performance of emergency service responders. By using shared AR programs, command staff are able to guide emergency responders who are wearing an AR headset and give them real time information in the form of scene annotations regarding the location of victims, exits and potential hazards.
Powered exoskeletons can give emergency service workers the extra strength to lift heavy objects, minimising fatigue, reducing injuries while preserving their autonomy and decision-making abilities.
The LEARNER project assesses and develops the physical, augmented and virtual reality technology for emergency service responders and aims to build a mixed reality learning platform, specialising in AR and exoskeleton-specific learning modules which optimise and adapt to the environment and for the individual responder by artificial intelligence to analyse biometric and behavioural data to personalise training to a responders’ learning needs.
Ultimately, The LEARNER researchers plan on creating an open source shared knowledge platform which will enhance and speed up the integration of training for HATs not only for emergency service responders but for workers in broader healthcare, manufacturing, construction and energy.
Texas A&M’s Principal Investigator on the LEARNER project and Director of the NeuroErgonomics Lab, Ranjana Mehta, stated:
“LEARNER is a personalised learning platform that will incorporate physiological, neurological and behavioural markers of learning into real-time emergency response scenario evolution,”
According to Mehta, the training will be available through laptops, AR headsets and haptic suits at field houses and in-situ emergency response training centres.
Mehta concluded by commenting:
“Imagine if health care workers are quickly able to learn how to use powered exoskeletons using LEARNER -- fewer workers would be needed for safer patient handling, thereby potentially reducing the spread of COVID-19-related infections… The award will accelerate our efforts to make immediate impacts to address challenges of national importance such as this.”