Projects using NuGlove

INTERACT

Interactive Next-generation Testbed Environment for Retention and Assessment of Computer-based Training (INTERACT) INTERACT provides multi-modal feedback for immersive training environments to promote psychomotor learning and multi-modal skill acquisition. It consists of NuGlove, which provides gesture recognition and haptic feedback to a trainee, and a scent collar, which provides olfactory feedback. In combination with a virtual training environment, INTERACT will create a fully immersive experience with perceptual cues, optimal for skill acquisition and long-term retention. Under an SBIR Phase II contract from the Office of Naval Research, AnthroTronix completed a proof-of-concept integration with the Immersive Virtual Ship Environment (IVSE) Engineering Plant Technician (EPT) Training Course for the Littoral Combat Ship. INTERACT has also been implemented with Virtual Reality (V/R) googles.

  • Proof-of-Concept integration with the Surface Warfare Officers School Virtual Reality training course for Littoral Combat Ship Engineering Plant Technicians, developed by Cubic Global Defense
  • Video of the IVSE EPT Training Course: https://www.youtube.com/watch?v=k-aT90ehH8Y
  • Implemented Virtual Reality headset (HTC Vibe) demonstration
  • Demonstrated at the Surface Warfare Officers School, Newport, RI
  • Office of Naval Research SBIR Phase II contract

Intuitive Robot Operator Control (IROC)

  • Integration with Endeavor Robotics PackBot 510, Samsung phone, and heads-up display
  • Selected from IROC Challenge at Muscatatuck, IN
  • Marine Corps Warfighting Lab SBIR Phase III contract

DROID AGENT

Dynamic Robot Operator Interface Design Assessment, Guidance, and Engineering Tool (DROID AGENT)

DROID AGENT’s goal is to facilitate machine learning of medical procedures to support an automated training/coaching system for combat medics or Navy corpsmen. We will accomplish this by collecting data from a wearable system consisting of two input modalities: machine vision from body-worn cameras and sensor data from IMUs located on instrumented gloves. We will train the system on the application of a tourniquet to an arm as an initial use-case. This data will be used to train the machine for the classification of actions that comprise the procedure. This approach allows for the classification to occur using only one of the modalities. Thus, if one of the inputs is not available, classification can still take place. The approach also lays the groundwork for the system to predict the next steps of procedures before they occur, increasing the quality of the human-machine collaboration.

  • NuGlove provides cues for autonomous behaviors by a robotic arm in different scenarios, e.g. Combat Casualty Care
  • Research Institute: University of Maryland, Computer Vision Laboratory & Autonomy, Robotics and Cognition (ARC) Laboratory
  • DARPA STTR Phase II Option contract

COMMAND

  • Multimodal system with NuGlove and haptic belt for silent non-line-of-sight communications and ground robot teleoperation
  • Used in Soldier experiments, Fort Benning, GA, 2014 and 2016
  • Army Research Office SBIR Phase III contract