Human Machine Interfaces
AnthroTronix is dedicated to optimizing the interaction between people and technology. We believe that, by improving human-machine interfaces, we can increase the benefits that technology can provide. Our intuitive interfaces help individuals directly connect to computers, robotic systems, and other programmed devices.
The AcceleGlove™ is an instrumented gesture recognition glove based on accelerometric, patented technology for detecting the individual motions of the finger, hand, wrist, and arm. The AcceleGlove™ can be used as a controller, using the natural movements of the operator’s hand/arm as the input device to control both the movement of a robotic device as well as the movement of ancillary devices, such as grasping and lifting arms. The glove was developed under SBIR grants from the U.S. Army and Department of Education. The AcceleGlove™ was further refined under a grant from the National Institutes of Health (NIH) to assist with physical therapy and the Army Research Office (ARO) for military hand signal recognition.
The NuGlove is an instrumented gesture recognition glove that can be used either as a controller for robotic devices or to track individual hand movements. Its capabilities include both static and dynamic hand and arm signal recognition for covert dismount warfighter communication. It can also be used for both direct and supervisory robotic control via discrete gestures and proportional control of one or more degrees of freedom. The NuGlove, like the AcceleGlove, uses the natural movements of the operator’s hand/arm as the input, but also includes gyroscopes and magnetometers, allowing for 9-axis detection and control in each finger. When combined with the accelerometers, the overall system allows for greater movement detection and differentiation. Incorporated Bluetooth® technology is currently under development.
HACS is a lightweight, wearable communication system to enable wireless transmission of hand-signals and gathered strategic and tactical information to the dismount warfighter. Previous operations units communicated using short-range radios and allow for only one team member to transmit information at a time. The HACS system improves and integrates a team’s status and communications with an ergonomic, low-power, bi-directional system that allows for real time automatic status gathering and hand-signal detection. The system includes an instrumented glove and haptic communication vest allowing for automated detection and recognition of standard hand-and-arm signals, as well as body position and movements. The system also includes a haptic feedback system that can be integrated into a warfighter’s uniform. This allows automated electronic capture of hand and arm signal commands to be sent to all team members, simultaneously, through the wearable haptic feedback system, thereby enhancing situation awareness, lethality, and likelihood of mission success.
The VIS Unit is a self-contained robotic controller with a multimodal display and sensor alert device for the Dismount Warfighter. Its intuitive display includes visually integrated video and sensor data, providing an augmented reality that vastly improves the warfighter’s local and remote situational awareness. The unit’s Internal Measurement Unit allows it to serve as both a low-level control system and a high-level tasking device. Syncing with external cameras gives warfighters the ability to remotely control any number of unmanned robotic devices and can provide an external line of vision in unknown or unsafe areas. The VIS Unit also includes a GPS, Zoom camera, IR camera, and Laser Range Finder, allowing warfighters to receive and transmit information based on their surroundings, even in times of low visibility. The unit supports interoperability with all classes of unmanned systems and allows for reliable control of any JAUS-compliant payload.
Dynamic Robot Operator Interface Design Assessment, Guidance, and Engineering Tool (DROID AGENT)
The purpose of the Dynamic Robot Operator Interface Design (DROID) Assessment, Guidance, and Engineering Tool (AGENT) is to advance the current state of human-robot interaction (HRI) through the development of a theoretically and empirically-based design science grounded in best practices and validated principles of human performance and human-systems integration (HSI). The DROID AGENT supports a multi-disciplinary HRI design process that involves key stakeholders from across domains, including roboticists, HRI specialists, and end users, throughout the design, development, and validation processes within the HRI lifecycle. This effort focuses on empirical research using physiological metrics to define objective assessments of the impact of various interaction design characteristics on human cognitive workload and HRI performance outcomes.