ImprovAIze
Published:
Supported by the AAU AI for the People, this bridging project combines machine learning and wearable sensing devices to develop intuitive audiovisual displays that accurately reflect physical activity and the felt experience of human movement. Tracking human motion is important for a range of activities and applications, from dance and music performance to rehabilitation and human-robot interaction. Wearable devices using physiological sensor data capture precise information about muscle activity, but provide very little information about how a person feels during the activity. This project combines a real-time interactive audiovisual system and machine learning techniques to develop algorithms that impart additional high-level information about the mover’s emotional and affective states. The goal is to improve algorithms for movement and effort tracking by incorporating people’s felt experience of movement.
PARTICIPANTS: SSH, TECH, ENG Elizabeth Jochum, RELATE Cumhur Erkut, Dan Overholt, Sofia Dahl, George Palamas, Augmented Performance Lab Shaoping Bai, Department of Materials and Production
Watch a previous, soma-based, non-AI exploration of the sensors, presented by Robin Otterbein at MOCO’22