CONnected through roBOTS: physically coupling humans to boost handwriting and music learning

Funded under: H2020-EU.2.1.1.

Grant agreement ID: 871803

Project Coordinator: UNIVERSITA CAMPUS BIO MEDICO DI ROMA

 

Project Summary

Robots to promote physical communication during handwriting and music learning

The EU-funded CONBOTS project will investigate a paradigm shift that promotes physical communication mediated by robots to enhance handwriting in children and music learning in beginner musicians. The project will apply innovative robotic technology, wearable sensors and machine learning algorithms to establish a physically interactive robotic platform that will connect humans to support the learning of complex sensorimotor tasks. The results of the project will advance the use of robotics in the education field.

 

Project Objective

From a parent coordinating movements to help a child learn to walk, to a violinist training a concerto, humans rely on physical interaction to learn from each other and from the environment. Building on a strongly multidisciplinary foundation with an integrated approach, CONBOTS proposes a paradigm shift that aims to augment handwriting and music learning through robotics, by creating a physically interacting robotic platform connecting humans in order to facilitate the learning of complex sensorimotor tasks.
The newly designed platform will combine four enabling technologies: i) compact robotic haptic devices to gently interact with upper limbs; ii) an interactive controller yielding physical communication, integrating differential Game Theory (GT) and an algorithm to identify the partner’s control; iii) a bi-directional user interface encompassing AR-based application-driven serious games, and a set of wearable sensors and instrumented objects; iv) Machine learning algorithms for tailoring learning exercises to the user physical, emotional, and mental state
CONBOTS is building on recent neuroscientific findings that showed the benefits of physical interaction to performing motor tasks together, where the human central nervous system understands a partner motor control and can use it to improve task performance and motor learning. This will be implemented on innovative robotic technology, wearable sensors and machine learning techniques to give rise to novel human-human and human-robot interaction paradigms applied in two different learning contexts: i) training graphomotor skills in children learning handwriting; ii) augmenting learning performance in beginner musicians.
Using its neuroscience-driven unifying approach to motor learning and physical communication CONBOTS will expand the impact and the application of robotics to the education industry.

 

ARVRtech’s Objective

To design a bi-directional user interface with Augmented Reality serious games, wearable sensors and instrumented objects for maximizing the impact of physical interaction in learning contexts (WP5). Description: This objective focuses on the design of novel bi-directional user interface to convey information to the user, through AR-based serious games, and from the user, collecting data from wearable sensors and instrumented objects. The application-driven serious games are specifically designed to exploit the capability of the CONBOTS platform to establish a physical communication between two users. Information from the users are captured by an easy-to-use, non-invasive wearable garment that embeds different kinds of physiological and motion sensors to: i) collect physiological parameters and estimate user’s mental and emotional state for adapting the engagement and the difficulty of the performed task; ii) track movements of the upper limbs for measuring user’s performance and control physical communication during human-human interaction. Moreover, a set of instrumented objects are designed to record how users interact with the environment during the manipulation of the tools necessary for the execution of tasks in the different application scenarios (e.g. instrumented pen or bow).

 

Source: CONBOTS Project Information