Beyond Humans: Perception & Understanding of Actions of Others

Ramiro Martin Joly-Mascheroni


This is an open source project developed with blind individuals from The Royal National Institute for the Blind (UK), ONCE Foundation (ES), and Universidad de Buenos Aires (AR).It is a collaborative effort involving Art, Neuroaesthetics, Artificial Intelligence (AI) and Robotics. Using an android we explore the neural mechanisms and motor processes involved in the perception, interpretation, and portrayal of facial gestures and motor actions, crucial to social communication and interaction. The android yawning may invoke this motor action by triggering the urge to yawn. Robots and artificial agents have raised important moral and ethical concerns. Decision-making processes and moral judgements can be difficult challenges for us humans. How will AI deal with moral dilemmas? Can artificial agents such as androids solve moral dilemmas we can’t solve ourselves? It is possible that we may be influenced by and follow the example of the ‘morally and ethically unaware’ android we have created.

Photo credit: Beyond Human Ramiro Joly-Mascheroni & Aline Sardin-Dalmasso.


Ramiro Martin Joly-Mascheroni (IT) is a Psychology PhD student exploring how blind children and adults perceive actions. Within a comparative and evolutionary perspective he is investigating how human action perception differs from that of animals. He developed an android, as one of a series of tools used by blind individuals and stroke victims, to train, rehabilitate, and regain control of their own facial expressions, and to aid interpretation of others though biofeedback systems.