Description of work
Since the seminal 1998 paper by Botvinick and Cohen  it is well-known that synchronous sensorial stimuli can induce a human subject to believe that a rubber hand is her own hand. This effect is called embodiment and can be measured both objectively and using questionnaires.
Now, in the ABI group we have already developed a few methods to detect a human subject's intent to move according to the signals she produces at her forearm (myocontrol). Here we concentrate on tactile sensing: a wearable tactile bracelet with 320 sensors can be used to reconstruct a user's intention to close her hand, move her fingers, rotate and flex the wrist.
Question: does better myoncontrol foster deeper embodiment? If we can have a really good myocontrol, how much can we test the brain's platicity? Would a user think she has a third hand? Would she embody a right hand in place of her left one? Your task: to implement this idea in our virtual reality setup, then conceive and carry out a comparative experiment, checking whether different sensor settings (more or fewer of them, diferent arrangements) improve embodiment or not.
Prof. Philip Beckerle of TU-Dortmund and Dr. Robin Bekrater-Bodmann of the Central Institute for Pain in Mannheim (see [2,3] for a deep discussion on this topic) will assist you while designing the experiment, providing their expertise in psychology, robotics and the senses of self, agency and ownership.
 Botvinick, M. and Cohen, J., Rubber hands "feel" touch that eyes see, Nature 391, 756, 1998.
 Beckerle, P., Castellini, C. and Lenggenhager, B., Robotic interfaces for cognitive psychology and embodiment research: a research roadmap, Wiley Interdisciplinary Reviews - Cognitive Science, e1486, 2018.
 Beckerle, P., Kõiva, R., Kirchner, E. A., Bekrater-Bodmann, R., Došen, S., Christ, O., Abbink, D. A., Castellini, C. and Lenggenhager, B., Feel-good robotics: requirements on touch for embodiment in assistive robotics, Frontiers in Neurorobotics 12, 2018.