CEEDs

THE COLLECTIVE EXPERIENCE OF EMPATHIC DATA SYSTEMS

Founded By: European Union – FP7-ICT

Start date: 1 Sep. 2010          End date: 28 Feb. 2015

Grant: € 8.627.550

Partners: Goldsmith’ College, University of Tübingen, Max-Planck-Gesellschaft, University of Augsburg, Pompeu Fabra University, Politechnique University of Cataluna, University of Helsinki, Ecole Normale Superieure, The Centre for Research and Technology-Hellas (CERTH), Budapest University of Technology and Economics, Human Inspired Technology (HIT) Research Centre (University of Padova), Electrolux Italia SPA, Univesity of Pisa, University of Leiden, University of Sussex, University of Teesside, Centre National de la Recherche Scientifique CNRS

HTLab Involved People: Luciano Gamberini, Anna Spagnolli, Patrik Pluchino

The CEEDs project aimed at designing and developing a mixed-reality-based system that enhances human abilities in processing information and comprehending complex data sets.
The project introduced pioneering tools for human-computer interaction/confluence. The technological solution consisted of two primary components. The first was a new system that enabled individuals to interact with large visual data sets by means of three projectors that displayed virtual stimuli on sizeable surrounding panels. In this highly immersive environment, users could engage with virtual objects, such as a 3D Connectome representing the brain’s neural connections, 3D magnetic resonance complete model, and a neuroscience atlas using their right arm or body, utilizing a motion capture sensor (KinectOne © Microsoft Corporation).
The second component of CEEDs involved leveraging implicit information from the early stages of our perceptual and cognitive processing. By monitoring implicit signals like eye behavior and cardiac activity, the system guided users to potential areas of interest and facilitated deep information processing while interacting with neuroscientific visualizations within the mixed-reality environment.

Various non-invasive wearable devices, including eye trackers, smart T-shirts, and gloves, were considered to detect internal states of individuals (e.g., heart rate, pupil dilation, electrodermal activity) as they explored visualizations of large data sets in the mixed-reality virtual environment (referred to as eXperience Induction Machine; XIM). These measures allowed CEEDs to identify users’ implicit responses to different visual stimuli characteristics projected on the large screens. The implicit data were then used to support users in discovering and analyzing meaningful information within the neuroscientific database by reducing the amount and complexity of visual data presented in case of the detection of participants’ mental overload or repositioning of the Connectome if the users looked disoriented in a zoomed-in condition inside the 3D reconstruction of the brain.

References:

Negri, P., Omedas, P., Chech, L., Pluchino, P., Minelle, F., Verschure, P. F., … & Gamberini, L. (2015). Comparing input sensors in an immersive mixed-reality environment for human-computer symbiosis. In Symbiotic Interaction: 4th International Workshop, Symbiotic 2015, Berlin, Germany, October 7-8, 2015, Proceedings 4 (pp. 111-125). Springer International Publishing.

Negri, P., Gamberini, L., & Cutini, S. (2014). A review of the research on subliminal techniques for implicit interaction in symbiotic systems. In Symbiotic Interaction: Third International Workshop, Symbiotic 2014, Helsinki, Finland, October 30-31, 2014, Proceedings 3 (pp. 47-58). Springer International Publishing.

Spagnolli, A., Guardigli, E., Orso, V., Varotto, A., & Gamberini, L. (2014). Measuring user acceptance of wearable symbiotic devices: validation study across application scenarios. In Symbiotic Interaction: Third International Workshop, Symbiotic 2014, Helsinki, Finland, October 30-31, 2014, Proceedings 3(pp. 87-98). Springer International Publishing.

Lessiter, J., Freeman, J., Miotto, A., & Ferrari, E. (2014). Ghosts in the machines: towards a taxonomy of human computer interaction. In Symbiotic Interaction: Third International Workshop, Symbiotic 2014, Helsinki, Finland, October 30-31, 2014, Proceedings 3 (pp. 21-31). Springer International Publishing.

Gamberini, L., Carlesso, C., Seraglia, B., & Craighero, L. (2013). A behavioural experiment in virtual reality to verify the role of action function in space coding. Visual cognition, 21(8), 961-969.

Seraglia, B., Priftis, K., Cutini, S., & Gamberini, L. (2012). How tool use and arm position affect peripersonal space representation. Cognitive processing, 13, 325-328.

Seraglia, B., Gamberini, L., Priftis, K., Scatturin, P., Martinelli, M., & Cutini, S. (2011). An exploratory fNIRS study with immersive virtual reality: a new method for technical implementation. Frontiers in human neuroscience, 5, 176.

© Copyright 2023 Human Technology Lab - All Rights Reserved.