MULTISIMO has developed a multimodal corpus within the project and targets the investigation and modeling of collaborative aspects of multimodal behavior in groups that perform simple tasks.
The corpus consists of a set of human-human interactions recorded in multiple modalities. In each interactive session two participants collaborate with each other to solve a quiz while assisted by a human facilitator. The corpus has been transcribed and annotated with information related to verbal and non-verbal signals. The corpus includes survey materials, i.e. personality tests and experience assessment questionnaires filled in by all participants. This dataset addresses multiparty collaborative interactions and aims at providing unique, scarce to find data for measuring collaboration and task success based on the integration of the related multimodal information and the personality traits of the participants, but also at modeling the multimodal strategies that members of a group employ to discuss and collaborate with each other.
To download the dataset, please fill out accurately the form below. Once submitted, you will receive the download link at the e-mail address you have provided.