Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The DREAM Dataset: Supporting a data-driven study of autism spectrum disorder and robot enhanced therapy

  • Erik Billing ,

    Roles Conceptualization, Data curation, Project administration, Software, Supervision, Validation, Visualization, Writing – original draft

    erik.billing@his.se

    Affiliation University of Skövde, Skövde, Sweden

  • Tony Belpaeme,

    Roles Methodology, Project administration, Supervision, Writing – review & editing

    Affiliations University of Plymouth, Plymouth, United Kingdom, IDLab - imec, Ghent University, Ghent, Belgium

  • Haibin Cai,

    Roles Investigation, Methodology, Software

    Affiliation University of Portsmouth, Portsmouth, United Kingdom

  • Hoang-Long Cao,

    Roles Methodology, Software

    Affiliations Vrije Universiteit Brussel, Brussel, Belgium, Flanders Make, Lommel, Belgium

  • Anamaria Ciocan,

    Roles Investigation, Validation

    Affiliation Universitatea Babeş-Bolyai, Cluj-Napoca, Romania

  • Cristina Costescu,

    Roles Investigation, Methodology

    Affiliation Universitatea Babeş-Bolyai, Cluj-Napoca, Romania

  • Daniel David,

    Roles Funding acquisition, Investigation, Methodology, Supervision, Validation

    Affiliation Universitatea Babeş-Bolyai, Cluj-Napoca, Romania

  • Robert Homewood,

    Roles Data curation, Visualization

    Affiliation University of Skövde, Skövde, Sweden

  • Daniel Hernandez Garcia,

    Roles Software, Writing – review & editing

    Affiliation University of Plymouth, Plymouth, United Kingdom

  • Pablo Gómez Esteban,

    Roles Software

    Affiliations Vrije Universiteit Brussel, Brussel, Belgium, Flanders Make, Lommel, Belgium

  • Honghai Liu,

    Roles Conceptualization, Data curation, Funding acquisition, Software, Supervision, Writing – review & editing

    Affiliation University of Portsmouth, Portsmouth, United Kingdom

  • Vipul Nair,

    Roles Data curation, Validation

    Affiliation University of Skövde, Skövde, Sweden

  • Silviu Matu,

    Roles Investigation, Methodology, Validation

    Affiliation Universitatea Babeş-Bolyai, Cluj-Napoca, Romania

  • Alexandre Mazel,

    Roles Conceptualization, Resources

    Affiliation SoftBank Robotics, Paris, France

  • Mihaela Selescu,

    Roles Investigation

    Affiliation Universitatea Babeş-Bolyai, Cluj-Napoca, Romania

  • Emmanuel Senft,

    Roles Data curation, Software, Validation, Writing – review & editing

    Affiliation University of Plymouth, Plymouth, United Kingdom

  • Serge Thill,

    Roles Conceptualization, Funding acquisition, Methodology, Writing – review & editing

    Affiliations University of Skövde, Skövde, Sweden, Donders Institute for Brain, Cognition, and Behavior, Radboud University, Nijmegen, The Netherlands

  • Bram Vanderborght,

    Roles Conceptualization, Funding acquisition, Methodology, Writing – review & editing

    Affiliations Vrije Universiteit Brussel, Brussel, Belgium, Flanders Make, Lommel, Belgium

  • David Vernon,

    Roles Funding acquisition, Methodology, Project administration, Software

    Affiliation University of Skövde, Skövde, Sweden

  •  [ ... ],
  • Tom Ziemke

    Roles Conceptualization, Funding acquisition, Project administration, Supervision, Writing – review & editing

    Affiliations University of Skövde, Skövde, Sweden, Linköping University, Linköping, Sweden

  • [ view all ]
  • [ view less ]

Abstract

We present a dataset of behavioral data recorded from 61 children diagnosed with Autism Spectrum Disorder (ASD). The data was collected during a large-scale evaluation of Robot Enhanced Therapy (RET). The dataset covers over 3000 therapy sessions and more than 300 hours of therapy. Half of the children interacted with the social robot NAO supervised by a therapist. The other half, constituting a control group, interacted directly with a therapist. Both groups followed the Applied Behavior Analysis (ABA) protocol. Each session was recorded with three RGB cameras and two RGBD (Kinect) cameras, providing detailed information of children’s behavior during therapy. This public release of the dataset comprises body motion, head position and orientation, and eye gaze variables, all specified as 3D data in a joint frame of reference. In addition, metadata including participant age, gender, and autism diagnosis (ADOS) variables are included. We release this data with the hope of supporting further data-driven studies towards improved therapy methods as well as a better understanding of ASD in general.

1 Introduction

Children diagnosed with Autism Spectrum Disorder (ASD) typically suffer from widespread difficulties in social interactions and communication, and they exhibit restricted interests and repetitive behavior [1]. ASD is referred to as a spectrum disease because the type and the severity of the symptoms vary significantly between individuals. At one pole mild difficulties in social interaction and communication, such as problems in the initiation and maintenance of a conversation, the integration of verbal and nonverbal communication and the behavior adaptation to various contexts, together with some behavior rigidity can be seen. The opposite pole is characterized by severe deficits in verbal and nonverbal communication, low level of social initiation, absence of peer interest, strong behavior inflexibility, and restricting/repetitive behaviors [1].

Behavioral and psychosocial interventions are the main approach for the treatment of ASD, while medication is sometimes prescribed in order to control associated symptoms or other comorbid problems [2]. Behavioral and psychosocial interventions try to facilitate the development and adaptation of children by teaching them appropriate social and communication skills, according to their developmental age. These interventions vary in terms of how structured the therapeutic activities are (e.g., during naturalistic play or following a pre-established activity), who delivers the intervention (e.g., a trained therapist or a parent), and the degree to which the child is required to follow a desired curriculum or the curriculum will be developed around child’s preferences and interests [3, 4].

The therapeutic intervention that currently has the most consistent empirical support is Applied Behavior Analysis (ABA) [5]. ABA is a structured intervention following behavioral learning principles, in which reinforcements are manipulated in order to increase the frequency of desired behaviors and decrease the frequency for those that are maladaptive. The discrete trial training (DTT) is a common method employed by ABA treatments, in which the child is presented with a discriminative stimulus for a specific behavior (e.g., an instruction from the therapist), and the child receives a reward if he or she performs the expected behavior. If he or she does not, the therapist might correct the behavior by offering a demonstration or by offering a prompt [6]. In order to be effective, ABA therapies need to be both intensive and extensive and are thus associated with significant efforts from both patients and therapists providing the treatments.

An alternate form of therapy receiving a lot of attention over the last decade is Robot Assisted Therapy (RAT) [711], sometimes referred to as Robot Enhanced Therapy (RET) [1216]. While RAT refers to a wide spectrum of approaches to autism therapy involving robots in one way or another, the notion of RET is used in a more narrow sense and refers to therapies following an ABA protocol where a humanoid robot constitutes an interaction partner. Both RAT and RET typically involve triadic interactions, comprising the child, a robot, and an adult, e.g., a therapist.

In RET interventions developed on ABA principles, the robot guides the child through a game-like activity in order to develop a behavior that is relevant for social communication, while the therapist supervises the interaction. The robot acts as a model by performing the desired behavior, or as a discriminant stimulus, by giving verbal or non-verbal instructions. The robot also acts as a source of social reinforcement, by providing positive or negative feedback on the performance of the child. The justification for using a robot in this form of treatment relies on the empirical findings indicating that ASD children are learning social behaviors from these interactions and might be more motivated to participate in the intervention as a result of the presence of the robot [17, 18].

Robots have also been proposed as a means for screening, diagnosis, and improved understanding of ASD [19, 20], the potential of which are still not fully exploited due to a majority of research on RAT and RET taking the form of small scale or single-case studies, without the methodological rigor required to make the data applicable in clinical domains [12, 21].

Within the European research project DREAM—Development of Robot-Enhanced therapy for children with Autism spectrum disorders [22], we have conducted a large scale clinical evaluation of RET, involving 61 children (9 female) between 3 and 6 years of age. 30 of the children interacted with a humanoid robot NAO [23] (RET-group), and the remaining 31 participants received a standard human treatment (SHT-group). The clinical efficacy of RET was tested in a randomized clinical trial design, with a study protocol consisting of an initial assessment, eight bi-weekly personalized behavioral interventions, and a final assessment. Each intervention targeted three social skills; imitation, joint-attention, and turn-taking.

All therapies were recorded using a sensorized intervention table able to record and interpret the child’s behavior during the intervention (analyzing, for example, eye gaze, body movement, facial expression) [24]. The table was developed in order to inform the control of a supervised-autonomous robot used in RET [13], but was also used to support assessment and analysis of SHT. A total of 3121 therapy sessions was recorded, covering 306 hours of therapy.

While the clinical results from the evaluation are in the process of being published elsewhere, we here present a public release of the DREAM dataset, made available for download by the Swedish National Data Service, https://doi.org/10.5878/17p8-6k13. Following the ethical approval and agreements with caregivers, this public release does not comprise any primary data from the study. Primary data refers to direct measurements, e.g., video and audio recordings, of children in therapy. Instead, this public release comprises secondary data not revealing the identity of the children. Secondary data refers to processed measurements from primary data, including 3D skeleton reconstructions and eye-gaze vectors.

Further background on data-driven studies of autism and relevant datasets is presented in Section 2, followed by a presentation of the clinical evaluation from which this dataset was gathered (Section 3). Details of the DREAM dataset are provided in Section 4. Finally, the paper concludes with a discussion in Section 5.

2 Background

Diagnosis of ASD involves the assessment of the child behaviors considering their social initiations and responses, their joint attention episodes, their social play and their repetitive and stereotypic movements [1]. This involves, for example, attention to the patient’s eye-gaze, face-expressions, and hand movements at specific points in time. While this is very difficult for a novice, therapists, knowing the protocol well, are trained to observe and identify these expressions.

Considering the large effort involved in its diagnosis and treatment of ASD, there is an urgent need to better understand the autism spectrum and to develop new methods and tools to support patients, caregivers, and therapists [25]. One initiative was made by Thabtah [26], who developed a mobile application for screening of ASD, based on DSM-5 [1] and two questionnaire based AQ and Q-CHAT screening methods [27, 28]. While this is far from the only mobile application for screening of ASD, we believe this initiative stands out by, in contrast to several other applications, being supported by published research and by sharing parts of the underlying databases publicly [29]. Such datasets, covering for example traits, characteristics, diagnoses and prognoses of individuals diagnosed with ASD could be important assets, and are still very rare.

Mobile applications could be excellent tools, for example during screening, not the least by being very accessible to the broader population. However, complete diagnosis and treatments require more information, and other forms of interaction, than what can be achieved with a mobile application. For example, coverage of the patient’s behavior and social interactions are critical components for both improved understanding of ASD and development of new tools.

One example that clearly demonstrates the value of data-driven analysis of ASD is the work by Anzulewicz et al. [30]. The authors report a computational analysis of movement kinematics and gesture forces recorded from 82 children between 3 and 6 years old. 37 of these children were diagnosed with autism. The analysis revealed systematic differences in force dynamics within the ASD group, compared to the typically developed children included in the study. Unfortunately, this dataset has not been released publicly.

Moving outside the autism spectrum, there are a couple of relevant datasets focusing on social interaction. One such example is the Tower Game Dataset [31], comprising multi-modal recordings from 39 adults engaged in a tower building game. A total of 112 annotated sessions were collected, with an average length of three minutes. It focuses specifically on rich dyadic social interactions. Similar to the data-set presented here, the Tower Game Dataset contains body skeleton and eye-gaze estimates. Additionally, the dataset is manually annotated with so-called Essential Social Interaction Predicates (ESIPs). The authors promise that “a dataset visualization software […] is available and will be released with the dataset” [31], but unfortunately the dataset does not appear to be publicly available online.

Another dataset covering social interaction is the Multimodal Dyadic Behavior Dataset (MMDB) [32, 33]. This dataset comprises audio and video recordings from semi-structured play between one adult and one child in the age of 1 to 2 years. To date, 160 sessions of 5-minute interaction from 121 children have been released. Videos are annotated automatically for gaze shifts, smiling, play gestures, and engagement. An attractive aspect of this dataset is that the raw data streams are provided, including a rich set of 13 RGB cameras, one Kinect (RGBD) camera, 3 microphones, and 4 Affectiva Q-sensors for electrodermal activity and accelerometry, worn by both the adult and the child.

Focusing instead on human-robot interaction, the UE-HRI dataset [34] is a recent example. It includes audio and video recordings of 54 adult participants engaged in spontaneous dialogue with the social robot Pepper. The interactions took place in a public space, and include both one-to-one and multi-party interactions.

To our knowledge, the only public dataset covering children’s interaction with robots is the PInSoRo dataset [35]. This dataset concerns typically developed children not associated with autism, but shares a similar ambition to support a data-driven study of interaction. PInSoRo covers 45 hours of RGB video recordings, 3D recordings of the faces, skeletal information, audio recordings, as well as game interactions. In addition, the dataset comprises manual annotations of, for example, task engagement, social engagement, and attitude.

In sum, several datasets related to the study of social interaction, human-robot interaction, and autism can be found in the literature. Some are also released publicly, but none of them reach the same size as the dataset we present here. While there are other datasets with a similar, or even richer, set of features, none of these cover children diagnosed with ASD. Under the label Behavior Imaging, Rehg et al. [36, p. 87] explicitly argue for the need for such a dataset:

We believe this approach can lead to novel, technology-based methods for screening children for developmental conditions such as autism, techniques for monitoring changes in behavior and assessing the effectiveness of intervention, and the real-time measurement of health-related behaviors from on-body sensors that can enable just-in-time interventions.

Since children diagnosed with ASD are often sensitive to new clothing and wearable equipment, we consciously avoided on-body sensors. However, in other respects, we hope that the present work constitutes one important step towards a data-driven study of autism outlined by Rehg et al. [36].

3 Clinical evaluation and protocol

The clinical evaluation of RET, from which the present dataset is gathered, was conducted between March 2017 and August 2018 at three different locations in Romania. 76 children, age 3 to 6 years, were recruited to the study, out of which 70 met the inclusion criteria and were randomly assigned one of two conditions, RET or SHT. Participants in both groups went through a protocol of initial assessment, eight interventions, and a final assessment. The effect of the treatment was assessed using the Autism Diagnostic Observation Schedule (ADOS), in terms of the difference between the initial and final assessments [37]. Nine children did not continue the treatment beyond initial diagnosis, e.g., as a result of high skill performance, leaving 61 children with an initial ADOS score between 7 and 20 in the study (RET n = 30, SHT n = 31). A letter of consent was signed by at least one parent before initiating the study, expressing their consent to record the assessment and the intervention sessions and to use the data and recordings for scientific purposes in an anonymous fashion. The clinical study where this data was collected received prior ethical approval from the Scientific Council of Babes-Bolyai University in Cluj-Napoca, Romania, where the trial was conducted (record no. 30664/February 10th, 2017). The clinical trial was pre-registered to the U.S. National Library of Medicine database (ClinicalTrials.gov) under the number NCT03323931.

The therapy environment followed two configurations illustrated in Fig 1. The two configurations (RET and SHT) were designed to be as similar as possible, with the interaction partner constituting the primary difference. A therapist was present during both conditions, seated at the side of the table. A picture from the RET condition is shown in Fig 2. The exact setup varied slightly between different tasks. Some tasks made use of a touch screen placed between the child and the interaction partner, referred to as a sandtray [38]. Other tasks had a table as illustrated in Fig 1.

thumbnail
Fig 1. Configuration of the therapy environment during the two conditions used.

The child interacts with either a humanoid robot (RET, left) or a therapist (SHT, right).

https://doi.org/10.1371/journal.pone.0236939.g001

thumbnail
Fig 2. Example of the therapy environment.

Red axes describe the orientation for the joint coordinate system for all data in the DREAM dataset.

https://doi.org/10.1371/journal.pone.0236939.g002

Each intervention targeted three basic social skills that have been previously shown to be affected in individuals on the spectrum, namely imitation [39], joint-attention [40] and turn-taking in collaborative play [41]. The intervention had the same structure across all skills and was employed during both RET and SHT:

  1. the interaction partner (robot or human) provided a discriminative stimulus (i.e., an instruction to perform a behavior that is relevant for a particular skill);
  2. the interaction partner waited for the response of the child;
  3. the interaction partner offered feedback that was contingent on the behavior of the child, namely a positive feedback if the behavior matched the one that was expected, or an indication to try again if the performance was below the expectation.

For each discriminative stimulus, the child had three attempts to perform the behavior, each trial following the same sequence from above. If the child failed to perform the behavior at the last attempt, then the therapist offered a behavioral prompt.

Each intervention was divided into three to six parts, following a task script. This script specifies the task used during the intervention, instructions given to the child by the interaction partner (human or robot), as well as actions made by the interaction partner. In the RET condition, the robot follows the script automatically, supervised by a second therapist (supervisor) sitting behind the child (c.f. Fig 1). The supervisors role is to monitor the automatic interpretation of the child’s behavior and to adjust the robot’s responses if necessary. In the SHT condition, the supervisor is not present and the human interaction partner follows the script manually. Twelve unique intervention scripts were used, specifying different exercises and three difficulty levels. As the child reached maximum performance on one level, he/she moved to the next one.

For imitation there were three scripts: 1) imitation of objects (e.g., the child had to imitate a common way of playing with a toy car); 2) imitation of common gestures (e.g., waving hand and saying goodbye); and 3) imitation of gestures without a particular meaning (e.g., moving hand in a position that does not have any common reference). In each imitation script, the interaction partner performed the move first and asked the child to do the same. The child received positive feedback if he/she imitated accurately the behavior of the interaction partner.

For joint-attention there were also three levels of difficulty, varying by the number of cues offered by the interaction partner: 1) pointing and looking at an object placed in front of the child while also giving a verbal cue (i.e., “look”); 2) pointing and looking at an object without verbal cues; and 3) just looking at an object. The objects for this task were displayed as pictures on the sandtray placed in front of the child. The child received a positive feedback if he/she followed the cues and looked at the object indicated by the interaction partner.

For turn-taking there were three different types of tasks, each with two levels of difficulty. One task was focused on sharing information about what one likes most, by choosing from five pictograms that were displayed at once. The two levels differed by the complexity of the pictograms (e.g., a simple color vs. an activity). Another task was focused on categorizing objects. In the first level of difficulty only one object that had to be categorized was displayed at a time, while in the second level there were eight such objects displayed, and the child had to choose one and move it in the correct category. The third task consisted of completing a series of pictures arranged in a pattern. In level one the child had to choose from two pictures the one that continues the pattern, while in the second level the child had to continue the pattern by choosing from four pictures. All turn-taking tasks were performed using the sandtray. The interaction partner and the child took turns in performing moves on the sandtray (e.g., choosing a favorite color). The child received a good performance rating and positive feedback if he/she waited without touching the screen while the interaction partner performed a move.

As mentioned above, the clinical protocol included an initial assessment, eight interventions, and a final assessment. The first and last assessments combined an ADOS evaluation with an evaluation of pre- and post-test performance in imitation, joint-attention and turn-taking. In the pre- and post-tests, the interaction partner did not provide any feedback, and the therapist did not offer any prompt (behavioral performance was only measured).

3.1 Sensors and setting

All therapy sessions were recorded using the same sensorized therapy table [24]. The table was equipped with three high-resolution RGB cameras and two RGBD (Kinect) cameras that, in combination with state of the art sensor interpretation methods, provide information about the child’s position, motion, eye-gaze, face expressions, and verbal utterances. In addition, the table captured the presence and location of objects used in the therapy. A range of different algorithms were employed to compute these perceptions. A complete list of sensor primitives and associated methods is provided in Table 1. Note that only a subset of these features are included in the public dataset, see Section 4 for details.

thumbnail
Table 1. Sensor primitives extracted by the sensorized intervention table.

https://doi.org/10.1371/journal.pone.0236939.t001

The data gathered by the sensorized table was collected in real time with a temporal resolution of 25 Hz and was used to guide the behavior of the robot in the RET condition, following the session script. In addition, recorded data was used off-line to support analysis and assessment of the child’s progress through therapy, in both RET and SHT conditions. The table works without sensors placed on the child and without individual calibration, both of which are problematic to employ in therapy with autistic children.

While sensor data was used to guide the robot’s behavior on-line, robot responses were kept consistent throughout the study, i.e., the robot did not learn from previous interactions with the child. Instead, suitable task difficulty was achieved through the session scripts as described above, combined with supervised autonomy ensuring reliable robot behavior even in cases when the system failed to correctly assess the child’s actions [13]. While this architecture could effectively be combined with robot learning [49], here we chose a static system in order to increase validity of the clinical study.

4 Open dataset

The dataset resulting from the clinical evaluation presented above comprises a total of 306 hours of therapy. 41 out of 61 children finished all 8 interventions and the final diagnosis. The remaining participants finished an average of 4.5 interventions, i.e., just above half of the complete protocol. The total length of each intervention varied from a 3 to 87 minutes as a result of script length and child behavior, with a median duration of 32 minutes. Average intervention durations for each of the tow conditions (RET, SHT) are presented in Fig 3. To our knowledge, this is the largest dataset of autism therapy involving robots and probably also the largest recorded data set of children interacting with robots in general.

thumbnail
Fig 3. Average duration of each intervention in the two conditions over the complete protocol, comprising initial assessment, 8 interventions, and a final assessment.

Envelopes represent the 95% confidence interval of the mean.

https://doi.org/10.1371/journal.pone.0236939.g003

As mentioned in the introduction, this dataset does not comprise any direct measurements, i.e., raw data, from the conducted therapies. However, given that a RET framework involves processing of sensory data into higher level perceptions (see Section 3.1), we have access to comprehensive secondary data from these recordings. Any variables related to the clinical evaluation can however not yet be released, concerning for example the child’s performances during therapy. This type of information may be included in a later release of the dataset. Other variables, such as face expressions, have relatively low reliability and were excluded from dataset for this reason. Finally, any variables that may reveal the child’s identity have been excluded from the dataset.

The following data has been selected for inclusion in the public release of the DREAM dxataset:

  1. Child ID (numerical index),
  2. Child’s gender,
  3. Child’s age in months,
  4. 3D skeleton comprising joint positions for upper body,
  5. 3D head position and orientation,
  6. 3D eye gaze vectors,
  7. Therapy condition (RET or SHT),
  8. Therapy task (Joint attention, Imitation, or Turn-taking),
  9. Date and time of recording,
  10. Initial ADOS scores.

With the ambition of releasing the dataset in an easily accessible, well-specified, and commonly used file format, JavaScript Object Notation (JSON) was selected. JSON is a stripped form of the JavaScript programming language, intended for data representation, https://www.json.org. JSON has many of the attractive attributes found in XML, including standard libraries for most programming languages, validation patterns, and human readability. However, JSON is less verbose than XML and includes standard notation for arrays, making it much more suitable for storing numeric data.

An example structure of the DREAM dataset JSON format is included in Table 2. The complete format is specified by the JSON Schema found in Appendix A. An important attribute of this dataset is that all attributes are defined in a common frame of reference using a Cartesian coordinate system. The orientation of the Cartesian space in relation to the therapy environment is visualized in Fig 2.

thumbnail
Table 2. Example structure of the open DREAM dataset, in JSON format.

“[…]” corresponds to numeric arrays that are too long to include here.

https://doi.org/10.1371/journal.pone.0236939.t002

4.1 Licence

The Open DREAM dataset is licensed under

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Licence [50].

This licence permits copying and redistribution of the material in any medium or format and states that the licensor cannot revoke these freedoms as long as you follow the licence terms. The material here refers to all secondary data and metadata included in this public release of the dataset, excluding its underlying recordings or direct measurements. This freedom is given under the following terms:

  1. Attribution—You must give appropriate credit, provide a link to the licence, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
  2. NonCommercial—You may not use the material for commercial purposes.
  3. ShareAlike—If you remix, transform, or build upon the material, you must distribute your contributions under the same licence as the original.
  4. No additional restrictions—You may not apply legal terms or technological measures that legally restrict others from doing anything the licence permits.

Note:

  • You do not have to comply with the licence for elements of the material in the public domain or where your use is permitted by an applicable exception or limitation.
  • No warranties are given. The licence may not give you all of the permissions necessary for your intended use. For example, other rights such as publicity, privacy, or moral rights may limit how you use the material.

4.2 Data visualization

An important aspect of any public dataset is to make it accessible, and understandable, for a larger audience. In addition to the technical specification provided in Appendix A, we release a visualization tool named DREAM Data Visualizer. This tool runs directly in the web-browser and comprises a 3D environment where a user can playback each intervention, view the child’s movements in relation to the interaction partner.

The DREAM Data Visualizer is released as open source under GNU GPL and available for download at github.com/dream2020/DREAM-data-visualizer. An example screenshoot from the DREAM data visualizer is presented in Fig 4.

thumbnail
Fig 4. Visualization of the DREAM dataset, including 3D skeleton of upper body and head rotations.

https://doi.org/10.1371/journal.pone.0236939.g004

5 Discussion

Although the global prevalence of autism is difficult to assess [51] and its exact origin has not been determined [5, 52], autism is becoming an increasingly common diagnosis worldwide. This affects not only all the people receiving diagnosis, but also constitutes a significant cost for society [30]. RAT/RET has been put forward as a potentially cost-effective treatment, but still lacks large-scale clinical trials and longitudinal studies in order to assess its effects [12, 15, 17].

In the present work, we present a dataset covering behavioral data recorded during therapeutic interventions with 61 children. The dataset comprises a rich set of features which we believe are essential for understanding and assessing childrens’ behavior during therapy. By providing a large set of data, comprising 61 children with varying degree of ASD taking part in more than 3000 therapy sessions, we hope that the DREAM Dataset can constitute an important asset in future studies in the field.

The dataset may for example be used by studies employing machine learning or artificial intelligence to find patterns in behavioral data. Such patterns may guide further clinical studies by providing new insights into how to appropriately select between RET and traditional ABA therapies or constitute input to new therapeutic methods.

Data Availability

The complete dataset is available for download through the Swedish National Data Service, https://doi.org/10.5878/17p8-6k13. Sample data and usage instructions can be accessed at https://github.com/dream2020/data. In addition, source code for the DREAM RET System [24], with which the present dataset was gathered, is made available at https://www.doi.org/10.5281/zenodo.3571992.

Appendix A: Dataset specification

This is a JSON schema for the open DREAM dataset presented in Section 4. The schema presented here specifies the format for a single therapy session with one child under therapy, including definitions of all mandatory attributes of the data. A JSON database file may however comprise additional attributes not defined here. The complete dataset comprises a large set of these sessions.

References

  1. 1. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders (DSM-5®). Washington, D.C.: American Psychiatric Publishing; 2013.
  2. 2. De Filippis M, Wagner KD. Treatment of autism spectrum disorder in children and adolescents. Psychopharmacology Bulletin. 2016;46(2):18–41.
  3. 3. Lord C, Elsabbagh M, Baird G, Veenstra-Vanderweele J. Autism spectrum disorder. The Lancet. 2018;392(10146):508–520.
  4. 4. Narzisi A, Costanza C, Umberto B, Filippo M. Non-Pharmacological Treatments in Autism Spectrum Disorders: An Overview on Early Interventions for Pre-Schoolers. Current clinical pharmacology. 2014;9(1). pmid:24050743
  5. 5. Smith T, Iadarola S. Evidence Base Update for Autism Spectrum Disorder. Journal of Clinical Child and Adolescent Psychology. 2015;44(6):897–922. pmid:26430947
  6. 6. Roane HS, Fisher WW, Carr JE. Applied Behavior Analysis as Treatment for Autism Spectrum Disorder. Journal of Pediatrics. 2016;175:27–32. pmid:27179552
  7. 7. Dautenhahn K, Werry I. Towards interactive robots in autism therapy: Background, motivation and challenges. Pragmatics & Cognition. 2004;12(1):1–35.
  8. 8. Diehl JJ, Schmitt LM, Villano M, Crowell CR. The clinical use of robots for individuals with Autism Spectrum Disorders: A critical review. Research in Autism Spectrum Disorders. 2012;6(1):249–262. pmid:22125579
  9. 9. Thill S, Pop CA, Belpaeme T, Ziemke T, Vanderborght B. Robot-Assisted Therapy for Autism Spectrum Disorders with (Partially) Autonomous Control: Challenges and Outlook. Paladyn, Journal of Behavioral Robotics. 2012;3(4):209–217.
  10. 10. Mengoni SE, Irvine K, Thakur D, Barton G, Dautenhahn K, Guldberg K, et al. Feasibility study of a randomised controlled trial to investigate the effectiveness of using a humanoid robot to improve the social skills of children with autism spectrum disorder (Kaspar RCT): a study protocol. BMJ open. 2017;7(6):e017376. pmid:28645986
  11. 11. Scassellati B, Boccanfuso L, Huang CM, Mademtzi M, Qin M, Salomons N, et al. Improving social skills in children with ASD using a long-term, in-home social robot. Science Robotics. 2018;3(21).
  12. 12. Costescu CA, Vanderborght B, David DO. The effects of robot-enhanced psychotherapy: A meta-analysis. Review of General Psychology. 2014;18(2):127–136.
  13. 13. Esteban PG, Baxter P, Belpaeme T, Billing EA, Cai H, Cao H, et al. How to Build a Supervised Autonomous System for Robot-Enhanced Therapy for Children with Autism Spectrum Disorder. Paladyn, Journal of Behavioral Robotics. 2017;8(1):18–38.
  14. 14. Cao HL, Esteban P, Bartlett M, Baxter PE, Belpaeme T, Billing E, et al. Robot-Enhanced Therapy: Development and Validation of a Supervised Autonomous Robotic System for Autism Spectrum Disorders Therapy. IEEE Robotics and Automation Magazine. 2019.
  15. 15. David D, Costescu CA, Matu S, Szentagotai A, Dobrean A. Effects of a Robot-Enhanced Intervention for Children With ASD on Teaching Turn-Taking Skills. Journal of Educational Computing Research. 2019; p. 073563311983034.
  16. 16. Hernández García D, Esteban PG, Lee HR, Romeo M, Senft E, Billing EA. Social Robots in Therapy and Care. In: Workshop at the ACM/IEEE International Conference on Human Robot Interaction (HRI). Daegu, South Korea; 2019.
  17. 17. Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, et al. Autism and social robotics: A systematic review; 2016.
  18. 18. Rabbitt SM, Kazdin AE, Scassellati B. Integrating socially assistive robotics into mental healthcare interventions: Applications and recommendations for expanded use; 2015.
  19. 19. Scassellati B. How Social Robots Will Help Us to Diagnose, Treat, and Understand Autism. In: Thrun S, Brooks R, Durrant-Whyte H, editors. Robotics Research. Berlin, Heidelberg: Springer Berlin Heidelberg; 2007. p. 552–563.
  20. 20. Ramírez-Duque AA, Frizera-Neto A, Bastes TF. Robot-Assisted Diagnosis for Children with Autism Spectrum Disorder Based on Automated Analysis of Nonverbal Cues. Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics. 2018; p. 456–461.
  21. 21. Begum M, Serna RW, Yanco HA. Are Robots Ready to Deliver Autism Interventions? A Comprehensive Review. International Journal of Social Robotics. 2016;8(2):157–181.
  22. 22. DREAM. Development of Robot-Enhanced Therapy for Children with Autism Spectrum Disorders; 2020. Available from: https://www.dream2020.eu/.
  23. 23. Gouaillier D, Hugel V, Blazevic P, Kilner C, Monceaux J, Lafourcade P, et al. Mechatronic design of NAO humanoid. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA). Institute of Electrical and Electronics Engineers (IEEE); 2009. p. 769–774.
  24. 24. Cai H, Fang Y, Ju Z, Costescu CA, David D, Billing EA, et al. Sensing-enhanced Therapy System for Assessing Children with Autism Spectrum Disorders: A Feasibility Study. IEEE Sensors Journal. 2019;9(4):1508–1518.
  25. 25. Thabtah F. An accessible and efficient autism screening method for behavioural data and predictive analyses. Health Informatics Journal. 2018; p. 146045821879663.
  26. 26. Thabtah F. Autism Spectrum Disorder screening: Machine learning adaptation and DSM-5 fulfillment. Proceedings of the 1st International Conference on Medical and Health Informatics—ICMHI’17. 2017; p. 1–6.
  27. 27. Kleinman JM, Robins DL, Ventola PE, Pandey J, Boorstein HC, Esser EL, et al. The modified checklist for autism in toddlers: a follow-up study investigating the early detection of autism spectrum disorders. Journal of Autism and Developmental Disorders. 2008;38(5):827–39. pmid:17882539
  28. 28. Allison C, Auyeung B, Baron-Cohen S. Toward Brief “Red Flags” for Autism Screening: The Short Autism Spectrum Quotient and the Short Quantitative Checklist in 1,000 Cases and 3,000 Controls. Journal of the American Academy of Child & Adolescent Psychiatry. 2012;51(2):202–212.
  29. 29. Thabtah F. Autism Datasets; 2019. Available from: http://fadifayez.com/autism-datasets/.
  30. 30. Anzulewicz A, Sobota K, Delafield-Butt JT. Toward the Autism Motor Signature: Gesture patterns during smart tablet gameplay identify children with autism. Scientific Reports. 2016;6(July):1–13.
  31. 31. Salter DA, Tamrakar A, Siddiquie B, Amer MR, DIvakaran A, Lande B, et al. The Tower Game Dataset: A multimodal dataset for analyzing social interaction predicates. 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015. 2015; p. 656–662.
  32. 32. Rehg JM, Abowd GD, Rozga A, Romero M, Clements MA, Sclaroff S, et al. Decoding children’s social behavior. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2013; p. 3414–3421.
  33. 33. Georgia Tech. The Multimodal Dyadic Behavior Dataset; 2019. Available from: http://www.cbi.gatech.edu/mmdb/.
  34. 34. Ben-Youssef A, Clavel C, Essid S, Bilac M, Chamoux M, Lim A. UE-HRI: A new dataset for the study of user engagement in spontaneous human-robot interactions. ICMI 2017—Proceedings of the 19th ACM International Conference on Multimodal Interaction. 2017; p. 464–472.
  35. 35. Lemaignan S, Edmunds CER, Senft E, Belpaeme T. The PInSoRo dataset: Supporting the data-driven study of child-child and child-robot social dynamics. PLOS ONE. 2018;13(10):e0205999. pmid:30339680
  36. 36. Rehg JM, Rozga A, Abowd GD, Goodwin MS. Behavioral Imaging and Autism. IEEE Pervasive Computing. 2014;13(2):84–87.
  37. 37. Gotham K, Pickles A, Lord C. Standardizing ADOS scores for a measure of severity in autism spectrum disorders. Journal of Autism and Developmental Disorders. 2009;39(5):693–705. pmid:19082876
  38. 38. Baxter P, Wood R, Belpaeme T. A touchscreen-based’sandtray’ to facilitate, mediate and contextualise human-robot social interaction. HRI’12—Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction. 2012; p. 105–106.
  39. 39. Ingersoll B. The Social Role of Imitation in Autism. Infants & Young Children. 2008;21(2):107–119.
  40. 40. Dawson G, Toth K, Abbott R, Osterling J, Munson J, Estes A, et al. Early Social Attention Impairments in Autism: Social Orienting, Joint Attention, and Attention to Distress. Developmental Psychology. 2004;40(2):271–283. pmid:14979766
  41. 41. Wimpory DC, Hobson RP, Williams JM, Nash S. Are infants with autism socially engaged? A study of recent retrospective parental reports. Journal of autism and developmental disorders. 2000;30(6):525–36. pmid:11261465
  42. 42. Zhou X, Cai H, Li Y, Liu H. Two-eye model-based gaze estimation from a Kinect sensor. In: IEEE International Conference on Robotics and Automation. IEEE; 2017. p. 1646–1653.
  43. 43. Dementhon DF, Davis LS. Model-based object pose in 25 lines of code. International Journal of Computer Vision. 1995;15(1-2):123–141.
  44. 44. Viola P, Jones MJ. Robust Real-Time Face Detection. International Journal of Computer Vision. 2004;57(2):137–154.
  45. 45. Xiong X, De La Torre F. Supervised descent method and its applications to face alignment. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition; 2013. p. 532–539.
  46. 46. Wang Y, Yu H, Dong J, Stevens B, Liu H. Facial expression-aware face frontalization. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 10113 LNCS. Springer Verlag; 2017. p. 375–388.
  47. 47. Liu B, Yu H, Zhou X, Tang D, Liu H. Combining 3D joints Moving Trend and Geometry property for human action recognition. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016—Conference Proceedings. Institute of Electrical and Electronics Engineers Inc.; 2017. p. 332–337.
  48. 48. Zhou X, Yu H, Liu H, Li Y. Tracking Multiple Video Targets with an Improved GM-PHD Tracker. Sensors. 2015;15(12):30240–30260. pmid:26633422
  49. 49. Senft E, Lemaignan S, Baxter PE, Belpaeme T. SPARC: an efficient way to combine reinforcement learning and supervised autonomy. In: Future of Interactive Learning Machines Workshop at NIPS’16. Nips 2016; 2016. p. 1–5.
  50. 50. Creative Commons. Attribution-NonCommercial-ShareAlike 4.0 International—CC BY-NC-SA 4.0; 2020. Available from: https://creativecommons.org/licenses/by-nc-sa/4.0/.
  51. 51. Poovathinal SA, Anitha A, Thomas R, Kaniamattam M, Melempatt N, Meena M. Global Prevalence of Autism: A Mini-Review. SciFed Journal of Autism. 2018;2(1):1–9.
  52. 52. Volkmar FR, Paul R, Rogers SJ, Pelphrey KA, editors. Handbook of Autism and Pervasive Developmental Disorders. 4th ed. New York, NY: Wiley: John Wiley & Sons, Ltd; 2014.