Correction
12 May 2025: Flatebø S, Ngoc-Nha Tran V, Arfwedson Wang CE, Bong LA (2025) Correction: Social robots in research on social and cognitive development in infants and toddlers: A scoping review. PLOS ONE 20(5): e0323770. https://doi.org/10.1371/journal.pone.0323770 View correction
Figures
Abstract
There is currently no systematic review of the growing body of literature on using social robots in early developmental research. Designing appropriate methods for early childhood research is crucial for broadening our understanding of young children’s social and cognitive development. This scoping review systematically examines the existing literature on using social robots to study social and cognitive development in infants and toddlers aged between 2 and 35 months. Moreover, it aims to identify the research focus, findings, and reported gaps and challenges when using robots in research. We included empirical studies published between 1990 and May 29, 2023. We searched for literature in PsychINFO, ERIC, Web of Science, and PsyArXiv. Twenty-nine studies met the inclusion criteria and were mapped using the scoping review method. Our findings reveal that most studies were quantitative, with experimental designs conducted in a laboratory setting where children were exposed to physically present or virtual robots in a one-to-one situation. We found that robots were used to investigate four main concepts: animacy concept, action understanding, imitation, and early conversational skills. Many studies focused on whether young children regard robots as agents or social partners. The studies demonstrated that young children could learn from and understand social robots in some situations but not always. For instance, children’s understanding of social robots was often facilitated by robots that behaved interactively and contingently. This scoping review highlights the need to design social robots that can engage in interactive and contingent social behaviors for early developmental research.
Citation: Flatebø S, Tran VN-N, Wang CEA, Bongo LA (2024) Social robots in research on social and cognitive development in infants and toddlers: A scoping review. PLoS ONE 19(5): e0303704. https://doi.org/10.1371/journal.pone.0303704
Editor: Simone Varrasi, University of Catania, ITALY
Received: February 5, 2024; Accepted: April 29, 2024; Published: May 15, 2024
Copyright: © 2024 Flatebø et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All data are available from the OSF database doi.org/10.17605/OSF.IO/WF48R.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Early childhood encompasses the infant and toddler years, marked by gradual but rapid growth in both social and cognitive development [1, 2]. Social development involves acquiring skills to interact and build social bonds with others, whereas cognitive development refers to developing skills related to thinking and reasoning processes [1, 2]. Research in these two subdisciplines focuses on a diverse range of abilities, such as attachment [3], imitation [4], play [5, 6], memory [7], theory of mind [8], social cognition [4], and language acquisition [9, 10]. Theory of Mind (ToM), the ability to attribute underlying mental states like beliefs, desires, and intentions to others [11–13], has not previously been studied in pre-verbal infants [14, 15]. However, recent advances in methods have demonstrated that a rudimentary ToM may emerge earlier than the traditional assumption at the age of four [14, 15]. In line with this research, an interesting question is whether infants attribute mental states to non-human agents. Similarly, animacy understanding, the ability to classify entities as animate or inanimate [16–18], has been demonstrated in infants as young as two months [19–22], and by three years of age, children are good at understanding this distinction. Research on animacy examines how young children distinguish living beings and objects based on featural and dynamic cues such as faces, contingency behavior, and goal-directed or self-generated movement, which may involve using non-human agents possessing such cues [16, 23–27].
Developmental psychology uses diverse methodologies, designs, data-gathering instruments and materials, and formats for stimuli presentation, and the research can be conducted in various research settings [28]. Using social robots as part of research methods has emerged as a promising way to gain social and cognitive developmental insights [29–31]. Some pioneering studies have also demonstrated that social robots can contribute to cognitive assessments of elderly people and children with autism [32, 33]. These robots are designed for social interactions with humans, and they are often physically embodied, with human or animal-like qualities, and can be autonomous or pre-programmed to perform specific actions, and they engage in social interactions [34, 35]. Social robots often have an anthropomorphic design with human-like appearance and behavior. For example, they commonly have heads with facial features and can display various social behaviors such as facial expressions, eye contact, pointing, or postural cues [36–38]. Two social robots commonly used for research on social and cognitive development skills are Robovie [39] and NAO [40]. In research settings, social robots can serve various roles, such as social partners in interactions [e.g., 40, 41], teaching aids delivering learning content [40, 42, 43], and they can be equipped with sensors and cameras to record child behaviors [39].
There are several research advantages of using social robots that are not easily achievable through other means when studying young children. Firstly, they provide a level of control and consistency that can be challenging to achieve with human experimenters [32, 44]. Secondly, because social robots are designed for social interactions, they might have potential in research on social learning situations such as imitation studies. Third, the socialness of robots in appearance and behavior [45], in addition to their novelty, make them potentially more suited to capture a child’s attention and sustain their engagement over longer time periods for a variety of testing purposes. Lastly, social robots offer a compelling avenue for advancing our understanding of young children’s early ToM and animacy understanding related to non-human agents with rich social properties and how they represent social robots specifically.
The current review
Although social robots are increasingly used in various settings with children, little is known about their utility as a research tool investigating social and cognitive concepts in infants and toddlers. We need to determine at which stages in early childhood children are receptive to and can learn from these robots. Currently, there is no available scoping review or systematic review of the available body of literature in this field. A review of the existing literature is needed to advance our understanding of social robots’ relevance in research with younger age groups and map the current state of knowledge in this field. Given the potential diversity in methodologies, research designs, and the wide range of developmental topics and concepts in the present research field, we decided to do a scoping review. Consequently, the main objective of the current scoping review is to provide a comprehensive overview and summary of the available literature on the use of social robots as research tools for studying the social and cognitive development of typically developing infants and toddlers aged 2 to 35 months.
Our focus is on research using social robots to inform child development, rather than research exclusively focusing on robot skills and application. We focus on typically developing children in the infancy and toddler years, younger than 3 years. We exclude neonates (0–2 months) and preschoolers (3–5 years) due to the notable distinctions in their developmental stages, which may necessitate different research methods compared to those used for infants and toddlers. Our definition of social robots is broad, encompassing all embodied robots exposed to children in a research context, irrespective of form and presentation format. However, we recognize the significance of eyes in early childhood communication [46] and, consequently, restrict our inclusion to only robots featuring eyes. Our definition covers both robots commonly defined as social robots as well as robots with social features in form and/or behavior. We chose this definition because both types of robots might be relevant for how non-human agents with richer social features can inform social and cognitive development.
This review will provide an overview of the research literature, covering research on concepts of social and cognitive development using robots, the research methods employed, and the types of robots used and their purposes. Also, our aim is to summarize the research trends by identifying the primary research focuses and findings. Finally, we want to summarize the reported gaps and challenges in this research field. Hopefully, the current review can be valuable for future research, helping to decide how to employ social robots in research settings with infants and toddlers and to support the development of age-appropriate robots for children.
Method
We conducted a scoping review, which aimed to explore and map the concepts and available literature in a given field of research [47]. Like systematic reviews, scoping reviews follow rigorous and transparent methods [47, 48]. But, differently from systematic reviews, scoping reviews ask broader rather than specific research questions to encompass the extent and breadth of the available literature of a given field [47, 48]. We used The Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) (S1 Checklist) to improve this scoping review’s methodological and reporting quality. We preregistered the protocol for this study on Open Science Framework on May 19, 2023 (see updated version of the protocol: https://osf.io/2vwpn/). We followed the recommendations of the Johanna Briggs Institute (JBI) [49] and the first five stages in the methodological framework of Arksey and O’Malley [47] and Levac and O’Brien’s advancements of this framework [50].
Stage 1: Identifying the research questions
The review was guided by three research questions: 1) What is the extent and nature of using social robots as a research tool to study social and cognitive development in infants and toddlers? 2) What are the primary research focus and findings? 3) What are the reported research gaps and challenges when using social robots as a research tool?
Stage 2: Identifying relevant studies
Inclusion criteria.
We developed inclusion criteria related to the publication type, target child population, the robot type, and the research focus (Table 1) to focus the scope of the review.
In the full-text screening, we excluded studies by the first unmet inclusion criteria, i.e., we checked if the publication met the criteria for publication type first, then for the target population, robot type, and finally, the research focus.
We consulted multiple databases to identify studies, as social robotics is an interdisciplinary field. We included conference proceedings and preprints because studies within robotics are often published in this format [51–53].
Search strategy
We searched for literature in PsychINFO (OVID), Education Resources Information Center (ERIC, EMBASE), and Web of Science. We searched for preprints using the Preprint Citation Index in Web of Science and in PsyArXiv. All searches were done on 29 May 2023. In consultation with an academic librarian, we developed a search strategy and search terms, which are presented in the S1 File. We used controlled vocabulary in addition to keywords when searching in PsychINFO and ERIC. Web of Science and PsyArXiv lack their own controlled vocabulary, so PsychINFO and ERIC keywords were used in the searches. We categorized the search terms into three categories: robot type, target child population, and social and cognitive developmental concepts. For a comprehensive search, we used the search terms “robot*”, “robotics”, “social robotics”, and “human robot interaction” related to robot type category. Moreover, for the target child population category we used terms like “infan*”, “toddler*”, “child*”, “infant development”, and “childhood development”. Lastly, for developmental concepts we used terms such as “cognitive development”, “social development”, “social cognition”, and “psychological development”.
Stage 3: Study selection
We developed a screening questionnaire a priori (doi.org/10.17605/OSF.IO/4BGX6), which all reviewers (SF, LAB, and VT) piloted initially on a random sample of studies. After revising the screening questionnaire, we started screening studies for eligibility in the web-based software Covidence [54]. We removed duplications manually and by using the Covidence duplicate check tool. All studies were screened by two reviewers independently using the screening questionnaire. The first author (SF) screened all studies, whereas LAB and VT screened half of the studies each. We resolved disagreements by team discussion. The studies were screened through a two-step process: 1) screening of titles and abstracts; 2) screening of full texts. In full-text screening, we followed the exclusion reason order in Table 1 and excluded studies by the first unmet inclusion criteria.
Stage 4: Data charting
We developed a data charting template a priori in Covidence and we used it to chart data from the studies included. The first author (SF) piloted the data charting template on five studies and iteratively modified it based on recommendations [50]. The main revisions included changes to the template layout, adding entities (i.e., final sample size and physical CRI contact), and providing more charting instructions and explanations of the entities. The details about the newest version of the charting template and charted entities are available at OSF (doi.org/10.17605/OSF.IO/B32R6). The first author (SF) charted data from each publication, and a second reviewer (LAB or VT) checked the charted data for completeness and accuracy in Covidence. Disagreements were resolved by discussion in the research team. We charted data regarding general study characteristics (e.g., authors, publication year, publication type, and country of the first author), research aims, developmental concepts, methods (e.g., research methodology and design, research setting, procedure and conditions, material, outcome measures, and type of CRI), child population characteristics (e.g., sample size, age, and socioeconomic background), robot characteristics (e.g., robotic platform, developer, exposition, physical CRI contact, purpose of use, form, appearance, autonomy, and behavior), reported gaps and limitations, research findings and conclusions. We exported the charted data from Covidence to Excel. All charted data is available at OSF (doi.org/10.17605/OSF.IO/WF48R).
Stage 5: Collation, summarizing, and reporting results
The reviewed studies are summarized, reported, and discussed in line with the fifth stage of Arksey and O’Malley’s scoping review framework in the following sections. We classified the studies based on the type of developmental concepts they involved.
Results
Search results
Overall, we identified 1747 studies from all database searches. After removing duplicates, and screening titles and abstracts, we screened 187 full texts for eligibility. Out of these, 158 studies were excluded. Finally, we included 29 studies in the review. Fig 1 shows the details of the search results and the study selection process in the PRISMA flowchart diagram [55].
The study selection process, including procedures of identification, and screening of studies. Studies were excluded based on a fixed order of exclusion reasons, including only the first incident of an unmet reason in this diagram.
General characteristics
S1 Table provides an overview of all reviewed studies, including general characteristics, research methods, aims, sample characteristics, the robotic platform and other measures used, and a summary of the main findings and conclusions. There were 25 journal articles, three conference papers, and one magazine article. None of the studies were preprints. Studies were published between 1991 to 2023, and the research activity slightly grew over the past three decades (Fig 2).
The cumulative number of studies per year between 1990 to 29. May 2023.
The authors came from different countries, and most studies were conducted in Japan, followed by the United States and Canada (Table 2).
Countries of the lead authors (N = 29).
Research methods
Almost all studies (n = 25) used quantitative methodology, while only two studies used qualitative methodology and one used a mixed approach. Twenty-five of the studies used an experimental design, while the remaining four used a descriptive, correlational, case study, or ethnomethodology design. Twenty-four studies were conducted in a laboratory or in a controlled laboratory setting. Two studies were conducted in ecological settings, such as classrooms. The remaining three studies were conducted in different locations, one study in a naturalistic setting at a science museum, and two studies used various locations (i.e., laboratory, ecological and/or naturalistic location).
Child characteristics
The final sample sizes of the studies ranged from 6 to 230 participants, with the ages of participants ranging from 2 to 35 months. While some studies [56–62] included participants older than the target age, this review only focuses on findings related to children in the target age group. Twenty studies included toddlers who were 12 months or older, while seven studies included infants under 12 months. Five studies reported the socioeconomic status of the families [63–67], all belonging to the middle-class. For more details about the samples, see S1 Table.
Robot characteristics and interaction types
We identified 16 social robots (Table 3 and Fig 3), most having a humanoid appearance (n = 24), whereas the remaining were animal-like (n = 4) and a ball-shaped robot (n = 1). The robots used were Robie Sr., Robovie, Robovie2, NAO, Dr. Robot Inc, HOAP-2, RUBI, RUBI-6, iRobiQ, Sphero, ReplieeQ2, MyKeepon, Bee-Bot, 210 AIBO, MiRoE, and Opie. Robovie (versions 1 and 2) was most frequently used (n = 8). Most robots were pre-programmed to perform specific behaviors to examine children’s responses to these acts (n = 24), such as making eye contact or gazing in the direction of an object [e.g., 68], or performing specific actions with objects [e.g., 62]. Two studies used autonomous robot dogs that acted by themselves and reacted to the children’s behavior [60, 61]. Additionally, some [57, 58, 69] exposed children to robots that were autonomous or pre-programmed at different phases of the experiment.
Images b, c, e, f, h, j, k, and l are modified cropped versions of the original work. Original images are licensed under CC-BY. For the robots Dr. Robot Inc., Opie, RUBI, and RUBI-6, we could not find images with a CC-BY (or similar) license. The Android and mechanical configurations of the same robot are shown in image (h). The image sources are: a) [70]; b) [71]; c) [72]; d) [73]; e) [74]; f) [75]; g) [76]; h) [77]; i) [78]; j) [79]; k & l); [80].
H = humanoid; NH = non-humanoid; n = number of studies using a given robot.
In most studies, the robots were present in the same physical location as the child (n = 18), whereas the remaining robots were presented in video (n = 11). In most cases, the child-robot interaction did not involve any physical contact with the robot (n = 19). A total of 34 experiments were conducted in the 29 reviewed articles in which children were exposed to robots in some way. Most commonly, the robot was exposed to the child in a one-to-one interaction or situation (n = 20), including both live interactions and passive observations without social exchange. The remaining were bystander interactions (n = 5), where the child observed the robot interact with someone else, children-robot interactions in groups (n = 4), or a mixture of different interaction types (n = 5).
Outcome measures and other instruments and material
Details of the outcome measures are presented in the S1 Table. The most frequent measure in the studies was children’s looking behavior during stimuli presentation (n = 12). Looking behavior was measured using different instruments, such as eye tracking methods, video recordings captured by cameras, or observational notes. Various techniques were used to analyze looking behavior, such as visual habituation, preferential looking, violation of expectation, and anticipatory looking. Another common measure was children’s imitation behavior assessed in imitation tests by analyzing the performance of target actions (n = 7).
Research focus, key findings, and conclusions
The studies focused on several social and cognitive skills that we clustered into 4 main categories (Table 4). The key findings and conclusions of all studies are presented in the S1 Table.
The other category includes the concepts of computational thinking (n = 1), reading interest and skills (n = 1), and physical play and emotions during robot interaction (n = 1).
Animacy understanding.
Seven studies investigated children’s understanding of animacy (Table 4). They examined how children classify robots as animate or inanimate based on their appearance [77, 91], movements [81], and interactive behaviors [60, 61, 82, 91], using both humanoid and animal-like robots (Table 3 and Fig 3). The findings were diverse, with children sometimes perceiving robots as more like living beings when the robots had a highly human-like appearance [77] or behaved contingently [82, 91, 92]. For example, infants aged 6 to 14 months did not differentiate between a highly human-like android and a human, viewing both as animate, but they recognized the difference between a human and a mechanical-looking robot (Fig 3) [77]. Contingency behavior influenced children’s animacy understanding, with children’s reactions to robots varying depending on the robots’ contingency [82, 92]. Children aged 9 to 17 months who observed contingent interactions between a robot and a human were more likely to perceive the robot as a social being, suggesting the importance of responsive behavior in animacy perception [82, 92]. Nine- and twelve-month-old infants showed different expectations for human and robot movement, demonstrating increased negative affect when robots moved autonomously, suggesting that infants might consider robots inanimate regardless of self-generated motion [81]. Studies with robot dogs showed that children differentiated between robotic dogs and toy dogs, but they did not necessarily view the robotic dog as a living animal [60, 61]. However, they did engage with the robotic dog in a manner suggesting that they perceived it as a social partner [60, 61]. Observations of 12- to 24-month-old toddlers’ long-term interactions with a social robot indicated that they perceived the robot as a social partner [91]. The robot’s interactivity, appearance, and inscriptions of gender and social roles influenced toddlers’ attribution of animacy [91]. One study discussed anecdotal observations suggesting that toddlers may ascribe animacy to robots based on reciprocal vocalizations and social behaviors, such as inviting the robot to dance or apologizing to it after accidental contact [63]. Two studies connected children’s concepts of animacy with their understanding of actions, particularly goal-directed and contingent actions [77, 91], which will be discussed in the section below on action understanding.
Action understanding.
Ten studies used humanoid social robots to examine children’s understanding of various actions (Tables 3 and 4), including referential actions [66, 67, 72, 84–86], goal-directed actions [83, 87, 88], and intentions behind failed actions [68]. Action understanding refers to the ability to recognize and respond appropriately to other’s actions, infer the goals of actions, and detect the intention underlying the actions [95].
Studies on referential actions [66, 67, 72, 84–86] showed that children aged 10 to 18 months can follow the gaze of humanoid robots, but their understanding of the robot’s intentions varied. For example, 12-month-olds respond to robot gaze, and it is not just an attentional reflex to its head movements [84], but they do not anticipate object appearance following robot gaze as they do for humans [84, 85]. Similarly, one study [72] found that 17-month-olds more frequently followed the human gaze than the robot gaze, suggesting that toddlers did not understand the referential intention of the robot’s gaze. Yet, toddlers may still understand the robot’s referential intentions, such as when the robots provide verbal cues during object learning [66, 86] or when the robot has previously engaged socially with adults [67]. Studies on goal-directed actions [83, 87, 88] showed that infants from 6.5 months could identify the goals of a humanoid robot as it is moving towards a goal destination, and they evaluate whether the robot is performing the most efficient path to reach its goal [83]. However, they do not attribute goals to a featureless box, suggesting that the human-like appearance of an agent influences infants’ reasoning about an agent’s actions [83]. Moreover, 13-month-old toddlers did not expect cooperative actions between humans and robots, even with social cues present [87]. By 17 months, toddlers showed signs of predicting the goal-directed reaching actions towards a target of both humans and humanoid robots, indicating an understanding of goal-directed behavior irrespective of the agent [106]. Finally, toddlers aged 24 to 35 months recognized the intention behind a robot’s failed attempts to place beads inside a cup, but only when the robot made eye contact [68].
Imitation.
Social robots were used to study two kinds of imitation in young children, i.e., their ability to learn by observing and imitating others [96]. Half of the studies focused on infants aged 2–8 months and their imitation of the humanoid robot’s bodily movements, also known as motor imitation, and contingency learning in a face-to-face interaction [69, 89, 90]. Although 2- to 5-month-olds paid more attention to the robot when it moved, only 6- to 8-month-olds imitated its motor movements and demonstrated contingency learning [69, 89, 90]. The remaining studies investigated 1- to 3-year-old toddlers’ imitation of a robot’s actions with objects, such as assembling a rattle and shaking it to make a sound [58, 62, 93]. The studies found that toddlers imitate both physically present [58] and on-screen robots [62] and that their imitation of robots increased with age [58, 62]. Toddlers who interacted more with the robot prior to the imitation test were more likely to imitate it [58], though they still imitated humans more frequently [58, 62]. Moreover, toddlers’ imitation from on-screen demonstrations of a human experimenter performing actions is not facilitated by presenting such videos embedded in robots behaving socially [93].
Early conversational skills.
Three studies used a toy robot to investigate early conversational skills in toddlers (Tables 3 and 4). The robot provided constant verbal stimulation through an in-built speaker. By using a robot, the researchers aimed to eliminate potential confounding nonverbal cues (e.g., gaze, gestures) inevitably present in human conversation that could affect toddlers’ responses [63–65]. For 24-month-olds, when the robot reciprocated toddlers’ utterances by repeating and expanding the topic, it led to more topic-maintaining conversation and increased linguistically mediated social play [63]. Moreover, 24-month-olds recognized when the robot’s responses were semantically relevant and on-topic, and in these situations, toddlers were more likely to continue and expand the conversational topic compared to when the robot was off-topic [64]. Older toddlers, aged 27 and 33 months, demonstrated an understanding of pragmatic quantity rules in conversations by responding appropriately to specific and general queries when conversing with the robot [65].
Gaps and challenges
To address our third research question, we summarize gaps and challenges in using social robots as a research tool reported by the authors of the studies in the review. The most reported gaps by the authors were related to children’s familiarity with robots, testing the effect of specific robot appearance and/or behavior cues, the design of the robot, and testing across different settings. Many studies [58, 62, 72, 82, 85, 87, 88] discussed that future work should investigate whether children’s familiarity with robots might influence their understanding of and response to robots. For example, Okumura discusses [85] that infants might have stronger expectations for referential cues, such as gaze, from humans rather than robots due to their familiarity with human interaction. Moreover, future studies should investigate whether children’s increased exposure to robots can enhance their ability to understand and respond to a robot’s referential communication [85]. Several studies suggest that further research should investigate how a robot’s physical appearance and behavior impact children’s perception, comprehension, and learning from robots [66, 81–83, 85, 87]. For instance, Okumura et al. [86] suggest that future research should examine whether verbal cues provided by robots influence infants’ object learning. Regarding gaps related to robotic design, one study [92] elucidated that robotic developers should aim to make robots that can interact autonomously without interference from a human operator. Related to the robot’s design, Peca and colleagues [92] propose that future work should try to make robots that can interact autonomously with the child without the need for an operator. Most of the studies were conducted in experimental settings, and some studies [69, 72] suggest that future work should examine child-robot interactions in more naturalistic settings.
Most studies (n = 24) reported some challenges or limitations related to using social robots as a research tool. Many studies (n = 10) reported challenges related to the robot’s design, such as issues related to its appearance and functionality. For example, additional human operators are required in the experimental procedures due to the technical constraints of the robots, difficulty in making the robots’ movements resemble human movements, or challenges with using robots in live tasks because robots fail to provide the stimuli correctly or do not respond appropriately during interactions. Several studies (n = 7) reported children having challenges understanding the robot, such as its actions, communicative cues, and underlying intentions. Relatedly, some studies discussed that children’s lack of familiarity and experience with robots may contribute to difficulty understanding them and make them more distracting (n = 4). Several studies (n = 5) reported children experiencing challenges with task focus, including little or too much interest in the robot, irritability during robot inactivity, or children being distracted and leaving the task activity. Some studies (n = 3) discussed ecological validity issues, such as the generalization of findings across settings and with specific robots to other robot types or humans. Relatedly, we noticed that few studies used control groups with human or non-human agents for the robots they used, and there is limited discussion on the absence of these controls. An overview of commonly reported challenges is presented in Table 5.
The category “no limitations reported” refers to studies that have not reported any challenges relevant to using social robots as a research tool.
Discussion
This scoping review is a novel contribution to the field as it is the first to systematically cover the breadth of the literature on how social robots have been used in early development research to investigate social and cognitive development. Our review provides an overview of general characteristics, methods, research focus, findings, and the reported gaps and challenges when social robots are used in early developmental research. Previous systematic reviews and scoping reviews have focused on using social robots with older children in other settings, such as in education [97], supporting autism development [98–102], or various health care contexts [103–106]. Although we maintained the wide approach of a scoping review, we found that an overarching research focus in the reviewed literature was to determine if social robots can act as social partners for young children. According to this literature, children sometimes classify social robots as social partners and can interpret the social cues and actions of robots in certain situations. Thus, the studies demonstrate the potential of using various social robots in early developmental research, but do not suggest that social robots can replace humans in research settings.
General characteristics and methods
We found that the use of social robots in early development research is a small research field, and we found 29 studies for the review. Most studies were quantitative with experimental designs and conducted in controlled laboratory settings, in which the children were exposed to the robots in a one-to-one situation. Few studies used qualitative methodology [59, 60, 91], and only one study [91] observed child-robot interactions in a long-term context. Most robots were humanoid and pre-programmed to perform a specific social behavior of interest. We had a broad definition of social robots, including robots that fit typical descriptions of social robots, such as Robie Sr., Robovie, Robovie2, NAO, Dr. Robot Inc., HOAP-2, RUBI, RUBI-6, iRobiQ, ReplieeQ2, MyKeepon, 210 AIBO, MiRoE, and Opie (Table 3 and Fig 3). However, we also found robots not typically considered social robots, such as the robotic ball Sphero and Bee-Bot (Table 3 and Fig 3). Notably, the robots used in the studies varied in their level of advancement. Some were relatively simple and immobile, like the Robie Sr. robot, while others were capable of autonomous action, such as the NAO robot (Table 3 and Fig 3). Naturally, some of the more advanced robots were unavailable when the first studies were conducted, and therefore, we found that more simplistic robots were used in the studies that were first published.
Research focus and key findings
Our review shows research trends in using social robots to study social and cognitive concepts such as animacy understanding, action understanding, imitation, and early conversational skills. Some studies also used robots to examine reading abilities, computational thinking, and emotions. We found that most studies focused on whether children classify robots as social partners to interact with and acquire information from or whether humans are a privileged source of information at these developmental stages [58, 60, 62, 66–69, 72, 77, 81–94]. Only a few studies [63–65] used robots to provide more constant stimuli instead of humans, with a main focus on the developmental concepts examined. Furthermore, some had an additional focus on the application of robots [56, 59, 60], such as the therapeutic potential of robot dogs [60] or as a learning tool to improve reading [56]. Lastly, one study used a robot providing socially contingent behaviors to facilitate children’s imitation learning from a human experimenter [93].
The limited number of studies means that caution is necessary when interpreting the findings. Furthermore, research findings from one age group cannot be generalized to others. However, some key findings indicate that infants are attentive to robots and can learn from them at an early stage of development in several situations. Thus, humans are not necessarily the only information source for young children. For instance, 2-month-olds tend to be more attentive to robots that move [90], while 6-month-olds imitate robots [69]. Furthermore, 6.5-month-olds can attribute goals to a robot’s moving actions toward a specific destination [83]. Another key finding was that as children grow older, they show signs of becoming better at recognizing and interpreting the social cues provided by robots, and their learning from robots is enhanced. For example, 24- to 35-month-old showed early signs of attributing intentions to robots by detecting what a robot intended to do when it failed to put beads inside a cup [68]. Additionally, 1-to-3-year-olds were able to imitate a robot’s actions with objects both on-screen and in real life, and imitation increased with age [58, 62]. Yet, in several situations, children in the reviewed studies did not understand the robots’ social behaviors and were not able to learn from them [66, 72, 84, 85, 87, 90]. Taken together, toddlers and infants may view robots as social partners, attributing mental states to them like older children do [107–110]. Moreover, this literature provides information on the ages at which young children can socially engage with social robots.
Yet another key finding was that it was not just the appearance of social robots but also how the robots behave that plays an important role in how young children perceive, understand, and respond to them [56, 58, 63, 64, 67, 82, 86, 91]. Especially, contingency and interactivity behaviors facilitated how the robots were understood. For example, when young infants observed another person talking to or contingently interacting with a robot, they tended to classify the robot as animate [82, 92], and they showed increased sensitivity to its social cues such as eye gaze [67]. Additionally, toddlers who interacted more with the robot prior to the imitation test were more likely to imitate it [58]. In conversations with robots, toddlers tended to stay more engaged in the conversation when the robot reciprocated their verbalizations and stayed on-topic [63, 64]. Moreover, adding more social factors to the robot, such as verbal cuinging, increases 12-month-old infants’ ability to follow a robot’s gaze to an object [86]. Relatedly, Csibra [111] proposes that it is not how an agent looks that is important for children to identify it as an agent, but how it behaves. It is possible that social robots having appearances and social behaviors like living beings blur the lines between living and non-living beings and that social robots are represented as a new ontological category in children. As a result, young children might perceive and treat these robots as social partners and not just machines. Relatedly, Manzi [88] et al. discuss robots with human-like characteristics might activate social mechanisms in young infants. Yet, in some cases, appearance and contingency behaviors were not enough to elicit an understanding of the robot’s intention [66].
Gaps and challenges
The authors reported several gaps and challenges related to using social robots in early developmental research. Most commonly, the authors reported that future work should investigate whether children’s familiarity with robots impacts their responses. Although social robots possess human-like qualities and behaviors already familiar to the child, their novelty may result in different responses from children when compared to interactions with human agents. Frequently reported challenges were related to robot design. For instance, in some studies, a human experimenter had to accompany the robot during an experiment because of the technical constraints of the robots [66, 92]. Relatedly, Peca and colleagues [92] discuss that future work should aim to make robots that do not require human operators.
Limitations
This scoping review is not without limitations. Although we conducted extensive searches across multiple databases, it is possible that some relevant studies were not included. Our inclusion criteria were limited to studies published in English, and we did not manually search reference lists to identify additional studies, which may have resulted in the exclusion of relevant studies. Furthermore, as scoping reviews do not typically aim to assess the quality of evidence, we did not perform a formal quality assessment of the studies included.
Future directions
This review has allowed us to identify important directions for future research, primarily within developmental psychology but also in social robotics. Firstly, it is unclear how efficient social robots are when acting as agents in early developmental research. This is indicated by diverse findings related to how children classify them as animate or inanimate and how children interpret their social cues and behaviors. Notably, few studies used any human or non-human controls for robots. Thus, future studies should use other agent types in addition to robots to determine the efficiency of using social robots, humans, and other types of agents in early developmental research. Findings on what robot behaviors are crucial for young children may have implications for future work within social robotics when aiming to develop age-appropriate robots. Secondly, we found that multiple robots were rarely used within the same study, and thus, it is unclear if their findings generalize to other types of robots or if the findings are specific to a particular robot type. Future work could use several robots to test generalizability across different robot types. Thirdly, most studies investigated child-robot interactions in highly controlled settings that do not easily generalize to other environments. Future work should investigate naturalistic interactions between children and robots, in which the robots respond to the child’s behavior at the moment rather than being pre-programmed to do a specific task. Fourth, we noticed that the included studies rarely reported the reasons behind their choice of a specific robot type and the amount of time spent preparing the robot, such as learning to program it or having a skilled programmer do it. We suggest reporting such information to ease replication and to improve planning for future studies.
Conclusion
Our scoping review of 29 studies shows a small and emerging field of using social robots to study social and cognitive development in infants and toddlers. We identified four main areas of focus: animacy understanding, action understanding, imitation, and early conversational skills. An important question in the field is whether young children perceive social robots as social partners or agents. Findings vary on how children classify and understand the behaviors of social robots. According to the studies, young children can, from an early age, pay attention to social robots, learn from them, and recognize their social signals, but not always. The studies suggest that certain robot behaviors, particularly those that are interactive and contingent, are critical for enhancing children’s perception of robots as social entities. Moreover, it seems like children’s understanding of robots improves with age. Our review indicates that even in infancy, social robots can be regarded as social partners, a perception that is essential in research settings that depend on social interaction. Consequently, our review highlights the need for careful selection of social robots that exhibit interactive and contingent behaviors to be effective in early developmental research. Furthermore, this review contributes knowledge on how children socially interact with and learn from non-human agents with rich social features. These insights are important for future studies within developmental psychology involving social robots and young children and future work within social robotics on designing appropriate robot behaviors to facilitate social interaction with robots in early childhood.
Supporting information
S1 Checklist. Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist.
https://doi.org/10.1371/journal.pone.0303704.s001
(DOCX)
S1 File. Search strategy.
Search queries and search terms used in the databases and preprint repository.
https://doi.org/10.1371/journal.pone.0303704.s002
(DOCX)
Acknowledgments
We thank Torstein Låg, Senior Academic Librarian at the UiT The Arctic University of Norway, for support in developing search strategies.
References
- 1.
Smith PK, Hart CH, Abecassis M, Barrett MD, Bellmore A, Bissaker K, et al. Blackwell handbook of childhood social development: Blackwell Publishers Oxford, UK; 2002.
- 2.
Goswami U. The Wiley-Blackwell handbook of childhood cognitive development: 2. ed. 2,Second edition. ed. Chichester u.a: Chichester u.a: Wiley-Blackwell; 2011.
- 3.
Workman L, Taylor S, Barkow JH. Evolutionary perspectives on social development. The Wiley‐Blackwell handbook of childhood social development: Wiley-Blackwell; 2022. p. 84–100.
- 4.
Meltzoff AN. Social cognition and the origins of imitation, empathy, and theory of mind. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 49–75.
- 5.
Lillard A, Pinkham AM, Smith E. Pretend play and cognitive development. The Wiley‐Blackwell handbook of childhood cognitive development. Blackwell Publishing Ltd; 2010. p. 285–311.
- 6.
Nicolopoulou A, Smith PK. Social play and social development. The Wiley‐Blackwell handbook of childhood social development: Wiley-Blackwell; 2022. p. 538–54.
- 7.
Bauer PJ, Larkina M, Deocampo J. Early memory development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 153–79.
- 8.
Wellman HM. Developing a theory of mind. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 258–84.
- 9.
Tomasello M. Language development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 239–57.
- 10.
Waxman SR, Leddon EM. Early word-learning and conceptual development. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 180–208.
- 11.
Gopnik A, Meltzoff AN. Words, thoughts, and theories. Cambridge, MA: MIT Press; 1997.
- 12.
Tomasello M. The cultural origins of human cognition. Cambridge, MA: Harvard University Press; 1999.
- 13.
Wellman Henry M. The child’s theory of mind. Cambridge, MA: MIT Press; 1990.
- 14.
Gergely G. Kinds of agents: The origins of understanding instrumental and communicative agency. The Wiley‐Blackwell Handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 76–105.
- 15. Johnson SC. The recognition of mentalistic agents in infancy. Trends in Cognitive Sciences. 2000;4(1):22–8. pmid:10637619
- 16.
Opfer JE, Gelman SA. Development of the animate–inanimate distinction. The Wiley‐Blackwell handbook of childhood cognitive development: Wiley-Blackwell; 2010. p. 213–38.
- 17.
Carey S. Conceptual change in childhood. Cambridge, MA: MIT Press; 1985.
- 18.
Keil FC. Concepts, kinds, and cognitive development. Cambridge, MA: Bradford. Cambridge, MA: MIT Press; 1989.
- 19. Klein RP, Jennings KD. Responses to social and inanimate stimuli in early infancy. The Journal of Genetic Psychology. 1979;135(1):3–9. pmid:512642
- 20. Field TM. Visual and cardiac responses to animate and inanimate faces by young term and preterm infants. Child Development. 1979:188–94. pmid:446203
- 21. Ellsworth CP, Muir DW, Hains SM. Social competence and person-object differentiation: An analysis of the still-face effect. Developmental Psychology. 1993;29(1):63–73.
- 22. Legerstee M, Pomerleau A, Malcuit G, Feider H. The development of infants’ responses to people and a doll: Implications for research in communication. Infant Behavior and Development. 1987;10(1):81–95.
- 23. Gelman SA, Gottfried GM. Children’s causal explanations of animate and inanimate motion. Child Development. 1996;67(5):1970–87.
- 24. Goren CC, Sarty M, Wu P. Visual following and pattern discrimination of face-like stimuli by newborn infants. Pediatrics. 1975;56(4):544–9. pmid:1165958
- 25. Csibra G, Gergely G, Bíró S, Koós O, Brockbank M. Goal attribution without agency cues: The perception of ‘pure reason’ in infancy. Cognition. 1999;72(3):237–67. pmid:10519924
- 26. Johnson S, Slaughter V, Carey S. Whose gaze will infants follow? The elicitation of gaze-following in 12-month-olds. Developmental Science. 1998;1(2):233–8.
- 27. Spelke ES, Phillips A, Woodward AL. Infants’ knowledge of object motion and human action. 1995.
- 28.
Mukherji P, Albon D. Research methods in early childhood: An introductory guide: Sage; 2022.
- 29.
Kozima H, Nakagawa C, Yano H, editors. Using robots for the study of human social development. AAAI Spring Symposium on Developmental Robotics; 2005: Citeseer. Available from: http://mainline.brynmawr.edu/DevRob05/schedule/papers/kozima.pdf.
- 30. Kozima H, Nakagawa C. Interactive robots as facilitators of childrens social development. Mobile robots: Towards new applications: IntechOpen; 2006. Available from: https://www.intechopen.com/chapters/59.
- 31. Scassellati B. How developmental psychology and robotics complement each other. NSF/DARPA workshop on development and learning;2000. Available from: https://groups.csail.mit.edu/lbr/hrg/2000/WDL2000.pdf.
- 32.
Varrasi S, Di Nuovo S, Conti D, Di Nuovo A, editors. Social robots as psychometric tools for cognitive assessment: A pilot test. Human Friendly Robotics; 2019 2019//; Cham: Springer International Publishing. Available from: https://doi.org/10.1007/978-3-319-89327-3_8.
- 33.
Conti D, Trubia G, Buono S, Di Nuovo S, Di Nuovo A, editors. Evaluation of a robot-assisted therapy for children with autism and intellectual disability. Towards Autonomous Robotic Systems; 2018 2018//; Cham: Springer International Publishing. Available from: https://doi.org/10.1007/978-3-319-96728-8_34.
- 34. Sarrica M, Brondi S, Fortunati L. How many facets does a “social robot” have? A review of scientific and popular definitions online. Information Technology & People. 2020;33(1):1–21.
- 35. Henschel A, Laban G, Cross ES. What makes a robot social? A review of social robots from science fiction to a home or hospital near you. Current Robotics Reports. 2021;2(1):9–19. pmid:34977592
- 36. Onyeulo EB, Gandhi V. What makes a social robot good at interacting with humans? Information. 2020;11(1):43.
- 37. Breazeal C, Scassellati B. A context-dependent attention system for a social robot. IJCAI International Joint Conference on Artificial Intelligence. 1999;2.
- 38.
Nakadai K, Hidai K-i, Mizoguchi H, Okuno H, Kitano H. Real-time auditory and visual multiple-object tracking for humanoids. Proceedings of the Seventeenth International Joint Conference on Artificial Intelligence, IJCAI, August 4–10, 2001; Seattle, Washington, USA, Seattle2001. p. 1425–36.
- 39. Das D, Rashed MG, Kobayashi Y, Kuno Y. Supporting human–robot interaction based on the level of visual focus of attention. IEEE Transactions on Human-Machine Systems. 2015;45(6):664–75.
- 40. Amirova A, Rakhymbayeva N, Yadollahi E, Sandygulova A, Johal W. 10 years of human-NAO interaction research: A scoping review. Frontiers in Robotics and AI. 2021;8:744526. pmid:34869613
- 41. Kanda T, Sato R, Saiwaki N, Ishiguro H. A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Transactions on Robotics. 2007;23(5):962–71. WOS:000250177900013.
- 42. Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F. Social robots for education: A review. Science Robotics. 2018;3:eaat5954. pmid:33141719
- 43. Billard A. Robota: Clever toy and educational tool. Robotics and Autonomous Systems. 2003;42(3):259–69.
- 44.
Scassellati B. Investigating models of social development using a humanoid robot. Proceedings of the International Joint Conference on Neural Networks, 20032003. p. 2704–9 vol.4.
- 45. Roesler E, Manzey D, Onnasch L. A meta-analysis on the effectiveness of anthropomorphism in human-robot interaction. Science Robotics. 2021;6(58):eabj5425. pmid:34516745
- 46. Csibra G. Recognizing communicative intentions in infancy. Mind & Language. 2010;25(2):141–68.
- 47. Arksey H O’Malley L. Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology. 2005;8(1):19–32.
- 48. Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Medical Research Methodology. 2018;18(1):143. pmid:30453902
- 49. Peters MD, Godfrey C, McInerney P, Munn Z, Tricco AC, Khalil H. Chapter 11: Scoping reviews. Joanna Briggs Institute reviewer’s manual. p. 1–24.
- 50. Levac D, Colquhoun H, O’Brien KK. Scoping studies: Advancing the methodology. Implementation Science. 2010;5(1):69. pmid:20854677
- 51.
Xie B, Shen Z, Wang K. Is preprint the future of science? A thirty year journey of online preprint services2021 February 01, 2021:[arXiv:2102.09066 p.]. Available from: https://ui.adsabs.harvard.edu/abs/2021arXiv210209066X.
- 52.
Baxter P, Kennedy J, Senft E, Lemaignan S, Belpaeme T, editors. From characterising three years of HRI to methodology and reporting recommendations. 2016 11th acm/ieee international conference on human-robot interaction (hri); 2016: IEEE.
- 53. Shamir L. The effect of conference proceedings on the scholarly communication in computer science and engineering. Scholarly and Research Communication. 2010;1(2).
- 54.
Covidence systematic review software: Veritas Health Innovation, Melbourne, Australia. Available from: Available at www.covidence.org.
- 55. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ. 2009;339:b2535. pmid:19622551
- 56. Hsiao H-S, Chang C-S, Lin C-Y, Hsu H-L. "iRobiQ": The influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior. Interactive Learning Environments. 2015;23(3):269–92. EJ1058393.
- 57.
Boccanfuso L, Kim ES, Snider JC, Wang Q, Wall CA, DiNicola L, et al. Autonomously detecting interaction with an affective robot to explore connection to developmental ability. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII); 20152015. p. 1–7.
- 58. Sommer K, Slaughter V, Wiles J, Owen K, Chiba AA, Forster D, et al. Can a robot teach me that? Children’s ability to imitate robots. Journal of Experimental Child Psychology. 2021;203. pmid:33302129
- 59. Critten V, Hagon H, Messer D. Can pre-school children learn programming and coding through guided play activities? A case study in computational thinking. Early Childhood Education Journal. 2022;50(6):969–81. EJ1340706.
- 60. Barber O, Somogyi E, McBride A, Proops L. Exploring the role of aliveness in children’s responses to a dog, biomimetic robot, and toy dog. Computers in Human Behavior. 2023;142. WOS:000922494900001.
- 61. Kahn PH Jr., Friedman B, Perez-Granados DR, Freier NG. Robotic pets in the lives of preschool children. Interaction Studies. 2006;7(3):405–36. WOS:000244942600009.
- 62. Sommer K, Redshaw J, Slaughter V, Wiles J, Nielsen M. The early ontogeny of infants’ imitation of on screen humans and robots. Infant Behavior and Development. 2021;64:101614. pmid:34333263
- 63. Dunham P, Dunham F, Tran S, Akhtar N. The nonreciprocating robot: Effects on verbal discourse, social play, and social referencing at two years of age. Child Development. 1991;62(6):1489–502. pmid:1786730
- 64. Dunham P, Dunham F. The semantically reciprocating robot: Adult influences on children’s early conversational skills. Social Development. 1996;5(3):261–74.
- 65. Ferrier S, Dunham P, Dunham F. The confused robot: Two-year-olds’ responses to breakdowns in conversation. Social Development. 2000;9(3):337–47.
- 66. O’Connell L, Poulin-Dubois D, Demke T, Guay A. Can infants use a nonhuman agent’s gaze direction to establish word–object relations? Infancy. 2009;14(4):414–38. pmid:32693449
- 67. Meltzoff AN, Brooks R, Shon AP, Rao RPN. “Social” robots are psychological agents for infants: A test of gaze following. Neural Networks. 2010;23(8):966–72. pmid:20951333
- 68. Itakura S, Ishida H, Kanda T, Shimada Y, Ishiguro H, Lee K. How to build an intentional android: Infants’ imitation of a robot’s goal-directed actions. Infancy. 2008;13(5):519–32.
- 69. Fitter NT, Funke R, Carlos Pulido J, Eisenman LE, Deng W, Rosales MR, et al. Socially assistive infant-robot interaction using robots to encourage infant leg-motion training. IEEE Robotics & Automation Magazine. 2019;26(2):12–23. WOS:000471680800005.
- 70. Somma R. Robie Sr. Robot. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/b616072e-5dcf-44e3-8a42-a4338ae72c72?q=Somma%20Robie%20Sr.%20Robot
- 71. Manzi F, Peretti G, Di Dio C, Cangelosi A, Itakura S, Kanda T, et al. A robot is not worth another: Exploring children’s mental state attribution to different humanoid robots. Frontiers in Psychology. 2020;11:Figure 1 (CC BY) https://www.frontiersin.org/journals/psychology/about#about-open. pmid:33101099
- 72. Manzi F, Ishikawa M, Di Dio C, Itakura S, Kanda T, Ishiguro H, et al. The understanding of congruent and incongruent referential gaze in 17-month-old infants: an eye-tracking study comparing human and robot. Scientific Reports. 2020;10(1):11918: Figure 3 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. pmid:32681110
- 73.
dullhunk. Nao social humanoid robot from aldebaran robotics at animation 2012. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/965747b9-7372-4ef0-bc45-8b3f5a77a7d9?q=Nao%20social%20humanoid%20robot%20from%20aldebaran.
- 74. Kim J, Mishra AK, Limosani R, Scafuro M, Cauli N, Santos-Victor J, et al. Control strategies for cleaning robots in domestic applications: A comprehensive review. International Journal of Advanced Robotic Systems. 2019;16(4):Figure 4 (CC BY .0) https://creativecommons.org/licenses/by/4.0/.
- 75. Jeong G-M, Park C-W, You S, Ji S-H. A study on the education assistant system using smartphones and service robots for children. International Journal of Advanced Robotic Systems. 2014;11(4):71: Figure 1 (CC BT 3.0) https://creativecommons.org/licenses/by/3.0/.
- 76.
Loimere. Sphero! openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/f8fe1444-9400-4a33-9597-dd2b8015d868?q=Sphero%21.
- 77. Matsuda G, Ishiguro H, Hiraki K. Infant discrimination of humanoid robots. Frontiers in Psychology. 2015;6:Figure 1 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. pmid:26441772
- 78.
gophodotcom. DSC_0096. openverse (CC BY 2.0 https://creativecommons.org/licenses/by/2.0/?ref=openverse) https://openverse.org/image/68971a4a-3deb-4a52-bc0d-811863c7bf4a?q=Keepon.
- 79. Cervera N, Diago PD, Orcos L, Yáñez DF. The acquisition of computational thinking through mentoring: An exploratory study. Education Sciences. 2020;10(8):202: Figure 2 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/.
- 80. Riddoch KA, Hawkins RD, Cross ES. Exploring behaviours perceived as important for human—Dog bonding and their translation to a robotic platform. PLOS ONE. 2022;17(9):e0274353: Figure 1 (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/. pmid:36170337
- 81. Poulin-Doubois D, Lepage A, Ferland D. Infants’ concept of animacy. Cognitive Development. 1996;11(1):19–36.
- 82. Arita A, Hiraki K, Kanda T, Ishiguro H. Can we talk to robots? Ten-month-old infants expected interactive humanoid robots to be talked to by persons. Cognition. 2005;95(3):B49–B57. pmid:15788157
- 83. Kamewari K, Kato M, Kanda T, Ishiguro H, Hiraki K. Six-and-a-half-month-old children positively attribute goals to human action and to humanoid-robot motion. Cognitive Development. 2005;20(2):303–20.
- 84. Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. The power of human gaze on infant learning. Cognition. 2013;128(2):127–33. pmid:23672983
- 85. Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. Infants understand the referential nature of human gaze but not robot gaze. Journal of Experimental Child Psychology. 2013;116(1):86–95. WOS:000321723500008. pmid:23660178
- 86. Okumura Y, Kanakogi Y, Kanda T, Ishiguro H, Itakura S. Can infants use robot gaze for object learning? The effect of verbalization. Interaction Studies. 2013;14(3):351–65. WOS:000338351400004.
- 87. Wang Y, Park Y-H, Itakura S, Henderson AME, Kanda T, Furuhata N, Ishiguro H. Infants’ perceptions of cooperation between a human and robot. Infant and Child Development. 2020;29(2). WOS:000501682800001.
- 88. Manzi F, Ishikawa M, Di Dio C, Itakura S, Kanda T, Ishiguro H, et al. Infants’ prediction of humanoid robot’s goal-directed action. International Journal of Social Robotics. 2022. WOS:000882557200001.
- 89. Deng W, Sargent B, Havens K, Vanderbilt D, Rosales M, Pulido JC, et al. Correlation between performance and quantity/variability of leg exploration in a contingency learning task during infancy. Infant Behavior & Development. 2023;70:1–10. pmid:36399847
- 90. Funke R, Fitter NT, de Armendi JT, Bradley NS, Sargent B, Mataric MJ, et al. A data collection of infants’ visual, physical, and behavioral reactions to a small humanoid robot. 2018 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO); 20182018. p. 99–104.
- 91. Alac M, Movellan J, Malmir M. Grounding a sociable robot’s movements in multimodal, situational engagements. New Frontiers in Artificial Intelligence (JSAI-ISAI 2013); 20142014. p. 267–81.
- 92. Peca A, Simut R, Cao H-L, Vanderborght B. Do infants perceive the social robot Keepon as a communicative partner? Infant Behavior & Development. 2016;42:157–67. WOS:000372389100017. pmid:26589653
- 93. Sommer K, Slaughter V, Wiles J, Nielsen M. Revisiting the video deficit in technology-saturated environments: Successful imitation from people, screens, and social robots. Journal of experimental child psychology. 2023;232:105673–. MEDLINE:37068443.
- 94. Kahn PH Jr., Gary HE, Shen S. Children’s social relationships with current and near-future robots. Child Development Perspectives. 2013;7(1):32–7. WOS:000314975500007.
- 95. Thompson EL, Bird G, Catmur C. Conceptualizing and testing action understanding. Neuroscience & Biobehavioral Reviews. 2019;105:106–14. pmid:31394116
- 96. Heyes CM. Social learning in animals: Categories and mechanisms. Biological Reviews. 1994;69(2):207–31. pmid:8054445
- 97. Papakostas GA, Sidiropoulos GK, Papadopoulou CI, Vrochidou E, Kaburlasos VG, Papadopoulou MT, et al. Social robots in special education: A systematic review. Electronics. 2021;10(12):1398.
- 98. Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, et al. Autism and social robotics: A systematic review. Autism Research. 2016;9(2):165–83. pmid:26483270
- 99. Kouroupa A, Laws KR, Irvine K, Mengoni SE, Baird A, Sharma S. The use of social robots with children and young people on the autism spectrum: A systematic review and meta-analysis. PLoS ONE. 2022;17(6). pmid:35731805
- 100. Sani-Bozkurt S, Bozkus-Genc G. Social robots for joint attention development in autism spectrum disorder: A systematic review. International Journal of Disability, Development and Education. 2023;70(5):625–43.
- 101. Kohli M, Kar AK, Sinha S. Robot facilitated rehabilitation of children with autism spectrum disorder: A 10 year scoping review. EXPERT SYSTEMS. 2023;40(5). WOS:000894239600001.
- 102. Alabdulkareem A, Alhakbani N, Al-Nafjan A. A systematic review of research on robot-assisted therapy for children with autism. Sensors. 2022;22(3):944. pmid:35161697
- 103. Kabacinska K, Prescott TJ, Robillard JM. Socially assistive robots as mental health interventions for children: A scoping review. International Journal of Social Robotics. 2021;13(5):919–35. WOS:000552929400001.
- 104. Dawe J, Sutherland C, Barco A, Broadbent E. Can social robots help children in healthcare contexts? A scoping review. BMJ paediatrics open. 2019;3(1). pmid:30815587
- 105. Lau Y, Chee DGH, Chow XP, Wong SH, Cheng LJ, Lau ST. Humanoid robot-assisted interventions among children with diabetes: A systematic scoping review. International Journal of Nursing Studies. 2020;111:103749. pmid:32911362
- 106. Triantafyllidis A, Alexiadis A, Votis K, Tzovaras D. Social robot interventions for child healthcare: A systematic review of the literature. Computer Methods and Programs in Biomedicine Update. 2023;3:100108.
- 107. Melson GF, Kahn PH, Beck A, Friedman B, Roberts T, Garrett E, et al. Children’s behavior toward and understanding of robotic and living dogs. Journal of Applied Developmental Psychology. 2009;30(2):92–102.
- 108. Kahn PH Jr, Kanda T, Ishiguro H, Freier NG, Severson RL, Gill BT, et al. “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology. 2012;48(2):303–14. pmid:22369338
- 109. Desideri L, Bonifacci P, Croati G, Dalena A, Gesualdo M, Molinario G, et al. The mind in the machine: Mind perception modulates gaze aversion during child–robot interaction. International Journal of Social Robotics. 2021;13(4):599–614.
- 110. Di Dio C, Manzi F, Peretti G, Cangelosi A, Harris PL, Massaro D, et al. Shall I trust you? From child–robot interaction to trusting relationships. Frontiers in Psychology. 2020;11. pmid:32317998
- 111. Csibra G. Teleological and referential understanding of action in infancy. Philosophical Transactions of the Royal Society of London Series B: Biological Sciences. 2003;358(1431):447–58. pmid:12689372