Figures
Abstract
Collaborative problem-solving (CPS) competency is critical for 21st century students. However, reports from the Programme for International Student Assessment (PISA) 2015 have revealed significant deficiencies in this competency among young students globally, indicating a critical need for the cultivation of CPS skills. Therefore, it is essential for educators and researchers to examine the factors that influence CPS competency and understand the potential role of CPS in secondary education. The present study aims to investigate the relationship between collaboration dispositions and students’ CPS competency as well as the relationships of CPS competency and inquiry-based science instruction (IBSI) with science achievement using the PISA 2015 data. A total of 408,148 students from 52 countries and economies (i.e., regions) were included in our analysis. Unlike most previous studies that only investigated one country at a time and neglected the multilevel data structure of PISA, this study provided a global view through adopting multilevel modeling to account for the cluster effect at the school and country levels. Our findings revealed that valuing relationship was positively associated with CPS, whereas valuing teamwork was negatively associated with CPS. Furthermore, CPS competency was found to be a dominant and positive predictor of science achievement among all study variables, underscoring the importance of integrating CPS into teaching practices to promote student success in science. Additionally, different IBSI activities show varying relationships with science achievement, indicating that caution should be taken when recommending any specific practices associated with IBSI to teachers.
Citation: Tang X, Liu Y, Milner-Bolotin M (2023) Investigating student collaborative problem-solving competency and science achievement with multilevel modeling: Findings from PISA 2015. PLoS ONE 18(12): e0295611. https://doi.org/10.1371/journal.pone.0295611
Editor: Sonia Vasconcelos, Institute of Medical Biochemistry Leopoldo de Meis (IBqM) - Federal University of Rio de Janeiro (UFRJ), BRAZIL
Received: July 17, 2023; Accepted: November 25, 2023; Published: December 8, 2023
Copyright: © 2023 Tang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All data are available from the Programme for International Student Assessment (PISA) 2015 database: https://www.oecd.org/pisa/data/2015database/.
Funding: The authors received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Collaborative problem-solving (CPS) has been increasingly recognized as being critical for succeeding in educational and work settings in the 21st century [1, 2]. The 2015 Programme for International Student Assessment (PISA) defines CPS as “the capacities of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by pooling their knowledge, skills and efforts” [3 p.32]. Similarly, the Assessment and Teaching of 21st Century Skills (ATC21S) conceptualizes CPS as “a set of distinguishable subskills which are deployed in accordance with situational needs” [4 p.41]. The Australian Council for Educational Research views it as “a division of labour with participants who are engaged in active discourse that results in a compilation of their efforts” [5 p.2]. Despite the variations in wording to define CPS, it can be seen from these definitions that CPS entails more than simply individuals working together; instead, it involves the synergistic combination of their diverse knowledge, skills, and efforts to reach a solution. As such, CPS competency comprises two important components: the cognitive problem-solving component and the social collaborative component [3, 6].
The evolving complexities associated with modern world problems often challenge students beyond their individual capacities in both content knowledge and problem-solving skills [7]. Consequently, collaboration among individuals with diverse expertise becomes essential, especially in science, technology, engineering and mathematics (STEM) education, as the field itself is highly collaborative and relies on collaborative problem-solve at its core [8]. In a meta-analysis study in STEM contexts, Lai and Wong [9] reviewed 33 publications involving a total of 4717 learners and found that learners who engaged in CPS activities exhibited better cognitive and affective learning outcomes compared to those who engaged in individual problem-solving activities.
However, reports from PISA 2015 have revealed significant deficiencies in CPS competency among young students globally. Only 8% of students from 52 countries achieved the highest level of CPS proficiency, whereas 29% performed at the lowest CPS level [10]. This indicates a critical need for the cultivation of CPS skills in young students and the implementation of relevant educational programs aimed at CPS training. To inform decision making, it is essential to examine the factors that influence CPS competency and understand the potential role of CPS in secondary education before the government and schools make large-scale investments in CPS training across the curriculum. In our literature review, we delve into some major factors related to CPS and how CPS and other important factors are related to secondary school student achievement, specifically in the field of science.
Collaboration dispositions and CPS competency
Previous studies have found a variety of factors associated with CPS competency, such as social media-related (e.g., attitude towards social media, and purposes and contexts of social media usage), teacher- and school-related (e.g., school location, the proportion of fully certified teachers, and teacher unfairness), and socioenvironmental (e.g., cultures and economic inequality) factors [11–13]. Although some of these factors were shown to be important to CPS development, it is challenging to directly manipulate these external factors in intervention programs of CPS. In contrast, internal factors (e.g., motivation to complete goals, perceived locus of causality, enjoyment of learning [14]) are relatively easy to be included in intervention programs and may be more important to the CPS development because these factors can inherently shape individuals’ thinking and behaviors.
To the best of our knowledge, the research examining how internal factors affect CPS performance is currently limited [14]. Based on the existing literature, one such factor worth investigating is collaboration dispositions (i.e., attitudes towards collaboration). Collaboration dispositions have been shown to positively impact a range of outcomes related to attitudes, beliefs, and performance. These outcomes include team cohesiveness, satisfaction with teamwork, expected quality of teamwork, positive thinking, STEM self-efficacy, beliefs about the values of STEM, as well as perceived learning and academic performance [15–17]. In all these studies, collaboration dispositions were conceptualized as a unidimensional construct, regardless of different measures being used.
In PISA research, however, collaboration dispositions were reflected by two distinct dimensions, including valuing relationships, which measures the willingness to interact for altruistic reasons rather than one’s own benefit during collaborative activities, and valuing teamwork, which represents the extent to which students acknowledge the advantages of collaboration and tend to team up together. The findings from PISA research deviated from previous studies that assumed a unidimensional structure of collaboration dispositions. Two peer-reviewed journal papers (China [11, 18]) and PISA national reports (Canada [19]; England [20]) found that valuing relationships was positively associated with CPS, but valuing teamwork was negatively related to CPS.
However, these findings are solely based on data from three countries (Canada, China, England), and the analytical methods used in PISA research may be problematic. For example, linear regression was employed as the analytical method in these national reports, which fails to take into account the nested structure of PISA data. The nested data structure refers to the scenario in which students from the same school tended to be more similar to each other than students from other schools in the sample (also known as the cluster effect or design effect), so the observations are not independent. Given these limitations, the relationship between collaboration dispositions and CPS needs to be further evaluated with more appropriate statistical models and data from additional countries. Our study aims to fill this gap.
Several demographic variables, including economic, social, and cultural status (ESCS), gender, and Human Development Index (HDI), were included in this study as covariates to control for the confounding effects from these covariates and capture the true relationship between the primary predictors and the outcome variable. ESCS and gender were found to influence CPS performance in PISA 2015 test results, with girls scoring higher than boys in all participating countries [3] and socioeconomically privileged students scoring higher than socioeconomically underprivileged students [19, 20]. In addition, a country-level covariate HDI was retrieved from a source outside the PISA database and added to our analysis to control for the heterogeneity of country characteristics.
CPS competency, inquiry-based instruction, and science achievement
The integration of CPS training and inquiry-based instruction has been considered as a promising approach to improve student learning outcomes in contemporary science education reform [21, 22]. In scientific fields, CPS and inquiry are closely related, as students apply inquiry skills, such as idea testing, prediction making, and evidence-based reasoning, when engaging in CPS activities [23]. To obtain a further understanding of their roles in science education, we investigated the relationship of CPS competency and inquiry-based science instruction (IBSI) with student science achievement.
CPS competency and science achievement
It has been argued that “science is best learned when students practice the language and tools of scientific problem-solving in socially situated activities” [24 p.646]. The CPS process would scaffold students to work collaboratively in groups, address conflict situations, apply critical and creative thinking to achieve solutions, and draw conclusions from empirical evidence, which are considered core elements of science literacy [25]. Palincsar et al.’s [24] study has demonstrated an increase in science performance after the implantation of a CPS-based instructional program, with a sample of 130 middle school students. Additionally, Chan et al.’s [26], Ebrahim’s [27], and Musalamani et al.’s [28] studies with sample sizes of 69, 163, and 120 students, respectively, have reported that CPS-based teaching methods have produced significantly more positive effects on student science achievement compared to traditional teaching methods. Although CPS seems a promising factor that may positively influence student science achievement, these findings are either predominantly based on qualitative analysis or derived from quantitative analysis with a small sample size. Recent findings from the PISA 2015 technical report showed that CPS scores had a moderately strong correlation with science scores, ranging from .65 to .83 in different countries [29]. However, to provide further support for the significance of CPS to student science achievement, more studies need to be conducted to examine the relationship between CPS and science achievement.
IBSI and science achievement.
Traditional lecture-based learning fails to adequately address students’ need of CPS competency in the 21st century. As a result, inquiry-based learning in which “students use an authentic problem as the context for an in-depth investigation of what they need and what to know” is attracting growing attention [30 p.26, 31–33]. Some prior studies employing a quasi-experimental design have consistently shown that inquiry-based teaching yields greater benefits for student science learning than conventional teaching approaches [34–36]. However, studies that used the PISA datasets to examine the relationship between IBSI and science achievement have demonstrated controversial results. When IBSI was used as a composite index variable, students who experienced lower levels of IBSI tended to perform better in science [37–40]. However, both Cairns [37] and Oliver et al. [40] found that individual items of IBSI had varying associations (positive, negative, and almost zero) with science performance. Using factor analysis, two studies found that one factor of IBSI was positively related to science achievement, while the other showed a negative association [41, 42]. Given the complexity of the relationships between IBSI and science achievement, individual item scores of IBSI were used in our study to examine how specific activities or practices of IBSI influenced science achievement scores.
In addition to CPS and IBSI, four attitudinal and three demographic factors were included in our study as covariates to adjust for their effects on the outcome variable and increase the precision of estimates. Attitudinal factors comprised a set of variables that reflected attitudes toward science measured by the PISA’s student questionnaire, including science self-efficacy, enjoyment of science, interest in broad science topics, and epistemological beliefs. These factors were all found to be positively associated with PISA science achievement [43–45]. For demographic variables, ESCS had a positive correlation with PISA science achievement across countries [40] and boys had higher scores than girls overall in the PISA 2015 science assessment [46]; hence, ESCS and gender must be controlled for. HDI was also used to adjust for the heterogeneity of country characteristics.
In order to uncover the key features of developing CPS and science competencies around the globe, we performed a comprehensive analysis using the PISA 2015 dataset. We used multilevel models to examine the effect of student- and country-level variables, within and between variations across countries, and addressed two research questions:
- RQ-1: How are dispositions towards collaboration (valuing relationships and valuing teamwork) associated with students’ CPS competency?
- RQ-2: How are CPS competency and IBSI related to students’ science achievement?
Of note, for RQ-1, gender, ESCS, and HDI were included to control for gender difference and differences in the individual’s and country’s socioeconomic status. For RQ-2, four more covariates (i.e., science self-efficacy, enjoyment of science, interest in broad science topics, and epistemological beliefs) were added to control for individual differences in their beliefs in and attitudes towards science.
Methods
Data source and sample
The data used in this study were retrieved from the PISA 2015 (https://www.oecd.org/pisa/data/2015database/), which is a large-scale, international assessment that evaluated both CPS competency and science achievement. It should be noted that the CPS assessment was conducted only in the PISA 2015 cycle. A two-stage stratified sampling design was adopted in PISA data collection to ensure the representativeness of the sample, with schools being sampled using probability proportional to their size (i.e., the estimated number of 15-year-olds enrolled in PISA-eligible schools), and students being sampled with equal probability within schools [29]. Out of 72 countries and economies (i.e., regions) that participated in PISA 2015, 20 did not take part in the CPS assessment due to its optional nature and requirement of computer access. Hence, this study only included 52 countries and economies with a total sample of 408,148 students. Ethics approval was not required for this study, as the extracted data were anonymized before access and analysis.
Measures
Predictors.
All measures used from the PISA 2015 student questionnaire were listed in S1 File. For collaboration dispositions, the valuing relationships index was derived from four items rated on a 4-point Likert-type scale and the valuing teamwork index was derived from another set of four items. The range of Cronbach’s alpha reliability estimates was 0.59–0.75 for valuing relationships and 0.57–0.87 for valuing teamwork across countries [29]. Students’ total scores of each scale were transformed using a generalized partial credit model and standardized to a metric with a mean of 0 and a standard deviation of 1, with higher scores indicating more positive collaboration dispositions. For more detailed descriptions of data transformation, please refer to the PISA technical report [29]. A total of nine items were included in the IBSI scale and each was used as an individual predictor. All items were rated on a 4-point Likert scale and were reverse scored, with higher values representing a higher frequency of activities in science lessons. This scale demonstrated good reliability across countries, with Cronbach’s alpha ranging from 0.83 to 0.90 [29].
Covariates.
Seven covariates were included, i.e., gender, HDI, ESCS, science self-efficacy, enjoyment of science, interest in broad science topics, and epistemological beliefs. Gender was included as a control variable (0 = girls; 1 = boys). HDI was a composite index of life expectancy, education, and per capita income indicators, which reflected the social and economic development levels of countries [47]. It was ranked on a scale from 0 to 1. ESCS was a composite score derived from three indices concerning the economic, social, and cultural status: home possessions, highest parental occupational status, and highest parental educational level. The Cronbach’s alpha for this measure ranged from 0.36 to 0.77 across countries [29]. For the attitude variables, the scales of science self-efficacy, enjoyment of science, epistemological beliefs, and interest in broad science topics consisted of eight, five, six, and five items, respectively. The four measures showed acceptable reliability across countries, with Cronbach’s alpha of 0.83–0.94, 0.85–0.97, 0.80–0.94, and 0.72–0.86, respectively [29]. All these scales were rated on a 4-point scale except a 5-point scale of interest in broad science topics. The index of ESCS and attitudes toward science was standardized with a mean of 0 and a standard deviation of 1 by PISA researchers. Additional details about the transformation process can be found in the PISA technical report [29].
Outcomes.
CPS and science scores were the outcome variable in RQ-1 and RQ-2, respectively. CPS scores were also used as a predictor to address RQ-2. In the PISA 2015 CPS assessment, students collaborated with computer-simulated agents to complete various tasks involving real-life problem scenarios [10]. They made selections from lists of predefined messages and earned credits when choosing the correct answer. Aside from message communication, non-chat problem-solving actions (e.g., moving elements on the screen) were also recorded and scored as a reflection of CPS competency. With the use of predefined stimuli, PISA 2015 outlined the expected interaction patterns and collaborative behaviors, thereby providing standardized assessment conditions and ensuring the comparability of CPS scores.
Several validation studies have shown that PISA 2015 CPS is “functioning reasonably well” [48 p.12]. Scoular et al. [48] demonstrated that the CPS test was measuring an assumed unidimensional collaboration construct using item response theory analysis on PISA data. Additionally, they found the test to have a good coverage of items, with most showing moderate difficulty level, and be sufficiently sensitive to differentiate students with varying collaborative abilities. Other empirical studies applying released PISA 2015 CPS tasks also provided validity evidence. Herborn et al. [49] found no significant differences in CPS performance accuracy between students collaborating with human partners and those with computer agents, supporting the use of computer-simulated agents in collaborative environments. Stadler et al. [50] revealed that student CPS scores were moderately related to their self-rated scores on collaborative abilities and teacher-rated scores on student collaborative and reasoning abilities. Moreover, CPS scores strongly predicted students’ collaboration performance with their own team player [50]. Together, these studies supported that this test was assessing the CPS construct as intended by the PISA developers.
In the 2015 cycle, there were 184 items in the science assessment and 117 items in the CPS assessment, and each student only answered part of them due to time constraints. For CPS and science assessments, 10 plausible values (PVs) ranging from 1 to 1000 were estimated for each student to represent their overall competencies by PISA researchers, using item response theory scaling methods [29]. The traditional approach for handling PVs was to use only one of them or an average of all PVs, resulting in a limited amount of information being incorporated. To fully utilize all the information provided by PVs, it is recommended to adopt Aparicio et al.’s [51] approach, which was to analyze each model 10 times with each of the PVs and report an average of 10 sets of estimates. This approach allows for a more accurate representation of educational outcomes, as it uses the whole distribution of test scores as a proxy of student performance [51]. Following this recommendation, we used 10 CPS PVs for the outcome in RQ-1. However, in RQ-2 we encountered the challenge of having multiple PVs for both the predictor variable (i.e., CPS) and the outcome variable (i.e., science achievement), which adds complexity to data analysis. As a result, we incorporated 10 science PVs for the outcome but used the mean of 10 CPS PVs for the predictor variable to simplify the analysis.
Given that the analysis included variables with varying ranges (e.g., 0–1, 1–1000), a scaling technique was applied to ensure that all variables were on a comparable scale, facilitating meaningful interpretations and helping with model convergence. Specifically, the CPS mean was scaled down by a factor of 100 (e.g., a CPS mean of 668.04 in raw data was transformed to 6.68 for analysis) and the HDI was scaled up by a factor of 10 (e.g., an HDI index of 0.92 in raw data was transformed to 9.2 for analysis) to mitigate potential discrepancies in magnitudes.
Data analysis
Multilevel modeling was performed using R version 4.0.4 [52] with the merTools R package [53]. To account for the cluster effect resulted from the sampling design in PISA data collection, three-level models with random intercepts were employed. To take account of 10 PVs for the outcome in both research questions, we used lmerModList() function in merTools R package to provide the final estimates from 10 sets of data analyses. We followed the model building strategies suggested by Raudenbush and Bryke [54] and added variables sequentially. For RQ-1, we started from the unconditional means model (Model-1A), entered the country-level covariate (Model-1B), subsequently added the student-level covariates (Model-1C), and eventually built the full model (Model-1D) by adding the predictors of interest. The equation of the full model is presented as follows: (1) where CPSijk denotes the outcome variable; γ000 denotes the intercept, i.e., the average CPS competency across all schools and all countries in the sample; γ001 − γ400 are the regression coefficients for HDI, gender, ESCS, valuing relationships, and valuing teamwork; three random effects are included in the analysis: eijk representing the deviation of student I score from the school mean score, r0jk representing the deviation of school j mean score from the country mean, and u00k representing the deviation of country k mean score from the grand mean.
Similarly, for RQ-2, we started from the unconditional means model (Model-2A), added both the country- and student-level covariates (Model-2B), followed by the addition of IBSI (Model-2C), and eventually entered CPS competency (Model-2D). The equation of the final model is presented as follows: (2) where Scienceijk is the outcome variable; γ001 ‒ γ1600 are the regression coefficients of HDI, gender, ESCS, epistemological beliefs, enjoyment of science, interest in broad science topics, science self-efficacy, and IBSI on science achievement. IBSI 1 to IBSI 9 represent the individual items from the IBSI questionnaire.
Missing values in the dataset were imputed using random forests algorithm, as implemented in the R package missRanger [55]. This imputation approach has shown low imputation errors and similar performance as multiple imputation [56].
Results
Descriptive statistics of all the variables used in this study are presented in Table 1.
Before running the full scale of data analyses, we conducted a small pilot study to examine whether the relationships we tested in the present study were held across countries. We selected seven countries with low, medium, and high CPS performance on the PISA tests and conducted the same models described above for each country. The results showed consistent patterns across these seven countries, which were similar to the findings based on 52 participating countries and economies. This pilot study supported our use of all-country data in the present study. Additionally, the multilevel modeling allows the outcome to vary across schools and countries. The results with 52 countries and economies are presented below.
RQ 1: How are collaboration dispositions (valuing relationships and valuing teamwork) related to students’ CPS competency?
Model-1A to D were used to address RQ-1, and the results are provided in Table 2. The unconditional means model (Model-1A) showed that 21% of the total variance in students’ CPS competency was accounted for by school differences with an intra-class correlation (ICC) of 0.21, and 17% accounted by country differences (ICC = 0.17). The results supported the application of multilevel models to take account of the cluster effect in the data.
Model-1B examined the role of the country-level covariate. With 0.1 unit increase in the HDI index, student CPS scores were predicted to increase on average by 43.56 points. Results on variance reduction indicated that HDI explained 49% of the country-level variance in students’ CPS competency. The effect of student-level covariates was explored in Model-1C. Student ESCS was positively related to their CPS scores, and female students scored 19.50 points higher than male students after controlling for the effect of HDI and other variables. These student-level covariates (i.e., gender and student-level ESCS) explained 3% of the student-level variance and 25% of the school-level variance. The literature has shown that the school-level variance can be explained by adding a student-level variable, if its mean varies over schools [57].
The key predictors, valuing relationships and valuing teamwork, were added to the full model (Model-1D). They showed opposite effects on CPS competency. After controlling for other variables, valuing relationship was positively related to CPS scores (γ300 = 14.75), whereas valuing teamwork was negatively associated with CPS scores (γ400 = 19.50). Together, collaboration dispositions explained 3% of the student-level variance and 5% of the school-level variance in CPS competency on the top of the variance explained by other covariates. The results of the full model showed that students with higher ESCS and higher HDI had achieved higher CPS scores and female students had better CPS competency than males.
RQ2. How are CPS competency and IBSI related to students’ science achievement?
Table 3 presents the results of Model-2A to D. The finding of the unconditional mean model (Model-2A) demonstrated the necessity of multilevel modeling, because 27% of the variance in students’ science achievement was accounted for by school differences (ICC = 0.27) and 16% accounted for by country differences (ICC = 0.16). Model-2B shows that all the covariates are significantly related to science achievement. Students from countries with higher HDI (γ001 = 31.31) performed better on the PISA science assessment. In terms of student-level covariates, male students (γ100 = 4.79) and students with higher ESCS (γ200 = 13.16) and more positive attitudes toward science, i.e., epistemological beliefs (γ300 = 16.65), enjoyment of science (γ400 = 8.60), interest in broad science topics (γ500 = 6.73), and science self-efficacy (γ600 = 4.70) had higher science scores.
Nine IBSI variables were added to Model-2C, which showed varying relationships with science achievement, and all of them were statistically significant. Four of these IBSI activities showed to be positive predictors of science achievement, including students explaining their ideas (γ700 = 0.54), students drawing conclusions from experiments (γ1000 = 2.85), teachers explaining applications of science ideas (γ1100 = 9.55), and teachers explaining relevance of science concepts (γ1400 = 2.77). However, the other five activities showed a negative association with science achievement, including spending time in the laboratory doing experiments (γ800 = -3.00), arguing about science questions (γ900 = -4.52), designing their own experiments (γ1200 = -11.56), having class debates about investigations (γ1300 = -7.30), and doing investigations to test ideas (γ1500 = -8.13). The student-, school-, and country-level variance in science achievement was further accounted for by IBSI variables with a reduction of 7%, 16% and 18%, respectively.
Finally, CPS was added to the full model (Model-2D), which was used to examine the effect of CPS competency on science achievement after controlling for all other variables. Among all variables, CPS was shown to be a dominant predictor; every 100-point increase in CPS was accompanied by an increase of 81.97 points in science achievement. It explained more than half of the student- (55%), school- (78%), and country-level (84%) variance in science achievement on the top of the variances explained by all other variables. While all the student-level covariates still had significant relationships with the science achievement, HDI was no longer significant, and the effect of IBSI 1 (students explaining their ideas) changed from positive (γ700 = 0.54) to negative (γ700 = -0.24) in the presence of CPS.
Discussion and conclusions
The present study investigated the associations of collaboration dispositions (valuing teamwork and valuing relationship) with CPS as well as the relationships of CPS and IBSI with science achievement using the PISA 2015 dataset. Our results showed that valuing relationship was positively related to CPS, whereas valuing teamwork was negatively related to CPS. Additionally, higher CPS competency corresponded to higher science achievement, and different IBSI activities showed varying relationships with science achievement. Our findings contributed to the existing empirical evidence in two critical areas. First, this study offers a global view on important factors influencing CPS competency and science achievement by utilizing large-scale data from 52 countries and economies. Second, this study provides a comprehensive overview of the impact of these factors through the use of multilevel modeling, taking into account the cluster effect at the school and country levels. A thorough discussion of our findings is presented below.
The relationship between collaboration dispositions and students’ CPS competency
Results of the present study suggest that valuing relationships with others were positively associated with students’ CPS competency, such as considering different perspectives, being a good listener, taking into account what others are interested in, and enjoying seeing classmates be successful, whereas valuing teamwork for its benefits were negative associated with CPS competency, such as raising one’s own efficiency and making better decisions. These findings are in line with previous studies that utilized PISA 2015 data [11, 18–20]. It is noteworthy that, while valuing teamwork and valuing relationships constituted integral components of students’ collaboration dispositions in PISA 2015, these two dimensions had a moderate correlation (r = 0.45) in our sample, indicating distinctions in their conceptualization. Expanding on the concept of valuing teamwork, although it provides certain advantages and can increase individual willingness to collaborate, it does not necessarily lead to more significant contributions to successful problem-solving. Effective collaboration requires shared responsibility and a dynamic distribution of labour [48]. Team members who exhibit reduced effort in group work (i.e., social loafing) [58] or lack competency [23] can dramatically impair the overall team performance, as it places additional burdens on other team members to compensate and strive towards achieving team goals. Without active and equal participation of each member, emphasizing the benefits of teamwork may have the opposite effect [59]. Further empirical studies are warranted to better understand the complex relationships between these collaboration dispositions and CPS as well as the underlying processes at play.
Furthermore, the beneficial impact of valuing relationships reveals the significance of social interaction quality within the CPS context, as indicated by extant literature [60, 61]. In addition to being active participants, students need to acquire a range of social skills, such as perspective-taking, active listening, negotiation, and conflict resolution, to communicate effectively in a team and make sense of problems collectively [62]. Such findings have some practical implications for educators aiming to designing effective intervention programs. First, teachers are recommended to enhance students’ motivation and engagement by showing them the relevance and importance of collaboration in task success. Second, teachers should incorporate classroom activities focusing on facilitating students’ communication, negotiation, as well as group and task management skills. By offering students abundant opportunities to practice and providing constructive feedback their peer interactions, teachers can improve current levels of individual collaboration, which leads to greater learning outcomes.
The relationships of CPS competency and IBSI with students’ science achievement
Another important finding is that CPS competency was a dominant and positive predictor of science achievement in PISA 2015, suggesting that CPS is conducive to science learning [63]. The cognitive aspect of CPS encompasses essential skills for scientific inquiry such as planning, executing, and monitoring, flexibility, and knowledge building, while the social aspect involves collaboration skills such as participation, perspective taking, and social regulation [4]. With the cognitive skills to analyze and solve problems and social skills to collaborate and communicate, students can leverage their collective knowledge and skills to tackle scientific questions [14]. As such, students with higher CPS competency can have better performance in science. It is worth noting that, unlike science, CPS is rarely taught as a school subject in most countries [64], although there have been some instructional programs specifically designed for fostering CPS [23, 60, 65]. In order to benefit science learning outcomes, it is recommended to integrate CPS as a learning practice in classroom and incorporate it into student work. Beyond the realm of science education, the advantage of implementing CPS can extend to other disciplines where problem-solving plays a central role. For instance, existing literature has demonstrated the positive outcomes of implementing CPS-based programs in STEM education [66, 67].
The relationship between IBSI and science achievement was found to vary depending on the particular activity being examined in this study. Four out of the nine activities had a positive effect on students’ performance in science, while the remaining activities had a negative effect. These two groups of items found are largely consistent with the two main forms of instructional practices that prior research identified within the IBSI framework: guided inquiry, which involves some kind of teacher guidance, and independent inquiry, which is student-led [41, 42]. Although there were slight differences in the specific items that fell under guided inquiry and independent inquiry between the current study and Aditomo and Klieme’s [41] study, our results support their conclusion that guided inquiry was positively related to science learning outcomes while independent inquiry was negatively related.
It is worth noting, however, that one should not interpret our results as implying that independent inquiry impedes science learning. One possible explanation for the negative relationships found in this study is that students from some countries have rarely, if ever, experienced such activities as argumentative discussions and debates in science classrooms [68, 69]. Hence, their ratings of occurrence of these activities were low, even though they had relatively high scores on the science assessment. Another possible reason is that the PISA 2015 science assessment has been designed to evaluate student performance on a much broader scope (e.g., chemistry, biology, physics, earth and space science), which requires knowledge that goes beyond lab experiments and scientific investigations. For example, one of the released PISA 2015 science items asks about the reason why a meteoroid speeds up when it approaches Earth and its atmosphere [70]. In the case of this astronomy problem, experiment and investigation activities may not be helpful experiences for students in answering the question.
Nonetheless, there are certain IBSI activities that have showed consistent positive effects in this study and prior research [37, 41], such as teachers explaining the applications of science ideas, teachers explaining relevance of science concepts, and students explaining their ideas. It is recommended that future research should examine the effects of each IBSI practice on science achievements with experimental designs and provide more robust evidence for the use or avoidance of particular practices in science education.
The present study is not without limitations. First, we included predictors of interest based on existing literature, but we might have missed other significant variables that are not yet identified. In particular, a substantial portion of the variation in students’ CPS and science scores was attributed to school-level and country-level differences; however, our analysis only included one variable at the country level (i.e., HDI) and did not incorporate any at the school level. It is necessary to add school- and country-level factors to explore their effects on CPS and science achievement in future research. Second, the PISA 2015 CPS assessment is fully computer-based, and therefore, the lack of access to modern technology might limit student participation in certain countries, especially those with the low HDI, and thus might limit the generalizability of our findings. The extent to which our findings can be generalized to countries that did not participate in PISA 2015 CPS need to be examined in the future. Lastly, our study does not provide evidence for cause-effect relationships. Experimental studies are encouraged to investigate what factors can improve CPS skills and how CPS can improve student learning outcomes. Our study can serve as a starting point for researchers to delve into the development of CPS and science competencies and inform educators who want to design instructional practices that aims at facilitating these essential skills among secondary school students.
Supporting information
S1 File. Measures selected from PISA2015 student questionnaire for the present study.
https://doi.org/10.1371/journal.pone.0295611.s001
(DOCX)
Acknowledgments
We acknowledge the Programme for International Student Assessment (PISA) 2015 for providing access to the valuable data used in this study. The availability of such a comprehensive and well-curated dataset has been instrumental in conducting our research. We are grateful for the opportunity to utilize this dataset and acknowledge its significant contribution to the outcomes of our study.
References
- 1. Graesser AC, Fiore SM, Greiff S, Andrews-Todd J, Foltz PW, Hesse FW. Advancing the science of collaborative problem solving. Psychological Science in the Public Interest. 2018;19(2):59–92. pmid:30497346
- 2.
Griffin P, Care E, editors. Assessment and teaching of 21st century skills: methods and approach. Dordrecht: Springer Netherlands; 2015.
- 3.
Organisation for Economic Co-operation and Development [OECD]. PISA 2015 results (volume v): collaborative problem solving [Internet]. 2017 Nov 21 [cited 2023 Sep 18]. https://www.oecd.org/publications/pisa-2015-results-volume-v-9789264285521-en.htm
- 4.
Hesse F, Care E, Buder J, Sassenberg K, Griffin P. A framework for teachable collaborative problem solving skills. In: Griffin P, Care E, editors. Assessment and teaching of 21st century skills. Dordrecht: Springer Netherlands; 2015. p.37–56.
- 5.
Scoular C, Duckworth D, Heard J, Ramalingam D. Collaboration: definition and structure. Camberwell: Australian Council for Educational Research; 2020.
- 6. Care E, Scoular C, Griffin P. Assessment of collaborative problem solving in education environments. Applied Measurement in Education. 2016;29(4):250–64.
- 7. Antonenko PD, Jahanzad F, Greenwood C. Fostering collaborative problem solving and 21st century skills using the DEEPER scaffolding framework. Journal of college science teaching. 2014;43(6):79–88.
- 8.
Martinovic D, Milner-Bolotin M. Problematizing STEM: what it is, what it is not, and why it matters. In: Michelsen C, Beckmann A, Freiman V, Jankvist UT, Savard A, editors. Mathematics and its connections to the arts and sciences (MACAS). Cham: Springer International Publishing; 2022. p.135–62.
- 9. Lai X, Wong GK. Collaborative versus individual problem solving in computational thinking through programming: a meta‐analysis. Brit J Educational Tech. 2022;53(1):150–70.
- 10.
OECD. PISA 2015 collaborative problem‑solving framework. In: PISA 2015 assessment and analytical framework [Internet]. OECD Publishing; 2017 [cited 2023 Sep 18]. p.131–88. http://www.oecd-ilibrary.org/education/pisa-2015-assessment-and-analytical-framework/pisa-2015-collaborative-problem-solving-framework_9789264281820-8-en
- 11. Tang P, Liu H, Wen H. Factors predicting collaborative problem solving: based on the data from PISA 2015. Front Educ. 2021;6:619450.
- 12. Wang M, Yu R, Hu J. The relationship between social media-related factors and student collaborative problem-solving achievement: an HLM analysis of 37 countries. Educ Inf Technol [Internet]. 2023 Apr 3 [cited 2023 Sep 18]. Available from: https://link.springer.com/10.1007/s10639-023-11763-z
- 13. Wu Y, Zhao B, Wei B, Li Y. Cultural or economic factors? which matters more for collaborative problem-solving skills: evidence from 31 countries. Personality and Individual Differences. 2022;190:111497.
- 14. Chen L, Inoue K, Goda Y, Okubo F, Taniguchi Y, Oi M, et al. Exploring factors that influence collaborative problem solving awareness in science education. Tech Know Learn. 2020;25(2):337–66.
- 15. Bravo R, Catalán S, Pina JM. Analysing teamwork in higher education: an empirical study on the antecedents and consequences of team cohesiveness. Studies in Higher Education. 2019;44(7):1153–65.
- 16. Chen SK, Yang YTC, Lin C, Lin SSJ. Dispositions of 21st-century skills in STEM programs and their changes over time. Int J of Sci and Math Educ. 2023;21(4):1363–80.
- 17. Wu M, Ho S, Lin H, Chang W. How do thinking styles influence collaborative dispositions? A study on the relationships between thinking styles and collaborative dispositions for youngsters in Taiwan. ESTP. 2013;13(4): 2059–70.
- 18.
Zheng JQ, Cheung KC, Sit PS. The effects of perceptions toward interpersonal relationships on collaborative problem-solving competence: comparing four ethnic Chinese communities assessed in PISA 2015. Asia-Pacific Edu Res [Internet]. 2023 May 18 [cited 2023 Sep 18]. https://link.springer.com/10.1007/s40299-023-00744-y
- 19.
Scerbina T, Deussing MA, O’Grady K. Measuring up: Canadian results of the OECD PISA 2015 study: the performance of Canadian 15-year-olds in collaborative problem solving [Internet]. Toronto (Canada): Council of Ministers of Education; 2019 [cited 2023 Sep 18]. https://www.voced.edu.au/content/ngv:82829
- 20.
Jerrim J, Shure, N. Achievement of 15-year-olds in England: PISA 2015: collaborative problem solving national report [Internet]. London (UK): Department for Education; 2017 [cited 2023 Sep 18]. http://dera.ioe.ac.uk/30540/1/PISA_2015_CPS_National_Report_FINAL-1.pdf
- 21.
Crippen KJ, Antonenko PD. Designing for collaborative problem solving in STEM cyberlearning. In: Dori YJ, Mevarech ZR, Baker DR, editors. Cognition, metacognition, and culture in stem education. Cham: Springer International Publishing; 2018. p.89–116.
- 22. Milner-Bolotin M, Aminov O, Wasserman W, Milner V. Pushing the boundaries of science demonstrations using modern technology. Can J Phys. 2020;98(6):571–8.
- 23. Rosen Y, Wolf I, Stoeffler K. Fostering collaborative problem solving skills in science: the Animalia project. Computers in Human Behavior. 2020;104:105922.
- 24. Palincsar AS, Anderson C, David YM. Pursuing scientific literacy in the middle grades through collaborative problem solving. The Elementary School Journal. 1993;93(5):643–58.
- 25. Musthafa MMA, Surekha PM. Fostering scientific literacy among students through collaborative problem solving. International Research Journal of Management Sociology & Humanities. 2014;5(3):200–7.
- 26. Chan CKK, Lam ICK, Leung RWH. Can collaborative knowledge building promote both scientific processes and science achievement? International Journal of Educational Psychology. 2012;(3):199–227.
- 27. Ebrahim A The effect of cooperative learning strategies on elementary students’ science achievement and social skills in Kuwait. Int J of Sci and Math Educ. 2012;10(2):293–314.
- 28. Musalamani W, Yasin RM, Osman K. The effectiveness of the school based-cooperative problem-based learning (SB-CPBL) model in improving students’ achievement in science. Malaysian Journal of Education. 2022;47(1):75–87.
- 29.
OECD (2017c). PISA 2015 technical report [Internet]. Paris (France): OECD Publishing; 2017 [cited 2023 Sep 18]. https://www.oecd.org/pisa/sitedocument/PISA-2015-technical-report-final.pdf
- 30. Akçay B. Problem-based learning in science education. Journal of Turkish Science Education. 2009;6(1):28–38.
- 31. Iwuanyanwu PN. Nature of problem-solving skills for 21st century STEM learners: what teachers need to know. JSTE. 2020;55(1):27–40.
- 32. Windschitl M, Thompson J, Braaten M. Beyond the scientific method: model-based inquiry as a new paradigm of preference for school science investigations. Sci Ed. 2008;92(5):941–67.
- 33. Wong KKH, Day JR. A comparative study of problem-based and lecture-based learning in junior secondary school science. Res Sci Educ. 2009;39(5):625–42.
- 34. Abdi A. The effect of inquiry-based learning method on students’ academic achievement in science course. UJER. 2014;2(1):37–41.
- 35. Ergül R, Şımşeklı Y, Çaliş S, Özdılek Z, Göçmençelebı Ş, & Şanli M. The effects of inquiry-based science teaching on elementary school students’ science process skills and science attitudes. BJSEP. 2011;5(1):48–68.
- 36. Nasution WN. The effects of inquiry-based learning approach and emotional intelligence on students’ science achievement levels. Journal of Turkish Science Education. 2018;15(4):104–15.
- 37. Cairns D. Investigating the relationship between instructional practices and science achievement in an inquiry-based learning environment. International Journal of Science Education. 2019;41(15):2113–35.
- 38. Cairns D, Areepattamannil S. Exploring the relations of inquiry-based teaching to science achievement and dispositions in 54 countries. Res Sci Educ. 2019;49(1):1–23.
- 39. Gómez RL, Suárez AM. Do inquiry-based teaching and school climate influence science achievement and critical thinking? Evidence from PISA 2015. IJ STEM Ed. 2020;7(1):43.
- 40. Oliver M, McConney A, Woods-McConney A. The efficacy of inquiry-based instruction in science: a comparative analysis of six countries using PISA 2015. Res Sci Educ. 2021;51(S2):595–616.
- 41. Aditomo A, Klieme E. Forms of inquiry-based science instruction and their relations with learning outcomes: evidence from high and low-performing education systems. International Journal of Science Education. 2020;42(4):504–25.
- 42. chi Lau K, ping Lam TY. Instructional practices and science performance of 10 top-performing regions in PISA 2015. International Journal of Science Education. 201;39(15):2128–49.
- 43. Lau KC, Ho SCE. Attitudes towards science, teaching practices, and science performance in PISA 2015: multilevel analysis of the Chinese and Western top performers. Res Sci Educ. 2022;52(2):415–26.
- 44. Ozel M, Caglak S, Erdogan M. Are affective factors a good predictor of science achievement? Examining the role of affective factors based on PISA 2006. Learning and Individual Differences. 2013;24:73–82.
- 45. Zhu Y. How Chinese students’ scientific competencies are influenced by their attitudes? International Journal of Science Education. 2019;41(15):2094–112.
- 46. Bijou M, Liouaeddine M. Gender and students’ achievements: evidence from PISA 2015. WJE. 2018;8(4):24–35.
- 47.
United Nations Development Programme. Human development report 2015: work for human development [Internet]. United Nations: Human Development Reports.; 2015 [cited 2023 Sep 18]. https://hdr.undp.org/content/human-development-report-2015
- 48. Scoular C, Eleftheriadou S, Ramalingam D, Cloney D. Comparative analysis of student performance in collaborative problem solving: what does it tell us? Australian Journal of Education. 2020;64(3):282–303.
- 49. Herborn K, Stadler M, Mustafić M, Greiff S. The assessment of collaborative problem solving in PISA 2015: can computer agents replace humans? Computers in Human Behavior. 2020;104:105624.
- 50. Stadler M, Herborn K, Mustafić M, Greiff S. The assessment of collaborative problem solving in PISA 2015: an investigation of the validity of the PISA 2015 CPS tasks. Computers & Education. 2020;157:103964.
- 51. Aparicio J, Cordero JM, Ortiz L. Efficiency analysis with educational data: how to deal with plausible values from international large-scale assessments. Mathematics. 2021;9(13):1579.
- 52.
R Core Team. R: A language and environment for statistical computing [Internet]. R Foundation for Statistical Computing; 2022. https://www.R-project.org/
- 53.
Knowles JE, Frederick C. MerTools: tools for analyzing mixed effect regression models. R package version 0.3.0; 2016. https://rdrr.io/cran/merTools/
- 54.
Raudenbush SW, Bryk AS. Hierarchical linear models: applications and data analysis methods. 2nd ed. Thousand Oaks: Sage Publications; 2002.
- 55.
Mayer M. MissRanger: fast imputation of missing values. R package Version 2(0); 2019.
- 56. Waljee AK, Mukherjee A, Singal AG, Zhang Y, Warren J, Balis U, et al. Comparison of imputation methods for missing laboratory data in medicine. BMJ Open. 2013;3(8):e002847. pmid:23906948
- 57. Aitkin M, Longford N. Statistical modelling issues in school effectiveness studies. J R Stat Soc Ser A Gen. 1986;149(1):1–26.
- 58. Teng CC, Luo YP. Effects of perceived social loafing, social interdependence, and group affective tone on students’ group learning performance. Asia-Pacific Edu Res. 2015;24(1):259–69.
- 59. Strauß S, Rummel N. Promoting regulation of equal participation in online collaboration by combining a group awareness tool and adaptive prompts. But does it even matter? Intern J Comput-Support Collab Learn. 2021;16(1):67–104.
- 60. Gu X, Chen S, Zhu W, Lin L. An intervention framework designed to develop the collaborative problem-solving skills of primary school students. Education Tech Research Dev. 2015;63(1):143–59.
- 61. Pöysä-Tarhonen J, Care E, Awwal N, Häkkinen P. Pair interactions in online assessments of collaborative problem solving: case-based portraits. RPTEL. 2018;13(1):12. pmid:30595740
- 62. Soller AL. Supporting Social Interaction in an Intelligent Collaborative Learning System. Int J Artif Intell Educ. 2001;12(1):40–62.
- 63. Bonitasya DA, Widiyatmoko A, Sovansophal K. The effect of blended learning with a collaborative problem solving approach on students’ cognitive learning outcomes and collaboration skills in science learning. J Penelit Pembelajaran IPA. 2021;7(2):152–67.
- 64.
Scalise K, Mustafic M, Greiff S. Dispositions for collaborative problem solving. In: Kuger S, Klieme E, Jude N, Kaplan D, editors. Assessing contexts of learning [Internet]. Cham: Springer International Publishing; 2016. p. 283–99.
- 65. Song Y. Improving primary students’ collaborative problem solving competency in project-based science learning with productive failure instructional design in a seamless learning environment. Education Tech Research Dev. 2018;66(4):979–1008.
- 66. Felmer P. Collaborative problem-solving in mathematics. Current Opinion in Behavioral Sciences. 2023;52:101296.
- 67.
Basu S, Kinnebrew JS, Shekhar S, Caglar F, Rafi TH, Biswas G, et al. Collaborative problem-solving using a cloud-based infrastructure to support high school STEM education. In 2015 ASEE Annual Conference & Exposition; 2015 Jun 14. pp. 26–359.
- 68. Berland LK, Reiser BJ. Classroom communities’ adaptations of the practice of scientific argumentation. Sci Ed. 2011;95(2):191–216.
- 69. Sampson V, Blanchard MR. Science teachers and scientific argumentation: trends in views and practice. J Res Sci Teach. 2012;49(9):1122–48.
- 70.
OECD. PISA 2015 Science Test Questions [Internet]. Date unknown [cited 2023 Sep 18]. https://www.oecd.org/pisa/pisa-2015-science-test-questions.htm