Twin and adoption studies find that non-shared environmental (NSE) factors account for variance in most behavioural traits and offer an explanation for why genetically identical individuals differ. Using data from a qualitative hypothesis-generating study we designed a quantitative measure of pupils’ non-shared experiences at the end of formal compulsory education (SENSES: Student Experiences of Non-Shared Environment Scales). In Study 1 SENSES was administered to n = 117 16–19 year old twin pairs. Exploratory Factor Analysis yielded a 49-item 10 factor solution which explained 63% of the variance in responses. SENSES showed good internal consistency and convergent and divergent validity. In Study 2 this factor structure was confirmed with data from n = 926 twin pairs and external validity was demonstrated via significant correlations between 9 SENSES factors and both public examination performance and life satisfaction. These studies lend preliminary support to SENSES but further research is required to confirm its psychometric properties; to assess whether individual differences in SENSES are explained by NSE effects; and to explore whether SENSES explains variance in achievement and wellbeing.
Citation: Yerdelen S, Durksen T, Rimfeld K, Plomin R, Asbury K (2018) Developing SENSES: Student experience of non-shared environment scales. PLoS ONE 13(9): e0202543. https://doi.org/10.1371/journal.pone.0202543
Editor: Timo Gnambs, Leibniz Institute for Educational Trajectories, GERMANY
Received: May 5, 2017; Accepted: August 6, 2018; Published: September 6, 2018
Copyright: © 2018 Yerdelen et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: People are typically used to working with datasets like ELSA and the birth cohort studies that are funded to provide a dataset for secondary analysis or datasets that are collected for ‘one-off’ cross-sectional analysis. However, there is a distinction between these kinds of data and ongoing longitudinal studies like TEDS that are funded for primary data analysis. We have many PhD and postdoctoral researchers — as well as over 100 long-term collaborators who have contributed to the project over the years — whose work, grants and fellowships depend on the primary data analysis of aspects of the study as the data are collected. Furthermore, there are data governance issues stipulated by Ethics Review Board and Executive Committee of the TEDS study, which require a sensitive handling of the data for the primary study analysis purposes, which the study participants have originally consented to. In order to ensure that the participants continue their longitudinal participation, we need to be sensitive to these governance issues. For these reasons we cannot publicly share our data. However, researchers can contact the TEDS PI, Professor Robert Plomin, to request data access on: firstname.lastname@example.org. TEDS has openly provided data for re-analyses and meta-analyses in the past, but has done so in accordance with the confidentiality stipulations set by the Ethics Review Board and the Executive Committee, which require the governance of this process to rest with the TEDS principal investigator.
Funding: This research was supported by a grant (EDU/40881 to R.P. and K.A.) from the Nuffield Foundation to the second and last authors. TEDS is supported by a programme grant to R.P. from the UK Medical Research Council (MR/M021475/1 and previously G0901245 to R.P.), with additional support from the US National Institutes of Health (AG046938 to R.P.) and the European Commission (602768; 295366 to R.P.). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
It has long been established by twin and adoption studies that non-shared environmental (NSE) effects explain variance in most behavioural and psychological traits, particularly after the preschool years e.g. [1, 2, 3]. NSE effects are those that make siblings brought up together differ from each other. They are uncorrelated with genetic effects and, for this reason, represent potentially interesting targets for intervention. However, it has proved almost as difficult to identify the specific experiences that can explain NSE variance as it has been to identify the specific genes that explain genetic variance . We have both a ‘missing heritability’ and a ‘missing environments’ problem [5,6]. The current study was motivated by a need to identify measured environments that can explain variance attributable to NSE in educationally relevant behaviour, such as achievement and wellbeing, with a view to possible intervention.
It has been difficult to identify NSE experiences because NSE effects, like genetic effects, tend to be many, small and involved in dynamic relationships with genes and other experiences. A further difficulty is that NSE effects include measurement error and one possibility is that they represent nothing but measurement error. However, this seems unlikely as some specific NSE effects have been found to explain small proportions of variance and, indeed, to show a degree of stability over time . The largest body of research in this area has focused on parenting and has found effects that explain small proportions of variance  and which, in some studies, explain more variance at the extremes . However, it remains possible that the measures of non-shared environment used in prior research do not explain much NSE variance because they do not accurately measure individuals’ experiences. It is important, therefore, to take a closer look at students’ experiences of the world in which they learn, and their perceptions of those experiences.
To that end, the current study was preceded by a qualitative hypothesis-generating MZ twin differences study designed to explore in detail the non-shared experiences of young people preparing for the public examinations (General Certificates of Secondary Education: GCSEs) taken by most UK pupils at age 16 [10, 11]. The focus was on MZ twins because behavioural or psychological differences between MZ twins cannot be explained by shared genes (because they can be assumed to have identical genotypes, albeit with a small chance of mutation) or shared environmental effects, and must therefore be explained by NSE effects, including measurement error. By asking MZ twins about differences between them in educationally relevant behaviour we were able to develop testable hypotheses about potential NSE influences at a transitional time, the point at which UK pupils make choices about further education, training and employment. The educationally-relevant traits we focused on were achievement and wellbeing (life satisfaction) as these are particularly salient variables at a time when young people are making important, potentially life-changing, decisions about their next steps in education or employment. Emerging hypotheses related to factors including perceived differences in teacher quality, teacher-pupil relationships and individual effort. Although psychology has already identified such factors as important correlates of achievement, this study was novel in suggesting that they may explain NSE variance specifically. The current study was designed to develop a quantitative measure that would make it possible to test these genetically-informed hypotheses.
We know that NSE factors, including measurement error, explain one-fifth of the variance in GCSE performance [12,13]. We therefore expected that families’ explanations of why one twin performed better than the other in their examinations could explain a maximum of 20% of variation in exam performance, and probably considerably less given that measurement error is likely to play a significant role . The study also focused on participants’ well-being and we know that there is more NSE variance to be explained here. In a recent meta-analysis, for instance, Bartels  found that genetic effects explained 32% of the variance in self-reported life satisfaction and 36% of the variance in feelings of well-being. The remaining variance in both types of measure was explained by non-shared environmental factors. Identifying NSE influences on well-being therefore represents an important challenge.
The current research had two main aims: (1) to design a measure that reflected qualitative accounts of NSE experience at this transitional time; and (2) to assess the factor structure, reliability and validity of this newly developed measure. We aimed to make a useful contribution to research in this area by developing a measure that can explain a proportion of environmental variance in educationally relevant traits in late adolescence. Because NSE effects are uncorrelated with genetic effects such a measure may feasibly form a useful basis for environmental intervention in the future.
The aims of Study 1 were threefold:
- ■. To develop an item pool based on data collected in an earlier qualitative phase of the project.
- ■. To reduce the number of items needed to measure experiences that may explain NSE variance in educationally-relevant outcomes.
- ■. To extract underlying factors and assess reliability.
Participants were drawn from the UK Twins’ Early Development Study (TEDS). TEDS is an on-going longitudinal study of three cohorts of twins born in 1994, 1995 and 1996 (16). The TEDS sample has been shown to be reasonably representative of the UK population of same-age adolescents and their parents [16, 17]. 300 twin pairs were invited to take part and data were gathered from n = 115 pairs and 2 unpaired twins (n = 117) who provided informed consent, 58 dizygotic pairs (62% female) and 57 monozygotic pairs (58% female) plus one dizygotic male twin and one monozygotic female twin. Twins were provided with a detailed information sheet and questionnaire completion indicated their consent to participate. Data were subsequently received from a further 6 pairs, but too late to be incorporated into analyses. Participants’ ages ranged from 16 to 19 (M = 18.28).
Measure development and procedure.
In an earlier phase of this project we gathered qualitative questionnaire data from n = 497 pairs of MZ twins (61% female) and interview data from n = 95 of these pairs (10–11). These twin pairs were all participants in TEDS and were all aged between 16 and 19 (M = 17.3). Both questionnaires and interviews asked MZ pairs, and one parent from each family, to describe and explain differences between them in a range of traits including GCSE achievement and wellbeing. This represented an attempt to generate new hypotheses about NSE influences on young people approaching the end of their formal compulsory education.
We drew on this rich dataset to build up an item bank for the current study. We prepared 175 draft items and revised them after conducting a small feasibility study with n = 6 young people aged 16–19. We then administered the items to our Study 1 sample of n = 117 twin pairs (n = 234 individuals). This initial questionnaire aimed to comprehensively represent the breadth and depth of our qualitative data and was therefore very long. We engaged in a process of extensive data reduction once the data were collected.
We organised our data into two related samples, Sample 1 and Sample 2, with one twin from each pair (randomly selected) represented in each. We identified items that could be excluded on the basis of data from both samples. More specifically, items were excluded for the following reasons:
- ■. Violations of univariate normality i.e. skewness or kurtosis values outside of the range -2 to +2.
- ■. Low correlations within the expected area (r < 0. 20).
- ■. Multi-collinearity.
- ■. Not adequately representing the qualitative data e.g. only mentioned by a small number of participants.
This initial data reduction process allowed us to discard 93 items, leaving 82. We then conducted Principal Component Analysis (PCA) with Sample 1 for the sole purpose of further data reduction, that is, not for factor extraction . PCA suggested the exclusion of 33 further items on grounds of cross-loadings, that is, items having a loading of 0.4 or higher on more than one component  and also having a lower than 0.2 loading difference between the primary and alternative factors ; or the clustering of fewer than three items i.e. too few to constitute a viable factor . This data reduction process left us with 49 items with which to measure NSE influences on young people preparing to leave school (the PCA process suggested using 48 items and the reasons for retaining 49 are discussed later). All items used a 5-point Likert response scale ranging from 1 = ‘not at all true’ to 5 = ‘very true’. These 49 items make up the SENSES measure.
Sample 2 data were used for the purposes of Exploratory Factor Analysis (EFA). Principal Axis Factoring extraction and promax rotation (kappa set at 4) were used to extract factors. Principal Axis Factoring is the most widely used method of factor analysis in the social sciences  and oblique rotation methods were suggested because some factors were expected to be correlated .
We began by assessing the suitability of our data for factor analysis. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy was .72 and therefore higher than the suggested cut-off point of .60 . Furthermore, Bartlett’s test of sphericity, χ2 (1176) = 4268.80, p < .05 showed that the correlation matrix was not an identity matrix and was appropriate for factor analysis. We therefore proceeded to EFA.
In order to identify the underlying factor structure we considered both Kaiser’s eigenvalues >1 and scree plots of the data . In both cases they yielded 9 factors. However, an eigenvalue of 0.935 for a tenth factor led to the decision to explore both 9 and 10 factor solutions. The research team ultimately decided that the 10 factor solution was conceptually more reasonable and therefore took the 10 factor solution forward to the main study. The 9 factor solution joined perceptions of self in Maths and Science in a single factor whereas the 10 factor solution distinguished between perceptions relating to the three core GCSE subjects. The 10 factor solution explained 63% of the variance in participants’ scores and can be seen in Table 1.
It is interesting to note that while some of these factors relate to experiences that can relatively easily be classified as ‘environmental’, such as teachers, social media and the influence of family and work experience; others describe aspects of behaviour such as effort and confidence that would not typically be considered as environmental influences. However, it is important to note that behavioural discordance in MZ twins must have non-shared environmental origins, that is, if one MZ twin is more confident or hard-working than the other this has to be for environmental reasons, and that such discordant behaviour could have NSE effects. For this reason, we retained both types of factor and both types of item. In some cases it is difficult to draw a clear line between experience and behaviour. For instance, does an individual’s perception or interpretation of an event, such as an interaction with a teacher, represent environment or behaviour? Our focus here was on individual experiences (including perceptions) that can explain NSE variance.
Table 1 also shows a sample item for each factor, the number of items representing each factor, the proportion of variance explained and Cronbach’s alpha. It can be seen that all factors showed good internal consistency with Cronbach’s alphas ranging from .76 to .92. The English (perceptions of self and teacher) factor explained the largest amount of variance (19.15%). The combined variance explained by the split Science and Maths factors was a little lower (11.85% for Science and 11% for Maths). Items relating to Maths and Science did not combine in a single factor as they did for English. Effort during GCSE courses accounted for a similar amount of variance (9.56%). Factors relating to social media and to future plans explained smaller proportions of variance (<4%).
The 49th item (Item 5.5)–I was interested in what we were studying in maths–was found to have close loadings (differences between primary and alternative factor loadings were less than .20) on 3 different factors (.44 on maths self-perception but also .34 on perceptions of maths teacher and .31 on science self-perception). However, it was noted that the equivalent item for English and Science loaded only on the appropriate factors i.e. perceptions of self in English and Science. In order to maintain consistency of items across three subject domains it was decided to take the 49th item forward to Study 2 for further testing. It has been suggested that the consequences of over-factoring are usually less marked than the consequences of under-factoring  and this approach allowed us to test the items and the factor structure with a larger sample, yielding more trustworthy results. Furthermore, it is generally agreed that substantive considerations should be taken into account alongside statistical considerations in EFA .
The 10 factors yielded by our multi-stage process were conceptually reasonable, representative of the qualitative data gathered in an earlier phase of this project and they showed good internal reliability. The scales showed both convergent item validity (high factor loadings on a relevant scale) and, with one exception, divergent item validity (low factor loadings on other scales).
The major limitation of this pilot study was that the ratio of items to participants was too great (82 items for PCA and 117 participants i.e. one twin per pair in each sample). However, subsequent analysis found that our data were suitable for factor analysis. A further issue to consider in Study 2, where this problem was eliminated, is that the decision to proceed with a 10 factor rather than a 9 factor model was made on conceptual rather than statistical grounds. We need to ask whether the 49-item, 10 factor structure is confirmed. These issues are revisited in Study 2 and in the General Discussion.
- ■. Conduct Confirmatory Factor Analysis (CFA) to evaluate the construct validity of SENSES.
- ■. Explore external validity via correlations with GCSE performance and self-reported life satisfaction.
Participants for Study 2 were also drawn from the Twins’ Early Development Study (TEDS) [16, 17]. We invited twins in 2165 families to participate and received SENSES and life-satisfaction data from n = 926 families (53% MZ). Twins were provided with a detailed information sheet and questionnaire completion implied consent. This approach was approved by our institutional ethics committee. In 908 cases we received data from both twins in the pair and in 18 cases, only from one. Data were gathered, therefore, from n = 1834 individuals (Mean age = 18.4; Range = 17 to 19; 61.6% female). Of these, n = 1672 participants had previously provided us with academic achievement data and, in all but 3 cases, this included General Certificate of Secondary Education (GCSE) data. In the remaining three cases alternative examinations to GCSE were taken. The sample was not fully representative of the UK population, or of the original TEDS sample. The relatively increased proportion of girls (from close to 50% at first contact) is broadly representative of TEDS data at age 16, but not of the UK population. This discrepancy may be the result of a greater willingness to engage with data collection among girls than boys at this age. Furthermore, standardized SES was higher in this sample than in the full TEDS sample (M = 0.31), and, more surprisingly, standardized g scores (measured at age 12) were slightly lower (M = -0.12). These discrepancies may be due to sample selection effects.
The 49 item SENSES measure developed in Study 1 was administered to Study 2 participants (n = 926 twin pairs). Data had previously been gathered on GCSE results and were also gathered on self-perceived life satisfaction using a well-validated five item measure of global life satisfaction . Items included ‘In most ways my life is close to my ideal’ and ‘If I could live my life over, I would change almost nothing’ and used a 5 point response scale from 1 = Strongly disagree through to 5 = Strongly agree. In the current study α = 0.86.
Questionnaires were posted to twin pairs along with an information sheet and separate envelopes for individual questionnaires so that twins could retain their privacy. Because the twins had already completed their GCSEs they were specifically asked to think back to Year 10 and 11 (when they were taking GCSE courses) when responding to items. Twins who returned completed questionnaires received a £5 gift voucher each and were entered into a prize draw with a chance of winning a pair of iPad minis.
Confirmatory Factor Analysis (CFA) was conducted using the maximum likelihood estimation method in LISREL 8.80  with SIMPLIS command language. One twin per pair was randomly selected for CFA and invariance analysis was conducted with this sample and the co-twin sample. Correlations between SENSES and our measures of achievement and life satisfaction were conducted. All analyses were replicated with the co-twin sample.
CFA found an acceptable model fit to the data (2(1082) = 4992.29, p < .05; CFI = .93; NFI = .92; SRMR = .053 RMSEA = .071; 90% CI = .070, .073) [29,30]. All items loaded significantly on their intended factors and standardized parameter estimates (Lambda X) ranged from .49 to .92 (See Table 3). Here we want to emphasize that the 49th item (i.e. I was interested in what we were studying in Maths), which we retained in the interests of retaining consistency across domains, had a sufficiently high factor loading of. 77. Therefore, this analysis with a larger sample supported our decision to retain this item.
We looked at correlations between our 10 factors and found an average correlation of r = 0.16 (range = .01 to .69). This suggested that the most factors showed low or no levels of correlation and can, therefore, be reasonably considered to be measuring different things (See Table 4).
Three correlations were exceptions to this pattern. The correlation between SCIENCE 1 (Perceptions of Self) and SCIENCE 2 (Perceptions of Teacher) was r = .69, p < .05. A similar pattern was also observed for Maths in that MATHS 1 and MATHS 2 correlated r = .64, p < .01. Finally, self-perceptions in science (SCIENCE 1) also correlated r = 0.46, p < .05 with MATHS 2 (perceptions of Maths teacher). These correlations and their implications for the SENSES measure are discussed later.
Table 5 shows means, standard deviations and reliability coefficients for the 10 factors.
As in Study 1 all 10 factors showed good levels of internal reliability with Cronbach’s alphas ranging from .72 to .94. Mean scores (based on 5 point response formats with 1 = not at all true and 5 = very true) were highest for English (M = 3.53), Maths (MATHS 1 M = 3.73; MATHS 2 M = 3.39), Science (SCIENCE 1 M = 3.59; SCIENCE 2 M = 3.65), Self-confidence about future plans (PLANS 2 M = 3.66) and Social Media (M = 3.38). They were lowest for the influence of family (PLANS 1 M = 1.79) and work experience (PLANS 3 M = 1.99) and were mid-range for effort (M = 2.53). Standard deviations ranged from a low of .78 for PLANS 1 (family influence) to 1.11 for MATHS 2 (perceptions of Maths teachers).
We also looked at factorial invariance across the two samples (one twin per pair in each sample group) in order to cross-validate the 10 factor model. This was achieved by conducting five multi-group CFA models. Firstly, a baseline model was tested in order to examine whether both samples conceptualised the constructs in a similar manner. In the remaining models factor loadings, factor variance, factor covariance and variance of error terms were constrained to be equal across the two samples and invariance between the models was compared (See Table 6)
Comparisons of each model found that chi-squared differences were non-significant. Furthermore, ΔCFI between constrained and unconstrained models were less than .01, as suggested by Cheung & Rensvold . In summary, invariance testing supported the factorial invariance of SENSES across our two related samples.
In order to assess the external validity of the SENSES measure we looked at correlations between the 10 factors and both GCSE achievement in English, Maths and Science and self-reported life satisfaction (See Table 7).
In general, the domain specific factors (Perceptions of Self and Teacher in English, Maths and Science) were significant correlates of GCSE achievement in English, Maths and Science respectively. The ENGLISH factor correlated r = .39, p < .001 with GCSE English but only r = .11, p < .01 with Maths and r = .10, p < .01 with Science. Likewise, SCIENCE factors correlated more strongly with Science achievement than with English or Maths achievement, and MATHS factors correlated more strongly with Maths achievement than with English or Science. Perceptions of self in Science correlated r = .49, p < .001 with Science GCSE, r = .32, p < .001 with Maths and r = .21, p < .001 with English. Perceptions of Science teachers correlated r = .25, p < .001 with Science GCSE, r = .16, p < .001 with Maths and r = .11, p < .001 with English. Perceptions of self in Maths correlated r = .31, p < .001 with Maths GCSE, r = .19, p < .001 with Science and r = .09, p < .01 with English. Perceptions of Maths teachers correlated r = .53, p < .001 with Maths GCSE, r = .38, p < .001 with Science and r = .14, p < .001 with English. Effort correlated at a similar level with all three domains (average r = .39, p < .001).
However, the remaining four factors yielded few significant correlations with GCSE achievement and those that did achieve statistical significance ranged from r = -0.07, p < .05 for the correlation between self-confidence about the future (PLANS 2) and Maths achievement, and r = -.11, p < .01 for the correlation between Social Media and Science achievement.
Correlations between the SENSES factors and our measure of life satisfaction were, with one exception statistically significant but mainly weak, ranging from r = .05 (NS) for PLANS 1 (family influence) and r = .08, p < .05 for PLANS 3 (work experience) through to a moderate correlation of r = .48, p < .001 for PLANS 2 (self-confidence about the future). The average correlation was r = .13.
Study 2 did not have the principal limitations of Study 1 in that we administered a questionnaire with fewer items (49 compared with 82) to a larger sample (n = 926 twin pairs compared with n = 117 twin pairs). It was therefore pleasing to note that CFA confirmed the factor structure that emerged from Study 1, and justified our decisions to use a 10 factor structure (rather than a 9 factor structure) and to retain 49 rather than 48 items. It was also seen that all 10 of the SENSES factors retained good levels of internal reliability. Furthermore, invariance testing cross-validated the model across our two related samples.
One area of concern was that although most factors correlated at the level of r</ = ~.3 there were three exceptions. The two Maths factors correlated r = .64; the two Science factors correlated r = .69 and self-perceptions in Science correlated r = .46 with perceptions of teacher in Maths. We also know that the same variables in relation to English did not split into two separate factors and the pattern is therefore inconsistent across the three GCSE subjects. This raises the possibility that either English should be split into two factors or that the currently separate but correlated Maths factors should be joined (also true for Science). It should remain a consideration that in the 9 factor structure suggested by EFA self-perceptions in Maths and Science were found to cluster on a single factor. This issue can only be satisfactorily resolved through further testing in different samples. We will not be able to reasonably claim the SENSES instrument is robust until we have tested it in more populations. However, in the meantime, it can be considered positive that ENGLISH factors correlated most strongly with English achievement, MATHS factors with Maths achievement and SCIENCE factors with Science achievement, suggesting external validity for the existing sub-scales.
The psychometric properties of the SENSES measure appear promising and indicate that it, or sub-scales from it, can make a useful contribution to research. However, some issues remain to be resolved in future research with different samples. Only by conducting further validation research will we gain confidence that we have identified the optimal factor structure. As the measure stands, domain specific factors show moderate correlations with domain specific GCSE achievement; and our measure of self-confidence about the future (PLANS 2) shows a moderate association with self-reported life satisfaction in late adolescence.
Four of the SENSES factors did not correlate with either GCSE achievement or life satisfaction. However, items were developed on the basis of discordance in a wider range of educationally relevant traits than this, and it is possible that these four factors could correlate with, for example, measures of occupational success, vocational interests or peer relationships. When undertaking further validation work with the SENSES measure it will be important to explore relationships with other variables.
It is important to note that some of the SENSES factors target traits such as effort or beliefs such as self-confidence about the future rather than environments per se. The suggestion from the qualitative data is that these factors differ between monozygotic twins and lead to non-shared outcomes. However, this discordance remains to be explained by differences in experience.
We have mentioned the limitation that while we gathered data on a wide range of traits in order to develop the SENSES measure we were only able to test it in relation to GCSE achievement and self-reported global life satisfaction. It would be interesting to explore relationships between SENSES factors and other variables, not measured in the current study, such as personality, peer relationships and mental health status. This can be achieved as we continue to test the validity of the SENSES measure as a whole, and individual sub-scales from it. Our study is also limited by a cross-sectional design that cannot speak to direction of effects or identify reverse causation. Furthermore, we have not yet established whether SENSES can do what it aims to do, that is, to explain NSE variance in outcomes including GCSE achievement and life satisfaction.
A particularly major limitation of this research is that it relied on retrospective data. Specifically, participants already knew their GCSE results when they provided data about their learning experiences during the GCSE course. This may have coloured their view of the GCSE experience in either positive or negative ways. Testing the SENSES measure with a sample of 14–16 year old UK pupils could address this concern.
The top priority for future research has to be reliability and validity testing of the SENSES measure in different populations. This will help us to address remaining concerns about whether we have identified the optimal factor structure. Beyond that, longitudinal work is needed if we are to begin to be able to understand the direction of any effects and to test for reverse causation i.e. the possibility that discordant GCSE results or life satisfaction are the precursor to discordant experiences, rather than the other way around. Finally, it will be important to assess whether SENSES factors can actually explain NSE variance in educationally relevant variables, and whether associations are mediated by genetic, shared or non-shared environmental effects (using multivariate twin analyses) and this research is already underway using the current sample.
S1 File. SENSES questionnaire.
S1 Table. Study 1 Covariance Matrix.
Study 1 Covariance Matrix.
We thank TEDS families for their generous participation, and Andy McMillan and Rachel Ogden for their help and support in collecting and managing the data for this research.
- 1. Bouchard T J Jr. Genetic Influence on Human Psychological Traits A Survey. Current Directions in Psychological Science. 2004 Aug; 13(4), 148–151.
- 2. Daniels D, Plomin R. Differential experience of siblings in the same family. Developmental Psychology. 1985 Sep; 21(5), 747–760.
- 3. Polderman T.J, Benyamin B, de Leeuw CA, Sullivan PF, van Bochoven A, Visscher PM, et al. Meta-analysis of the heritability of human traits based on fifty years of twin studies. Nature genetics. 2015 July 1; 47(7), 702–9. pmid:25985137
- 4. Turkheimer E, Waldron M. Nonshared environment: a theoretical, methodological, and quantitative review. Psychological Bulletin. 2000; 126(1), 78–108. pmid:10668351
- 5. Asbury K. Can Genetics Research Benefit Educational Interventions for All? Hastings Center Report. 2015 Sep 1; 45(S1), S39–S42. pmid:26413947
- 6. Turkheimer E. Genome wide association studies of behavior are social science. In Plaisance KS, Reydon TAC, editors. Philosophy of Behavioral Biology. New York, NY: Springer. 2012; p 43–64.
- 7. Briley DA, Tucker-Drob EM Genetic and environmental continuity in personality development: A meta-analysis. Psychological Bulletin 2014; 140(5), 1303–1331. pmid:24956122
- 8. Burt SA, McGue M, Iacono WG, Krueger RF. Differential parent-child relationships and adolescent externalizing symptoms: cross-lagged analyses within a monozygotic twin differences design. Developmental Psychology, 2006 Nov; 42(6), 1289–1298. 9.1037/0012-16220.127.116.119 pmid:17087561
- 9. Asbury K, Dunn J, Pike A, Plomin R. Nonshared environmental influences on individual differences in early behavioral development: A monozygotic twin differences study. Child Development, 2003 May 1; 74 (3), 933–943. pmid:12795399
- 10. Asbury K, Moran N, Plomin R. Nonshared Environmental Influences on Academic Achievement at Age 16: A Qualitative Hypothesis-Generating Monozygotic-Twin Differences Study. AERA Open. 2016 Oct; 2(4):2332858416673596.
- 11. Asbury K, Moran N, Plomin R. Do MZ twins have discordant experiences of friendship? A Qualitative Hypothesis-Generating Monozygotic-Twin Differences Study. Forthcoming.
- 12. Krapohl E, Rimfeld K, Shakeshaft NG, Trzaskowski M, McMillan A, Pingault JB, et al. The high heritability of educational achievement reflects many genetically influenced traits, not just intelligence. Proceedings of the National Academy of Sciences. 2014 Oct 21;111(42):15273–8. pmid:25288728
- 13. Shakeshaft NG, Trzaskowski M, McMillan A, Rimfeld K, Krapohl E, Haworth CM,et al. Strong genetic influence on a UK nationwide test of educational achievement at the end of compulsory education at age 16. PLoS One. 2013 Dec 11;8(12):e80341. pmid:24349000
- 14. Wiliam D. Reliability, validity, and all that jazz. Education 3–13. 2001 Oct 1;29(3):17–21.
- 15. Bartels M. Genetics of wellbeing and its components satisfaction with life, happiness, and quality of life: A review and meta-analysis of heritability studies. Behavior genetics. 2015 Mar 1;45(2):137–56. pmid:25715755
- 16. Oliver BR, Plomin R. Twins' Early Development Study (TEDS): A multivariate, longitudinal genetic investigation of language, cognition and behavior problems from childhood through adolescence. Twin Research and Human Genetics. 2007 Feb 1;10(01):96–105.
- 17. Haworth CM, Davis OS, Plomin R. Twins Early Development Study (TEDS): a genetically sensitive investigation of cognitive and behavioral development from childhood to young adulthood. Twin Research and Human Genetics. 2013 Feb 1;16(01):117–25.
- 18. Fabrigar LR, Wegener DT, MacCallum RC, Strahan EJ. Evaluating the use of exploratory factor analysis in psychological research. Psychological methods. 1999 Sep;4(3):272.
- 19. Costello AB, Osborne JW. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation. 2005; 10(7). Available online: http://pareonline.net/pdf/v10n7.pdf.
- 20. Howard MC. A review of exploratory factor analysis decisions and overview of current practices: What we are doing and how can we improve?. International Journal of Human-Computer Interaction. 2016 Jan 2;32(1):51–62.
- 21. Comrey AL. Factor-analytic methods of scale development in personality and clinical psychology. Journal of consulting and clinical psychology. 1988 Oct;56(5):754–761. pmid:3057010
- 22. Warner RM. Applied statistics: from bivariate through multivariate techniques: from bivariate through multivariate techniques. Sage. 2012.
- 23. Thompson B. Exploratory and confirmatory factor analysis: Understanding concepts and applications. Washington D.C.: American Psychological Association. 2004.
- 24. Pallant J. SPSS Survival Guide: A step by step guide to data analysis using SPSS. Australia: Allen & Unwin. 2001.
- 25. Stevens J. Applied multivariate statistics for the social sciences. Mahwah, N.J.: Lawrence Erlbaum Associates. 2002.
- 26. Brown TA. Confirmatory factor analysis for applied research. Guilford Publications. 2015.
- 27. Diener ED, Emmons RA, Larsen RJ, Griffin S. The satisfaction with life scale. Journal of personality assessment. 1985 Feb 1;49(1):71–5. pmid:16367493
- 28. Jöreskog KG, Sörbom D. LISREL 8.80 for Windows [Computer software]. Lincolnwood, IL: Scientific Software International. 2006.
- 29. Browne MW, Cudeck R. Alternative ways of assessing model fit. Sage focus editions. 1993 Feb 1;154:136.
- 30. Bentler PM. Comparative fit indexes in structural models. Psychological bulletin. 1990 Mar;107(2):238. pmid:2320703
- 31. Cheung GW, Rensvold RB. Evaluating goodness-of-fit indexes for testing measurement invariance. Structural equation modeling. 2002 Apr 1;9(2):233–55.