Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Constructing a digital twin maturity assessment framework for the building construction phase based on an improved matter-element model: A case study of a construction project in Xinyang, China

  • Qi Yang,

    Roles Data curation, Formal analysis, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations State Key Laboratory for Tunnel Engineering, Beijing, China, China University of Mining and Technology, Beijing, China

  • Zongjun Xia,

    Roles Conceptualization, Project administration, Resources, Software, Visualization

    Affiliation China RAILWAY 15 BUREAU Group Corporation, Shanghai, China

  • Xiaodan Li ,

    Roles Data curation, Formal analysis, Funding acquisition, Investigation, Project administration, Resources, Supervision, Writing – review & editing

    108840@cumtb.edu.cn

    Affiliation State Key Laboratory for Tunnel Engineering, Beijing, China

  • Zhen Liu,

    Roles Conceptualization, Formal analysis, Methodology, Validation, Writing – original draft, Writing – review & editing

    Affiliations State Key Laboratory for Tunnel Engineering, Beijing, China, China University of Mining and Technology, Beijing, China

  • Zhifei Chen,

    Roles Data curation, Validation, Visualization

    Affiliation China RAILWAY 15 BUREAU Group Corporation, Shanghai, China

  • Jing Li,

    Roles Data curation, Supervision, Validation, Visualization

    Affiliations State Key Laboratory for Tunnel Engineering, Beijing, China, China University of Mining and Technology, Beijing, China

  • Yangyang Wang,

    Roles Data curation, Supervision, Validation, Visualization

    Affiliations State Key Laboratory for Tunnel Engineering, Beijing, China, China University of Mining and Technology, Beijing, China

  • Jinfeng Wang

    Roles Data curation, Resources

    Affiliation China RAILWAY 15 BUREAU Group Corporation, Shanghai, China

Abstract

Digital twin technology has the potential to enhance construction efficiency, reduce costs, and minimize errors. However, its application during the construction phase remains at an early stage, largely constrained by the absence of standardized guidelines and principles. To address this challenge, it is essential to establish a comprehensive and universal maturity assessment framework to facilitate the effective implementation of this technology in the construction phase of building projects. This study focuses on two critical aspects: the development of the maturity assessment framework and its empirical validation. The proposed framework encompasses a maturity assessment indicator system covering five dimensions: acquisition layer, data layer, modeling layer, analysis layer, and application layer. For the first time, an optimized matter-element model based on dynamic thresholds and nonlinear correlation is introduced to improve the accuracy of maturity assessments. Furthermore, a feedback mechanism based on Importance-Performance Analysis (IPA) is utilized to clarify the formulation of optimization strategies. Finally, the framework is applied to the CAZ Innovation Industrial Park construction phase in Xinyang, Henan Province. The assessment results demonstrate that the system precisely measures the project’s maturity level and provides effective improvement recommendations. This study not only offers technological support for assessing and optimizing the digital twin maturity during the construction phase of building projects but also provides methodological insights into global digital twin maturity assessments.

1. Introduction

Digital Twin (DT) technology refers to the virtual representation of physical assets, processes, or systems. Through real-time monitoring and data analytics, DT enables performance prediction and optimization [1]. This innovative approach has been shown to significantly enhance energy efficiency [2], thereby contributing to the sustainable development of the construction industry. Since the early 21st century, the global construction sector has progressively adopted a range of international standards and methodologies in its transition toward digitalization and intelligent systems, laying a foundational framework for the widespread implementation of DT technologies. The institutional adoption of DT in construction has evolved through a coordinated progression from methodology to principles, standards, and policy. The Capability Maturity Model Integration (CMMI), introduced in 2002 and updated in 2018, provides a cross-sectoral paradigm for process improvement and maturity assessment [1]. In 2018, the United Kingdom introduced the Gemini Principles, which established value orientation and data governance guidelines for DT applications [2]. That same year, ISO 19650−1/-2 was published, setting international standards for lifecycle information management within Building Information Modeling (BIM) environments [3]. Building upon these foundations, policy diffusion has accelerated since 2020. The European Union has promoted the comprehensive adoption of BIM among member states to enhance efficiency and sustainability (EU BIM Policy). In 2021, the United Kingdom released its Digital Twin Strategy (UK DT Strategy), while China’s Ministry of Industry and Information Technology advocated for the development of DT platforms in its 14th Five-Year Plan for Intelligent Manufacturing (MIIT Policy). In 2022, the American National Standards Institute (ANSI) introduced DT standards encompassing data exchange, model management, and cybersecurity (ANSI DT Standards). Collectively, these frameworks are mutually reinforcing: CMMI provides organizational capability and evaluation pathways; the Gemini Principles ensure alignment with public value and data governance; and ISO 19650 guarantees cross-phase coordination and information consistency. Together, they facilitate national policy implementation and support industry-wide scalability. As global urbanization accelerates, the application of DT technology in the construction sector has emerged as an inevitable trend. By offering more precise and comprehensive insights into building performance, DT has the potential to fundamentally transform the industry and contribute to the development of intelligent, resilient, and sustainable cities [3].

Digital Twin (DT) technology refers to the virtual representation of physical assets, processes, or systems. By leveraging real-time monitoring and data analysis, it enables predictions and optimization of performance [4]. This innovative technology significantly enhances energy efficiency [5], thereby fostering sustainable development in the construction industry. In recent years, the global construction sector has increasingly embraced the applications of Digital Twin technology. For instance: In 2020, the European Union released the EU Building Information Modeling (BIM) Directive (https://single-market-economy.ec.europa.eu/sectors/construction-and-real-estate/bim_en), mandating all member states to adopt BIM standards by 2020 to improve project efficiency and sustainability. In 2021, the UK government published the UK Digital Twin Building Strategy (https://www.gov.uk/government/publications/digital-twin-strategy), which highlights the critical role of Digital Twin technology in enhancing the efficiency and sustainability of the construction sector and outlines various measures to promote its adoption.In the same year, China’s Ministry of Industry and Information Technology released the “14th Five-Year Plan for Intelligent Manufacturing Development” (https://www.miit.gov.cn/), advocating for the establishment of Digital Twin platforms, particularly in smart city applications. In 2022, the American National Standards Institute introduced the ANSI Digital Twin Building Standards (https://www.ansi.org/), which address aspects such as data exchange, model management, and security, aiming to promote widespread application of the technology in the U.S. construction industry. As urbanization continues to accelerate globally, the application of Digital Twin technology in the construction domain has become increasingly indispensable. This technology provides more accurate and comprehensive information about building performance, potentially revolutionizing the construction industry and contributing to the development of intelligent, resilient, and sustainable cities [6].

Digital Twin (DT) technology was initially applied in the aerospace field [7,8]. With the development of new technologies such as the Industrial Internet, Internet of Things (IoT), cloud computing, big data, and artificial intelligence, digital twin applications have expanded into a wide range of industries [9]. In the construction industry, research on DT primarily focuses on applications during the project design, construction, and operation and maintenance phases [1012]. Among these, the construction phase constitutes a critical part of the building lifecycle [13]. The application of digital twin technology during this phase helps improve construction efficiency and quality while reducing costs and minimizing errors [5]. At present, DT technology is mainly used during the construction phase to assist in various management activities, such as construction safety management [1417], uncertain site scheduling [18], fault detection, and quality management [19]. Related research topics can be broadly categorized into three areas: (1) Exploration of DT Implementation Frameworks: For instance, Boje C et al. proposed a conceptual framework for constructing digital twins from the perspectives of service, value chains, and methods [20]. Similarly, Pan Y developed a digital twin framework for structural health monitoring based on cloud computing and deep learning [21]; (2) Analysis of Specific DT Applications in Construction Processes: For example, Kaewunruen S and N. Xu explored the specific application of BIM technology in railway station buildings from a digital twin perspective, finding that hollow wall insulation combined with fire barrier installation achieved the optimal carbon emission rate among all tested schemes [22]. Gerhard D et al. proposed a fault-tolerant and rapid module production method, using prefabricated, freely extensible high-performance concrete components conceptualized within a digital twin framework [23]. Additionally, Angjeliu G developed a simulation model for DT applications in historic masonry buildings to analyze structural behavior during different construction phases [24]; (3) Optimization of DT Technology in Construction: For instance, Chen S et al. proposed a novel method based on multi-view consistency constraints and deep learning to refine the missing regions in depth maps generated by MVS algorithms, providing improved technical support for DT systems [25]. Ritto T G et al. introduced a digital twin conceptual framework for dynamic structural damage issues, combining physical models with machine learning to enhance the accuracy of digital twins for damaged structures [26].

Digital Twin Maturity is utilized to evaluate the developmental level of digital twin systems, playing a crucial role in enabling enterprises to assess their digital twin application level and formulate future improvement strategies [27]. Scholars from various disciplines have conducted research on digital twin maturity assessment. The evaluation model frameworks primarily include: (1) “Digital Model-Digital Shadow-Digital Twin” Framework based on Information Flow [28]; (2)“Digital Twin Capability-Phase Objectives-Technical Requirements” Model based on Development Levels [29]; (3) “Purpose-Trust-Function” Model based on Asset Management [30]; (4) “Context-Data-Computing Capacity-Model-Integration-Control-Human-Machine Interface” Framework based on Lifecycle [31]; (5) “Value-Function-Reliability” Model Framework based on Equipment Attributes [32]. The evaluation methods primarily include: (1) Analytic Hierarchy Process (AHP) [33]; (2) Decision-Making Trial and Evaluation Laboratory (DEMATEL) Method [34]; (3) Weighted Arithmetic Mean Method [31]; (4) Probability Distribution Function Method [35]; D-ANP Method [36].

It is worth noting that significant progress has been made in evaluating the maturity of digital twins (DT) in the construction sector [34,37]. Specific advancements include:

  1. [1]. In terms of evaluation frameworks:1) Deng et al. [38] developed a five-level hierarchical classification system based on the building lifecycle to reflect the latest status of digital twin applications in the built environment; 2) Chen Z. S. et al. [39] proposed a maturity model comprising seven dimensions—instrument assets, models, data, interactions, functional services, systems, and organizations—to assess building digital twin projects. They emphasized that evaluation indicators should capture the temporal characteristics of the building lifecycle; 3) Afzal M. et al. [40] discussed the maturity levels of DT technologies in the construction sector based on the key components of the latest DT technologies. They proposed four distinct DT maturity stages: pre-DT, DT, adaptive DT, and intelligent DT. They further emphasized that learning capability is a key factor in enhancing DT maturity levels.
  2. [2]. In terms of evaluation methods: 1) Chen Z. S. et al. [39] constructed an evaluation system to assess the maturity of construction DT projects by using a multi-objective optimization model based on fairness awareness, combined with a collective opinion generation paradigm. This approach excels in consolidating information for construction DT maturity assessments involving multiple stakeholders but requires high data accuracy and adherence to specific probability distributions; 2) Chen et al. [35] developed a maturity assessment framework for Building Information Modeling (BIM) projects in the architectural design and construction phases. They employed the Large-Scale Group Decision-Making (LSGDM) method to determine maturity levels. however, this approach still retains a degree of subjectivity; 3) Alnaser A. et al. [41] employed Structural Equation Modeling (SEM) to evaluate the maturity of BIM-DT adoption in the sustainable construction domain. This innovative approach aids in understanding the complex relationships among key factors but requires a large sample size and high-quality data, and its computational process is complex.

These studies confirm the feasibility of using maturity models to assess digital twin capabilities in the construction domain. However, the development of DT maturity models remains in its infancy. Particularly during the construction phase, literature related to the evaluation of digital twin maturity is still limited. The few existing studies include: (1) Li T. et al. [42] proposed five maturity levels specific to the characteristics of underground infrastructure. They constructed an evaluation indicator system comprising 5 dimensions and 13 rating criteria, and employed the D-ANP method to assess maturity levels; (2) Wei Y. et al. [43] developed an off-site construction digital twin model, encompassing data structures, physical-to-virtual transmission, modeling, and decision-making, to evaluate digital twin maturity levels. Overall, existing research primarily focuses on the theoretical framework for assessing the maturity of digital twins (DT). However, there is still no consensus on the precise definition of maturity stages. Additionally, compared to theoretical studies, empirical research based on case studies remains relatively underdeveloped, limiting a deeper understanding of DT application maturity.

In summary, although various frameworks and perspectives for evaluating digital twin maturity have been proposed in the construction field, significant challenges remain:

  1. (1). Lack of Standardization: The construction phase of the industry lacks standardized definitions, application frameworks, and maturity models for digital twins [44].
  2. (2). Limitations of Traditional Evaluation Methods: Traditional evaluation methods are predominantly qualitative, characterized by a high degree of subjectivity. Even quantitative approaches suffer from limited intuitiveness and inadequate capacity to handle uncertainty and complex decision-making. Furthermore, the absence of a scientifically sound and well-structured feedback mechanism hampers the formulation of objective and precise improvement recommendations.
  3. (3). Insufficient Data and Case Studies: Limited data related to the construction phase and a lack of case studies impede practical research on digital twin maturity assessment.

To address these gaps, this study aims to systematically explore the evaluation of digital twin maturity during the construction phase, thereby promoting the application of digital twin technology in construction activities. For this purpose, we developed a standardized evaluation framework specifically designed for assessing digital twin maturity during the construction phase (Fig 1). Subsequently, this framework was applied to real construction projects, enriching the domain of practice-oriented case studies.

thumbnail
Fig 1. A standardized framework for assessing the maturity of digital twins during the construction phase.

https://doi.org/10.1371/journal.pone.0332449.g001

2. Research methods

The evaluation process for digital twin maturity encompasses three integral components: the establishment of a comprehensive evaluation indicator system, the determination of robust evaluation methodologies alongside clearly defined grade standards, and the development of systematic feedback mechanisms.

2.1 Evaluation indicator system

Building upon existing research definitions [42,43], this study posits that, in the construction phase, the maturity of digital twins refers to the capability level achieved through real-time dynamic mapping between virtual models and physical construction sites. This maturity is characterized by data-driven resource optimization, risk prediction, and collaborative management. The central objective is to transition from reactive responses to proactive interventions, ultimately establishing a closed-loop optimized intelligent construction system.

Additionally, considering existing interpretations of digital twin maturity within the construction industry [45], this study identifies several critical factors for evaluating the maturity of digital twins in the construction phase. These factors include data collection and integration capabilities, model development and simulation capabilities, real-time interaction and closed-loop control, collaborative management and decision support, intelligent prediction capabilities, as well as economic benefits and sustainability (Fig 2).

thumbnail
Fig 2. Key factors influencing the maturity of digital twins in the construction phase.

https://doi.org/10.1371/journal.pone.0332449.g002

2.1.1 Evaluation Indicators.

Given the unique characteristics of digital twin technology during the construction phase, this study builds upon the traditional three-dimensional digital twin model (physical, virtual, and connected) [45] and proposes five refined dimensions for maturity assessment: the acquisition layer, data layer, modeling layer, analysis layer, and application layer (refer to Table 1).

thumbnail
Table 1. Specific meanings of digital twin maturity assessment dimensions.

https://doi.org/10.1371/journal.pone.0332449.t001

2.1.2 Evaluation sub-dimensions.

Using keywords such as “Digital Twin Technology,” “Digital Twin Maturity Assessment,” and “Digital Twin Construction,” a literature search was conducted in the Web of Science and Scopus databases. A total of 102 academic papers and 12 doctoral dissertations were selected for detailed review and synthesis. Additionally, using “Digital Twin” as the search term, 70 relevant items and documents were retrieved from the official website of the Ministry of Housing and Urban-Rural Development, with the search timeframe set to August 1, 2024. Following an analysis of these research outcomes and policy documents, 35 maturity evaluation indicators across five dimensions were identified. However, redundant information within some indicators posed potential challenges to the accuracy of evaluation results, necessitating further refinement. The Delphi method was employed to filter these indicators [46]. Specifically, 20 experts—including six academic scholars, seven construction management professionals, and seven technical specialists from Digital Twin platform providers—were invited to assess the rationality of the initially proposed 35 indicators on a scale from 1 to 10. These experts, confirmed prior to the survey to possess substantial knowledge of Digital Twin research, responded positively to the evaluation. The selection of indicators was based on the level of consensus among the experts, as measured by Kendall’s coefficient of concordance (W). Indicators were retained if the significance test of W yielded P < 0.05 [46]. Statistical analysis was conducted using IBM SPSS software. Indicators with p-values greater than or equal to 0.05 were excluded, resulting in a final set of 30 evaluation indicators. Details of the excluded indicators are presented in Table 2.

To mitigate redundancy among indicators, repeated or overlapping indicators were eliminated, and those with identical meanings were merged. The main steps for indicator screening were as follows:

  1. (1). Regression Analysis: Each preliminary indicator was subjected to regression analysis, using other indicators as predictor variables and the given indicator as the response variable. The coefficient of determination () for each regression model was calculated, with the entire process conducted in IBM SPSS software.
  2. (2). Variance Inflation Factor (VIF) Calculation: The variance inflation factor (VIF) was computed using the formula [43]:
(1)

Where R represents the correlation among all indicators within each driving factor. Typically, indicators are considered to exhibit multicollinearity if VIF > 10, whereas 0 ≤ VIF ≤ 10 indicates the absence of multicollinearity [47].

Highly correlated indicators were eliminated using the Variance Inflation Factor (VIF) method, as detailed in Table 3. Following this refinement, a total of 26 evaluation indicators were retained (see Table 4).

thumbnail
Table 3. Summary of highly correlated indicators removed based on VIF analysis.

https://doi.org/10.1371/journal.pone.0332449.t003

2.2 Evaluation methods and levels

This study initially adopts a combined AHP-entropy weight method to assign weights to each dimension level. The AHP method is utilized to determine the subjective weights of dimensions and indicators, while the entropy weight method is applied to calculate objective weights. Finally, the comprehensive weights are computed through the product normalization method. Subsequently, the matter-element model is employed, integrating expert scoring results with comprehensive weight results to derive the comprehensive correlation degree of the evaluated objects, thereby determining the evaluation level.

2.2.1 Determination of indicator weights.

  1. (1). Subjective Weight Calculation—AHP

Step 1: Constructing the Judgment Matrix The judgment matrix reflects the relative importance of each element within the evaluation layer concerning a particular element in the higher layer. Its mathematical expression is as follows:

(2)

For any judgment matrix, the following conditions must be satisfied: Xij = 1 and . Xij represents the judgment value of the relative importance of element Xi to Xj. The graded standard values of Xij are determined based on Saaty’s 1–9 scale method (see Table 5).

Step 2: Conducting Hierarchical Single Sorting

  1. 1). Calculate the product of the elements in each row of the judgment matrix, denoted as Mi:
(3)
  1. 2). Compute the Nth root of M, denoted as Wi.
  2. 3). Normalize Wi to obtain the weight of the indicators, denoted as Zi, using the following formula:
(4)

Step 3: Conducting Consistency Checks on the Matrix

  1. 1). Calculate the maximum eigenvalue (𝜆max) of each judgment matrix.
  2. 2). Compute the consistency index (CI) using the formula:
(5)

The random consistency index (RI) is related to the order of the judgment matrix (see Table 6). Generally, as the order of the matrix increases, the likelihood of random deviations in consistency becomes higher.

thumbnail
Table 6. Standard values of the average Random Consistency Index (RI).

https://doi.org/10.1371/journal.pone.0332449.t006

  1. 3). Calculate the consistency ratio (CR) using the formula:
(6)

If CR < 0.10, the matrix is considered to meet the consistency requirement. If CR > 0.10, adjustments to the judgment matrix are required until the condition is satisfied.

  1. (2). Objective Weight Calculation

Entropy Weight Method The entropy weight method determines weights based on the degree to which each evaluation indicator influences the overall system. The specific calculation steps are as follows:

Step 1: Forming the Original Data Sequence Matrix

For the given evaluation objects Mi (i = 1, 2, 3, 4, …, m) and evaluation indicators Nf (f = 1, 2, 3, 4, …, n), the evaluation value of object Mi under indicator Nf is denoted as Rif (i = 1, 2, 3, … ,m; f = 1, 2, 3, …, n). This forms the matrix R.

(7)

Step 2: Normalizing the Original Matrix

The original matrix is normalized to ensure that Vif falls within the range of 0–1. In this study, the threshold method (also known as the critical value method) is employed for processing indicators where “the larger, the better.”

(8)

Step 3: Calculating the Proportion of Indicator Values for Object i under Indicator f, Pif

(9)

Step 4: Calculating the Entropy Value of Indicator f

(10)

Step 5: Calculating Entropy Weight Zf

The divergence coefficient df is introduced, defined as df = 1 − ef. A larger df implies greater divergence for the corresponding indicator, indicating that it carries more information and should be assigned a higher weight. The formula for calculating entropy weight Zf is as follows:

(11)
  1. (3). Comprehensive Weight Calculation

In the process of determining weights by combining the Analytic Hierarchy Process (AHP) and the entropy weight method, this approach not only takes expert experience into account but also reduces the influence of subjective factors on the results. This allows for a more objective and reasonable determination of the weights for each evaluation indicator.

The formula for calculating the comprehensive weight Wj is as follows:

(12)

2.2.2 Construction of the optimized matter-element model.

The matter-element model, originally proposed by Chinese mathematician Wen Tsai in the 1980s, provides an effective method to resolve incompatibility issues in complex systems [78]. This model encompasses three fundamental steps: defining the classical and joint domains, establishing the correlation function, and calculating the comprehensive correlation degree [78]. As detailed in Section 2.2.1, combining the Analytic Hierarchy Process (AHP) with entropy weighting was utilized to derive indicator weights, thereby reducing subjectivity in scoring to some extent. However, traditional models still exhibit the following limitations:

  1. (1). Classical domains often rely on fixed thresholds, potentially causing abrupt changes at correlation boundaries that lead to unreasonable evaluation grades (e.g., while scores of 0.79 and 0.81 are numerically close, their grade differences may be stark).
  2. (2). Correlation functions are piecewise linear, which may introduce inaccuracies due to potential function distortions.

To overcome the two aforementioned limitations, we have improved the traditional matter-element model, as detailed in Fig 3.

thumbnail
Fig 3. Optimization process of the traditional matter-element model.

https://doi.org/10.1371/journal.pone.0332449.g003

The optimized matter-element model involves the following steps:

Step 1: Define the Matter-Element Structure

The matter-element structure is defined as:

(13)

Step 2: Define Classical and Joint Domains

  1. (1). Classical Domain Matrix:
(14)

Here, and indicate the lower and upper threshold values of indicator within grade , respectively.

  1. (2). Joint Domain :
(15)

Step 3: Dynamic Threshold Optimization

  1. (1). The classical domain is adjusted adaptively using the following formula:

, (16)

In this context, denotes the expansion amplitude. Existing studies suggest that the formulation is considered most appropriate for boundary adjustment in maturity evaluation models [79] (See S1 Appendix in the Supporting Information for details).

Based on expert scoring data collected in this study, six candidate values were comparatively analyzed: 𝛿j = 0.05∣Xj∣, 𝛿j = 0.06|Xj|, 𝛿j = 0.07|Xj|, 𝛿j = 0.08|Xj|, 𝛿j = 0.09|Xj|, and 𝛿j = 0.10|X|. The evaluation focused on misclassification rates of maturity levels, boundary sensitivity, and the maximum correlation degree (See S2 Appendix in the Supporting Information for details). The results indicate that when 𝛿j = 0.07|Xj|, the boundary misclassification rate is minimized (0%), and a favorable balance is achieved between sensitivity and stability, with a more pronounced maximum correlation degree. These findings support the adoption of 𝛿j = 0.07|Xj| as the optimal parameter for the evaluation model in this project.

  1. (2). The standard distance is then computed using the formula:
(17)

Here: xj: Normalized indicator score, 𝜇jk: Adjusted grade center, calculated as , 𝜎jk: Adjusted grade bandwidth, calculated as .

Step 4: Nonlinear Correlation Degree Calculation

  1. (1). The nonlinear correlation degree Kj(vk) is calculated using the bimodal Sigmoid correlation function as follows:
(18)

Where: 𝛽1 = 8 and 𝛽2 = 5: Represent the decay rate parameters for the left and right sides, respectively [80]. 𝛼 = 0.3: Denotes the center offset [81].

  1. (2). The local sensitivity correction term ΔKj(vk) is formulated as:
(19)

Where:: Represents the local bandwidth coefficient [82]. 𝜂 = 0.2: Denotes the correction intensity coefficient [83].

  1. (3). The comprehensive correlation degree is computed using the following formula:
(20)

Step 5: Determination of Comprehensive Evaluation Grade

This study categorizes the application scenarios and functionalities of Digital Twin technology in the construction field into a hierarchical maturity framework consisting of five levels: (I) Basic, (II) Connected, (III) Integrated, (IV) Interactive, and (V) Autonomous. The proposed hierarchical structure represents a progressive classification based on the technological advancement and practical application of Digital Twin systems. Its objective is to assess the improvements in efficiency and advancements in technology. The corresponding score requirements and specific definitions for each level are provided in Table 7.

The correlation degree in the matter-element model extends the logical values from the [0,1] closed interval of fuzzy mathematics to the real number axis (−∞, +∞), which reveals more differentiation information:

  1. (1). When 0≤ ≤1, it indicates that the digital twin maturity level in this region meets the requirements of the standard level.
  2. (2). When −1< <0, it suggests that the level does not meet the standard requirements but still has the potential to belong to this interval. The closer its value is to zero, the more likely it is to be converted.
  3. (3). When ≤−1, it signifies that it is neither within this grade range nor has the conditions for conversion into this grade.
  4. (4). When >1, it indicates that the level exceeds the upper limit of the standard grade.

By integrating the previously determined indicator weights, the grade membership degree Dk can be calculated as follows:

The grade membership degree Dk is synthesized using the formula:

(21)

Where: wj: Represents the weight of indicator j. : Denotes the comprehensive correlation degree for grade vk.

The maturity grade L and scoreis determined as:

(22)(23)

2.3 Feedback mechanism

The Importance-Performance Analysis (IPA), introduced by Martilla and James in 1977 [84], is a straightforward, intuitive, and visual evaluation method that has been extensively applied in the assessment of service quality across various domains [85]. This approach evaluates key elements from the dual perspectives of importance and performance, mapping the evaluation indicators across four quadrants based on their importance and performance levels, as demonstrated in Fig 4. In this study, the digital twin maturity evaluation employs weight values of indicators as the horizontal axis (importance) and maturity scores as the vertical axis (maturity). Using the mean values of these two dimensions as thresholds, the matrix is divided into four quadrants: Quadrant I (high importance, high maturity) represents the “Advantage Retention Zone,” Quadrant II (low importance, high maturity) constitutes the “Steady Development Zone,” Quadrant III (low importance, low maturity) corresponds to the “Low Priority Development Zone,” and Quadrant IV (high importance, low maturity) signifies the “Key Improvement Zone.” This methodology facilitates cross-dimensional analysis of indicators, offering improvement recommendations tied to maturity results and providing scientific guidance for evaluation processes.

2.4 Alignment and advantages over existing standards

By systematically reviewing the methodological foundations and applicable scopes of existing standards and models, this study clarifies the distinctive features and innovations of the proposed evaluation framework in terms of goal orientation, structural granularity, assessment methodology, construction-phase alignment, and complementary potential. The framework’s relationship with established principles and standards is characterized by translation, integration, and extension. A comparative analysis is presented across four dimensions: the Gemini Principles, ISO 19650, the CMMI framework, and representative academic literature.

The Gemini Principles emphasize governance values such as purpose and trust, including data quality, security, visualization, and access control [20,86]. This study translates these abstract principles into measurable indicators within the data and application layers. Through dynamic thresholding and weight adjustment, the framework balances privacy, security, and openness across varying contexts. While the Gemini Principles are non-quantitative and architecture-oriented, the proposed framework operationalizes them into an evaluable governance–technology coupling mechanism, facilitating project-level implementation.

ISO 19650 focuses on lifecycle information management, including data organization, exchange, versioning, and Common Data Environment (CDE) collaboration [8789]. The indicators defined in this study—such as B1 to B6 (transmission, quality, storage, timeliness, security, visualization) and C2/C3 (standardization/synchronization)—serve as quantitative externalizations of the information management process. These indicators enable monitoring of CDE performance, fidelity in cross-organizational exchange, and timeliness. Unlike ISO 19650, which centers on process roles and compliance, the proposed framework bridges procedural conformity with performance quantification.

The CMMI framework provides process domains and maturity levels for aspects such as data quality, risk, measurement, and governance. However, it does not address the closed-loop applications specific to construction-oriented digital twins, such as equipment, materials, quality, safety, personnel, and technology [9093]. This study maps these operational capabilities to performance-linked indicators (E1–E7) and employs Importance–Performance Analysis (IPA) to identify high-importance but low-performance gaps. In doing so, it advances from assessing process existence to evaluating process effectiveness, offering a scenario-specific quantitative extension of CMMI for construction digital twin applications.

Existing academic literature primarily emphasizes layered digital twin capabilities, core technologies (data–model–analysis–feedback), and semantic interoperability requirements [86,94]. The five-layer structure (A–E) adopted in this study aligns closely with these technical logics. However, the framework introduces quantifiable management-oriented indicators tailored to the construction phase, along with a novel evaluation methodology that integrates matter-element modeling, dynamic thresholds, and nonlinear correlation functions. These enhancements enrich the methodological toolkit for maturity assessment.

Table 8 presents a comparative summary of the proposed framework against the Gemini Principles, ISO 19650, CMMI, and academic models across five dimensions: goal orientation, structural granularity, assessment methodology, construction-phase alignment, and complementary potential.

thumbnail
Table 8. Framework alignment with standards and literature.

https://doi.org/10.1371/journal.pone.0332449.t008

3. Case study

3.1 Project overview

3.1.1 Project introduction.

The A and B buildings of the CAZ Innovation and Entrepreneurship Industrial Park in Xinyang, Henan Province, are currently the tallest landmark structures in the city. They are located in Pingqiao District, an area characterized by relatively flat terrain (as depicted in Fig 5). The project covers a land area of 39,685 square meters, with a total construction area of approximately 215,100 square meters. It comprises two tower buildings (Buildings A and B) and an associated commercial podium, with a standard floor height of 4.3 meters.

thumbnail
Fig 5. Location Map of Xinyang CAZ Innovation and Entrepreneurship Industrial Park.

Note: The satellite basemap on the left is sourced from the official Tianditu website of China (https://www.tianditu.gov.cn/), while the basemap on the right is derived from USGS (viewer.nationalmap.gov/viewer).

https://doi.org/10.1371/journal.pone.0332449.g005

Despite facing constraints such as a tight construction schedule and significant technical challenges, the project team effectively employed advanced technologies, including Building Information Modeling (BIM) three-dimensional modeling, intelligent network platforms, and Eagle Eye panoramic monitoring systems, throughout the construction process. Moreover, the adoption of digital twin technology enabled the comprehensive digitization of physical structures, production elements, and management processes.

This project has been recognized as a “Digital Twin Intelligent Construction Site Demonstration Project” by Henan Province and serves as the first high-rise building in Xinyang to utilize digital twin technology for construction monitoring and management. Additionally, it represents a seminal example of digital twin applications in the construction sector across China.

3.1.2 Overview of digital twin application in the project.

The project introduced an intelligent digital twin scheduling platform—referred to as the “Project Brain” (see Fig 6)—which enables real-time data aggregation, production tracking, and proactive risk mitigation. The system has been applied across multiple domains, including production management, equipment management, logistics, quality control, safety supervision, personnel coordination, and technical oversight. It has established a foundational level of usability in multi-source data acquisition, low-latency transmission, automated model interaction, platform governance, and scalability, thereby supporting continuous visualization, traceability, and predictive capabilities on-site. Key technical parameters are summarized in Table 9.

thumbnail
Table 9. Key technical parameters of the digital twin system.

https://doi.org/10.1371/journal.pone.0332449.t009

thumbnail
Fig 6. Illustration of the digital twin platform of the enterprise.

https://doi.org/10.1371/journal.pone.0332449.g006

The data acquisition framework integrates Ultra-Wideband (UWB) anchors, video feeds, RFID tags, point cloud and image data, and Network Time Protocol (NTP) synchronization. This multi-source configuration supports localization, monitoring, and temporal alignment, enabling stable tracking of personnel and materials in medium-scale construction environments. It facilitates parallel acquisition of structured and unstructured data, laying a solid foundation for multi-model fusion. However, improvements in positioning accuracy and update frequency remain necessary.

In terms of data processing, the system leverages 5G/private networks and local Wi-Fi to achieve low-latency transmission and a complete end-to-end processing pipeline, encompassing acquisition, transmission, computation, and visualization. Fault tolerance and redundancy mechanisms are in place to ensure continuous operation, though further optimization is needed under high-concurrency conditions, particularly in cross-source temporal alignment.

Model interaction capabilities are reflected in geometric accuracy (mean deviation of 32 mm), component recognition performance (F1 score of 0.75), and 4D verification coverage (70%). The system supports stable visualization and alerting for situational awareness and post-event review, with strong human–machine collaboration features that allow rapid manual correction in complex scenarios. Nonetheless, reducing alert response time could enhance real-time responsiveness.

Platform governance and analytical capabilities are well-developed, with clear budget allocation (CAPEX 0.85%, OPEX 0.15%) and a KPI system that supports scalability to larger or multi-regional projects. Enhancing material traceability and improving On-Time In-Full (OTIF) performance would further strengthen operational efficiency.

3.2 Evaluation of digital twin maturity

3.2.1 Determination of weights.

A panel of 20 experts was invited to score each indicator based on the actual conditions of the project, with questionnaire scores ranging from 0 to 100. The panel included six academic scholars specializing in digital twin applications in architecture, seven management personnel from construction enterprises, and seven technical staff members from digital twin platform providers. All experts responded positively, resulting in a 100% valid questionnaire recovery rate.

The combined weight of each indicator was calculated using the Analytic Hierarchy Process (AHP)-entropy weight method, as shown in Table 10. It was observed that among the primary indicators, the “Analysis Layer” and the “Application Layer” ranked the highest in weight, highlighting their significant impact on the digital twin maturity during the construction process. Among the secondary indicators, “Self-Learning Capability,” “Predictive Capability,” “Production Management Capability,” “Quality Management Capability,” and “Model Visualization Capability” emerged as the top five, emphasizing their crucial contributions to digital twin maturity at this stage.

thumbnail
Table 10. Combined weight results of evaluation indicators.

https://doi.org/10.1371/journal.pone.0332449.t010

In contrast, the Collection Layer received the lowest weight among the primary indicators, suggesting that its influence on digital twin maturity during the construction phase is relatively limited. Similarly, Data Security ranked lowest among the secondary indicators, indicating that its contribution to maturity in this context is also minimal. In the specific case examined, experts justified the reduced weighting of data security by noting that many digital twin systems deployed on Chinese construction sites operate within local area networks, enterprise intranets, or private 5G environments. These configurations significantly limit exposure to external internet connections, thereby reducing the risk of external cyberattacks [97].

3.2.2 Maturity evaluation results.

Based on the scoring of the digital twin maturity evaluation indicators for the Xinyang CAZ Innovation and Entrepreneurship Industrial Park by 20 experts, the average score across all participants was calculated and adopted as the final evaluation score for each indicator (refer to Table 11).

Based on the classification in Table 7 and the integration of the dynamic threshold optimization algorithm, the initial classical domain and adjusted classical domain for evaluating the maturity indicators of digital twins were identified (refer to Table 12). Subsequently, the linear and nonlinear correlations of the primary and secondary indicators were calculated using both traditional and optimized models, with the results detailed in Table 13. A comparison of the correlation degrees calculated by the two methods revealed that their levels were nearly identical. However, the correlation values derived from the optimized model were higher, providing a more intuitive representation of the correlation levels.

thumbnail
Table 12. Classical domain and node domain of digital twin maturity evaluation.

https://doi.org/10.1371/journal.pone.0332449.t012

thumbnail
Table 13. Results of nonlinear correlations for primary and secondary evaluation indicators across different levels.

https://doi.org/10.1371/journal.pone.0332449.t013

Based on the calculated comprehensive correlation degrees of all evaluation indicators for the Xinyang CAZ Innovation and Entrepreneurship Industrial Park (see Table 14), the results indicate that the traditional matter-element model yields a maturity rating of Level III (Integrated), whereas the optimized model produces a lower rating of Level II (Connected). This discrepancy suggests that the digital twin technology applied during the construction phase of the project exhibits moderate maturity. Notably, the analytical layer demonstrates the lowest level of maturity, highlighting a critical area for improvement in data integration and model responsiveness.

3.2.3 Analysis of evaluation results.

  1. (1). Descriptive statistics of indicator maturity scores

Based on the mean and median scores of the primary indicators evaluated by 20 experts (as shown in Fig 7(a) and 7(c)), the Modeling Layer (C) and Analysis Layer (D) had relatively lower mean scores and medians, ranking among the bottom two. This suggests that the interaction between the physical space and virtual models in this project is limited, and that the spatial algorithm library exhibits relatively weak capabilities in data processing, decision-making, and optimization. In contrast, the Data Layer (B) and Application Layer (E) achieved higher mean scores and medians, ranking among the top two, highlighting the superior quality, transmission efficiency, storage capability, and management effectiveness of the digital twin data during the construction process.

thumbnail
Fig 7. Descriptive statistical results of expert scores for evaluation indicators.

https://doi.org/10.1371/journal.pone.0332449.g007

Furthermore, most secondary indicators showed mean and median scores ranging between 60 and 80 (as illustrated in Fig 7(b) and 7(d)). Among these, E6 (personnel management capability) achieved the highest mean and median scores, reflecting that the project’s high management standards are primarily attributable to effective personnel management. In contrast, the indicators ranked among the bottom two in terms of both mean and median scores were C1 (model accuracy) and E3 (material management capability), which fell below the passing threshold of 60. These findings highlight that low model accuracy and insufficient material management represent significant barriers to advancing the project’s digital twin maturity. In the specific case examined, two diagnostic findings support these conclusions. First, the low model accuracy is evidenced by an average deviation of 32 mm in point cloud alignment and component recognition. Prior research indicates that when this deviation consistently exceeds 30 mm, automated progress verification and quality deviation alerts—core functions of a self-aware and self-diagnostic digital twin—tend to regress into manual inspection and sampling. This transition effectively downgrades the system from a “digital twin” (characterized by high-fidelity bidirectional synchronization) to a “digital shadow” (unidirectional and low-trust) [98]. Second, construction records show that delays due to material mismatches, shortages, or omissions occurred four times during the project. Existing studies have demonstrated that material shortages and fluctuations in lead times can amplify schedule deviations and increase the risk of acceleration and rework, thereby distorting performance and inflating costs [95]. These factors explain the experts’ relatively low scores for model accuracy and material management capability.

As depicted in Fig 7(e) and 7(f), the standard deviations of the scores assigned by 20 experts to each indicator were consistently less than 10% of their respective mean values. This high level of consistency among expert evaluations further supports the reliability of the scoring data and bolsters the validity of the study’s conclusions.

  1. (2). Analysis of Evaluation Levels

Based on the results in Table 13, it was observed that, for both primary and secondary indicators, the proportion of indicators with a correlation level of III was the highest (accounting for 60.00% and 46.15%, respectively). Furthermore, the overall maturity of the digital twin evaluation for this project was determined to be at Level II (Table 14). These findings suggest that the overall maturity level of the project’s digital twin technology is moderate, with considerable potential for enhancement in certain areas. Specifically:

  1. 1). Among the primary indicators, the relevance level of the data layer (B) ranks highest (Grade IV), while that of the analysis layer (D) is the lowest (Grade II). This suggests that the quality, transmission, and storage capabilities of the digital twin data are relatively high, whereas the data processing capacity of the spatial algorithm library and the overall decision-making and optimization capabilities are considerably weaker.
  2. 2). In the secondary indicators, personnel management capability (E6) achieved the highest correlation level (Level V), while eight indicators, including model accuracy (C1) and material management capability (E3), were evaluated at lower levels (Level II). This indicates that the personnel management level of digital twin technology in the project is relatively high. However, deficiencies in model accuracy and material management hinder the improvement of the project’s digital twin maturity.
  3. 3). Based on the analysis of primary and secondary indicators, the correlation levels of predictive capability (D3) and autonomous learning capability (D5) in the Analysis Layer (D) were relatively low, both rated at Level II. This indicates that deficiencies in intelligent event prediction and autonomous decision-making are key factors contributing to the weaker analytical capabilities of the digital twin technology in this project. Meanwhile, in the Data Layer (B), which has the highest maturity level, the secondary indicator of data timeliness (B4) was only rated at Level II. This reveals that, despite the overall high maturity of the Data Layer, there are still shortcomings in real-time data synchronization capabilities.
  1. (3). Optimization Strategies for Feedback Evaluation

Through Importance-Performance Analysis (IPA) of primary and secondary indicators in the digital twin maturity evaluation, optimization strategies for feedback on project indicators have been determined, as illustrated in Fig 8(a)–8(f). The specific analysis is as follows:

thumbnail
Fig 8. IPA chart of digital twin indicators (maturity−weight).

https://doi.org/10.1371/journal.pone.0332449.g008

  1. 1). Quadrant I (Advantage Retention Zone): This quadrant includes one primary indicator, the application layer (E), and five secondary indicators: sensing modalities (A1), data quality (B2), model visualization capability (C4), production management capability (E1), and quality management capability (E4). These indicators have generally met the expected standards of digital twin technology in the construction sector. They should remain well-maintained and continuously refined to sustain their performance.
  2. 2). Quadrant II (Steady Development Zone): This quadrant contains two primary indicators, the acquisition layer (A) and the data layer (B), along with eleven secondary indicators: network infrastructure (A3), data storage (B3), data transmission (B1), model standardization (C2), model-integrated information capability (C5), analytical capability (D1), decision-making capability (D2), optimization capability (D4), safety management capability (E5), personnel management capability (E6), and technical management capability (E7). These indicators have relatively lower impact on the construction sector’s digital twin technology but demonstrate high levels of development. As such, they require moderate attention without excessive resource allocation.
  3. 3). Quadrant III (Low Priority Development Zone): This quadrant includes one primary indicator, the modeling layer (C), and seven secondary indicators: data security (B5), data visualization (B6), model accuracy (C1), model synchronization capability (C3), prediction capability (D3), equipment management capability (E2), and material management capability (E3). These indicators exhibit both low impact and low levels of development. In the short term, they may be deprioritized, with resources instead allocated to urgent areas requiring improvement or refinement. Optimization of these indicators can be pursued as conditions mature.
  4. 4). Quadrant IV (Key Improvement Zone): This quadrant contains one primary indicator, the analysis layer (D), and three secondary indicators: sensing capability (A2), data timeliness (B4), and autonomous learning capability (D5). These indicators have significant impact on the construction sector’s digital twin technology but display low levels of development. Immediate improvements are therefore essential. Managers should allocate substantial resources—human, material, and financial—to transform these weaknesses into strengths.

Optimization Strategies for Indicators Based on Current Conditions:

  1. 1). Overall Strategic Framework

Project prioritization follows the sequence: Quadrant IV > Quadrant I > Quadrant II > Quadrant III. The investment strategy allocates over 60% of the incremental budget to areas characterized by high strategic importance but low current performance. In high-performance zones, technological leadership should be maintained; in stable zones, cost-effective optimization is recommended; and in low-priority zones, data accumulation and controlled experimentation should be sustained. An evaluation mechanism is proposed whereby all quadrant indicators undergo quarterly reassessment using the Importance–Performance Analysis (IPA) framework. This enables dynamic strategic realignment, mitigating the risk of performance degradation in high-performing areas and prolonged stagnation in underperforming zones.

  1. 2). Specific Recommendations

Indicators in the Advantage Retention Zone: It is advisable to establish a dynamic KPI monitoring dashboard that automatically aggregates key performance metrics—such as latency, accuracy, and visualization refresh rates—on a monthly basis to prevent performance volatility. Redundant computational nodes and hot backup mechanisms should be integrated into critical modules (e.g., real-time visualization and quality inspection) to ensure zero-interruption in the event of node failure. Drawing on best practices from advanced digital twin platforms in smart construction and intelligent manufacturing, the visualization rendering engine should be upgraded to support millimeter-level 4D scene reconstruction. Additionally, root cause analysis (RCA) capabilities should be embedded within production and quality management modules, enabling quality alerts to be directly linked to material batches, construction teams, and suppliers, thereby facilitating closed-loop management.

Indicators in the Steady Development Zone: Without incurring substantial investment costs, continuous optimization of network and storage performance is recommended. For instance, implementing edge caching combined with incremental data update mechanisms can alleviate transmission pressure. Cross-module standards for model and data encoding should be formulated and enforced to enhance model integration and reusability, laying the groundwork for future cross-project data fusion. Under D1/D2 indicators, scenario simulation and solution optimization algorithms should be introduced to support multi-scheme comparisons and risk assessments in resource scheduling and project timeline forecasting. Regular (quarterly or semi-annual) emergency drills for information and site safety should be conducted, with digital twin simulation results used to refine standard operating procedures (SOPs) for emergency response.

Indicators in the Low Priority Development Zone: Although short-term maintenance may be minimal, medium- to long-term breakthroughs should be strategically planned. Historical data on construction, materials, and equipment can be leveraged to deploy machine learning models that enhance early detection of risks related to progress, quality, and cost. Higher-resolution sensor data should be used to drive dynamic updates of BIM models, reducing the update cycle from biweekly to weekly or even daily. A hybrid positioning system combining Radio Frequency Identification (RFID) and Ultra-Wideband (UWB) technologies, integrated with equipment health scoring systems, can facilitate predictive maintenance and material shortage alerts. Transitioning from static views to interactive 3D/4D scenes will enable users to annotate issues and retrieve historical data directly through the visualization interface.

Indicators in the Key Improvement Zone: To enhance the detection of emergent risks, a multi-dimensional environmental sensing matrix should be established through the deployment of diverse sensors (e.g., temperature, humidity, vibration, tilt, and gas concentration). Edge computing nodes should be introduced in high-risk operational zones to enable on-site preliminary data analysis and early warning, reducing end-to-end alert latency to under 10 seconds. Online learning and transfer learning frameworks should be incorporated to allow the digital twin platform to adapt its parameters and predictive models in response to varying construction phases and environmental conditions. A closed-loop system integrating field feedback, historical deviation analysis, and model optimization should be developed to continuously improve analytical accuracy and the practical relevance of recommendations.

4. Discussion

4.1 Evaluation of the reasonableness of the results

This study analyzed the digital twin maturity of the CAZ Innovation Industrial Park project in Xinyang through an actual case study, yielding evaluation results that demonstrate the system’s strong suitability.

  1. (1). Indicator System: The study revealed that among the primary indicators, the “Analysis Layer” accounted for the highest weight, while the “Collection Layer” carried the lowest. In terms of secondary indicators, “Self-Learning Capability” exhibited the highest global weight, whereas “Data Security Capability” ranked the lowest. These findings are consistent with the conclusions of Tao Fei et al. (2018) [99], who analyzed the hierarchical structure of digital twins and emphasized the importance and central role of the analysis layer. In contrast, the collection layer was identified as serving a more foundational function within the overall system [99].
  2. (2). Evaluation Method: The final evaluation result of the digital twin maturity for this project, calculated using the improved matter-element model, was determined to be at the “Connected Level” (Level II). Among the primary indicators, the “Data Layer” received the highest rating, while the “Analysis Layer” scored the lowest. For the secondary indicators, “Personnel Management Capability” ranked the highest, whereas indicators such as “Model Accuracy” and “Material Management Capability” ranked the lowest. These findings align with the conclusions drawn by S. Boschert et al. [96], which emphasize the significance of data in the overall architecture of digital twins. Additionally, the analysis layer relies heavily on accumulated data and model predictions, and is therefore considered to be a relatively lower-tier element in certain cases of digital twin maturity evaluation.

Furthermore, compared to traditional models, the optimized matter-element model exhibits the following advantages:

  1. 1). Improved interpretability of boundary indicators.

As shown in Table 11, the maturity score of E6 is 90.1. While the traditional model defines a Level V classical domain as [90, 100), the optimized model, with smooth transition bands using dynamic thresholds, expands the Level I classical domain to [83, 100). Although both models yield Level I results, the correlation degree in the traditional model is merely 0.01 (which is susceptible to noise interference and abrupt grade transitions). In contrast, the optimized model achieves a correlation degree of 0.96 (indicating a 95-fold improvement in stability). This demonstrates that the optimized model not only provides greater precision in boundary responses but also significantly enhances boundary sensitivity.

Of note, dynamic thresholds also mitigate over-limit misjudgments, as mathematically proven below:

  1. (i). Assuming the true range is [0, 100], the extended Level V domain is [83,100].
  2. (ii). For over-limit values 𝜇∈ [90,100], the proportion of the actual effective range is:

This indicates that over-limit values retain a reasonable affiliation within the extended domain, avoiding forced classification into the highest level.

  1. 2). Enhanced Clarity in Correlation Degree Interpretation.

The optimized model achieves greater clarity in the representation of correlation degrees. By mapping the original correlation values—including potentially negative values—onto the [0,1] interval, the model effectively eliminates the distortions introduced by negative correlation degrees in traditional methods. This nonlinear mapping not only improves the intuitive interpretability of the results but also aligns the correlation measures more closely with probabilistic semantics, thereby enhancing their comprehensibility and practical applicability.

For example, as presented in Tables 11 and 13, the maturity score of indicator C1 is 46.5. Under the traditional model, the corresponding correlation degree is −0.15 (Level V), which could easily be misinterpreted as indicating a “strong negative correlation.” In contrast, within the optimized model, this value is remapped to 0.852 (Level V), thereby providing a more accurate reflection of its actual influence (refer to Table 15). Such a transformation substantially improves both the continuity and the rationality of the evaluation results, effectively preventing misclassification stemming from inappropriate sign conventions or scale definitions.

thumbnail
Table 15. Optimized correlation degree matrix for secondary indicators.

https://doi.org/10.1371/journal.pone.0332449.t015

  1. (3). Feedback Mechanism: The Importance-Performance Analysis (IPA) identified the “Analysis Layer” as the key area of focus, suggesting priority improvements in sensor capability, data real-time performance, and self-learning capability indicators. These findings are consistent with the conclusions of W. Kritzinger et al. [100], who explicitly emphasized that sensor technology and data real-time performance are critical elements for enhancing the accuracy of digital twins. Furthermore, the integration of self-learning algorithms allows digital twins to achieve more efficient data processing and intelligent decision-making.

This project employed the Importance–Performance Analysis (IPA) framework to optimize resource allocation, thereby facilitating the efficient implementation of a digital twin platform in the context of construction engineering. As demonstrated in Table 16, the strategy of prioritizing breakthroughs in critical improvement zones, consolidating performance in high-value areas, implementing cost-effective enhancements in stable zones, and maintaining strategic oversight in low-priority zones yielded outcomes that significantly exceeded initial expectations. Specifically, construction efficiency improved by 30%, the project timeline was shortened by nearly seven months, and cumulative cost savings surpassed RMB 16 million. These results provide robust empirical validation of the feasibility and forward-looking nature of IPA-guided strategies in enhancing both the economic performance and technological maturity of construction projects. The findings offer a replicable model and decision-making reference for future smart construction site development.

thumbnail
Table 16. Projected benefits vs. actual efficiency gains.

https://doi.org/10.1371/journal.pone.0332449.t016

4.2 Expansion of the appropriateness of the research framework

The proposed research framework demonstrates robust generalizability and adaptability across diverse regulatory and technological environments, contingent upon localized adjustments that preserve its structural integrity and evaluative rigor.

  1. (1). In regulatory contexts marked by legal heterogeneity, the framework must reconcile compliance requirements with engineering feasibility. The European Union’s General Data Protection Regulation (GDPR) imposes strict mandates on data quality, storage security, and data subject rights—including data minimization, anonymization, and breach notification within 72 hours. To operationalize these mandates, indicators such as anonymization coverage, degree of data minimization, and response timeliness should be embedded within the data layer, with dynamically adjusted weights to support privacy-by-design configurations [101,102]. A cost–benefit calibration mechanism is also essential to quantify the social welfare gains and implementation costs of privacy protection, thereby mapping these metrics to indicator thresholds and weights to avoid overregulation that may suppress data utility [103]. In contrast, the United States’ decentralized, sector-specific, and technology-driven governance model necessitates modular compliance and interoperability assessments. These should focus on interface standardization, semantic consistency, security validation, and cross-platform coordination, aligning with research trajectories in AEC semantic interoperability and semantic digital twins [20,88]. In emerging markets such as China, where policy emphasizes technological deployment and industrialization, the framework must prioritize technical integration and capacity building. Data and network security, along with industry-specific compliance, should be deeply embedded in indicator design. In high-density digital twin scenarios, joint constraints involving data, computational power, and connectivity must be holistically considered [86,94].
  2. (2). Given the dependency of digital twin maturity on data infrastructure and computational ecosystems, a stratified adaptation strategy is required. In low-resource environments, manual data entry may be permitted with credibility discounts, and thresholds for model accuracy and analytical latency may be relaxed. Intermediate conditions, often constrained by system silos and limited predictive capabilities, demand emphasis on connection efficiency, interface automation, and data backfill. High-resource environments should prioritize predictive control, real-time feedback, and closed-loop decision-making, supported by contingency protocols and manual review to mitigate risks associated with over-technologization [86,94]. Semantic interoperability should serve as the foundational layer, with ontology and semantic web technologies ensuring cross-platform consistency in data and model semantics [20,88]. Indicator design should draw from tiered maturity frameworks to structurally configure weights and thresholds, aligning the five-layer capability distribution with the project’s technological stage [87].
  3. (3). Project scale and contractual models further shape the depth and focus of evaluation. Small-scale projects typically emphasize core data acquisition and basic visualization, allowing for simplified indicator sets. Large and complex projects require assessment of advanced analytics and comprehensive data integration, consistent with empirical findings from digital twin capability stratification and BIM maturity frameworks [87]. Under Design–Bid–Build (DBB) models, characterized by elongated information chains and complex cross-organizational coordination, indicators should reinforce seamless design–construction integration, model conversion accuracy, and collaborative update mechanisms. The data layer should emphasize standard compliance, timeliness, and consistency in inter-organizational exchange, while the application layer should address digital change management and multi-party decision-making maturity. IPA analyses frequently identify the modeling and data layers as critical improvement domains [89,93,104]. Integrated delivery models such as DB/EPC and IPD, which feature centralized decision chains and collaborative emphasis, require elevated weighting of the analysis layer, focusing on real-time cost–schedule–quality analytics and predictive risk alerts. The application layer should support centralized command and rapid closure, while the modeling layer must enable collaborative design and model-driven construction. Comparative studies have demonstrated the efficiency and performance advantages of integrated delivery, providing empirical justification for these configurations [92]. Relational contracts and joint risk management mechanisms can further enhance incentive compatibility by linking data sharing and predictive control to production management and risk alert indicators [105].
  4. (4). To address cultural and organizational barriers in global deployment, a multi-dimensional strategy is essential. Trust and data governance should be reinforced through cross-organizational data sharing agreements, stratified responsibilities, and auditable catalogs, clarifying ownership, usage boundaries, and value distribution. Privacy-enhancing technologies such as anonymization and differential privacy should enable risk-tiered openness, with performance-linked incentives incorporating data contributions [90,106]. Organizational transformation and capability gaps must be addressed via tiered maturity frameworks that define staged competencies across acquisition, data, modeling, analysis, and application layers. Scenario-based training and change management initiatives can improve BIM/DT adoption and collaborative performance [91]. Semantic consistency should be ensured through domain ontologies and minimal viable semantic sets, with cross-platform fidelity and interoperability regression testing incorporated into evaluation metrics [20,88]. Privacy and ethical considerations require embedding indicators such as anonymization coverage, data minimization, response latency, and explainable appeal mechanisms within the data layer, with dynamic weighting based on jurisdictional context [101,102]. Economic trade-offs must be considered to prevent excessive constraints that undermine data value. Incentive misalignment and risk preferences should be addressed through relational contracts and joint risk management, binding predictive analytics and data sharing to compensation, risk allocation, and performance incentives. Indicator weights should be adjusted according to delivery models to reflect decision chain characteristics [90,105]. Finally, thresholds and weights must be contextually configured based on technological infrastructure. In low-to-medium resource settings, priority should be given to connectivity and data quality enhancement; in high-resource environments, emphasis should be placed on predictive control and real-time closure. IPA should be employed to identify high-importance, low-performance gaps, guiding iterative optimization [86,94].

4.3 Technical challenges and mitigation strategies

While the proposed evaluation framework leverages real-time data as a core advantage, its large-scale deployment may encounter several technical challenges. To ensure system stability, scalability, and assessment accuracy, this section systematically identifies potential issues across five critical technical dimensions and proposes targeted mitigation strategies.

  1. (1). Sensor Network Bandwidth Limitations

High-frequency, multi-source sensor data uploads can lead to link congestion and packet loss, constraining sampling rates and node scalability while amplifying end-to-end transmission bottlenecks [107]. To mitigate these issues, four strategies are proposed: (i) edge computing should be deployed near the data source to perform denoising, feature extraction, and lightweight inference, thereby reducing upstream traffic [108]; (ii) data compression and adaptive encoding techniques—such as keyframe extraction, point cloud voxelization, and differential compression—can minimize bandwidth consumption within acceptable error margins [109]; (iii) selective acquisition and in-network aggregation, enabled by event-triggered sampling and node-side filtering, ensure that only informative data variations are transmitted [110]; and (iv) event-driven control mechanisms can replace periodic full reporting, actively reducing link load [111]. These strategies collectively establish an efficient and controllable data acquisition and transmission architecture that enhances system responsiveness and reliability.

  1. (2). Data Latency and Jitter

Multi-hop routing and protocol stack overhead may impair real-time feedback and online collaborative simulation, making it difficult to maintain control and alert loops within millisecond-to-second latency budgets [112]. To address this, four optimization strategies are recommended: (i) edge–cloud layered closed-loop control, where high-frequency control logic is executed at the edge while the cloud handles low-frequency global optimization [108]; (ii) end-to-end QoS and flow control mechanisms, incorporating service-level differentiation and congestion feedback, to suppress queuing delays [113]; (iii) prioritized transmission of keyframes and event-driven data to ensure timely reporting during anomalies [111]; and (iv) high-precision time synchronization across sources, achieved through clock alignment and temporal calibration, to alleviate processing delays caused by time drift [114]. These measures collectively enhance system responsiveness and support stable collaborative simulation.

  1. (3). Computational Resource Constraints

Real-time stream processing, complex event detection, 3D simulation, and machine learning inference impose intensive demands on GPU, memory, and I/O resources. Centralized deployment may lead to performance bottlenecks and resource contention [115]. To alleviate this, four resource optimization strategies are proposed: (i) edge intelligence and distributed model deployment to offload computationally intensive tasks to peripheral nodes [108]; (ii) hybrid simulation strategies using high-fidelity models for critical scenarios and reduced-order or surrogate models elsewhere to balance accuracy and efficiency [116]; (iii) feature-level data uploads, rather than raw streams, to reduce computational and I/O burdens [109]; and (iv) elastic stream processing architectures incorporating load scaling and window-based scheduling to smooth peak system loads [113]. These strategies contribute to a scalable and efficient computational framework capable of supporting real-time evaluation and simulation in complex environments.

  1. (4). Data Fusion Complexity

Integrating multi-source heterogeneous data presents significant challenges in synchronization, coordinate transformation, semantic alignment, and precision control, which affect the consistency and predictive reliability of the digital twin [111]. To address this, three fusion optimization strategies are proposed: (i) robust fusion mechanisms based on uncertainty modeling—using confidence scores and covariance matrices—to manage conflicts and enhance data consistency [117]; (ii) high-precision clock synchronization and spatiotemporal alignment techniques, such as PTP-level timing and sliding window calibration, to mitigate cross-source drift [114]; and (iii) unified semantic and data standards, including domain-specific ontologies for the built environment, to reduce integration friction and ensure semantic coherence [118]. These approaches significantly improve the accuracy and consistency of data fusion, thereby supporting high-quality modeling and prediction.

  1. (5). Architectural and Engineering Support Deficiencies

The absence of end-to-end QoS guarantees, flow control mechanisms, standardized metadata, and tiered data management strategies may lead to cascading degradation and traceability loss during scalable deployment [110,111]. To address this, three systemic improvement strategies are proposed: (i) integrated QoS and backpressure mechanisms enabling priority-based scheduling and congestion-aware traffic management, with edge-side aggregation to reduce network load [110,113]; (ii) event-triggered governance mechanisms to replace periodic full reporting and prevent systemic congestion [111]; and (iii) a standardized governance framework—including unified metadata structures, semantic standards, and data lineage tracking—to enhance system traceability and interoperability [20]. These improvements establish a stable and scalable architecture capable of supporting reliable operation in complex engineering environments.

4.4 Innovations and limitations of the study

  1. (1). Innovations: Existing research on the evaluation of digital twin maturity during the construction phase remains limited in terms of objective and quantitative evaluation methods and standards, with relatively few practical cases available. This study makes significant contributions to the methodology and empirical case analysis within the domain of digital twin maturity evaluation in construction. The innovations of this study are summarized as follows:
    1. 1). This research provides a universal methodological framework to support the evaluation of digital twin maturity in the global construction sector. The proposed comprehensive evaluation process facilitates the hierarchical processing of complex maturity assessment systems, offering simplicity in calculation, relative objectivity, and practical applicability. It enhances the accuracy of the maturity evaluation system during the construction phase and lays a solid foundation for further research on construction-phase maturity assessment.
    2. 2). The case study presented in this research fills a gap in empirical studies on digital twin maturity evaluation within the construction field. It offers practical insights for researchers and professionals in this area, helping to advance the development of digital twins in the global construction industry and providing reliable references for similar projects.
  2. (2). Limitations: This study primarily focuses on the application of digital twin technology during the construction phase, with an emphasis on progress and quality management. However, it does not incorporate the monitoring or analysis of material performance, thereby limiting the ability to directly correlate macro-level construction outcomes with micro-level material mechanisms. Prior research has demonstrated that the incorporation of polyaluminum chloride waste residue and citric acid into magnesium oxychloride cement can significantly enhance its water resistance, volumetric stability, and pore structure, while also delaying the setting time [118]. These findings suggest that future digital twin frameworks should integrate key material performance parameters to improve the comprehensiveness and explanatory power of the evaluation system. Moreover, the absence of a standardized evaluation framework for digital twin applications in China presents an additional challenge. Future research should aim to synthesize the latest global policy documents related to digital twin technologies and expand the evaluation scope to encompass the entire building lifecycle. Through extensive case-based analysis, the effectiveness of digital twin implementation can be systematically assessed. Consequently, the maturity assessment indicators should be dynamically updated in accordance with the evolving characteristics and specific objectives of digital twin technologies.

5. Conclusion

This study developed a novel quantifiable evaluation model for digital twin maturity, primarily aimed at assessing the maturity level of digital twins during the construction phase. The main conclusions are as follows:

  1. (1). Evaluation System: Based on the characteristics of digital twins during the construction phase, this study proposed a maturity model encompassing five dimensions—Collection Layer, Data Layer, Modeling Layer, Analysis Layer, and Application Layer—and 26 evaluation indicators. Five levels of maturity were defined as the classification criteria. During the weighting process, the AHP-Entropy method was utilized to effectively overcome the subjectivity of data. In the comprehensive evaluation phase, an optimized matter-element model based on dynamic thresholds and nonlinear correlation degrees was introduced to resolve incompatibilities in complex systems more effectively and improve evaluation accuracy. In the feedback phase, the IPA method was employed for evaluation result analysis, enabling more intuitive optimization of the outcomes.
  2. (2). Case Study: Through an analysis of the CAZ Innovation Industrial Park project in Xinyang, Henan Province, the digital twin maturity was determined to be at the Connected Level (Level II). Primary indicators require prioritized attention to the “Analysis Layer,” while secondary indicators call for improvements in sensor capability, data real-time performance, and self-learning capability within digital twin systems. Empirical validation demonstrated that the proposed maturity evaluation model is highly operable, providing accurate assessment results and optimization recommendations for the enterprise’s digital twin level and effectively guiding the application of digital twin technology.

The digital twin maturity model proposed in this study not only advances the development and application of digital twin technology during the construction phase but also provides both theoretical and practical value for the progress of digital twin technology within the construction domain. This model serves as a reference for the implementation of similar projects and offers the academic community a novel approach for further research in this field.

References

  1. 1. Chrissis MB, Konrad M, Shrum S. CMMI for development: guidelines for process integration and product improvement. 3rd ed. Boston: Addison-Wesley. 2011.
  2. 2. Bolton A, Butler L, Dabson I, Enzer M, Evans M, Fenemore T, et al. The Gemini principles. Cambridge: Centre for Digital Built Britain. 2018.
  3. 3. International Organization for Standardization. ISO 19650-2:2018 – Organization and digitization of information about buildings and civil engineering works, including BIM — Information management using BIM — Part 2: Delivery phase of the assets. Geneva: ISO. 2018.
  4. 4. Negri E, Fumagalli L, Macchi M. A Review of the Roles of Digital Twin in CPS-based Production Systems. Procedia Manufacturing. 2017;11:939–48.
  5. 5. Omrany H, Al-Obaidi KM, Husain A, Ghaffarianhoseini A. Digital Twins in the Construction Industry: A Comprehensive Review of Current Implementations, Enabling Technologies, and Future Directions. Sustainability. 2023;15(14):10908.
  6. 6. Khajavi SH, Motlagh NH, Jaribion A, Werner LC, Holmstrom J. Digital Twin: Vision, Benefits, Boundaries, and Creation for Buildings. IEEE Access. 2019;7:147406–19.
  7. 7. Glaessgen E, Stargel DS. The digital twin paradigm for future NASA and US Air Force vehicles. In: Proceedings of the 53rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, Reston (VA), 2012.
  8. 8. Grieves M. Digital twin: manufacturing excellence through virtual factory replication. https://www.3ds.com/fileadmin/PRODUCTS-SERVICES/DELMIA/PDF/Whitepaper/DELMIA-APRISO-Digital-Twin-Whitepaper.pdf. 2021.
  9. 9. Guo J, Lv Z. Application of Digital Twins in multiple fields. Multimed Tools Appl. 2022;81(19):26941–67. pmid:35194381
  10. 10. Yitmen I, Alizadehsalehi S, Akıner İ, Akıner ME. An adapted model of cognitive digital twins for building lifecycle management. Appl Sci. 2021;11:4276.
  11. 11. Akanmu AA, Anumba CJ, Ogunseiju OO. Towards next generation cyber-physical systems and digital twins for construction. J Inf Technol Constr. 2021;26(27):505–25.
  12. 12. Arslan M, Cruz C, Ginhac D. Visualizing intrusions in dynamic building environments for worker safety. Saf Sci. 2019;120:428–46.
  13. 13. Long W, Bao Z, Chen K, Ng ST, Wuni IY. Developing an integrative framework for digital twin applications in the building construction industry: a systematic literature review. Adv Eng Inform. 2024;59:102346.
  14. 14. Jiang W, Ding L, Zhou C. Digital twin: stability analysis for tower crane hoisting safety with a scale model. Autom Constr. 2022;138:104257.
  15. 15. Li M, Lu Q, Bai S, Zhang M, Tian H, Qin L. Digital twin-driven virtual sensor approach for safe construction operations of trailing suction hopper dredger. Autom Constr. 2021;132:103961.
  16. 16. Wu S, Hou L, Zhang GK, Chen H. Real-time mixed reality-based visual warning for construction workforce safety. Autom Constr. 2022;139:104252.
  17. 17. Kamari M, Ham Y. AI-based risk assessment for construction site disaster preparedness through deep learning-based digital twinning. Autom Constr. 2022;134:104091.
  18. 18. Gao Y, Chang D, Chen CH, Xu Z. Design of digital twin applications in automated storage yard scheduling. Adv Eng Inform. 2022;51:101477.
  19. 19. Tran H, Nguyen TN, Christopher P, Bui D-K, Khoshelham K, Ngo TD. A digital twin approach for geometric quality assessment of as-built prefabricated façades. Journal of Building Engineering. 2021;41:102377.
  20. 20. Boje C, Guerriero A, Kubicki S, Rezgui Y. Towards a semantic construction digital twin: directions for future research. Autom Constr. 2020;114:103179.
  21. 21. Pan Y, Zhang L. A BIM-data mining integrated digital twin framework for advanced project management. Autom Constr. 2021;124:103564.
  22. 22. Kaewunruen S, Xu N. Digital Twin for Sustainability Evaluation of Railway Station Buildings. Front Built Environ. 2018;4:430624.
  23. 23. Gerhard D, Neges M, Wolf M. Transfer of digital twin concepts to the production of precast concrete parts in the construction industry. ZWF. 2020;115:58–61.
  24. 24. Angjeliu G, Coronelli D, Cardani G. Development of the simulation model for digital twin applications in historical masonry buildings: the integration between numerical and experimental reality. Comput Struct. 2020;238:106282.
  25. 25. Chen S, Fan G, Li J. Improving completeness and accuracy of 3D point clouds by using deep learning for applications of digital twins to civil structures. Adv Eng Inform. 2023;58:102196.
  26. 26. Ritto TG, Rochinha FA. Digital twin, physics-based model, and machine learning applied to damage detection in structures. Mechanical Systems and Signal Processing. 2021;155:107614.
  27. 27. Evans S, Savian C, Burns A, Cooper C. Digital twins for the built environment: an introduction to the opportunities, benefits, challenges and risks. Built Environ News. 2019.
  28. 28. Grieves M. Digital model, digital shadow, digital twin. In: 2023.
  29. 29. Liu Y, Feng J, Lu J, Zhou S. A review of digital twin capabilities, technologies, and applications based on the maturity model. Adv Eng Inform. 2024;62:102592.
  30. 30. Chen L, Xie X, Lu Q, Parlikad AK, Pitt M, Yang J. Gemini Principles-Based Digital Twin Maturity Model for Asset Management. Sustainability. 2021;13(15):8224.
  31. 31. Uhlenkamp J-F, Hauge JB, Broda E, Lutjen M, Freitag M, Thoben K-D. Digital Twins: A Maturity Model for Their Classification and Evaluation. IEEE Access. 2022;10:69605–35.
  32. 32. Hu W, Fang J, Zhang T, Liu Z, Tan J. A new quantitative digital twin maturity model for high-end equipment. J Manuf Syst. 2023;66:248–59.
  33. 33. Bai L, Wang H, Huang N, Du Q, Huang Y. An Environmental Management Maturity Model of Construction Programs Using the AHP-Entropy Approach. Int J Environ Res Public Health. 2018;15(7):1317. pmid:29937508
  34. 34. Kou G, Dinçer H, Yüksel S, Deveci M. Synergistic integration of digital twins and sustainable industrial internet of things for new generation energy investments. J Adv Res. 2024;66:39–46. pmid:38008174
  35. 35. Chen ZS, Zhou MD, Chin KS, Darko A, Wang XJ, Pedrycz W. Optimized decision support for BIM maturity assessment. Autom Constr. 2023;149:104808.
  36. 36. Kocaoglu B, Kirmizi M. Prescriptive digital transformation maturity model: a development and validation study. Kybernetes. 2025;54(5):2662–705. https://doi.org/10.1108/K-02-2023-0243
  37. 37. Wang M, Ashour M, Mahdiyar A, Sabri S. Opportunities and Threats of Adopting Digital Twin in Construction Projects: A Review. Buildings. 2024;14(8):2349.
  38. 38. Deng M, Menassa CC, Kamat VR. From BIM to digital twins: a systematic review of the evolution of intelligent building representations in the AEC-FM industry. J Inf Technol Constr. 2021;26(5):58–83.
  39. 39. Chen ZS, Chen KD, Xu YQ, Pedrycz W, Skibniewski MJ. Multiobjective optimization-based decision support for building digital twin maturity measurement. Adv Eng Inform. 2024;59:102245.
  40. 40. Afzal M, Li RYM, Shoaib M, Ayyub MF, Tagliabue LC, Bilal M, et al. Delving into the digital twin developments and applications in the construction industry: A PRISMA approach. Sustainability. 2023;15(23):16436. https://doi.org/10.3390/su152316436
  41. 41. Alnaser AA, Hassan Ali A, Elmousalami HH, Elyamany A, Gouda Mohamed A. Assessment framework for BIM-digital twin readiness in the construction industry. Buildings. 2024;14(1):268.
  42. 42. Li T, Rui Y, Zhao S, Zhang Y, Zhu H. A quantitative digital twin maturity model for underground infrastructure based on D-ANP. Tunn Undergr Space Technol. 2024;146:105612.
  43. 43. Wei Y, Lei Z, Altaf S. An Off-Site Construction Digital Twin Assessment Framework Using Wood Panelized Construction as a Case Study. Buildings. 2022;12(5):566.
  44. 44. Duan H, Tian F. The development of standardized models of digital twin. IFAC-PapersOnLine. 2020;53(5):726–31.
  45. 45. Grieves M, Vickers J. Digital twin: mitigating unpredictable, undesirable emergent behavior in complex systems. In: Kahlen FJ, Flumerfelt S, Alves A, editors. Transdisciplinary perspectives on complex systems: new findings and approaches. Cham: Springer. 2017:85–113.
  46. 46. Liu H, Yin Z, Chen S, Yang Y, Tian H. Development of an Assessment of Ethics for Chinese Physical Education Teachers: A Study Using the Delphi and Expert Ranking Methods. Int J Environ Res Public Health. 2022;19(19):11905. pmid:36231205
  47. 47. Chuangchang P, Thinnukool O, Tongkumchum P. Modelling urban growth over time using grid-digitized method with variance inflation factors applied to spatial correlation. Arab J Geosci. 2016;9:1–13.
  48. 48. Gransberg DD, Popescu CM, Ryan R. Construction equipment management for engineers, estimators, and owners. Boca Raton (FL): CRC Press; 2006.
  49. 49. Madubuike OC, Anumba CJ, Khallaf R. A review of digital twin applications in construction. J Inf Technol Constr. 2022;27:1–27.
  50. 50. Zhang H, Zhou Y, Zhu H, Sumarac D, Cao M. Digital Twin-Driven Intelligent Construction: Features and Trends. Structural Durability & Health Monitoring. 2021;15(3):183–206.
  51. 51. Riku A. Sensor data transmission from a physical twin to a digital twin. Espoo (Finland): Aalto University. 2019.
  52. 52. Botín-Sanabria DM, Mihaita A-S, Peimbert-García RE, Ramírez-Moreno MA, Ramírez-Mendoza RA, Lozoya-Santos J de J. Digital Twin Technology Challenges and Applications: A Comprehensive Review. Remote Sensing. 2022;14(6):1335.
  53. 53. Broo DG, Bravo-Haro M, Schooling J. Design and implementation of a smart infrastructure digital twin. Autom Constr. 2022;136:104171.
  54. 54. Pregnolato M, Gunner S, Voyagaki E, De Risi R, Carhart N, Gavriel G. Towards Civil Engineering 4.0: Concept, Workflow and Application of Digital Twins for Existing Infrastructure. Autom Constr. 2022;141:104421.
  55. 55. International Organization for Standardization. ISO 23247-1:2021 – Automation systems and integration – Digital twin framework for manufacturing – Part 1: Overview and general principles. Geneva: ISO. 2021. https://www.iso.org/standard/75066.html
  56. 56. Lu Q, Parlikad AK, Woodall P, Ranasinghe GD, Heaton J. Developing a dynamic digital twin at a building level: using Cambridge campus as case study. In: International Conference on Smart Infrastructure and Construction 2019 (ICSIC): driving data-informed decision-making, Cambridge, UK, 2019. 67–75.
  57. 57. Lu Y, Liu C, Kevin I, Wang K, Huang H, Xu X. Digital twin-driven smart manufacturing: connotation, reference model, applications and research issues. Robot Comput Integr Manuf. 2020;61:101837.
  58. 58. Schroeder GN, Steinmetz C, Pereira CE, Espindola DB. Digital Twin Data Modeling with AutomationML and a Communication Methodology for Data Exchange. IFAC-PapersOnLine. 2016;49(30):12–7.
  59. 59. Lu Q, Chen L, Lee S, Zhao X. Activity theory-based analysis of BIM implementation in building O&M and first response. Autom Constr. 2018;85:317–32.
  60. 60. Hasan SM, Lee K, Moon D, Kwon S, Jinwoo S, Lee S. Augmented reality and digital twin system for interaction with construction machinery. Journal of Asian Architecture and Building Engineering. 2021;21(2):564–74.
  61. 61. Negri E, Ardakani HD, Cattaneo L, Singh J, Macchi M, Lee J. A Digital Twin-based scheduling framework including Equipment Health Index and Genetic Algorithms. IFAC-PapersOnLine. 2019;52(10):43–8.
  62. 62. Zhu Z, Liu C, Xu X. Visualisation of the Digital Twin data in manufacturing by using Augmented Reality. Procedia CIRP. 2019;81:898–903.
  63. 63. Fan Y, Yang J, Chen J, Hu P, Wang X, Xu J. A digital-twin visualized architecture for flexible manufacturing system. J Manuf Syst. 2021;60:176–201.
  64. 64. Haraguchi M, Funahashi T, Biljecki F. Assessing governance implications of city digital twin technology: A maturity model approach. Technological Forecasting and Social Change. 2024;204:123409.
  65. 65. Agrawal A, Fischer M, Singh V. Digital twin: from concept to practice. J Manag Eng. 2022;38(3):06022001.
  66. 66. Qi Q, Tao F. Digital Twin and Big Data Towards Smart Manufacturing and Industry 4.0: 360 Degree Comparison. IEEE Access. 2018;6:3585–93.
  67. 67. Lee D, Lee SH. Digital twin for supply chain coordination in modular construction. Appl Sci. 2021;11(13):5909.
  68. 68. Madni AM, Madni CC, Lucero SD. Leveraging Digital Twin Technology in Model-Based Systems Engineering. Systems. 2019;7(1):7.
  69. 69. Kor M, Yitmen I, Alizadehsalehi S. An investigation for integration of deep learning and digital twins towards Construction 4.0. Smart Sustain Built Environ. 2023;12(3):461–487. https://doi.org/10.1108/SASBE-08-2021-0148
  70. 70. Barth L, Ehrat M, Fuchs R, Haarmann J. Systematization of digital twins: ontology and conceptual framework. In: Proceedings of the 3rd International Conference on Information Science and Systems, Cambridge, UK, 2020. 13–23.
  71. 71. Moyne J, Qamsane Y, Balta EC, Kovalenko I, Faris J, Barton K, et al. A Requirements Driven Digital Twin Framework: Specification and Opportunities. IEEE Access. 2020;8:107781–801.
  72. 72. AlBalkhy W, Karmaoui D, Ducoulombier L, Lafhaj Z, Linner T. Digital twins in the built environment: definition, applications, and challenges. Autom Constr. 2024;162:105368.
  73. 73. Delgado JMD, Oyedele L. Digital twins for the built environment: learning from conceptual and process models in manufacturing. Advances in Engineering Information. 2021;49:101332.
  74. 74. Opoku DGJ, Perera S, Osei-Kyei R, Rashidi M. Digital twin application in the construction industry: a literature review. J Build Eng. 2021;40:102726.
  75. 75. Hu W, Lim KYH, Cai Y. Digital twin and industry 4.0 enablers in building and construction: a survey. Buildings. 2022;12(11):2004.
  76. 76. Sacks R, Brilakis I, Pikas E, Xie HS, Girolami M. Construction with digital twin information systems. DCE. 2020;1:e14.
  77. 77. Sepasgozar S, Khan A, Smith K, Romero J, Shen X, Shirowzhan S, et al. BIM and Digital Twin for Developing Convergence Technologies as Future of Digital Construction. Buildings. 2023;13(2):441.
  78. 78. Cai W. The extension set and incompatibility problem. J Sci Explor. 1983;1:81–93.
  79. 79. Islam T, Pruyt E. Scenario generation using adaptive sampling: the case of resource scarcity. Environ Model Softw. 2016;79:285–99.
  80. 80. Twomey D, Gorse D. ASTra: a novel algorithm-level approach to imbalanced classification. In: International Conference on Artificial Neural Networks; 2022 Sep 6–9; Bristol, UK. Cham: Springer Nature Switzerland; 2022:569–80.
  81. 81. Zhang H, Na J, Zhang B. Autonomous Internet of Things (IoT) Data Reduction Based on Adaptive Threshold. Sensors (Basel). 2023;23(23):9427. pmid:38067800
  82. 82. Silverman BW. Density estimation for statistics and data analysis. Boca Raton (FL): Routledge. 2018.
  83. 83. Friedman JH. Greedy function approximation: a gradient boosting machine. Ann Stat. 2001;29(5):1189–232.
  84. 84. Zhang GF. Investigation and research of college landscape satisfaction based on IPA. J Chaohu Univ. 2016;18(5):61–6.
  85. 85. Feng L, Zhao J. Evaluation of urban park landscape satisfaction based on the fuzzy‐IPA model: a case study of the Zhengzhou People’s Park. Math Probl Eng. 2022;2022:2116532.
  86. 86. Fuller A, Fan Z, Day C, Barlow C. Digital Twin: Enabling Technologies, Challenges and Open Research. IEEE Access. 2020;8:108952–71.
  87. 87. Succar B. Building information modelling framework: a research and delivery foundation. Autom Constr. 2009;18(3):357–75.
  88. 88. Pauwels P, Zhang S, Lee YC. Semantic web technologies in AEC industry: a literature overview. Autom Constr. 2017;73:145–65.
  89. 89. Merschbrock C, Munkvold BE. Effective digital collaboration in the construction industry: a case study of BIM deployment in a hospital construction project. Comput Ind. 2015;73:1–11.
  90. 90. Papadonikolaki E, Vrijhoef R, Wamelink H. Adoption of BIM in the supply chain: a systemic view. Int J Proj Manag. 2016;34(6):1025–46.
  91. 91. Kent DC, Becerik-Gerber B. Understanding construction industry experience with building information modeling. J Constr Eng Manag. 2010;136(6):815–25.
  92. 92. El Asmar M, Hanna AS, Loh WY. Quantifying performance for the integrated project delivery system as compared to established delivery systems. J Constr Eng Manag. 2013;139(11):04013012.
  93. 93. Martilla JA, James JC. Importance–performance analysis. J Mark. 1977;41(1):77–9.
  94. 94. Jones D, Snider C, Nassehi A, Yon J, Hicks B. Characterising the Digital Twin: A systematic literature review. CIRP Journal of Manufacturing Science and Technology. 2020;29:36–52.
  95. 95. Doloi H, Sawhney A, Iyer KC, Rentala S. Analysing factors affecting delays in Indian construction projects. Int J Proj Manag. 2012;30(4):479–89.
  96. 96. Boschert S, Rosen R. Digital twin—the simulation aspect. In: Hehenberger P, Bradley D, editors. Mechatronic futures: challenges and solutions for mechatronic systems and their designers. Cham: Springer. 2016:59–74.
  97. 97. Huang H, Zhang J, Li X. Application of digital twin in construction project safety management. J Civ Eng Manag. 2024;30(5):401–15.
  98. 98. Tang P, Huber D, Akinci B, Lipman R, Lytle A. Automatic reconstruction of as-built building information models from laser-scanned point clouds: a review of related techniques. Autom Constr. 2010;19(7):829–43.
  99. 99. Tao F, Zhang H, Liu A, Nee AYC. Digital Twin in Industry: State-of-the-Art. IEEE Trans Ind Inf. 2019;15(4):2405–15.
  100. 100. Kritzinger W, Karner M, Traar G, Henjes J, Sihn W. Digital twin in manufacturing: a categorical literature review and classification. IFAC-PapersOnLine. 2018;51(11):1016–22.
  101. 101. Tikkinen-Piri C, Rohunen A, Markkula J. EU General Data Protection Regulation: Changes and implications for personal data collecting companies. Computer Law & Security Review. 2018;34(1):134–53.
  102. 102. Sweeney L. k-anonymity: a model for protecting privacy. Int J Unc Fuzz Knowl Based Syst. 2002;10(05):557–70.
  103. 103. Acquisti A, Taylor C, Wagman L. The economics of privacy. J Econ Lit. 2016;54(2):442–92.
  104. 104. Songer JM, Molenaar KR. Project characteristics for successful public-sector design–build. J Constr Eng Manag. 1997;123(1):34–40.
  105. 105. Rahman MM, Kumaraswamy MM. Potential for implementing relational contracting and joint risk management. Constr Manag Econ. 2004;22(2):131–42.
  106. 106. Milberg SJ, Smith HJ, Burke SJ. Information privacy: corporate management and national regulation. J Int Bus Stud. 2000;31(6):823–46.
  107. 107. Beuchert J, Solowjow F, Trimpe S, Seel T. Overcoming Bandwidth Limitations in Wireless Sensor Networks by Exploitation of Cyclic Signal Patterns: An Event-triggered Learning Approach. Sensors (Basel). 2020;20(1):260. pmid:31906455
  108. 108. Shi W, Cao J, Zhang Q, Li Y, Xu L. Edge Computing: Vision and Challenges. IEEE Internet Things J. 2016;3(5):637–46.
  109. 109. Lindstrom P. Fixed-Rate Compressed Floating-Point Arrays. IEEE Trans Vis Comput Graph. 2014;20(12):2674–83. pmid:26356981
  110. 110. Fasolo E, Rossi M, Widmer J, Zorzi M. In-network aggregation techniques for wireless sensor networks: a survey. IEEE Wireless Commun. 2007;14(2):70–87.
  111. 111. Heemels WPMH, Donkers MCF, Teel AR. Periodic event-triggered control for linear systems. IEEE Trans Autom Control. 2013;58(4):847–61.
  112. 112. Mao Y, You C, Zhang J, Huang K, Letaief KB. A Survey on Mobile Edge Computing: The Communication Perspective. IEEE Commun Surv Tutorials. 2017;19(4):2322–58.
  113. 113. Al-Fuqaha A, Guizani M, Mohammadi M, Aledhari M, Ayyash M. Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications. IEEE Commun Surv Tutorials. 2015;17(4):2347–76.
  114. 114. Sundararaman B, Buy U, Kshemkalyani AD. Clock synchronization for wireless sensor networks: a survey. Ad Hoc Networks. 2005;3(3):281–323.
  115. 115. Zhou Z, Chen X, Li E, Zeng L, Luo K, Zhang J. Edge Intelligence: Paving the Last Mile of Artificial Intelligence With Edge Computing. Proc IEEE. 2019;107(8):1738–62.
  116. 116. Gomes C, Thule C, Broman D, Larsen PG, Vangheluwe H. Co-simulation: state of the art. Simul Model Pract Theory. 2017;70:134–56.
  117. 117. Khaleghi B, Khamis A, Karray FO, Razavi SN. Multisensor data fusion: A review of the state-of-the-art. Information Fusion. 2013;14(1):28–44.
  118. 118. Xu P, Guo Y, Ding Y, Li H, Chen T, Wang H. Effect of polymeric aluminum chloride waste residue and citric acid on the properties of magnesium oxychloride cement. J Building Eng. 2025;101:111864.