Article
Article
Article

Digital Twin: Bringing MEP Models to Life

Share this Article

The AEC industry is heading toward Industry 4.0 and digital twins are a trend. As MEP design processes become more intelligent, digital twins are now feasible to reach and have many advantages ahead. These are powerful and can enable building owners to create smart buildings and operate with less. MEP models and point cloud scans are more accessible toward more-affordable sensors. How can we use MEP models after project handover to enhance the operation of buildings along their lifecycles?

This article provides a general introduction to the digital twin. We also discuss the process of converting MEP models to digital twins using Internet of Things (IoT) devices—specifically, how to connect MEP models and sensors together using cloud technology, Dynamo, and Revit 2020 software. Also, we will show how to visualize IoT live data with BI dashboards to increase awareness of how people use buildings and to enhance the maintenance process of buildings. Learn how to bring MEP models to life as digital twins, comply with client needs, and make the built environment more sustainable.

What Is a Digital Twin?

"Digital twin is a set of virtual information constructs that fully describes a potential or actual physical manufactured product from the micro atomic level to the macro geometrical level." —Michael Grievers

The AEC industry is slowly catching up with the aerospace and manufacturing industries when it comes to digital twins. This has been used extensively for prototyping new products, and verifying production line performance, and the performance of physical objects and systems. It has not been used widely within the AEC industry until very recently. With the increase of the available IoT sensors and the desire for more data from assets for verifying performance, digital twins have become more common.

Within the next few years there will be over 20 billion connected sensors and potentially billions of connected objects. The number of organizations using digital twins will triple by 2022. Gartner placed digital twins at the top of their Hype Cycle for Emerging Technologies in 2018 with a forecasted peak in 5 to 10 years.

lifecycle

Historically, the only way to gain knowledge of buildings was to have direct physical contact with the building itself. All the data about the building and its performance was directly contained within the building. The data about the buildings has been stored in static documentation formats such as paper or computer files. Digital twins build the bridge between the physical and digital worlds to allow for data to flow in real time or near-real time, so the data becomes alive.

The digital twin concept was first noted in 2002 by Michael Grieves at the University of Michigan as part of Product Lifecycle Management (PLM). His idea was the real space and virtual space worlds would be linked throughout the lifecycle of the system. Initially it was known as a Mirrored Spaces Model. Later, it was known as a Mirrored Information Model before being known as digital twin.

Digital twin is described as the bi-directional flow of data between “virtual space” (the digital representation) and “real space” (the physical asset). The data then needs to be accessible in real time or near-real time to build a complete digital picture of the physical asset.

assest
Digital twin relation between physical and virtual worlds.

Digital twins started as basic CAD documentation of the physical world in the late 1900s, but with the growth of BIM working processes the concept of digital twin has become a representation much closer to reality. With the ability to assign parametric data to objects it was possible to make the representations move beyond just a physical description, but a functional representation as well. Recently, with the growth of IoT technologies it became possible to live stream data to the objects and systems in the physical world to a remote location, analyze the data, and react to modify conditions of the physical object in its actual location. This moves the CAD object from being just a 2D/3D representation randomly positioned in space to being a representation of the physical object that demonstrates not only the form of the physical object, but its behavior as well.

Digital twins can be used for both prototyping objects as well as verifying and controlling physical objects. An object can be modeled in a pure digital environment with software and subjected to digital simulations to test the limits, and functional qualities of an object before it is produced. This is a huge cost and time savings. Previously, to conduct this simulation a precise physical representation would be created and subjected to physical tests which often resulted in the destruction of the objects, and the need to create a new modified representation of the object to continue testing.

Related: Forge with BIM 360 Docs as an IoT Hub with Tiago Ricotta

Now when prototyping, the simulations can be conducted digitally and modifications made in real time without the need to produce any physical object before it is ready to be tested in the environment where it is intended to be used. When individual objects are assembled into a system the complexity increases and the ability to evaluate and understand the system decreases. Digital twins assist in testing and evaluating the individual parts of a complex system.

system
Design process.

When discussing digital twins, there are three distinct environments where they are active:

Digital twin prototypeThis type of digital twin describes the prototype of a physical system. It contains the information sets necessary to describe and produce a physical version that duplicates or twins the physical version.

Digital twin instanceThis type of digital twin describes a specific corresponding physical system that the prototypical digital twin remains linked to throughout the life of that specific physical system.

Digital twin environmentThis is an integrated, multidomain physical application space for operating on digital twins. It is used for a variety of purposes. There are two parts to this environment. The predictive, where the digital twin is used for predicting future behavior and performance of a system. This will test the system to verify that it will function within an acceptable range, the interrogative where actual physical systems are for their current and historical performance. Multiple systems can be compared to find patterns where the system does not meet the design criteria so future systems can be modified accordingly.

Digital twins can be made using 3D models, 3D scans, and even 2D documentation. The requirement to qualify as a digital twin is the link between the physical and virtual worlds where data is transmitted bi-directionally between the two worlds.

worlds

 

Historic Digital Twins

Looking back to see where the digital twin concept came from, one can look to NASA in the 1960s during the first trips to the moon. NASA built exact replicas of everything that was launched into space. During production these replicas were prototypes of the actual objects, and after the objects were on their way to remote destinations, they became a twin of the equipment in use. All modifications made by the astronauts on their way into space were also made to the twin. This is probably best documented during the ill-fated Apollo 13 mission to the moon where there was a serious malfunction in the service module two days into the journey. Before mission control sent instructions to the astronauts, simulations were made to what we could call an “Analog Twin” to simulate all decisions made before implementing them on the physical object thousands of kilometers away. This is known as a Mirrored System since physical modifications were made to a twin physical object even though there were digital calculations made as well.

historic
NASA mission control dashboard. 

Mission control could be seen as a dashboard for the mission where all data was displayed and used to monitor and verify the status of the mission in real time. With the technology available at the time, the dashboard required the space of an auditorium.

Digital twin is also used in Formula 1 racing. While the team car is racing at 300 km/h on the track there is a team of technicians and engineers sitting remotely in the pit monitoring all the stresses on the car in real time and making small modifications to ensure the car is performing at the highest level possible. Before getting to the racetrack digital twin are used to simulate the performance of the car, so the best possible prototypes can be produced. If a physical prototype is needed for testing before every race there would not be enough time to produce the best possible cars. Digital twins help save time and develop better products by reducing the design time required for each iteration of the cars.

f1
Vodafone McLaren Mercedes dashboard.

Adding sensors to a building is not difficult. The challenge is gathering the data, structuring and analyzing the data, so it is useful downstream without the need for more investment to make the data usable. One of the biggest challenges is getting the data to be shared across the multitude of systems that could use access to the data. Many data systems are closed, so getting access to the data and ensuring the structure of the data is usable can be difficult.

Unlike industrial asset-centric businesses and discrete process manufacturing, the data captured in buildings has an extra factor that can be unpredictable with the interaction of people. This makes the assets more dynamic, so it is no longer just predictive maintenance of the asset it becomes controlling a living entity.

Digital twins are useful at every stage of a project’s lifecycle. Currently, their focus is primarily during the operations and maintenance phases. There is a lot of data available to be captured and the owner’s interest is focused primarily on the ability to control the operations of their assets. The benefit of the digital twin is the insight that is gained from the data that is harvested to be used proactively to improve performance and gain insight for the next project that could be designed.

Creating a Digital Twin

When creating a digital twin, the system needs to be planned from the beginning rather than imposed on a project at a late stage. The data required, how the data is generated, how the data is received, how the data is stored, who has access to the data, and the types of digital models required must be planned. After the framework is in place, then the technology to be integrated can be selected for the physical asset to enable the capture of the real time flow of data.

The function of the digital twin needs to be defined. Will it be for just for monitoring an asset or will it control the systems in the asset as well? Will the data be used for advanced analytics to be used for predictive maintenance? The answers to these questions will drive the decisions needed to define the sensors to be used, the way data is captured, and the applications required for interpreting the data.

The implementation of the digital twin can start small with a single system and be expanded over time. It can start as a series of smaller system-specific digital twins that are assembled to create the full picture of an asset. It is better to layer the data rather than to continually start new digital twins.

IoT plays a role in the creation of digital twins. IoT refers to unique identifiable objects and their virtual representations in an Internet-like structure. The transmission of data from sensors to storage devices is a critical connection. Without this connection the system cannot exist. It is necessary to select the correct network type, the correct protocol, and the correct transmission frequency in order to transmit the data.

Evaluating Behavior

Digital twins can help prevent serious accidents by the real time monitoring of the physical asset. By combining 3D scanning and sensors there is the opportunity to monitor existing assets where a digital model has not been created.

The digital twins are used for predicting future behavior and performance of physical systems. During the prototyping stage, the behavior prediction could be verifying the behavior of the designed system with associated components to verify that the as-designed system meets the proposed requirements.

The design phase provides the perfect opportunity to use digital twins by evaluating the virtual representation of the designed system. The behavior of systems can be verified virtually. The data structure of the model can be established early, so the data generated can be used downstream well into the construction and operation of the project. However, this requires that the information requirements are established before modelling, and the data is structured properly.

There are four possible outcomes when evaluating the behavior of a system. This is not only a measurement of the success or failure of the system, but a means to find faults and correct issues before creating the physical system.

system
Categories of system behavior. 

 

Predicted Desirable–The system performs as predicted.

Unpredicted Desirable–The performance of the system results in unexpected surprises. This result offers new results that were not originally planned for. There are no detrimental effects from using the system as designed.

Predicted Undesirable–The system fails as predicted and will require modification. Still the system is performing as planned.

Unpredicted Undesirable–The system fails when not expected to fail. The system requires redesign. If not addressed this can result in possible catastrophic failures. Through simulations this outcome can be minimized. There is always a risk of this outcome in the physical system. If the outcome was never considered; therefore, it was never tested for.

An interrogative digital twin could apply to digital twin instances that could be queried for their current and past histories irrespective of where their physical counterpart resided in the world. Individual instances could be interrogated for their current system state.

System Lifecycle

During the system lifecycle, there are two flows of data. The first is by the creation of the physical system where data flows forward as traditionally from creation to production to operation to disposal. However, data for a digital twin flows in reverse. Data from the future phase informs the previous stage. This data can be used to improve the performance of the systems by finding the weaknesses and failures that need refinement.

chart
Virtual and physical data flow. 

Creation–During this phase the characteristics and behavior of the system are defined. Desirable attributes are defined, and undesirable attributes are identified. Strategies to mitigate the undesirable attributes are developed to prevent them from occurring.

Production–The physical system is created. This is the phase where the manufacturability or constructability of the system is tested. At his point there is the possibility of undesirable behaviors to start appearing.

Operation–The physical system is tested. At this point all undesirable behaviors should be found and resolved. Although, there is still the possibility of unforeseen undesirable behaviors to be found.

Disposal–This is the decommissioning of the system. Decommissioning is typically ignored but, it does require consideration. The knowledge acquired through the previous stages is often lost through the decommissioning. The information generated during the disposal phase can be used towards the design of the next generation of the element.

Digital twins are used to understand problems that are too complex for human understanding. Models and their associated data can be brought together to inexpensively check for conflicts and clashes so the physical model can be created more efficiently. Previously, the conversion of 2D documentation to a physical object was an inefficient iterative process. Now models can be created and simulated in a virtual system, so when the physical models are created it is primarily for final testing and verification. The destructive testing is conducted on the virtual models which has minimal time and cost implications. More testing can be conducted, and time and waste material is minimized.

chart2
Digital twin implementation model. 

The Value of the Digital Twin

The value of the digital twin lies in the data and its connection from the physical system back to the virtual system. Large amounts of data can be generated which is used to inform design and operational decisions. It is important that the data created is reviewed and analyzed to gain greater understanding of the environment that will be affecting the physical system. Data replaces wasted physical materials, time, labor, and energy over the lifecycle of the system. Data is never free to acquire. It will require resources such as planning, implementation, sensors, software storage, and time. The cost of acquiring data is less than the cost of the physical waste operating an underperforming system. The greatest gains are made during the creation stage this reduces the amount of trial and error during the production phase. For MEP projects digital twins can improve the performance of systems, improve indoor environment, reduce energy consumption, and reduce operation cost.

Issues with Data

There are still security risks associated with digital twins. Typically, the data is stored in the cloud, so there is no physical infrastructure associated with the data storage side. However, there is a massive amount of data being collected from endpoints. Each of these endpoints are a potential point of weakness in the system. There is a possibility that data can be compromised between the endpoints and the cloud. Users of the data should have defined roles and it is best if the information transmitted is encrypted. The devices must have rights to send data over existing IT infrastructure.

Bad or misleading data can lead to errors. It must be ensured that the data is validated, and the data obtained can be trusted. It must be ensured that all sensors are sending data that is correct, calibrated, in the correct format and corresponds to the other sensors connected to the same system.

Our Process

We were inspired by Project Dasher and the Autodesk bridge project at Pier 9. The idea of linking sensors and models became the goal. After a call with Kean Walmsley we quickly found out that we would need to change our strategy. Getting the project into Project Dasher would not be as easy as we originally had hoped. We also saw the cabling challenges that the bridge project exposed, so we wanted to go wireless. From here we embarked on our digital twin journey.

Our initial intention was to create a digital twin of our office. This is an interrogative digital twin where an  existing building is monitored to verify its performance. First, we investigated creating a digital twin of the canteen in our building and have a live dashboard on the company intranet which would show how much traffic was in the canteen at any given point in time. This would help find the best time to go to lunch. We thought this data would also be helpful for the staff in the canteen, so they could time the food in accordance with the number of people in the canteen at any time. We had an older model of the canteen and a Matter Port model, so there was our basis. We just needed data.

Initially, we investigated computer vision, but this was not possible due to the new European General Data Protection Regulation (GDPR). We had recently tested computer vision for other applications for cars and street signage, but it could not be used for this situation. We investigated using the Building Management System (BMS), but the system was too old. We could not access the data and the system could not record data. Even if we could get the BMS to work it would not be possible to get historical data. An acoustics engineer in the company recommended that we could use sound levels as an indicator of the number of people in the canteen. It would not be precise, but it could work and help us get around the GDPR problem. This would require a few sensors distributed around the canteen and some manual calibration.

Our project involved three processes. The first was documenting the physical environment. This involved documenting the physical environment with Revit models and laser scans. The second part was capturing live conditions from the physical environment with sensors and converting it into digital data. The final part was making the live connection between the physical and digital worlds. This connection was made by modeling a digital instance of our physical sensor and connecting the two, so data could flow from the physical sensor in the physical world to our digital representation of the sensor in the digital world.

Download the full class handout to learn how to connect MEP models and sensors using cloud technology, Dynamo, and Revit; how to visualize IoT live data with BI dashboards to enhance building maintenance; and more. 

David Fink is an American architect living in Copenhagen for the past 18 years where he has worked in design build, architectural design, and consulting engineering companies as a BIM specialist. BIM is one of his passions and he is constantly looking for new ways of expanding the boundaries and use of BIM. He has been involved with BIM implementation and the active use of BIM on projects since 2007 both locally in Denmark and internationally. In his daily work he looks for ways to increase the quality of projects as well as increase efficiency and consistency while having some fun. He is part of the generation of architects who has taken the ride from the Mayline through CAD to BIM which gives him a holistic view of the AECO industry. At Ramboll he is part of the Integrated Digital Solutions Group where he focuses on the architectural departments and how they work with other disciplines to deliver integrated solutions.

Alejandro Mata is automation manager in the Integrated Digital Solutions department at Ramboll Denmark. He is MSc. HVAC design engineer with a background in civil engineering and architectural technology from DTU-Technical University of Denmark. Alejandro is passionate about enhancing the performance of the AEC industry by promoting a better utilization of building technology, towards automation of digital design processes. His focus is to work smarter and achieve the most effective practices to enhance data utilization and digital collaboration among AEC parties. He has been using Autodesk products for the last 10 years, with a detailed focus in Revit MEP software, and the Dynamo extension complemented with Business Intelligence cloud solutions. He has gained experience in the last five years through well-known Nordic projects. Additionally, he has worked as a teaching assistant at DTU and loves sharing knowledge.