Houston, we have a problem.
Quickly after the oxygen tanks exploded early into Apollo 13’s mission in April 1970, the astronauts were fighting for their lives. The whole world held its breath as the now-famous rescue ensued. Engineers in Houston scrambled to sort through technical issues from hundreds of thousands of miles away.
The rescue was successful because back on earth there was an identical copy of the spacecraft—Apollo 13’s twin. The team could quickly test solutions on the ground without adding risk to the astronauts in space.
Almost 50 years later, NASA uses the same strategy to understand and manage systems and machines across the solar system. Except today, the twins are virtual and there’s a fancy buzzword to describe them: Digital Twins.
Read any marketing-driven article today and you’ll learn that digital twins are about to save the world from climate change, create world peace, and wipe our virtual and physical butts. According to Gartner, digital twins were at the peak of the famous Hype Cycle for Emerging Technologies in 2018—entering the Peak of Inflated Expectations and headed towards the Trough of Disillusionment. They estimated that digital twins are 5 to 10 years away from the “Plateau of Productivity” where a technology becomes mainstream and fully operational.
Gartner defines a digital twin as:
A software design pattern that represents a physical object with the objective of understanding the asset’s state, responding to changes, improving business operations and adding value.
By that definition, virtually all organizations that manage physical assets have some sort of digital twin in operation. And it’s been that way for decades. For example, consider a single data point representing one quality of the health of one asset. Every commercial building I’ve ever been inside has that covered.
Here lies a paradox: How can a technology be simultaneously proven and in-use for decades AND be 5-10 years from maturity? That’s very confusing.
I think it’s because we’re talking in circles. There’s little consensus on what a digital twin is, what it does, and how the latest ultra-hyped versions are any different than what we’ve been doing for years. This post is the first installment in my attempt to untangle that mess for myself, with the hope it will be useful for you as well. Since we’re mainly concerned with buildings, I think the history of digital tools for buildings is a great jumping-off point.
I’ve worked in the buildings industry for almost 10 years now (woohoo!). Much of that time has been spent with digital representations of mechanical systems or whole building energy models. These types of tools, along with the others listed below, are mainstream in our industry. It turns out that we’ve been inching closer to digital twins this whole time—it’s the next progression. To illustrate, let’s walk through them in the order that I encountered them, starting at the beginning of my career.
Whole building energy models—built in tools like eQuest, OpenStudio, Trane Trace, etc—are used in design and energy management practices to calculate the energy consumed in a building for every hour of the year.
By creating a digital version of the building, different alternatives for a target energy use (e.g. net-zero) can be digitally simulated and tested before being implemented in real life. A better envelope, a different orientation to the sun, more distributed energy resources, a different HVAC design, etc.
This makes it possible to test different design alternatives in the context of the exact user interaction and specifics of the building. Also, knowing the savings and/or ROI for different alternatives helps building owners and investors decide whether or not to pursue them.
We use it to operate and monitor our buildings. It provides streaming data, a user interface with graphical ways to explore the equipment, and of course control of the equipment itself. The BAS was the IoT for buildings before IoT was a thing.
Building information models are three-dimensional representations of buildings that aid in the construction process. I’m not sure I could say it better than Jenana Roper did:
BIM is a small sub-set of a Digital Twin, frozen in time – typically during the design and construction phase. BIM is a finely tuned tool for more accurate design, collaboration, visualization, costing and construction sequencing phases of a building’s life. Its primary purpose is to design and construct a building, and post-construction, it serves to provide a digital record of a constructed asset. BIM is only focused on buildings – not people or processes. However, BIM is a small but very useful input into a Digital Twin, as it provides us with an accurate digital asset register and location data and is a great starting point for both a Smart Building and a Digital Twin.
EMIS, or building analytics, comprise another puzzle piece of the digital twin. As we’ve defined in the EMIS Framework, an EMIS is a system of devices, data services, and software applications that communicates with any building system or third-party data source to aggregate data and transform it into new capabilities that aid in the optimization of the building.
An EMIS typically isn’t one product or application, although it should perform like one. The EMIS stack is all of the devices, data services, and applications that meet the needs of the user. Depending on the EMIS implementation, the stack has many different layers:
A smart building builds on the BAS and in some ways the EMIS. It integrates multiple previously-siloed building systems into one platform and provides enhanced control functionality and engagement with occupants. Some have called these Building Engagement Platforms, Building Operating Systems, or IoT platforms. There are probably many more buzzwords grouped under the Smart Buildings umbrella.
A smart building platform is the “middleware” between devices, people, and software applications. It differs from BAS in the scope of the data it connects with—ideally it connects everything digital: HVAC, lighting, plug loads, meters, access control, fire suppression, grid-interaction, IoT, etc. It differs from EMIS in its focus on connections at the edge instead of the cloud, human engagement, and control. Just like an EMIS, a smart building platform provides capabilities to users and occupants through applications.
What is a modern digital twin then? What does it add to historical versions? One way to think about it is that it combines all of the above together.
Static models are enriched by live data and the systems using live data are enriched by better models. For instance, an energy model would be more useful if it was calibrated to actual conditions in the building as it evolves over time. If that could happen automatically, even better.
Similarly, a BIM needs to be transitioned from construction-focused to O&M focused—meaning it needs static data (e.g. maintenance logs, O&M manuals, warranties) and live data from the BAS. No longer are models or simulations being operated in isolation from the physical world—there must be a connection between the physical and the digital systems. This requires two-way data exchange and the inclusion of humans in the roles of occupants or operators, so the focus on the human experience inherent to the latest smart building approaches is vital for the digital twin as well.
With the combined functionality of all of these approaches, the digital twin is more intelligent and able to provide better analytics and control. It’s greater than the sum of it’s parts. We’ll get to that in more detail in the next installment.
Another way to think of it: define what a modern digital twin needs to include:
Drop any one of these and, in my opinion, you no longer have a modern digital twin. In the next installment, we’ll walk through what digital twins will do for building owners, why they will change the buildings industry, and my questions about the future of digital twins.
Houston, we have a problem.
Quickly after the oxygen tanks exploded early into Apollo 13’s mission in April 1970, the astronauts were fighting for their lives. The whole world held its breath as the now-famous rescue ensued. Engineers in Houston scrambled to sort through technical issues from hundreds of thousands of miles away.
The rescue was successful because back on earth there was an identical copy of the spacecraft—Apollo 13’s twin. The team could quickly test solutions on the ground without adding risk to the astronauts in space.
Almost 50 years later, NASA uses the same strategy to understand and manage systems and machines across the solar system. Except today, the twins are virtual and there’s a fancy buzzword to describe them: Digital Twins.
Read any marketing-driven article today and you’ll learn that digital twins are about to save the world from climate change, create world peace, and wipe our virtual and physical butts. According to Gartner, digital twins were at the peak of the famous Hype Cycle for Emerging Technologies in 2018—entering the Peak of Inflated Expectations and headed towards the Trough of Disillusionment. They estimated that digital twins are 5 to 10 years away from the “Plateau of Productivity” where a technology becomes mainstream and fully operational.
Gartner defines a digital twin as:
A software design pattern that represents a physical object with the objective of understanding the asset’s state, responding to changes, improving business operations and adding value.
By that definition, virtually all organizations that manage physical assets have some sort of digital twin in operation. And it’s been that way for decades. For example, consider a single data point representing one quality of the health of one asset. Every commercial building I’ve ever been inside has that covered.
Here lies a paradox: How can a technology be simultaneously proven and in-use for decades AND be 5-10 years from maturity? That’s very confusing.
I think it’s because we’re talking in circles. There’s little consensus on what a digital twin is, what it does, and how the latest ultra-hyped versions are any different than what we’ve been doing for years. This post is the first installment in my attempt to untangle that mess for myself, with the hope it will be useful for you as well. Since we’re mainly concerned with buildings, I think the history of digital tools for buildings is a great jumping-off point.
I’ve worked in the buildings industry for almost 10 years now (woohoo!). Much of that time has been spent with digital representations of mechanical systems or whole building energy models. These types of tools, along with the others listed below, are mainstream in our industry. It turns out that we’ve been inching closer to digital twins this whole time—it’s the next progression. To illustrate, let’s walk through them in the order that I encountered them, starting at the beginning of my career.
Whole building energy models—built in tools like eQuest, OpenStudio, Trane Trace, etc—are used in design and energy management practices to calculate the energy consumed in a building for every hour of the year.
By creating a digital version of the building, different alternatives for a target energy use (e.g. net-zero) can be digitally simulated and tested before being implemented in real life. A better envelope, a different orientation to the sun, more distributed energy resources, a different HVAC design, etc.
This makes it possible to test different design alternatives in the context of the exact user interaction and specifics of the building. Also, knowing the savings and/or ROI for different alternatives helps building owners and investors decide whether or not to pursue them.
We use it to operate and monitor our buildings. It provides streaming data, a user interface with graphical ways to explore the equipment, and of course control of the equipment itself. The BAS was the IoT for buildings before IoT was a thing.
Building information models are three-dimensional representations of buildings that aid in the construction process. I’m not sure I could say it better than Jenana Roper did:
BIM is a small sub-set of a Digital Twin, frozen in time – typically during the design and construction phase. BIM is a finely tuned tool for more accurate design, collaboration, visualization, costing and construction sequencing phases of a building’s life. Its primary purpose is to design and construct a building, and post-construction, it serves to provide a digital record of a constructed asset. BIM is only focused on buildings – not people or processes. However, BIM is a small but very useful input into a Digital Twin, as it provides us with an accurate digital asset register and location data and is a great starting point for both a Smart Building and a Digital Twin.
EMIS, or building analytics, comprise another puzzle piece of the digital twin. As we’ve defined in the EMIS Framework, an EMIS is a system of devices, data services, and software applications that communicates with any building system or third-party data source to aggregate data and transform it into new capabilities that aid in the optimization of the building.
An EMIS typically isn’t one product or application, although it should perform like one. The EMIS stack is all of the devices, data services, and applications that meet the needs of the user. Depending on the EMIS implementation, the stack has many different layers:
A smart building builds on the BAS and in some ways the EMIS. It integrates multiple previously-siloed building systems into one platform and provides enhanced control functionality and engagement with occupants. Some have called these Building Engagement Platforms, Building Operating Systems, or IoT platforms. There are probably many more buzzwords grouped under the Smart Buildings umbrella.
A smart building platform is the “middleware” between devices, people, and software applications. It differs from BAS in the scope of the data it connects with—ideally it connects everything digital: HVAC, lighting, plug loads, meters, access control, fire suppression, grid-interaction, IoT, etc. It differs from EMIS in its focus on connections at the edge instead of the cloud, human engagement, and control. Just like an EMIS, a smart building platform provides capabilities to users and occupants through applications.
What is a modern digital twin then? What does it add to historical versions? One way to think about it is that it combines all of the above together.
Static models are enriched by live data and the systems using live data are enriched by better models. For instance, an energy model would be more useful if it was calibrated to actual conditions in the building as it evolves over time. If that could happen automatically, even better.
Similarly, a BIM needs to be transitioned from construction-focused to O&M focused—meaning it needs static data (e.g. maintenance logs, O&M manuals, warranties) and live data from the BAS. No longer are models or simulations being operated in isolation from the physical world—there must be a connection between the physical and the digital systems. This requires two-way data exchange and the inclusion of humans in the roles of occupants or operators, so the focus on the human experience inherent to the latest smart building approaches is vital for the digital twin as well.
With the combined functionality of all of these approaches, the digital twin is more intelligent and able to provide better analytics and control. It’s greater than the sum of it’s parts. We’ll get to that in more detail in the next installment.
Another way to think of it: define what a modern digital twin needs to include:
Drop any one of these and, in my opinion, you no longer have a modern digital twin. In the next installment, we’ll walk through what digital twins will do for building owners, why they will change the buildings industry, and my questions about the future of digital twins.
Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.
Go to Nexus ConnectJoin Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.
Sign Up