Article
10
min read
James Dice

AI Won’t Fix Your Building’s Data—How to Actually Make It Useful

July 1, 2025

The data streaming from our buildings is useless without context. 

Real example: Imagine you’re trying to pull HVAC data to run analytics on how comfortable your spaces are. Your Building Automation System (BAS) provides multiple ‘setpoint’ points to use, but the true effective setpoint chosen by the occupant is missing from the network. This renders your reported data meaningless without reconfiguration.

This illustrates a key truth: most building data is raw exhaust, a byproduct of control systems, not formatted for analytics (or any other use case) without intentional context.

This lack of context, or metadata, is holding back smart buildings programs. Building owners often wrongly assume BAS or IoT data is useful by default, finding it unlabeled, inconsistent, or incomplete. Readings like temperature or flow lack critical context ('who/what/where'), making them technically available but practically unusable without extensive cleanup.

While AI excels at pattern recognition, it cannot magically fill in missing context for raw building data. AI models struggle with cryptic point names or unrecorded information; training cannot recover a sensor's location or a valve's function if never captured. There's no easy button for transforming data exhaust into insight.

Building owners (and their facility and OT teams) cannot simply expect clean data to emerge from their systems. Leaving data quality to chance ends with increased labor costs during implementation, rework, delayed time to value, and failed use cases due to erroneous analytics.

Here’s the key reality: it’s on you, the building owner/operator, to run a program that builds and maintains context for your data. You don’t have to do it all yourself—there’s a growing ecosystem of vendors and tools to help—but you do need to take ownership of the outcome. 

Treating building data like a first-class asset means setting standards, enforcing them, and continuously validating that your data stays organized and reliable. With the right approach, you can ensure your building’s data is useful fuel for operations and analytics, not just exhaust.

Set Standards and Own the Stack (or Pay for the Rework)

Making building data useful begins with establishing metadata standards and enforcing them across your portfolio. This is not a “one and done” task or something you can completely delegate. Owners need to own their data stack—or be prepared to pay later when someone has to clean it up. 

Why? Because context in building systems is not static: equipment gets replaced, control sequences get modified, point names drift over time. If nobody is in charge of keeping the data model up to date, entropy takes over. Since software application success depends on downstream modeling, this role is crucial.

It might be a new role in the facilities or OT team, part of the scope of your data layer software provider, or a responsibility given to your Master Systems Integrator (MSI) under contract. Without that accountability, you’re essentially letting each vendor or contractor do whatever they want with naming and tagging—which guarantees a mess down the road.

A practical framework for this comes from Mapped (an independent data layer vendor), whose entire business is centered on structuring building data. Jason Koh, Mapped’s Chief Data Officer and co-founder of the Brick Schema ontology, outlines a five-step data modeling process that building owners can adopt (whether using Mapped’s tools or otherwise):

  • Type Assignment: Identify and tag each data point with what kind of thing it is (e.g. zone temperature sensor, discharge air damper command, etc.).

  • Identification: Parse and interpret identifiers in the raw data to group related points and devices. From “VAV101_RM100_ZN_T”, you can infer and confirm that this is the zone temp for VAV unit 101 serving room 100.

  • Linking: Establish relationships between entities—which points belong to which piece of equipment, which equipment serves which space, and how equipment is nested.

  • Unification: Reconcile disparate data sources into a unified model. A single “real world” object might appear in the BAS, in mechanical schedules, and in drawing files, all with different names. Unification means mapping those together.

  • Enrichment: Add any additional context or custom metadata needed for your use cases. This could be human-friendly equipment labels, associations to floor or zone names, capacities, control sequences, commissioning dates, etc. 

By breaking the problem into these steps (or similar ones), owners (and their vendors) can systematically build a robust data model. 

As we discussed with Mapped co-founder Shaun Cooley in 2021, Mapped was founded on the vision of accelerating this process using AI. Over time, they've automated much of the process, but they've learned there's no replacement for expert and owner involvement in providing critical context.

Mapped initially aimed for a 'magic’ solution, but learned that human expert involvement is crucial and customers want the ability to customize how their data is structured. They now request comprehensive upfront information and have developed an 'Expert Center', allowing customers to contribute directly to data cleaning and modeling. 

Mapped’s Expert Center combines toolchains and AI agents that interact with users through a visual interface. Building operations teams can respond to specific prompts, such as confirming whether an AHU serves the third or fourth floor, or validate suggested mappings through the interface. This human-in-the-loop approach ensures expert context feeds back into the AI pipeline.

By setting metadata standards and being willing to “own” the fidelity of your data, you enable the AI and experts to work in concert. If you don’t provide structure up front, a vendor like Mapped will either come back to you for help anyway or, worst case, have to send people on-site to figure it out. 

Smart building success demands collaboration: the owner’s team provides accurate info and context; the vendor provides tools and expertise to organize it. When owners establish clear standards (e.g. “we tag all our points with Haystack 4 conventions” or “here’s our equipment list for each site in Excel”), vendors can deliver solutions faster and more accurately. When owners don’t, the vendor has to first do the dirty job of cleaning the data or reverse-engineering your building.

In short, take charge of your metadata and data modeling program. As the next sections demonstrate, this principle extends throughout the entire building lifecycle —from design to daily operations.

Build the Model Early and Update It Often

One of the most common mistakes in smart building initiatives is treating data modeling or tagging as an afterthought—something to do after the BAS is installed or when analytics are being onboarded. 

But the best time to capture context is when the system is installed, not months or years later. Steve Dawson-Haggerty of Normal argues that modeling should be an integral part of the controls engineering workflow, not a separate project down the line. 

He envisions a world where “you cannot deploy the system without the ontology being constructed.” In other words, if a contractor is programming a building’s control system or setting up devices, they should be simultaneously building the digital model of those systems. 

Consider how a typical BAS graphics setup works: an integrator might draw out an air handling unit diagram and label points on it (supply temp, return fan status, etc.). In doing so, they’re implicitly mapping relationships (this point is part of AHU-1). Yet traditionally, that effort doesn’t get captured as structured data—it lives only in the graphic or in someone’s head. 

Controls vendors like 75F and J2 Innovations are unique in the BAS world for exactly this reason: Haystack data modeling is “native” and built into their product’s deployment workflow through the use of tagged templates and engineering tools that automatically create relationships. 

"Imagine a world where systems inherently speak a standardized language, making them self-describing from the moment of integration," says Scott Muench of J2 Innovations. Like Steve, Scott hopes this approach will become an industry standard, prompting manufacturers to incorporate data modeling as a foundation for their products, not as an afterthought. 

Normal’s approach, in the absence of context, is to help the user formalize those relationships during onboarding. An engineer can define “AirHandler” as a template with certain expected points, then create instances AHU-1, AHU-2, etc., each with its associated points or subsystems. By the time the system is up, you have a navigable equipment hierarchy and consistent point metadata. Point tagging isn’t an additional step—it “follows naturally” from the equipment definitions. 

For building owners, the lesson is to push your vendors and project teams to incorporate modeling from day one. When you’re kicking off a new construction or a major retrofit, require that the deliverables include a complete labeled model of the BAS and devices. 

Some owners are beginning to do this. Andrew Rodgers of ACE IoT Solutions mentioned that a few consultants have started specifying that the MSI (Master Systems Integrator) must shepherd the data model and use-case delivery throughout the project. 

The idea is that the MSI cannot just get the BAS online; they also have to ensure the data is modeled correctly for the analytics to work. However, Rodgers notes, “almost no one is doing business like that right now” except in some high-end greenfield projects. 

More often, the traditional process plays out: “Consultant builds some use cases into the spec, it goes out to a general contractor, and then it all gets value engineered to hell. And at the end of the day, a beleaguered controls tech implements whatever he did at the last job,” Rodgers said.

The result is not a rigorously tagged system, but rather whatever naming convention (if any) the technician felt like using, and use-case requirements that may have been stripped out to cut cost. Commissioning agents, for their part, typically aren’t tasked with validating data semantics —as Rodgers points out, they care if the HVAC works, not if it’s ready to share data with an app: “They’re there to make sure the damper on the VAV actuates… They don’t really care how the control signal is generated… They validate the dampers move and walk away.” 

Dawson-Haggerty also says don’t get stuck on which ontology you use, a point which was echoed by a panel of data modeling experts at NexusCon ‘24. The exact choice matters far less than having one and using it consistently. Too many owners get stuck in analysis paralysis, awaiting a winner in the ontology wars or a new ASHRAE standard, while their contractors continue to deploy systems with zero metadata in the meantime. 

It’s more productive to decide and include that requirement in specs and contracts right now. You can always translate or expand to another standard later if needed (since these ontologies are often interoperable or mappable). What you can’t do easily is recover context for a building commissioned with no metadata at all.

Another benefit of building the model early is that it sets the stage for continuous updates. If you have an authoritative model or data layer from the start, you can implement processes to keep it updated as things change. 

Next, we’ll look at how to validate and sustain that data integrity through commissioning and beyond, so your upfront efforts don’t decay over time.

Validate at Commissioning and Beyond—or Pay for It Later

[Nexus Pro members: log in to continue reading!]

Many metadata and integration issues in buildings can be traced back to a simple root cause: nobody ever validated the data after installation. In traditional construction, there’s (usually) a commissioning step to verify the HVAC equipment works and meets performance specs, but it rarely extends to verifying that all relevant data points are accessible and correctly labeled for future use. 

The consultant or engineer might have written the spec with good intentions, but without enforcement, things slip through the cracks. And in construction, plenty gets “value engineered” out or changed under pressure. Field techs often revert to what’s fastest for them, rather than what’s best for data consistency.

What would it look like if we did validate data at commissioning? Increasingly, this is referred to as digital commissioning or data commissioning. It means that alongside functional tests, the project team runs data-centric tests

  • Are all devices connected and reporting? 
  • Do point names follow the standard and match the model? 
  • Are the relationships (AHU has all its VAV children, etc.) reflected correctly in the database? 

This may sound tedious, but new tools are emerging to automate the process. Rodgers’ company, ACE IoT, has been developing a tool called Sentinel aimed at this exact need. Sentinel sits on the network during commissioning, scans the live devices and points, flags any missing points or off-pattern names, and creates a punch list for data completeness. 

This approach echoes what a very diligent MSI might do manually (some integrators do check all their points against a spec sheet), but automating it ensures nothing is overlooked and saves man-hours. Rodgers points out that commissioning agents could be ideal for this role, as they already test equipment operation (manually or automatically) and are familiar with the control sequences. They just need the tools and mandate to verify metadata while they’re at it. 

As Rodgers laments, it’s still early days; many building owners haven’t considered asking for this, or balk at the additional cost, real or perceived. Yet those same owners likely spend much more later when a retrofit or analytics install uncovers the mess.

Over time, if nobody monitors day-to-day changes, your once-accurate model will diverge from reality. This is exactly what happened when Bueno Analytics deployed their platform across 1,000+ grocery stores (the Woolworths chain) to gather equipment data for energy and refrigeration analytics. 

Bueno’s solution was to develop an automated reconciliation tool they call Synchro. Sentinel and Synchro represent a new wave of continuous data commissioning tools. They continuously validate that your production data aligns with your expected model and address any discrepancies. 

From a responsibility standpoint, building owners have a choice: either build this capability internally or ensure a partner is doing it for you. What doesn’t work is assuming “someone else” is handling it when no one actually is. 

Rodgers voiced a passionate plea on this front: he argues owners should stop engaging integrators and vendors on a pure time-and-materials, break-fix basis and move to a more programmatic partnership. “Stop [f*ing] paying MSIs by the hour. Tie their payment to outcome… pay a retainer because this stuff needs to be maintained,” he urges. 

If a vendor knows they are accountable for data quality over time (with SLAs or ongoing fees contingent on it), they have an incentive to set up automated checks, alerts, and proactive fixes. If not, they might happily bill you each time a report breaks because a point was missing. The owner’s mindset should shift from one-off projects to continuous commissioning and lifecycle data management.

Conclusion: Treat Data as an Asset, Not an Afterthought

In the world of facilities and OT, the reliability and usefulness of your building data is quickly becoming as important as the reliability of the HVAC or power systems themselves. If there’s one overarching lesson from those on the cutting edge, it’s that data quality doesn’t take care of itself. 

If you put the right building blocks in place (standards, early modeling, validation processes, and accountability), all the promised ROI of smart building technology becomes attainable. The cliché fits: garbage data in will yield garbage out. Conversely, well-managed data in will open the door to optimization, cost savings, and innovation out. 

AI will then have a fertile ground to help you—not to figure out what a setpoint name means, but to truly optimize your building’s performance. The path to get there isn’t flashy, but it is doable.

Sign Up for Access or Log In to Continue Viewing

Many metadata and integration issues in buildings can be traced back to a simple root cause: nobody ever validated the data after installation. In traditional construction, there’s (usually) a commissioning step to verify the HVAC equipment works and meets performance specs, but it rarely extends to verifying that all relevant data points are accessible and correctly labeled for future use. 

The consultant or engineer might have written the spec with good intentions, but without enforcement, things slip through the cracks. And in construction, plenty gets “value engineered” out or changed under pressure. Field techs often revert to what’s fastest for them, rather than what’s best for data consistency.

What would it look like if we did validate data at commissioning? Increasingly, this is referred to as digital commissioning or data commissioning. It means that alongside functional tests, the project team runs data-centric tests

  • Are all devices connected and reporting? 
  • Do point names follow the standard and match the model? 
  • Are the relationships (AHU has all its VAV children, etc.) reflected correctly in the database? 

This may sound tedious, but new tools are emerging to automate the process. Rodgers’ company, ACE IoT, has been developing a tool called Sentinel aimed at this exact need. Sentinel sits on the network during commissioning, scans the live devices and points, flags any missing points or off-pattern names, and creates a punch list for data completeness. 

This approach echoes what a very diligent MSI might do manually (some integrators do check all their points against a spec sheet), but automating it ensures nothing is overlooked and saves man-hours. Rodgers points out that commissioning agents could be ideal for this role, as they already test equipment operation (manually or automatically) and are familiar with the control sequences. They just need the tools and mandate to verify metadata while they’re at it. 

As Rodgers laments, it’s still early days; many building owners haven’t considered asking for this, or balk at the additional cost, real or perceived. Yet those same owners likely spend much more later when a retrofit or analytics install uncovers the mess.

Over time, if nobody monitors day-to-day changes, your once-accurate model will diverge from reality. This is exactly what happened when Bueno Analytics deployed their platform across 1,000+ grocery stores (the Woolworths chain) to gather equipment data for energy and refrigeration analytics. 

Bueno’s solution was to develop an automated reconciliation tool they call Synchro. Sentinel and Synchro represent a new wave of continuous data commissioning tools. They continuously validate that your production data aligns with your expected model and address any discrepancies. 

From a responsibility standpoint, building owners have a choice: either build this capability internally or ensure a partner is doing it for you. What doesn’t work is assuming “someone else” is handling it when no one actually is. 

Rodgers voiced a passionate plea on this front: he argues owners should stop engaging integrators and vendors on a pure time-and-materials, break-fix basis and move to a more programmatic partnership. “Stop [f*ing] paying MSIs by the hour. Tie their payment to outcome… pay a retainer because this stuff needs to be maintained,” he urges. 

If a vendor knows they are accountable for data quality over time (with SLAs or ongoing fees contingent on it), they have an incentive to set up automated checks, alerts, and proactive fixes. If not, they might happily bill you each time a report breaks because a point was missing. The owner’s mindset should shift from one-off projects to continuous commissioning and lifecycle data management.

Conclusion: Treat Data as an Asset, Not an Afterthought

In the world of facilities and OT, the reliability and usefulness of your building data is quickly becoming as important as the reliability of the HVAC or power systems themselves. If there’s one overarching lesson from those on the cutting edge, it’s that data quality doesn’t take care of itself. 

If you put the right building blocks in place (standards, early modeling, validation processes, and accountability), all the promised ROI of smart building technology becomes attainable. The cliché fits: garbage data in will yield garbage out. Conversely, well-managed data in will open the door to optimization, cost savings, and innovation out. 

AI will then have a fertile ground to help you—not to figure out what a setpoint name means, but to truly optimize your building’s performance. The path to get there isn’t flashy, but it is doable.

Sign Up for Access or Log In to Continue Viewing

Many metadata and integration issues in buildings can be traced back to a simple root cause: nobody ever validated the data after installation. In traditional construction, there’s (usually) a commissioning step to verify the HVAC equipment works and meets performance specs, but it rarely extends to verifying that all relevant data points are accessible and correctly labeled for future use. 

The consultant or engineer might have written the spec with good intentions, but without enforcement, things slip through the cracks. And in construction, plenty gets “value engineered” out or changed under pressure. Field techs often revert to what’s fastest for them, rather than what’s best for data consistency.

What would it look like if we did validate data at commissioning? Increasingly, this is referred to as digital commissioning or data commissioning. It means that alongside functional tests, the project team runs data-centric tests

  • Are all devices connected and reporting? 
  • Do point names follow the standard and match the model? 
  • Are the relationships (AHU has all its VAV children, etc.) reflected correctly in the database? 

This may sound tedious, but new tools are emerging to automate the process. Rodgers’ company, ACE IoT, has been developing a tool called Sentinel aimed at this exact need. Sentinel sits on the network during commissioning, scans the live devices and points, flags any missing points or off-pattern names, and creates a punch list for data completeness. 

This approach echoes what a very diligent MSI might do manually (some integrators do check all their points against a spec sheet), but automating it ensures nothing is overlooked and saves man-hours. Rodgers points out that commissioning agents could be ideal for this role, as they already test equipment operation (manually or automatically) and are familiar with the control sequences. They just need the tools and mandate to verify metadata while they’re at it. 

As Rodgers laments, it’s still early days; many building owners haven’t considered asking for this, or balk at the additional cost, real or perceived. Yet those same owners likely spend much more later when a retrofit or analytics install uncovers the mess.

Over time, if nobody monitors day-to-day changes, your once-accurate model will diverge from reality. This is exactly what happened when Bueno Analytics deployed their platform across 1,000+ grocery stores (the Woolworths chain) to gather equipment data for energy and refrigeration analytics. 

Bueno’s solution was to develop an automated reconciliation tool they call Synchro. Sentinel and Synchro represent a new wave of continuous data commissioning tools. They continuously validate that your production data aligns with your expected model and address any discrepancies. 

From a responsibility standpoint, building owners have a choice: either build this capability internally or ensure a partner is doing it for you. What doesn’t work is assuming “someone else” is handling it when no one actually is. 

Rodgers voiced a passionate plea on this front: he argues owners should stop engaging integrators and vendors on a pure time-and-materials, break-fix basis and move to a more programmatic partnership. “Stop [f*ing] paying MSIs by the hour. Tie their payment to outcome… pay a retainer because this stuff needs to be maintained,” he urges. 

If a vendor knows they are accountable for data quality over time (with SLAs or ongoing fees contingent on it), they have an incentive to set up automated checks, alerts, and proactive fixes. If not, they might happily bill you each time a report breaks because a point was missing. The owner’s mindset should shift from one-off projects to continuous commissioning and lifecycle data management.

Conclusion: Treat Data as an Asset, Not an Afterthought

In the world of facilities and OT, the reliability and usefulness of your building data is quickly becoming as important as the reliability of the HVAC or power systems themselves. If there’s one overarching lesson from those on the cutting edge, it’s that data quality doesn’t take care of itself. 

If you put the right building blocks in place (standards, early modeling, validation processes, and accountability), all the promised ROI of smart building technology becomes attainable. The cliché fits: garbage data in will yield garbage out. Conversely, well-managed data in will open the door to optimization, cost savings, and innovation out. 

AI will then have a fertile ground to help you—not to figure out what a setpoint name means, but to truly optimize your building’s performance. The path to get there isn’t flashy, but it is doable.

The data streaming from our buildings is useless without context. 

Real example: Imagine you’re trying to pull HVAC data to run analytics on how comfortable your spaces are. Your Building Automation System (BAS) provides multiple ‘setpoint’ points to use, but the true effective setpoint chosen by the occupant is missing from the network. This renders your reported data meaningless without reconfiguration.

This illustrates a key truth: most building data is raw exhaust, a byproduct of control systems, not formatted for analytics (or any other use case) without intentional context.

This lack of context, or metadata, is holding back smart buildings programs. Building owners often wrongly assume BAS or IoT data is useful by default, finding it unlabeled, inconsistent, or incomplete. Readings like temperature or flow lack critical context ('who/what/where'), making them technically available but practically unusable without extensive cleanup.

While AI excels at pattern recognition, it cannot magically fill in missing context for raw building data. AI models struggle with cryptic point names or unrecorded information; training cannot recover a sensor's location or a valve's function if never captured. There's no easy button for transforming data exhaust into insight.

Building owners (and their facility and OT teams) cannot simply expect clean data to emerge from their systems. Leaving data quality to chance ends with increased labor costs during implementation, rework, delayed time to value, and failed use cases due to erroneous analytics.

Here’s the key reality: it’s on you, the building owner/operator, to run a program that builds and maintains context for your data. You don’t have to do it all yourself—there’s a growing ecosystem of vendors and tools to help—but you do need to take ownership of the outcome. 

Treating building data like a first-class asset means setting standards, enforcing them, and continuously validating that your data stays organized and reliable. With the right approach, you can ensure your building’s data is useful fuel for operations and analytics, not just exhaust.

Set Standards and Own the Stack (or Pay for the Rework)

Making building data useful begins with establishing metadata standards and enforcing them across your portfolio. This is not a “one and done” task or something you can completely delegate. Owners need to own their data stack—or be prepared to pay later when someone has to clean it up. 

Why? Because context in building systems is not static: equipment gets replaced, control sequences get modified, point names drift over time. If nobody is in charge of keeping the data model up to date, entropy takes over. Since software application success depends on downstream modeling, this role is crucial.

It might be a new role in the facilities or OT team, part of the scope of your data layer software provider, or a responsibility given to your Master Systems Integrator (MSI) under contract. Without that accountability, you’re essentially letting each vendor or contractor do whatever they want with naming and tagging—which guarantees a mess down the road.

A practical framework for this comes from Mapped (an independent data layer vendor), whose entire business is centered on structuring building data. Jason Koh, Mapped’s Chief Data Officer and co-founder of the Brick Schema ontology, outlines a five-step data modeling process that building owners can adopt (whether using Mapped’s tools or otherwise):

  • Type Assignment: Identify and tag each data point with what kind of thing it is (e.g. zone temperature sensor, discharge air damper command, etc.).

  • Identification: Parse and interpret identifiers in the raw data to group related points and devices. From “VAV101_RM100_ZN_T”, you can infer and confirm that this is the zone temp for VAV unit 101 serving room 100.

  • Linking: Establish relationships between entities—which points belong to which piece of equipment, which equipment serves which space, and how equipment is nested.

  • Unification: Reconcile disparate data sources into a unified model. A single “real world” object might appear in the BAS, in mechanical schedules, and in drawing files, all with different names. Unification means mapping those together.

  • Enrichment: Add any additional context or custom metadata needed for your use cases. This could be human-friendly equipment labels, associations to floor or zone names, capacities, control sequences, commissioning dates, etc. 

By breaking the problem into these steps (or similar ones), owners (and their vendors) can systematically build a robust data model. 

As we discussed with Mapped co-founder Shaun Cooley in 2021, Mapped was founded on the vision of accelerating this process using AI. Over time, they've automated much of the process, but they've learned there's no replacement for expert and owner involvement in providing critical context.

Mapped initially aimed for a 'magic’ solution, but learned that human expert involvement is crucial and customers want the ability to customize how their data is structured. They now request comprehensive upfront information and have developed an 'Expert Center', allowing customers to contribute directly to data cleaning and modeling. 

Mapped’s Expert Center combines toolchains and AI agents that interact with users through a visual interface. Building operations teams can respond to specific prompts, such as confirming whether an AHU serves the third or fourth floor, or validate suggested mappings through the interface. This human-in-the-loop approach ensures expert context feeds back into the AI pipeline.

By setting metadata standards and being willing to “own” the fidelity of your data, you enable the AI and experts to work in concert. If you don’t provide structure up front, a vendor like Mapped will either come back to you for help anyway or, worst case, have to send people on-site to figure it out. 

Smart building success demands collaboration: the owner’s team provides accurate info and context; the vendor provides tools and expertise to organize it. When owners establish clear standards (e.g. “we tag all our points with Haystack 4 conventions” or “here’s our equipment list for each site in Excel”), vendors can deliver solutions faster and more accurately. When owners don’t, the vendor has to first do the dirty job of cleaning the data or reverse-engineering your building.

In short, take charge of your metadata and data modeling program. As the next sections demonstrate, this principle extends throughout the entire building lifecycle —from design to daily operations.

Build the Model Early and Update It Often

One of the most common mistakes in smart building initiatives is treating data modeling or tagging as an afterthought—something to do after the BAS is installed or when analytics are being onboarded. 

But the best time to capture context is when the system is installed, not months or years later. Steve Dawson-Haggerty of Normal argues that modeling should be an integral part of the controls engineering workflow, not a separate project down the line. 

He envisions a world where “you cannot deploy the system without the ontology being constructed.” In other words, if a contractor is programming a building’s control system or setting up devices, they should be simultaneously building the digital model of those systems. 

Consider how a typical BAS graphics setup works: an integrator might draw out an air handling unit diagram and label points on it (supply temp, return fan status, etc.). In doing so, they’re implicitly mapping relationships (this point is part of AHU-1). Yet traditionally, that effort doesn’t get captured as structured data—it lives only in the graphic or in someone’s head. 

Controls vendors like 75F and J2 Innovations are unique in the BAS world for exactly this reason: Haystack data modeling is “native” and built into their product’s deployment workflow through the use of tagged templates and engineering tools that automatically create relationships. 

"Imagine a world where systems inherently speak a standardized language, making them self-describing from the moment of integration," says Scott Muench of J2 Innovations. Like Steve, Scott hopes this approach will become an industry standard, prompting manufacturers to incorporate data modeling as a foundation for their products, not as an afterthought. 

Normal’s approach, in the absence of context, is to help the user formalize those relationships during onboarding. An engineer can define “AirHandler” as a template with certain expected points, then create instances AHU-1, AHU-2, etc., each with its associated points or subsystems. By the time the system is up, you have a navigable equipment hierarchy and consistent point metadata. Point tagging isn’t an additional step—it “follows naturally” from the equipment definitions. 

For building owners, the lesson is to push your vendors and project teams to incorporate modeling from day one. When you’re kicking off a new construction or a major retrofit, require that the deliverables include a complete labeled model of the BAS and devices. 

Some owners are beginning to do this. Andrew Rodgers of ACE IoT Solutions mentioned that a few consultants have started specifying that the MSI (Master Systems Integrator) must shepherd the data model and use-case delivery throughout the project. 

The idea is that the MSI cannot just get the BAS online; they also have to ensure the data is modeled correctly for the analytics to work. However, Rodgers notes, “almost no one is doing business like that right now” except in some high-end greenfield projects. 

More often, the traditional process plays out: “Consultant builds some use cases into the spec, it goes out to a general contractor, and then it all gets value engineered to hell. And at the end of the day, a beleaguered controls tech implements whatever he did at the last job,” Rodgers said.

The result is not a rigorously tagged system, but rather whatever naming convention (if any) the technician felt like using, and use-case requirements that may have been stripped out to cut cost. Commissioning agents, for their part, typically aren’t tasked with validating data semantics —as Rodgers points out, they care if the HVAC works, not if it’s ready to share data with an app: “They’re there to make sure the damper on the VAV actuates… They don’t really care how the control signal is generated… They validate the dampers move and walk away.” 

Dawson-Haggerty also says don’t get stuck on which ontology you use, a point which was echoed by a panel of data modeling experts at NexusCon ‘24. The exact choice matters far less than having one and using it consistently. Too many owners get stuck in analysis paralysis, awaiting a winner in the ontology wars or a new ASHRAE standard, while their contractors continue to deploy systems with zero metadata in the meantime. 

It’s more productive to decide and include that requirement in specs and contracts right now. You can always translate or expand to another standard later if needed (since these ontologies are often interoperable or mappable). What you can’t do easily is recover context for a building commissioned with no metadata at all.

Another benefit of building the model early is that it sets the stage for continuous updates. If you have an authoritative model or data layer from the start, you can implement processes to keep it updated as things change. 

Next, we’ll look at how to validate and sustain that data integrity through commissioning and beyond, so your upfront efforts don’t decay over time.

Validate at Commissioning and Beyond—or Pay for It Later

[Nexus Pro members: log in to continue reading!]
⭐️ Pro Article

Sign Up for Access or Log In to View

⭐️ Pro Article

Sign Up for Access or Log In to View

Are you interested in joining us at NexusCon 2025? Register now so you don’t miss out!

Join Today

Are you a Nexus Pro member yet? Join now to get access to our community of 600+ members.

Join Today

Have you taken our Smart Building Strategist Course yet? Sign up to get access to our courses platform.

Enroll Now
Conversation
Comments (-)
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

This is a great piece!

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

I agree.

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Get the renowned Nexus Newsletter

Access the Nexus Community

Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.

Go to Nexus Connect

Upgrade to Nexus Pro

Join Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.

Sign Up