“It’s not first to market that wins, it’s the first to cross the chasm.”
—Geoffrey Moore
Do you ever feel like the Universe is speaking directly to you?
Over the holidays, my mom had a stack of books that she wanted to give away. She had set them out to see if I wanted any of them. One of them was not like the others: Crossing the Chasm by Geoffrey Moore.1
If you haven’t read it, here’s the main premise of the book: Motivations for buying a new product are vastly different, depending on who is buying. There are groups of buyer personas with similar motivations that line up along a bell curve called the Technology Adoption Lifecycle:
The innovators and visionary early adopters want huge changes and are willing to bet on them against the odds. But people in the early majority (and beyond) are much more pragmatic. They don’t want big changes and huge innovations, but rather incremental improvements based on using proven products and solutions.
This gap between the early adopters and the early majority is called the chasm. In order to cross it, you need to successfully progress from left to right… or else you get stuck and die.
Before we get to why this is relevant to us, back to the Universe: I have no clue why my mom owned this book. She doesn’t do new technology. She hasn’t started a startup, nor does she plan to cross the chasm someday. There was no rational reason for this hand-me-down to occur. This must be some sort of sign...
Since I don’t take these signs lightly, I took the book. After reading halfway through, I think the Universe was asking me (and therefore us) to think about this question:
How can analytics for buildings go mainstream?
Fair warning: this is a juicy one. Let’s dive in.
Of all the technologies we discuss here on Nexus, the broad and murky category of “analytics” has been around the longest.2 And yet, in the two months I’ve been thinking about this question and bouncing my ideas off of people (including 50+ commenters on LinkedIn), I haven’t had one person tell me this technology is already mainstream, at scale, realizing its potential, or viewed as part of the required technology to own and run a commercial building.
We’re still in the early days—in what Geoffrey Moore calls the “early market”. The world of innovators and early adopters.
And while we seem to have a consensus on our current location, we aren’t anywhere near consensus on why that is, what to do about it, or how to cross that big scary chasm.
If I may generalize, let’s levelset on what it’s like here📍:
We haven’t created what Moore calls a breakthrough application that produces an order-of-magnitude improvement. Before all of you with successful analytics case studies get upset with me, the exception to this rule is of course when us analytics nerds are paired with the product.
In the vast majority of those case studies, we human service providers produce that breakthrough with analytics as a tool in our toolbox. And it’s within these competent, often-external service provider organizations that I think analytics has already crossed a chasm.
But that chasm, and that early majority buyer, is a mirage, folks. Getting across it doesn’t mean it’s gone mainstream with building owners. In this sense, there are really two chasms…
And the second chasm is the key to scale. Crossing the first one is great, but if analytics are not integrated into the day-to-day operations of internal teams—the people that could benefit from the new technology the most—it’ll remain a niche tool operating on the fringes of organizations.
There are three reasons I believe that’s true. First, we only have so many analytics nerds. Look at all the “building performance engineer” job openings out there right now. It’s a bottleneck. Second, when propped up by an engineer, it’s too expensive to get to the majority of the market.
Third, I’ve provided this service for years and still provide it to this day. It’s extremely inefficient because it doesn’t put the onus on the building owner to actually change what they’re doing and how they’re doing it.
In Moore’s terminology, the dependence on service providers allows the technology to be adopted as a “continuous innovation”. But that approach will limit scale and impact4 because it’s neglecting the potential of the tool, which is to transform the operation of the building. It’s disruptive and should be treated as such.
Despite this murky spot we’re in, I remain just as bullish on this second chasm as I was on the first when I adopted analytics as a service provider almost ten years ago. The real estate industry is shifting under our feet and all signs are pointing towards analytics as a solution to today’s top issues for building owners:
In other words, building owners have new problems that analytics can uniquely solve. But not just any analytics—it will require the right approach. Back to those LinkedIn comments and conversations: Most opinions on where we go from here seem to think today’s average analytics product is ready to go to scale.
As one commenter said, “the software performance is a given”. I wholeheartedly disagree.
I think those that create the right product will see the market develop and open up for them. Those that try to take the wrong product mainstream are going to get stuck. And money spent propping up the wrong products, like with legislation or MBCx incentives, will be wasted. It’s aimed at the wrong chasm.
So let’s now go back to the book. A useful framework for thinking about the right analytics product is Geoffrey Moore’s Whole Product concept.
“The single most important difference between early markets and mainstream markets is that the former are willing to take responsibility for piecing together the whole product, whereas the latter are not.”
—Geoffrey Moore
A hard truth: for most of today’s analytics deployments, there’s a gap between the promise made to the customer—that compelling value proposition that causes them to buy—and the ability of the shipped product to fulfill that promise.
For example, analytics is often pitched and sold on ambitious promises like enabling predictive maintenance and creating 15-20% energy savings—while most of today’s deployments only realize a portion of that vision.6 We’re pitching a mature potential that the actual product can’t hit consistently.
For the visionaries in the early market—like those early-adopting building owners who simply build a whole team around the tool—this gap is no big deal. They’re trying to change the game, so they don’t mind pitching in to fill it in.
For pragmatists on the other side of the chasm, this gap is a dealbreaker. They call BS. They need proof you’ve closed it before they jump in with two feet. They need proof of what Moore calls a “whole product”, meaning you’re bringing everything required to resolve their pain. Frankly, most of us aren’t coming anywhere close to wholeness.
Before we talk about the whole product, let’s talk about where most of the market is at. I’ve been calling this Analytics 1.0.
Let’s walk through each piece of the pie from the perspective of a pragmatist. To illustrate my point, I’m going to try and speak from the pragmatist’s cynicism. If this were a pizza, the pragmatist is the guy asking where the hell the other half is.
While the real estate technology laggards or late majority might not be ready for a SaaS product, I think the pragmatists accept this as the way software is purchased nowadays.
But even if the vendor provides all the normal SaaS stuff (customer support, training, regular updates, bug fixes, etc.) there are many ways Analytics 1.0 falls short of today’s mainstream SaaS expectations:
For the first chasm, none of these issues is a dealbreaker. For the second, each will be.
Centralization is all about pulling the data out of siloed building systems and unlocking it for analysis. While this is essentially table stakes for analytics, here too there are ways the pragmatist looks skeptically at 1.0.
Analytics 1.0 is notorious for pretty visualizations, charts, dashboards, and graphics. And while they’re table stakes too, the mainstream market needs more than pretty screens to drive value.
Building analytics is different than business analytics, where Tableau and PowerBI visualizations drive massive value for other industries. Someone needs to go turn a wrench, and the distance between the pretty chart and the wrench getting turned is too far in 1.0.
Detecting faults goes one step further than visualization, but as I detailed in my whitepaper Building Analytics Showdown, it doesn’t go far enough. As a result, it must be propped up by the aforementioned engineering service provider.
In analytics 1.0, detection can also be riddled with data issues and false positives, which damages the trust of end-users. The worst detection-only products flood users with too many unprioritized and unimportant issues—which are essentially alarms on abnormal measurements not too far outside of what their existing systems could do.
In order to realize the customer’s compelling value proposition, these products need to be packaged with “man-analytics” services to investigate, prioritize, and diagnose faults. Again, this is a product problem, and it’s a massive barrier to scale.
Do you see how looking skeptically at this technology from the eyes of the pragmatist adds a dose of reality? Sorry if I’m hurting feelings, but we’re only just getting started. Let’s now complete the pie.
The whole analytics product needs to make up for those shortcomings, plus add the bottom half.
Let’s walk through the rest of the pie.
The ideal tool produces insights that fully diagnose and prioritize each issue automatically, which minimizes the human effort required to produce results. Most of today’s tools lie somewhere in the middle of that ideal and the worst-case scenario described in the previous section.
As I’ve covered extensively, advanced supervisory control goes hand in hand with analytics, but that’s not what I’m talking about in this section. I’m bullish on ASC, but I think it’s on a different adoption curve from analytics.
What I think analytics 2.0 needs to have covered is a smooth path to controls changes. If the product and the owner are ready for ASC, great. If not, when the software detects a change that’s needed in the underlying systems, there needs to be a reliable and timely process for getting changes made.
Analytics 1.0 has been notorious for leaving this for someone else to figure out. With a whole product, it’s part of the plan from day one.
This is, in my opinion, the biggest shortcoming of Analytics 1.0. When we say “integration”, we’re talking about integrating analytics into existing systems. But the second, and equally important, type of integration is into existing teams of humans. Analytics needs to be integrated into the ways work gets done in the organization.
A classic example of this lack of integration is with FDD analytics. There’s a disconnect between FDD software that produces potential work orders and the software that O&M teams use to manage work orders, the CMMS. And then there’s the disconnect between the capital projects that the FDD software identifies and the organization’s capital project development process. An external (m)analytics service provider bridges both of these gaps, often through a monitoring-based commissioning (MBCx) process.
How about PropTech’s newest darling, occupancy analytics? There’s a disconnect between space allocation workflows and spatial analytics software. In most cases, these two applications are separate, missing the opportunity to be able to validate asks for additional space by a business unit against how business units are currently utilizing existing space.
Again, if I’m a pragmatist buyer, I wouldn’t want to fill these gaps on my own. But let’s not stop there. Workflow integration is also the key to solving the fragmentation problem on the buyer side. In order to make it spread through fragmented organizations, the product needs to make it easy for adjacent stakeholders to adopt it into new workflows.
For example, the folks responsible for capital planning care about HVAC performance too. The folks buying the building do too. Same data, but new stakeholders with new questions for that data.
Traversing inside the organization is a great segue to the next requirement for Analytics 2.0: making a business impact. From the standpoint of an executive team, Analytics 1.0 has been stuck in vitamin territory: optimizing a cost center.
The typical 1.0 business case is based on creating energy savings. This has value in year 1, but the whole product must continue to produce value year over year to justify the operating expense. That means traversing to new workflows beyond those that save energy.
But what if you can’t even get to year 1 because the benefit just isn’t enticing enough? Eventually, Analytics 2.0 needs to be a painkiller by improving workflows that affect the top-line revenue of the organization’s business. And that might mean putting energy savings use cases on the back burner at first.7
I think we’ve barely scratched the surface here.
So where do we go from here? With all the different stakeholders needing insights from building data, and all the diversity in today’s connected building systems, here’s the kicker: no one is going to create the whole product on their own.
For all the above gaps to be overcome, today’s products must be augmented by a variety of services and ancillary products to become the whole product that pragmatist buyers need and want.
And according to Crossing the Chasm, these buyers don’t want to be the ones pulling all these pieces together. They fear the hidden, full price they must pay to get the thing to produce the ROI they want—and they want to see the full picture before they buy.
I think the ones that blast out of the early market into the mainstream need to act now on this reality. They’ll navigate consolidation, partnerships, native integrations, and set up platform marketplaces so that the pragmatist building owner doesn’t have to piece everything together on their own.8
Thanks for reading,
James
P.S. Now I want to hear from you: Where am I wrong? Where do we go from here?
P.P.S Thanks to Tyson Soutter, Will Coleman, and Alex Grace for giving me feedback on pieces of this.
If you’re wondering what else was in my mom’s stack, it was The Five Invitations, The Effective Executive, and a cookbook.
I know people get hung up on the word “analytics”. I use it because we don’t have a better one, in my opinion. EMIS is the second-best choice. For why I use analytics instead, I outlined my problems with it here.
It’s not just LinkedIn… I think the dizzying array of 100+ analytics-related companies on the Nexus vendor landscape back me up here.
Alex’s feedback that I’ve yet to incorporate: I don’t see it as a problem per se if the service provider is running the tech…to me it’s more of a question of what the status quo workflows are today. In other words if you have a low skilled maintenance guy/janitor on site, and you have robust service contracts for mechanical and controls, I don’t have a bone to pick with that business model, probably makes a lot of sense for an 80k sq. ft. random office bldg. But the only way for this type of market – let’s say “mid-market” to cross the chasm is for those service contracts to transform.
When I say “Product” from here on out, I’m not talking about one company creating one product to rule them all. A better word for us might be whole solution.
The recently-published results of the Smart Energy Analytics Campaign (SEAC) back me up here. Just look at how inconsistent the savings were across organizations.
I can’t believe I just wrote that.
And while they’re at it, I hope they’ll collaborate to solve the one piece of the pie that is holding EVERY vendor back from becoming part of a whole product: getting the data and making sense of it. I hope they’ll realize we’re better off solving that together and giving the pragmatist confidence that the market is ready for their investment.
Despite this murky spot we’re in, I remain just as bullish on this second chasm as I was on the first when I adopted analytics as a service provider almost ten years ago. The real estate industry is shifting under our feet and all signs are pointing towards analytics as a solution to today’s top issues for building owners:
In other words, building owners have new problems that analytics can uniquely solve. But not just any analytics—it will require the right approach. Back to those LinkedIn comments and conversations: Most opinions on where we go from here seem to think today’s average analytics product is ready to go to scale.
As one commenter said, “the software performance is a given”. I wholeheartedly disagree.
I think those that create the right product will see the market develop and open up for them. Those that try to take the wrong product mainstream are going to get stuck. And money spent propping up the wrong products, like with legislation or MBCx incentives, will be wasted. It’s aimed at the wrong chasm.
So let’s now go back to the book. A useful framework for thinking about the right analytics product is Geoffrey Moore’s Whole Product concept.
“The single most important difference between early markets and mainstream markets is that the former are willing to take responsibility for piecing together the whole product, whereas the latter are not.”
—Geoffrey Moore
A hard truth: for most of today’s analytics deployments, there’s a gap between the promise made to the customer—that compelling value proposition that causes them to buy—and the ability of the shipped product to fulfill that promise.
For example, analytics is often pitched and sold on ambitious promises like enabling predictive maintenance and creating 15-20% energy savings—while most of today’s deployments only realize a portion of that vision.6 We’re pitching a mature potential that the actual product can’t hit consistently.
For the visionaries in the early market—like those early-adopting building owners who simply build a whole team around the tool—this gap is no big deal. They’re trying to change the game, so they don’t mind pitching in to fill it in.
For pragmatists on the other side of the chasm, this gap is a dealbreaker. They call BS. They need proof you’ve closed it before they jump in with two feet. They need proof of what Moore calls a “whole product”, meaning you’re bringing everything required to resolve their pain. Frankly, most of us aren’t coming anywhere close to wholeness.
Before we talk about the whole product, let’s talk about where most of the market is at. I’ve been calling this Analytics 1.0.
Let’s walk through each piece of the pie from the perspective of a pragmatist. To illustrate my point, I’m going to try and speak from the pragmatist’s cynicism. If this were a pizza, the pragmatist is the guy asking where the hell the other half is.
While the real estate technology laggards or late majority might not be ready for a SaaS product, I think the pragmatists accept this as the way software is purchased nowadays.
But even if the vendor provides all the normal SaaS stuff (customer support, training, regular updates, bug fixes, etc.) there are many ways Analytics 1.0 falls short of today’s mainstream SaaS expectations:
For the first chasm, none of these issues is a dealbreaker. For the second, each will be.
Centralization is all about pulling the data out of siloed building systems and unlocking it for analysis. While this is essentially table stakes for analytics, here too there are ways the pragmatist looks skeptically at 1.0.
Analytics 1.0 is notorious for pretty visualizations, charts, dashboards, and graphics. And while they’re table stakes too, the mainstream market needs more than pretty screens to drive value.
Building analytics is different than business analytics, where Tableau and PowerBI visualizations drive massive value for other industries. Someone needs to go turn a wrench, and the distance between the pretty chart and the wrench getting turned is too far in 1.0.
Detecting faults goes one step further than visualization, but as I detailed in my whitepaper Building Analytics Showdown, it doesn’t go far enough. As a result, it must be propped up by the aforementioned engineering service provider.
In analytics 1.0, detection can also be riddled with data issues and false positives, which damages the trust of end-users. The worst detection-only products flood users with too many unprioritized and unimportant issues—which are essentially alarms on abnormal measurements not too far outside of what their existing systems could do.
In order to realize the customer’s compelling value proposition, these products need to be packaged with “man-analytics” services to investigate, prioritize, and diagnose faults. Again, this is a product problem, and it’s a massive barrier to scale.
Do you see how looking skeptically at this technology from the eyes of the pragmatist adds a dose of reality? Sorry if I’m hurting feelings, but we’re only just getting started. Let’s now complete the pie.
The whole analytics product needs to make up for those shortcomings, plus add the bottom half.
Let’s walk through the rest of the pie.
The ideal tool produces insights that fully diagnose and prioritize each issue automatically, which minimizes the human effort required to produce results. Most of today’s tools lie somewhere in the middle of that ideal and the worst-case scenario described in the previous section.
As I’ve covered extensively, advanced supervisory control goes hand in hand with analytics, but that’s not what I’m talking about in this section. I’m bullish on ASC, but I think it’s on a different adoption curve from analytics.
What I think analytics 2.0 needs to have covered is a smooth path to controls changes. If the product and the owner are ready for ASC, great. If not, when the software detects a change that’s needed in the underlying systems, there needs to be a reliable and timely process for getting changes made.
Analytics 1.0 has been notorious for leaving this for someone else to figure out. With a whole product, it’s part of the plan from day one.
This is, in my opinion, the biggest shortcoming of Analytics 1.0. When we say “integration”, we’re talking about integrating analytics into existing systems. But the second, and equally important, type of integration is into existing teams of humans. Analytics needs to be integrated into the ways work gets done in the organization.
A classic example of this lack of integration is with FDD analytics. There’s a disconnect between FDD software that produces potential work orders and the software that O&M teams use to manage work orders, the CMMS. And then there’s the disconnect between the capital projects that the FDD software identifies and the organization’s capital project development process. An external (m)analytics service provider bridges both of these gaps, often through a monitoring-based commissioning (MBCx) process.
How about PropTech’s newest darling, occupancy analytics? There’s a disconnect between space allocation workflows and spatial analytics software. In most cases, these two applications are separate, missing the opportunity to be able to validate asks for additional space by a business unit against how business units are currently utilizing existing space.
Again, if I’m a pragmatist buyer, I wouldn’t want to fill these gaps on my own. But let’s not stop there. Workflow integration is also the key to solving the fragmentation problem on the buyer side. In order to make it spread through fragmented organizations, the product needs to make it easy for adjacent stakeholders to adopt it into new workflows.
For example, the folks responsible for capital planning care about HVAC performance too. The folks buying the building do too. Same data, but new stakeholders with new questions for that data.
Traversing inside the organization is a great segue to the next requirement for Analytics 2.0: making a business impact. From the standpoint of an executive team, Analytics 1.0 has been stuck in vitamin territory: optimizing a cost center.
The typical 1.0 business case is based on creating energy savings. This has value in year 1, but the whole product must continue to produce value year over year to justify the operating expense. That means traversing to new workflows beyond those that save energy.
But what if you can’t even get to year 1 because the benefit just isn’t enticing enough? Eventually, Analytics 2.0 needs to be a painkiller by improving workflows that affect the top-line revenue of the organization’s business. And that might mean putting energy savings use cases on the back burner at first.7
I think we’ve barely scratched the surface here.
So where do we go from here? With all the different stakeholders needing insights from building data, and all the diversity in today’s connected building systems, here’s the kicker: no one is going to create the whole product on their own.
For all the above gaps to be overcome, today’s products must be augmented by a variety of services and ancillary products to become the whole product that pragmatist buyers need and want.
And according to Crossing the Chasm, these buyers don’t want to be the ones pulling all these pieces together. They fear the hidden, full price they must pay to get the thing to produce the ROI they want—and they want to see the full picture before they buy.
I think the ones that blast out of the early market into the mainstream need to act now on this reality. They’ll navigate consolidation, partnerships, native integrations, and set up platform marketplaces so that the pragmatist building owner doesn’t have to piece everything together on their own.8
Thanks for reading,
James
P.S. Now I want to hear from you: Where am I wrong? Where do we go from here?
P.P.S Thanks to Tyson Soutter, Will Coleman, and Alex Grace for giving me feedback on pieces of this.
If you’re wondering what else was in my mom’s stack, it was The Five Invitations, The Effective Executive, and a cookbook.
I know people get hung up on the word “analytics”. I use it because we don’t have a better one, in my opinion. EMIS is the second-best choice. For why I use analytics instead, I outlined my problems with it here.
It’s not just LinkedIn… I think the dizzying array of 100+ analytics-related companies on the Nexus vendor landscape back me up here.
Alex’s feedback that I’ve yet to incorporate: I don’t see it as a problem per se if the service provider is running the tech…to me it’s more of a question of what the status quo workflows are today. In other words if you have a low skilled maintenance guy/janitor on site, and you have robust service contracts for mechanical and controls, I don’t have a bone to pick with that business model, probably makes a lot of sense for an 80k sq. ft. random office bldg. But the only way for this type of market – let’s say “mid-market” to cross the chasm is for those service contracts to transform.
When I say “Product” from here on out, I’m not talking about one company creating one product to rule them all. A better word for us might be whole solution.
The recently-published results of the Smart Energy Analytics Campaign (SEAC) back me up here. Just look at how inconsistent the savings were across organizations.
I can’t believe I just wrote that.
And while they’re at it, I hope they’ll collaborate to solve the one piece of the pie that is holding EVERY vendor back from becoming part of a whole product: getting the data and making sense of it. I hope they’ll realize we’re better off solving that together and giving the pragmatist confidence that the market is ready for their investment.
Despite this murky spot we’re in, I remain just as bullish on this second chasm as I was on the first when I adopted analytics as a service provider almost ten years ago. The real estate industry is shifting under our feet and all signs are pointing towards analytics as a solution to today’s top issues for building owners:
In other words, building owners have new problems that analytics can uniquely solve. But not just any analytics—it will require the right approach. Back to those LinkedIn comments and conversations: Most opinions on where we go from here seem to think today’s average analytics product is ready to go to scale.
As one commenter said, “the software performance is a given”. I wholeheartedly disagree.
I think those that create the right product will see the market develop and open up for them. Those that try to take the wrong product mainstream are going to get stuck. And money spent propping up the wrong products, like with legislation or MBCx incentives, will be wasted. It’s aimed at the wrong chasm.
So let’s now go back to the book. A useful framework for thinking about the right analytics product is Geoffrey Moore’s Whole Product concept.
“The single most important difference between early markets and mainstream markets is that the former are willing to take responsibility for piecing together the whole product, whereas the latter are not.”
—Geoffrey Moore
A hard truth: for most of today’s analytics deployments, there’s a gap between the promise made to the customer—that compelling value proposition that causes them to buy—and the ability of the shipped product to fulfill that promise.
For example, analytics is often pitched and sold on ambitious promises like enabling predictive maintenance and creating 15-20% energy savings—while most of today’s deployments only realize a portion of that vision.6 We’re pitching a mature potential that the actual product can’t hit consistently.
For the visionaries in the early market—like those early-adopting building owners who simply build a whole team around the tool—this gap is no big deal. They’re trying to change the game, so they don’t mind pitching in to fill it in.
For pragmatists on the other side of the chasm, this gap is a dealbreaker. They call BS. They need proof you’ve closed it before they jump in with two feet. They need proof of what Moore calls a “whole product”, meaning you’re bringing everything required to resolve their pain. Frankly, most of us aren’t coming anywhere close to wholeness.
Before we talk about the whole product, let’s talk about where most of the market is at. I’ve been calling this Analytics 1.0.
Let’s walk through each piece of the pie from the perspective of a pragmatist. To illustrate my point, I’m going to try and speak from the pragmatist’s cynicism. If this were a pizza, the pragmatist is the guy asking where the hell the other half is.
While the real estate technology laggards or late majority might not be ready for a SaaS product, I think the pragmatists accept this as the way software is purchased nowadays.
But even if the vendor provides all the normal SaaS stuff (customer support, training, regular updates, bug fixes, etc.) there are many ways Analytics 1.0 falls short of today’s mainstream SaaS expectations:
For the first chasm, none of these issues is a dealbreaker. For the second, each will be.
Centralization is all about pulling the data out of siloed building systems and unlocking it for analysis. While this is essentially table stakes for analytics, here too there are ways the pragmatist looks skeptically at 1.0.
Analytics 1.0 is notorious for pretty visualizations, charts, dashboards, and graphics. And while they’re table stakes too, the mainstream market needs more than pretty screens to drive value.
Building analytics is different than business analytics, where Tableau and PowerBI visualizations drive massive value for other industries. Someone needs to go turn a wrench, and the distance between the pretty chart and the wrench getting turned is too far in 1.0.
Detecting faults goes one step further than visualization, but as I detailed in my whitepaper Building Analytics Showdown, it doesn’t go far enough. As a result, it must be propped up by the aforementioned engineering service provider.
In analytics 1.0, detection can also be riddled with data issues and false positives, which damages the trust of end-users. The worst detection-only products flood users with too many unprioritized and unimportant issues—which are essentially alarms on abnormal measurements not too far outside of what their existing systems could do.
In order to realize the customer’s compelling value proposition, these products need to be packaged with “man-analytics” services to investigate, prioritize, and diagnose faults. Again, this is a product problem, and it’s a massive barrier to scale.
Do you see how looking skeptically at this technology from the eyes of the pragmatist adds a dose of reality? Sorry if I’m hurting feelings, but we’re only just getting started. Let’s now complete the pie.
The whole analytics product needs to make up for those shortcomings, plus add the bottom half.
Let’s walk through the rest of the pie.
The ideal tool produces insights that fully diagnose and prioritize each issue automatically, which minimizes the human effort required to produce results. Most of today’s tools lie somewhere in the middle of that ideal and the worst-case scenario described in the previous section.
As I’ve covered extensively, advanced supervisory control goes hand in hand with analytics, but that’s not what I’m talking about in this section. I’m bullish on ASC, but I think it’s on a different adoption curve from analytics.
What I think analytics 2.0 needs to have covered is a smooth path to controls changes. If the product and the owner are ready for ASC, great. If not, when the software detects a change that’s needed in the underlying systems, there needs to be a reliable and timely process for getting changes made.
Analytics 1.0 has been notorious for leaving this for someone else to figure out. With a whole product, it’s part of the plan from day one.
This is, in my opinion, the biggest shortcoming of Analytics 1.0. When we say “integration”, we’re talking about integrating analytics into existing systems. But the second, and equally important, type of integration is into existing teams of humans. Analytics needs to be integrated into the ways work gets done in the organization.
A classic example of this lack of integration is with FDD analytics. There’s a disconnect between FDD software that produces potential work orders and the software that O&M teams use to manage work orders, the CMMS. And then there’s the disconnect between the capital projects that the FDD software identifies and the organization’s capital project development process. An external (m)analytics service provider bridges both of these gaps, often through a monitoring-based commissioning (MBCx) process.
How about PropTech’s newest darling, occupancy analytics? There’s a disconnect between space allocation workflows and spatial analytics software. In most cases, these two applications are separate, missing the opportunity to be able to validate asks for additional space by a business unit against how business units are currently utilizing existing space.
Again, if I’m a pragmatist buyer, I wouldn’t want to fill these gaps on my own. But let’s not stop there. Workflow integration is also the key to solving the fragmentation problem on the buyer side. In order to make it spread through fragmented organizations, the product needs to make it easy for adjacent stakeholders to adopt it into new workflows.
For example, the folks responsible for capital planning care about HVAC performance too. The folks buying the building do too. Same data, but new stakeholders with new questions for that data.
Traversing inside the organization is a great segue to the next requirement for Analytics 2.0: making a business impact. From the standpoint of an executive team, Analytics 1.0 has been stuck in vitamin territory: optimizing a cost center.
The typical 1.0 business case is based on creating energy savings. This has value in year 1, but the whole product must continue to produce value year over year to justify the operating expense. That means traversing to new workflows beyond those that save energy.
But what if you can’t even get to year 1 because the benefit just isn’t enticing enough? Eventually, Analytics 2.0 needs to be a painkiller by improving workflows that affect the top-line revenue of the organization’s business. And that might mean putting energy savings use cases on the back burner at first.7
I think we’ve barely scratched the surface here.
So where do we go from here? With all the different stakeholders needing insights from building data, and all the diversity in today’s connected building systems, here’s the kicker: no one is going to create the whole product on their own.
For all the above gaps to be overcome, today’s products must be augmented by a variety of services and ancillary products to become the whole product that pragmatist buyers need and want.
And according to Crossing the Chasm, these buyers don’t want to be the ones pulling all these pieces together. They fear the hidden, full price they must pay to get the thing to produce the ROI they want—and they want to see the full picture before they buy.
I think the ones that blast out of the early market into the mainstream need to act now on this reality. They’ll navigate consolidation, partnerships, native integrations, and set up platform marketplaces so that the pragmatist building owner doesn’t have to piece everything together on their own.8
Thanks for reading,
James
P.S. Now I want to hear from you: Where am I wrong? Where do we go from here?
P.P.S Thanks to Tyson Soutter, Will Coleman, and Alex Grace for giving me feedback on pieces of this.
If you’re wondering what else was in my mom’s stack, it was The Five Invitations, The Effective Executive, and a cookbook.
I know people get hung up on the word “analytics”. I use it because we don’t have a better one, in my opinion. EMIS is the second-best choice. For why I use analytics instead, I outlined my problems with it here.
It’s not just LinkedIn… I think the dizzying array of 100+ analytics-related companies on the Nexus vendor landscape back me up here.
Alex’s feedback that I’ve yet to incorporate: I don’t see it as a problem per se if the service provider is running the tech…to me it’s more of a question of what the status quo workflows are today. In other words if you have a low skilled maintenance guy/janitor on site, and you have robust service contracts for mechanical and controls, I don’t have a bone to pick with that business model, probably makes a lot of sense for an 80k sq. ft. random office bldg. But the only way for this type of market – let’s say “mid-market” to cross the chasm is for those service contracts to transform.
When I say “Product” from here on out, I’m not talking about one company creating one product to rule them all. A better word for us might be whole solution.
The recently-published results of the Smart Energy Analytics Campaign (SEAC) back me up here. Just look at how inconsistent the savings were across organizations.
I can’t believe I just wrote that.
And while they’re at it, I hope they’ll collaborate to solve the one piece of the pie that is holding EVERY vendor back from becoming part of a whole product: getting the data and making sense of it. I hope they’ll realize we’re better off solving that together and giving the pragmatist confidence that the market is ready for their investment.
“It’s not first to market that wins, it’s the first to cross the chasm.”
—Geoffrey Moore
Do you ever feel like the Universe is speaking directly to you?
Over the holidays, my mom had a stack of books that she wanted to give away. She had set them out to see if I wanted any of them. One of them was not like the others: Crossing the Chasm by Geoffrey Moore.1
If you haven’t read it, here’s the main premise of the book: Motivations for buying a new product are vastly different, depending on who is buying. There are groups of buyer personas with similar motivations that line up along a bell curve called the Technology Adoption Lifecycle:
The innovators and visionary early adopters want huge changes and are willing to bet on them against the odds. But people in the early majority (and beyond) are much more pragmatic. They don’t want big changes and huge innovations, but rather incremental improvements based on using proven products and solutions.
This gap between the early adopters and the early majority is called the chasm. In order to cross it, you need to successfully progress from left to right… or else you get stuck and die.
Before we get to why this is relevant to us, back to the Universe: I have no clue why my mom owned this book. She doesn’t do new technology. She hasn’t started a startup, nor does she plan to cross the chasm someday. There was no rational reason for this hand-me-down to occur. This must be some sort of sign...
Since I don’t take these signs lightly, I took the book. After reading halfway through, I think the Universe was asking me (and therefore us) to think about this question:
How can analytics for buildings go mainstream?
Fair warning: this is a juicy one. Let’s dive in.
Of all the technologies we discuss here on Nexus, the broad and murky category of “analytics” has been around the longest.2 And yet, in the two months I’ve been thinking about this question and bouncing my ideas off of people (including 50+ commenters on LinkedIn), I haven’t had one person tell me this technology is already mainstream, at scale, realizing its potential, or viewed as part of the required technology to own and run a commercial building.
We’re still in the early days—in what Geoffrey Moore calls the “early market”. The world of innovators and early adopters.
And while we seem to have a consensus on our current location, we aren’t anywhere near consensus on why that is, what to do about it, or how to cross that big scary chasm.
If I may generalize, let’s levelset on what it’s like here📍:
We haven’t created what Moore calls a breakthrough application that produces an order-of-magnitude improvement. Before all of you with successful analytics case studies get upset with me, the exception to this rule is of course when us analytics nerds are paired with the product.
In the vast majority of those case studies, we human service providers produce that breakthrough with analytics as a tool in our toolbox. And it’s within these competent, often-external service provider organizations that I think analytics has already crossed a chasm.
But that chasm, and that early majority buyer, is a mirage, folks. Getting across it doesn’t mean it’s gone mainstream with building owners. In this sense, there are really two chasms…
And the second chasm is the key to scale. Crossing the first one is great, but if analytics are not integrated into the day-to-day operations of internal teams—the people that could benefit from the new technology the most—it’ll remain a niche tool operating on the fringes of organizations.
There are three reasons I believe that’s true. First, we only have so many analytics nerds. Look at all the “building performance engineer” job openings out there right now. It’s a bottleneck. Second, when propped up by an engineer, it’s too expensive to get to the majority of the market.
Third, I’ve provided this service for years and still provide it to this day. It’s extremely inefficient because it doesn’t put the onus on the building owner to actually change what they’re doing and how they’re doing it.
In Moore’s terminology, the dependence on service providers allows the technology to be adopted as a “continuous innovation”. But that approach will limit scale and impact4 because it’s neglecting the potential of the tool, which is to transform the operation of the building. It’s disruptive and should be treated as such.
Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.
Go to Nexus ConnectJoin Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.
Sign Up