While most contractors today use digital tools and technologies to collect and track project information, many lack consistent and reliable processes for collecting and documenting it. This affects their ability to analyze and identify issues, improve processes, and positively impact outcomes.
The question is no longer whether contractors are using technology but rather how they use it and positively impact data standards and processes.
A recent survey from Autodesk and Dodge Data & Analytics looked into standards and processes in the construction industry. Interestingly, almost twice as many general contractors (62%) as trade contractors (36%) say that capturing data was already part of their standard practice. However, top specialty contractors like McKinstry and Rosendin are raising the bar for setting data standards across the industry.
To learn how top specialty contractors are standardizing, we asked four construction thought leaders and experts from McKinstry and Rosendin to provide their insight:
- Brian Peguillan, Product Manager, McKinstry
- Dace Campbell, Director of Product Management in Construction, McKinstry
- Fred Meeske, Vice President, Rosendin
- Dr. Jad Chalhoub, Technology Solutions Implementation Lead, Rosendin
Dive into their insights below.
What do you think are the main benefits of standardizing construction data and processes?
Brian: The primary benefit of standardization is ensuring that everyone throughout the organization is operating on the same page. Everyone knows where to find the most up to date information and how fresh that information actually is. People can move from job to job and not have to relearn an entirely new process. Keeping everyone on the same page prevents costly mistakes, keeps morale high with your crews, and is good for the overall health of the project. Armed with standardized information, leadership can get an accurate picture of how a project is performing and make well informed decisions for their business.
Jad: It is important to remember that standardization in general, especially data standardization, is a tool and not a goal. Data standardization serves an important purpose in allowing comparisons when repeated measurements with single variate changes are basically impossible, which can uniquely benefit construction.
It’s a well-known fact that no two projects are alike, and no two tasks are perfectly repeatable: personnel change, experiences grow, and elements surrounding the task or project change all the time.
Standardization has provided several benefits for Rosendin, depending on where it was applied. For example, standardizing the data points we are collecting allows us to create reference points to compare against, especially when integrating new processes or comparing performance across several projects. In construction, even with the same project, the data is rarely static and directly comparable, so standardizing the collected data points enables a way to compare a project’s outcome against its goals, as well as compare it with other projects.
Dace: Standardizing our data and processes contributes directly to minimizing waste and rework and ultimately helps us mitigate risks, avoid write-downs on projects, and maintain our company’s financial health in a challenging market. It enables us to maintain fluidity and support turnover of our staff across our business operations, minimizing downtime associated with onboarding and re-orientation of labor crews new to any specific jobsite. Additionally, it allows us to consistently analyze and report on project performance across projects, markets, and geographies, empowering our leaders to stay abreast of overall profitability and focus resources in the right areas to shore up any projects that may need extra support.
What do you think are the most challenging aspects of collecting standardized data on projects?
Brian: One challenging aspect is the massive amount of variation not only between different businesses but also between different project teams. Each individual project manager has their own way of running work, and changing that can seem more difficult than just maintaining old processes. Different lines of business each have their own unique requirements that may not overlap with others. The biggest issue is that, as a subcontractor, we are often at the whims of the GC, making any standardized process seem more like double work.
Jad: Standardizing data types is more challenging, but it also allows us to connect to different data sources from the different software and sources. Having a single point of access to all our data allows for easier reporting and advanced data analytics.
One of the main challenges of collecting standardized data is figuring out what we want to use the data for, and subsequently what data needs to be collected and to what accuracy. Different types of applications require different tolerances and collection methods, so understanding the use case is extremely important.
It is also important to remember that standardization is merely a tool, one that is not usable in all occasions: while we can apply the same standardization techniques to the data we collect, it would not necessarily make the data comparable. For example, labor/non-labor splits in a project are standardized metrics, but comparing those splits in different sectors, size projects, or geographical locations might not be suitable. There’s always another layer you can standardize to, and it’s important to know when to stop and how to use the other tools in your toolbox.
Dace: Despite being a specialty contractor often required to follow the standard tools and processes required by the general contractor, we are striving to standardize the way we store and manage project data in the cloud. Currently, our data is stored on too many platforms, in too many formats, and in too many ways. We are challenged to develop a unified experience for our many project teams.
We aim to streamline that to store data in just two environments, leveraging BIM 360 (and Autodesk Construction Cloud) for our models and drawings while simultaneously leveraging Microsoft SharePoint (and Teams) for other project data (text documents, spreadsheets, PDFs, etc.). In this way, we hope to leverage each platform for the data controls and analysis of the file/data types best associated with those platforms. We are further challenged to standardize this information in a way that serves the needs for both current projects with “live ammunition” and historic project data, where data must be standardized in an archive for rapid search, retrieval, and analysis.
How are you currently implementing data standards on your projects?
Fred: Currently, Rosendin collects a lot of standardized data for both active projects and as metrics for measuring performance of processes. These metrics allow us to understand the performance of projects related to their stated goals. For example, if we will hit our markup on a certain project or compare a project to historic capabilities.
There are countless examples of the use of data throughout Rosendin: within the BIM department, it has enabled the use and implementation of new conduit routing software, hanger placement processes, and interdepartmental reviews and discussions. One of the main differentiators that Rosendin has pioneered is connecting multiple data sources into a single funnel, which allowed us to have insights that span multiple departments and business units, which would have previously been impossible.
Brian: We currently collect Earned Value data (self-reported and estimated using tools in conjunction with earned value workbooks), financial data (captured from hours reported and purchases), and daily reporting impact data on our projects (captured through proprietary daily reporting tools for field and detailers). It is worth noting that each individual business unit has its own methods for EV.
Dace: We have a standardized work-breakdown structure (WBS) and phase codes that support standard data input and collection and analysis, and reporting. We are developing a “model data tracker” tool, allowing us to track the quality and quantity of data in our documents as we move from design to detailing phases of our work, and compare the scope of work to the original estimate to verify that any discrepancies relate to the level of detail, versus an increase in scope. We are also working to unify our field reporting, relating and leveraging the same data across: planning and forecasting work at the beginning of a shift (pre-task planning); reporting what was accomplished at the end of a shift (daily reporting); comparing the two, and exploring root causes for any discrepancies (planned percent complete); associating work in-place to progress payments (earned value tracking). To truly unify reporting in this way, we are leveraging our own in-house software engineers to develop a standard mobile app, collecting and integrating data from tools.
If you are using standardized data and processes on your projects, has it helped you unlock any valuable analytics or insights? What are some best practices for teams starting to build better data standards and processes?
Fred: First, Rosendin collects standardized metrics about its internal processes, especially when considering changing them or updating them using new tools and software. In general, it is easy to fall for “the magic trick,” or the cool factor behind new solutions being introduced. Having standardized metrics allows for more direct, analytical, and objective analysis and feedback, allowing us to continually improve and quantify this improvement.
Secondly, for teams starting to build data standards and processes, it is important to keep two things in mind: first, build an interdisciplinary team that focuses on easily achievable goals. This will enable you to learn while still providing immediate value. More intricate questions with more effort and time requirements can have a bigger impact, but when starting, nothing beats the immediate impact. Once momentum is built, and the questions are well defined, the team can start tackling harder questions.
Lastly, and possibly, more importantly, build the metrics to be flexible.
There is a misconception that data is very stringent and static, but in reality, data is very fluid, especially from where data can be collected and generated.
It is essential to connect as many distinct data pieces together as possible to enable as much interconnectedness and deeper insights in the future. Always remember that good foundations will save you significantly more time and effort in the long run than having to repeat things multiple times.
Brian: In many cases yes. The data we collect provides early and ongoing indicators of project health and identifies potential risk factors. We are currently in the process of identifying existing standards and processes and understanding pain points. After that, we can start to look at potential areas of improvement, standards creation, and solution ideation.
Dace: Our internal Product Management Organization (PdMO) is a key resource to the success of these standardization efforts. Best practices championed by our PdMO include a “diagnose before you prescribe” approach, with a thorough discovery process to identify pain points, root causes, and impacts of not having data standards. By qualifying and quantifying this pain in terms of business value, we can articulate and prioritize the need to standardize our data among other important process and technology improvements across our enterprise.