At the highest level, the goal of any integration project is the secure transfer of data needed to accomplish business objectives. To achieve that goal, facility managers need to understand their stakeholders and the stakeholders’ objectives.
Successful integration projects start by consulting with these leaders to define their specific needs. This requires developing detailed answers to a number of questions, including:
Taking time to answer these questions helps designers and building owners identify “must have” data and features, develop measurable goals and anticipate challenges. The next step is to define the enterprise relative to objectives.
Facilitate IT/OT cooperation
Implementing a comprehensive data solution requires integrating the energy, facility and business domains of an organization. Each domain uses different technologies, performance indicators and skill sets as described below:
- Facility domain. The facility domain encompasses all systems that capture data related to the equipment, amenities and other physical assets of a high-performance building. This includes data regarding the operation, access and service of systems such as HVAC, lighting, physical security, utility metering and elevators.
- Energy domain. This domain consists of systems related to recognizing or being able to capture the energy performance of a facility. This involves measuring upstream consumption through electricity, natural gas and water metering. It also includes submetering the downstream systems that consume that energy, such as air conditioning, heating and toilets.
- Business domain. The business domain contains all systems related to capital planning, including those that allow an owner to use asset performance data, understand asset health and make informed financial decisions. This domain also includes data systems related to space use and the allocation of associated costs. It could also include integrating the systems that impact the occupant: room booking, digital signage, knowledge management, etc.
Integrating these three domains is a complex process that begins with defining the IT/OT workflow necessary to achieve overall interoperability goals. As part of this process:
- Define what type of data needs to be generated.
- Identify where various data currently reside.
- Put systems in place to generate any additional data needed.
- Specify how the relevant systems will communicate.
- See that data can be transferred securely where it is needed.
It is important to see that IT and OT are on the same page early in this process. IT managers are often the first to recognize the potential of networked building systems. They have the knowledge and experience needed to make data meaningful and accessible, as well as to implement effective cybersecurity protocols. They lack detailed knowledge of the equipment that needs to be connected, however.
To achieve the organization’s networking goals, IT needs to support and be involved with the implementation of OT. However, simply understanding each other’s needs and limitations often proves difficult. This is especially true on facility integration projects because these smart devices are foreign to IT and require knowledge and experience across electrical, mechanical, cybersecurity and critical infrastructure systems.
Ideally, every project team would include multidisciplinary OT personnel with knowledge and experience in both OT and IT, but this combined skill set can be hard to find. When such multidisciplinary backgrounds and skills aren’t available in-house, a third-party expert can be a valuable addition to the team.
Ultimately, the data flow must be architected in a way that is secure and efficient, while also allowing third-party service providers remote access. Working together, IT and OT can recommend and implement best practices for secure network design. Typically, this will include hardening of systems, detection and protection against malware and ransomware, regular patching and other cybersecurity hygiene, continuous vulnerability management, proper segregation of IT and OT systems and use of demilitarized zones.
System integration makes it possible to generate the operational data that managers need to achieve their sustainability goals. Many companies assume if they have data, they can do an effective analysis. But few companies have reliable access to the high-quality data that is valuable at the point of decision-making.
Some operational data available today is low quality. Data entry errors, lack of business rules defining acceptable data values and problems related to migrating data from one system to another result in malformed, duplicative and missing data. For example, the thermostats in a building may be programmed to read “room temperature,” while the algorithm used to monitor temperature is programmed to search for “zone temp.” Therefore, any attempts to analyze temperature are unsuccessful.
To be valuable, operational data needs to be carefully managed from creation through extraction, translation and consumption. Data modeling is an important aspect of this process. Data modeling involves defining data entities (the objects tracked), their attributes and their relationships. The first step is to give the data meaning.
Because of the vast quantity of data being generated, establishing and enforcing a consistent tagging system can be challenging. Open source initiatives like Project Haystack streamline working with internet of things data by standardizing semantic data models. Working within Project Haystack naming conventions and taxonomies eliminates the need to maintain rigid schemas.
Next, asset classification standards allows the team to define building elements as major components common to most buildings. Subject matter experts can provide common threads linking activities, costs and participants in a building project from initial planning through operations, maintenance and disposal. UniFormat, OmniClass and MasterFormat are three widely used classification systems for the construction industry. Modeling operational data in this way allows you to normalize the data so that many systems can onboard it effectively and efficiently.