Ürünlerimiz ve çözümlerimiz hakkında daha fazla bilgi, çevrimiçi sunumlar, demolar ve PoC talepleri için bizimle iletişime geçin.

Contact us for more information about our products and solutions, online presentations, demos and PoC requests.


İletişim / Contact

İçerenköy Mah. Umut Sok. Quick Tower, Kozyatağı – İstanbul / Türkiye


+90 216 999 1394

Digital Transformation

Industry 4.0: Managing Operational Data

The concept of Digital Transformation is based on 3 pillars;

  • High Processing Power (xPU Power)
  • High volume storage Power (Storage)
  • High speed communication power (Communications)

All 3 technological foundations are related to the data generated/processed. In the coming Industry x.0 era, all industries will be based on data and its analysis. In order to get the maximum benefit from the data, we need to collect the data reliably, mark it correctly, store it in volume and performance, bring back the data that needs to be analyzed quickly and make the right analysis.

Industrial and critical sectors such as Oil & Natural Gas, Energy, Manufacturing, etc., which ensure the continuation of today’s world, use sensor data at high speed and volume and need improvements based on analysis, artificial intelligence and machine learning that these data will provide. The intensive operational data generated by these facilities should be available for instant operation as well as future planning and optimization needs.

When we put all this together, organizations need to be able to transfer the experience and knowledge they gain through their teams and partners while managing their facilities into the software environment and analyze it first-hand.

Industry 4.0 is driven by the digitization of production tools with more sensors and data than ever before. Time series open new horizons for both business and the development of new services.

The industry transformation driven by digital technologies is forcing us to rethink a value chain that is becoming more technically complex. Beyond the production process, it is profoundly changing relationships with both customers and suppliers. And operational data is playing a critical role at the crossroads of productivity, quality, cost control, the development of new services and competition.

This evolution is based on the integration of sensors and measurements at all levels. As a result, the data generated in operation can be used for anomaly detection, predictive maintenance plans and optimization, reducing downtime and production losses.

Sensors have always had a special place in industry. Measurements of voltage or power consumption, pressure and temperature – along with their displays – are an integral part of the industrial world. Over the years, measurements have been concentrated on production or operational monitoring stations. Today, sensors are no longer limited to simple measurements of a few variables. They reflect the gradual and continuous digitalization of machines, systems and technical processes.

The industrial sector is just emerging from a long period of sometimes heavy computerization, with the generalization of ERP systems. The proliferation of sensors is paving the way for a transformation towards the Industry of the Future. The common thread, better known internationally as “Industry 4.0”, is data, which now has a strategic value.

The concept of “Industry 4.0” was initiated by Germany in 2011. Initially limited to industrial production, this concept mainly covered the factories of the future. Today, it generally encompasses everything related to the transformation of industry with the contribution of digital technologies.

The challenges for manufacturers go beyond simple technical or technological development of the means of production.

They combine two forms of historically contradictory trends in industry. On the one hand, the search for a reduction in production hazards, whether caused by interruptions in the supply or logistics chain or by the unavailability of production resources. On the other hand, the competitive pressure for ever more personalized products for ever more demanding customers in terms of services, in a context of ever more restrictive rules in terms of energy, environment, safety, etc.

Key components of Industry 4.0

In 2015, the Boston Consulting Group formalized the 9 pillars that define the transition to Industry 4.0. This widely used representation highlights the main role of data processing and analysis. The other two pillars are directly linked to it: Industrial Internet of Things (IIoT) and simulation with Digital Twin. We can also add the “Horizontal and vertical system integration” pillar, which highlights the need to have a technical organization of data that allows this type of integration.

Sensors, the pillar of IIoT, play a catalytic role in this evolution. Development conditions dictate the way manufacturers respond to the twofold constraint mentioned above:

  • Reducing downtime: machines and systems are becoming increasingly complex, with a strong penetration of electronics and technical software that tends to optimize each component as well as its overall functioning based on continuous sensor measurements. Just like a car full of electronics, mechanical control is no longer enough. The sensors themselves make it possible to meet improved operating quality requirements and better detect faults. Both internal users and customers want to have better control and avoid being subject to the vagaries of the operation of complex systems.
  • Responding to the challenges of privatization in a more constrained competitive and regulatory context. Everything is measured, both to accurately express the interdependencies of systems, and to monitor everything that can be monitored and reduce the burden and pressure on people as much as possible.

Most of the other pillars of Industry 4.0 can be thought of as components that can be added incrementally. But the more complex horizontal and vertical system integration remains a technological and organizational challenge for transformation.

There are two reasons for this:

  • The systems are based on different levels of functional and organizational responsibility resulting in siloed information systems. Integrated architectures may be attractive in principle, but are technically difficult to implement and organizationally complex with financial implications (who pays for what?).
  • This verticality in silos implies strong technical constraints on the organization of data. While it is easy to cross-reference different data sources, this work is on the contrary tedious and technically difficult due to the heterogeneity of the format and the real meaning of each data source.

In industry, these constraints are even more pronounced as the digital environment encompasses two groups that are still very far apart: the company’s information systems (or management information systems) on the one hand and its operational systems on the other.

The distinction between business systems and operational systems

The need for a global operational and financial vision and the constant search for process optimization have led to the proliferation of ERP systems, which are increasingly embedded in most industrial companies, even more than in other sectors. ERP systems cover purchasing, asset management, production management, maintenance management, sales, accounting and human resources with specialized modules. ERP systems therefore cover the whole of IT (Information Technology).

On the side of the machines and technical systems that have to be produced and then operated, we move into another world: industrial IT or OT (Operations Technology). The control and command of machines and systems requires specialized devices to adapt to a wide range of constraints.

In the case of a factory, the whole chain is more complex. It can be represented as follows:


More globally, the connection between the world of IT systems and the world of machines and technical systems (OT/Operational Technologies), be it the factory itself or the machines or systems in production, is established at the SCADA (Supervisory Control and Data Acquisition) level. It is the central part that allows one or more systems, machines or technical devices to be controlled from the information it sends back, the orders and commands it may receive from other applications or from operators who do business for this purpose. has a human-machine interface (HMI).

In conjunction with SCADA, a Data Historian, which records local data streams, is increasingly being used. Analysis tools can be associated with Data Historians. These can be used to answer basic questions about the operating statistics of machines, systems or any device with sensors such as valves. These can also be fully or partially fed back to a centralized level.

SCADA can control several machines and systems. Depending on the situation, the connection can be direct or through other decentralized devices (defined in the previous diagram as PLC, DCS or RTU) to each of them. In some cases, these devices are integrated directly into the machine or system for operational control and monitoring.

The connection between SCADAs controlling technical systems and IT is rarely direct. In general, it happens through software to manage production, its processes, programming and operational management of the whole. In the case of a factory, MES (Manufacturing Execution System) has a special place in the scheduling of tasks and the overall organization of production lines.

In other words, the IT resources of an industrial player organized around its ERP and the world of machines with their control systems and automation have every reason to be distant from each other.

But the integration of IT and OT worlds is inevitable. In an increasingly competitive industrial context, two requirements make it necessary: on the one hand, the need to respond to operational constraints more effectively than ever before, and on the other, the transformation of customer service.

Operational Data Management is the key to IT/OT convergence

Besides its fragmented nature, OT comes with another feature you need to manage: “time series data.” Time Series data is a critical part of the IT-OT integration effort and is a different type of data than the data commonly found in IT systems. It is also time-variable, which leads to a few more complications. Overall, data managers will need to solve two specific problems:

  • We cannot always interpret time series data as it is, and contextual information needs to be added to it. If a temperature graph says “30°C”, managers cannot know if this is good or not unless they have any information about the equipment that generated the data point (is it a refrigeration unit or an oven? What is it? What are the specifications etc.?).
  • Time series data often comes with quality issues. This is usually due to limitations of the sensors and associated infrastructure, resulting in signal dropout, drift, spikes, stale data, etc.

The best way to do this is to create a “trusted data layer” where all OT “historian” data is fed into an analytical framework. Here algorithms will enrich it with contextual metadata while addressing some basic data quality management. If configured correctly, this layer also enables data chain governance (i.e. governance from sensor to automation controls) and monitoring plus support functionality.

A real-time data infrastructure is required to achieve operational goals and support business initiatives. The Cerrus Analytics Platform provides this enterprise-wide infrastructure and enables users to analyze and aggregate real-time, historical and future data and events into user-defined actionable insights, all in real-time.

Cerrus Analytics Platform features:

  • High Connectivity: A true data infrastructure solution needs to connect to a wide variety of systems. Without high connectivity, data cannot be centrally aggregated for in-depth analysis, timely visualization and broad reporting.
  • Effective Accessibility: For data to be valuable, it must be accessible to the people and systems that need it. This access should be easy and fast.
  • Easy Integration: Data should always be accessible to the right people and by the right third-party systems. This ensures that seamless integrations extract more value from your critical process data.
  • Wide Scalability: With the rapidly increasing amount of data that existing IT solutions need to handle, there is a need for a robust scalable system. It allows us to get all data to all users in real time, even down to sub-second data.
  • High Performance: The most performant solution for every end user to achieve fast and reliable results.
  • Flexibility: Every company has different metrics, legal and reporting requirements and processes. Such a flexible data infrastructure is necessary to ensure that you have full control to define your own analysis and enter your own intellectual property.
  • Secure Data Transfer: In today’s volatile environment, cyber security is a concern for all organizations. Your data is a vital business asset.