How companies are using data to optimize manufacturing operations

Digitization efforts across manufacturing have created new opportunities with digital twins, predictive maintenance and cloud systems.
June 20, 2024

This article was first published on SupplyChainBrain.com on February 22, 2023.

Digitization efforts across manufacturing have created new opportunities with digital twins, predictive maintenance and cloud systems. And while these technologies are changing the industry, we have an opportunity to do more, if we combine data in sophisticated ways that address the core problems that are slowing down operations today.

Multiple systems and sensors are providing plenty of operational data, but it’s often challenging to consolidate and combine it with relevant business information to maximize asset availability. In the transportation industry, however, there are opportunities to move beyond predictive analytics, and use data to coordinate operations.

As Livio Mariano, Director of Math and Systems at Altair noted, “When we talk about digital evolution, smart manufacturing, industry 4.0 or [the] internet of things, the common factor is data. Whether [it’s] coming from simulations or sensors, today the challenge is not about having data available; it’s more about efficiently and effectively using it.” 

The move to the cloud is providing manufacturers with useful intelligence and powerful algorithms at scale, but most of that data is still siloed, not centralized. Many companies are still utilizing an on-premises primary data warehouse system, but are moving pieces of the data stack to a cloud-based data warehouse or data lake to support procedures like statistical process control and monitoring based on programmable logic controllers (PLCs).

Real-time agility requires combining data from multiple sources to create new insights for use cases like machine learning. Manufacturers are familiar with streaming data today, but this is nowhere near the point of saturation. Everything from supply chain shortages to COVID-19 sick time and weather disruptions has made the simple task of getting shipments from point A to B complicated and uncertain.

However, companies that consolidate data from asset-based sensors, predictive maintenance algorithms and ordering and staffing systems (such as enterprise resource management, supply chain management and human resources software), can use it to respond much more quickly, and keep assets operating at maximum efficiency.

Industrial asset management, especially in transportation, allows companies to showcase the value of real- or near-real-time data in a range of scenarios, from airlines to railroads. Large-scale assets deployed in the past few years generate a tremendous amount of streaming data. But even predictive maintenance can’t tell you everything you need to know to keep equipment on the move.

Consider a locomotive in the U.S. The train engine is generating a ton of data on how the machine is performing, and this data is crucial for predictive maintenance and digital twin scenarios. But what happens when the alert goes off that an engine’s bearings are wearing down? Then the data management problem becomes a coordination problem. 

Once you know a piece of equipment is close to breaking down, you want to intervene before you face downtime and have to scramble to service the equipment. You want to do this preemptively because you need to coordinate across multiple data sources to accomplish the task without requiring additional inventory, emergency adjustments to the production line or supply chain, or an engineer to stand by.

Say a company has a large locomotive service center in Sacramento. To successfully repair this engine, it has to have the right parts on hand, requiring a system for ordering. Then the company needs a crew that’s trained to maintain that particular model, and has available capacity. There’s also the task of getting the locomotive there on the right day — and that’s literally a moving target. 

The most efficient approach is to have all of these pieces arrive in Sacramento at the same time. Bringing these components together isn’t a new challenge, but the ability to analyze and optimize each aspect is a new way to minimize downtime and inefficiencies. Anytime that locomotive is not hauling freight, it’s not driving revenue.

But what happens if Sacramento is hit with several days of record storms, and multiple tracks are closed due to flooding? In the past, the railroad would need to wait until things dried out while assets, parts and personnel stood by. But real-time information, integrated into a consolidated system, can give a railroad insight into new solutions.

Potentially, the railroad might discover that if it brings all the pieces together in Los Angeles instead of Sacramento, and have technicians fly or drive to L.A., they can get the job done before the waters recede. But without a combined real-time overview, companies can miss these opportunities to be agile. Ultimately, if a manufacturer can help customers maximize uptime and availability for that locomotive, it can deliver real value. 

This challenge can impact airlines as much as railroads. Industry data engineers can spend up to 40% of their time building and testing the ingestion part of the data pipeline, according to Ashley Van Name, JetBlue’s general manager of data engineering. “We're a 24/7 operation, and every minute requires information for our crew members to help us get our customers from one place to another efficiently,” she says, estimating that once the company combined data sources, JetBlue’s data engineers only need to spend 10% of their time testing, validating and making sure that data from 130 different systems is being pushed into the warehouse.

Today, we’re several years into this “Industry 4.0” transition. We’ve seen many manufacturers move more data from onsite servers to cloud-based services, and move beyond operational data to embrace software-as-a-service enterprise resource planning and finance systems. As companies move more of that data stack to cloud environments, the good news is that SaaS environments make it easier to start centralizing data. This will help maximize efficiency, highlight opportunities and create the type of situational awareness that allow companies to solve complicated problems in new ways.

[CTA_MODULE]

Start for free

Join the thousands of companies using Fivetran to centralize and transform their data.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Data insights
Data insights

How companies are using data to optimize manufacturing operations

How companies are using data to optimize manufacturing operations

June 20, 2024
June 20, 2024
How companies are using data to optimize manufacturing operations
Digitization efforts across manufacturing have created new opportunities with digital twins, predictive maintenance and cloud systems.

This article was first published on SupplyChainBrain.com on February 22, 2023.

Digitization efforts across manufacturing have created new opportunities with digital twins, predictive maintenance and cloud systems. And while these technologies are changing the industry, we have an opportunity to do more, if we combine data in sophisticated ways that address the core problems that are slowing down operations today.

Multiple systems and sensors are providing plenty of operational data, but it’s often challenging to consolidate and combine it with relevant business information to maximize asset availability. In the transportation industry, however, there are opportunities to move beyond predictive analytics, and use data to coordinate operations.

As Livio Mariano, Director of Math and Systems at Altair noted, “When we talk about digital evolution, smart manufacturing, industry 4.0 or [the] internet of things, the common factor is data. Whether [it’s] coming from simulations or sensors, today the challenge is not about having data available; it’s more about efficiently and effectively using it.” 

The move to the cloud is providing manufacturers with useful intelligence and powerful algorithms at scale, but most of that data is still siloed, not centralized. Many companies are still utilizing an on-premises primary data warehouse system, but are moving pieces of the data stack to a cloud-based data warehouse or data lake to support procedures like statistical process control and monitoring based on programmable logic controllers (PLCs).

Real-time agility requires combining data from multiple sources to create new insights for use cases like machine learning. Manufacturers are familiar with streaming data today, but this is nowhere near the point of saturation. Everything from supply chain shortages to COVID-19 sick time and weather disruptions has made the simple task of getting shipments from point A to B complicated and uncertain.

However, companies that consolidate data from asset-based sensors, predictive maintenance algorithms and ordering and staffing systems (such as enterprise resource management, supply chain management and human resources software), can use it to respond much more quickly, and keep assets operating at maximum efficiency.

Industrial asset management, especially in transportation, allows companies to showcase the value of real- or near-real-time data in a range of scenarios, from airlines to railroads. Large-scale assets deployed in the past few years generate a tremendous amount of streaming data. But even predictive maintenance can’t tell you everything you need to know to keep equipment on the move.

Consider a locomotive in the U.S. The train engine is generating a ton of data on how the machine is performing, and this data is crucial for predictive maintenance and digital twin scenarios. But what happens when the alert goes off that an engine’s bearings are wearing down? Then the data management problem becomes a coordination problem. 

Once you know a piece of equipment is close to breaking down, you want to intervene before you face downtime and have to scramble to service the equipment. You want to do this preemptively because you need to coordinate across multiple data sources to accomplish the task without requiring additional inventory, emergency adjustments to the production line or supply chain, or an engineer to stand by.

Say a company has a large locomotive service center in Sacramento. To successfully repair this engine, it has to have the right parts on hand, requiring a system for ordering. Then the company needs a crew that’s trained to maintain that particular model, and has available capacity. There’s also the task of getting the locomotive there on the right day — and that’s literally a moving target. 

The most efficient approach is to have all of these pieces arrive in Sacramento at the same time. Bringing these components together isn’t a new challenge, but the ability to analyze and optimize each aspect is a new way to minimize downtime and inefficiencies. Anytime that locomotive is not hauling freight, it’s not driving revenue.

But what happens if Sacramento is hit with several days of record storms, and multiple tracks are closed due to flooding? In the past, the railroad would need to wait until things dried out while assets, parts and personnel stood by. But real-time information, integrated into a consolidated system, can give a railroad insight into new solutions.

Potentially, the railroad might discover that if it brings all the pieces together in Los Angeles instead of Sacramento, and have technicians fly or drive to L.A., they can get the job done before the waters recede. But without a combined real-time overview, companies can miss these opportunities to be agile. Ultimately, if a manufacturer can help customers maximize uptime and availability for that locomotive, it can deliver real value. 

This challenge can impact airlines as much as railroads. Industry data engineers can spend up to 40% of their time building and testing the ingestion part of the data pipeline, according to Ashley Van Name, JetBlue’s general manager of data engineering. “We're a 24/7 operation, and every minute requires information for our crew members to help us get our customers from one place to another efficiently,” she says, estimating that once the company combined data sources, JetBlue’s data engineers only need to spend 10% of their time testing, validating and making sure that data from 130 different systems is being pushed into the warehouse.

Today, we’re several years into this “Industry 4.0” transition. We’ve seen many manufacturers move more data from onsite servers to cloud-based services, and move beyond operational data to embrace software-as-a-service enterprise resource planning and finance systems. As companies move more of that data stack to cloud environments, the good news is that SaaS environments make it easier to start centralizing data. This will help maximize efficiency, highlight opportunities and create the type of situational awareness that allow companies to solve complicated problems in new ways.

[CTA_MODULE]

Looking to lead your organization towards digital transformation?
Read our CDO's guide for a roadmap
Topics
No items found.
Share

Related blog posts

Unifying manufacturing data with Fivetran and Databricks
Data insights

Unifying manufacturing data with Fivetran and Databricks

Read post
Launch partnership: Fivetran and Snowflake accelerate manufacturing insights
Product

Launch partnership: Fivetran and Snowflake accelerate manufacturing insights

Read post
How replicating SAP data benefits manufacturing organizations
Data insights

How replicating SAP data benefits manufacturing organizations

Read post
No items found.
No items found.

Start for free

Join the thousands of companies using Fivetran to centralize and transform their data.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.