<img src="https://secure.leadforensics.com/133892.png" alt="" style="display:none;">

We're all familiar with the time value of money — money that shows up today is effectively worth more than the same amount of money arriving at some point down the road. Information has a time value, too: Insights that are delivered substantially faster than you're used to can be a little like getting tomorrow's newspaper today. Imagine the opportunities.

That's why effectively consolidating data sources for enterprise analytics can be the springboard for a big leap forward. Right now, every industry is in the process of digital transformation, as companies draw on the power of AI, machine learning, IoT technology and cloud-based analytics to become smarter and more efficient.

For an example of how this can play out, and the rewards it can bring, let's look at how Columbus recently helped a mid-sized glass manufacturer transition to Azure technologies to unlock the power of better and faster data.

The Fiberglass Furnace: A Manufacturer with a Fragile Legacy System

In the case of this client, getting the right data solution in place was quite literally a make-or-break assignment.

In the spring of 2020, Columbus was introduced to a medium-sized fiberglass company with multiple plants in the United States and several hundred employees. The company's plants had been built in the middle of the 20th century and consisted of giant furnaces mixing raw materials, which would then flow from large batch silos through melters and then into bushings to create glass fiber strands.

The challenge for the client was that they needed to dramatically improve the time it took to respond to quality issues in their plants. Under their current legacy setup, it was taking as long as several weeks to get the data they needed to diagnose problems and implement fixes. And that cumbersome lag time created substantial opportunity costs.

The most substantial deviations within the manufacturing process occurred in the form of breaks in the glass fiber, known as filament breaks. The rate of fiber breakage was measured in terms of breakouts per bushing run hour, or BOBRH. They collected and evaluated many data points to monitor this process, including throughput, product mix, temperature, glass level oxides and glass properties. The quality of the mix — whether it melted well — was another key factor.

The company was able to capture data using tens of thousands of IoT sensors located throughout the manufacturing process — measuring values like temperature, gas flow, oxygen flow and pressure, as well as data captured at the end of the process. All of this resulted in a substantial amount of raw data available, including not only product information but material information that was used for comparison.

Diagnosing the Data Challenges

The difficult part, of course, was accessing and analyzing this data drawn from different sources to provide actionable insights in a timely and efficient way.

Instead of a centralized data warehouse, the company was relying on a bespoke Excel spreadmart — essentially several Excel spreadsheets cobbled together, with data that wasn't synchronized or automated. They depended on a manual process using Excel to create an analysis on a seven-day rolling basis, which was time-consuming and slow-moving. Quality problems could take up to eight weeks to identify and correct, and in the meantime the quality of the product suffered.

Additionally, because the company wasn't able to compare readings with production best practices over time, process improvements weren't sustainable — and in many cases weren't even achievable, since production had already switched to a different product.

The client knew they had a problem — that their data was stale and holding them back. They wanted a solution that would yield more actionable information from their production process.

Architecting the Solution with Azure

After initial conversations with the client to understand their needs, and performing an assessment to map out the details of the engagement, Columbus built a roadmap to visualize the data journey for this customer. This was a critical tool for driving digital change within the organization, by providing visibility and a time horizon to achieve their goals.

The solution for this client was to provide a consolidated view of the data across various systems that drove their production process and quality levels. Columbus recommended implementation of an ODS (Operational Data Store) framework utilizing best-practice architecture — and the right technology stack to make this a reality was the Microsoft Azure platform.

To put this into action, Columbus architected a solution using Azure Data Factory for ingestion, Data Lake for storage, Databricks for manipulation, Cosmos DB for the actual storage and a Power BI workspace for visual presentation — with the ability to showcase reporting metrics that the client could use to minimize quality issues.

Using these tools provided reliability from a data ingestion and manipulation standpoint, as well as low-cost storage and greater visualization through Power BI.

Columbus and the client decided to start with a proof of concept. For this pilot, Columbus focused on one site and one furnace to ensure control of multiple variables. Columbus also focused on identifying efficiencies and shortening the analysis and response time needed to execute corrections when a problem occurs.

Success: Putting the Brakes on Breakage

Columbus and the client shortened the quality-control response time from as long as eight weeks down to around two to four days. That represented a transformative win.

Thanks to customized Power BI dashboards and reports, featuring trends and controlled diagrams, the client had better and easier consumption of information with the ability to access and analyze data much earlier in the process — almost in real time.

This visibility allowed them to optimize production with less waste and breakage, as well as improve product quality, which would allow them to generate more revenue.

And once they saw these improvements in action, they wanted to extend them to all their plants and production lines. At that point, Columbus and the client were ready to move beyond reaction to prediction, with the vision of building out machine-learning models that could anticipate quality issues, including potential filament breaks.

Watch the Webinar for a Closer Look

Interested in learning more? Watch this webinar with Columbus’s Mike Simms, where he shares the story of this case and explores the solution in detail. It's available on the MSDynamicsWorld.com website (free registration required): Webinar: Azure Case Study: Consolidating Enterprise Data for Advanced Analytics

Topics

Discuss this post

Recommended posts

Everyone knows that a building is only as strong as its foundation. That's true for businesses as well — and sometimes the platform you choose to build on is the single most important choice you can make.
right-arrow share search phone phone-filled menu filter envelope envelope-filled close checkmark caret-down arrow-up arrow-right arrow-left arrow-down