blog

AI in Manufacturing: Why the Data Foundation Matters More Than the Algorithm

By: Lauren Dunford

By: Guidewheel
Updated: 
March 6, 2026
7 min read

No items found.

AI in manufacturing: why the data foundation matters more than the algorithm

In this webinar, David Ariens (Founder of The IT/OT Insider, former Industrial Digitalization Director at BASF) joins Lauren Dunford, CEO of Guidewheel, to discuss what it really takes to connect systems, teams, and data so industrial AI delivers on the factory floor.


Top 5 takeaways

💡

Here are the five most important insights from the conversation:

  • Don't touch the OT layer. Treat PLCs, SCADA, and HMIs as a given. Build your data and AI layer on top of existing automation rather than ripping and replacing it.
  • Most AI in manufacturing today is a bolt-on. Chatbots on historians and LLMs in engineering tools add convenience, but they do not constitute embedded intelligence across operations.
  • The WIP black hole is the critical gap. ERP handles scheduling and finished goods well, but the messy middle of production is where visibility collapses and margin leaks.
  • Operator-to-operator advocacy scales adoption. Mandates from management create resistance. When operators sell a tool to their peers, every plant wants it.
  • Just start. The technology exists. The barrier is organizational. Stop debating and begin building your data foundation now.

Best practices and key learnings


Your factory has multiple digital twins, and they are all in silos


One of the most clarifying frameworks David shared is the idea that every manufacturer already has multiple digital twins. Not the flashy 3D model kind. The real kind: your data historian, your quality system, your maintenance records, your ERP, your engineering documents. Each one holds a partial digital representation of your physical assets.

The problem is none of them talk to each other. And many of them contain duplicated, outdated, or conflicting versions of the same information.

"If we look to how a typical manufacturer is working today then you could say that on the one hand they have the physical twin... and then they have multiple digital twins... But you have all these different systems which contain a small or larger part of the digital representation of that physical assets."

David Ariens

This is the real barrier to machine learning and AI data analytics delivering value in manufacturing. It is not a technology problem. It is an organizational and data architecture problem. The data exists, but it lives in locked file cabinets of different ages, scattered across departments, with no master index.

For manufacturers running 10, 20, or 50 plants with mixed equipment ages and different systems at each site, this fragmentation compounds. A recent acquisition might have an MES. The legacy plants might not. Some critical knowledge lives in Access databases or copies of copies of P&IDs. Until these siloed digital twins are connected into a centralized platform with context, every AI initiative is building on sand.

David defines a unified namespace simply: centralized data access, in context. That means all your data, whether historian, MES, quality, or maintenance, is available in one platform. And it carries context: asset context (ISA-95), production context (what was being produced), quality context, and maintenance context. Without that foundation, industrial AI remains a collection of disconnected bolt-ons.


Build on top of what you have, then find the use case that scales


When you are staring at 40 brownfield plants with equipment dating back decades, the instinct to standardize everything is strong. Standardize the PLC vendor. Standardize the naming conventions. Standardize the SCADA platform.

David has watched these projects fail repeatedly.

"You don't you really do not want to touch your pure OT layer. So that means that at a minimum we should start by building a layer on top of that."

David Ariens

The pure OT layer, your PLCs, your SCADA, your HMIs, has a long lifecycle. It is deeply embedded in operations. Trying to standardize it across plants, especially after an acquisition, is a multi-year project with enormous risk and uncertain ROI. That is the definition of a nightmare project.

Instead, the practical path is to build a data and visibility layer on top of existing automation. This is where Guidewheel's approach fits: plug-and-play sensors that clip onto any machine regardless of age, make, or model, with no PLC connection, no OT network access, and no cybersecurity risk. You get real-time data flowing in hours, not months, without touching the automation foundation.

When selecting where to start your data foundation, look for use cases that meet three criteria: they deliver value at a specific site, they have clear scaling potential across multiple plants, and they have a local champion who is hungry for change. The standardization that matters is not in the OT layer — it is in your software stack, your hardware footprint, your cybersecurity approach, and your data architecture. Standardize the layer you are building, not the layer you inherited.

But choosing where to start matters. David recommends looking for use cases that meet three criteria: they deliver value at a specific site, they have clear scaling potential across multiple plants, and they have a local champion who is hungry for change. One of Guidewheel's customers described their selection criteria even more simply: find sites where leaders know what they will do with data, and where they actually want it. That combination of capability and motivation is the winning formula.

The standardization that matters is not in the OT layer. It is in your software stack, your hardware footprint, your cybersecurity approach, and your data architecture. Standardize the layer you are building, not the layer you inherited.


Let operators drive adoption, not PowerPoint decks


David shared a story from his time at BASF that perfectly illustrates why top-down mandates fail and bottom-up adoption wins.

He was tasked with replacing paper-based manual entries with tablets for operators. Management's assumption was straightforward: digital entry is faster than writing on paper, so operators will save time. That was the business case.

When David's team actually sat with operators shift after shift, they discovered something different. Operators did not care about speed. They cared about context. One operator asked: could the previous shift's value be pre-filled on the screen? Management resisted, worried operators would just click "next" without checking. But the operators saw it differently. If the previous value was visible immediately, they could spot changes instantly and act on them. That is a fundamentally different mindset from the management assumption.

The project succeeded because the team built with operators daily. The operators felt understood. The tool made their work easier, better, and more engaging.

When it came time to roll out to other plants, David refused to do a traditional deployment. Instead, he had operators talk to operators. He tapped into the site's existing internal communication channels, featured operators (not engineers, not managers) telling their own stories, and let word spread organically. Within a few years, every plant wanted the solution. Not because leadership mandated it, but because peers recommended it.

This is the difference between push and pull. And it is the difference between a pilot that dies and a platform that scales.



Why your digital transformation is failing (and how to fix it)


"Management went like, no, no, no, we can't do that. Because if you pre-fill the previous value, then maybe they are lousy and they're just clicking next, next, next."



How to put these insights into practice

The gap between knowing what to do and actually doing it is where most manufacturers stall. Here is a concrete sequence based on what David shared and what we see working across 400+ manufacturers using Guidewheel.

Step 1: Map your existing digital twins. Before buying anything, inventory the systems that already hold representations of your assets and processes. Historian, MES, ERP, quality, maintenance, engineering documents. Identify where data overlaps, where it conflicts, and where it is missing entirely. This is your honest starting point.

Step 2: Pick one use case with scaling potential. Do not try to build a unified namespace for everything at once. Find a specific, high-value use case, such as real-time downtime visibility, changeover tracking, or OEE across shifts, that can prove value at one site and then extend to others. The use case should matter to both the plant floor and the P&L.

Step 3: Deploy on top, not through. Use a non-intrusive approach to get data flowing without touching PLCs, SCADA, or OT networks. Guidewheel's clip-on sensors install in minutes per machine and deliver second-by-second data with zero IT involvement. This eliminates the biggest objection to getting started: risk to existing operations.

Step 4: Co-build with operators, not for them. Involve shift teams from day one. Ask them what would make their work easier. Let them shape the interface, the alerts, the workflows. The people closest to the work know what matters. Their buy-in is not optional; it is the mechanism by which the solution scales.

Step 5: Let early wins create pull. Once you have a site where operators are seeing value, resist the urge to mandate a rollout. Instead, create channels for operators to share their experience with peers at other plants. Feature their stories. Let the demand come from the floor. That is how you go from one plant to fifty.

Step 6: Build the business case as you go. David made an important point: for foundational tools, the ROI may not be obvious in the first weeks. The value compounds as teams start using the data to make decisions, spot patterns, and prevent problems before they escalate. Give the foundation time to prove itself, but track leading indicators like adoption rates, downtime response times, and shift-over-shift consistency.


Building the data foundation for industrial AI

The conversation with David reinforced something we see every day at Guidewheel: the biggest barrier to AI in manufacturing is not the algorithm. It is the absence of clean, contextualized, real-time data from the factory floor. The WIP black hole, the siloed digital twins, the tribal knowledge walking out the door with retiring operators: these are the problems that must be solved first.

The good news is that solving them does not require a multi-year, rip-and-replace project. It requires a pragmatic, build-on-top approach that respects existing automation, starts with high-value use cases, and puts operators at the center.

As David put it: just start. Stop talking about it.

If you are ready to close the visibility gap between your ERP and your factory floor, and build the real-time data foundation that makes AI and automation actually work, we would love to show you how Guidewheel can help.

Book a Demo



Watch and listen

Watch the webinar now:

Listen on Apple Podcasts

GradientGradient