26 avsnitt • Längd: 40 min • Månadsvis
How can we really digitalize our Industry? Join us as we navigate through the innovations and challenges shaping the future of manufacturing and critical infrastructure. From insightful interviews with industry leaders to deep dives into transformative technologies, this podcast is your guide to understanding the digital revolution at the heart of the physical world. We talk about IT/OT Convergence and focus on People & Culture, not on the Buzzwords. To support the transformation, we discover which Technologies (AI! Cloud! IIoT!) can enable this transition.
itotinsider.substack.com
The podcast The IT/OT Insider Podcast – Pioneers & Pathfinders is created by By David Ariens and Willem van Lammeren. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Welcome to the final episode of our special Industrial DataOps podcast series. And what better way to close out the series than with Dominik Obermaier, CEO and co-founder of HiveMQ—one of the most recognized names when it comes to MQTT and Unified Namespace (UNS).
Dominik has been at the heart of the MQTT story from the very beginning—contributing to the specification, building the company from the ground up, and helping some of the world’s largest manufacturers, energy providers, and logistics companies reimagine how they move and use industrial data.
Every Company is Becoming an IoT Company
Dominik opened with a striking analogy:
"Just like every company became a computer company in the ‘80s and an internet company in the ‘90s, we believe every company is becoming an IoT company."
And that belief underpins HiveMQ’s mission—to build the digital backbone for the Internet of Things, connecting physical assets to digital applications across the enterprise.
Subscribe for free to receive new posts and support our work.
Today, HiveMQ is used by companies like BMW, Mercedes-Benz, and Lilly to enable real-time data exchange from edge to cloud, using open standards that ensure long-term flexibility and interoperability.
What is MQTT?
For those new to MQTT, Dominik explains what it is: a lightweight, open protocol built for real-time, scalable, and decoupled communication.
Originally developed in the late 1990s for oil pipeline monitoring, MQTT was designed to minimize bandwidth, maximize reliability, and function in unstable network conditions.
It uses a publish-subscribe pattern, allowing producers and consumers of data to remain decoupled and highly scalable—ideal for IoT and OT environments, where devices range from PLCs to cloud applications.
"HTTP works for the internet of humans. MQTT is the protocol for the internet of things."
The real breakthrough came when MQTT became an open standard. HiveMQ has been a champion of MQTT ever since—helping manufacturers escape vendor lock-in and build interoperable data ecosystems.
From Broker to Backbone: Mapping HiveMQ to the Capability Model
HiveMQ is often described as an MQTT broker, but as Dominik made clear, it's far more than that. Let’s map their offerings to our Industrial DataOps Capability Map:
Connectivity & Edge Ingest →
* HiveMQ Edge: A free, open-source gateway to connect to OPC UA, Modbus, BACnet, and more.
* Converts proprietary protocols into MQTT, making data accessible and reusable.
Data Transport & Integration →
* HiveMQ Broker: The core engine that enables highly reliable, real-time data movement across millions of devices.
* Scales from single factories to hundreds of millions of data tags.
Contextualization & Governance →
* HiveMQ Data Hub and Pulse: Tools for data quality, permissions, history, and contextual metadata.
* Pulse enables distributed intelligence and manages the Unified Namespace across global sites.
UNS Management & Visualization →
* HiveMQ Pulse is a true UNS solution that provides structure, data models, and insights without relying on centralized historians.
* Allows tracing of process changes, root cause analysis, and real-time decision support.
Building the Foundation for Real-Time Enterprise Data
Few topics have gained as much traction recently as UNS (Unified Namespace). But as Dominik points out, UNS is not a product—it’s a pattern. And not all implementations are created equal.
"Some people claim a data lake is a UNS. Others say it’s OPC UA. It’s not. UNS is about having a shared, real-time data structure that’s accessible across the enterprise."
HiveMQ Pulse provides a managed, governed, and contextualized UNS, allowing companies to:
* Map their assets and processes into a structured namespace.
* Apply insights and rules at the edge—without waiting for data to reach the cloud.
* Retain historical context while staying close to real-time operations.
"A good data model will solve problems before you even need AI. You don’t need fancy tech—you need structured data and the ability to ask the right questions."
Fix the Org Before the Tech
One of the most important takeaways from this conversation was organizational readiness. Dominik was clear:
"You can’t fix an organizational problem with technology."
Successful projects often depend on having:
* A digital transformation bridge team between IT and OT.
* Clear ownership and budget—often driven by a C-level mandate.
* A shared vocabulary, so teams can align on definitions, expectations, and outcomes.
To help customers succeed, HiveMQ provides onboarding programs, certifications, and educational content to establish this common language.
Use Case
One specific use case we’d like to highlight is that at Lilly, a Pharmaceutical company:
Getting Started with HiveMQ & UNS
Dominik shared practical advice for companies just starting out:
* Begin with open-source HiveMQ Edge and Cloud—no license or sales team required.
* Start small—connect one PLC, stream one tag, and build from there.
* Demonstrate value quickly—show how a single insight (like predicting downtime from a temperature drift) can justify further investment.
* Then scale—build a sustainable, standards-based data architecture with the support of experienced partners.
Final Thoughts: A Fitting End to the Series
This episode was the perfect way to end our Industrial DataOps podcast series—a conversation that connected the dots between open standards, scalable data architecture, organizational design, and future-ready analytics (and don’t worry, we have lots of other podcast ideas for the months to come :)).
HiveMQ’s journey—from a small startup to powering the largest industrial IoT deployments in the world—is proof that open, scalable, and reliable infrastructure will be the foundation for the next generation of digital manufacturing.
If you want to learn more about MQTT, UNS, or HiveMQ Pulse, check out the excellent content at www.hivemq.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 11!
As we get closer to Hannover Messe 2025, we’re also approaching the final episodes of this podcast series. Today we have two fantastic guests from AVEVA: Roberto Serrano Hernández, Technology Evangelist for the CONNECT industrial intelligence platform, and Clemens Schönlein, Technology Evangelist for AI and Analytics.
Together, they bring a unique mix of deep technical insight, real-world project experience, and a passion for making industrial data usable, actionable, and valuable.
We cover a lot in this episode: from the evolution of AVEVA's CONNECT industrial intelligence platform, to real-world use cases, data science best practices, and the cloud vs. on-prem debate. It’s a powerful conversation on how to build scalable, trusted, and operator-driven data solutions.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts.
What is CONNECT?
Let’s start with the big picture. What is the CONNECT industrial intelligence platform? As Roberto explains:
"CONNECT is an open and neutral industrial data platform. It brings together all the data from AVEVA systems—and beyond—and helps companies unlock value from their operational footprint."
This isn’t just another historian or dashboard tool. CONNECT is a cloud-native platform that allows manufacturers to:
* Connect to on-prem systems.
* Store, contextualize, and analyze data.
* Visualize it with built-in tools or share it with AI platforms like Databricks.
* Enable both data scientists and domain experts to collaborate on decision-making.
It’s also built to make the transition to cloud as seamless as possible—while preserving compatibility with legacy systems.
"CONNECT is for customers who want to do more – close the loop, enable AI, and future-proof their data strategy"
Where CONNECT Fits in the Industrial Data Capability Map
Roberto breaks it down neatly:
* Data Acquisition – Strong roots in industrial protocols and legacy system integration.
* Data Storage and Delivery – The core strength of CONNECT: clean, contextualized, and trusted data in the cloud.
* Self-Service Analytics & Visualization – Tools for both data scientists and OT operators to work directly with data.
* Ecosystem Integration – CONNECT plays well with Databricks, Snowflake, and other analytics platforms.
But Clemens adds an important point:
"The point isn’t just analytics—it’s about getting insights back to the operator. You can’t stop at a dashboard. Real value comes when change happens on the shop floor."
Use Case Spotlight: Stopping Downtime with Data Science at Amcor
One of the best examples of CONNECT in action is the case of Amcor, a global packaging manufacturer producing the plastic film used in things like chip bags and blister packs.
The Problem:
* Machines were stopping unpredictably, causing expensive downtime.
* Traditional monitoring couldn’t explain why.
* Root causes were hidden upstream in the process.
The Solution:
* CONNECT was used to combine MES data and historian data in one view.
* Using built-in analytics tools, the team found that a minor drift in a temperature setpoint upstream was causing the plastic’s viscosity to change—leading to stoppages further down the line.
* They created a correlation model, mapped it to ideal process parameters, and fed the insight back to operators.
"The cool part was the speed," said Clemens. "What used to take months of Excel wrangling and back-and-forth can now be done in minutes."
The Human Side of Industrial Data: Start with the Operator
One of the most powerful themes in this episode is the importance of human-centric design in analytics.
Clemens shares from his own experience:
"I used to spend months building an advanced model—only to find out the data wasn't trusted or the operator didn’t care. Now I start by involving the operator from Day 1."
This isn’t just about better UX. It’s about:
* Getting faster buy-in.
* Shortening time-to-value.
* Ensuring that insights are actionable and respected.
Data Management and Scaling Excellence
We also touched on the age-old challenge of data management. AVEVA’s take? Don’t over-architect. Start delivering value.
"Standardization is important—but don’t wait five years to get it perfect. Show value early, and the standardization will follow."
And when it comes to building centers of excellence, Clemens offers a simple yet powerful principle:
"Talk to the people who press the button. If they don’t trust your model, they won’t use it."
Final Thoughts
As we edge closer to Hannover Messe, and to the close of this podcast series, this episode with Clemens and Roberto reminds us what Industrial DataOps is all about:
* Useful data
* Actionable insights
* Empowered people
* Scalable architecture
If you want to learn more about AVEVA's CONNECT industrial intelligence platform and their work in AI and ET/OT/IT convergence, visit: www.aveva.com
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 10 of the IT/OT Insider Podcast. Today, we're pleased to feature Anupam Gupta, Co-Founder & President North Americas at Celebal Technologies, to discuss how enterprise systems, AI, and modern data architectures are converging in manufacturing.
Celebal Technologies is a key partner of SAP, Microsoft, and Databricks, specializing in bridging traditional enterprise IT systems with modern cloud data and AI innovations. Unlike many of our past guests who come from a manufacturing-first perspective, Celebal Technologies approaches the challenge from the enterprise side—starting with ERP and extending into industrial data, AI, and automation.
Anupam's journey began as a developer at SAP, later moving into consulting and enterprise data solutions. Now, with Celebal Technologies, he is helping manufacturers combine ERP data, OT data, and AI-driven insights into scalable Lakehouse architectures that support automation, analytics, and business transformation.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
ERP as the Brain of the Enterprise
One of the most interesting points in our conversation was the role of ERP (Enterprise Resource Planning) systems in manufacturing.
"ERP is the brain of the enterprise. You can replace individual body parts, but you can't transplant the brain. The same applies to ERP—it integrates finance, logistics, inventory, HR, and supply chain into a single system of record."
While ERP is critical, it doesn't cover everything. The biggest gap? Manufacturing execution and OT data.
* ERP handles business transactions → orders, invoices, inventory, financials.
* MES and OT systems handle operations → machine status, process execution, real-time sensor data.
Traditionally, these two have been separated, but modern manufacturers need both worlds to work together. That's where integrated data platforms come in.
Bridging Enterprise IT and Manufacturing OT
Celebal Technologies specializes in merging enterprise and industrial data, bringing IT and OT together in a structured, scalable way.
Anupam explains: "When we talk about Celebal Tech, we say we sit at the right intersection of traditional enterprise IT and modern cloud innovation. We understand ERP, but we also know how to integrate it with IoT, AI, and automation."
Key focus areas include:
* Unifying ERP, MES, and OT data into a central Lakehouse architecture.
* Applying AI to optimize operations, logistics, and supply chain decisions.
* Enabling real-time data processing at the edge while leveraging cloud for scalability.
This requires a shift from traditional data warehouses to modern Lakehouse architectures—which brings us to the next big topic.
What is a Lakehouse and Why Does It Matter?
Most people are familiar with data lakes and data warehouses, but a Lakehouse combines the best of both.
Traditional Approaches:
* Data warehouses → Structured, governed, and optimized for business analytics, but not flexible for AI or IoT data.
* Data lakes → Can store raw data from many sources but often become data swamps—difficult to manage and analyze.
Lakehouse Benefits:
* Combines structured and unstructured data → Supports ERP transactions, sensor data, IoT streams, and documents in a single system.
* High performance analytics → Real-time queries, machine learning, and AI workloads.
* Governance and security → Ensures data quality, lineage, and access control.
"A Lakehouse lets you store IoT and ERP data in the same environment while enabling AI and automation on top of it. That's a game-changer for manufacturing."
Celebal Tech is a top partner for Databricks and Microsoft in this space, helping companies migrate from legacy ERP systems to modern AI-powered data platforms.
There's More to AI Than GenAI
With all the hype around Generative AI (GenAI), it's important to remember that AI in manufacturing goes far beyond chatbots and text generation.
"Many companies are getting caught up in the GenAI hype, but the real value in manufacturing AI comes from structured, industrial data models and automation."
Celebal Tech is seeing two major AI trends:
* AI for predictive maintenance and real-time analytics → Using sensor and operational data to predict failures, optimize production, and automate decisions.
* AI-driven automation with agent-based models → AI is moving from just providing recommendations to executing complex tasks in ERP and MES environments.
GenAI has a role to play, but:
* Many companies are converting structured data into unstructured text just to apply GenAI—which doesn't always make sense.
* Enterprises need explainability and trust before AI can take over critical operations.
"Think of AI in manufacturing like self-driving cars—we're not fully autonomous yet, but we're moving toward AI-assisted automation."
The key to success? Good data governance, well-structured industrial data, and AI models that operators can trust.
Final Thoughts: Scaling DataOps and AI in Manufacturing
For manufacturers looking to modernize their data strategy, Anupam offers three key takeaways:
* Unify ERP and OT data → AI and analytics only work when data is structured and connected across systems.
* Invest in a Lakehouse approach → It's the best way to combine structured business data with real-time industrial data.
* AI needs governance→ Without trust, transparency, and explainability, AI won't be adopted at scale.
"You don't have to replace your ERP or MES, but you do need a data strategy that enables AI, automation, and better decision-making."
If you want to learn more about Celebal Technologies and how they're bridging AI, ERP, and manufacturing data, visit www.celebaltech.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 9 in our Special DataOps series. We’re getting closer to Hannover Messe, and thus also the end of this series. We still have some great episodes ahead of us, with AVEVA, HiveMQ and Celebal Technologies joining us in the days to come (and don’t worry, this is not the end of our podcasts, many other great stories are already recorded and will be aired in April!)
In this episode, we’re joined by David Rogers, Senior Solutions Architect at Databricks, to explore how AI, data governance, and cloud-scale analytics are reshaping manufacturing.
David has spent years at the intersection of manufacturing, AI, and enterprise data strategy, working at companies like Boeing and SightMachine before joining Databricks. Now, he’s leading the charge in helping manufacturers unlock value from their data—not just by dumping it into the cloud, but by structuring, governing, and applying AI effectively.
Databricks is one of the biggest names in the data and AI space, known for lakehouse architecture, AI workloads, and large-scale data processing. But how does that apply to the shop floor, supply chain, and industrial operations?
That’s exactly what we’re unpacking today.
Join Our Community Today! Subscribe for free to receive all new post
What is Databricks and How Does It Fit into Manufacturing?
Databricks is a cloud-native data platform that runs on AWS, Azure, and Google Cloud, providing an integrated set of tools for ETL, AI, and analytics.
David breaks it down:
"We provide a platform for any data and AI workload—whether it’s real-time streaming, predictive maintenance, or large-scale AI models."
In the manufacturing context, this means:
* Bringing factory data into the cloud to enable AI-driven decision-making.
* Unifying different data types—SCADA, MES, ERP, and even video data—to create a complete operational view.
* Applying AI models to optimize production, reduce downtime, and improve quality.
"Manufacturers deal with physical assets, which means their data comes from machines, sensors, and real-world processes. The challenge is structuring and governing that data so it’s usable at scale."
Why Data Governance Matters More Than Ever
Governance is becoming a critical challenge in AI-driven manufacturing.
David explains why:
"AI is only as good as the data feeding it. If you don’t have structured, high-quality data, your AI models won’t deliver real value."
Some key challenges manufacturers face:
* Data silos → OT data (SCADA, historians) and IT data (ERP, MES) often remain disconnected.
* Lack of lineage → Companies struggle to track how data is transformed, making AI deployments unreliable.
* Access control issues → Manufacturers work with multiple vendors, suppliers, and partners, making data security and sharing complex.
Databricks addresses this through Unity Catalog, an open-source data governance framework that helps manufacturers:
* Control access → Manage who can see what data across the organization.
* Track data lineage → Ensure transparency in how data is processed and used.
* Enforce compliance → Automate data retention policies and regional data sovereignty rules.
"Data governance isn’t just about security—it’s about making sure the right people have access to the right data at the right time."
A Real-World Use Case: AI-Driven Quality Control in Automotive
One of the best examples of how Databricks is applied in manufacturing is in the automotive industry, where manufacturers are using AI and multimodal data to improve yield of battery packs for EV’s.
The Challenge:
* Traditional quality control relies heavily on human inspection, which is time-consuming and inconsistent.
* Sensor data alone isn’t enough—video, images, and even operator notes play a role in defect detection.
* AI models need massive, well-governed datasets to detect patterns and predict failures.
The Solution:
* The company ingested data from SCADA, MES, and video inspection cameras into Databricks.
* Using machine learning, they automatically detected defects in real time.
* AI models were trained on historical quality failures, allowing the system to predict when a defect might occur.
* All of this was done at cloud scale, using governed data pipelines to ensure traceability.
"Manufacturers need AI that works across multiple data types—time-series, video, sensor logs, and operator notes. That’s the future of AI in manufacturing."
Scaling AI in Manufacturing: What Works?
A big challenge for manufacturers is moving beyond proof-of-concepts and actually scaling AI deployments.
David highlights some key lessons from successful projects:
* Start with the right use case → AI should be solving a high-value problem, not just running as an experiment.
* Ensure data quality from the beginning → Poor data leads to poor AI models. Structure and govern your data first.
* Make AI models explainable → Black-box AI models won’t gain operator trust. Make sure users can understand how predictions are made.
* Balance cloud and edge → Some AI workloads belong in the cloud, while others need to run at the edge for real-time decision-making.
"It’s not about collecting ALL the data—it’s about collecting the RIGHT data and applying AI where it actually makes a difference."
Unified Namespace (UNS) and Industrial DataOps
David also touches on the role of Unified Namespace (UNS) in structuring manufacturing data.
"If you don’t have UNS, your data will be an unstructured mess. You need context around what product was running, on what line, in what factory."
In Databricks, governance and UNS go hand in hand:
* UNS provides real-time context at the factory level.
* Databricks ensures governance and scalability at the enterprise level.
"You can’t build scalable AI without structured, contextualized data. That’s why UNS and governance matter."
Final Thoughts: Where is Industrial AI Heading?
* More real-time AI at the edge → AI models will increasingly run on local devices, reducing cloud dependencies.
* Multimodal AI will become standard → Combining sensor data, images, and operator inputs will drive more accurate predictions.
* AI-powered data governance → Automating data lineage, compliance, and access control will be a major focus.
* AI copilots for manufacturing teams → Expect more AI-driven assistants that help operators troubleshoot issues in real time.
"AI isn’t just about automating decisions—it’s about giving human operators better insights and recommendations."
Final Thoughts
AI in manufacturing is moving beyond hype and into real-world deployments—but the key to success is structured data, proper governance, and scalable architectures.
Databricks is tackling these challenges by bringing AI and data governance together in a platform designed to handle industrial-scale workloads.
If you’re interested in learning more, check out www.databricks.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 8 of the IT/OT Insider Podcast. Today, we’re diving into real-time data, edge processing, and AI-driven analytics with Evan Kaplan, CEO of InfluxData.
InfluxDB is one of the most well-known time-series databases, used by developers, industrial companies, and cloud platforms to manage high-volume data streams. With 1.3 million open-source users and partners like Siemens, Bosch, and Honeywell, it’s a major player in the Industrial DataOps ecosystem.
Evan brings a unique perspective—coming from a background in networking, cybersecurity, and venture capital, he understands both the business and technical challenges of scaling industrial data infrastructure.
In this episode, we explore:
* How time-series data has become critical in manufacturing.
* The shift from on-prem to cloud-first architectures.
* The role of open-source in industrial data strategies.
* How AI and automation are reshaping data-driven decision-making.
Let’s dive in.
If you like this episode, you surely don’t want the miss our other stuff. Subscribe now!
From Networking to Time-Series Data
Evan’s journey into time-series databases started in venture capital, where he met Paul Dix, the founder of InfluxData.
"At the time, I wasn't a data expert, but I saw an opportunity—everything in the world runs on time-series data. Sensors, machines, networks—they all generate metrics that change over time."
At the time, InfluxDB was a small open-source project with about 3,000 users. Today, it’s grown to 1.3 million users, powering everything from IoT devices and industrial automation to financial services and network telemetry.
One of the biggest drivers of this growth? Industrial IoT.
"Over the last decade, we’ve seen a shift. IT teams originally used InfluxDB for monitoring servers and applications. But today, over 60% of our business comes from industrial IoT and sensor data analytics."
How InfluxDB Maps to the Industrial Data Platform Capability Model
We often refer to our Industrial Data Platform Capability Map to understand where different technologies fit into the IT/OT data landscape.
So where does InfluxDB fit?
* Connectivity & Ingest → One of InfluxDB’s biggest strengths. It can ingest massive amounts of data from sensors, PLCs, MQTT brokers, and industrial protocols using Telegraf, their open source agent.
* Edge & Cloud Processing → Data can be stored and analyzed locally at the edge, then replicated to the cloud for long-term storage.
* Time-Series Analytics → InfluxDB specializes in storing, querying, and analyzing time-series data, making it ideal for predictive maintenance, OEE tracking, and process optimization.
* Integration with Data Lakes & AI → Many manufacturers use InfluxDB as the first stage in their data pipeline before sending data to Snowflake, Databricks, or other lakehouse architectures.
"Our strength is in real-time streaming and short-term storage. Most customers eventually downsample and push long-term data into a data lake."
A Real-World Use Case: ju:niz Energy’s Smart Battery Systems
One of the most compelling use cases for InfluxDB comes from ju:niz Energy, a company specializing in off-grid energy storage.
The Challenge:
* ju:niz needed to monitor and optimize distributed battery systems used in renewable energy grids.
* Each battery had hundreds of sensors generating real-time data.
* Connectivity was unreliable, meaning data couldn’t always be sent to the cloud immediately.
The Solution:
* Each battery system was equipped with InfluxDB at the edge to store and process local data.
* Data was compressed and synchronized with the cloud whenever a connection was available.
* AI models used InfluxDB data to predict battery failures and optimize energy usage.
The Results:
* Improved energy efficiency—By analyzing real-time data, ju:niz optimized battery charging and discharging across their network.
* Reduced downtime—Predictive maintenance prevented unexpected failures.
* Scalability—The system could be expanded without requiring a centralized cloud-only approach.
"This hybrid edge-cloud model is becoming more common in industrial IoT. Not all data needs to live in the cloud—sometimes, local processing is faster, cheaper, and more reliable."
Cloud vs. On-Prem: The Future of Industrial Data Storage
A common debate in industrial digitalization is whether to store data on-premise or in the cloud.
Evan sees a hybrid approach as the future:
"Pushing all data to the cloud isn’t practical. Factories need real-time decision-making at the edge, but they also need centralized visibility across multiple sites."
A few key trends:
* Cloud adoption is growing, with 55-60% of InfluxDB deployments now cloud-based.
* Hybrid architectures are emerging, where real-time data stays at the edge while historical data moves to the cloud.
* Data replication is becoming the norm, ensuring that insights aren’t locked into one location.
"The most successful companies are balancing edge processing with cloud-scale analytics. It’s not either-or—it’s about using the right tool for the right job."
AI and the Next Evolution of Industrial Automation
AI has been a major topic in every recent IT/OT discussion, but how does it apply to manufacturing and time-series data?
Evan believes AI will redefine industrial operations—but only if companies structure their data properly.
"AI needs high-quality, well-governed data to work. If your data is a mess, your AI models will be a mess too."
Some key AI trends he sees:
* AI-assisted predictive maintenance → Combining sensor data, historical trends, and real-time analytics to predict failures before they happen.
* Real-time anomaly detection → AI models can identify subtle changes in machine behavior and flag potential issues.
* Autonomous process control → Over time, AI will move from making recommendations to fully automating factory adjustments.
"Right now, AI is mostly about decision support. But in the next five years, we’ll see fully autonomous manufacturing systems emerging."
Final Thoughts: How Should Manufacturers Approach Data Strategy?
For companies starting their Industrial DataOps journey, Evan has a few key recommendations:
* Start with a strong data model → Don’t just collect data—structure it properly from day one.
* Invest in developers → The best data strategies aren’t IT-led or OT-led—they’re developer-led.
* Think hybrid → Balance edge and cloud storage to get the best of both worlds.
* Prepare for AI → Even if AI isn’t a priority now, organizing your data properly will make AI adoption easier in the future.
"Industrial data is evolving fast, but the companies that structure and govern their data properly today will have a huge advantage tomorrow."
Next Steps & More Resources
Industrial DataOps is no longer just a concept—it’s becoming a business necessity. Companies that embrace scalable data management and AI-driven insights will outpace competitors in efficiency and innovation.
If you want to learn more about InfluxDB and time-series data strategies, visit www.influxdata.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome back to the IT/OT Insider Podcast. In this episode, we dive deep into industrial data modeling, manufacturing execution systems (MES), and the rise of headless data platforms with Geoff Nunan, CTO and co-founder of Rhize.
Geoff has been working in industrial automation and manufacturing information systems for over 30 years. His experience spans multiple industries, from mining and pharmaceuticals to food & beverage. But what really drove him to start Rhize was a frustration many in the industry will recognize:
"MES solutions are either too rigid or too custom-built. We needed a third option—something flexible but structured, something that could scale without requiring endless software development."
Rhize is built around that idea. It’s a headless manufacturing data platform that allows companies to build custom applications on top of a standardized data backbone.
In today’s discussion, we explore why MES implementations often struggle, why data modeling is key to digital transformation, and how companies can avoid repeating the same mistakes when scaling industrial data solutions. Or in the words of Geoff:
“Data Modeling in manufacturing isn't optional. You're either going to end up with the model that you planned for or the one that you didn’t.”
Thanks for reading The IT/OT Insider! Subscribe for free to support our work:
Why Geoff co-founded Rhize: The MES Dilemma
Geoff’s journey to starting Rhize began with a frustrating experience at a wine bottling plant in Australia.
The company was implementing an MES solution to track downtime, manage inventory, and integrate with ERP. Sounds simple, right? But the project quickly became complex and expensive—and despite being an off-the-shelf solution, it required a lot of custom development.
"It was a simple MES use case, yet we spent 80% of our time on the 20% of requirements that didn’t fit the system. That’s the reality of most MES projects."
After seeing this pattern repeat across multiple industries, Geoff realized the problem wasn’t just the software—it was the entire approach.
* Off-the-shelf MES systems are often too rigid → They don’t adapt well to company-specific workflows.
* Custom-built solutions are too complex → They require too much development and long-term maintenance, especially in larger corporations.
* Manufacturing data needs structure, but also flexibility → There wasn’t a “headless” option that let companies build custom applications on a standardized data backbone.
So, seven years ago, Geoff and his team started Rhize, focusing on providing a flexible, open manufacturing data platform that supports modern low-code front-end applications.
"We don’t provide an MES. We provide the data foundation that lets you build MES-like applications the way you need them."
How Rhize Maps to the Industrial Data Platform Capability Model
One of the key themes of our podcast series is understanding where different solutions fit into the broader industrial data ecosystem.
So, how does Rhize align with our Industrial Data Platform Capability Map?
* Data Modeling → The core of Rhize. It provides a structured, standardized manufacturing data model based on ISA-95.
* Connectivity → Connection via open API’s and the most important industrial protocols.
* Workflow & Event Processing → Supports rules-based automation and event-driven manufacturing processes.
* Scalability → Built to support multi-site deployments with a common, reusable data architecture.
"Traditional MES forces you into a rigid workflow. With Rhize, you get the structure of MES but the flexibility to adapt it to your needs."
The Importance of Data Modeling in Manufacturing
A recurring theme in our conversation is data modeling—a topic that IT teams understand well, but OT teams often overlook.
Geoff explains why a strong data model is critical for industrial data success:
"Any IT system lives or dies by how well its data is structured. Yet in manufacturing, we often take a 'just send the data somewhere' approach without thinking about how to organize it for long-term use."
The problem? Without a structured approach:
* Data becomes siloed → Every plant has a different data format and naming convention.
* Scaling becomes impossible → A solution that works in one factory won’t work in another without extensive rework.
* AI and analytics won’t deliver value → Without consistent, contextualized data, AI models struggle to provide reliable insights.
Geoff believes companies need to adopt structured industrial data models—and the best foundation for that is ISA-95.
"ISA-95 gives us a common language to describe manufacturing. If companies start with this as their foundation, they avoid years of painful restructuring later."
A Real-World Use Case: Gold Traceability in Luxury Watchmaking
One of Rhize’s projects involved a luxury Swiss watchmaker trying to solve a complex traceability problem.
The Challenge:
* The company uses different grades of gold in its watches.
* Due to fluctuating gold prices, tracking material usage accurately was critical.
* The company needed mass balance tracking across all factories, but each plant had different processes and equipment.
The Solution:
* They implemented Rhize as a standardized data platform across all factories.
* They modeled gold usage at a granular level, ensuring every gram was accounted for.
* By unifying data across sites, they could benchmark efficiency and reduce material waste.
The Result:
* Improved material traceability, reducing financial loss from inaccurate tracking.
* More efficient use of gold, leading to millions in savings per year.
* A scalable system, enabling future expansion to other materials and components.
"They didn’t just solve a traceability problem. They built a data foundation that can now be extended to other manufacturing processes."
Why MES Projects Fail—and How to Avoid It
One of the biggest takeaways from our conversation is why MES implementations struggle.
Geoff has seen companies fail multiple times before getting it right, often repeating the same mistakes:
* Overcomplicating the data model → Trying to design for every possible scenario upfront.
* Lack of standardization → Each site implements MES differently, making it impossible to scale.
* Not considering long-term flexibility → A system that works now may not work five years from now.
His advice?
"Companies need to move away from 'big bang' MES rollouts. Start with a strong data model, implement a scalable data platform, and build applications on top of that."
The Role of UNS in Data Governance
Unified Namespace (UNS) has been a hot topic in recent years, but how does it fit into manufacturing data management?
Geoff sees UNS as a useful tool, but not a silver bullet:
* It helps with real-time data sharing, but without a structured data model, it can quickly become a mess.
* Companies should see UNS as part of their data strategy, not the entire strategy.
"If you don’t start with a structured data model, UNS can become an uncontrolled stream of unstructured data. Governance is key."
Final Thoughts
Industrial data is evolving fast, but companies that don’t invest in proper data modeling will struggle to scale.
Rhize is tackling this problem by providing a structured but flexible data platform, allowing manufacturers to build applications the way they need—without the limitations of traditional MES.
If you want to learn more about Rhize and their approach to industrial data, visit www.rhize.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 6 of our Industrial DataOps podcast series. Today, we’re diving into a conversation with Joel Jacob, Principal Product Manager at Splunk, about the company’s growing focus on OT, its approach to industrial data analytics, and how it fits into the broader ecosystem of industrial platforms.
Splunk is a name that’s well known in IT and cybersecurity circles, but its role in industrial environments is less understood. Now, as part of Cisco, Splunk is positioning itself at the intersection of IT observability, security, and industrial data analytics. This episode is all about understanding what that means in practice.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new Industrial DataOps Insights and support our work.
From IT and Cybersecurity to Industrial Data
Joel’s journey into Splunk mirrors the company’s shift into OT. Coming from a background in robotics, automotive, and smart technology, he initially saw Splunk as a security and IT analytics company. But what he found was a growing demand from industrial customers who were already using Splunk for OT use cases.
"A lot of customers had already started using Splunk for OT, and the company realized it needed people with industrial experience to support that growing demand."
Splunk has built its reputation on handling log data, security monitoring, and IT observability. But as Joel explains, industrial data has its own challenges, and Splunk has had to adapt.
How Splunk Fits into the Industrial Data Platform Capability Map
To make sense of where Splunk fits, we look at our Industrial Data Platform Capability Map—a framework that defines the core building blocks of an industrial data strategy.
Splunk’s Strengths:
* Data Storage and Analytics: This is where Splunk is strongest. The platform can ingest, store, and analyze massive amounts of data, whether it’s sensor data, log files, or security events.
* Data Quality and Federation: Splunk allows companies to store raw data and extract value dynamically, rather than forcing them to clean and standardize everything upfront. Its federated search capabilities also mean that data doesn’t have to be centralized—a key advantage for IT/OT integration.
* Visualization and Dashboards: With Dashboard Studio, Splunk provides modern, customizable visualizations that stand out from traditional industrial software.
Where Splunk is Expanding:
* Connectivity and Edge Computing: Historically, getting industrial data into Splunk required external middleware. But in the last 18 months, the company has introduced an edge computing device with built-in AI capabilities, making it easier to ingest and process OT data directly.
* Edge Analytics and AI: The Splunk Edge Hub enables local AI inferencing and analytics on industrial equipment, addressing latency and connectivity challenges that arise when relying on cloud-based models.
Joel sees this as a natural evolution:
"We know that moving all industrial data to the cloud isn’t always practical. By adding edge computing capabilities, we make it easier for OT teams to process data where it’s generated."
A Real-World Use Case: Energy Optimization in Cement Manufacturing
One of Splunk’s key industrial customers, Cementos Argos, is a major cement producer facing a common challenge—high energy costs and carbon emissions.
The Problem:
* Cement manufacturing is one of the most energy-intensive industries in the world.
* The company needed a way to optimize kiln operations while ensuring consistent product quality.
* Traditional manual adjustments were slow and lacked real-time visibility.
The Solution:
* The company ingested data from OT systems into Splunk.
* Using the Machine Learning Toolkit, they built predictive models to optimize kiln temperature and pressure settings.
* These models were then pushed back to PLCs, allowing automated process adjustments.
The Results:
* $10 million in annual energy savings across multiple sites.
* The ability to push AI models to the edge reduced response times by 20%.
* Operators could now trust AI-generated recommendations, while still overriding changes if needed.
"The combination of machine learning and real-time process control created a true closed-loop optimization system."
Federated Search: A Different Approach to Industrial Data
One of Splunk’s unique contributions to industrial data management is federated search. Unlike traditional platforms that require all data to be centralized, Splunk allows companies to analyze data across multiple sources in real-time.
Joel explains the shift in thinking:
"Most industrial data strategies assume you need a single source of truth. But in reality, data lives in multiple places, and moving it all is expensive. With federated search, we can analyze data wherever it resides—whether it’s on-prem, in the cloud, or at the edge."
This is a major departure from the “data lake” approach that many industrial companies have pursued. Instead of trying to move and harmonize all data upfront, Splunk’s model is about leaving data where it makes the most sense and analyzing it dynamically.
How IT and OT Collaboration is Changing
Bridging the IT/OT divide has been a theme across this podcast series, and Splunk’s approach to security and data federation provides a unique perspective on this challenge.
Joel shares some key insights on what makes collaboration successful:
* Security is often the bridge. Since IT teams already use Splunk for security monitoring, they are more open to OT data integration when it’s part of a broader cybersecurity strategy.
* OT needs tools that don’t slow them down. Engineers don’t want to wait for IT approval to test new models. That’s why Splunk’s edge device was designed to be easily deployable by OT teams.
* The next generation of engineers is more IT-savvy. Younger engineers entering the workforce are more comfortable with IT tools and cloud environments, making collaboration easier.
One of the most interesting points was how Splunk leverages its Cisco partnership to expand into OT environments:
"Cisco has an enormous footprint in industrial networking. By running analytics on Cisco switches and edge devices, we can make OT data integration seamless."
The Role of AI in Industrial Data
Like many companies, Splunk is exploring the role of AI and generative AI in industrial environments. One of the most promising areas is automating data analysis and dashboard creation.
Joel shares how this is already happening:
* AI-generated dashboards: Engineers can simply describe what they want in natural language, and Splunk’s AI generates the necessary queries and visualizations.
* Low-code model deployment: Instead of manually writing Python scripts, users can export machine learning models with a single click.
* Multimodal AI: By combining sensor data, image recognition, and sound analysis, AI models can detect patterns that human operators might miss.
"In the next few years, AI will make it dramatically easier to analyze and visualize industrial data—without requiring deep programming expertise."
Final Thoughts
Splunk’s journey into OT is a great example of how traditional IT platforms are adapting to the realities of industrial environments. While the company’s core strength remains in data analytics and security, its expansion into edge computing and OT integration is opening up new possibilities for manufacturers.
If you want to learn more about how Splunk is evolving in the OT space, check out their website: www.splunk.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to another episode of the IT/OT Insider Podcast. In this special series on Industrial DataOps, we’re diving into the world of real-time industrial data, edge computing, and scaling digital transformation. Our guest today is John Younes, Co-founder and COO of Litmus, a company that has been at the forefront of industrial data platforms for the past 10 years.
Litmus is a name that keeps popping up when we talk about bridging OT and IT, democratizing industrial data, and making edge computing scalable. But what does that actually mean in practice? And how does Litmus help manufacturers standardize and scale their industrial data initiatives across multiple sites?
That’s exactly what we’re going to explore today.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new DataOps insights and support our work.
Litmus, you say?
John introduces Litmus as an Industrial DataOps platform, designed to be the industrial data foundation for manufacturers. The goal? To make industrial data usable, scalable, and accessible across the entire organization.
"We help manufacturers connect to any type of equipment, normalize and store data locally, process it at the edge, and then integrate it into enterprise systems—whether that’s cloud, AI platforms, or business applications."
At the core of Litmus’ offering is Litmus Edge, a factory-deployable edge data platform. It allows companies to:
* Connect to industrial equipment using built-in drivers.
* Normalize and store data locally, enabling real-time analytics and processing.
* Run AI models and analytics workflows at the edge for on-premise decision-making.
* Push data to cloud platforms like Snowflake, Databricks, AWS, and Azure.
For enterprises with multiple factories, Litmus Edge Manager provides a centralized way to manage and scale deployments, allowing companies to standardize use cases across multiple plants.
"We don’t just want to collect data. We want to help companies actually use it—to make better decisions and improve efficiency."
How Litmus Maps to the Industrial Data Platform Capability Model
We always refer to our Industrial Data Platform Capability Map to understand how different technologies fit into the broader IT/OT data landscape. So where does Litmus fit in?
* Connectivity → One of Litmus’ core strengths. Their platform connects to PLC, SCADA, MES, historians, and IoT sensors out-of-the-box.
* Edge Compute and Store → Litmus processes and optionally stores data locally before sending it to the cloud, reducing costs and improving real-time responsiveness.
* Data Normalization & Contextualization → The platform includes a data modeling layer, making sure data is structured and usable for enterprise applications.
* Analytics & AI → Companies can run KPIs like OEE, asset utilization, and energy consumption directly on the edge.
* Scalability & Management → With Litmus Edge Manager, enterprises can deploy and scale their data infrastructure across dozens of plants without having to rebuild everything from scratch.
John explains:
"The biggest challenge in industrial data isn’t just connecting things—it’s making that data usable at scale. That’s why we built Litmus Edge Manager to help companies replicate use cases across their entire footprint."
A Real-World Use Case: Standardizing OEE Across 35 Plants
One of the most compelling Litmus deployments comes from a large European food & beverage manufacturer with 50+ factories.
The Challenge:
* The company had grown through acquisitions, meaning each factory had different equipment, different systems, and different data formats.
* They wanted to standardize OEE (Overall Equipment Effectiveness) across all plants to benchmark performance and identify inefficiencies.
* They needed a way to deploy an Industrial DataOps solution at scale—without taking years to implement.
The Solution:
* The company deployed Litmus Edge in 35 factories within 12-18 months.
* They standardized KPIs like OEE across all plants, providing real-time insights into performance.
* By filtering and compressing data at the edge, they reduced cloud storage costs by 90%.
* They also introduced energy monitoring, identifying unused machines running during non-production hours, leading to 4% energy savings per plant.
The Impact:
* Faster deployment: The project was rolled out with just a small team, proving that scalability in industrial data is possible.
* Cost savings: Less unnecessary cloud storage and lower energy usage translated to significant financial gains.
* Enterprise-wide visibility: For the first time, they could compare OEE across all plants and identify best practices for process optimization.
"With Litmus, they didn’t just deploy a one-off use case. They built a scalable, repeatable data foundation that they can expand over time."
The Challenge of Scaling Industrial Data
One of the biggest barriers to industrial digitalization is scalability. IT systems are designed to scale effortlessly—but factory environments are different.
John explains:
"Even within the same factory, two production lines might be completely different. How do you deploy a use case that works across all sites without starting from scratch every time?"
His answer? A standardized but flexible approach.
* 80% of the deployment can be standardized.
* 20% requires last-mile configuration to account for machine variations.
* A central management platform ensures that scaling doesn’t require an army of engineers.
"The key is having a platform that adapts to different machines and processes—without forcing companies to custom-build everything for each site."
Data Management: The Next Big IT/OT Challenge
As industrial companies push for enterprise-wide data strategies, data management is becoming a bigger issue.
John shares his take:
"IT teams have been doing data management for years. But in OT, data governance is still a new concept."
Some of the biggest challenges he sees:
* Legacy data formats and siloed systems make data hard to standardize.
* Different plants use different naming conventions, making data aggregation difficult.
* Lack of clear ownership—Who is responsible for defining the data model? IT? OT? Corporate?
To address this, Litmus introduced a Unified Namespace (UNS) solution, allowing companies to enforce data models from enterprise level down to individual assets.
"We’re seeing more companies set up dedicated data teams—because without good data management, AI and analytics won’t work properly."
The Role of AI in Industrial Data
AI is the hottest topic in manufacturing right now, but how does it actually fit into industrial data workflows?
John sees two major trends:
* AI-powered analytics at the edge
* Instead of just sending raw data to the cloud, companies are running AI models directly on edge devices.
* Example: AI detecting machine anomalies and recommending preventative actions to operators before failures occur.
* AI-assisted deployment & automation
* Litmus is using AI to simplify Industrial DataOps—automating edge deployments across multiple sites.
* Example: Instead of manually configuring devices, users can type a command like “Deploy Litmus Edge to 30 plants with Siemens drivers”, and the system automates the entire process.
"AI won’t replace humans on the shop floor anytime soon. But it will make deploying, managing, and using industrial data significantly easier."
Final Thoughts
Industrial DataOps is no longer just a technical experiment—it’s becoming a business necessity. Companies that don’t embrace scalable data management and AI-driven insights risk falling behind their competitors.
Litmus is tackling the problem head-on by providing a standardized but flexible way to ingest, process, and scale industrial data.
If you want to learn more about Litmus and their approach to Industrial DataOps, check out their website: www.litmus.io.
If you’re visiting Hannover Messe, find them in Hall 16 Booth B06. More information here: https://litmus.io/hannover-messe
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 4 of our special podcast series on Industrial DataOps. Today, we’re joined by Aron Semle, CTO at HighByte, to discuss how contextualized industrial data, Unified Namespace (UNS), and Edge AI are transforming IT/OT collaboration.
Aron has spent over 15 years working in industrial connectivity, starting his career at Kepware (later acquired by PTC) before joining HighByte in 2020. With a deep understanding of industrial data integration, he shares insights on why DataOps matters, what makes or breaks a data strategy, and how organizations can scale their industrial data initiatives.
Thanks for reading The IT/OT Insider! Subscribe for free to receive our weekly insights.
Who is HighByte?
HighByte is focused on Industrial DataOps—helping companies connect, contextualize, and share industrial data at scale. The platform bridges the gap between OT and IT, ensuring that manufacturing data is structured, clean, and ready for enterprise systems.
Aron sums it up perfectly:
"We solved connectivity years ago, but we never put context around data. Industrial DataOps is about fixing that—so IT teams actually understand the data coming from OT systems."
This contextualization challenge is at the heart of Industrial DataOps, and it’s why companies are moving beyond simple connectivity toward structured, enterprise-ready industrial data.
What is Industrial DataOps?
Many organizations struggle with fragmented, unstructured data in manufacturing. Aron defines Industrial DataOps as:
* An IT-driven discipline applied to OT
* The process of structuring, transforming, and sharing industrial data
* A bridge between factory systems and enterprise applications
Unlike traditional IT DataOps tools, Industrial DataOps must handle:
* Unstructured, time-series data from OT systems
* Multiple industrial protocols (OPC UA, MQTT, Modbus, etc.)
* On-prem, edge, and cloud data architectures
In short, Industrial DataOps is not just about moving data—it’s about making it usable.
Mapping HighByte to the Industrial Data Platform Capability Model
In our podcast series, we’ve introduced the Industrial Data Platform Capability Map—a framework that helps organizations understand the building blocks of industrial data platforms.
Where Does HighByte Fit?
* Connectivity → HighByte ingests data from PLC, SCADA, MES, historians, databases, and files.
* Contextualization → HighByte’s core strength. It structures data into reusable models before sending it to IT.
* Data Sharing → The platform delivers industrial data in IT-ready formats for BI tools, data lakes, and analytics platforms.
* Storage, Analytics & Visualization → HighByte does not store data or provide analytics. Instead, it feeds high-quality data to existing enterprise tools.
Aron explains the reasoning behind this approach:
"If we started adding storage and visualization, we’d just compete with existing factory systems. Instead, we make sure they work better."
A Real-World Use Case: Detecting Stuck AGVs in Warehouses
One of HighByte’s customers—a global manufacturer with hundreds of warehouses—used Industrial DataOps to optimize autonomous guided vehicles (AGVs).
The Challenge:
* The company used multiple AGV vendors, each with different protocols (Modbus, OPC UA, MQTT).
* Some AGVs would get stuck in corners, causing downtime and inefficiencies.
* Operators had no way to detect when an AGV was stuck across multiple sites.
The Solution:
* HighByte created a standardized data model for AGVs across all sites.
* The platform unified AGV data from different vendors and protocols.
* AWS Lambda functions processed AGV data in real-time to detect and alert operators.
The Results:
* Operators received real-time alerts when AGVs got stuck.
* Downtime was minimized, improving warehouse efficiency.
* The solution was scalable across all sites, reducing integration costs.
Below is another example of the power of Industrial DataOps, in this case at their customer Gousto:
Unified Namespace (UNS): Buzzword or Game-Changer?
The concept of Unified Namespace (UNS) has exploded in popularity, but what does it actually mean?
According to Aron:
"A lot of people think of UNS as just MQTT and a broker, but it’s more than that. It’s a logical way to structure and contextualize industrial data—making it accessible across IT and OT."
Aron warns against over-engineering UNS:
"If you spend six months defining the perfect UNS model, but no one uses it, what did you actually achieve?"
Instead, he recommends a use-case-driven approach, where UNS evolves organically as new applications require structured data.
Scaling DataOps: What Makes or Breaks a Data Strategy?
Aron has seen countless industrial data projects, and he knows what works—and what doesn’t.
Signs of a Failing Data Strategy:
🚩 IT wants to push all factory data to the cloud without defining use cases.🚩 OT ignores IT and builds custom, local integrations that don’t scale.🚩 No executive sponsorship to drive alignment across teams.
What Works?
✅ IT and OT collaboration—creating a DataOps team that manages data models and flows.✅ Use-case-driven approach—focusing on practical business outcomes rather than just moving data.✅ Scalable architecture—ensuring that data pipelines can expand over time without major rework.
Aron summarizes:
"If IT and OT aren’t working together, your data strategy is doomed. The best companies build cross-functional teams that manage data, not just technology."
Edge AI: The Next Big Thing?
While most AI in manufacturing has focused on cloud-based analytics, Aron believes Edge AI will change the game—especially for real-time operator assistance.
What is Edge AI?
* AI models run locally on edge devices, rather than in the cloud.
* Reduces latency, data transfer costs, and security risks.
* Ideal for operator support, real-time recommendations, and process optimization.
Early Use Cases:
* Operator guidance—Providing real-time suggestions to improve efficiency.
* Process optimization—AI-driven adjustments to production settings.
* Fault detection—Identifying anomalies at the edge before failures occur.
While AI isn’t ready for fully closed-loop automation yet, Aron sees huge potential for AI-driven insights to help human operators make better decisions.
Final Thoughts & What’s Next?
We had an amazing discussion with Aron Semle, who shared insights on Industrial DataOps, UNS, Edge AI, and scaling industrial data strategies.
If you’re interested in learning more about HighByte, check out their website: www.highbyte.com.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 3 of our special podcast series on Industrial DataOps. Today, we’re excited to sit down with Andrew Waycott, President and Co-founder of TwinThread, to explore how AI and Digital Twins can transform manufacturing operations.
Andrew has been working with industrial data for over 30 years, from building MES and historian solutions to developing real-time AI-driven optimization at TwinThread. In this episode, we discuss the state of industrial data, the role of AI, and why closed-loop automation is the future of AI in manufacturing.
Subscribe to support our work and receive the other episodes directly in your mailbox :)
What is TwinThread?
TwinThread was founded with a simple but powerful mission: Make AI accessible to non-technical engineers in manufacturing.
As Andrew explains:
"Most engineers in manufacturing shouldn’t have to become data scientists to solve industrial problems. TwinThread is about giving them AI-powered tools they can actually use."
The platform covers data ingestion, contextualization, AI analytics, and closed-loop optimization, all while allowing manufacturers to start small, scale fast, and operationalize AI without massive IT overhead.
Mapping TwinThread to the Industrial Data Platform Capability Model
For those following our podcast series, you know we’ve been refining our Industrial Data Platform Capability Map—a framework to understand how different vendors fit into the industrial data ecosystem. Andrew breaks it down step by step:
* Connectivity: TwinThread ingests data from a wide range of industrial systems—Historians, OPC, MES, databases, IoT platforms, and MQTT.
* Digital Twin & Contextualization: The platform structures data into Digital Twins, modeling not just assets, but also maintenance, production, and process relationships.
* Data Cleaning & Quality: TwinThread automates the process of cleaning, organizing, and adding context to industrial data.
* Data Storage: While TwinThread functions as a cloud historian, it doesn’t require companies to replace existing on-prem historians.
* Analytics: The core strength of TwinThread is its ability to analyze and optimize processes using AI, applying predictive models to industrial operations.
* Data Sharing: The platform generates curated datasets—ready for BI tools like PowerBI, Snowflake, or Databricks—allowing manufacturers to turn raw data into actionable insights.
* Visualization & Dashboards: Unlike traditional generic dashboards, TwinThread provides visual tools optimized for operational decision-making.
As Andrew puts it:
"We don’t just show data. We help you solve problems—whether that’s quality optimization, energy efficiency, or predictive maintenance."
A Real-World Use Case: Quality Optimization at Hills Pet Food
One of TwinThread’s most successful deployments is with Hill’s Pet Food (a Colgate company), where they’ve transformed quality control across all global production lines.
The Challenge:
* Dog and cat food requires strict control of moisture, fat, and protein levels to ensure product consistency and compliance.
* Manual adjustments led to variability, waste, and inefficiencies.
* Traditional sampling-based quality control meant problems were discovered too late—after bad batches were already produced.
The Solution:
* TwinThread integrates with Hill’s existing infrastructure, pulling data from historians and process control systems.
* Their Perfect Quality AI Module predicts final product quality in real time—before production is complete.
* The system automatically optimizes setpoints at the beginning of the line, ensuring the process always stays within ideal quality parameters.
The Results:
* No more bad batches—quality issues are detected and corrected before they occur.
* Maximized yield & cost efficiency, as AI continuously fine-tunes production to hit quality targets at the lowest possible cost.
* Scalability—The system is now running on 18 production lines worldwide.
And perhaps most impressively:
"We implemented a fully closed-loop, AI-powered quality control system—probably the first of its kind in the food industry."
Closed-Loop AI: The Key to Scalable Industrial Automation
Many companies struggle to move beyond pilot projects because AI-driven insights still require manual intervention. TwinThread changes that with closed-loop AI.
Instead of just providing insights, the system automatically adjusts process parameters to maintain optimal performance.
Andrew explains:
"A lot of people think closed-loop automation means making adjustments every millisecond. But in reality, most industrial processes don’t need real-time micro-adjustments—what they need is the ability to make controlled, intelligent changes at regular intervals."
At Hills Pet Food, AI-generated adjustments are sent directly to the control system, where operators can:
* Manually review recommendations before applying them.
* Auto-accept adjustments within pre-set limits.
Why Closed-Loop AI Matters:
* Eliminates the risk of “shelfware”—AI models that aren’t actively used often get abandoned.
* Ensures long-term impact—AI insights become part of daily operations, not just a one-time report.
* Frees up operators—Instead of constantly tweaking processes, they focus on higher-value tasks.
The IT/OT Divide: What Makes AI Projects Succeed?
One of the biggest barriers to AI adoption in manufacturing is organizational silos between IT and OT.
Red flags in AI projects?
* No IT/OT collaboration—When IT and OT teams don’t align, AI solutions often fail to scale beyond pilots.
* No senior-level sponsorship—Without executive buy-in, projects get stuck in proof-of-concept mode.
* Lack of automation maturity—Companies still manually tracking process variables on paper aren’t ready for advanced AI-driven optimization.
Andrew sees a major shift happening:
"Nine years ago, getting buy-in for AI in manufacturing was nearly impossible. Today, leadership teams actively want AI solutions—but they need a clear roadmap to operationalize them."
Standardization: The Next Big Challenge for Industrial AI
Despite advances in AI and cloud data storage, the industrial world still lacks standardized ways to store and structure data.
Andrew warns:
"Every company is reinventing the wheel—creating their own custom data lakes with unique structures. That makes it nearly impossible to build scalable, interoperable AI solutions."
Andrew suggests the industry needs a standardized approach to cloud-based industrial data storage—similar to how Sparkplug B standardized MQTT architectures.
Final Thoughts
We had a fantastic conversation with Andrew Waycott, who shared insights on AI, Digital Twins, and scaling industrial automation.
If you’re interested in learning more about TwinThread, check out their website: www.twinthread.com.
Or visit them at the Hannover Messe at the AWS Booth, Hall 15, Stand D76. More information can be found on the HMI website.
Stay Tuned for More!
Subscribe to our podcast and blog to stay updated on the latest trends in Industrial Data, AI, and IT/OT convergence.
🚀 See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 2 of our special podcast series on Industrial Data. Today, we’re joined by Martin Thunman, CEO and co-founder of Crosser. Together with David and Willem, we dive deep into Industrial DataOps, IT/OT integration, and how real-time processing is shaping the future of manufacturing.
Subscribe today to support our work and receive the next episodes as well!
What is Crosser?
Crosser is a next-generation integration platform built specifically for industrial environments. It acts as the intelligent layer between OT, IT, cloud, and SaaS applications. As Martin puts it:
"We see ourselves as a combination of Industrial DataOps, next-generation iPaaS, and a real-time stream and event processing platform—all in one."
For those unfamiliar with iPaaS (Integration Platform as a Service), Martin explains how traditional integration platforms started with enterprise service buses (ESB), then evolved into cloud-based solutions. Crosser takes this further by integrating both industrial and enterprise data in a way that not only moves data but also processes and transforms it in real time.
Mapping Crosser to the Industrial Data Platform Capability Model
The Industrial Data Platform Capability Map was created to help companies make sense of the complex ecosystem of industrial data platforms. When asked where Crosser fits in, Martin identified key areas where they outperform:
* Connectivity: Crosser enables companies to connect to over 800 different systems, from ERP and MES to QMS and supply chain applications. However, Martin emphasizes that connectivity alone is not enough.
* Data in Motion & Transformation: Crosser doesn’t store data; instead, it enables real-time analytics and transformation at the edge. Martin notes:"If you have a platform that connects data, why not take the opportunity to do something with it while moving it?"
* Analytics: Companies are increasingly running machine learning models at the edge for anomaly detection, predictive maintenance, and real-time decision-making. Crosser enables closed-loop automation, where anomalies can trigger automatic machine stoppages or dynamic work order creation.
One area where Crosser can also help is in the "supporting capabilities", such as deployment, monitoring, and user management. Or in Martin’s words:
"Boring enterprise features like deployment and monitoring are actually critical when rolling out solutions across multiple sites."
A Real-World Use Case: Real-Time Anomaly Detection & Automated Work Orders
One concrete example of Crosser in action involves real-time anomaly detection in an industrial setting. Here’s how it works:
* Step 1: Data is collected in real-time from a plant historian with thousands of data tags.
* Step 2: Anomalies are detected using fixed rules or machine learning models at the edge.
* Step 3: If an issue is found, an automated work order is sent to SAP, triggering maintenance actions without human intervention.
This closed-loop automation prevents failures before they happen and reduces downtime.
Breaking Down IT and OT Silos
One of the biggest challenges in industrial digitalization is the disconnect between IT and OT teams. Martin highlights how modern industrial environments require collaboration between multiple skill sets:
* OT Teams → Understand machine data, sensors, and processes.
* Data Science Teams → Develop machine learning models.
* IT Teams → Manage cloud, enterprise systems, and security.
Traditionally, these groups have worked in silos, making IT/OT convergence difficult. Crosser’s low-code approach aims to bridge the gap, allowing different teams to collaborate on the same workflows.
"OT knows their machines, IT knows their systems, and data scientists know their models. The challenge is getting them to work together."
Final Thoughts & What’s Next?
We had a fantastic discussion with Martin Thunman, who shared valuable insights into the future of industrial data processing.
If you’re interested in learning more about Crosser, check out their website: www.crosser.io.
Stay Tuned for More!
Subscribe to our podcast and blog to stay up-to-date on the latest trends in Industrial Data, AI, and IT/OT convergence.
See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack:
Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the official policy or position of The IT/OT Insider. This content is provided for informational purposes only and should not be seen as an endorsement by The IT/OT Insider of any products, services, or strategies discussed. We encourage our readers and listeners to consider the information presented and make their own informed decisions.
Welcome to Episode 1 of our special podcast series on Industrial Data.
In this episode, David and Willem take you behind the scenes of their Industrial Data Platform Capability Map—a structured way to understand how organizations can truly leverage their industrial data. David talks about the role of a platform and which capabilities are needed to build it. He also focuses on the role of Data Management and how that is linked to building a Unified Namespace.
But that's just the beginning!
We’ve lined up a series of exciting conversations with industry tech leaders, showcasing their solutions and cutting-edge innovations in industrial data platforms. And the timing couldn’t be better: with Hannover Messe just around the corner, data will undoubtedly be one of the hottest topics on everyone’s mind.
Subscribe now to receive every episode in this series on Industrial Data!
If you are interested in Industrial Data, you should definitely review these earlier articles: Part 1 (The IT and OT view on Data), Part 2 (Introducing the Operational Data Platform), Part 3 (The need for Better Data), Part 4 (Breaking the OT Data Barrier: It's the Platform), Part 5 (The Unified Namespace) and Part 6 (The Industrial Data Platform Capability Map)
Explore the articles in this series:
Stay Tuned for More!
Subscribe to our podcast and blog to stay up-to-date on the latest trends in Industrial Data, AI, and IT/OT convergence.
See you in the next episode!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack:
Manufacturing has long been the backbone of global economies, yet the industry often remains hidden in plain sight, tucked away in industrial parks and misunderstood by the public. In this episode David was joined by Mike Nager.
Mike Nager is a passionate advocate for Smart Manufacturing, with a career that began in electrical engineering, where he worked closely with manufacturers to automate and optimize their production processes. Over time, he visited hundreds of plants—ranging from automotive and pharmaceuticals to paper mills and tire factories—each with its own unique challenges and stories. From the carbon-black-coated environments of tire production to the ultra-clean rooms of semiconductor manufacturing, Mike witnessed firsthand the diversity of manufacturing and the dedication of the people behind the scenes.
“It’s a world most people never get to see and part of my mission is to provide a window into that world.”
He currently serves as a Business Development Executive at Festo Didactic, the technical education arm of the Festo Group, which provides equipment and solutions to prepare the workforce of tomorrow—a mission that’s more important now than ever.
As if that weren’t enough, Mike is also an author, having published several engaging books, including All About Smart Manufacturing, a children’s book with delightful illustrations, and his Smart Student's Guide, aimed at helping students navigate the path to manufacturing careers.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and episodes.
Addressing the Awareness Gap
One of Mike’s key messages is the need to bridge the “awareness gap” in manufacturing. For years, the perception of manufacturing as dirty, dangerous, and undesirable work has discouraged young people from pursuing these careers. However, as Mike explained, the tide is turning. Modern manufacturing offers high-paying, stable careers in fields like robotics, automation, and data analysis.
We talked about how technical education can be a pathway to well-paying jobs, even for those without four-year degrees. “In some regions, students who complete just a year or two of technical training can go from earning minimum wage to $40 or $50 an hour with overtime,” he said. “It’s a massive opportunity for those who are willing to learn.”
The Role of Education in Revitalizing Manufacturing
As part of his work, Mike collaborates with educators to create hands-on training programs that prepare students for real-world manufacturing environments. Inspired by the German apprenticeship model, these programs emphasize learning by doing, providing students with the skills they need to succeed on the factory floor.
Yet, as Mike pointed out, the U.S. education system faces unique challenges. Unlike Germany, where apprenticeships are embedded in the culture, the U.S. relies heavily on public education to develop technical skills. This gap in structured training has made it even more critical to create accessible and engaging educational resources.
A Mission to Inspire—From High School to Children’s Books
Mike has taken a creative approach to inspiring interest in manufacturing. In addition to his professional work, he’s authored a children’s book, All About Smart Manufacturing, and a high school-focused Smart Students Guide. These books introduce young readers to the possibilities of manufacturing careers, using relatable language and illustrations to make the subject approachable.
“The first book was aimed at high school students, but I realized they’d already chosen their paths,” Mike explained. “That’s when I decided to write for younger kids, to plant the seed of curiosity early on.”
The Future of Manufacturing Careers
We also touched on broader trends shaping the industry, such as the push for local manufacturing due to national security concerns and the growing need for technical talent in an increasingly automated world. Mike emphasized that while automation is transforming processes, people remain at the heart of manufacturing.
“The idea of a ‘lights-out factory’—completely automated with no people—has been talked about for decades. But in reality, people are still essential, and their roles are evolving to require more technical and analytical skills.”
Closing Thoughts
Mike’s passion for manufacturing and education is clear: from his hands-on work with educators to his mission of raising awareness through books and outreach. His vision for the future of manufacturing is one where education, automation, and human creativity come together to revitalize the industry.
Or as Mike put it:
“Manufacturing is one of the few industries that truly creates wealth. It’s not just about making things—it’s about building communities and creating opportunities.”
Whether you’re an educator, a parent, or simply curious about the future of manufacturing, Mike’s insights are a valuable reminder of the importance of inspiring the next generation. As the industry evolves, it’s clear that the need for skilled, passionate people will only grow.
Find Mike on LinkedIn: https://www.linkedin.com/in/mikenager/
Interested in one of his books? Printed and e-book versions available here: https://www.industrialinsightsllc.com/#books
This episode was one of our most engaging yet on the topic of AI. David sat down with Dr. Wilhelm Klein, an expert in Automated Quality Control and holder of a PhD in Ethics. As the co-founder and CEO of Zetamotion, Wilhelm brings a mix of hands-on experience and deep understanding of the ethical questions surrounding technology.
Over the hour, we covered a lot of ground—how AI has evolved, its role in manufacturing, and the challenges of scaling systems from Proof of Concept (PoC) to full production. Wilhelm explained how computer vision is changing quality control. We also explored the ethical questions raised by AI, touching on its impact on industries, jobs, and decision-making.
If you’re interested in AI’s practical applications and the questions it raises about the way we work and live, this episode has plenty to offer.
The Starship Enterprise
Wilhelm’s journey began with a childhood fascination with science fiction, tinkering, and a deep curiosity about the inner workings of technology and society. His academic path in technology ethics and sustainability, combined with his entrepreneurial work at Zetamotion, provides a unique perspective on AI's role in reshaping manufacturing processes, particularly in quality control.
Wilhelm focuses on integrating AI and machine vision to optimize manufacturing quality control. But as he emphasized during the conversation, the story of AI in this domain is more than just technology; it’s about aligning innovations with human values, operational realities, and societal needs.
The Last Mile Problem: Scaling AI Beyond Proof of Concept
One of the discussions revolved around the "last mile problem" in AI implementations. While many organizations can successfully deploy AI in Proof of Concept (PoC) stages, transitioning these systems into scalable, production-ready solutions is an entirely different challenge. This gap arises from unforeseen complexities, including technical integration, stakeholder alignment, and the adaptation of processes to new workflows.
“Scaling isn't just about having a functional prototype. It's about systemically embedding AI into the fabric of operations, which often reveals blind spots that were invisible during the PoC phase.”
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support our work.
Ethics and the Future of AI
We also delved into the ethics of AI—a field Wilhelm has explored extensively. In the current debate, people often find themselves polarized between AI optimism and AI doom. Wilhelm offered a refreshingly balanced view, recognizing both the transformative potential of AI and the risks inherent in its misuse or unregulated growth.
"What I find interesting," he noted, "is that both optimists and pessimists bring valid arguments. The critical task is to address these challenges proactively while ensuring that AI development remains aligned with societal well-being." From bias in algorithms to potential job displacement, Wilhelm argued for a more nuanced understanding of AI's broader impacts, advocating for policies and practices that emphasize transparency, accountability, and inclusivity.
Practical AI in Action
At Zetamotion, Wilhelm and his team are leveraging AI to transform quality control processes. By automating inspection workflows, AI not only reduces human error but also enables faster decision-making and significant cost savings. These advancements have profound implications for sustainability as well, minimizing waste and enhancing resource efficiency across industries.
Yet, as Wilhelm pointed out, technology alone isn’t enough. The success of such initiatives depends on an organization’s ability to integrate AI into human-centric processes. This means involving frontline workers, addressing their concerns, and creating systems that are intuitive and supportive rather than alienating.
Looking Ahead: AI’s Place in Industry and Society
"The next five to ten years are going to be revolutionary. AI has already transformed many aspects of business and personal life, but the scale and speed of change we’re about to witness will challenge us in ways we can barely imagine."
Whether it’s navigating the ethics of AI, bridging the gap between innovation and operational utility, or understanding the cultural shifts AI demands, this episode underscored the importance of thoughtful engagement with technology. Wilhelm’s insights remind us that the future of AI isn’t just about algorithms or automation—it’s about shaping a world where technology serves humanity, not the other way around.
If you’re interested in the practical and philosophical dimensions of AI—or simply want a deeper understanding of its implications for industry and society—this podcast is a must-listen. It’s a conversation that challenges, inspires, and equips us to navigate the extraordinary opportunities and challenges that lie ahead.
Want to learn more?
Connect with Wilhelm on LinkedIn.More about AI & Quality Control: https://zetamotion.com/
Subscribe on Youtube, Apple or Spotify
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
In this episode of the IT OT Insider podcast, host David interviews Davy Demeyer, an expert in industrial automation. Davy shares his extensive background in automation engineering, discussing the challenges faced in programming PLCs and the divide between IT and OT. He emphasizes the need for modern software development practices, such as DevOps and DesignOps, to improve automation workflows. Davy also explores the potential of generative AI in automation engineering and introduces the Society of Automation and Software Engineers, a community focused on combining automation and software principles. The conversation highlights the importance of evolving engineering practices to meet the demands of Industry 4.0. Chapters 00:00 Introduction to Davy de Meijer and His Journey 04:50 Understanding the Control Layer in Automation 09:55 Programming PLCs: Standards and Challenges 14:59 Bridging the Gap: Learning from Software Development 19:49 The Future of Automation: DesignOps and Generative AI 27:54 The Society of Automation and Software Engineers 32:58 The Importance of Design in Automation Engineering Want to know more? Find Davy on LinkedIn: https://www.linkedin.com/in/demeyerdavy/ More about SASE: https://sase.space/
Davy Demeyer has spent his career bridging the gap between traditional automation and the rapidly advancing world of digital technology. With decades of experience working on automation projects, he’s a passionate advocate for rethinking how we approach automation in the age of Industry 4.0.
Understanding the Basics: What Are PLCs and DCS?
Davy broke down two cornerstone technologies in automation:
* PLCs: Often referred to as the backbone of automation, PLCs are specialized computers designed to control machinery and industrial processes. They are programmed using proprietary languages like Ladder Logic or Structured Text, a method that hasn’t evolved significantly during the last decades.
* DCS (Distributed Control Systems): These are more complex systems, typically used for large-scale, continuous processes such as in chemical plants or refineries. They offer a centralized view and control of entire plants, integrating with various PLCs and other devices.
Despite their importance, Davy highlighted how their programming methodologies remain rooted in the past, limiting their adaptability to modern software development practices.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts and support my work.
The Programming Gap
We talked about the differences between traditional automation programming and modern software development. While the software industry has embraced Agile, DevOps, and cloud-native design, automation engineering often remains tied to rigid, manual workflows. This divergence creates a bottleneck for scalability and innovation in automation, which is essential for Industry 4.0. Even Excel plays a critical role in ‘modern’ software development… 😣
Davy emphasized how automation programming’s reliance on vendor-specific tools and proprietary languages makes collaboration difficult and slows down the pace of digital transformation.
Digital Transformation and Industry 4.0: The Bottleneck
Why does this gap matter for Industry 4.0? Digital transformation initiatives rely on seamless data flow, agile responses to changing conditions, and scalable solutions. However, the slow evolution of automation practices hinders:
* Scalability: New solutions remain siloed, with pilot projects often stuck in proof-of-concept stages.
* Integration: Connecting PLCs to IT systems, cloud platforms, or advanced analytics often requires costly custom solutions.
* Innovation: Without adopting modern practices, the automation industry risks falling behind in leveraging emerging technologies like AI or machine learning.
The Future: DesignOps for Automation
Davy proposed a vision for the future of automation: DesignOps for Automation Engineers. Borrowing from the software industry, DesignOps would focus on creating collaborative, integrated environments where engineers and developers work in harmony. He wants to Automate the Automation Engineer. This vision isn’t just theoretical—it’s already being championed in forward-thinking organizations.
SASE: Society of Automation Software Engineers
In line with this future, Davy introduced the Society of Automation Software Engineers (SASE), a community-driven initiative aimed at fostering collaboration and innovation in automation. SASE provides a platform for professionals to share best practices, develop new standards, and advocate for modernizing the industry.
Make sure to listen to this very interesting episode! (And subscribe to get our weekly new content 🙂)
Want to know more? Find Davy on LinkedIn: https://www.linkedin.com/in/demeyerdavy/ More about SASE: https://sase.space/
Welcome back to the IT/OT Insider Podcast, where we dig into the nuances of digital transformation and Industry 4.0. Today we welcome Jon “The Factory Guy” Weiss on the podcast! With a career that spans global leadership roles at GE Digital, Software AG, and Amazon, Jon now operates as an Industry 4.0 expert. His diverse experience gives him a unique perspective on how technology impacts the manufacturing landscape.
The State of Manufacturing Today
Kicking off the conversation, Jon reflects on a familiar theme: the manufacturing world is under strain, juggling aging infrastructure with a dwindling workforce. Labor shortages and skill gaps dominate the conversation, with manufacturers scrambling to retain institutional knowledge as veteran operators retire without adequate replacements. Jon contextualizes today’s challenges by tracing the evolution of industrial revolutions, especially post-WWII. Following a manufacturing boom, especially in the U.S., companies expanded rapidly but without modernizing infrastructure, leading to a reliance on aging systems.
Data’s Pivotal Role and DataOps
As the conversation shifts to data, Jon emphasizes the critical role of DataOps in optimizing manufacturing. Building a robust data foundation is a vital first step in deploying AI solutions effectively. Without it, any data-driven project risks failure. Manufacturing data isn’t just about numbers; it’s real-time insights on machine health, output efficiency, and product quality. A strong DataOps practice ensures that manufacturers can collect, clean, and utilize this data across various systems and departments.
What Can AI Bring to Manufacturing?
Jon offers a balanced view on AI, highlighting both its promise and its limits. While AI can automate specific tasks, optimize equipment, and predict maintenance needs, it’s not a silver bullet. In manufacturing, AI excels at identifying patterns and improving efficiency in structured, predictable environments. But its impact diminishes without high-quality, well-curated data. Manufacturers must recognize that implementing AI is a journey, requiring continuous improvement and the right expertise to maximize its benefits.
The Three Pitfalls in AI Implementation
Jon also shares insights into common pitfalls in manufacturing AI projects:
Rushing into Production: Companies often move too quickly from pilot projects to full production without thorough testing, resulting in issues that can halt or complicate operations. A slower, more phased approach ensures AI is reliable and integrated seamlessly.
Building a Solid Data Foundation: Quality data is essential, yet many companies still overlook it. Investing in data infrastructure—collection, storage, and processing—is crucial to making AI effective.
Justifying a True Business Case: AI can be flashy, but without a clear, measurable business case, it’s easy for projects to fall short of expectations. Manufacturers must evaluate the ROI of AI, focusing on realistic goals that align with broader business objectives.
ROI Discussion: Is AI Worth It?
Jon wraps up with thoughts on ROI, stressing that AI projects need to demonstrate value beyond the hype. This includes both direct financial gains, such as cost savings through predictive maintenance, and indirect benefits, like improved safety and reduced downtime. Achieving ROI in AI requires patience, strategic planning, and a commitment to building a strong data infrastructure from the start.
Listen to the full episode to hear Jon’s insights on navigating the AI landscape in manufacturing. Subscribe to the IT/OT Insider Podcast for more discussions on the latest in digital transformation and smart manufacturing.
You can find Jon on https://www.thefactoryguy.ai/ or on LinkedIn .
Welcome to the 10th episode ( 🎉) of The IT/OT Insider Podcast! David talks with Gregory Grauwels, who is the current Group OT Manager at Cloetta, a confectionery company in Europe. With years of experience in both industrial automation and digital transformation, Gregory has an impressive track record in OT management. In this podcast, he shares his insights and advice for those aspiring to start a career as an OT manager. Amongst other things, we talked about Doing things step by step, about Land and Expand and about Open architectures.
Gregory's Journey to OT Management
Gregory’s path to becoming an OT manager started with a deep technical background in industrial automation. After gaining experience in the petrochemical industry, where he worked on programming and integrating automation systems such as PLCs, HMIs, and SCADA systems, Gregory transitioned into more senior roles focused on digital transformation. Before joining Cloetta, Gregory held a role at Bayer, focusing on digital manufacturing initiatives like cybersecurity, digital maturity assessments, and overseeing Manufacturing Execution Systems (MES) and Manufacturing Operations Management (MOM).
His move to Cloetta marked a shift from the petrochemical sector to the confectionery industry, which brought unique challenges. As he describes it, "I went from petrochemical to confectionery, which was completely different but exciting." At Cloetta, he was tasked with modernizing their operational technology infrastructure while maintaining a balance between long-standing, traditional machinery and cutting-edge digital systems.
Key Challenges
One of the key challenges Gregory faces at Cloetta is managing a complex mix of old and new technologies. Cloetta, with its long history, still operates some older production machines alongside the latest modern equipment. “We have lines that have been in operation for decades, and some of these machines are still vital to our production processes,” Gregory explains. As an OT manager, one must ensure that these systems run smoothly and integrate with new digital initiatives.
Variation is also a unique challenge. Producing different types of confectionery—whether it’s chocolate, wine gums, or jelly beans—requires different technologies and processes. "Each product comes with its own set of machines and technology. The way we produce jelly beans is entirely different from how we make wine gums, and each of these technologies requires specialized knowledge and equipment,” he says.
Managing these varied technologies means an OT manager must be adept at navigating both older machinery and modern automation tools. Gregory emphasizes the importance of ensuring that every machine, whether old or new, works harmoniously to maintain production efficiency and quality.
Unified Namespace as the Data-Glue between all lines
A key concept in modern industrial digital transformation is the Unified Namespace (UNS), which also came up during our conversation. The UNS serves as a central repository or hub for real-time data exchange across all systems in an organization. In the context of IT/OT convergence, this approach allows for seamless communication between different systems—whether it's legacy equipment, modern IoT devices, or enterprise-level applications like ERP systems. Gregory explained that the Unified Namespace provides a structured, standardized framework that ensures data from various sources is consistently accessible and usable by both OT and IT teams. "The idea behind the UNS is to create a single source of truth for all operational data," Gregory noted. By doing so, organizations can eliminate data silos, improve interoperability, and enable more effective decision-making based on real-time insights. This is particularly useful for industries with diverse systems, where aligning data formats and communication protocols has traditionally been a significant challenge.
Advice for Aspiring OT Managers
* Develop a Strong Foundation in Industrial Technology: Gregory’s background in automation and digital manufacturing laid the groundwork for his success as an OT manager. Aspiring OT managers should focus on building a deep technical understanding of automation systems like PLCs, SCADA, and MES, as well as new digital technologies that are reshaping the industry. “A strong technical foundation is key because OT is all about managing the technology that keeps production running,” he advises.
* Learn to Manage Both Legacy and Modern Systems: In many industries, production lines often include a combination of legacy systems and the latest technologies. Gregory stresses the importance of balancing these two worlds. “You don’t replace a functioning machine just for the sake of digitalization. The challenge is to integrate new technologies in a way that complements the existing systems without disrupting production,” he explains.
* Focus on Practical Problem-Solving: Problem-solving is at the heart of OT management. Gregory emphasizes that a good OT manager needs to focus on practical, efficient solutions to keep production moving smoothly. “It’s not just about implementing the latest tools or systems; it’s about ensuring that everything works together seamlessly,” he says. This often involves finding creative solutions to integrate digital tools into established processes.
* Collaborate Across Teams and Departments: One of the most critical skills for an OT manager is the ability to collaborate effectively with different teams, from operators on the shop floor to upper management. Gregory highlights the importance of understanding the needs and challenges of each stakeholder involved in production. “As an OT manager, you’re the link between the technology and the people who use it. Strong communication and collaboration skills are essential,” he advises.
* Keep a Long-Term Vision for Digital Transformation: Gregory views digital transformation as an ongoing process that requires careful planning and a forward-looking mindset. “Digital transformation isn’t just about adopting new technologies; it’s about making continuous improvements to streamline production, reduce costs, and improve product quality,” he explains. For aspiring OT managers, having a strategic vision for how to integrate digital solutions into manufacturing is crucial for long-term success.
Thank you, Gregory, for joining us!
If you have an interesting story to share, feel free to reach out to David !
In this new podcast David had an insightful conversation with Klaas Dobbelaere, IIoT Connectivity Director at Electrolux (also known in some markets under their brands AEG or Frigidaire). Klaas shared valuable insights into the world of Industrial Internet of Things (IIoT) and how Electrolux is embracing IT/OT convergence to drive digital transformation in its manufacturing operations.
Thanks for reading The IT/OT Insider! Subscribe for free and get all new posts
The Electrolux Transformation Journey
Klaas started by highlighting Electrolux's digital transformation efforts across its global footprint. As a leading household appliance manufacturer, Electrolux has to innovate continuously while ensuring its operations remain efficient and sustainable. The company's focus on leveraging IIoT for seamless connectivity across its operations helps optimize everything from energy consumption to predictive maintenance.
One of the key takeaways from our conversation was the importance of actionable data. For Klaas, collecting data from machines, sensors, and production lines is not enough—what matters is translating that data into insights that can inform better decisions and improve operational efficiency. He stressed the significance of finding the right balance between cutting-edge technologies and the practical, everyday needs of the plant floor.
Building Bridges Between IT and OT
Historically, IT and OT have operated in silos—IT managing information systems, while OT focuses on controlling physical operations. This divide has often caused friction in industrial environments, but as Klaas explained, the boundaries are blurring rapidly.
At Electrolux, IT/OT integration is a critical driver for innovation. By bridging the gap, teams can create a more collaborative environment where both the data insights from IT systems and the operational know-how from OT experts can come together to drive better outcomes. One concrete example Klaas gave was their efforts to deploy real-time monitoring systems that allow engineers to analyze machine performance instantly, identifying issues before they lead to costly downtimes.
Navigating the Challenges
Of course, IT/OT convergence isn’t without its challenges. Klaas was candid about the growing pains Electrolux faced, including technical hurdles like legacy equipment integration and organizational barriers that often slow down progress. However, he emphasized the need for patience and strong leadership to guide teams through these transitions.
One challenge particularly close to Klaas' heart is the cultural shift that needs to occur. At Electrolux, as in many manufacturing companies, there's a deeply ingrained culture of precision, safety, and reliability. While these are strengths in traditional operations, they can slow down the adoption of new, agile technologies. For Klaas, the solution lies in fostering a mindset of continuous learning among employees and providing the right training to bridge the knowledge gap between IT and OT.
Future-Proofing Manufacturing with IIoT
Looking ahead, Klaas believes that the future of manufacturing lies in smart, connected ecosystems. He painted a vision of a factory where every machine, sensor, and operator is linked in a vast network, feeding real-time data into AI-driven systems. These systems will not only make predictions but will autonomously make decisions to optimize production processes.
However, he also issued a word of caution: “Technology can only take you so far. Without the right people and processes in place, even the most advanced systems will fall short.” His message was clear—people remain the most important asset in any digital transformation.
Conclusion
Our conversation with Klaas Dobbelaere underscores the critical role IT/OT convergence plays in the modern manufacturing landscape. For companies like Electrolux, harnessing the power of IIoT and data-driven insights is key to staying competitive and driving innovation. But success doesn’t come easy—it requires breaking down silos, fostering a culture of collaboration, and being willing to embrace change.
To kick off the second season of IT/OT Insider, we’re diving into the world of Digital Transformation, MES and change management with Bram Van Genabet. With a career that spans from refining oil at ExxonMobil over chocolate production at Barry Callebaut (Director of Digital Innovation) to his current role as independent consultant at La Lorraine Bakery Group (Director of Digital Strategy), Bram brings a wealth of experience and a fresh perspective on industrial digital transformation.
You really want to listen to this one.. we have a very cool announcement which we are sharing 😀
Setting the Scene for Digital Transformation
As we (David and Willem) settle back into our editorial chairs after summer, we couldn’t have asked for a more fitting guest to help us explore the themes of innovation and transformation that will define the next wave of content on The IT/OT Insider.
In our conversation, Bram emphasized that digital transformation is more than just a technological shift; it’s a fundamental change in how businesses operate.
"Successful digital transformation is about aligning digital initiatives with the core business strategy, ensuring that technology serves as an enabler rather than just an addition."
This alignment, according to Bram, is what separates successful digital transformations from those that fail to deliver real value.
The Shift from Innovation to Execution
One of the key takeaways from our interview is the critical importance of moving from innovation to execution. Bram notes that while many organizations excel at generating innovative ideas, the real challenge lies in execution:
“It’s easy to get caught up in the excitement of new technology, but without a clear execution plan, those ideas rarely translate into tangible business outcomes.”
Bram’s approach underscores this philosophy. By focusing on scalable solutions that integrate seamlessly into existing operations, he’s driving meaningful change that not only improves efficiency but also enhances the overall customer experience.
Bram talked about these concepts at AVEVA World, you can watch his keynote here:
Special announcement
As we wrap up our conversation, there's still one more piece of exciting news to share…
Here it is: We are super excited that Bram is joining our IT/OT Insider team 🙂
Or, as Bram himself puts it:
“Over the past year, I’ve been following IT/OT Insider closely, reading the insights shared here, and even discussing them with David and Willem at various conferences. What struck me most was how much the day-to-day experiences shared by them resonated with what I’ve encountered in my own journey.
Despite not knowing each other for that long, I’ve found that we have so many common points to talk about. That’s truly fascinating and what motivates me to join this platform. Over the last 10 to 12 years, I’ve dealt with many challenges around effective IT/OT cooperation—bringing different teams together, and navigating the complexities of people, processes, technology, and data.
When I talk with other practitioners, it’s clear we’re all struggling with very similar issues. I’ve often wondered if there’s a magical formula or methodology that can make us more successful in these transformations. While I’m realistic enough to know there’s probably no silver bullet, I do believe that by sharing our experiences and being open to learning from each other, we can get closer to finding better ways of doing things.
That’s why I’m so excited to join IT/OT Insider as a co-author. I see it as a fantastic platform to share our experiences, learn from one another, and build a stronger community within digital manufacturing. Whether it’s talking about mistakes or celebrating successes, I’m eager to contribute and engage with this incredible community.”
Welcome to the team, Bram !
Make sure to subscribe to receive or weekly in-depth articles, expert interviews, and insights. Next week, we will start a new series titled: “Unlocking Success in Digital Transformation”, especially focused on the implementation of Manufacturing Execution System (MES) projects. 🎉
Subscribe today! Don’t miss out on our new series “Unlocking Success in Digital Transformation“ by Bram, David and Willem
After nearly a year of delving into the intricacies of IT/OT convergence, it's time to shift gears and step into the real world where theory meets practice. Introducing our new series, "Idea to Shop Floor: How companies are figuring out IT/OT Convergence," where we'll be showcasing real-life case studies from companies that are at the forefront of digital transformation. These stories will highlight the challenges, triumphs, and lessons learned from businesses that are successfully (or not) integrating IT and OT, providing invaluable insights and inspiration for your own journey.
In this podcast David talks to Sophie Van Nevel. She is the Global IT Lead for Strategy and Governance and also Data and Analytics at Bekaert.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new articles and podcasts.
Established in 1880, Bekaert is a global leader in steel wire transformation and coating technologies. You can find their steel products in various applications: Champagne cork wire, fishhook wire, steel cord inside tires, concrete reinforcement, very specialized applications in renewable energy, fencing and many others.
At Bekaert, you will find various types of industrial processes, one of them is cold drawn steel wire. Cold-drawing changes the steel's shape and size by pulling the material through a carbide die or turks head. Often, steel will go through cold-rolling first and then cold-drawing to enhance its properties for better performance.
Obviously, minimizing energy consumption while increasing the reliability/uptime of the production lines is extremely important for Bekaert. This is where data and Sophie’s team comes into play!
“The basic process is drawing the wire into a smaller diameter, then adding coatings or bundling wires together to create new material properties. This intricate process requires precise measurements and adjustments, making sensor data critical. Sensor data quality is essential to ensure we meet the desired parameters throughout the production process,” Sophie explains.
Building Digital Products
Sophie's team is responsible for creating digital products that leverage data analytics to optimize production processes. This includes using dashboards to monitor energy consumption and employing AI models to refine product quality. “We work closely with our business teams to drive intelligent processes, aiming to optimize our production with the help of technology and data,” Sophie explains. One of their innovative goals is to provide sensor data as a service to their customers, enhancing transparency and collaboration.
The Role of Data Governance
Data governance, often seen as a theoretical concept, is vital for managing both transactional and sensor data. Sophie emphasizes the importance of integrating data governance into everyday practices. “We’ve set up roles like data stewards and custodians, and provide training to ensure everyone understands their role in the data delivery value chain,” she says. This approach ensures high data quality and consistency, which are crucial for generating reliable insights and driving business value.
Case Study: Energy Management
A prime example of Bekaert’s data-driven approach is their energy management program. The company installed energy meters in their plants to monitor and reduce energy consumption, aligning with their sustainability goals. “We started by reporting the data from the meters, but soon realized the need for better accuracy,” Sophie recalls. By analyzing discrepancies between machines and understanding the underlying causes, such as temperature changes or data drift, Bekaert was able to develop predictive models to optimize energy use.
Cross-Team Collaboration
At Bekaert, the convergence of IT and OT is achieved through cross-team collaboration. “We bring together IT, business, and engineering teams to solve specific cases, focusing on driving end-to-end value,” Sophie explains. This collaborative approach leverages diverse expertise to tackle complex challenges, such as optimizing energy consumption, and ensures that solutions are practical and effective.
Subscribe now for free:
From Services to Manufacturing
Transitioning from a service-oriented organization to a manufacturing environment presented unique challenges for Sophie. “In banking, it’s more about services, whereas at Bekaert, the impact of data actions is very concrete and immediate,” she notes. This tangible impact underscores the importance of effective change management, particularly when dealing with existing assets and technology that may not be digital-native. “We focus on data literacy and involve people in the journey to ensure they see the value of their contributions,” Sophie adds.
Scaling and Sustainability
Scaling digital initiatives from pilot projects to full-scale implementations is a critical aspect of Bekaert’s strategy. Sophie outlines their governance approach, which assesses the value and applicability of pilots across different plants. “Not every plant or product is the same, so we define criteria to determine whether we can scale a digital product,” she explains. This method ensures resources are allocated to initiatives with the highest potential impact while discontinuing those that do not deliver expected results.
The Future of AI and Data at Bekaert
Looking ahead, Sophie is optimistic about the role of AI and data in driving business value. “We hope to continue leveraging AI and data to create significant business value. Data quality and governance will remain crucial as we develop more advanced AI models,” she asserts. The foundation of robust data practices will enable Bekaert to harness the full potential of digital technologies, ensuring sustainable growth and innovation. Bekaert's journey exemplifies how IT/OT convergence can transform traditional manufacturing processes through the strategic use of data and collaboration.
As we continue this series, we’ll explore more real-world stories to inspire and guide your digital transformation efforts. Stay tuned for more insights from industry leaders who are turning theory into practice.
Find Sophie on LinkedIn.
Thank you for reading The IT/OT Insider. This post is public so feel free to share it.
Further reading:
Can’t stop reading?We have organized our 30+ articles in three categories: Organization, Change, and Technology. Make sure to check them out!
Did you already subscribe to our podcast?
YouTube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack: https://itotinsider.substack.com/podcast
Welcome back to the IT/OT Insider Podcast, Today, we have a special guest, Dr. Amir Cahn, CEO of the Smart Water Networks Forum (SWAN). Dr. Cahn brings a wealth of knowledge and experience in leveraging data-driven technologies to transform water, wastewater and stormwater networks worldwide.
The SWAN (Smart Water Networks) Forum is a global hub for industry experts, innovators, and thought leaders dedicated to the digital transformation of the water sector. SWAN's mission is to accelerate the adoption of data-driven solutions to improve the efficiency, sustainability, and resilience of water networks worldwide. They unites a diverse range of stakeholders, including water utilities, engineering firms, technology companies, startups, investors, and academics, fostering collaboration and innovation across the water sector.
Understanding Data-as-a-Service (DaaS)
I was particularly interested in learning more about Data-as-a-Service (DaaS), a transformative model for water utilities. Unlike traditional methods where utilities manage their hardware and their data, DaaS shifts the responsibility to a service provider, who handles data generation, transmission, and analytics. This model allows utilities to focus on outcomes rather than infrastructure management.
"Data-as-a-Service is about shifting the risk and responsibility from the utility to the service provider. This way, utilities can decide whether they want just the data, a summary report, or predictive analytics," explains Dr. Cahn.
Thanks for reading The IT/OT Insider! Subscribe for free to receive new posts & podcasts and support our work.
The Services Staircase
We also talked about the Services Staircase, a framework that outlines the progressive stages utilities can follow to enhance their data management capabilities. There are three different types of service levels: base (product), intermediate (service), and advanced (capability). This structured approach helps utilities gradually improve their data capabilities, ensuring a sustainable and scalable transformation. By following the Services Staircase, utilities can systematically build their expertise and infrastructure, leading to smarter, more efficient water management practices.
Real-World Examples of DaaS Implementations
Gonzales, Louisiana: Enhancing Service Quality with Smart Metering
In Gonzales, Louisiana, a small utility faced significant budget constraints that limited its ability to upgrade its infrastructure. By adopting a smart metering DaaS model, the utility was able to implement advanced metering infrastructure without the need for substantial upfront capital. The DaaS provider handled the data generation, transmission, and analytics, delivering actionable insights directly to the utility. This approach allowed the utility to improve billing accuracy and efficiency, while simultaneously enhancing service quality. As a result, Gonzales saw a reduction in water losses, improved customer satisfaction, and better resource management, demonstrating the tangible benefits of DaaS in a cost-effective manner.
Jerusalem: Reducing Industrial Pollution through Data-Driven Monitoring
In Jerusalem, the city's water utility faced challenges in monitoring and managing industrial pollution, which posed significant environmental and public health risks. By partnering with a DaaS provider, the utility was able to implement a comprehensive monitoring system that continuously collected and analyzed data from various industrial sites. This system provided real-time alerts and predictive analytics, enabling the utility to identify pollution sources and respond promptly to potential issues. The DaaS model not only improved the utility's ability to manage industrial pollution but also facilitated compliance with environmental regulations. This proactive approach led to a significant reduction in pollution incidents, showcasing how DaaS can drive environmental improvements and operational efficiency in urban water management.
The Future of Water Management
As we look to the future, Dr. Cahn envisions increased collaboration between utilities, technology providers, and other sectors. This collaborative approach is essential for addressing pressing challenges like climate change and resource management. Dr. Cahn encourages utilities to embrace DaaS and other innovative models to enhance their operations and sustainability.
"We're at a pivot point where water management needs innovative solutions more than ever. By working together, we can advance the sector and address global challenges."
Join the 350+ members of the SWAN Network
To learn more about DaaS and other innovative water management solutions, visit the SWAN Forum's website. Now is a great time to get involved, as SWAN is offering a 10% discount on membership until July 30th. Join SWAN's global community to share insights, collaborate on projects, and drive the future of water management.
Resources
Find Dr. Amir Cahn on LinkedIn: https://www.linkedin.com/in/amir-cahn/ Download the free DaaS Playbook: https://swan-forum.com/publications/swan-daas-playbook/Become a SWAN member: https://swan-forum.com/join/
Can’t stop reading?We have organized our 30+ articles in three categories: Organization, Change, and Technology. Make sure to check them out!
Did you already subscribe to our podcast?
YouTube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack: https://itotinsider.substack.com/podcast
In this episode of the IT/OT Insider podcast, David sat down with Toni Manzano, a veteran in the pharmaceutical industry and co-founder and Chief Scientific Officer at Aizon. We delve into how IT/OT convergence concepts can be applied to the pharmaceutical industry, an area where precision, regulation, and innovation intersect in complex ways.
Thank you all for all the positive feedback we received. Please subscribe to our blog if you haven't done this so far. We really like to make an impact, so sharing is always highly appreciated. Finally, we are always on the lookout for interesting stories.
The life sciences industry, encompassing sectors like pharmaceuticals, biotechnology, and medical devices, has an impact on all of us. They are pivotal in advancing global health by innovating and producing therapies, diagnostics, and treatments that improve and save lives. This industry represents a significant portion of the global economy, with the pharmaceutical sector alone generating approximately $1.6 trillion in global revenue in 2023.
Let’s delve into this super interesting sector and discover how digital solutions are transforming the way medicines are developed and produced!
From Astrophysics to Pharma
Toni began by sharing his intriguing transition from teaching astrophysics to spearheading software innovation in the highly regulated pharmaceutical industry. He reflected on his contributions to the development of Laboratory Information Management Systems (LIMS) and Manufacturing Execution Systems (MES). "Twenty years ago, LIMS and MES were something ‘wow’, and today it's still ‘wow’," Toni noted, emphasizing the continuous relevance of these systems, but also the slow adaptation.
AI in Pharma: Beyond the Buzz
We discussed the current hype surrounding artificial intelligence (AI) in pharma. Toni describes AI as a "cocktail" with power computing, algorithms, maths, and crucially for pharma, quality data as ingredients. "The secret sauce of this cocktail in pharma is the quality data. Without quality data, you cannot bring AI to fruition," Toni explained.
Toni shared a story about handling human plasma. Unlike other raw materials where quality non-compliance could simply lead to a batch rejection, human plasma represents a unique and invaluable resource that cannot be discarded. Here, Toni illustrated how AI can play a critical role. He described a scenario where, despite the high quality of operations, fluctuations in the quality of plasma can affect the final product. This complexity is where AI excels—by integrating vast amounts of operational and quality data to optimize processes that traditional methods cannot. This example not only underscores the complexity inherent in pharma manufacturing but also highlights the transformative potential of AI in managing such complexities, ensuring that every batch of product meets quality standards without wasting precious resources.
Challenges and Conservatism
Addressing the conservatism in the pharmaceutical industry, Toni pointed out the paradox of massive profitability discouraging rapid innovation. "If the industry is earning a lot of money with the status quo, there's less perceived need to evolve," he said. This highlights a significant barrier to adopting new technologies in an environment where traditional methods continue to yield high returns.
Regulatory Insights and the Path Forward
We also talked about the critical role of regulations in pharma, which ensure the safety, quality, and efficacy of medical products. Toni illuminated the evolving nature of regulatory frameworks which are increasingly accommodating modern computational methods, including AI. "Regulatory bodies are promoting innovation and the modernization of the pharmaceutical industry," Toni stated, suggesting a gradual but inevitable shift towards more advanced, data-driven manufacturing processes.
Outlook
As we wrapped up our conversation, Toni expressed optimism about the future of IT/OT convergence in pharma, driven by societal demands for rapid innovation and a generational shift in executive leadership towards tech-savviness. "Young executives understand and live with digital technology daily. It's not something strange; it's necessary," he remarked.
Find Toni on LinkedIn: https://www.linkedin.com/in/tonimanzano/Interested in finding out more about Aizon? Visit https://www.aizon.ai/
Can’t stop reading?We have organized our 30+ articles in three categories: Organization, Change, and Technology. Make sure to check them out!
Did you already subscribe to our podcast?
YouTube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack: https://itotinsider.substack.com/podcast
Welcome to a special episode of the IT/OT Insider Podcast! Today is the first time we, David and Willem, are interviewing each other instead of hosting a guest. As co-authors of this blog and both working in the field of industrial digitalization, we thought it would be fun to share our own stories. In this episode, we'll delve into our backgrounds, our journeys in IT and OT, and our joint presentation at the Enterprise Technology Leadership Summit Europe last week.
This conference was organized by IT Revolution, who published many of our favorite books including The Phoenix Project, Team Topologies, Sooner Safer Happier, DevOps Handbook and others. We were introduced by Gene Kim who shared that he too had witnessed the IT-OT divide in critical infrastructure. Our presentation is available on their website, you need to create a free trial account to get access or you can listen to this podcast ;)
The Paradox of Digital Solutions
We discussed the paradox that despite the promising potential of digital solutions, their implementation often fails when subjected to the realities of diverse and incompatible systems across manufacturing sites. "Billions have been poured into digitization, yet the average shop floor whispers tales of the 1980s," we noted during our talk. This mismatch between investment and outcome highlights the difficulty in scaling digital projects beyond pilot 'lighthouse' plants.
A personal note: Thank you all for all the positive feedback we received from this amazing community. Subscribe to our blog if you haven't done this so far. We really like to make an impact on this industry, so sharing with your peers is always highly appreciated. Finally, we are always on the lookout for inspiring IT/OT stories. Feel free to reach out to us if you have something to share.
The Socio-Technical Ecosystem
Our conversation emphasized that IT/OT integration is not merely a technical challenge but a socio-technical endeavor that involves people, processes, and entrenched cultural norms. We pointed out that the disparity between IT and OT spans not only systems but also cultures, with IT's rapid innovation cycle clashing with OT's priority for reliability and gradual evolution.
New!! Because we already have 30+ articles, we have now created 3 summary pages containing our most influential articles on Organization, Change, and Technology. Make sure to check them out!
Challenges of Top-Down Directives
We critiqued the common practice of dictating IT/OT convergence from the upper echelons of management, a top-down approach that often underestimates the intricate dynamics of integration. "Converging two organizations is not a management decision," we explained, emphasizing that these initiatives should consider the socio-technical aspects where both IT and OT bring unique strengths and cultural perspectives.
Anti-Patterns in Convergence
Highlighting the pitfalls in current convergence practices, we talked about how mismanaged efforts can lead to increased divergence instead of integration. "When done wrong, efforts to converge will quickly lead to quite the opposite: divergence," we observed. We discussed how organizational comfort zones and resistance to change can exacerbate friction and mistrust between IT and OT departments.
Our Vision for Collaboration
Looking forward, we advocate for fostering a culture of collaboration where IT and OT not only coexist but actively cooperate, leveraging each other's strengths. We propose building a bridge over the existing differences to create systems and processes that are IT-enhanced yet respect the operational imperatives of OT.
Moving Toward Effective Integration
Our discussion concludes with a call to action for organizations to rethink their approach to IT/OT integration. We emphasize the need for adaptive strategies that recognize the complexities of modern industrial environments and promote a balanced integration of technology and operational practice. By focusing on collaborative approaches and understanding each domain's unique contributions, companies can more effectively navigate the challenges of digital transformation in manufacturing.
Subscribe to our podcast!
YouTube: https://www.youtube.com/@TheITOTInsiderApple Podcasts:
Spotify Podcasts:
But also here on Substack: https://itotinsider.substack.com/podcast
In this latest episode of the IT/OT Insider podcast, David welcomed Jan Meskens, a seasoned data consultant with a rich background in data management and academia. The discussion provided deep insights into the evolving landscape of industrial data management, emphasizing the critical need for bridging the gap between Information Technology (IT) and Operational Technology (OT) to leverage the full potential of data in Industry 4.0.
User-Centric Data Management
“In the world of data, usability is often an afterthought”
Jan, who started his career in user experience design before transitioning into data consultancy, shared his unique perspective on making data systems as user-friendly as possible. “In the world of data, usability is often an afterthought,” Meskens explained. He highlighted the common industry challenge where crucial data is frequently trapped within complex systems like massive Excel spreadsheets, understandable only by their creators.
The Two-Pronged Approach to Data Projects: Bottom-up + Top-down
The conversation shifted towards how organizations initiate and drive data projects. Meskens outlined a dual approach seen in most successful enterprises: a bottom-up initiative driven by specific teams who see value in data and a visionary top-down strategy led by leaders who understand the broader benefits of data integration. “Both directions are crucial for cultivating a data-driven culture within any organization,” Meskens noted.
Proof of Concept: Learning or a Pitfall?
“The real success stories are those where PoCs serve as a stepping stone to full-scale implementation and integration.”
A significant focus was on the role of proofs of concept (PoCs) in data management projects. Meskens emphasized that PoCs should be learning instruments rather than final solutions. “The real success stories are those where PoCs serve as a stepping stone to full-scale implementation and integration,” he stated. This approach mitigates the risk of what he humorously refers to as "PoC purgatory," where projects perpetually cycle through the proof-of-concept phase without reaching full deployment.
Integrating IT and OT Perspectives
David and Jan also delved into the cultural and procedural nuances that differentiate IT and OT. Meskens pointed out that while IT projects can often pivot and adapt rapidly, operational technology demands a more methodical and safety-oriented approach due to the physical nature of the machinery and processes involved. This difference often leads to a clash of expectations and methodologies when managing data projects across IT and OT boundaries.
Facilitating Change through Sketches
Highlighting an innovative communication method, Meskens shared how sketching complex ideas has helped bridge the communication gap between various stakeholders in data projects. “Sketches open a dialogue—they are simple yet powerful tools for visualization and feedback,” he remarked, noting how this method helps stakeholders engage more constructively in project discussions.
Book Recommendations
Meskens recommended two influential books for those interested in deepening their understanding of data management and project dynamics: "The Phoenix Project" and "Data Management at Scale." These readings, he believes, provide foundational knowledge and advanced insights into effectively managing and scaling data projects.
Insights for the Future
The podcast wrapped up with a reflective discussion on the future of IT/OT convergence, with both David and Jan advocating for more integrated and cooperative approaches to managing industrial digital transformation.
This episode of the IT/OT Insider not only shed light on the technical and cultural facets of data management but also underscored the importance of strategic and human-centric approaches to digital transformation in the manufacturing sector. As industries worldwide continue to evolve, the principles discussed by David and Jan will undoubtedly influence future innovations and integrations across the IT-OT spectrum.
Book giveaway!
We are giving away a copy of The Phoenix Project ! Details can be found at the end of the episode. Send your answers to [email protected] or DM me on LinkedIn.
About our Guest
Find Jan on LinkedIn: https://www.linkedin.com/in/janmeskens/ or via his website https://sievax.be. Make sure to subscribe to his blog on Medium: https://medium.com/@meskensjan
Subscribe to our podcast!
Youtube: https://www.youtube.com/@TheITOTInsiderApple Podcasts:
Spotify Podcasts:
But also here on Substack: https://itotinsider.substack.com/podcast
In the heart of the digital revolution, companies across the globe are recognizing the need to adapt and transform. The industrial sector is no exception. David had the privilege of speaking with Mike Hughes, Zone President at Schneider Electric for the Nordics and Baltic region, who has been at the forefront of this transformation. With years of experience and a career that spans various regions, including the UK and Ireland, Mike shared valuable insights on the evolving landscape of digitalization in manufacturing.
Thanks for reading (and now also listening to) The IT/OT Insider! Subscribe for free to receive new articles and podcasts in your inbox :)
“Digitalization isn't a new concept; it's been on the corporate agenda for years. However, the urgency to adopt digital strategies has significantly increased.”
The Why and How of Digitalization
David: “Why should companies talk about digital transformation today?”
Mike Hughes: “Digitalization is not a new concept; it's been part of the dialogue for many years. However, what's changed is the urgency and the necessity for it. The advancements in sensor technology over the past decade, coupled with exponential growth in computing power, particularly through AI and companies like NVIDIA, have propelled us into a new era. It's not just about being able to gather data anymore; it's about analyzing and extracting valuable insights from that data to drive significant productivity gains.”
A compelling aspect of our conversation centered around the realization that digitalization transcends IT. This transition is not just about enhancing IT infrastructure but transforming industrial processes to unlock new value.
Key Drivers of Change
From the introduction of SCADA systems to the integration of industrial software, the landscape has changed. This evolution is fueled by the need for data-driven insights to optimize processes and reduce inefficiencies.
Mike: “Over the past five years, the shift has been remarkable. Historically, IT and OT have operated in silos, but we're seeing those barriers come down. The realization that the largest potential for value lies not within office productivity tools but within industrial processes has been pivotal. When you combine digital sensor technology with AI capabilities, you create a powerful tool for unlocking efficiency on the shop floor. This convergence of IT and OT is vital for leveraging data across the entire manufacturing ecosystem.”
Sustainability + Digitalization
Another key theme that emerged was the role of digitalization in promoting sustainability.
Mike: “Sustainability and digitalization go hand in hand. As regulations around sustainability tighten, companies are compelled to not only track but actively manage their environmental impact. Digital tools allow for better energy management, supply chain transparency, and overall resource efficiency. It's a win-win scenario where companies can achieve sustainability targets while enhancing operational efficiency. [..] One of the most important lessons is that digital transformation goes beyond technology. It's fundamentally about rethinking processes and systems to unlock new value. Another critical aspect is the blurring lines between IT and OT, enabling seamless data flow and analytics. Lastly, sustainability can serve as a powerful catalyst for digital adoption, driving companies towards practices that are not only efficient but also environmentally friendly.”
Looking forward: what does the future hold?
Mike: “The future is incredibly promising. As we refine our approaches to integrating IT and OT and as technologies continue to advance, I believe we'll see even more innovative applications of digital tools in manufacturing. Sustainability will remain a key focus, driving further innovation in how we manage resources and reduce environmental impact. The journey of digitalization is ongoing, and it will continue to shape the manufacturing sector in profound ways.”
Lessons Learned
Reflecting on our conversation, several lessons stand out:
* Digitalization is a strategic imperative, not just a technological upgrade. It's about rethinking processes and systems to unlock new value.
* The integration of IT and OT is a game-changer, enabling seamless data flow and analytics across the manufacturing value chain.
* Sustainability can be a powerful catalyst for digital transformation, driving companies to adopt practices that are not only efficient but also environmentally friendly.
A big thank you to Mike, the Schneider Electric Marketing & Communication team and AVEVA Select Scandinavia !
Subscribe today to our podcasts!
Youtube: https://www.youtube.com/@TheITOTInsider Apple Podcasts:
Spotify Podcasts:
But also here on Substack:
Welcome to today’s episode, where we dive deep into the world of IT/OT convergence with Shiv Trisal, the global manufacturing, transportation, and energy market leader at Databricks. Join us as we explore the transformative power of data across industries and discuss the future of industrial digital transformation.
Timestamps:
[00:00] Meet Shiv Trisal
[00:33] Databricks Explained
[01:36] IT vs. OT Data
[02:31] Shiv's Journey
[05:26] Industry Comparisons
[05:34] Greenfield vs. Brownfield
[10:16] Jargon Challenges
[10:20] Bridging Gaps
[17:41] Data Convergence
[22:56] Cloud Transformation
[29:32] Episode Wrap-Up
Key Takeaways:
Databricks is at the forefront of empowering organizations with data intelligence, providing tools for companies to harness specific insights from their data, leveraging both IT and operational technology (OT) information.
The convergence of IT and OT data is crucial for the future of industrial digital transformation, requiring a unified approach to data analysis and utilization.
Shiv Trisal highlights the significant shift in data utilization and perception during his transition from the aviation industry to Databricks, emphasizing the value of AI and machine learning in discovering patterns and insights.
The conversation underscores the challenges and opportunities in IT/OT convergence, emphasizing the need for domain expertise, mutual understanding, and the alignment of perspectives between IT and OT domains.
The future of IT/OT integration is collaborative, focusing on creating a shared data foundation and maintaining data quality to drive actionable insights and outcomes.
Resources and Links:
Databricks Official Website: https://www.databricks.com/
AVEVA Partnership Announcement: https://www.aveva.com/en/about/news/press-releases/2024/aveva-and-databricks-forge-strategic-collaboration-to-accelerate-industrial-ai-outcomes-and-enable-a-connected-industrial-ecosystem/
Shiv Trisal’s LinkedIn Profile: https://www.linkedin.com/in/shiv-trisal/
Thank you for tuning in to today’s episode. For more insightful discussions on the impact of data in the industrial sector and the journey towards IT/OT convergence, make sure to subscribe to our channel and hit the notification bell so you never miss an episode.
Find our blog here: https://itotinsider.substack.com/
En liten tjänst av I'm With Friends. Finns även på engelska.