This is a bonus nugget in my Make Anything Thursday, After AI series, focused on . . .
The Expanding Definition of Infrastructure
A recent explainer from McKinsey highlights an important shift. Infrastructure now includes more than bridges, highways, ports, and power plants. It increasingly includes fiber networks, renewable energy systems, EV charging corridors, data centers, and the digital technologies that operate them.
Their estimate is striking. Global infrastructure investment could reach $106 trillion by 2040, reflecting a world where energy systems, transportation networks, digital platforms, and water infrastructure are becoming deeply interconnected.
Technology sits at the center of this transformation. Artificial intelligence, predictive maintenance tools, and digital twins are now being used to simulate, monitor, and optimize large infrastructure systems.
In short, infrastructure itself is becoming intelligent. Yet reading the report raises a quieter question. Where does the data actually come from?
The Quiet Assumption Beneath Intelligent Systems
Most discussions about artificial intelligence, digital twins, and smart infrastructure start with the assumption that operational data is already available. Once the data is in place, powerful analytics engines and simulation models can begin their work.
This assumption is rarely stated directly, but it appears everywhere.
- Digital twins rely on real-time signals.
- Predictive maintenance depends on telemetry.
- AI models require structured historical data.
All of these capabilities rely on consistent streams of operational information. Without that data, the intelligence layer above has nothing steady to learn from.
Producing this data inside industrial facilities worldwide is far from easy. Power plants, manufacturing lines, refineries, water systems, and logistics networks generate enormous volumes of signals. Sensors must be installed and maintained, protocols translated, assets identified, and time series aligned before meaningful analysis can occur.
Only after those steps are completed can analytics platforms or artificial intelligence start generating value. This early stage of the process is rarely discussed. Yet it is where much of the complexity resides.
The First Mile of Industrial Data
For decades, most industrial systems were designed for control rather than data science. Control systems focused on reliability and safety. They were built to keep physical processes operating smoothly, often in isolated environments with little connection to enterprise analytics platforms.
Those systems performed very well for their specific purpose. They were not created to generate structured operational data for large-scale analytics or artificial intelligence.
As a result, many industrial organizations now confront a core challenge. Before advanced analytics or digital twins can operate effectively, industrial data must first be captured, structured, and governed in a consistent way. Signals from physical systems have to be translated into reliable telemetry that can move into digital environments without losing context, meaning, or trust.
That first mile is not a minor technical detail. It is the quiet infrastructure layer beneath intelligent systems.
A New Layer of Infrastructure
The past fifteen years have seen major investment in cloud platforms, data lakes, and analytics environments designed to interpret data once it reaches upstream systems.
Much less attention has been paid to the systems responsible for producing reliable data at the source. That is beginning to change. A new category of technology is emerging that treats the creation of operational data as infrastructure in its own right.
One way to understand this shift is through the rise of an Industrial Data Infrastructure Layer. Just as fiber networks transmit digital information across continents, this layer carries operational signals from physical systems into digital environments where they can be analyzed and optimized. Without that layer, intelligent infrastructure remains an incomplete idea.
The Infrastructure Beneath the Infrastructure
The physical infrastructure that supports modern life is visible. Roads, power plants, pipelines, and water systems shape the physical landscape of cities and industries. The data that describes how these systems behave is far less visible. Yet it is becoming just as important.
If the next decade truly marks an infrastructure moment, then the systems responsible for generating reliable operational data will play a key role in what follows. Artificial intelligence may offer the analytical power, but the effectiveness of those systems will depend on the quality of the underlying signals.
A small number of platforms are now being created specifically for this initial stage of industrial telemetry. One example is the Altior platform developed by Inkwell Data, which emphasizes structuring operational signals before they reach cloud analytics environments or digital twins.
The emergence of systems like this suggests that a new layer of infrastructure is quietly taking shape. In many ways, it may become the infrastructure beneath the infrastructure.
The Question Ahead
Artificial intelligence is rapidly progressing. Digital twins are growing more advanced. Governments and investors are getting ready to invest trillions of dollars into infrastructure systems that will fuel the next wave of economic growth.
The real question may not be whether intelligence is ready. The question may be whether the data foundation beneath it is also ready.
