Lado Okhotnikov's Decentralised AI: A Grid Question
The success of a decentralised AI model doesn't just depend on clever software, but on something far more fundamental: the capacity and resilience of our electrical grids

There's a growing debate about how we should build the next generation of artificial intelligence. On one side is the established, centralised approach: vast data centres operated by a few large companies. On the other is a push for decentralisation. Figures like entrepreneur Lado Okhotnikov argue that distributing AI's computational power could make it more resilient, transparent, and user-controlled.
The arguments for this shift are compelling. But as the discussion often focuses on blockchain architectures and novel algorithms, a more mundane — yet critical — factor is frequently overlooked. The success of a decentralised AI model doesn't just depend on clever software. It depends on something far more fundamental: the capacity and resilience of our electrical grids.
The Centralised Model's Growing Power Problem
To understand the challenge, it's useful to look at the system decentralisation seeks to improve. Today's most advanced AI models are trained in enormous, centralised data centres. These aren't just server rooms; they are industrial-scale power consumers. A single new facility can require as much electricity as a medium-sized town.
This concentration of power demand is already creating strain. In key markets from Virginia to Dublin, grids are reaching capacity, leading to long waiting times for new data centre connections. The sheer scale of power required is a primary bottleneck for AI's growth in its current form.
This setup also creates a single point of failure, both physically and in terms of control. It centralises risk. Decentralisation, in theory, spreads that risk out. But while it may solve the problem of concentrated control, it doesn't automatically solve the problem of concentrated energy use. It simply redistributes it.
Why a Distributed Network Needs a Different Grid
The vision for decentralised AI involves pooling computing power from a global network of devices—from personal computers to smartphones. The idea is to use this 'spare' capacity instead of building another massive data centre.
The technical concepts to coordinate this, like federated learning, are maturing. The unresolved question is one of energy infrastructure. For a device to contribute meaningful computational work, it needs a stable and abundant power supply. Asking millions of homes to run intensive computations shifts a significant energy demand from industrial zones, where grids are built for high loads, to residential areas, where they often are not.
Most local electricity networks were designed for refrigerators, lights, and televisions—not for sustained, high-performance computing. Without upgrades, this could lead to reliability issues or simply make consistent participation in a decentralised network impractical for many users.
Grid Modernisation Isn't Optional; It's Foundational
This means the future of decentralised AI is tied directly to the future of energy infrastructure. The conversation needs to include not just computer scientists, but utility planners and electrical engineers.
A functioning, large-scale decentralised network would likely require a grid that is more digital, flexible, and two-way. It would benefit from widespread local energy generation, like rooftop solar panels, and advanced home battery systems that could provide power during computationally intensive tasks without straining the local network. Success might be measured not just in processing speed, but in how efficiently a task uses every watt of electricity.
Therefore, advancing decentralised AI may require progress on two parallel tracks: developing the distribution software, and modernising the power grids that would support it. The organisations that succeed may be those that can integrate computational workloads with clean energy generation and smart grid management.
A Pragmatic Perspective
The case for exploring decentralised AI models, as highlighted by advocates like Lado Okhotnikov, is strong from perspectives of resilience and ethics. It addresses genuine concerns about over-reliance on a handful of corporate-controlled infrastructures.
However, scaling this vision from a compelling concept to a global reality introduces a formidable, physical constraint. The electricity grid — an often-invisible piece of 20th-century engineering — becomes a decisive 21st-century factor. The promise of a more democratised AI is significant, but realising it fully may depend as much on upgrading our power lines and transformers as it does on perfecting our algorithms. The path forward isn't merely about distributing compute; it's about ensuring every node on the network can reliably plug in.
© Copyright IBTimes 2025. All rights reserved.



















