
I’m not a tech naysayer. Removed from it. However we’re doing it once more.
A brand new period of know-how is taking off. AI is reshaping economies, industries, and governance. And similar to the final time, we’re transferring quick, breaking issues, and constructing the aircraft whereas flying it (to make use of some frequent tech phrases). These mantras have pushed innovation, however we’re now dwelling with the unintended penalties.
For over a decade, I labored within the engine room of the social media revolution, beginning in U.S. authorities, then at Twitter and Meta. I led groups participating with governments worldwide as they grappled with platforms they didn’t perceive. At first, it was intoxicating. Expertise moved quicker than establishments might sustain. Then got here the issues: misinformation, algorithmic bias, polarisation, political manipulation. By the point we tried to manage it, it was too late. These platforms have been too large, too embedded, too important.
The lesson? For those who wait till a know-how is ubiquitous to consider security, governance, and belief then you definitely’ve already misplaced management. And but we’re on the verge of repeating the identical errors with AI.
The brand new infrastructure of intelligence
For years, AI was considered as a tech concern. Not anymore. It’s changing into the substrate for every part from power to defence. The underlying fashions are getting higher, deployment prices are dropping, and the stakes are rising.
The identical mantras are again: construct quick, launch early, scale aggressively, win the race. Solely now we’re not disrupting media as a substitute we’re reinventing society’s core infrastructure.
AI isn’t only a product. It’s a public utility. It shapes how assets are allotted, how selections are made, and the way establishments operate. The results of getting it improper are exponentially larger than with social media.
Some dangers look eerily acquainted. Fashions educated on opaque information with no exterior oversight. Algorithms optimised for efficiency over security. Closed methods making selections we don’t totally perceive. World governance void while capital flows quicker than regulation.
And as soon as once more, the dominant narrative is: “We’ll determine it out as we go.”
We want a brand new playbook
The social media period playbook of transfer quick, ask forgiveness, resist oversight gained’t work for AI. We’ve seen what occurs when platforms scale quicker than the establishments meant to control them.
This time, the stakes are increased. AI methods aren’t simply mediating communication. They’re beginning to affect actuality from how power is transferred to how infrastructure is allotted throughout crises.
Vitality as a case research
Vitality is the perfect instance of an trade the place infrastructure is future. It’s complicated, regulated, mission-critical, and world. It’s the sector that may both allow or restrict the following part of AI.
AI racks in information centres devour 10-50 instances extra energy than conventional methods. Coaching a big mannequin requires the identical power as 120 properties use yearly. AI workloads are anticipated to drive a 2-3x improve in world information centre electrical energy demand by 2030.
Already, AI is being embedded in methods optimising grids, forecasting outages, and integrating renewables. However with out the right oversights, we might face situations the place AI methods prioritise industrial clients over residential areas throughout peak demand. Or crises the place AI makes 1000’s of speedy selections throughout emergencies that go away total areas with out energy and nobody can clarify why or override the system. This isn’t about selecting sides. It’s about designing methods that work collectively, safely and transparently.
Don’t repeat the previous
We’re nonetheless early. We have now time to form the methods that may govern this know-how. However that window is closing. So, we should act otherwise.
We should perceive that incentive buildings form outcomes in invisible methods. If fashions prioritise effectivity with out safeguards, we danger constructing methods that reinforce bias or push reliability to the sting till one thing breaks.
We should govern from the start, not the tip. Regulation shouldn’t be a retroactive repair however a design precept.
We should deal with infrastructure as infrastructure. Vitality, compute, and information centres have to be constructed with long-term governance in thoughts, not short-term optimisation.
We can’t rush vital methods with out strong testing, crimson teaming and auditing. As soon as embedded at scale, it’s almost not possible to reverse dangerous design decisions.
We should align public, non-public, and world actors, which will be achieved via really cross-sector occasions like ADIPEC, a worldwide power platform that brings collectively governments, power firms and know-how innovators to debate and focus on the way forward for power and AI.
No firm or nation can remedy this alone. We want shared requirements and interoperable methods that may evolve over time. The social media revolution confirmed what occurs when innovation outpaces establishments. With AI, we get to decide on a unique path. Sure, we’ll transfer quick. However let’s not break the methods we rely on. As a result of this time, we’re not simply constructing networks. We’re constructing the following basis of the trendy world.
The opinions expressed in Fortune.com commentary items are solely the views of their authors and don’t essentially mirror the opinions and beliefs of Fortune.

