How did John D. Rockefeller build one of the most powerful business empires in American history?
Yes, he found and refined more oil than his competitors. But that wasn’t the key to his success.
Rockefeller ultimately built his legacy by owning the pipeline, not the oil — the infrastructure that every barrel had to flow through to get from the ground to the market.
Once he controlled that layer, the game changed. Competitors could find all the oil they wanted. They still had to move it through him.
That is the infrastructure rule: whoever owns the bottleneck owns the economics.
Amazon (AMZN) didn’t win the internet age by selling books better than Barnes & Noble. It built AWS — the cloud infrastructure that much of the startup, tech, and enterprise world now runs on top of. AWS has often generated the majority of Amazon’s operating income, despite being a much smaller share of total revenue.
Apple (AAPL) didn’t win the smartphone age by building the best phone. It built the App Store — the distribution layer iPhone developers had to pass through — and collected a commission on the commerce that flowed across it.
The pipeline, not the product, is the real prize.
Now that same logic is starting to show up in artificial intelligence.
Data. Compute. Models. Robots. Distribution.
Elon Musk is assembling pieces across all of them — and taken together, they look less like separate businesses than the early architecture of a vertically integrated AI empire.
We call it Elon Co.
What Is Elon Co.? The Four Layers of Musk’s AI Stack
From what we can tell,X, SpaceX, xAI, and Tesla (TSLA) are four layers of a single machine.
Layer One: X Gives xAI a Proprietary Data Moat
Every AI model needs training data. And many frontier AI models still rely heavily on overlapping public-web data: scraped pages, forum posts, Wikipedia archives, code repositories, and licensed text. Essentially, everyone is drinking from the same data lake, and then claiming their model is meaningfully different.
Grok — xAI’s AI model — is different. And the reason is X.
Musk has claimed that X generates roughly 500 billion tokens of human language every day — actual thoughts, arguments, jokes, fears, and observations from hundreds of millions of real people, in real time.
OpenAI has ChatGPT. Google has Search and YouTube. Meta (META) has Facebook and Instagram. But X gives xAI something different: a live, text-heavy feed of human reaction as it happens. Arguments, jokes, panic, politics, markets, culture — all of it, constantly refreshing.
Musk’s data moat is proprietary, constantly refreshing, and difficult for competitors to replicate at the same scale.
That’s the fuel.
And X may not stop at fueling Grok. The same platform that gives xAI a live stream of human behavior could also become the distribution layer for Musk’s financial ambitions through — a something we’ve been tracking closely. If X becomes not just where people talk, but where they transact, invest, and manage money, the infrastructure implications get even larger.
Layer Two: SpaceX and the Orbital Data Center Thesis
Meanwhile, the global AI industry is running into a growing physical constraint.
Tech companies plan to spend trillions building AI data centers by 2030. But the bottleneck is no longer just chips or capital. It is the physical infrastructure required to run them: grid connections, power availability, cooling capacity, and land with the right utility footprint.
AI is extraordinarily energy hungry, and today’s grid infrastructure is struggling to keep up.
Musk’s longer-term solution is to move some data centers into space.
In January 2026, to “launch and operate a new NGSO satellite system of up to one million satellites to operate as the ‘SpaceX Orbital Data Center system’.”
The physics here are compelling, even if the engineering is tricky. Solar arrays in low Earth orbit receive stronger and more consistent sunlight than panels on Earth because there is no atmosphere or weather. Heat can be radiated into space through dedicated thermal systems rather than managed with water-intensive cooling. And the bandwidth of laser inter-satellite links exceeds what any fiber optic cable can carry.
An orbital supercomputer, powered by persistent solar exposure, cooled through space-based thermal systems, and connected by laser links.
But here’s the critical point that separates this from science fiction: SpaceX already operates 8,000 satellites in orbit right now through its Starlink network. SpaceX is not trying to build an orbital network from scratch. It already has thousands of powered, connected machines circling the planet. Orbital compute would be an expansion of that infrastructure, not a cold start.
And no one else has the same launch economics. Others have rockets. SpaceX has Starship — the vehicle designed to make orbital infrastructure economics viable at scale.
Layer Three: Grok Could Turn Cheaper Compute Into AI Advantage
With proprietary training data and potentially cheaper compute on the horizon, Grok could develop a structural advantage that most rivals would struggle to match.
If your compute costs are a fraction of what everyone else pays — because you are harvesting solar power in orbit instead of buying grid electricity and paying to cool a terrestrial data center — you can train longer, run more experiments, serve more users, and iterate faster than any rival whose economics are tethered to a power utility.
The Memphis supercomputer that xAI built in 2025 — 100,000 Nvidia (NVDA) GPUs, assembled in a fraction of the time many expected — was the proof of concept. It demonstrated that Musk can execute at speed that defies conventional expectation. The orbital compute layer is the second act.
Sam Altman — CEO of OpenAI, arguably Grok’s most direct competitor — has said that orbital data centers “might be the long-term solution.” Altman is right. Orbital compute is probably where this ends up. And right now, SpaceX is the only company with a rocket business capable of making the math work.
Layer Four: Tesla Optimus Puts AI Into the Physical World
This is the layer that makes Elon Co. categorically different from the rest of the AI industry.
Most AI companies are still building intelligence that primarily lives inside a screen. It answers your questions, writes your emails, generates images. All of that is useful and valuable. But it doesn’t reach through the screen. It can’t load the dishwasher, work in a factory, or build a house.
Tesla’s humanoid robot Optimus — now in early production at the Gigafactory in Austin — is designed to take the intelligence produced by the Fuel, Engine, and Brain layers and put it to work in the physical world. At $20,000-$25,000 per unit at scale (Musk’s stated cost target), Optimus operates 24 hours a day, 7 days a week, 365 days a year. It learns from every task it performs, updates via software, and becomes more capable over time.
Optimus is designed to be produced at automotive scale, using Tesla’s existing manufacturing infrastructure, supply chain relationships, and production engineering excellence — the same infrastructure that proved it could build the Model 3 at volume when everyone said it couldn’t.
The market is still pricing Tesla as an electric vehicle company with margin pressure and Chinese competition. That framing may prove as wrong as pricing Amazon as an online bookstore in 2006.
If Tesla proves Optimus can scale, the market will have to stop valuing it like an EV company and start valuing it like a physical AI platform. That would be a very different stock.
How to Invest Around Elon Musk’s AI Empire
Musk is building something rare: a vertically integrated AI stack that stretches from data to compute to models to physical deployment.
Most of that stack is either private, partially inaccessible, or misunderstood by public markets. That is why the investable angle is not just Musk himself — it is the suppliers enabling the system: those building the components, infrastructure, and services that make Elon Co. possible.
That means looking for companies that are:
- Building the cloud infrastructure and developer platforms that process X’s data and host applications built on Grok
- Manufacturing satellite hardware, space-grade chips, solar arrays, and laser communication systems for orbital compute
- Designing custom AI silicon, high-bandwidth memory, and connectivity semiconductors for Grok’s training and inference workloads
- Supplying vision systems, motion-control hardware, edge AI chips, and test equipment that could go into Optimus units
Many supply chain names are still underfollowed. That may change if the SpaceX IPO forces Wall Street to take orbital AI infrastructure seriously. The gap between what these companies enable and how the market currently values them is where the most interesting returns may come from.
The Bottom Line: Own the Bottlenecks Behind Elon Co.
During the California Gold Rush, the miners weren’t the folks who got rich. The people who sold picks, shovels, denim, and banking services were.
The miners took the risk. The infrastructure suppliers took the margin.
The history of wealth creation is ultimately the history of infrastructure.
Pipelines. Railroads. Cloud networks. App stores.
The people who controlled the flow captured the economics.
That’s why Elon Co. matters.
And it’s why the single most important piece of the puzzle may not be rockets, robots, or AI models — but .
Because if Musk turns X into the financial backbone for this new ecosystem, the implications won’t stop at technology.
They’ll extend into money. And that could create a generational investment opportunity.
.