Weekly Dose of Optimism #84
AWS Nuclear, Gigascale Hydrocarbons, Helium, User-Owned Models, Arms
Hi friends 👋,
Happy Friday and welcome back to our 81st Weekly Dose of Optimism.
Packy here. Dan’s out this week and next slinging creatine gummies across the country, so I’m doing a Weekly Dose takeover. The Optimism Gods made my job easy this week — we have a lot of good news to cover.
Let’s get to it.
Today’s Weekly Dose is brought to you by… Secureframe
Secureframe is the only compliance automation platform with AI capabilities that help customers speed up cloud remediation and security questionnaires.
Secureframe empowers businesses to build trust with customers by simplifying information security and compliance through AI and automation. By automating manual tasks related to security, risk, and compliance, Secureframe allows them to focus on growing their business.
That’s why thousands of fast-growing businesses including AngelList, Ramp, Remote, and Coda trust Secureframe to expedite their compliance journey for security and privacy standards such as SOC 2, ISO 27001, PCI DSS, HIPAA, GDPR, and more.
As a long-term partner on my newsletter and podcast, and a fan of the team, I recommend it to all my portfolio companies.
Sign up for Secureframe and receive 10% your first year:
(1) AWS acquires Talen’s nuclear data center campus in Pennsylvania
Dave Swinhoe for Data Center Dynamics
Talen Energy Corporation this week announced it has sold its 960MW Cumulus data center campus in Pennsylvania to a ‘major cloud service provider’ – listed as Amazon in a Talen investor presentation.
Welcome to the Age of Miracles, Amazon!
Data Center Dynamics reported that the bookseller has acquired a 960MW data center at a nuclear power station in my home commonwealth of Pennsylvania for $650 million.
While it’s not new nuclear, this is big news for a few reasons.
As Mark Nelson highlights, it’s Amazon’s first foray into nuclear. The company is the last of the hyperscalers to admit that nuclear is a good, clean energy source to power their energy hungry data centers.
Those companies — Amazon, Google, Meta, and Microsoft — consume over 22 gigawatts of electricity in the US each year, and S&P conservatively estimates that number will grow to 33GW by in the next few years. In order to go clean and maintain ESG ratings, they’ve contracted for 40GW of renewable power (the quoted capacity is higher than the 22GW need for a bunch of reasons, including intermittency). For context, the US has only 95GW of nuclear capacity.
Data centers may represent a huge catalyst for nuclear energy. Utilities are pretty terrible buyers for nuclear reactors, but hyperscalers are faster-moving and more economically motivated. Their soaring hunger for electricity may kickstart the process of building more reactors and driving them down the learning curve.
And not a moment too soon… the Russians and Chinese are discussing a partnership to build a nuclear reactor on the Moon by 2035.
Look, I’m all for a little international competition, but if anyone’s Putin the first nuclear reactor on the Moon, I want it to be the US of A.
(2) Gigascale Hydrocarbon Synthesis: Casey Handmer, Terraform Industries
I love all clean, cheap energy sources, and solar’s ride down the learning curve has been a thing of beauty.
It’s that learning curve that convinced Not Boring favorite Casey Handmer to start Terraform Industries, which is “scaling technology to produce cheap natural gas with sunlight and air.”
On the second episode of First Principles, Christian Keil digs in on the details with Casey. They cover everything why hydrocarbons are good, how to make synthetic methane (and eventually synthetic fuels), the Sabatier Reaction, and why cheap solar energy makes it all possible.
My favorite part about Terraform is that if it’s successful, it will create even more demand for solar, driving down the price of one of its key inputs even further, and making solar a more abundant energy source for the world. Plus, it takes advantage of solar’s advantages without overloading the grid. Smart.
(3) Helium discovery in northern Minnesota may be biggest ever in North America
Jonah Kaplan for CBS News
According to Abraham-James, the helium concentration was measured at 12.4%, which is higher than forecasted and roughly 30 times the industry standard for commercial helium."12.4% is just a dream. It's perfect," he said.
OK OK fine we’re getting really good at producing cheap, clean energy, but we can’t continue to grow because we’ll run out of resources, right? Wrong.
We just keep finding more of the things we need. Like the 2.34 billion metric tons of rare earth minerals found southwest of Wheatland, Wyoming last month. Or the enormous lithium deposits discovered in California’s Salton Sea.
And now, helium in Minnesota. This week, Pulsar Helium announced the discovery of what might be the largest helium deposits ever discovered in North America.
Helium is best-known for its pivotal role in making balloons float and voices squeak, but it’s also a fantastic coolant, something that’s going to get more important as we build more big, hot things.
Helium is used in MRI machines (which might help beat cancer), semiconductor manufacturing, and fusion reactors. If things go well in the world, we’re going to need a lot more helium, and now we have billions of standard cubic meters of the good good 12.4% stuff.
The combination of human ingenuity, capitalism, and America’s blessed natural resources keeps winning.
(4) User-Owned Foundation Models
Foundation models tend towards monopolies, as they require huge upfront investments in the form of data and compute. It’s tempting to resign ourselves to the easy option: do the best we can with hand-me-down, open source models that are a few generations behind, the leftovers from large AI companies. But we shouldn’t settle for being a few generations behind and only eating leftovers! We as users should create our own best model — we have the data and compute to make it possible.
Anna Kazlauskas, the founder of our portfolio company Vana, has been building in crypto and AI since long before any token that even smells like AI went to infinity. The crypto x AI narrative is running years ahead of practical implementation, and most of those tokens will crash and burn, but there’s a path.
Anna paints a compelling vision for user-owned foundation models: “Users who contribute to the model would collectively own and govern it. They could get paid when the model is used, or even paid proportionally to how much their data improved the model.”
And she brings the facts. Models need a lot of two things to work: data and compute. Collectively, people have a ton of data and a ton of compute.
Data: 453 trillion words locked as our private data across big platforms (with an order of magnitude more as we start recording more of our lives).
Compute: Ethereum miners ran 16.4 million GPUs when the chain was proof of work, many multiples of the amount used to train GPT-4. When the incentives are right, people can pool vast resources.
There are challenges, of course. Not all data is high quality, and models perform better with high quality data. And distributed GPUs aren’t optimized for training the way a purpose-built data center is.
But with the battle over closed versus open AI raging on, with companies like Google showing that they want to put their own spin on their models’ outputs, and with AI becoming more powerful every week, it’s useful to think about alternatives that would give people more control and ownership.
* Speaking of AI getting better, you should really check out Anthropic’s Claude 3. It’s winning on benchmarks, but it also just feels smarter. It’s my favorite thought partner for now, until next week when someone else comes out with something better. What a time to be alive.
(5) Delhi Painter Gets Hands Back As Organ Donation Meets Surgical Excellence
Saikat Kumar Bose for India News
The surgery, which took more than 12 hours, involved connecting every artery, muscle, tendon and nerve between the donor's hands and the recipient's arms.
Four years ago, a 45-year-old Indian painter lost both of his hands in a train accident.
That’s a devastating accident for anyone, let alone a painter from an underprivileged background. And until recently, it would have been permanent.
But this week, a team of doctors at Sir Ganga Ram Hospital completed the first successful bilateral hand transplant in Delhi’s history.
“The surgery, which took more than 12 hours, involved connecting every artery, muscle, tendon and nerve between the donor's [Meena Mehta] hands and the recipient's arms.”
I mean, that’s just an unbelievable W for the painter, and for humanity.
Give those doctors a hand!
(I’m sorry. Time to wrap it up.)
We’ll be back in your inbox on Tuesday. Until then, spread the good news:
Thanks for reading,
Packy + Dan
Wow, that bilateral hand transplant is amazing.
I remember assuming that this type of operation would NEVER be possible.
But science is progressing, and now we're starting to see that bodies (and brains, even) can be figured out and manipulated pretty well.
I'm much less trigger-happy with the word "impossible" these days.