Weekly Dose of Optimism #183
a16z AD Summit, Pi memory, Evo 2, Injectable Livers, Ginkgo Cloud Lab + 5 fundraising stories and my favorite sci-fi series.
Hi friends 👋,
Happy Friday and welcome back to our 183rd Weekly Dose of Optimism.
I was at a16z’s American Dynamism event in DC this week, where I got to meet a bunch of people who read not boring, including one who told me she never opens the Weekly Dose because the title doesn’t tell her anything.
If you’re reading this: thanks for the feedback, I guess it worked. I’m running a title test; I’ll check open rates this week and decide what to do going forward.
Those of you who opened are in for a treat. What a week for the optimists.
Let’s get to it.
Today’s Weekly Dose is brought to you by… Guru
Guru is the best way to make sure that your employees get accurate company knowledge from whatever AI you use.
Companies like Spotify and Brex use Guru because it’s the only AI verification system that automatically validates company knowledge before your AI agents use it, like quality control for your AI’s brain.
Like them, your company’s systems weren’t built to power automated systems. Your people send Slack messages, write docs, and even quasi-maintain wikis for other people to consume. Guru translates all of it to speak AI so that AI doesn’t give your people incorrect answers ~40% of the time.
Put your messy company data to clean use in a couple of clicks. Get a Guru to do it for you.
(1) a16z American Dynamism Summit
On Tuesday, I Joe Bidened myself1 down to Washington, DC for a16z’s American Dynamism Summit.
Complete with its very own Jason Carman hype video (see above), it was like a living, breathing version of the Weekly Dose of Optimism, and full of the people behind the stories we cover here each week.
Drew Baglino, whose Heron Power led off the Dose last week, was there, as was Noah Smith, whose Decade of the Battery I linked in that story. Cameron McCord, whose fundraise for Nominal you’ll read about today, was there, too. I spent time with probably a dozen people whose work has appeared in the Dose, and got to watch others on stage, like NASA Administrator Jared Isaacman.
My job is to spend time with these people, write about them, and invest in some of their companies, but there was something about seeing so many of them in one place that brought home the fact that a small group of sufficiently talented and motivated people can change how the future plays out.
Power generation, transmission, and storage, motors, power electronics, autonomous boats, supersonic planes, educational choice, modern manufacturing (terrestrial and in space), new weapons systems, cheap and secure communication everywhere, autonomous everything, safer neighborhoods, new cities altogether. If the people in that room succeed, America’s next 250 years will be better than its first 250.
(2) Physical Intelligence Models Get Memory So Robots Can Do Complex Tasks
Last week, I was talking to a Physical Intelligence investor who told me the progress the company had made in just the first couple of months of 2026 has blown his mind. Now, we know why.
Pi gave its models a new memory system that helps both short-term visual memory, to recall what it did recently with fine-grained details, and long-term semantic memory, which lets it remember what it did for up to 15 minutes.
With better memory, robots using Pi’s model can do long, multi-step things, like clean up an entire kitchen, set up the ingredients for a recipe, and grill a grilled cheese sandwich. It also learns from its past mistakes.
Yesterday, I wrote that LLMs are producing a lot of dead text that we’re probably better without. Robotics is way more interesting. I would love for the robots to spend billions of tokens doing my laundry and making me healthy gourmet meals while I kick back and read human-written classics. This is one more small step in that direction.
For a full primer on robotics, check out my cossay with Evan Beard:
(3) Arc Institute Releases Evo 2, Fully Open-Source Biological Foundation Model
A year ago, the Arc Institute released Evo 2, the largest fully open biological AI model ever built, as a preprint. This week it was published in Nature, and Arc dropped a one-year recap showing what the thing has actually done in the wild.
Trained on 9.3 trillion nucleotides from over 128,000 whole genomes, Evo 2 can read and write across all three domains of life: bacteria, archaea, and complex eukaryotes including us. The architecture, called StripedHyena 2, trained on 30 times more data than its predecessor Evo 1, and can reason over 8 times as many nucleotides at a time. Greg Brockman, the OpenAI co-founder and ex-Stripe CTO (Stripe co-founder Patrick Collison is a founder of Arc), helped build it during a sabbatical.
Evo 2 generalizes “across biological prediction and design tasks, across all modalities of the central dogma, across molecular to genome scale, and across all domains of life.”
It can predict which genetic variants cause disease and which don’t, without being specifically trained on that task. In tests on the breast cancer gene BRCA1, Evo 2 achieved greater than 90% accuracy in predicting which mutations are benign versus disease-causing.
It can generate new genomes. Entire genomes. The team demonstrated the first AI-designed and experimentally validated bacteriophage: 16 of 285 tested designs successfully propagated and inhibited growth of the target bacteria, with no impact on unrelated strains. For those who, like me, don’t fully speak the language of biology, that’s a whole functional new organism.
And in a stunt that’s both scientifically real and deeply fun: the team guided Evo 2 to design DNA sequences with controllable chromatin accessibility patterns, then wrote Morse code messages—”EVO2,” “LO,” and “ARC”—into the epigenome of mouse embryonic stem cells, experimentally validated with ATAC-seq.
The whole thing is fully open: model weights, training code, inference code, and the OpenGenome2 dataset. So more scientists will be able to experiment with a model that understands the grammar of life as the cost of reading and writing biology keeps falling. The design space for medicine, agriculture, and materials is going to keep opening up in ways that are genuinely hard to imagine.
(4) Injectable “satellite livers” could offer an alternative to liver transplantation
Sangeeta Bhatia Lab / MIT / Cell Biomaterials
More than 10,000 Americans are on a waitlist for a liver transplant. There aren’t enough donated organs. And many patients with liver failure aren’t even eligible for the surgery because they’re too sick to survive it. For those people, the best current answer is: sorry.
MIT engineer Sangeeta Bhatia’s answer is: what if we just injected a second liver?
Her lab has developed “satellite livers,” tiny engineered tissue grafts made of hepatocytes (the liver's workhorse cells) mixed with hydrogel microspheres and supportive fibroblasts, all deliverable through a syringe. Once injected into fatty tissue in the abdomen, the spheres act like a liquid going in and then solidify, giving the hepatocytes a scaffold to cluster around. Blood vessels grow in and the mini liver takes root. In mouse studies, the grafts remained viable and producing the right proteins for at least eight weeks (the full length of the study).
The insight that makes this work is a clever one: the hydrogel microspheres create an “engineered niche” for the cells. Without them, injected hepatocytes disperse and fail to integrate. With them, the cells localize, connect to the host circulation faster, and behave the way liver cells are supposed to.
The vision is a twofer. Short-term, these grafts could act as a bridge to transplantation, buying time for patients waiting on the donor list. Long-term, they could be the treatment: repeatedly administered, minimally invasive, titrated to how much liver function a patient needs. It could replace surgery altogether with routine injections.
The liver does about 500 essential things for the human body. The transplant chain is broken. People die. The fact that we might soon be able to restore meaningful liver function with a syringe, a handful of cells, and some hydrogel spheres is miraculous.
(5) Ginkgo Bioworks Launches Ginkgo Cloud Lab
In early February, we shared Ginkgo’s collaboration with OpenAI in which GPT-5 integrated with Ginkgo’s autonomous lab and autonomously designed, executed, and analyzed over 36,000 cell-free protein synthesis experiments. GPT-5 reduced costs by 40% compared to benchmarks (from $698 to $422 per gram). Good robot.
Now, Ginkgo is making its Cloud Labs available to anyone in a push to move all of its R&D services away from traditional benches and to these futuristic ones. Building out a lab can cost a bunch of money, Ginkgo CEO Jason Kelly tweeted that it can cost $10 million, and Ginkgo’s strategy has long been to offer lab services to startups, academic labs, and pharma companies as a service.
We hope that Cloud Labs will someday allow anyone to be a scientist with their own lab just like personal computers and cloud data centers democratized programming and the web.
With this move, presumably the service gets better as Ginkgo’s margins do, too. The starting price for Ginkgo’s first three protocols, two for cell-free protein expression and one for bacterial pixel art, is just $39.
Awesome news for the democratization of science, although I can’t imagine this is going to help my friend Noah Smith sleep any better at night.
EXTRA DOSE: Ulkar is out for the week, so in our Extra Dose, I’m rounding up a big week in fundraising news for software for hardware and sight restoration, plus recommending my favorite sci-fi series of all time.
Keep reading with a 7-day free trial
Subscribe to Not Boring by Packy McCormick to keep reading this post and get 7 days of free access to the full post archives.




