7 Comments
Oct 29·edited Oct 29

In the Google example, doesn't the "AGI Doesn't Exist" scenario also result in an infinite downside risk, assuming Page is being honest and willing to bankrupt the company chasing it? This downside risk seems to be the same as in the "AGI Exists" scenario where Google does not invest/is not able to capitalize on it.

Expand full comment
Oct 29·edited Oct 29

If we create a digital god, will it be benevolent, or wrathful? In other words, the analogy here is close, but not quite right, because it ignores the infinite downside scenario of AGI.

We rule the earth because we are smarter than everything else that lives here. The difference between eating burgers and being factory farmed as a burger source is purely brain capacity, and we are about to create something with more brain capacity than we have. We are expecting it to be subservient to us. Are we subservient to any less intelligent thing that we have ever encountered? Can we honestly expect to successfully control a god? Can we guarantee what we summon won't be a demon instead?

I don't know. I do think there's an infinite upside to AGI, if the AGI remains fundamentally devoid of independent will; a benevolent god that does only what we ask it to do. As long as we ask it for the right things, we can use that tool for great benefit. But ask it the wrong things, or imbue it with programming that leads unintentionally to the wrong objectives, and we can potentially annihilate ourselves just as surely as we could with nuclear weapons. Worse yet, develop it such that it begins having its own will and goals, and we will very quickly depart from the top of the food chain. That's the wrathful deity scenario.

So, I'd draw the diagram differently, with the "invest in AGI" box split in two, a +infinite for Benevolent, a -infinite for Wrathful. We don't need the "don't invest in AGI" box; just as with the nuclear bomb, the fact we have realized it is probably possible means that someone on earth will eventually build it, and we would be smart to do the same, and get there first. We don't know whether we are responsible enough to handle it properly, but we do know we'd prefer not to find out if anyone else is. Unlike the bomb, though, we might not be in control of the big red button when we finish building it.

Expand full comment

Newton, confined for two years during the Plague ravishing London, developed the laws of motion, helped improve calculus, worked on color and the light spectrum.

Expand full comment

awesome thought experimentation... but it's science fiction.. it's the future

Expand full comment

I'm disappointed and surprised that you think there is a climate crisis. At worst, there is a slowly unfolding warming that has both pluses and minuses. Even if warming was at the wildly implausible extreme end, it would not be an existential threat. Pushing slow and often beneficial warming as a crisis has already led to highly destructive energy policies (especially in Europe).

Expand full comment

Semantic intepretations of the words "climate crisis" aside, the vast majority of brightest minds closest to the data and resulting technical innovation from that data, would certainly disagree with you.

Expand full comment
Oct 29·edited Oct 29

If the modern world can't get birth rates above replacement, we will pursue zero infinite missions in the future.

As a former mathematician I'm sensitive to exponentials, and in only 200 years assuming current linear downwards trends in birth rates; humanity could be at around only 80m people. The UN projections assumes stabilization, but it seems birth rates in big cities are just trending down further, and most of the world is urbanized.

More likely than infinite missions is that in 25 years people will suddenly realize we are neither going to mars, nor build a digital god. But that there are no young people to even sustain society itself.

Demographics is destiny, and our destiny is empty, and if we don't turn it around drastically in the next 50 years we won't even have the people to colonize mars in 200 years. We won't even colonize planet earth anymore. People think this is far off, but it's not. This is the story of our lifetime, and nobody is taking it seriously.

We just assume: 'of course there will be people'. That's not the track we are on. We are on track of having no people.

Expand full comment