7 Comments
User's avatar
Nathan Smith's avatar

The economic theories surrounding economic growth, technological change, and ideas are important to understand, and kudos for putting this explainer out there.

But let me challenge you to take it further, in a certain direction: *map the space of technological possibilities.*.

Economists have a concept called a "production function," which describes outputs as a function of inputs. But overwhelmingly of not exclusively, production functions are framed in abstract terms, like "GDP" being a function of "capital" and "labor."

Real supply chains, meanwhile, and links production functions that are More concrete and specific. You need such and such an amount of coal and turbines to make such and such an amount of electricity. That sort of thing.

You can think of it as a network of nodes and directional links.

Invention expands the map. New nudes are added. That can be very useful. But exploiration of the technology space should not be taken for granted. There are inventions which are obsolete, no longer useful. There can be inventions for which the economy is not ready.

Invention may be less important as a determinant of progress than specialization and division of labor, and/or capital accumulation.

Where I think in a you're in almost a uniquely brilliant position to contribute is that the intersection of economics and engineering. Work on mapping what is known. I think you'll find that while more research may be needed in some places, there are a lot of useful but unimplemented, or insufficiently implemented, technologies already known, and investment and better organization are more important at this stage than new discoveries.

Of course, if we were exploiting the known technology space more thoroughly, that would encourage discovery and invention as well.

Expand full comment
Vakus Drake's avatar

One flaw I see in you analysis is that you are only considering growth over timescales that are still extremely short compared to that of human civilization overall. Whereas if you consider things centuries or millennia ahead then it's obvious that there's numerous physical limits that do not allow for you to sustain even a small exponential growth rate long enough to colonize more than a miniscule fraction of the galaxy.

The issue is that the maximum speed you can gather resources is always going to be at most geometric. Since if you were to expand to colonize everything at near lightspeed in an expanding sphere, then that would still be a geometric growth. Whereas exponential growth will always eventually outstrip this:

If you do the math for an 8 billion population with a 2% growth rate, then in 20k years every Planck volume needs to have twelve quadrillion people in it. With a 1% growth rate it's 1.4142695e-98 cubic cm per person or 3.35e+69 Planck volumes per person. Meanwhile a hydrogen atom is like ~200,000 times bigger than that at 1.5e+74 Planck volumes.

So I've come to the conclusion that without a singleton AI or some authoritarian world government (which needs to emerge prior to major space colonization) you will *eventually* have the inner parts of the future human expansion sphere turn into a post biological Malthusian hell.

Since the overpopulation issues I describe won't kick in for centuries at least, meaning by the time people realize the danger civilization will be too spread out to hope to feasibly restrict every groups reproduction. I say post biological because once resource scarcity become dire digital minds will be able to support themselves for a lot longer because they can in principle be vastly more resource efficient.

So you have a strong impetus to try to stay on the frontier of expansion so you can avoid future resource scarcity (as long as your particular expansion fleet has some internal regulation to restrict its growth, of many mediocre and one good solution exists). Then you want to leave the space behind you totally mined out and filled with automated defenses, so nobody finds it worthwhile to try to follow you. Thus preventing competition for the resources you're trying to claim. If you keep going long enough then you will be over the Hubble horizon from your competitors, at which point you never have to deal with outside competition again.

As for your point about birth rates reversing I would point out a good data point you should have included: Which is that if you look specifically at wealthy women then their birth rates are very notably higher than the rest of their society, but also in line with how many children women say they'd *like* to have given the opportunity. Suggesting that underpopulation won't be a short-medium term problem if everyone reaches the QOL of currently rich people.

Expand full comment
Nathan Smith's avatar

I made a presentation about it some years ago. I think you would find it very beneficial to take a look.

Let me know if you'd like me to write a guest post about it. I'd like to connect with your audience.

https://drive.google.com/file/d/1pc1s3rPb1y1bOdmjVUTUxnmcMnS09R0y/view?usp=drivesdk

Expand full comment
Swami's avatar

Insightful comment.

I would offer two complementary responses.

First, worrying about growth rates centuries or millennia in advance is probably not very worthwhile. Too many unknown unknowns. We should focus on the part of the forward trail that is within our view and potential understanding.

Second and more importantly, as JC implies in his discussion of AI researchers, it just is not the case that growth requires more resources. It can require more resources, and honestly usually does. However, it is possible to get more efficient and achieve growth at the same time.

A virtual reality hell and a virtual reality utopia, for example, could require exactly the same resources. But one is the ultimate regression, and the other the ultimate progress.

In other words, qualitative improvements do not necessarily require quantitative increases in energy or resources.

Expand full comment
Vakus Drake's avatar

Had to send this on mobile, so pardon the formatting.

There's a reason I focused on specifically population: However little resources a digital mind requires is at minimum determined by the Landauer limit, and you can't make computers below the Planck scale. My analysis basically assumes infinite energy, and shows you still run out of physical space in which to put things and people regardless of their substrate.

As for timing:

Firstly, unless anti aging research hits a wall pretty soon, these are things that may actually affect you. And you have a strong incentive to want to be on the frontier of expansion, which means setting off *before* things get bad. Also things like super fast (and fast reproducing) digital minds or people being sufficiently wasteful could massively shorten the time before a Malthusian trap.

Secondly, these problems if they are to be addressed must be done so way before they kick in. Since enforcement of any countermeasures becomes impossible unless you ensure something like an enforcer ai being sent in every colony ship from the outset of expansion.

For instance if these things aren't considered when making AI you might lock yourselves out of being able to do anything about the problem. Other than get out of dodge early.

Expand full comment
Swami's avatar
8hEdited

Thanks, but I am still confused. The essay is not arguing for population growth. It is arguing for a growth in ideas — over the reasonably foreseeable future.

Perhaps it would be useful for me and other people following the conversation if you clarified what physical limits affect the issue over the next few decades?

Expand full comment
Vakus Drake's avatar

Basically you can physically only use resources so efficiently. So you do eventually have to reach a "peak resource" situation you genuinely can't escape from. Even if you had unlimited energy, physical space itself would eventually become a bottleneck.

While this is definitely an issue long term there's a number of things that could push this problem ahead to within a few decades. For instance digital minds running at 1000x speed may also reproduce at 1000x speed. Additionally self replicating machines aren't limited by human birth rates, so if people hoard/waste enough resources via automation that could dramatically speed things along.

Additionally even if these issues don't arise in the next few decades, we still likely need to plan for them soon. Simply because you might otherwise pass the critical period where you can do anything to address the issue.

Expand full comment