Monday, November 24, 2008

Human Extinction Greatly Exaggerated?

Apparently so.

I need to do some clarifying.

In the former post I ended up adding a caveat— "If humans can live off the interest they make from money they have in the bank, this wouldn't apply to them."

Depending on whether AIs have agency, this could be very likely.


If AIs are not their own independent agents, then they will not make money— the capital will stay with the humans. Money is spent to maintain the AIs, but AIs never actually get paid.

In this case, all the money spent in the economy ends up returning to humans as interest. When a human buys a good, the money ends up going to the AI managing the business. From there, it goes to the AI investor and back to the bank to pay back the money loaned him. Humans receive this money as interest. They may earn crappy wages, if they work at all, but they can live well off interest.


If AIs are independent agents, then things are more precarious. The future of humanity, in this case, depends on how much humans have in the bank, and whether people (or AIs) have an incentive to create more AIs.


Here's an essay where Robin Hanson explains his vision (which I'm basing this on): Economics of the Singularity


I've also discovered that the talk that inspired the gnxp post was by Marshall Brain at The Singularity Summit. A video should be up soon, and I'll post it later with criticisms.

Thanks to Ryan for keeping me on my toes.

No comments: