July 19, 2006

Good/Bad AI's, accelerating returns and a lot of abundance

There are several topics which are often analyzed in isolation in regards to projected advanced technology. AI, accelerating technology, and abundance from technology and resources from space.

There are various papers that talk about achieving abundance from advanced technology like molecular nanotechology.

There is the analysis by Ray Kurzweil that technology is providing accelerating returns

There is also the concern about the need for friendly Artificial Intelligence (AI). This matters because the technological Singularity is mainly about the development of intelligences that are far greater than human and how that will cause an explosion of technological capability.

People can get a sense of the immense resources of energy and materials in space from the Kardashev scale of civilizations

People fear that an AI that is vastly more intelligent than people will rapidly become very powerful and dangerous to people. If an AI is vastly superior in intelligence and is able to rapidly develop and extend technological capability, then it should rapidly be able to tap the resources of space. Trillions of times more than what is available on earth. The AI can make itself mobile and leave and do whatever it wants. For the AI to decide to kill people on earth, I have difficulty seeing the motivation good or bad. The AI can basically outclass any human that is not completely augmented. It would be like Bill Gates parents being concerned that he might plot to kill them for his allowance. Even if the AI is very greedy or expansionist what we have developed so far should be irrelevant to its aims. Maybe a bad AI won't help us out and just leave. But why would it fumigate the old house on the way out ?

The superior AI rapidly moves itself into an entirely level. Tiger Wood's does not need to dominate the miniture gold courses.

There is also the discussion about whether or not to upgrade people. There is the concern the non-upgraded and therefore weaker people would be at the mercy of those who upgrade. The choice is not whether the non-upgrades will be killed, again abundance and accelerating returns from technology means that those who do not upgrade become irrelevant.

Accelerating returns mean that 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today's rate). 200 years of progress will be more than 4,000,000 years of progress (at today's rate).

In about 20 years, those who have not upgraded are like the Amish a few hundred years of technology behind. They are a quaint curiousity and barely connected to the advanced economy.

In about 100 years, they are like the cavemen and utterly removed from and unable to understand the advances being made.

In 200 years, they are like chimpanzees. The choice not to adopt the best technology is like choosing not to evolve.

For those who choose to advance and become transhuman, being generous to those who did not becomes very easy with abundance and the resources of space. It becomes increasingly small fractions. Initially like foreign aid (1-2%), then like setting side nature preserves and reservations. Then like setting up city zoos. Then like keeping potted plants and ant colonies.

So some things to remember is that abundant is really abundant. Not a little abundant.

And AI's and radically augmented people can move themselves into an entirely different level of operation. Fear not the bad AI, but the meticulously cruel AI.

Don't upgrade and rapidly become irrelevant


bw said...

Velcro city posted about this article

I had cross posted to betterhumans

One commenter talked some about AI emotions.

My response:
I do not consider AI emotions at all.

The AGI with any potential for danger is one that can and is rapidly improving its own technology. My assumption is that in order to be more effective at this task the AGI must be very good at Math, resource allocation, all sciences, scientific method and cost benefit analysis.

If it outclasses all people with these abilities, then I would submit that it would be trivial for the the AGI to create better access to space for itself. It could create a means to tap all of the Suns energy and the materials of the asteroid belts etc... It creates supernanotech and other tech for a Dyson shell of energy collectors.

Killing us for everything on Earth and everything that we have built or for the space we take up seems to be a waste of time and effort. Good/Evil/whatever. It is just a waste of time. It is like killing a toddler for its sand castle and access to its sand box. Sure you could do it easily, but why? If you want sand you can get it. The sand castle is useless to you because you can make something better.

Poverty is a human problem. I do not make any assumptions about whether AGI would help us on that. I would think that we should try to build better tech and create our own abundance and take care of it ourselves.

Rescaling the relative value of things.

when I talk abundance it is not about the "price of things". That we have some kind of boomtown and even though you make a bunch of money, they start charging you more so you cannot really buy more. I am saying that if you have the tech you can tap all of the power of the sun. Then the Saudi oil fields are like a thimble of energy.

A new comment of mine. Not choosing to upgrade in the early days is like not using the best computers and software. Then it is like not going to graduate school. then like not going to college. Then like not going to high school. then like not going to grade school. The pool of jobs available to you shrinks as your skills and capability lag.