Pages

July 25, 2006

follow up on AI/transhumanism

Velcro city posted about the original AGI/loads of abundance article.

I had cross posted to betterhumans

One commenter talked some about AI emotions and human poverty.

My response:
I do not consider AI emotions at all.

The AGI with any potential for danger is one that can and is rapidly improving its own technology. My assumption is that in order to be more effective at this task the AGI must be very good at Math, resource allocation, all sciences, scientific method and cost benefit analysis.

If it outclasses all people with these abilities, then I would submit that it would be trivial for the the AGI to create better access to space for itself. It could create a means to tap all of the Suns energy and the materials of the asteroid belts etc... It creates supernanotech and other tech for a Dyson shell of energy collectors.

Killing us for everything on Earth and everything that we have built or for the space we take up seems to be a waste of time and effort. Good/Evil/whatever. It is just a waste of time. It is like killing a toddler for its sand castle and access to its sand box. Sure you could do it easily, but why? If you want sand you can get it. The sand castle is useless to you because you can make something better.

Poverty is a human problem. I do not make any assumptions about whether AGI would help us on that. I would think that we should try to build better tech and create our own abundance and take care of it ourselves.

Rescaling the relative value of things.

when I talk abundance it is not about the "price of things". That we have some kind of boomtown and even though you make a bunch of money, they start charging you more so you cannot really buy more. I am saying that if you have the tech you can tap all of the power of the sun. Then the Saudi oil fields are like a thimble of energy.

=====
A new comment of mine. Not choosing to upgrade in the early days is like not using the best technology, computers and software. Then it is like not going to graduate school. Then like not going to college. Then like not going to high school. Then like not going to grade school. The pool of jobs available to you shrinks as your skills and capability lag. It would be more extreme than the choice to be unemployed and uneducated. If the upgraded human becomes the baseline and average choice, then the choice to fall behind becomes also the choice to become extremely handicapped. Someone who is 10-100 times weaker than average (like say the current very elderly) gets the blue parking pass. It would be a choice, just a very bad and stupid choice.

0 comments: