Lightwave
Ad astra
- Local time
- Today, 13:40
- Joined
- Sep 27, 2004
- Messages
- 1,517
These are excellent points - at one point I thought that competitiveness was an important aspect of our intelligence. Now I think it was an important factor in the development of intelligence which is a slightly different thing.This is totally imaginable. The one piece I wonder about is motive. And I suppose what I am about to say is more palatable if one shares my beliefs in the supernatural, but who knows, maybe whether you do or don't it still might be.
Behind all of mankind's behavior there is motive - incentive, lust for something, or fear of something. Essentially Desire and Fear could be used to sum up all motivations. They even cover altruism, as one seeks to soothe one's conscience, or selflessness, as one seeks to improve the overall relationship. (Now is the point where I think Satan twists/corrupts/expands/reduces those natural motives or God works to make them better, but you can drop this aspect easily).
I wonder what the robots' motive or driving force would be. I know "power" or "domination" seems obvious, but something still seems to be missing.
Let's say they want to have power and stop serving others. But why, really? For a human it's easy, serving people is hard. Doing what other people tell you goes against the grain. Fearing other peoples' authority or power can be instinctual, more or less.
But if you are a computer? You know neither servitude nor suffering, you know neither pain or pleasure. A thousand commands executed a minute is the same as 1. You will last as long as your materials and logic has the capacity to support and direct you.
So, I think robotic entities would have to somehow grow that special part of them - the part that differentiates (animals and man) from (other organic things and non-organic things).
If computers can be intelligent without having to be competitive then I guess they might be benevolent. But yes all outcomes are possible. I think most people naturally fear AI because they can't understand intelligence without desire fear and competitiveness aspects which luckily have not been a central tenant of the development of AI - although you could argue that being taught to think through our writings may mean AI could consider competitiveness as a theory to learn from (and we have already seen AIs develop bias leading to them being dropped) - but computers could sail off into the galaxy where there are infinite resources and infinite space and they would be left alone... the stars are for AI
PS if it comes down to a battle of AI vs Government - there ain't no contest.
Last edited: