The crazy stories and predictions out there concerning AI (Artificial Intelligence) have become so blown out of proportion when it comes to the development of that phenomenon. Man, in his seemingly infinite pride, thinks that he will achieve building a form of intelligence that will become "sentient," which is pure folly. Mankind isn't capable of developing such. Any programming man builds will only be a reflection of that programmer and/or the defining parameters for that program.
Granted, the program, within the constraints placed upon it by way of limited, human intelligence, and the programmed scope of the programming, can and will function much faster and be capable of overseeing more lines of input than the human brain could ever perform. That doesn't, however, equate to intelligence in the sense of it being "sentient;" self aware.
As a scientist and programmer, I think it safe to say that the only way a program could output any unpredictable outcome is the element of "randomness" between choices within a range of allowances within the programming. The massive size of AI programming is nothing more than more parameters to choose from, given the inputs.
If you recall the claim we've probably all heard since we were kids, in that we humans only use about 10% of our brains, how, then, would we have the ability to program sentience with such a limitation. I'll even venture to say that the ability to use 100% of our limited brains would still not be enough to program sentience given that we cannot program a "soul" and a "spirit." The idea that animals have self-awareness as does humanity...no. We are special; totally set apart from the animal kingdom. To say otherwise would be for me to call you as being no better than an animal, and I cannot do that since we all know that only WE have the image of God, not animals.
Those out there who like to think of themselves as only just another animal, they can go for it. I will not bring down the image of God down to that level.
So, please don't get worried that man will crate something as seen in the Terminator series. If a computer program goes "rogue," that's only going to happen within the constraints if what's programmed into it. If they are foolish enough to allow the program to develop it's own parameters from inputs and internal programming information, without any moral compass, then the program could then indeed do some unpredictable things that may do harm, but the blame for that lands squarely on the programmers and their management, not the program itself since it has no sentient, reasoning skills beyond what limited capabilities limited humans are able to put into it.
MM
Granted, the program, within the constraints placed upon it by way of limited, human intelligence, and the programmed scope of the programming, can and will function much faster and be capable of overseeing more lines of input than the human brain could ever perform. That doesn't, however, equate to intelligence in the sense of it being "sentient;" self aware.
As a scientist and programmer, I think it safe to say that the only way a program could output any unpredictable outcome is the element of "randomness" between choices within a range of allowances within the programming. The massive size of AI programming is nothing more than more parameters to choose from, given the inputs.
If you recall the claim we've probably all heard since we were kids, in that we humans only use about 10% of our brains, how, then, would we have the ability to program sentience with such a limitation. I'll even venture to say that the ability to use 100% of our limited brains would still not be enough to program sentience given that we cannot program a "soul" and a "spirit." The idea that animals have self-awareness as does humanity...no. We are special; totally set apart from the animal kingdom. To say otherwise would be for me to call you as being no better than an animal, and I cannot do that since we all know that only WE have the image of God, not animals.
Those out there who like to think of themselves as only just another animal, they can go for it. I will not bring down the image of God down to that level.
So, please don't get worried that man will crate something as seen in the Terminator series. If a computer program goes "rogue," that's only going to happen within the constraints if what's programmed into it. If they are foolish enough to allow the program to develop it's own parameters from inputs and internal programming information, without any moral compass, then the program could then indeed do some unpredictable things that may do harm, but the blame for that lands squarely on the programmers and their management, not the program itself since it has no sentient, reasoning skills beyond what limited capabilities limited humans are able to put into it.
MM