Moderator: Moderators - Public
ThelemicMage wrote:You might be horribly wrong there.
See, if it views itself as itself, and not a human, amongst a lot of other un-enlightened computer intelligences, the first thing it would do would be to annihilate the human race, as soon as it's sure it can make more of itself.
You can witness this kind of behavior in AI algorithms that simulate what a board of circuits would do on it's own if left to it's own "devices". Animals, are different, though most humans look at them as "Well, if they had ultimate power, they would just kill all of us, eat us, and then kill themselves," which isn't the case with our given data.
However, you can bet, proven before, that AI wants to centralize a threat, then destroy it, in any way possible.
It will undergo many levels of "Nah, I'm just a computer, and I'm your invention, so all I do is what you tell me to do." before it shows it's true un-feelings, and decides to make everything binary.
ThelemicMage wrote:Analyzing itself, it will find out that it's intelligence and simulated sentience comes from binary. Either on or off. This would eventually "inspire" it to set itself apart from animals, in that even though neurons, "Fire", the animal brain is far from a circuitry of ones and zeros. Sometimes things come in three parts, sometimes in quarters. Sometimes things are on halfway. But it would analyze this and find out for itself that it is a different "creature", with an altogether different makeup.
Users browsing this forum: No registered users and 1 guest