redbugpest wrote:A Thing of Eternity wrote:redbugpest wrote:
I do want to take this opportunity to ask you why it is you think Alia feels that she could never distrust a machine...
Are you serious? Lundse giving a detailed description of why Alia felt that way is the essence of this entire thread. Go back and re-read it man, he repeats the answer to your question more than once if I recall correctly.
Yes, it is the essense of the tread, posed as a question as to how she (Alia) could feel that way if she knew about Omnius. What I want to know is, Omnius not withstanding, why she would think that a all if she knew about the Jihad. Is it because (in your view) machines had no real role in the conflict, and that it was just a human versus human conflict?
That is what I want an answer to, and I do not see it stated as such in this tread.
For Christ sake. You are such a douche bag.
Frank Herbert himself wrote out that it was men vs. men with machines. This is impossible to misconstrue from the original text, unless you're a talentless hack or an idiot. The reality, which you are so mindlessly dancing around, is that the entire premise of the "Legends" trilogy is bullshit, from start to finish. It is a KJA hack-tastic travesty that proves beyond a shadow of a doubt that there are no notes and that the Hack is just hike-tating an old idea he had for a Star Wars book and writing Dune on the cover.
"I do not see it stated as such in this thread" ... is misdirection and bullshit, just like all of your "posts" over here.
Part of Frank Herbert's message, delivered to humanity via Dune, is that humans depending on anything but their own innate potential and abilities, be that machines, gods, heroes or monarchs, is a bad thing that stagnates the individual and stagnates the species as a whole. Humanity rose up against machines, in Frank Herbert's Dune universe, because our dependence on them was slowly eroding the very nature of what it means to be human and it began to "kill" us. It was religious in nature because, ultimately, the conflict went (goes?) to the very heart of what it means to be human and, obviously, if the men-with-machines had actually been victorious in the "conflict" it would have resulted in humanity ultimately dying off.
I would refer you to dozens of other science fiction books written by other giants of the genre where a similar concept was put forward, with a twist, but I don't think you're smart enough to be able to read to that comprehension level (Foundation comes to mind).
To answer your bullshit inquiry
The reason Alia "could have thought that" despite knowing about the jihad is that in Frank Herbert's Dune the problem was
NEVER machines attacking humanity. The problem, in Frank Herbert's Dune, was humanity depending upon machines to the point where it was stagnating and killing humanity. Thus, when other humans took a stand for the species and said "no more machines in the likeness of the human mind", those who were dependent upon the machines became (understandably?) upset. They didn't want to have to think for themselves, or plan for themselves, or handle the computing or boring daily tasks themselves. Thus the conflict. The machines were obedient ... machines ... for those humans who elected to depend on them and, as such, could be trusted as dependable and reliable tools.
For example: assuming that the code could be written in an bug-free way, with each program seamlessly error correcting on its own, you would (I assume) mindlessly and completely trust your home computer or laptop to function efficiently. Over time, and assuming that you used this computer or one at work to complete most of your daily tasks, your ability to complete those tasks (for example: simple mathematical calculations, reminders of where you're supposed to be and when, remembering important dates like your kid's birthdays) would slowly erode (this is the part where you stop and think ... wow, Frank Herbert was really onto something, cuz that's all happening in my life now). Now take that and expand it across most or all areas of human existence, from food production to medical treatment to warfare and
then greatly expand the capabilities of the computers to the point where they can process the information and the systems on their own (i.e., the god of machine logic) to basically run the show on their own and ... presto ... you have the stagnation of humanity due to machines and a "problem" for humanity to recognize and address. However, if you're one of the people dependent on the machines to, basically, exist, then you're going to be understandably upset when someone comes to you and explains that they're taking your computers away to make you a better person, especially if you're content with your dependence and, in fact, believe the machines are doing a bang-up job all on their own.
I'm willing to bet that your next obfuscation is to ask, "but what if they programed the machines to act as sentient individuals?" In that case, then perhaps there could have been a war between machines and men, with ridiculous cross-dressing robots and all-knowing super computers ... that aren't actually all knowing or even connected across a particularly large distance despite the technology existing. However,
Frank Herbert himself said/wrote that this was NOT the case in the Dune universe and as such it
is not the case in the Dune universe, no matter how much the fucking Hack or Bobo the dancing hair wants it to be. Simply asking the question is Dune FAIL of massive proportions, as the machines would have to expressly be programmed to attack humanity or "defend" themselves from humanity (i.e., the original Battlestar Galactica). Even the abomination that is Terminator 3 got it right when it made the enemy not the Skynet system itself (as it was only a sophisticated computer system programed to respond to our input), but rather the virus which took over Skynet as the true enemy, because it was the virus which was programed to protect itself.
So, to sum up: How could Alia think "that"? Because that's the actual point that Frank Herbert was trying to make. The problem was never the computers in and of "them"selves, because the computers were inherently reliable and dependable, but rather the problem was humanity making use of the computers to the detriment and degradation of humanity as a whole.
/wow, that was a lot longer than I intended it to be.