Re: The Jihad (and Alia), round... whatever.
Posted: 27 Oct 2009 12:57
It's a good, passioned, straight forward explanation
*thumbs up*
*thumbs up*
DUNE DISCUSSION FORUM FOR ORTHODOX HERBERTARIANS
http://www.jacurutu.com/
Thank you for clarify that. I did have the ulterior motive of wanting your position to be clearly stated within the context of this conversation.Lundse wrote:As Thing said, I don't think anyone is saying no machines ever took part in the fighting - what I am talking about, and what is really interesting, is how the conflict started and why. Who waged war on who? Machines trying to kill humans, or humans trying to destroy machines? Skynet or Butler? In that light (or without it), I don't see why Alia could not claim "You could never distrust a machine". Trusting them was, as I have said somewhere before, obviously not the issue. The problem was:redbugpest wrote:What I want to know is, Omnius notwithstanding, why she would think that a all if she knew about the Jihad. Is it because (in your view) machines had no real role in the conflict, and that it was just a human versus human conflict? That is what I want an answer to, and I do not see it stated as such in this tread.
- ...a machine-attitude as much as the machines...
- ...devices in the image of the mind...
- ...[a] "god of machine-logic"...
- ...turn[ing] their thinking over to machines in the hope that this would set them free...
- That "We must negate the machines-that-think. Humans must set their own guidelines."
The machines were trusted and did not malfunction or try to eradicate mankind for no good reason. They did as ordered (including, if you believe the very probable scenario that they were used in the actual fighting, following orders to kill humans). That was the problem - and does not clash with "You could never distrust the machines". That they tried to kill all of us does clash a bit...
But I see some flaws in this. You said:Lundse wrote:I have a proof and I want to see if anyone can tear it down. They can't.
This is my justification for not only believing what I do, but also throwing it in people’s faces. If I am not willing to defend my views, I have to right to push them on other people.
She had just mused in the sentence before about desiring a compliant machine, instead of needing to rely on the less than compliant Idaho.Lundse wrote: (Also note that she uses "They could not have..." between these two quotes. If the machine she is thinking about is hypothetical and has nothing to do with the machines from the Jihad, of which Omnius is one, she would have thought: "Such a machine could not..." - notice the missing "have".)
It is machines in general, specifically including those from the Jihad (which logically entails Omnius) that she is thinking about. And she is thinking they are compliant and trustworthy.
Redstar wrote:Isaac Asimov loved robots. Man vs. machine wasn't a "theme" of his. He may have written a story or two with such a theme, but it wasn't defining feature of his works.
I disagree.Freakzilla wrote:Neither of those quotes states that machines were untrustworthy. Inspiring mistrust is not the same thing. What RM Mohiam feels is a prejudice.
Nowhere does FH say that there were machines killing people, ever. Men with machines were killing men. Machines only do what humans tell them to and humans told them to think for them, thus stagnating the human mind. That is what the revolt was against.
We should not have to read "homocidal robots" into those quotes for prequels to make sense. It's just too much of a stretch.
Man vs Machine was a theme in Sci Fi at the time (I only gave one Asimov reference because of it's similarity to McNelleys proposed BJ story)Redstar wrote:Isaac Asimov loved robots. Man vs. machine wasn't a "theme" of his. He may have written a story or two with such a theme, but it wasn't defining feature of his works.
Becuase they are merely tools.redbugpest wrote:I disagree.Freakzilla wrote:Neither of those quotes states that machines were untrustworthy. Inspiring mistrust is not the same thing. What RM Mohiam feels is a prejudice.
Nowhere does FH say that there were machines killing people, ever. Men with machines were killing men. Machines only do what humans tell them to and humans told them to think for them, thus stagnating the human mind. That is what the revolt was against.
We should not have to read "homocidal robots" into those quotes for prequels to make sense. It's just too much of a stretch.
"From the days of the Butlerian Jihad when "thinking machines" had been wiped from most of the universe, computers had inspired distrust."
That is FH telling us, through Scytales inner monologue, that machines inspitre distrust - so how could Alia feel that they could be trusted?
IgnoredFrom the DE (yes, I know it is not cannon, but I believe that FH approved of everything except the reference to who Jessica's mother was)
“the Butlerian Jihad is attributed to Jehanne Butler, a Bene Gesserit whose developing fetus is therapeutically aborted due to apparent birth defects. She soon discovers that her child had in fact been healthy, but that the hospital director, the first self-programming computer on the planet, had been secretly carrying out a policy of unjustified abortions.This triggers further investigation into the extent to which such machines had been controlling society and altering the emotional and intellectual characteristics of planetary populations over a course of centuries.”
This is clearly a case of a Machine Intelligence acting on it's own to damage humanity for it's own benefit. FH may not have outright said that it was a machine vs man conflict, but I thin that there is plenty of room to argue that he did not intend it to be men vs men, with AI just being an unwitting tool, doing as instructed. Otherwise he would not have mentioned sentient machines in his own definition of the jihad.
Of course not, the Butlerians didn't want to eliminate the men, but their dependance on the machines. If you're not allowed to make a machine in the image of the human mind you'll have to think for yourself.JIHAD, BUTLERIAN: (see also Great Revolt) -- the crusade against computers,
thinking machines, and conscious robots begun in 201 B.G. and concluded in 108 B.G. Its chief commandment remains in the O.C. Bible as "Thou shalt not make a machine in the likeness of a human mind."
It is listed here in the index as a crusade against computers, not a crusade against men with computers.
No. the conscious robots were not the cause, human dependance on them was. It was a religious fever and anything even remotely resembling AI was destroyed.I could make the rational, after reading this, that the "conscious robots" were the cause of the Jihad - acting on their own to undermine humanity (could be an Omnius type entity or not), and that "Thinking Machines" were advanced adaptive AI systems that, under the control of other humans and /or "conscious robots" aided the machine cause, but otherwise could be trusted, because they would follow instruction without passion or or question. In other words, completely trustworthy, but destroyed along with the rest because of the association with the loosing element.
Maybe you're starting to get it...And, yes, I am aware of this quote:
"Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them."
It is more about the folly of the men who push us to the technological edge without thinking of the possible consiquenses to society. It is what I think he is alluding to here:
The assumption that a whole system can be made to work better through an assault on its conscious elements betrays a dangerous ignorance. This has often been the ignorant approach of those who call themselves scientists and technologists.
The Butlerian Jihad, by Harq al-Ada
Giskard and Daneel were STILL human creation reacting to their programming.redbugpest wrote:Man vs Machine was a theme in Sci Fi at the time (I only gave one Asimov reference because of it's similarity to McNelleys proposed BJ story)Redstar wrote:Isaac Asimov loved robots. Man vs. machine wasn't a "theme" of his. He may have written a story or two with such a theme, but it wasn't defining feature of his works.
But yes, Asmiov did focus on the unintended consequences of human machine interactions, and how they can sometimes go wrong even with supposed safeguards in place, and how ultimately they had covertly taken control of humanity's destiny "for their own good" including killing millions for the greater good.
That is the "Zeroith Law" that Giskard had figured out and passed to Daneel at the end of Robots and Empire. If you recall, they decide not to stop the irradiating of Earth, knowing that millions would suffer, because Humanity needed to be pushed to colonize the stars to be better able to survive.
It is later explained that Daneel has been playing puppet master with the human race throughout the whole foundation series in order to try and protect it from unknown threats - the Gaia solution...
Robot Genocide. It is just not done in a violent and bloody way as KJA/BH envision it for Dune.
If that is the case, since Omnius was the Evermind, and all of the other robots were just "following their programming" as was Omnius himself, how would it be a problem for Alia to feel she could trust a robot in that scenario as well? KJA/BH clearly state that men made Omnius, and used him to dominate other men (time of Titans) until one of them gave Omnius too much discretion in his programming.Freakzilla wrote:Becuase they are merely tools.redbugpest wrote:I disagree.Freakzilla wrote:Neither of those quotes states that machines were untrustworthy. Inspiring mistrust is not the same thing. What RM Mohiam feels is a prejudice.
Nowhere does FH say that there were machines killing people, ever. Men with machines were killing men. Machines only do what humans tell them to and humans told them to think for them, thus stagnating the human mind. That is what the revolt was against.
We should not have to read "homocidal robots" into those quotes for prequels to make sense. It's just too much of a stretch.
"From the days of the Butlerian Jihad when "thinking machines" had been wiped from most of the universe, computers had inspired distrust."
That is FH telling us, through Scytales inner monologue, that machines inspitre distrust - so how could Alia feel that they could be trusted?
Why? Just because it is McNelley?Freakzilla wrote:Ignoredredbugpest wrote: From the DE (yes, I know it is not cannon, but I believe that FH approved of everything except the reference to who Jessica's mother was)
“the Butlerian Jihad is attributed to Jehanne Butler, a Bene Gesserit whose developing fetus is therapeutically aborted due to apparent birth defects. She soon discovers that her child had in fact been healthy, but that the hospital director, the first self-programming computer on the planet, had been secretly carrying out a policy of unjustified abortions.This triggers further investigation into the extent to which such machines had been controlling society and altering the emotional and intellectual characteristics of planetary populations over a course of centuries.”
This is clearly a case of a Machine Intelligence acting on it's own to damage humanity for it's own benefit. FH may not have outright said that it was a machine vs man conflict, but I thin that there is plenty of room to argue that he did not intend it to be men vs men, with AI just being an unwitting tool, doing as instructed. Otherwise he would not have mentioned sentient machines in his own definition of the jihad.
I think it is more along the lines of the need to stop the "Conscious Robots" who would no longer be wholly under the control of the men who created them, and to stop the men who were engaged in creating these machines who dehumanized mankind by making live so simplistic. The "Conscious Robots" allowed to control our lives would turn us all into Sheeple. Once the machines are making independent decisions, they are no longer "just tools"Freakzilla wrote:Of course not, the Butlerians didn't want to eliminate the men, but their dependance on the machines. If you're not allowed to make a machine in the image of the human mind you'll have to think for yourself.redbugpest wrote: What was the whole point of Frank referring to "Conscious Robots" then? JIHAD, BUTLERIAN: (see also Great Revolt) -- the crusade against computers,
thinking machines, and conscious robots begun in 201 B.G. and concluded in 108 B.G. Its chief commandment remains in the O.C. Bible as "Thou shalt not make a machine in the likeness of a human mind."
It is listed here in the index as a crusade against computers, not a crusade against men with computers.
See above. As soon as the machines are able to make decisions independent of the men who created them (free will), they become the cause. The religious fever was a mob mentality response, resulting in more destruction than was necessary, holding all machines accountable for the few.Freakzilla wrote:No. the conscious robots were not the cause, human dependance on them was. It was a religious fever and anything even remotely resembling AI was destroyed.redbugpest wrote: I could make the rational, after reading this, that the "conscious robots" were the cause of the Jihad - acting on their own to undermine humanity (could be an Omnius type entity or not), and that "Thinking Machines" were advanced adaptive AI systems that, under the control of other humans and /or "conscious robots" aided the machine cause, but otherwise could be trusted, because they would follow instruction without passion or or question. In other words, completely trustworthy, but destroyed along with the rest because of the association with the loosing element.
I do get that there is a gap between KJA / BH and FH vision of this. I also get that there is enough ambiguity for them to use the imagination that FH trusted his readers would employ to come up with what they did.Freakzilla wrote:Maybe you're starting to get it...redbugpest wrote:
The assumption that a whole system can be made to work better through an assault on its conscious elements betrays a dangerous ignorance. This has often been the ignorant approach of those who call themselves scientists and technologists.
The Butlerian Jihad, by Harq al-Ada
Programming that they had changed, allowing them to act in a more free manner, outside of the intentions of their creators. They had ceased to be tools once they could exercise free will and make decisions that would have gone against the wants and needs of their creators.Freakzilla wrote:Giskard and Daneel were STILL human creation reacting to their programming.redbugpest wrote:Man vs Machine was a theme in Sci Fi at the time (I only gave one Asimov reference because of it's similarity to McNelleys proposed BJ story)Redstar wrote:Isaac Asimov loved robots. Man vs. machine wasn't a "theme" of his. He may have written a story or two with such a theme, but it wasn't defining feature of his works.
But yes, Asmiov did focus on the unintended consequences of human machine interactions, and how they can sometimes go wrong even with supposed safeguards in place, and how ultimately they had covertly taken control of humanity's destiny "for their own good" including killing millions for the greater good.
That is the "Zeroith Law" that Giskard had figured out and passed to Daneel at the end of Robots and Empire. If you recall, they decide not to stop the irradiating of Earth, knowing that millions would suffer, because Humanity needed to be pushed to colonize the stars to be better able to survive.
It is later explained that Daneel has been playing puppet master with the human race throughout the whole foundation series in order to try and protect it from unknown threats - the Gaia solution...
Robot Genocide. It is just not done in a violent and bloody way as KJA/BH envision it for Dune.
I know I've sworn to keep my distance, Brian, but ... sorry, can't let it pass.redbugpest wrote:I also get that there is enough ambiguity for them to use the imagination that FH trusted his readers would employ to come up with what they did.
Well, keep in mind that I am not saying that their imagination is the only imagination that could be applied here. That comes down to a difference in writing styles. Both KJA an BH have said in interviews that they like to write the stuff that Frank shied away from.TheDukester wrote:I know I've sworn to keep my distance, Brian, but ... sorry, can't let it pass.redbugpest wrote:I also get that there is enough ambiguity for them to use the imagination that FH trusted his readers would employ to come up with what they did.
I'll let others do the actual smashing; I'll just quickly say: "not ambiguous at all, actually."
Good luck. I'd get the asbestos armor ready ...
Hu? You had my opinion already. Clearly stated. Several times.redbugpest wrote:Thank you for clarify that. I did have the ulterior motive of wanting your position to be clearly stated within the context of this conversation.
Because I meant it? Why do you ask? You do not address this again, as far as I see...redbugpest wrote:How you can say that:
Lundse wrote:I have a proof and I want to see if anyone can tear it down. They can't.
This is my justification for not only believing what I do, but also throwing it in people’s faces. If I am not willing to defend my views, I have to right to push them on other people.
Interesting. But Scytale's thoughts are not to be counted alongside Alia's. He does not have OM. And "inspire distrust" is, at any rate, far more vague - the idea of using computers inspiresredbugpest wrote:The problem I have with the above is that Alia seems to be the only person to believe that machines could be trusted in the first place. Lets look at another quote.
Mohiam allowed her old eyes to go wide in surprise. "The ghola's a mentat?
That's a dangerous move."
"To be accurate," Irulan said, "a mentat must have accurate data. What if
Paul asks him to define the purpose behind our gift?"
"Hayt will tell the truth," Scytale said. "It makes no difference."
"So you leave an escape door open for Paul," Irulan said.
"A mentat!" Mohiam muttered.
Scytale glanced at the old Reverend Mother, seeing the ancient hates which
colored her responses. From the days of the Butlerian Jihad when "thinking
machines" had been wiped from most of the universe, computers had inspired
distrust. Old emotions colored the human computer as well.
If Mohiam, through her own insight as a RM had a distrust of machines, why would Alia feel that they could be trusted? They would both have access to direct knowledge of the times, but it appears that they have a quite different attitude.
So you are saying that Alia did not count Omnious as a machine, because he was a sentient machine? So she is saying one can trust machine, except sentient ones?redbugpest wrote:This is important. Computers, Thinking Machines, and Conscious Robots. Omnius would fall under the auspices of a “conscious robot”, would he not? So if he were the real threat, behind the jihad (as KJA/BH saw it), and Alia were aware that he was a conscious, sentient being, but desired an AI driven non sentient machine, she may very well feel that it could be trusted.
Is it a stretch? A little, admittedly, but if you take the Legends out of the picture completely, you still can’t help but wonder why she would feel that way at all, when everywhere else it seems that the machines themselves are distrusted.
Nonsense. Yes, the passage talks about the purge. Yes we can imagine it to be about fighting machines, which I have already given you are very likely. But that does not mean there is any reason to distrust machines. At all. That they could be depended on to slay humans does not make them untrustworthy - deciding to eradicate humanity on its own, that is untrsustworthy! And even if such guesswork had some argumentative weight, it would be trumped by Alias clear, OM-backed, statement of fact.redbugpest wrote:When I read this, I get the distinct impression that it was as much a war against the machines themselves as it was against those who embraced the unemotional coldly logical machine like though processes. And there are other passages that cal out the feelings of distrust against thinking machines.
So I think that the only way your argument works is if you believe that there was no reason to distrust machines at all, which obviously there was.
I agree completely. That is why she was thinking those thoughts. It is entirely irrelevant to the knowledge she obviously has, and which she is employing in formulating the thoughts.redbugpest wrote:The only rational that I can see for her statement to begin with it to think that she is not thinking of the Jihad at all, but musing over a desire to have an adviser that she can use without the worry of them judging her, and acting against her desires.
Asimov's thoughts on the matter do not count, though I happen to like Evitable Conflict too.redbugpest wrote:Sentient thinking machines were the real threat. It was also a theme that had been covered in other SiFi like Isaac Asmov and his I Robot series of stories from circa 1950. (The Evitable Conflict in particular) It was this same theme that McNelley claimed he had forwarded to FH and gotten approved for.
That is what I think, at any rate.
No. It really is not. It is a machine doing as programmed. No ulterior motive. But it was making an ethical decision, and that is why people rose up against them (in part).redbugpest wrote:This is clearly a case of a Machine Intelligence acting on it's own to damage humanity for it's own benefit.
Then, as I understand it, your argument is:The human-computer replaced the mechanical devices destroyed by the Butlerian Jihad. Thou shalt not make a machine in the likeness of a human mind! But Alia longed now for a compliant machine. They could not have suffered from Idaho's limitations. You could never distrust a machine.
1. Just to be nitpicky -- but can you "program" free will?redbugpest wrote:Programming that they had changed, allowing them to act in a more free manner, outside of the intentions of their creators. They had ceased to be tools once they could exercise free will and make decisions that would have gone against the wants and needs of their creators.
Not as a "gotcha", but just as a fresh, unequivocal statement, because it is Germain to this particular discussion. I wanted no contextual ambiguity since, from my point of view, the problem with your premis lays in the foundation on which you built it, so to speak...Lundse wrote:Hu? You had my opinion already. Clearly stated. Several times.redbugpest wrote:Thank you for clarify that. I did have the ulterior motive of wanting your position to be clearly stated within the context of this conversation.
Your comment seems snide to me - are you saying "gotcha" because you got me to state my beliefs? Or what was the meaning here...?
See my above statement.Lundse wrote:Because I meant it? Why do you ask? You do not address this again, as far as I see...redbugpest wrote:How you can say that:
Lundse wrote:I have a proof and I want to see if anyone can tear it down. They can't.
This is my justification for not only believing what I do, but also throwing it in people’s faces. If I am not willing to defend my views, I have to right to push them on other people.
Not so fast. Why, if machines were just tools, and machine thinking, machine logic, and machine attitude are the real issue, would anyone feel that machines would inspire any distrust. wouldn't it be the people who aspired to be machine like that would inspire the distrust?Lundse wrote:Interesting. But Scytale's thoughts are not to be counted alongside Alia's. He does not have OM. And "inspire distrust" is, at any rate, far more vague - the idea of using computers inspiresredbugpest wrote:The problem I have with the above is that Alia seems to be the only person to believe that machines could be trusted in the first place. Lets look at another quote.
Mohiam allowed her old eyes to go wide in surprise. "The ghola's a mentat?
That's a dangerous move."
"To be accurate," Irulan said, "a mentat must have accurate data. What if
Paul asks him to define the purpose behind our gift?"
"Hayt will tell the truth," Scytale said. "It makes no difference."
"So you leave an escape door open for Paul," Irulan said.
"A mentat!" Mohiam muttered.
Scytale glanced at the old Reverend Mother, seeing the ancient hates which
colored her responses. From the days of the Butlerian Jihad when "thinking
machines" had been wiped from most of the universe, computers had inspired
distrust. Old emotions colored the human computer as well.
If Mohiam, through her own insight as a RM had a distrust of machines, why would Alia feel that they could be trusted? They would both have access to direct knowledge of the times, but it appears that they have a quite different attitude.
Note that Mohiam is against mentats because they are "like" computers - the Bene Gesserit share this trait. She is in no way the one to say that machines inspire distrust.
Also note that if the problem was that machines started murdering and torturing, her dislike of mentats makes no freakin' sense - mentats are not going to do that all of a sudden. Such coloring from old emotions only makes sense if you have a problem with... machine thinking, machine logic, machine-attitude.
So thanks for yet another quote to prove my point...
I am saying that if Alia were to feel that way she would have to be differentiating between machine types. It's far more likely that FH was using it to highlight Alia's discomfort in having to deal with Mentats at all. As I recall, the Baron never trusted Piter as well. Alia wanted absolute control and devotion, something she did not feel she could get from Idaho or any of her advisers.Lundse wrote:So you are saying that Alia did not count Omnious as a machine, because he was a sentient machine? So she is saying one can trust machine, except sentient ones?redbugpest wrote:This is important. Computers, Thinking Machines, and Conscious Robots. Omnius would fall under the auspices of a “conscious robot”, would he not? So if he were the real threat, behind the jihad (as KJA/BH saw it), and Alia were aware that he was a conscious, sentient being, but desired an AI driven non sentient machine, she may very well feel that it could be trusted.
Is it a stretch? A little, admittedly, but if you take the Legends out of the picture completely, you still can’t help but wonder why she would feel that way at all, when everywhere else it seems that the machines themselves are distrusted.
Just as your argument attempts to make the Legends series not fit. It quits working unless it is an all or nothing. Even FH broke them into categories. Machines, thinking machines, and Conscious Robots. If you want to play the word game, I can just as easily say, Omnius is not a machine, he is a robot. But that would get us no where.Lundse wrote: I am glad you brought up "farfetched" so I won't have to. I'd even go so far to say that "You could never distrust a machine" means any machine. And she was just musing on powerful AIs, like Omnious would have been, so we have every reason to include him in the mix.
And the one reason to exclude him? So the passage will fit with KJAs writing!
See above. even if they were doing as programmed, how could you trust that someone wouldn't change the program to turn it against you. That is untrustworthy as well. Alia does not clearly state that she can trust machines completely because of OM. That is an inference that you are making based on your definition of the jihad.Lundse wrote:Nonsense. Yes, the passage talks about the purge. Yes we can imagine it to be about fighting machines, which I have already given you are very likely. But that does not mean there is any reason to distrust machines. At all. That they could be depended on to slay humans does not make them untrustworthy - deciding to eradicate humanity on its own, that is untrsustworthy! And even if such guesswork had some argumentative weight, it would be trumped by Alias clear, OM-backed, statement of fact.redbugpest wrote:When I read this, I get the distinct impression that it was as much a war against the machines themselves as it was against those who embraced the unemotional coldly logical machine like though processes. And there are other passages that cal out the feelings of distrust against thinking machines.
So I think that the only way your argument works is if you believe that there was no reason to distrust machines at all, which obviously there was.
I'm not sure where you were going with this statement.Lundse wrote:I agree completely. That is why she was thinking those thoughts. It is entirely irrelevant to the knowledge she obviously has, and which she is employing in formulating the thoughts.redbugpest wrote:The only rational that I can see for her statement to begin with it to think that she is not thinking of the Jihad at all, but musing over a desire to have an adviser that she can use without the worry of them judging her, and acting against her desires.
No, the machine has risen above it's original programming. It is clearly stated that it is a self programming machine - putting out of the scope of control of it's designers, and as a consequence is doing what it feels is in the best interests of humanity. It does not have a human telling it who's baby to abort, it is making that decision on its own, using it's own developed sense of ethics.Lundse wrote: [On the Jehanna Butler story in DE]No. It really is not. It is a machine doing as programmed. No ulterior motive. But it was making an ethical decision, and that is why people rose up against them (in part).redbugpest wrote:This is clearly a case of a Machine Intelligence acting on it's own to damage humanity for it's own benefit.
So when Frank considered writing a Butlerian Jihad prequel with McNelly, it was because NcNelly understood that the Jihad was about machine which were making decisions better left to mankind itself.
We are on point because to understand why Alia made a comment like that in the first place we need to understand what she knew and understood about the root causes of the jihad.Lundse wrote: If we return to the passage in question:
Then, as I understand it, your argument is:The human-computer replaced the mechanical devices destroyed by the Butlerian Jihad. Thou shalt not make a machine in the likeness of a human mind! But Alia longed now for a compliant machine. They could not have suffered from Idaho's limitations. You could never distrust a machine.
The rest is really not addressing the original point. Though it might be relevant to the overall discussion, lets stay on Alia in this thread, ok?
- That Alia did not mean Omnious, but only all the other kinds of AI machines when she said they were trustworthy.
- Which has no basis on anything in FH's work, and is obviously wrong because she was just thinking of strong AI like Omnious and her statement encompasses all machines specifically.- That Alia was wrong.
- For which you show little and certainly worse evidence. Scytale's opinion vs. OM?- That because machines might have been ordered to kill people, they cannot be trustworthy.
- Which is just plain wrong.
By setting up a loosely governed adaptive AI system that can adjust it's own code based on it's understanding to eh the associated data. They have been using this in Sci Fi for decades.The machine quest to be more human. Look at Data in STTNG.Slugger wrote:1. Just to be nitpicky -- but can you "program" free will?redbugpest wrote:Programming that they had changed, allowing them to act in a more free manner, outside of the intentions of their creators. They had ceased to be tools once they could exercise free will and make decisions that would have gone against the wants and needs of their creators.
2. There is a difference between "Machine Intelligence" and "machines." You seem to be applying both definitions to Omnius (I would argue that Omnius, as he appears in the TBJ stories, adheres strictly to the former) when it benefits whatever temporary position you're arguing at the time.
I'm looking at this from a psychology perspective: you're arguing that given enough programming, a machine could begin to modify it's own base programming. This self-modification (which Omnius never really practiced) is somehow akin to free will? And, again, given enough programming, the machine can become sentient?redbugpest wrote:By setting up a loosely governed adaptive AI system that can adjust it's own code based on it's understanding to eh the associated data. They have been using this in Sci Fi for decades.The machine quest to be more human. Look at Data in STTNG.Slugger wrote:1. Just to be nitpicky -- but can you "program" free will?redbugpest wrote:Programming that they had changed, allowing them to act in a more free manner, outside of the intentions of their creators. They had ceased to be tools once they could exercise free will and make decisions that would have gone against the wants and needs of their creators.
2. There is a difference between "Machine Intelligence" and "machines." You seem to be applying both definitions to Omnius (I would argue that Omnius, as he appears in the TBJ stories, adheres strictly to the former) when it benefits whatever temporary position you're arguing at the time.
Omnius is an intelligent machine that has achieved consciousness. That would set him apart from a machine, how ever well designed the AI, that has not become sentient. This is something that FH mentions too, with his three categories, machines, thinking machines, and conscious robots.
Harken ye unto al-Lisan al-Ghabi, who cannot even quote correctly.redbugpest wrote:Even FH broke them into categories. Machines, thinking machines, and Conscious Robots. If you want to play the word game, I can just as easily say, Omnius is not a machine, he is a robot. But that would get us no where.
A minor detail, but it is said to be therein the gods reside.FH in Dune, breaking them into categories, wrote:computers, thinking machines, and conscious robots
Agree on #1SandChigger wrote: Computers are non-sentient calculating & data storage machines.
Thinking machines are sentient computers.
Conscious robots are thinking machines capable of moving about and altering their environment.
That at least is how I interpret them. Omnius was a thinking machine. Erasmus was a robot. ("Independent robot", I believe the hackneyed phrase is.)
You would hold some of them accountable, and possibly execute them (crimes against humanity). I happen to think that Frank Herbert's focus was more on the problem with how automation was affecting humanity, and less on how some people was using automation as a weapon against others.Freakzilla wrote:If you want humans to stop killing each other take their weapons away, you don't kill all the humans.