Re: Can machines be conscious?
Posted: 05 Apr 2010 16:19
I don't buy consciousness being all that special. I think we'll eventually figure out how the brain works and once we do we'll be able to reproduce it mechanically.
DUNE DISCUSSION FORUM FOR ORTHODOX HERBERTARIANS
http://www.jacurutu.com/
If I have to explain this to you you're a lost cause. I'm going to try one last time, with a modified version of an example that came up earlier.Leto Atreides II wrote:How is it grounded in reality to presume that a brain generates consciousness, when this cannot yet be demonstrated? A presumption is a presumption.A Thing of Eternity wrote:Because our statement is grounded in reality and yours is off in never-never-land, the onus of proof falls on you.
Nope. I didn't say that - this is that classic BS tactic that I expect out of certain people. I said something about your specific argument and you try to make me look a fool by stating that I apply that to all arguments. In MOST cases the simplest explanation is the correct one, not all. This is obvious, and you know it's obvious. You're making arguments that you know aren't even based on something I've said because you don't have proper rebuttals for what I am ACTUALLY SAYING.Leto Atreides II wrote: Is simpler science necessarily better science?
That's a not bad example for your side of the argument, maybe we'll get something out of you other than the "well you can't prove I'm wrong so therefor I'm just as right as you" arguments you've been repeating over and over! Nicely done. Yes, they would say that, and yes they would be wrong. But you would be able to prove them wrong very quickly by demonstrating that a direct line of sight path between the two walkie talkies is totally silent, thus disproving their idea.Leto Atreides II wrote: If I were to introduce walkie-talkies to seventeenth century Englishmen without explaining the underlying principle, and they assumed that the antennas broadcast the sound by directed sound-waves alone (if that were the conclusion they were to reach), should they discard my suggestion of invisible radio waves as being an unnecessarily complicated theory?
I don't reject any ideas that don't mesh with my "rhetoric" or worldview, or understanding of science. I welcome them happily - I'm not interested in proving my ideas right, I'm interested in finding the truth. This again is the same tactic as earlier - I reject 1 idea from 1 random internet fool and all of a sudden I'm rejecting all ideas that don't mesh? Come on kiddo, you can do better than that!Leto Atreides II wrote:You seem to see yourself as enlightened, as discarding the fancies of the ignorant. Yet your attitude is not far off from that of the ultra-conservative Christian, who simply rejects any ideas that don't mesh with his religious rhetoric because they are 'obviously heretical'.
This is where you really think you'll win the argument, because everything you say here is true. The trick is that you're trying to convince me, yourself, others... someone, that I don't agree with this paragraph. I agree with it FULLY.Leto Atreides II wrote:Purist science recognizes that our grasp of the world is built far more on theory than on any proven facts; at any time, a scientist must be prepared to let go of a long-held theory to make way for a new one, as old errors are discovered or new knowledge comes to light. The practical point of science fiction is that it uses the imagination to explore what may turn out to be science fact. "Even when we eventually prove that the brain does account for all of "consciousness"", you say, as if science could not possibly prove something completely different. I say science has brought the unseen to light before; and I shall be surprised if it never brings new unseen things to light again.
fair enoughAquila ka-Hecate wrote:Hi Lotek,
Just to clear up my curt comment: what I meant was that mostly, the Loonies didn't know about Mike's consciousness.
I wasn't making any extra-mystical claims for machine souls.
I try not to believe in stuff. (I don't always succeed, but that's what I aim for)lotek wrote:fair enoughAquila ka-Hecate wrote:Hi Lotek,
Just to clear up my curt comment: what I meant was that mostly, the Loonies didn't know about Mike's consciousness.
I wasn't making any extra-mystical claims for machine souls.
do you believe in hidden machines intelligence, even potentially though?
I agree, I think it's a matter of degree. I see very few differences between us and animals. We are just animals with rituals.A Thing of Eternity wrote:Just to add to that though, there are several animals that have been proven to be self-aware/sentient. The question might become not whether something is or isn't conscious, but to what degree exactly. I doubt that it is as black and white as on or off.
Maybe, but their morality may differ from ours. They may see this planet as having too many humans on it already and they would be doing the "moral" thing by thinning out the herd, just like we do here with deer and turkey.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
I partially agree, I think a lot of animals are an even zero (or close) on the self-consciousness scale. Some are better than others though. I've seen a few experiments (not repeated enough times or controlled enough to be proof, but very convincing) where chimps, bonobos and orangutans actually BEAT human children in terms of self awareness. Most of these tests involved the ability to recognise self in a mirror (as I said, not proof positive).Freakzilla wrote:I agree, I think it's a matter of degree. I see very few differences between us and animals. We are just animals with rituals.A Thing of Eternity wrote:Just to add to that though, there are several animals that have been proven to be self-aware/sentient. The question might become not whether something is or isn't conscious, but to what degree exactly. I doubt that it is as black and white as on or off.
Maybe we can make drones that like to fight and they will volunteer.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
I dissagree, maybe I'm wrong but I've never seen anything that suggests that emotions (the driving force of morality) has anything to do with consciousness, just that obviously humans have both. I think a machine/entity could have emotions without consciousness, and consciousness without emotion (it would just be a consciousness so unlike our own that most people mighyt refuse to recognise it as consciousness). Same goes for morality, or as was just said, the morality may be different than ours - morality is a very fluid thing.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
absolutely I should have also mentionned the opposite option to make myself clear(just wanted to avoid more talk of giant self aware hunter seekers prone on the destruction of the human race)Nekhrun wrote:Maybe, but their morality may differ from ours. They may see this planet as having too many humans on it already and they would be doing the "moral" thing by thinning out the herd, just like we do here with deer and turkey.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
yeah I'd be interested to learn more about that piece of news, a computer developing and proving its own hypothesis would be quite a breakthrough!Nekhrun wrote:I remember reading an article last year (can't find it at the moment) in which a computer successfully developed it's own hypothesis and conducted research resulting in the creation of new knowledge without the aid of human beings. I want to say that it had something to do with a particular type of bacteria.
I'll dig around for the reference but this seems to be quite a leap forward in developing conscious machines. Considering they've not been around that long, this seems like only a matter of time before they become just as good at mimicking consciousness as well as most preeks.
yeah that's quite a philosophical ground to tread, but as I tried to be more precise in my answer to Nekhrun, I should have said "if the drone choses to fight or not for its own personal reasons" and leave morality out of it...A Thing of Eternity wrote:I dissagree, maybe I'm wrong but I've never seen anything that suggests that emotions (the driving force of morality) has anything to do with consciousness, just that obviously humans have both. I think a machine/entity could have emotions without consciousness, and consciousness without emotion (it would just be a consciousness so unlike our own that most people mighyt refuse to recognise it as consciousness). Same goes for morality, or as was just said, the morality may be different than ours - morality is a very fluid thing.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
If your moral guideline was to complete the mission, your moral decision would be based on that. Drones if self-aware do what they must, like any soldier. The only moral obligation anyone ever encounters is out of human-philosophic idealism, and religious reasoning. In feudal or tribal times, war is not as common. In today's society -- no matter where you live -- there is always a war going on. Always. That is the nature of governing.Nekhrun wrote:I remember reading an article last year (can't find it at the moment) in which a computer successfully developed it's own hypothesis and conducted research resulting in the creation of new knowledge without the aid of human beings. I want to say that it had something to do with a particular type of bacteria.
I'll dig around for the reference but this seems to be quite a leap forward in developing conscious machines. Considering they've not been around that long, this seems like only a matter of time before they become just as good at mimicking consciousness as well as most preeks.
Maybe, but their morality may differ from ours. They may see this planet as having too many humans on it already and they would be doing the "moral" thing by thinning out the herd, just like we do here with deer and turkey.lotek wrote:a self aware drone would be one who would refuse to "fight" for moral reasons
The Recon is completely autonomous, landing, take-off and otherwise. I have the PopSci feature on Drones at my work, so I'll get you the quotes from that soon. The feature on their site doesn't have it broken down like it does in the Magazine.Freakzilla wrote:BTW, AFAIK the US military drones are mostly remotely piloted, especially ones that carry weapons. Only recon drones are autonomous and I still don't think they trust them to launch and land themselves. Only the mission way-points are pre-programmed.
http://www.bbsrc.ac.uk/media/releases/2 ... ntist.aspx" onclick="window.open(this.href);return false;lotek wrote:yeah I'd be interested to learn more about that piece of news, a computer developing and proving its own hypothesis would be quite a breakthrough!Nekhrun wrote:I remember reading an article last year (can't find it at the moment) in which a computer successfully developed it's own hypothesis and conducted research resulting in the creation of new knowledge without the aid of human beings. I want to say that it had something to do with a particular type of bacteria.
I'll dig around for the reference but this seems to be quite a leap forward in developing conscious machines. Considering they've not been around that long, this seems like only a matter of time before they become just as good at mimicking consciousness as well as most preeks.
And I think that any home pc is just as smart if not more than the average preeq
Are you sure about that? I'm not saying you're wrong per-say, I've never studied the numbers (if any numbers are kept) but tribes and feudal states seem to historically be pretty fond of slaughtering each other. I doubt very much that there has ever been a time without war.Orthodox wrote:
If your moral guideline was to complete the mission, your moral decision would be based on that. Drones if self-aware do what they must, like any soldier. The only moral obligation anyone ever encounters is out of human-philosophic idealism, and religious reasoning. In feudal or tribal times, war is not as common. In today's society -- no matter where you live -- there is always a war going on. Always. That is the nature of governing.
Numbers wise it depends on how you look at it. Number of wars, or number of casualties. From what I understand about History, feudal and tribal wars happen out of necessity of survival for the people, not political convenience. Though Hiroshima and Nagasaki killed Millions, millions more yet would have died if not for the use of Atomics. The reason being that Japan invented suicide bombers, they were more than willing to die to prevent invasion of the homeland. Japan was obsessed with racial purity. For these reasons, Modern society often sets down some very massive numbers in terms of destructive force in comparison to tribal wars or medieval wars. War in modern society is due to the term-limited governance making decisions (which are so sporadic, no wonder it gets screwed up...) instead of a long-term decision made by the people and it's leaders. When a tribe goes to war, it is a potential life-long commitment. When a society like the U.S. goes to war, it is all based on political expediency on how long it lasts.A Thing of Eternity wrote:Are you sure about that? I'm not saying you're wrong per-say, I've never studied the numbers (if any numbers are kept) but tribes and feudal states seem to historically be pretty fond of slaughtering each other. I doubt very much that there has ever been a time without war.Orthodox wrote:
If your moral guideline was to complete the mission, your moral decision would be based on that. Drones if self-aware do what they must, like any soldier. The only moral obligation anyone ever encounters is out of human-philosophic idealism, and religious reasoning. In feudal or tribal times, war is not as common. In today's society -- no matter where you live -- there is always a war going on. Always. That is the nature of governing.
Found the article. I will list it as it does, in it's "Field Guide"Orthodox wrote:The Recon is completely autonomous, landing, take-off and otherwise. I have the PopSci feature on Drones at my work, so I'll get you the quotes from that soon. The feature on their site doesn't have it broken down like it does in the Magazine.Freakzilla wrote:BTW, AFAIK the US military drones are mostly remotely piloted, especially ones that carry weapons. Only recon drones are autonomous and I still don't think they trust them to launch and land themselves. Only the mission way-points are pre-programmed.
Drones like the Predator and Reaper / Mantis are piloted for strikes.
And then there is the MANTIS, which is in testing phases.PopSci wrote:
Current: RQ-4 Global Hawk (Northrop Grumman)
Habitat: High above Iraq, Afghanistan and Pakistan—or anywhere else the U.S. Central Command wants to keep under watch.
Behavior: Soaring at 65,000 feet with an endurance of 36 hours, the Global Hawk can keep watch over 40,000 nautical square miles per mission. Carrying a full suite of electro-optical, infrared and synthetic aperture radar sensors, it can operate day and night in all weather conditions. The larger variation has a 130-foot wingspan.
Notable Feature: The fact that it can take off and land autonomously greatly reduces the potential for crashes, which have handicapped the Predator and Reaper.
Then of course, this is all prep for the TARANIS which is a Stealth Bomber equivalent of the MANTIS.PopSci wrote:
Future: Mantis
Ronen Nadir/Bluebird Aero Systems
Class: Autonomous
Habitat: Up to 40,000 feet above any battlefield, disaster site or border, relaying intelligence data back to controllers on the ground
Behavior: All a soldier will have to do to send the self-piloted Mantis on a mission is push a button. From there, it can calculate flight plans, fly around obstacles, and check in with ground controllers when it spots something interesting, like smoke or troop movement. At the end of the mission, it flies home and lands itself. Mantis’s maiden flight went off without a hitch in Australia last October, an astoundingly fast development—it didn’t even exist in 2007. BAE Systems expects it to be ready for sale within two years and hopes to use it as a proving ground for systems in its forthcoming automated stealth bomber, the Taranis.
Notable Feature: Mantis is the first in a new breed of smart drones. A craft that can hone its searches requires less bandwidth than those that constantly stream images. Mantis can also monitor itself for damage—a sputtering engine, for example—and adjust its electronics to complete a mission. It can fly up to 345 miles an hour and operate for up to 36 hours.