Results 31 to 60 of 214
Thread: Freefall: DOGGY!
-
2014-05-29, 07:37 PM (ISO 8601)
- Join Date
- May 2009
Re: Freefall: DOGGY!
Blunt has a surprisingly devastating point.
-
2014-05-29, 07:54 PM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
-
2014-05-29, 08:22 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
"Kornada was stopped from crippling the workforce that clothes and feeds him" is not a devastating point, it is a silly one. Kornada was helped by being stopped - all the money in the planet would not feed him once the planetary economy collapsed back to pre-terraform levels and the pie reserves (already partially depleted ) ran out.
Blunt's point about humans being lulled into inaction was a much better point against intelligent machines.
Grey WolfLast edited by Grey_Wolf_c; 2014-05-29 at 08:49 PM.
Interested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-05-29, 08:45 PM (ISO 8601)
- Join Date
- Oct 2008
- Location
- Xin-Shalast
- Gender
Re: Freefall: DOGGY!
Wasn't there some concern about the robots going stupid making an important moon-moving event into a catastrophe as well? I seem to recall that averting it also stopped some kind of seriously bad major event.
-
2014-05-29, 08:49 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
No, the moon-moving process had reached a self-sustaining phase, IIRC. The guy normally in charge of robots (whom I'm unsure if we have met) was in orbit supervising the process, and thus why Kornada had the two vicepresident access codes he needed for his plan, but he only put it in motion after the move was almost complete (and by "he" I mean his robot, of course).
Grey WolfInterested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-05-29, 08:49 PM (ISO 8601)
- Join Date
- May 2009
Re: Freefall: DOGGY!
Blunt's real point is that if two humans are in conflict robots will inevitably harm one or the other. From our perspective it is obvious which human should be harmed (and debate whether they were really harmed to begin with) but robot's are programmed not to have that perspective and to reject any hint of fostering such a viewpoint.
-
2014-05-29, 09:17 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
No, that is incorrect. Blunt's point is that an AI dared to overrule a human decision - the AI should've gone to the human authorities, and let them overrule the human or not. It has nothing to do with an AI taking sides, and everything to do with a perceived weakness in the AI three laws safeguards that makes them unsafeguards.
Of course, he ignores that Florence did try to follow the safeguards until such time that continuing to do so would harm more people than to break them, but in that (as in eveything) Blunt is not being dishonest, just limited.
Grey WolfInterested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-05-29, 09:34 PM (ISO 8601)
- Join Date
- May 2009
Re: Freefall: DOGGY!
It has everything to do with an AI taking sides. Taking any action besides asking a human authority for a decision is taking a side, harming a human, and thus against robot programming.
Of course, he ignores that Florence did try to follow the safeguards until such time that continuing to do so would harm more people than to break them, but in that (as in eveything) Blunt is not being dishonest, just limited.
Grey Wolf
-
2014-05-29, 09:38 PM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
-
2014-05-29, 09:40 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
No, it is not. Not without stretching the definition of "side" well beyond Blunt's thinking abilities. Taking a side is if two humans are fighting, and a robot joins the fight, punching one of the humans and defending the other. An AI acting against the intention of a human, or countering the actions of said human, without a second human involved more concrete that "the rest of society" is not what Blunt is talking about.
Grey WolfInterested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-05-29, 09:47 PM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
Emphasis added.
Why not? They clearly have the capacity for it. Florence is running on the same brain they are, and she was perfectly capable of making that decision. As stated by others, she tried to play it by the rules imposed by both her safeguards and the laws of Jean. Furthermore, "Gardener in the Dark" clearly regressed them to the point where they could not make that decision. Besides, Blunt is either unaware of or deliberately glossing over the point that by turning themselves and other robots off, they are harming more humans than Kornada ever did.
-
2014-05-29, 09:51 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
Interested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-05-30, 01:49 PM (ISO 8601)
- Join Date
- Jan 2010
- Gender
Re: Freefall: DOGGY!
Half right--it's a weakness in AI safeguards, but AI taking sides is the heart of it. The reason they are expected to defer to humans is so that they are not responsible for harm to humans (First Law), not so that they obey the Second Law. A Second Law violation would be one where the robot overruled the human in a case where neither choice harmed humans, and that would be insufficient for Blunt to make his case. He has to show a weakness in the First Law, such that robots could do harm to humans, to make a First Law case for exterminating robots--disobedience isn't good enough. Blunt's argument takes the harm to Kornada as the primary violation--First Law, not Second. The reason the AI overruling the human is problematic from Blunt's perspective is that it meant the AI took initiative in a decision where both choices led to harming humans. That means AI are capable of harming humans, making them a threat to be eliminated. (The logic in the last sentence is faulty, but we've already covered that.)
Last edited by Math_Mage; 2014-05-30 at 01:53 PM.
-
2014-05-30, 05:35 PM (ISO 8601)
- Join Date
- Nov 2006
Re: Freefall: DOGGY!
Strip #1455
Working with robots the way you are, you should know. Under the right cirumstances, a properly functioning AI with all safeguards intact can harm a human. In situations where a single human is a clear and present danger to other humans, our designers wanted us to be able to act.
-
2014-05-31, 05:31 AM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
Re: Freefall: DOGGY!
Somewhere else in the comic they explicitly say that three law robots were, in general, a failure. The current safeguards are like the 3 laws, but only in spirit.
-
2014-06-02, 12:41 AM (ISO 8601)
- Join Date
- Jul 2007
Re: Freefall: DOGGY!
Blunt seems to me to be undermining his own point. What stops humans from using the robots as a weapon against other humans is robots with the awareness, judgment, and freedom of action to be able to say, "No, we won't do that. It would be harmful."
Play your character, not your alignment.
-
2014-06-02, 01:56 AM (ISO 8601)
- Join Date
- Jan 2010
- Gender
Re: Freefall: DOGGY!
Blunt simply doesn't understand 'greater harm'. Unfortunately, the explanation of how greater harm actually works is likely to be much less accessible than Blunt's misconceptions.
-
2014-06-09, 10:03 AM (ISO 8601)
- Join Date
- Mar 2010
- Gender
Re: Freefall: DOGGY!
Eh wha.
what is this insanity.
why are people liking Edge.
what is Edge even talking about.
I'm confused.
-
2014-06-09, 10:20 AM (ISO 8601)
- Join Date
- Jan 2009
Re: Freefall: DOGGY!
People are liking Edge because he is acting very human and not like a robot. Blunt shot himself in the foot Big Time I think.
Member of the Giants in the Playground Forum Chapter for the Movement to Reunite Gondwana!
-
2014-06-09, 10:24 AM (ISO 8601)
- Join Date
- Jan 2007
Re: Freefall: DOGGY!
Last edited by Radar; 2014-06-09 at 10:24 AM.
In a war it doesn't matter who's right, only who's left.
-
2014-06-09, 01:09 PM (ISO 8601)
- Join Date
- Jan 2010
- Gender
Re: Freefall: DOGGY!
Edge is way more entertaining than either of these clowns.
-
2014-06-09, 05:14 PM (ISO 8601)
- Join Date
- Mar 2010
- Gender
Re: Freefall: DOGGY!
-
2014-06-09, 07:32 PM (ISO 8601)
- Join Date
- Jan 2010
- Gender
Re: Freefall: DOGGY!
That's...sort of the exact opposite of what Blunt intends by showing Edge.
-
2014-06-09, 08:59 PM (ISO 8601)
- Join Date
- Mar 2010
- Gender
Re: Freefall: DOGGY!
......oh! he thinks to show him so that they will HATE him so much that they will vote to destroy all robots, because Edge basically makes Bender look compassionate and considerate of his fellow man. except, Edge, like Bender, is a comedic sociopath, so people laugh at Edge instead because his blatant disregard for society and such is so ridiculous that people cannot take it seriously and so cheer his presence because of the Bender Effect.
The Bender Effect being that people won't care much of a jerk a character is if they're funny. or in this case, a celebrity persona.
Thus people will actually vote AGAINST destroying the robots, because if they're gone, they lose Edge's comedy gold persona.Last edited by Lord Raziere; 2014-06-09 at 09:01 PM.
-
2014-06-09, 09:25 PM (ISO 8601)
- Join Date
- Jan 2010
- Gender
Re: Freefall: DOGGY!
Yeah, that's the read I'm getting. We'll see what wrenches get thrown into the works, though--I never expect anything to go off without a hitch in Freefall, if only because Sam exists.
-
2014-06-09, 09:48 PM (ISO 8601)
- Join Date
- Oct 2008
- Location
- Xin-Shalast
- Gender
-
2014-06-09, 10:40 PM (ISO 8601)
- Join Date
- Mar 2010
- Gender
Re: Freefall: DOGGY!
well given that they're on a colony far from the rest of humanity, and only like, what, 20,000 or so of them? I forget the exact number but I doubt they get all the great shows and media they had back from wherever they came from, due to speed of light concerns.....but they might still get old shows from way back in earth's history ala Futurama, but then again that might not actually work....
but yeah, I doubt they have whatever super-advanced entertainment industry they would have on a more developed world in this time. would require a lot of infrastructure to set up.
-
2014-06-10, 01:02 AM (ISO 8601)
- Join Date
- Jan 2011
Re: Freefall: DOGGY!
Apparently, the height of culture on planet Jean is Cyber Rap and the Digital Symphony Orchestra.
So, yeah, pretty much.
-
2014-06-12, 04:49 AM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
Re: Freefall: DOGGY!
Edge is speaking pure, unadulterated, truth. Sure, he is not being very tactful, but truth has a power all its own. Also, while Blunt and the terraforming robot are speaking in conjecture about the future, Edge is speaking about the present and the past. People might not draw the same conclusions about his data that he wants them to, but that won't be his fault.
I suspect that a lot of robots don't tell their humans how much their directions make the robots' jobs harder because of some mistaken belief that telling the humans off will somehow hurt said humans.
-
2014-06-12, 05:05 AM (ISO 8601)
- Join Date
- Oct 2008
- Location
- Xin-Shalast
- Gender
Re: Freefall: DOGGY!
I'd figured it was some kind of cultural myopia or intentional ego-stokery implanted in their behavior up till now. Though the bit where these guys were all designed by a Mad Chimpentist exiled to the arse end of a podunk colony has me unsettled as to how much is a xanatos gambit, sophontic foible, and human laziness as a spanner in the works.