Results 181 to 210 of 214
Thread: Freefall: DOGGY!
-
2014-11-02, 11:15 PM (ISO 8601)
- Join Date
- Aug 2005
- Location
- Mountain View, CA
- Gender
Re: Freefall: DOGGY!
Not really. He didn't ask for the security upgrade because he already knew she wanted her security upgraded. He does not have any previous indication about her desire for puppies, so this does not show any understanding that there might be other reasons for asking permission.
Like 4X (aka Civilization-like) gaming? Know programming? Interested in game development? Take a look.
Avatar by Ceika.
Archives:
SpoilerSaberhagen's Twelve Swords, some homebrew artifacts for 3.5 (please comment)
Isstinen Tonche for ECL 74 playtesting.
Team Solars: Powergaming beyond your wildest imagining, without infinite loops or epic. Yes, the DM asked for it.
Arcane Swordsage: Making it actually work (homebrew)
-
2014-11-05, 06:07 PM (ISO 8601)
- Join Date
- Jun 2011
Re: Freefall: DOGGY!
...so, this was posted to the freefall forums.
Spoiler: Free puppy?
-
2014-11-05, 10:22 PM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
-
2014-11-05, 10:45 PM (ISO 8601)
-
2014-12-13, 11:06 AM (ISO 8601)
- Join Date
- Jun 2013
- Location
- Bristol, UK
Re: Freefall: DOGGY!
Are all people human in Freefall? Are all humans people?
The end of what Son? The story? There is no end. There's just the point where the storytellers stop talking.
-
2014-12-13, 01:58 PM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
No, and maybe. There are people in Freefall who are demonstrably not human (e.g. the robots). Florence's definition seems to go "Can they think for themselves? If Yes, person. If person, do they need to follow safeguards? If No, human." On the other hands there are some humans who are demonstrably stupid, self-absorbed and sociopathic enough not to qualify as a person, despite the rights they are entitled to (e.g. Mr. Kornada).
-
2014-12-13, 06:01 PM (ISO 8601)
- Join Date
- Jun 2013
- Location
- Bristol, UK
-
2014-12-13, 07:07 PM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
Re: Freefall: DOGGY!
Is the commander going to ask "Florence, are you human?" I'm not sure he is thinking quite far enough ahead to see that it is an interesting question to ask her, since she fits all the criteria she just listed for humanity.
Edit: I love the panels of obfustication. Especially the lampshade of obfustication.
-
2014-12-14, 05:28 AM (ISO 8601)
- Join Date
- Jun 2005
- Location
- Buried under C++ compilers
Re: Freefall: DOGGY!
The big question is how do you measure if someone can think for themselves. Unless you can bisect a working brain and study the actual mechanics of thought process, the Chinese Room is always a possibility. Given a large enough storage and sufficient processing power, you could theoretically pre-program a dumb AI with enough responses to confuse anyone into believing they are talking a conscious being.
Take the part where Florence asks two random robots a nonsense question. She took the result as one failing her turing-test and the other passing by lamenting on the absurdity. I know what Mark was trying to get across there, but now we know that that was two Bowman AIs talking to each other.
How would we know that Florence's "random" question wasn't something she was programmed to think of? And that the other AI didn't have a few canned responses after he recognized the phrase? We know Florence can be put into maintenance mode just with vocal commands, so that level of engineering isn't out of the question either.
In the end, defining consciousness is a very tricky question - mostly because we have no firm concept of what consciousness is.There is no such thing as "innocence", only degrees of guilt.
-
2014-12-14, 10:04 AM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
Funny thing, the "Chinese Box" was mentioned in-story already.
Qwerty: "Of course, there is the Chinese box argument. I might be simulating consciousness to the extent that I'm not really conscious."
Winston: "Do they serve drinks where we're going? I might need one."Last edited by Landis963; 2014-12-14 at 10:04 AM.
-
2014-12-14, 10:07 AM (ISO 8601)
- Join Date
- Jun 2013
- Location
- Bristol, UK
Re: Freefall: DOGGY!
MRI can scan a working brain without damaging it.
The status of the "Chinese Room" is disputed. Some people, including myself, think that the Room is consciouse, even though the CPU it's running on isn't aware of that.
In the end, defining consciousness is a very tricky question - mostly because we have no firm concept of what consciousness is.The end of what Son? The story? There is no end. There's just the point where the storytellers stop talking.
-
2014-12-14, 10:25 AM (ISO 8601)
- Join Date
- Apr 2011
- Gender
Re: Freefall: DOGGY!
The Chinese Room argument always struck me as an example of begging the question.
In order to agree with its conclusion (that computational models are not an explanation of consciousness) you have to have already accepted its conclusion (that there is something essential about consciousness in addition to computation), because the argument makes no effort to explain what the missing element "understanding Chinese" even is and how we might test for its absence.
It's just another P-Zombie (quite literally because the P-Zombie is basically the same argument of a thing which acts in every way as if it is conscious but is not), with the same fundamental assumption that there is something about consciousness over and above acting in every way as if it is consciousness, and it deserves the attentions of the boomstick.
-
2014-12-14, 12:05 PM (ISO 8601)
- Join Date
- Jan 2007
Re: Freefall: DOGGY!
In a war it doesn't matter who's right, only who's left.
-
2014-12-14, 12:47 PM (ISO 8601)
- Join Date
- Apr 2011
- Gender
Re: Freefall: DOGGY!
Since you can only know if another person is aware of their own existence because they told you they were, and a P-Zombie system like the Chinese room will, asked the same question, give the same report, you have no grounds to make that assertion.
As I said, you have to agree with the assumption that there is a special thing before the p-zombie has any conceptual weight.
-
2014-12-14, 01:14 PM (ISO 8601)
- Join Date
- Jan 2007
Re: Freefall: DOGGY!
I am not talking about checking anyone else for sentience - I am talking about being aware of my own sentience. Everyone is aware of their own consciousness and nobody can check it in others - we simply assume it instinctively, since we are all from the same species and so on and so forth. Lack of measurement method cannot exlude that as an important aspect of consciousness. That's pretty much all the p-zombie scenario says: we have no method of actually checking for consciousness.
It seems most reasonable to assume that if something quacks like a duck, walks like a duck and looks like a duck, then it's a duck and not a pointed stick or a pile of noodles. That being said, it's still an assumption and not a fact.In a war it doesn't matter who's right, only who's left.
-
2014-12-14, 06:13 PM (ISO 8601)
- Join Date
- Jun 2005
- Location
- Buried under C++ compilers
Re: Freefall: DOGGY!
Only for fleshware. Digital constructs would need something like a debug mode - if the functionality is provided, of course. We know they have dream-machines that muck with their long-term data storage, so one could take a look at that... as long as the format can be understood without interfacing it with the actual robot, of course.
Let's use a real-life example. IBM's nifty WATSON super-computer had beat people at jeopardy. Let's say IBM goes nuts on the R&D budget and upgrade WATSON, to the point where he can convincingly chat with people. He can respond to questions and fool people into believing he's a human.
So did IBM just develop an artificial consciousness?
I would say not. Precisely because it was developed. WATSON may appear smart, but it's still just an old-style dumb AI - responding to external stimuli by applying a rigid set of rules. It can't - and won't even try to! - comprehend those rules; it's just a glorified bunch of mathematical equations, in the end.
I'd say that's the fundamental gap between "real" thinking machines and the ones we can make - or even comprehend - nowadays.
A machine we can conceivably create will be limited by its nature, because we need to feed it a set of rules it's gonna follow. We literally script the 'intelligence" part of AI; basically give the bloke our version of the chinese dictionary. This is a stark contrast to "real" intelligence which is emergent; it has the capacity to evolve and make up its own rules as it goes along. Due to that very emergent nature, it'd very difficult - if not impossible - to script a true AI. You would have more luck with abandoning programming as a concept and just setting up the "base" point, then bombarding the AI with a set of stimulis... let it learn itself.
That's just my opinion, of course. But I think that no matter how advanced an AI we make, it will still remain on this side of that fundamental gap. I think that is the "something essential" that is missing. When a true AI will come along, it will most likely look nothing like the ones we have today.
Which is why we would need to understand the actual process of how the AI makes a decision. If it can actively evolve on its own, just by being faced with external stimuli, then it's a consciousness - or a consciousness waiting to happen.Last edited by Ravenlord; 2014-12-14 at 06:16 PM.
There is no such thing as "innocence", only degrees of guilt.
-
2014-12-14, 06:13 PM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
Re: Freefall: DOGGY!
Descarte said "Cogito, Ergo Sum", loosely translated "I think, therefore I am." I think he was reducing the set of knowable things down to "I am doubting, so I must exist", but it also applies to our consciousness. The only person whose consciousness can be proven irrefutably is the consciousness of the one being proven to.
-
2014-12-14, 06:40 PM (ISO 8601)
- Join Date
- Apr 2011
- Gender
Re: Freefall: DOGGY!
You're behind the times when it comes to the current state of computer learning. We're already building learning machines, Google have created a learning machine, shown it ten million random still images from youtube videos, and it determined with no prior rules other than pattern recognition, the existence of cats. (The original experiment was actually to see if a learning machine could learn to recognise faces where the images provided were not tagged as containing a face, as they previously had been in such experiments. It could, the cats were a bonus feature because of the nature of the internet).
Learning computers are a thing already, they're just limited by engineering.
Originally Posted by Rockphred
-
2014-12-14, 07:29 PM (ISO 8601)
- Join Date
- Nov 2006
- Location
- Watching the world go by
- Gender
Re: Freefall: DOGGY!
Obvious to whom, pray tell? Because it is increasingly obvious to me that thou art not me, and therefore I am not thee. Furthermore, with a bit of comparison between writing styles on posts and preferred sources of quotes, we could probably prove that they are not us.
-
2014-12-15, 01:26 AM (ISO 8601)
- Join Date
- Jun 2005
- Location
- Buried under C++ compilers
Re: Freefall: DOGGY!
I wish you were right, but to quote Blunt, there is no soul in the machine yet.
Neural nets (which this AI must be using) are fairly old as a concept. They are, however, not really "learning" as much as "self tuning", continuously adjusting their own parameters based on feedback. That makes them fairly adaptive, yes; emergent, no.
You see, what the Google AI did was pretty nifty; but it was still only executing the task it had been scripted to do. It was a visual pattern detection algorithm coupled with a neural net. It was literally made for nothing other than recognizing similar patterns in a video feed. It doesn't actually know it found a cat; it only spots a frequently occurring pattern. It can't - and won't be able to - act out something it wasn't given scripts to.
It won't be able to crawl through wikipedia and learn more about cats, for example.
That is the difference I had been talking about.There is no such thing as "innocence", only degrees of guilt.
-
2014-12-15, 03:43 PM (ISO 8601)
- Join Date
- Jun 2013
- Location
- Bristol, UK
Re: Freefall: DOGGY!
From your location, I have to ask, how do you feel about recursion?
I might be convinced that AI in a C based language is very difficult, but I'm not half as convinced that Lisp or something like it won't work. I'm just not convinced that something can't emerge from self modifying software. There are worms (nematodes?) where all the neurons (about 10^3) have been mapped. It's just a matter of scaling that up about a billionfold (which is obviously non-trivial), we've come form 8 bits at 4MHz with 16 KB of RAM in 1980 to 64 bits at 3GHz with 4 cores and 16 GB of RAM. I'm not saying the hardware is definitely there yet, but I'm not convinced it definitively isn't.
Self modifying software is unpredictable, so that's a negative for reliable reproducibility, but happening randomly once in a while? I can't see how we can rule it out.The end of what Son? The story? There is no end. There's just the point where the storytellers stop talking.
-
2014-12-15, 04:40 PM (ISO 8601)
- Join Date
- Jun 2005
- Location
- Buried under C++ compilers
Re: Freefall: DOGGY!
I don't like it. It leads to stack overflows.
But to risk using a quote that have been worn thin already, insanity is expecting different results when you keep repeating the same action. If a design has the capability to improve (in a way relevant to our discussion), then it will exhibit that behaviour with or without recursion. It's the basic design that's relevant in our case, after all.
I don't think the language of choice matters much. In the end, C, LISP or any other languages are just abstraction layers to ASM. It's not the languages that are at fault; it's us. We simply can't model a sufficiently advanced thought process using pure mathematics.
As for your theory on self-modifying software, I believe you are fundamentally correct. Once a software rolls out that can actively modify its basic function and programming, I'll believe that it has potential.
I haven't heard of anything such however.
Even neural nets don't truly modify themselves; they only tune themselves continuously, getting better at the task they are programmed to do. This is a fundamental limitation, too; as neural nets require a feedback loop to adjust the weights of their neurons. That, I believe, prevents them from picking up completely new actions. The net would need to set up a new neuron cluster and the necessary evaluation routines at the same time to "branch out". This won't change no matter how fast you progress the cycles, either. It's a limitation of design, not implementation.
As for the nematodes, the one that has been mapped has far less neurons. Magnitudes less than 1000 even - try 302. And even then I have my doubts how complete our modelling is. In the real fleshware, the nematode neurons don't exist in a vacuum; they have a complex system of neurochemistry happening around them, which greatly influences how they behave. I wonder how they are going to include that in the models, for example...There is no such thing as "innocence", only degrees of guilt.
-
2014-12-15, 05:41 PM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
Freefall-style AI cannot be programmed, it will have to be grown. A combination of unfettered neural nets and evolutionary algorithms with some very broad targets would be my guess. That, and a hell of a lot of generations. I.e. just like actual intelligence. We just about can model one brain in our biggest supercomputer. When we can simulate thousands at once and subject to random input and measure their fitness, then we stand a chance to get AI.
Grey WolfLast edited by Grey_Wolf_c; 2014-12-15 at 05:43 PM.
Interested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-12-15, 10:19 PM (ISO 8601)
- Join Date
- Jun 2013
- Location
- Bristol, UK
Re: Freefall: DOGGY!
The particular programming language somewhat constrains how the particular human writing it thinks. C had me thinking in lines and functions, later recursion, I never did understand OOP.
As for your theory on self-modifying software, I believe you are fundamentally correct. Once a software rolls out that can actively modify its basic function and programming, I'll believe that it has potential.
I haven't heard of anything such however.
http://en.wikipedia.org/wiki/Self-modifying_code
I suspect doing much with it is probably a lot harder than recursion or OOP.The end of what Son? The story? There is no end. There's just the point where the storytellers stop talking.
-
2014-12-16, 12:54 AM (ISO 8601)
- Join Date
- May 2009
Re: Freefall: DOGGY!
All machine learning has the same broad base of invent a predictive function that depends on some unknown parameters, construct a cost function that you try to minimize using those parameters, and use the function to predict new items. If that is ever going to be used to mimic human intelligence, then you need to tell me what cost function humans are trying to minimize. In each activity we do we often have some intuitive cost function, but as far as living itself? People have such different objectives in life that it seems unlikely to make a whole consciousness this way.
Last edited by Kornaki; 2014-12-16 at 12:54 AM.
-
2014-12-16, 07:26 AM (ISO 8601)
- Join Date
- Apr 2011
- Gender
Re: Freefall: DOGGY!
What Descartes referred to as "I" was an immaterial soul which is shown all the sense data gathered by the brain and makes decisions based on it which are fed back to the brain for action.
He identified the pineal gland as the part of the brain where this happened, because nobody knew what the pineal gland did at the time so it might as well be that.
Modern neuroscience shows in several ways that the brain does quite a lot on its own without the conscious "I" (brain activity related to performing voluntary acts begins up to half a second before conscious awareness of the decision to perform the act), and that a very great deal of the "sense data" we assume to be shown to the "I" is actually made up by the brain (There a are a lot of good examples of this not only in The Self Illusion but also Richard Wiseman's book Paranormality). The most obvious example though is our sense of colour vision. Our actual physical colour vision is limited to quite a narrow portion of our actual field of vision, but we appear to see our whole field of vision in colour because our brain makes up the rest based on what it assumes it already knows about the things it is seeing. It turns out that we aren't the cartesian "I", we're actually the omnimpotent deceiver.
-
2014-12-16, 09:11 AM (ISO 8601)
- Join Date
- Oct 2011
- Location
- Therinos
- Gender
Re: Freefall: DOGGY!
-
2014-12-16, 09:16 AM (ISO 8601)
- Join Date
- Apr 2011
- Gender
-
2014-12-16, 09:22 AM (ISO 8601)
- Join Date
- Aug 2007
Re: Freefall: DOGGY!
I should add that, apart from not knowing what it did, it was also believed to be exclusive to humans - I was always told that this was the reason Descartes believed it to be connected to the soul, since "obviously" "lesser" animals didn't have souls. This turned out (unsurprisingly), to be false: it's not only present in most animals, it's particularly well-developed in lizards.
I think that it was this together with the assorted idiocies expressed by Aristotle that soured me to philosophy as a discipline.
Grey WolfInterested in MitD? Join us in MitD's thread.There is a world of imagination
Deep in the corners of your mind
Where reality is an intruder
And myth and legend thrive
Ceterum autem censeo Hilgya malefica est
-
2014-12-17, 02:01 AM (ISO 8601)
- Join Date
- Apr 2014
- Gender
Re: Freefall: DOGGY!
The big thing about 'P-zombies" is... how do we know they don't think? We can't experience what code executing actually feels like (If it feels like anything)
And I thought the meaning of Descartes "Cogito Ergo Sum" was him saying "I'm thinking, which means there's something that needs to be thinking, and therefore I exist, so I can think. If I didn't exist, there wouldn't be anything to think, and so I wouldn't be thinking."Last edited by Sartharina; 2014-12-17 at 02:03 AM.