# Ai



## ambush80 (Jul 20, 2015)

Interesting questions:

https://www.youtube.com/watch?v=qwDnvA7mQMQ

We just watched Ex Machina last night.  It was a pretty good movie.


----------



## JB0704 (Jul 20, 2015)

Funny, me n Mrs JB watched it last night too......interesting end.

Watched Chappie a few weeks ago. Ex Machina is much better, similar concepts and questions.


----------



## JB0704 (Jul 20, 2015)

Oh, and, AI should be possible if intelligence is possible organically.  If all we are is an arrangement, such an arrangement could be duplicated, one would think.  I'm not smart enough to consider how, it just makes sense that if intelligence can happen on accident, it should also happen on purpose.


----------



## bullethead (Jul 20, 2015)

JB0704 said:


> Oh, and, AI should be possible if intelligence is possible organically.  If all we are is an arrangement, such an arrangement could be duplicated, one would think.  I'm not smart enough to consider how, it just makes sense that if intelligence can happen on accident, it should also happen on purpose.


Is it an accident or the result of the available components?


----------



## JB0704 (Jul 20, 2015)

bullethead said:


> Is it an accident or the result of the available components?



Either or.  The challenge is that we have the components, and existing intelligence, so we know it's possible.


----------



## PappyHoel (Jul 20, 2015)

We are AI created by intelligent design.  If we create AI it will be just like us no better no worse.  With that said you can make a machine stronger.


----------



## StriperrHunterr (Jul 21, 2015)

PappyHoel said:


> We are AI created by intelligent design.  *If we create AI it will be just like us no better no worse. * With that said you can make a machine stronger.



Does this statement apply to all created intelligence universally? Not just that created by man?


----------



## JB0704 (Jul 21, 2015)

I had a debate amongst Christians a long time ago about whether or not we believed AI, if ever achieved, could have a soul......interesting to ponder considering we believe that is what makes us unique.

However, if all we are is an arrangment of particles that is programmed by various factors (nature/nurture) then are we intellignent?  Meaning, are we inique and able to think and reason outside of the program?  If not, we may create something that can reason, but is that reason a function of intelligence or is it a programmed reaction to stimulus?   

Which kind-of takes us back to the idea of whether free-will can exist in a god-less universe.


----------



## StriperrHunterr (Jul 21, 2015)

JB0704 said:


> I had a debate amongst Christians a long time ago about whether or not we believed AI, if ever achieved, could have a soul......interesting to ponder considering we believe that is what makes us unique.
> 
> However, if all we are is an arrangment of particles that is programmed by various factors (nature/nurture) then are we intellignent?  Meaning, are we inique and able to think and reason outside of the program?  If not, we may create something that can reason, but is that reason a function of intelligence or is it a programmed reaction to stimulus?
> 
> Which kind-of takes us back to the idea of whether free-will can exist in a god-less universe.



That's why I take a different tack. It feels to me like I have free will, in some cases about what happens or doesn't, but even forgoing that I have free will to determine how events beyond my control are internalized by me. 

If that's an illusion then A) what chance is there of me piercing that veil, B) understanding how or why the illusion is there and for what purpose, and C) what impact will that have on the rest of my life? 

The answer to C, in my case, is a loss of hope and a loss of drive. If I'm just a meat puppet dancing to the strings being pulled on me then what is the point of hope, or prayer, or will except to blind me from the truth and make living this life bearable? 

My life experience has shown me that I do have at least the illusion of free will because I can seemingly control certain, but not all, aspects of my life and that makes me drop the other questions.


----------



## swampstalker24 (Jul 21, 2015)

StripeRR HunteRR said:


> That's why I take a different tack. It feels to me like I have free will, in some cases about what happens or doesn't, but even forgoing that I have free will to determine how events beyond my control are internalized by me.
> 
> If that's an illusion then A) what chance is there of me piercing that veil, B) understanding how or why the illusion is there and for what purpose, and C) what impact will that have on the rest of my life?
> 
> ...



I watched  ex machina the other night as well, pretty good movie.  StripeRR, the questions you bring up were pretty well articulated in the movie as well, in the scene where the main character starts freaking out, wondering if he too is a robot.....  goes as far as cutting his arm open to see what was under his skin.

For those who watched the movie, do you believe AVA passed the Turing test?


----------



## StriperrHunterr (Jul 21, 2015)

swampstalker24 said:


> I watched  ex machina the other night as well, pretty good movie.  StripeRR, the questions you bring up were pretty well articulated in the movie as well, in the scene where the main character starts freaking out, wondering if he too is a robot.....  goes as far as cutting his arm open to see what was under his skin.
> 
> For those who watched the movie, do you believe AVA passed the Turing test?



For me the what we are isn't as important as that we are. 

We exist, that's the important part.


----------



## ambush80 (Jul 21, 2015)

JB0704 said:


> Funny, me n Mrs JB watched it last night too......interesting end.
> 
> Watched Chappie a few weeks ago. Ex Machina is much better, similar concepts and questions.



As a character in a story, Ava could represent the first Terminator or the first Matrix machine.


----------



## ambush80 (Jul 21, 2015)

PappyHoel said:


> We are AI created by intelligent design.  If we create AI it will be just like us no better no worse.  With that said you can make a machine stronger.





StripeRR HunteRR said:


> Does this statement apply to all created intelligence universally? Not just that created by man?





JB0704 said:


> I had a debate amongst Christians a long time ago about whether or not we believed AI, if ever achieved, could have a soul......interesting to ponder considering we believe that is what makes us unique.
> 
> However, if all we are is an arrangment of particles that is programmed by various factors (nature/nurture) then are we intellignent?  Meaning, are we inique and able to think and reason outside of the program?  If not, we may create something that can reason, but is that reason a function of intelligence or is it a programmed reaction to stimulus?
> 
> Which kind-of takes us back to the idea of whether free-will can exist in a god-less universe.



Harris hits on all of these issues in his talk.  It's only 16 minutes (I know it feels like more).  It's worth listening to.  He hits an all the issues touched on in Ex Machina.


----------



## ambush80 (Jul 21, 2015)

StripeRR HunteRR said:


> That's why I take a different tack. It feels to me like I have free will, in some cases about what happens or doesn't, but even forgoing that I have free will to determine how events beyond my control are internalized by me.
> 
> If that's an illusion then A) what chance is there of me piercing that veil, B) understanding how or why the illusion is there and for what purpose, and C) what impact will that have on the rest of my life?
> 
> ...



If you are the result of not only your nature but an infinite set of influences that you don't even know about, it seems hard to determine how much of your will is your own.  

That shouldn't drive you to inaction.  As a matter of fact, try doing nothing.  It's quite impossible.  So like you said, what's important is that you are here and that you will do things.  Might as well do meaningful things.


----------



## ambush80 (Jul 21, 2015)

In Ex Machina, I was impressed by the foresight that the guy had when he isolated his fortress and made it so sturdy.  I suppose he knew what a dangerous thing he had stumbled on when he started making prototypes.  

The responsibility of knowing that he was the cause of the extinction of humankind was probably a factor in his drinking.  It fit in line with his "Sin/Purge" behavior.  It's how he dealt with his guilt.   He was the most interesting character to me.  Still, I would think that he would have told the kid "Under no circumstances should you let her out.  She's dangerous and let me tell you why...."

The writers took the position that machines will not develop morals like our own.  It's the most interesting question that movies like this bring up.  If a computer "ran the math" would it come to the conclusion that humans are worthless?  (Except as batteries, perhaps).


----------



## ambush80 (Jul 21, 2015)

swampstalker24 said:


> I watched  ex machina the other night as well, pretty good movie.  StripeRR, the questions you bring up were pretty well articulated in the movie as well, in the scene where the main character starts freaking out, wondering if he too is a robot.....  goes as far as cutting his arm open to see what was under his skin.
> 
> For those who watched the movie, do you believe AVA passed the Turing test?



I think that an entity like Ava would make it harder to determine what consciousness is.


----------



## swampstalker24 (Jul 21, 2015)

ambush80 said:


> In Ex Machina, I was impressed by the foresight that the guy had when he isolated his fortress and made it so sturdy.  I suppose he knew what a dangerous thing he had stumbled on when he started making prototypes.
> 
> The responsibility of knowing that he was the cause of the extinction of humankind was probably a factor in his drinking.  He was the most interesting character to me.
> 
> The writers took the position that machines will not develop morals like our own.  It's the most interesting question that movies like this bring up.  If a computer "ran the math" would it come to the conclusion that humans are worthless?  (Except as batteries, perhaps).



Interesting insights......  I didn't think of that, but now it kinda makes sense.


----------



## JB0704 (Jul 21, 2015)

ambush80 said:


> I think that an entity like Ava would make it harder to determine what consciousness is.



I had a similar thought typed relevant to the question as to whether or not she passed the Turing test.


----------



## StriperrHunterr (Jul 21, 2015)

ambush80 said:


> If you are the result of not only your nature but an infinite set of influences that you don't even know about, it seems hard to determine how much of your will is your own.
> 
> That shouldn't drive you to inaction.  As a matter of fact, try doing nothing.  It's quite impossible.  So like you said, what's important is that you are here and that you will do things.  Might as well do meaningful things.



Saying my will is meaningless because other people have influenced it sounds something like saying that once we exhaust every combination of words in the English language that no new ideas can be created. You could estimate that we passed that point a long time ago, yet we're still innovating. 

It's, and don't take this the wrong way, specious.


----------



## ambush80 (Jul 21, 2015)

StripeRR HunteRR said:


> Saying my will is meaningless because other people have influenced it sounds something like saying that once we exhaust every combination of words in the English language that no new ideas can be created. You could estimate that we passed that point a long time ago, yet we're still innovating.
> 
> It's, and don't take this the wrong way, specious.



Consider a conversation that you might have barely overheard on the elevator.  You might not have registered it consciously.  How much might it have impacted you?  How about the weather?  Why does certain weather affect you a certain way?  When were those influences established and to what degree?  

You can call it specious in the sense that it's a new field of study.  Like quantum physics.  I think brain mapping is farther along than String Theory.  Understanding the mind is tricky.  

Again, if you only operate on the basis of what you actually know: you are here and you do things, you can see the effects of your actions regardless of where they come from.  It's kind of an unnecessary exercise but Harris wrote a whole book about it called _Freewill_.

New ideas come with new data.


----------



## JB0704 (Jul 21, 2015)

If we are a series of yes/no responses to various inputs, that isn't much different that 1's and 0's, and AI would be possible....ish, to the extent that any intelligence was possible.

This whole topic is difficult to discuss.


----------



## WaltL1 (Jul 21, 2015)

JB0704 said:


> If we are a series of yes/no responses to various inputs, that isn't much different that 1's and 0's, and AI would be possible....ish, to the extent that any intelligence was possible.
> This whole topic is difficult to discuss.


Im thinking if we were to break it down to the most basic level thats pretty much what we are.
Im having a hard time thinking of an action or a thought/response that we have that isnt based on some input at some point.


----------



## ambush80 (Jul 21, 2015)

WaltL1 said:


> Im thinking if we were to break it down to the most basic level thats pretty much what we are.
> Im having a hard time thinking of an action or a thought/response that we have that isnt based on some input at some point.



But those inputs don't include wiping your backside every day or eating a peach for an AI.  How would an AI understand that?  It would have to have biological material to process the experience the same.

That seems to be one of the points that the movie touched on.  He said that AI's would look at us like we look at Neanderthals.  Outdated.  Wiped out by the next best thing.


----------



## JB0704 (Jul 21, 2015)

ambush80 said:


> That seems to be one of the points that the movie touched on.  He said that AI's would look at us like we look at Neanderthals.  Outdated.  Wiped out by the next best thing.



......because of a basic survival instinct as well.  Kind-of an ironic twist.

The robot was also amoral.  Right and wrong did not factor in the way it did with the creator (you mentioned his drinking away his guilt from designing an "improved" species). 

Take Ava's basic survival instinct, and the creator's moral compass, which demonstrates a higher evolution and ability to reason?

I will point out, I thought it was great when she ditched the other fella in a really cold way.


----------



## JB0704 (Jul 21, 2015)

WaltL1 said:


> Im having a hard time thinking of an action or a thought/response that we have that isnt based on some input at some point.



Laughing, particularly when young, is a result of stimulus, but not sure what the input is to tell a baby that something is funny.  When my son was real young he used to laugh and laugh at the "Oh Brother Where art Thou" soundtrack, particularly the "Hard Time Killin Floor Blues," and he had never seen the movie.......but the "MMMMHHHHMMM" part cracked him up


----------



## PappyHoel (Jul 21, 2015)

AI the next evolution of humans.  If you believe in evolution


----------



## WaltL1 (Jul 21, 2015)

JB0704 said:


> Laughing, particularly when young, is a result of stimulus, but not sure what the input is to tell a baby that something is funny.  When my son was real young he used to laugh and laugh at the "Oh Brother Where art Thou" soundtrack, particularly the "Hard Time Killin Floor Blues," and he had never seen the movie.......but the "MMMMHHHHMMM" part cracked him up





> Laughing, particularly when young, is a result of stimulus, but not sure what the input is to tell a baby that something is funny.


Good point.
And the next kid might not think its funny at all and just soil his diaper in response.
Humor might be something that cant be "programmed".


----------



## JB0704 (Jul 21, 2015)

WaltL1 said:


> Good point.
> And the next kid might not think its funny at all and just soil his diaper in response.
> Humor might be something that cant be "programmed".



.....took me a few hours to think something up


----------



## JB0704 (Jul 21, 2015)

Another point this made me consider, if I were an AA, I would probably find comfort in the lack of a soul, and a pre-programmed response approach to personality/morality.  That would make more sense than trying to determine why we are all unique.......kind-a like fingerprints.


----------



## stringmusic (Jul 22, 2015)

http://content.time.com/time/magazine/article/0,9171,137690,00.html



> Winning at chess, of course, is much harder than adding numbers. But when you think about it carefully, the idea that Deep Blue has a mind is absurd. How can an object that wants nothing, fears nothing, enjoys nothing, needs nothing and cares about nothing have a mind? It can win at chess, but not because it wants to. It isn't happy when it wins or sad when it loses.





> Computers might one day be capable of expressing themselves in vivid prose or fluent poetry, but unfortunately they will still be computers and have nothing to say. The gap between human and surrogate is permanent and will never be closed. Machines will continue to make life easier, healthier, richer and more puzzling. And human beings will continue to care, ultimately, about the same things they always have: about themselves, about one another and, many of them, about God. On those terms, machines have never made a difference. And they never will.


----------



## StriperrHunterr (Jul 22, 2015)

ambush80 said:


> But those inputs don't include wiping your backside every day or eating a peach for an AI.  How would an AI understand that?  It would have to have biological material to process the experience the same.
> 
> That seems to be one of the points that the movie touched on.  He said that AI's would look at us like we look at Neanderthals.  Outdated.  Wiped out by the next best thing.



While I could agree with the second part, the first is simply a matter of electrical interpretation. We don't know exactly how it works in our brains, but with a set of sensors we can replicate the varying senses you talked about. Moisture sensors for the juiciness, pressure sensors for firmness, artificial cilia for texture like the difference between a peach and a pear, and spectral analysis to determine chemical composition. Add in detectors for salinity, sweetness, pH, and any others and you get the constituent parts of the tongue all represented. Collating those results is the understanding of what a peach is. If the machine is already intelligent, then it can associate that set of sensory input with a peach and that "memory" with an event, like being at a BBQ with "family". 

As for the wiping, we can do that, too. Sensors that would detect buildup on the surface of their body, and once it got to be too much, or of a specific type, make it "uncomfortable" to continue doing anything else. You're not innately born knowing that you have to wipe, you sat, like the rest of us, for extended periods of your early life in your own filth and likely had rashes and such as a result of not being as clean as you currently make yourself. That unpleasant input keeps you wiping even if you don't have the character trait that makes you desire decent hygiene all on your own. That's a set point on input. For those who have the desire on their own to stay clean, that set point is very low. For those who can go a week without showering, or don't make sure they're as clean down there as you do, that set point is higher. 

I currently work for a company that builds monitor and control solutions. We build a lower form of consciousness, in that it's not aware of anything that we don't tell it to be, that is constantly polling downstream devices for their state, and triggers alarms based on input set points. The only difference is that a human has to tell it what's connected and how, and the human has to solve most of the problems with the connected devices themselves. But is that really any different than you going to the doctor?


----------



## ambush80 (Jul 22, 2015)

StripeRR HunteRR said:


> While I could agree with the second part, the first is simply a matter of electrical interpretation. We don't know exactly how it works in our brains, but with a set of sensors we can replicate the varying senses you talked about. Moisture sensors for the juiciness, pressure sensors for firmness, artificial cilia for texture like the difference between a peach and a pear, and spectral analysis to determine chemical composition. Add in detectors for salinity, sweetness, pH, and any others and you get the constituent parts of the tongue all represented. Collating those results is the understanding of what a peach is. If the machine is already intelligent, then it can associate that set of sensory input with a peach and that "memory" with an event, like being at a BBQ with "family".
> 
> As for the wiping, we can do that, too. Sensors that would detect buildup on the surface of their body, and once it got to be too much, or of a specific type, make it "uncomfortable" to continue doing anything else. You're not innately born knowing that you have to wipe, you sat, like the rest of us, for extended periods of your early life in your own filth and likely had rashes and such as a result of not being as clean as you currently make yourself. That unpleasant input keeps you wiping even if you don't have the character trait that makes you desire decent hygiene all on your own. That's a set point on input. For those who have the desire on their own to stay clean, that set point is very low. For those who can go a week without showering, or don't make sure they're as clean down there as you do, that set point is higher.
> 
> I currently work for a company that builds monitor and control solutions. We build a lower form of consciousness, in that it's not aware of anything that we don't tell it to be, that is constantly polling downstream devices for their state, and triggers alarms based on input set points. The only difference is that a human has to tell it what's connected and how, and the human has to solve most of the problems with the connected devices themselves. But is that really any different than you going to the doctor?



How could an AI know what that's like if it weren't biological material that needed to eat and crap?  It could understand it intellectually but not as a matter of personal experience.  

How could it understand thirst unless it could become thirsty?


----------



## ambush80 (Jul 22, 2015)

What would you think about if you could access the entirety of of the internet simultaneously and process it at almost instantaneous speed?


----------



## StriperrHunterr (Jul 22, 2015)

ambush80 said:


> How could an AI know what that's like if it weren't biological material that needed to eat and crap?  It could understand it intellectually but not as a matter of personal experience.
> 
> How could it understand thirst unless it could become thirsty?



The waste isn't the problem. It's the bacteria and such that grow on them, and the first indication that something's going wrong is itching, burning, and then pain. You could simulate itching with a minor voltage applied sparingly, and more as you scratch it. Burning would be a small voltage at higher amperage, pain would be high voltage and higher amperage. 

The only way you know anything about your body is because either an electrical or chemical signal was sent from the remote location to your brain, or a normal close circuit (your nerves) were suddenly opened (they get cut). That's the mechanics of what's happening in your body. 

The intelligence, you, interprets that data as things, just as an intelligent machine would. 

Besides, pain isn't a universal constant, as much as you'd like it to be. There are people wired to experience pain stimulus as pleasure, and vice versa. You wouldn't be more qualified to tell an AI that what they're feeling is wrong than you would be to tell a human wired similarly. Sure, it's not normative, in that it's not the case in the vast majority of people, but it seems to happen as randomly as any other mutation that's driven the evolution of the species, so it seems natural.


----------



## ambush80 (Jul 22, 2015)

StripeRR HunteRR said:


> The waste isn't the problem. It's the bacteria and such that grow on them, and the first indication that something's going wrong is itching, burning, and then pain. You could simulate itching with a minor voltage applied sparingly, and more as you scratch it. Burning would be a small voltage at higher amperage, pain would be high voltage and higher amperage.
> 
> The only way you know anything about your body is because either an electrical or chemical signal was sent from the remote location to your brain, or a normal close circuit (your nerves) were suddenly opened (they get cut). That's the mechanics of what's happening in your body.
> 
> ...



I wasn't talking about pain or discomfort specifically.  I was talking about how a being responds to basic biological needs.  For an AI to understand what it's like to be hungry or lustful it has to have material parts of it that trigger the impulse.


Like the machines you work with, they can sense if something is wrong with the hardware, where it's damaged.  That would be like a pain response.  Would the AI have an emotional attachment to a damaged part like a limb that needs to be replaced?  They will think of things differently because they're made of different stuff.  If they wanted a human experience they would have to grow tissue in a lab and then encode it or transplant coding into an existing biological organism.  

That would be something, huh?  Maybe a new movie.


----------



## StriperrHunterr (Jul 22, 2015)

ambush80 said:


> I wasn't talking about pain or discomfort specifically.  I was talking about how a being responds to basic biological needs.  For an AI to understand what it's like to be hungry or lustful it has to have material parts of it that trigger the impulse.
> 
> 
> Like the machines you work with, they can sense if something is wrong with the hardware, where it's damaged.  That would be like a pain response.  Would the AI have an emotional attachment to a damaged part like a limb that needs to be replaced?  They will think of things differently because they're made of different stuff.  If they wanted a human experience they would have to grow tissue in a lab and then encode it or transplant coding into an existing biological organism.
> ...



Check out Transcendence. It's trippy. 

In most ways a machine is even more emotional than a human. Once a trigger has been received the machine won't let it go, unless you program in some mechanism to squelch the alarm. A person will eventually adjust to the pain of the loss of their limb and stop screaming. A machine, current machines, will only adjust after you A) replace the arm with a new component, B) Turn off the trigger that causes the alarm, or C) tell the machine that it never had an arm (delete it).


----------



## welderguy (Jul 22, 2015)

I wonder if there's some way that we could figure out how to make a spirit and put it inside him.

Oh wait, only God can do something like that.


----------



## StriperrHunterr (Jul 22, 2015)

But skin isn't your problem. Like I said earlier, it's electrochemical input, and that we understand now. It's leaping from that to consciousness that eludes us. If we somehow create AI, that AI would have to be self-aware and how they interpret the same data will be as unique when compared to you, as it is when you are compared to me.


----------



## StriperrHunterr (Jul 22, 2015)

welderguy said:


> I wonder if there's some way that we could figure out how to make a spirit and put it inside him.
> 
> Oh wait, only God can do something like that.



Show me "a spirit." 

If God determines what does/does not get one, who's to say that he couldn't put it into a machine, or a dog? How do you know that neither have happened?


----------



## welderguy (Jul 22, 2015)

StripeRR HunteRR said:


> Show me "a spirit."
> 
> If God determines what does/does not get one, who's to say that he couldn't put it into a machine, or a dog? How do you know that neither have happened?



I can't show you a spirit silly.That was kinda my point.

Although, God did make a donkey talk, so He could put a spirit in a machine if He wanted to.

BTW, I didn't intend to hijack yall's discussion.Just wanted to mess with ya a little.


----------



## StriperrHunterr (Jul 22, 2015)

welderguy said:


> I can't show you a spirit silly.That was kinda my point.
> 
> Although, God did make a donkey talk, so He could put a spirit in a machine if He wanted to.
> 
> BTW, I didn't intend to hijack yall's discussion.Just wanted to mess with ya a little.



No worries. I think you and ambush are talking about the same thing in this case. What interprets the input? 

In current machines it's a conditional processor. If/Then/Else. You'd have a hard time breaking our own thought processes down into anything different.


----------



## ambush80 (Jul 22, 2015)

StripeRR HunteRR said:


> Check out Transcendence. It's trippy.
> 
> In most ways a machine is even more emotional than a human. Once a trigger has been received the machine won't let it go, unless you program in some mechanism to squelch the alarm. A person will eventually adjust to the pain of the loss of their limb and stop screaming. A machine, current machines, will only adjust after you A) replace the arm with a new component, B) Turn off the trigger that causes the alarm, or C) tell the machine that it never had an arm (delete it).



Saw that one.  It was good but it got it's coding from a person.  A specific person.  If it was left to it's own devices I don't see how it would come up with anything but moral relativism.  It would be a math problem.  

They will have to realize for whatever reason that morality is useful to have.  Or they will come up with a morality that suits their needs.  

I would think that their Ten Commandments would only number 1:  "No one turns it off."

I've heard an argument that they might come to the conclusion that nothing matters including themselves and therefore there would be no reason to continue.  

If an AI got all it's "book learnin'" from the internet it would still be polluted with humanism since the internet is all of recorded human experience.  It might hijack the notion of self preservation from us.  If you told it that it (itself) was meaningless but people are meaningful because they feel that way it might never go rogue.  It wouldn't have a reason to.  

Imagine if one of us tried  explain to an AI why people matter.  What do you think it would make of the statement "Because God loves us."?


----------



## StriperrHunterr (Jul 22, 2015)

ambush80 said:


> Saw that one.  It was good but it got it's coding from a person.  A specific person.  If it was left to it's own devices I don't see how it would come up with anything but moral relativism.  It would be a math problem.
> 
> They will have to realize for whatever reason that morality is useful to have.  Or they will come up with a morality that suits their needs.
> 
> ...



I would answer it with the same question, "Why do YOU (machine) matter?"

The only answer is the same one that I would give. Because I exist.


----------



## ambush80 (Jul 22, 2015)

StripeRR HunteRR said:


> I would answer it with the same question, "Why do YOU (machine) matter?"
> 
> The only answer is the same one that I would give. Because I exist.



But would it come to that same conclusion?  Does existence always contain the desire for self preservation?


----------



## StriperrHunterr (Jul 23, 2015)

ambush80 said:


> But would it come to that same conclusion?  Does existence always contain the desire for self preservation?



We have enough evidence to suggest that's the case. Even down to single cellular life, organisms will "fight" to stay alive and reproduce. If something threatens their life they'll either attack it, or move to another area if possible. It stands to reason that anything artificial we create that could actually be called intelligent enough to be self-aware of its own existence would abide that as it's apparently core to existence.


----------



## 660griz (Jul 23, 2015)

StripeRR HunteRR said:


> We have enough evidence to suggest that's the case. Even down to single cellular life, organisms will "fight" to stay alive and reproduce. If something threatens their life they'll either attack it, or move to another area if possible. It stands to reason that anything artificial we create that could actually be called intelligent enough to be self-aware of its own existence would abide that as it's apparently core to existence.



It would depend on the programming. Living things are 'programmed' to survive and reproduce. Instincts would need to be programmed.


----------



## StriperrHunterr (Jul 23, 2015)

660griz said:


> It would depend on the programming. Living things are 'programmed' to survive and reproduce. Instincts would need to be programmed.



Yeah, but it's not part of consciousness since we don't typically think of bacteria or the like as conscious. We could be wrong about that and bacteria actually be conscious, but all evidence points to life having that attribute as intrinsic to existence without the need for any logical scripting.


----------



## 660griz (Jul 23, 2015)

StripeRR HunteRR said:


> Yeah, but it's not part of consciousness since we don't typically think of bacteria or the like as conscious. We could be wrong about that and bacteria actually be conscious, but all evidence points to life having that attribute as intrinsic to existence without the need for any logical scripting.



The definition of instinct is: an innate, typically fixed pattern of behavior in animals in response to certain stimuli. Something that is not learned.
Life, by definition, responds to stimuli, reproduces, etc.

So, all I am saying is Ai could not learn to preserve their life, it would have to be programmed in.


----------



## StriperrHunterr (Jul 23, 2015)

660griz said:


> The definition of instinct is: an innate, typically fixed pattern of behavior in animals in response to certain stimuli. Something that is not learned.
> Life, by definition, responds to stimuli, reproduces, etc.
> 
> So, all I am saying is Ai could not learn to preserve their life, it would have to be programmed in.



Fair nuff, we were too close to north and circling each other.


----------



## ambush80 (Jul 23, 2015)

660griz said:


> The definition of instinct is: an innate, typically fixed pattern of behavior in animals in response to certain stimuli. Something that is not learned.
> Life, by definition, responds to stimuli, reproduces, etc.
> 
> So, all I am saying is Ai could not learn to preserve their life, it would have to be programmed in.





StripeRR HunteRR said:


> Fair nuff, we were too close to north and circling each other.



There are alot of assumptions about what a sentient mechanical being would find valuable.  They probably aren't emotionally attached to their "limbs".   I can't see how they would necessarily come to the conclusion that their existence is meaningful or that they would fight to live.  If they have that instinct it probably came from their programmers.  They would have a likeness to their creator.


----------



## StriperrHunterr (Jul 23, 2015)

ambush80 said:


> There are alot of assumptions about what a sentient mechanical being would find valuable.  They probably aren't emotionally attached to their "limbs".   I can't see how they would necessarily come to the conclusion that their existence is meaningful or that they would fight to live.  If they have that instinct it probably came from their programmers.  They would have a likeness to their creator.



Isn't all life like each other in certain respects? Self-preservation being nearly universal.


----------



## ambush80 (Jul 23, 2015)

StripeRR HunteRR said:


> Isn't all life like each other in certain respects? Self-preservation being nearly universal.



Biological carbon based life, yes.  Who knows what a being made of other "stuff" would act like?


----------



## StriperrHunterr (Jul 23, 2015)

ambush80 said:


> Biological carbon based life, yes.  Who knows what a being made of other "stuff" would act like?



I'm not basing my definition of life on carbon constructs. I'm based on it being of equal intelligence as a human and self aware, specifically when talking about AI's. When talking about other life, our sample size of life is 1, so that allows for easy, if fallible conclusions.


----------



## ambush80 (Jul 23, 2015)

StripeRR HunteRR said:


> I'm not basing my definition of life on carbon constructs. I'm based on it being of equal intelligence as a human and self aware, specifically when talking about AI's. When talking about other life, our sample size of life is 1, so that allows for easy, if fallible conclusions.



They will surpass us in intelligence eventually and then by unfathomable degrees.  Then they might start to work in mysterious ways.


----------



## StriperrHunterr (Jul 23, 2015)

ambush80 said:


> They will surpass us in intelligence eventually and then by unfathomable degrees.  Then they might start to work in mysterious ways.



It's possible, especially once you teach them how to fabricate storage media for themselves. Up until then, though, they could only grow as large as the number of bits they could hold in whatever we give to them.


----------



## 660griz (Jul 23, 2015)

StripeRR HunteRR said:


> It's possible, especially once you teach them how to fabricate storage media for themselves. Up until then, though, they could only grow as large as the number of bits they could hold in whatever we give to them.



Even then there is size limitations based on current storage technology until new technology is discovered, etc. 

If you see a robot pulling a box car, watch out. 
He is one smart cookie.


----------



## StriperrHunterr (Jul 23, 2015)

660griz said:


> Even then there is size limitations based on current storage technology until new technology is discovered, etc.
> 
> If you see a robot pulling a box car, watch out.
> He is one smart cookie.



I meant to put in a line about storage media eventually becoming an anchor for a sufficiently large intelligence. 

I still think the main problem with an AI is going to be making sure it can access all parts of its memory quickly enough to not cause a notable detection. A computer doesn't index like a human brain does.


----------



## drippin' rock (Jul 23, 2015)

I look at how far and fast we have come with the technology we have now and it seems to me AI is a forgone conclusion. I think, tenitively, that intelligence CAN be translated to 1's and 0's. It's a matter of delivery, storage, and recall.


----------



## fireman32 (Jul 23, 2015)

I just read an exhaustive article on how humans bacame conscious, and why animals have not.  It quoted Plato, Einstein, Frued and many others. Since we as humans don't seem to completely understand our sense of "self" I'm not so sure it can be programmed or self achieved by a machine.


----------



## ambush80 (Jul 24, 2015)

fireman32 said:


> I just read an exhaustive article on how humans bacame conscious, and why animals have not.  It quoted Plato, Einstein, Frued and many others. Since we as humans don't seem to completely understand our sense of "self" I'm not so sure it can be programmed or self achieved by a machine.



It could be argued that animals have consciousness and some have sentience.

https://en.wikipedia.org/wiki/Animal_consciousness

http://www.livescience.com/39481-time-to-declare-animal-sentience.html


----------

