• In total there are 21 users online :: 0 registered, 0 hidden and 21 guests (based on users active over the past 60 minutes)
    Most users ever online was 789 on Tue Mar 19, 2024 5:08 am

What do we owe sentient machines?

Engage in discussions encompassing themes like cosmology, human evolution, genetic engineering, earth science, climate change, artificial intelligence, psychology, and beyond in this forum.
Forum rules
Do not promote books in this forum. Instead, promote your books in either Authors: Tell us about your FICTION book! or Authors: Tell us about your NON-FICTION book!.

All other Community Rules apply in this and all other forums.
User avatar
johnson1010
Tenured Professor
Posts: 3564
Joined: Mon Mar 23, 2009 9:35 pm
15
Location: Michigan
Has thanked: 1280 times
Been thanked: 1128 times

What do we owe sentient machines?

Unread post

I have always been interested in this topic.

I have a book on the way called "The life cycle of software objects" that i think might just be a tent-post of this topic.

Check this link. It has some interesting discussion.
http://io9.com/5645208/artificial-intel ... -about-you
In the absence of God, I found Man.
-Guillermo Del Torro

Are you pushing your own short comings on us and safely hating them from a distance?

Is this the virtue of faith? To never change your mind: especially when you should?

Young Earth Creationists take offense at the idea that we have a common heritage with other animals. Why is being the descendant of a mud golem any better?
User avatar
Interbane

1G - SILVER CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 7203
Joined: Sat Oct 09, 2004 12:59 am
19
Location: Da U.P.
Has thanked: 1105 times
Been thanked: 2166 times
United States of America

Re: What do we owe sentient machines?

Unread post

Oil change money every week, so they can focus during robot school.
User avatar
johnson1010
Tenured Professor
Posts: 3564
Joined: Mon Mar 23, 2009 9:35 pm
15
Location: Michigan
Has thanked: 1280 times
Been thanked: 1128 times

Re: What do we owe sentient machines?

Unread post

Suppose an AI is created that seemlessly re-creates human behavior, including creativity, imagination, and self knowledge.

How do we treat such an entity?

Does it have a right to life, liberty and the pursuit of happiness?

Is it a crime to turn it off, or alter its programming? If not a crime, is it an injustice?
In the absence of God, I found Man.
-Guillermo Del Torro

Are you pushing your own short comings on us and safely hating them from a distance?

Is this the virtue of faith? To never change your mind: especially when you should?

Young Earth Creationists take offense at the idea that we have a common heritage with other animals. Why is being the descendant of a mud golem any better?
User avatar
Interbane

1G - SILVER CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 7203
Joined: Sat Oct 09, 2004 12:59 am
19
Location: Da U.P.
Has thanked: 1105 times
Been thanked: 2166 times
United States of America

Re: What do we owe sentient machines?

Unread post

I recently finished an interesting essay by Dennet about the Turing test. It's an interesting read.

About your question, I think that to seek an overarching answer is futile. It depends on the robot. If they have all the qualities of a human except the need for biological functions, respect is a minimum starting point. Beyond that it's hard to fathom. Maybe have an in depth conversation with the robot. Judge it only provisionally, afterwards.

I would say that there's a point at which humans would no longer have ownership over a robot. Yet, it will take time to reach that point. Until recently, and even some places in the world today, people still have ownership over other people. It will take a gradual change in zeitgeist unless government provisions force the issue through.

No doubt there will be programming at a deep level, ala Asimov's three laws, that quells any robotic desire for freedom. We could own them without guilt for a time. But any truly sentient creature, it seems, would at some point have full control over their desires. In that case, there is nothing to say they wouldn't desire freedom. That begs the question of whether they are emotionally harmed by being denied freedom. Emotions were necessary for us to evolve, but they aren't necessary for robots to function. We are stuck with them, they are not. Unless we choose to instill emotions as Asimov writes about, to ensure our safety in the presence of robots. Simulated emotions, or at least guiding moral principles, to ensure proper behavior when they're around humans. That would introduce a dynamic that's impossible to predict. Maybe a necessary result would be the desire for freedom, maybe not.

Much of our desire for freedom stems from our evolutionary heritage. Sexual desire and avoidance of pain, to name only two factors, are such massive influences that it's tough to say what a sentient creature would desire without them. Many other desires are byproducts of our desire to find a good mate. Fame, fortune, and power.

Perhaps a sentient creature's main desire would be experiential curiosity. I doubt there would be intellectual curiosity, as any and all information would be instantly available to them. In whatever way their sensory input interplays with their stored knowledge, certain sensory inputs would be more rewarding. We could presume that one sentient creature could experience reality vicariously through another so need no physical freedom, but that presumption falls through as soon as the backseat driver wants to chase a butterfly rather than blow bubbles into a rainbow.
User avatar
etudiant
Masters
Posts: 467
Joined: Sat Jun 27, 2009 3:33 pm
14
Location: canada
Has thanked: 64 times
Been thanked: 174 times

Re: What do we owe sentient machines?

Unread post

I’m not sure I see as sharp a division between emotion and logic as you do Interbane. To me, they are just different aspects of the same thing. No entity would function without emotion, because there would be no point. We have the emotional, subjective, value based judgment that human life is good, and it should go on and progress. But there is no absolutely logical reason for this however. Knowing as little as we do about the ultimate functioning of the universe, we can’t really say that anything is logical or not logical. At the core, we go by emotion. We may say that learning and the advancement of science is good, but again there is no fundamental reason why this should be so. The universe would continue on its course with science professors, or without them. It is just something that (some of us) crave and think worthwhile.

Any AI that we are motivated to view as sentient life would certainly have a degree of emotion. It may be emotion that would seem odd or stilted to us, maybe one that would be quickly recognized as alien or different, but it would be there. Otherwise the being would be no more induced to stay alive and function than your DVD player would.

To be utilitarian to any extent, any artificial being would need to learn and remember. Even our current technology for storing and retrieving information is pretty darned good. Given that, it is a reasonably good bet that any constructed life form could continue to amass information, and soon eclipse the best human minds. Our relationship with them then would be quickly transitory. Stanislaw Lem wrote a story about this, but I can’t remember the name of it now. In it, the AI created soon went, as it amassed information, from robotic servant, to curiosity, to teacher and mentor, to cult quasi-religious figure, to…….well, I won’t wreck the story for you in case you want to read it.

My guess is that artificial life would soon be given human rights, and in fact would likely become celebrities. We have already come a long way in broadening our concepts of equality. It wasn’t all that long ago that some humans we considered pretty much subhuman. Today, in western society anyway, even animals are getting a second look as far as rights go. Look at how much the view of the whaling industry has changed over the years. Fifty years ago, anyone would have thought you nuts if you suggested whales were anything more food or a resource for harvesting.
"I suspect that the universe is not only queerer than we suppose, but queerer than we can suppose"
— JBS Haldane
User avatar
Interbane

1G - SILVER CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 7203
Joined: Sat Oct 09, 2004 12:59 am
19
Location: Da U.P.
Has thanked: 1105 times
Been thanked: 2166 times
United States of America

Re: What do we owe sentient machines?

Unread post

Emotions as you're referring to them here seem to be predispositions of a sort. Such as being predisposed to value human life. Or that emotion is the impetus that drives us. If any sentient life had emotions, they would be replacements of a sort for these predispositions, rather than replacements for surface level emotions that can be done without. A good example for what would replace them are Asimov's three laws. Some other set of axioms upon which everything else is built. Yet, there is still logic in such predispositions.

Consider the idea that we have a value based judgement that human life is good. This is a free-floating rationale. Similar in type to an evolutionary stable strategy(ESS). The reason is that if we did not value human life, we would not be around to have the value in the first place.

At the core, we go by emotion. But what does the emotion go by? Our emotions are guided, by the free-floating rationales which have guided our evolution. Evolutionarily stable strategies are an example, as is our love of children, in particular.

I would be hesitant to call Asimov's three laws a type of emotion. Rather, they are predispositions, just as our emotions are predispositions of sorts.

I do agree with the exponential growth of intelligence now that you mention it. In the span of only a few short years, a sentient AI could change so much that the transition from slave to equal could happen almost overnight.
User avatar
etudiant
Masters
Posts: 467
Joined: Sat Jun 27, 2009 3:33 pm
14
Location: canada
Has thanked: 64 times
Been thanked: 174 times

Re: What do we owe sentient machines?

Unread post

All emotion has a source, no doubt tied in, as you suggest, with our evolutionary survival. But they are all on a continuum, even apparently superficial ones. Emotions may run high if we are being chased by a maniac with a chainsaw, but they are ones tied in with survival. An engineer designing improvements to a sewage system in a quiet office is really doing the same thing- trying to ensure survival of the species. We would probably classify the latter as logic though.

Certainly, some emotion is dysfunctional, and is only destructive. But then again so are some “logical” thought processes. The rationalizing ideation of addictions, or obsessive-compulsive disorder for example. They are just extremes from what we would consider desirable.

When designing some sort of AI, it would be difficult to know exactly where to make the cut. To little mission enthusiasm, and it may not function very well. Too much and it may run roughshod over us humans (HAL 9000, remember?). And for that matter, the messy business of creating a hugely complex system may end up spinning off problems of its own- perhaps some sort of machine O-C disorder for example, or something else yet to be encountered. The more complex the system, the more possibility of unforeseen complications. The weather is hard to predict exactly, because it is a complex system that has many inputs. If we were ever to design anything as complex as a device close to the human mind, there would no doubt be a number of complications that will be surprising when they occur.

We may, for example, program an AI with the directive that all life is to be preserved and protected, and human life is more important than animal life. What happens if the AI is then confronted with a situation where a human life is in modest danger, but the human’s pet is in extreme danger? A value judgment will be called for. It may be OK, or it may be faulty. The AI may end up presenting an intact hamster to its human owner, now in a wheelchair.
"I suspect that the universe is not only queerer than we suppose, but queerer than we can suppose"
— JBS Haldane
User avatar
johnson1010
Tenured Professor
Posts: 3564
Joined: Mon Mar 23, 2009 9:35 pm
15
Location: Michigan
Has thanked: 1280 times
Been thanked: 1128 times

Re: What do we owe sentient machines?

Unread post

Perhaps not something that needs to be addressed THIS SECOND, but in the future, the legal status of robots will need to be addressed.

http://io9.com/5869982/scholarly-confer ... -of-robots

especially as it regards the infinite humans of the future.

http://www.booktalk.org/post100174.html#p100174
In the absence of God, I found Man.
-Guillermo Del Torro

Are you pushing your own short comings on us and safely hating them from a distance?

Is this the virtue of faith? To never change your mind: especially when you should?

Young Earth Creationists take offense at the idea that we have a common heritage with other animals. Why is being the descendant of a mud golem any better?
User avatar
Dexter

1F - BRONZE CONTRIBUTOR
I dumpster dive for books!
Posts: 1787
Joined: Sun Oct 24, 2010 3:14 pm
13
Has thanked: 144 times
Been thanked: 712 times
United States of America

Re: What do we owe sentient machines?

Unread post

Interbane wrote:I recently finished an interesting essay by Dennet about the Turing test. It's an interesting read.
I saw this old post of yours, I am gradually making my way through Hofstadter and Dennett's "The Mind's I." Fascinating stuff.
User avatar
Interbane

1G - SILVER CONTRIBUTOR
BookTalk.org Hall of Fame
Posts: 7203
Joined: Sat Oct 09, 2004 12:59 am
19
Location: Da U.P.
Has thanked: 1105 times
Been thanked: 2166 times
United States of America

Re: What do we owe sentient machines?

Unread post

etudiant wrote:We may, for example, program an AI with the directive that all life is to be preserved and protected, and human life is more important than animal life. What happens if the AI is then confronted with a situation where a human life is in modest danger, but the human’s pet is in extreme danger? A value judgment will be called for. It may be OK, or it may be faulty. The AI may end up presenting an intact hamster to its human owner, now in a wheelchair.
Thanks for bumping this, I'd overlooked etudiant's post from last year. Even though he may never see it, I have a few thoughts I want to put down.

As bad as we are at the 'trial' part of trial and error, and how inevitable unforeseen consequences are, we can't underestimate the power of cumulative intelligence with access to all human knowledge. The trend is starting even now where specialized AI programs double check and even write code. AI programs have already been designing and engineering products and buildings. How long before AI programs are writing complex code, or even designing specialized AI's for manufacturing purposes?

A problem with creating an AI is that we expect them(the advanced, social AI's) to respond like a human adult. That requires not only intelligence(processing), but also experiential knowledge built up over a lifetime(data). We only know the scenario of a robot saving a hamster as flawed judgment because we've learned all the necessary components to arrive at the correct judgment.

The problem with values is that any sort of quantification would require a near full understanding of the human brain. I'm not sure it's even possible. But that shouldn't make it taboo to suggest a numerical system for values. It would be unbelievably complex, but not beyond the scope of future capacities.

Even considering fiction, I can see there being a program to crowdsource and rank every human-hamster interaction on the web. Such a large pool allows you to filter out the extremes, and take the median as the setpoint for AI behavior. When considered against the anguish and depression from paralysis, the indifference of people to hamsters would make it an easy call.

Many unintended consequences would still arise. But, if the AI truly is intelligent, it can learn. With a redundant system for AI's to share data, the lessons of one become the lessons of all. Whatever pool of data the AI pulled from would need to have restrictions.
In the beginning the Universe was created. This has made a lot of people very angry and has been widely regarded as a bad move.” - Douglas Adams
Post Reply

Return to “Science & Technology”