Automation Ethics: Should your car serve you or serve society?

In summary: I think it's a great idea. Tesla is selling their own brand of auto insurance. If you choose SELFISH, it will cost you an additional $100/hour, but you are allowed to choose. For rich people, the fee might be progressive and expressed in percent of your net worth. That might be the way to manage the question of automation ethics if we can't ever agree.
  • #36
Mark44 said:
Which, of course, would drain the battery even more quickly.
Yes, it must have been designed by lawyers and accountants.

But to answer the original post, if my car is going to serve society first, then society should pay for it.
 
  • Like
Likes jack action, Dale and Tom.G
Computer science news on Phys.org
  • #37
Just a few tweeks, and ...
What Level would you say you are at?

LEVELS OF AUTOMATIONWHO DOES WHAT, WHEN
Level 0The human driver does all the driving.
Level 1WIFE can sometimes assist the human driver with either steering or braking/accelerating, but not both simultaneously.
Level 2WIFE can itself actually control both steering and braking/accelerating simultaneously under some circumstances. The human driver must continue to pay full attention (“monitor the driving environment”) at all times and perform the rest of the driving task.
Level 3WIFE can itself perform all aspects of the driving task under some circumstances. In those circumstances, the human driver must be ready to take back control at any time . In all other circumstances, the human driver performs the driving task.
Level 4WIFE can itself perform all driving tasks and monitor the driving environment – essentially, do all the driving – in certain circumstances. The human need not pay attention in those circumstances.
Level 5WIFE can do all the driving in all circumstances. The human occupants are just passengers and need never be involved in driving.

You could already have an automated assisted drive and not know nor appreciate the technology.
 
  • Like
Likes symbolipoint
  • #38
Is the title question a silly question? Vehicles for "driving" came from using animals for transportation. Maybe the earlier use of animals for transport was as much for the benefit of a group as for the benefit of individual people. This has not changed in modern time.

Time For Some Humor:
One day we may have a self-driving unicycle.
 
  • #39
symbolipoint said:
Time For Some Humor:
One day we may have a self-driving unicycle.
We are half way there...
 
  • #40
anorlunda said:
When society's interest conflict with the individual owner's interest, which takes priority?
I think that it is also important to distinguish between different levels of interest. For example, while a reasonable case can be made that the designers of the car should prioritize the owner’s life over a pedestrian’s life, the owner’s property would not be prioritized over a pedestrian’s life.

To me the difference between the GM and Tesla seemed to be more about user convenience than anything. As far as I know there is too little safety data to say which causes more harm.
 
  • Like
Likes russ_watters and jack action
  • #41
jack action said:
Why anyone would want to be part of a society that does not consider his or her interest? That would be totally absurd.
Strawman alert. The GM model does not exclude your interest, it just limits your ability to inflict damage on the rest of society in satisfying it. You seem to favour a return to the situation in the early years of motoring where pedestrian deaths tended to be viewed as the fault of the pedestrian and an inconvenience for the speeding (and not infrequently drunk) motorist.
 
  • #42
Ophiolite said:
The GM model does not exclude your interest, it just limits your ability to inflict damage on the rest of society in satisfying it.
The point I'm strongly defending is making sure no one thinks having decisions made for you based on someone else's fears is OK. Does the Tesla model inflicts more damages than the GM model? That is the real question that can be answered with data.

As an individual, you assess the situation and, based on your experience, you decide how much you are willing to risk. This evaluation is very personal - and often very different - for every individual and should stay that way in my opinion. If we remove the right for someone to take a risk, you remove all possible progress. It's often easy to identify extreme cases. But people who are too reckless or too careful see all cases as extreme; when in reality the true answer begins with «Well, it depends ...»

The situation at hand is dangerously close to that kind of thinking. Person A is afraid of self-driving car, therefore person B shouldn't have one, even if person B is fine with it. Somehow person A's fear rules the decision process of anyone. When person A hides behind «society» to do so, I think it's wrong.

Implying that person B doesn't care about inflicting damage to person A because he/she doesn't share the same fear is a logical fallacy. It is sometime presented as an appeal to ignorance («We don't know if it is dangerous, therefore we shouldn't take any chance») or as a questionable cause («If they don't take this precaution, then they must want to hurt others»).

Recall OP's statement:
anorlunda said:
So their conclusion was that GM's version is clearly safer.
Where are the facts showing that Tesla's method gives worst results than GM? As far as we know, neither one has more accidents - or even incidents - in tests or the real world. On what basis can someone use the term «clearly»?

And if there are no incidents, why would «society» cares? (Again who is «society»?)
Ophiolite said:
Strawman alert. [...] You seem to favour a return to the situation in the early years of motoring where pedestrian deaths tended to be viewed as the fault of the pedestrian and an inconvenience for the speeding (and not infrequently drunk) motorist.
Mischaracterizing the opponent’s position for the sake of deceiving others IS a strawman argument. Thank you for the example.
 
  • #43
jack action said:
As an individual, you assess the situation and, based on your experience, you decide how much you are willing to risk.

At face value, that would argue against speed limits and other traffic regulations if they were enforced automatically and gave a driver no choice about whether to obey them. As it is, the driver has a choice about taking the risk of getting a ticket, causing a accident etc.

Whatever the merits of that abstract theory, I think auto pilot technology for cars will eventually lead to automatic enforcement of traffic laws by computers. That will still leave people choices - whether to hack the computers in their cars, whether to give up driving etc.
 
  • #44
jack action said:
The point I'm strongly defending is making sure no one thinks having decisions made for you based on someone else's fears is OK.
To me, the key feature of the OP story was that it was private (GM & Tesla) decisions made to be settled in the marketplace, and free of government interference. Bringing in the argument of others forcing decisions upon you really is a strawman argument.
 
  • #45
This is a great example - we have two competing values and are asking machines to choose:
  • Obey the speed limit
  • Avoid accidents
What happens when the best way to avoid an accident is to speed up? And what if you don't know what the best way is? What if you think speeding up is just a pretty good way. People have trouble with these decisions - do we think machines will do any better?

Another (IMO, more likely) decision is whether to execute a maneuver if it a) decreases the probability of an accident, but b) increases the severity of an accident if it occurs. If you say "just look at expectation values", I would counter that you probably know neither the probability nor the impact well.
 
Last edited:
  • Like
Likes symbolipoint
  • #46
Stephen Tashi said:
At face value, that would argue against speed limits and other traffic regulations if they were enforced automatically and gave a driver no choice about whether to obey them.
What is the value of a law that forces you to wait at a red light when there is no one around? If all cars had autopilot, would you trust them to engage in an intersection because the car is aware that it is alone? Why don't we trust humans to do that? Why does the human need a punishment for making a decision without any consequence?

Interesting anecdote: One of my cousin lived in England for a year and told me there were basically no stop signs there. Every intersections was basically treated as a roundabout where the car at your right has priority over you.

For people like us living in Québec, Canada, it seems chaotic. How do they do it? Here, people panic when there are no stop signs. Some trials about 'right of way' in residential area create confusion. Roundabout have been installed lately and a lot of people panic.

But actually, we were doing exactly the same thing as in England before the 80's, without realizing it. There were stop signs everywhere but, back then, everybody was doing what we called an 'American stop', i.e. we slowed down and if there was nobody, we would engage without halting. Priority was given to the first car arriving at the intersection. There was no laws regulating it, it was just common knowledge.

But in the 80's some bright police officer actually read the book of law and found out that it said that a vehicle must 'halt' at a stop sign. An opportunity for tickets arose. The notion soon spread all over the police force and ticket traps were set everywhere. Things like 'you must halt and count to 3 before going on' were some of the things you would be told by police officers (not in the law). Suddenly, everyone not making a full stop was a dangerous driver. The notion sank in and now is well accepted. The police has now stopped giving tickets for stop signs (I can't remember the last time I heard about someone receiving one). But what have we gain as a society? People that are so afraid of intersections without stop signs that they panic. Somehow, people do not trust that other people will stop for them. "There are so many crazy drivers out there!"

This is a case where laws make no sense. Laws can be abused for other reasons than what they were intended for. People are not stupid. They don't drive to create accidents. Humans - although not without flaws - are more able to accomplish complex tasks than we gave them credit for. But it is easy to destroy their confidence. The same goes with car companies (which are also run by humans). They don't build cars to kill people.
Stephen Tashi said:
That will still leave people choices - whether to hack the computers in their cars, whether to give up driving etc.
These are not choices about driving. These are choices about breaking the law (being an outlaw) or not participating (being an outcast). It sounds more like extortion to me than freedom of choice.

Who make those laws, anyway? Who decided over X km/h is too fast? How do you impose the same limit to a 20-year-old pickup truck and to a brand new Porsche? An old tired man vs an alert young woman?
Vanadium 50 said:
This is a great example - we have two competing values and are asking machines to choose:
  • Obey the speed limit
  • Avoid accidents
This is where it all starts, by opposing two unrelated values: If you go over the speed limit, will you cause an accident? OR if you stay below the speed limit, do you avoid an accident?

There is no direct relationship between the two. There is no line drawn in the sand that can make such a clear distinction. You can have an accident regardless of the speed you are driving. There are a lot more variables in the equation. The only direct relationship between speed and accidents is the gravity of the consequences WHEN you have an accident.

This is again a causal fallacy: It is not a choice between respecting the speed limit or having an accident. Nobody chooses to have an accident, regardless of their speed.
anorlunda said:
To me, the key feature of the OP story was that it was private (GM & Tesla) decisions made to be settled in the marketplace, and free of government interference. Bringing in the argument of others forcing decisions upon you really is a strawman argument.
You seem to think governments take better decisions than the people they represent. People driving cars or running private companies are the same one who are elected. So such hypothesis never made any sense to me and there is no data supporting it (or there is an equal amount of data disproving it, if you prefer).

You think this:
anorlunda said:
we can have a SELFISH/ALTRUISTIC toggle switch on all our automated devices. If you choose SELFISH, it will cost you an additional $100/hour, but you are allowed to choose. For rich people, the fee might be progressive and expressed in percent of your net worth. That might be the way to manage the question of automation ethics if we can't ever agree.
is not about others forcing decisions?

I wonder in which category you place yourself? Nobody thinks he is in the SELFISH category. Nobody. The truth behind that 'managing the question of automation ethics' is really about everyone else agreeing with you, and those who won't, we'll just make them pay until they regain their senses and act like you wish. Do you really think someone will 'choose' to pay 100 $/hour because they don't agree with you?

One's life is so much easier when the law agree with his/her views or actions. Sadly, laws are always a very poor solution for people who disagree with each other and is certainly a poor way of 'living together' in society.
 
  • Sad
Likes Dale
  • #47
bob012345 said:
But to answer the original post, if my car is going to serve society first, then society should pay for it.
Yes. In a perfect world, you should pay society for the privilege of taking a minimally safe car out on the public highways. You should then be paid a rebate for a car that is better than required or for driving behavior or automation options that are extra safe. You should be penalized for unsafe equipment or behavior.

To some extent, laws and insurance companies make this a reality today.
 
Last edited:
  • Like
Likes bob012345 and russ_watters

Similar threads

Replies
22
Views
1K
  • General Engineering
Replies
19
Views
10K
  • General Discussion
Replies
20
Views
3K
Replies
3
Views
4K
Replies
42
Views
6K
  • Introductory Physics Homework Help
Replies
2
Views
2K
  • General Discussion
Replies
1
Views
8K
  • General Discussion
Replies
33
Views
5K
  • General Discussion
2
Replies
42
Views
10K
  • General Discussion
2
Replies
38
Views
5K
Back
Top