felltoearth wrote::roll:
Incidentally, tuco posted a much longer reply to you in his personal passive-aggressive bitching 'world according to tuco' thread.
for driverless cars
Moderators: kiore, Blip, The_Metatron
felltoearth wrote::roll:
tuco wrote:The question was rhetorical. It was to underline the context which was: it will not be technological experts but legal ones deciding how AI in autonomous vehicles will behave. Which of course, and as usual, is the obvious and its beyond me why it prompted you to reaction. I answered your question and you are welcome. In case you missed it, its: both. Any other questions I can help you with?
Waymo's self-driving cars rack up 4 million miles on public roads - The ...
https://www.theverge.com/.../waymo-self ... ads-mile...
Nov 28, 2017 - Self-driving cars are still very much a technology of the future, however their presence in the here and now is growing. Alphabet's Waymo is among the pioneers pushing self-driving vehicles out into real-world testing, and it has just revealed that its cars have passed the milestone of driving more than 4 million miles on ...
Self-driving vehicles passed a major milestone in November when Waymo’s minivans hit the streets of Phoenix without backup human drivers — reportedly making them the first fleet of fully autonomous cars on public roadways. Over the next few months, people will get a chance to take these streetwise vehicles for a free spin as the company tries to drum up excitement — and a customer base — for its launch of a driverless taxi service.
Self-driving cars
Who to save, who to sacrifice?
By Marc Lajoie
April 25, 2019
Rumraket wrote:aban57 wrote:It's acknowledged that no system is perfect. If harmful outcomes cannot be reduced to zero, at least it will be below the current human level.
If a collision is unavoidable, the report say systems must aim for harm minimisation. There must be no discrimination on the basis of age, gender, race, physical attributes or anything else of any potential accident victim.
All humans are considered equal for the purposes of harm minimisation.
This makes sense to me.
I think I'd only want to add the caveat that it should, if at all possible, try to hit fewer rather than more people.
So htting a car with a single driver instead of a car with a family seems like an obvious choise to me. Try hitting the car with a single driver rather than one with a family if those are the only choises available. I don't know whether that sort of identification is technically feasible atm though.
Ryan Robert Jenkins, assistant professor of philosophy at California Polytechnic State University, said his industry sources are quick to brush aside the trolley problem. He expects companies will program their cars to brake in a straight line even if there is a potential to save lives by taking other actions.
According to Jenkins, self-driving car manufacturers are likely to be sued in any accident involving their vehicles, and they would prefer not to confront a lawsuit where people were injured who otherwise would not have been. "I suspect you'll see companies programming a car to just brake in a straight line because that gives them a great amount of plausible deniability."
Return to News, Politics & Current Affairs
Users viewing this topic: No registered users and 1 guest