- cross-posted to:
- fuck_cars@lemmy.ml
- technews@radiation.party
- magnetosphere ( @HappyMeatbag@beehaw.org ) English82•1 year ago
Those damn things are not ready to be used on public roads. Allowing them is one of the more prominent examples of corruption that we’ve seen recently.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 50•1 year ago
Statistically they’re still less prone to accidents than human drivers.
I never quite undestood why so many people seem to be against autonomous vehicles. Especially on Lemmy. It’s unreasonable to demand perfection before any of these is used on the public roads. In my view the bar to reach is human level driving and after that it seems quite obvious that from safety’s point of view it’s the better choice.
- evilviper ( @evilviper@beehaw.org ) 78•1 year ago
This is just such a bad take, and it’s so disappointing to see it parroted all over the web. So many things are just completely inaccurate about these “statistics”, and it’s probably why it “seems” so many are against autonomous vehicles.
- These are self-reported statistics coming from the very company(s) that have extremely vested interests in making themselves look good.
- These statistics are for vehicles that are currently being used in an extremely small (and geo-fenced) location(s) picked for their ability to be the easiest to navigate while being able to say “hey we totally work in a big city with lots of people”.
- These cars don’t even go onto highways or areas where accidents are more likely.
- These cars drive so defensively they literally shut down so as to avoid causing any accidents (hey, who cares if we block traffic and cause jams because we get to juice our numbers).
- They always use total human driven miles which are a complete oranges to apples comparison: Their miles aren’t being driven
- In bad weather
- On dangerous, windy, old, unpaved, or otherwise poor road conditions
- In rural areas where there are deer/etc that wander into the road and cause accidents
- They also don’t adjust or take any median numbers as I’m not interested in them driving better than the “average” driver when that includes DUIs, crashes caused by neglect or improper maintenance, reckless drivers, elderly drivers, or the fast and furious types crashing their vehicle on some hill climb driving course.
- And that’s all just off the top of my head.
So no, I would absolutely not say they are “less prone to accidents than human drivers”. And that’s just the statistics, to say nothing about the legality that will come up. Especially given just how adverse companies seem to be to admit fault for anything.
- Kleinbonum ( @Kleinbonum@feddit.de ) 19•1 year ago
These cars don’t even go onto highways or areas where accidents are more likely.
Accidents are less likely on highways. Most accidents occur in urban settings. Most deadly accidents occur outside of cities, off-highway.
- evilviper ( @evilviper@beehaw.org ) 4•1 year ago
Sure mile for mile they are less likely. But when they happen they are generally more serious as higher speeds are involved, and if Tesla has shown anything it’s a much more complicated process for autonomous vehicles to navigate and deal with edge cases (like vehicles on the side of the road, emergency or otherwise). Much harder (and dangerous) to just slam on the brakes and put on your hazards on a highway than a side street if the car gets confused.
- uzay ( @uzay@beehaw.org ) 4•1 year ago
I could see accidents being more likely for autonomous cars on highways though
- Kornblumenratte ( @Kornblumenratte@feddit.de ) 8•1 year ago
Why? Driving on highways is the easiest kind of driving?
- uzay ( @uzay@beehaw.org ) 4•1 year ago
For humans, but not necessarily for camera-based autonomous cars? They also can’t just stop on a highway to prevent accidents.
- Kornblumenratte ( @Kornblumenratte@feddit.de ) 1•1 year ago
Well, I do use a car that is able to drive (almost) autonomous on a highway, so I know that the tech to drive on highways exist since several years.
All the difficult stuff – slow traffic, parking cars, crossings, pedestrians… – does not exist on highways.
The only problem that still remains is the problem you mention: what to do in case of trouble?
Of course you have to stop on a highway to prevent an accident or in case of an emergency. That’s exactly what humans do. But then humans get out of the car, set up warning signs, get help &c. Cars cannot do this. The result is reported in this article.
- abhibeckert ( @abhibeckert@beehaw.org ) 6•1 year ago
Avoiding dangerous scenarios is the definition of driving safely.
This technology is still an area under active development and nobody (not even Elon!) is claiming this stuff is ready to replace a human in every possible scenario. Are you actually suggesting they should be testing the cars in scenarios that they know wouldn’t be safe with the current technology? Why the fuck would they do that?
So no, I would absolutely not say they are “less prone to accidents than human drivers”.
OK… if you won’t accept the company’s reported data - who’s data will you accept? Do you have a more reliable source that contradicts what the companies themselves have published?
to say nothing about the legality that will come up
No that’s a non issue. When a human driver runs over a pedestrian/etc and causes a serious injury, if it’s a civilised country and a sensible driver, then an insurance company will pay the bill. This happens about a million times a week worldwide and insurance is a well established system that people are, for the most part, happy with.
Autonomous vehicles are also covered by insurance. In fact it’s another area where they’re better than humans - because humans frequently fail to pay their insurance bill or even deliberately drive after they have been ordered by a judge not to drive (which obviously voids their insurance policy).
There have been debates over who will pay the insurance premium, but that seems pretty silly to me. Obviously the human who ordered the car to drive them somewhere will have to pay for all costs involved in the drive. And part of that will be insurance.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 5•1 year ago
Well hey - atleast I provided some statistics to back me up. That’s not the case with the people refuting those stats.
- evilviper ( @evilviper@beehaw.org ) 3•1 year ago
I honestly can’t tell if that’s a passive-aggressive swipe at me or not; but just in case it was: stats mean very little w/o context. I believe the quote was “Lies, damned lies, and statistics”. I simply pointed out a few errors with the foundation of these “statistics”. I didn’t need to quote my own statistics because, as I was pointing out, this is a completely apples to oranges comparison. The AV companies want at the same time to preach about how many miles they go w/o accident while comparing themselves to an average they know doesn’t match their own circumstances. Basically they are taking their best case scenario and comparing it against average/worst case scenario stats.
I’d give more weight to the stats if they where completely transparent, worked with a neutral 3rd party, and gave them access to all their video/data/etc to generate (at the very least) proper stats relative to their environment. Sure, I’ll way easier believe waymo/cruises’ numbers over those by tesla; but I still take it with a grain of salt. Because again, they have a HUGE incentive to tweak their numbers to put themselves in the very best light.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 4•1 year ago
No, I see your point, and I agree. These companies are almost guaranteed to cherry-pick those stats, so only a fool would take that as hard evidence. However, I don’t think these stats flat-out lie either. If they show a self-driving car is three times less prone to accidents, I doubt the truth is that humans, in fact, are twice as good. I believe it’s safe to assume that these stats at least point us in the right direction, and that seems to correlate with the little personal experience I have as well. If these systems really sucked as much as the most hardcore AV-skeptics make it seem, I doubt we’d be seeing any of these in use on public roads because the issues would be apparent.
However, the point I’m trying to highlight here is that I make a claim about AV-safety, and I then provide some stats to back me up. People then come telling me that’s nonsense and list a bunch of personal reasons why they feel so but provide nothing concrete evidence except maybe links to articles about individual accidents. That’s just not the kind of data that’s going to change my mind.
- Rikudou_Sage ( @rikudou@lemmings.world ) 19•1 year ago
Fine by me, as long as the companies making the cars take all responsibility for accidents. Which, you know, the human drivers do.
But the car companies want to sell you their shitty autonomous driving software and make you be responsible.
If they don’t trust it enough, why should I?
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 2•1 year ago
Well you shouldn’t trust it and the car company tells you this. It’s not foolproof and something to be blindly relied on. It’s a system that assists driving but doesn’t replace the driver. Not in it’s current form atleast though they may be getting close.
- Rikudou_Sage ( @rikudou@lemmings.world ) 11•1 year ago
Then what’s the discussion even about? I don’t want autonomous cars on the street because even their creators don’t trust them to make it.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 2•1 year ago
Most people consider cruise control to be quite useful feature thought it still requires you to pay attention that you stay on your lane and don’t run into a slower vehicle in front of you. You can then keep adding features such as radar for adaptive cruise control and lane assist and this further decreases the stuff you need to pay attention to but you still need to sit there behind the wheel watching the road. These self-driving systems at their current form are no different. They’re just further along the spectrum towards self driving. Some day we will reach the point that you sitting on the driver’s seat just introduces noise to the system so better you go take a nap on the back seat. We’re not there yet however. This is still just super sophisticated cruise control.
It’s kind of like with chess engines. First humans are better at it than computers. Then computer + human is better than just the computer and then at some point the human is no longer needed and computer will from there on always be better.
- Rikudou_Sage ( @rikudou@lemmings.world ) English7•1 year ago
I don’t feel like this is what we were talking about - at least I was talking about cars that drive alone.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 2•1 year ago
Well Cruise is offering a full self driving taxi service where they don’t mandate you as a passenger to pay attention to the traffic and take control if needed so it’s not fair to say that they don’t trust it so why should you.
With Tesla however this is the case but despite their rather aggresive marketing they still make it very clear that this is not finished yet and you are allowed to use it but you’re still the driver and the safe use of it is on your responsibility. That’s the case with the beta version of any software; you get it early which is what early adopters like but you’re expected to encounter bugs and this is the trade-off you have to accept.
- Kornblumenratte ( @Kornblumenratte@feddit.de ) 4•1 year ago
The discussed incident does not involve driving assist systems, driverless autonomous taxis are already on the streets:
A number of Cruise driverless vehicles were stopped in the middle of the streets of the Sunset District after Outside Lands in Golden Gate Park on Aug. 11, 2023.
- TheHalc ( @TheHalc@sopuli.xyz ) 2•1 year ago
take responsibility [… like] human drivers do.
But do they really? If so, why’s there the saying “if you want to murder someone, do it in a car”?
I do think self-driving cars should be held to a higher standard than humans, but I believe the fundamental disagreement is in precisely how much higher.
While zero incidents is naturally what they should be aiming for, it’s more of a goal for continuous improvement, like it is for air travel.
What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?
- Rikudou_Sage ( @rikudou@lemmings.world ) English3•1 year ago
Well, the laws for sure aren’t perfect, but people are responsible for the accidents they cause. Obviously there are plenty of exceptions, like rich people, but if we’re talking about the ideal real-life scenario, there are consequences for causing an accident. Whether those consequences are appropriate or not is for another discussion.
- abhibeckert ( @abhibeckert@beehaw.org ) 2•1 year ago
While zero incidents is naturally what they should be aiming for, it’s more of a goal for continuous improvement, like it is for air travel.
As far as I know, proper self driving (not “autopilot”) AVs are pretty close to zero incidents if you only count crashes where they are at fault.
When another car runs a red light and smashes into the side of an autonomous vehicle at 40mph… it wasn’t the AV’s fault. Those crashes should not be counted and as far as I know they currently are in most stats.
What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?
I’m fine with exactly the same liability as human drivers have. Unlike humans, who are motivated to drive dangerously for fun or get home when they’re high on drugs or continue driving through the night without sleep to avoid paying for a hotel, autonomous vehicles have zero motivation to take risks.
In the absence of that motivation, the simple fact that insurance against accidents is expensive is more than enough to encourage these companies to continue to invest in making their cars safer. Because the safer the cars, the lower their insurance premiums will be.
Globally insurance against car accidents is approaching half a trillion dollars per year and increasing over time. With money like that on the line, why not spend a lazy hundred billion dollars or so on better safety? It won’t actually cost anything - it will save money.
- jarfil ( @jarfil@beehaw.org ) 1•1 year ago
the safer the cars, the lower their insurance premiums will be.
Globally insurance against car accidents is approaching half a trillion dollars per year
That… almost makes it sound like the main opposition to autonomous cars, would be insurance companies: can’t earn more by raising the premiums, if there are no accidents and a competing insurance company can offer a much cheaper insurance.
- upstream ( @upstream@beehaw.org ) 16•1 year ago
I saw a video years ago discussing this topic.
How good is “good enough” for self-driving cars?
The bar is much higher than it is for human drivers because we downplay our own shortcomings and think that we have less risk than the average driver.
Humans can be good drivers, sure. But we have serious attention deficits. This means it doesn’t take a big distraction before we blow a red light or fail to observe a pedestrian.
Hell, lot of humans fail to observe and yield to emergency vehicles as well.
But none of that is newsworthy, but an autonomous vehicle failing to yield is.
My personal opinion is that the Cruise vehicles are as ready for operational use as Teslas FSD, ie. should not be allowed.
Obviously corporations will push to be allowed so they can start making money, but this is probably also the biggest threat to a self-driving future.
Regulated so strongly that humans end up being the ones in the driver seat for another few decades - with the cost in human lives which that involves.
- millie ( @millie@beehaw.org ) 1•1 year ago
By definition nearly half of us are better than average drivers. Given that driving well is a matter of survival, I’ll take my own driving ability over any autonomous vehicle until they’re safer than 99% of drivers.
- upstream ( @upstream@beehaw.org ) 3•1 year ago
I mean, that’s an obvious one.
But how much better would it need to be? 99.9% or 99.9999999999999999999999%, or just 99.01%
A lot of people will have qualms as long as the chance of dying is higher than zero.
People have very poor understanding of statistics and will cancel holidays because someone in the vicinity of where they’re going got bitten by a shark (the current 10 year average of unprovoked shark bites is 74 per year).
Similarly we can expect people to go “I would never get into a self-driving car” when the news inevitably reports on a deadly accident even if the car was hit by a falling rock.
And then there’s the other question:
Since 50% of drivers are worse than the average - would you feel comfortable with those being replaced by self driving cars that were (proven to be) better than the average?
- millie ( @millie@beehaw.org ) 2•1 year ago
Given that I have no way of communicating with the driverless car and communication is often important to driving, I’d rather the kinda bad driving person. I can compensate for their bad driving when I spot it and give them room. Or sometimes i can even convey information that helps them be safer while they’re not paying attention. I’ve definitely stopped crashes that didn’t involve me using my horn.
There’s no amount of discussion or frantic hand waving that will alter the course of an automated vehicle.
- upstream ( @upstream@beehaw.org ) 2•1 year ago
I think you are optimistic about communicating with the worst percentile of drivers, but can’t argue with your reasoning
- millie ( @millie@beehaw.org ) 1•1 year ago
Once I was driving down what had become a narrow street with high snow banks when I came across an older woman stuck between the banks repeatedly backing into the door of her neighbor’s car as she tried to get out of her driveway. After watching her do this for a couple of minutes I offered to get her car straightened out for her. She was ecstatic and about 30 seconds later we were both able to go about our days.
- jarfil ( @jarfil@beehaw.org ) 1•1 year ago
discussion or frantic hand waving
I don’t think drivers are supposed to communicate like that… but it raises a better question: how is a cop directing draffic, supposed to communicate with a driverless car?
If there is no mechanism in place, that’s a huge oversight… while if there is one, why didn’t use it in this case?
- Turun ( @Turun@feddit.de ) 9•1 year ago
I’m not gonna join in the discussion, but if you cite numbers, please don’t link to the advertising website of the company itself. They have a strong interest in cherry picking the data to make positive claims.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 3•1 year ago
These companies are the only ones with access to those stats. Nobody else has it. The alternative here is to not cite stats at all. If you think the stats are wrong you can go find alternative source and post it here.
- Turun ( @Turun@feddit.de ) 3•1 year ago
If they do not give researchers access to the data, then I can guarantee you they are cherry picking their results. A research paper in a reputable journal would be easy publicity and create a lot of trust in the public.
- Baggins ( @baggins@beehaw.org ) English8•1 year ago
They can’t come quick enough for me. I can go to work after a night out without fear I might still be over the limit. I won’t have to drive my wife everywhere. Old people will not be prisoners in their own homes. No more nobheads driving about with exhausts that sound like a shoot out with the cops. No more aresholes speeding about and cutting you up. No more hit and runs. Traffic accident numbers falling through the floor. In fact it could even get to a point where the only accidents are the fault of pedestrians/cyclists not looking where they are going.
- nous ( @nous@programming.dev ) English16•1 year ago
All of these are solved by better public transport/safe bike routes and more walkable city designs. All of which is we can do now, not rely on some new shiny tech so that we can keep car companies profits up.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 4•1 year ago
The possibilities really are endless.
When the light turns green the entire row of cars can start moving at the same time like on motor sports. Perhaps you don’t even need traffic lights because they can all just drive to the intersection at the same time and just keep barely missing eachother but never crash due to the superior reaction times and processing speeds of computer. You could also let your car go taxi other people around when you don’t need it.
- Confetti Camouflage ( @Confetti_Camouflage@pawb.social ) English10•1 year ago
What if we tied that entire row of cars together as one unit so we could save cost on putting high end computers in each car? Give them their own dedicated lane because we will never have 100% fully autonomous cars on the road unless we make human drivers illegal.
I’ll call my invention a train.
- Baggins ( @baggins@beehaw.org ) English5•1 year ago
I think you might need lights for pedestrians at crossings.
I did wonder if ambulances would need sirens but again, pedestrians!
- Kornblumenratte ( @Kornblumenratte@feddit.de ) 4•1 year ago
Just ban pedestrians. Problem solved,
- TehPers ( @TehPers@beehaw.org ) English4•1 year ago
The day I can get in a car and not be simultaneously afraid of my own shortcomings and the fact that there are strangers driving massive projectiles around me is a day I will truly celebrate. The fact is that automobiles are weapons, and I don’t want to be the one wielding it when a single mistake can cost an entire family their lives, although I would like to be there to slam on the brakes and prevent it if needed.
- const_void ( @const_void@lemmy.ml ) 4•1 year ago
For me it’s because they’re controlled by a few evil companies. I’m not against them in concept. Human drivers are the fucking worst.
- WagesOf ( @WagesOf@artemis.camp ) 2•1 year ago
It sure would be nice if the bar was the rational one of “better” but people aren’t rational. It’s literally never going to be good enough, because even if it were perfect it still can’t be used.
- Thorny_Thicket ( @Thorny_Thicket@sopuli.xyz ) 3•1 year ago
I think one of the big issues psychologically about self-driving cars that people find really hard to come to terms with is the fact that even with the best systems, accidents are bound to happen and without a driver there’s no one to blame and we hate that.
- Rikudou_Sage ( @rikudou@lemmings.world ) 7•1 year ago
There is - the company. Which they obviously don’t like. I think a huge chunk of people would be fine with them if the companies took responsibility.
- TehPers ( @TehPers@beehaw.org ) English4•1 year ago
I remember something about Mercedes taking liability when self driving is active, although I don’t know if that still holds. Still, this seems like something that can be approached with proper legislation, assuming we can get past the lobbying BS in the US (though the EU will probably make the right call much sooner).
- Rikudou_Sage ( @rikudou@lemmings.world ) English5•1 year ago
Yep, I’m pretty confident they won’t be autonomously driving on EU roads legally until they conform to pretty strict legislation which I’m pretty sure will include the liability of the company.
Nice of Mercedes to do the right thing without being forced to, that’s surprisingly rare.
- TehPers ( @TehPers@beehaw.org ) English2•1 year ago
I believe they’re already allowed in Germany actually, although their autonomous driving feature is very limited in where it can be activated. Hopefully other vehicle manufacturers follow suit and take liability when doing autonomous driving (as opposed to “assisted driving”, which many vehicles currently have).
- gelberhut ( @gelberhut@lemdro.id ) English12•1 year ago
Are you talking about AVs or about humandrivers, which drive drunk, been overtired, after a bad night, emotionally, texting during driving etc?
- NoPro ( @NoPro@kbin.social ) 5•1 year ago
drive drunk, texting while driving
those things are also illegal, mind you
- gelberhut ( @gelberhut@lemdro.id ) English4•1 year ago
some are, some are not, but they happen.
my point was, there are some cases where human drivers act better (yet), but there are a lot of other cases where they act worse (for many different reasons). And if a single indirectly lethal case means that “Those damn things are not ready to be used on public roads.”, then human drivers are not ready either - they are responsible for much more lethal cases (per whatever unit you count).
- sub_o ( @sub_@beehaw.org ) English52•1 year ago
I believe from what I read is that some of these driverless car companies in the US are releasing their fleet, flooding the street 24/7. Some of them will take up parking places, cause traffic jam, or just stall in the middle of the road.
Maybe it’s different in the Europe, where there’s stricter regulation, since from the comments here, many who are okay with driverless car are mostly from European countries. Unless if you own stock in those companies, then there’s incentive caused bias.
Just like how drugs need to go on multiple clinical trials before going on the mass market, I believe that if you want driverless vehicles, a lot of testing is needed.
But this is not testing / gathering data phase, Cruise has 300 cars at night, 100 during the day in SF, while Waymo has around 250 cars. Again, this is not testing phase, there’s no driver to safeguard in case things go wrong, these are actual driverless taxi that charges people.
The main rationale of these companies is not to bring a safer environment with driverless cars, the main rationale is how to get rid of gig workers that causes problems to Uber or Lyft, problems such as demanding living wage, proper employment status, unions, etc.
If you want to look at a better approach, maybe look at how Singapore is doing it
- it’s operated by SMRT and SBS bus, which are regulated and owned by government
- it’s self driving bus
- “drivers will remain essential to the operation of autonomous vehicles even when these do take off, although their job scope will change”
So if you wanna support, maybe don’t support what Cruise is doing, but more of what Singapore is doing
- it’s still highly regulated
- it’s a bus, it’s a public transportation, so it still helps in tackling climate issues.
- it’s not being used to fire workers,
- there’s still failsafe, the drivers are standby, in case the bus goes haywire
- query ( @query@beehaw.org ) English3•1 year ago
Empty cars on roads or anywhere they don’t need to be, should be treated like empty residential properties should. Tax them for wasting resources that others could use.
- jon ( @jon@lemmy.tf ) English28•1 year ago
Maybe don’t allow autonomous cars on public streets then? The tech is nowhere near ready for prime time.
- abhibeckert ( @abhibeckert@beehaw.org ) 19•1 year ago
We should ban police cars too - because allegedly an empty police car was also blocking the ambulance.
The AV spokesperson said they reviewed the footage and found there was room to pass their vehicle safely and another ambulance and other cars did so.
Or ban police 🤔
- sanzky ( @sanzky@beehaw.org ) 1•1 year ago
and cars
- ryan ( @ryan@the.coolest.zone ) 25•1 year ago
When these things were originally being tested, at least the Waymo ones I’m familiar with, there was a driver who could manually override in case of issues. Honestly, if these things still have issues with emergency situations (and other unexpected situations), they absolutely still need a driver with the ability to manually override the car. That way, they can still test the self-driving function while being able to actually maneuver the car out of the way of things like this.
- Sightline ( @Sightline@lemmy.ml ) English14•1 year ago
Don’t worry, they’ll continue to fail upwards.
- kitonthenet ( @kitonthenet@kbin.social ) 20•1 year ago
These people never should’ve been allowed to beta test with our lives when no one approved it
- thepianistfroggollum ( @thepianistfroggollum@lemmynsfw.com ) English18•1 year ago
I’d really like to see the stats on how many human driver issues they had during the same time span
The wonderful thing about human drivers is that they generally listen to instructions from first responders and are pretty good at realizing when they need to get out of the way. Even when they do not speak English, they are typically responsive to gestures.
Entirely unsurprisingly, existing resources are putting together plans on how to deal with this problem and what they’d like to see in terms of changes from AV operators and the companies which operate them.
- gelberhut ( @gelberhut@lemdro.id ) English11•1 year ago
I disagree that human drivers in general act more responsible than AV. And for exactly this use case as well, I read too many stories about emergency cars stuck in traffic causing death of someone.
The only country where emergency corridor works well is Germany, afaik.
- Rikudou_Sage ( @rikudou@lemmings.world ) 3•1 year ago
They work quite well in Czechia as well.
- gelberhut ( @gelberhut@lemdro.id ) English2•1 year ago
Sounds good, can you share an example. In Germany, this is called “Rettungsgasse” and works (mostly always) like here: https://www.youtube.com/watch?v=7kPT7VHVTb8
- Rikudou_Sage ( @rikudou@lemmings.world ) English1•1 year ago
Pretty much the same. Few years ago we even changed the law to be in tune with Germany and our other neighbors - previously the corridor was on the right side, now it’s on the left. IIRC we were the only ones who had it on the right.
- gelberhut ( @gelberhut@lemdro.id ) English1•1 year ago
This sounds really good. Thank you for sharing!
- n2burns ( @n2burns@lemmy.ca ) 12•1 year ago
It’s not really an apple-to-apples comparison. These are taxis, so they should only be compared to professional taxi drivers. Then, unless you’re comparing per ride statistics, you have to factor in the fact that drivers typically park in between customers while AVs roam leading to additional traffic and chances for “glitches”.
This is before you begin to consider whether AV taxis are a societal benefit in one of the least car-centric places in the country.
- gelberhut ( @gelberhut@lemdro.id ) English4•1 year ago
I think that taxi for AV is selected not because this is the most painful area which must be improved (in that case I agree apple to apple comparison would be needed), but because it is a small well controlled area which is relatively easy to start and implement an improvement feedback loop.
- aeternum ( @aeternum@kbin.social ) 13•1 year ago
I thought this meant tom cruise lol.
- paper_clip ( @paper_clip@kbin.social ) 2•1 year ago
To be fair, those are Mission Impossible chase scenes really disrupt traffic.
- CanadaPlus ( @CanadaPlus@lemmy.sdf.org ) 11•1 year ago
I don’t get it, why isn’t there an option for a Cruise employee or a first responder to just take control of the thing when it gets stuck?
- abhibeckert ( @abhibeckert@beehaw.org ) 12•1 year ago
Drive to the right edge of the road and stop until the emergency vehicle(s) have passed
That is a direct quote from the California DMV and from the sounds of it that’s exactly what the autonomous car did.
The right answer, in my opinion, is to allow the first responders to take control of the car. This wasn’t just a lone ambulance that happened upon a stationary car. It was a major crash (where a human driven car ran over a pedestrian) with a road that was blocked by emergency vehicles. A whole bunch of cars, not just autonomous ones, were stopped in the middle of the road waiting for the emergency to be over so they could continue on their way. Not sure why only this one car is getting all the blame.
- CanadaPlus ( @CanadaPlus@lemmy.sdf.org ) 10•1 year ago
I just actually bothered to read the article, and it sounds like it was an empty police car blocking the way between two Cruise cars that had pulled over leaving a space, and there in fact was a way to manually move them but it took critical time.
These cars get stuck all the time and are a major local controversy, so I’m guessing this was the click-baitiest headline they could go with. “Police officer carelessly gets in the way of paramedics” just doesn’t have the same ring.
- Amju Wolf ( @amju_wolf@pawb.social ) English9•1 year ago
Not sure why only this one car is getting all the blame.
Because it generates clicks.
- EquipLordBritish ( @EquipLordBritish@beehaw.org ) 9•1 year ago
Two autonomous Cruise vehicles and an empty San Francisco police vehicle were blocking the only exits from the scene, according to one of the reports, forcing the ambulance to wait while first responders attempted to manually move the Cruise vehicles or** locate an officer who could move the police car**.
So, in conjunction with a cop car, the road was blocked. I’d love to see an actual picture or diagram of the blockage.
- jarfil ( @jarfil@beehaw.org ) 5•1 year ago
These AVs are programmed to give high priority to police cars, ambulances, read works, and what not. They’re also happy to interprete what they see in the strictest way possible.
IIRC, there was a YouTube video of one of them going crazy because of a traffic cone… then running away from the operator when they tried to override and correct what it was doing.
It could be as little as cops leaving the car “somewhat” blocking the normal flow of traffic, then the Cruise cars strictly obeying “pull over and wait”, while someone with more common sense might’ve reversed, gone onto the curb, or whatever.
Then again:
Cruise spokesperson Tiffany Testo countered that one of the cars cleared the scene and that traffic to the right of it remained unblocked. “The ambulance behind the AV had a clear path to pass the AV as other vehicles, including another ambulance, proceeded to do,”
…it could’ve been the “blocked” ambulance’s drivers who were on autopilot?
Seems like not enough data to draw a conclusion.
- cicapocok ( @cicapocok@lemm.ee ) 8•1 year ago
Obviously it is a sad story for the deceased and it’s family but according to the cruise spoke person there was supposed to be enough space so the emergency car could pass. And later the article mentioned there were 55 more situations where these cars caused problems. Well there are car accidents everywhere in the word every day because of careless drivers so this is kinda common. So I really don’t think banning these cars should be an answer, but to keep improving them.