General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
SmallSpoonBrigade
The Wall Street Journal
comments
Comments by "SmallSpoonBrigade" (@SmallSpoonBrigade) on "The Hidden Autopilot Data That Reveals Why Teslas Crash | WSJ" video.
And we the people who drive have a right to know if other vehicles on the road are safe to operate in that mode. Especially given that the rules about whom to sue when there is a crash aren't really clear.
61
Possibly, he may also have still been waking up. Driving a car at that hour of the morning is often times extremely boring.
11
@daydreamer8373 Eventually, but there's no reason it needs to be this dangerous. Tesla is just pushing things far faster than what they're technology can handle. If they were deadset on operating these with just cameras, there should have been more cameras and there should have been a period of years where all the autopilot system did was identify potential hazards in the road and flag the information for adding tot he model. And, anytime the vehicle crashed into something or was crashed into, that should have been sent off to be considered for inclusion. With a sufficient amount of that information, there should have been a period where drivers were allowed to engage it for no more than 5-10 minutes a time, a short enough period where they could still reliably focus on monitoring what the car was doing. Eventually, as things progressed, they could eventually allow longer periods, but how Tesla has handled it has been extremely irresponsible and led to unnecessary fatalities in people that weren't even in the vehicles.
7
If a human didn't have enough light, the human would slow down so that there was enough stopping time. The Tesla in question apparently didn't.
7
@DannyPops One of the big issues is that Elon has an extremely large mouth and a long history of overpromising what they can deliver. And they're still having issues like people being trapped in their cars if the software doesn't think they should be allowed to open the doors and the cars still can't figure out how far away motorcycles are.
4
@snugglesjuggler It's harder to do than you'd think. The real problem is that Tesla has tried to skip a stage where all the computer does is just record a ton of telemetry information about the cars as they drive around to compile into a library that can be used to train the system. There's only so much you can do with professional test drivers, they aren't necessarily going to be driving in the sort of conditions that regular people do. And for things that that Tesla slamming into an upended semi, that doesn't happen a lot, and it's going to take an extremely large number of such crashes before the system learns to deal with it. Doubly so since they don't appear to be using the situations where regular drivers come into contact with that.
2
@madhavdhilip The issue is that Tesla is being way too aggressive about pushing the automatic driving capabilities. What they should have done was just spent a period of time with the system just trying to predict things like this and what the driver would do and fed all that into the AI before even allowing drivers to have the level of self-driving they've got. These systems basically all suffer for the fact that they can't replicate the experience you get from millions of drivers driving billions of miles every year.
2
That's something that people don't seem to get. Humans mainly use our eyes when driving, but we also use our ears and our sense of touch as well. A car can use far more sensors and the only real limits are how many can be processed quickly enough to keep up with the car as it drives and the ability of car buyers to afford the systems. Also, these things have a tendency to get cheaper as the amounts of them being manufactured increase. From what I can tell, a LIDAR array for a car is well under $1k at this point and just the savings on insurance premiums from having the extra level of ability should more than pay for it within a few years.
2
@RichardBaran Yes and not just that Elon has been falsely claiming that they'd have full self-driving capabilities any year now, and that's been going on for years.
2
Possibly, or at that time of the day, the driver might be suffering from "white line fever" or just being too tired to properly register what's happening, or perhaps overestimated what the autopilot was capable of doing. But, its' been far too many years of these cars crashing into inanimate objects for them to still be allowed to sell the cars without any sensors to augment the cameras.
2
That's why other manufacturers use terms like "adaptive cruise control" and "lane centering" to describe their capabilities. It's a lot more clear what the car can do without being misleading.
2
So, like P.T. Barnum said, there's a sucker born every minute. TSLA stock isn't priced like that on the basis of anything they're doing.
1
They can mostly all be avoided by driving at a rate appropriate to conditions. The only exception would be crashes like when a kid pops out from behind a parked car.
1
@daydreamer8373 It's not, the point is to have several different redundant types of sensors generating the data needed for the car to understand what's going on in the area. Even human drivers use more than just our eyes when driving. We use our ears and even just our feel for the vibrations when driving to help inform our sense of what's going on. Nobody should be saying that LIDAR is magic, but LIDAR does help the car understand a lot more about where physical things are in space and if you add radar dots and cameras, you can get a pretty comprehensive sense of what's going on around the car with which to have the AI make decisions.
1
@sblackm1 IMHO, a large part of the problem is that they are trying to do it without the sensors. My wife's Ford has a camera as well as some sensors and I don't personally trust it to stop suddenly if something does enter the road. On paper it is supposed to and it does seem to know how far ahead the car in front of me is when I'm using adaptive cruise control, but I haven't seen it actually use the emergency brake yet, and I'm under the assumption that it might not work, so I'll stop on my own before it kicks in. That being said, I wish the adaptive cruise control were set up to allow it to go even slower because in traffic, it'll keep kicking me back to manual acceleration and that really tires me out.
1
@JustXavier Cameras can be enough, however, the amount of training that it would take under every possible scenario is a lot. It's ultimately cheaper just to use LIDAR and other auxiliary sensors than to try to get a camera only system to work. The sensors are expensive, but the cost of them has been coming down as the quantities being bought have increased to generate a better economy of scale and more experience integrating them into designs.
1
Originally it was about cost and Elon's belief that it wasn't needed. It can add some complexity in terms of interpreting what it sees, but if you saw that many points in the road ahead, at bare minimum the system should have slowed down.
1
@TheTomexification Yes, and it's unlikely that things have changed appreciably since then as they're still using just the cameras and what AI improvements they could generate based on their own test drives. Things like the car slamming into an upended semi is just not something that's easy to train for as it doesn't happen that often. I've seen precisely one over turned cement truck in my life and another time I saw a boat that had been flipped by the wind. I'd be terrified of coming across either depending upon just camera based AI detection systems as those things are unlikely to come up during training.
1
@daydreamer8373 Tesla's still regularly drive over the top of bikers because they can't figure out how far away the motorcycle is using only the cameras. LIDAR and other sensors are needed for these systems to help reduce issues like that. We may well get to the point at some point where the extra sensors aren't needed, but we won't get there any time soon and Elon seems to genuinely think that we're already there in terms of the AI modeling.
1
@Joelsef It's because Elon keeps promising that they'll move fast and break things. They have to move fast with things like this because Elon refused to hire anybody that understood how to build a car factory.
1
@AtleyCarman-xe7rs Typically a human is expected to drive slower if they are having issues seeing ahead of them, the computer vision system doesn't seem to do that.
1
It just uses camera feeds, I don't know about those cameras in particular, but the backup camera that I bought for my wife's old car was capable of seeing in some pretty dark conditions, so there's no reason why the cameras that are using on these cars shouldn't be able to see in near complete darkness. Whether they're properly trained to interpret what they see at night or not is likely the issue.
1