Robotaxi, Waymo, and others

Oh when Waymo does something bad it’s a software bug, but when Tesla does, it’s an inferior product. Lol. Biased much, just deal with it.

Is that a real stat or just from what you have obsevered ?

Per the company, they are doing many rides. I did not see a breakout of the data
by city. Or do you think they are just lying ?

Waymo is delivering more than 250,000 paid robotaxi rides a week, Alphabet said in its April earnings report.

Read better. It’s a serious problem.

Lack of hardware is an inferior product. But story you posted is not about
hardware. It’s a software glitch. And these can be very serious.

Do you see the difference ?

What I have observed. I have my own car so don’t generally use cabs, ubers, lyfts, etc. However, when I see the Waymo vehicle driving around my office building (which it does regularly), it is always empty. I did a quick google search and the responses were basically that there is no data for the Atlanta market specifically, but that 10’s of millions have booked Waymo rides since it started in 2017. That seems like hopium if you ask me. I would be very skeptical.

Is every taxi, uber full when you see it? I mean, half the time it’s usually going to pick someone up, right? 50%+ your confirmation bias.

Is every one full, of course not. But they are occupied more often than not.

This is a problem too.
Now this may be a problem for waymo as well, but I can’t find any reported instances.

I do find the waymo method of using audio to help detect trains a bit crazy too.

Audio Detection:
Waymo uses audio receivers to detect the sounds of trains, which helps its vehicles identify and respond to upcoming rail crossings.

But back to the Tesla thing…

Now, two U.S. Senators are calling on the National Highway Traffic Safety Administration (NHTSA) to investigate Tesla FSD in relation to how it handles railroad crossings, which according to several accounts, is not well.

Thought this interesting, Tesla moves safety person to drivers seat for highway driving in robotaxis.

Traditionally, they sit in the passenger’s seat. During highway driving, they move to the driver’s seat.

I guess putting the safety person in the passenger seat is to “prove” it is self driving. But I don’t know how they could intervene as quickly from the passenger seat, so I understand why they would move.

Again, its a regulation during the trial phase, I’ve already linked articles showing Waymo do the same when they rolled out their service, like they have to do in their current trials in New York for example. But you guys don’t care about reading them, so there’s that. Just deal with it.

Correction-it’s the law in NY.

Another hurdle is that there is no permitting structure in New York that allows Waymo or any other AV company to test or deploy robotaxis without a human safety driver. While legislation has been introduced to create a framework for driverless operation, nothing has been passed into law yet.

Yes, like I said, if you read the articles I posted. But you only read articles your feed gives you I guess.

What you said is incorrect. There is currently
no law passed to allow fully autonomous driving in NY. It’s not a trial rollout thing per se. See the difference ?

So why is Waymo testing there’s? See the logic? Is Waymo going to have drivers?

I get it. You want the absolute worst light on anything Tesla. And will even spin anything that contradicts that.

All AVs in New York are required to have humans. It makes no difference if
testing has successfully completed or not. Or which company is doing it.
The state is working to get the law changed.

Once law is implemented, look for Waymo to quickly move to full autonomous
implementation, like they did in Austin.

Why do you continue not accept the actual status of where the companies doing
AV are at, and continue with this odd behavior to think it’s some media conspiracy
to get Musk ? Facts of the matter are just that, facts. Accept it and move on.

The media at it again !

Oh, never mind…this the govmint joining in on the conspiracy.

For the TLDR crowd, here is the synopsis.

ODI has identified a number of incidents in which the inputs to the dynamic driving task commanded by FSD induced vehicle behavior that violated traffic safety laws. Although reports of this nature span a variety of behaviors, the reports appear to most commonly involve two types of scenarios. The first type of scenario involves a vehicle operating with FSD proceeding into an intersection in violation of a red traffic signal. The second type of scenario involves FSD commanding a lane change into an opposing lane of traffic.

ODI has identified six Standing General Order (“SGO”) reports in which a Tesla vehicle, operating with FSD engaged, approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection. Of these incidents, four crashes resulted in one or more reported injuries

OCTOBER 7, 2025NHTSA ACTION NUMBER: PE25012OPEN INVESTIGATION

Traffic safety violations while Full Self Driving (“FSD”) is engaged

NHTSA Action Number: PE25012

Components: ELECTRICAL SYSTEM

Opened From: October 7, 2025 – Present

Summary

The Office of Defects Investigation (“ODI”) is opening this Preliminary Evaluation (PE) to assess the scope, frequency, and potential safety consequences of FSD executing driving maneuvers that constitute traffic safety violations. This investigation concerns versions of FSD that Tesla has labeled as “FSD (Supervised)” and “FSD (Beta).” Tesla characterizes FSD as an SAE Level 2 partial automation system requiring a fully attentive driver who is engaged in the driving task at all times. Level 2 partial automation systems are designed to support and assist the driver in performing certain aspects of the driving task, requiring a driver to supervise and intervene as necessary. The driver remains fully responsible at all times for driving the vehicle, including complying with applicable traffic laws. ODI’s investigation will therefore focus, in particular, on whether certain driving inputs within the control authority of FSD forestall the driver’s supervision when they are unexpectedly performed.

ODI has identified a number of incidents in which the inputs to the dynamic driving task commanded by FSD induced vehicle behavior that violated traffic safety laws. Although reports of this nature span a variety of behaviors, the reports appear to most commonly involve two types of scenarios. The first type of scenario involves a vehicle operating with FSD proceeding into an intersection in violation of a red traffic signal. The second type of scenario involves FSD commanding a lane change into an opposing lane of traffic.

With respect to the first type of scenario, ODI has identified 18 complaints and 1 media report alleging that a Tesla vehicle, operating at an intersection with FSD engaged, failed to remain stopped for the duration of a red traffic signal, failed to stop fully, or failed to accurately detect and display the correct traffic signal state in the vehicle interface. Some complainants also alleged that FSD did not provide warnings of the system’s intended behavior as the vehicle was approaching a red traffic signal.

ODI has identified six Standing General Order (“SGO”) reports in which a Tesla vehicle, operating with FSD engaged, approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection. Of these incidents, four crashes resulted in one or more reported injuries. At least some of the incidents appeared to involve FSD proceeding into the intersection after coming to a complete stop. ODI’s pre-investigative work, including coordination with the Maryland Transportation Authority and State Police, indicated that the problem may be repeatable, given that multiple subject incidents occurred at the same intersection in Joppa, Maryland. NHTSA understands that Tesla has since taken action to address the issue at this intersection.

With respect to the second type of scenario, ODI has identified 2 SGO reports, 18 complaints, and 2 media reports alleging that a Tesla vehicle, operating with FSD engaged, entered opposing lanes of travel during or following a turn, crossed double-yellow lane markings while proceeding straight, or attempted to turn onto a road in the wrong direction despite the presence of wrong-way road signs. Likewise, ODI has identified 4 SGO reports, 6 complaints, and 1 media report alleging that a Tesla vehicle, operating with FSD engaged, proceeded straight through an intersection in a turn-only lane or executed a turn at an intersection in a through lane despite the presence of lane markings or signals. Complaints also alleged that FSD did not provide warnings of the system’s intended behavior. Some complaints alleged that more than one of these failures occurred and, as such, the numbers are not cumulative. Some of the reported incidents appeared to involve FSD executing a lane change into an opposing lane of travel with little notice to a driver or opportunity to intervene.

https://www.nhtsa.gov/?nhtsaId=PE25012

Tesla is breaking traffic laws beyond Texas, investigation says

1 Like

And in recent stuff on Waymo I find these very concerning.

These types of incidents hilight to me that there are real flaws in these AVs
to understand situation scenarios. And the problem to adhere to traffic lights,
signs, and pavement markings. I’m puzzled, this far into real world deployment,
these type of failures still happen.

OTOH, if you accept Waymo safety analysis based on miles driven, AVs are 80-90% less likely to be involved in recordable motor vehicle incidents than a human
driver. As with all stats like this it probably requires a deeper dive.

Other thing that concerns me is the use of legal suits to prevent release of public incident data. Not a fan of this approach.

In January 2022, Waymo sued the California Department of Motor Vehicles (DMV) to prevent data on driverless crashes from being released to the public. Waymo maintained that such information constituted a trade secret.[201] According to The Los Angeles Times, the “topics Waymo wants to keep hidden include how it plans to handle driverless car emergencies, what it would do if a robot taxi started driving itself where it wasn’t supposed to go, and what constraints there are on the car’s ability to traverse San Francisco’s tunnels, tight curves and steep hills.”[202]

In February 2022, Waymo was successful in preventing the release of robotaxi safety records. A Waymo spokesperson claimed that the company would be transparent about its safety record.[203]

Only in California.