The National Highway Traffic Safety Administration released on Wednesday nine months of crash data from vehicles using driver-assist technologies like Tesla Autopilot as well as fully autonomous vehicles like Waymo’s robotaxis. NHTSA broke crash data into two categories based on the level of the autonomous systems: driver-assist systems – which offer speed and steering input – and fully autonomous technologies, which are intended to one day safely function without human intervention. NHTSA found that there have been 367 crashes in the last nine months involving vehicles that were using these driver-assist technologies. 273 of the incidents involved a Tesla system, either its “full self-driving” software or its precursor, Tesla Autopilot. There were 130 crashes involving fully automated driving systems, 62 of which were Waymo crashes. Transdev, a shuttle operator, reported 34 crashes, and Cruise, which offers robotaxis for General Motors in San Francisco, reported 23. The data lacks critical context like fleet size or the number of miles traveled, making it impossible to fairly compare the safety of the different technologies. Not all relevant crashes may be included in the data set, NHTSA said, because crash data recording may vary widely among manufacturers. “I would advise caution before attempting to draw conclusions based only on the data we’re releasing. In fact, the data alone may raise more questions than they answer,” NHTSA administrator Steven Cliff told reporters in a briefing Tuesday. Two of the technologies with the most reported crashes are also two of the most commonly used systems. Tesla Autopilot, for example, comes standard on all of its vehicles, unlike competing driver-assist systems from other automakers. Drivers describe using Autopilot regularly because they say it can make them feel less fatigued after long drives. Waymo, the other company with the most total crashes, operates the most extensive robotaxi service in the country, with operations in much of metropolitan Phoenix, Arizona and San Francisco. For the first time, automakers and robotaxi operators have had to report to NHTSA data about crashes involving these vehicles. NHTSA says it will use the data to identify safety issues and intervene as necessary. Pony.ai, which is testing robotaxis in California, recalled three of its vehicles this year following data NHTSA gathered from this process. Of the 497 crashes total, 43% occurred in California. The state is home to Silicon Valley, making it a hotspot for testing new technologies. NHTSA found that of the 367 driver-assist crashes reported, there were six fatalities and five serious injuries. The safety risks of these new technologies have drawn the attention of safety advocates for years. There are not specific regulations for driver-assist systems, leaving automakers to market and describe the systems as they so choose. Tesla’s Autopilot and “full self-driving” software have been especially controversial. NHTSA’s investigation into Teslas rear-ending first responders’ vehicles was expanded last week and could lead to a recall. The National Transportation Safety Board has investigated fatal crashes involving Autopilot and called for the automaker to make changes, such as developing technology to more effectively sense the driver’s level of engagement and alert them when their engagement is lacking. Tesla has released data since 2018 claiming that Autopilot has a lower crash rate per mile than typical driving. But safety experts caution that Tesla’s analysis compares apples to oranges, as most Autopilot driving takes place on highways, where crash rates per mile are much lower than all driving. Tesla states that drivers using Autopilot must remain alert and be prepared to take full control of the vehicle at a moment’s notice. However, drivers using technologies like Autopilot risk becoming distracted, experts say. A 2021 MIT study found that Tesla drivers looked away from the road more frequently while using Autopilot than when driving without the driver-assist system. NHTSA said that its investigation into Teslas rear-ending emergency vehicles while using Autopilot found that in 37 of 43 crashes with detailed car log data available, drivers had their hands on the wheel in the last second prior to the collision. For years, Tesla detected torque on the wheel to determine if a driver was engaged. It’s begun to use an in-car camera for detecting distraction, which many safety experts say is a superior method, as cameras can track eye movement. “We see value in having nationally standardized and uniform crash reporting during this early stage of the development and deployment of autonomous driving technology, and there’s public benefit in NHTSA sharing its findings,” Waymo said in response to the data. Tesla did not respond to a request for comment.