Tesla's Full Self-Driving (FSD) software has been under intense regulatory scrutiny due to safety concerns and allegations of misleading marketing. Despite being marketed as a step toward autonomous driving, the FSD system has been linked to numerous incidents, including fatalities, leading to investigations by regulatory bodies. Safety Incidents and Investigations
The National Highway Traffic Safety Administration (NHTSA) has initiated multiple investigations into Tesla's Autopilot and FSD systems. As of 2024, there have been over 50 reported fatalities involving Tesla vehicles with these systems engaged. In response, Tesla issued a recall in December 2023, updating the software to address identified safety issues. However, the NHTSA opened a recall query in April 2024 to assess the effectiveness of this update. Regulatory Actions and Legal Challenges
Tesla's marketing practices have also come under fire. In 2022, the California Department of Motor Vehicles (DMV) accused Tesla of misleading advertising regarding the capabilities of its Autopilot and FSD systems. This led to legal proceedings and the enactment of California's SB1398 law, which prohibits the use of terms like "Full Self-Driving" for Level 2 driver assistance systems