Tesla reached a significant milestone in self-driving vehicles in 2026, with billions of miles driven on its Full Self-Driving (FSD) system, but what does this new way of transportation look like for pedestrians? As we continue to see an increase in self-driving cars and Tesla plans to continue expanding FSD to billions of vehicles, it’s important to stay informed about the potential risks this can cause and who would be liable if an accident were to occur.

What Is Tesla’s Self-Driving System?

Tesla’s Full Self-Driving system (FSD) is an advanced driver-assistance system that features automated capabilities allowing your vehicle to drive on its own. Tesla’s Autopilot features have been around for a while, but have evolved into this more advanced way of driving known as FSD. Tesla’s FSD vehicles include all the Autopilot features and more, including:
  • Traffic-Aware Cruise Control
  • Autosteer
  • Auto Lane Change
  • Navigation on Autopilot
  • Autopark
  • Smart Summon
  • Traffic Light and Stop Sign Control
  • Autosteer on City Streets
Because of these new, advanced features allowing Tesla models to practically operate on their own, Tesla has marketed these vehicles as Full-Self Driving systems. However, Teslas are still considered a level 2 driving system, meaning the driver of the vehicle is still required to be fully attentive at all times.

Common Pedestrian Risks

With self-driving cars becoming the future of driving, what are the common risks they can pose to pedestrians, and how many accidents are actually caused by self-driving vehicles?

Failure to Detect Pedestrians

In FSD vehicles, the advanced autonomous features have failed to consistently detect people on crosswalks, cyclists, and children. We have already seen many major lawsuits and headlines related to self-driving vehicles. Recently, a 2-year-old girl was hit and killed by a Tesla backing into a driveway. The child was seen playing in her driveway when a visitor came and hit her. The investigation for this case is ongoing, but since the driver wasn’t impaired, it’s likely that the accident occurred due to Tesla’s FSD features.

Low Visibility Conditions

There have been many recent investigations into self-driving vehicle crashes in areas with low visibility, making it harder for Tesla’s autonomous features to detect a pedestrian. The National Highway Safety Administration (NHTSA) reported four Tesla crashes that happened  in areas with low visibility, including sun glare, fog, and airborne dust.

Traffic Violation Behaviors

Another NHTSA investigation conducted in 2025 found that nearly 2.9 million Tesla vehicles flagged reports of FSD vehicles running red traffic signals and initiating lane changes into opposing traffic. NHTSA has received reports of 58 safety violations linked to Tesla vehicles with FSD.

Who Is Liable When a Pedestrian Is Hurt?

With the increase in self-driving cars, knowing your rights and who is liable in the face of an accident can be confusing and varies from case to case. An extensive investigation should be conducted to determine who is at fault in these cases, and many questions regarding the situation and the autonomous operation should be asked. Some of these questions could include:
  • Was the vehicle operating in FSD mode?
  • Was the pedestrian properly detected?
  • Was there a system lag or a software update needed?
  • Were roadway markings clear?
  • Was the pedestrian following traffic safety rules?

What Pedestrians Should Know

As self-driving vehicles continue to increase and share the road with pedestrians, it’s important for pedestrians and drivers to stay aware of the common inconsistencies seen in autonomous vehicles. For pedestrians, staying visible on a crosswalk or sidewalk is crucial. Don’t assume that a self-driving vehicle will stop automatically, especially in areas with low visibility. Stay aware of your surroundings and know your rights so you can document the scene and seek legal counsel if an accident occurs.