Tesla to remove sensors from cars in bet on cameras amid crash probes

Tesla to REMOVE sensors from the vehicle’s self-driving system and instead rely on eight cameras and artificial intelligence after intense scrutiny following series of crashes

  • Elon Musk’s company is removing ultrasonic sensors that can detect obstacles from its Autopilot system in favor of one based cameras and AI 
  • The sensors, which emit high-frequency sounds that bounce off of objects, will be phased out of Model 3 and Model Y first, with other models impacted in 2023
  • Tesla is facing intense regulatory and legal scrutiny at the federal and state level over a series of crashes involving its self-driving system
  • Musk reportedly told the Autopilot team in 2021 that ‘humans could drive with only two eyes’ and this means ‘cars should be able to drive with cameras alone’

Tesla is removing sensors from its cars as it shifts toward a system based solely on eight cameras that feed information into its self-driving artificial intelligence. 

Ultrasonic sensors (USS), which emit high-frequency sounds that bounce off of potential obstacles, will in the coming months be phased out of new Model 3 and Model Y vehicles sold in North America, Europe, the Middle East and Taiwan, and then globally. They will be phased out of Model 4 and Model X cars next year. 

The announcement from the company led by CEO Elon Musk comes as Tesla is facing intense regulatory and legal scrutiny over a series of crashes involving its self-driving system. 

Data from the National Highway Traffic Safety Administration (NHTSA) identified 392 reported accidents as of May 2022 involving cars with assisted-driver features – out of those, 273 involved Teslas. 

Tesla is removing sensors from its cars as it shifts toward a system based solely on eight cameras that feed information into its self-driving artificial intelligence. ABOVE: A Tesla Model Y is seen in Culver City, California

The announcement from the company led by CEO Elon Musk comes as Tesla is facing intense regulatory and legal scrutiny over a series of crashes involving its self-driving system

Last year, Musk reportedly told members of the Autopilot team that ‘humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.’ Cameras and artificial intelligence will be the primary ways Teslas recognize the environment. 

Tesla has always said its Full Self Driving beta software, currently available to 160,000 owners, requires a human to keep their hands on the wheel and pay attention to the road. 

The company said the sensors’ removal won’t affect its crash safety ratings and maintains that safety is at the core of its design and engineering decisions. 

The 12 ultrasonic sensors were mostly used for short-range object detection in features such as autopark. The automaker confirmed in a post that moving to the camera-only system will temporarily limit some of the vehicles’ features, including park assist, summon and smart summon. 

‘In the near future, once these features achieve performance parity to today’s vehicles, they will be restored via a series of over-the-air software updates. All other available Autopilot, Enhanced Autopilot and Full Self-Driving capability features will be active at delivery,’ Tesla explains. 

‘With today’s software, this approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time.’

But it remains to be seen how smoothly this change will go. 

Last year, the company announced it would no longer ship cars equipped with forward-facing radar, which came at the same time as reports about ‘phantom braking’ – when the self-driving system applies the brakes because it mistakenly thinks the car is about to collide with something. 

The NHTSA is probing whether the Autopilot system increases the risk of crashes and it could issue a recall if it finds that the technology is to blame. 

Musk’s company has also been accused of false advertising over its Autopilot and Full Self Driving beta system by the California Department of Motor Vehicles. However, it’s unclear exactly what changes they want since Tesla famously does not advertise at all. 

Following Tesla AI Day, the company released a video depicting how its neural networks – which operate with 1 billion different parameters (or values) and can complete an astounding 144 trillion operations per second. 

How does Tesla’s Autopilot work?

Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car. 

The sensor and camera suite provides drivers with an awareness of their surroundings that a driver alone would not otherwise have. 

A powerful onboard computer processes these inputs in a matter of milliseconds to help what the company say makes driving ‘safer and less stressful.’

Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. 

It does not turn a Tesla into a self-driving car nor does it make a car autonomous.

Before enabling Autopilot, driver must agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your car.’ 

Once engaged, if insufficient torque is applied, Autopilot will also deliver an escalating series of visual and audio warnings, reminding drivers to place their hands on the wheel. 

If drivers repeatedly ignore the warnings, they are locked out from using Autopilot during that trip.

Any of Autopilot’s features can be overridden at any time by steering or applying the brakes.

The Autopilot does not function well in poor visibility (due to heavy rain, snow, fog, etc.), bright light (due to oncoming headlights, direct sunlight, etc.), mud, ice, snow, interference or obstruction by objects mounted onto the vehicle (such as a bike rack), obstruction caused by applying excessive paint or adhesive products (such as wraps, stickers, rubber coating, etc.) onto the vehicle; narrow, high curvature or winding roads, a damaged or misaligned bumper, interference from other equipment that generates ultrasonic waves, extremely hot or cold temperatures.

 

Source: Read Full Article