Toyota using more cameras, less sensors to reduce cost, speed up autonomous driving tech development

Toyota using more cameras, less sensors to reduce cost, speed up autonomous driving tech development

Toyota, through its subsidiary Woven Planet, is working to progress the development of autonomous vehicle technology by employing low-cost cameras for the collection of data, according to Reuters. According to Woven Planet, the ability to use low-cost cameras is a breakthrough that the manufacturer hopes will reduce costs.

The gathering of diverse driving data from a large fleet of vehicles is crucial to developing a robust self-driving car system, however the exercise is costly and not scalable to use expensive sensors in the testing of the autonomous vehicles, Woven Planet told Reuters in an interview.

“We need a lot of data. And it’s not sufficient to just have a small amount of data that can be collected from a small fleet of very expensive autonomous vehicles. Rather, we’re trying to demonstrate that we can unlock the advantage that Toyota and a large automaker would have, which is a large corpus of data, but with a much lower fidelity,” Woven Planet vice president of engineering Michael Benisch told the news wire.

Toyota using more cameras, less sensors to reduce cost, speed up autonomous driving tech development

A large proportion of data coming from the lower-cost cameras increased the autonomous driving system’s performance to a level similar to when the system was being trained exclusively on data from more expensive sensors, Woven Planet said, and the firm uses cameras which are 90% cheaper than the sensors used previously, and which can be easily installed in fleets of passenger vehicles, according to Reuters.

That said, Toyota will continue to use sensors for lidar and radar for robotaxis and other autonomous vehicles that will be deployed on roads because this remains the best and safest approach for the development of robotaxis, Benisch said.

The Japanese automaker has also been working with Aurora in the testing of an autonomous ride-hailing fleet featuring units of the Toyota Sienna outfitted with lidar and radar sensors as well as cameras. Toyota previously held a stake in the self-driving vehicle unit of Uber, although this was transferred when Uber sold the unit.

Looking to sell your car? Sell it with Carro.

Certified Pre-Owned - 1 Year Warranty

10% discount when you renew your car insurance

Compare prices between different insurer providers and use the promo code 'PAULTAN10' when you make your payment to save the most on your car insurance renewal compared to other competing services.

Car Insurance

Mick Chan

Open roads and closed circuits hold great allure for Mick Chan. Driving heaven to him is exercising a playful chassis on twisty paths; prizes ergonomics and involvement over gadgetry. Spent three years at a motoring newspaper and short stint with a magazine prior to joining this website.

 

Comments

  • Mr slim on Apr 25, 2022 at 10:31 pm

    Tesla safety rival

    Like or Dislike: Thumb up 4 Thumb down 0
    • James on Apr 26, 2022 at 7:33 am

      Tesla no longer using radar sensors in Model 3 and Model Y,
      Cameras being used too.

      Like or Dislike: Thumb up 5 Thumb down 0
      • Both realised current autonomous cars are too complex in order to be 99% safe but it becomes to costly for regular Joes to own so they make it cheaper by compromising safety levels and racing to the bottom of that barrel to market an autonomous car with 90% safety level. The other 9% less is considered acceptable losses.

        Like or Dislike: Thumb up 0 Thumb down 1
  • Oh really? Uber & Waymo also tried the low cost camera only way, guess what happen to their auto cars? Between this cheapskate Toyota system and Nissan’s autonomous system in the article below this, I rather trust the latter.

    Like or Dislike: Thumb up 4 Thumb down 10
    • i think you misunderstood the story

      camera to collect data to train the AI

      implementation in the vehicle uses lidar

      Like or Dislike: Thumb up 8 Thumb down 2
      • Aura89 on Apr 26, 2022 at 8:39 am

        They are trying luck with sensorless autonomous driving. Gathering data is just one of their goals to remove those sensors & lidar.

        Like or Dislike: Thumb up 4 Thumb down 6
    • Anonymous on Apr 26, 2022 at 6:21 am

      Toyota is trying to collect more data for the same cost. Sensors+cameras are the best way for autonomous vehicles to navigate with, however to collect data for the AI to train with at the development speed they want would be very expensive. They save 90% for every sensor they replace with a camera but the data quality isn’t 90% lower. More importantly, being able to be fitted to a much larger fleet diversifies the data collection which compensates for the lack of fidelity and is better for the AI to train with.

      Like or Dislike: Thumb up 11 Thumb down 3
      • Aura89 on Apr 26, 2022 at 8:38 am

        Don’t believe all that Toyota shill they feed you. Toyota is trying their luck, like Tesla, in doing autonomous driving without the costly but essential sensors. It is like trusting a one-eyed pirate trying to steer a stormy sea with a hooked hand. That is madness!

        Like or Dislike: Thumb up 5 Thumb down 7
        • Anonymous on Apr 26, 2022 at 2:40 pm

          The technology has been proven to work. There is no hidden agenda here. Toyota clearly said the data from cameras are of ‘lower fidelity’. But, if you read it carefully they said they only want reduce dependency on Lidar for the AI training only.

          “Conventional” autonomous vehicles have at least 3 Lidars. At 90% less you can have 9-10 cameras for every Lidar. For AI training purposes only and with the right algorithm, a 5-car fleet with 15 Lidars produces much less image data for the AI to sift through, compared to 20 cars with 10 cameras each. Oh, and cameras work in the rain and fog. Lidars will map literally everything in sight, including raindrops and thick fog. So ironically, what you would use in more precarious weather is the “inferior” technology. You can’t have the AI learn from Lidar sensors in the rain, it would just be noise.

          This is why “Tesla Vision” is one of – if not the most advanced machine vision system for autonomous vehicles. Lidars are in essence 3D cameras. They map the surrounding with bouncing laser lights and plot it in 3D very accurately for the AI to look at. Yet our pair of eyes see 3D just as well and we don’t shoot lasers out of them to perceive depth and distance. Our brains do all the work. This is what Tesla has been furiously teaching their AI to do, and they have a fleet of 1 million cars globally to train it. Meanwhile, other big manufacturers are progressing at a crawling pace toying with the handful of Lidar-equipped fleets they have. Others have focused so much on reducing Lidar costs that they can’t spare enough resources to develop the brain of the cars.

          Like or Dislike: Thumb up 5 Thumb down 2
  • Piano on Apr 25, 2022 at 11:53 pm

    Cool

    Like or Dislike: Thumb up 3 Thumb down 0
 

Add a comment

required

required