Rise and shine, top of the morning and all – though in the case of a Ford engineer working on the company’s self-driving technology, falling asleep at the wheel is part of a day’s work. Tasked with overseeing the autonomous test subjects, Ford’s engineers have been found to be dozing off in the cars, and seemingly not as part of a more radical approach to product testing.

According to Bloomberg Technology, the company has fitted a host of devices including alarms, buzzers, warning lights, even vibrating seats and steering wheels to said test vehicles, but test crew were still falling asleep, even with a second engineer in the car.

“These are trained engineers who are there to observe what’s happening,” said Ford’s product development chief Raj Nair. “But it’s human nature that you start trusting the vehicle more and more and that you feel you don’t need to be paying attention.”

This discovery has led Ford to consider skipping Level Three automation in its development of self-driving vehicles, namely conditional automation of vehicular control and the monitoring of the driving environment, but one that falls back on the human driver in certain situations.

This consideration is in line with that of Waymo, the entity researching Google’s autonomous, driverless car. “Level Three may turn out to be a myth. Perhaps it’s just not worth doing,” said Waymo chief executive officer John Krafcik. In this case, Ford and Waymo differ from most automakers in thinking that the self-driving vehicle should fully assume control of driving.

The contention here appears to be regarding how much control should be apportioned between human and computer in the aforementioned Level Three of automation – Audi’s Traffic Jam Pilot, for example, allows hands-free driving to an extent, but will call for human help should the car detect situations which call for it, in these cases giving the driver 10 seconds to assume command of the controls.

“We like the levels. It helps with consumer understanding and getting trust built into the marketplace, as opposed to going straight to the moonshot right off the bat,” said president of Audi of America Scott Keogh. Advocates of Level Three contend a situation of human backup allows consumers to become comfortable with autonomous driving technology, the report said.

Other automakers such as Nissan and Honda apply systems that give the driver 30 seconds to re-engage contact with the steering wheel, otherwise the vehicle will pull over to the roadside. “You can even go to sleep and the car can wake you up… waking up for 30 seconds is quite a long time,” said Amnon Shashua, co-founder and chief technical officer of autonomous technology supplier Mobileye NV.

Volvo sees it differently, however. “We don’t believe in five seconds, 10 seconds. It could even be dangerous. If you are doing something else, research shows that it will take two minutes or more before you can come back and take over. And that’s absolutely impossible. That really rules out Level Three,” said Volvo CEO Hakan Samuelsson.

Volvo’s concern appears to be somewhat shared by a non-automaker. “There’s evidence to suggest that Level Three may show an increase in traffic crashes. I don’t think there’s enough evidence to suggest that it should be prohibited at this time, but it does pose safety concerns,” said Rand Center for Decision Making Under Uncertainty co-director Nidhi Kalra during a US congressional hearing.

One reason for handing control back to the human driver is for legal liability, said Deloitte’s global automotive sector leader Joe Vitale. “With a vehicle crash when it’s operating in Level Three, I’m sure manufacturers will believe the consumer is responsible because they have their hands on the wheel and they’ve been alerted, but I don’t think regulators are going to easily turn over on that issue,” he said.

To that end, Volvo has pledged that it will assume responsibility for any crash by its self-driving vehicles; Samuelsson said part-human, part-computer task of Level Three automation could create confusion over which party is legally liable for a crash. “It should be black and white. With responsibility, you cannot tell anybody you are a bit responsible. Either you are responsible or you are not,” said Samuelsson.

One matter all parties agree upon is that too many requests for human intervention could spoil to autonomous driving experience, according to the Bloomberg Technology report. In the course of its testing, Ford discovered that through systems meant to monitor driver readiness, the drivers felt they were constantly reminded to pay attention. “The car is actually yelling at you all the time,” Nair said.

The constant reminders to be ready to take over also undermines the value of having an automated chauffeur. Why did I spend that extra premium for this if I have to be alert and pay attention?,” said Ford CEO Mark Fields of the constant reminders in Level Three automation.