Whether it’s to change the music, look at a text, make sure a mobile phone is charging or check directions – in that brief instance, a driver’s focus is diverted. Concentration is no longer on driving or the road ahead, and research shows that taking your eyes off the road for just two seconds can increase the risk of an accident up to 24 times.
A recent report issued by the Department of Transport found that use of a mobile phone when behind the while led to 33 deaths, 90 serious injuries and 308 less serious injuries in the UK in 2017. So, why do drivers continue to flout the law?
One answer could lie with the 24/7 world we’re living in, where we feel the need to constantly multi-task, refresh newsfeeds, update profiles and speak to friends – perhaps we simply cannot switch off.
In the future, we may solve this problem through autonomous vehicles that require no human input. In the meantime, however, commercial drivers face hundreds of distractions every day with potentially catastrophic consequences.
But, if technology is the problem – could it also be the solution?
Machine Vision and Artificial Intelligence
Commercial vehicle drivers experience some of the toughest working environments of any profession; constantly having to deal with disruption in the form of varying road conditions, poor weather, cyclists, distracted pedestrians and changes in traffic levels, all while keeping to a pre-determined schedule. They also often have to do all of this alone, with just one set of eyes to rely on.
Two technologies are changing the game – machine vision (MV) and artificial intelligence (AI).
Artificial intelligence, which learns from categorised events and enables a machine to imitate human behaviour, acts as a purpose-built brain. A complex field of computer science, AI has been created using algorithms deriving from various areas of study including psychology, philosophy and linguistics. The power and impact of AI-backed solutions is that machine learning gets better over time by integrating data from multiple sources. This allows AI, in a commercial transport setting, to identify and categorise distracted driving behaviours such as eating, drinking, smoking, cell phone use, lane departure and following too closely. It learns from the objects it sees through machine vision.
Machine vision, a video-based type of machine learning, acts as a smart set of eyes that scan and recognise both the internal and external environment of the vehicle. This could include the driver’s behaviour, for example, if they are looking down to look at their mobile phone or their eyes are closing, as well as other dangerous events such as following the vehicle in front too closely, straying outside of lane lines or failing to stop at a junction with stop signs. In summary, MV identifies the issue while AI determines how risky that issue is.
This combination of MV+AI is powerful. It can detect a vehicle nearing a junction through algorithms which monitor GPS location and municipal map data and, as the vehicle gets closer, kickstart its machine vision capabilities to ‘look’ for a stop sign. If it sees one, the algorithm checks the engine control unit and accelerometer to determine whether the vehicle comes to a complete stop.
Much in the same way, it can also determine when a driver is losing concentration by ‘seeing’ the direction the driver’s head is facing, or if they only have one hand on the steering wheel. Artificial intelligence will then consider the riskiness of the situation and, if on a busy motorway where this behaviour is especially dangerous, alert the driver so they can self-correct.
Driving the industry forward
The power of integrating these two neutral networks into one single model, running simultaneously, is unparalleled. The result is an assistant that never tires, never loses concentration and can detect behaviours associated with an object, even when the object itself isn’t visible through AI’s algorithm connecting subtle human behaviour patterns with certain objects.
For example, the network can detect a driver looking down over a two-minute period and can confidently determine phone use without even seeing the hardware. In other words, the technology can reliably detect a risky behaviour, even when data itself is incomplete. It is also sophisticated enough to detect two behaviours at once, such as eating while not wearing a seatbelt.
This second set of formally-trained ‘eyes’ and intuitive ‘brain’ is already undoubtedly improving the safety, comfort and performance of commercial drivers and their vehicles.
The technology is also able to improve customer service and ensure the efficient allocation of resources. One example is using machine vision to gauge bus occupancy levels throughout any given week. AI could then use this data to predict days when the bus is in highest demand to ensure that operators are running services accordingly.
The ultimate goal is lofty, but crucially important: that no commercial or professional driver will ever be the cause of a collision. The first step in this journey is to support and protect those drivers who keep us moving forward with the most advanced safety technology available.
Author: Damian Penney, VP of Lytx Europe