Talk To Me, KITT

Share this page

As a safety-led autonomous urban trial company with a focus on the challenging ‘zone 1’ restrictive metropolitan environments around the world, we at StreetDrone are uniquely focused on the interaction between a moving vehicle and its environment. In truth, a cityscape presents far more potential for hazard and it is precisely that challenge that we are tackling with relish.

One of our areas of focus is the change that will need to happen in the unspoken language that has grown up over the years that enables pedestrians, cyclists and road users to interact together – both within the confines of formal driving regulations such as the highway code, and the more informal patterns of interaction which we all participate in on a daily basis and coalesce in a set of norms to which we all largely subscribe.

There are no rules save common sense that govern a motorist stopping to allow a pedestrian to cross a busy road. And the interaction and signalling between car and pedestrian is all the more difficult to codify when we consider who the pedestrian is. If they are manifestly old, or young, the driver’s propensity to stop is higher. If the driver’s assessment of the pedestrian’s age is masked by clothing in adverse weather, perhaps a raincoat and a hood, then the signals that the driver will receive and act on shift again – and produce different outcomes.

All of this normalised behaviour, the complexities of signals that vary from flashed headlights to a nod of the head or something as subtle as making eye contact with a pedestrian to acknowledge their wish to cross the road will have to undergo a profound translation when cars start to drive themselves.

With a safety-first approach, here at StreetDrone, we’re working with a variety of our customers such as Parkopedia as well as public authorities who are contributing to developing new codes of communication between vehicles and their animated environment – i.e. the pedestrians, cyclists etc.

Parking is a key testbed for autonomous vehicle intention communication

One such public body wrestling with this new set of communications standards is SAE International, or the Society of Automobile Engineers based in the USA. After a series of studies, they identified three main areas where new so-called ‘car2pedestrian’ communications protocols between vehicles and animated third parties (that is pedestrians, cyclists, micromobility commuters etc) need to be developed. These intersections are:-

1) The need for communication to show or acknowledge that an autonomous vehicle has detected a pedestrian (I have seen you and I am incorporating your presence into my decision-making)

2) The need for vehicle-to-pedestrian and pedestrian-to-vehicle intent communication (I intend to cross the road in front of you)

3) The need for a communication interface to relay these messages

StreetDrone’s autonomous Twizy has provided the platform for Parkopedia’s driverless parking, including status indicator lights

The starting point for trialling these communications are status lamps – or lights – to indicate vehicle state (read here whether the vehicle is acting autonomously or under human control) and additional illuminated indicators to communicate intent. These ADS lamps are being implemented by Parkopedia for instance for specific application in car parks.

The Parkopedia array distinguishes between a vehicle under manual control with green indicators, blue for an autonomous state and red if the vehicle detects any systems errors.

While developing a spectrum of lights for autonomous vehicles that become universally understood is a challenge in itself, there remains much to do at a practical level to ensure illuminated indicators offer 360 visibility – and the ultimate conundrum, how to open a communications channel the other way round – from the pedestrian to the vehicle. It’s the stuff we at StreetDrone get out of bed for….

To find out more, get in touch with StreetDrone at info@streetdrone.com