Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Gender Bias In the Driving Systems of AI Autonomous Cars  from AI Trends

Save for later
  • 17 min read
  • 08 Oct 2020

article-image
gender-bias-in-the-driving-systems-of-ai-autonomous-cars-from-ai-trends-img-0

By Lance Eliot, the AI Trends Insider   

Here’s a topic that entails intense controversy, oftentimes sparking loud arguments and heated responses. Prepare yourself accordingly. Do you think that men are better drivers than women, or do you believe that women are better drivers than men?   

Seems like most of us have an opinion on the matter, one way or another.   

Stereotypically, men are often characterized as fierce drivers that have a take-no-prisoners attitude, while women supposedly are more forgiving and civil in their driving actions. Depending on how extreme you want to take these tropes, some would say that women shouldn’t be allowed on our roadways due to their timidity, while the same could be said that men should not be at the wheel due to their crazed pedal-to-the-metal predilection. 

What do the stats say? According to the latest U.S. Department of Transportation data, based on their FARS or Fatality Analysis Reporting System, the number of males annually killed in car crashes is nearly twice that of the number of females killed in car crashes.   

Ponder that statistic for a moment. Some would argue that it definitely is evidence that male drivers are worse drivers than female drivers, which seems logically sensible under the assumption that since more males are being killed in car crashes than women, men must be getting into a lot more car crashes, ergo they must be worse drivers.   

Presumably, it would seem that women are better able to avoid getting into death-producing car crashes, thus they are more adept at driving and are altogether safer drivers.   

Whoa, exclaim some that don’t interpret the data in that way. Maybe women are somehow able to survive deadly car crashes better than men, and therefore it isn’t fair to compare the count of how many perished. Or, here’s one to get your blood boiling, perhaps women trigger car crashes by disrupting traffic flow and are not being agile enough at the driving controls, and somehow men pay a dear price by getting into deadly accidents while contending with that kind of driving obfuscation.   

There seems to be little evidentiary support for those contentions. A more straightforward counterargument is that men tend to drive more miles than women. By the very fact that men are on the roadways more so than women, they are obviously going to be vulnerable to a heightened risk of getting into bad car crashes. In a sense, it’s a situation of rolling the dice more times than women do.   

Insurance companies opt for that interpretation, including too that the stats show that men are more likely to drive while intoxicated, they are more likely to be speeding, and more likely to not use seatbelts   

There could be additional hidden factors involved in these outcomes. For example, some studies suggest that the gender differences begin to dissipate with aging, namely that at older ages, the chances of getting killed in a car crash becomes about equal for both male and female drivers. Of course, even that measure has controversy, which for some it is a sign that men lose their driving edge and spirit as they get older, become more akin to the skittishness of women.   

Yikes, it’s all a can of worms and a topic that can readily lend itself to fisticuffs.   

Suppose there were some means to do away with all human driving, and we had only AI-based driving that took place. One would assume that the AI would not fall into any gender-based camp. In other words, since we all think of AI as a kind of machine, it wouldn’t seem to make much sense to say that an AI system is male or that an AI system is female. 

As an aside, there have been numerous expressed concerns that the AI-fostered Natural Language Processing (NLP) systems that are increasingly permeating our lives are perhaps falling into a gender trap, as it were. When you hear an Alexa or Siri voice that speaks to you if it has a male intonation do you perceive the system in a manner differently than if it has a female intonation? 

Some believe that if every time you want to learn something new that you invoke an NLP that happens to have said a female sounding voice, it will tend to cause children especially to start to believe that women are the sole arbiters of the world’s facts. This could also work in other ways such as if the female sounding NLP was telling you to do your homework, would that cause kids to be leery of women as though they are always being bossy?   

The same can be said about using a male voice for today’s NLP systems. If a male-sounding voice is always used, perhaps the context of what the NLP system is telling you might be twisted into being associated with males versus females. 

As a result, some argue that the NLP systems ought to have gender-neutral sounding voices.   

The aim is to get away from the potential of having people try to stereotype human males and human females by stripping out the gender element from our verbally interactive AI systems.   

There’s another perhaps equally compelling reason for wanting to excise any male or female intonation from an NLP system, namely that we might tend to anthropomorphize the AI system, unduly so.   

Here’s what that means. 

AI systems are not yet even close to being intelligent, and yet the more that AI systems have the appearance of human-like qualities, we are bound to assume that the AI is as intelligent as humans. Thus, when you interact with Alexa or Siri, and it uses either a male or female intonation, the argument is that the male or female verbalization acts as a subtle and misleading signal that the underlying system is human-like and ergo intelligent. 

You fall readily for the notion that Alexa or Siri must be smart, simply by extension of the aspect that it has a male or female sounding embodiment. 

In short, there is ongoing controversy about whether the expanding use of NLP systems in our society ought to not “cheat” by using a male or female sounding basis and instead should be completely neutralized in terms of the spoken word and not lean toward using either gender. 

Getting back to the topic of AI driving systems, there’s a chance that the advent of true self-driving cars might encompass gender traits, akin to how there’s concern about Alexa and Siri doing so.   

Say what? 

You might naturally be puzzled as to why AI driving systems would include any kind of gender specificity.   

Here’s the question for today’s analysis: Will AI-based true self-driving cars be male, female, gender fluid, or gender-neutral when it comes to the act of driving? 

Let’s unpack the matter and see. 

For my framework about AI autonomous cars, see the link here: https://aitrends.com/ai-insider/framework-ai-self-driving-driverless-cars-big-picture/ 

Why this is a moonshot effort, see my explanation here: https://aitrends.com/ai-insider/self-driving-car-mother-ai-projects-moonshot/   

For more about the levels as a type of Richter scale, see my discussion here: https://aitrends.com/ai-insider/richter-scale-levels-self-driving-cars/   

For the argument about bifurcating the levels, see my explanation here: https://aitrends.com/ai-insider/reframing-ai-levels-for-self-driving-cars-bifurcation-of-autonomy/  

The Levels Of Self-Driving Cars  

It is important to clarify what I mean when referring to true self-driving cars. 

True self-driving cars are ones where the AI drives the car entirely on its own and there isn’t any human assistance during the driving task. 

These driverless vehicles are considered a Level 4 and Level 5, while a car that requires a human driver to co-share the driving effort is usually considered at a Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems). 

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.   

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some point out). 

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different from driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable). 

For semi-autonomous cars, the public must be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car. 

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.   

For why remote piloting or operating of self-driving cars is generally eschewed, see my explanation here: https://aitrends.com/ai-insider/remote-piloting-is-a-self-driving-car-crutch/   

To be wary of fake news about self-driving cars, see my tips here: https://aitrends.com/ai-insider/ai-fake-news-about-self-driving-cars/   

The ethical implications of AI driving systems are significant, see my indication here: http://aitrends.com/selfdrivingcars/ethically-ambiguous-self-driving-cars/ 

Be aware of the pitfalls of normalization of deviance when it comes to self-driving cars, here’s my call to arms: https://aitrends.com/ai-insider/normalization-of-deviance-endangers-ai-self-driving-cars/   

Self-Driving Cars And Gender Biases 

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task.   

All occupants will be passengers.   

The AI is doing the driving.   

At first glance, it seems on the surface that the AI is going to drive like a machine does, doing so without any type of gender influence or bias. 

How could gender get somehow shoehorned into the topic of AI driving systems?   

There are several ways that the nuances of gender could seep into the matter.   

We’ll start with the acclaimed use of Machine Learning (ML) or Deep Learning (DL).   

As you’ve likely heard or read, part of the basis for today’s rapidly expanding use of AI is partially due to the advances made in ML/DL.   

You might have also heard or read that one of the key underpinnings of ML/DL is the need for data, lots, and lots of data. 

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime

In essence, ML/DL is a computational pattern matching approach. 

You feed lots of data into the algorithms being used, and patterns are sought to be discovered. Based on those patterns, the ML/DL can then henceforth potentially detect in new data those same patterns and report as such that those patterns were found. 

If I feed tons and tons of pictures that have a rabbit somewhere in each photo into an ML/DL system, the ML/DL can potentially statistically ascertain that a certain shape and color and size of a blob in those photos is a thing that we would refer to as a rabbit.   

Please note that the ML/DL is not likely to use any human-like common-sense reasoning, which is something not often pointed out about these AI-based systems. 

For example, the ML/DL won’t “know” that a rabbit is a cute furry animal and that we like to play with them and around Easter, they are especially revered. Instead, the ML/DL simply based on mathematical computations has calculated that a blob in a picture can be delineated, and possibly readily detected whenever you feed a new picture into the system, attempting to probabilistically state whether there’s such a blob present or not.   

There’s no higher-level reasoning per se, and we are a long ways away from the day when human-like reasoning of that nature is going to be embodied into AI systems (which, some argue, maybe we won’t ever achieve, while others keep saying that the day of the grand singularity is nearly upon us. 

In any case, suppose that we fed pictures of only white-furry rabbits into the ML/DL when we were training it to find the rabbit blobs in the images.   

One aspect that might arise would be that the ML/DL would associate the rabbit blob as always and only being white in color.   

When we later on fed in new pictures, the ML/DL might fail to detect a rabbit if it was one that had black fur, because the lack of white fur diminished the calculated chances that the blob was a rabbit (as based on the training set that was used). 

In a prior piece, I emphasized that one of the dangers about using ML/DL is the possibility of getting stuck on various biases, such as the aspect that true self-driving cars could end up with a form of racial bias, due to the data that the AI driving system was trained on. 

Lo and behold, it is also possible that an AI driving system could incur a gender-related bias.   

Here’s how.   

If you believe that men drive differently than women, and likewise that women drive differently than men, suppose that we collected a bunch of driving-related data that was based on human driving and thus within the data there was a hidden element, specifically that some of the driving was done by men and some of the driving was done by women.   

Letting loose an ML/DL system on this dataset, the ML/DL is aiming to try and find driving tactics and strategies as embodied in the data.   

Excuse me for a moment as I leverage the stereotypical gender-differences to make my point. 

It could be that the ML/DL discovers “aggressive” driving tactics that are within the male-oriented driving data and will incorporate such a driving approach into what the true self-driving car will do while on the roadways.   

This could mean that when the driverless car roams on our streets, it is going to employ a male-focused driving style and presumably try to cut off other drivers in traffic, and otherwise be quite pushy.   

Or, it could be that the ML/DL discovers the “timid” driving tactics that are within the female-oriented driving data and will incorporate a driving approach accordingly, such that when a self-driving car gets in traffic, the AI is going to act in a more docile manner.   

I realize that the aforementioned seems objectionable due to the stereotypical characterizations, but the overall point is that if there is a difference between how males tend to drive and how females tend to drive, it could potentially be reflected in the data.   

And, if the data has such differences within it, there’s a chance that the ML/DL might either explicitly or implicitly pick-up on those differences. 

Imagine too that if we had a dataset that perchance was based only on male drivers, this landing on a male-oriented bias driving approach would seem even more heightened (similarly, if the dataset was based only on female drivers, a female-oriented bias would be presumably heightened).   

Here’s the rub. 

Since male drivers today have twice the number of deadly car crashes than women, if an AI true self-driving car was perchance trained to drive via predominantly male-oriented driving tactics, would the resulting driverless car be more prone to car accidents than otherwise?   

That’s an intriguing point and worth pondering. 

Assuming that no other factors come to play in the nature of the AI driving system, we might certainly reasonably assume that the driverless car so trained might indeed falter in a similar way to the underlying “learned” driving behaviors. 

Admittedly, there are a lot of other factors involved in the crafting of an AI driving system, and thus it is hard to say that training datasets themselves could lead to such a consequence.   

That being said, it is also instructive to realize that there are other ways that gender-based elements could get infused into the AI driving system. 

For example, suppose that rather than only using ML/DL, there was also programming or coding involved in the AI driving system, which indeed is most often the case.   

It could be that the AI developers themselves would allow their own biases to be encompassed into the coding, and since by-and-large stats indicate that AI software developers tend to be males rather than females (though, thankfully, lots of STEM efforts are helping to change this dynamic), perhaps their male-oriented perspective would get included into the AI system coding.   

For why remote piloting or operating of self-driving cars is generally eschewed, see my explanation here: https://aitrends.com/ai-insider/remote-piloting-is-a-self-driving-car-crutch/ 

To be wary of fake news about self-driving cars, see my tips here: https://aitrends.com/ai-insider/ai-fake-news-about-self-driving-cars/   

The ethical implications of AI driving systems are significant, see my indication here: http://aitrends.com/selfdrivingcars/ethically-ambiguous-self-driving-cars/ 

Be aware of the pitfalls of normalization of deviance when it comes to self-driving cars, here’s my call to arms: https://aitrends.com/ai-insider/normalization-of-deviance-endangers-ai-self-driving-cars/ 

In The Field Biases Too  

Yet another example involves the AI dealing with other drivers on the roadways.   

For many years to come, we will have both self-driving cars on our highways and byways and simultaneously have human-driven cars. There won’t be a magical overnight switch of suddenly having no human-driven cars and only AI driverless cars.   

Presumably, self-driving cars are supposed to be crafted to learn from the driving experiences encountered while on the roadways. 

Generally, this involves the self-driving car collecting its sensory data during driving journeys, and then uploading the data via OTA (Over-The-Air) electronic communications into the cloud of the automaker or self-driving tech firm. Then, the automaker or self-driving tech firm uses various tools to analyze the voluminous data, including likely ML/DL and pushes out to the fleet of driverless cars some updates based on what was gleaned from the roadway data collected.  

How does this pertain to gender?   

Assuming again that male drivers and female drivers do drive differently, the roadway experiences of the driverless cars will involve the driving aspects of the human-driven cars around them.   

It is quite possible that the ML/DL doing analysis of the fleet collected data would discover the male-oriented or the female-oriented driving tactics, though it and the AI developers might not realize that the deeply buried patterns were somehow tied to gender.   

Indeed, one of the qualms about today’s ML/DL is that it oftentimes is not amenable to explanation.   

The complexity of the underlying computations does not necessarily lend itself to readily being interpreted or explained in everyday ways (for how the need for XAI or Explainable AI is becoming increasingly important). 

Conclusion 

Some people affectionately refer to their car as a “he” or a “she,” as though the car itself was of a particular gender.   

When an AI system is at the wheel of a self-driving car, it could be that the “he” or “she” labeling might be applicable, at least in the aspect that the AI driving system could be gender-biased toward male-oriented driving or female-oriented driving (if you believe such a difference exists). 

Some believe that the AI driving system will be gender fluid, meaning that based on all how the AI system “learns” to drive, it will blend together the driving tactics that might be ascribed as male-oriented and those that might be ascribed as female-oriented.   

If you don’t buy into the notion that there are any male versus female driving differences, presumably the AI will be gender-neutral in its driving practices. 

No matter what your gender driving beliefs might be, one thing is clear that the whole topic can drive one crazy. 

Copyright 2020 Dr. Lance Eliot  

This content is originally posted on AI Trends.  

[Ed. Note: For reader’s interested in Dr. Eliot’s ongoing business analyses about the advent of self-driving cars, see his online Forbes column: https://forbes.com/sites/lanceeliot/] 

http://ai-selfdriving-cars.libsyn.com/website