Kudos to the Financial Times, that bastion of male writers covering male-dominated endeavors and industries! It recognized that there are women who might be interested in their articles! Also, it noted that not enough women experts were being leveraged in their articles! Finally, it recognized research that suggests that women might be put off by articles that quote heavily or exclusively from men! So, FT sourced the effort to correct this situation to a Bot – call it a “FemBot” if you will (my name, not theirs). This bot scans through the articles during the editing process to determine whether the sources named in the article are male or female. Editors are then alerted that they are falling short on including women in their pieces. Later versions might actually alert writers to their overly male tone as they type them up.
FT isn’t stopping there. It is also examining the images it uses, and intends to press for more pictures of women. Because women are more likely to click through on pictures of women, than those only containing men. The Opinion Desk at the FT is also tracking behaviors, noting gender, ethnicity and geographical location, with the goal of supporting more female and minority voices in the publication.
The concept of bias baked into Artificial Intelligence systems from developers and data sets is an emerging issue and a well-identified risk of those systems. However, FT appears to be embracing the bias in an effort to counteract it. Well done, FT!