A More Mundane Example of Algorithm Design Error

Last week I posted about the errors in Presidential Election polls for this General Election. These errors occurred despite exhortations by the polling industry that it had learned from its mistakes in 2016, and that the polls would be more accurate this time. I used these errors as an example of how bad algorithm designs, combined with suspect data, can result in unacceptably wrong outcomes.

This week I providing a more mundane example, but nonetheless one that shows the limitations and risks of algorithms. Facebook uses an algorithm to insert advertisements into subscribers’ Facebook pages. Facebook advertisers pay for the opportunity to aim advertising at a class of likely consumers, thereby promising “more bang for the buck”. If my page is anything like Facebook subscribers in general, Facebook, despite its huge investment in advertising algorithms, is falling very short.

Recently, the vast majority of the ads on my Facebook page started to be for workout wear, shorts that are clearly designed to show off ones assets (emphasis on the first syllable), and lounge wear for the apparently active man who actually sits around most of the day (one can wear these lounge pants for ‘twenty-four hours at a time in complete comfort’, although with apparently complete disregard for daily hygiene). I can only surmise that Facebook must have recently subtracted thirty-five years and thirty-five pounds from my personal dataset. Unfortunately, I can guarantee I cannot fit my assets into these clothing lines, and if I could, the result would be not uncomfortable for me and anyone in eyesight of me. So, every one of those postings is a waste of ad money for the advertiser, and a lost opportunity for more appropriate advertisers selling what would entice me–might I suggest those offering food or cocktail of the month clubs, or clothes that fit an “Oh, Daddy!” as my teen daughter disapprovingly uses that phrase for every occasion, not an “OH DADDY!” as that phrase might be used in streaming porn.

The closest aim that Facebook’s algorithm gets to being right are for biking tours through various wine “countries”. These ads would have been appropriate a few decades ago (before Mark Zuckerberg was born, let alone formulating the idea for easing hookups at Harvard that was to become Facebook), but I now prefer my touring to be via chauffeured vehicles, so I can drink without concern for driving, let alone peddling. This placement can be deemed fifty percent accurate. Unfortunately, as with almost any real-world application of algorithms, fifty percent (or even eighty percent) accuracy is going to prove woefully inadequate, and thus useless or even harmful to the public.

I might add that I have clear proof that I am not the only recipient of these woefully inadequate ad placements. My nephew, who just finished advanced communications training with the Marines, let me know that ads on his Facebook page are for rentals of private jets. He thought that perhaps Facebook must know best, and that the problem was he is the only underpaid Marine in the United States. Ironically, he is a far better candidate for the workout wear ads that I am receiving, and I, in turn, at least enjoyed a few private jet trips in my professional career.

While this ad algorithm deficiency is amusing, no one would be laughing if the algorithm was for medical treatment, and my physical conditions were being ignored because the algorithm assumed I was as health as a young Marine, while the young Marine was being given needless treatments because the algorithm had concluded he suffered the health problems of an aging, sedentary attorney. That is why in the Age of Algorithms, AI should not be considered a perfect miracle worker, and why humans must always be the final and complementary step in any AI process with serious ramifications.