Month: December 2020

A More Mundane Example of Algorithm Design Error

Last week I posted about the errors in Presidential Election polls for this General Election. These errors occurred despite exhortations by the polling industry that it had learned from its mistakes in 2016, and that the polls would be more accurate this time. I used these errors as an example of how bad algorithm designs, combined with suspect data, can result in unacceptably wrong outcomes.

This week I providing a more mundane example, but nonetheless one that shows the limitations and risks of algorithms. Facebook uses an algorithm to insert advertisements into subscribers’ Facebook pages. Facebook advertisers pay for the opportunity to aim advertising at a class of likely consumers, thereby promising “more bang for the buck”. If my page is anything like Facebook subscribers in general, Facebook, despite its huge investment in advertising algorithms, is falling very short.

Recently, the vast majority of the ads on my Facebook page started to be for workout wear, shorts that are clearly designed to show off ones assets (emphasis on the first syllable), and lounge wear for the apparently active man who actually sits around most of the day (one can wear these lounge pants for ‘twenty-four hours at a time in complete comfort’, although with apparently complete disregard for daily hygiene). I can only surmise that Facebook must have recently subtracted thirty-five years and thirty-five pounds from my personal dataset. Unfortunately, I can guarantee I cannot fit my assets into these clothing lines, and if I could, the result would be not uncomfortable for me and anyone in eyesight of me. So, every one of those postings is a waste of ad money for the advertiser, and a lost opportunity for more appropriate advertisers selling what would entice me–might I suggest those offering food or cocktail of the month clubs, or clothes that fit an “Oh, Daddy!” as my teen daughter disapprovingly uses that phrase for every occasion, not an “OH DADDY!” as that phrase might be used in streaming porn.

The closest aim that Facebook’s algorithm gets to being right are for biking tours through various wine “countries”. These ads would have been appropriate a few decades ago (before Mark Zuckerberg was born, let alone formulating the idea for easing hookups at Harvard that was to become Facebook), but I now prefer my touring to be via chauffeured vehicles, so I can drink without concern for driving, let alone peddling. This placement can be deemed fifty percent accurate. Unfortunately, as with almost any real-world application of algorithms, fifty percent (or even eighty percent) accuracy is going to prove woefully inadequate, and thus useless or even harmful to the public.

I might add that I have clear proof that I am not the only recipient of these woefully inadequate ad placements. My nephew, who just finished advanced communications training with the Marines, let me know that ads on his Facebook page are for rentals of private jets. He thought that perhaps Facebook must know best, and that the problem was he is the only underpaid Marine in the United States. Ironically, he is a far better candidate for the workout wear ads that I am receiving, and I, in turn, at least enjoyed a few private jet trips in my professional career.

While this ad algorithm deficiency is amusing, no one would be laughing if the algorithm was for medical treatment, and my physical conditions were being ignored because the algorithm assumed I was as health as a young Marine, while the young Marine was being given needless treatments because the algorithm had concluded he suffered the health problems of an aging, sedentary attorney. That is why in the Age of Algorithms, AI should not be considered a perfect miracle worker, and why humans must always be the final and complementary step in any AI process with serious ramifications.

Posted by Alfred Cowger

The Bad Presidential Polling Results and their Ominous Implications in the Age of Algorithms

For the second Presidential election in a row, the polls were materially off. This suggests that the science of polling is regressing, not improving, because pollsters are failing to create polling models that realistically reflect the voters, and thus the extrapolations of polls based on faulty models are resulting in faulty polls. Not only is this bad news for polling in years to come, but it is a sadly superlative example of what society faces with algorithms in the future.

Just like polls, algorithms are dependent on three fundamental elements: 1) an accurate database, 2) a design that uses, or allows the algorithm to learn, accurate correlations between the database and conclusions to be drawn by the algorithm, and 3) a design that properly uses the those correlations to make accurate conclusions. If any one of these elements fails, the entire algorithm will result in inaccurate, misleading or simply incoherent results.

The bad polling results have ominous implications for what society faces in the Age of Algorithms. Just like bad polling models suggested that Biden might actually win my home state of Ohio, a bad algorithm might decide, for example, that some convicted criminals are more likely to be a harm to society, and thus those algorithms might instruct judges to increase the sentences for those guilty persons. However, when a poll is bad, eventually the election results will show the poll was erroneous, and the poll will not have resulted in fundamental harm to the election, except perhaps the pride of commentators who were paid to spout the poll results like an electoral oracle. In contrast, after a criminal is given a longer sentence by a faulty algorithm, there is no subsequent test by which that sentencing will be proven wrong, and the harm caused to the prisoner will mean an irreparable loss of additional and unnecessary years in prison, while criminals who perhaps should be in prison longer will be freed too soon to commit more crimes.

So the bad polling of the last two elections should serve as an omen for what society faces in the Age of Algorithms.

Posted by Alfred Cowger

Due to technical difficulties….

Due to a combination of several intermittent days without internet, as well as a glitch in the upgrade of my blog’s programming that knocked me offline for several days, I have been unable to update this blog until today. But that’s not a problem, because what possibly could have happened between October 10th and now that would give cause to legal analysis? Oy….

Posted by Alfred Cowger