Don’t Blame Me — The Algorithm Made Me Do It

Asked to leave, the doctor still would not collaborate in his randomly selected humiliation and schedule-breaking ouster.

The doctor being dragged from the plane

By now you have probably seen or at least heard about a United Airlines passenger being literally dragged off an overbooked airplane. after a computer algorithm selected him at random.

The passenger refused to cooperate—he could have walked off the plane with some dignity. Instead he chose to make a statement by forcing the Chicago Aviation Police to manhandle him. You know, like Civil Rights protesters in the South.

Videos of the incident have gone viral and it now become A Thing.

This story gives us several points to ponder.

Point #1:  The Offer

It seems that no one on the plane took UA’s offer of money to give their seat up to—wait for it—four United Airlines employees. Not paying customers. Not a bereaved family trying to get home. Not a medical emergency. Not even a politician or a celebrity. They chose to treat a customer badly in order to smooth things for four of their own.

To be fair, this was a flight crew that needed to get back to base to keep another flight on schedule. In any other circumstance, bumping customers for deadheading crew would be appalling.

This is, however, the logical extension of the airline companies’ growth plans, which have systematically and consistently put customer satisfaction at the bottom of their priority list. Once you have gotten away with treating your customers badly, once you have made maximizing profit at customer expense a policy, once you have inured yourself to the complaints and outrage of those who fill your narrow, flat, upright, hard and uncomfortable seats, well, you can do anything at all.

Dragging a customer down the aisle hardly seems like much of a leap.

Point #2:  The Options

United Airlines planeAfter getting no takers on their $400 offer for anyone who would give up a seat, they increased it to $800. Still no volunteers. At this point, United Airlines had other options. (I am assuming there were no empty seats in First Class,)

They could have:

  • Increased the amount of the offer
  • Offered First Class upgrades for a future United Airlines flight.
  • Offered a “couples rate” to get two volunteers.
  • Offered a free ticket to anywhere in the continental United States.
  • Offered a year’s membership in the United Club airport lounges.
  • Offered a package of two or three of these options.

There, was that so difficult? It took me five minutes to think of six things that would have made a customer or two happier about giving up their seat(s).

Point #3: The Algorithm

Having made the decision to throw a passenger from the plane, UA whiffed on making a logical selection about which human beings to target and had a computer decide.

Welcome to our Brave New World in which algorithms decide and human beings follow orders.

As Alistair Croll points out in his blog post, “I just did what the computer told me to,” this incident demonstrates what happens when machines make black-and-white decisions inside the gray areas that humans currently finesse.

These small infractions of the rules—looking the other way or allowing a small change—make life easier for us all and they depend on the kindness of strangers. Such generosity also avoids creating public relations debacles like this one.

Binary Decisions

But computers don’t operate in gray areas. They are binary: on or off, light or dark, yes or no, go or stay.  Given instructions that someone had to go the computer algorithm selected four.

Three of the four got up and left. Number Four—a doctor—refused to move because he had patients to see. He would not cooperate. Well, why should he? He had a ticket. He had a seat. And he had a schedule.

By now you have probably seen or at least heard about a United Airlines passenger being literally dragged off an overbooked airplane. Enter Chicago’s Aviation Police with instructions to deal with this non-violent, passively resisting human being.  Asked to leave, the doctor still would not collaborate in his randomly selected humiliation, inconvenience, and schedule-breaking ouster. The Chicago Aviation Police is a hammer and they were looking for a nail.

When they hauled the man out of his seat he struck his face on the armrest and began to bleed. This constitutes assault. They then dragged him down the aisle with his shirt rolling up and his glasses falling off.

I Did What I Was Told

We have gone from the infamous excuse, “I was just following orders,” to the new justification, “I just did what the computer told me to.” How quickly and easily we human beings stoop to inhuman behavior.

The officer could have told the airline that the overbooking was their fault and, thus, their issue to deal with but he would not drag a decent, passively resisting human being down the aisle. He could have but he didn’t. Given a chance to exert force over another human being, he took it.

A human being tasked with taking four people off the plane might have noted that the passenger was a doctor and passed him by in favor of someone with fewer life-and-death responsibilities. A computer algorithm is blind to such nuance.

No Apology

United Airlines has apologized for the overbooking situation and for having to re-accommodate customers.  But not for treating a human being, and a customer badly enough to make him bleed.

To complicate an already  outrageous situation, the doctor is Chinese, the videos of his mistreatment have gone viral in China and discrimination is being alleged. The Unite Airlines PR department is undoubtedly working around the clock but they are bailing against the tide.

This incident demonstrates so many failures of good judgment and so many bad decisions that it’s difficult to even list them. It does not, however, augur well for a future in which computer algorithms and programmed robots take over human functions. We have now seen the future in which humans execute the decisions of mechanical devices and that future is us.