Mike Haberman | , , ,| By
In 2002 the film Minority Report appeared with a plot line dealing with a special police unit of the future, in the year 2054, that was able to identify murders before they committed the murder. They were able to arrest these people in order to prevent the murder, thus reducing murder to almost zero. Thirteen years ago that idea seemed far-fetched, but now in the age of “big data” algorithms, this scenario seems much more realistic.
The company JP Morgan has implemented a new surveillance program with an algorithm designed to identify problem employees before they become problems. The algorithm reviews emails and texts looking for “employees who miss compliance classes, employees who violate personal trading rules, and workers who break market risk limits.”
As attorneys Keith Anderson and Amy Puckett point out, there is nothing new about employers having a right to review email, texts or the devices employees use. My experience has been that most companies have such a policy in their employee handbooks. Anderson and Puckett however, say that JP Morgan’s algorithm goes one step further and attempts to weed out bad employees before they ever have a chance to be bad. JP Morgan calls this “predictive monitoring.” To the company this is “a proactive step to aid in detecting intentions and preventing bad acts.” This is akin to the Minority Report’s police unit preventing murder by arresting the murder that has not yet committed the murder.
As you might guess the attorneys question what happens when JP Morgan actually identifies these employees. Does the employee get terminated? What kind of lawsuit will that result in? Anderson and Puckett seem to think the cost of JP Morgan defending themselves in a wrongful termination suit will be much cheaper than what they have been paying to correct the illegal and improper financial actions of bad employees.
One thing that is being counted on by the company is the deterrent factor of announcing such a program. Recall that when police announced they were going to use radar to catch speeders many people started easing up on the gas pedal. But, also like that situation, such a program will only work if someone is actually caught and made an example. A deterrent without a consequence is no deterrent; it is just an empty threat.
Initially the threat of the algorithm may suppress bad behavior. But eventually someone will consider themselves to be “bullet proof” or they will come up with some way to defeat the process, much like radar detectors flourished after radar was introduced.
The Role of HR
With such an algorithm does the role of HR get to be even more of a “police role” than it already is? Or is this just a logical extension of what HR already does in many organizations? Would you be interested in having such an algorithm reviewing your company’s email in order to anticipate “bad behavior?” Or do you already?