Can humans be fired by a bot?

While AI and machine learning can be leveraged for taking key decisions, they cannot be left to make these decisions on their own without a human interface

0
28392

A Bloomberg report claims that an employee who spent four years delivering packages for Amazon, as a contract driver in Phoenix, suddenly got axed. It wasn’t the sacking that was sudden but the process.

The driver received an automated e-mail informing him that his services had been terminated. A bot had fired a human! For Amazon, this may be quite acceptable given that Jeff Bezos, chief executive officer, Amazon, believes that machines make more accurate decisions than humans because they are devoid of bias. However, what could be a reason for alarm is the dependency on tech to carry out key functions, such as termination of a contract which is expected to be done personally.

“We run a human organisation, and if we don’t provide a human interface for a decision which involves an individual’s career and livelihood, then we’d better not be in this business. Let’s not employ human beings at all then. Tomorrow I may not even need a leadership team. I could just bring in AI-enabled machines and make them managers. Getting a system to come back and execute a termination is completely inhuman. An organisation choosing to do so shouldn’t be in the people business.”

Nihar Ghosh, president – HR, Emami

Is it feasible?

Use of artificial intelligence and machine learning approach in the first stages of hiring isn’t uncommon. Many organisations indulge in early sieving of CVs with the use of a set algorithms to narrow down their searches. However, giving machines the power to take critical, high-risk decisions, such as analysing set parameters to conclude whether or not to keep someone at a job can be detrimental. In the Indian context, how feasible is this approach now that the dependency on technology in HR has increased manifold?

Rajesh Balaji, CHRO, Matrimony.com, is rather confident that such a system will take at least a couple of decades to be widely accepted in India. He, however, doesn’t rule out the possibility completely, because the bot is just executing the decision of the organisation. He also suggests how it can be made a lot more compassionate despite being a mechanical process.

“An adequate ecosystem needs to be created for people to be made aware beforehand that they will be fired, and the bot will just execute it. The message should not come as a surprise. The concerned employees should be provided a warning about them being in the red. One more mistake and the person is fired. So, if adequate filters are put in place, then it will not matter whether it is a bot or a human at the other end,” Balaji explains.

“An adequate ecosystem needs to be created for people to be made aware beforehand that they will be fired, and the bot will just execute it. The message should not come as a surprise. The concerned employees should be provided a warning about them being in the red. One more mistake and the person is fired. So, if adequate filters are put in place, then it will not matter whether it is a bot or a human at the other end.”

Rajesh Balaji, CHRO, Matrimony.com

He also points out that such things mostly occur at the entry level because as the pyramid goes up, the sensitivities are much stronger.

The reason why he thinks such a process will take some decades to enter India is that currently there’s no integrated system for the same, but perhaps, one day, there will be. He also speaks of the nudge technology — apps on phones that nudge people to meet deadlines without annoying or angering them. In fact, people try to overachieve to compensate for their shortcomings.

“There is no question of being offended because there’s no individual emotion there. Therefore, it is all about creating a technological ecosystem, which will prepare people to expect a termination from a bot,” suggests Balaji.

Who will look into the intangible data?

What has been mapped in the algorithm is yet another matter to be factored in. The algorithm maps tangible data, such as achievements, performances and personal track records, but what about the intangible data?

“In performance-management processes, we rely on machines to give us performance data. We use that data to make performance decisions. In selection of resources as well, we leverage machines. The key difference is the next step — what do we do once we have the output from machines? Do we rely solely on the scores or also try and understand other factors that may have impacted performance, say, a personal exigency? How do we communicate the outcome to the concerned people? Can we communicate even bad news with deep empathy? That decision is for humans to make.”

Amit Das, CHRO, Bennett Coleman & Company

Nihar Ghosh, president – HR, Emami, does not condemn technology if it involves an AI programme which has been designed and mapped well with defined parameters. However, it should come back with a recommendation. The precipitation of the action should involve a human interface, where somebody talks to the individual with understanding. It should have human involvement rather than being interpersonal.

“We run a human organisation, and if we don’t provide a human interface for a decision which involves an individual’s career and livelihood, then we’d better not be in this business. Let’s not employ human beings at all then. Tomorrow I may not even need a leadership team. I could just bring in AI-enabled machines and make them managers. Getting a system to come back and execute a termination is completely inhuman. An organisation choosing to do so shouldn’t be in the people business,” Ghosh asserts strongly.

Although he agrees machines cut down on biases, he is against them making the final call. “The world will not be governed by programmed learning. It is not an Avenger series. We created machines to serve us and not become subservient to them,” Ghosh points out.

Amit Das, CHRO, Bennett Coleman & Company, however, would not hastily label the Amazon case as one of machines taking over human jobs, without understanding what really happened. A contractor signed a contract to perform a job to a pre-defined standard. When the contractor failed to perform as per the standard, the contract was terminated through a pre-configured communication template. However, the decision to use this system of machine-monitored, machine-executed performance system was taken by humans. Machines can be designed to perform many routine and advanced tasks. However, the earmarking of tasks between man and machine is still a decision for humans to make.

“In performance-management processes, we rely on machines to give us performance data. We use that data to make performance decisions. In selection of resources as well, we leverage machines. The key difference is the next step — what do we do once we have the output from machines? Do we rely solely on the scores or also try and understand other factors that may have impacted performance, say, a personal exigency? How do we communicate the outcome to the concerned people? Can we communicate even bad news with deep empathy? That decision is for humans to make,” Das explains.

Does the answer lie in digital empathy?

Technology should be an enabler for us to take the right steps in the interest of the organisation, while preserving human dignity. It would be able to perform monotonous tasks and take away cognitive load from decision making, so that humans can perform higher-order functions. “To harness the power of technology, leaders will need to develop their digital empathy, and use it to design technical solutions. Only those organisations that focus on developing this critical competency in their leaders will be able to leverage the gift of technology. Others will relinquish their responsibility to machines in pursuit of hollow rational goals,” asserts Das.

Firing of a person by a bot can come as a shock to many, and understandably so. When dealing with people, one has to keep in mind human emotions and empathy. So, while AI and machine learning should be leveraged to get data, they shouldn’t be the ones making decisions such as terminating someone without any human interface.

Comment on the Article

Please enter your comment!
Please enter your name here

19 + six =