Algorithms: The Invisible Boss in the Digital Age

Although just mentioning the subject seems like a science fiction matter or a bold chapter of Black Mirror, without realizing it and in a very subtle, but forceful way, millions of people in the world have an algorithm, software, neural system or a code based on Artificial Intelligence as their boss or immediate superior.

What are the PROS and CONS of algorithm-based automated job monitoring??

By: Gabriel E. Leby B. – @galevy

One of the greatest challenges associated with digital transformation is that it unleashes deep changes that occur in front of our eyes, but that generally go unnoticed in a large part of the population, either due to the lack of rigorous reflection on it or simply because evolution has prepared us to adapt to changes without putting up greater resistance, and it is possibly for this reason, that although the automation of verticality in labor relations is a palpable fact that has been consolidating in recent years, it is very likely that there has not been enough reflection and what is even more serious, that the present and future challenges that this represents have not been regulated.

The broken promise of working without bosses

Perhaps the most recurring promise made by transport applications such as Uber, Didi, Cabify, Lyft, among others, is that if you work for them you will be able to Be your own boss”, which is not totally false, given the tangible inexistence of “A human boss” supervising the fulfillment of tasks, however, it is imprecise to infer that there is no employer hierarchy, since control rests in the hands of an automated structure based on algorithms or computer codes, which gives orders, regarding to the assigned client, the value of the service, the suggested route to take, work schedules, the fulfillment of the assigned task, among many other factors.

About this, Researcher Alex Rosenblat, co-author of the Data & Society Research in charge of a non-profit research institute in New York City (USA) and who was referred to in an Article in Mit Technology Review, concluded that applications such as Uber:

“They control the way their drivers behave while on duty, interact with an automatic management system, through a mobile app” [1].

The researcher suggests that although the applications are offered as neutral intermediaries between users who offer and others who demand, in practice the algorithm that underlies this technology behaves like a “boss” with the drivers:

“As long as a driver’s user is activated, the app assigns them nearby jobs. That is, it assigns you the work that you must do. At the same time, the system works by tracking the proportion of jobs that the driver has accepted (both Lyft and Uber give drivers 15 seconds to decide), and averages the score that passengers have assigned to the driver after the service. Drivers can be suspended for not accepting enough jobs, or for low user scores; they receive incentives to work during certain hours, or in specific locations, through a “surge pricing” model that raises rates temporarily”. Alex Rosenblat, co-author of the Research Data & Society [2]

In practical terms, working for applications such as Uber, Cabify, Lyft, among others, is to accept that an algorithm orders what services to provide, at what rate, under what conditions and then to be supervised and evaluated on the quality of the activity, and may be suspended or awarded for the work performed, just as it would happen with a human “boss”

As it happens with the so-called transportation applications, the same “modus operandi” is repeated with messaging apps such as Rappi, Uber Eats, or Globo, which through an algorithm a worker is assigned to the service to be provided, at the same time that all stages of the product delivery process are controlled.

Automation processes in traditional companies

Although the best exponents of job automation are the so-called digital service applications mentioned above, it is important to have on mind that the phenomenon has also permeated traditional companies that have been concentrating efforts on digital transformation, including of course the automation of many supervisory tasks previously performed by humans.

For companies, automation in management, monitoring and supervision processes can represent a notable greater efficiency, economy and transparency in hierarchical relationships, which can end up being reflected in substantial improvements in some institutional processes and procedures.

An example of this transformation are the so-called “Call Centers”, companies dedicated to outsourcing user support services, where the supervision of the operator’s tasks is in charge of algorithms, which are in charge of automatically assigning the calls to each official, to control the time they take in attention, to average the service evaluation, to calculate the time connected for the payment of the fees, even some very sophisticated algorithms do meta-analysis of the words used by the operator, penalizing in some cases the use of certain types of expressions, such as mentioning competing brands, swearing or referring disparagingly to certain topics [3].

The telecommunications industry has been automating in supervision, since many large companies assign the routes of home service provision to their technical operators through information systems, and through of algorithms they control the time of service provision and even the technical operators themselves request activations, schedules and authorizations through these systems, which are the ones that authorize or deny those processes, while monitoring the fulfillment of tasks by the employees and monitor all the service provisioning activities.

Generally, this type of cyber-solutions at the end of the month issue detailed performance reports of each employee, who in many cases does not even work for the same company, but is outsourced by another contractor company.

Possibly the main difference between the traditional industry and the new emerging applications is that in the first one the human-software supervision model is mixed, while in the second ones the algorithm is fully trusted for the supervision process.

Are Algorithms more objective?

Taking into account that software systems are not motivated by subjective aspects of human nature, it could be assumed that algorithms can be much more accurate, fair and balanced with their employees, since, unless they have been programmed for this purpose, an “algorithm” will not be envious of its dependents, nor will it discriminate against them because of race, beauty, size or the way they dress and think, which could mean substantial progress in the field of social inclusion and respect for human rights, however, although in theory it seems logical, there are many other variables involved, such as lack of tact and humanity, since that So-called “objectivity” can play a double-edged sword and could eventually end up being unfair under the rule of law.

An algorithm, for example, could fire a worker who suddenly left her position due to the serious accident of one of her children, a factor that would be irrelevant for an automated code and based simply on the employee’s behavior and not on their motivations, it could take this absolutely logical decision, but lacking the slightest principle of humanity.

In an interview with El País RETINA magazine in Spain, the country’s labor minister, Yolanda Díaz, referred to this, giving possible insight not only about the complexity of the issue, but also about possible strategies for its regulatory intervention:

“Algorithms are not abstract entities, there are processes behind them that must be analyzed and evaluated, monitored. Control of work activity by the employer, for example, is a right, but it is not unlimited. And the protection of fundamental rights, the protection of data and the guarantee of digital rights must always prevail. I remember an episode of ‘The Good Fight’ in which an algorithm penalized the best teacher in school, condemning him to dismissal. We have to extract two basic lessons: they are not infallible systems and their operation must be controlled” Yolanda Díaz, Labor Minister of the Kingdom of Spain. [4]

In the same way, the minister draws attention to the complexity that the regulation of algorithms represents in the field of employment and points out that:

“Technology is a tool in the hands of the people, not the passport to another dimension in which the rights and protection of workers are not fulfilled or disappear! Our rights must go hand in hand with new technological developments. Before it was said that workers’ rights did not stop at the factory doors, now they do not stop at the computer keyboard either”. [5]

In Conclusion, the possibility of an algorithm acting as “Boss” is not a matter of Science Fiction, but a palpable reality that without realizing it is already part of our daily lives, making the control and management of companies much more efficient, which at the same time generates value, confidence in the market, and new possible investments, while boosting the economy.

From a social, human labor and, above all, regulatory perspective, this type of technology represents a great present and future challenge. It is necessary that, in the coming years, a regulation that guarantees a humanistic perspective of the quality of work emerges at a global level and that rights are not relegated by corporate logics, because finally they are people with feelings, illusions, expectations and aspirations, those that will end up suffering the impact of inadequate regulation, of a technology that, although it may be “efficient”, is not necessarily “fair”.

 [1] Reference article about Mit Technology Review

 [2] Reference article about Mit Technology Review

[3] Reference article about working modality of Call Centers in Argentina

[4] Interview to the Spanish Labor Minister Published on Retina magazine

[5] Interview to the Spanish Labor Minister Published on Retina magazine

 Disclaimer: This article corresponds to a review and analysis in the context of digital transformation in the information society, and is duly supported by reliable and verified academic and/or journalistic sources. This is NOT an opinion article and therefore the information it contains does not necessarily represent the position of Andinalink, or its authors’ or the entities with which they are formally linked, regarding the issues, persons, entities or organizations mentioned in the text.