This 2016 video from the USA says about itself:
The (uncomfortable) truth of HR and leadership development | Patrick Vermeren | TEDxKMA
Quackery and pseudoscience can be very dangerous. Not only in medicine but also in human resources. Alarmed by the nonsensical ideas of Transactional Analysis, Patrick Vermeren set out on a mission to reveal the truth about the many HR theories, models and questionnaires. The academic literature revealed that most HR practices (in recruitment, assessment, development, coaching…) are very problematic and some even dangerous. In this Talk valid alternatives are presented.
Translated from Dutch weekly De Groene Amsterdammer, 31 October 2018:
Workers as a collection of data
PostNL [the privatised Dutch postal service], Zeeman, Shell, Achmea [insurance corporation], banks, the Tax and Customs Administration – companies and services are increasingly going to digitally measure the characters, qualities and performances of their employees. Sometimes up to their DNA. What does this mean for our privacy?
By Karlijn Kuijpers, Thomas Muntz and Tim Staal
On a fresh September afternoon the postwoman walks her round. She knows this neighborhood well: what house numbers are houseboats, where the letterbox is (sometimes at the level of a cat flap).
Yet [fictional name for privacy] Ellen’s boss looks over her shoulder, through the ‘My Work’ app of PostNL. The app tells Ellen by the minute how long she can do about this neighborhood and sets out the fastest route. ‘You are obliged to walk exactly that route. But there is often something wrong with that”, Ellen knows.”The craziest example is that the app sends you through a narrow alley where you can not go through with your mailbags!” The app also does not take into account leftover mail from the previous day: no matter how much mail you have, whether it snows, storms or the sun is shining, you always get the same amount of time for your neighborhood from the app.
Postal workers who can not make it within the official time can choose to add minutes. In the beginning they were not paid for those extra hours, that happened only after postal workers shortly after the introduction of the app rebelled. …
PostNL is not the only company that closely monitors its employees. From ASR insurance to the ING bank, from McKinsey to Zeeman and from the municipality of The Hague to the Port of Rotterdam, at all these organizations employees provide the smallest possible details about their state of mind, health and even enthusiasm to their bosses. Employers collect details about their performance, citizenship, ‘working memory’ and stress level, often without their staff knowing, to use the ‘human capital’ more efficiently. Technicians and mathematicians take over the ‘soft’ personnel management, to make every personnel decision [supposedly] fact-based.
Research by PwC consultants shows that forty percent of international companies use artificial intelligence in the personnel department. US American companies are at the forefront: research by the American Management Association shows that 43 percent of US companies read e-mails by their employees. European companies are still lagging behind, but in the Netherlands organizations such as the ABN Amro bank, APG, Rabobank, Shell, Achmea, KLM, ASML, the AIVD [spying service], the Defense department and the Tax Authority all have their own departments for ‘Human Resource Analysis’: HR-analytics.
Whoever applies for a job is expected to play online games or is interviewed automatically. Whether or not all that effort yields something you do not know, because it is best that your intended employer only looks at your micro-facial expressions or your ‘natural language’. Once you have found a job, it is possible that all your e-mails are read to determine your mood or that you need to do some brain boosting, so that you can ‘get into your flow’ and ‘can craft your job’. Employers put on a mirrored pair of sunglasses: they keep a close eye on their staff, without the staff being able to look at them and without it being clear what they are looking at exactly.
But how do you measure people? What is this all based on? And what does this mean for our privacy? Platform for investigative journalism Investico investigated the latest form of robotisation for De Groene, radio program Argos and the Rotterdam online magazine Vers Beton; where human labour is not made superfluous by robots, but where workers themselves are made into streamlined robots. …
“For example, frequent use of the word “you” in Dutch [emails] indicates that you are shifting responsibility.”
Antal van den Bosch, professor of language and speech technology and director of the Meertens Institute, is very skeptical about these methods. In itself he is not afraid of automatic language analysis: Van den Bosch develops such technologies himself. But on the basis of an explanation of the operation of the algorithms received by Investico from Keencorp [HR corporation], Van den Bosch concludes that these algorithms say very little …
And then of course there is the privacy aspect. Keencorp does not worry about that. … But he [the Keencorp big cheese] actually thinks privacy is an outdated idea. ‘Look at Google, privacy is just gone. We recently had a company where a few guys did not want to be measured. They also do not attend any company party, I heard from the manager. That single person who does not want to take part should not take away the benefit for the rest of the company. These are also the kind of people who were always against nuclear missiles. …
Dear Mr Keencorp Big Cheese: most people in the Netherlands oppose nuclear weapons. The biggest demonstrations ever in the Netherlands were against nuclear weapons.
This Dutch video is about the 29 October 1983 demonstration of 600,000 people in The Hague against the plan to have United States nuclear cruise missiles in the Netherlands.
In the 2018 referendum, most Dutch voters rejected the government’s anti-privacy plans.
So, not at all what you, Mr Keencorp fat cat, claim are just a few isolated ‘idiots’.
“I feel being treated unfairly when I am rejected for a job] because of my place of residence. I do not understand why that would be relevant. Achmea probably does not even know that, that’s exactly the weird thing about big data.” …
BrainCompass continues to work at their [pseudo]scientific foundation. For this, the company also registers each participant’s ‘race
and ethnicity’, in order to be able to investigate whether there is a relationship between race and ‘the way in which DNA is expressed’. …
In reality, the models of companies such as BrainCompass and Keencorp are dodgy on all sides, as do the fact-based analyzes of the national government and Achmea. …
Amazon has been working since 2014 on a tool that automatically selects the best applicants. Analysts found that the algorithm discriminated against women because it was trained on the basis of CVs submitted to the company in the past ten years. Most of those resumes were from men, and so the algorithm taught itself to automatically exclude women. The techies have worked a few years to eliminate that robotic sexism, but recently had to conclude that it just does not work. …
Yet most Dutch companies continue on the same path. Many employers do not seem to care about decent, accurate figures. For them the data is a new crowbar to make sackings easier or a tool to show impressive figures at the end of the quarter.
This is also confirmed by the HR companies we have spoken.