Research finds that robot bosses get less respect and lower results

If you’ve ever fantasized about providing a robot to your boss instead of a demanding human manager, new research suggests you might want to rethink. While you may feel less pressure to obey your mechanical superiors, your performance may actually be affected under its synthetic supervision.
In the first study published on cognitive, technology and work published on January 6, Polish researchers found surprising insights into how humans respond to robotic authority figures in workplace environments. The results show that while people do show important obedience to humanoid robots in the position of power, they are less compliant and productivity when compared to human supervision.
“We have shown that people show a lot of obedience to the texture of humanoid robots, although it is slightly lower than for humans (63% vs. 75%),” explains Dr. Konrad Maj of SWPS University, a PhD from the Humantech Center for Social and Technological Innovation.
At the obedience level, this 12% point gap seems moderate, but the impact on productivity is huge. “As the experiments have shown, people may show a decrease in motivation for machines that supervise their work – in our study, participants performed their tasks more slowly and more efficiently under the supervision of robots,” Maj noted.
As robots increasingly enter traditionally, with human authority across departments (including education, health care and law enforcement), the research was conducted at a critical moment. For organizations that view automation as a pathway to improve efficiency, these findings present a shocking reality check.
“This means that automation is not planning correctly from a psychological point of view and does not necessarily improve efficiency,” Maj said.
The laboratory study, conducted by Maj University, along with colleagues Dr. Tomasz Grzyb, Professor Dariusz Doliński and Magda Franjo, involved a simple and tedious computer task. Participants were randomly assigned to work under the attention of human supervision or pepper-like human robots.
The task itself is simple, but monotonous – changes the file extension on the computer. When participants showed signs of reluctance to continue (e.g. pauses for more than 10 seconds), the robot or human experimenter would verbally encourage them to persist.
The performance differences are astonishing. Under human supervision, participants averaged an expansion for each file, completing an average of about 355 files. When supervised by the robot, the same task took nearly four times (82 seconds per file), and participants completed an average of 224 files, with productivity dropping by nearly 37%.
These results highlight the complex psychological dynamics in human robot interactions. Although robots are often implemented with expectations of increasing efficiency, research shows that human factors are still crucial, especially when it comes to authority and motivation.
The researchers point out that the appearance of a robot significantly affects how people respond to it. The study found: “Robots that look more human-like are considered to be more capable and trustworthy robots.” However, the relationship is not linear – robots that look too human without achieving perfect similarity trigger the so-called “weird valley” effect, which actually reduces trust and comfort.
Maj provides some possible explanations for this phenomenon: “If a machine has clear human characteristics but still exhibits various imperfections, this leads to cognitive conflicts – we are confused about how to treat it, and we don’t know how to act on something like that. But we can also talk about emotional conflicts: obsession and admiration mixed with disappointment and fear.”
There is also an evolutionary perspective. “Proponents of evolutionary explanations claim that programming humans to avoid various pathogens and threats, and a robot that pretends to be human but is still not perfect seems to be a threat. Why? Because it looks like someone is sick, disturbed or imbalanced.”
These implications go beyond the efficiency of the workplace, making it a wider social sphere. As robots become increasingly integrated into daily life, issues about human relationships are becoming more and more complex.
“Robots that look like humans, communicate like humans become easy to use,” Maj admits. “But there is also a dark side to this – if we create robots that are very similar to humans, we will no longer see the boundaries. People will start to be friends with them, demanding that they be granted various rights, and maybe even marry them in the future.”
This blur of boundaries can have unexpected social consequences. “In the long run, humanoid robots can create rifts between people. There will be more misunderstandings and disgusts – that’s because robots owned at home will be personalized, always available, compassion in communication and understanding. People don’t match that much,” Mary notes.
For employers and HR departments, the point is clear: implementing a robot in a supervisory role requires careful consideration of psychological factors, including viewing it as an authority figure, trust building, and potential resistance to the following commands.
As organizations rapidly develop automated landscapes in workplaces, this study suggests that the most effective approach may be to leverage the strengths of human and robotic leadership, rather than viewing automation as a complete replacement for human supervision.
If you find this piece useful, consider supporting our work with a one-time or monthly donation. Your contribution allows us to continue to bring you accurate, thought-provoking scientific and medical news that you can trust. Independent reporting requires time, effort, and resources, and your support makes it possible for us to continue exploring stories that are important to you. Together, we can ensure that important discoveries and developments attract those who need them the most.