AI’s crawling influence: Do we need to hand over too much power?

AI quietly (or not quietly dependent on personal experience) embeds itself into our daily lives, affecting the job market, media, governance, and even our cultural narrative. While much of the discussion about AI focuses on sudden, dramatic threats such as Rogue Artificial General Intelligence (AGI) or Deepfakes, there is another more sinister risk: Gradually depart from power.
A recent study conducted by Charles University of Telic Research and Jan Kulveit of Raymond Douglas of Telic Research shows how advances in AI steadily erode human control over key social systems. We have witnessed a slow, systematic shift rather than an open AI rebellion, where AI increasingly replaces human decision-making in key areas such as economy, governance, and culture. As these technologies optimize efficiency, market value and forecast accuracy, human agencies are quietly being placed within scope.
Why is this important? Because the mechanisms that align our society with human values (economic participation, cultural expression and democratic governance) have the potential to slide outside our control. If not selected by organizations, the growing role of AI in decision-making may lead to a marginalized future in which human influence is possible, and our ability to shape our own future will be greatly weakened.
How AI reshapes the economy
This study reminds us that AI-driven automation is reshaping the global workforce, thus steadily replacing the human workforce across the industry. While AI-powered tools can increase productivity and reduce costs, they also transfer financial power to workers, fundamentally changing the flow of wealth. As machines perform tasks that once depend on human cognition and expertise, traditional employment models are collapsing, leading to inequality and economic displacement.
An International Monetary Fund (IMF) report shows that AI will affect nearly 40% of global jobs, replacing some jobs and supplementing others.
One of the main economic consequences of AI dominance is wealth concentration. Companies that develop and control AI systems will benefit themselves disproportionately, while workers find themselves less opportunities. This transformation has the potential to create a world where financial power is concentrated in AI-driven businesses, putting human labor in the economy.
Another problem is that AI has an increasingly important role in economic decision-making. From stock market forecasting to resource allocation, AI systems operate at speed and complexity beyond human capabilities. While this can lead to an optimized financial strategy, it also eliminates human judgment from critical decisions, thereby increasing the risk of economic instability. Without proper safeguards, AI-driven markets can prioritize efficiency and profits over broader social well-being, creating a system that enables AI-LED entities to have their labor at the expense of their labor.
When AI decides creativity
AI is not only assisting human creativity, but also actively shaping the cultural landscape. In areas such as music, literature, and film, AI-generated content has become increasingly common, affecting not only the content produced, but also how audiences interact with art. While AI tools can help human artists by providing new technologies and inspiration, they also introduce risks that can fundamentally change creative expression.
One of the main problems is that the potential of AI-generated content allows human creativity to mask human creativity. With AI systems, the difference between music, articles and visual arts, humans and machine-made content is vague. This raises questions about originality, authorship identity and artistic value – if algorithms determine creative processes, will human expression be outdated?
Another risk is cultural homogeneity. AI models generate content based on existing data, meaning they tend to enhance dominant trends through AI bias rather than encourage real innovation. Over time, cultural production optimized for participation and algorithms successfully may lead to a landscape in which efficiency is sacrificed to increase originality.
In addition to artistic expression, AI also influences social narrative. AI-curated news, automated content reviews, and targeted media advice shape public discourse, filtering what people see and interact with. This creates the reality that AI not only amplifies certain points of view, but also determines which cultural narratives flourish and fade out of obscureness. Unrestricted, the growing impact of AI on media and communications could erode the diversity and autonomy of human-driven cultural expressions.
The future of AI and governance
From predicting policing to automated social services, AI has also become a powerful force in political and bureaucratic decision-making. Governments around the world are integrating AI into their administrative frameworks to optimize operations for efficiency and scalability. However, this shift has also raised concerns about the erosion of the impact of civic participation and democracy.
A key issue highlighted by the research team is that as AI is integrated into governance, states can prioritize technological efficiency over human rights and civic participation. AI-driven decision-making can simplify bureaucracy, but it can also reduce personality services, reduce accountability and transparency. For example, automated systems for welfare allocation or legal case assessment may prioritize data-driven efficiency over individual nuances.
There is also a risk that AI-driven countries continue to grow into corporate-like entities where governance optimizes institutional stability rather than public interest. AI-driven surveillance, predictive execution and automated decision-making may lead to governments’ inputs to reduce their citizens’ inputs, thereby further reducing human influence in governance.
Is this just another AI panic?
Skeptics might argue that AI is just another technological advancement, similar to the industrial revolution of the past. However, this study highlights that this has nothing to do with sudden AI domination, but rather a structural shift in how society works within. Unlike previous technological disruptions, AI not only changed the industry—it actively replaces human role in decision-making processes in multiple social fields.
It is dangerous to slow erosion of human influence that doesn’t require AI superintelligence. Even without publicly malicious intentions, AI systems gradually replace human judgment, leading to a future where people’s control over the power that shapes life is reduced. The challenge is not to stop AI from progressing, but to ensure it is aligned with human values and to retain meaningful control over key social functions.
To mitigate the risk of gradual AI disenfranchise, the team suggests that we need to take positive steps to protect human impacts in economic, cultural and governmental systems.
- Implement human supervision policies: Governments and institutions must ensure AI-driven decisions remain transparent and bound by human scrutiny. Mechanisms should be adopted to prevent AI from making autonomous choices that affect fundamental rights.
- Strengthen democratic participation: As AI plays a greater role in governance, democratic institutions must adapt. This may include AI-assisted voting systems designed to enhance citizen participation rather than reduce.
- Retain human influence in creative and economic fields: Regulations should be introduced to maintain a balance between AI generation and artificially created content to ensure that human creativity and labor are not obscured.
The study stressed that the risk of gradual disenfranchise was not a distant assumption, it was already underway. Addressing this problem requires international cooperation, research on system-wide AI consistency, and an active and public discourse on the role AI plays in shaping our society. The future is not scheduled, and with the right interventions, we can ensure that AI enhances human agents rather than reduces human agents.