Why AI systems are resisting?

Speed and efficiency do not automatically make processes legitimate. When decisions are opaque, distance and resistance arise. Sustainable digitization requires systems that explain, enable human involvement and create trust.

Efficiency is not legitimacy

Why fast systems resist without explanation

Automation is often used to make processes faster and more consistent. This works in many organizations. Decisions are made more quickly and enforceability is increasing. However, speed alone does not appear to be sufficient for support.

When systems make decisions without providing insight into how they work, distance occurs. Especially when the effects are immediately noticeable. People want to understand how a decision came about, even if the outcome is unfavorable.

In many digital systems, this explanation is difficult to give. The considerations were not clearly recorded. It is unclear which data was decisive. This makes it difficult to place responsibility or make corrections.

This ambiguity leads to resistance. Not against automation itself, but against the feeling that processes are not approachable. Those who receive a decision want to know where to go and how recovery is possible.

Administrative friction occurs when systems are set up faster than the accountability around them. Appeal procedures are increasing and decision-making is coming under pressure. The intended efficiency gain thus loses its value.

Legitimacy requires systems that allow for explanation and human involvement. Not as an exception, but as part of the design.

At Xuntos, we see efficiency as important, but not as a leading criterion. In public and regulated contexts, acceptance determines the success of digitization. Understandable systems are used sustainably.

That's why automation requires more than speed. She asks for trust.

Scientific insights into resistance to AI acceptance

Recent studies show that acceptance of AI can decline after hypewaves, with more demand for human control. How is that? Resistance is often caused by:

  • lack of trust
  • ethical concerns
  • inequality between groups

So how can you still take the step towards successful adoption:

  • change management
  • transparency
  • learning organizational culture

Source: arxiv.org

AI adoption grows, but governance lags behind

Arrange governance properly by using responsible AI the resistance will fall.

Despite accelerating adoption, governance still does not seem to be a logical step for many companies. While 57% of organizations are at an advanced stage of AI adoption, only 27% have established a comprehensive AI governance framework. The breakdown by maturity is significant: 2% have no policy at all, 15% work ad-hoc without formal governance, 56% are developing policies, and barely 27% have an extensive framework with active enforcement and regular reviews. According to the study, this lack of governance can prevent many organizations from exploiting the full potential of AI, resulting in unused returns.

In short, many pilots are being scaled up to production, but organizations are still struggling with:

  • governance and regulation
  • training and skills
  • infrastructure and security

Source: dutchitleaders.nl

Getting started with responsible AI

At Xuntos, working together, scalable. responsible is the basis of what we do. How can you properly arrange governance? That starts with the foundation. We first start with an analysis for a thorough plan of action. In doing so, we analyse data, processes and risks and translate them into a concrete architectural sketch.
Learn more about our approach at

Getting to know

Heb je vragen? Of wil je samen sparren?

We denken graag met je mee hoe jouw organisatie digitaal kan versnellen. Of het nu gaat om een UX, technische uitdaging of AI oplossingen. Stuur een berichtje en we nemen zsm contact met je op.