Episode 2. The consequences of the “AI Act” for the manufacturing and production industry

Alex Van Unnik on January 20, 2025

Frank Ferro, former Program Director GenAI at PostNL, explains that the AI Act will affect every manufacturing company. The impact will depend on whether you use AI for your product or just internal processes. What will be the consequences of non-compliance?

AI in the Manufacturing Industry: Differences Between Processes and Products

Alex: Welcome, everyone, to this knowledge-sharing session on data in the manufacturing industry.

Today, I have invited a few experts to discuss this topic. My first guest is Frank Ferro, Program Director of AI at PostNL.

What are the main challenges you see at the moment regarding the AI Act in the manufacturing industry?

Frank: The main challenge has to do with where you apply AI within the manufacturing industry. This is true in all industries, but particularly in manufacturing.

If you use AI in your internal processes, the conditions of the AI Act and the Data Act are very different from when AI is used in products, it affects society, consumers, and other businesses. At this point, the legislation arising from the various acts becomes much stricter.

You must inform and notify people that AI is being used in a product, explain its effects, and how it is applied. If AI is only used in internal processes—for example, to manufacture products without directly affecting people—the regulations are much less stringent.

For instance, an algorithm that chooses between Machine A or Machine B has minimal impact. You still need to indicate that you are using AI, but the legislation is much more lenient in this case because there is no direct impact on society or people.

However, once you apply AI to your products, the likelihood of a direct impact on people and businesses increases significantly.

Ethical Issues Surrounding AI and the AI Act

Alex: I’ve also read about ethical issues related to the new AI Act. Could you tell us more about that?

Frank: Yes, ethics is quite a complex concept. The AI Act and the Data Act aim to create a framework and regulate it, but within this regulation, you have freedoms. As a company, you need to fill in those freedoms yourself and apply your own values, norms, and ethics.

For example, if you have business principles, you need to assess whether they are still adequate in light of the new AI Act and the Data Act. Are they still appropriate, or do they need to be adjusted?

You have to consider questions such as:

  • How does AI affect consumers?
  • How does AI affect my employees?

Perhaps you need to adapt your business principles. For instance, we’ve added three business principles specifically focused on digital ethics.

However, as I mentioned, ethics is a challenging topic because my ethics might differ from yours. That doesn’t mean one is right and the other is wrong. At some point, as a company, you need to say: This is an ethical dilemma. What choices will we need to make? 

I often use the example of shell companies in Amsterdam: They’re legal, but are they ethical? You are allowed to establish shell companies, but does this align with your ethical framework and business principles as a company?

Alex: So yes, something can be generally accepted, but we can still have different opinions about it.

Frank: I believe a company should have a clear stance on such matters and be explicit about it. Legally, things are often straightforward: something is either allowed or not. But if something is legally permissible, you still need to decide whether you, as a company, want to engage in it.

Do you want to be known for this? Do you want to act in this way? Such dilemmas will become more common with the use of AI. For example, AI can exclude certain groups or favor others. In some cases, this might be ethically justifiable, but you need to have a conversation about it. You can’t just let it happen.

That’s the most important aspect of all these acts: they provide regulation, but they mainly emphasize that a company must be able to justify why it allows something or not, both internally and externally. This is especially true when you apply AI to the products you produce.

Documentation and Transparency in AI Usage

Alex: And by “clarify,” do you mean that things need to be documented properly as well?

Frank: Yes, more and more, you need to be able to demonstrate that you’ve thought about these matters and that you’ve made a conscious decision to do something a certain way.

Alex: Could that also be one of the reasons for manufacturing companies’ attitude toward documentation, transparency, and compliance with the AI Act? We speak with professionals in the manufacturing industry regularly, and they’re not always positive about all this.

Benefits of the AI Act for Businesses

Frank: Yes, you see this a lot, right? I often compare it to Europe: Everyone complains about the more stringent regulations in Europe, but they also brings tremendous benefits. You need to highlight the benefits as well.

These acts are here to stay, and you can use them to your company’s advantage, no matter how challenging or time-consuming the process might be. I’m convinced that if you’re the one in a competitive market who handles this the best, it will work to your benefit in the long run.

In the eyes of consumers and other companies that purchase your products, the difference will be noticeable. Eventually, there will come a time when this will contribute to your success.

Alex: You mean that you distinguish yourself from the rest, showing that you’re socially responsible and complying with the act that’s been established?

Frank: Yes, or even beyond that: You don’t just show that you’re complying with the act because most people will say, “Well, of course; it’s the law.” You also demonstrate how you’re using it to your advantage. For example, in marketing, sales, and even in your products.

Consequences of Non-Compliance with the AI Act

Alex: As I mentioned earlier, we’ve spoken with multiple professionals in the manufacturing industry. Some have expressed concerns that this will result in additional costs or work and prevent them from focusing on their core business.

Do you agree?

Frank: There’s no need to be afraid of that—it will happen. That’s just the way it is. For example, NIS 2 is also on the horizon; it’s simply a fact.

Alex: So, it will indeed bring additional costs, or at least require some investment. But you mean that this is an investment that could pay off in the long run because it will help you stand out from the rest?

Frank: For me, there are two key points:

  1. The level playing field is equal—your competitors will need to do this as well.
  2. Ultimately, you can benefit from it if you do this effectively.

Alex: Can you talk about what happens if parties don’t comply with the AI Act? Are there serious consequences, and could these parties face significant problems?

Frank: Yes, if you engage in practices explicitly prohibited by the AI Act, the consequences can be severe. This can lead to hefty fines, often calculated as a percentage of your revenue. While it doesn’t always mean you’ll have to pay these fines immediately—there are often ways to get extensions or reductions—it does bring significant additional costs, such as legal fees and other expenses. These are costs that won’t go into your product and will place extra strain on your business.

There’s also the ethical aspect. If your company gains a reputation for being unethical—whatever that might mean in your industry—and your competitor is seen as ethical, it could have long-term repercussions for your market position and reputation.

Alex: How can companies in the manufacturing industry get help implementing the AI Act? You just mentioned NIS 2. We’ve noticed that with the introduction of NIS 2, a wave of NIS 2 experts have emerged to assist businesses.

Is something similar possible with the AI Act? Can companies, for example, hire an "AI expert" to answer questions and provide guidance?

The Need for Internal Knowledge of AI Legislation

Frank: Yes, there are certainly possibilities. With any legislation coming from Europe, the Netherlands, or even globally, consultancy firms naturally step in. They help you comply with the legislation to ensure you remain compliant. That’s already happening extensively.

When it comes to AI, this will increasingly become part of the core of your own business. Whether it’s about your production processes or your products, AI is becoming a fundamental part of who you are as a company.

In such cases, you must understand what’s happening and what the consequences are. For example, no one questions whether an HR department should be well-versed in labor law—it’s a no-brainer. The same will happen with AI and data. As it becomes deeply integrated into the core of your processes, you will need to ensure you have this knowledge internally.

If you manage to do this well, I’m convinced you’ll be able to stand out from the rest.

Predictive Analytics and Machine Learning in Production Processes

Alex: Okay, that all sounds very logical. So, we’ve put some statements about AI in a cup here. I’d like to ask you to take out the first one. Then, we’ll see whether we agree with it or not. Of course, a little explanation is appreciated—exciting, right? It’s a bit like drawing lots at Sinterklaas!

Frank: DECT rules make using predictive analytics and machine learning in production processes much more difficult.

No.

I want to focus on the word “much.” No, it doesn’t make it much more difficult. As I mentioned earlier, if you’re using AI within production processes—so within your company—the consequences of the AI Act are much smaller. This is especially true if the AI doesn’t make decisions that directly affect people, like your employees.

The rules only become more stringent if AI is used in a way that directly impacts people. For example, if AI makes decisions about your employees, the regulations are much tighter.

Similarly, if AI is applied to your products, it involves your customers. They use the product, so they must be informed about how AI is used. In such cases, the requirements are much stricter.

But the consequences are much smaller if you’re using AI in your production processes and it doesn’t directly impact people. So, the word “much” in the statement is important here: It’s not always much more difficult.

Alex: So essentially, the consequences or the impact, as you’re explaining it now, aren’t that significant.

Frank: Yes, it always depends on the context. But if you present the statement in a way that implies that these rules always make production processes harder, then I’d say no, that’s not true.

You do need to look into and investigate the rules, which will definitely require some effort. You need to clearly understand what AI does and also determine whether it directly affects people, yes or no. These are all things you need to address.

But to claim that it becomes much harder to use AI by default? No, that’s not correct.

Alex: So, does this mean that people who think this way might have a sort of fear culture around anything new? You’re saying that it does have an impact, just not in a way that directly affects your core business or product.

Frank: No, it doesn’t necessarily have to be that way. I don’t even know if it’s a fear culture. For example, we’ve often seen privacy legislation being used as a convenient argument to block other initiatives.

“You can’t do that because of privacy rules,” you’ll hear, even though the action doesn’t involve privacy-sensitive data at all. Why are we stopping these initiative? Sometimes, it seems like we’re hiding behind the rules because it’s easy.

International Competition and Ethics

Alex: The AI Act will hinder rather than stimulate innovation in the production sector.

Well, this statement aligns somewhat with what we discussed earlier. Based on our experience and the conversations we’re currently having with parties in the manufacturing industry, I don’t think this will be a hindrance.

As you mentioned, this process often runs parallel to your core processes. It might mean you need to hire extra people or invest in it, but it certainly won’t negatively affect your core business.

Frank: Well, the risk could lie in competition with countries like China and the United States. That’s where I see a potential danger. If they’re allowed to do things that we don’t consider ethical here, yet we still allow their products into our market, that could have an impact.

However, I think it’s good that Europe is actively addressing this. It’s something we all have to deal with, and it’s important to uphold ethical standards. But this competitive risk must be addressed effectively to maintain a level playing field.

Alex: Yes, that would indeed create unfair competition.

Frank: It’s all about a level playing field, but you’re seeing more and more measures being implemented in Europe. For example, if we don’t know what’s happening in the supply chain regarding AI, then those products won’t be allowed in Europe.

Alex: Based on your experience, is there a chance that such an AI Act could also take off in China? You just mentioned the competitive playing field.

Frank: I don’t know much about China, but based on the news I follow and the discussions within, for example, the European Commission, I don’t expect China to go in the same direction as Europe.

What you do see—and this is very interesting—is how things work in the United States. There, individual states are often much stronger than the federal government. For example, California’s privacy legislation is even stricter than European regulations. Those laws are also being adopted by other states.

You see something similar in Canada. Depending on the political stance of a state or province, they sometimes adopt stricter legislation. They do this because certain rules, from a human and societal perspective, simply make sense.

However, things are different in China. The drivers there can be very different, which might lead to an approach unlike what we see in Europe or North America.

Alex: Yes, let’s keep that in mind. You already mentioned that you don’t know much about China. In this regard, China will likely always have an unfair market, at least compared to what we’re discussing in the Netherlands right now.

Frank: Their motivations for success are entirely different.

Closing and Summary

Alex: It’s, let’s say, a matter of unfair competition.

Frank, I’d like to thank you for a great conversation. I’m glad you were able to provide us with additional insights into AI, especially about the possibilities. Of course, we also discussed some obstacles, but fortunately, neither of us sees them as too significant.

Thank you for your time and your valuable insights.

Frank: My pleasure.

Alex: We just had an in-depth discussion with Frank about the AI Act, international competition, and the importance of a level playing field.

If you’d like to learn more about data and ICT or discuss these topics further with us, we would love to invite you to sit down with us. We can even develop a proof of concept that is fully customized and tailored to your specific needs.

We look forward to hearing from you!