Artificial intelligence (AI) is increasingly and visibly transforming corporate operations. The business use of the technology is spreading at a rapid pace, and AI is taking on an ever more significant role in optimising and enhancing the efficiency of financial processes as well.
However, lawmakers are not standing still either, and we are witnessing substantial developments in the regulatory environment. The European Union’s new regulation, the EU AI Act, may pose adaptation challenges for domestic companies that could dampen their appetite for innovation.
The regulation, which enters into force in August 2025, aims to regulate prohibited practices, the development and use of high-risk AI systems, their compliance, and, more broadly, the ethical boundaries of artificial intelligence. It establishes different categories for such systems: classification may range from low-risk systems to high-risk and even unacceptable-risk solutions. It also defines prohibited practices that may pose a threat to privacy, and restricts the use of AI systems that could significantly endanger the safety and fundamental rights of individuals or organisations.
The EU AI Act also requires transparency obligations from both developers and users of AI systems, imposing additional documentation and information duties on those concerned.
While EU-level regulation is gradually and steadily being integrated into domestic law, it is already coming within reach for Hungarian companies as well. The recently published Act LXXV of 2025 regulates the Hungarian implementation of the European Union’s 2024/1689 AI Regulation, and establishes the framework for setting up two new national authorities, which will begin their operations on the 31st day following promulgation (expected in December 2025).
National Accreditation Authority
As the AI Notification Authority, it will be responsible for designating and supervising organisations that carry out conformity assessments for high-risk AI systems. Designation may only apply to organisations with accredited status, and the authorisation may be withdrawn if the accreditation ceases. The authority will begin operations on the 31st day following promulgation (expected in December 2025).
Tasks of the AI Market Surveillance Authority
Its tasks include, among others, the ex-post monitoring of the lawful use of AI systems and the conduct of market surveillance procedures. This authority will also act as the sole national contact point towards the EU.
In addition, it has been empowered to impose administrative fines, which may be substantial for companies. When determining the highest fine amounts, the EU Regulation 2024/1689 must be taken into account, which sets a very high range for national authorities — between approximately HUF 285 million and HUF 13.3 billion.
Hungarian Artificial Intelligence Council
Alongside the two new authorities, the regulation also identifies a third body, intended to have a strategic and coordination role. While the National Accreditation Authority will assess technical compliance and the AI Market Surveillance Authority will conduct legality checks, the Hungarian Artificial Intelligence Council will not make decisions or conduct procedures, and therefore will not directly affect corporate operations.
The Council’s purpose is to support AI strategy and policy, and to coordinate the domestic application of AI. It will serve in an advisory capacity to the government, make recommendations, provide opinions, and coordinate the activities of the organisations involved.
Its members include representatives of state institutions, professional organisations, academia, and economic actors — for example, the Hungarian National Bank, the Hungarian Competition Authority, the Hungarian Academy of Sciences, and the Hungarian Chamber of Commerce and Industry. The body will meet quarterly, and its chair will be appointed by the Prime Minister.
AI is not only a technological decision
The rise of artificial intelligence requires not only technological adaptation but also legal compliance from companies. As organisations strive to keep pace with innovation, it is essential that they prepare in time for the new AI regulations, particularly the requirements of the EU AI Act. For safe and lawful application, companies must understand the risk classification of AI systems, the regulatory expectations arising from their use, and the potential risks of non-compliance.
Related Services
NIS2 consultancy
The NIS2 rules apply to state and public administration bodies, as well as large and medium-sized private companies, as defined in detail in the law.
NIS2 GAP analysis
Comprehensive analysis and action plan to prepare to comply with the requirements.
NIS2 mentoring
NIS2 mentoring is designed to support the responsible managers’ professional preparedness and effectiveness.
NIS2 pre-audit
NIS2 internal audits are always conducted by a support team within the company.





