The revised Swiss Federal Act on Data Protection (FADP) has been in force since 1 September 2023. Over two years later, practice shows that many Swiss companies have implemented the basics but struggle with the details. Processing registers are incomplete, data protection impact assessments are missing, data processing agreements are outdated, and the integration of AI systems creates new compliance gaps that were not foreseeable in 2023.
This checklist is not a substitute for legal advice. It is a practical tool to review the current state of your FADP compliance and identify the most important areas for action.
Fundamentals: What Must Be in Place Since September 2023
The following points have been mandatory since the revised FADP entered into force. If you have gaps here, immediate action is required.
Processing register (Art. 12 FADP). Every company with more than 250 employees or processing particularly sensitive personal data must maintain a register of all data processing activities. The register must document the purpose of processing, the categories of data subjects, the categories of personal data processed, the recipients, the retention period, and the technical and organisational measures in place. In practice, a processing register is recommended for all companies regardless of size.
Privacy notice (Art. 19 FADP). Your privacy notice must transparently set out the identity of the controller, the processing purpose, the recipients, disclosures abroad, and the rights of data subjects. Check whether your privacy notice reflects the revised FADP or is still based on the old law.
Duty to inform upon collection (Art. 19 FADP). Data subjects must be informed whenever personal data is collected. This applies not only to web forms but also to telephone data collection, business card exchanges, recruitment processes, and every other form of data collection.
Data processing agreements (Art. 9 FADP). If you have personal data processed by third parties, you need a written contract governing the processing. This applies to cloud providers, IT service providers, payroll offices, marketing agencies, and every other data processor. The contracts must ensure that the processor only processes data according to your instructions and guarantees adequate data security.
Data protection impact assessment (Art. 22 FADP). If data processing poses a high risk to the personality or fundamental rights of the data subjects, you must conduct a data protection impact assessment (DPIA) before processing. This applies in particular to high-risk profiling, large-scale processing of particularly sensitive data, and systematic monitoring of public areas.
Notification obligation for data breaches (Art. 24 FADP). Data security breaches that are likely to pose a high risk to data subjects must be reported to the Federal Data Protection and Information Commissioner (FDPIC) as quickly as possible. Do you have a process for detecting, assessing, and reporting data breaches?
AI and Data Protection: The New Challenges of 2026
Since 2023, the AI landscape has changed fundamentally. Most companies now use AI tools, often without having fully thought through the data protection implications. This is where the biggest compliance risks lie in 2026.
AI-assisted data processing. If you deploy AI systems that process personal data, the FADP’s principles apply without restriction. Purpose limitation, proportionality, data minimisation, accuracy, and transparency must be observed in AI-assisted processing as well. The fact that an algorithm processes the data rather than a human does not change the obligations.
Automated individual decisions (Art. 21 FADP). The FADP gives data subjects the right to be informed about exclusively automated individual decisions and to present their point of view. If your AI system makes decisions with significant impact on individuals (credit decisions, application pre-screening, insurance premium calculations), you must guarantee this right.
Disclosure abroad (Art. 16-17 FADP). Using cloud-based AI tools often means disclosing data abroad. If the AI provider’s server location is outside Switzerland or if data is transmitted to a third country for processing, you must verify whether the recipient country provides adequate data protection. For the United States, the Swiss-US Data Privacy Framework provides a basis, but the conditions are complex and not suitable for all types of processing.
Profiling (Art. 5 lit. f and g FADP). The revised FADP distinguishes between standard profiling and high-risk profiling. AI systems that create personal profiles, analyse behavioural patterns, or make predictions about individuals fall under the profiling provisions. For high-risk profiling, stricter requirements apply, in particular the obligation for private controllers to obtain explicit consent.
The Checklist: 15 Points for Your 2026 FADP Compliance
Go through the following points and assess for each whether it is fully implemented, partially implemented, or not implemented in your organisation.
1. Processing register. Is your register complete and up to date? Are AI systems and their data processing activities recorded?
2. Privacy notice. Does your privacy notice comply with the revised FADP? Are all processing purposes transparently set out, including AI-assisted processing?
3. Duty to inform. Are data subjects correctly informed at every point of data collection? Including telephone contacts, recruitment processes, and events?
4. Data processing agreements. Do you have current contracts with all data processors (cloud providers, IT service providers, AI providers) that meet FADP requirements?
5. Data protection impact assessments. Have you conducted a DPIA for all high-risk processing activities? Especially for AI systems that process personal data or engage in profiling?
6. Breach notification process. Is there a documented process for detecting, assessing, and reporting breaches to the FDPIC?
7. Data subject rights. Can you respond to access requests (Art. 25 FADP) within 30 days? Is there a process for correction and deletion requests?
8. Data disclosure abroad. Have you ensured the required safeguards for all data transfers abroad? Are the recipient countries and legal bases documented?
9. Technical and organisational measures. Are your security measures appropriate to the state of the art? Encryption, access controls, logging, backup concept?
10. Automated individual decisions. If you use AI for decisions about individuals: are data subjects informed? Can they present their point of view?
11. Privacy by design and by default. Are data protection requirements considered from the outset in new projects and systems? Are default settings privacy-friendly?
12. Training. Are your employees trained in handling personal data? Do they know how to detect and report data breaches?
13. Data deletion and retention. Do you have a deletion policy? Is data actually deleted after the retention period expires?
14. AI inventory. Do you have an overview of all AI systems in your organisation that process personal data? Are the data flows documented?
15. Responsibilities. Is it clearly defined who is responsible for data protection in your organisation? Is there a data protection advisor (voluntary but recommended)?
Common Weaknesses in Practice
Based on experience with Swiss companies of various sizes and industries, recurring problems emerge.
Incomplete processing registers. Most registers were created in 2023 and have not been updated since. New AI tools, cloud services, and data flows are missing.
Missing DPIAs for AI systems. Many companies have introduced AI tools without conducting a data protection impact assessment. This is particularly critical for systems that use personal data for profiling, scoring, or automated decisions.
Outdated data processing agreements. Contracts with cloud providers and IT service providers were often concluded before the introduction of AI features. The new data flows are not covered.
Unclear data flows with AI. Many companies do not know exactly where data goes when employees use AI tools. Is input data used to train the model? Where is the data stored? Who has access?
Lack of transparency towards data subjects. Customers and employees are rarely informed that their data is being processed by AI systems.
Making Data Protection and AI Compatible
FADP compliance and the use of AI are not mutually exclusive. But they require the right infrastructure. When data is processed and stored in Switzerland, when data flows are transparent and documented, when every output is traceable, and when data subjects can exercise their rights, AI can be used in a data-protection-compliant manner.
Enclava was developed as a Swiss AI platform for regulated industries, with a full focus on data sovereignty, transparency, and traceability. All data is processed and stored in Switzerland. Every output is source-attributed and traceable. The architecture is designed for compliance from the ground up.
If you want to bring your FADP compliance up to date while also using AI, find more information at enclava.ch or write to us at [email protected].