Back to Insights
Privacy · AI · 2026 Update

Privacy Protection Amendment 13: What AI and Technology Companies in Israel Must Do

This article provides general information only and does not constitute legal advice. Each situation is unique — consult with a qualified attorney for guidance specific to your circumstances.

In 2023 Israel passed a significant upgrade to its Privacy Protection Law — Amendment 13. After a transition period, the new obligations now apply to a wide range of businesses — and particularly to companies that develop, operate, or use artificial intelligence tools. Appointing a Data Protection Officer (DPO), conducting Data Protection Impact Assessments (DPIA), enhanced transparency obligations, and significantly increased penalties have all become regulatory reality. This guide explains what changed, what you must do, and how to prepare.

1. What Is Amendment 13 — Background

The Privacy Protection Law, 1981, is Israel's primary legislation governing personal data. For decades it was considered outdated relative to digital-age challenges. Amendment 13, passed by the Knesset in 2023 and phased in progressively, has brought Israeli law closer to GDPR standards. **Key changes:** - Mandatory appointment of a Data Protection Officer (DPO) for certain entities - Obligation to conduct Data Protection Impact Assessments (DPIA) for high-risk processing - Enhanced transparency obligations — detailed privacy policies, informed consent - Expanded enforcement powers for the Israeli Privacy Protection Authority (ILPPA) — including fines of up to millions of shekels - Recognition of Privacy Enhancing Technologies (PETs) as the preferred approach to data processing

2. Who Must Appoint a DPO?

The DPO obligation applies to: - Public bodies - Organisations for which data processing is a core activity - Organisations that process sensitive data at significant scale (health, biometric, location, children's data) - Organisations engaged in systematic large-scale monitoring of individuals **For AI companies:** If your product analyses user data, generates personalised recommendations, or processes user-generated images/code/text — you likely fall within scope. Specific legal advice is essential before drawing a firm conclusion. **What a DPO does:** Acts as the point of contact with the authority, advises the organisation on compliance, supervises DPIA implementation, and serves as data protection trustee. The DPO is not directly liable — the organisation is.

3. DPIA — When and How

A DPIA is a structured process to identify and manage privacy risks before launching a new product or process that involves high-risk data processing. **When a DPIA is mandatory:** - Deployment of new technology processing personal data - Large-scale sensitive data processing - Systematic monitoring of public spaces - Profiling of individuals (recommendations, credit, employment) - Processing of children's data **For AI:** Almost every consumer-facing AI product requires a DPIA. If your product generates personal profiles, assesses predispositions, recommends content, assigns credit scores, or makes accept/reject decisions — a DPIA is mandatory. **What a DPIA includes:** Description of processing and its purposes; necessity and proportionality assessment; identification of privacy risks; planned mitigation measures; outcome: acceptable-risk process vs. process requiring authority approval.

4. AI, Transparency, and Liability for Errors

Amendment 13 adds a new dimension to the AI discussion — a transparency obligation. **Hallucinations and incorrect outputs:** What happens when an AI model generates false information about a person? For example: an AI credit recommendation system based on erroneous data, or an HR system that rejects candidates based on faulty analysis. Amendment 13 requires: (a) disclosure that a decision was made by an automated system; (b) a right to review and correction; (c) an appeals mechanism. **Deepfakes:** Creating harmful AI-generated content (deepfake video, synthetic voice) increases exposure to privacy + defamation claims. Amendment 13 does not address this directly, but the transparency obligation continues to apply. **AI vendor agreements:** If you use an OpenAI, Google, or other provider API, you are a "data processor" not a "database owner". Obligations still apply: sign a Data Processing Agreement (DPA) with the vendor, verify the vendor meets GDPR/Amendment 13 standards, and document the chain.

5. Fines and Enforcement: What It Costs

**Fines under Amendment 13:** The Israeli Privacy Protection Authority (ILPPA) now has authority to impose administrative fines: - Minor violations: up to NIS 100,000 - Serious violations: up to NIS 1,000,000 - Aggravating circumstances (intent, negligence, large scale): up to millions of shekels **Individuals can sue directly:** In addition to authority fines, individuals whose data was breached can bring civil claims — including damages without proving actual loss. **What increases risk:** - Failure to appoint a DPO when required - Failure to conduct DPIA for high-risk processes - Failure to notify the authority of a data breach within 72 hours - Retaining data beyond the required period

6. Privacy Enhancing Technologies (PETs) — The Preferred Approach

Amendment 13 encourages — and in some cases requires — the use of technologies that reduce privacy intrusion. **PETs relevant to AI:** - **Differential Privacy:** Adding statistical noise to training data so specific individuals cannot be identified - **Federated Learning:** Decentralised model training — data stays on the device, only parameters are shared - **Anonymisation & Pseudonymisation:** Removing/replacing personal identifiers before training - **Synthetic Data:** Using AI-generated synthetic data for training instead of real data **Practical recommendation:** Building AI training pipelines with recognised PETs from the outset — not only reduces legal risk but becomes an investment policy: large buyers and enterprise clients increasingly require it.

Checklist — 10 Steps to Prepare

  • Determine whether your organisation is required to appoint a DPO — obtain legal advice
  • Map all personal data processing activities in your organisation
  • Identify high-risk processes requiring DPIA and conduct the assessments
  • Update your privacy policy to clear language meeting Amendment 13 requirements
  • Sign Data Processing Agreements (DPA) with all AI vendors
  • Build a data breach notification process to alert the authority within 72 hours
  • Apply data minimisation — collect only what is necessary
  • Implement mechanisms for the right of access, correction, and erasure
  • Explore PET implementation in your training pipelines
  • Train technical and management staff on data protection obligations

Common Pitfalls

  • Assuming appointing a DPO is sufficient — a DPO is a tool, not a shield. Liability remains with the organisation
  • Not conducting a DPIA before launching a new AI product — 'we'll do it after' is not enough
  • Using an AI provider API without a DPA — may make you liable for the provider's processing
  • A privacy policy that says 'we may share with third parties' without specifics — does not meet Amendment 13
  • Retaining data indefinitely — obligation to delete/anonymise after the retention period

FAQ

Must every AI company appoint a DPO?

Not all of them, but companies that process personal data at significant scale or sensitive data — likely yes. The analysis depends on the specifics of your actual activity.

What is the difference between an Israeli DPO and a European DPO?

The requirements are very similar. An Israeli DPO under Amendment 13 requires the same skills and functions as a European DPO under GDPR. If your company has already appointed a DPO for GDPR purposes — they may also meet Israeli requirements, subject to review.

Does an AI tool that uses customer data require a DPIA?

Generally — yes. Using AI for profiling, recommendations, behavioural analysis, or large-scale personal data processing — almost always requires a DPIA.

What happens if a data breach is discovered?

Under Amendment 13, there is an obligation to notify the Privacy Protection Authority within 72 hours of discovery. A material breach may also require notifying the affected individuals.

Is Synthetic Data sufficient to be considered anonymous?

It depends on the quality of generation and the degree to which original data 'flows' into the synthetic data. Research shows that real data can sometimes be recovered from synthetic datasets. Technical-legal advice is recommended before relying on synthetic data as a compliance solution.

Need Professional Compliance Support?

We will review your organisation's compliance status together and build a preparation plan.

Schedule a Consultation