Executive Summary. Between November 2025 and March 2026, the Turkish Personal Data Protection Board (KVKK) issued three guidelines on generative artificial intelligence: the "Guide on Generative AI and Personal Data Protection" (24 November 2025), the "Use of Generative AI Tools in the Workplace" (5 March 2026), and the 46-page "Agentic AI Guide" (12 March 2026). Read together, they establish a clear position: employees pasting client or customer data into ChatGPT, Copilot, Gemini and similar tools outside of corporate policy, approval or oversight — what the Board calls "Shadow AI" — is now expressly categorised as a risk by the regulator. For multinationals with Turkish operations, the implications go beyond translation of existing GDPR policies. This note sets out the common framework of the three guidelines, the substantive obligations they create, and a 90-day roadmap.

1. Shadow AI: Definition and the Regulator's Position

In its 5 March 2026 guideline, the Board defines Shadow AI as the use of generative AI tools by employees outside corporate policy, approval or oversight frameworks — through personal accounts or unmanaged integrations. The defining feature of the term is invisibility. The company is unaware of the use; because it is unaware, it cannot manage it; because it cannot manage it, it cannot mitigate liability.

A practical scenario: an employee pastes a client contract into the free version of ChatGPT to summarise it. Depending on the terms of service, that document may then be incorporated into model training data, retained in system logs, or transferred to third-party processors. The company's technical control over the process is zero.

2. The Common Framework of the Three Guidelines

Guideline 1 — 24 November 2025: "Generative AI and Personal Data Protection (15 Questions)"

This guideline focuses on the system lifecycle: training data, data processed during use, retention of outputs, and reuse. Its core message to data controllers: Law No. 6698 applies regardless of technology; the fact that the model is "AI" does not relax KVKK's processing conditions.

Guideline 2 — 5 March 2026: "Use of Generative AI Tools in the Workplace"

The most operationally significant of the three. It addresses the risks of corporate use of publicly available, third-party generative AI tools (ChatGPT, Gemini, Copilot, Claude and similar). The guideline's central points:

Guideline 3 — 12 March 2026: "Agentic AI Guide" — 46 pages

A comprehensive Turkish-language regulatory framework dedicated specifically to agentic AI — multi-step, autonomous AI agents that integrate across systems. The principal risks identified:

3. The Four Elements the Board Expects

Read together, the three guidelines surface four core elements the regulator will look for during inspection.

First: A written corporate policy

A written, approved policy on the use of generative AI tools, communicated to employees. The policy must distinguish three categories clearly: prohibited use (e.g. client data, special category data, financial secrets), restricted use (via corporate accounts, with anonymised data, for defined tasks), and permitted use (internal processes that contain no personal data). A one-page note is not enough; each category requires concrete examples and exceptions.

Second: Access control

Which tool, in which department, through which account type, with which data category. The difference between corporate (Enterprise/Team) and personal accounts is contractual: corporate plans typically do not use data for training; personal plans vary by provider and user setting. How this distinction is communicated to employees, which tools are licensed corporately, which are blocked categorically — all must be documented.

Third: Personnel training

The purpose of training is not to blame employees, but to make use visible. The guideline emphasises that employees generally turn to Shadow AI not in bad faith but to accelerate their work. Training that rejects this motivation will fail; training that shows how to achieve the same efficiency within the corporate framework succeeds.

Fourth: Audit and documentation

The gap between policy and reality only becomes visible through audit. Periodic internal review, log analysis, employee surveys, incident analysis. What matters is not only that an audit was performed, but that its findings were reported and presented to the board.

4. Legal Consequences and Penalty Exposure

The three guidelines are not strictly binding; however, because they shape the Board's expectations and assessment criteria, they function as a practically binding reference. In the investigation of a Shadow AI incident, these guidelines provide the framework the Board will apply. Likely violation headings (KVKK Article 18; Official Gazette, 27 November 2025):

Applied cumulatively, the upper limits of Art. 12 and Art. 10 alone exceed a theoretical ceiling of TRY 18.8 million; with Art. 9 and Art. 5 assessments added, the figure rises further. The calculation is derived from the Board's authority to penalise multiple separate violations arising from a single incident.

5. How KVKK Differs from GDPR — Where Multinationals Get Caught

For groups already prepared under GDPR, the gap to KVKK readiness is narrower than starting from zero, but it is not zero. Three patterns recur:

First, the translated global notice. A GDPR-style privacy notice translated into Turkish and posted to the local site. The Turkish regulator looks for a separate "aydınlatma metni" (information notice) with specific structural elements; the translated notice typically lacks them. Under the principle established by Board Decision 2026/347 (18 February 2026), the information notice and the explicit consent text must be presented as separate documents.

Second, the global breach response playbook. GDPR allows 72 hours "where feasible." The Turkish Board reads its 72-hour rule strictly, and counts from the moment the data controller — not the local subsidiary alone — could reasonably have known. The intra-group escalation chain matters.

Third, intra-group data transfer. A Turkish subsidiary transferring HR or customer data to a global parent is conducting a cross-border transfer under KVKK. EU SCCs do not automatically satisfy Turkish requirements; KVKK has its own SCC framework and separate notification rules.

6. 90-Day Roadmap

Days 0–30 — Visibility

Days 30–60 — Framework

Days 60–90 — Embedding

7. A Note for Industries with Confidentiality Obligations

Beyond general corporate compliance, certain industries operate under additional confidentiality regimes that interact with KVKK and amplify the Shadow AI risk: legal services (attorney-client privilege under the Attorneys' Act and Bar Association Professional Rules), healthcare (medical confidentiality), banking (banking secrecy under Law No. 5411), and statutory financial reporting (auditor confidentiality). For these sectors, an employee pasting confidential material into a free generative AI tool may trigger not only KVKK exposure but a parallel professional or sectoral liability. AI policies in these industries must be drafted with both lenses.

Conclusion

Generative AI tools will not leave business processes. The Board recognises this — the framing of the guidelines is enabling rather than prohibitive. But if the company does not set the framework, the framework set by the regulator will arrive as an enforcement outcome. The right question for companies is no longer "should we use AI?" but "is our AI use defensible under inspection?"

A company without a policy has no option to say "we were not aware" in the moment of an incident. A company with a policy, in the same incident — provided the policy is written, training has been delivered, and audit can be evidenced — has the basis to argue mitigation.

Sources

KVKK, "Guide on Generative AI and Personal Data Protection (15 Questions)", 24.11.2025 (kvkk.gov.tr)

KVKK, "Use of Generative AI Tools in the Workplace" Guideline, 05.03.2026 (kvkk.gov.tr)

KVKK, "Agentic AI Guide", 12.03.2026 (kvkk.gov.tr)

Law No. 6698 on the Protection of Personal Data, Arts. 5, 9, 10, 12, 18

KVKK Board Principle Decision 2026/347 dated 18.02.2026 (separation of information notice and explicit consent text)

Official Gazette, 27.11.2025 — 2026 revaluation rate tariff (25.49%)

This note is provided for general informational purposes and does not constitute legal advice.