Canva's Magic Layers feature was caught automatically replacing the word "Palestine" with "Ukraine" in user designs on April 27. Multiple users replicated the behavior independently, prompting Canva to issue an apology and deploy a fix. PetaPixel and Gizmodo first reported the incident.
What Happened
Magic Layers, a recently launched AI feature that converts flat images into editable layered components, modified text content without user instruction. When users uploaded designs containing the word "Palestine," the feature substituted it with "Ukraine" during the layering process. An X user posted that an image reading "cats for Palestine" was changed to "cats for Ukraine," and other users confirmed they could reproduce the error consistently. Similar geopolitical terms like "Gaza" were not affected.
Why It Matters
The incident highlights how AI text processing can introduce unintended political bias into creative tools. Magic Layers is designed to separate visual components into editable layers, not to modify text content. The fact that a specific geopolitical term was consistently targeted and replaced with another raises questions about training data composition and content filtering in production AI tools.
For creators who rely on Canva for client work, advocacy campaigns, or editorial content, the possibility that AI features silently alter text is a trust-breaking issue. The tool changed the meaning of user content without any warning or opt-in, which is fundamentally different from a rendering bug or layout glitch.
Key Details
- Feature: Magic Layers (AI-powered image-to-layers conversion)
- Behavior: Automatically replaced "Palestine" with "Ukraine" in text content
- Reproducible: Multiple independent users confirmed the same behavior
- Scope: "Palestine" consistently targeted; "Gaza" and other terms unaffected
- Canva response: Spokesperson Louisa Green apologized, said the issue was "isolated" and fixed
- Root cause: Not disclosed; Canva launched an internal audit but shared no technical details
What to Do Next
Creators using AI-powered design tools should review output text carefully, especially for content involving geopolitical terms, brand names, or sensitive language. Canva says the fix is deployed, but the company has not explained what caused the substitution or what safeguards are now in place to prevent similar issues.