Zhipu AI released GLM-5 on February 13, 2026, a 744 billion parameter open-source model trained entirely on Huawei Ascend chips without a single NVIDIA GPU. The model ships under the MIT license, scores 50.4% on Humanity's Last Exam (beating Claude Opus 4.5), achieves 77.8% on SWE-bench Verified, and holds the lowest hallucination rate in its class.

What Happened

GLM-5 is the first frontier-class AI model built completely outside the NVIDIA ecosystem. Zhipu AI, a Tsinghua University spinoff that went public on the Hong Kong Stock Exchange in January 2026, trained the entire 744B parameter model on Huawei's Ascend 910B chips.

The architecture uses Mixture of Experts (MoE) with 44 billion active parameters per inference call. This keeps compute costs manageable despite the massive total parameter count. The 200K context window matches or exceeds most proprietary competitors.

Benchmark performance is competitive with the best proprietary models. GLM-5 scores 50.4% on Humanity's Last Exam, 77.8% on SWE-bench Verified, and 75.9 on BrowseComp. These numbers place it in the same tier as GPT-5.2 and Claude Opus 4.6 on multiple evaluations.

The MIT license means anyone can download, modify, and deploy GLM-5 without restrictions. The model weights are available on Hugging Face and ModelScope. Free API access is available through Zhipu's developer platform.

Why It Matters for Creative Professionals

An MIT-licensed frontier model means creators can run powerful AI locally without API costs or data privacy concerns. For professionals working with sensitive client content, local deployment eliminates the risk of data exposure to third-party services.

The "no NVIDIA required" aspect has broader implications. While NVIDIA's upcoming Vera Rubin GPU promises 5x performance gains, the fact that high-quality AI can be trained on alternative hardware could eventually lead to more competition in the GPU market and lower costs for everyone running AI workloads.

GLM-5's coding performance (77.8% SWE-bench) makes it a viable alternative for creators who use AI for web development, automation, or tooling. The MIT license means you can integrate it into commercial products without licensing concerns.

Key Details

Model: GLM-5 by Zhipu AI

Release date: February 13, 2026

Parameters: 744B total, 44B active (MoE)

Training hardware: Huawei Ascend 910B (zero NVIDIA GPUs)

License: MIT (fully open)

Context window: 200K tokens

Key benchmarks: 50.4% Humanity's Last Exam, 77.8% SWE-bench Verified

Availability: Hugging Face, ModelScope, free API at api.z.ai

What to Do Next

Try GLM-5 through the free API at api.z.ai to evaluate it against your current AI tools. No signup cost means zero risk to test.

If you run local AI models, check whether GLM-5's 44B active parameter size fits your hardware. The MoE architecture makes it more accessible than the 744B total parameter count suggests.

Watch for the ecosystem to develop. MIT-licensed models tend to spawn fine-tuned variants quickly. Expect specialized creative AI versions within weeks. For a look at how Chinese labs are optimizing for different hardware constraints, see Qwen 3.5 Small, which runs frontier-quality AI on phones and browser tabs.


This story was featured in Creative AI News, Week of March 3-7, 2026. Subscribe for free to get the weekly digest.