DeepSeek V4 Pro
LLM
PRO

DeepSeek V4 Pro

DeepSeek V4 Pro is a state-of-the-art large language model combining efficient sparse attention, strong reasoning, and integrated agent capabilities for robust long-context understanding and versatile AI applications.

DeepSeek V4 Pro
MiniMax M2.7
GLM 5 Turbo
Kimi K2.5
Qwen3.5 122B A10B
KATEGORIA
Serie
6 z 6 modeli
Nowe
NEW
HOT
GLM 5.1
LLM

GLM 5.1

GLM-5.1 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:202.75K
Wejście:$1.39/M tokenów
Wyjście:$4.4/M tokenów
Maks. Wyjście:202.75K
$1.39/4.4M wejście/wyjście
Cache-Based
NEW
HOT
GLM 5v Turbo
LLM
TURBO

GLM 5v Turbo

GLM-5v Turbo is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:202.75K
Wejście:$1.2/M tokenów
Wyjście:$4/M tokenów
Maks. Wyjście:131.07K
$1.2/4M wejście/wyjście
Cache-Based
NEW
HOT
GLM 5 Turbo
LLM
TURBO

GLM 5 Turbo

GLM-5 Turbo is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

262.1K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:262.14K
Wejście:$1.2/M tokenów
Wyjście:$4/M tokenów
Maks. Wyjście:131.07K
$1.2/4M wejście/wyjście
Cache-Based
NEW
HOT
GLM 5
LLM

GLM 5

GLM-5 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:202.75K
Wejście:$0.95/M tokenów
Wyjście:$3.15/M tokenów
Maks. Wyjście:202.75K
$0.95/3.15M wejście/wyjście
Cache-Based
NEW
HOT
GLM 4.7
LLM

GLM 4.7

GLM-4.7 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:202.75K
Wejście:$0.52/M tokenów
Wyjście:$1.85/M tokenów
Maks. Wyjście:202.75K
$0.52/1.85M wejście/wyjście
Cache-Based
NEW
HOT
GLM 4.6
LLM

GLM 4.6

357B-parameter efficient MoE model from Zhipu AI.

202.8K KONTEKST:
Typ wejścia:
Typ wyjścia:
Kontekst:202.75K
Wejście:$0.6/M tokenów
Wyjście:$2.2/M tokenów
Maks. Wyjście:202.75K
$0.6/2.2M wejście/wyjście
Cache-Based

Join our Discord community

Join the Discord community for the latest model updates, prompts, and support.