DeepSeek V4 Pro
LLM
PRO

DeepSeek V4 Pro

DeepSeek V4 Pro is a state-of-the-art large language model combining efficient sparse attention, strong reasoning, and integrated agent capabilities for robust long-context understanding and versatile AI applications.

DeepSeek V4 Pro
MiniMax M2.7
GLM 5 Turbo
Kimi K2.5
Qwen3.5 122B A10B
카테고리
시리즈
6개의 모델 중 6개
신규
NEW
HOT
GLM 5.1
LLM

GLM 5.1

GLM-5.1 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:202.75K
입력:$1.39/백만 token
출력:$4.4/백만 token
최대 출력:202.75K
$1.39/4.4M 입력/출력
Cache-Based
NEW
HOT
GLM 5v Turbo
LLM
TURBO

GLM 5v Turbo

GLM-5v Turbo is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:202.75K
입력:$1.2/백만 token
출력:$4/백만 token
최대 출력:131.07K
$1.2/4M 입력/출력
Cache-Based
NEW
HOT
GLM 5 Turbo
LLM
TURBO

GLM 5 Turbo

GLM-5 Turbo is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

262.1K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:262.14K
입력:$1.2/백만 token
출력:$4/백만 token
최대 출력:131.07K
$1.2/4M 입력/출력
Cache-Based
NEW
HOT
GLM 5
LLM

GLM 5

GLM-5 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:202.75K
입력:$0.95/백만 token
출력:$3.15/백만 token
최대 출력:202.75K
$0.95/3.15M 입력/출력
Cache-Based
NEW
HOT
GLM 4.7
LLM

GLM 4.7

GLM-4.7 is Z.AI’s latest flagship model, featuring upgrades in two key areas: enhanced programming capabilities and more stable multi-step reasoning/execution. It demonstrates significant improvements in executing complex agent tasks while delivering more natural conversational experiences and superior front-end aesthetics.

202.8K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:202.75K
입력:$0.52/백만 token
출력:$1.85/백만 token
최대 출력:202.75K
$0.52/1.85M 입력/출력
Cache-Based
NEW
HOT
GLM 4.6
LLM

GLM 4.6

357B-parameter efficient MoE model from Zhipu AI.

202.8K 컨텍스트:
입력 유형:
출력 유형:
컨텍스트:202.75K
입력:$0.6/백만 token
출력:$2.2/백만 token
최대 출력:202.75K
$0.6/2.2M 입력/출력
Cache-Based

Join our Discord community

Join the Discord community for the latest model updates, prompts, and support.