117B-parameter MoE language model from OpenAI.
117B-parameter MoE language model from OpenAI.
gpt-oss-120b — for production, general purpose, high reasoning use cases that fit into a single H100 GPU (117B parameters with 5.1B active parameters)
gpt-oss-120b run on a single H100 GPU and the gpt-oss-20b model run within 16GB of memory.Uniquement chez Atlas Cloud.