OpenAI has unveiled its latest advancements in artificial intelligence with the release of two new open models — gpt-oss-120b and gpt-oss-20b. These are the first open-weight models from the company since the launch of GPT-2 over five years ago. Both models are available for free on Hugging Face, making them ideal for developers and researchers looking to build innovative solutions.
The models differ in power and hardware requirements:
- gpt-oss-120b — a larger and more powerful model that can run on a single NVIDIA GPU;
- gpt-oss-20b — a lighter version suitable for use on a standard laptop with 16 GB of RAM.
OpenAI aims to establish an American open AI platform as an alternative to the increasing influence of Chinese labs like DeepSeek, Qwen (Alibaba), and Moonshot AI, which are actively developing their own powerful open models.
In competitive coding tests on Codeforces, the 120b model scored 2622 points, while the 20b model scored 2516, surpassing DeepSeek R1 but falling short against closed models o3 and o4-mini. In the challenging Humanity’s Last Exam (HLE) test, the 120b achieved 19%, while the 20b reached 17.3%, outperforming other open models but lagging behind o3.
The new models were trained using a methodology similar to that of OpenAI's closed models, employing a mixture-of-experts (MoE) approach that activates only a portion of parameters for each token, enhancing efficiency. Additional reinforcement learning fine-tuning enabled the models to construct logical reasoning chains and invoke tools such as web search or Python code execution.
These models operate solely with text and do not generate images or audio. They are distributed under the Apache 2.0 license, allowing commercial use without the need for OpenAI’s approval, although training data remains proprietary due to copyright concerns.
The launch of gpt-oss aims to strengthen OpenAI's position within the developer community while also addressing political pressure in the United States to enhance the role of open American models in global competition.