
Open-weight models allow users to access the model’s trained parameters but withhold details such as training data, architecture, and source code.
OpenAI has made its first move toward model openness since GPT-2 in 2019 with the publication of two open-weight AI language models, an unusual departure from its long-standing emphasis on proprietary systems. The decision appears to be influenced not only by rising global pressure but also by China’s recent lead in open-source AI innovations, particularly the success of DeepSeek.
The newly released models — named gpt-oss-120b and gpt-oss-20b — are available for download on Hugging Face. Because the smaller 20b model can run on a typical laptop with 16GB RAM and the bigger 120b model requires a single Nvidia GPU, the entrance barrier for developers and researchers is greatly reduced.
Up until recently, OpenAI had strictly protected its most potent models, choosing to only make them accessible through chatbot interfaces and the cloud. However, CEO Sam Altman recently acknowledged the need for change, admitting during a Reddit AMA, “I think we need to figure out a different open source strategy. Not everyone at OpenAI shares this view, and it’s also not our current highest priority… We will produce better models, but we will maintain less of a lead than we did in previous years.”
This latest release is an open-weight model, not fully open source. Open-weight models allow users to access the model’s trained parameters but withhold details such as training data, architecture, and source code.
The global AI race has seen a dramatic shift, with China’s DeepSeek leading a movement in cost-efficient, high-performing open-source language models.
US firms like OpenAI have had to reconsider their approach in light of the success of DeepSeek and Meta's Llama, which has accumulated over a billion downloads despite certain commercial licensing issues.
Altman’s comments and OpenAI’s community feedback form, which invites suggestions on future open-weight models, hint at deeper introspection within the company. Questions in the form include, “What would you like to see in an open weight model from OpenAI?”
This also comes as the U.S. government increasingly advocates for open development in AI to keep up with global counterparts.
Although the release made headlines, experts point out that open-source is not open-weight, and many developers continue to want more ethical control, transparency, and modifiability in AI systems. Open-weight models are partially opaque, in contrast to open-source models, which come with comprehensive architectural and training documentation.
However, the article is seen as a significant acknowledgement of the importance of openness in the advancement of AI. As competition intensifies and public trust becomes increasingly crucial, more companies may feel compelled to partially disclose their technologies.