SK Telecom announced on July 11 that it has open-sourced its self-developed lightweight large language model (LLM), ‘A.X 3.1 Light,’ on the global platform Hugging Face.
‘A.X 3.1 Light’ is a model designed entirely from scratch by SK Telecom, built through a proprietary development process. It is a lightweight model consisting of 7 billion (7B) parameters and is an upgraded version of the previous ‘A.X 3.0 Light.’ While maintaining the efficiency and compactness of its predecessor, it offers improved language processing capabilities suitable for a range of applications.
This model is optimized to deliver high-quality performance across various mobile devices, enabling companies to provide more efficient AI services.
According to SK Telecom, ‘A.X 3.1 Light’ performs competitively with its larger sibling, ‘A.X 4.0 Light,’ particularly in Korean language understanding. In the Korean Massive Multitask Language Understanding (KMMLU) benchmark, it scored 61.70, approximately 96% of the 64.15 points achieved by A.X 4.0 Light. Notably, it outperformed in the Cultural and Linguistic Intelligence in Korea (CLIcK) benchmark, achieving 71.22 points compared to A.X 4.0 Light’s 69.97, indicating strong cultural and linguistic understanding.
Furthermore, SK Telecom plans to release a follow-up model within this month featuring around 34 billion (34B) parameters, about five times larger—to complement both the lightweight A.X 3 series and the larger A.X 4 series. This “two-track strategy” aims to develop a diverse AI ecosystem, where the A.X 3 models provide self-sufficient efficiency, and the A.X 4 series caters to high-performance needs through ongoing pre-training.
Since beginning AI language model development in 2018, SK Telecom has continuously advanced domestic AI technology. It applied KoBERT to customer service chatbots in 2019, developed Korea’s first Korean GPT-2 open source in 2020, and has been utilizing its proprietary A.X models in its AI service ‘A. dot’ since 2022—enhancing conversational and summarization functions.
Looking ahead, SK Telecom plans to further boost LLM performance by expanding GPU resources and strengthening AI R&D. The company also intends to participate in the government-led ‘Independent AI Foundation Model Development Project.’
Kim Taeyoon, head of SK Telecom’s foundation models division, stated, “With continuous expertise in developing domestic LLMs, we aim to contribute to the self-reliance of the AI ecosystem and enhance Korea’s AI competitiveness.”

