Meet LLM Powerhouses in Korea: HyperClova, Ax, Solar Pro, etc.
Driven by strategic government investment, corporate research and open source collaboration, South Korea is rapidly establishing itself as a key innovator of the Big Language Model (LLMS) to create models tailored to Korean language processing and domestic applications. This focus helps reduce dependence on foreign AI technologies, enhance data privacy, and support sectors such as healthcare, education and telecommunications.
Government-supported promotion of sovereign artificial intelligence
In 2025, the Ministry of Science and ICT launched a 240 billion won program, selecting five consortiums led by Naver Cloud, SK Telecom, Upstage, LG AI Research and NC AI to develop sovereign LLMs capable of operating on local infrastructure.
Regulatory advances include guidance from the Ministry of Food and Drug Safety to approve text-generated medical AI, marking the first such framework worldwide in early 2025.
Company and academic innovation
SK Telecom has launched AX 3.1 Lite, a 7 billion parameter model that is trained from scratch in more than 1665 million language tokens, with strong emphasis. For Korean language reasoning, it performs about 96% on KMMLU2, while 102% of Click3 can achieve 102% of cultural understanding relative to the larger model and can provide open source for hug faces for mobile and device applications.
Naver advances the Hyperclova series in the June 2025 HyperClova X Think, enhancing South Korea-specific search and dialogue capabilities.
Upstage’s Solar Pro 2 is the only Korean portal on the Frontier LM Intelligence rankings, indicating efficiency in matching performance for larger international models.
LG AI Research launched Exaone 4.0 in July 2025, a research that competes with a global benchmark of 30 billion parameter design.
Seoul National University Hospital developed Korea’s first Bachelor of Medicine Law, received training in 38 million identifying clinical records, and scored 86.2% on South Korean medical license examinations, compared with a human average of 79.7%.
MathPresso and Upstage work with Math GPT, a small LLM with 13 billion parameters that surpassed GPT-4 in mathematical benchmarks with an accuracy of 0.488, while 0.425 uses significantly less computing resources.
By constantly preprocessing on Korean datasets to handle nuances of language, for example, open source plans such as Polyglot-KO (range 13 to 12.8 million parameters) and Gecko 7B (range 13 to 12.8 million parameters) such as code switches.
Technology Trends
South Korean developers emphasize efficiency and optimize the token-to-parameter ratio inspired by the Totoro scaling range to enable the 70-30 billion parameter model to compete with larger Western peers, despite resource constraints.
As shown by the Medical LLM of Seoul National University Hospital and the Medical LLM of Mathematics Mathematics, domain-specific adaptability produces outstanding results in the target area.
Progress is measured by benchmarks including KMMMLU2, CLICK3, and Frontier LM rankings, confirming equality with advanced global systems.
Market prospects
The South Korean LLM market is expected to increase from $182.4 million in 2024 to $127.83 million in 2030, reflecting a CAGR of 39.4%, driven primarily by chatbots, virtual assistants and sentiment analytics tools. Telecom companies will integrate Edge-Computing LLMS with plans such as AI infrastructure superhighways to support reduced latency and enhanced data security.
The Korean big-speaking model mentioned
# | Model | Developers/Leadership Organizations | Parameter Count | Highlights worth noting |
---|---|---|---|---|
1 | AX 3.1 Lite | SK Telecom | 7 billion | Mobile and equipment processing in Korea |
2 | AX 4.0 Lite | SK Telecom | 72 billion | Scalable sovereign application |
3 | HyperClova X Thinking | naver | ~200.4 billion (EST.) | Korean search and dialogue |
4 | Solar Pro 2 | rise | ~30 billion (EST.) | General efficiency of global rankings |
5 | Mathematics GPT | MathPresso + on stage | 13 billion | Mathematics specialization |
6 | Exaone 4.0 | LG AI Research | 30 billion | Multimode AI features |
7 | polyglot-ko | Eleutherai + Kifai | 130 to 12.8 billion | Korean open source training only |
8 | Gecko 7B | Beomi Community | 7 billion | Continuous Korean training |
9 | SNUH Medical LLM | Seoul National University Hospital | Not disclosed (~15b es.) | Clinical and medical decision support |
These developments highlight South Korea’s approach to creating efficient, culturally relevant AI models, thus enhancing its position in the global technology field.
Source:
Michal Sutter is a data science professional with a master’s degree in data science from the University of Padua. With a solid foundation in statistical analysis, machine learning, and data engineering, Michal excels in transforming complex datasets into actionable insights.