Moonshot AI releases Kimi-K2.6 model with 1T parameters, focus adjustment

Moonshot AI today released Kimi-K2.6, the latest addition to its popular Kimi series of large open source language models.
The Chinese artificial intelligence startup says the algorithm outperforms GPT-5.4 and Claude Opus 4.6 across several AI benchmarks.
Each of LLM’s synthetic sensors includes modules called weights that prioritize input data based on correlation. From there, the data is sent to an algorithm called an activation function. It processes the input and decides whether the output is valuable enough to be shared with other neurons.
The Kimi-K2.6 uses an activation function known as a Swish-Gated Linear Unit, or SwiGLU for short. It is more hardware efficient than previous algorithms and simplifies the LLM training process in some ways. The algorithm has been integrated into several other open source LLM families besides Kimi, notably the Llama series of Meta Platforms Inc..
The Kimi-K2.6 neurons are organized into 384 so-called experts, small neural networks that are optimized for a different set of tasks. When the LLM receives information, it uses only eight experts to respond. Reducing the number of neural networks involved in processing user input reduces hardware consumption.
Kimi-K2.6’s neural networks use a technology called MLA, or multiple-brain implicit attention, to identify the most important part of the information. A high-performance hardware version of the standard attention method found in LLMs. The technology works in a similar way, except that it compresses the data it processes into a lightweight mathematical representation to reduce hardware requirements.
Kimi-K2.6 neural networks are supported by a vision encoder with 400 million parameters. Converts images into embedded, mathematical representations that can be easily understood by LLM. The vision encoder enables the Kimi-K2.6 to not only process textual instructions but also embed multimedia.
According to Moonshot AI, the model can turn simple user commands and interface diagrams into complete websites. When an LLM is tasked with a very difficult, time-consuming task, it can activate up to 300 agents to speed up the workflow. Agents divide work into smaller steps and perform them in parallel, which is faster than completing them sequentially.
Kimi-K2.6 can pick off human workers using a feature called claw teams. According to Moonshot AI, it enables LLM to divide the work involved in a project between humans and agents. The Kimi-K.26 is also better than its predecessor in other functions, including the improvement of Rust. Rust is a low-level language with complex syntax that is widely used for programming connected devices.
Moonshot AI compared Kim-K.26 with GPT-5.4 and Claude Opus 4.6 across more than a dozen popular benchmarks. According to the company, its algorithm is best for LLMs of two parameters or within a few percent of their scores in most tests.
One of the tests that Kim-K.26 has successfully completed is HLE-Full, which is among the most difficult benchmarks in the AI ecosystem. It includes nearly 2,500 doctoral-level questions covering more than 100 academic fields. Kim-K.26 scored 54 while Opus 4.6 and GPT 5.4 scored 53 and 52.1 respectively.
Photo: Unsplash
Support our mission to keep content open and free by engaging with the CUBE community. Join CUBE’s Alumni Trust Networkwhere technology leaders connect, share wisdom and create opportunities.
- 15M+ viewers of CUBE videosenabling conversations across AI, cloud, cybersecurity and more
- 11.4k+ CUBE alumni – Connect with more than 11,400 technology and business leaders who are shaping the future through a unique network based on trust.
About SiliconANGLE Media
Founded by technology visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media products that reach 15+ million elite technology professionals. Our new ownership of CUBE AI Video Cloud is starting to engage with audiences, using CUBEai.com’s neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.



