Chinese AI firm DeepSeek unveils V4 with 1 million-token context
DeepSeek, a tech company from China, just dropped its latest AI model, DeepSeek V4, aiming to compete with big names like ChatGPT and Gemini.
What stands out? It can handle huge tasks thanks to a massive 1 million-token context window, so it's built for both power and efficiency.
V4-Pro 1.6T parameters 49B active affordable
There are two versions, V4-Pro and V4-Flash, both using smart tech that cuts down on memory and computing needs.
The Pro version is especially efficient: it has 1.6 trillion parameters but only uses 49 billion at a time.
Trained on over 32 trillion tokens, DeepSeek V4 is great at coding tasks (though it trails behind US models in some knowledge tests).
Plus, it's priced lower than rivals like Claude 4.7, making it an appealing option for anyone looking for affordable AI tools.