DeepSeek V3

DeepSeek V3

LLM·Horizontal
FreemiumOpen Source

Cost-efficient open-source MoE model rivaling GPT-4o in reasoning and math tasks

DeepSeek-V3 is a 671-billion-parameter Mixture-of-Experts (MoE) model with 37B parameters activated per token. It excels in coding, mathematics, and multilingual tasks, outperforming leading open-source models like Qwen2.5-72B and Llama-3.1-405B, and matches closed-source models like GPT-4o and Claude-3.5-Sonnet in benchmarks. Trained on 14.8 trillion tokens using FP8 mixed precision, it achieves state-of-the-art efficiency with a 128K context window and 3x faster generation speed compared to its predecessor

Metrics

Popularity Score23/100
Total Views890
Last 24h3
Last 7 days7
Last 30 days18
Upvotes1
Bookmarks0

Similar Agents

DeepSeek V3 - Review, Features & Alternatives | One9Founders | One9Founders