DeepSeek launches Expert Mode

robot
Abstract generation in progress

Reporter|Ye Xiaodan

Editor|Jin Mingyu Xu Shaohang Du Bo Proofreader|Zhang Jinhe

On April 8, DeepSeek rolled out Expert Mode. In the latest version, above the DeepSeek input box, two new options—“Quick Mode” and “Expert Mode”—were added. This is the first time since DeepSeek became popular that it has introduced a layered mode design at the product level.

A reporter from the Beijing Business Daily noticed that Quick Mode is suited for everyday conversations, with instant responses, and it supports OCR of text in images and files. Expert Mode is strong at handling complex problems, supporting deep thinking and intelligent search.

In its introduction of Expert Mode within the dialogue, DeepSeek said that Expert Mode has features such as domain-depth enhancement, multi-step reasoning visualization, citation provenance strengthening, custom expert combinations, and long-context compression optimization.

Regarding the model support behind this newly added feature, DeepSeek stated that the Expert Mode is powered by DeepSeek’s next-generation Mixture of Experts (MoE) architecture. The core foundation is the reinforced learning results from the integration of DeepSeek-V3.2 (or later successor versions) and the inference layer, DeepSeek-R1. Expert Mode inherits R1’s long chain-of-thought reasoning capabilities, but it also performs targeted distillation and fine-tuning for professional domains, making “quick thinking” and “slow thinking” more balanced within those fields.

In simple terms, Expert Mode = a combination of V3.2’s domain-expert routing + R1’s deep reasoning mechanism + professional retrieval enhancement.

On April 8, a reporter from the Beijing Business Daily checked DeepSeek’s open platform update log and found that its latest updated model version is still the DeepSeek-V3.2 model updated on December 1, 2025.

|Beijing Business Daily nbdnews Original article|

Reprinting, excerpting, copying, mirroring, or other uses are prohibited without permission.

Beijing Business Daily

(Editor: Zhang Xiaobo)

     【Disclaimer】This article only represents the author’s personal views and is not related to Hexun. Hexun, and its website, remain neutral toward the statements and judgments in the text, and do not provide any express or implied guarantees regarding the accuracy, reliability, or completeness of the content included. Readers should refer to the information only as a reference and bear all responsibility themselves. Email: news_center@staff.hexun.com

Report

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments