
凌度bit|Jan 28, 2025 07:13
Recently, I saw on Twitter that they are still using the previous GPT system to test DeepSeek, and they say that the effect is not good, but the inference model is not used in this way!!
Throw away your prompt word template, its core is an inferential big model, not an imperative big model.
The directive type requires you to explain the process in detail, so that it can understand how to do it, which has given rise to many prompt templates
But the reasoning type can think on its own! You just need to tell him the 'purpose' of what you want to do and some of your ideas, and he can automatically think for you and come up with solutions. Just like a boss instructing a particularly intelligent colleague to do something, there is no need to tell him specifically what to do, but rather what I want to do.
Traditional way:
You are a fintech analyst, please write the report according to the following framework:
1. Scale of mobile payment market
2. Analysis of main competitors
3. Current status of blockchain technology applications
4. Development Trends of Digital Currency
Requirement: 1000 words per section, citing data from the central bank and iResearch Consulting
Isn't it a very common prompt template, but if given to DepSeek, it is highly likely to only generate a dry set of templates. But if you really want to write something meaningful, you can use a thinking model to think about it first
Thinking model:
I will attend the Central Bank Digital Currency Seminar next Wednesday, but I have just switched to fintech for three months. To make the experts on the roundtable forum think that I am an experienced driver with ten years of experience.
You need to explain your purpose and demands, and he will help you solve it
The previous OpenAI o1 model could also achieve it, but the price was expensive and probably few people have used it. However, now that the price of DeepSeek's inference model has come down, there is a lot of room for imagination. You can put the inference model in any position and think about it, which will definitely greatly improve the quality of AI applications!!
Share To
Timeline
HotFlash
APP
X
Telegram
CopyLink