又问了同样的问题,得到了一个比较中立的答案:

问:Is deepseek's r1 model distillated from ChatGPT

答:

As of now, I don’t have any specific, confirmed information on a model called “Deepseek’s R1� or whether it is distilled from ChatGPT. “Distillation� typically refers to the process of transferring knowledge from a larger model (like GPT-3 or GPT-4) to a smaller, more efficient model. Companies and research labs often use distillation techniques to create smaller models that retain much of the larger model’s performance.

 

If Deepseek’s R1 is based on OpenAI’s models, it’s possible that it could be a fine-tuned or distilled version of a model like ChatGPT, but without more detailed information or an official source, it’s hard to confirm. Would you like me to look up more details?

 
请您先登陆,再发跟帖!