女大危矣!

清华和腾讯码工的AI新作CALM:直接颠覆了现有LLM

Robert Youssef

 

@rryssf_

 

Holy shit... this might be the next big paradigm shift in AI.

 

 

 

Tencent + Tsinghua just dropped a paper called Continuous Autoregressive Language Models (CALM) and it basically kills the “next-token” paradigm every LLM is built on.

 

 

 

Instead of predicting one token at a time, CALM predicts continuous vectors that represent multiple tokens at once.

 

 

 

Meaning: the model doesn’t think “word by word”… it thinks in ideas per step.

 

 

 

Here’s why that’s insane

 

 

 

→ 4× fewer prediction steps (each vector = ~4 tokens)

 

→ 44% less training compute

 

→ No discrete vocabulary pure continuous reasoning

 

→ New metric (BrierLM) replaces perplexity entirely

 

 

 

They even built a new energy-based transformer that learns without softmax no token sampling, no vocab ceiling.

 

 

 

It’s like going from speaking Morse code… to streaming full thoughts.

 

 

 

If this scales, every LLM today is obsolete.

 

 

所有跟帖: 

论文到落地可能是一步可能是正无穷 -食神OG- 给 食神OG 发送悄悄话 (0 bytes) () 11/04/2025 postreply 12:50:00

嘘! 快去买点扑 -FBE63- 给 FBE63 发送悄悄话 (0 bytes) () 11/04/2025 postreply 12:50:56

这就是为什么open AI现在急于变现。 -TalkToMi- 给 TalkToMi 发送悄悄话 (0 bytes) () 11/04/2025 postreply 12:53:48

本来嘛,算法的进步才是最重要的,以为piling on compute就能work的人估计算法方面不灵 -过来人2- 给 过来人2 发送悄悄话 (0 bytes) () 11/04/2025 postreply 12:59:47

FOMO -大好时光- 给 大好时光 发送悄悄话 (0 bytes) () 11/04/2025 postreply 13:04:37

对OpenAI也许有影响,但是目前对NVDA还没有,因为只要AI scaling laws没有失效,算力就永远短缺~ -ocliving2005_4ever- 给 ocliving2005_4ever 发送悄悄话 (0 bytes) () 11/04/2025 postreply 13:09:34

你的无知无与伦比 -cnrhm2017- 给 cnrhm2017 发送悄悄话 cnrhm2017 的博客首页 (0 bytes) () 11/04/2025 postreply 13:13:08

请您先登陆,再发跟帖!