https://www.deeplearning.ai/the-batch/ai-transformed/
“Noam Shazeer helped spark the latest NLP revolution. He developed the multi-headed self-attention mechanism described in “Attention Is All You Need,” the 2017 paper that introduced the transformer network. That architecture became the foundation of a new generation of models that have a much firmer grip on the vagaries of human language. Shazeer’s grandparents fled the Nazi Holocaust to the former Soviet Union, and he was born in Philadelphia in 1976 to a multi-lingual math teacher turned engineer and a full-time mom. He studied math and computer science at Duke University before joining Google in 2000. Below, he discusses the transformer and what it means for the future of deep learning.”
大多是名牌大学的。怎么没有一个美国人?想说明什么?查了下,Noam Shazeer出生于费城,上的Duke, 算美国人吧
本文内容已被 [ sportfan ] 在 2024-04-04 21:25:55 编辑过。如有问题,请报告版主或论坛管理删除.