Premium Only Content
Retentive Network: A Successor to Transformer for Large Language Models (Paper Explained)
#ai #retnet #transformers
Retention is an alternative to Attention in Transformers that can both be written in a parallel and in a recurrent fashion. This means the architecture achieves training parallelism while maintaining low-cost inference. Experiments in the paper look very promising.
OUTLINE:
0:00 - Intro
2:40 - The impossible triangle
6:55 - Parallel vs sequential
15:35 - Retention mechanism
21:00 - Chunkwise and multi-scale retention
24:10 - Comparison to other architectures
26:30 - Experimental evaluation
Paper: https://arxiv.org/abs/2307.08621
Abstract:
In this work, we propose Retentive Network (RetNet) as a foundation architecture for large language models, simultaneously achieving training parallelism, low-cost inference, and good performance. We theoretically derive the connection between recurrence and attention. Then we propose the retention mechanism for sequence modeling, which supports three computation paradigms, i.e., parallel, recurrent, and chunkwise recurrent. Specifically, the parallel representation allows for training parallelism. The recurrent representation enables low-cost O(1) inference, which improves decoding throughput, latency, and GPU memory without sacrificing performance. The chunkwise recurrent representation facilitates efficient long-sequence modeling with linear complexity, where each chunk is encoded parallelly while recurrently summarizing the chunks. Experimental results on language modeling show that RetNet achieves favorable scaling results, parallel training, low-cost deployment, and efficient inference. The intriguing properties make RetNet a strong successor to Transformer for large language models. Code will be available at this https URL.
Authors: Yutao Sun, Li Dong, Shaohan Huang, Shuming Ma, Yuqing Xia, Jilong Xue, Jianyong Wang, Furu Wei
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
LIVE
BonginoReport
3 hours agoTrump’s Colonizing a Country and It’s Not Greenland (Ep.133) - 02/05/2025
13,390 watching -
LIVE
Wendy Bell Radio
4 hours agoTaking Out The Trash
10,818 watching -
LIVE
Graham Allen
2 hours agoDems Try To RAID USAID Office!! + Trump Meets With Netanyahu In The Oval Office! Peace Is HERE!!
6,339 watching -
LIVE
Vigilant News Network
13 hours agoRFK Jr. Clears Major Hurdle for HHS Secretary | The Daily Dose
826 watching -
1:28:24
Game On!
14 hours ago $0.48 earnedTom Brady's SHOCKING Super Bowl Pick!
5.45K1 -
LIVE
Jeff Ahern
1 hour agoNever Woke Wednesday with Jeff Ahern (6am Pacific)
499 watching -
18:52
Producer Michael
17 hours agoI GOT PULLED OVER BY THE POLICE IN MIAMI!
34.4K3 -
1:21:53
MTNTOUGH Fitness Lab
22 hours agoThe Fight That Almost Destroyed Jim Miller (& How He Overcame It) to Break UFC Records
48.2K3 -
24:46
Cooking with Gruel
1 day agoFeeding Hope - LA Wildfire Relief from the People
34.6K7 -
14:34
MichaelBisping
14 hours agoSean Strickland CONFRONTS Bisping! 'Pereira in Corner, Dricus and Khamzat!' (EXCLUSIVE INTERVIEW)
16.1K4