Premium Only Content
Efficient Streaming Language Models with Attention Sinks (Paper Explained)
#llm #ai #chatgpt
How does one run inference for a generative autoregressive language model that has been trained with a fixed context size? Streaming LLMs combine the performance of windowed attention, but avoid the drop in performance by using attention sinks - an interesting phenomenon where the token at position 0 acts as an absorber of "extra" attention.
OUTLINE:
0:00 - Introduction
1:20 - What is the problem?
10:30 - The hypothesis: Attention Sinks
15:10 - Experimental evidence
18:45 - Streaming LLMs
20:45 - Semantics or position?
22:30 - Can attention sinks be learned?
27:45 - More experiments
30:10 - Comparison to Big Bird
Paper: https://arxiv.org/abs/2309.17453
Abstract:
Deploying Large Language Models (LLMs) in streaming applications such as multi-round dialogue, where long interactions are expected, is urgently needed but poses two major challenges. Firstly, during the decoding stage, caching previous tokens' Key and Value states (KV) consumes extensive memory. Secondly, popular LLMs cannot generalize to longer texts than the training sequence length. Window attention, where only the most recent KVs are cached, is a natural approach -- but we show that it fails when the text length surpasses the cache size. We observe an interesting phenomenon, namely attention sink, that keeping the KV of initial tokens will largely recover the performance of window attention. In this paper, we first demonstrate that the emergence of attention sink is due to the strong attention scores towards initial tokens as a ``sink'' even if they are not semantically important. Based on the above analysis, we introduce StreamingLLM, an efficient framework that enables LLMs trained with a finite length attention window to generalize to infinite sequence lengths without any fine-tuning. We show that StreamingLLM can enable Llama-2, MPT, Falcon, and Pythia to perform stable and efficient language modeling with up to 4 million tokens and more. In addition, we discover that adding a placeholder token as a dedicated attention sink during pre-training can further improve streaming deployment. In streaming settings, StreamingLLM outperforms the sliding window recomputation baseline by up to 22.2x speedup. Code and datasets are provided at this https URL.
Authors: Guangxuan Xiao, Yuandong Tian, Beidi Chen, Song Han, Mike Lewis
Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
8:09:50
Dr Disrespect
15 hours ago🔴LIVE - DR DISRESPECT - MARVEL RIVALS - GOLD VANGUARD
197K33 -
1:15:00
Awaken With JP
14 hours agoMerry Christmas NOT Happy Holidays! Special - LIES Ep 71
209K183 -
1:42:21
The Quartering
16 hours agoTrump To INVADE Mexico, Take Back Panama Canal Too! NYC Human Torch & Matt Gaetz Report Drops!
159K109 -
2:23:15
Nerdrotic
15 hours ago $12.47 earnedA Very Merry Christmas | FNT Square Up - Nerdrotic Nooner 453
121K11 -
1:14:05
Tucker Carlson
15 hours ago“I’ll Win With or Without You,” Teamsters Union President Reveals Kamala Harris’s Famous Last Words
218K374 -
1:58:31
The Dilley Show
15 hours ago $34.05 earnedTrump Conquering Western Hemisphere? w/Author Brenden Dilley 12/23/2024
163K49 -
1:09:59
Geeks + Gamers
16 hours agoSonic 3 DESTROYS Mufasa And Disney, Naughty Dog Actress SLAMS Gamers Over Intergalactic
111K21 -
51:59
The Dan Bongino Show
17 hours agoDemocrat Donor Admits The Scary Truth (Ep. 2393) - 12/23/2024
924K3.09K -
2:32:15
Matt Kohrs
1 day agoRumble CEO Chris Pavlovski Talks $775M Tether Partnership || The MK Show
141K36 -
28:23
Dave Portnoy
1 day agoDavey Day Trader Presented by Kraken - December 23, 2024
171K44