Premium Only Content
∞-former: Infinite Memory Transformer (aka Infty-Former / Infinity-Former, Research Paper Explained)
#inftyformer #infinityformer #transformer
Vanilla Transformers are excellent sequence models, but suffer from very harsch constraints on the length of the sequences they can process. Several attempts have been made to extend the Transformer's sequence length, but few have successfully gone beyond a constant factor improvement. This paper presents a method, based on continuous attention mechanisms, to attend to an unbounded past sequence by representing the past as a continuous signal, rather than a sequence. This enables the Infty-Former to effectively enrich the current context with global information, which increases performance on long-range dependencies in sequence tasks. Further, the paper presents the concept of sticky memories, which highlight past events that are of particular importance and elevates their representation in the long-term memory.
OUTLINE:
0:00 - Intro & Overview
1:10 - Sponsor Spot: Weights & Biases
3:35 - Problem Statement
8:00 - Continuous Attention Mechanism
16:25 - Unbounded Memory via concatenation & contraction
18:05 - Does this make sense?
20:25 - How the Long-Term Memory is used in an attention layer
27:40 - Entire Architecture Recap
29:30 - Sticky Memories by Importance Sampling
31:25 - Commentary: Pros and cons of using heuristics
32:30 - Experiments & Results
Paper: https://arxiv.org/abs/2109.00301
Sponsor: Weights & Biases
https://wandb.me/start
Abstract:
Transformers struggle when attending to long contexts, since the amount of computation grows with the context length, and therefore they cannot model long-term memories effectively. Several variations have been proposed to alleviate this problem, but they all have a finite memory capacity, being forced to drop old information. In this paper, we propose the ∞-former, which extends the vanilla transformer with an unbounded long-term memory. By making use of a continuous-space attention mechanism to attend over the long-term memory, the ∞-former's attention complexity becomes independent of the context length. Thus, it is able to model arbitrarily long contexts and maintain "sticky memories" while keeping a fixed computation budget. Experiments on a synthetic sorting task demonstrate the ability of the ∞-former to retain information from long sequences. We also perform experiments on language modeling, by training a model from scratch and by fine-tuning a pre-trained language model, which show benefits of unbounded long-term memories.
Authors: Pedro Henrique Martins, Zita Marinho, André F. T. Martins
Links:
TabNine Code Completion (Referral): http://bit.ly/tabnine-yannick
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://discord.gg/4H8xxDF
BitChute: https://www.bitchute.com/channel/yann...
Minds: https://www.minds.com/ykilcher
Parler: https://parler.com/profile/YannicKilcher
LinkedIn: https://www.linkedin.com/in/yannic-ki...
BiliBili: https://space.bilibili.com/1824646584
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannick...
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
44:19
ykilcher
3 years agoPonderNet: Learning to Ponder (Machine Learning Research Paper Explained)
93 -
35:29
ykilcher
3 years agoFastformer: Additive Attention Can Be All You Need (Machine Learning Research Paper Explained)
25 -
1:57:22
Steven Crowder
4 hours agoThe 4B Movement: How Trump is Saving the World from Liberal Women
441K41 -
1:24:03
The Rubin Report
2 hours ago'Real Time' Crowd Stunned as Bill Maher Gives a Brutal Message to Democrats
44.2K13 -
DVR
Benny Johnson
2 hours agoDC Swamp Declares WAR on TRUMP in Senate Battle to REPLACE Mitch McConnell! We EXPOSED Secret Ballot
78.2K -
LIVE
MTNTOUGH Fitness Lab
1 hour agoWhy 99% of People Will Fail: The Hardcore Truth About Entrepreneurship | MTNT POD #90
212 watching -
LIVE
The Nima Yamini Show
2 hours agoWhat is AFD In Germany?
158 watching -
1:28:43
Caleb Hammer
14 hours agoPsycho Tried To Manipulate Me | Financial Audit
18.7K -
2:02:49
LFA TV
13 hours agoJUSTICE IS COMING! | LIVE FROM AMERICA 11.11.24 11am EST
44.1K1 -
2:25:04
Matt Kohrs
16 hours agoEXPLOSIVE NEW HIGHS!!! (Bitcoin, Tesla, Coinbase & DJT) || The MK Show
85.7K