Premium Only Content
Active Dendrites avoid catastrophic forgetting - Interview with the Authors
#multitasklearning #biology #neuralnetworks
This is an interview with the paper's authors: Abhiram Iyer, Karan Grewal, and Akash Velu!
Paper Review Video: https://youtu.be/O_dJ31T01i8
Check out Zak's course on Graph Neural Networks (discount with this link): https://www.graphneuralnets.com/p/int...
Catastrophic forgetting is a big problem in mutli-task and continual learning. Gradients of different objectives tend to conflict, and new tasks tend to override past knowledge. In biological neural networks, each neuron carries a complex network of dendrites that mitigate such forgetting by recognizing the context of an input signal. This paper introduces Active Dendrites, which carries over the principle of context-sensitive gating by dendrites into the deep learning world. Various experiments show the benefit in combatting catastrophic forgetting, while preserving sparsity and limited parameter counts.
OUTLINE:
0:00 - Intro
0:55 - Sponsor: GNN Course
2:30 - How did the idea come to be?
7:05 - What roles do the different parts of the method play?
8:50 - What was missing in the paper review?
10:35 - Are biological concepts viable if we still have backprop?
11:50 - How many dendrites are necessary?
14:10 - Why is there a plateau in the sparsity plot?
20:50 - How does task difficulty play into the algorithm?
24:10 - Why are there different setups in the experiments?
30:00 - Is there a place for unsupervised pre-training?
32:50 - How can we apply the online prototyping to more difficult tasks?
37:00 - What did not work out during the project?
41:30 - How do you debug a project like this?
47:10 - How is this related to other architectures?
51:10 - What other things from neuroscience are to be included?
55:50 - Don't miss the awesome ending :)
Paper: https://arxiv.org/abs/2201.00042
Blog: https://numenta.com/blog/2021/11/08/c...
Link to the GNN course (with discount): https://www.graphneuralnets.com/p/int...
Abstract:
A key challenge for AI is to build embodied systems that operate in dynamically changing environments. Such systems must adapt to changing task contexts and learn continuously. Although standard deep learning systems achieve state of the art results on static benchmarks, they often struggle in dynamic scenarios. In these settings, error signals from multiple contexts can interfere with one another, ultimately leading to a phenomenon known as catastrophic forgetting. In this article we investigate biologically inspired architectures as solutions to these problems. Specifically, we show that the biophysical properties of dendrites and local inhibitory systems enable networks to dynamically restrict and route information in a context-specific manner. Our key contributions are as follows. First, we propose a novel artificial neural network architecture that incorporates active dendrites and sparse representations into the standard deep learning framework. Next, we study the performance of this architecture on two separate benchmarks requiring task-based adaptation: Meta-World, a multi-task reinforcement learning environment where a robotic agent must learn to solve a variety of manipulation tasks simultaneously; and a continual learning benchmark in which the model's prediction task changes throughout training. Analysis on both benchmarks demonstrates the emergence of overlapping but distinct and sparse subnetworks, allowing the system to fluidly learn multiple tasks with minimal forgetting. Our neural implementation marks the first time a single architecture has achieved competitive results on both multi-task and continual learning settings. Our research sheds light on how biological properties of neurons can inform deep learning systems to address dynamic scenarios that are typically impossible for traditional ANNs to solve.
Authors: Abhiram Iyer, Karan Grewal, Akash Velu, Lucas Oliveira Souza, Jeremy Forest, Subutai Ahmad
Links:
TabNine Code Completion (Referral): http://bit.ly/tabnine-yannick
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://discord.gg/4H8xxDF
BitChute: https://www.bitchute.com/channel/yann...
LinkedIn: https://www.linkedin.com/in/ykilcher
If you want to support me, the best thing to do is to share out the content :)
If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannick...
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n
-
2:02:54
Mally_Mouse
6 hours agoLet's Play!! - Spicy Saturday
30.1K -
1:33:06
Slightly Offensive
6 hours ago $20.08 earnedAre You Ready for What's Coming Next? | Just Chatting Chill Stream
48.2K31 -
32:10
MYLUNCHBREAK CHANNEL PAGE
1 day agoThe Gate of All Nations
125K47 -
13:07
Sideserf Cake Studio
11 hours ago $1.88 earnedIS THIS THE MOST REALISTIC SUSHI CAKE EVER MADE?
46.9K3 -
21:08
Clownfish TV
1 day agoElon Musk Tells WotC to BURN IN HELL for Erasing Gary Gygax from DnD!
36.4K13 -
48:22
PMG
6 hours ago $6.50 earned"IRS Whistleblowers Speak Out on Biden Family with Mel K In-Studio"
30.1K14 -
2:59
BIG NEM
8 hours agoLost in the Wrong Hood: Who Do I Check In With?
24.2K2 -
1:29:32
I_Came_With_Fire_Podcast
19 hours ago"UFOs, Nukes, & Secrecy: Bob Salas on the 1967 Malmstrom Incident, UAPs, & Disclosure"
137K28 -
1:57:05
The Quartering
12 hours agoElon Musk To BUY MSNBC & Give Joe Rogan A Spot, MrBeast Responds Finally To Allegations & Much More
129K107 -
3:01:18
EXPBLESS
13 hours agoFirst Time Playing Extraction Shooters | *LIVE* Arena BreakOut | #RumbleTakeOver
103K9