Data fest Online 2020
Catalyst Workshop track
In the past few years, state-of-the-art architectures become more and more complex the number of parameters grows exponentially. But what if all networks are over-parametrized and more than a half parameters don’t influence the result. The process of removing connections between neurons called pruning.
Moreover, some examples came from the biology field. For instance, the human brain is also over-parametrized in the first stages of growth, and we are learning through pruning unnecessary connections.
In my presentation, I will tell you how to perform pruning of your network with the Catalyst framework and will briefly demonstrate what is “Lottery Ticket Hypothesis“ mean.
Register and get access to the tracks:
Join the community:
3 views
818
352
5 months ago 01:05:31 1
Желтый Club Talks — Про ACL, идеи для исследований и конкуренцию в ресерче
2 years ago 00:49:38 1
Лекция. Генеративные модели. Генеративно-состязательные сети
2 years ago 01:01:47 2
Семинар. Transfer Learning в компьютерном зрении
2 years ago 00:38:58 1
Лекция. Генеративные модели, автоэнкодеры
3 years ago 01:18:06 40
Tinkoff Lab Event
3 years ago 00:30:50 1
Семинар. Word2Vec.
3 years ago 00:40:24 1
Семинар. Языковые модели
3 years ago 00:48:33 1
Семинар. Рекуррентные нейронные сети
4 years ago 00:37:10 1
Семинар. Transfer learning для задачи классификации изображений.
4 years ago 00:29:11 21
Как заставить языковую модель генерировать нужные нам тексты / Никита Балаганский
4 years ago 00:55:26 11
Семинар. GPT
4 years ago 00:12:38 10
Семинар. Продвинутое обучение свёрточных нейронных сетей.
4 years ago 00:29:48 22
Nikita Balagansky - Pruning with Catalyst (DataFest 2020)