Warning: file_put_contents(aCache/aDaily/post/loopacademy/--): Failed to open stream: No space left on device in /var/www/tgoop/post.php on line 50
Loop Academy | آکادمیِ لوپ@loopacademy P.178
LOOPACADEMY Telegram 178
🎯 Deep Learning and Neural Networks Symposium and Workshop

👨🏻‍🎓 Speaker introduction:

Dr. Mohammad Rostami,
University of Pennsylvania

Title: Learning to Transfer Knowledge Through Embedding Spaces

Abstract: The unprecedented processing demand, posed by the explosion of big data, challenges researchers to design efficient and adaptive machine learning algorithms that do not require persistent retraining and avoid learning redundant information. Inspired from learning techniques of intelligent biological agents, identifying transferable knowledge across learning problems has been a significant research focus to improve machine learning algorithms. In this talk, we address the challenges of knowledge transfer through embedding spaces that capture and store hierarchical knowledge.

In the first part of the talk, we focus on the problem of cross-domain knowledge transfer. We first address zero-shot image classification, where the goal is to identify images from unseen classes using semantic descriptions of these classes. We train two coupled dictionaries which align visual and semantic domains via an intermediate embedding space. We then extend this idea by training deep networks that match data distributions of two visual domains in a shared cross-domain embedding space. Our approach addresses both semi-supervised and unsupervised domain adaptation setting.

In the second part of the talk, we investigate the problem of cross-task knowledge transfer. Here, the goal is to identify relations and similarities of multiple machine learning tasks to improve performance across the tasks. We first address the problem of zero-shot learning in a lifelong machine learning setting, where the goal is to learn tasks with no data using high-level task descriptions. Our idea is to relate high-level task descriptors to the optimal task parameters through an embedding space. We then develop a method to overcome the problem of catastrophic forgetting within continual learning setting of deep neural networks by enforcing the tasks to share the same distribution in the embedding space. We further demonstrate that our model can address the challenges of domain adaptation in the continual learning setting.

We demonstrate that despite major differences, problems within the above learning scenarios can be tackled through learning an intermediate embedding space.

⭕️ Check our website for more information

⚙️ Organizers: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy


📢 @LoopAcademy
📢 @CMPLab

🌐 www.loopacademy.ir
🌐 www.cmplab.ir



tgoop.com/loopacademy/178
Create:
Last Update:

🎯 Deep Learning and Neural Networks Symposium and Workshop

👨🏻‍🎓 Speaker introduction:

Dr. Mohammad Rostami,
University of Pennsylvania

Title: Learning to Transfer Knowledge Through Embedding Spaces

Abstract: The unprecedented processing demand, posed by the explosion of big data, challenges researchers to design efficient and adaptive machine learning algorithms that do not require persistent retraining and avoid learning redundant information. Inspired from learning techniques of intelligent biological agents, identifying transferable knowledge across learning problems has been a significant research focus to improve machine learning algorithms. In this talk, we address the challenges of knowledge transfer through embedding spaces that capture and store hierarchical knowledge.

In the first part of the talk, we focus on the problem of cross-domain knowledge transfer. We first address zero-shot image classification, where the goal is to identify images from unseen classes using semantic descriptions of these classes. We train two coupled dictionaries which align visual and semantic domains via an intermediate embedding space. We then extend this idea by training deep networks that match data distributions of two visual domains in a shared cross-domain embedding space. Our approach addresses both semi-supervised and unsupervised domain adaptation setting.

In the second part of the talk, we investigate the problem of cross-task knowledge transfer. Here, the goal is to identify relations and similarities of multiple machine learning tasks to improve performance across the tasks. We first address the problem of zero-shot learning in a lifelong machine learning setting, where the goal is to learn tasks with no data using high-level task descriptions. Our idea is to relate high-level task descriptors to the optimal task parameters through an embedding space. We then develop a method to overcome the problem of catastrophic forgetting within continual learning setting of deep neural networks by enforcing the tasks to share the same distribution in the embedding space. We further demonstrate that our model can address the challenges of domain adaptation in the continual learning setting.

We demonstrate that despite major differences, problems within the above learning scenarios can be tackled through learning an intermediate embedding space.

⭕️ Check our website for more information

⚙️ Organizers: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy


📢 @LoopAcademy
📢 @CMPLab

🌐 www.loopacademy.ir
🌐 www.cmplab.ir

BY Loop Academy | آکادمیِ لوپ




Share with your friend now:
tgoop.com/loopacademy/178

View MORE
Open in Telegram


Telegram News

Date: |

Your posting frequency depends on the topic of your channel. If you have a news channel, it’s OK to publish new content every day (or even every hour). For other industries, stick with 2-3 large posts a week. Telegram has announced a number of measures aiming to tackle the spread of disinformation through its platform in Brazil. These features are part of an agreement between the platform and the country's authorities ahead of the elections in October. How to build a private or public channel on Telegram? How to Create a Private or Public Channel on Telegram? Informative
from us


Telegram Loop Academy | آکادمیِ لوپ
FROM American