Warning: file_put_contents(aCache/aDaily/post/dlinnlp/-1571-1572-1573-1574-): Failed to open stream: No space left on device in /var/www/tgoop/post.php on line 50
DL in NLP@dlinnlp P.1574
DLINNLP Telegram 1574
Scaling Down to Scale Up: A Guide to Parameter-Efficient Fine-Tuning
arxiv.org/abs/2303.15647

Наша новая статья! Мы обозреваем методы parameter-efficient fine-tuning: от простых и популярных типа adapters или LoRa до более хитрых типа Compacter или KronA.

Продублирую сюда моё короткое описание статьи из твиттера.

PEFT methods can target several things: storage efficiency, multitask inference efficiency, and memory efficiency are among them. We are interested in the case of fine-tuning large models, so memory efficiency is a must.

I feel like everyone knows about Adapters, BitFit, and LoRa, but there are even better methods out there! In the last two years, low-rank methods took off.
Compacter and KronA use a more rank-efficient way to get large matrices. Kronecker product is the new matmul for PEFT.

We dive into the details of 20 different PEFT methods in the paper. Still, because we understand not everyone has the time to read the full 15 pages, we highlight a one-sentence description of each method and provide a pseudocode!
🔥40👍113



tgoop.com/dlinnlp/1574
Create:
Last Update:

Scaling Down to Scale Up: A Guide to Parameter-Efficient Fine-Tuning
arxiv.org/abs/2303.15647

Наша новая статья! Мы обозреваем методы parameter-efficient fine-tuning: от простых и популярных типа adapters или LoRa до более хитрых типа Compacter или KronA.

Продублирую сюда моё короткое описание статьи из твиттера.

PEFT methods can target several things: storage efficiency, multitask inference efficiency, and memory efficiency are among them. We are interested in the case of fine-tuning large models, so memory efficiency is a must.

I feel like everyone knows about Adapters, BitFit, and LoRa, but there are even better methods out there! In the last two years, low-rank methods took off.
Compacter and KronA use a more rank-efficient way to get large matrices. Kronecker product is the new matmul for PEFT.

We dive into the details of 20 different PEFT methods in the paper. Still, because we understand not everyone has the time to read the full 15 pages, we highlight a one-sentence description of each method and provide a pseudocode!

BY DL in NLP







Share with your friend now:
tgoop.com/dlinnlp/1574

View MORE
Open in Telegram


Telegram News

Date: |

With the sharp downturn in the crypto market, yelling has become a coping mechanism for many crypto traders. This screaming therapy became popular after the surge of Goblintown Ethereum NFTs at the end of May or early June. Here, holders made incoherent groaning sounds in late-night Twitter spaces. They also role-played as urine-loving Goblin creatures. Developing social channels based on exchanging a single message isn’t exactly new, of course. Back in 2014, the “Yo” app was launched with the sole purpose of enabling users to send each other the greeting “Yo.” For crypto enthusiasts, there was the “gm” app, a self-described “meme app” which only allowed users to greet each other with “gm,” or “good morning,” a common acronym thrown around on Crypto Twitter and Discord. But the gm app was shut down back in September after a hacker reportedly gained access to user data. Unlimited number of subscribers per channel The best encrypted messaging apps
from us


Telegram DL in NLP
FROM American