LoRA+: Efficient Low Rank Adaptation of Large Models
181 points
14 days ago
| 7 comments
| arxiv.org
| HN
batterseapower
14 days ago
[-]
The other recent improvement suggested for LoRA is DoRA: https://magazine.sebastianraschka.com/p/lora-and-dora-from-s.... It really does seem to strongly outperform LoRA - see also https://www.answer.ai/posts/2024-04-26-fsdp-qdora-llama3.htm...
reply
josalhor
14 days ago
[-]
I just skimmed over LoRA+ and DoRA and I see no reason why these improvements could not go hand in hand. Actually, LoRA+ seems to be about efficient training while DoRA seems about improving the ability to actually learn, making it significantly more robust. Although I still have my questions on how the improvements of LoRA+ would be applied to the magnitude vector.
reply
WithinReason
14 days ago
[-]
The two methods seem to be independent, wonder if you can combine them for even better performance.

Interestingly both seem to indirectly modify the optimisation process, in my opinion effectively trying to fix a bad optimiser. Seems like we still have a long way to go after Adam...

reply
neodypsis
14 days ago
[-]
> Seems like we still have a long way to go after Adam...

A preprint in arxiv suggests that Adam works better than SGD for training LLMs due to the issue of class-imbalance [0]. It appears that scaling the gradient step helps with the training, for example, see another approach suggested in [1].

0. https://arxiv.org/pdf/2402.19449 1. https://arxiv.org/pdf/2402.02347

reply
Ger_Onimo
14 days ago
[-]
I've just started playing with DoRAs for fine-tuning TTS models towards particular styles of speech, and they're working extremely well!
reply
allpaca
14 days ago
[-]
Can you tell us more about it? Have you reported the results of your experiments in a post?
reply
mysfi
14 days ago
[-]
Count me interested here as well, specially if it is about the style of speech. I had a fun project in mind that involved the style of speech.
reply
cooljoseph
13 days ago
[-]
Those blog posts are pretty bad. Just read the original paper, https://arxiv.org/pdf/2402.09353. The key section is 4.1.
reply
cuuupid
14 days ago
[-]
I’m struggling to understand from this paper whether the approach is better in the general sense (all cases, with wider models seeing greater benefits) or purely for wider models (with narrower models seeing detriment)?

If it’s the former this could effectively halve finetuning cost overnight which would go a significant way towards enabling a wider array of use cases for LoRA.

reply
ironbound
14 days ago
[-]
I've had sucess with GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection https://arxiv.org/abs/2403.03507
reply
Scipio_Afri
13 days ago
[-]
This uses less memory so you can do fine tuning or hardware with less vram but at a cost of taking longer on training - there is a throughput penalty, the paper detailing the technique shows something like a 15% decrease in throughput.
reply
youssefabdelm
14 days ago
[-]
A better name would've probably been FastLoRA or something
reply
mobilemidget
13 days ago
[-]
I was expecting to read about LOng RAng radio communication.

https://en.wikipedia.org/wiki/LoRa

reply
throwaway2562
14 days ago
[-]
fLORA
reply
yau8edq12i
14 days ago
[-]
What an unfortunate name... I initially thought this was about wireless communication. https://en.wikipedia.org/wiki/LoRa
reply
sorenjan
14 days ago
[-]
This gets mentioned here everytime an article about LoRA is posted. Sometimes acronyms means multiple things, they're not in the same field so the risk of confusion beyond short headlines is negligible.

It's a bit like if someone reading a bicycling article and getting annoyed that FTP means Functional Threshold Power instead of File Transfer Protocol, or reading about machine learning and getting confused that MLP doesn't mean My Little Pony.

reply
rakoo
14 days ago
[-]
"computer science" and "bicycles" aren't the same domain, it's fine to have the same acronym.

"computer science" and "tv shows" aren't the same domain, it's fine to have the same acronym.

"computer science" and "computer science" are the same domain, it's not a good idea to use the same acronym.

reply
dragonwriter
14 days ago
[-]
> "computer science" and "computer science" are the same domain, it's not a good idea to use the same acronym.

But “radio communication" is not “computer science”, even though people sometimes plug radio transceivers into computers, just like “tv shows” aren't “computer science” just because people sometimes view or store their shows on a computer, and “bicycles” aren’t “computer science” because sometimes people mount computers on their bikes.

reply
WithinReason
14 days ago
[-]
Large Models is in the title so it's obviously not about radio
reply
GuB-42
14 days ago
[-]
The acronym is also spelled out in the title: LoRA = Low Rank Adaptation.
reply
rakoo
14 days ago
[-]
Large models is not spelled in full, and doesn't explicitely says it's not about the communication protocol.
reply
squigz
14 days ago
[-]
Context is hard!
reply
rakoo
14 days ago
[-]
So instead of LoRa and anything else, everyone now has to say LoRa (the communication protocol) or LoRa (the large model thing). Having to add context all the time makes everything so much simpler !
reply
squigz
13 days ago
[-]
Or potentially include the necessary context in the title of the post.
reply
rakoo
12 days ago
[-]
Or just pick another name
reply
squigz
12 days ago
[-]
That does seem to be more reasonable than expecting people to pick up on context
reply
WithinReason
14 days ago
[-]
Low rank adaptation is abbreviated LoRA
reply
nostrademons
14 days ago
[-]
"Computer science" isn't really one domain anymore - the field split into several subdomains in the 2010s. Just try to get a job as a "computer scientist" now - the recruiter would be like "No, are you a web developer? Mobile developer? Backend developer? Data scientist? Data engineer? Cloud engineer? AI engineer? Machine learning developer?"
reply
the__alchemist
14 days ago
[-]
I think the reason this keeps coming up is encoded in your second sentence, in conjunction with the HN medium: LoRa and LoRA are both, unfortunately, things that the target audience are likely to be interested in and/or knowledgeable with, but a general audience is not.

Also, both use a non-standard case mix.

reply
1024core
14 days ago
[-]
> Sometimes acronyms means multiple things

Exactly. Like WiFi: from ancient times it has meant "Wife's Fidelity".

reply
teaearlgraycold
14 days ago
[-]
Tell my WiFi love her
reply
EGreg
14 days ago
[-]
Finally! Some people have been screaming to change the acronym since 2001. But these tech bros group didn’t listen. Such hubris!

https://en.m.wikipedia.org/wiki/Crypto_naming_controversy

reply
yau8edq12i
14 days ago
[-]
> This gets mentioned here everytime an article about LoRA is posted.

I wonder why!

reply
IshKebab
14 days ago
[-]
Yes but radio protocols and AI methods are a lot closer than most overlapping acronyms. This is obvious from the fact that it gets mentioned every time an article about LoRA is posted.
reply
mattlondon
14 days ago
[-]
But these are clearly both in the same field as everyone keeps saying mentioning it here! So clearly there is confusion. It certainly tricked me on first reading - "ah cool - efficient lora+ that sounds cool... Ah wait no it's just some machine learning spam"
reply
kcorbitt
14 days ago
[-]
This specific variant "LoRA+" described in this paper is even harder to search for. I was doing some research on this technique recently and it turns out that "Lora+" matches with "Lora" in Discord search, which is quite unhelpful. :)
reply
SquareWheel
14 days ago
[-]
Discord search is one of the worst I've ever used. They remap words like "localization" to "local", which makes it impossible to search for more specific terms.
reply
rytill
14 days ago
[-]
That’s LoRa. This is LoRA.
reply
bee_rider
14 days ago
[-]
The idea of low rank approximations is not new, truncated SVDs for example have been used for many decades.
reply
yau8edq12i
14 days ago
[-]
The acronym LoRA used in the context of deep learning (2021) is about seven years younger than the radio communication protocol called LoRa (2014). Type "lora" in a search engine and see what you get.
reply
kleiba
14 days ago
[-]
In the first 10 search results, I now get a mix of results for either of the two technologies when searching with Google.
reply
blamestross
14 days ago
[-]
Yeah I would prefer they didn't offusicate the actually useful search term
reply
allpaca
14 days ago
[-]
This is old, having been released in February... Why do you talk about it now?
reply
axpy906
14 days ago
[-]
In 2024 are folks still swapping out LoRA adapters? Is this still relevant?
reply
ac2u
14 days ago
[-]
Can’t tell if your tone is inquisitive or incredulous :)

If the later please point out the alternatives.

reply
bckr
14 days ago
[-]
Why would it not be?
reply