Onavo 2 days ago

> Removed KL Divergence

Wait, how are they computing the loss?

3
danielhanchen 2 days ago

Oh it's the KL term sorry - beta * KL ie they set beta to 0.

The goal of it was to "force" the model not to stray to far away from the original checkpoint, but it can hinder the model from learning new things

trc001 1 day ago

It's become trendy to delete it. I say trendy because many papers delete it without offering any proof that it is meaningless

mjburgess 2 days ago

It's just a penalty term that they delete