PodParley PodParley
Scaling Laws for Precision

EPISODE · Nov 18, 2024 · 18 MIN

Scaling Laws for Precision

from LlamaCast · host Shahriar Shariati

⚖️ Scaling Laws for PrecisionThis research paper investigates the impact of precision in training and inference on the performance of large language models. The authors explore how precision affects the effective parameter count and propose scaling laws that predict performance degradation due to low-precision training and post-training quantization. They find that overtrained models are more sensitive to post-training quantization, and that training larger models in lower precision might be computationally optimal. Their unified scaling law accounts for both training and post-training effects and predicts loss in varied precision settings, ultimately suggesting that the standard practice of training models in 16-bit might be suboptimal.📎 Link to paper🌐 Read their Tweet

NOW PLAYING

Scaling Laws for Precision

0:00 18:39

No transcript for this episode yet

We transcribe on demand. Request one and we'll notify you when it's ready — usually under 10 minutes.

No similar episodes found.

No similar podcasts found.

URL copied to clipboard!