Section 1 — The Model

    Parameters

    The numbers inside a model — often billions of them — tuned during training. Everything the model "knows" lives in them. Training sets them; inference uses them unchanged. Also...

    Matt Pocock
    Matt Pocock

    The numbers inside a model — often billions of them — tuned during training. Everything the model "knows" lives in them. Training sets them; inference uses them unchanged. Also called weights.

    Usage:

    "Can we fine-tune it on our codebase?"

    "That'd update the parameters — different model afterwards. For one project it's almost always cheaper to load the codebase as context than to retrain."

    Want more than vocabulary?

    Join AI Hero for practical skills, thinking on AI engineering, and resources that keep you ahead of the curve.

    I respect your privacy. Unsubscribe at any time.

    Share