• sigmaklimgrindset@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    ·
    6 days ago

    second, more attention is given to the last bits of input, so as chat goes on, the first bits get less important, and that includes these guardrails

    This part is something that I really can’t grasp for some reason. Why do LLMs like…lose context the longer a chat goes on, if that makes any sense? Especially context that’s baked into the system prompts, which I would would be a perpetual thing?

    I’m sorry if this is a stupid question, but I truly am an AI luddite. My roomate set up a local Deepseek server to help me determine what to cook with what’s almost expired our fridge. I’m not really having long, soulful conversations with it, you know?