• DrDystopia@lemy.lol
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    19 days ago

    Meh, I wouldn’t mind a NPU in my laptop - Chugging tokens on a i3 with only two power cores is a chore.

    I would mind a Windows laptop though and I’ve not seen rave reviews about how well Ollama on linux utilises NPU’s yet so I’ll just wait for now. I’d expect the smarties have found other ways to utilise the extra processing power by the time there’s full linux support as well.

    • xcjs@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      19 days ago

      It’s actually difficult to find software that supports it, assuming you’re not using Copilot. Ollama on my NPU enabled laptop doesn’t even try to use it, and even if it did, performance might suffer anyway.

      • DrDystopia@lemy.lol
        link
        fedilink
        arrow-up
        1
        ·
        19 days ago

        Thanks for an actual user confirmation. No NPU for Ollama. Yet. And that’s part of my point, it’s not there yet but maybe some day. I’m pretty convinced. Also people will figure out other ways of using it - Like computational stuff n shit.