Any experiences with a self-hosted assistant like the modern Google Assistant? Looking for something LLM-powered that is smarter than older assistants that would just try to call 3rd party tools directly and miss or misunderstand requests half of the time.

I’d like integration with a mobile app to use it from the phone and while driving. I see Home Assistant has an Android Auto integration. Has anyone used this, or another similar option? Any blatant limitations?

  • wildbus8979@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 hour ago

    Home Assistant can absolutely do that. If you are ok with simple intent based phrasing it’ll do it out of the box. If you want complex understanding and reasoning you’ll have to run a local LLM, like Llama, on top of it

      • lyralycan@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        57 seconds ago

        I don’t think there’s a straightforward way like a HACS integration yet, but you can access Ollama from the web with open-webui and save the page to your homepage:

        Just be warned, you’ll need a lot of resources – Gemma3 4B uses around 3GB storage, 0.5GB RAM and 4GB of VRAM to respond. It’s a compromise as I can’t get replacement RAM, and tends to be wildly inaccurate with large responses. The one I’d rather use, Dolphin-Mixtral 22B, takes 80GB storage and 17GB min RAM, the latter of which I can’t afford to take from my other services.