• LionyxML@alien.topOPB
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Here’s another addition to my aspiring blog: https://www.rahuljuliato.com/posts/ellama

    This post outlines the process of setting up Ollama as a server and Ellama as an Emacs client. The result is a localized AI companion that enhances your Emacs experience.

    Embracing the world of Emacs has been a delightful journey, and I can’t wait to share it with all of you!

    • wonko7@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      that’s so cool, thanks!

      I think I’m gonna need new hardware if I want to play with this…

      • LionyxML@alien.topOPB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Nice!

        Ollama runs in CPU mode if you don’t have a GPU, also there are models that can be used with 4Gb of RAM.

        Things won’t be fast but for experimenting it is enought :)