• 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    There are models you can download and run at home that doesn’t have the politically correct censorship inside. It’s very nice to not have artificial politeness for example, and the models actually answers your actual questions.

    You need a powerful computer for some of them though.

      • merari42@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        For a user without much technical experience using a ready-made gui like Jan.ai with automatic model download and ability to run models with the ggml library on consumer grade hardware like mac M-series chips or cheap GPUs by either Nvidia or AMD is probably a good start.

        For a little bit more technically proficient users Ollama is probably a great choice to start to host your own OpenAI-like API for local models. I mostly run gemma2 or small llama 3.1 like models with that.