A chart titled “What Kind of Data Do AI Chatbots Collect?” lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
  • Kiuyn@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    7 hours ago

    I recommend GPT4all if you want run locally on your PC. It is super easy.

    If you want to run in a separate server. Ollama + some kind of web UI is the best.

    Ollama can also be run locally but IMO it take more learning than GUI app like GPT4all.

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 hours ago

      If by more learning you mean learning

      ollama run deepseek-r1:7b

      Then yeah, it’s a pretty steep curve!

      If you’re a developer then you can also search “$MyFavDevEnv use local ai ollama” to find guides on setting up. I’m using Continue extension for VS Codium (or Code) but there’s easy to use modules for Vim and Emacs and probably everything else as well.

      The main problem is leveling your expectations. The full Deepseek is a 671b (that’s billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.

      They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren’t as impressive as the cloud hosted big versions though.

      • Kiuyn@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        2 hours ago

        My assumption is always the person I am talking to is a normal window user who don’t know what a terminal is. Most of them even freak out when they see “the black box with text on it”. I guess on Lemmy the situation is better. It is just my bad habit.

        • CodexArcanum@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 hour ago

          No worries! You’re probably right that it’s better not to assume, and it’s good of you to provide some different options.