It goes without saying, DVDs/BlueRays.

  • @[email protected]
    link
    fedilink
    5
    edit-2
    10 days ago

    Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.

    If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.

    • @[email protected]
      link
      fedilink
      49 days ago

      The silly take is that using it is “part of the problem”.

      Also, the glue on pizza thing is nearly a moot point. The models are much more advanced now and will continue to be.

      The commercial LLM’s can share their sources now so that’s also a moot point.

      It’s not going away. Learn to use it effectively.

    • @[email protected]
      link
      fedilink
      310 days ago

      You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.

        • @[email protected]
          link
          fedilink
          510 days ago

          If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.

          • @[email protected]
            link
            fedilink
            110 days ago

            If it’s something serious, yes.

            Good, then it is a bit less of a bad tool in this instance. Just don’t lose the habit of checking your sources—it’s a slippery slope.