• Fubarberry@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I noticed something similar on websites like Reddit. I’ve come across answering a question on something I’m will educated on, and their answer is definitively wrong but “sounds correct”. The reddit community will up vote them, and even down vote people who try correcting them.

    But then later on I would come across a post on a topic I don’t know, and I’m inclined to believe the answers because they sound right and there’s a group consensus backing it up.

    • FundMECFSResearch@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Yeah it is really frustrating to try and educate the reddit hivemind about your field of expertise.

      They like things that sound good and plausible and fit their biases, not necessarily where the scientific consensus points to.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        It’s not specific to Reddit, you’ll see that in any community, probably because we are social animals.

        • FundMECFSResearch@lemmy.blahaj.zoneOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          In real life if I give people my academic title they’ll trust me more than the random person who is arguing with me about basic facts in my field of expertise. For some reason, not on reddit though

          • phdepressed@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            On an anonymous platform like reddit there’s no verification. Unless you cite what you’re saying one person is as likely an expert as anyone else.

    • BowtiesAreCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I get this with 5-10 minute educational type youtube videos. When it’s a topic I know, it’s obvious they just slightly changed the Wikipedia entry, or took google result headlines. But when I don’t know I’m tempted to parrot the information without checking it

      • Carrolade@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I get this with wikipedia articles. I have to force myself to click through the links provided and check the reliability of the sources. They’re usually fine, but every once in awhile you find something questionable snuck in there.

        • whyNotSquirrel@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          do you correct or mark it as incorrect then? Because I usually never go for the sources hoping people did it for me… yeah I’m a lazy ignorant

          • Carrolade@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            No, if I’m on wikipedia for something, I never really feel confident enough in my own knowledge to actually do anything significant. I just mentally mark the article as questionable as I read.

            And when I know something well, I’m never looking at its wikipedia entry. lol

            Maybe I should though.

    • Kecessa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Don’t be too scared but… The same thing is happening on Wikipedia. I realized it when I tried to correct something benign on an article (a motorcycle being the first road legal model from the brand in 40 years) and pointed at an article confirming what I was correcting (article about another model released by the same brand 5 years prior that was a road legal model) and my edit got deleted.

      I then went looking and found an article by an expert on a subject that argued with people on Wikipedia for over a year before just giving up because they wouldn’t accept that a bunch of sources all quoting one wrong source didn’t mean the information was true.

    • kobra@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      This is my experience with AI, specifically ChatGPT.

      If I ask it questions about how to do technical things I already know how to do, ChatGPT comes off as wildly inept often times.

      However, if I ask it something I don’t know and start working through it’s recommend processes, more often than not I end up where I want to end up. Even with missteps along the way.

      • Fubarberry@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        This was a concern of mine with companies training AI on reddit. Both reddit and AI struggle with confidently providing false info in a way that sounds true, so training AI on reddit seems like it would really compound this issue.

    • teft@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      People upvote you if you sound right or confident and you’re early to post. Later posters don’t get the same number of eyeballs on them as earlier posts so any correction won’t (generally) receive the same amount of votes.

    • TranscendentalEmpire@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The reddit community will up vote them, and even down vote people who try correcting them.

      Yeap… Especially with any topic where there’s a big hobbyist community.

      I work in orthotics and prosthetics for a university hospital as both an educator and a healthcare provider. I can’t tell you how many times I’ve been down voted by 3d printer enthusiasts for critiquing untrained and uneducated people fitting children with medical devices that can severely injure or debilitate them.