• SkunkWorkz@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    14 hours ago

    well it’s also that they used biased data. biased data is garbage data. The problem with these neural networks is the human factor, humans tend to be biased, subconsciously or consciously, hence the data they provide to these networks will often be biased as well. It’s like that ML that was designed to judge human faces and it would consistently give non-whites lower scores, because it turned out the input data was mostly full of white faces.

    • Nalivai@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      14 hours ago

      I am convinced that unbiased data doesn’t exist, and at this point I’m not sure it can exist on principal. Then you take your data full of unknown bias, and feed it to a blackbox that creates more unknown bias.

      • jumping_redditor@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        10 hours ago

        if you get enough data of a specific enough task I’m fairly confident you can get something that is relatively unbiased. Almost no company wants to risk it though because the training would require that no human decisions are made.

        • Nalivai@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 hours ago

          The problems in thinking that your data is unbiased, is that you don’t know where your data is biased, and you stopped looking