• PiJiNWiNg@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      1 month ago

      There’s a lot we understand about the brain, but there is so much more we dont understand about the brain and “awareness” in general. It may not be magic, but it certainly isnt 100% understood.

      • nBodyProblem@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        5
        ·
        1 month ago

        We don’t need to understand cognition, nor for it to work the same as machine learning models, to say it’s essentially a statistical model

        It’s enough to say that cognition is a black box process that takes sensory inputs to grow and learn, producing outputs like muscle commands.

        • 0xD@infosec.pub
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 month ago

          You can abstract everything down to that level, doesn’t make it any more right.

          • FooBarrington@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 month ago

            Yes, that’s physics. We abstract things down to their constituent parts, to figure out what they are made up of, and how they work. Human brains aren’t straightforward computers, so they must rely on statistics if there is nothing non-physical (a “soul” or something).

      • Schmeckinger@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        9
        ·
        1 month ago

        I’m not saying we understand the brain perfectly, but everything we learn about it will follow logic and math.

        • PiJiNWiNg@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          2
          ·
          1 month ago

          Not neccesarily, there are a number of modern philosiphers and physicists who posit that “experience” is incalculable, and further that it’s directly tied to the collapse of the wave function in quantum mechanics (Penrose-Hammerof; ORCH-OR). I’m not saying they’re right, but Penrose won a Nobel Prize in quantum mechanics and he says it can’t be explained by math.

          • bunchberry@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            1 month ago

            I agree experience is incalculable but not because it is some special immaterial substance but because experience just is objective reality from a particular context frame. I can do all the calculations I want on a piece of paper describing the properties of fire, but the paper it’s written on won’t suddenly burst into flames. A description of an object will never converge into a real object, and by no means will descriptions of reality ever become reality itself. The notion that experience is incalculable is just uninteresting. Of course, we can say the same about the wave function. We use it as a tool to predict where we will see real particles. You also cannot compute the real particles from the wave function either because it’s not a real entity but a description of relationships between observations (i.e. experiences) of real things.

    • LANIK2000@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      (working with the assumption we mean stuff like ChatGPT) mKay… Tho math and logic is A LOT more than just statistics. At no point did we prove that statistics alone is enough to reach the point of cognition. I’d argue no statistical model can ever reach cognition, simply because it averages too much. The input we train it on is also fundamentally flawed. Feeding it only text skips the entire thinking and processing step of creating an answer. It literally just take texts and predicts on previous answers what’s the most likely text. It’s literally incapable of generating or reasoning in any other way then was already spelled out somewhere in the dataset. At BEST, it’s a chat simulator (or dare I say…language model?), it’s nowhere near an inteligence emulator in any capacity.