• sab@kbin.social
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    This is the key - it does not create, it can only copy. Which is good enough to fool us - there’s enough stuff to copy out there that you can spend your whole life copying other people and nobody will ever notice you’re not actually creating anything new. What’s more, you’ll probably come across as pretty clever. But you’re not creating anything new.

    For me, this poses an existential threat to academia. It might halt development in the field without researchers even noticing: Their words look fine, as if they had thought it through, and they of course read it to make sure it’s logically consistent. However, the creative force is gone. Nothing new will come under the sun - the kind of new thoughts that can only be made by creative humans thinking new thoughts that have never been put on paper before.

    If we give up that, what’s even the point of doing science in the first place.

    • Sinnerman@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      There’s a difference between:

      1. Using ChatGPT to help write parts of the text in the same way you’d use a grammar- or spell-checker (e.g. if English isn’t your first language) after you’ve finished the experiments

      2. Using ChatGPT to write a paper without even doing any experiments

      Clearly the second is academic misconduct. The first one is a lot more defensible.

      • sab@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        Yes, absolutely. But I still think it has its dangers.

        Using it to write the introduction doesn’t change the substance of the paper, yet it does provide the framework for how the reader interprets it, and also often decides whether it’ll be read at all.

        Maybe worse, I find that it’s oftem in the painful writing and rewriting of the introduction and conclusion that I truly understand my own contribution - I’ve done the analysis and all that, but in forcing myself to think about the relevance for the field and the reader I also bring myself to better understand what the paper means in a deeper sense. I believe this kind of deep thinking at the end of the process is incredibly valuable, and it’s what I’m afraid we might be losing with AI.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      This is the key - it does not create, it can only copy.

      I have asked ChatGPT to write poetry on subjects that I know with great certainty have never had poems written about them.

      You can of course shuffle around the meanings of “create” and “copy” to try to accommodate that, but eventually you end up with a “copying” process that’s so flexible and malleable that it might as well be creativity. It’s not like what comes out of human brains isn’t based on stuff that went into them earlier either.