I must confess to getting a little sick of seeing the endless stream of articles about this (along with the season finale of Succession and the debt ceiling), but what do you folks think? Is this something we should all be worrying about, or is it overblown?

EDIT: have a look at this: https://beehaw.org/post/422907

  • spoonful@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I’m actually very optimistic and here’s why - it changes education and research completely.

    Generally when learning new things the initial step is the hardest. Where to start and what to learn is extremely overwhelming and we basically got rid of that. It’s amazing.

    I’m a fullstack engineer and honestly I feel that with LLM I have the tools to switch to basically any career I’d want to. If AI takes away coding then I’d happily let it build stuff for me and pivot somewhere else. The things get a bit weirder for people who can’t do that but that’s not a new issue - we already have people who need assistance and if anything we should be able to support them better now.

    • orclev@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      The problem with ALL the LLMs is that they don’t actually understand anything at all. They produce output that looks like other similar things but may or may not have any actual relationship to reality. So they’re incredibly advanced bullshit generators.

      I would never trust a piece of code written by one of these things and you’d spend just as much time debugging what it wrote as it would have taken to write it in the first place.

      For that matter you’d never be able to actually trust anything one told you about anything either as you never know if anything it has said is true or not so you literally need to research everything it tells you in order to know if it was true or not.

      It could maybe work as a better interface to a search engine though. You ask it a question and it redirects you to what it thinks are the most relevant search results. E.G. “how do I do X” and it tells you “People who wanted to do X often needed to know about Y and Z, here are some of the top search results for that”, but you’d need to actually follow the links provided, not let it summarize them.