• 0 Posts
  • 2 Comments
Joined 11 months ago
cake
Cake day: August 2nd, 2023

help-circle
  • Nevoic@lemm.eetoTechnology@lemmy.worldHello GPT-4o
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    “they can’t learn anything” is too reductive. Try feeding GPT4 a language specification for a language that didn’t exist at the time of its training, and then tell it to program in that language given a library that you give it.

    It won’t do well, but neither would a junior developer in raw vim/nano without compiler/linter feedback. It will roughly construct something that looks like that new language you fed it that it wasn’t trained on. This is something that in theory LLMs can do well, so GPT5/6/etc. will do better, perhaps as well as any professional human programmer.

    Their context windows have increased many times over. We’re no longer operating in the 4/8k range, but instead 128k->1024k range. That’s enough context to, from the perspective of an observer, learn an entirely new language, framework, and then write something almost usable in it. And 2024 isn’t the end for context window size.

    With the right tools (e.g input compiler errors and have the LLM reflect on how to fix said compiler errors), you’d get even more reliability, with just modern day LLMs. Get something more reliable, and effectively it’ll do what we can do by learning.

    So much work in programming isn’t novel. You’re not making something really new, but instead piecing together work other people did. Even when you make an entirely new library, it’s using a language someone else wrote, libraries other people wrote, in an editor someone else wrote, on an O.S someone else wrote. We’re all standing on the shoulders of giants.


  • Nevoic@lemm.eetoTechnology@lemmy.worldHello GPT-4o
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    18 months ago, chatgpt didn’t exist. GPT3.5 wasn’t publicly available.

    At that same point 18 months ago, iPhone 14 was available. Now we have the iPhone 15.

    People are used to LLMs/AI developing much faster, but you really have to keep in perspective how different this tech was 18 months ago. Comparing LLM and smartphone plateaus is just silly at the moment.

    Yes they’ve been refining the GPT4 model for about a year now, but we’ve also got major competitors in the space that didn’t exist 12 months ago. We got multimodality that didn’t exist 12 months ago. Sora is mind bogglingly realistic; didn’t exist 12 months ago.

    GPT5 is just a few months away. If 4->5 is anything like 3->4, my career as a programmer will be over in the next 5 years. GPT4 already consistently outperforms college students that I help, and can often match junior developers in terms of reliability (though with far more confidence, which is problematic obviously). I don’t think people realize how big of a deal that is.