Greg Clarke

Mastodon: @greg@clar.ke

  • 19 Posts
  • 79 Comments
Joined 2 years ago
cake
Cake day: November 9th, 2022

help-circle






  • Yes of course I’m asserting that. While the performance of LLMs may be plateauing, the cost, context window, and efficiency is still getting much better. When you chat with a modern chat bot it’s not just sending your input to an LLM like the first public version of ChatGPT. Nowadays a single chat bot response may require many LLM requests along with other techniques to mitigate the deficiencies of LLMs. Just ask the free version of ChatGPT a question that requires some calculation and you’ll have a better understanding of what’s going on and the direction of the industry.
















  • That data is available is the ActivityPub but as far as I know, you don’t have access to the vote data as a Lemmy user. You would have access to this data if you were running your own Lemmy server.

    But honestly, let it go. Someone stole fake Internet points from you. It costs you nothing but it cost them 5 minutes of their lives. You’re having a fight with some stranger that doesn’t value their own time. Value your time and stop feeding these kinds of energy vampires. You deserve better! Have a wonderful day my friend :)