• 3 Posts
  • 268 Comments
Joined 8 months ago
cake
Cake day: March 22nd, 2024

help-circle














  • I briefly wrote articles for an oldschool PC hardware outlet (HardOCP if anyone remembers)… And I’m surprised any such sites are still alive. Mine shut down, and not because they wanted to.

    Why?

    Who reads written text over their favorite YouTube personality, or the SEO garbage that pops up first on their search, or first party articles/recs on steam, and so on? No one, except me apparently, as their journalistic integrity aside, I’m way too impatient for youtube videos, and am apparently the only person on the planet that believes influencers as far as I can throw them.

    And that was before Discord, Tiktok, and ChatGPT really started eating everything. And before a whole generation barely knew what a website is.

    They cited Eurogamer as an offender here, and thats an outstanding/upstanding site. I’m surprised they can even afford to pay that much as a business.

    And I’m not sure what anyone is supposed to do about it.






  • Yeah, well Alibaba nearly (and sometimes) beat GPT-4 with a comparatively microscopic model you can run on a desktop. And released a whole series of them. For free! With a tiny fraction of the GPUs any of the American trainers have.

    Bigger is not better, but OpenAI has also just lost their creative edge, and all Altman’s talk about scaling up training with trillions of dollars is a massive con.

    o1 is kind of a joke, CoT and reflection strategies have been known for awhile. You can do it for free youself, to an extent, and some models have tried to finetune this in: https://github.com/codelion/optillm

    But one sad thing OpenAI has seemingly accomplished is to “salt” the open LLM space. Theres way less hacky experimentation going on than there used to be, which makes me sad, as many of its “old” innovations still run circles around OpenAI.