• 10 Posts
  • 23 Comments
Joined 1 年前
cake
Cake day: 2024年7月13日

help-circle








  • Is the feature request page an astroturfing campaign or is there really such a pathetic crowd within Firefox’s userbase?

    Firefox could’ve added Mistral or Deepseek and they could’ve made money charging for zero-data-retention inference on Mozilla’s servers. But no, they would rather to do free work for a marketing company (that’s what Perplexity is - they’re not even an “AI” company - their best products are finetunes of Meta’s and Deepseek’s open-weight models and these finetunes are somehow stupider than the originals).







  • BB84@mander.xyzOPtoMemes@lemmy.mlsomeone teach them LaTeX
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    3 个月前

    The only times anyone would use the asterisk as multiplication symbol are

    • they are doing some fancy math and it’s not the same kind of number multiplication we’re familiar with
    • they are on a computer, the keyboard does not have a (×) key, and they don’t know how to typeset it (\times in LaTex), so they just use the asterisk instead

    The US government falls in the second category.










  • BB84@mander.xyztoTechTakes@awful.systemsOpenAI is so cooked and I'm all here for it
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    6 个月前

    Can someone explain why I am being downvoted and attacked in this thread? I swear I am not sealioning. Genuinely confused.

    @sc_griffith@awful.systems asked how request frequency might impact cost per request. Batch inference is a reason (ask anyone in the self-hosted LLM community). I noted that this reason only applies at very small scale, probably much smaller than what OpenAI is operating at.

    @dgerard@awful.systems why did you say I am demanding someone disprove the assertion? Are you misunderstanding “I would be very very surprised if they couldn’t fill [the optimal batch size] for any few-seconds window” to mean “I would be very very surprised if they are not profitable”?

    The tweet I linked shows that good LLMs can be much cheaper. I am saying that OpenAI is very inefficient and thus economically “cooked”, as the post title will have it. How does this make me FYGM? @froztbyte@awful.systems