• chunkystyles@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    17 hours ago

    What are your plans when these AI companies collapse, or start charging the actual costs of these services?

    Because right now, you’re paying just a tiny fraction of what it costs to run these services. And these AI companies are burning billions to try to find a way to make this all profitable.

    • TurdBurgler@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      10 hours ago

      These tools are mostly determistic applications following the same methodology we’ve used for years in the industry. The development cycle has been accelerated. We are decoupled from specific LLM providers by using LiteLLM, prompt management, and abstractions in our application.

      Losing a hosted LLM provider means we prox6 litellm to something out without changing contracts with our applications.

    • Eheran@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      4
      ·
      16 hours ago

      What are your plans when the Internet stops existing or is made illegal (same result)? Or when…

      They are not going away. LLMs are already ubiquitous, there is not only one company.

      • chunkystyles@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        11 hours ago

        Ok, so you’re completely delusional.

        The current business model is unsustainable. For LLMs to be profitable, they will have to become many times more expensive.

        • TurdBurgler@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          9 hours ago

          What are you even trying to say? You have no idea what these products are, but you think they are going to fail?

          Our company does market research and test pilots with customers, we aren’t just devs operating in a bubble pushing AI.

          We are listening and responding to customer needs and investing in areas that drive revenue using this technology sparingly.

          • chunkystyles@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            3 hours ago

            I don’t know what your products are. I’m speaking specifically about LLMs and LLMs only.

            Seriously research the cost of LLM services and how companies like Anthropic and OpenAI are burning VC cash at an insane clip.

            • TurdBurgler@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              16 minutes ago

              That’s a straw man.

              You don’t know how often we use LLM calls in our workflow automation, what models we are using, what our margins are or what a high cost is to my organization.

              That aside, business processes solve for problems like this, and the business does a cost benefit analysis.

              We monitor costs via LiteLLM, Langfuse and have budgets on our providers.

              Similar architecture to the Open Source LLMOps Stack https://oss-llmops-stack.com/

              Also, your last note is hilarious to me. “I don’t want all the free stuff because the company might charge me more for it in the future.”

              Our design is decoupled, we do comparisons across models, and the costs are currently laughable anyway. The most expensive process is data loading, but good data lifecycles help with containing costs.

              Inference is cheap and LiteLLM supports caching.