substack.com/@dilemmaworks/note/c-139036404

Preview meta tags from the substack.com website.

Linked Hostnames

2

Thumbnail

Search Engine Appearance

Google

https://substack.com/@dilemmaworks/note/c-139036404

Erik at Dilemma Works (@dilemmaworks)

Right, a chatbot has limited utility because its input and output modes are too general. You need constraint to achieve better outputs for specialized use cases. Doubao is on the way, but not quite there yet. I had a thought this morning - how come Lovable can sell tokens at a 100x markup? It's because they solves the blackbox problem of LLMs. Here's another comment I wrote earlier today: Why did Lovable become the fastest startup to $100M ARR? Because they solved the black box problem inherent to language models and the standard chatbox interface. LLMs are powerful but their full and exact capabilities are unknown and unknowable. Users don’t know what to ask, and don’t trust what comes back. Most AI products hand users a blank chat box and hope for magic. Lovable took a different route: they guided both the user and the model. By wrapping their LLM in a tightly designed UI with smart defaults, pre-filled prompts, and clear output constraints, Lovable turned a vague interface into a reliable tool. Users get value in seconds, not after wrestling with prompt engineering. The result is faster time-to-value, and productized outputs people will pay for. That’s how they can sell inference tokens at a 100x markup: by solving the blackbox problem that keeps models from doing this that are valuable to users. Solve the black box, and you unlock the market, and the margins.



Bing

Erik at Dilemma Works (@dilemmaworks)

https://substack.com/@dilemmaworks/note/c-139036404

Right, a chatbot has limited utility because its input and output modes are too general. You need constraint to achieve better outputs for specialized use cases. Doubao is on the way, but not quite there yet. I had a thought this morning - how come Lovable can sell tokens at a 100x markup? It's because they solves the blackbox problem of LLMs. Here's another comment I wrote earlier today: Why did Lovable become the fastest startup to $100M ARR? Because they solved the black box problem inherent to language models and the standard chatbox interface. LLMs are powerful but their full and exact capabilities are unknown and unknowable. Users don’t know what to ask, and don’t trust what comes back. Most AI products hand users a blank chat box and hope for magic. Lovable took a different route: they guided both the user and the model. By wrapping their LLM in a tightly designed UI with smart defaults, pre-filled prompts, and clear output constraints, Lovable turned a vague interface into a reliable tool. Users get value in seconds, not after wrestling with prompt engineering. The result is faster time-to-value, and productized outputs people will pay for. That’s how they can sell inference tokens at a 100x markup: by solving the blackbox problem that keeps models from doing this that are valuable to users. Solve the black box, and you unlock the market, and the margins.



DuckDuckGo

https://substack.com/@dilemmaworks/note/c-139036404

Erik at Dilemma Works (@dilemmaworks)

Right, a chatbot has limited utility because its input and output modes are too general. You need constraint to achieve better outputs for specialized use cases. Doubao is on the way, but not quite there yet. I had a thought this morning - how come Lovable can sell tokens at a 100x markup? It's because they solves the blackbox problem of LLMs. Here's another comment I wrote earlier today: Why did Lovable become the fastest startup to $100M ARR? Because they solved the black box problem inherent to language models and the standard chatbox interface. LLMs are powerful but their full and exact capabilities are unknown and unknowable. Users don’t know what to ask, and don’t trust what comes back. Most AI products hand users a blank chat box and hope for magic. Lovable took a different route: they guided both the user and the model. By wrapping their LLM in a tightly designed UI with smart defaults, pre-filled prompts, and clear output constraints, Lovable turned a vague interface into a reliable tool. Users get value in seconds, not after wrestling with prompt engineering. The result is faster time-to-value, and productized outputs people will pay for. That’s how they can sell inference tokens at a 100x markup: by solving the blackbox problem that keeps models from doing this that are valuable to users. Solve the black box, and you unlock the market, and the margins.

  • General Meta Tags

    14
    • title
      Erik at Dilemma Works (@dilemmaworks): "Right, a chatbot has limited utility because its input and output modes are too general. You need constraint to achieve better outputs for specialized use cases. Doubao is on the way, but not quite there yet. I had a thought this morning - how come Lovable can sell tokens at a…"
    • title
    • title
    • title
    • title
  • Open Graph Meta Tags

    9
    • og:url
      https://substack.com/@dilemmaworks/note/c-139036404
    • og:image
      https://substackcdn.com/image/fetch/$s_!XCt4!,w_400,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Freader%2Fnotes-thumbnail.jpg
    • og:image:width
      400
    • og:image:height
      400
    • og:type
      article
  • Twitter Meta Tags

    8
    • twitter:image
      https://substackcdn.com/image/fetch/$s_!XCt4!,w_400,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack.com%2Fimg%2Freader%2Fnotes-thumbnail.jpg
    • twitter:card
      summary
    • twitter:label1
      Likes
    • twitter:data1
      1
    • twitter:label2
      Replies
  • Link Tags

    17
    • alternate
      https://substack.com/@dilemmaworks/note/c-139036404
    • apple-touch-icon
      https://substackcdn.com/icons/substack/apple-touch-icon.png
    • canonical
      https://substack.com/@dilemmaworks/note/c-139036404
    • icon
      https://substackcdn.com/icons/substack/icon.svg
    • manifest
      /manifest.json

Links

5