blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution

Preview meta tags from the blog.sicuranext.com website.

Linked Hostnames

5

Thumbnail

Search Engine Appearance

Google

https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution

Influencing LLM Output using logprobs and Token Distribution

What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.



Bing

Influencing LLM Output using logprobs and Token Distribution

https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution

What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.



DuckDuckGo

https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution

Influencing LLM Output using logprobs and Token Distribution

What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.

  • General Meta Tags

    22
    • title
      Influencing LLM Output using logprobs and Token Distribution
    • title
      TDD forward – interactive log‑prob viewer
    • title
      TDD forward – interactive log‑prob viewer (instance 2)
    • title
      TDD forward – interactive log‑prob viewer (instance 3)
    • title
      TDD forward – interactive log‑prob viewer (instance 4)
  • Open Graph Meta Tags

    8
    • og:site_name
      Sicuranext Blog
    • og:type
      article
    • og:title
      Influencing LLM Output using logprobs and Token Distribution
    • og:description
      What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
    • og:url
      https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
  • Twitter Meta Tags

    11
    • twitter:card
      summary_large_image
    • twitter:title
      Influencing LLM Output using logprobs and Token Distribution
    • twitter:description
      What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
    • twitter:url
      https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
    • twitter:image
      https://blog.sicuranext.com/content/images/2025/06/Screenshot-2025-06-09-at-14.44.28.png
  • Link Tags

    9
    • alternate
      https://blog.sicuranext.com/rss/
    • canonical
      https://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
    • icon
      https://blog.sicuranext.com/content/images/size/w256h256/2023/08/favicon.png
    • preload
      /assets/built/screen.css?v=af1b72fe25
    • preload
      /assets/built/casper.js?v=af1b72fe25

Links

13