
blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution
Preview meta tags from the blog.sicuranext.com website.
Linked Hostnames
5- 9 links toblog.sicuranext.com
- 1 link toarxiv.org
- 1 link toghost.org
- 1 link togithub.com
- 1 link tosicuranext.com
Thumbnail

Search Engine Appearance
Influencing LLM Output using logprobs and Token Distribution
What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
Bing
Influencing LLM Output using logprobs and Token Distribution
What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
DuckDuckGo

Influencing LLM Output using logprobs and Token Distribution
What if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
General Meta Tags
22- titleInfluencing LLM Output using logprobs and Token Distribution
- titleTDD forward – interactive log‑prob viewer
- titleTDD forward – interactive log‑prob viewer (instance 2)
- titleTDD forward – interactive log‑prob viewer (instance 3)
- titleTDD forward – interactive log‑prob viewer (instance 4)
Open Graph Meta Tags
8- og:site_nameSicuranext Blog
- og:typearticle
- og:titleInfluencing LLM Output using logprobs and Token Distribution
- og:descriptionWhat if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
- og:urlhttps://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
Twitter Meta Tags
11- twitter:cardsummary_large_image
- twitter:titleInfluencing LLM Output using logprobs and Token Distribution
- twitter:descriptionWhat if you could influence an LLM's output not by breaking its rules, but by bending its probabilities? In this deep-dive, we explore how small changes in user input (down to a single token) can shift the balance between “true” and “false”, triggering radically different completions.
- twitter:urlhttps://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
- twitter:imagehttps://blog.sicuranext.com/content/images/2025/06/Screenshot-2025-06-09-at-14.44.28.png
Link Tags
9- alternatehttps://blog.sicuranext.com/rss/
- canonicalhttps://blog.sicuranext.com/influencing-llm-output-using-logprobs-and-token-distribution/
- iconhttps://blog.sicuranext.com/content/images/size/w256h256/2023/08/favicon.png
- preload/assets/built/screen.css?v=af1b72fe25
- preload/assets/built/casper.js?v=af1b72fe25
Links
13- https://arxiv.org/abs/2405.11891?ref=blog.sicuranext.com
- https://blog.sicuranext.com
- https://blog.sicuranext.com/author/themiddle
- https://blog.sicuranext.com/breaking-down-multipart-parsers-validation-bypass
- https://blog.sicuranext.com/hunt3r-kill3rs-and-the-italian-critical-infrastructure-risks