www.ailog.blog/p/on-confabulation

Preview meta tags from the www.ailog.blog website.

Linked Hostnames

14

Thumbnail

Search Engine Appearance

Google

https://www.ailog.blog/p/on-confabulation

On Confabulation

Hallucination is the wrong word to describe when ChatGPT or another generative AI model gets something wrong. Confabulation is a much better word. Confabulating seems to be how most people use large language models, even though enthusiasts want these models to do more than just talk to us. What value is there in an AI confabulator?



Bing

On Confabulation

https://www.ailog.blog/p/on-confabulation

Hallucination is the wrong word to describe when ChatGPT or another generative AI model gets something wrong. Confabulation is a much better word. Confabulating seems to be how most people use large language models, even though enthusiasts want these models to do more than just talk to us. What value is there in an AI confabulator?



DuckDuckGo

https://www.ailog.blog/p/on-confabulation

On Confabulation

Hallucination is the wrong word to describe when ChatGPT or another generative AI model gets something wrong. Confabulation is a much better word. Confabulating seems to be how most people use large language models, even though enthusiasts want these models to do more than just talk to us. What value is there in an AI confabulator?

  • General Meta Tags

    33
    • title
      On Confabulation - by Rob Nelson - 𝐀𝐈 𝐋𝐨𝐠
    • title
    • title
    • title
    • title
  • Open Graph Meta Tags

    5
    • og:url
      https://www.ailog.blog/p/on-confabulation
    • og:type
      article
    • og:title
      On Confabulation
    • og:description
      Hallucination is the wrong word to describe when ChatGPT or another generative AI model gets something wrong. Confabulation is a much better word. Confabulating seems to be how most people use large language models, even though enthusiasts want these models to do more than just talk to us. What value is there in an AI confabulator?
    • og:image
      https://substackcdn.com/image/fetch/$s_!zPpt!,w_1200,h_600,c_fill,f_jpg,q_auto:good,fl_progressive:steep,g_auto/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17b6a262-d9b1-46a8-a42a-0da03fc43986_4000x4858.jpeg
  • Twitter Meta Tags

    4
    • twitter:title
      On Confabulation
    • twitter:description
      Hallucination is the wrong word to describe when ChatGPT or another generative AI model gets something wrong. Confabulation is a much better word. Confabulating seems to be how most people use large language models, even though enthusiasts want these models to do more than just talk to us. What value is there in an AI confabulator?
    • twitter:image
      https://substackcdn.com/image/fetch/$s_!U7fJ!,f_auto,q_auto:best,fl_progressive:steep/https%3A%2F%2Failogblog.substack.com%2Fapi%2Fv1%2Fpost_preview%2F154960340%2Ftwitter.jpg%3Fversion%3D4
    • twitter:card
      summary_large_image
  • Link Tags

    32
    • alternate
      /feed
    • apple-touch-icon
      https://substackcdn.com/image/fetch/$s_!L39x!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3bef41fe-4cb2-422a-8694-f515142b375c%2Fapple-touch-icon-57x57.png
    • apple-touch-icon
      https://substackcdn.com/image/fetch/$s_!hKlr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3bef41fe-4cb2-422a-8694-f515142b375c%2Fapple-touch-icon-60x60.png
    • apple-touch-icon
      https://substackcdn.com/image/fetch/$s_!cQo2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3bef41fe-4cb2-422a-8694-f515142b375c%2Fapple-touch-icon-72x72.png
    • apple-touch-icon
      https://substackcdn.com/image/fetch/$s_!IC9F!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3bef41fe-4cb2-422a-8694-f515142b375c%2Fapple-touch-icon-76x76.png

Links

45