
web.archive.org/web/20230323055248/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Preview meta tags from the web.archive.org website.
Linked Hostnames
1Thumbnail

Search Engine Appearance
https://web.archive.org/web/20230323055248/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Introducing LLaMA: A foundational, 65-billion-parameter language model
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
Bing
Introducing LLaMA: A foundational, 65-billion-parameter language model
https://web.archive.org/web/20230323055248/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
DuckDuckGo

Introducing LLaMA: A foundational, 65-billion-parameter language model
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
General Meta Tags
14- titleIntroducing LLaMA: A foundational, 65-billion-parameter language model
- charsetutf-8
- referrerdefault
- descriptionToday, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
- viewportwidth=device-width, initial-scale=1
Open Graph Meta Tags
1- og:imagehttps://web.archive.org/web/20230323054638im_/https://scontent-sjc3-1.xx.fbcdn.net/v/t39.2365-6/333095137_1286826058904423_4144395724304288774_n.png?_nc_cat=110&ccb=1-7&_nc_sid=ad8a9d&_nc_ohc=W1oBMAAkxWEAX_QIwUF&_nc_ht=scontent-sjc3-1.xx&oh=00_AfDpu71hoQGeyZcHVrvnZEfnXpDPQQb-JUAMPGIOaCLtjg&oe=6421466B
Twitter Meta Tags
1- twitter:cardsummary
Link Tags
41- canonicalhttps://web.archive.org/web/20230323054638/https://ai.facebook.com/blog/large-language-model-llama-meta-ai/
- preloadhttps://web.archive.org/web/20230323054638/https://static.xx.fbcdn.net/rsrc.php/v3/yG/l/0,cross/XkHwTQuwprp.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230323054638/https://static.xx.fbcdn.net/rsrc.php/v3/yA/l/0,cross/GrvtJE3Ls_C.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230323054638/https://static.xx.fbcdn.net/rsrc.php/v3/yU/l/0,cross/oby7JQ9EDbX.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230323054638/https://static.xx.fbcdn.net/rsrc.php/v3/yV/l/0,cross/KxGJ10xTR_J.css?_nc_x=Ij3Wp8lg5Kz
Links
42- https://web.archive.org/web/20230323054638/https://ai.facebook.com
- https://web.archive.org/web/20230323054638/https://ai.facebook.com/blog
- https://web.archive.org/web/20230323054638/https://ai.facebook.com/blog/ai-math-theorem-proving
- https://web.archive.org/web/20230323054638/https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b
- https://web.archive.org/web/20230323054638/https://ai.facebook.com/blog/dino-paws-computer-vision-with-self-supervised-transformers-and-10x-more-efficient-training