
blog.mathieuacher.com/GTP2AndChess
Preview meta tags from the blog.mathieuacher.com website.
Linked Hostnames
7- 7 links toblog.mathieuacher.com
- 6 links tolichess.org
- 1 link toen.m.wikipedia.org
- 1 link tofr.wikipedia.org
- 1 link togithub.com
- 1 link totwitter.com
- 1 link towww.twitter.com
Search Engine Appearance
GPT-2 and Chess
Shawn Presser has released an intringuing chess engine based on deep learning-based language model (GPT-2). The model was trained on the Kingbase dataset (3.5 million chess games in PGN notation) in 24 hours using 146 TPUs (ouch!). The engine is purely based on text prediction with no concept of chess. Though GPT-2 has already delivered promising/bluffing results for text generation, one can be skeptical and wonder whether it does work for chess.
Bing
GPT-2 and Chess
Shawn Presser has released an intringuing chess engine based on deep learning-based language model (GPT-2). The model was trained on the Kingbase dataset (3.5 million chess games in PGN notation) in 24 hours using 146 TPUs (ouch!). The engine is purely based on text prediction with no concept of chess. Though GPT-2 has already delivered promising/bluffing results for text generation, one can be skeptical and wonder whether it does work for chess.
DuckDuckGo
GPT-2 and Chess
Shawn Presser has released an intringuing chess engine based on deep learning-based language model (GPT-2). The model was trained on the Kingbase dataset (3.5 million chess games in PGN notation) in 24 hours using 146 TPUs (ouch!). The engine is purely based on text prediction with no concept of chess. Though GPT-2 has already delivered promising/bluffing results for text generation, one can be skeptical and wonder whether it does work for chess.
General Meta Tags
8- titleGPT-2 and Chess – Mathieu Acher – Professor in Computer Science
- charsetutf-8
- Content-Typetext/html; charset=utf-8
- X-UA-CompatibleIE=edge
- viewportwidth=device-width, initial-scale=1.0, maximum-scale=1.0
Open Graph Meta Tags
2- og:descriptionShawn Presser has released an intringuing chess engine based on deep learning-based language model (GPT-2). The model was trained on the Kingbase dataset (3.5 million chess games in PGN notation) in 24 hours using 146 TPUs (ouch!). The engine is purely based on text prediction with no concept of chess. Though GPT-2 has already delivered promising/bluffing results for text generation, one can be skeptical and wonder whether it does work for chess.
- og:titleGPT-2 and Chess
Link Tags
2- alternate/feed.xml
- stylesheet/style.css
Emails
1Links
18- https://blog.mathieuacher.com
- https://blog.mathieuacher.com/about
- https://blog.mathieuacher.com/tag/#artificial-intelligence
- https://blog.mathieuacher.com/tag/#chess
- https://blog.mathieuacher.com/tag/#gpt2