ieeexplore.ieee.org/document/7844355
Preview meta tags from the ieeexplore.ieee.org website.
Linked Hostnames
2Thumbnail

Search Engine Appearance
SHTM: A neocortex-inspired algorithm for one-shot text generation
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
Bing
SHTM: A neocortex-inspired algorithm for one-shot text generation
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
DuckDuckGo
SHTM: A neocortex-inspired algorithm for one-shot text generation
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
General Meta Tags
12- titleSHTM: A neocortex-inspired algorithm for one-shot text generation | IEEE Conference Publication | IEEE Xplore
- google-site-verificationqibYCgIKpiVF_VVjPYutgStwKn-0-KBB6Gw4Fc57FZg
- DescriptionText generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get g
- Content-Typetext/html; charset=utf-8
- viewportwidth=device-width, initial-scale=1.0
Open Graph Meta Tags
3- og:imagehttps://ieeexplore.ieee.org/assets/img/ieee_logo_smedia_200X200.png
- og:titleSHTM: A neocortex-inspired algorithm for one-shot text generation
- og:descriptionText generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
Twitter Meta Tags
1- twitter:cardsummary
Link Tags
9- canonicalhttps://ieeexplore.ieee.org/document/7844355
- icon/assets/img/favicon.ico
- stylesheethttps://ieeexplore.ieee.org/assets/css/osano-cookie-consent-xplore.css
- stylesheet/assets/css/simplePassMeter.min.css?cv=20250812_00000
- stylesheet/assets/dist/ng-new/styles.css?cv=20250812_00000
Links
17- http://www.ieee.org/about/help/security_privacy.html
- http://www.ieee.org/web/aboutus/whatis/policies/p9-26.html
- https://ieeexplore.ieee.org/Xplorehelp
- https://ieeexplore.ieee.org/Xplorehelp/overview-of-ieee-xplore/about-ieee-xplore
- https://ieeexplore.ieee.org/Xplorehelp/overview-of-ieee-xplore/accessibility-statement