ieeexplore.ieee.org/document/7844355

Preview meta tags from the ieeexplore.ieee.org website.

Linked Hostnames

2

Thumbnail

Search Engine Appearance

Google

https://ieeexplore.ieee.org/document/7844355

SHTM: A neocortex-inspired algorithm for one-shot text generation

Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.



Bing

SHTM: A neocortex-inspired algorithm for one-shot text generation

https://ieeexplore.ieee.org/document/7844355

Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.



DuckDuckGo

https://ieeexplore.ieee.org/document/7844355

SHTM: A neocortex-inspired algorithm for one-shot text generation

Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.

  • General Meta Tags

    12
    • title
      SHTM: A neocortex-inspired algorithm for one-shot text generation | IEEE Conference Publication | IEEE Xplore
    • google-site-verification
      qibYCgIKpiVF_VVjPYutgStwKn-0-KBB6Gw4Fc57FZg
    • Description
      Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get g
    • Content-Type
      text/html; charset=utf-8
    • viewport
      width=device-width, initial-scale=1.0
  • Open Graph Meta Tags

    3
    • og:image
      https://ieeexplore.ieee.org/assets/img/ieee_logo_smedia_200X200.png
    • og:title
      SHTM: A neocortex-inspired algorithm for one-shot text generation
    • og:description
      Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
  • Twitter Meta Tags

    1
    • twitter:card
      summary
  • Link Tags

    9
    • canonical
      https://ieeexplore.ieee.org/document/7844355
    • icon
      /assets/img/favicon.ico
    • stylesheet
      https://ieeexplore.ieee.org/assets/css/osano-cookie-consent-xplore.css
    • stylesheet
      /assets/css/simplePassMeter.min.css?cv=20250812_00000
    • stylesheet
      /assets/dist/ng-new/styles.css?cv=20250812_00000

Links

17