blog.plumerai.com/2021/10/cortex-m-inference-software

Preview meta tags from the blog.plumerai.com website.

Linked Hostnames

7

Thumbnail

Search Engine Appearance

Google

https://blog.plumerai.com/2021/10/cortex-m-inference-software

The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog

Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.



Bing

The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog

https://blog.plumerai.com/2021/10/cortex-m-inference-software

Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.



DuckDuckGo

https://blog.plumerai.com/2021/10/cortex-m-inference-software

The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog

Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.

  • General Meta Tags

    4
    • title
      The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog
    • viewport
      width=device-width, initial-scale=1.0
    • title
      The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog
    • description
      Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.
  • Open Graph Meta Tags

    3
    • og:title
      The world’s fastest deep learning inference software for Arm Cortex-M | Plumerai Blog
    • og:description
      Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.
    • og:image
      https://blog.plumerai.com/images/cortex-m-inference-software/social-image.png
  • Twitter Meta Tags

    5
    • twitter:site
      @plumerai
    • twitter:card
      summary_large_image
    • twitter:title
      The world’s fastest deep learning inference software for Arm Cortex-M
    • twitter:description
      Plumerai’s deep learning inference software has 40% lower latency and requires 49% less RAM than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels while retaining the same accuracy, making it the fastest and most memory-efficient deep learning inference software for Arm Cortex-M.
    • twitter:image
      https://blog.plumerai.com/images/cortex-m-inference-software/social-image.png
  • Link Tags

    12
    • alternate icon
      /images/favicon.ico
    • apple-touch-icon
      /images/apple-touch-icon.png
    • icon
      /images/favicon.svg
    • icon
      /images/favicon-32x32.png
    • icon
      /images/favicon-16x16.png

Links

19