
blog.chewxy.com/2016/12/06/a-direct-way-of-understanding-backpropagation-and-gradient-descent
Preview meta tags from the blog.chewxy.com website.
Linked Hostnames
12- 6 links toblog.chewxy.com
- 5 links togithub.com
- 1 link todeanattali.com
- 1 link todeeplearning.net
- 1 link todisqus.com
- 1 link toen.wikipedia.org
- 1 link togohugo.io
- 1 link tokarpathy.github.io
Search Engine Appearance
A Direct Way of Understanding Backpropagation and Gradient Descent
Summary: I believe that there are better representations of neural networks that aid in faster understanding of backpropagation and gradient descent. I find representing neural networks as equation graphs combined with the value at run-time helps engineers who don't have the necessary background in machine learning gets them up to speed faster. In this post, I generate a few graphs with Gorgonia to illustrate the examples. Backpropagation has been explained to death. It really is as simple as applying the chain rule to compute gradients. However, in my recent adventures, I have found that this explanation isn’t intuitive to people who want to just get shit done. As part of my consultancy (hire me!) job* really, I need to pay the bills to get my startup funded , I provide a brief 1-3 day machine learning course to engineers who will maintain the algorithms that I designed. Whilst most of the work I do don’t use neural networks* people who think deep learning can solve every problem are either people with deep pockets aiming to solve a very general problem, or people who don't understand the hype. I have found that most businesses do not have problems that involves a lot of non-linearities. In fact a large majority of problems can be solved with linear regressions. , recently there was a case where deep neural networks were involved. This blog post documents what I found was useful to explain neural networks, backpropagation and gradient descent. It’s not meant to be super heavy with theory - think of it as an enabler for an engineer to hit the ground running when dealing with deep networks. I may elide over some details, so some basic understanding/familiarity of neural networks is recommended.
Bing
A Direct Way of Understanding Backpropagation and Gradient Descent
Summary: I believe that there are better representations of neural networks that aid in faster understanding of backpropagation and gradient descent. I find representing neural networks as equation graphs combined with the value at run-time helps engineers who don't have the necessary background in machine learning gets them up to speed faster. In this post, I generate a few graphs with Gorgonia to illustrate the examples. Backpropagation has been explained to death. It really is as simple as applying the chain rule to compute gradients. However, in my recent adventures, I have found that this explanation isn’t intuitive to people who want to just get shit done. As part of my consultancy (hire me!) job* really, I need to pay the bills to get my startup funded , I provide a brief 1-3 day machine learning course to engineers who will maintain the algorithms that I designed. Whilst most of the work I do don’t use neural networks* people who think deep learning can solve every problem are either people with deep pockets aiming to solve a very general problem, or people who don't understand the hype. I have found that most businesses do not have problems that involves a lot of non-linearities. In fact a large majority of problems can be solved with linear regressions. , recently there was a case where deep neural networks were involved. This blog post documents what I found was useful to explain neural networks, backpropagation and gradient descent. It’s not meant to be super heavy with theory - think of it as an enabler for an engineer to hit the ground running when dealing with deep networks. I may elide over some details, so some basic understanding/familiarity of neural networks is recommended.
DuckDuckGo
A Direct Way of Understanding Backpropagation and Gradient Descent
Summary: I believe that there are better representations of neural networks that aid in faster understanding of backpropagation and gradient descent. I find representing neural networks as equation graphs combined with the value at run-time helps engineers who don't have the necessary background in machine learning gets them up to speed faster. In this post, I generate a few graphs with Gorgonia to illustrate the examples. Backpropagation has been explained to death. It really is as simple as applying the chain rule to compute gradients. However, in my recent adventures, I have found that this explanation isn’t intuitive to people who want to just get shit done. As part of my consultancy (hire me!) job* really, I need to pay the bills to get my startup funded , I provide a brief 1-3 day machine learning course to engineers who will maintain the algorithms that I designed. Whilst most of the work I do don’t use neural networks* people who think deep learning can solve every problem are either people with deep pockets aiming to solve a very general problem, or people who don't understand the hype. I have found that most businesses do not have problems that involves a lot of non-linearities. In fact a large majority of problems can be solved with linear regressions. , recently there was a case where deep neural networks were involved. This blog post documents what I found was useful to explain neural networks, backpropagation and gradient descent. It’s not meant to be super heavy with theory - think of it as an enabler for an engineer to hit the ground running when dealing with deep networks. I may elide over some details, so some basic understanding/familiarity of neural networks is recommended.
General Meta Tags
67- titleA Direct Way of Understanding Backpropagation and Gradient Descent - Bigger on the Inside
- title%3
- title+
- titleone
- title+->one
Open Graph Meta Tags
5- og:titleA Direct Way of Understanding Backpropagation and Gradient Descent
- og:descriptionSummary: I believe that there are better representations of neural networks that aid in faster understanding of backpropagation and gradient descent. I find representing neural networks as equation graphs combined with the value at run-time helps engineers who don't have the necessary background in machine learning gets them up to speed faster. In this post, I generate a few graphs with Gorgonia to illustrate the examples. Backpropagation has been explained to death. It really is as simple as applying the chain rule to compute gradients. However, in my recent adventures, I have found that this explanation isn’t intuitive to people who want to just get shit done. As part of my consultancy (hire me!) job* really, I need to pay the bills to get my startup funded , I provide a brief 1-3 day machine learning course to engineers who will maintain the algorithms that I designed. Whilst most of the work I do don’t use neural networks* people who think deep learning can solve every problem are either people with deep pockets aiming to solve a very general problem, or people who don't understand the hype. I have found that most businesses do not have problems that involves a lot of non-linearities. In fact a large majority of problems can be solved with linear regressions. , recently there was a case where deep neural networks were involved. This blog post documents what I found was useful to explain neural networks, backpropagation and gradient descent. It’s not meant to be super heavy with theory - think of it as an enabler for an engineer to hit the ground running when dealing with deep networks. I may elide over some details, so some basic understanding/familiarity of neural networks is recommended.
- og:urlhttps://blog.chewxy.com/2016/12/06/a-direct-way-of-understanding-backpropagation-and-gradient-descent/
- og:typewebsite
- og:site_nameBigger on the Inside
Twitter Meta Tags
5- twitter:titleA Direct Way of Understanding Backpropagation and Gradient Descent
- twitter:descriptionSummary: I believe that there are better representations of neural networks that aid in faster understanding of backpropagation and gradient descent. I find representing neural networks as equation …
- twitter:cardsummary
- twitter:site@chewxy
- twitter:creator@chewxy
Link Tags
13- alternatehttps://blog.chewxy.com/index.xml
- stylesheethttps://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.10.0/katex.min.css
- stylesheethttps://use.fontawesome.com/releases/v5.5.0/css/all.css
- stylesheethttps://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css
- stylesheethttps://blog.chewxy.com/css/main.css
Links
21- http://deeplearning.net/software/theano
- http://karpathy.github.io
- http://lambda-the-ultimate.org/node/5335
- http://neuralnetworksanddeeplearning.com
- https://blog.chewxy.com