Skip to content
gqlxj1987's Blog
Go back

neuralnetwork deep learning1

Edit page

neural netowrks and deep learning

神经网络的机器学习,主要分为几个部分

神经网络识别手写

two important types of artificial neuron (the perceptron and the sigmoid neuron)

$$ output = \begin{cases} 0, & \text{if } \sum_{j}w_{j}x_{j} \leq threshold \ 1, & \text{if } \sum_{j}w_{j}x_{j} > threshold \end{cases} $$

simplify the describe perceptrons

$$ w \cdot x \equiv \sum_{j}w_{j}x_{j} \ b \equiv -threshold $$

$$ output = \begin{cases} 0, & \text{if } w \cdot x +b \leq 0 \ 1, & \text{if } w \cdot x +b > 0 \end {cases} $$

While the design of the input and output layers of a neural network is often straightforward, there can be quite an art to the design of the hidden layers

multilayer perceptrons orMLPs

feedforward neural networks

RNN networks, feedback loops are possible

two problems:

There are many approaches to solving the segmentation problem.

One approach is to trial many different ways of segmenting the image, using the individual digit classifier to score each trial segmentation. A trial segmentation gets a high score if the individual digit classifier is confident of its classification in all segments, and a low score if the classifier is having a lot of trouble in one or more segments.

The idea is that if the classifier is having trouble somewhere, then it’s probably having trouble because the segmentation has been chosen incorrectly. This idea and other variations can be used to solve the segmentation problem quite well.

10 outputs vs 4 outputs

gradient descent $$ C(w,b) \equiv \frac{1}{2n}\sum_{x} \begin{Vmatrix} y(x)-a \end{Vmatrix}^2 $$

ball-rolling analogy $$ \Delta C \approx \frac{\partial C}{\partial v1} \Delta v1 + \frac{\partial C}{\partial v2} \Delta v2 $$

$$ a’ = \sigma(wa + b) $$

bp算法的工作原理

提高神经网络学习方式

神经网络可以计算任何方法

深度神经网络很难训练

deep learning


Edit page
Share this post on:

Previous Post
Maria
Next Post
孤独以及不要多做事情