Jim Liang Get started with machine learning 学习笔记 下(311页到520页)

5星(超过95%的资源)
所需积分/C币:35 2018-05-06 13:07:11 140.44MB PDF
25
收藏 收藏
举报

Jim Liang Get started with machine learning 学习笔记 (311页到520页) Jim Liang Get started with machine learning 学习笔记 (311页到520页)
∷ Neura| Network The neuron is the basic computational unit of the brain Neuron, ofen called a node or unit, receives input from some other neurons, or from an external source and computes an output. Each input has an associated weight (w), which is assigned on the basis of its relative importance to other inputs the node applies a function to the weighted sum of its inputs The idea is that the synaptic strengths(the weights w) are learnable and control the strength of influence and its direction: excitory (positive weight)or inhibitory (negative weight) of one neuron on another. In the basic model the dendrites carry the signal to the cell body where they all get summed. If the final sum is above a certain threshold, the neuron can fire, sending a spike along its axon Artificial neural networks are computing systems inspired by the biological neural networks impulses carried toward cell body ton from a neuron branches dendrites of axon cell bod axon U1 t1 nuc eus axon W:Ti+b teminals output axon activation X impulses carried away from cell body 2T2 cell body (a is input, wr is the weight biological neuron(left) and a common mathematical model (right) ∷ Neura| Network Artificial neural network(ANN) Inspired by the human brain, an ann is comprised of a network of artificial neurons(also known as nodes"). these nodes are connected to each other, and the strength of their connections to one another is assigned a value based on their strength If the value of the connection is high then it indicates that there is a strong connection The network is trained by iteratively modifying the strengths of the connections so that given inputs map to the correct response? Feedforward Neura/ Network A feedforward neural network is an artificial neural notwork whore connections bctwccn tho units do not form a cycle. In this network, the information moves in only one direction, forwara. from the input nodos, through the hiddon nodcs(if any)and to the output nodes. T here are no cycles or loops in the Output Laye Input Layer Hidden Layer 1 idden Laye All these connections have weights associated with them W is the weight which represents the strengths of the connections ∷ Neura| Network Best used. For modeling highly nonlinear systems When data is available incrementally and you wish to constantly update the model When there could be unexpected changes in your input data hen model interpretability is not a key concern 35.0 This high performance doesnt come for 34.8 ee, though. Neural networks can take a iang time to train, particularly for large data sets with lots of features. They also Latitud nave more parameters than most aigarth s, which means that parameter sweeping expands the training time a 33.6 Towns report snow Neural network for class roblem The boundarles learned by neural networks can be complex and irreg SOUrce Ma'/onE Appro sune vsed Lea/m ∷ Neura| Network How does artificial neuron work Artificial neurons are most basic units of information processing in an artificial neural network Each neuron takes information from a set of neurons, processes it, and gives the result to another neuron for further processing Activation Function The artificial neuron receives one or more Usually each input is separately weighted Inputs and sums them to produce an output and the sum is passed through a non-linear Each of neurons has some weicht function known as an activation associated with it function or transfer function Input SUM Basic structure of an artificial neuron ∷ Neura| Network How does artificial neuron work ?-continued Activation Function (1 All the inputs rare multiplied with their weights w, and added together b+>xw weighted sum = b*1+,*w,+.+x,*w,=b+ 2i1xw And then the weighted sum is passed to activation function he output f(x) There are many types of activation function One of the simplest would be st function f(r) -e c.g. usc Heaviside Step function as Activation Function A step function will typically output a 1 If the input Is higher A SUM than a certain threshold, otherwise it's output will be 0 b+ w Hevside() forx≥0 for x <0 ∷ Neura| Network Single-layer Perceptron Let's take a look at Perceptron. As a linear classifier, the Perceptron is the simplest feedforward neural network It's also termed the single-layer perceptron, to distinguish it from a multilayer neural network. It does not contain any hidden layers, which means it only consists of a single layer of output nodes Perceptron can be defined as a single artificial neuron that computes its weighted input, and uses a threshold activation function to autput a quantity. It's also called as TLU (Threshold Logic Unit Where we use Perceptron? f=f(x) put Layer Perceptron is usua ly used to classify the data into two parts. Therefore, it is also known as a Linear Binary Classifier ∷ Neura| Network Limitation of Perceptron a single layer perceptron can only learn linearly separable problems. If the vectors are not linearly separable, learning will never reach a point where all vectors are classified properly Boolean and Boolean or Boo ean Xor inearly separable A single perceptron can only learn linearly separable functions Boolean And functian, Boolean OR function are linearly separable, whereas XOR (exclusive or) function is not linearly separable( Its positive and negative instances can't be separated by a line or hyperplane) Even when the rata set is linearly separable, there may be many solutions, and which ane is found will depend on the nitialization of the parameters and on the order of presentation of the data points. Furthermore, for data sets that are not inoarly scparablo, the perceptron learning algorithm will nover convargo ∷ Neura| Network Multi-layer Perceptron can solve non-linear separable problem A multi-layer perceptron(MLP)is a class of feedforward artificial neural network. Given a set of features X=:,x2, ..nm and a target y, Multi-layer Perceptron can learn a non-linear function approximator for either classification or regression Output Layer linear prablem Oulput Layer Input Layer nput layer Single-Layer Perceptron Multi-Layer Perceptron(MLP) It coes not contain any hidden layer. It It contains hidden layers. It's able to learn not only linear problems but also non Non-linear Problem performs binary class fication(cithcr this or lincar probloms. In most casos tho data is not linearly soparablc. An MLP that), But it can't learn non-linearly separable neuron is free to either perform classification or regression, depending upon its problem activation function2

...展开详情
试读 127P Jim Liang Get started with machine learning 学习笔记 下(311页到520页)
立即下载 低至0.43元/次 身份认证VIP会员低至7折
一个资源只可评论一次,评论内容不能少于5个字
妖孽横生 非常好的学习资料感谢分享
2020-07-11
回复
fpx001 非常好用,推荐下载
2018-08-03
回复
您会向同学/朋友/同事推荐我们的CSDN下载吗?
谢谢参与!您的真实评价是我们改进的动力~
  • 分享达人

关注 私信
上传资源赚钱or赚积分
最新推荐
Jim Liang Get started with machine learning 学习笔记 下(311页到520页) 35积分/C币 立即下载
1/127
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第1页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第2页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第3页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第4页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第5页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第6页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第7页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第8页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第9页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第10页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第11页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第12页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第13页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第14页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第15页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第16页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第17页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第18页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第19页
Jim Liang Get started with machine learning 学习笔记 下(311页到520页)第20页

试读结束, 可继续阅读

35积分/C币 立即下载