Artificial Intelligence Full Course 2024 _ AI Tutorial For Beginners _ AI Full Course_ Intellipaat - Ep44

[solidity] Time: 2025-07-11 11:49:11 Source: MetaScripta Author: code Click: 70 times
know the directionright but snoop dogg vegas showyou cannot take large stepswhen you are standing on top of amountain you might know that thedirection to go downwards is let's saytowards your front but you willobviously not run down the hill rightyou are not going to run down themountain because you might die right soyou are knowing the direction but youare being cautious you're going into inin that particular direction slowlyrightso Raju when you're standing on top of amountain that is a three dimensionalthing right you you will have uh in amountain as well it's not that you canjust step to the front right you willhave different directions correct solet's say you go towards the right handside then after some point of time whenyou're going towards the right hand sidethe level of the mountain starts to goup then what will you do you you mightchange the direction to slightly towardsthe left hand side right so you arekeeping in mind when you are going downa mountain you are keeping in mind allthe different directions that you youmight have to change right so just likethat if I have 20 weights so you can ifyou if you are able to understand thecomplexity of coming down a mountainwhich is only three dimensions youshould be able to understand howcomplicated it will be for the neuralnetwork of a gradientto identify the direction of the minimumcost curve minimum error in let's say aspace as big as 20Dimensions right so which weight toincrease which to decrease so that'swhat I'm saying right you need todifferentiate your error with respect tothe different weight values so let's saythe differentiation of error withrespect to weight one tells you that youhave to move in a positive directionlet's say that differentiation of errorwith respect to weight two tells youthat you have to move in let's say someother direction so you are going toconsolidate all the moves together andthen you are going tomove all right so let's moveforward right now we'll try tounderstand the mathematics behind uhbackward propagation and this is wherewe'll go for a break because this iswhere it will get uh mathematical rightso now the next part we are going tounderstand internally how the neuralnetwork model does its calculation rightwe are going to take one feature rightmultiply it by the weight valuescalculate the activation function thisis going to be the mathematical part uhthis is going to be slightly complicatedright so if you're able to follow it isgreat even if you're not able to followit totally Iwouldeverything is two right correct uh thesebottom nodes are what these are the biasvalues right and what are these values0.152 02530 these are for our example these areweight values that have been alreadypredefined as of now right we'll see howthese values go ahead and multiply andinputs are what if you see i1 is 0.05 I2is 0.1 these are the input values and o1and O2 are output values which are 01and0.99 all right I hope everyone should beable to understandthis okay now what will be the firstthing that we will do over here can yougive me the input that will go to thenodeH1 I want you to think of this I wantyou to give me the input the input valuethat will give that will go to nodenumber H1 I am only talking about thelinear transformation right not thenonlineartransformation you have all the valuesthat we have seen till now right howwill you what input is going to go tonode H1 what is thevalue all right this is the image rightso let's see what happens in H1 net H1is what whatever is going inside this soas as you guys had said it willbe .15 into05 2 into 0.1right so if you look at this 05 into0.15 that is one now 0.1 into 0.2because this is the weight two rightweight two they're taking this weightthree as this and weight four as this sothat's okay we got confused uh I thinkthey have not given the weight values ina very correct visual way but that'sokay right I hope this is clear toeveryone and then if you see the bias0.35 into1okay so this will go as an input to theuh node H1 which is this node right andthis is what this is the lineartransformation correctnow assuming that we had the sigmoidactivation function now what will happenif we have the sigmoid activationfunction inside this particular neuronH1 what will happen is that the sigmoidactivation function will get the inputvalue of0.375 which was the lineartransformation am I correct and theoutput that will come out from H1 willbe this right 1 upon 1 + e^ minus net H1because this this is the formula foryour sigmoid activation function rightand minus net H1 is nothing butus0.375 which will give me the output as5932692 is this clear toeveryone you can go ahead and you can dothe calculations uh once when we aredone with this but uh if you'llcalculate this you'll be getting thesimilarvalues M me bias is a constant so I amgiving the feature as the value of oneonly if you see bias is not any featureit is just a constant value if you'llmultiply it by one obviously it willremain the bias onlyright exactly now we need to do the samething for the output right now just likewe had 05 and 0.10 here now for H1 andH2 we have the values as six and let'ssay 6 right so let's say 0. 59 and 0.59so what I have over here nowis I will behaving this value as0.59 right and once again this value is0.59 what what do we need to do next weneed to now multiply this 0.59 by thedifferent weight values add the biasvalues and get an output the output nodewill also have a function right we aregoing to do the same process now is thisclear toeveryone all right so let's go to thenext part of this neural network now theoutput from o1 right I directly we weneed to go to o1 now o1 is this whatwill that be if H1 and H2 both aregiving me 0. 59 output of o1 will bewhat 0.59 into0.4 and

(Editor in charge: solidity)

Related content
Wonderful recommendations
Popular Clicks
Friendly links