audio
audioduration (s)
0.62
28.4
text
stringlengths
2
287
block which lets get this one running so the simple part is just go over there and then
you have the run selected cell so we select that one and run it so while it runs you would
so this is just a comment so if you can choose to run it but it doesnt actually do anything
you would be make basically classifying them into these different kinds of classes over
images itself and these are all color rgb color images so thats available directly within
the torch vision data sets so now we had imported torch vision data set over here
now i can go into data sets and then from there i input the cifar ten data set now the
point is when it imports locally so its its either imported somewhere earlier and then
is basically another folder which is created within my local directory so you see your
directory anyways because we did not upload the data set thats a huge bulky file to be
so if you have it already downloaded
purpose otherwise you need to download it from scratch so here like what it would do
is it just goes over there and sees that files are already downloaded and they are perfectly
and then within cifar ten batches it will be creating my training and test batches over
here ok so now once thats done so what i can do is i move back on to my main directory
over here and lets go to the next part of it so here what i am trying to do is get into
what is the length over there and then it just converts it to a string and prints it
training and testing data set is of ten thousand images now once thats done the next part is
we come down over here which is feature extraction on a single image so initially what we will
be doing is lets lets see what these images look like so what i am doing is i take down
one of these images which is at the zero , zero location so this is the first image present
format so that will typically be coming down as some sort of a container with me now that
its its really fuzzy to understand but this is basically if you like really go far off
what you are going to do is you would need the main image array so thats present over
is basically the number of points you would be taking around the central point
so you remember clearly from our earlier discussions on from in the last class on lbp where you
you would be getting eight such neighbors along that point which are at a distance separation
you would do now what it allows within these functions is that you can choose down any
number of neighbors you can choose four five six seven typically for the three cross three
that that would not be a uniform pixel kind of a distribution but you can interpolate
and go down to those kind of forms so what we choose to do is we take a circular neighborhood
this lbp feature on a point to point basis looks like so we compute this one and this
hard to actually find out whether there is a frog or something or not from so many points
there for for this from this histogram then that would help you to get down the energy
and entropy as well
now once you have all of these you can basically use energy and entropy as two different distinct
whole image needs to be represented in terms of one single scalar value and a set of those
multiple number of scalar values which will be your features which describe this image
so for that what we do is we just evaluate this part over here and i get down that lbp
energy of this much and lbp entropy of this much is what defines all of this together
present in this image ok now once that goes down the next part is to find it out on the
co occurrence matrix ok so in a co occurrence matrix what i need to do is i need to get
there is what is the orientation of your vector whether its at zero degrees forty five degree
number two fifty six is basically the number of gray levels you have in your gray level
are basically to show down how to handle down the boundary conditions present over there
one the first scalar value is basically to get done contrast second scalar value is to
in getting and this are the different measures for that one particular image now from there
the next one is to get into wavelets and do it so for we choose to do it with gabor filters
now as you remember from your gabor filtered equations in the last class so there would
down over there as well as what is your frequency at which you would like to operate
now the other part is what is the angle at which it is located and what are the variables
also choose to give them so you can read down with within the details more over there now
given that at any point you will be getting down to components of your wavelet decomposition
imaginary part and this is basically the consolidated magnitude response over there the next part
than these kind of matrix representation and they are basically your probability energy
now this is till now what we have done was just for one of these images which was at
the first location within my training data set now in order to do it for training i would
define some sort of a matrix which is called as the training features matrix so this is
a two d matrix which is the number of rows in this matrix is equal to the length of the
training data set the number of columns is equal to the length of features now how many
features we found out was basically two plus five plus two and that makes it nine features
which we are going to have over here now for this part what we do is we write down first
over the whole length of the training data set once you get over the whole length of
the training data set you need to find out one feature at a time now once you have one
feature at a time coming down you need to calculate all of these features one sorry
filters now once you have all of them you need to concatenate that into one row matrix
and then you keep on concatenating one below the other and you get your two d matrix coming
down so if we run this part you see this verbose commenting coming down and then it keeps on
running so together that would finish it off there might be certain warnings at positions
it over fifty thousand of those but if you look through it so its its pretty much fast
so tidy slow as well in the duration of where we are speaking you can already see this quite
going on so we just have a verbose ,nd given down over there so if you would like
to get rid of this part then the simple task is that you dont keep one printing this part
to show down how many of them are done and and then you just just need to wait till its
out on your test set as well
do a basic revision in that case so what i did was i have my pre defined precursor coming
the type of the data set or not but say if you are writing a full fledged code over there
all images in your data set now if you dont want to look into whats getting extracted
still keeps on running over here so lets see how far yeah it should be quite close to finishing
time now once your features are extracted the next part of your code is basically to
are extracted the next part is to go down on your test data set and also extract out
features and completely show it and and then eventually you can go and basically save down
yeah so now this is over and the next part of it is basically to get down your testing
out all the features is basically to get down get each feature dynamically varying within
to be applied within your testing set otherwise the nature of normalizations are going to
file and then just print it all so once this part is complete you need to get down extract
features for your training one and for your testing set then run the feature normalization
on images some basic operations using the classical way so as you start with any kind
you have in that big corpus of pixel space available to you now from that when we eventually
go down as you have seen that there are features which you have extracted out the next question
as what we had defined in the first few lectures was that you need to be able to relate certain
called as a classification problem ok
now in order to make it even simpler so what it would essentially mean is that if i have
these are all may be scalar parameters now if i arrange these scalar parameters into
sort of a matrix thats what we would call down as a vector or in the standard parlance
of our definitions we would also be calling this as a feature vector now once you have
that feature vector given to you how do i associate a feature vector to one single categorical
itself and now from that perspective here is where we start down so what todays lecture
neuron model and from there we will go down to ah the neural network formulation and then
would define what a neuron is so as in a neural network you would always have a neuron
README.md exists but content is empty. Use the Edit dataset card button to edit it.
Downloads last month
40
Edit dataset card