Hi this is Abhilash Nelson and I am thrilled to introduce you to my new course Deep Learning and Neural Networks using Python: For Dummies
The world has been revolving much around the terms "Machine Learning" and "Deep Learning" recently. With or without our knowledge every day we are using these technologies. Ranging from google suggestions, translations, ads, movie recommendations, friend suggestions, sales and customer experience so on and so forth. There are tons of other applications too. No wonder why "Deep Learning" and "Machine Learning along with Data Science" are the most sought after talent in the technology world now a days.
But the problem is that, when you think about learning these technologies, a misconception that lots of maths, statistics, complex algorithms and formulas needs to be studied prior to that. Its just like someone tries to make you believe that, you should learn the working of an Internal Combustion engine before you learn how to drive a car. The fact is that, to drive a car, we just only need to know how to use the user friendly control pedals extending from engine like clutch, brake, accelerator, steering wheel etc. And with a bit of experience, you can easily drive a car.
The basic know how about the internal working of the engine is of course an added advantage while driving a car, but its not mandatory. Just like that, in our deep learning course, we have a perfect balance between learning the basic concepts along the implementation of the built in Deep Learning Classes and functions from the Keras Library using the Python Programming Language. These classes, functions and APIs are just like the control pedals from the car engine, which we can use easily to build an efficient deep learning model.
Lets now see how this course is organized and an overview about the list of topics included.
We will be starting with few theory sessions in which we will see an overview about the Deep Learning and neural networks. The difference between deep learning and machine learning, the history of neural networks, the basic work-flow of deep learning, biological and artificial neurons and applications of neural networks.
In the next session, we will try to answer the most popular , yet confusing question weather we have to choose Deep Learning or machine learning for an upcoming project involving Artificial intelligence. We will compare the scenarios and factors which help us to decide in between machine learning or deep learning.
And then we will prepare the computer and install the python environment for doing our deep learning coding. We will install the anaconda platform, which a most popular python platform and also install the necessary dependencies to proceed with the course.
Once we have our computer ready, we will learn the basics of python language which could help if you are new to python and get familiar with the basic syntax of python which will help with the projects in our course. We will cover the details about python assignments, flow control, functions, data structures etc.
Later we will install the libraries for our projects like Theano, Tensorflow and Keras which are the best and most popular deep learning libraries. We will try a sample program with each libraries to make sure its working fine and also learn how to switch between them.
Then we will have another theory session in which we will learn the concept of Multi-Layer perceptrons, which is the basic element of the deep learning neural network and then the terminology and the Major steps associated with Training a Neural Network. We will discuss those steps in details in this session.
After all these exhaustive basics and concepts, we will now move on to creating real-world deep learning models.
At first we will download and use the Pima Indians Onset of Diabetes Dataset, with the training data of Pima Indians and whether they had an onset of diabetes within five years. We will build a classification model with this and later will train the model and evaluate the accuracy of the model. We will also try Manual and automatic data splitting and k-Fold Cross Validation with this model
The next dataset we are going to use is the Iris Flowers Classification Dataset, which contains the classification of iris flowers into 3 species based on their petal and sepal dimensions. This is a multi class dataset and we will build a multi-classification model with this and will train the model and try to evaluate the accuracy.
The next dataset is the Sonar Returns Dataset, which contains the data about the strength of sonar signals returns and classification weather it was reflected by a rock or any metal like mines under the sea bed. we will build the base model and will evaluate the accuracy. Also we will try to Improve Performance of model With Data Preparation technique like standardization and also by changing the topology of the neural network. By making it deeper or shallow.
We will also use the Boston House Prices dataset. Unlike the previous ones, this is a regression dataset which uses different factors to determine the average cost of owning a house in the city of Boston. For this one also we will build the model and try to Improve Performance of model With Data Preparation technique like standardization and also by changing the topology of the neural network.
As we have spend our valuable time designing and train the model, we need to save it to use it for doing predictions later. We will see how we can save the already trained model structure to either json or a yaml file along with the weights as an hdf5 file. Then we will load it and convert it back to a live model. We will try this for all the data sets we learned so far.
Now the most awaited magic of Deep Learning. Our Genius Multi-Layer Perceptron models will make predictions for custom input data from the already learned knowledge they have. The pima Indian model will predict weather I will get diabetes in the future by analysing my actual health statistics. Then the next model, the Iris Flower model will predict correct species of the newly blossomed Iris flower in my garden.
Also the prediction will be done with the Sonar Returns Model to check if the data provided matches either a mine or a rock under the sea.
Then with our next Multi-Layer Perceptron model, the Boston House Price model will predict the median value of the cost of housing in Boston.
Large deep learning models may take days or even weeks to complete the training. Its a long running process. There is a great chance that some interruptions may occur in between and all our hard work till then will be lost. In order to prevent that, we have a feature called Check-pointing. We can safely mark checkpoints and keep them safe and load model from that point at a later time. Check-pointing can be done based on every improvement to a model during training or the best instance of model during training.
At times, we may need to supervise and take a look at how the model is doing while its getting trained. We can Access Model Training History in Keras very easily and if needed can visualize the progress using a graphical representation.
Then we will deal with a major problem in Deep Learning called Over-fitting. Some neurons in the network gain more weightage gradually and will contribute to incorrect results. We will learn how to include drop-out regularization technique to prevent this to both visible as well as hidden layers
We can control the learning rate of a model. Just like we do rigorous learning at first and by the end of lesson, we could slow down the pace to understand better, we will also configure and evaluate a time-based as well as drop-based learning rate scheduler for our new model called Ionosphere classification model.
In the sessions that follow, we will learn a powerful deep learning neural network technique called Convolutional Neural Networks. This is proved very efficient in dealing with difficult computer vision and natural language processing tasks where the normal nerual network architecture would fail.
In the following sessions, at first we will have an overview about the convolutional neural networks or CNNs. How it works and its architecture. Then we will proceed with some popular and interesting experiments with the convolutional neural network.
The major capability of deep learning techniques is object recognition in image data. We will build a CNN model in keras to recognize hand written digits. We will be using the openly available MNIST dataset for this purpose. We will at first build a Multi-Layer Perceptron based Neural Network at first for MNIST dataset and later will upgrade that to Convolutional Neural Network.
And you know what... we are bold enough to do prediction with a hand written digit using our MNIST dataset. We will take time to train the model, save it. And later load it and do a quick prediction with the already saved model.
We will later try improving the performance of the model by making the network large. We will also try techniques like Image Augmentation, Sample Standardization, ZCA whitening, transformations like Random rotations, random shifts and flips to our augmented images. And we will finally save the augmented images as the dataset for later use.
Then we will go ahead with another important and challenging project using CNN which is the Object Recognition in Photographs. We will use another openly available dataset called CIFAR-10. We will learn about the CIFAR-10 object recognition dataset and how to load and use it in Keras. We will at first create a simple Convolutional Neural Network for object recognition. Then later will try to improve the performance using a more deeper network. One more time we are having the guts to do a real time prediction with the CIFAR-10 dataset Convolutional Neural network, where the model will identify a cat and dog from the image we supplied to the system.
Overall, this is a basic to advanced crash course in deep learning neural networks and convolutional neural networks using Keras and Python, which I am sure once you completed will sky rocket your current career prospects as this is the most wanted skill now a days and of course this is the technology of the future. We will also be providing you with an experience certificate after the completion of this course as a proof of your expertise and you may attach it with your portfolio.
There is a day in the near future itself, when the deep learning models will out perform human intelligence. So be ready and lets dive into the world of thinking machines.
See you soon in the class room. Bye for now.
2313
88
TAKE THIS COURSE