Machine Learning

Machine Learning with Python – for Beginners

Machine Learning with Python is a 10+ hours FREE course – a journey from zero to mastery.

The course consist of the following content.

  • 15 video lessons – which explain Machine Learning concepts, demonstrate models on real data, introduce projects and show a solution (YouTube playlist).
  • 30 JuPyter Notebooks – with the full code and explanation from the lectures and projects (GitHub).
  • 15 projects – with step guides to help you structure your solutions and solution explained in the end of video lessons (GitHub).

TL;DR

How to get started:

Who is Machine Learning with Python course for?

If you want to learn Machine Learning in a simple down to earth way.

You don’t need a strong background in math, statistics, computer science or any other high level degree.

All you need is a desire to learn Machine Learning and spend the time to follow along the 15 lessons in this course.

It would be good with some Python fundamentals – but don’t worry if don’t have that – there is a FREE 8h Python beginners course available here. It comes with a practical eBook with all you need to know and is structured in 17 lessons tailored to the course.

What will you learn in the Machine Learning with Python course?

It will be an amazing journey from zero experience through all the important concepts in Machine Learning with real life practical examples and projects you will make together with me.

This includes the following.

  • k-Nearest-Neighbors Classifier
  • Linear Classifier
  • Support Vector Classification
  • Linear Regression
  • Reinforcement Learning
  • Unsupervised Learning
  • Neural Networks
  • Deep Neural Networks (DNN)
  • Convolutional Neural Networks (CNN)
  • PyTorch classifier
  • Recurrent Neural Networks (RNN)
  • Natural Language Processing
  • Text Categorization
  • Information Retrieval
  • Information Extraction

Every concept is introduced with explanatory examples, with a in-depth project to play with it on your own afterwards.

Worried you cannot solve the problem. No worries – I will help you through the project in the end of the video tutorials.

How to start?

  1. Download all the JuPyter Notebooks from my GitHub (you can get them as zip file here: download zip-file with full content).
    • Don’t know what JuPyter Notebook is?
  2. Launch JuPyter Notebook.
    • Don’t have JuPyter Notebook?
      • Don’t worry – get it for FREE here: Download Anaconda
      • Anaconda install JuPyter Notebook and Python and you will be ready to go.
  3. Open the first JuPyter Notebook from the zip-file (or download from GitHub).
    • Don’t know how to do that?
      • Don’t worry – get the eBook from here or follow this course. The eBook explains how to get started with Anaconda, JuPyter Notebook and more.
  4. Start the first video on YouTube (YouTube).

Lessons

Lesson 00 – k-Nearest-Neighbors Classifier

In this first lesson you will learn about the following.

  • What is the difference between classical computing and Machine Learning.
  • How does Machine Learning work.
  • Get data, prepare data, train the model, and test the model.
  • The types of Machine Learning: Supervised learning, unsupervised learning, and reinforcement learning (note: we will cover all in this course).
  • In this lesson we learn about k-Nearest-Neighbors Classifier – a supervised learning model.
  • We learn how to use it.
  • We make a project on real life weather data.

This will give you an understanding of what Machine Learning is and why it does not require high level programmings skills to master. Also, it will get you started with your first Machine Learning model – the k-Nearest-Neighbors Classifier.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

See the Video below or read a more detailed tutorial here.

Lesson 01 – Linear Classifier

In this lesson we will explore the following.

  • How Linear Classifier (supervised learning) works
  • How are they different from k-Nearest-Neighbors Classifer.
  • Understand the theory behind the Perceptron classifier (the linear classifier)
  • How to prepare data for the model (Perceptron classifier).
  • Visualize the result of the model
  • Create a project using the Perceptron classifier on real weather data.

This lesson will give you a broader understanding of what Machine Learning is, how the concepts are simple to understand and use. The next model (Linear Classifier) will be used to show visually how it differs from the previous one (k-Nearest-Neighbor Classifier).

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 02 – Support Vector Machines (SVM)

In this lesson we will continue with.

  • Learn about the problem of seperation.
  • The idea to maximize the distance.
  • Work with examples to demonstrate the issue.
  • Use the Support Vector Machine (SVM) model on data.
  • Explore the result of SVM on classification data.
  • Use the SVM model in a project to classify dog species.

In this lesson you will learn about the challenge the find the best fit of a Machine Learning model. We will explore how the Support Vector Machine can help solve the problem of the optimal classification.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 03 – Linear Regression

The goal of this lesson is.

  • Learn about Linear Regression
  • Understand difference from discrete classifier
  • Understand it is supervised learning task
  • Get insight into how similar a linear classifier is to discrete classifier
  • Hands-on experience with linear regression

Here you will learn how to predict precise values using the Linear Regression model, more specifically, learn how to predict house prices. Also, you will explore some common pitfalls, which demonstrates the importance of understanding what the data represents.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 04 – Reinforcement Learning

In this lesson we will do the following.

  • Understand how Reinforcement Learning works
  • Learn about Agent and Environment
  • Q-Learning and how Q-Tables works
  • How it iterates and gets rewards based on action
  • How to continuously learn new things
  • Create own Reinforcement Learning from scratch

The Reinforcement Learning model will teach you how simple the Machine Learning can be. You will create your own model from scratch. This will teach you how to think when creating Machine Learning models.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 05 – Unsupervised Learning

Here we will explore and learn about.

  • Understand how Unsupervised Learning is different from Supervised Learning
  • How it can organize data without knowledge
  • Understand how 𝑘-Means Clustering works
  • Train a 𝑘-Means Cluster model

Here you will learn how to organize documents with no prior knowledge and how to optimize the parameters of the algorithm k-Means Clustering.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 06 – Neural Network

In this lesson we will learn the following.

  • Understand Neural Networks
  • How you can model other machine learning techniques
  • Activation functions
  • How to make simple OR function
  • Different ways to calcualte weights
  • Use tensorflow to build our model.
  • What Batch sizes and Epochs are

You will learn about Neural Networks and how it works. It is an essential building block of modern Machine Learning.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 07 – Deep Neural Network (DNN)

This lecture will cover.

  • Understand Deep Neural Network (DNN)
  • How algorithms calculate weights in DNN with Backpropagation
  • Show tools to visually understand what DNN can solve
  • The problem of overfitting models
  • How Dropout works and use it.
  • Create our own DNN model
  • Explore a how to solve the XORproblem with DNN

This will teach you about Deep Neural Networks and demonstrate the power of this techniques. It will teach you how to solve problems, which are more complex than simple classification.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 08 – Convolutional Neural Network (CNN)

Here we will explore the following.

  • Understand what Convolutional Neural Network (CNN) is
  • The strength of CNN
  • How to use it to detect handwriting
  • Extract features from pictures
  • Learn Convolution, Pooling and Flatten
  • How to create a CNN to classify pictures of birds, airplanes and more.

Convolutional Neural Network (CNN) will teach you how to classify images – from handwritten letters to classification of birds and airplanes.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 09 – PyTorch

In this lecture we will cover the following.

  • What is PyTorch
  • PyTorch vs Tensorflow
  • Get started with PyTorch
  • Work with image classification with handwriting detection
  • Make a project with detecting birds and airplanes pictures.

In this lesson you will learn how to use PyTorch, an alternative to tensorflow. You will learn to classify images with PyTorch using DNN.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 10 – Recurrent Neural Network (RNN)

Here we will learn about.

  • Understand Recurrent Neural Network (RNN)
  • Build a RNN on a timeseries
  • Hover over the theory of RNN (LSTM cells)
  • Use the MinMaxScaler from sklearn.
  • Create a RNN model with tensorflow
  • Applying the Dropout techniques.
  • Predict stock prices and make weather forecast using RNN.

Here you will learn how to use Recurrent Neural Network (RNN), where you use data multiple times in the model. In this lesson you will learn how to use RNN on timeseries data to predict stock prices and weather forecast.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 11 – Natural Language Processing

In this lesson we will learn the following.

  • How the simple syntax of language can be parsed
  • What Context-Free Grammar (CFG) is
  • Use it to parse text
  • Understand word tokenization of text and trigrams
  • See how it can be used to generate predictions
  • Use the nltk toolkit.
  • A bit about Markov Chains/models.
  • Show how to use markovify library

You will learn the limitations of computers understanding of language as well as the strengths. How this knowledge can be used to create models for natural language processing.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 12 – Text Categorization and Sentiment Classification

This lecture will teach you the following.

  • What is Text Categorization
  • Learn about the Bag-of-Words Model
  • Understand Naive Bayes’ Rule
  • How to use Naive Bayes’ Rule for sentiment classification (text categorization)
  • What problem smoothing solves

This will teach you how to categorize documents and get an understanding of the sentiment of the text. This is helpful in classifying whether a review is positive or negative.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Lesson 13 – Information Retrieval

Here we will learn about the following.

  • Learn what Information Retrival is
  • Topic modeling documents
  • How to use Term Frequency and understand the limitations
  • Implement Term Frequency by Inverse Document Frequency (TF-IDF)
  • This will teach how google engines can find the most relevant pages.
  • Make our own TF-IDF calculation to demonstrate the power.

You will learn how to find the most significant words in a collection of document. This teaches you how search engines like Google can find the most relevant pages.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Video lecture released on December 7 at 16:00 CET

Lesson 14 – Information Extraction and Word2Vec

In this final lesson we will explore the following.

  • What is Information Extraction
  • Extract knowledge from patterns
  • Word representation
  • Skip-Gram architecture
  • To see how words relate to each other (this is surprising)
  • How to use Word2Vec

This will teach you how artificial intelligence can get an understanding of words and get meaning out of it. This lecture will surprise you.

Remember to get the JuPyter Notebooks used in the lecture from the GitHub. This way you will be able to follow along and make the project in the prepared JuPyter Notebooks.

Video lecture released on December 14 at 16:00 CET