Oct 21, 2020, 3:00 PM – Nov 13, 2020, 12:22 PM

RSVP'd

Machine Learning

• Week 1: Overview of Machine Learning

– Part 1: Data And Terms

◦ Lecture introdution.

◦ What Is Machine Learning?

◦ Why Estimate f?

◦ Types of Learning.

◦ Data Types And Datasets.

◦ Model Performance.

– Part 2: Regression Models and Linear Regression

◦ What is Linear Regression problem?

◦ Defining loss function.

◦ Interpretation of loss function and estimating parameters.

◦ Finding minima with gradient descent algorithm.

– Part 3: Regression Models and Logistic Regression

◦ Decision theory.

◦ What is Logistic Regression problem?

◦ Defining loss function.

◦ Interpretation of loss function and estimating parameters.

• Week 2: Model’s Performance

– Part 1: Train - Validate - Test

◦ Why we need to evaluate the model?

◦ Splitting the dataset.

◦ Definitions of datasets.

– Part 2: Evaluating Regression Models.

◦ Evaluating Linear Regression.

◦ Evaluating Logistic Regression: Misclassification Error.

◦ Evaluating Logistic Regression: Confusion Matrix.

– Part 3: The Problem of Overfitting

◦ Bias - Variance Trade off.

◦ Hyperparameters.

◦ Homework 1.

1

• Week 3: Introduction To Deep Learning

– Part 1: Perceptrons and Forward Propagation

◦ Why Deep Learning?

◦ Single-layer Perceptrons.

◦ Activation Functions.

◦ Multi-layer Perceptrons.

– Part 2: Computational Graphs And Backpropagation

◦ Defining Computational Graph.

◦ Mathematics of Backpropagation.

◦ Multi-layer Perceptrons.

– Part 3: Model & Loss & Optimizer

◦ Common Deep Learning Architectures.

◦ Other Loss Functions.

◦ Optimizers In Deep Learning.

◦ Homework 2.

• Week 4: Images And Convolutional Neural Networks

– Part 1: Basic Image Processing

◦ Representation Of An Image And Color Space.

◦ Kernels.

◦ Morphological Operations.

– Part 2: Convolutional Neural Networks

◦ Convolution Operator.

◦ Feature Extraction and Classification.

◦ History Of Computer Vision.

– Part 3: Computer Vision Tasks

◦ Object Detection.

◦ Image Segmentation.

◦ Homework 3.

• Week 5: Texts And Sequential Models

– Part 1: Vanilla Recurrent Neural Networks

◦ Main Idea Behind RNNs.

◦ RNN With Examples.

◦ Backpropagation Through Time.

◦ Multilayer & Bidirectional RNNs.

– Part 2: Canonical Recurrent Neural Networks

◦ Vanishing And Exploding Gradients.

◦ Types Of Canonical RNNs: LSTM, GRU, Echo State.

– Part 3: Introduction To Natural Language Processing.

◦ Representation Of Text.

◦ Tokenization, Stemming, Lemmatization.

◦ N-Grams and Markov Assumption.

◦ Homework 4.

2

• Week 6: Generative Models

– Part 1: Autoencoders

◦ Main Idea Behind Autoencoders.

◦ Sparse Autoencoders, Denoising Autoencoders.

◦ Variational Autoencoders.

– Part 2: Generative Adversarial Networks

◦ GANs.

◦ DCGANs.

– Part 3: Summary And Discussion

◦ Papers And Textbooks.

◦ Academics And Companies.

◦ Softwares.

October 21 – November 13, 2020

3:00 PM – 12:22 PM UTC

3:00 PM | Day 1 |

## Elif Irem Kulcu

Yıldız Technical University

GDSC Lead

## Mert Tuna Kurnaz

Yıldız Technical University

Core Team Member

## İzzettin karasayar

Core Team Member

## Fatih AY

Core Team Member

## ömerhan sancak

Core Team Member

## Elif Kundu

Yıldız Teknik Üniversitesi

Core Team Member

## Özgür Ağar

Yildiz Technical University

Core Team Member

## Osman Bahadır Avcı

Core Team Member

## Yasemin Atmaca

Core Team Member