Course intended for:

Training course is aimed at developers and data analysts who want to learn the concepts of deep neural networks.

Course objective:

The course objective is to equip participants with knowledge about deep neural networks. The participants will be able to program and debug deep neural network, including convolutional and the recurrent neural networks. Neural network architecture discussed during the training will be presented by means of basic concepts of computer vision (classification) and processing of natural language (sentiment analysis, machine translation). Additionally, newest research along with the most popular uses of deep learning will be presented, such as automated generation of description for images and transfer of style between images.

Course strengths:

Course is conducted by trainers who have both practical experience in the application of deep neural networks in the processing of image and natural language, as well as excellent theoretical background gained during research work and participation in international workshops. Training curriculum is regularly updated, after most important professional conferences (NIPS, ACL, EMNLP) or major competitions (ILSVRC, MSCOCO ). Practical part of the training is conducted with the use of open source software library TensorFlow.

Requirements:

Course requires basic programming skills in Python. In addition, participants will be expected to know the basic concepts of probability, linear algebra (matrix multiplication) and mathematical analysis (partial derivatives). Basic knowledge of machine learning (on the level of a PYTHON/ML course) would also be helpful.

Parameters :

5*8 hours (5*7 net hours) of lectures and workshops.

Course curriculum

  1. Introduction to neural networks
    1. Perceptron and multilayer perceptron
    2. Neural network learning
    3. Implementation of neural network in Python
  2. Neural networks maintenance
    1. Initialization of neural network parameters
    2. Batch Normalization
    3. Optimalization
      1. SGD with Momentum
      2. Adagrad
      3. Adadelta
      4. RMSProp
      5. Adam
    4. Regularization:
      1. L1 and L2
      2. Dropout
  3. Convolutional neural network
    1. Basic concepts
    2. Case studies: winners of ImageNet
      1. AlexNet (2012)
      2. ZFNet (2013)
      3. GoogLeNet (2014)
      4. VGGNet (2014)
      5. ResNet (2015)
    3. Interesting application:
      1. "Pupy-snail", how Deep Dream works
      2. "A Neural Algorithm of Artistic Style", i.e. transfer of style between images
  4. Vector presentation of words and documents
    1. Word2Vec
      1. CBOW (Continuous Bag of Words)
      2. Skip-gram
      3. Application example: sentiment analysis
    2. Doc2Vec
      1. Distributed Memory Model
      2. Distributed Bag-of-Words
      3. Application example: recommendation of similar products on the basis of their review
  5. Recurrent Neural Network
    1. Recurrent Neural Network
    2. Long Short-Term Memory
    3. Gated Recurrent Unit
    4. Application example: how was film “Sunspring” made or the generation of strings
    5. Application example: automated generation of description for images
  6. Promising research directions
    1. Attention models
      1. Hard attention models
      2. Soft attention models
      3. Application example:
      4. Recursive Neural Networks
    2. Memory models
      1. Neural Turing Machines
      2. Dynamic Memory Networks
      3. Application example: answering questions to text and image

Any questions?

* Required.


Contact
Phone +48 22 2035600
Fax +48 22 2035601
Email