Acerca de este Curso
156,758 vistas recientes

100 % en línea

Comienza de inmediato y aprende a tu propio ritmo.

Fechas límite flexibles

Restablece las fechas límite en función de tus horarios.

Nivel avanzado

Aprox. 33 horas para completar

Sugerido: 5 weeks of study, 4-5 hours per week...

Inglés (English)

Subtítulos: Inglés (English)

Habilidades que obtendrás

ChatterbotTensorflowDeep LearningNatural Language Processing

100 % en línea

Comienza de inmediato y aprende a tu propio ritmo.

Fechas límite flexibles

Restablece las fechas límite en función de tus horarios.

Nivel avanzado

Aprox. 33 horas para completar

Sugerido: 5 weeks of study, 4-5 hours per week...

Inglés (English)

Subtítulos: Inglés (English)

Programa - Qué aprenderás en este curso

5 horas para completar

Intro and text classification

In this module we will have two parts: first, a broad overview of NLP area and our course goals, and second, a text classification task. It is probably the most popular task that you would deal with in real life. It could be news flows classification, sentiment analysis, spam filtering, etc. You will learn how to go from raw texts to predicted classes both with traditional methods (e.g. linear classifiers) and deep learning techniques (e.g. Convolutional Neural Nets).

11 videos (Total 114 minutos), 3 readings, 3 quizzes
11 videos
Welcome video5m
Main approaches in NLP7m
Brief overview of the next weeks7m
[Optional] Linguistic knowledge in NLP10m
Text preprocessing14m
Feature extraction from text14m
Linear models for sentiment analysis10m
Hashing trick in spam filtering17m
Neural networks for words14m
Neural networks for characters8m
3 lecturas
Prerequisites check-list2m
Hardware for the course5m
Getting started with practical assignments20m
2 ejercicios de práctica
Classical text mining10m
Simple neural networks for text10m
5 horas para completar

Language modeling and sequence tagging

In this module we will treat texts as sequences of words. You will learn how to predict next words given some previous words. This task is called language modeling and it is used for suggests in search, machine translation, chat-bots, etc. Also you will learn how to predict a sequence of tags for a sequence of words. It could be used to determine part-of-speech tags, named entities or any other tags, e.g. ORIG and DEST in "flights from Moscow to Zurich" query. We will cover methods based on probabilistic graphical models and deep learning.

8 videos (Total 84 minutos), 2 readings, 3 quizzes
8 videos
Perplexity: is our model surprised with a real text?8m
Smoothing: what if we see new n-grams?7m
Hidden Markov Models13m
Viterbi algorithm: what are the most probable tags?11m
MEMMs, CRFs and other sequential models for Named Entity Recognition11m
Neural Language Models9m
Whether you need to predict a next word or a label - LSTM is here to help!11m
2 lecturas
Perplexity computation10m
Probabilities of tag sequences in HMMs20m
2 ejercicios de práctica
Language modeling15m
Sequence tagging with probabilistic models20m
5 horas para completar

Vector Space Models of Semantics

This module is devoted to a higher abstraction for texts: we will learn vectors that represent meanings. First, we will discuss traditional models of distributional semantics. They are based on a very intuitive idea: "you shall know the word by the company it keeps". Second, we will cover modern tools for word and sentence embeddings, such as word2vec, FastText, StarSpace, etc. Finally, we will discuss how to embed the whole documents with topic models and how these models can be used for search and data exploration.

8 videos (Total 83 minutos), 3 quizzes
8 videos
Explicit and implicit matrix factorization13m
Word2vec and doc2vec (and how to evaluate them)10m
Word analogies without magic: king – man + woman != queen11m
Why words? From character to sentence embeddings11m
Topic modeling: a way to navigate through text collections7m
How to train PLSA?6m
The zoo of topic models13m
2 ejercicios de práctica
Word and sentence embeddings15m
Topic Models10m
5 horas para completar

Sequence to sequence tasks

Nearly any task in NLP can be formulates as a sequence to sequence task: machine translation, summarization, question answering, and many more. In this module we will learn a general encoder-decoder-attention architecture that can be used to solve them. We will cover machine translation in more details and you will see how attention technique resembles word alignment task in traditional pipeline.

9 videos (Total 98 minutos), 4 quizzes
9 videos
Noisy channel: said in English, received in French6m
Word Alignment Models12m
Encoder-decoder architecture6m
Attention mechanism9m
How to deal with a vocabulary?12m
How to implement a conversational chat-bot?11m
Sequence to sequence learning: one-size fits all?10m
Get to the point! Summarization with pointer-generator networks12m
3 ejercicios de práctica
Introduction to machine translation10m
Encoder-decoder architectures20m
Summarization and simplification15m
84 revisionesChevron Right


comenzó una nueva carrera después de completar estos cursos


consiguió un beneficio tangible en su carrera profesional gracias a este curso


consiguió un aumento de sueldo o ascenso

Principales revisiones sobre Procesamiento de lenguajes naturales

por GYMar 24th 2018

Great thanks to this amazing course! I learned a lot on state-to-art natural language processing techniques! Really like your awesome programming assignments! See you HSE guys in next class!

por MVMar 18th 2019

Definitely best course in the Specialization! Lecturers, projects and forum - everything is super organized. Only StarSpace was pain in the ass, but I managed :)



Anna Potapenko

HSE Faculty of Computer Science

Alexey Zobnin

Accosiate professor
HSE Faculty of Computer Science

Anna Kozlova

Team Lead

Sergey Yudin


Andrei Zimovnov

Senior Lecturer
HSE Faculty of Computer Science

Acerca de National Research University Higher School of Economics

National Research University - Higher School of Economics (HSE) is one of the top research universities in Russia. Established in 1992 to promote new research and teaching in economics and related disciplines, it now offers programs at all levels of university education across an extraordinary range of fields of study including business, sociology, cultural studies, philosophy, political science, international relations, law, Asian studies, media and communicamathematics, engineering, and more. Learn more on

Acerca del programa especializado Aprendizaje automático avanzado

This specialization gives an introduction to deep learning, reinforcement learning, natural language understanding, computer vision and Bayesian methods. Top Kaggle machine learning practitioners and CERN scientists will share their experience of solving real-world problems and help you to fill the gaps between theory and practice. Upon completion of 7 courses you will be able to apply modern machine learning methods in enterprise and understand the caveats of real-world data and settings....
Aprendizaje automático avanzado

Preguntas Frecuentes

  • Una vez que te inscribes para obtener un Certificado, tendrás acceso a todos los videos, cuestionarios y tareas de programación (si corresponde). Las tareas calificadas por compañeros solo pueden enviarse y revisarse una vez que haya comenzado tu sesión. Si eliges explorar el curso sin comprarlo, es posible que no puedas acceder a determinadas tareas.

  • Cuando te inscribes en un curso, obtienes acceso a todos los cursos que forman parte del Programa especializado y te darán un Certificado cuando completes el trabajo. Se añadirá tu Certificado electrónico a la página Logros. Desde allí, puedes imprimir tu Certificado o añadirlo a tu perfil de LinkedIn. Si solo quieres leer y visualizar el contenido del curso, puedes auditar el curso sin costo.

¿Tienes más preguntas? Visita el Centro de Ayuda al Alumno.