Acerca de este Curso

45,842 vistas recientes
Certificado para compartir
Obtén un certificado al finalizar
100 % en línea
Comienza de inmediato y aprende a tu propio ritmo.
Fechas límite flexibles
Restablece las fechas límite en función de tus horarios.
Nivel intermedio
Aprox. 23 horas para completar
Inglés (English)
Subtítulos: Inglés (English)

Habilidades que obtendrás

Reformer ModelsNeural Machine TranslationChatterbotT5+BERT ModelsAttention Models
Certificado para compartir
Obtén un certificado al finalizar
100 % en línea
Comienza de inmediato y aprende a tu propio ritmo.
Fechas límite flexibles
Restablece las fechas límite en función de tus horarios.
Nivel intermedio
Aprox. 23 horas para completar
Inglés (English)
Subtítulos: Inglés (English)

ofrecido por

Placeholder

deeplearning.ai

Programa - Qué aprenderás en este curso

Semana
1

Semana 1

5 horas para completar

Neural Machine Translation

5 horas para completar
8 videos (Total 50 minutos), 2 lecturas, 1 cuestionario
8 videos
Alignment4m
Attention6m
Setup for Machine Translation3m
Training an NMT with Attention6m
Evaluation for Machine Translation8m
Sampling and Decoding9m
Beam Search (Optional)5m
2 lecturas
Connect with your mentors and fellow learners on Slack!10m
Content Resource10m
Semana
2

Semana 2

6 horas para completar

Text Summarization

6 horas para completar
7 videos (Total 43 minutos), 1 lectura, 1 cuestionario
7 videos
Transformer Applications8m
Dot-Product Attention7m
Causal Attention4m
Multi-head Attention6m
Transformer decoder5m
Transformer summarizer4m
1 lectura
Content Resource10m
Semana
3

Semana 3

6 horas para completar

Question Answering

6 horas para completar
10 videos (Total 44 minutos), 1 lectura, 1 cuestionario
10 videos
Transfer Learning in NLP6m
ELMo, GPT, BERT, T57m
Bidirectional Encoder Representations from Transformers (BERT)4m
BERT Objective2m
Fine tuning BERT2m
Transformer: T53m
Multi-task training strategy5m
GLUE Benchmark2m
Question Answering2m
1 lectura
Content Resource10m
Semana
4

Semana 4

6 horas para completar

Chatbot

6 horas para completar
6 videos (Total 21 minutos), 3 lecturas, 1 cuestionario
6 videos
Transformer Complexity3m
LSH Attention4m
Motivation for Reversible Layers: Memory! 2m
Reversible Residual Layers 5m
Reformer2m
3 lecturas
Optional KNN & LSH Review20m
Acknowledgments10m
References10m

Acerca de Programa especializado: Procesamiento de lenguajes naturales

Natural Language Processing (NLP) uses algorithms to understand and manipulate human language. This technology is one of the most broadly applied areas of machine learning. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio. By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text, and even build chatbots. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....
Procesamiento de lenguajes naturales

Preguntas Frecuentes

¿Tienes más preguntas? Visita el Centro de Ayuda al Alumno.