Traditional Language models

video-placeholder
Loading...
Ver programa

Destrezas que aprenderás

Word Embedding, Sentiment with Neural Nets, Siamese Networks, Natural Language Generation, Named-Entity Recognition

Reseñas

4.5 (838 calificaciones)

  • 5 stars
    71,24 %
  • 4 stars
    17,54 %
  • 3 stars
    5,72 %
  • 2 stars
    2,74 %
  • 1 star
    2,74 %

BS

25 de sep. de 2020

Filled StarFilled StarFilled StarFilled StarFilled Star

Great Course as usual. Tried siamese models but got a very different results. Will need to study more on the conceptual side and implementation behind them. But overall, I am glad I touched LSTMs.

KT

24 de sep. de 2020

Filled StarFilled StarFilled StarFilled StarFilled Star

The lectures are well planned--very short and to the point. The labs offer immense opportunity for practice, and assignment notebooks are well-written! Overall, the course is fantastic!

De la lección

Recurrent Neural Networks for Language Modeling

Learn about the limitations of traditional language models and see how RNNs and GRUs use sequential data for text prediction. Then build your own next-word generator using a simple RNN on Shakespeare text data!

Impartido por:

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

Explora nuestro catálogo

Inscríbete de manera gratuita y obtén recomendaciones personalizadas, actualizaciones y ofertas.