Week Conclusion

video-placeholder
Loading...
查看授课大纲

您将学习的技能

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

审阅

4.3(821 个评分)

  • 5 stars
    66.13%
  • 4 stars
    14.98%
  • 3 stars
    9.13%
  • 2 stars
    5.35%
  • 1 star
    4.38%

AM

Oct 12, 2020

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding!

Thank you!

SB

Dec 31, 2020

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

从本节课中

Neural Machine Translation

Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.

教学方

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。