Seq2seq Model with Attention

video-placeholder
Loading...
查看授课大纲

您将学习的技能

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

审阅

4.3(835 个评分)

  • 5 stars
    66.10%
  • 4 stars
    15.20%
  • 3 stars
    9.10%
  • 2 stars
    5.26%
  • 1 star
    4.31%

DB

Jan 24, 2023

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

AM

Oct 12, 2020

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding!

Thank you!

从本节课中

Neural Machine Translation

Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.

教学方

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。