N-Grams: Overview

video-placeholder
Loading...
查看授课大纲

您将学习的技能

Word2vec, Parts-of-Speech Tagging, N-gram Language Models, Autocorrect

审阅

4.7(1,443 个评分)

  • 5 stars
    79.69%
  • 4 stars
    14.83%
  • 3 stars
    3.53%
  • 2 stars
    0.76%
  • 1 star
    1.17%

AH

Sep 28, 2020

Very good course! helped me clearly learn about Autocorrect, edit distance, Markov chains, n grams, perplexity, backoff, interpolation, word embeddings, CBOW. This was very helpful!

SR

Aug 4, 2021

Another great course introducing the probabilistic modelling concepts and slowly getting to the direction of computing neural networks. One must learn in detail how embedding works.

从本节课中

Autocomplete and Language Models

Learn about how N-gram language models work by calculating sequence probabilities, then build your own autocomplete language model using a text corpus from Twitter!

教学方

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。