Fine Tune BERT for Text Classification with TensorFlow

4.6

165 个评分

提供方

10,905 人已注册

在此免费指导项目中,您将:
2.5 hours
中级
无需下载
分屏视频
英语(English)
仅限桌面

This is a guided project on fine-tuning a Bidirectional Transformers for Language Understanding (BERT) model for text classification with TensorFlow. In this 2.5 hour long project, you will learn to preprocess and tokenize data for BERT classification, build TensorFlow input pipelines for text data with the tf.data API, and train and evaluate a fine-tuned BERT model for text classification with TensorFlow 2 and TensorFlow Hub. Prerequisites: In order to successfully complete this project, you should be competent in the Python programming language, be familiar with deep learning for Natural Language Processing (NLP), and have trained models with TensorFlow or and its Keras API. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

必备条件

您要培养的技能

  • natural-language-processing

  • Tensorflow

  • machine-learning

  • deep-learning

  • BERT

分步进行学习

在与您的工作区一起在分屏中播放的视频中,您的授课教师将指导您完成每个步骤:

指导项目工作原理

您的工作空间就是浏览器中的云桌面,无需下载

在分屏视频中,您的授课教师会为您提供分步指导

审阅

来自FINE TUNE BERT FOR TEXT CLASSIFICATION WITH TENSORFLOW的热门评论

查看所有评论

常见问题