Skip to content

Transformers provides general-purpose architectures for Natural Language Understanding and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch

License

Notifications You must be signed in to change notification settings

shin-sforzando/etude-Transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

etude-Transformers

Bump Version Commitizen friendly License: MIT

Screenshot 1 Screenshot 2
Screenshot 1 Screenshot 2

Transformers provides general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet…) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.

Requirements

  • A (Version x.y.z or higher)
    • B
    • C
  • D
    • E

How to

Setup

(T.B.D.)

Develop

(T.B.D.)

Run

(T.B.D.)

Lint

(T.B.D.)

Test

(T.B.D.)

Deploy

(T.B.D.)

Document

(T.B.D.)

CHANGELOG.md

cz changelog

Misc

Notes

This repository is Commitizen friendly.

LICENSE

See LICENSE.

Contributors

About

Transformers provides general-purpose architectures for Natural Language Understanding and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch

Topics

Resources

License

Stars

Watchers

Forks