Github repo with tutorials to fine tune transformers for diff NLP tasks
-
Updated
Apr 1, 2024 - Jupyter Notebook
Github repo with tutorials to fine tune transformers for diff NLP tasks
A flexible, adaptive classification system for dynamic text classification
Transformers 3rd Edition
An all-in-one AI audio playground using Cloudflare AI Workers to transcribe, analyze, summarize, and translate any audio file.
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Build and train state-of-the-art natural language processing models using BERT
Pytorch-Named-Entity-Recognition-with-transformers
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
Distillation of BERT model with catalyst framework
FoodBERT: Food Extraction with DistilBERT
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
Aura is an emotion-aware music recommender that understands your mood from natural language input. It uses a fine-tuned BERT model trained on the GoEmotions dataset to detect your emotion. Based on the emotion, Aura suggests a song to comfort your mood or celebrate it through music.
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Compares the DistilBERT and MobileBERT architectures for mobile deployments.
Task Complexity Classifier using Transformer-based NLP model based on Bloom's Taxonomy
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."