Skip to content

goin2crazy/bard-cnn-large-finetuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

BART-Large-CNN Lora Fine-Tuning with Transformers and PEFT

This repository contains a Jupyter Notebook (*.ipynb) that demonstrates LoRA (Low Rank Adaptation) fine-tuning methods for the 'bart-large-cnn' model using the Transformers library.

Instructions

Environment Setup

Before running the notebook, ensure you have the required libraries installed. You can install them using the following steps:

  1. Open the todo.txt file.
  2. Follow the instructions to install the necessary libraries using pip or conda.

Running the Notebook

  1. Open the *.ipynb file in Jupyter Notebook or JupyterLab.
  2. Follow the instructions within the notebook to run different fine-tuning experiments with the 'bart-large-cnn' model.

Contents

  • *.ipynb: Jupyter Notebook with code for fine-tuning 'bart-large-cnn' using Transformers.
  • todo.txt: Instructions for installing required libraries.
  • Additional files for datasets and model configurations.

Acknowledgements

About

Hugging model finetuning with different ways

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published