Skip to content

Successfully developed a French text summarization model using the MLSum dataset and a Seq2Seq architecture with attention mechanism to generate concise and coherent summaries from long-form news articles.

Notifications You must be signed in to change notification settings

SayamAlt/MLSum-French-Summarization-using-Seq2Seq-Attention

About

Successfully developed a French text summarization model using the MLSum dataset and a Seq2Seq architecture with attention mechanism to generate concise and coherent summaries from long-form news articles.

Topics

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published