Skip to content

chengxiluo/Natural-Language-Generation

Repository files navigation

Natural-Language-Generation

Code of natural language generation as defined in this paper.

Abstract

This paper introduce two approaches to achieve natural language generation using Markov Chains and Recurrent Neural Networks. Markov Chains uses a probabilistic model based on the relationship between each unique word to calculate the probability of the next word, which can be used to text generation. Recurrent Neural Networks are powerful sequence models that are able to remember and process the sequence of each input, which are popular used to solve text generation problem. This paper will show the difference of two techniques through an experiment which uses the works of William Shakespeare to generate texts with Shakespeare’s writing style. This study shows that texts generated using Recurrent Neural Network are better than texts using Markov Chains.

Prerequisites

  • python3.7
  • numpy
  • nltk

Usage

python train.py

About

Code to achieve natural language generation based on Markov Chains and Recurrent Neural Networks.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages