- Housing Prices (C1W1_Assignment.ipynb)
- Hello World Neural Network (C1_W1_Lab_1_hello_world_nn.ipynb)
- Handwriting Recognition (C1W2_Assignment.ipynb)
- Beyond Hello World, A Computer Vision Example (C1_W2_Lab_1_beyond_hello_world.ipynb)
- Callbacks (C1_W2_Lab_2_callbacks.ipynb)
- Improve MNIST with Convolutions (C1W3_Assignment.ipynb)
- Improving Accuracy with Convolutions (C1_W3_Lab_1_improving_accuracy_using_convolutions.ipynb)
- Exploring Convolutions (C1_W3_Lab_2_exploring_convolutions.ipynb)
- Handling Complex Images (C1W4_Assignment.ipynb)
- Image Generator (C1_W4_Lab_1_image_generator_no_validation.ipynb)
- Image Generator with Validation (C1_W4_Lab_2_image_generator_with_validation.ipynb)
- Compacted Images (C1_W4_Lab_3_compacted_images.ipynb)
- Cats vs. Dogs (C2W1_Assignment.ipynb)
- Using more sophisticated images with Convolutional Neural Networks (C2_W1_Lab_1_cats_vs_dogs.ipynb)
- Cats vs. Dogs using Augmentation (C2W2_Assignment.ipynb)
- Cats vs. Dogs with Augmentation (C2_W2_Lab_1_cats_v_dogs_augmentation.ipynb)
- Horses vs. Humans with Augmentation (C2_W2_Lab_2_horses_v_humans_augmentation.ipynb)
- Horses vs. Humans using Transfer Learning (C2W3_Assignment.ipynb)
- Exploring Transfer Learning (C2_W3_Lab_1_transfer_learning.ipynb)
- Multi-class Classifier (C2W4_Assignment.ipynb)
- Classifying Rock, Paper, and Scissors (C2_W4_Lab_1_multi_class_classifier.ipynb)
- Explore the BBC News Archive (C3W1_Assignment.ipynb)
- Simple Tokenizing (C3_W1_Lab_1_tokenize_basic.ipynb)
- Simple Sequences (C3_W1_Lab_2_sequences_basic.ipynb)
- Sarcasm (C3_W1_Lab_3_sarcasm.ipynb)
- Lab 1 - Explain about tokenizing. its about text to number so the machine can process it (?)
- Lab 2 -
- Explain from tokenizing text to sequences, soo from word_index each of word get the key and make it into sequences from the text (?)
- Pad the sequences into a uniform length because that is what your model expects
- Explain about OOV, it's about the word is not in the list from var word_index so the key will get OOV (Out of Vocabulary)
- Lab 3 - Tokenizing Real world Data from sarcasm dataset using json.
- Assignment :
- Implementation Tokenizing with remove_stopword.
- Can Filter the word and output the exclude all of the stopwords.
- Categorizing the BBC News Archive (C3W2_Assignment.ipynb)
- Positive or Negative IMDB Reviews (C3_W2_Lab_1_imdb.ipynb)
- Sarcasm Classifier (C3_W2_Lab_2_sarcasm_classifier.ipynb)
- IMDB Review Subwords (C3_W2_Lab_3_imdb_subwords.ipynb)
- Lab 1 - Training Binary with IMDB datasets - Word Embedding Compile the model using Embedding layers
- Lab 2 - Training Binary with sarcasm datasets - build a binary classifier.
- tokenizing datasets from scratch and you're treating the vocab size as a hyperparameter.
- Furthermore, you're tokenizing the texts by building a vocabulary of full words.
- Lab 3 - Subword tokenization -
- subword text encoding can be a robust technique to avoid out-of-vocabulary tokens
- Assignment :
- Struggle on Validation Split
- implemented a neural network capable of classifying text and also learned about embeddings and tokenization
- Exploring Overfitting in NLP (C3W3_Assignment.ipynb)
- IMDB Subwords 8K with Single Layer LSTM (C3_W3_Lab_1_single_layer_LSTM.ipynb)
- IMDB Subwords 8K with Multi Layer LSTM (C3_W3_Lab_2_multiple_layer_LSTM.ipynb)
- IMDB Subwords 8K with 1D Convolutional Layer (C3_W3_Lab_3_Conv1D.ipynb)
- IMDB Reviews with GRU (and optional LSTM and Conv1D) (C3_W3_Lab_4_imdb_reviews_with_GRU_LSTM_Conv1D.ipynb)
- Sarcasm with Bidirectional LSTM (C3_W3_Lab_5_sarcasm_with_bi_LSTM.ipynb)
- Sarcasm with 1D Convolutional Layer (C3_W3_Lab_6_sarcasm_with_1D_convolutional.ipynb)
Model / Architecture can use for text classification :
Embedding - LSTM - GRU / RNN - Convolutional Network
-
Lab 1 - IMDB Subwords 8K with with Single Layer LSTM
- Change Flatten or GlobalConvolutional1D to LSTM layer / Bidirectional Function.
-
Lab 2 - IMDB Subwords 8K with with Multiple Layer LSTM
- Additional LSTM Layer will make lengthen the training compared with previous lab.
-
Lab 3 - IMDB Subwords 8K with 1D convolutional Layer.
- Build the model by simply appending the conv & pooling layer after embedding layer.
-
Lab 4 - Building Models for the IMDB Datasets
- Build with Flatten() -> Very fast to train.
- Build with LSTM -> slower to train but useful in applications where the order of the tokens is important
- Build with GRU -> a simpler version of the LSTM. It can be used in applications where the sequence is important but you want faster results and can sacrifice some accuracy
- Build with Conv1D -> extract features from your dataset
- Each layers have a diff function
- GlobalMaxPooling1D - layer to reduce the results before passing it on to the dense layers
-
Lab 5 - Training a Sarcasm Detection Model using Bidirectional LSTMs
-
Lab 6 - Training a Sarcasm Detection Model using a Convolution Layer
- Writing Shakespeare with LSTMs (C3W4_Assignment.ipynb)
- NLP with Irish Music (C3_W4_Lab_1.ipynb)
- Generating Poetry from Irish Lyrics (C3_W4_Lab_2_irish_lyrics.ipynb)
- Create and Predict Synthetic Data (C4W1_Assignment.ipynb)
- Time Series (C4_W1_Lab_1_time_series.ipynb)
- Forecasting (C4_W1_Lab_2_forecasting.ipynb)
- Predict with a DNN (C4W2_Assignment.ipynb)
- Preparing Features and Labels (C4_W2_Lab_1_features_and_labels.ipynb)
- Single Layer Neural Network (C4_W2_Lab_2_single_layer_NN.ipynb)
- Deep Neural Network (C4_W2_Lab_3_deep_NN.ipynb)
- Using RNN's and LSTM's for time series (C4W3_Assignment.ipynb)
- Recurrent Neural Network (RNN) (C4_W3_Lab_1_RNN.ipynb)
- Long Short-Term Memory (LSTM) (C4_W3_Lab_2_LSTM.ipynb)
- Daily Minimum Temperatures in Melbourne - Real Life Data (C4W4_Assignment.ipynb)
- Long Short-Term Memory (LSTM) (C4_W4_Lab_1_LSTM.ipynb)
- Sunspots (C4_W4_Lab_2_Sunspots.ipynb)
- Sunspots - DNN Only (C4_W4_Lab_3_DNN_only.ipynb)