Photo by Tim Mossholder

Welcome to my articles on Deep Learning, Reinforcement Learning, Meta-Learning, and Machine Learning. The purpose of this article is to index the series and articles I wrote so far. If you want to reach me or have non-technical comments regarding the articles, please read the F.A.Q. first for the most common questions and contact methods.

Series

Not part of a series yet

NLP


In the two previous articles on GCN and GNN networks, we present a design overview for Graph Neural Networks. In our final article, we will focus on GNN applications. Since the basic GNN theory is already covered by the mentioned articles, we will not repeat it here. And for the design details for each application, please also refer to the original research papers.

Medical Diagnosis & Electronic Health Records Modeling

Medical ontology can be described by a graph, for example, the following diagram represents the ontology using a DAG (directed acyclic graph).


In general, Graph Neural Networks (GNN) refer to the general concept of applying neural networks (NNs) on graphs. In a previous article, we cover GCN which is one of the popular approaches in GNN. But in some literature, GNN may refer to a more specific approach that the hidden state of a node depends on its last states and its neighbors. We can view this as message passings with its neighboring nodes. Mathematically, it has the form of:


You know, who you choose to be around you, let’s you know who you are. — The Fast and the Furious: Tokyo Drift.

What is the challenge?

In social networks, friend connections can be realized by a social graph.


In this article, we cover the TensorFlow generative models includes:

  • DCGAN
  • CycleGAN
  • Pix2Pix
  • Neural style transfer
  • Adversarial FGSM
  • Autoencoder
  • Autoencoder Denoising
  • Autoencoder Anomaly detection
  • Convolutional Variational Autoencoder (VAE)
  • DeepDream

DCGAN

DCGAN is one popular design for GAN. It composes of convolution and transposed convolutional layers without max pooling or fully connected layers. The figure below is the network design for the generator. This example will be trained to generate MNIST digits. We will start the article with this most basic GAN model.


As part of the TensorFlow series, this article focuses on coding examples on BERT and Transformer. These examples are:

  • IMDB files: Sentimental analysis with pre-trained TF Hub BERT model and AdamW,
  • GLUE/MRPC BERT finetuning,
  • Transformer for language translation.

IMDB files: Sentimental analysis with pre-trained TF Hub BERT model and AdamW

In this example, we use a pre-trained TensorFlow Hub model for BERT and an AdamW optimizer. Because most of the heavy work is done by the TF Hub model, we will keep the explanation simple for this example.

First, we download and prepare IMDB files.


Sequence-to-sequence models are particularly popular in NLP. This article, as part of the TensorFlow series, will cover examples for the sequence to sequence model.


In the last article, we present TensorFlow coding examples on computer vision. Now, we will focus on NLP classification and BERT. Here is the list of examples, tested with TF 2.4.0 released in Dec 2020. These examples are originated from the TensorFlow Tutorial.

  • IMDB files: Sentimental analysis with dataset mapping & Embedding,
  • IMDB files: TensorBoard & Sentimental analysis with Embedding & a data preprocessing layer,
  • IMDB TF Dataset: Sentimental analysis with pre-trained TF Hub Embedding,
  • IMDB TF Dataset: Sentimental analysis with Bi-directional LSTM,
  • Illiad files classification using data processing with tf.text & TextLineDataset,

IMDB files: Sentimental analysis with dataset mapping & Embedding

This example performs sentimental analysis on IMDB…


Reading code is one effective way to get professional in TensorFlow (TF). In this article, we reuse the examples in the TensorFlow tutorial but we keep them condense and take away codes that are for tracing or demonstration purposes. We also keep our discussion minimum so you can browse through as many examples as possible to get a complete picture. If you have problems to follow, in particular after reading the first example, please read the articles in this series first.

Yet, in most examples, we keep all the boilerplate codes required by the examples. So skip them if needed…


In this series, we cover major topics in deep learning coding with TensorFlow 2 — for computer vision, NLP, and generative models. In the end, we go through coding examples categorized by each type of application.

Jonathan Hui

Deep Learning

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store