Welcome to my articles on Deep Learning, Reinforcement Learning, Meta-Learning, and Machine Learning. The purpose of this article is to index the series and articles I wrote so far. If you want to reach me or have non-technical comments regarding the articles, please read the F.A.Q. first for the most common questions and contact methods.
NLP
In the two previous articles on GCN and GNN networks, we present a design overview for Graph Neural Networks. In our final article, we will focus on GNN applications. Since the basic GNN theory is already covered by the mentioned articles, we will not repeat it here. And for the design details for each application, please also refer to the original research papers.
Medical Diagnosis & Electronic Health Records Modeling
Medical ontology can be described by a graph, for example, the following diagram represents the ontology using a DAG (directed acyclic graph).
In general, Graph Neural Networks (GNN) refer to the general concept of applying neural networks (NNs) on graphs. In a previous article, we cover GCN which is one of the popular approaches in GNN. But in some literature, GNN may refer to a more specific approach that the hidden state of a node depends on its last states and its neighbors. We can view this as message passings with its neighboring nodes. Mathematically, it has the form of:
In this article, we cover the TensorFlow generative models includes:
DCGAN is one popular design for GAN. It composes of convolution and transposed convolutional layers without max pooling or fully connected layers. The figure below is the network design for the generator. This example will be trained to generate MNIST digits. We will start the article with this most basic GAN model.
As part of the TensorFlow series, this article focuses on coding examples on BERT and Transformer. These examples are:
In this example, we use a pre-trained TensorFlow Hub model for BERT and an AdamW optimizer. Because most of the heavy work is done by the TF Hub model, we will keep the explanation simple for this example.
First, we download and prepare IMDB files.
In the last article, we present TensorFlow coding examples on computer vision. Now, we will focus on NLP classification and BERT. Here is the list of examples, tested with TF 2.4.0 released in Dec 2020. These examples are originated from the TensorFlow Tutorial.
This example performs sentimental analysis on IMDB…
Reading code is one effective way to get professional in TensorFlow (TF). In this article, we reuse the examples in the TensorFlow tutorial but we keep them condense and take away codes that are for tracing or demonstration purposes. We also keep our discussion minimum so you can browse through as many examples as possible to get a complete picture. If you have problems to follow, in particular after reading the first example, please read the articles in this series first.
Yet, in most examples, we keep all the boilerplate codes required by the examples. So skip them if needed…
In this series, we cover major topics in deep learning coding with TensorFlow 2 — for computer vision, NLP, and generative models. In the end, we go through coding examples categorized by each type of application.
Deep Learning