TensorFlow Libraries and Extensions

In this article, we will overview some of the key extensions and libraries in TensorFlow 2.x (TF 2.x). This will include TF Datasets, TF Hub, XLA, model optimization, TensorBoard, TF Probability, Neural Structured Learning, TF Serving, TF Federated, TF Graphics, and MLIR.

TensorFlow Datasets

import tensorflow as tf
import tensorflow_datasets as tfds
import tensorflow.data as tfd

# Construct a tf.data.Dataset
# Data loaded into ~/tensorflow_datasets/mnist
ds = tfds.load('mnist', split='train', shuffle_files=True)

# Build your input pipeline
ds = ds.shuffle(1024).batch(32).prefetch(tfd.experimental.AUTOTUNE)
for example in ds.take(1):
image, label = example["image"], example["label"]

assert image.shape == (32, 28, 28, 1)
assert label.shape == (32,)

TensorFlow Hub

#pip install --upgrade tensorflow_hub
import tensorflow_hub as hub

model = hub.KerasLayer("https://tfhub.dev/google/nnlm-en-dim128/2")
embeddings = model(["The rain in Spain.", "falls",
"mainly", "In the plain!"])

assert embeddings.shape == [4, 128]

Model optimization

Source

The memory footprint can also be reduced using weight clustering. It first groups the weights of each layer into N clusters, then shares the cluster’s centroid value for all the weights belonging to the cluster.

TensorBoard

Source: TensorFlow

In a TF application, we store information into files that can be read from the tensorboard application.

TensorFlow Probability (TFP)

The code below samples 100K data from a normal distribution and manipulate it to sample 100K Bernoulli distribution data. With the data collected, the code fit the Bernoulli distribution with these data and find the model parameters.

mport tensorflow as tf
import tensorflow_probability as tfp

# Pretend to load synthetic data set.
features = tfp.distributions.Normal(loc=0., scale=1.).sample(int(100e3))
labels = tfp.distributions.Bernoulli(logits=1.618 * features).sample()

# Specify model.
model = tfp.glm.Bernoulli()

# Fit model given data.
coeffs, linear_response, is_converged, num_iter = tfp.glm.fit(
model_matrix=features[:, tf.newaxis],
response=tf.cast(labels, dtype=tf.float32),
model=model)
# ==> coeffs is approximately [1.618] (We're golden!)

Neural Structured Learning (NSL)

Source

For example, we can introduce a neighbor loss to penalize the difference (D) of the neighbors’ embeddings.

With the additional edge information, we can perform graph regularization in performing document and sentiment classification.

In computer vision, we can add noise into pixels of an image to generate artificial neighbors to avoid an adversarial attack. Here is the code in generating an adversarial regularization model on top of a deep learning model using NSL.

Source

TensorFlow Serving (TFS)

Source

Below, the SavedModel “my_model” is served at port 8501.

Source

TensorFlow Federated (TFF)

In Federated Learning, the client devices compute SGD updates on locally-collected data. The model’s updates are collected and aggregated in a remote server instead of sending the more sensitive user data to that server. Finally, the aggregated model is sent back to the client.

Source

Tensorflow Graphics

For example, we can train a neural network model to decompose an image to the corresponding scene parameters that can be used to renderer a scene. Such models will be trained to minimize reconstruction loss.

Source

The following is the sample code in rendering an object using the TensorFlow Graphics.

Source

MLIR

Source

XLA

Credits and References

Disclaimer

Deep Learning