text
stringlengths 0
4.99k
|
---|
inputs = keras.Input(shape=(784,), name="digits") |
x = keras.layers.Dense(64, activation="relu", name="dense_1")(inputs) |
x = keras.layers.Dense(64, activation="relu", name="dense_2")(x) |
outputs = keras.layers.Dense(10, name="predictions")(x) |
functional_model = keras.Model(inputs=inputs, outputs=outputs, name="3_layer_mlp") |
inputs = keras.Input(shape=(784,), name="digits") |
x = keras.layers.Dense(64, activation="relu", name="dense_1")(inputs) |
x = keras.layers.Dense(64, activation="relu", name="dense_2")(x) |
# Add a dropout layer, which does not contain any weights. |
x = keras.layers.Dropout(0.5)(x) |
outputs = keras.layers.Dense(10, name="predictions")(x) |
functional_model_with_dropout = keras.Model( |
inputs=inputs, outputs=outputs, name="3_layer_mlp" |
) |
functional_model_with_dropout.set_weights(functional_model.get_weights()) |
APIs for saving weights to disk & loading them back |
Weights can be saved to disk by calling model.save_weights in the following formats: |
TensorFlow Checkpoint |
HDF5 |
The default format for model.save_weights is TensorFlow checkpoint. There are two ways to specify the save format: |
save_format argument: Set the value to save_format="tf" or save_format="h5". |
path argument: If the path ends with .h5 or .hdf5, then the HDF5 format is used. Other suffixes will result in a TensorFlow checkpoint unless save_format is set. |
There is also an option of retrieving weights as in-memory numpy arrays. Each API has its pros and cons which are detailed below. |
TF Checkpoint format |
Example: |
# Runnable example |
sequential_model = keras.Sequential( |
[ |
keras.Input(shape=(784,), name="digits"), |
keras.layers.Dense(64, activation="relu", name="dense_1"), |
keras.layers.Dense(64, activation="relu", name="dense_2"), |
keras.layers.Dense(10, name="predictions"), |
] |
) |
sequential_model.save_weights("ckpt") |
load_status = sequential_model.load_weights("ckpt") |
# `assert_consumed` can be used as validation that all variable values have been |
# restored from the checkpoint. See `tf.train.Checkpoint.restore` for other |
# methods in the Status object. |
load_status.assert_consumed() |
<tensorflow.python.training.tracking.util.CheckpointLoadStatus at 0x151f01150> |
Format details |
The TensorFlow Checkpoint format saves and restores the weights using object attribute names. For instance, consider the tf.keras.layers.Dense layer. The layer contains two weights: dense.kernel and dense.bias. When the layer is saved to the tf format, the resulting checkpoint contains the keys "kernel" and "bias" and their corresponding weight values. For more information see "Loading mechanics" in the TF Checkpoint guide. |
Note that attribute/graph edge is named after the name used in parent object, not the name of the variable. Consider the CustomLayer in the example below. The variable CustomLayer.var is saved with "var" as part of key, not "var_a". |
class CustomLayer(keras.layers.Layer): |
def __init__(self, a): |
self.var = tf.Variable(a, name="var_a") |
layer = CustomLayer(5) |
layer_ckpt = tf.train.Checkpoint(layer=layer).save("custom_layer") |
ckpt_reader = tf.train.load_checkpoint(layer_ckpt) |
ckpt_reader.get_variable_to_dtype_map() |
{'save_counter/.ATTRIBUTES/VARIABLE_VALUE': tf.int64, |
'layer/var/.ATTRIBUTES/VARIABLE_VALUE': tf.int32, |
'_CHECKPOINTABLE_OBJECT_GRAPH': tf.string} |
Transfer learning example |
Essentially, as long as two models have the same architecture, they are able to share the same checkpoint. |
Example: |
inputs = keras.Input(shape=(784,), name="digits") |
x = keras.layers.Dense(64, activation="relu", name="dense_1")(inputs) |
x = keras.layers.Dense(64, activation="relu", name="dense_2")(x) |
outputs = keras.layers.Dense(10, name="predictions")(x) |
functional_model = keras.Model(inputs=inputs, outputs=outputs, name="3_layer_mlp") |
# Extract a portion of the functional model defined in the Setup section. |
# The following lines produce a new model that excludes the final output |
# layer of the functional model. |
pretrained = keras.Model( |
functional_model.inputs, functional_model.layers[-1].input, name="pretrained_model" |
) |
# Randomly assign "trained" weights. |
for w in pretrained.weights: |
w.assign(tf.random.normal(w.shape)) |
pretrained.save_weights("pretrained_ckpt") |
pretrained.summary() |
# Assume this is a separate program where only 'pretrained_ckpt' exists. |
# Create a new functional model with a different output dimension. |
inputs = keras.Input(shape=(784,), name="digits") |
x = keras.layers.Dense(64, activation="relu", name="dense_1")(inputs) |
x = keras.layers.Dense(64, activation="relu", name="dense_2")(x) |
outputs = keras.layers.Dense(5, name="predictions")(x) |
model = keras.Model(inputs=inputs, outputs=outputs, name="new_model") |
# Load the weights from pretrained_ckpt into model. |