Model Path, Parameters, & Metadata

For an experiment to output a model in Gradient, the resulting model files need to be written to the specified or default model path.

PS_MODEL_PATH

Using the PS_MODEL_PATH environment variable is the easiest way to make sure you are outputting to the correct location.

model_dir = os.path.abspath(os.environ.get('PS_MODEL_PATH'))
#�You can also use gradient_sdk:
from gradient_sdk.utils import model_dir

Default paths

Single-node: --modelPath /artifacts is currently the default for single-node experiments. Output your model files there to appear in your Model Repository so that you can deploy it using Deployments

Multi-node: The default model path for multi-node experiments is /storage/models/<experiment_id>/

Model Parameters

To store Models in the Models list, add the following Model-specific parameters to the Experiment command when running an Experiment.

Model Type

--modelType defines the type of model that is being generated by the experiment. For example, --modelType Tensorflow will ensure that the model checkpoint files being generated are recognized as TensorFlow model files.

Model Type Values

Description

"Tensorflow"

TensorFlow compatible model outputs

"ONNX"

ONNX model outputs

"Custom"

Custom model type (e.g., a simple flask server)

Model Path

--modelPath defines where in the context of an Experiment the Model checkpoint files will be stored. This is a key argument that enables the evaluation and persistence of the generated model files.

One option is to set --modelPath "/artifacts" and keep the checkpoint files around only in the context of its Experiment. Another option is to set --modelPath "/storage/models" to have permanent access to the model generated files in your Paperspace storage.

Enabling Support for Models with GradientCI

You can also specify the model path and model type parameters when running experiments with GradientCI. See GradientCI Models for more info.

Custom model metadata

When modelType is not specified, custom model metadata can be associated with the model for later reference by creating a gradient-model-metadata.json file in the modelPath directory. Any valid JSON data can be stored in this file.

For models of type Tensorflow, metadata is automatically generated for your experiment, so any custom model metadata will be ignored.

An example of custom model metadata JSON is as follows:

{
"metrics": [
{
"name": "accuracy-score",
"numberValue": 60
}
]
}