Transformer#
Handle#
get_model_serving#
Connection.get_model_serving()
Get a reference to model serving to perform operations on. Model serving operates on top of a model registry, defaulting to the project's default model registry.
Example
import hopsworks
project = hopsworks.login()
ms = project.get_model_serving()
Returns
ModelServing
. A model serving handle object to perform operations on.
Creation#
create_transformer#
ModelServing.create_transformer(script_file=None, resources=None)
Create a Transformer metadata object.
Example
# login into Hopsworks using hopsworks.login()
# get Dataset API instance
dataset_api = project.get_dataset_api()
# get Hopsworks Model Serving handle
ms = project.get_model_serving()
# create my_transformer.py Python script
class Transformer(object):
def __init__(self):
''' Initialization code goes here '''
pass
def preprocess(self, inputs):
''' Transform the requests inputs here. The object returned by this method will be used as model input to make predictions. '''
return inputs
def postprocess(self, outputs):
''' Transform the predictions computed by the model before returning a response '''
return outputs
uploaded_file_path = dataset_api.upload("my_transformer.py", "Resources", overwrite=True)
transformer_script_path = os.path.join("/Projects", project.name, uploaded_file_path)
my_transformer = ms.create_transformer(script_file=uploaded_file_path)
# or
from hsml.transformer import Transformer
my_transformer = Transformer(script_file)
Create a deployment with the transformer
my_predictor = ms.create_predictor(transformer=my_transformer)
my_deployment = my_predictor.deploy()
# or
my_deployment = ms.create_deployment(my_predictor, transformer=my_transformer)
my_deployment.save()
Lazy
This method is lazy and does not persist any metadata or deploy any transformer. To create a deployment using this transformer, set it in the predictor.transformer
property.
Arguments
- script_file
Optional[str]
: Path to a custom predictor script implementing the Transformer class. - resources
Optional[Union[hsml.resources.PredictorResources, dict]]
: Resources to be allocated for the transformer.
Returns
Transformer
. The model metadata object.
Retrieval#
predictor.transformer#
Transformers can be accessed from the predictor metadata objects.
predictor.transformer
Predictors can be found in the deployment metadata objects (see Predictor Reference). To retrieve a deployment, see the Deployment Reference.
Properties#
inference_batcher#
Configuration of the inference batcher attached to the deployment component (i.e., predictor or transformer).
resources#
Resource configuration for the deployment component (i.e., predictor or transformer).
script_file#
Script file ran by the deployment component (i.e., predictor or transformer).
Methods#
describe#
Transformer.describe()
Print a description of the transformer
to_dict#
Transformer.to_dict()
To be implemented by the component type