cldz quickdeploy

Deploy Notebooks

Syntax

cldz quickdeploy -n python PATH_TO_JUPYTER_NOTEBOOK {REQUIREMENTS_TXT}

-n flag tells cldz that deployment is of type notebook. Immediately after -n we follow up with the notebook kernel type. Following kernels are supported

  • python
  • R (coming soon)
  • julia (coming soon)

If no flag is given, cldz assumes the first argument is a python based jupyter notebook and second argument is a requirements file.

Example

cldz quickdeploy -n python awesome-notebook.ipynb requirements.txt

Async Endpoints

Notebook deployments support Async serverless endpoints. This is useful as often Notebook execution is a long running operation. Async endpoints can be invoked via any http client like curl. Caller can specify a callback URL (webhook) in its http invocation under header X-callback-url. Serverless function immediately replies back with 202 request accepted response code and continue to execute the notebook. Once notebook execution is complete, the callback URL (webhook) is called with the result. In addition to this webhook, output of the invocation can also be viewed under Invocation History on Clouderizer console.

Deploy Models

Syntax

cldz quickdeploy -m MODEL_TYPE PATH_TO_H2O_MODEL_FILE

-m flag tells cldz that deployment is of type model. Immediately after -m we follow up with the model type. Following model types are supported

  • h2o.ai MOJO model
  • PMML
  • Python
  • ONNX

h2o / dai / pmml model deployment

cldz quickdeploy -m h2o PATH_TO_H2O_MODEL_FILE

-m flag tells cldz that deployment is of type model.

If -m is not specified,cldz assumes type as notebook.

h2o in the above example can be replaced with dai / pmml.

CLI_LOGGEDIN

python model deployment

Since python predict code needs to be executed.

cldz quickdeploy -m python PATH_TO_PYTHON_MODEL --predict predict.py

PATH_TO_PYTHON_MODEL is usually the model file we load and score in predict.py.

Pre-processing and Post-processing scripts

Does your model requires preprocess / postprocess scripts at the time of scoring?

Suppose you have a pmml model with both preprocess and postprocess files. Syntax is:

cldz quickdeploy -m pmml PATH_TO_PMML_FILE --preprocess preprocess.py --postprocess postprocess.py

VERY IMPORTANT NOTE: Preprocess and postprocess flags are optional. Even mispelling the flags will deploy the model but without the mispelled arguments.

Infra Configuration Flag

--infra flag tells cldz the type of infra you want to run on.

Clouderizer currently supports 3 infra types:

  1. standard - 2GB Memory
  2. highmemory - 6GB Memory
  3. gpu - 16GB GPU Memory, 30GB System Memory

standard infra type is selected by default but if you want to start a project on gpu

Note

Tensorflow 2.4+ is supported for GPU.

cldz start PROJECT_NAME --infra gpu

Image Configuration Flag

--image flag tells cldz the type of base image you want to run on.

Clouderizer currently supports 3 image types:

  1. standard - python 3.8
  2. tensorflow - python 3.8 + Tensorflow 2.4
  3. torch - python 3.8 + Pytorch + Fastai + nbdev

standard image type is selected by default

cldz quickdeploy my_notebook.ipynb --image torch

Startup scripts

Your project might need a few additional system dependencies. We use a Linux based distro Ubuntu for all the project setup. Incase the dependencies you need are missing, with cldz quickdeploy you can insert your dependency file with --startup_scripts arg i.e list of linux shell commands which installs the needed dependencies.

Suppose you need vim text editor in your project, create a file project_shell_commands.txt. File contents:
apt-get update
apt-get install -y vim

This will install vim in your project environment

cldz quickdeploy my_notebook.ipynb --startup_scripts project_shell_commands.txt

Privileged user

To install some dependencies or run system commands within notebook root user access is needed. With cldz quickdeploy you can pass --user root for root user privileges in your environment.

cldz quickdeploy my_notebook.ipynb --user root