__setup__()
and __predict__()
methods__setup__()
method is run once during initialization and is typically used for heavy one-time operations, such as loading machine learning model weights into memory. The __predict__()
method has the same attributes as a standard function, with a single entry point and exit point.
All code is mounted by default into /src
of the container.
@sieve.function
decorator:
referenced_function
on the Sieve cloud.
.run()
method blocks and waits for the function to complete, and directly returns the value returned by the function.
.push()
method returns a SieveFuture
object, which contains information about the job. You can later access the output using the .result()
method, for example. This is useful for when you are pushing several jobs at once, like when processing frames from a video. This ensures that you are not waiting for previous frames to finish processing before pushing the next ones, and are taking full advantage of parallel processing.
SieveFuture
objects don’t work with traditional concurrent.futures
methods like concurrent.futures.as_completed
or concurrent.futures.wait
.
You can call both .run()
and .push()
locally, or within another Sieve function. Calling .run()
or .push()
within Sieve allows you to chain function calls together, enabling powerful AI applications!
Parameter | Type | Description | Default |
---|---|---|---|
name | str | Defines the name of the function | This field is required |
gpu | sieve.gpu | Determines whether the function is deployed on a server with a GPU, which GPU it should be deployed to. See the Parameter column in the table in GPU Acceleration for a list of options and for more information about GPU sharing. The default value runs on a machine with no GPU. Any other value is specified with a call to one of the sieve.gpu constructors | None |
python_version | str | Determines the python version installed | Default is 3.8 |
python_packages | List[str] | List of python packages to be installed during build | A minimum set of Sieve dependencies will be installed by default |
system_packages | List[str] | List of Linux packages to be installed during build | [] |
cuda_version | str | Version of cuda to be installed (for functions with gpu enabled). See below for possible versions. | Default is determined by python packages |
run_commands | List[str] | List of shell commands to be run during build. Note: these commands currently do not have access to the uploaded code | [] |
environment_variables | List[sieve.Env] | List of environment variables that allow you to pass in org-level configs and secrets to your sieve Function. Check out our guide here for more. | [] |
metadata | sieve.Metadata | Extra information about the function to be shown on the dashboard. See below for example. | None |
restart_on_error | bool | If this variable is true, if your function ever errors (or __predict__() for class-based functions) the container will automatically be restarted from scratch. This takes time, but is useful in the event of irreversible failures like GPU corruption. Read more about it below. | True |
__setup__()
function for class-based functions. This behavior is intended as a safeguard against irreversible errors, including GPU errors and state-related errors. For production applications, you may choose to disable this so that functions don’t restart during more trivial errors, such as invalid inputs or edge cases in code. Restarting functions takes longer but guarantees a brand new state after every function error.
To disable this behavior, you can set the restart_on_error
parameter in the function header as follows:
sieve.FatalException
.
sieve.FatalException
is a special Python Exception in the Sieve ecosystem that allows you to trigger a hard restart of the function when certain irreversible errors are caught, even if the restart_on_error
parameter is set to False. Here is a brief example of this in action.
cuda_version
parameter in the function decorator. You can specify parts of versions as well (e.g. '12.2'
for '12.2.2'
). Possible values for cuda_version
are: