AWS SageMaker
flytekitplugins-awssagemaker
The plugin currently features a SageMaker deployment connector.
pip install flytekitplugins-awssagemakerQuick Start(example, may need adjustment)
See full examplespip install flytekitplugins-awssagemaker
from flytekit import task, workflow
from flytekitplugins.awssagemaker_inference import BotoConnector, BotoConfig, BotoTask, SageMakerEndpointConnector
@task(task_config=BotoTask(...))
def my_task() -> None:
...
@workflow
def my_workflow() -> None:
my_task()Available Imports (6)
A general purpose boto3 connector that can be used to call any boto3 method.
from flytekitplugins.awssagemaker_inference import BotoConnector
Configuration type for AWS SageMaker.
extends dataclass — configuration or data structure for plugin setup
from flytekitplugins.awssagemaker_inference import BotoConfig
Task for AWS SageMaker.
extends PythonTask — a flyte task that can be used in workflows
from flytekitplugins.awssagemaker_inference import BotoTask
This connector creates an endpoint.
from flytekitplugins.awssagemaker_inference import SageMakerEndpointConnector
Task for AWS SageMaker.
from flytekitplugins.awssagemaker_inference import create_sagemaker_deployment
Task for AWS SageMaker.
from flytekitplugins.awssagemaker_inference import delete_sagemaker_deployment
Dependencies
Related Plugins
AWS Athena
Flyte backend can be connected with Athena. Once enabled, it allows you to query AWS Athena service (Presto + ANSI SQL Support) and retrieve typed schema (optionally).
AWS Batch
Flyte backend can be connected with AWS batch. Once enabled, it allows you to run flyte task on AWS batch service
Databricks
This plugin provides Databricks integration for Flyte, enabling you to run Spark jobs on Databricks as Flyte tasks.
Kubernetes Pod
By default, Flyte tasks decorated with @task are essentially single functions that are loaded in one container. But often, there is a need to run a job with more than one container.