Skip to content

Stack

laktory.models.Stack ¤

Bases: BaseModel

The Stack defines a collection of deployable resources, the deployment configuration, some variables and the environment-specific settings.

ATTRIBUTE DESCRIPTION
backend

IaC backend used for deployment.

TYPE: Literal['pulumi', 'terraform']

description

Description of the stack

TYPE: str

environments

Environment-specific overwrite of config, resources or variables arguments.

TYPE: dict[str, EnvironmentSettings]

name

Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.

TYPE: str

organization

Organization

TYPE: Union[str, None]

pulumi

Pulumi-specific settings

TYPE: Pulumi

resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Union[StackResources, None]

settings

Laktory settings

TYPE: LaktorySettings

terraform

Terraform-specific settings

TYPE: Terraform

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]

Examples:

from laktory import models

stack = models.Stack(
    name="workspace",
    backend="pulumi",
    pulumi={
        "config": {
            "databricks:host": "${vars.DATABRICKS_HOST}",
            "databricks:token": "${vars.DATABRICKS_TOKEN}",
        },
    },
    resources={
        "databricks_dltpipelines": {
            "pl-stock-prices": {
                "name": "pl-stock-prices",
                "development": "${vars.is_dev}",
                "libraries": [
                    {"notebook": {"path": "/pipelines/dlt_brz_template.py"}},
                ],
            }
        },
        "databricks_jobs": {
            "job-stock-prices": {
                "name": "job-stock-prices",
                "clusters": [
                    {
                        "name": "main",
                        "spark_version": "14.0.x-scala2.12",
                        "node_type_id": "Standard_DS3_v2",
                    }
                ],
                "tasks": [
                    {
                        "task_key": "ingest",
                        "job_cluster_key": "main",
                        "notebook_task": {
                            "notebook_path": "/.laktory/jobs/ingest_stock_prices.py",
                        },
                    },
                    {
                        "task_key": "pipeline",
                        "depends_ons": [{"task_key": "ingest"}],
                        "pipeline_task": {
                            "pipeline_id": "${resources.dlt-pl-stock-prices.id}",
                        },
                    },
                ],
            }
        },
    },
    variables={
        "org": "okube",
    },
    environments={
        "dev": {
            "variables": {
                "is_dev": True,
            }
        },
        "prod": {
            "variables": {
                "is_dev": False,
            }
        },
    },
)

print(stack)
'''
variables={'org': 'okube'} backend='pulumi' description=None environments={'dev': EnvironmentSettings(variables={'is_dev': True}, resources=None, terraform=Terraform(variables={}, backend=None)), 'prod': EnvironmentSettings(variables={'is_dev': False}, resources=None, terraform=Terraform(variables={}, backend=None))} name='workspace' organization=None pulumi=Pulumi(variables={}, config={'databricks:host': '${vars.DATABRICKS_HOST}', 'databricks:token': '${vars.DATABRICKS_TOKEN}'}, outputs={}) resources=StackResources(variables={}, databricks_alerts={}, databricks_catalogs={}, databricks_clusterpolicies={}, databricks_clusters={}, databricks_dashboards={}, databricks_dbfsfiles={}, databricks_directories={}, databricks_dltpipelines={'pl-stock-prices': DLTPipeline(resource_name_='pl-stock-prices', options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], allow_duplicate_names=None, catalog=None, channel='PREVIEW', clusters=[], configuration={}, continuous=None, development='${vars.is_dev}', edition=None, libraries=[PipelineLibrary(variables={}, file=None, notebook=PipelineLibraryNotebook(variables={}, path='/pipelines/dlt_brz_template.py'))], name='pl-stock-prices', name_prefix=None, name_suffix=None, notifications=[], photon=None, serverless=None, storage=None, target=None)}, databricks_externallocations={}, databricks_grants={}, databricks_groups={}, databricks_jobs={'job-stock-prices': Job(resource_name_='job-stock-prices', options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], clusters=[JobCluster(resource_name_=None, options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=None, apply_policy_default_values=None, autoscale=None, autotermination_minutes=None, cluster_id=None, custom_tags=None, data_security_mode='USER_ISOLATION', driver_instance_pool_id=None, driver_node_type_id=None, enable_elastic_disk=None, enable_local_disk_encryption=None, idempotency_token=None, init_scripts=[], instance_pool_id=None, is_pinned=None, libraries=None, name='main', node_type_id='Standard_DS3_v2', no_wait=None, num_workers=None, policy_id=None, runtime_engine=None, single_user_name=None, spark_conf={}, spark_env_vars={}, spark_version='14.0.x-scala2.12', ssh_public_keys=[])], continuous=None, control_run_state=None, description=None, email_notifications=None, format=None, health=None, max_concurrent_runs=None, max_retries=None, min_retry_interval_millis=None, name='job-stock-prices', name_prefix=None, name_suffix=None, notification_settings=None, parameters=[], queue=None, retry_on_timeout=None, run_as=None, schedule=None, tags={}, tasks=[JobTask(variables={}, condition_task=None, depends_ons=None, description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key='main', libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=JobTaskNotebookTask(variables={}, notebook_path='/.laktory/jobs/ingest_stock_prices.py', base_parameters=None, warehouse_id=None, source=None), notification_settings=None, pipeline_task=None, retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='ingest', timeout_seconds=None, for_each_task=None), JobTask(variables={}, condition_task=None, depends_ons=[JobTaskDependsOn(variables={}, task_key='ingest', outcome=None)], description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key=None, libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=None, notification_settings=None, pipeline_task=JobTaskPipelineTask(variables={}, pipeline_id='${resources.dlt-pl-stock-prices.id}', full_refresh=None), retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='pipeline', timeout_seconds=None, for_each_task=None)], timeout_seconds=None, trigger=None, webhook_notifications=None)}, databricks_metastoredataaccesses={}, databricks_metastores={}, databricks_mlflowexperiments={}, databricks_mlflowmodels={}, databricks_mlflowwebhooks={}, databricks_networkconnectivityconfig={}, databricks_notebooks={}, databricks_queries={}, databricks_repos={}, databricks_schemas={}, databricks_secrets={}, databricks_secretscopes={}, databricks_serviceprincipals={}, databricks_tables={}, databricks_users={}, databricks_vectorsearchendpoints={}, databricks_vectorsearchindexes={}, databricks_volumes={}, databricks_warehouses={}, databricks_workspacefiles={}, pipelines={}, providers={}) settings=None terraform=Terraform(variables={}, backend=None)
'''
METHOD DESCRIPTION
apply_settings

Required to apply settings before instantiating resources and setting default values

get_env

Complete definition the stack for a given environment. It takes into

to_pulumi

Create a pulumi stack for a given environment env.

to_terraform

Create a terraform stack for a given environment env.

Functions¤

apply_settings classmethod ¤

apply_settings(data)

Required to apply settings before instantiating resources and setting default values

Source code in laktory/models/stacks/stack.py
444
445
446
447
448
449
450
451
452
@model_validator(mode="before")
@classmethod
def apply_settings(cls, data: Any) -> Any:
    """Required to apply settings before instantiating resources and setting default values"""
    settings = data.get("settings", None)
    if settings:
        LaktorySettings(**settings)

    return data

get_env ¤

get_env(env_name)

Complete definition the stack for a given environment. It takes into account both the default stack values and environment-specific overwrites.

PARAMETER DESCRIPTION
env_name

Name of the environment

TYPE: str

RETURNS DESCRIPTION
EnvironmentStack

Environment definitions.

Source code in laktory/models/stacks/stack.py
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
def get_env(self, env_name: str) -> EnvironmentStack:
    """
    Complete definition the stack for a given environment. It takes into
    account both the default stack values and environment-specific
    overwrites.

    Parameters
    ----------
    env_name:
        Name of the environment

    Returns
    -------
    :
        Environment definitions.
    """

    if env_name is None:
        env = self
        env.push_vars()
        return env

    if env_name not in self.environments.keys():
        raise ValueError(f"Environment '{env_name}' is not declared in the stack.")

    if self._envs is None:
        ENV_FIELDS = ["pulumi", "resources", "terraform", "variables"]

        # Because options is an excluded field for all resources and
        # sub-resources, we need to manually dump it and add it to
        # the base dump
        def dump_with_options(obj: Any) -> Any:
            # Check data type, call recursively if not a BaseModel
            if isinstance(obj, list):
                return [dump_with_options(v) for v in obj]
            elif isinstance(obj, dict):
                return {k: dump_with_options(v) for k, v in obj.items()}
            elif not isinstance(obj, BaseModel):
                return obj

            # Get model dump
            model = obj
            data = model.model_dump(exclude_unset=True)

            # Loop through all model fields
            for field_name, field in model.model_fields.items():
                # Explicitly dump options if found in the model
                if field_name == "options" and field.annotation == ResourceOptions:
                    data["options"] = model.options.model_dump(exclude_unset=True)

                if field_name == "resource_name_" and model.resource_name_:
                    data["resource_name_"] = model.resource_name_

                if field_name == "lookup_existing" and model.lookup_existing:
                    data["lookup_existing"] = model.lookup_existing.model_dump(
                        exclude_unset=True
                    )

                # Parse list
                if isinstance(data.get(field_name, None), list):
                    data[field_name] = [
                        dump_with_options(v) for v in getattr(model, field_name)
                    ]

                # Parse dict (might result from a dict or a BaseModel)
                elif isinstance(data.get(field_name, None), dict):
                    a = getattr(model, field_name)

                    if isinstance(a, dict):
                        for k in a.keys():
                            data[field_name][k] = dump_with_options(a[k])
                    else:
                        data[field_name] = dump_with_options(a)

            return data

        envs = {}
        for _env_name, env in self.environments.items():
            d = dump_with_options(self)
            _envs = d.pop("environments")

            for k in ENV_FIELDS:
                v1 = _envs[_env_name].get(k, {})
                if k in d:
                    d[k] = merge_dicts(d[k], v1)
                elif k in _envs[_env_name]:
                    d[k] = v1

            envs[_env_name] = EnvironmentStack(**d)
            envs[_env_name].push_vars()

        self._envs = envs

    return self._envs[env_name]

to_pulumi ¤

to_pulumi(env_name=None)

Create a pulumi stack for a given environment env.

PARAMETER DESCRIPTION
env_name

Target environment. If None, used default stack values only.

TYPE: Union[str, None] DEFAULT: None

RETURNS DESCRIPTION
PulumiStack

Pulumi-specific stack definition

Source code in laktory/models/stacks/stack.py
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
def to_pulumi(self, env_name: Union[str, None] = None):
    """
    Create a pulumi stack for a given environment `env`.

    Parameters
    ----------
    env_name:
        Target environment. If `None`, used default stack values only.

    Returns
    -------
    : PulumiStack
        Pulumi-specific stack definition
    """
    from laktory.models.stacks.pulumistack import PulumiStack

    env = self.get_env(env_name=env_name)

    # Resources
    resources = {}
    for r in env.resources._get_all().values():
        r.variables = env.variables
        for _r in r.core_resources:
            resources[_r.resource_name] = _r

    return PulumiStack(
        name=env.name,
        organization=env.organization,
        config=env.pulumi.config,
        description=env.description,
        resources=resources,
        variables=env.variables,
        outputs=env.pulumi.outputs,
    )

to_terraform ¤

to_terraform(env_name=None)

Create a terraform stack for a given environment env.

PARAMETER DESCRIPTION
env_name

Target environment. If None, used default stack values only.

TYPE: Union[str, None] DEFAULT: None

RETURNS DESCRIPTION
TerraformStack

Terraform-specific stack definition

Source code in laktory/models/stacks/stack.py
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
def to_terraform(self, env_name: Union[str, None] = None):
    """
    Create a terraform stack for a given environment `env`.

    Parameters
    ----------
    env_name:
        Target environment. If `None`, used default stack values only.

    Returns
    -------
    : TerraformStack
        Terraform-specific stack definition
    """
    from laktory.models.stacks.terraformstack import TerraformStack

    env = self.get_env(env_name=env_name)

    # Providers
    providers = {}
    for r in env.resources._get_all(providers_only=True).values():
        for _r in r.core_resources:
            rname = _r.resource_name
            providers[rname] = _r

    # Resources
    resources = {}
    for r in env.resources._get_all(providers_excluded=True).values():
        r.variables = env.variables
        for _r in r.core_resources:
            resources[_r.resource_name] = _r

    # Update terraform
    return TerraformStack(
        terraform={"backend": env.terraform.backend},
        providers=providers,
        resources=resources,
        variables=env.variables,
    )

laktory.models.StackResources ¤

Bases: BaseModel

Resources definition for a given stack or stack environment.

ATTRIBUTE DESCRIPTION
databricks_alerts

Databricks Alerts

TYPE: dict[str, Alert]

databricks_dbfsfiles

Databricks DbfsFiles

TYPE: dict[str, DbfsFile]

databricks_catalogs

Databricks Catalogs

TYPE: dict[str, Catalog]

databricks_clusters

Databricks Clusters

TYPE: dict[str, Cluster]

databricks_clusterpolicies

Databricks Cluster Policies

TYPE: dict[str, ClusterPolicy]

databricks_dashboards

Databricks Dashboards

TYPE: dict[str, Dashboard]

databricks_directories

Databricks Directories

TYPE: dict[str, Directory]

databricks_dltpipelines

Databricks DLT Pipelines

TYPE: dict[str, DLTPipeline]

databricks_externallocations

Databricks External Locations

TYPE: dict[str, ExternalLocation]

databricks_groups

Databricks Groups

TYPE: dict[str, Group]

databricks_grants

Databricks Grants

TYPE: dict[str, Grants]

databricks_jobs

Databricks Jobs

TYPE: dict[str, Job]

databricks_metastores

Databricks Metastores

TYPE: dict[str, Metastore]

databricks_mlflowexperiments

Databricks MLflow Experiments

TYPE: dict[str, MLflowExperiment]

databricks_mlflowmodels

Databricks MLflow models

TYPE: dict[str, MLflowModel]

databricks_mlflowwebhooks

Databricks MLflow webhooks

TYPE: dict[str, MLflowWebhook]

databricks_networkconnectivityconfig

Databricks Network Connectivity Config

TYPE: dict[str, MwsNetworkConnectivityConfig]

databricks_notebooks

Databricks Notebooks

TYPE: dict[str, Notebook]

databricks_queries

Databricks Queries

TYPE: dict[str, Query]

databricks_repo

Databricks Repo

databricks_schemas

Databricks Schemas

TYPE: dict[str, Schema]

databricks_secretscopes

Databricks SecretScopes

TYPE: dict[str, SecretScope]

databricks_serviceprincipals

Databricks ServicePrincipals

TYPE: dict[str, ServicePrincipal]

databricks_tables

Databricks Tables

TYPE: dict[str, Table]

databricks_users

Databricks Users

TYPE: dict[str, User]

databricks_vectorsearchendpoint

Databricks Vector Search Endpoint

databricks_vectorsearchindex

Databricks Vector Search Index

databricks_volumes

Databricks Volumes

TYPE: dict[str, Volume]

databricks_warehouses

Databricks Warehouses

TYPE: dict[str, Warehouse]

databricks_workspacefiles

Databricks WorkspacFiles

TYPE: dict[str, WorkspaceFile]

pipelines

Laktory Pipelines

TYPE: dict[str, Pipeline]

providers

Providers

TYPE: dict[str, Union[AWSProvider, AzureProvider, AzurePulumiProvider, DatabricksProvider]]


laktory.models.stacks.stack.LaktorySettings ¤

Bases: BaseModel

Laktory Settings

ATTRIBUTE DESCRIPTION
dataframe_backend

DataFrame backend

TYPE: str

laktory_root

Laktory cache root directory. Used when a pipeline needs to write checkpoint files.

TYPE: str

workspace_laktory_root

Root directory of a Databricks Workspace (excluding `"/Workspace") to which databricks objects like notebooks and workspace files are deployed.

TYPE: str


laktory.models.stacks.stack.EnvironmentSettings ¤

Bases: BaseModel

Settings overwrite for a specific environments

ATTRIBUTE DESCRIPTION
resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Any

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]

terraform

Terraform-specific settings

TYPE: Terraform


laktory.models.stacks.stack.EnvironmentStack ¤

Bases: BaseModel

Environment-specific stack definition.

ATTRIBUTE DESCRIPTION
backend

IaC backend used for deployment.

TYPE: Literal['pulumi', 'terraform']

description

Description of the stack

TYPE: str

name

Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.

TYPE: str

organization

Organization

TYPE: str

pulumi

Pulumi-specific settings

TYPE: Pulumi

resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Union[StackResources, None]

settings

Laktory settings

TYPE: LaktorySettings

terraform

Terraform-specific settings

TYPE: Terraform

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]


laktory.models.stacks.stack.Pulumi ¤

Bases: BaseModel

config: Pulumi configuration settings. Generally used to configure providers. See references for more details. outputs: Requested resources-related outputs. See references for details.

References

laktory.models.stacks.stack.Terraform ¤

Bases: BaseModel