Skip to content

Stack

laktory.models.Stack ¤

Bases: BaseModel

The Stack defines a collection of deployable resources, the deployment configuration, some variables and the environment-specific settings.

ATTRIBUTE DESCRIPTION
backend

IaC backend used for deployment.

TYPE: Literal['pulumi', 'terraform']

description

Description of the stack

TYPE: str

environments

Environment-specific overwrite of config, resources or variables arguments.

TYPE: dict[str, EnvironmentSettings]

name

Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.

TYPE: str

organization

Organization

TYPE: Union[str, None]

pulumi

Pulumi-specific settings

TYPE: Pulumi

resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Union[StackResources, None]

terraform

Terraform-specific settings

TYPE: Terraform

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]

Examples:

from laktory import models

stack = models.Stack(
    name="workspace",
    backend="pulumi",
    pulumi={
        "config": {
            "databricks:host": "${vars.DATABRICKS_HOST}",
            "databricks:token": "${vars.DATABRICKS_TOKEN}",
        },
    },
    resources={
        "databricks_dltpipelines": {
            "pl-stock-prices": {
                "name": "pl-stock-prices",
                "development": "${vars.is_dev}",
                "libraries": [
                    {"notebook": {"path": "/pipelines/dlt_brz_template.py"}},
                ],
            }
        },
        "databricks_jobs": {
            "job-stock-prices": {
                "name": "job-stock-prices",
                "clusters": [
                    {
                        "name": "main",
                        "spark_version": "14.0.x-scala2.12",
                        "node_type_id": "Standard_DS3_v2",
                    }
                ],
                "tasks": [
                    {
                        "task_key": "ingest",
                        "job_cluster_key": "main",
                        "notebook_task": {
                            "notebook_path": "/.laktory/jobs/ingest_stock_prices.py",
                        },
                    },
                    {
                        "task_key": "pipeline",
                        "depends_ons": [{"task_key": "ingest"}],
                        "pipeline_task": {
                            "pipeline_id": "${resources.dlt-pl-stock-prices.id}",
                        },
                    },
                ],
            }
        },
    },
    variables={
        "org": "okube",
    },
    environments={
        "dev": {
            "variables": {
                "is_dev": True,
            }
        },
        "prod": {
            "variables": {
                "is_dev": False,
            }
        },
    },
)

print(stack)
'''
variables={'org': 'okube'} backend='pulumi' description=None environments={'dev': EnvironmentSettings(variables={'is_dev': True}, resources=None), 'prod': EnvironmentSettings(variables={'is_dev': False}, resources=None)} name='workspace' organization=None pulumi=Pulumi(variables={}, config={'databricks:host': '${vars.DATABRICKS_HOST}', 'databricks:token': '${vars.DATABRICKS_TOKEN}'}, outputs={}) resources=StackResources(variables={}, databricks_dashboards={}, databricks_dbfsfiles={}, databricks_catalogs={}, databricks_clusters={}, databricks_directories={}, databricks_externallocations={}, databricks_groups={}, databricks_jobs={'job-stock-prices': Job(resource_name_='job-stock-prices', options=ResourceOptions(variables={}, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], clusters=[JobCluster(resource_name_=None, options=ResourceOptions(variables={}, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=None, apply_policy_default_values=None, autoscale=None, autotermination_minutes=None, cluster_id=None, custom_tags=None, data_security_mode='USER_ISOLATION', driver_instance_pool_id=None, driver_node_type_id=None, enable_elastic_disk=None, enable_local_disk_encryption=None, idempotency_token=None, init_scripts=[], instance_pool_id=None, is_pinned=None, libraries=None, name='main', node_type_id='Standard_DS3_v2', num_workers=None, policy_id=None, runtime_engine=None, single_user_name=None, spark_conf={}, spark_env_vars={}, spark_version='14.0.x-scala2.12', ssh_public_keys=[])], continuous=None, control_run_state=None, email_notifications=None, format=None, health=None, max_concurrent_runs=None, max_retries=None, min_retry_interval_millis=None, name='job-stock-prices', notification_settings=None, parameters=[], retry_on_timeout=None, run_as=None, schedule=None, tags={}, tasks=[JobTask(variables={}, condition_task=None, depends_ons=None, description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key='main', libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=JobTaskNotebookTask(variables={}, notebook_path='/.laktory/jobs/ingest_stock_prices.py', base_parameters=None, source=None), notification_settings=None, pipeline_task=None, retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='ingest', timeout_seconds=None), JobTask(variables={}, condition_task=None, depends_ons=[JobTaskDependsOn(variables={}, task_key='ingest', outcome=None)], description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key=None, libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=None, notification_settings=None, pipeline_task=JobTaskPipelineTask(variables={}, pipeline_id='${resources.dlt-pl-stock-prices.id}', full_refresh=None), retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='pipeline', timeout_seconds=None)], timeout_seconds=None, trigger=None, webhook_notifications=None)}, databricks_metastoredataaccesses={}, databricks_metastores={}, databricks_networkconnectivityconfig={}, databricks_notebooks={}, databricks_dltpipelines={'pl-stock-prices': DLTPipeline(resource_name_='pl-stock-prices', options=ResourceOptions(variables={}, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], allow_duplicate_names=None, catalog=None, channel='PREVIEW', clusters=[], configuration={}, continuous=None, development='${vars.is_dev}', edition=None, libraries=[PipelineLibrary(variables={}, file=None, notebook=PipelineLibraryNotebook(variables={}, path='/pipelines/dlt_brz_template.py'))], name='pl-stock-prices', notifications=[], photon=None, serverless=None, storage=None, target=None)}, databricks_schemas={}, databricks_secrets={}, databricks_secretscopes={}, databricks_serviceprincipals={}, databricks_sqlqueries={}, databricks_tables={}, databricks_users={}, databricks_volumes={}, databricks_vectorsearchendpoints={}, databricks_vectorsearchindexes={}, databricks_warehouses={}, databricks_workspacefiles={}, pipelines={}, providers={}) terraform=Terraform(variables={}, backend=None)
'''

Functions¤

get_env ¤

get_env(env_name, inject_vars=True)

Complete definition the stack for a given environment. It takes into account both the default stack values and environment-specific overwrites.

PARAMETER DESCRIPTION
env_name

Name of the environment

TYPE: str

RETURNS DESCRIPTION
EnvironmentStack

Environment definitions.

Source code in laktory/models/stacks/stack.py
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
def get_env(self, env_name: str, inject_vars=True) -> EnvironmentStack:
    """
    Complete definition the stack for a given environment. It takes into
    account both the default stack values and environment-specific
    overwrites.

    Parameters
    ----------
    env_name:
        Name of the environment

    Returns
    -------
    :
        Environment definitions.
    """

    if self._envs is None:

        ENV_FIELDS = ["pulumi", "resources", "terraform", "variables"]

        # Because options is an excluded field for all resources and
        # sub-resources, we need to manually dump it and add it to
        # the base dump
        def dump_with_options(obj: Any) -> Any:

            # Check data type, call recursively if not a BaseModel
            if isinstance(obj, list):
                return [dump_with_options(v) for v in obj]
            elif isinstance(obj, dict):
                return {k: dump_with_options(v) for k, v in obj.items()}
            elif not isinstance(obj, BaseModel):
                return obj

            # Get model dump
            model = obj
            data = model.model_dump(exclude_unset=True)

            # Loop through all model fields
            for field_name, field in model.model_fields.items():

                # Explicitly dump options if found in the model
                if field_name == "options" and field.annotation == ResourceOptions:
                    data["options"] = model.options.model_dump(exclude_unset=True)

                if field_name == "resource_name_" and model.resource_name_:
                    data["resource_name_"] = model.resource_name_

                if field_name == "lookup_existing" and model.lookup_existing:
                    data["lookup_existing"] = model.lookup_existing.model_dump(
                        exclude_unset=True
                    )

                # Parse list
                if isinstance(data.get(field_name, None), list):
                    data[field_name] = [
                        dump_with_options(v) for v in getattr(model, field_name)
                    ]

                # Parse dict (might result from a dict or a BaseModel)
                elif isinstance(data.get(field_name, None), dict):
                    a = getattr(model, field_name)

                    if isinstance(a, dict):
                        for k in a.keys():
                            data[field_name][k] = dump_with_options(a[k])
                    else:
                        data[field_name] = dump_with_options(a)

            return data

        envs = {}
        for _env_name, env in self.environments.items():

            d = dump_with_options(self)
            _envs = d.pop("environments")

            for k in ENV_FIELDS:
                v1 = _envs[_env_name].get(k, {})
                if k in d:
                    d[k] = merge_dicts(d[k], v1)
                elif k in _envs[_env_name]:
                    d[k] = v1

            # Inject Variables
            if inject_vars:
                d = self.environments[_env_name].inject_vars(d)

            envs[_env_name] = EnvironmentStack(**d)

        self._envs = envs

    return self._envs[env_name]

to_pulumi ¤

to_pulumi(env_name=None)

Create a pulumi stack for a given environment env.

PARAMETER DESCRIPTION
env_name

Target environment. If None, used default stack values only.

TYPE: Union[str, None] DEFAULT: None

RETURNS DESCRIPTION
PulumiStack

Pulumi-specific stack definition

Source code in laktory/models/stacks/stack.py
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
def to_pulumi(self, env_name: Union[str, None] = None):
    """
    Create a pulumi stack for a given environment `env`.

    Parameters
    ----------
    env_name:
        Target environment. If `None`, used default stack values only.

    Returns
    -------
    : PulumiStack
        Pulumi-specific stack definition
    """
    from laktory.models.stacks.pulumistack import PulumiStack

    if env_name is not None and env_name in self.environments.keys():
        env = self.get_env(env_name=env_name, inject_vars=False)
    else:
        env = self

    # Resources
    resources = {}
    for r in env.resources._get_all().values():
        r.variables = env.variables
        for _r in r.core_resources:
            resources[_r.resource_name] = _r

    return PulumiStack(
        name=env.name,
        organization=env.organization,
        config=env.pulumi.config,
        description=env.description,
        resources=resources,
        variables=env.variables,
        outputs=env.pulumi.outputs,
    )

to_terraform ¤

to_terraform(env_name=None)

Create a terraform stack for a given environment env.

PARAMETER DESCRIPTION
env_name

Target environment. If None, used default stack values only.

TYPE: Union[str, None] DEFAULT: None

RETURNS DESCRIPTION
TerraformStack

Terraform-specific stack definition

Source code in laktory/models/stacks/stack.py
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
def to_terraform(self, env_name: Union[str, None] = None):
    """
    Create a terraform stack for a given environment `env`.

    Parameters
    ----------
    env_name:
        Target environment. If `None`, used default stack values only.

    Returns
    -------
    : TerraformStack
        Terraform-specific stack definition
    """
    from laktory.models.stacks.terraformstack import TerraformStack

    if env_name is not None and env_name in self.environments.keys():
        env = self.get_env(env_name=env_name, inject_vars=False)
    else:
        env = self

    # Providers
    providers = {}
    for r in env.resources._get_all(providers_only=True).values():
        for _r in r.core_resources:
            rname = _r.resource_name
            providers[rname] = _r

    # Resources
    resources = {}
    for r in env.resources._get_all(providers_excluded=True).values():
        r.variables = env.variables
        for _r in r.core_resources:
            resources[_r.resource_name] = _r

    # Update terraform
    return TerraformStack(
        terraform={"backend": env.terraform.backend},
        providers=providers,
        resources=resources,
        variables=env.variables,
    )

laktory.models.StackResources ¤

Bases: BaseModel

Resources definition for a given stack or stack environment.

ATTRIBUTE DESCRIPTION
databricks_dbfsfiles

Databricks DbfsFiles

TYPE: dict[str, DbfsFile]

databricks_catalogs

Databricks Catalogs

TYPE: dict[str, Catalog]

databricks_clusters

Databricks Clusters

TYPE: dict[str, Cluster]

databricks_dashboards

Databricks Dashboards

TYPE: dict[str, Dashboard]

databricks_directories

Databricks Directories

TYPE: dict[str, Directory]

databricks_externallocations

Databricks External Locations

TYPE: dict[str, ExternalLocation]

databricks_groups

Databricks Groups

TYPE: dict[str, Group]

databricks_jobs

Databricks Jobs

TYPE: dict[str, Job]

databricks_metastores

Databricks Metastores

TYPE: dict[str, Metastore]

databricks_networkconnectivityconfig

Databricks Network Connectivity Config

TYPE: dict[str, MwsNetworkConnectivityConfig]

databricks_notebooks

Databricks Notebooks

TYPE: dict[str, Notebook]

databricks_dltpipelines

Databricks DLT Pipelines

TYPE: dict[str, DLTPipeline]

databricks_schemas

Databricks Schemas

TYPE: dict[str, Schema]

databricks_secretscopes

Databricks SecretScopes

TYPE: dict[str, SecretScope]

databricks_serviceprincipals

Databricks ServicePrincipals

TYPE: dict[str, ServicePrincipal]

databricks_sqlqueries

Databricks SQLQueries

TYPE: dict[str, SqlQuery]

databricks_tables

Databricks Tables

TYPE: dict[str, Table]

providers

Providers

TYPE: dict[str, Union[AWSProvider, AzureProvider, AzurePulumiProvider, DatabricksProvider]]

databricks_users

Databricks Users

TYPE: dict[str, User]

databricks_vectorsearchendpoint

Databricks Vector Search Endpoint

databricks_vectorsearchindex

Databricks Vector Search Index

databricks_volumes

Databricks Volumes

TYPE: dict[str, Volume]

databricks_warehouses

Databricks Warehouses

TYPE: dict[str, Warehouse]

databricks_workspacefiles

Databricks WorkspacFiles

TYPE: dict[str, WorkspaceFile]

pipelines

Laktory Pipelines

TYPE: dict[str, Pipeline]


laktory.models.stacks.stack.EnvironmentSettings ¤

Bases: BaseModel

Settings overwrite for a specific environments

ATTRIBUTE DESCRIPTION
resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Any

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]


laktory.models.stacks.stack.EnvironmentStack ¤

Bases: BaseModel

Environment-specific stack definition.

ATTRIBUTE DESCRIPTION
backend

IaC backend used for deployment.

TYPE: Literal['pulumi', 'terraform']

description

Description of the stack

TYPE: str

name

Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.

TYPE: str

organization

Organization

TYPE: str

pulumi

Pulumi-specific settings

TYPE: Pulumi

resources

Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.

TYPE: Union[StackResources, None]

terraform

Terraform-specific settings

TYPE: Terraform

variables

Dictionary of variables made available in the resources definition.

TYPE: dict[str, Any]


laktory.models.stacks.stack.Pulumi ¤

Bases: BaseModel

config: Pulumi configuration settings. Generally used to configure providers. See references for more details. outputs: Requested resources-related outputs. See references for details.

References

laktory.models.stacks.stack.Terraform ¤

Bases: BaseModel