Stack
laktory.models.Stack
¤
Bases: BaseModel
The Stack defines a collection of deployable resources, the deployment configuration, some variables and the environment-specific settings.
ATTRIBUTE | DESCRIPTION |
---|---|
backend |
IaC backend used for deployment.
TYPE:
|
description |
Description of the stack
TYPE:
|
environments |
Environment-specific overwrite of config, resources or variables arguments.
TYPE:
|
name |
Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.
TYPE:
|
organization |
Organization |
pulumi |
Pulumi-specific settings
TYPE:
|
resources |
Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.
TYPE:
|
settings |
Laktory settings
TYPE:
|
terraform |
Terraform-specific settings
TYPE:
|
variables |
Dictionary of variables made available in the resources definition. |
Examples:
from laktory import models
stack = models.Stack(
name="workspace",
backend="pulumi",
pulumi={
"config": {
"databricks:host": "${vars.DATABRICKS_HOST}",
"databricks:token": "${vars.DATABRICKS_TOKEN}",
},
},
resources={
"databricks_dltpipelines": {
"pl-stock-prices": {
"name": "pl-stock-prices",
"development": "${vars.is_dev}",
"libraries": [
{"notebook": {"path": "/pipelines/dlt_brz_template.py"}},
],
}
},
"databricks_jobs": {
"job-stock-prices": {
"name": "job-stock-prices",
"clusters": [
{
"name": "main",
"spark_version": "14.0.x-scala2.12",
"node_type_id": "Standard_DS3_v2",
}
],
"tasks": [
{
"task_key": "ingest",
"job_cluster_key": "main",
"notebook_task": {
"notebook_path": "/.laktory/jobs/ingest_stock_prices.py",
},
},
{
"task_key": "pipeline",
"depends_ons": [{"task_key": "ingest"}],
"pipeline_task": {
"pipeline_id": "${resources.dlt-pl-stock-prices.id}",
},
},
],
}
},
},
variables={
"org": "okube",
},
environments={
"dev": {
"variables": {
"is_dev": True,
}
},
"prod": {
"variables": {
"is_dev": False,
}
},
},
)
print(stack)
'''
variables={'org': 'okube'} backend='pulumi' description=None environments={'dev': EnvironmentSettings(variables={'is_dev': True}, resources=None, terraform=Terraform(variables={}, backend=None)), 'prod': EnvironmentSettings(variables={'is_dev': False}, resources=None, terraform=Terraform(variables={}, backend=None))} name='workspace' organization=None pulumi=Pulumi(variables={}, config={'databricks:host': '${vars.DATABRICKS_HOST}', 'databricks:token': '${vars.DATABRICKS_TOKEN}'}, outputs={}) resources=StackResources(variables={}, databricks_alerts={}, databricks_catalogs={}, databricks_clusterpolicies={}, databricks_clusters={}, databricks_dashboards={}, databricks_dbfsfiles={}, databricks_directories={}, databricks_dltpipelines={'pl-stock-prices': DLTPipeline(resource_name_='pl-stock-prices', options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], allow_duplicate_names=None, catalog=None, channel='PREVIEW', clusters=[], configuration={}, continuous=None, development='${vars.is_dev}', edition=None, libraries=[PipelineLibrary(variables={}, file=None, notebook=PipelineLibraryNotebook(variables={}, path='/pipelines/dlt_brz_template.py'))], name='pl-stock-prices', name_prefix=None, name_suffix=None, notifications=[], photon=None, serverless=None, storage=None, target=None)}, databricks_externallocations={}, databricks_grants={}, databricks_groups={}, databricks_jobs={'job-stock-prices': Job(resource_name_='job-stock-prices', options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=[], clusters=[JobCluster(resource_name_=None, options=ResourceOptions(variables={}, is_enabled=True, depends_on=[], provider=None, ignore_changes=None, aliases=None, delete_before_replace=True, import_=None, parent=None, replace_on_changes=None), lookup_existing=None, variables={}, access_controls=None, apply_policy_default_values=None, autoscale=None, autotermination_minutes=None, cluster_id=None, custom_tags=None, data_security_mode='USER_ISOLATION', driver_instance_pool_id=None, driver_node_type_id=None, enable_elastic_disk=None, enable_local_disk_encryption=None, idempotency_token=None, init_scripts=[], instance_pool_id=None, is_pinned=None, libraries=None, name='main', node_type_id='Standard_DS3_v2', no_wait=None, num_workers=None, policy_id=None, runtime_engine=None, single_user_name=None, spark_conf={}, spark_env_vars={}, spark_version='14.0.x-scala2.12', ssh_public_keys=[])], continuous=None, control_run_state=None, description=None, email_notifications=None, format=None, health=None, max_concurrent_runs=None, max_retries=None, min_retry_interval_millis=None, name='job-stock-prices', name_prefix=None, name_suffix=None, notification_settings=None, parameters=[], queue=None, retry_on_timeout=None, run_as=None, schedule=None, tags={}, tasks=[JobTask(variables={}, condition_task=None, depends_ons=None, description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key='main', libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=JobTaskNotebookTask(variables={}, notebook_path='/.laktory/jobs/ingest_stock_prices.py', base_parameters=None, warehouse_id=None, source=None), notification_settings=None, pipeline_task=None, retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='ingest', timeout_seconds=None, for_each_task=None), JobTask(variables={}, condition_task=None, depends_ons=[JobTaskDependsOn(variables={}, task_key='ingest', outcome=None)], description=None, email_notifications=None, existing_cluster_id=None, health=None, job_cluster_key=None, libraries=None, max_retries=None, min_retry_interval_millis=None, notebook_task=None, notification_settings=None, pipeline_task=JobTaskPipelineTask(variables={}, pipeline_id='${resources.dlt-pl-stock-prices.id}', full_refresh=None), retry_on_timeout=None, run_if=None, run_job_task=None, sql_task=None, task_key='pipeline', timeout_seconds=None, for_each_task=None)], timeout_seconds=None, trigger=None, webhook_notifications=None)}, databricks_metastoredataaccesses={}, databricks_metastores={}, databricks_mlflowexperiments={}, databricks_mlflowmodels={}, databricks_mlflowwebhooks={}, databricks_networkconnectivityconfig={}, databricks_notebooks={}, databricks_queries={}, databricks_repos={}, databricks_schemas={}, databricks_secrets={}, databricks_secretscopes={}, databricks_serviceprincipals={}, databricks_tables={}, databricks_users={}, databricks_vectorsearchendpoints={}, databricks_vectorsearchindexes={}, databricks_volumes={}, databricks_warehouses={}, databricks_workspacefiles={}, pipelines={}, providers={}) settings=None terraform=Terraform(variables={}, backend=None)
'''
METHOD | DESCRIPTION |
---|---|
apply_settings |
Required to apply settings before instantiating resources and setting default values |
get_env |
Complete definition the stack for a given environment. It takes into |
to_pulumi |
Create a pulumi stack for a given environment |
to_terraform |
Create a terraform stack for a given environment |
Functions¤
apply_settings
classmethod
¤
apply_settings(data)
Required to apply settings before instantiating resources and setting default values
Source code in laktory/models/stacks/stack.py
444 445 446 447 448 449 450 451 452 |
|
get_env
¤
get_env(env_name)
Complete definition the stack for a given environment. It takes into account both the default stack values and environment-specific overwrites.
PARAMETER | DESCRIPTION |
---|---|
env_name
|
Name of the environment
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
EnvironmentStack
|
Environment definitions. |
Source code in laktory/models/stacks/stack.py
458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 |
|
to_pulumi
¤
to_pulumi(env_name=None)
Create a pulumi stack for a given environment env
.
PARAMETER | DESCRIPTION |
---|---|
env_name
|
Target environment. If |
RETURNS | DESCRIPTION |
---|---|
PulumiStack
|
Pulumi-specific stack definition |
Source code in laktory/models/stacks/stack.py
557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 |
|
to_terraform
¤
to_terraform(env_name=None)
Create a terraform stack for a given environment env
.
PARAMETER | DESCRIPTION |
---|---|
env_name
|
Target environment. If |
RETURNS | DESCRIPTION |
---|---|
TerraformStack
|
Terraform-specific stack definition |
Source code in laktory/models/stacks/stack.py
596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 |
|
laktory.models.StackResources
¤
Bases: BaseModel
Resources definition for a given stack or stack environment.
ATTRIBUTE | DESCRIPTION |
---|---|
databricks_alerts |
Databricks Alerts |
databricks_dbfsfiles |
Databricks DbfsFiles |
databricks_catalogs |
Databricks Catalogs |
databricks_clusters |
Databricks Clusters |
databricks_clusterpolicies |
Databricks Cluster Policies
TYPE:
|
databricks_dashboards |
Databricks Dashboards |
databricks_directories |
Databricks Directories |
databricks_dltpipelines |
Databricks DLT Pipelines
TYPE:
|
databricks_externallocations |
Databricks External Locations
TYPE:
|
databricks_groups |
Databricks Groups |
databricks_grants |
Databricks Grants |
databricks_jobs |
Databricks Jobs |
databricks_metastores |
Databricks Metastores |
databricks_mlflowexperiments |
Databricks MLflow Experiments
TYPE:
|
databricks_mlflowmodels |
Databricks MLflow models
TYPE:
|
databricks_mlflowwebhooks |
Databricks MLflow webhooks
TYPE:
|
databricks_networkconnectivityconfig |
Databricks Network Connectivity Config |
databricks_notebooks |
Databricks Notebooks |
databricks_queries |
Databricks Queries |
databricks_repo |
Databricks Repo
|
databricks_schemas |
Databricks Schemas |
databricks_secretscopes |
Databricks SecretScopes
TYPE:
|
databricks_serviceprincipals |
Databricks ServicePrincipals
TYPE:
|
databricks_tables |
Databricks Tables |
databricks_users |
Databricks Users |
databricks_vectorsearchendpoint |
Databricks Vector Search Endpoint
|
databricks_vectorsearchindex |
Databricks Vector Search Index
|
databricks_volumes |
Databricks Volumes |
databricks_warehouses |
Databricks Warehouses |
databricks_workspacefiles |
Databricks WorkspacFiles
TYPE:
|
pipelines |
Laktory Pipelines |
providers |
Providers
TYPE:
|
laktory.models.stacks.stack.LaktorySettings
¤
Bases: BaseModel
Laktory Settings
ATTRIBUTE | DESCRIPTION |
---|---|
dataframe_backend |
DataFrame backend
TYPE:
|
laktory_root |
Laktory cache root directory. Used when a pipeline needs to write checkpoint files.
TYPE:
|
workspace_laktory_root |
Root directory of a Databricks Workspace (excluding `"/Workspace") to which databricks objects like notebooks and workspace files are deployed.
TYPE:
|
laktory.models.stacks.stack.EnvironmentSettings
¤
Bases: BaseModel
Settings overwrite for a specific environments
ATTRIBUTE | DESCRIPTION |
---|---|
resources |
Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.
TYPE:
|
variables |
Dictionary of variables made available in the resources definition. |
terraform |
Terraform-specific settings
TYPE:
|
laktory.models.stacks.stack.EnvironmentStack
¤
Bases: BaseModel
Environment-specific stack definition.
ATTRIBUTE | DESCRIPTION |
---|---|
backend |
IaC backend used for deployment.
TYPE:
|
description |
Description of the stack
TYPE:
|
name |
Name of the stack. If Pulumi is used as a backend, it should match the name of the Pulumi project.
TYPE:
|
organization |
Organization
TYPE:
|
pulumi |
Pulumi-specific settings
TYPE:
|
resources |
Dictionary of resources to be deployed. Each key should be a resource type and each value should be a dictionary of resources who's keys are the resource names and the values the resources definitions.
TYPE:
|
settings |
Laktory settings
TYPE:
|
terraform |
Terraform-specific settings
TYPE:
|
variables |
Dictionary of variables made available in the resources definition. |
laktory.models.stacks.stack.Pulumi
¤
Bases: BaseModel
config: Pulumi configuration settings. Generally used to configure providers. See references for more details. outputs: Requested resources-related outputs. See references for details.
References
- Pulumi configuration
- Pulumi outputs