cronitor
pin to allow versions >= 5.0.1
to enable use of DayOfWeek
as 7. Cronitor 4.0.0
is still disallowed. (Thanks, @joshuataylor!)checkDbReadyInitContainer
to optionally disable db check initContainer.kind
tags. (Thanks, @dragos-pop!)Re-executions
.kind
tag for display in the UI.DataprocResource
now receives an optional parameter labels
to be attached to Dataproc clusters. (Thanks, @thiagoazcampos!)checkDbReadyInitContainer
flag to the Dagster Helm chart to allow disabling the default init container behavior. (Thanks, @easontm!)build_materialize_workbook_assets_definition
which can be used to build assets that run materialize schedules for a Sigma workbook.SnowflakeResource
and SnowflakeIOManager
both accept additional_snowflake_connection_args
config. This dictionary of arguments will be passed to the snowflake.connector.connect
method. This config will be ignored if you are using the sqlalchemy
connector.BackfillPolicy
are now evaluated correctly during backfills. Self dependent assets no longer result in serial partition submissions or disregarded upstream dependencies.PipesCloudWatchMessageReader
correctly identifies streams which are not ready yet and doesn't fail on ThrottlingException
. (Thanks, @jenkoian!)FivetranWorkspace.sync_and_poll(...).fetch_column_metadata()
.dlt_assets
API docs. (Thanks, @zilto!)dagster/run-id
, dagster/code-location
, dagster/user
and Dagster Cloud environment variables) typically attached to external resources are now available under DagsterRun.dagster_execution_info
.SensorReturnTypesUnion
is now exported for typing the output of sensor functions.@fivetran_assets
decorator.load_fivetran_asset_specs
has been updated to accept an instance of DagsterFivetranTranslator
or custom subclass.fivetran_assets
decorator was added. It can be used with the FivetranWorkspace
resource and DagsterFivetranTranslator
translator to load Fivetran tables for a given connector as assets in Dagster. The build_fivetran_assets_definitions
factory can be used to create assets for all the connectors in your Fivetran workspace.ECSPipesClient.run
now waits up to 70 days for tasks completion (waiter parameters are configurable) (Thanks @jenkoian!)load_airbyte_cloud_asset_specs
function has
been added. It can be used with the AirbyteCloudWorkspace
resource and DagsterAirbyteTranslator
translator to load your Airbyte Cloud connection streams as external assets in Dagster.icechunk
kind.ConsolidatedSqliteEventLogStorage
, which is mostly used for tests.1.6.0
may need to run dagster instance migrate
to enable.map_asset_specs
to enable modifying AssetSpec
s and AssetsDefinition
s in bulk.AssetSpec.replace_attributes
and AssetSpec.merge_attributes
to easily alter properties of an asset spec.PipesS3MessageReader
now has a new parameter include_stdio_in_messages
which enables log forwarding to Dagster via Pipes messages.log_external_stream
has been added. It can be used to forward external logs to Dagster via Pipes messages.load_powerbi_asset_specs(..., use_workspace_scan=False)
.dagster-sigma snapshot
command, allowing Sigma workspaces to be captured to a file for faster subsequent loading.DagsterExecutionStepNotFoundError
errors when trying to execute an asset check step of a run launched by a backfill.owners
added to AssetOut
s when defining a @graph_multi_asset
were not added to the underlying AssetsDefinition
.&
or |
operators on AutomationCondition
s with labels would cause that label to be erased.SigmaFilter
no longer fetch lineage information.DagsterPowerBITranslator.get_asset_key
is deprecated in favor of DagsterPowerBITranslator.get_asset_spec().key
DagsterLookerApiTranslator.get_asset_key
is deprecated in favor of DagsterLookerApiTranslator.get_asset_spec().key
DagsterSigmaTranslator.get_asset_key
is deprecated in favor of DagsterSigmaTranslator.get_asset_spec().key
DagsterTableauTranslator.get_asset_key
is deprecated in favor of DagsterTableauTranslator.get_asset_spec().key
Added run_id
to the run_tags
index to improve database performance. Run dagster instance migrate
to update the index. (Thanks, @HynekBlaha!)
Added icons for kind
tags: Cassandra, ClickHouse, CockroachDB, Doris, Druid, Elasticsearch, Flink, Hadoop, Impala, Kafka, MariaDB, MinIO, Pinot, Presto, Pulsar, RabbitMQ, Redis, Redpanda, ScyllaDB, Starrocks, and Superset. (Thanks, @swrookie!)
Added a new icon for the Denodo kind tag. (Thanks, @tintamarre!)
Errors raised from defining more than one Definitions
object at module scope now include the object names so that the source of the error is easier to determine.
[ui] Asset metadata entries like dagster/row_count
now appear on the events page and are properly hidden on the overview page when they appear in the sidebar.
[dagster-aws] PipesGlueClient
now attaches AWS Glue metadata to Dagster results produced during Pipes invocation.
[dagster-aws] PipesEMRServerlessClient
now attaches AWS EMR Serverless metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the job run.
[dagster-aws] PipesECSClient
now attaches AWS ECS metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the ECS task.
[dagster-aws] PipesEMRClient
now attaches AWS EMR metadata to Dagster results produced during Pipes invocation.
[dagster-databricks] PipesDatabricksClient
now attaches Databricks metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the Databricks job.
[dagster-fivetran] Added load_fivetran_asset_specs
function. It can be used with the FivetranWorkspace
resource and DagsterFivetranTranslator
translator to load your Fivetran connector tables as external assets in Dagster.
[dagster-looker] Errors are now handled more gracefully when parsing derived tables.
[dagster-sigma] Sigma assets now contain extra metadata and kind tags.
[dagster-sigma] Added support for direct workbook to warehouse table dependencies.
[dagster-sigma] Added include_unused_datasets
field to SigmaFilter
to disable pulling datasets that aren't used by a downstream workbook.
[dagster-sigma] Added skip_fetch_column_data
option to skip loading Sigma column lineage. This can speed up loading large instances.
[dagster-sigma] Introduced an experimental dagster-sigma snapshot
command, allowing Sigma workspaces to be captured to a file for faster subsequent loading.
dagster-airlift
(experimental)#dagster-airlift
is coming out of stealth. See the initial Airlift RFC here, and the following documentation to learn more:
More Airflow-related content is coming soon! We'd love for you to check it out, and post any comments / questions in the #airflow-migration
channel in the Dagster slack.
monitor_all_code_locations
and monitored_jobs
did not raise the expected error. (Thanks, @apetryla!)AutomationCondition.any_deps_match()
and AutomationCondition.all_deps_match()
to render incorrectly when allow_selection
or ignore_selection
were set.CacheableAssetsDefinitions
in code locations that contained AutomationConditions
Too many open files
errors for jobs with many steps.dagster-dingtalk
to the list of community supported libraries.dagster-wandb
(Weights and Biases) documentation. (Thanks, @matt-weingarten!)dagster-sigma
documentation.AssetOut.from_spec
, that will construct an AssetOut
from an AssetSpec
.Column name
section of the asset overview page.gcs
(Google Cloud Storage) kind tag.report
and semanticmodel
kind tags.EcsRunLauncher
. (Thanks, @zyd14!)DagsterDbtTranslator.get_code_version
to customize the code version for your dbt assets. (Thanks, @Grzyblon!)PipesClientCompletedInvocation
. This metadata will be attached to all materializations and asset checks stored during the pipes invocation.AutomationCondition.execution_in_progress
which would cause it to evaluate to True
for unpartitioned assets that were part of a run that was in progress, even if the asset itself had already been materialized.AutomationCondition.run_in_progress
that would cause it to ignore queued runs.default_automation_condition_sensor
to be constructed for user code servers running on dagster version < 1.9.0
even if the legacy auto_materialize: use_sensors
configuration setting was set to False
.0
in cases where all partitions were evaluated./
characters now work correctly with Dagster Pipes.dagster-airlift
.types-sqlalchemy
package is no longer included in the dagster[pyright]
extra package.dagster project scaffold
now has an option to create dagster projects from templates with excluded files/filepaths.parse_tableau_external_and_materializable_asset_specs
is now available to parse a list of Tableau asset specs into a list of external asset specs and materializable asset specs.DagsterFivetranTranslator
to customize assets loaded from Fivetran.dagster_snowflake.fetch_last_updated_timestamps
now supports ignoring tables not found in Snowflake instead of raising an error.default_automation_condition_sensor
to be constructed for user code servers running on dagster version < 1.9.0 even if the legacy auto_materialize: use_sensors
configuration setting was set to False
.dagster instance migrate
on Dagster version 1.9.0 constructed a SQL query that exceeded the maximum allowed depth.ImportError
s are no longer raised when bigquery libraries are not installed [#25708]New
dagster schedule up
or press the Reconcile button before turning on a new schedule for the first timeCommunity Contributions
Bugfixes
Experimental
New
Bugfixes
@pipeline
decorated functions with -> None typing no longer cause unexpected problems.Breaking Changes
CliApiRunLauncher
and GrpcRunLauncher
have been combined into DefaultRunLauncher
.
If you had one of these run launchers in your dagster.yaml
, replace it with DefaultRunLauncher
or remove the run_launcher:
section entirely.New
Bugfixes
dagster-k8s/config
) will now be
passed to the k8s jobs when using the dagster-k8s
and dagster-celery-k8s
run launchers.
Previously, only user-defined k8s config in the pipeline definition’s tag was passed down.Experimental
QueuedRunCoordinator
enables limiting the number of concurrent runs.
The DefaultRunCoordinator
launches jobs directly from Dagit, preserving existing behavior.New
Community contributions
Bug fixes
PipelineDefinition
's that do not meet resource requirements for its types will now fail at definition timeDeprecated
@pipeline
. This return value actually had no impact at all and was ignored, but we are making changes that will use that value in the future. By changing your code to not return anything now you will avoid any breaking changes with zero user-visible impact.Breaking Changes
DagsterKubernetesPodOperator
in dagster-airflow
.execute_plan
mutation from dagster-graphql
.ModeDefinition
, PartitionSetDefinition
, PresetDefinition
, @repository
, @pipeline
, and ScheduleDefinition
names must pass the regular expression r"^[A-Za-z0-9_]+$"
and not be python keywords or disallowed names. See DISALLOWED_NAMES
in dagster.core.definitions.utils
for exhaustive list of illegal names.dagster-slack
is now upgraded to use slackclient 2.x - this means that this resource will only support Python 3.6 and above.dagster api grpc-health-check
cli command present in Dagster 0.9.16
and later.New
K8sRunLauncher
, in place of the CeleryK8sRunLauncher
.Community Contributions
--limit
flag on the dagster run list
command (Thanks @haydarai!)Bugfixes