cronitor
pin to allow versions >= 5.0.1
to enable use of DayOfWeek
as 7. Cronitor 4.0.0
is still disallowed. (Thanks, @joshuataylor!)checkDbReadyInitContainer
to optionally disable db check initContainer.kind
tags. (Thanks, @dragos-pop!)Re-executions
.kind
tag for display in the UI.DataprocResource
now receives an optional parameter labels
to be attached to Dataproc clusters. (Thanks, @thiagoazcampos!)checkDbReadyInitContainer
flag to the Dagster Helm chart to allow disabling the default init container behavior. (Thanks, @easontm!)build_materialize_workbook_assets_definition
which can be used to build assets that run materialize schedules for a Sigma workbook.SnowflakeResource
and SnowflakeIOManager
both accept additional_snowflake_connection_args
config. This dictionary of arguments will be passed to the snowflake.connector.connect
method. This config will be ignored if you are using the sqlalchemy
connector.BackfillPolicy
are now evaluated correctly during backfills. Self dependent assets no longer result in serial partition submissions or disregarded upstream dependencies.PipesCloudWatchMessageReader
correctly identifies streams which are not ready yet and doesn't fail on ThrottlingException
. (Thanks, @jenkoian!)FivetranWorkspace.sync_and_poll(...).fetch_column_metadata()
.dlt_assets
API docs. (Thanks, @zilto!)dagster/run-id
, dagster/code-location
, dagster/user
and Dagster Cloud environment variables) typically attached to external resources are now available under DagsterRun.dagster_execution_info
.SensorReturnTypesUnion
is now exported for typing the output of sensor functions.@fivetran_assets
decorator.load_fivetran_asset_specs
has been updated to accept an instance of DagsterFivetranTranslator
or custom subclass.fivetran_assets
decorator was added. It can be used with the FivetranWorkspace
resource and DagsterFivetranTranslator
translator to load Fivetran tables for a given connector as assets in Dagster. The build_fivetran_assets_definitions
factory can be used to create assets for all the connectors in your Fivetran workspace.ECSPipesClient.run
now waits up to 70 days for tasks completion (waiter parameters are configurable) (Thanks @jenkoian!)load_airbyte_cloud_asset_specs
function has
been added. It can be used with the AirbyteCloudWorkspace
resource and DagsterAirbyteTranslator
translator to load your Airbyte Cloud connection streams as external assets in Dagster.icechunk
kind.ConsolidatedSqliteEventLogStorage
, which is mostly used for tests.1.6.0
may need to run dagster instance migrate
to enable.map_asset_specs
to enable modifying AssetSpec
s and AssetsDefinition
s in bulk.AssetSpec.replace_attributes
and AssetSpec.merge_attributes
to easily alter properties of an asset spec.PipesS3MessageReader
now has a new parameter include_stdio_in_messages
which enables log forwarding to Dagster via Pipes messages.log_external_stream
has been added. It can be used to forward external logs to Dagster via Pipes messages.load_powerbi_asset_specs(..., use_workspace_scan=False)
.dagster-sigma snapshot
command, allowing Sigma workspaces to be captured to a file for faster subsequent loading.DagsterExecutionStepNotFoundError
errors when trying to execute an asset check step of a run launched by a backfill.owners
added to AssetOut
s when defining a @graph_multi_asset
were not added to the underlying AssetsDefinition
.&
or |
operators on AutomationCondition
s with labels would cause that label to be erased.SigmaFilter
no longer fetch lineage information.DagsterPowerBITranslator.get_asset_key
is deprecated in favor of DagsterPowerBITranslator.get_asset_spec().key
DagsterLookerApiTranslator.get_asset_key
is deprecated in favor of DagsterLookerApiTranslator.get_asset_spec().key
DagsterSigmaTranslator.get_asset_key
is deprecated in favor of DagsterSigmaTranslator.get_asset_spec().key
DagsterTableauTranslator.get_asset_key
is deprecated in favor of DagsterTableauTranslator.get_asset_spec().key
Added run_id
to the run_tags
index to improve database performance. Run dagster instance migrate
to update the index. (Thanks, @HynekBlaha!)
Added icons for kind
tags: Cassandra, ClickHouse, CockroachDB, Doris, Druid, Elasticsearch, Flink, Hadoop, Impala, Kafka, MariaDB, MinIO, Pinot, Presto, Pulsar, RabbitMQ, Redis, Redpanda, ScyllaDB, Starrocks, and Superset. (Thanks, @swrookie!)
Added a new icon for the Denodo kind tag. (Thanks, @tintamarre!)
Errors raised from defining more than one Definitions
object at module scope now include the object names so that the source of the error is easier to determine.
[ui] Asset metadata entries like dagster/row_count
now appear on the events page and are properly hidden on the overview page when they appear in the sidebar.
[dagster-aws] PipesGlueClient
now attaches AWS Glue metadata to Dagster results produced during Pipes invocation.
[dagster-aws] PipesEMRServerlessClient
now attaches AWS EMR Serverless metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the job run.
[dagster-aws] PipesECSClient
now attaches AWS ECS metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the ECS task.
[dagster-aws] PipesEMRClient
now attaches AWS EMR metadata to Dagster results produced during Pipes invocation.
[dagster-databricks] PipesDatabricksClient
now attaches Databricks metadata to Dagster results produced during Pipes invocation and adds Dagster tags to the Databricks job.
[dagster-fivetran] Added load_fivetran_asset_specs
function. It can be used with the FivetranWorkspace
resource and DagsterFivetranTranslator
translator to load your Fivetran connector tables as external assets in Dagster.
[dagster-looker] Errors are now handled more gracefully when parsing derived tables.
[dagster-sigma] Sigma assets now contain extra metadata and kind tags.
[dagster-sigma] Added support for direct workbook to warehouse table dependencies.
[dagster-sigma] Added include_unused_datasets
field to SigmaFilter
to disable pulling datasets that aren't used by a downstream workbook.
[dagster-sigma] Added skip_fetch_column_data
option to skip loading Sigma column lineage. This can speed up loading large instances.
[dagster-sigma] Introduced an experimental dagster-sigma snapshot
command, allowing Sigma workspaces to be captured to a file for faster subsequent loading.
dagster-airlift
(experimental)#dagster-airlift
is coming out of stealth. See the initial Airlift RFC here, and the following documentation to learn more:
More Airflow-related content is coming soon! We'd love for you to check it out, and post any comments / questions in the #airflow-migration
channel in the Dagster slack.
monitor_all_code_locations
and monitored_jobs
did not raise the expected error. (Thanks, @apetryla!)AutomationCondition.any_deps_match()
and AutomationCondition.all_deps_match()
to render incorrectly when allow_selection
or ignore_selection
were set.CacheableAssetsDefinitions
in code locations that contained AutomationConditions
Too many open files
errors for jobs with many steps.dagster-dingtalk
to the list of community supported libraries.dagster-wandb
(Weights and Biases) documentation. (Thanks, @matt-weingarten!)dagster-sigma
documentation.AssetOut.from_spec
, that will construct an AssetOut
from an AssetSpec
.Column name
section of the asset overview page.gcs
(Google Cloud Storage) kind tag.report
and semanticmodel
kind tags.EcsRunLauncher
. (Thanks, @zyd14!)DagsterDbtTranslator.get_code_version
to customize the code version for your dbt assets. (Thanks, @Grzyblon!)PipesClientCompletedInvocation
. This metadata will be attached to all materializations and asset checks stored during the pipes invocation.AutomationCondition.execution_in_progress
which would cause it to evaluate to True
for unpartitioned assets that were part of a run that was in progress, even if the asset itself had already been materialized.AutomationCondition.run_in_progress
that would cause it to ignore queued runs.default_automation_condition_sensor
to be constructed for user code servers running on dagster version < 1.9.0
even if the legacy auto_materialize: use_sensors
configuration setting was set to False
.0
in cases where all partitions were evaluated./
characters now work correctly with Dagster Pipes.dagster-airlift
.types-sqlalchemy
package is no longer included in the dagster[pyright]
extra package.dagster project scaffold
now has an option to create dagster projects from templates with excluded files/filepaths.parse_tableau_external_and_materializable_asset_specs
is now available to parse a list of Tableau asset specs into a list of external asset specs and materializable asset specs.DagsterFivetranTranslator
to customize assets loaded from Fivetran.dagster_snowflake.fetch_last_updated_timestamps
now supports ignoring tables not found in Snowflake instead of raising an error.default_automation_condition_sensor
to be constructed for user code servers running on dagster version < 1.9.0 even if the legacy auto_materialize: use_sensors
configuration setting was set to False
.dagster instance migrate
on Dagster version 1.9.0 constructed a SQL query that exceeded the maximum allowed depth.ImportError
s are no longer raised when bigquery libraries are not installed [#25708]EnvVar
s used in Sling source and target configuration would not work properly in some circumstances.DAGSTER_DBT_PARSE_PROJECT_ON_LOAD=1 dagster dev
in a new scaffolded project from dagster-dbt project scaffold
, dbt logs from creating dbt artifacts to loading the project are now silenced.connection_meta_to_group_fn
argument which allows configuring loaded asset groups based on the connection’s metadata dict.QueuedRunCoordinatorDaemon
has been refactored to paginate over runs when applying priority sort and tag concurrency limits. Previously, it loaded all runs into memory causing large memory spikes when many runs were enqueued.UPathIOManager
has been updated to use the correct path delimiter when interacting with cloud storages from a Windows process.STEP_WORKER_STARTED
event now fires before importing code in line with the other executors.EnvVar
did not work properly.IAttachDifferentObjectToOpContext
would pass the incorrect object to schedules and sensors.materialize_on_cron
rule with dynamically partitioned assets.without_checks
no longer fails by attempting to include the checks.DATABRICKS_HOST
is set. Thanks @zyd14!PipesLambdaClient
, an AWS Lambda pipes client has been added to dagster_aws
.PipesLambdaClient
with Dagster Pipes.MetadataValue.job
metadata type, which can be used to link to a Dagster job from other objects in the UI.schema
metadata set on the asset
or op
, I/O manager schema
/ dataset
configuration, key_prefix
set on the asset
. Previously, all methods for setting the schema/dataset were mutually exclusive, and setting more than one would raise an exception.DAGSTER_DBT_PARSE_PROJECT_ON_LOAD=1 dagster dev
in a new scaffolded project from dagster-dbt project scaffold
, dbt artifacts for loading the project are now created in a static target/
directory.ScheduleEvaluationContext
when testing via build_schedule_context
.metadata
from a Failure
exception is now hoisted up to the failure that culminates when retry limits are exceeded.PipesK8sClient
now correctly raises on failed containers.maxCatchupRuns
and maxTickRetries
configuration options for the scheduler in the Helm chart.DBT_INDIRECT_SELECTION=empty
.@asset(check_specs=...
to not cooperate with the key_prefix
argument of the load_assets_from_modules
method and it’s compatriots.define_asset_job
now accepts an op_retry_policy
argument, which specifies a default retry policies for all of the ops in the job. (thanks Eugenio Contreras!)observable_source_asset
decorator now accepts a key
argument.implicit_materializations
argument has been added to get_results
and get_materialize_result
to control whether an implicit materialization event is created or not.SlingConnectionResource
to allow reusing sources and targets interoperably.build_dbt_asset_selection
now also selects asset checks based on their underlying dbt tests. E.g. build_dbt_asset_selection([my_dbt_assets], dbt_select="tag:data_quality")
will select the assets and checks for any models and tests tagged with ‘data_quality’.EnvVar
vs. os.getenv
to the Environment variables documentation.AutoMaterializeRule.materialize_on_cron()
rule makes it possible to create policies which materialize assets on a regular cadence.SensorResult
from a sensor no longer overwrites a cursor if it was set via the context.can_subset=True
alongside assets which were upstream of some assets in the multi-asset, and downstream of others.HourlyPartitionsDefinition
with a non-UTC timezone and the default format string (or any format string not including a UTC-offset), there was no way to disambiguate between the first and second instance of the repeated hour during a daylight saving time transition. Now, for the one hour per year in which this ambiguity exists, the partition key of the second instance of the hour will have the UTC offset automatically appended to it.check_specs
to AssetsDefinition.from_graph
dagster-dbt
that caused some dbt tests to not be selected as asset checks.email_on_failure
sensor called deprecated methods on the context. This has been fixedDagsterInstance.report_runless_asset_event
is now public.AutoMaterializeRule.materialize_on_parent_updated
now accepts an updated_parents_filter
of type AutoMaterializeAssetPartitionsFilter
, which allows only materializing based on updates from runs with a required set of tags.EnvVars
used in Airbyte or Fivetran resources would show up as their processed values in the launchpad when loading assets from a live Fivetran or Airbyte instance.