02 - Design and Develop Data Processing

43 bookmarks
Custom sorting
Create tumbling window triggers - Azure Data Factory & Azure Synapse
Create tumbling window triggers - Azure Data Factory & Azure Synapse
Learn how to create a trigger in Azure Data Factory or Azure Synapse Analytics that runs a pipeline on a tumbling window.
Tumbling window triggers are a type of trigger that fires at a periodic time interval from a specified start time, while retaining state. Tumbling windows are a series of fixed-sized, non-overlapping, and contiguous time intervals. A tumbling window trigger has a one-to-one relationship with a pipeline and can only reference a singular pipeline.
·docs.microsoft.com·
Create tumbling window triggers - Azure Data Factory & Azure Synapse
Create event-based triggers - Azure Data Factory & Azure Synapse
Create event-based triggers - Azure Data Factory & Azure Synapse
Learn how to create a trigger in an Azure Data Factory or Azure Synapse Analytics that runs a pipeline in response to an event.
Data integration scenarios often require customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Data Factory and Synapse pipelines natively integrate with Azure Event Grid, which lets you trigger pipelines on such events.
·docs.microsoft.com·
Create event-based triggers - Azure Data Factory & Azure Synapse
Understand inputs for Azure Stream Analytics
Understand inputs for Azure Stream Analytics
This article describe the concept of inputs in an Azure Stream Analytics job, comparing streaming input to reference data input.
Azure Blob storage, Azure Data Lake Storage Gen2, and Azure SQL Database are currently supported as input sources for reference data.
Event Hubs, IoT Hub, Azure Data Lake Storage Gen2 and Blob storage are supported as data stream input sources.
A data stream is an unbounded sequence of events over time. Stream Analytics jobs must include at least one data stream input.
Reference data is either completely static or changes slowly. It is typically used to perform correlation and lookups.
·docs.microsoft.com·
Understand inputs for Azure Stream Analytics
Choose a real-time and stream processing solution on Azure
Choose a real-time and stream processing solution on Azure
Learn about how to choose the right real-time analytics and streaming processing technology to build your application on Azure.
Built-in temporal operators, such as windowed aggregates, temporal joins, and temporal analytic functions. Native Azure input and output adapters Support for slow changing reference data (also known as a lookup tables), including joining with geospatial reference data for geofencing. Integrated solutions, such as Anomaly Detection Multiple time windows in the same query Ability to compose multiple temporal operators in arbitrary sequences.
·docs.microsoft.com·
Choose a real-time and stream processing solution on Azure
Process real-time IoT data streams with Azure Stream Analytics
Process real-time IoT data streams with Azure Stream Analytics
IoT sensor tags and data streams with stream analytics and real-time data processing
Stream Analytics Query Language (SAQL)
How to write different Queries in Azure Stream Analytics
·docs.microsoft.com·
Process real-time IoT data streams with Azure Stream Analytics
Conditional split transformation in mapping data flow - Azure Data Factory & Azure Synapse
Conditional split transformation in mapping data flow - Azure Data Factory & Azure Synapse
Split data into different streams using the conditional split transformation in a mapping data flow in Azure Data Factory or Synapse Analytics
The conditional split transformation routes data rows to different streams based on matching conditions. The conditional split transformation is similar to a CASE decision structure in a programming language.
disjoint is false because the data goes to the first matching condition.
·docs.microsoft.com·
Conditional split transformation in mapping data flow - Azure Data Factory & Azure Synapse
Prepare and transform data with Azure Synapse Analytics - Learn
Prepare and transform data with Azure Synapse Analytics - Learn
Prepare and transform data with Azure Synapse Analytics
Azure Blob Storage (JSON, Avro, Text, Parquet) Azure Data Lake Storage Gen1 (JSON, Avro, Text, Parquet) Azure Data Lake Storage Gen2 (JSON, Avro, Text, Parquet) Azure Synapse Analytics Azure SQL Database Azure CosmosDB
·docs.microsoft.com·
Prepare and transform data with Azure Synapse Analytics - Learn
Flatten transformation in mapping data flow - Azure Data Factory & Azure Synapse
Flatten transformation in mapping data flow - Azure Data Factory & Azure Synapse
Denormalize hierarchical data using the flatten transformation in Azure Data Factory and Synapse Analytics pipelines.
Use the flatten transformation to take array values inside hierarchical structures such as JSON and unroll them into individual rows. This process is known as denormalization.
·docs.microsoft.com·
Flatten transformation in mapping data flow - Azure Data Factory & Azure Synapse