Upload image data in the cloud with Azure Storage - Azure Event Grid
This tutorial creates a web app that stores and displays images from Azure storage. It's a prerequisite for an Event Grid tutorial that's linked at the end of this article.
Create a function in Go or Rust using Visual Studio Code - Azure Functions
Learn how to create a Go function as an Azure Functions custom handler, then publish the local project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code.
Create a Python function using Visual Studio Code - Azure Functions
Learn how to create a Python function, then publish the local project to serverless hosting in Azure Functions using the Azure Functions extension in Visual Studio Code.
UNION (Azure Stream Analytics) - Stream Analytics Query
Combines the results of two or more queries into a single result set that includes all the rows that belong to all queries in the union.
The following are basic rules for combining the result sets of two queries by using UNION:
The number and the order of the columns must be the same in all queries.
The data types must be compatible.
Streams must have the same partition and partition count
Spark Streaming - Different Output modes explained - Spark by {Examples}
This article describes usage and differences between complete, append and update output modes in Apache Spark Streaming. outputMode describes what data is written to a data sink (console, Kafka e.t.c) when there is new data available in streaming input (Kafka, Socket, e.t.c)
Use complete as output mode outputMode("complete") when you want to aggregate the data and output the entire results to sink every time.
It is similar to the complete with one exception; update output mode outputMode("update") just outputs the updated aggregated results every time to data sink when new data arrives.
Use append as output mode outputMode("append") when you want to output only new rows to the output sink.
Continuous integration and delivery - Azure Data Factory
Learn how to use continuous integration and delivery to move Azure Data Factory pipelines from one environment (development, test, production) to another.