Please log in to access the latest updates. If you don't have an account yet, you can register by clicking the Register link. We're excited to have you join our website and stay informed about our latest updates.
Azure Data Factory (ADF) has released a “quick re-use” option as public preview to the Azure Integration Runtime TTL to reduce data flow execution to from 2 mins to under…
Azure Data Factory (ADF) has make data flow connectors for ETL data loading, transforming, and landing in Common Data Model (CDM) and Delta Lake formats.
Azure Data Factory (ADF) has made it easier to view and manage large, complex ETL patterns with new zoom controls for complex graph design. ADF has also added cached lookups…
Azure Data Factory (ADF) is adding new connector support to enable Optimized Row Columnar (ORC) format to data flows in ADF and Synapse Analytics in ADLS and Blob Store.
Azure Data Factory (ADF) is adding new connector support to SQL MI as a source and sink to data flows in ADF and Synapse Analytics in ADLS and Blob Store.
Update your servers/machines running self-hosted integration runtimes that communicate with the Azure Data Factory backend for control plane actions to use transport layer security (TLS) 1.2 by May 11, 2020.
Azure Data Factory added several new features to mapping data flows and pipeline activities this week: Flatten transformation, Lookup transformation, container UI.
Azure Data Factory added several new features to mapping data flows this week: Import schema and test connection from debug cluster, custom sink ordering.
Azure Data Factory users can now build Mapping Data Flows utilized Managed Identity (formerly MSI) for Azure Data Lake Store Gen 2, Azure SQL Database, and Azure Synapse Analytics (formerly…
Azure Data Factory copy activity now supports resume from last failed run when you copy files between file-based data stores to ease large volumes of data ingestion and migration.
Azure Data Factory now supports SFTP as a sink and as a source. Use copy activity to copy data from any supported data store to your SFTP server located on-premises…
You now have the ability to run your Azure Machine Learning service pipelines as a step in your Azure Data Factory pipelines. This allows you to run your machine learning…
Azure Data Factory now supports Azure Database for PostgreSQL as a sink. Use the copy activity feature to load data into Azure Database for PostgreSQL from any supported data source.
Load data faster with new support from the Copy Activity feature in Azure Data Factory. Now, if you’re trying to copy data from any supported source into SQL database/data warehouse…
Azure Data Factory now supports copying data into Azure Database for MySQL. Use the Copy Activity feature to load data into Azure Database for MySQL from any supported data sources.
Azure Data Factory has added the ability to execute custom SQL scripts from your SQL sink transformation in mapping data flows. Now you can easily perform options such as disabling…
Azure Data Factory Mapping Data Flows provides a code-free design environment for building and operationalizing ETL data transformations at scale. Now, the ADF team has added parameter support for Data…
Azure Data Factory upgraded the Teradata connector with new feature adds and enhancement, including built-in Teradata driver, out-of-box data partitioning to performantly ingest data from Teradata in parallel, and more.
A new logging mode in Diagnostic Settings for an Azure Logs target, starting with Azure Data Factory, will allow you to take advantage of improved ingestion latency, query performance, data…
Azure Data Factory seamlessly integrates with Polybase to empower you to ingest data into SQL DW performantly. ADF now adds support for loading data from ADLS Gen2 and from Blob…
Azure Data Factory empowers you to copy data from Azure Data Lake Storage (ADLS) Gen1 to Gen2 easily and performantly. Furthermore, now you can choose to preserve the access control…
The new Mapping Data Flows feature in Azure Data Factory allows Data Engineers to visually design, debug, manage, and operationalize data transformations at scale in the cloud.