Sqlbits 2016 was my first SqlBits ever, after few efforts in past few years. This time I managed to go to Liverpool, and meet many friends from UK and Europe who I haven’t met in person before. It was a pleasure to meet you all there. In this post I’ve shared my session slides. I Read more about Slides of My Session at SqlBits 2016: Azure Data Factory vs SSIS[…]
It is with great pleasure to announce that I will be speaking in SqlBits 2016 Liverpool which happens from 4th to 7th May 2016. I will be speaking one of my favorite topics which is comparing SSIS as on-premises ETL tool with Azure Data Factory as cloud data ingestion service. My session scheduled for 4pm Read more about I Will Speak at SqlBits; Azure Data Factory vs SSIS[…]
Thanks to all attendees to my session today at SQL Saturday Sydney. It was a pleasure to have you all in front of me, and having those great questions and feedback from you. This was my second time in Sydney, and I’m looking forward to the next gathering hopefully next year. Special Thanks to Grant Read more about Presentation Materials for My Session at SQL Saturday Sydney 2016[…]
I’ve been honored to talk with Carlos L Chacon about Azure Data Factory after my session at PASS Summit, and he published the webcast in hist website; SQLDataPartners. Thanks to Carlos and SQLDataPartners for the great opportunity to talk on one of my favorite topics as Data movement with the new technology; Azure Data Factory. Read more about Podcast: Azure Data Factory[…]
It was my honor to present for the third year in a row in PASS Summit 2015. I’ve seen many familiar faces of #SQLFamily, many MVPs, MCMs, Speakers and PASS HQ. I’ve presented my favorite topic here this year which was: Azure Data Factory vs SSIS. Slides shared in PASS website, but I also uploaded Read more about Slides of my Session at PASS Summit 2015: Azure Data Factory vs SSIS[…]
Azure Data Factory has been released as general availability 10 days ago. There has been also an extension for Visual Studio published a little earlier for Data Factory. The good news is that now you can create Azure Data Factory projects from Visual Studio. This is a great step forward in development of Data Factory Read more about Azure Data Factory Templates for Visual Studio[…]
With the announcement of Power BI a week ago and with great features in this product, there are many opportunities that this product and service can be used. Power BI can be used alongside some other Azure services, and even On-Premises services to build an end-to-end enterprise BI and Data Analysis Solution. In this post I’ll just explain one example scenario of Hybrid and End-to-End BI solution that uses below technologies;
- Data Sources On-Premises
- Azure Data Factory or SSIS with Azure Pack for Data extraction and transformation
- Azure SQL Database or Azure SQL Data Warehouse as the database / data warehouse engine in cloud
- Power BI as the data analysis and front-end tool
In previous post you’ve seen how to create Azure Data Factory. In this post we want to take the first step in building components of Azure Data Factory. Usually the very first step is creating Linked Services. Linked Services are connection to data sources and destinations. Data Source or destination may be on Azure (such as Azure Blob Storage, Azure SQL Database) or on premises (such as on-premises SQL Server, or on-premises Oracle). Linked Services need to work with Data Management Gateway if the data source/destination is on-premises.
In this example we follow the previous post solution; We want to copy data from some CSV files exists on Azure Blob Storage and load it into Azure SQL database. So we need two Linked Services for this example; one for Azure Blob Storage, and the other one for Azure SQL Database. Creating Linked Services might not be so hard once you have the environment ready for it. However in this example as we want to do everything from the scratch I’ll explain you how to create an Azure Blob Storage and upload CSV files there to be the source of our operation. I’ll also explain how to create the destination table in Azure SQL Database.
Previously in another post I explained what is Azure Data Factory alongside tools and requirements for this service. In this post I want to go through a simple demo of Data Factory, so you get an idea of how Data Factory project builds, develops and schedules to run. You may see some components of Azure Data Factory in this post that you don’t fully understand, but don’t worry, I’ll go through them later on in future posts.
An overview from previous section; Azure Data Factory is a Microsoft Azure service to ingest data from data sources and apply compute operations on the data and load it into the destination. The main purpose of Data Factory is data ingestion, and that is the big difference of this service with ETL tools such as SSIS (I’ll go through difference of Data Factory and SSIS in separate blog post). With Azure Data Factory you can;
- Access to data sources such as SQL Server On premises, SQL Azure, and Azure Blob storage
- Apply Data transformation through Hive, Pig, and C#.
- Monitor the pipeline of data, validation and execution of scheduled jobs
- Load it into desired Destinations such as SQL Server On premises, SQL Azure, and Azure Blob storage
- And on last but not least; This is Cloud based service.
It is a honor for me that I’ve been selected to speak in SQL PASS Summit 2015. SQL PASS Summit is the largest SQL Server conference and event in the world. Last year about 6ooo people from more than 55 countries attended this great event. I’ve been honored previously to speak in this great conference Read more about I’ll Speak in SQL PASS Summit 2015; 3 Years in a row[…]