Today we are pleased to introduce the U-SQL database project, a new project type in Azure Data Lake Tools for Visual Studio (ADL Tools for VS) that accelerates U-SQL database development, management and deployment. All objects (except for credentials) in U-SQL databases can be created and managed with the U-SQL Data Definition Language (DDL). Before,…
Build hybrid cloud analytics solutions with ADLA Task in SSIS
Today, we are pleased to announce new support for the Azure Data Lake Analytics Task (ADLA Task) in the Azure Feature Pack for Integration Services (SSIS). The ADLA Task enables you to easily extend your existing SSIS workflows with big data compute capability in the cloud powered by ADLA. More and more customers are storing…
Easier Azure Data Lake Store management: alerts for folders and files.
The massive scale and capabilities of Azure Data Lake Store are regularly used by companies for big data storage. As the number of files, file types, and folders grow, things get harder to manage and staying compliant becomes a greater challenge for companies. Regulations such as GDPR (General Data Protection Regulation) have heightened requirements for…
Process more files than ever and use Parquet with Azure Data Lake Analytics
In a recent release, Azure Data Lake Analytics (ADLA) takes the capability to process large amounts of files of many different formats to the next level. This blog post is showing you an end to end walk-through of generating many Parquet files from a rowset, and process them at scale with ADLA as well as…
Azure Data Lake Analytics and U-SQL Spring 2018 Updates: Parquet support, small files, dynamic output, fast file sets, and much more!
Hello Azure Data Lake and U-SQL fans and followers. It is high time for the release notes for all the cool features we released over the winter as well as listing all the pending deprecation items and breaking changes. There was so much cool new stuff that it took me several weeks to write the…
Get started with U-SQL: It’s easy!
Azure Data Lake Analytics combines declarative and imperative concepts in the form of a new language called U-SQL. The idea of learning a new language is daunting. Don’t worry! U-SQL is easy to learn. You can learn the vast majority of the language in a single day. If you are familiar with SQL or languages…
Use AU Analyzer for faster, lower cost Data Lake Analytics
Do you use Data Lake Analytics and wonder how many Analytics Units your jobs should have been assigned? Do you want to see if your job could consume a little less time or money? The recently-announced AU Analyzer tool can help you today! See our recent announcement of the AU Analyzer, available in both Visual…
Keeping Data Lake Costs Under Control: Creating Alerts for AUs Usage Thresholds.
Have you ever been surprised by a larger-than-expected monthly Azure Data Lake Analytics bill? Creating alerts using Log Analytics will help you know when the bill is growing more than it should. In this post, I will show you how to create an alert that emails a message whenever the total AUs assigned to jobs…
Simple Trick to Stay on top of your Azure Data Lake: Create Alerts using Log Analytics
If you manage one or more Azure Data Lake accounts, do you ever find it hard to stay on top of everything that is happening? Ever feel the need to know more about them? Are you regularly asking yourself any or all of these questions: What are our most expensive jobs ? When was a…
Using the first job run to optimize subsequent runs with Azure Data Lake job AU analyzer
Customers have been telling us it’s hard to find the right balance between Analytics Units (AUs) and job execution time. For too many users, optimizing a job means running the job in the cloud using trial and error to find the best AU allocation that balances cost and performance. Today we are happy to introduce…