This week in Orlando, Florida, Microsoft is hosting the 2018 Microsoft Ignite conference. Ignite is the largest Microsoft technology conference held every year. In fact, the attendee numbers are in the range of 20,000+ people in attendance. This year, there seems to have been more Microsoft Azure related announcements made at, or around, the conference than in years past. In addition, there are many other Microsoft announcements that have been made as well.
This special edition of Build Azure Weekly highlights the top Microsoft Azure related announcements made by Microsoft at, or around, the Microsoft Ignite 2018 conference. Be aware, this is a pretty long list, and doesn’t even include all the announcement made this week! Read More
The 70-777 Implementing Microsoft Azure Cosmos DB Solutions certification exam tests and validates your expertise in designing, building, and troubleshooting Azure Cosmos DB solutions. The exam is focused around the Azure Cosmos DB database service in the Microsoft Azure Cloud, and is targeted towards database developers, big data developers, and architects who are leveraging Azure Cosmos DB in their solutions. Read More
The 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam tests and validates your expertise in designing analytics solutions and building operationalized solutions on Microsoft Azure. This exam covers data engineering topics around Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics.
Exam Target Audience
The 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam is targeted towards Big Data Professionals. This exam is centered around designing analytics solutions and building operationalized solutions on Microsoft Azure. The primary Azure service areas covered on this Big Data exam are: Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics.
Candidates with experience and familiarity with the capabilities and features of batch data processing, real-time processing, and operationalization technologies are the targeted audience for this exam. These candidates will be able to apply Microsoft cloud technologies to solution designs, and implement big data analytics solutions.
Here is a list of the skills and objectives measured on this exam. The percentages on the high level objective areas represents the percentage of the exam that is focused on that objective area.
- Design and Implement Complex Event Processing By Using Azure Stream Analytics (15-20%)
- Ingest data for real-time processing
- Select appropriate data ingestion technology based on specific constraints; design partitioning scheme and select mechanism for partitioning; ingest and process data from a Twitter stream; connect to stream processing entities; estimate throughput, latency needs, and job footprint; design reference data streams
- Design and implement Azure Stream Analytics
- Configure thresholds, use the Azure Machine Learning UDF, create alerts based on conditions, use a machine learning model for scoring, train a model for continuous learning, use common stream processing scenarios
- Implement and manage the streaming pipeline
- Stream data to a live dashboard, archive data as a storage artifact for batch processing, enable consistency between stream processing and batch processing logic
- Query real-time data by using the Azure Stream Analytics query language
- Use built-in functions, use data types, identify query language elements, control query windowing by using Time Management, guarantee event delivery
- Design and Implement Analytics by Using Azure Data Lake (25-30%)
- Ingest data into Azure Data Lake Store
- Create an Azure Data Lake Store (ADLS) account, copy data to ADLS, secure data within ADLS by using access control, leverage end-user or service-to-service authentication appropriately, tune the performance of ADLS, access diagnostic logs
- Manage Azure Data Lake Analytics
- Create an Azure Data Lake Analytics (ADLA) account, manage users, manage data sources, manage, monitor, and troubleshoot jobs, access diagnostic logs, optimize jobs by using the vertex view, identify historical job information
- Extract and transform data by using U-SQL
- Schematize data on read at scale; generate outputter files; use the U-SQL data types, use C# and U-SQL expression language; identify major differences between T-SQL and U-SQL; perform JOINS, PIVOT, UNPIVOT, CROSS APPLY, and Windowing functions in U-SQL; share data and code through U-SQL catalog; define benefits and use of structured data in U-SQL; manage and secure the Catalog
- Extend U-SQL programmability
- Use user-defined functions, aggregators, and operators, scale out user-defined operators, call Python, R, and Cognitive capabilities, use U-SQL user-defined types, perform federated queries, share data and code across ADLA and ADLS
- Integrate Azure Data Lake Analytics with other services
- Integrate with Azure Data Factory, Azure HDInsight, Azure Data Catalog, and Azure Event Hubs, ingest data from Azure SQL Data Warehouse
- Design and Implement Azure SQL Data Warehouse Solutions (15-20%)
- Design tables in Azure SQL Data Warehouse
- Choose the optimal type of distribution column to optimize workflows, select a table geometry, limit data skew and process skew through the appropriate selection of distributed columns, design columnstore indexes, identify when to scale compute nodes, calculate the number of distributions for a given workload
- Query data in Azure SQL Data Warehouse
- Implement query labels, aggregate functions, create and manage statistics in distributed tables, monitor user queries to identify performance issues, change a user resource class
- Integrate Azure SQL Data Warehouse with other services
- Ingest data into Azure SQL Data Warehouse by using AZCopy, Polybase, Bulk Copy Program (BCP), Azure Data Factory, SQL Server Integration Services (SSIS), Create-Table-As-Select (CTAS), and Create-External-Table-As-Select (CETAS); export data from Azure SQL Data Warehouse; provide connection information to access Azure SQL Data Warehouse from Azure Machine Learning; leverage Polybase to access a different distributed store; migrate data to Azure SQL Data Warehouse; select the appropriate ingestion method based on business needs
- Design and Implement Cloud-Based Integration by using Azure Data Factory (15-20%)
- Implement datasets and linked services
- Implement availability for the slice, create dataset policies, configure the appropriate linked service based on the activity and the dataset
- Move, transform, and analyze data by using Azure Data Factory activities
- Copy data between on-premises and the cloud, create different activity types, extend the data factory by using custom processing steps, move data to and from Azure SQL Data Warehouse
- Orchestrate data processing by using Azure Data Factory pipelines
- Identify data dependencies and chain multiple activities, model schedules based on data dependencies, provision and run data pipelines, design a data flow
- Monitor and manage Azure Data Factory
- Identify failures and root causes, create alerts for specified conditions, perform a redeploy, use the Microsoft Azure Portal monitoring tool
- Manage and Maintain Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics (20-25%)
- Provision Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics
- Provision Azure SQL Data Warehouse, Azure Data Lake, and Azure Data Factory, implement Azure Stream Analytics
- Implement authentication, authorization, and auditing
- Integrate services with Azure Active Directory (Azure AD), use the local security model in Azure SQL Data Warehouse, configure firewalls, implement auditing, integrate services with Azure Data Factory
- Manage data recovery for Azure SQL Data Warehouse, Azure Data Lake, and Azure Data Factory, Azure Stream Analytics
- Backup and recover services, plan and implement geo-redundancy for Azure Storage, migrate from an on-premises data warehouse to Azure SQL Data Warehouse
- Monitor Azure SQL Data Warehouse, Azure Data Lake, and Azure Stream Analytics
- Manage concurrency, manage elastic scale for Azure SQL Data Warehouse, monitor workloads by using Dynamic Management Views (DMVs) for Azure SQL Data Warehouse, troubleshoot Azure Data Lake performance by using the Vertex Execution View
- Design and implement storage solutions for big data implementations
- Optimize storage to meet performance needs, select appropriate storage types based on business requirements, use AZCopy, Storage Explorer and Redgate Azure Explorer to migrate data, design cloud solutions that integrate with on-premises data
You can also view the full objectives list for the 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam on the official 70-776 exam page at Microsoft.com.
There are not very many study / training materials designed specifically for the 70-776 Perform Big Data Engineering on Microsoft Cloud Services certification exam. As example, Microsoft Press has not published any Exam Ref or Guide books for this exam yet (at the time of writing this.) However, there is still plenty of material available from various sources, both Free and Paid, that range from official product documentation from Microsoft, other articles and videos from Microsoft, as well as training from companies like Opsgility and their SkillMeUp service.
Another interesting resource to utilize is the following recording of the 70-776 Cert Exam Prep session given by James Herring at Microsoft Ignite 2017:
The Developing Microsoft SQL Server Databases (70-464) certification exam is one of the elective exams that counts towards the MCSE: Data Management and Analytics certifications. This exam centers around objectives that cover the areas of developing Microsoft SQL Server databases.
Certification Target Audience
The focus of the 70-464 Developing Microsoft SQL Server Database certification exam is centered around SQL Server technologies. This exam was first published in 2012, with more recent updates done in 2016, so it doesn’t seem to contain the Azure SQL Database topics that other exams cover.
This exam targets those database professionals who build and implement databases across organizations and who ensure high levels of data availability. The database professionals roles would include creating database files, data types, and tables; planning, creating, and optimizing indexes; ensuring data integrity; implementing views, stored procedures, and functions; and managing transactions and locks.
Here’s a high level outline of the skill objectives measured on the Developing Microsoft SQL Server Databases (70-464) certification exam. The percentages next to each of the exam objectives represents the percentage of exam questions in that particular exam objective area.
- Implement database objects (30-35%)
- Create and alter tables
- Design, implement, and troubleshoot security
- Design the locking granularity level
- Implement indexes
- Implement data types
- Create and modify constraints
- Implement programming objects (15-20%)
- Design and implement stored procedures
- Design T-SQL table-valued and scalar functions
- Create, use, and alter user-defined functions (UDFs)
- Create and alter views
- Design database objects (25-30%)
- Design tables
- Design for concurrency
- Design indexes
- Design data integrity
- Design for implicit and explicit transactions
- Optimize and troubleshoot queries (25-30%)
- Optimize and tune queries
- Troubleshoot and resolve performance problems
- Optimize indexes
- Capture and analyze execution plans
- Collect performance and system information
When studying for this exam, you’ll definitely want to look at the official exam page from Microsoft for the complete list of objectives covered. You’ll need to study each and every one of the objectives measured on the exam before attempting the exam successfully.
At the time of writing this summary, the 70-464 Developing Microsoft SQL Databases exam has a limited amount of Exam preparation material available. As a result, you may need to focus primarily on Microsoft documentation surrounding the technologies and skills measured on this exam.
However, there is a bit of overlap between this exam and the 70-762 Developing SQL Databases exam. It’s unclear at this time if this new exam is meant to replace this one, but there seems to be overlap either way. As a result, you may be able to use the study materials for 70-762 when studying for the 70-464 exam. You’ll at least want to check it out.
For years it’s been difficult to migrate a database that was built in SQL Server to Microsoft Azure SQL Database. Originally, the T-SQL support in Azure SQL wasn’t the same as SQL Server, which caused a lot of pain over the years. The T-SQL support in Azure SQL has been greatly updated / enhanced over the years, but at times a migration is still necessary. Fortunately, SQL Server Management Studio includes the “Deploy Database to Microsoft Azure Database Wizard” which has built-in support to migrate a database from SQL Server to Azure SQL Database. This tool actually works BOTH ways, from Azure to SQL Server and SQL Server to Azure! Read More
The “Microsoft Azure Essentials: Migrating SQL Server Databases to Azure” book written by Carl Rabeler has been made a free eBook by Microsoft Press. This book is part of the “Microsoft Azure Essentials” books series. With this book you will learn how SQL Server in Azure is similar to SQL Server in an on-premises environment. You will also learn how they are different. The author, Carl Rabeler, is a content lead for Azure.com and will walk you through the steps of getting started with SQL Server in an Azure Virtual Machine as well as Azure SQL Databases. Read More
When Microsoft announced the initial Technical Preview release of Azure Stack it was also announced there would be additional services released in the coming weeks. This week, Microsoft announced the addition of Web Apps, SQL database, and MySQL database Platform as a Service (PaaS) services to the Microsoft Azure Stack platform, and some additional new tools too!
The recently released tools and services for Azure Stack can be downloaded and installed on top of any installation of the Azure Stack Technical Preview. This is really the first installment of many more Azure Stack services to come, beyond what comes with the Technical Preview itself. Read More
Just as all Azure Web Apps need configuration values, most applications also need to have database Connection String values configured. With Azure Web Apps the Connection Strings are stored/retrieved in a very similar fashion as Azure Web App Application Settings. Connection Strings are also Key / Value pairs of String values, but are separated out into their own section.
Connection Strings are typically used to store the connection information for one or more databases the Web App needs to connect to for storing and retrieving data. The Connection String types supported are SQL Database, SQL Server, MySQL and Custom. Most often the Connection Strings used will be for some kind of SQL RDMS, but the Custom type allows for an additional Connection String to be configured any other type of database connection necessary.
As with Application Settings, the Connection Strings are accessed as normal from .NET code and the values will come from what is set within the Azure Management Portal. In other development environments (Node.js, Java, PHP, Python) the Connection Strings are exposed to code as Environment Variables. Additionally, the Connection Strings are editable within the Azure Management Portal, but are read-only when access through code. Read More
Azure DocumentDB is a NoSQL document database that uses the JSON data format for storing and querying documents. Being a feature of Microsoft Azure, DocumentDB offers a nice NoSQL database using JSON documents that includes all of the benefits of Microsoft Azure and the cloud.
Here’s a list of some of the key features DocumentDB offers:
- PaaS and Scalability
- Schema-free JSON documents
- SQL language queries
There are many things to look at when either migrating an existing application or just building a new application to host on Microsoft Azure. Data storage is definitely a top concern, and along with that is the ability to use SQL Server in many cases. Fortunately, Microsoft Azure provides a couple options for using a SQL Database. Read More