The AI-100 Designing and Implementing an Azure AI Solution certification exam tests and validates your expertise using the various services within the Microsoft Azure Artificial Intelligence (AI) portfolio.
This exam is the requirement to earn the Microsoft Certified: Azure AI Engineer Associate certification. This certification is for Azure AI Engineers who use Cognitive Services, Machine Learning, and Knowledge Mining to architect and implement Microsoft AI solutions involving natural language processing, speed, computer vision, bots, and agents.
I’m thrilled to be a part of the inaugural Azure + AI Conference in Las Vegas on December 3-6, 2018! Co-located with DevIntersection we will have a first of its kind Azure and AI focused conference! Join me along with industry experts Scott Guthrie, Zoiner Tejada, Michele Leroux Bustamante, Eric Boyd, Donovan Brown and Scott Hanselman.
You will have the opportunity to train and network with the Microsoft engineers and Azure & AI industry experts. Registrants who sign up for the conference and workshops will take home some great hardware such as a Surface Go, XBOX and more. The convergence of cloud and AI open entire new worlds of opportunities to achieve new capabilities, but also a lot of new technologies to learn.
Save $100 by using the Code “BuildAzure“
The convergence of cloud and AI open entire new worlds of opportunities to achieve new capabilities, but also a lot of new technologies to learn. Whether you are a born in the cloud developer looking to increase your AI capabilities, or a data scientist looking to understand how to build powerful AI in the cloud using the tools you already know and love or if you are a data engineer with some expertise in both, but want to learn about the latest in cutting-edge approaches, the Azure + AI conference is the one place you can attend that will help you tie the Azure and AI together in order to build amazing AI-powered solutions. In this casual environment, experts are here to talk to you, share their knowledge and experience and to help you build the knowledge, skills, and network you need to succeed in your Azure + AI endeavors.
The 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam tests and validates your expertise in designing analytics solutions and building operationalized solutions on Microsoft Azure. This exam covers data engineering topics around Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics.
This exam is retired June 30, 2019.
Exam Target Audience
The 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam is targeted towards Big Data Professionals. This exam is centered around designing analytics solutions and building operationalized solutions on Microsoft Azure. The primary Azure service areas covered on this Big Data exam are: Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics.
Candidates with experience and familiarity with the capabilities and features of batch data processing, real-time processing, and operationalization technologies are the targeted audience for this exam. These candidates will be able to apply Microsoft cloud technologies to solution designs, and implement big data analytics solutions.
Here is a list of the skills and objectives measured on this exam. The percentages on the high level objective areas represents the percentage of the exam that is focused on that objective area.
- Design and Implement Complex Event Processing By Using Azure Stream Analytics (15-20%)
- Ingest data for real-time processing
- Select appropriate data ingestion technology based on specific constraints; design partitioning scheme and select mechanism for partitioning; ingest and process data from a Twitter stream; connect to stream processing entities; estimate throughput, latency needs, and job footprint; design reference data streams
- Design and implement Azure Stream Analytics
- Configure thresholds, use the Azure Machine Learning UDF, create alerts based on conditions, use a machine learning model for scoring, train a model for continuous learning, use common stream processing scenarios
- Implement and manage the streaming pipeline
- Stream data to a live dashboard, archive data as a storage artifact for batch processing, enable consistency between stream processing and batch processing logic
- Query real-time data by using the Azure Stream Analytics query language
- Use built-in functions, use data types, identify query language elements, control query windowing by using Time Management, guarantee event delivery
- Design and Implement Analytics by Using Azure Data Lake (25-30%)
- Ingest data into Azure Data Lake Store
- Create an Azure Data Lake Store (ADLS) account, copy data to ADLS, secure data within ADLS by using access control, leverage end-user or service-to-service authentication appropriately, tune the performance of ADLS, access diagnostic logs
- Manage Azure Data Lake Analytics
- Create an Azure Data Lake Analytics (ADLA) account, manage users, manage data sources, manage, monitor, and troubleshoot jobs, access diagnostic logs, optimize jobs by using the vertex view, identify historical job information
- Extract and transform data by using U-SQL
- Schematize data on read at scale; generate outputter files; use the U-SQL data types, use C# and U-SQL expression language; identify major differences between T-SQL and U-SQL; perform JOINS, PIVOT, UNPIVOT, CROSS APPLY, and Windowing functions in U-SQL; share data and code through U-SQL catalog; define benefits and use of structured data in U-SQL; manage and secure the Catalog
- Extend U-SQL programmability
- Use user-defined functions, aggregators, and operators, scale out user-defined operators, call Python, R, and Cognitive capabilities, use U-SQL user-defined types, perform federated queries, share data and code across ADLA and ADLS
- Integrate Azure Data Lake Analytics with other services
- Integrate with Azure Data Factory, Azure HDInsight, Azure Data Catalog, and Azure Event Hubs, ingest data from Azure SQL Data Warehouse
- Design and Implement Azure SQL Data Warehouse Solutions (15-20%)
- Design tables in Azure SQL Data Warehouse
- Choose the optimal type of distribution column to optimize workflows, select a table geometry, limit data skew and process skew through the appropriate selection of distributed columns, design columnstore indexes, identify when to scale compute nodes, calculate the number of distributions for a given workload
- Query data in Azure SQL Data Warehouse
- Implement query labels, aggregate functions, create and manage statistics in distributed tables, monitor user queries to identify performance issues, change a user resource class
- Integrate Azure SQL Data Warehouse with other services
- Ingest data into Azure SQL Data Warehouse by using AZCopy, Polybase, Bulk Copy Program (BCP), Azure Data Factory, SQL Server Integration Services (SSIS), Create-Table-As-Select (CTAS), and Create-External-Table-As-Select (CETAS); export data from Azure SQL Data Warehouse; provide connection information to access Azure SQL Data Warehouse from Azure Machine Learning; leverage Polybase to access a different distributed store; migrate data to Azure SQL Data Warehouse; select the appropriate ingestion method based on business needs
- Design and Implement Cloud-Based Integration by using Azure Data Factory (15-20%)
- Implement datasets and linked services
- Implement availability for the slice, create dataset policies, configure the appropriate linked service based on the activity and the dataset
- Move, transform, and analyze data by using Azure Data Factory activities
- Copy data between on-premises and the cloud, create different activity types, extend the data factory by using custom processing steps, move data to and from Azure SQL Data Warehouse
- Orchestrate data processing by using Azure Data Factory pipelines
- Identify data dependencies and chain multiple activities, model schedules based on data dependencies, provision and run data pipelines, design a data flow
- Monitor and manage Azure Data Factory
- Identify failures and root causes, create alerts for specified conditions, perform a redeploy, use the Microsoft Azure Portal monitoring tool
- Manage and Maintain Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics (20-25%)
- Provision Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics
- Provision Azure SQL Data Warehouse, Azure Data Lake, and Azure Data Factory, implement Azure Stream Analytics
- Implement authentication, authorization, and auditing
- Integrate services with Azure Active Directory (Azure AD), use the local security model in Azure SQL Data Warehouse, configure firewalls, implement auditing, integrate services with Azure Data Factory
- Manage data recovery for Azure SQL Data Warehouse, Azure Data Lake, and Azure Data Factory, Azure Stream Analytics
- Backup and recover services, plan and implement geo-redundancy for Azure Storage, migrate from an on-premises data warehouse to Azure SQL Data Warehouse
- Monitor Azure SQL Data Warehouse, Azure Data Lake, and Azure Stream Analytics
- Manage concurrency, manage elastic scale for Azure SQL Data Warehouse, monitor workloads by using Dynamic Management Views (DMVs) for Azure SQL Data Warehouse, troubleshoot Azure Data Lake performance by using the Vertex Execution View
- Design and implement storage solutions for big data implementations
- Optimize storage to meet performance needs, select appropriate storage types based on business requirements, use AZCopy, Storage Explorer and Redgate Azure Explorer to migrate data, design cloud solutions that integrate with on-premises data
You can also view the full objectives list for the 70-776 Performing Big Data Engineering on Microsoft Cloud Services certification exam on the official 70-776 exam page at Microsoft.com.
There are not very many study / training materials designed specifically for the 70-776 Perform Big Data Engineering on Microsoft Cloud Services certification exam. As example, Microsoft Press has not published any Exam Ref or Guide books for this exam yet (at the time of writing this.) However, there is still plenty of material available from various sources, both Free and Paid, that range from official product documentation from Microsoft, other articles and videos from Microsoft, as well as training from companies like Opsgility and their SkillMeUp service.
Another interesting resource to utilize is the following recording of the 70-776 Cert Exam Prep session given by James Herring at Microsoft Ignite 2017:
The book “Python Data Science Handbook: Essential Tools for Working with Data” published by O’Reilly and written by Jake VanderPlas is available for purchase in print, as well as being offered completely Free as an online eBook.
Here’s the book description from the publisher: Read More
Microsoft Connect() is a virtual conference that is meant to Inspire developers to build the apps of the future. Today, Day 1 of Connect(), started with a keynote from Scott Guthrie, Executive Vice President, Microsoft Cloud and Enterprise Group sharing how Microsoft is empowering developers to lead the new digital revolution by creating apps that will have a profound impact on the world. There are lots of feature announcements and other great content and highlights included in Connect() in addition to the inspiration given by Scott and others. This post lists out some highlights of some of the things announced and discussed at Microsoft Connect() 2017. Read More
It seems there is constant news on the Microsoft certification front. Near the end of 2016, Microsoft shook up the entire MCP program with some amazing changes to how Azure certifications are integrated in the tracks, as well as the ability to renew MCSD and MCSE certifications annually with an elective exam. Now they are continuing to expand with new certification targeting the extremely popular realm of Machine Learning. The all new MCSA: Machine Learning certification is being added as an option to earn!
This exam is retired June 30, 2019.