Designing and Implementing an Azure AI Solution v1.0 (AI-100)

Page:    1 / 15   
Total 218 questions

HOTSPOT -
Your company plans to build an app that will perform the following tasks:
Match a userג€™s picture to a picture of a celebrity.


✑ Tag a scene from a movie, and then search for movie scenes by using the tags.
You need to recommend which Azure Cognitive Services APIs must be used to perform the tasks.
Which Cognitive Services API should you recommend for each task? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:



Answer :

Box 1: Computer Vision -
Azure's Computer Vision service provides developers with access to advanced algorithms that process images and return information.
Computer Vision Detect Faces: Detect faces in an image and provide information about each detected face. Computer Vision returns the coordinates, rectangle, gender, and age for each detected face.
Computer Vision provides a subset of the Face service functionality. You can use the Face service for more detailed analysis, such as facial identification and pose detection.

Box 2: Bing Video Search -
Search for videos and get comprehensive results
With Bing Video Search API v7, find videos across the web. Results provide useful metadata including creator, encoding format, video length, view count, improved & simplified paging, and more.
Incorrect Answers:
Video Indexer:
Automatically extract metadataג€"such as spoken words, written text, faces, speakers, celebrities, emotions, topics, brands, and scenesג€"from video and audio files.
Custom Vision:
Easily customize your own state-of-the-art computer vision models for your unique use case. Just upload a few labeled images and let Custom Vision Service do the hard work. With just one click, you can export trained models to be run on device or as Docker containers.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/computer-vision/home https://azure.microsoft.com/en-us/services/cognitive-services/bing-video-search-api/

You need to evaluate trends in fuel prices during a period of 10 years. The solution must identify unusual fluctuations in prices and produce visual representations.
Which Azure Cognitive Services API should you use?

  • A. Anomaly Detector
  • B. Computer Vision
  • C. Text Analytics
  • D. Bing Autosuggest


Answer : A

Explanation:
The Anomaly Detector API enables you to monitor and detect abnormalities in your time series data with machine learning. The Anomaly Detector API adapts by automatically identifying and applying the best-fitting models to your data, regardless of industry, scenario, or data volume. Using your time series data, the API determines boundaries for anomaly detection, expected values, and which data points are anomalies.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview

HOTSPOT -
Your company plans to deploy several apps that will use Azure Cognitive Services APIs.
You need to recommend which Cognitive Services APIs must be used to meet the following requirements:
✑ Must be able to identify inappropriate text and profanities in multiple languages.
✑ Must be able to interpret user requests sent by using text input.
✑ Must be able to identify named entities in text.
Which API should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:




Answer :

Box 1: Content Moderator -
The Azure Content Moderator API is a cognitive service that checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users.
Box 2: Language Understanding (LUIS)
Designed to identify valuable information in conversations, LUIS interprets user goals (intents) and distills valuable information from sentences (entities), for a high quality, nuanced language model. LUIS integrates seamlessly with the Azure Bot Service, making it easy to create a sophisticated bot.

Box 3: Text Analytics -
The Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text, and includes four main functions: sentiment analysis, key phrase extraction, named entity recognition, and language detection.
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/content-moderator/overview https://www.luis.ai/home https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/

You plan to perform analytics of the medical records of patients located around the world.
You need to recommend a solution that avoids storing and processing data in the cloud.
What should you include in the recommendation?

  • A. Azure Machine Learning Studio
  • B. the Text Analytics API that has container support
  • C. Azure Machine Learning services
  • D. an Apache Spark cluster that uses MMLSpark


Answer : D

Explanation:
The Microsoft Machine Learning Library for Apache Spark (MMLSpark) assists in provisioning scalable machine learning models for large datasets, especially for building deep learning problems. MMLSpark works with SparkML pipelines, including Microsoft CNTK and the OpenCV library, which provide end-to-end support for the ingress and processing of image input data, categorization of images, and text analytics using pre-trained deep learning algorithms.
References:
https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781789131956/10/ch10lvl1sec61/an-overview-of-the-microsoft-machine-learning- library-for-apache-spark-mmlspark

Your company has an on-premises datacenter.
You plan to publish an app that will recognize a set of individuals by using the Face API. The model is trained.
You need to ensure that all images are processed in the on-premises datacenter.
What should you deploy to host the Face API?

  • A. a Docker container
  • B. Azure File Sync
  • C. Azure Application Gateway
  • D. Azure Data Box Edge


Answer : A

Explanation:
A container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings.
Incorrect Answers:
D: Azure Data Box Edge is an AI-enabled edge computing device with network data transfer capabilities. This article provides you an overview of the Data Box
Edge solution, benefits, key capabilities, and the scenarios where you can deploy this device.
Data Box Edge is a Hardware-as-a-service solution. Microsoft ships you a cloud-managed device with a built-in Field Programmable Gate Array (FPGA) that enables accelerated AI-inferencing and has all the capabilities of a storage gateway.
References:
https://www.docker.com/resources/what-container

You have a Bing Search service that is used to query a product catalog.
You need to identify the following information:
✑ The locale of the query
✑ The top 50 query strings
✑ The number of calls to the service
✑ The top geographical regions of the service
What should you implement?

  • A. Bing Statistics
  • B. Azure API Management (APIM)
  • C. Azure Monitor
  • D. Azure Application Insights


Answer : A

Explanation:
The Bing Statistics add-in provides metrics such as call volume, top queries, API response, code distribution, and market distribution. The rich slicing-and-dicing capability lets you gather deeper understanding of your users and their usage to inform your business strategy.
References:
https://www.bingapistatistics.com/

You have a Face API solution that updates in real time. A pilot of the solution runs successfully on a small dataset.
When you attempt to use the solution on a larger dataset that continually changes, the performance degrades, slowing how long it takes to recognize existing faces.
You need to recommend changes to reduce the time it takes to recognize existing faces without increasing costs.
What should you recommend?

  • A. Change the solution to use the Computer Vision API instead of the Face API.
  • B. Separate training into an independent pipeline and schedule the pipeline to run daily.
  • C. Change the solution to use the Bing Image Search API instead of the Face API.
  • D. Distribute the face recognition inference process across many Azure Cognitive Services instances.


Answer : B

Incorrect Answers:
A: The purpose of Computer Vision is to inspects each image associated with an incoming article to (1) scrape out written words from the image and (2) determine what types of objects are present in the image.
C: The Bing API provides an experience similar to Bing.com/search by returning search results that Bing determines are relevant to a user's query. The results include Web pages and may also include images, videos, and more.
D: That would increase cost.
References:
https://github.com/Azure/cognitive-services

HOTSPOT -
You plan to create a bot that will support five languages. The bot will be used by users located in three different countries. The bot will answer common customer questions. The bot will use Language Understanding (LUIS) to identify which skill to use and to detect the language of the customer.
You need to identify the minimum number of Azure resources that must be created for the planned bot.
How many QnA Maker, LUIS and Text Analytics instances should you create? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:




Answer :

Explanation:

QnA Maker: 5 -
If the user plans to support multiple languages, they need to have a new QnA Maker resource for each language.

LUIS: 5 -
If you need a multi-language LUIS client application such as a chatbot, you have a few options. If LUIS supports all the languages, you develop a LUIS app for each language. Each LUIS app has a unique app ID, and endpoint log. If you need to provide language understanding for a language LUIS does not support, you can use Microsoft Translator API to translate the utterance into a supported language, submit the utterance to the LUIS endpoint, and receive the resulting scores.

Language detection: 1 -
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
This capability is useful for content stores that collect arbitrary text, where language is unknown. You can parse the results of this analysis to determine which language is used in the input document. The response also returns a score that reflects the confidence of the model. The score value is between 0 and 1.
The Language Detection feature can detect a wide range of languages, variants, dialects, and some regional or cultural languages. The exact list of languages for this feature isn't published.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/language-support https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-language-support https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection

You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

  • A. an Azure event hub for Stream1 and Azure Blob storage for Stream2
  • B. Azure Blob storage for Stream1 and Stream2
  • C. an Azure event hub for Stream1 and Stream2
  • D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2
  • E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2


Answer : AB

Explanation:

Stream1 - Azure Event -

Stream2 - Blob Storage -
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.

Stream1, Stream2 - Blob Storage -
Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources:

Azure Event Hubs -

Azure IoT Hub -

Azure Blob storage -
These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You create a managed identity for AKS, and then you create an SSH connection.
Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

Explanation:
Instead add an SSH key to the node, and then you create an SSH connection.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You change the permissions of the AKS resource group, and then you create an SSH connection.
Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

Explanation:
Instead add an SSH key to the node, and then you create an SSH connection.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You add an SSH key to the node, and then you create an SSH connection.
Does this meet the goal?

  • A. Yes
  • B. No


Answer : A

Explanation:
By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes.
You also need to create an SSH connection to the AKS node.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

You are developing a Computer Vision application.
You plan to use a workflow that will load data from an on-premises database to Azure Blob storage, and then connect to an Azure Machine Learning service.
What should you use to orchestrate the workflow?

  • A. Azure Kubernetes Service (AKS)
  • B. Azure Pipelines
  • C. Azure Data Factory
  • D. Azure Container Instances


Answer : C

Explanation:
With Azure Data Factory you can use workflows to orchestrate data integration and data transformation processes at scale.
Build data integration, and easily transform and integrate big data processing and machine learning with the visual interface.
References:
https://azure.microsoft.com/en-us/services/data-factory/

DRAG DROP -
You are designing an AI solution that will use IoT devices to gather data from conference attendees, and then later analyze the data. The IoT devices will connect to an Azure IoT hub.
You need to design a solution to anonymize the data before the data is sent to the IoT hub.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:




Answer :

Explanation:
Step 1: Create a storage container
ASA Edge jobs run in containers deployed to Azure IoT Edge devices.
Step 2: Create an Azure Stream Analytics Edge Job
Azure Stream Analytics (ASA) on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data.
Scenario overview:


Step 3: Add the job to the IoT devices in IoT
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-edge

HOTSPOT -
You are designing a solution that will ingest data from an Azure IoT Edge device, preprocess the data in Azure Machine Learning, and then move the data to
Azure HDInsight for further processing.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:




Answer :

Explanation:

Box 1: Export Data -
The Export data to Hive option in the Export Data module in Azure Machine Learning Studio. This option is useful when you are working with very large datasets, and want to save your machine learning experiment data to a Hadoop cluster or HDInsight distributed storage.

Box 2: Apache Hive -
Apache Hive is a data warehouse system for Apache Hadoop. Hive enables data summarization, querying, and analysis of data. Hive queries are written in
HiveQL, which is a query language similar to SQL.

Box 3: Azure Data Lake -
Default storage for the HDFS file system of HDInsight clusters can be associated with either an Azure Storage account or an Azure Data Lake Storage.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/export-to-hive-query https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/hdinsight-use-hive

Page:    1 / 15   
Total 218 questions