Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB v1.0 (DP-420)

Page:    1 / 9   
Total 137 questions

You have an Azure Cosmos DB Core (SQL) API account.
The change feed is enabled on a container named invoice.
You create an Azure function that has a trigger on the change feed.
What is received by the Azure function?

  • A. only the changed properties and the system-defined properties of the updated items
  • B. only the partition key and the changed properties of the updated items
  • C. all the properties of the original items and the updated items
  • D. all the properties of the updated items


Answer : B

Change feed is available for each logical partition key within the container.
The change feed is sorted by the order of modification within each logical partition key value.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed

DRAG DROP -
You have an Azure Synapse Analytics workspace named workspace1 that contains a serverless SQL pool.
You have an Azure Table Storage account that stores operational data.
You need to replace the Table storage account with Azure Cosmos DB Core (SQL) API. The solution must meet the following requirements:
✑ Support queries from the serverless SQL pool.
✑ Only pay for analytical compute when running queries.
✑ Ensure that analytical processes do NOT affect operational processes.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:



Answer :

Step 1: Create an Azure Cosmos DB core (SQL) API account
Step 2: Enable Azure Synapse Link
Synapse Link creates a tight seamless integration between Azure Cosmos DB and Azure Synapse Analytics.
Serverless SQL pool allows you to query and analyze data in your Azure Cosmos DB containers that are enabled with Azure Synapse Link. You can analyze data in near real-time without impacting the performance of your transactional workloads.
Step 3: Create a database and a container that has Analytical store enabled
Create an analytical store enabled container
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/configure-synapse-link

You have a database named db1 in an Azure Cosmos DB Core (SQL) API account named account1.
You need to write JSON data to db1 by using Azure Stream Analytics. The solution must minimize costs.
Which should you do before you can use db1 as an output of Stream Analytics?

  • A. In account1, add a private endpoint
  • B. In db1, create containers that have a custom indexing policy and analytical store disabled
  • C. In db1, create containers that have an automatic indexing policy and analytical store enabled
  • D. In account1, enable a dedicated gateway


Answer : A

Azure Cosmos DB settings for JSON output.
Using Azure Cosmos DB as an output in Stream Analytics generates the following prompt for information.



Field: Description -
Output alias: An alias to refer to this output in your Stream Analytics query.
Subscription:The Azure subscription.
Account ID: The name or endpoint URI of the Azure Cosmos DB account.
Etc.
Note: A private endpoint could be used for a VPN connection.
Reference:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output

You have a database named db1 in an Azure Cosmos DB Core (SQL) API account.
You have a third-party application that is exposed through a REST API.
You need to migrate data from the application to a container in db1 on a weekly basis.
What should you use?

  • A. Azure Migrate
  • B. Azure Data Factory
  • C. Database Migration Assistant


Answer : B

You can use Copy Activity in Azure Data Factory to copy data from and to Azure Cosmos DB (SQL API).
The Azure Cosmos DB (SQL API) connector is supported for the following activities:
Copy activity with supported source/sink matrix

Mapping data flow -

Lookup activity -
Incorrect:
Not A: Azure Migrate provides a centralized hub to assess and migrate on-premises servers, infrastructure, applications, and data to Azure. It assesses on- premises databases and migrate them to Azure SQL Database or to SQL Managed Instance.
Not C: Data Migration Assistant (DMA) enables you to upgrade to a modern data platform by detecting compatibility issues that can impact database functionality on your new version of SQL Server. It recommends performance and reliability improvements for your target environment.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-cosmos-db

HOTSPOT -
You have an Apache Spark pool in Azure Synapse Analytics that runs the following Python code in a notebook.

For each of the following statements. select Yes if the statement is true. Otherwise. select No.
NOTE: Each correct selection is worth one point.
Hot Area:



Answer :

Box 1: No -
Streaming ג€" Append Output Mode is an outputMode in which only the new rows in the streaming DataFrame/Dataset will be written to the sink.
This is the default mode. Use append as output mode outputMode("append") when you want to output only new rows to the output sink.
Note:
Streaming ג€" Complete Output Mode is an OutputMode in which all the rows in the streaming DataFrame/Dataset will be written to the sink every time there are some updates.
Streaming ג€" Update Output Mode is an outputMode in which only the rows that were updated in the streaming DataFrame/Dataset will be written to the sink every time there are some updates.

Box 2: No -
Structured Streaming is a scalable and fault-tolerant stream processing engine built on the Spark SQL engine. You can express your streaming computation the same way you would express a batch computation on static data. The Spark SQL engine will take care of running it incrementally and continuously and updating the final result as streaming data continues to arrive.

Box 3: Yes -
Synapse Apache Spark also allows you to ingest data into Azure Cosmos DB. It is important to note that data is always ingested into Azure Cosmos DB containers through the transactional store. When Synapse Link is enabled, any new inserts, updates, and deletes are then automatically synced to the analytical store.
Reference:
https://sparkbyexamples.com/spark/spark-streaming-outputmode/ https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html https://docs.microsoft.com/en-us/azure/synapse-analytics/synapse-link/how-to-query-analytical-store-spark

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.
You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.
Solution: You create an Azure function to copy data to another Azure Cosmos DB Core (SQL) API container.
Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

Instead: Create an Azure function that uses Azure Cosmos DB Core (SQL) API change feed as a trigger and Azure event hub as the output.
Note: The Azure Cosmos DB change feed is a mechanism to get a continuous and incremental feed of records from an Azure Cosmos container as those records are being created or modified. Change feed support works by listening to container for any changes. It then outputs the sorted list of documents that were changed in the order in which they were modified.
The following diagram represents the data flow and components involved in the solution:


Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/sql/changefeed-ecommerce-solution

You develop an application that uses Azure Cosmos DB Core (SQL) API.

You create an Azure pipeline to build and deploy the application.

You need to change the pipeline to run integration tests that you wrote for the application. The solution must execute entirely in the pipeline.

What should you add to the pipeline?

  • A. a deployment group named Cosmos DB testing
  • B. an Azure Cosmos DB Emulator task
  • C. a NuGet service connection that uses an Azure Cosmos DB API key
  • D. a secret variable that has a connection string to an Azure Cosmos DB database


Answer : B

You plan to create an operational system that will store data in an Azure Cosmos DB Core (SQL) API account.

You need to configure the account to meet the following requirements:

• Support Spark queries.
• Support the analysis of data from the last six months.
• Only pay for analytical compute when running queries.

Which three actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

  • A. Enable Azure Synapse Link for the account.
  • B. Create a container and set the analyticalTTL property to six months.
  • C. Create an Azure Databricks notebook.
  • D. Create an Azure Synapse linked service.
  • E. Create a container and set the time to live to six months.
  • F. Create an Azure Synapse pipeline.


Answer : ABC

You have an Azure Cosmos DB account named account1.

You need to access account1 from an on-premises environment by using a Site-to-Site VPN.

What should you use?

  • A. a private endpoint
  • B. a dedicated gateway
  • C. Azure Synapse Link


Answer : A

HOTSPOT
-

You plan to create an Azure Cosmos DB database named db1 that will contain two containers. One of the containers will contain blog posts, and the other will contain users. Each item in the blog post container will include:

• A single blog post
• All the comments associated to the blog post
• The names of the users who created the blog post and added the comments

You need to design a solution to update usernames in the user container without causing data integrity issues. The solution must minimize administrative and development effort.

What should you include in the solution? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.



Answer :

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB for NoSQL account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure function to copy data to another Azure Cosmos DB for NoSQL container.

Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

The following is a sample of a document in orders.



The orders container uses customerId as the partition key.

You need to provide a report of the total items ordered per month by item type. The solution must meet the following requirements:

• Ensure that the report can run as quickly as possible.
• Minimize the consumption of request units (RUs).

What should you do?

  • A. Configure the report to query orders by using a SQL query.
  • B. Configure the report to query a new materialized view. Populate the view by using the change feed.
  • C. Configure the report to query orders by using a SQL query through a dedicated gateway.
  • D. Configure the report to query a new materialized view. Populate the view by using SQL queries that run daily.


Answer : B

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure function that uses Azure Cosmos DB for NoSQL change feed as a trigger and Azure event hub as the output.

Does this meet the goal?

  • A. Yes
  • B. No


Answer : A

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure Synapse pipeline that uses Azure Cosmos DB for NoSQL as the input and Azure Blob Storage as the output.

Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure Data Factory pipeline that uses Azure Cosmos DB for NoSQL as the input and Azure Blob Storage as the output.

Does this meet the goal?

  • A. Yes
  • B. No


Answer : B

Page:    1 / 9   
Total 137 questions