Developing SQL Data Models v7.0 (70-768)

Page:    1 / 3   
Total 38 questions

You need to configure the server to optimize the afternoon report generation based on the
OrderAnalysis cube.
Which property should you configure?

  • A. LowMemoryLimit
  • B. VertiPaqPagingPolicy
  • C. TotalMemoryLimit
  • D. VirtualMemoryLimit


Answer : A

Explanation:
LowMemoryLimit: For multidimensional instances, a lower threshold at which the server first begins releasing memory allocated to infrequently used objects.
From scenario: Reports that are generated based on data from the OrderAnalysis cube take more time to complete when they are generated in the afternoon each day. You examine the server and observe that it is under significant memory pressure.

You need to resolve the issues that the users report.
Which processing options should you use? To answer, drag the appropriate processing option to the correct location or locations. Each processing option may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.




Answer :

Explanation:


Box1: Process Full:
When Process Full is executed against an object that has already been processed,
Analysis Services drops all data in the object, and then processes the object. This kind of processing is required when a structural change has been made to an object, for example, when an attribute hierarchy is added, deleted, or renamed.

Box 2: Process Default -
Detects the process state of database objects, and performs processing necessary to deliver unprocessed or partially processed objects to a fully processed state. If you change a data binding, Process Default will do a Process Full on the affected object.
Box 3:
Not Process Update: Forces a re-read of data and an update of dimension attributes.
Flexible aggregations and indexes on related partitions will be dropped.

You need to create the cube processing job and the dimension processing job.
Which processing task should you use for each job? To answer, drag the appropriate processing tasks to the correct locations. Each processing task may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.




Answer :

Explanation:



Box 1: ProcessData -
Processes data only without building aggregations or indexes. If there is data is in the partitions, it will be dropped before re-populating the partition with source data.

Box 2: Process Update -
Forces a re-read of data and an update of dimension attributes. Flexible aggregations and indexes on related partitions will be dropped.
References:https://docs.microsoft.com/en-us/sql/analysis-services/multidimensional- models/processing-options-and-settings-analysis-services

Topic 2, Case Study #2 -

Background -
Wide World Importers has multidimensional cubes named SalesAnalysis and
ProductSales. The SalesAnalysis cube is refreshed from a relational data warehouse. You have a Microsoft SQL Server Analysis Services instance that is configured to use tabular mode. You have a tabular data model named CustomerAnalysis.

Sales Analysis -
The SalesAnalysis cube contains a fact table named CoffeeSale loaded from a table named FactSale in the data warehouse. The time granularity within the cube is 15 minutes.
The cube is processed every night at 23:00. You determine that the fact table cannot be fully processed in the expected time. Users have reported slow query response times.
The SalesAnalysis model contains tables from a SQL Server database named SalesDB.
You set the DirectQueryMode option to DirectQuery. Data analyst access data from a cache that is up to 24 hours old. Data analyst report performance issues when they access the SalesAnalysis model.
When analyzing sales by customer, the total of all sales is shown for every customer, instead of the customers sales value. When analyzing sales by product, the correct totals for each product are shown.

Customer Analysis -
You are redesigning the CustomerAnalysis tabular data model that will be used to analyze customer sales. You plan to add a table named CustomerPermission to the model. This table maps the Active Directory login of an employee with the CustomerId keys for all customers that the employee manages.
The CustomerAnalysis data model will contain a large amount of data and needs to be shared with other developers even if a deployment fails. Each time you deploy a change during development, processing takes a long time.
Data analysts must be able to analyze sales for financial years, financial quarters, months, and days. Many reports are based on analyzing sales by month.

Product Sales -
The ProductSales cube allows data analysts to view sales information by product, city, and time. Data analysts must be able to view ProductSales data by Year to Date (YTD) as a measure. The measure must be formatted as currency, associated with the Sales measure group, and contained in a folder named Calculations.

Requirements -
You identify the following requirements:
*Data available during normal business hours must always be up-to-date.
*Processing overhead must be minimized.
*Query response times must improve.
*All queries that access the SalesAnal

You need to configure the SalesAnalysis cube to correct the sales analysis by customer calculation.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.




Answer :

Explanation:


Step 1: Open the cube editor, and open the Dimension Usage tab.
Step 2: Configure a relationship between the Customer dimension and the Sales measure group. Use Day as the granularity.
From scenario: The SalesAnalysis cube contains a fact table named CoffeeSale loaded from a table named FactSale in the data warehouse. The time granularity within the cube is
15 minutes. The cube is processed every night at 23:00. You determine that the fact table cannot be fully processed in the expected time. Users have reported slow query response times.
Step 3: Reprocess the cube.
Step 4: Deploy the project changes.

You need to configure the project option settings to minimize deployment time for the
CustomerAnalysis data model.
What should you do? To answer, select the appropriate setting from each list in the answer area.




Answer :

Explanation:


Scenario:
Box 1, Processing option:Default
Process Default detects the process state of database objects, and performs processing necessary to deliver unprocessed or partially processed objects to a fully processed state.
If you change a data binding, Process Default will do a Process Full on the affected object.
Note: Processing Method This setting controls whether the deployed objects are processed after deployment and the type of processing that will be performed. There are three processing options:
Default processing (default)

Full processing -

None -
Box 2, Transactional deployment: False
If this option is False, Analysis Services deploys the metadata changes in a single transaction, and deploys each processing command in its own transaction.
Scenario: The CustomerAnalysis data model will contain a large amount of data and needs to be shared with other developers even if a deployment fails. Each time you deploy a change during development, processing takes a long time.
References:https://docs.microsoft.com/en-us/sql/analysis-services/multidimensional- models/deployment-script-files-specifying-processing-options

You need to configure the CoffeeSale fact table environment.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.




Answer :

Explanation:


Step 1: Partition the CoffeSale facto table.
Step 2: Set the storage mode for all partitions to HOLAP.
Partitions stored as HOLAP are smaller than the equivalent MOLAP partitions because they do not contain source data and respond faster than ROLAP partitions for queries involving summary data.
Step 3: Alter the processing job to ensure that it rearranges the partition structure each evening.
Step 4: Test that the cube meets the functional requirement for data currency and query performance.
From scenario:
Data analysts must be able to analyze sales for financial years, financial quarters, months, and days. Many reports are based on analyzing sales by month.
The SalesAnalysis cube contains a fact table named CoffeeSale loaded from a table named FactSale in the data warehouse. The time granularity within the cube is 15 minutes.
The cube is processed every night at 23:00. You determine that the fact table cannot be fully processed in the expected time. Users have reported slow query response times.
References:https://docs.microsoft.com/en-us/sql/analysis-services/multidimensional- models-olap-logical-cube-objects/partitions-partition-storage-modes-and-processing

Topic 3, Case Study #3 -

Background -
You are a developer for a Seattle-based company. The company is expanding globally.
Many company employees speak fluent Mandarin and read Simplified Chinese.
You have six tabular data models that are deployed to two instances of Microsoft SQL
Server Analysis Services (SSAS).
Users report that the query takes a long time to complete.
You are planning the disk space allocations for a new Microsoft SQL Server Analysis
Services deployment. You plan to move several relational data file databases to the new
SSAS instance. The databases require a total of 10 GB of disk space.
You also plan to deploy Cubes and Aggregations and use Object Processing. Cubes will have small fact tables and few dimension members. No unnecessary aggregations will be created. You plan to process an entire cube in a single transaction.

Data Models -
One of the data models is named CustomerSales. This data model contains eight tables.
The model includes a table named Sales that defines several measures, including a measure named PriorYearSales. The PriorYearSales measure is referenced by other measures, and is not intended to be analyzed directly by users. You must translate the metadata for all fata the CustomerSales data model to Simplifies Chinese. Team members from the Shanghai office assist with identifying appropriate translations.
A data model named OrderAnalysis is deployed to one of the SSAS instances. Order data is loaded into the OrderAnalysis data as part of an overnight process. You observe that the model is not up-to-date.
The business analysis team uses a variety of client applications to issue MDX queries against OrderAnalysis. Order data must be completely up-to-date.
The OrderAnalysis model has two user-defined h

A database named DB2 uses the InMemory query mode. Users frequently run the following query:


You need to configure SQL Server Profiler to determine why the query is performing poorly.
Which three event should you monitor on the SQL Server Profiler trace events configuration page? To answer, select the appropriate options in the answer area.



Answer :

Explanation:


By using SQL Profiler, you can intercept two classes of trace events from Analysis
Services, DAX Query Plan and DirectQuery events, both generated by the DirectQuery engine. Here, in this scenario we have a DAX Query.
DAX Query Plan events are generated by the DAX formula.
By using the In-Memory mode, you store a copy of data in the xVelocity (VertiPaq) storage engine.
Figure: This is how a query is executed by using In-Memory mode.

References: Microsoft SQL Server 2012 Analysis Services, The BISM Tabular Model,
Microsoft Press (July 2012), page 331
From Scenario: Users report that the query takes a long time to complete.

A database named DB2 uses the InMemory query mode. Users frequently run the following query:


You need to reconfigure the SSAS instance that hosts DB1.
Which three actions should perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.



Answer :

Explanation:


Step 1: Set the default mode for the data model to DirectQuery.
You discover that the project has been deployed with the Direct Query Mode option set to
OFF.
Step 2: Set the mode for the FactInternetSales table's partition to DirectQueryOnly.
Initially, even DirectQuery models are always created in memory. The default query mode for the workspace database is also set toDirectQuery with In-Memory. This hybrid working mode lets you use the cache of imported data for improved performance during the model design process, while validating the model against DirectQuery requirements.
From Scenario: Most queries that use the SalesAnalysis data model use data from a table named FactInternetSales that is 20 gigabyte (GB) in size. Cached data must be available for the FactInternetSales table. All queries accessing the SalesAnalysis model must be executed in near real time.
Step 3: Run Process Full for the FactInternetSales partition.
When Process Full is executed against an object that has already been processed,
Analysis Services drops all data in the object, and then processes the object. This kind of processing is required when a structural change has been made to an object, for example, when an attribute hierarchy is added, deleted, or renamed

A database named DB2 uses the InMemory query mode. Users frequently run the following query:


You need to ensure no users see the PriorYearSales measure in the field list for the Sales table.
What should you do?

  • A. Create a perspective, and ensure that the PriorYearSales measure is not added to the perspective. Ensure that users connect to the model by using the perspective.
  • B. Set the Display Folder property for PriorYearSales toHidden.
  • C. Remove the PriorYearSales measure from the default field set of the Sales table.
  • D. Create a role using Read permissions, and define a DAX expression to filter out the PriorYearSales measure. Add all users to the role.


Answer : A

Explanation:
Using perspectives in the data model might help you expose a subset of tables, columns, and measures that are useful for a particular type of analysis.
Usually, every user needs only a subset of data you create, and showing him or her the model through perspectives can offer a better user experience.
From scenario; The PriorYearSales measure is referenced by other measures, and is not intended to be analyzed directly by users.
References: Microsoft SQL Server 2012 Analysis Services, The BISM Tabular Model,
Microsoft Press (July 2012), page 305

Topic 4, Mix Questions -

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.
You have a Microsoft SQL Server Analysis Services (SSAS) multidimensional database that stores customer and order data for customers in the United States only. The database contains the following objects:


You must create a KPI named Large Sales Target that uses the Traffic Light indicator to display status. The KPI must contain:

You need to create the KPI.
Solution: You set the value of the Status expression to:

Does the solution meet the goal?

  • A. Yes
  • B. No


Answer : B

Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided in a question apply only to that question.
You administer a Microsoft SQL Server Analysis Services (SSAS) tabular model for a travel agency that specializes in vacation packages. Vacation bookings and packages are stored in a SQL Server database. You use the model as the basis for customer emails that highlight vacation packages that are currently underbooked, or projected to be underbooked.
The company plans to incorporate cruise ship vacation packages. Cruise ship vacation packages include new features such as region availability and cruise line specialties that require changes to the tabular model.
You must ensure that the tabular model reflects the new vacation packages.
You need to configure the tabular data model.
What should you do?

  • A. Ensure that DirectQuery is enabled for the model.
  • B. Ensure that DirectQuery is disabled for the model.
  • C. Ensure that the Transactional Deployment property is set to True.
  • D. Ensure that the Transactional Deployment property is set to False.
  • E. Process the model in Process Full mode.
  • F. Process the model in Process Data mode.
  • G. Process the model in Process Defrag mode.


Answer : E

Explanation:
Process Full processes an Analysis Services object and all the objects that it contains.
When Process Full is executed against an object that has already been processed,
Analysis Services drops all data in the object, and then processes the object. This kind of processing is required when a structural change has been made to an object, for example, when an attribute hierarchy is added, deleted, or renamed.

You are a business analyst for a retail company that uses a Microsoft SQL Server Analysis
Services (SSAS) multidimensional database for reporting. The database contains the following objects:


You must create a report that shows, for each month, the Internet sales for that month and the total Internet sales for the calendar year up to and including the current month.
You create the following MDX statement (Line numbers are included for reference only.):

You need to complete the MDX statement to return data for the report.
Which MDX segment should you use in line 01?

  • A.
  • B.
  • C.
  • D.


Answer : B

Explanation:
The following example returns the sum of the Measures. [Order Quantity] member, aggregated over the first eight months of calendar year 2003 that are contained in the Date dimension, from the Adventure Works cube.

Copy -
WITH MEMBER [Date].[Calendar].[First8Months2003] AS
Aggregate(
PeriodsToDate(
[Date].[Calendar].[Calendar Year],
[Date].[Calendar].[Month].[August 2003]
)
)

SELECT -
[Date].[Calendar].[First8Months2003] ON COLUMNS,
[Product].[Category].Children ON ROWS

FROM -
[Adventure Works]

WHERE -
[Measures].[Order Quantity]
References:https://docs.microsoft.com/en-us/sql/mdx/aggregate-mdx

Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided in a question apply only to that question.
You are developing Microsoft SQL Server Analysis Services (SSAS) tabular model.
The model must meet the following requirements:
You need to configure model.
What should you do?

  • A. Ensure that DirectQuery is enabled for the model.
  • B. Ensure that DirectQuery is disabled for the model.
  • C. Ensure that the Transactional Deployment property is set to True.
  • D. Ensure that the Transactional Deployment property is set to False.
  • E. Process the model in Process Full mode.
  • F. Process the model in Process Data mode.
  • G. Process the model in Process Defrag mode.


Answer : A

Explanation:
DAX originally emerged from a Power Pivot add-in for Excel, as a formula language extension for creating calculated columns and measures for data analysis (which are also integral elements of one SSAS Tabular model database, too), but when Microsoft added support for DAX queries in SQL Server 2012, BI experts started daxing data from Tabular model databases.
That trend continues, because of simplicity and fast query execution (related to
DirectQuery mode in SSAS Tabular).
References:https://www.sqlshack.com/query-ssas-tabular-model-database-using-dax- functions/

Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided in a question apply only to that question.
You administer a Microsoft SQL Server Analysis Services (SSAS) tabular model for a retail company. The model is the basis for reports on inventory levels, popular products, and regional store performance.
The company recently split up into multiple companies based on product lines. Each company starts with a copy of the database and tabular model that contains data for a specific product line.
You need to optimize performance of queries that use the copied tabular models while minimizing downtime.
What should you do?

  • A. Ensure that DirectQuery is enabled for the model.
  • B. Ensure that DirectQuery is disabled for the model.
  • C. Ensure that the Transactional Deployment property is set to True.
  • D. Ensure that the Transactional Deployment property is set to False.
  • E. Process the model in Process Full mode.
  • F. Process the model in Process Data mode.
  • G. Process the model in Process Defrag mode.


Answer : C

Explanation:
The Transactional Deployment setting controls whether the deployment of metadata changes and process commands occurs in a single transaction or in separate transactions.
If this option is True (default), Analysis Services deploys all metadata changes and all process commands within a single transaction.
If this option is False, Analysis Services deploys the metadata changes in a single transaction, and deploys each processing command in its own transaction.
References:https://docs.microsoft.com/en-us/sql/analysis-services/multidimensional- models/deployment-script-files-specifying-processing-options

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.
You have an existing multidimensional cube that provides sales analysis. The users can slice by date, product, location, customer, and employee.
The management team plans to evaluate sales employee performance relative to sales targets. You identify the following metrics for employees:
You need to implement the KPI based on the Status expression.
Solution: You design the following solution:


Does the solution meet the goal?

  • A. Yes
  • B. No


Answer : B

Page:    1 / 3   
Total 38 questions