Question 46

You are designing a statistical analysis solution that will use custom proprietary1 Python functions on near real-time data from Azure Event Hubs.
You need to recommend which Azure service to use to perform the statistical analysis. The solution must minimize latency.
What should you recommend?
  • Question 47

    You are implementing an Azure Stream Analytics solution to process event data from devices.
    The devices output events when there is a fault and emit a repeat of the event every five seconds until the fault is resolved. The devices output a heartbeat event every five seconds after a previous event if there are no faults present.
    A sample of the events is shown in the following table.

    You need to calculate the uptime between the faults.
    How should you complete the Stream Analytics SQL query? To answer, select the appropriate options in the answer are a.
    NOTE: Each correct selection is worth one point.

    Question 48

    Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
    After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
    You are designing an Azure Stream Analytics solution that will analyze Twitter data.
    You need to count the tweets in each 10-second window. The solution must ensure that each tweet is counted only once.
    Solution: You use a session window that uses a timeout size of 10 seconds.
    Does this meet the goal?
  • Question 49

    You are designing an application that will use an Azure Data Lake Storage Gen 2 account to store petabytes of license plate photos from toll booths. The account will use zone-redundant storage (ZRS).
    You identify the following usage patterns:
    * The data will be accessed several times a day during the first 30 days after the data is created. The data must meet an availability SU of 99.9%.
    * After 90 days, the data will be accessed infrequently but must be available within 30 seconds.
    * After 365 days, the data will be accessed infrequently but must be available within five minutes.

    Question 50

    You have data stored in thousands of CSV files in Azure Data Lake Storage Gen2. Each file has a header row followed by a properly formatted carriage return (/r) and line feed (/n).
    You are implementing a pattern that batch loads the files daily into an enterprise data warehouse in Azure Synapse Analytics by using PolyBase.
    You need to skip the header row when you import the files into the data warehouse. Before building the loading pattern, you need to prepare the required database objects in Azure Synapse Analytics.
    Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
    NOTE: Each correct selection is worth one point