Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: ecus65

Snowflake ARA-R01 - SnowPro Advanced: Architect Recertification Exam

Page: 3 / 5
Total 162 questions

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

A.

TRANSIENT

B.

TEMPORARY

C.

PERMANENT

An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

What design should be used?

A.

Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.

B.

Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.

C.

Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes

D.

Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

A.

A task scheduled in a UTC-based schedule will have no issues with the time changes.

B.

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

C.

A task will move to a suspended state during the daylight savings time change.

D.

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

E.

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

How can the Snowpipe REST API be used to keep a log of data load history?

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minute time range.

A company’s client application supports multiple authentication methods, and is using Okta.

What is the best practice recommendation for the order of priority when applications authenticate to Snowflake?

A.

1) OAuth (either Snowflake OAuth or External OAuth)

2) External browser

3) Okta native authentication

4) Key Pair Authentication, mostly used for service account users

5) Password

B.

1) External browser, SSO

2) Key Pair Authentication, mostly used for development environment users

3) Okta native authentication

4) OAuth (ether Snowflake OAuth or External OAuth)

5) Password

C.

1) Okta native authentication

2) Key Pair Authentication, mostly used for production environment users

3) Password

4) OAuth (either Snowflake OAuth or External OAuth)

5) External browser, SSO

D.

1) Password

2) Key Pair Authentication, mostly used for production environment users

3) Okta native authentication

4) OAuth (either Snowflake OAuth or External OAuth)

5) External browser, SSO

A new user user_01 is created within Snowflake. The following two commands are executed:

Command 1-> show grants to user user_01;

Command 2 ~> show grants on user user 01;

What inferences can be made about these commands?

A.

Command 1 defines which user owns user_01

Command 2 defines all the grants which have been given to user_01

B.

Command 1 defines all the grants which are given to user_01 Command 2 defines which user owns user_01

C.

Command 1 defines which role owns user_01

Command 2 defines all the grants which have been given to user_01

D.

Command 1 defines all the grants which are given to user_01

Command 2 defines which role owns user 01

A table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.

The Architect must design a clustering key for this table to improve the query performance.

Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

A.

C5, C4, C2

B.

C3, C4, C5

C.

C1, C3, C2

D.

C2, C1, C3

The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

Step 1: Data files are loaded in a stage.

Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

A.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

B.

The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

C.

The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

D.

The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

A.

Every Kafka message is in JSON or Avro format.

B.

The default retention time for Kafka topics is 14 days.

C.

The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

D.

The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

A.

IDEF1X

B.

Schema-on-write

C.

Schema-on-read

D.

Information schema