Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: ecus65

Snowflake ARA-R01 - SnowPro Advanced: Architect Recertification Exam

Page: 4 / 5
Total 162 questions

There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

An Architect needs to create a read-only role for certain employees working in the human resources department.

Which permission sets must be granted to this role?

A.

USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

B.

USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

C.

MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

D.

USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:

1: SnowPro Advanced: Architect | Study Guide 8

2: Snowflake Documentation | Snowpipe Overview 9

3: Snowflake Documentation | Using the Snowpipe REST API 10

4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11

5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12

6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13

7: Snowflake Documentation | Loading Data Using COPY into a Table

: SnowPro Advanced: Architect | Study Guide

: Snowpipe Overview

: Using the Snowpipe REST API

: Loading Data Using Snowpipe and AWS Lambda

: Supported File Formats and Compression for Staged Data Files

: Using Cloud Notifications to Trigger Snowpipe

: Loading Data Using COPY into a Table

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

A.

Use ON_ERROR = continue in the copy into command.

B.

Use purge = TRUE in the copy into command.

C.

Use FURGE = FALSE in the copy into command.

D.

Use on error = SKIP_FILE in the copy into command.

An Architect would like to save quarter-end financial results for the previous six years.

Which Snowflake feature can the Architect use to accomplish this?

A.

Search optimization service

B.

Materialized view

C.

Time Travel

D.

Zero-copy cloning

E.

Secure views

A company needs to have the following features available in its Snowflake account:

1. Support for Multi-Factor Authentication (MFA)

2. A minimum of 2 months of Time Travel availability

3. Database replication in between different regions

4. Native support for JDBC and ODBC

5. Customer-managed encryption keys using Tri-Secret Secure

6. Support for Payment Card Industry Data Security Standards (PCI DSS)

In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

A.

Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake (VPS)

A company is designing its serving layer for data that is in cloud storage. Multiple terabytes of the data will be used for reporting. Some data does not have a clear use case but could be useful for experimental analysis. This experimentation data changes frequently and is sometimes wiped out and replaced completely in a few days.

The company wants to centralize access control, provide a single point of connection for the end-users, and maintain data governance.

What solution meets these requirements while MINIMIZING costs, administrative effort, and development overhead?

A.

Import the data used for reporting into a Snowflake schema with native tables. Then create external tables pointing to the cloud storage folders used for the experimentation data. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

B.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create a role that has access to this schema and manage access to the data through that role.

C.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

D.

Import the data used for reporting into a Snowflake schema with native tables. Then create views that have SELECT commands pointing to the cloud storage files for the experimentation data. Then create two different roles to match the different user personas, and grant these roles to the corresponding users.

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

A.

Use, at minimum, the Business Critical edition of Snowflake.

B.

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.

Use the Internal Tokenization feature to obfuscate sensitive data.

D.

Use the External Tokenization feature to obfuscate sensitive data.

E.

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.

Avoid sharing data with partner organizations.

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

A.

Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

A.

Global

B.

Database

C.

Schema

D.

Table

The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?

A.

The consumer role will automatically see the new table and no additional grants are needed.

B.

The consumer role will see the table only after this grant is given on the consumer side:

grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

C.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

D.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

grant usage on database EDW to share PSHARE_EDW_4TEST ;

grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;