Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: ecus65

Google Professional-Cloud-Database-Engineer - Google Cloud Certified - Professional Cloud Database Engineer

You are setting up a Bare Metal Solution environment. You need to update the operating system to the latest version. You need to connect the Bare Metal Solution environment to the internet so you can receive software updates. What should you do?

A.

Setup a static external IP address in your VPC network.

B.

Set up bring your own IP (BYOIP) in your VPC.

C.

Set up a Cloud NAT gateway on the Compute Engine VM.

D.

Set up Cloud NAT service.

You need to perform a one-time migration of data from a running Cloud SQL for MySQL instance in the us-central1 region to a new Cloud SQL for MySQL instance in the us-east1 region. You want to follow Google-recommended practices to minimize performance impact on the currently running instance. What should you do?

A.

Create and run a Dataflow job that uses JdbcIO to copy data from one Cloud SQL instance to another.

B.

Create two Datastream connection profiles, and use them to create a stream from one Cloud SQL instance to another.

C.

Create a SQL dump file in Cloud Storage using a temporary instance, and then use that file to import into a new instance.

D.

Create a CSV file by running the SQL statement SELECT...INTO OUTFILE, copy the file to a Cloud Storage bucket, and import it into a new instance.

You are deploying a new Cloud SQL instance on Google Cloud using the Cloud SQL Auth proxy. You have identified snippets of application code that need to access the new Cloud SQL instance. The snippets reside and execute on an application server running on a Compute Engine machine. You want to follow Google-recommended practices to set up Identity and Access Management (IAM) as quickly and securely as possible. What should you do?

A.

For each application code, set up a common shared user account.

B.

For each application code, set up a dedicated user account.

C.

For the application server, set up a service account.

D.

For the application server, set up a common shared user account.

Your organization has a ticketing system that needs an online marketing analytics and reporting application. You need to select a relational database that can manage hundreds of terabytes of data to support this new application. Which database should you use?

A.

Cloud SQL

B.

BigQuery

C.

Cloud Spanner

D.

Bigtable

You are designing a new gaming application that uses a highly transactional relational database to store player authentication and inventory data in Google Cloud. You want to launch the game in multiple regions. What should you do?

A.

Use Cloud Spanner to deploy the database.

B.

Use Bigtable with clusters in multiple regions to deploy the database

C.

Use BigQuery to deploy the database

D.

Use Cloud SQL with a regional read replica to deploy the database.

You need to migrate a 1 TB PostgreSQL database from a Compute Engine VM to Cloud SQL for PostgreSQL. You want to ensure that there is minimal downtime during the migration. What should you do?

A.

Export the data from the existing database, and load the data into a new Cloud SQL database.

B.

Use Migrate for Compute Engine to complete the migration.

C.

Use Datastream to complete the migration.

D.

Use Database Migration Service to complete the migration.

Your ecommerce website captures user clickstream data to analyze customer traffic patterns in real time and support personalization features on your website. You plan to analyze this data using big data tools. You need a low-latency solution that can store 8TB of data and can scale to millions of read and write requests per second. What should you do?

A.

Write your data into Bigtable and use Dataproc and the Apache Hbase libraries for analysis.

B.

Deploy a Cloud SQL environment with read replicas for improved performance. Use Datastream to export data to Cloud Storage and analyze with Dataproc and the Cloud Storage connector.

C.

Use Memorystore to handle your low-latency requirements and for real-time analytics.

D.

Stream your data into BigQuery and use Dataproc and the BigQuery Storage API to analyze large volumes of data.

Your company wants to migrate an Oracle-based application to Google Cloud. The application team currently uses Oracle Recovery Manager (RMAN) to back up the database to tape for long-term retention (LTR). You need a cost-effective backup and restore solution that meets a 2-hour recovery time objective (RTO) and a 15-minute recovery point objective (RPO). What should you do?

A.

Migrate the Oracle databases to Bare Metal Solution for Oracle, and store backups on tapes on-premises.

B.

Migrate the Oracle databases to Bare Metal Solution for Oracle, and use Actifio to store backup files on Cloud Storage using the Nearline Storage class.

C.

Migrate the Oracle databases to Bare Metal Solution for Oracle, and back up the Oracle databases to Cloud Storage using the Standard Storage class.

D.

Migrate the Oracle databases to Compute Engine, and store backups on tapes on-premises.

Your online delivery business that primarily serves retail customers uses Cloud SQL for MySQL for its inventory and scheduling application. The required recovery time objective (RTO) and recovery point objective (RPO) must be in minutes rather than hours as a part of your high availability and disaster recovery design. You need a high availability configuration that can recover without data loss during a zonal or a regional failure. What should you do?

A.

Set up all read replicas in a different region using asynchronous replication.

B.

Set up all read replicas in the same region as the primary instance with synchronous replication.

C.

Set up read replicas in different zones of the same region as the primary instance with synchronous replication, and set up read replicas in different regions with asynchronous replication.

D.

Set up read replicas in different zones of the same region as the primary instance with asynchronous replication, and set up read replicas in different regions with synchronous replication.

An analytics team needs to read data out of Cloud SQL for SQL Server and update a table in Cloud Spanner. You need to create a service account and grant least privilege access using predefined roles. What roles should you assign to the service account?

A.

roles/cloudsql.viewer and roles/spanner.databaseUser

B.

roles/cloudsql.editor and roles/spanner.admin

C.

roles/cloudsql.client and roles/spanner.databaseReader

D.

roles/cloudsql.instanceUser and roles/spanner.databaseUser