Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: ecus65

Google Professional-Cloud-Security-Engineer - Google Cloud Certified - Professional Cloud Security Engineer

You are the security admin of your company. Your development team creates multiple GCP projects under the "implementation" folder for several dev, staging, and production workloads. You want to prevent data exfiltration by malicious insiders or compromised code by setting up a security perimeter. However, you do not want to restrict communication between the projects.

What should you do?

A.

Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration.

B.

Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects.

C.

Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter.

D.

Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter.

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

A.

Send all logs to the SIEM system via an existing protocol such as syslog.

B.

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.

What should you do?

A.

Use multi-factor authentication for admin access to the web application.

B.

Use only applications certified compliant with PA-DSS.

C.

Move the cardholder data environment into a separate GCP project.

D.

Use VPN for all connections between your office and cloud environments.

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

    The network connection must be encrypted.

    The communication between servers must be over private IP addresses.

What should you do?

A.

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?

A.

Cloud External Key Manager

B.

Customer-managed encryption keys

C.

Customer-supplied encryption keys

D.

Google default encryption

Your team needs to obtain a unified log view of all development cloud projects in your SIEM. The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization.

Which logging export strategy should you use to meet the requirements?

A.

1. Export logs to a Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project.

2.Subscribe SIEM to the topic.

B.

1. Create a Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project.

2.Process Cloud Storage objects in SIEM.

C.

1. Export logs in each dev project to a Cloud Pub/Sub topic in a dedicated SIEM project.

2.Subscribe SIEM to the topic.

D.

1. Create a Cloud Storage sink with a publicly shared Cloud Storage bucket in each project.

2.Process Cloud Storage objects in SIEM.

You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.

What should you do?

A.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.

B.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.

C.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.

D.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.

You need to centralize your team’s logs for production projects. You want your team to be able to search and analyze the logs using Logs Explorer. What should you do?

A.

Enable Cloud Monitoring workspace, and add the production projects to be monitored.

B.

Use Logs Explorer at the organization level and filter for production project logs.

C.

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a Cloud Storage bucket.

D.

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a logs bucket.

Your organization processes sensitive health information. You want to ensure that data is encrypted while in use by the virtual machines (VMs). You must create a policy that is enforced across the entire organization.

What should you do?

A.

Implement an organization policy that ensures that all VM resources created across your organization use customer-managed encryption keys (CMEK) protection.

B.

Implement an organization policy that ensures all VM resources created across your organization are Confidential VM instances.

C.

Implement an organization policy that ensures that all VM resources created across your organization use Cloud External Key Manager (EKM) protection.

D.

No action is necessary because Google encrypts data while it is in use by default.

Your organization is using Vertex AI Workbench Instances. You must ensure that newly deployed instances are automatically kept up-to-date and that users cannot accidentally alter settings in the operating system. What should you do?​

A.

Enable the VM Manager and ensure the corresponding Google Compute Engine instances are added.​

B.

Enforce the disableRootAccess and requireAutoUpgradeSchedule organization policies for newly deployed instances.​

C.

Assign the AI Notebooks Runner and AI Notebooks Viewer roles to the users of the AI Workbench Instances.​

D.

Implement a firewall rule that prevents Secure Shell access to the corresponding Google Compute Engine instances by using tags.​