New Year Sale Special - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: mxmas70

Home > Google > Google Cloud Certified > Professional-Cloud-Security-Engineer

Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer Question and Answers

Question # 4

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.

Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.

Which type of access should your team grant to meet this requirement?

A.

Organization Administrator

B.

Security Reviewer

C.

Organization Role Administrator

D.

Organization Policy Administrator

Full Access
Question # 5

Your organization is deploying a serverless web application on Cloud Run that must be publicly accessible over HTTPS. To meet security requirements, you need to terminate TLS at the edge, apply threat mitigation, and prepare for geo-based access restrictions. What should you do?

A.

Make the Cloud Run service public by enabling allUsers access. Configure Identity-Aware Proxy (IAP) for authentication and IP-based access control. Use custom SSL certificates for HTTPS.

B.

Assign a custom domain to the Cloud Run service. Enable HTTPS. Configure IAM to allow allUsers to invoke the service. Use firewall rules and VPC Service Controls for geo-based restriction and traffic filtering.

C.

Deploy an external HTTP(S) load balancer with a serverless NEG that points to the Cloud Run service. Use a Google-managed certificate for TLS termination. Configure a Cloud Armor policy with geo-based access control.

D.

Create a Cloud DNS public zone for the Cloud Run URL. Bind a static IP to the service. Use VPC firewall rules to restrict incoming traffic based on IP ranges and threat signatures.

Full Access
Question # 6

A customer’s company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions.

Which strategy should you use to meet these needs?

A.

Create an organization node, and assign folders for each business unit.

B.

Establish standalone projects for each business unit, using gmail.com accounts.

C.

Assign GCP resources in a project, with a label identifying which business unit owns the resource.

D.

Assign GCP resources in a VPC for each business unit to separate network access.

Full Access
Question # 7

A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication

Which GCP product should the customer implement to meet these requirements?

A.

Cloud Identity-Aware Proxy

B.

Cloud Armor

C.

Cloud Endpoints

D.

Cloud VPN

Full Access
Question # 8

Your organization is using Google Cloud to develop and host its applications. Following Google-recommended practices, the team has created dedicated projects for development and production. Your development team is located in Canada and Germany. The operations team works exclusively from Germany to adhere to local laws. You need to ensure that admin access to Google Cloud APIs is restricted to these countries and environments. What should you do?

A.

Create dedicated firewall policies for each environment at the organization level, and then apply these policies to the projects. Create a rule to restrict access based on geolocations.

B.

Group all development and production projects in separate folders. Activate the organization policy on the folders to restrict resource location according to the requirements.

C.

Create dedicated VPC Service Controls perimeters for development and production projects. Configure distinct ingress policies to allow access from the respective countries.

D.

Create dedicated IAM Groups for the Canadian and German developers. Grant access to the development and production projects according to the requirements.

Full Access
Question # 9

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

A.

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Full Access
Question # 10

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

A.

Use Puppet or Chef to push out the patch to the running container.

B.

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.

Update the application code or apply a patch, build a new image, and redeploy it.

D.

Configure containers to automatically upgrade when the base image is available in Container Registry.

Full Access
Question # 11

Your organization has hired a small, temporary partner team for 18 months. The temporary team will work alongside your DevOps team to develop your organization's application that is hosted on Google Cloud. You must give the temporary partner team access to your application's resources on Google Cloud and ensure that partner employees lose access if they are removed from their employer's organization. What should you do?

A.

Implement just-in-time privileged access to Google Cloud for the temporary partner team.

B.

Create a temporary username and password for the temporary partner team members. Auto-clean the usernames and passwords after the work engagement has ended.

C.

Add the identities of the temporary partner team members to your identity provider (IdP).

D.

Create a workforce identity pool and federate the identity pool with the identity provider (IdP) of the temporary partner team.

Full Access
Question # 12

Your organization is migrating its primary web application from on-premises to Google Kubernetes Engine (GKE). You must advise the development team on how to grant their applications access to Google Cloud services from within GKE according to security recommended practices. What should you do?

A.

Create an application-specific IAM service account and generate a user-managed service account key for it. Inject the key to the workload by storing it as a Kubernetes secret within the same namespace as the application.

B.

Enable Workload Identity for GKE. Assign a Kubernetes service account to the application and configure that Kubernetes service account to act as an Identity and Access Management (IAM) service account. Grant the required roles to the IAM service account.

C.

Configure the GKE nodes to use the default Compute Engine service account.

D.

Create a user-managed service account with only the roles required for the specific workload. Assign this service account to the GKE nodes.

Full Access
Question # 13

A customer’s data science group wants to use Google Cloud Platform (GCP) for their analytics workloads. Company policy dictates that all data must be company-owned and all user authentications must go through their own Security Assertion Markup Language (SAML) 2.0 Identity Provider (IdP). The Infrastructure Operations Systems Engineer was trying to set up Cloud Identity for the customer and realized that their domain was already being used by G Suite.

How should you best advise the Systems Engineer to proceed with the least disruption?

A.

Contact Google Support and initiate the Domain Contestation Process to use the domain name in your new Cloud Identity domain.

B.

Register a new domain name, and use that for the new Cloud Identity domain.

C.

Ask Google to provision the data science manager’s account as a Super Administrator in the existing domain.

D.

Ask customer’s management to discover any other uses of Google managed services, and work with the existing Super Administrator.

Full Access
Question # 14

All logs in your organization are aggregated into a centralized Google Cloud logging project for analysis and long-term retention.4 While most of the log data can be viewed by operations teams, there are specific sensitive fields (i.e., protoPayload.authenticationinfo.principalEmail) that contain identifiable information that should be restricted only to security teams. You need to implement a solution that allows different teams to view their respective application logs in the centralized logging project. It must also restrict access to specific sensitive fields within those logs to only a designated security group. Your solution must ensure that other fields in the same log entry remain visible to other authorized groups. What should you do?

A.

Configure field-level access in Cloud Logging by defining data access policies that specify sensitive fields and the authorized principals.

B.

Use Cloud IAM custom roles with specific permissions on logging.privateLogEntries.list. Define field-level access within the custom role's conditions.

C.

Implement a log sink to exclude sensitive fields before logs are sent to the centralized logging project. Create separate sinks for sensitive data.

D.

Create a BigQuery authorized view on the exported log sink to filter out the sensitive fields based on user groups.

Full Access
Question # 15

Your team uses a service account to authenticate data transfers from a given Compute Engine virtual machine instance of to a specified Cloud Storage bucket. An engineer accidentally deletes the service account, which breaks application functionality. You want to recover the application as quickly as possible without compromising security.

What should you do?

A.

Temporarily disable authentication on the Cloud Storage bucket.

B.

Use the undelete command to recover the deleted service account.

C.

Create a new service account with the same name as the deleted service account.

D.

Update the permissions of another existing service account and supply those credentials to the applications.

Full Access
Question # 16

Your organization's application is being integrated with a partner application that requires read access to customer data to process customer orders. The customer data is stored in one of your Cloud Storage buckets. You have evaluated different options and determined that this activity requires the use of service account keys. You must advise the partner on how to minimize the risk of a compromised service account key causing a loss of data. What should you advise the partner to do?

A.

Define a VPC Service Controls perimeter, and restrict the Cloud Storage API. Add an ingress rule to the perimeter to allow access to the Cloud Storage API for the service account from outside of the perimeter.​

B.

Scan the Cloud Storage bucket with Sensitive Data Protection when new data is added, and automatically mask all customer data.​

C.

Ensure that all data for the application that is accessed through the relevant service accounts is encrypted at rest by using customer-managed encryption keys (CMEK).​

D.

Implement a secret management service. Configure the service to frequently rotate the service account key. Configure proper access control to the key, and restrict who can create service account keys.​

Full Access
Question # 17

A retail customer allows users to upload comments and product reviews. The customer needs to make sure the text does not include sensitive data before the comments or reviews are published.

Which Google Cloud Service should be used to achieve this?

A.

Cloud Key Management Service

B.

Cloud Data Loss Prevention API

C.

BigQuery

D.

Cloud Security Scanner

Full Access
Question # 18

You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project.

What should you do?

A.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation.

B.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.

C.

In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User.

D.

In Resource Manager, edit the organization permissions. Add the project ID as member with the role: Compute Image User.

Full Access
Question # 19

Your company is concerned about unauthorized parties gaming access to the Google Cloud environment by using a fake login page. You must implement a solution to protect against person-in-the-middle attacks.

Which security measure should you use?

A.

Text message or phone call code

B.

Security key

C.

Google Authenticator application

D.

Google prompt

Full Access
Question # 20

Your organization is rolling out a new continuous integration and delivery (CI/CD) process to deploy infrastructure and applications in Google Cloud Many teams will use their own instances of the CI/CD workflow It will run on Google Kubernetes Engine (GKE) The CI/CD pipelines must be designed to securely access Google Cloud APIs

What should you do?

A.

• 1 Create a dedicated service account for the CI/CD pipelines• 2 Run the deployment pipelines in a dedicated nodes pool in the GKE cluster• 3 Use the service account that you created as identity for the nodes in the pool to authenticate to the Google Cloud APIs

B.

• 1 Create service accounts for each deployment pipeline• 2 Generate private keys for the service accounts• 3 Securely store the private keys as Kubernetes secrets accessible only by the pods that run the specific deploy pipeline

C.

* 1 Create individual service accounts (or each deployment pipeline• 2 Add an identifier for the pipeline in the service account naming convention• 3 Ensure each pipeline runs on dedicated pods• 4 Use workload identity to map a deployment pipeline pod with a service account

D.

• 1 Create two service accounts one for the infrastructure and one for the application deployment• 2 Use workload identities to let the pods run the two pipelines and authenticate with the service accounts• 3 Run the infrastructure and application pipelines in separate namespaces

Full Access
Question # 21

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

A.

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.

B.

Set an Identity and Access Management (1AM) policy that includes a condition that restricts group membership to user principals that belong to your organization.

C.

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.

D.

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Have the alert trigger a Cloud Function instance that removes the external members from the group.

Full Access
Question # 22

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.

What should you do?

A.

Use Resource Manager on the organization level.

B.

Use Forseti Security to automate inventory snapshots.

C.

Use Stackdriver to create a dashboard across all projects.

D.

Use Security Command Center to view all assets across the organization.

Full Access
Question # 23

Which international compliance standard provides guidelines for information security controls applicable to the provision and use of cloud services?

A.

ISO 27001

B.

ISO 27002

C.

ISO 27017

D.

ISO 27018

Full Access
Question # 24

Your company's Chief Information Security Officer (CISO) creates a requirement that business data must be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on the details to implement this requirement, you determine the following:

The services in scope are included in the Google Cloud Data Residency Terms.

The business data remains within specific locations under the same organization.

The folder structure can contain multiple data residency locations.

You plan to use the Resource Location Restriction organization policy constraint. At which level in the resource hierarchy should you set the constraint?

A.

Folder

B.

Resource

C.

Project

D.

Organization

Full Access
Question # 25

Your organization previously stored files in Cloud Storage by using Google Managed Encryption Keys (GMEK). but has recently updated the internal policy to require Customer Managed Encryption Keys (CMEK). You need to re-encrypt the files quickly and efficiently with minimal cost.

What should you do?

A.

Encrypt the files locally, and then use gsutil to upload the files to a new bucket.

B.

Copy the files to a new bucket with CMEK enabled in a secondary region

C.

Reupload the files to the same Cloud Storage bucket specifying a key file by using gsutil.

D.

Change the encryption type on the bucket to CMEK, and rewrite the objects

Full Access
Question # 26

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys.

Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

A.

Customer-supplied encryption keys (CSEK)

B.

Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS)

C.

Encryption by default

D.

Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

Full Access
Question # 27

Your organization relies heavily on virtual machines (VMs) in Compute Engine. Due to team growth and resource demands. VM sprawl is becoming problematic. Maintaining consistent security hardening and timely package updates poses an increasing challenge. You need to centralize VM image management and automate the enforcement of security baselines throughout the virtual machine lifecycle. What should you do?

A.

Activate Security Command Center Enterprise. Use VM discovery and posture management features to monitor hardening state and trigger automatic responses upon detection of issues.B. Create a Cloud Build trigger to build a pipeline that generates hardened VM images. Run vulnerability scans in the pipeline, and store images with passing scans in a registry. Use instance templates pointing to this registry.

B.

Configure the sole-tenancy feature in Compute Engine for all projects. Set up custom organization policies in Policy Controller to restrict the operating systems and image sources that teams are allowed to use.

C.

Use VM Manager to automatically distribute and apply patches to VMs across your projects. Integrate VM Manager with hardened. organization-standard VM images stored in a central repository.

Full Access
Question # 28

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

A.

Cloud Bigtable

B.

Cloud BigQuery

C.

Compute Engine SSD Disk

D.

Compute Engine Persistent Disk

Full Access
Question # 29

You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?

A.

Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.

B.

Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.

C.

Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.

D.

Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region.

Full Access
Question # 30

Your company is deploying a three-tier web application—web, application, and database—on Google Cloud. You need to configure network isolation between tiers to minimize the attack surface. The web tier needs to be accessible from the public internet, the application tier should only be accessible from the web tier, and the database tier should only be accessible from the application tier. Your solution must follow Google-recommended practices. What should you do?

A.

Create three separate VPC networks, one for each tier. Configure VPC Network Peering between the web and application VPCs, and between the application and database VPCs. Use firewall rules to control the traffic.

B.

Create a single subnet for all tiers. Create firewall rules that allow all traffic between instances within the same subnet. Use application-level security to prevent unauthorized access.

C.

Create three subnets within the VPC, one for each tier. Create firewall rules that allow traffic on specific ports on each subnet. Use network tags or service accounts on the VMs to apply the firewall rules.

D.

Create three subnets within the VPC, one for each tier. Enable Private Google Access on each subnet. Create a single firewall rule allowing all traffic between the subnets.

Full Access
Question # 31

A customer has an analytics workload running on Compute Engine that should have limited internet access.

Your team created an egress firewall rule to deny (priority 1000) all traffic to the internet.

The Compute Engine instances now need to reach out to the public repository to get security updates. What should your team do?

A.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority greater than 1000.

B.

Create an egress firewall rule to allow traffic to the CIDR range of the repository with a priority less than 1000.

C.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority greater than 1000.

D.

Create an egress firewall rule to allow traffic to the hostname of the repository with a priority less than 1000.

Full Access
Question # 32

Your organization has a hybrid cloud environment with a data center connected to Google Cloud through a dedicated Cloud Interconnect connection. You need to configure private access from your on-premises hosts to Google APIs, specifically Cloud Storage and BigQuery, without exposing traffic to the public internet. What should you do?

A.

Configure Shared VPC to extend your Google Cloud VPC network to your on-premises environment. Use Private Google Access to access Google APIs.

B.

Use Private Google Access for on-premises hosts. Configure DNS resolution to point to the private.googleapis.com domain.

C.

Configure Cloud NAT on your on-premises network. Configure DNS records in a private DNS zone to send requests to 199.36.153.8/30 to access Google APIs.

D.

Establish VPC peering between your on-premises network and your Google Cloud VPC network. Configure Cloud Firewall rules to allow traffic to Google API IP ranges.

Full Access
Question # 33

Your team needs to prevent users from creating projects in the organization. Only the DevOps team should be allowed to create projects on behalf of the requester.

Which two tasks should your team perform to handle this request? (Choose two.)

A.

Remove all users from the Project Creator role at the organizational level.

B.

Create an Organization Policy constraint, and apply it at the organizational level.

C.

Grant the Project Editor role at the organizational level to a designated group of users.

D.

Add a designated group of users to the Project Creator role at the organizational level.

E.

Grant the billing account creator role to the designated DevOps team.

Full Access
Question # 34

An organization is moving applications to Google Cloud while maintaining a few mission-critical applications on-premises. The organization must transfer the data at a bandwidth of at least 50 Gbps. What should they use to ensure secure continued connectivity between sites?

A.

Dedicated Interconnect

B.

Cloud Router

C.

Cloud VPN

D.

Partner Interconnect

Full Access
Question # 35

Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer.

What type of Load Balancing should you use?

A.

Network Load Balancing

B.

HTTP(S) Load Balancing

C.

TCP Proxy Load Balancing

D.

SSL Proxy Load Balancing

Full Access
Question # 36

You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.

What should you do?

A.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.

B.

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.

C.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.

D.

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.

Full Access
Question # 37

You control network traffic for a folder in your Google Cloud environment. Your folder includes multiple projects and Virtual Private Cloud (VPC) networks You want to enforce on the folder level that egress connections are limited only to IP range 10.58.5.0/24 and only from the VPC network dev-vpc." You want to minimize implementation and maintenance effort

What should you do?

A.

• 1. Attach external IP addresses to the VMs in scope.• 2. Configure a VPC Firewall rule in "dev-vpc" that allows egress connectivity to IP range 10.58.5.0/24 for all source addresses in this network.

B.

• 1. Attach external IP addresses to the VMs in scope.• 2. Define and apply a hierarchical firewall policy on folder level to deny all egress connections and to allow egress to IP range 10 58.5.0/24 from network dev-vpc.

C.

• 1. Leave the network configuration of the VMs in scope unchanged.• 2. Create a new project including a new VPC network "new-vpc."• 3 Deploy a network appliance in "new-vpc" to filter access requests and only allow egress connections from -dev-vpc" to 10.58.5.0/24.

D.

• 1 Leave the network configuration of the VMs in scope unchanged• 2 Enable Cloud NAT for dev-vpc" and restrict the target range in Cloud NAT to 10.58.5 0/24.

Full Access
Question # 38

Your company operates an application instance group that is currently deployed behind a Google Cloud load balancer in us-central-1 and is configured to use the Standard Tier network. The infrastructure team wants to expand to a second Google Cloud region, us-east-2. You need to set up a single external IP address to distribute new requests to the instance groups in both regions.

What should you do?

A.

Change the load balancer backend configuration to use network endpoint groups instead of instance groups.

B.

Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.

C.

Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.

D.

Create a Cloud VPN connection between the two regions, and enable Google Private Access.

Full Access
Question # 39

Your company has deployed an artificial intelligence model in a central project. This model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet. You must expose the model endpoint only to a defined list of projects in your organization. What should you do?

A.

Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

B.

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

C.

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.

D.

Activate Private Google Access in both the model project and in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

Full Access
Question # 40

Your company plans to move most of its IT infrastructure to Google Cloud. They want to leverage their existing on-premises Active Directory as an identity provider for Google Cloud. Which two steps should you take to integrate the company’s on-premises Active Directory with Google Cloud and configure access management? (Choose two.)

A.

Use Identity Platform to provision users and groups to Google Cloud.

B.

Use Cloud Identity SAML integration to provision users and groups to Google Cloud.

C.

Install Google Cloud Directory Sync and connect it to Active Directory and Cloud Identity.

D.

Create Identity and Access Management (1AM) roles with permissions corresponding to each Active Directory group.

E.

Create Identity and Access Management (1AM) groups with permissions corresponding to each Active Directory group.

Full Access
Question # 41

Your organization is using Google Workspace, Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

A.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

B.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

C.

Poll Cloud Logging for authentication events using the gcloud logging read tool.3 Forward the events to the SIEM.

D.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.4

Full Access
Question # 42

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Full Access
Question # 43

You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs?

A.

Google Cloud Armor

B.

Cloud NAT

C.

Cloud Router

D.

Cloud VPN

Full Access
Question # 44

You work for an organization in a regulated industry that has strict data protection requirements. The organization backs up their data in the cloud. To comply with data privacy regulations, this data can only be stored for a specific length of time and must be deleted after this specific period.

You want to automate the compliance with this regulation while minimizing storage costs. What should you do?

A.

Store the data in a persistent disk, and delete the disk at expiration time.

B.

Store the data in a Cloud Bigtable table, and set an expiration time on the column families.

C.

Store the data in a BigQuery table, and set the table's expiration time.

D.

Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.

Full Access
Question # 45

Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need to join user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.

This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?

A.

Deterministic encryption

B.

Secure, key-based hashes

C.

Format-preserving encryption

D.

Cryptographic hashing

Full Access
Question # 46

Your organization has on-premises hosts that need to access Google Cloud APIs You must enforce private connectivity between these hosts minimize costs and optimize for operational efficiency

What should you do?

A.

Route all on-premises traffic to Google Cloud through an IPsec VPN tunnel to a VPC with Private Google Access enabled.

B.

Set up VPC peering between the hosts on-premises and the VPC through the internet.

C.

Enforce a security policy that mandates all applications to encrypt data with a Cloud Key Management. Service (KMS) key before you send it over the network.

D.

Route all on-premises traffic to Google Cloud through a dedicated or Partner interconnect to a VPC with Private Google Access enabled.

Full Access
Question # 47

Which type of load balancer should you use to maintain client IP by default while using the standard network tier?

A.

SSL Proxy

B.

TCP Proxy

C.

Internal TCP/UDP

D.

TCP/UDP Network

Full Access
Question # 48

Your organization is building a real-time recommendation engine using ML models that process live user activity data stored in BigQuery and Cloud Storage. Each new model developed is saved to Artifact Registry. This new system deploys models to Google Kubernetes Engine and uses Pub/Sub for message queues. Recent industry news has been reporting attacks exploiting ML model supply chains. You need to enhance the security in this serverless architecture, specifically against risks to the development and deployment pipeline. What should you do?​

A.

Limit external libraries and dependencies that are used for the ML models as much as possible. Continuously rotate encryption keys that are used to access the user data from BigQuery and Cloud Storage.​

B.

Enable container image vulnerability scanning during development and pre-deployment. Enforce Binary Authorization on images deployed from Artifact Registry to your continuous integration and continuous deployment (CI/CD) pipeline.​

C.

Thoroughly sanitize all training data prior to model development to reduce risk of poisoning attacks. Use IAM for authorization, and apply role-based restrictions to code repositories and cloud services.​

D.

Develop strict firewall rules to limit external traffic to Cloud Run instances. Integrate intrusion detection systems (IDS) for real-time anomaly detection on Pub/Sub message flows.​

Full Access
Question # 49

A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery

What should you do?

A.

Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.

B.

Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery.

C.

Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery.

D.

Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery.

Full Access
Question # 50

Your organization wants to protect its supply chain from attacks. You need to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can be executed in your production environment. You want to minimize management overhead. What should you do?

A.

Deploy all container images to a staging environment and use Container Threat Detection to detect malicious content before promoting them to production.

B.

Review container images before deployment to production, checking for known vulnerabilities using a public vulnerability database. Use Grafeas and Kritis to prevent deployment of containers that haven't been built using your build pipeline.

C.

Use Cloud Next Generation Firewall (Cloud NGFW) Enterprise with traffic inspection to restrict access to containerized applications in the production environment.

D.

Integrate Artifact Registry vulnerability scanning and Binary Authorization into your CI/CD pipeline to ensure only verified images are deployed to production.

Full Access
Question # 51

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

A.

Google Cloud Armor's preconfigured rules in preview mode

B.

Prepopulated VPC firewall rules in monitor mode

C.

The inherent protections of Google Front End (GFE)

D.

Cloud Load Balancing firewall rules

E.

VPC Service Controls in dry run mode

Full Access
Question # 52

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

A.

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Full Access
Question # 53

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Full Access
Question # 54

You are in charge of creating a new Google Cloud organization for your company. Which two actions should you take when creating the super administrator accounts? (Choose two.)

A.

Create an access level in the Google Admin console to prevent super admin from logging in to Google Cloud.

B.

Disable any Identity and Access Management (1AM) roles for super admin at the organization level in the Google Cloud Console.

C.

Use a physical token to secure the super admin credentials with multi-factor authentication (MFA).

D.

Use a private connection to create the super admin accounts to avoid sending your credentials over the Internet.

E.

Provide non-privileged identities to the super admin users for their day-to-day activities.

Full Access
Question # 55

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

A.

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Full Access
Question # 56

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

A.

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Full Access
Question # 57

Your organization wants to publish yearly reports of your website usage analytics. You must ensure that no data with personally identifiable information (PII) is published by using the Cloud Data Loss Prevention (Cloud DLP) API. Data integrity must be preserved. What should you do?​

A.

Encrypt the PII from the report by using the Cloud DLP API.​

B.

Discover and transform PII data in your reports by using the Cloud DLP API.​

C.

Detect all PII in storage by using the Cloud DLP API. Create a cloud function to delete the PII.​

D.

Discover and quarantine your PII data in your storage by using the Cloud DLP API.​

Full Access
Question # 58

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

A.

Create a policy that requires employees to not leave their sessions open for long durations.

B.

Review and disable unnecessary Google Cloud APIs.

C.

Require strong passwords and 2SV through a security token or Google authenticate.

D.

Set the session length timeout for Google Cloud services to a shorter duration.

Full Access
Question # 59

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

A.

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Full Access
Question # 60

You define central security controls in your Google Cloud environment for one of the folders in your organization you set an organizational policy to deny the assignment of external IP addresses to VMs. Two days later you receive an alert about a new VM with an external IP address under that folder.

What could have caused this alert?

A.

The VM was created with a static external IP address that was reserved in the project before the organizational policy rule was set.

B.

The organizational policy constraint wasn't properly enforced and is running in "dry run mode.

C.

At project level, the organizational policy control has been overwritten with an 'allow' value.

D.

The policy constraint on the folder level does not have any effect because of an allow" value for that constraint on the organizational level.

Full Access
Question # 61

Your organization acquired a new workload. The Web and Application (App) servers will be running on Compute Engine in a newly created custom VPC. You are responsible for configuring a secure network communication solution that meets the following requirements:

Only allows communication between the Web and App tiers.

Enforces consistent network security when autoscaling the Web and App tiers.

Prevents Compute Engine Instance Admins from altering network traffic.

What should you do?

A.

1. Configure all running Web and App servers with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

B.

1. Configure all running Web and App servers with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

C.

1. Re-deploy the Web and App servers with instance templates configured with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

D.

1. Re-deploy the Web and App servers with instance templates configured with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

Full Access
Question # 62

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.

What should your team do to meet these requirements?

A.

Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups.

B.

Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups.

C.

Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory.

D.

Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

Full Access
Question # 63

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync, and there will be single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

A.

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Full Access
Question # 64

Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services.

Which two settings must remain disabled to meet these requirements? (Choose two.)

A.

Public IP

B.

IP Forwarding

C.

Private Google Access

D.

Static routes

E.

IAM Network User Role

Full Access
Question # 65

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.

How should you advise this organization?

A.

Use Forseti with Firewall filters to catch any unwanted configurations in production.

B.

Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.

C.

Route all VPC traffic through customer-managed routers to detect malicious patterns in production.

D.

All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Full Access
Question # 66

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

A.

External Key Manager

B.

Customer-supplied encryption keys

C.

Hardware Security Module

D.

Confidential Computing and Istio

E.

Client-side encryption

Full Access
Question # 67

A customer needs an alternative to storing their plain text secrets in their source-code management (SCM) system.

How should the customer achieve this using Google Cloud Platform?

A.

Use Cloud Source Repositories, and store secrets in Cloud SQL.

B.

Encrypt the secrets with a Customer-Managed Encryption Key (CMEK), and store them in Cloud Storage.

C.

Run the Cloud Data Loss Prevention API to scan the secrets, and store them in Cloud SQL.

D.

Deploy the SCM to a Compute Engine VM with local SSDs, and enable preemptible VMs.

Full Access
Question # 68

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync and there will be a single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

A.

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Full Access
Question # 69

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.

What should you do?

A.

Use multi-factor authentication for admin access to the web application.

B.

Use only applications certified compliant with PA-DSS.

C.

Move the cardholder data environment into a separate GCP project.

D.

Use VPN for all connections between your office and cloud environments.

Full Access
Question # 70

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements such as having a company-managed device, a specific location, and a valid user identity can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

A.

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.

Use Cloud Audit Logs to monitor user access to the project resources. Use post-incident analysis to identify unauthorized access attempts.

C.

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

D.

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

Full Access
Question # 71

Your company’s new CEO recently sold two of the company’s divisions. Your Director asks you to help migrate the Google Cloud projects associated with those divisions to a new organization node. Which preparation steps are necessary before this migration occurs? (Choose two.)

A.

Remove all project-level custom Identity and Access Management (1AM) roles.

B.

Disallow inheritance of organization policies.

C.

Identify inherited Identity and Access Management (1AM) roles on projects to be migrated.

D.

Create a new folder for all projects to be migrated.

E.

Remove the specific migration projects from any VPC Service Controls perimeters and bridges.

Full Access
Question # 72

Your Google Cloud environment has one organization node, one folder named Apps." and several projects within that folder The organizational node enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the terramearth.com organization The "Apps" folder enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the flowlogistic.com organization. It also has the inheritFromParent: false property.

You attempt to grant access to a project in the Apps folder to the user testuser@terramearth.com.

What is the result of your action and why?

A.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy mustbe defined on the current project to deactivate the constraint temporarily.

B.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy is in place and only members from the flowlogistic.com organization are allowed.

C.

The action succeeds because members from both organizations, terramearth. com or flowlogistic.com, are allowed on projects in the "Apps" folder

D.

The action succeeds and the new member is successfully added to the project's Identity and Access Management (1AM) policy because all policies are inherited by underlying folders and projects.

Full Access
Question # 73

You recently joined the networking team supporting your company's Google Cloud implementation. You are tasked with familiarizing yourself with the firewall rules configuration and providing recommendations based on your networking and Google Cloud experience. What product should you recommend to detect firewall rules that are overlapped by attributes from other firewall rules with higher or equal priority?

A.

Security Command Center

B.

Firewall Rules Logging

C.

VPC Flow Logs

D.

Firewall Insights

Full Access
Question # 74

A customer is collaborating with another company to build an application on Compute Engine. The customer is building the application tier in their GCP Organization, and the other company is building the storage tier in a different GCP Organization. This is a 3-tier web application. Communication between portions of the application must not traverse the public internet by any means.

Which connectivity option should be implemented?

A.

VPC peering

B.

Cloud VPN

C.

Cloud Interconnect

D.

Shared VPC

Full Access
Question # 75

Your organization is using Vertex AI Workbench Instances. You must ensure that newly deployed instances are automatically kept up-to-date and that users cannot accidentally alter settings in the operating system. What should you do?

A.

Enable the VM Manager and ensure the corresponding Google Compute Engine instances are added.

B.

Enforce the disableRootAccess and requireAutoUpgradeSchedule organization policies for newly deployed instances.

C.

Assign the AI Notebooks Runner and AI Notebooks Viewer roles to the users of the AI Workbench Instances.

D.

Implement a firewall rule that prevents Secure Shell access to the corresponding Google Compute Engine instances by using tags.

Full Access
Question # 76

Your team wants to limit users with administrative privileges at the organization level.

Which two roles should your team restrict? (Choose two.)

A.

Organization Administrator

B.

Super Admin

C.

GKE Cluster Admin

D.

Compute Admin

E.

Organization Role Viewer

Full Access
Question # 77

You need to audit the network segmentation for your Google Cloud footprint. You currently operate Production and Non-Production infrastructure-as-a-service (IaaS) environments. All your VM instances are deployed without any service account customization.

After observing the traffic in your custom network, you notice that all instances can communicate freely – despite tag-based VPC firewall rules in place to segment traffic properly – with a priority of 1000. What are the most likely reasons for this behavior?

A.

All VM instances are missing the respective network tags.

B.

All VM instances are residing in the same network subnet.

C.

All VM instances are configured with the same network route.

D.

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 999.

E.

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 1001.

Full Access
Question # 78

You work for a large organization that recently implemented a 100GB Cloud Interconnect connection between your Google Cloud and your on-premises edge router. While routinely checking the connectivity, you noticed that the connection is operational but there is an error message that indicates MACsec is operationally down. You need to resolve this error. What should you do?

A.

Ensure that the active pre-shared key created for MACsec is not expired on both the on-premises and Google edge routers.

B.

Ensure that the active pre-shared key matches on both the on-premises and Google edge routers.

C.

Ensure that the Cloud Interconnect connection supports MACsec.

D.

Ensure that the on-premises router is not down.

Full Access
Question # 79

You work for a financial organization in a highly regulated industry that is subject to active regulatory compliance. To meet compliance requirements, you need to continuously maintain a specific set of configurations, data residency, organizational policies, and personnel data access controls. What should you do?

A.

Create an Assured Workloads folder for your required compliance program to apply defined controls and requirements.

B.

Create a posture.yaml file with the required security compliance posture. Apply the posture with the gcloud sec postures create POSTURE_NAME --posture-from-file=posture.yaml command in Security Command Center Premium.

C.

Apply an organizational policy constraint at the organization level to limit the location of new resource creation.

D.

Go to the Compliance page in Security Command Center View the report for your status against the required compliance standard. Triage violations to maintain compliance on a regular basis.

Full Access
Question # 80

Your organization has had a few recent DDoS attacks. You need to authenticate responses to domain name lookups. Which Google Cloud service should you use?

A.

Cloud DNS with DNSSEC

B.

Cloud NAT

C.

HTTP(S) Load Balancing

D.

Google Cloud Armor

Full Access
Question # 81

You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.

What should you do?

A.

Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for theapplication to work properly.

B.

Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.

C.

Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

D.

Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project.Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

Full Access
Question # 82

You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?

A.

Change the access control model for the bucket

B.

Update your sink with the correct bucket destination.

C.

Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

D.

Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

Full Access
Question # 83

Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this?

A.

Configure Secret Manager to manage service account keys.

B.

Enable an organization policy to disable service accounts from being created.

C.

Enable an organization policy to prevent service account keys from being created.

D.

Remove the iam.serviceAccounts.getAccessToken permission from users.

Full Access
Question # 84

Your security team uses encryption keys to ensure confidentiality of user data. You want to establish a process to reduce the impact of a potentially compromised symmetric encryption key in Cloud Key Management Service (Cloud KMS).

Which steps should your team take before an incident occurs? (Choose two.)

A.

Disable and revoke access to compromised keys.

B.

Enable automatic key version rotation on a regular schedule.

C.

Manually rotate key versions on an ad hoc schedule.

D.

Limit the number of messages encrypted with each key version.

E.

Disable the Cloud KMS API.

Full Access
Question # 85

Your team wants to make sure Compute Engine instances running in your production project do not have public IP addresses. The frontend application Compute Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.

How should your team meet these requirements?

A.

Enable Private Access on the VPC network in the production project.

B.

Remove the Editor role and grant the Compute Admin IAM role to the engineers.

C.

Set up an organization policy to only permit public IPs for the front-end Compute Engine instances.

D.

Set up a VPC network with two subnets: one with public IPs and one without public IPs.

Full Access
Question # 86

Your company wants to deploy 2-step verification (2SV). The organizational unit (OU) structure of your company is divided into four departmental units: Human Resources. Finance. Engineering, and Marketing. You need to prevent many access issues from occurring at the same time. Your solution should minimize complexity in management and configuration. What should you do?

A.

Create a single new OU to configure enforcement of 2SV to certain users but not others.

B.

Create configuration groups, and enable a phased migration to control the number of individuals in which to enforce 2SV.

C.

In the Admin console, for each OU, check the checkbox to Allow users to turn on 2-Step Verification and set Enforcement to Off.

D.

In the Admin console, for each OU. uncheck the checkbox to Allow users to turn on 2-Step Verification and set Enforcement to On

Full Access
Question # 87

Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1.

What command should you execute?

A.

• organization policy: constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: deny• policy value: storage.gcogleapis.com

B.

• organization policy: constraints/gcp.restrictHonCmekServices• binding at: orgl• policy type: deny• policy value: storage.googleapis.com

C.

• organization policy:constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: allow• policy value: all supported services

D.

• organization policy: constramts/gcp.restrictNonCmekServices• binding at: orgl• policy type: allow• policy value: storage.googleapis.com

Full Access
Question # 88

Your company is migrating a customer database that contains personally identifiable information (PII) to Google Cloud. To prevent accidental exposure, this data must be protected at rest. You need to ensure that all PII is automatically discovered and redacted, or pseudonymized, before any type of analysis. What should you do?

A.

Implement Cloud Armor to protect the database from external threats and configure firewall rules to restrict network access to only authorized internal IP addresses.

B.

Configure Sensitive Data Protection to scan the database for PII using both predefined and custom infoTypes and to mask sensitive data.8

C.

Use Cloud KMS to encrypt the database at rest with a customer-managed encryption key (CMEK). Implement VPC Service Controls.

D.

Create Cloud Storage buckets with object versioning enabled, and use IAM policies to restrict access to the data. Use Data Loss Prevention API (DLP API) on the buckets to scan for sensitive data and generate detection alerts.9

Full Access
Question # 89

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

A.

Create a firewall rule to block internet traffic from the VM.

B.

Provision a NAT Gateway to access the Cloud Storage API endpoint.

C.

Enable Private Google Access on the VPC.

D.

Mount a Cloud Storage bucket as a local filesystem on every VM.

Full Access
Question # 90

You are a security administrator at your company and are responsible for managing access controls (identification, authentication, and authorization) on Google Cloud. Which Google-recommended best practices should you follow when configuring authentication and authorization? (Choose two.)

A.

Use Google default encryption.

B.

Manually add users to Google Cloud.

C.

Provision users with basic roles using Google's Identity and Access Management (1AM) service.

D.

Use SSO/SAML integration with Cloud Identity for user authentication and user lifecycle management.

E.

Provide granular access with predefined roles.

Full Access
Question # 91

You are a member of your company's security team. You have been asked to reduce your Linux bastion host external attack surface by removing all public IP addresses. Site Reliability Engineers (SREs) require access to the bastion host from public locations so they can access the internal VPC while off-site. How should you enable this access?

A.

Implement Cloud VPN for the region where the bastion host lives.

B.

Implement OS Login with 2-step verification for the bastion host.

C.

Implement Identity-Aware Proxy TCP forwarding for the bastion host.

D.

Implement Google Cloud Armor in front of the bastion host.

Full Access
Question # 92

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

A.

BigQuery datasets

B.

Cloud Storage buckets

C.

StackDriver logging

D.

Cloud Pub/Sub topics

Full Access
Question # 93

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements—such as having a company-managed device, a specific location, and a valid user identity—can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

A.

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

C.

Use Cloud Audit Logs to monitor user access to the project resources.11 Use post-incident analysis to identify unauthorized access attempts.

D.

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

Full Access
Question # 94

Your financial services company is migrating its operations to Google Cloud. You are implementing a centralized logging strategy to meet strict regulatory compliance requirements. Your company's Google Cloud organization has a dedicated folder for all production projects. All audit logs, including Data Access logs from all current and future projects within this production folder, must be securely collected and stored in a central BigQuery dataset for long-term retention and analysis. To prevent duplicate log storage and to enforce centralized control, you need to implement a logging solution that intercepts and overrides any project-level log sinks for these audit logs, to ensure that logs are not inadvertently routed elsewhere. What should you do?

A.

Create an aggregated log sink at the production folder level with a destination of the central BigQuery dataset. Configure an inclusion filter for all audit and Data Access logs. Grant the Logs Bucket Writer role to the sink's service account on the production folder.

B.

Create a log sink in each production project to route audit logs to the central BigQuery dataset. Set the writer_identity field of each sink to a service account with BigQuery Data Editor permissions on the central dataset.

C.

Create an aggregated log sink at the organization level with a destination of the central BigQuery dataset and a filter for all audit logs. Use the --include-children flag and configure a log view for the production folder.

D.

Create an intercepting aggregated log sink at the production folder level with the central BigQuery dataset as the destination. Configure an inclusion filter for the necessary audit logs. Grant the appropriate IAM permissions to the sink's writer_identity on the BigQuery dataset.

Full Access