AWS 架構師證照考古題大全20241112
Amazon Web Service(AWS 亞馬遜)全系列考古題,2024年最新題庫,持續更新,全網最完整。AWS 證照含金量高,自我進修、跨足雲端產業必備近期版本更新,隨時追蹤最新趨勢變化。
QUESTION 641
A company wants to use an event-driven programming model with AWS Lambda. The company wants to reduce startup latency for Lambda functions that run on Java 11,
The company does not have strict latency requirements for the applications. The company wants to reduce cold starts and outlier latencies when a function scales up.
Which solution will meet these requirements MOST cost-effectively?
A. Configure Lambda provisioned concurrency
B. Increase the timeout of the Lambda functions
C. Increase the memory of the Lambda functions
D. Configure Lambda SnapStart.
Correct Answer: D
Section: (none)
QUESTION 642
An ecommerce company uses Amazon Route 53 as its DNS provider. The company hosts its website on premises and in theAWS Cloud. The company's on-premises data center is near the us-west-1 Region.The company uses the eu-central-1 Region to host the website.The company wants to minimize load time for the website as much as possible.
Which solution will meet these requirements?
A. Set up a geolocation routing policy. Send the traffic that is near us-west-1 to the on-premises data center. Send the traffic that is near eu-central-1 to eu- central-1
B. Set up a simple routing policy that routes all traffic that is near eu-central-1 to eu-central-1 and routes all traffic that is near the on-premises data center to the on-premises data center
C. Set up a latency routing policy Associate the policy with us-west-1
D. Set up a weighted routing policy. Split the traffic evenly between eu-central-1 and the on-premises data center
Correct Answer: A
Section: (none)
QUESTION 643
A company will migrate 10 PB of data to Amazon S3 in 6 weeks.The current data center has a 500 Mbps uplink to theinternet. Other on-premises applications share the uplink. The company can use 80% of the internet bandwidth for this one-time migration task.
Which solution will meet these requirements?
A. Configure AWS DataSync to migrate the data to Amazon S3 and to automatically verify the data
B. Use rsync to transfer the data directly to Amazon S3
C. Use the AWS CLI and multiple copy processes to send the data directly to Amazon S3
D. Order multiple AWS Snowball devices. Copy the data to the devices. Send the devices to AWS to copy
the data to Amazon S3
Correct Answer: D
Section: (none)
QUESTION 644
A company wants to use Amazon FSx for Windows File Server for its Amazon EC2 instances that have an SMB file share mounted as a volume in the us-east-1 Region.
The company has a recovery point objective (RPO) of 5 minutes for planned system maintenance or unplanned servicedisruptions. The company needs to replicate the file system to the us-west-2 Region. The replicated data must not be deleted by any user for 5 years.
Which solution will meet these requirements?
A. Create an FSx for Windows File Server file system in us-east-1 that has a Single-AZ 2 deployment type. Use AWS Backup to create a daily backup plan that includes a backup rule that copies the backup to us-west-2. Configure AWS Backup Vault Lock in compliance mode for a target vault in us-west-2. Configure a minimum duration of 5 years
B. Create an FSx for Windows File Server file system in us-east-1 that has a Multi-AZ deployment type. Use AWS Backupto create a daily backup plan that includes a backup rule that copies the backup to us-west-2. Configure AWS Backup Vault Lock in governance mode for a target vault in us-west-2. Configure a minimum duration of 5 years
C. Create an FSx for Windows File Server file system in us-east-1 that has a Multi-AZ deployment type. Use AWS Backupto create a daily backup plan that includes a backup rule that copies the backup to us-west-2. Configure AWS Backup Vault Lock in compliance mode for a target vault in us-west-2. Configure a minimum duration of 5 years
D. Create an FSx for Windows File Server file system in us-east-1 that has a Single-AZ 2 deployment type. Use AWS Backup to create a daily backup plan that includes a backup rule that copies the backup to us-west-2. Configure AWS Backup Vault Lock in governance mode for a target vault in us-west-2. Configure a minimum duration of 5 years
Correct Answer: C
Section: (none)
QUESTION 645
A company hosts multiple applications on AWS for different product lines. The applications use different computeresources,including Amazon EC2 instances and Application Load Balancers.The applications run in different AWS accounts under the same organization in AWS Organizations across multiple AWS Regions. Teams for each product line have tagged each compute resource in the individual accounts.
The company wants more details about the cost for each product line from the consolidated billing feature in Organizations.
Which combination of steps will meet these requirements?(Select TWO.)
A. Select a specific AWS generated tag in the AWS Billing console
B. Select a specific user-defined tag in the AWS Billing console
C. Select a specific user-defined tag in the AWS Resource Groups console
D. Activate the selected tag from each AWS account
E. Activate the selected tag from the Organizations management account
Correct Answer: BE
Section: (none)
QUESTION 646
A company runs a website that stores images of historical events Website users need the ability to search and view images based on the year that the event in the image occurred. On average,users request each image only once or twice a year. The company wants a highly available solution to store and deliver the images to users.
Which solution will meet these requirements MOST cost-effectively?
A. Store images in Amazon Elastic Block Store (Amazon EBS). Use a web server that runs on Amazon EC2
B. Store images in Amazon Elastic File System (Amazon EFS). Use a web server that runs on Amazon
EC2
C. Store images in Amazon S3 Standard. Use S3 Standard to directly deliver images by using a static website
D. Store images in Amazon S3 Standard-Infrequent Access (S3 Standard- IA). Use S3 Standard-IA to directly deliver images by using a static website
Correct Answer: C
Section: (none)
QUESTION 647
A company has a large data workload that runs for 6 hours each day. The company cannot lose any data while the process is running. A solutions architect is designing an Amazon EMR cluster configuration to support this critical data workload.
Which solution will meet these requirements MOST cost_effectively?
A. Configure a long-running cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances
B. Configure a transient cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances
C. Configure a transient cluster that runs the primary node on an On-Demand Instance and the core nodes and task nodes on Spot Instances
D. Configure a long-running cluster that runs the primary node on an On-Demand Instance, the core nodes on Spot Instances, and the task nodes on Spot Instances
Correct Answer: B
Section: (none)
QUESTION 648
A company has multiple AWS accounts with applications deployed in the us-west-2 Region Application togs are stored within Amazon S3 buckets in each account. The company wants to build a centralized log analysts solution that uses a single S3 bucket Logs must not leave us-west-2T and the company wants to incur minimal operational overhead. Which solution meetsthese requirements and is MOST cost-effective?
A. Create an S3 Lifecycle policy that copies the objects from one of the application S3 buckets to the centralized S3 bucket
B. Use S3 Same-Region Replication to replicate togs from the S3 buckets to another S3 bucket in us- west-2 Use this S3 bucket for log analysis
C. Write a script that uses the PutObject API operation every day to copy the entire contents of the buckets to another S3 bucket in us-west-2 Use this S3 bucket for log analysis
D. Write AWS Lambda functions in these accounts that are triggered every time logs ate delivered to the S3 buckets (s3 ObjectCreated. * event)
Copy the logs to another S3 bucket in us-west-2 Use this S3 bucket for log analysis
Correct Answer: B
Section: (none)
QUESTION 649
A company has a business-critical application that runs on Amazon EC2 instances The application stores data in an AmazonDynamoDB table. The company must be able to revert the table to any point within the last24 hours.
Which solution meets these requirements with the LEAST operational overhead?
A. Configure point-in-time recovery for the table
B. Use AWS Backup for the table
C. Use an AWS Lambda function to make an on-demand backup of the table every hour.
D. Turn on streams on the table to capture a log of all changes to the table in the last24hours. Store a copy of the stream in an Amazon S3 bucket
Correct Answer: A
Section: (none)
QUESTION 650
A company hosts an application used to upload files to an Amazon S3 bucket. Once uploaded, the files are processed to extractmetadata, which takes less than 5 seconds. The volume and frequency of the uploads varies from a few files each hour to hundreds of concurrent uploads. The company has asked a solutions architect to design a cost-effective architecture that will meet these requirements.
What should the solutions architect recommend?
A. Configure AWS Cloud Trail trails to log S3 API calls. Use AWS AppSync to process the files.
B. Configure an object-created event notification within the S3 bucket to invoke an AWS Lambda function to process the files.
C. Configure Amazon Kinesis Data Streams to process and send data to Amazon S3. Invoke an AWS Lambda function to process the files.
D. Configure an Amazon Simple Notification Service (Amazon SNS) topic to process the files uploaded to Amazon S3.
Invoke an AWS Lambda function to process the files.
Correct Answer: B
Section: (none)
QUESTION 651
A company is developing a mobile game that streams score updates to a backend processor and then posts results on aleaderboard. A solutions architect needs to design a solution that can handle large traffic spikes, process the mobile game updates in order of receipt, and store the processed updates in a highly available database. The company also wants to minimize the management overhead required to maintain the solution. What should the solutions architect do to meet these requirements?
A. Push score updates to Amazon Kinesis Data Streams.
Process the updates in Kinesis Data Streams with AWS Lambda. Store the processed updates in Amazon DynamoDB.
B. Push score updates to Amazon Kinesis Data Streams.
Process the updates with a fleet of Amazon EC2 instances set up for Auto Scaling. Store the processed updates in Amazon Redshift.
C. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe an AWS Lambda function to the SNS topic to process the updates.
Store the processed updates in a SQL database running on Amazon EC2.
D. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue.
Use a fleet of Amazon EC2 instances with Auto Scaling to process the updates in the SOS queue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.
Correct Answer: A
Section: (none)
QUESTION 652
A company maintains about 300 TB m Amazon S3 Standard storage month after month. The S3 objects are each typicallyaround 50 GB m size and are frequently replaced with multipart uploads by their global application. The number and size ofS3 objects remain constant but the company's S3 storage costs are increasing each month.
How should a solutions architect reduce costs in this situation?
A. Switch from multipart uploads to Amazon S3 Transfer Acceleration
B. Enable an S3 Lifecycle policy that deletes incomplete multipart uploads
C. Configure S3 inventory to prevent objects from being archived too quickly
D. Configure Amazon CloudFront to reduce the number of objects stored in Amazon S3
Correct Answer: B
Section: (none)
QUESTION 653
A company is deploying an application that processes streaming data in near-real time. The company plans
to use Amazon EC2 instances for the workload. The network architecture must be configurable to provide the lowest possible latency between nodes.
Which combination of network solutions will meet these requirements? (Select TWO.)
A. Enable and configure enhanced networking on each EC2 instance.
B. Group the EC2 instances in separate accounts.
C. Run the EC2 instances in a cluster placement group.
D. Attach multiple elastic network interfaces to each EC2 instance.
E. Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.
Correct Answer: AC
Section: (none)
QUESTION 654
A company built an application with Docker containers and needs to run the application in the AWS Cloud. The company wants to use a managed service to host the application. The solution must scale in and out appropriately according to demand on the individual container services The solution also must not result in additional operational overhead or infrastructure to manage.Which solutions will meet these requirements? (Select TWO)
A. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate
B. Use Amazon Elastic Kubermetes Service Amazon EKS) with AWS Fargate
C. Provision an Amazon API Gateway API. Connect the API to AWS Lambda to run the containers
D. Use Amazon Elastic Container Service Amazon ECS) with Amazon EC2 worker nodes
E. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 worker nodes.
Correct Answer: AB
Section: (none)
QUESTION 655
A company has an AWS Direct Connect connection from its on-premises location to an AWS account. The AWS account has 3O different VPCs in the same AWS Region. The VPCs use private virtual interfaces (VIFs).Each VPC has a CIDR block that doesnot overlap with other networks under the company's control. The company wants to centrally manage the networking architecture while sill allowing each VPC to communicate with all other VPCs and on-premises networks.
Which solution will meet these requirements with the LEAST amount of operational overhead?
A. Create a transit gateway, and associate the Direct Connect connection with a new transit VIF.Turn on the transit gateway's route propagation feature.
B. Create a Direct Connect gateway. Recreate the private VlFs to use the new gateway.Associate each VPC by creating new virtual private gateways.
C. Create a transit VPC. Connect the Direct Connect connection to the transit VPC. Create a peering connection between all other VPCs in the Region. Update the route tables.
D. Create AWS Site-to-Site VPN connections from on premises to each VPC. Ensure that both VPN tunnels are UP for each connection. Turn on the route propagation feature.
Correct Answer: A
Section: (none)
QUESTION 656
A solutions architect is designing a shared storage solution for a web application that is deployed across multiple AvailabilityZones. The web application runs on Amazon EC2 instances that are in an Auto Scaling group. The company plans to make frequent changes to the content. The solution must have strong consistency in returning the new content as soon as the changes occur.
Which solutions meet these requirements? (Select TWO.)
A. Use AWS Storage Gateway Volume Gateway Internet Small Computer Systems Interface (iSCSl) block storage that is mounted to the individual EC2 instances.
B. Create an Amazon Elastic File System (Amazon EFS) file system. Mount the EFS file system on the individual EC2 instances.
C. Create a shared Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBS volume on the individual EC2 instances
D. Use AWS DataSync to perform continuous synchronization of data between EC2 hosts in the Auto Scaling group
E. Create an Amazon S3 bucket to store the web content. Set the metadata for the Cache-Control header to no-cache. Use Amazon CloudFront to deliver the content.
Correct Answer: BE
Section: (none)
QUESTION 657
A company has an organization in AWS Organizations that has all features enabled. The company requires that all API calls and logins in any existing or new AWS account must be audited. The company needs a managed solution to prevent additional workand to minimize costs. The company also needs to know when any AWS account is not compliant with the AWS Foundational Security Best Practices (FSBP) standard.
Which solution will meet these requirements with the LEAST operational overhead?
A. Deploy an AWS Control Tower environment in the Organizations management account. Enable AWS Security Hub and AWS Control Tower Account Factory in the environment
B. Deploy an AWS Control Tower environment in a dedicated Organizations member account. Enable AWS Security Hub and AWS Control Tower Account Factory in the environment
C. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ). Submit an RFC to self-service provision Amazon GuardDuty in the MALZ
D. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone (MALZ). Submit an RFC to self-service provision AWS Security Hub in the MALZ
Correct Answer: B
Section: (none)
QUESTION 658
A company that uses AWS needs a solution to predict the resources needed for manufacturing processes each month.Thesolution must use historical values that are currently stored in an Amazon S3 bucket. The company has no machine learning (ML) experience and wants to use a managed service for the training and predictions.
Which combination of steps will meet these requirements?(Select TWO.)
A. Deploy an Amazon SageMaker model.Create a SageMaker endpoint for inference
B. Use Amazon SageMaker to train a model by using the historical data in the S3 bucket
C. Configure an AWS Lambda function with a function URL that uses Amazon SageMaker endpoints to create predictions based on the inputs
D. Configure an AWS Lambda function with a function URL that uses an Amazon Forecast predictor to create a prediction based on the inputs
E. Train an Amazon Forecast predictor by using the historical data in the S3 bucket
Correct Answer: BE
Section: (none)
QUESTION 659
A company collects and processes data from a vendor. The vendor stores its data in an Amazon RDS for MySQL database in the vendor's own AWS account. The company's VPC does not have an internet gateway, an AWS Direct Connect connection, oran AWS Site-to-Site VPN connection. The company needs to access the data that is in the vendor database.
Which solution will meet this requirement?
A. Instruct the vendor to sign up for the AWS Hosted Connection Direct Connect Program. Use VPC peering to connect the company's VPC and the vendor's VPC
B. Configure a client VPN connection between the company's VPC and the vendor's VPC. Use VPC peering to connect the company's VPC and the vendor's VPC
C. Instruct the vendor to create a Network Load Balancer (NLB). Place the NLB in front of the Amazon RDS for MySQLdatabase.Use AWS PrivateLink to integrate the company's VPC and the vendor's VPC
D. Use AWS Transit Gateway to integrate the company's VPC and the vendor's VPC. Use VPC peering to connect the company's VPC and the vendor's VPC
Correct Answer: A
Section: (none)
QUESTION 660
A company has an organization in AWS Organizations. The company runs Amazon EC2 instances across four AWS accounts in the root organizational unit(OU). There are three nonproduction accounts and one production account. The company wants toprohibit users from launching EC2 instances of a certain size in the nonproduction accounts.The company has created a service control policy(SCP) to deny access to launch instances that use the prohibited types.
Which solutions to deploy the SCP will meet these requirements?(Select TWO.)
A. Attach the SCP to the root OU for the organization
B. Attach the SCP to the three nonproduction Organizations member accounts
C. Attach the SCP to the Organizations management account
D. Create an OU for the production account. Attach the SCP to the OU. Move the production member account into the new OU
E. Create an OU for the required accounts.Attach the SCP to the OU. Move the nonproduction member accounts into the new OU
Correct Answer: AB
Section: (none)
QUESTION 661
A company's website hosted on Amazon EC2 instances processes classified data stored in Amazon S3. Due to securityconcerns,the company requires a private and secure connection between its EC2 resources and Amazon S3,
Which solution meets these requirements?
A. Set up S3 bucket policies to allow access from a VPC endpoint
B. Set up an IAM policy to grant read-write access to the S3 bucket.
C. Set up a NAT gateway to access resources outside the private subnet
D. Set up an access key ID and a secret access key to access the S3 bucket
Correct Answer: A
Section: (none)
QUESTION 662
A company's data platform uses an Amazon Aurora MySQL database. The database has multiple read replicas and multiple DBinstances across different Availability Zones. Users have recently reported errors from the database that indicate that there are too many connections. The company wants to reduce the failover time by 20% when a read replica is promoted to primary writer.
Which solution will meet this requirement?
A. Switch from Aurora to Amazon RDS with Multi-AZ cluster deployment
B. Use Amazon RDS Proxy in front of the Aurora database
C. Switch to Amazon DynamoDB with DynamoDB Accelerator (DAX) for read connections
D. Switch to Amazon Redshift with relocation capability
Correct Answer: B
Section: (none)
QUESTION 663
A company uses an AWS Batch job to run its end-of-day sales process.The company needs a serverless
solution that will invoke a third-party reporting application when the AWS Batch job is successful. The reporting application has an HTTP API interface that uses username and password authentication.
Which solution will meet these requirements?
A. Configure an Amazon EventBridge rule to match incoming AWS Batch job SUCCEEDED events. Configure the third-partyAPI as an EventBridge API destination with a username and password. Set the API destination as the EventBridge rule target
B. Configure Amazon EventBridge Scheduler to match incoming AWS Batch iob SUCCEEDED events. Configure an AWSLambda function to invoke the third-party API by using a username and password. Set the Lambda function as the EventBridge rule target
C. Configure an AWS Batch job to publish job SUCCEEDED events to an Amazon API Gateway REST API. Configure anHTTP proxy integration on the API Gateway REST API to invoke the third-party API by using a username and password
D. Configure an AWS Batch job to publish job SUCCEEDED events to an Amazon API Gateway REST API. Configure a proxy integration on the API Gateway REST API to an AWS Lambda function. Configure the Lambda function to invoke the third party API by using a username and password
Correct Answer: D
Section: (none)
QUESTION 664
A pharmaceutical company is developing a new drug. The volume of data that the company generates has grown exponentiallyover the past few months. The company's researchers regularly require a subset of the entire dataset to be immediately availablewith minimal lag. However. the entire dataset does not need to be accessed on a daily basis. All the data currently resides in on-premises storage arrays. and the company wants to reduce ongoing capital expenses.
Which storaae solution should a solutions architect recommend to meet these requirements?
A. Run AWS DataSync as a scheduled cron job to migrate the data to an Amazon S3 bucket on an ongoing basis
B. Deploy an AWS Storage Gateway file gateway with an Amazon S3 bucket as the target storage. Migrate the data to the Storage Gateway appliance
C. Deploy an AWS Storage Gateway volume gateway with cached volumes with an Amazon S3 bucket as the target storage. Migrate the data to the Storage Gateway appliance
D. Configure an AWS Site-to-Site VPN connection from the on-premises environment to AWS. Migrate data to an Amazon Elastic File System (Amazon EFS) file system
Correct Answer: C
Section: (none)
QUESTION 665
An ecommerce company runs its application on AWS. The application uses an Amazon Aurora PostgreSQL cluster in Multi-AZmode for the underlying database. During a recent promotional campaign, the application experienced heavy read load and write load Users experienced timeout issues when they attempted to access the application.
A solutions architect needs to make the application architecture more scalable and highly available. Which solution will meet these requirements with the LEAST downtime?
A. Create an Amazon EventBridge rule that has the Aurora cluster as a source. Create an AWS Lambda function to log thestate change events of the Aurora cluster. Add the Lambda function as a target for the EventBridge rule. Add additional reader nodes to fail over to
B. Modify the Aurora cluster and activate the zero-downtime restart (ZDR) feature. Use Database Activity Streams on the cluster to track the cluster status
C. Add additional reader instances to the Aurora cluster. Create an Amazon RDS Proxy target group for the Aurora cluster
D. Create an Amazon ElastiCache for Redis cache. Replicate data from the Aurora cluster to Redis by using AWS Database Migration Service(AWS DMS) with a write-around approach
Correct Answer: C
Section: (none)
QUESTION 666
A company has a mobile game that reads most of its metadata from an Amazon RDS DB instance. As the game increased in popularity, developers noticed slowdowns related to the game's metadata load times. Performance metrics indicate that simplyscaling the database will not help. A solutions architect must explore all options that include capabilities for snapshots, replication,and sub-millisecond response times. What should the solutions architect recommend to solve these issues?
A. Migrate the database to Amazon Aurora with Aurora Replicas
B. Migrate the database to Amazon DynamoDB with global tables
C. Add an Amazon ElastiCache for Redis laver in front of the database
D. Add an Amazon ElastiCache for Memcached layer in front of the database
Correct Answer: C
Section: (none)
QUESTION 667
A company maintains its accounting records in a custom application that runs on Amazon EC2 instances. The company needs tomigrate the data to an AWS managed service for development and maintenance of the application data.The solution must require minimal operational support and provide immutable,cryptographically verifiable logs of data changes.
Which solution will meet these requirements MOST cost-effectively?
A. Copy the records from the application into an Amazon Redshift cluster
B. Copy the records from the application into an Amazon Neptune cluster
C. Copy the records from the application into an Amazon Timestream database
D. Copy the records from the application into an Amazon Quantum Ledger Database(Amazon QLDB) ledger
Correct Answer: D
Section: (none)
QUESTION 668
A company is building an application on AWS that connects to an Amazon RDS database.The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services.
Which solution will meet these requirements with the LEAST administrative overhead?
A. Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials
B. Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials
C. Use an encrypted application configuration file. Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials
D. Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials
Correct Answer: A
Section: (none)
QUESTION 669
A company wants to standardize its Amazon Elastic Block Store(Amazon EBS) volume encryption strategy.The company alsowants to minimize the cost and configuration effort required to operate the volume encryption check.
Which solution will meet these requirements?
A. Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Use Amazon EventBridge to schedule an AWS Lambda function to run the API calls
B. Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Run the API calls on an AWS Fargate task
C. Create an AWS Identity and Access Management (IAM) policy that requires the use of tags on EBS
volumes.Use AWS Cost Explorer to display resources that are not properly tagged.Encrypt the untagged resources manually
D. Create an AWS Config rule for Amazon EBS to evaluate if a volume is encrypted and to flag the volume if it is not encrypted
Correct Answer: D
Section: (none)
QUESTION 670
A company's marketing data is uploaded from multiple sources to an Amazon S3 bucket. A series of data preparation jobsaggregate the data for reporting. The data preparation jobs need to run at regular intervals in parallel. Afew jobs need to run in a specific order later.
The company wants to remove the operational overhead of job error handling, retry logic, and state management.
Which solution will meet these requirements?
A. Use an Aws Lambda function to process the data as soon as the data is uploaded to the S3 bucket. Invoke other Lambda functions at regularly scheduled intervals
B. Use Amazon Athena to process the data.Use Amazon EventBridge Scheduler to invoke Athena on a regular internal
C. Use AWS Glue DataBrew to process the data.Use an AWS Step Functions state machine to run the DataBrew data preparation jobs
D. Use AWS Data Pipeline to process the data. Schedule Data Pipeline to process the data once at midnight
Correct Answer: C
Section: (none)
QUESTION 671
A company is designing a web application on AWS. The application will use a VPN connection between the company's existingdata centers and the company's VPCs The company uses Amazon Route 53 as its DNS service.The application must use private DNS records to communicate with the on-premises services from a VPC.
Which solution will meet these requirements in the MOST secure manner?
A. Create a Route 53 Resolver outbound endpoint. Create a resolver rule. Associate the resolver rule with the VPC
B. Create a Route 53 Resolver inbound endpoint. Create a resolver rule. Associate the resolver rule with the VPC
C. Create a Route 53 private hosted zone. Associate the private hosted zone with the VPC
D. Create a Route 53 public hosted zone. Create a record for each service to allow service communication
Correct Answer: A
Section: (none)
QUESTION 672
An ecommerce company runs applications in AWS accounts that are part of an organization in AWS Organizations. Theapplications run on Amazon Aurora PostgreSQL databases across all the accounts.The company needs to prevent malicious activity and must identify abnormal failed and incomplete login attempts to the databases.
Which solution will meet these requirements in the MOST operationally efficient way?
A. Attach service control policies (SCPs) to the root of the organization to identify the failed login attempts
B. Enable the Amazon RDS Protection feature in Amazon GuardDuty for the member accounts of the organization
C. Publish the Aurora general logs to alog group in Amazon CloudWatch Logs.Export the log data to a central Amazon S3 bucket
D. Publish all the Aurora PostgreSQL database events in AWS CloudTrail to a central Amazon S3 bucket
Correct Answer: C
Section: (none)
QUESTION 673
A company needs to extract the names of ingredients from recipe records that are stored as text files in an Amazon S3 bucket.A web application will use the ingredient names to query an Amazon DynamoDB table and determine a nutrition score The application can handle non-food records and errors. The company does not have any employees who have machine learning knowledge to develop this solution.
Which solution will meet these requirements MOST cost-effectively?
A. Use S3 Event Notifications to invoke an AWS Lambda function when PutObject requests occur. Program the Lambdafunction to analyze the object and extract the ingredient names by using Amazon Comprehend. Store the Amazon Comprehend output in the DynamoDB table
B. Use an Amazon EventBridge rule to invoke an AWS Lambda function when PutObject requests occur. Program theLambda function to analyze the object by using Amazon Forecast to extract the ingredient names. Store the Forecast output in the DynamoDB table
C. Use S3 Event Notifications to invoke an AWS Lambda function when PutObject requests occur. Use Amazon Polly to create audio recordings of the recipe records. Save the audio files in the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send a URL as a message to employees. Instruct the employees to listen to the audio files andcalculate the nutrition score. Store the ingredient names in the DynamoDB table
D. Use an Amazon EventBridge rule to invoke an AWS Lambda function when a PutObject request occurs. Program the Lambda function to analyze the object and extract the ingredient names by using Amazon SageMaker. Store the inference output from the SageMaker endpoint in the DynamoDB table.
Correct Answer: A
Section: (none)
QUESTION 674
A company needs a solution to prevent AWS CloudFormation stacks from deploying AWS Identity and Access Management(IAM) resources that include an inline policy or "*" in the statement. The solution must also prohibit deployment of Amazon EC2 instances with public IP addresses. The company has AWS Control Tower enabled in its organization in AWS Organizations.
Which solution will meet these requirements?
A. Use AWS Control Tower proactive controls to block deployment of EC2 instances with public IP addresses and inline policies with elevated access or "*'
B. Use AWS Control Tower detective controls to block deployment of EC2 instances with public IP addresses and inline policies with elevated access or "*"
C. Use AWS Config to create rules for EC2 and IAM compliance. Configure the rules to run an AWS Systems Manager Session Manager automation to delete a resource when it is not compliant
D. Use a service control policy(SCP) to block actions for the EC2 instances and IAM resources if the actions lead to noncompliance
Correct Answer: A
Section: (none)
QUESTION 675
An online gaming company hosts its platform on Amazon EC2 instances behind Network Load Balancers (NLBs) across multiple AWS Regions.The NLBs can route requests to targets over the internet.The company wants to improve the customerplaying experience by reducing end-to-end load time for its global customer base.
Which solution will meet these requirements?
A. Create Application Load Balancers (ALBs) in each Region to replace the existing NLBs. Register the existing EC2 instances as targets for the ALBs in each Region
B. Configure Amazon Route 53 to route egually weighted traffic to the NLBs in each Region
C. Create additional NLBs and EC2 instances in other Regions where the company has large customer bases
D. Create a standard accelerator in AWS Global Accelerator. Configure the existing NLBs as target endpoints
Correct Answer: D
Section: (none)
QUESTION 676
A solutions architect needs to design the architecture for an application that a vendor provides as a Docker container image.Thecontainer needs 50 GB of storage available for temporary files.The infrastructure must be serverless.
Which solution meets these requirements with the LEAST operational overhead?
A. Create an AWS Lambda function that uses the Docker container image with an Amazon S3 mounted volume that has more than 50 GB of space
B. Create an AWS Lambda function that uses the Docker container image with an Amazon Elastic Block Store(Amazon EBS) volume that has more than 50 GB of space
C. Create an Amazon Elastic Container Service(Amazon ECS) cluster that uses the AWS Fargate launch type. Create a taskdefinition for the container image with an Amazon Elastic File System (Amazon EFS) volume. Create a service with that task definition
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses the Amazon EC2 launch type with an Amazon Elastic Block Store (Amazon EBS) volume that has more than 50 GB of space. Create a task definition for the container image. Create a service with that task definition
Correct Answer: C
Section: (none)
QUESTION 677
A company's applications use Apache Hadoop and Apache Spark to process data on premises. The existing infrastructure is not scalable and is complex to manage.
A solutions architect must design a scalable solution that reduces operational complexity. The solution must keep the data processing on premises.
Which solution will meet these requirements?
A. Use AWS Site-to-Site VPN to access the on-premises Hadoop Distributed File System (HDFS) data and application. Use an Amazon EMR cluster to process the data
B. Use AWS DataSync to connect to the on-premises Hadoop Distributed File System (HDFS) cluster. Create an Amazon EMR cluster to process the data
C. Migrate the Apache Hadoop application and the Apache Spark application to Amazon EMR clusters on AWS Outposts. Use the EMR clusters to process the data
D. Use an AWS Snowball device to migrate the data to an Amazon S3 bucket. Create an Amazon EMR cluster to process the data
Correct Answer: C
Section: (none)
QUESTION 678
A social media company is creating a rewards program website for its users. The company gives users points when userscreate and upload videos to the website. Users redeem their points for gifts or discounts from the company's affiliated partners. A unique ID identifies users. The partners refer to this ID to verify user eligibility for rewards.
The partners want to receive notification of user IDs through an HTTP endpoint when the company gives users points. Hundredsof vendors are interested in becoming affiliated partners every day. The company wants to design an architecture that gives thewebsite the ability to add partners rapidly in a scalable way. Which solution will meet these requirements with the LEAST implementation effort?
A. Create an Amazon Timestream database to keep a list of affiliated partners. Implement an AWS Lambda function to readthe list. Configure the Lambda function to send user IDs to each partner when the company gives users points
B. Create an Amazon Simple Notification Service (Amazon SNS) topic. Choose an endpoint protocol. Subscribe thepartners to the topic. Publish user IDs to the topic when the company gives users points Create an AWS Step Functions state machine. Create a task for every affiliated partner. Invoke the state machine with user IDs as input when the company gives users points
C. Create a data stream in Amazon Kinesis Data Streams. Implement producer and consumer
applications. Store a list of affiliated partners in the data stream. Send user IDs when the company gives users points
Correct Answer: B
Section: (none)
QUESTION 679
A company is designing a tightly coupled high performance computing (HPC) environment in the AWS Cloud. The companyneeds to include features that will optimize the HPC environment for networking and storage.
Which combination of solutions will meet these requirements? (Select TWO.)
A. Create an accelerator in AWS Global Accelerator. Configure custom routing for the accelerator
B. Create an Amazon FSx for Lustre file system. Configure the file system with scratch storage
C. Create an Amazon CloudFront distribution. Configure the viewer protocol policy to be HTTP and HTTPS
D. Launch Amazon EC2 instances. Attach an Elastic Fabric Adapter (EFA) to the instances
E. Create an Aws Elastic Beanstalk deployment to manage the environment
Correct Answer: A
Section: (none)
QUESTION 680
A company uses high concurrency AWS Lambda functions to process a constantly increasing number of messages in amessage queue during marketing events. The Lambda functions use CPU intensive code to process the messages. The company wants to reduce the compute costs and to maintain service latency for its customers.
Which solution will meet these reauirements?
A. Configure reserved concurrency for the Lambda functions. Decrease the memory allocated to the Lambda functions
B. Configure reserved concurrency for the Lambda functions. Increase the memory according to AWS Compute Optimizer recommendations
C. Configure provisioned concurrency for the Lambda functions. Decrease the memory allocated to the Lambda functions
D. Configure provisioned concurrency for the Lambda functions. Increase the memory according to AWS Compute Optimizer recommendations
Correct Answer: B
Section: (none)