AWS 架構師證照考古題大全 20240926
Amazon Web Service(AWS 亞馬遜)全系列考古題,2024年最新題庫,持續更新,全網最完整。AWS 證照含金量高,自我進修、跨足雲端產業必備近期版本更新,隨時追蹤最新趨勢變化。
QUESTION 121
A global company is using Amazon API Gateway to design REST APIs for its loyalty club users in the us- east-1 Region and theap-southeast-2 Region.A solutions architect must design a solution to protect these API Gateway managed REST APIs across multiple accounts from SQL injection and cross-site scripting attacks.
Which solution will meet these requirements with the LEAST amount of administrative effort?
A. Set up AWS WAF in both Regions. Associate Regional web ACLs with an API stage
B. Set up AWS Firewall Manager in both Regions. Centrally configure AWS WAF rules.
C. Set up AWS Shield in both Regions. Associate Regional web ACLs with an API stage.
D. Set up AWS Shield in one of the Regions. Associate Regional web ACLs with an API stage
Correct Answer: B
Section: (none)
好奇小豬科技如何幫助您的企業?了解更多 關於我們的專業服務。
QUESTION 122
A company delivers files in Amazon S3 to certain users who do not have AWS credentials. These users must be given access for a limited time.
What should a solutions architect do to securely meet these requirements?
A. Configure public access on an Amazon S3 bucket.
B. Generate a presigned URL to share with the users
C. Encrypt files using AWS KMS and provide keys to the users
D. Create and assign IAM roles that will grant GetObject permissions to the users
Correct Answer: B
Section: (none)
QUESTION 123
A company is building an interactive knowledge base system to help its call center staff work more efficiently. The system will be deployed on AWS and will be used 24 hours a day, 7 days a week. The company has more than 10 years of customer support transcripts stored in plaintext files. The company needs a database solution in which the company can bulk load existingtranscripts and can individually load new transcripts as they are captured. The solution must provide search functionality to identifytranscripts in which keywords and phrases occur.
Which solution meets these requirements MOST cost-effectively?
A. Amazon Athena
B. Amazon RDS
C. Amazon OpenSearch Service (Amazon Elasticsearch Service)
D. Amazon DynamoDB
Correct Answer: C
Section: (none)
QUESTION 124
A company recently experienced a DDoS attack on its web application. The application is hosted on Amazon EC2 instances and isarchitected for high availability with Elastic Load Balancers and Auto Scaling groups. The DDoS attack lasted for an extended period of time and resulted in additional cost to the company. The company wants to acquire financial protection from future DDoS attacks Which solution will provide this protection MOST cost-effectively?
A. AWS WAF
B. AWS Shield Standard
C. AWS Shield Advanced
D. Amazon Detective
Correct Answer: C
Section: (none)
QUESTION 125
A gaming company hosts a browser-based application on AWS. The users of the application consume a large number of videosand images that are stored in Amazon S3. This content is the same for all users. The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin. Which solution meets these requirements MOST cost-effectively?
A. Deploy an AWS Global Accelerator accelerator in front of the web servers
B. Deploy an Amazon CloudFront web distribution in front of the S3 bucket
C. Deploy an Amazon ElastiCache for Redis instance in front of the web servers
D. Deploy an Amazon ElastiCache for Memcached instance in front of the web servers
Correct Answer: B
Section: (none)
想要提升業務效率?立即註冊 小豬科技,體驗最優質的雲端服務。
QUESTION 126
A company runs its two-tier ecommerce website on AWS. The web tier consists of a load balancer that sends traffic to Amazon EC2 instances. The database tier uses an Amazon RDS DB instance. The EC2 instances and the RDS DB instance should not beexposed to the public internet. The EC2instances require internet access to complete payment processing of orders through a third-party web service. The application must be highly available.
Which combination of configuration options will meet these requirements? (Select TWO.)
A. Use an Auto Scaling group to launch the EC2 instances in private subnets. Deploy an RDS Multi-AZ DB instance in private subnets
B. Configure a VPC with two private subnets and two NAT gateways across two Availability Zones. Deploy an Application Load Balancer in the private subnets
C. Use an Auto Scaling group to launch the EC2 instances in public subnets across two Availability Zones. Deploy an RDS Multi-AZ DB instance in private subnets
D. Configure a VPC with one public subnet, one private subnet, and two NAT gateways across two Availability Zones. Deploy an Application Load Balancer in the public subnet.
E. Configure a VPC with two public subnets, two private subnets, and two NAT gateways across two Availability Zones. Deploy an Application Load Balancer in the public subnets
Correct Answer: AE
Section: (none)
QUESTION 127
A company is serving an application through Amazon CloudFront. The company wants to protect the application from commonSQL injection and cross-site scripting attacks. The company also wants the ability to block IP addresses and apply rate limiting.
Which AWS service meets these requirements?
A. Amazon GuardDuty
B. AWS Shield
C. Amazon Inspector
D. AWS WAF
Correct Answer: D
Section: (none)
QUESTION 128
A solutions architect is designing a new hybrid architecture to extend a company's on-premises infrastructure to AWS. Thecompany requires a highly available connection with consistent low latency to an AWS Region. The company needs to minimize costs and is willing to accept slower traffic if the primary connection fails.
What should the solutions architect do to meet these requirements?
A. Provision an AWS Direct Connect connection to a Region. Provision a VPN connection as a backup if the primary Direct Connect connection fails
B. Provision a VPN tunnel connection to a Region for private connectivity. Provision a second VPN tunnel for private connectivity and as a backup if the primary VPN connection fails
C. Provision an AWS Direct Connect connection to a Region. Provision a second Direct Connect connection to the same Region as a backup if the primary Direct Connect connection fails
D. Provision an AWS Direct Connect connection to a Region. Use the Direct Connect failover attribute from the AWS CLI to automatically create a backup connection if the primary Direct Connect connection fails
Correct Answer: A
Section: (none)
QUESTION 129
A company wants to store large amounts of data in Amazon S3 buckets. Numerous applications access the data, and the access pattern of the data is irregular. A solutions architect must recommend a cost-effective storage solution that does not affect performance or require operational overhead.
Which solution meets these requirements?
A. AWS Trusted Advisor
B. S3 Analytics
C. S3 Intelligent-Tiering
D. Cost Explorer
Correct Answer: C
Section: (none)
QUESTION 130
A company needs to store its accounting records in Amazon S3. The records must be immediately accessible for 1year and thenmust be archived for an additional 9 years. No one at the company, including administrative users and root users, can be able todelete the records during the entire 10-year period. The records must be stored with maximum resiliency.
Which solution will meet these requirements?
A. Store the records in S3 Glacier for the entire 10-year period. Use an access control policy to deny deletion of the records for a period of 10 years
B. Store the records by using S3Intelligent-Tiering. Use an IAM policy to deny deletion of the records. After 10years, change the IAM policy to allow deletion
C. Use an S3 Lifecycle policy to transition there cords from S3 Standard to S3 Glacier Deep Archive after 1year,Use S3 Object Lock in compliance mode for a period of 10 years
D. Use an S3 Lifecycle policy to transition the records from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA)after 1 year. Use S3 Object Lock in governance mode for a period of 10 years
Correct Answer: C
Section: (none)
聯絡小豬科技專員,立即獲取 專屬雲端解決方案。
QUESTION 131
A company is designing an application where users upload small files into Amazon S3.After a user uploads a file, the file requires one-time simple processing to transform the data and save the data in JSON format for later analysis.
Each file must be processed as quickly as possible after it is uploaded. Demand will vary. On some days, users will upload a high number of files. On other days, users will upload a few files or no files. Which solution meets these requirements with the LEAST operational overhead?
A. Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
B. Configure Amazon S3to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EC2 instances to read from the queue and process the data Store the resulting JSON file in Amazon DynamoDB.
C. Configure Amazon S3to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
D. Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when anew file is uploaded. Use an AWS Lambda function to consume the event from the stream and process the data Store the resulting JSON file in an Amazon Aurora DB cluster.
Correct Answer: C
Section: (none)
QUESTION 132
A company hosts a marketing website in an on-premises data center. The website consists of static documents and runs on asingle server.An administrator updates the website content infrequently and uses an SFTP client to upload new documents.
The company decides to host its website on AWS and to use Amazon CloudFront. The company's solutions architect creates a CloudFront distribution. The solutions architect must design the most cost-effective and resilient architecture for website hosting to serve as the CloudFront origin.
Which solution will meet these requirements?
A. Create a virtual server by using Amazon Lightsail Configure the web server in the Lightsail instance. Upload website content by using an SFTP client.
B. Create an AWS Auto Scaling group for Amazon EC2 instances. Use an Application Load Balancer Upload website content by using an SFTP client.
C. Create a private Amazon S3 bucket. Use an S3 bucket policy to allow access from a CloudFront origin access identity (OAI). Upload website content by using the AWS CL.
D. Create a public Amazon S3 bucket. Configure AWS Transfer for SFTP Configure the S3 bucket for website hosting. Upload website content by using the SFTP client.
Correct Answer: C
Section: (none)
QUESTION 133
A company has a website hosted on AWS. The website is behind an Application Load Balancer (ALB) that is configured to handle HTTP and HTTPS separately. The company wants to forward all requests to the website so that the requests will use HTTPS.
What should a solutions architect do to meet this requirement?
A. Update the ALB's network ACL to accept only HTTPS traffic
B. Create a rule that replaces the HTTP in the URL with HTTPS
C. Create a listener rule on the ALB to redirect HTTP traffic to HTTPS.
D. Replace the ALB with a Network Load Balancer configured to use Server Name Indication (SNI)
Correct Answer: C
Section: (none)
QUESTION 134
A startup company wants to decouple its three-tier application stack into microservices and use container technologies to build out the microservices. The company does not want to provision and manage the compute infrastructure for its containers on AWS.
What should a solutions architect recommend to meet these requirements?
A. Provision an Amazon Elastic Container Service (Amazon ECS) cluster. Attach ECS nodes to an Amazon EC2 Auto Scaling group to host containers.
B. Provision an Amazon Elastic Container Service (Amazon ECS) cluster with AWS Fargate. Deploy containers to Fargate tasks.
C. Create AWS Lambda functions to host microservices. Integrate the functions with an Amazon API Gateway API to redirect application traffic.
D. Provision an Amazon Elastic Kubernetes Service (Amazon EKS) cluster.Attach EKS nodes to an Amazon EC2Auto Scaling group to host containers.
Correct Answer: B
Section: (none)
QUESTION 135
A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new objects that are uploaded to Amazon S3 to remain unchangeable for a nonspecific amount of time until the company decides to modify theobjects. Only specific users in the company's AWS account can have the ability to delete the objects.
What should a solutions architect do to meet these requirements?
A. Create an S3 Glacier vault. Apply a write-once, read-many (WORM) vault lock policy to the objects.
B. Create an S3 bucket with S3 Object Lock enabled Enable versioning. Set a retention period of 100 years. Use governance mode as the S3 bucket's default retention mode for new objects.
C. Create an S3 bucket. Use AWS CloudTrail to track any S3API events that modify the objects. Upon notification, restore the modified objects from any backup versions that the company has.
D. Create an S3 bucket with S3 Object Lock enabled. Enable versioning. Add a legal hold to the objects. Add thes3:PutObjectLegalHold permission to the IAM policies of users who need to delete the objects.
Correct Answer: D
Section: (none)
想知道小豬科技的最新功能?了解更多 關於我們的創新技術。
QUESTION 136
A global company hosts its web application on Amazon EC2 instances behind an Application Load Balancer (ALB).The web application has static data and dynamic data. The company stores its static data in an Amazon S3 bucket The company wants to improve performance and reduce latency for the static data and dynamic data. The company is using its own domain name registered with Amazon Route 53. What should a solutions architect do to meet these requirements?
A. Create an Amazon CloudFront distribution that has the S3 bucket and the ALB as origins. Configure Route 53to route traffic to the CloudFront distribution.
B. Create an Amazon CloudFront distribution that has the ALB as an origin. Create an AWS Global Accelerator standardaccelerator that has the S3 bucket as an endpoint Configure Route 53to route traffic to the CloudFront distribution.
C. Create an Amazon CloudFront distribution that has the S3 bucket as an origin. Create an AWS Global Accelerator standardaccelerator that has the ALB and the CloudFront distribution as endpoints. Create a custom domain name that points to the accelerator DNS name. Use the custom domain name as an endpoint for the web application.
D. Create an Amazon CloudFront distribution that has the ALB as an origin. Create an AWS Global Accelerator standard accelerator that has the S3 bucket as an endpoint Create two domain names. Point one domain name to the CloudFront DNS name for dynamic content. Point the other domain name to the accelerator DNS name for static content. Use the domainnames as endpoints for the web application.
Correct Answer: A
Section: (none)
QUESTION 137
A company recently migrated to AWS and wants to implement a solution to protect the traffic that flows in and out of theproduction VPC. The company had an inspection server in its on-premises data center. The inspection server performed specific operations such as traffic flow inspection and traffic filtering. The company wants to have the same functionalities in the AWS Cloud.
Which solution will meet these requirements?
A. Use Amazon GuardDuty for traffic inspection and traffic filtering in the production VPC.
B. Use Traffic Mirroring to mirror traffic from the production VPC for traffic inspection and filtering.
C. Use AWS Network Firewall to create the required rules for traffic inspection and traffic filtering for the production VPC.
D. Use AWS Firewall Manager to create the required rules for traffic inspection and traffic filtering for the production VPC.
Correct Answer: C
Section: (none)
QUESTION 138
A company has an ordering application that stores customer information in Amazon RDS for MySQL. During regular business hours, employees run one-time queries for reporting purposes. Timeouts are occurring during order processing because thereporting queries are taking along time to run. The company needs to eliminate the timeouts without preventing employees from performing queries. What should a solutions architect do to meet these requirements?
A. Create a read replica. Move reporting queries to the read replica.
B. Create a read replica. Distribute the ordering application to the primary DB instance and the read replica.
C. Migrate the ordering application to Amazon DynamoDB with on-demand capacity.
D. Schedule the reporting queries for non-peak hours.
Correct Answer: A
Section: (none)
QUESTION 139
A company stores its application logs in an Amazon CloudWatch Logs log group. A new policy requires the company to store all application logs in Amazon OpenSearch Service (Amazon Elasticsearch Service) in near-real time.
Which solution will meet this requirement with the LEAST operational overhead?
A. Configure a CloudWatch Logs subscription to stream the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).
B. Create an AWS Lambda function. Use the log group to invoke the function to write the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service)
C. Create an Amazon Kinesis Data Firehose delivery stream. Configure the log group as the delivery stream's source.Configure Amazon OpenSearch Service (Amazon Elasticsearch Service) as the delivery stream's destination.
D. Install and configure Amazon Kinesis Agent on each application server to deliver the logs to Amazon Kinesis Data Streams. Configure Kinesis Data Streams to deliver the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service)
Correct Answer: A
Section: (none)
QUESTION 140
A company stores data in an Amazon Aurora PostgreSQl DB cluster. The company must store all the data for 5 years and mustdelete all the data after 5 years. The company also must indefinitely keep audit logs of actions that are performed within thedatabase. Currently, the company has automated backups configured for Aurora.
Which combination of steps should a solutions architect take to meet these requirements?(Select TWO.)
A. Take a manual snapshot of the DB cluster.
B. Create a lifecycle policy for the automated backups.
C. Configure automated backup retention for 5 years.
D. Configure an Amazon CloudWatch Logs export for the DB cluster
E. Use AWS Backup to take the backups and to keep the backups for 5 years.
Correct Answer: DE
Section: (none)
聯絡小豬科技專員,立即獲取 專屬雲端解決方案。
QUESTION 141
A rapidly growing ecommerce company is running its workloads in a single AWS Region. A solutions architect must create adisaster recovery (DR) strategy that includes a different AWS Region. The company wants its database to be up to date in the DR Region with the least possible latency. The remaining infrastructure in the DR Region needs to run at reduced capacity and must beable to scale up if necessary. Which solution will meet these requirements with the LOWEST recovery time objective (RTO)?
A. Use an Amazon Aurora global database with a pilot light deployment.
B. Use an Amazon Aurora global database with a warm standby deployment.
C. Use an Amazon RDS Multi-AZ DB instance with a pilot light deployment.
D. Use an Amazon RDS Multi-AZ DB instance with a warm standby deployment
Correct Answer: B
Section: (none)
QUESTION 142
A company produces batch data that comes from different databases. The company also produces live stream data from networksensors and application APIs. The company needs to consolidate all the data into one place for business analytics. The companyneeds to process the incoming data and then stage the data in different Amazon S3 buckets. Teams will later run one-time queries and import the data into a business intelligence tool to show key performance indicators (KPIs). Which combination of steps will meet these requirements with the LEAST operational overhead? (Select TWO.)
A. Use Amazon Athena for one-time queries. Use Amazon QuickSight to create dashboards for KPls.
B. Use Amazon Kinesis Data Analytics for one-time queries. Use Amazon QuickSightto create dashboards for KPls.
C. Create custom AWS Lambda functions to move the individual records from the databases to an Amazon Redshift cluster.
D. Use an AWS Glue extract, transform, and load (ETL)job to convert the data into JSON format. Load the data into multiple Amazon OpenSearch Service (Amazon Elasticsearch Service) clusters.
E. Use blueprints in AWS Lake Formation to identify the data that can be ingested into a data lake. Use AWS Glue to crawlthe source, extract the data, and load the data into Amazon S3 in Apache Parquet format.
Correct Answer: AE
Section: (none)
QUESTION 143
A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that runson an Amazon EC2 instance in a single Availability Zone. These messages are processed by a different application that runs on a separate EC2 instance. This application stores the details in a PostgreSQL database on another EC2 instance. All the EC2 instances are in the same Availability Zone.
The company needs to redesign its architecture to provide the highest availability with the least operational overhead
What should a solutions architect do to meet these requirements?
A. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ. Create a Multi-AZ Auto Scaling group for EC2 instances that host the application. Create another Multi-AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
B. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ. Create a Multi-AZ Auto Scaling group for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
C. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scalinggroup for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
D. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scaling group for EC2 instances that host the application. Create a third Muti-AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
Correct Answer: B
Section: (none)
QUESTION 144
A company is deploying a new public web application to AWS. The application will run behind an Application Load Balancer (ALB). The application needs to be encrypted at the edge with an SSLTLS certificate that is issued by an external certificate authority (CA).The certificate must be rotated each year before the certificate expires.
What should a solutions architect do to meet these requirements?
A. Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificate.Apply the certificate to the ALB. Use the managed renewal feature to automatically rotate the certificate.
B. Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificate.Import the key material from the certificate. Apply thecertificate to the ALB. Use the managed renewal feature to automatically rotate the certificate.
C. Use AWS Certificate Manager (ACM) Private Certificate Authority to issue an SSL/TLS certificate from the root CA.Apply the certificate to the ALB. Use the managed renewal feature to automatically rotate the certificate.
D. Use AWS Certificate Manager (ACM) to import an SSL/TLS certificate.Apply the certificate to the ALB. Use Amazon EventBridge (Amazon CloudWatch Events) to send a notification when the certificate is nearing expiration. Rotate the certificate manually.
Correct Answer: D
Section: (none)
QUESTION 145
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth.
Which solution will meet these requirements?
A. Create an S3 bucket. Create an IAM role that has permissions to write to the S3 bucket. Use the AWS CLl to copy all files locally to the S3 bucket.
B. Create an AWS Snowball Edge job. Receive a Snowball Edge device on premise. Use the Snowball Edge client to transfer data to the device. Return the device so that AWS can import the data into Amazon S3.
C. Deploy an S3 File Gateway on premises. Create a public service endpoint to connect to the S3 File Gateway. Create an S3bucket. Create a new NFS file share on the S3 File Gateway. Point the new file share to the S3 bucket. Transfer the data from the existing NFS file share to the S3File Gateway.
D. Setup an AWS Direct Connect connection between the on-premises network and AWS. Deploy an S3 File Gateway on premises. Create a public virtual interface (VlF) to connect to the S3 File Gateway. Create an S3 bucket. Create a new NFSfile share on the S3 File Gateway. Point the new file share to the S3 bucket. Transfer the data from the existing NFS file share to the S3 File Gateway.
Correct Answer: B
Section: (none)
小豬科技助您輕鬆掌握雲端技術,了解更多 我們如何幫助您提升業務效率。
QUESTION 146
A company uses Amazon S3 as its data lake. The company has a new partner that must use SFTP to upload data files.Asolutions architect needs to implement a highly available SFTP solution that minimizes operational overhead.
Which solution will meet these requirements?
A. Use AWS Transfer Family to configure an SFTP-enabled server with a publicly accessible endpoint. Choose the S3 data lake as the destination.
B. Use Amazon S3 File Gateway as an SFTP server. Expose the S3 File Gateway endpoint URL to the new partner. Share the S3 File Gateway endpoint with the new partner.
C. Launch an Amazon EC2 instance in a private subnet in a VPC. Instruct the new partner to upload files to the EC2 instance byusing a VPN. Run a cron job script on the EC2 instance to upload files to the S3 data lake.
D. Launch Amazon EC2 instances in a private subnet in a VPC. Place a Network Load Balancer (NLB) in front of the EC2instances. Create an SFTP listener port for the NLB.Share the NLB hostname with the
new partner. Run a cron job script on the EC2 instances to upload files to the S3 data lake.
Correct Answer: A
Section: (none)
QUESTION 147
A company has migrated a two-tier application from its on-premises data center to the AWS Cloud. The data tier is a Multi-AZdeployment of Amazon RDS for Oracle with 12TB of General Purpose SSD Amazon Elastic Block Store (Amazon EBS) storage.The application is designed to process and store documents in the database as binary large objects (blobs) with an average document size of 6 MB. The database size has grown over time, reducing the performance and increasing the cost of storage. The company must improve the database performance and needs a solution that is highly available and resilient. Which solution will meet these requirements MOST cost-effectively?
A. Reduce the RDS DB instance size. Increase the storage capacity to 24 TiB.Change the storage type to Magnetic.
B. Increase the RDS DB instance size. Increase the storage capacity to 24 TiB.Change the storage type to Provisioned IOPS.
C. Create an Amazon S3 bucket. Update the application to store documents in the S3 bucket. Store the object metadata in the existing database.
D. Create an Amazon DynamoDB table. Update the application to use DynamoDB. Use AWS Database Migration Service (AWS DMS) to migrate data from the Oracle database to DynamoDB.
Correct Answer: C
Section: (none)
QUESTION 148
A company has a small Python application that processes JSON documents and outputs the results to an on-premises SQL database. The application runs thousands of times each day. The company wants to move the application to the AWS Cloud. The company needs a highly available solution that maximizes scalability and minimizes operational overhead.
Which solution will meet these requirements?
A. Place the JSON documents in an Amazon S3 bucket. Run the Python code on multiple Amazon EC2 instances to process the documents. Store the results in an Amazon Aurora DB cluster.
B. Place the JSON documents in an Amazon S3 bucket. Create an AWS Lambda function that runs the Python code toprocess the documents as they arrive in the S3 bucket. Store the results in an Amazon Aurora DB cluster.
C. Place the JSON documents in an Amazon Elastic Block Store (Amazon EBS) volume. Use the EBS Multi-Attach feature toattach the volume to multiple Amazon EC2 instances. Run the Python code on the EC2 instances to process the documents. Store the results on an Amazon RDS DB instance.
D. Place the JSON documents in an Amazon Simple Queue Service (Amazon SQS) queue as messages. Deploy the Pythoncode as a container on an Amazon Elastic Container Service (Amazon ECS) cluster that is configured with the Amazon EC2launch type. Use the container to process the SQS messages. Store the results on an Amazon RDS DB instance.
Correct Answer: B
Section: (none)
QUESTION 149
A company is running several business applications in three separate VPCs within the us-east-1 Region. The applications must be able to communicate between VPCs. The applications also must be able to consistently send hundreds of gigabytes of data eachday to a latency-sensitive application that runs in a single on-premises data center.
A solutions architect needs to design a network connectivity solution that maximizes cost-effectiveness. Which solution meets these requirements?
A. Configure three AWS Site-to-Site VPN connections from the data center to AWS. Establish connectivity by configuring one VPN connection for each VPC.
B. Launch a third-party virtual network appliance in each VPC.Establish an IPsec VPN tunnel between the data center and each virtual appliance.
C. Set up three AWS Direct Connect connections from the data center to a Direct Connect gateway in us- east-1.Establish connectivity by configuring each VPC to use one of the Direct Connect connections.
D. Set up one AWS Direct Connect connection from the data center to AWS. Create a transit gateway, and attach each VPC to the transit gateway. Establish connectivity between the Direct Connect connection and the transit gateway.
Correct Answer: D
Section: (none)
QUESTION 150
A company needs to store data from its healthcare application. The application's data frequently changes.Anew regulation requires audit access a tall levels of the stored data. The company hosts the application on an on-premises infrastructure that isrunning out of storage capacity.A solutions architect must securely migrate the existing data to AWS while satisfying the new regulation.
Which solution will meet these requirements?
A. Use AWS DataSync to move the existing data to Amazon S3. Use AWS CloudTrail to log data events.
B. Use AWS Snowcone to move the existing data to Amazon S3. Use AWS CloudTrail to log management events.
C. Use Amazon S3 Transfer Acceleration to move the existing data to Amazon S3. Use AWS CloudTrail to log data events.
D. Use AWS Storage Gateway to move the existing data to Amazon S3. Use AWS CloudTrail to log management events.
Correct Answer: A
Section: (none)
想要提升業務效率?立即註冊 小豬科技,體驗最優質的雲端服務。
QUESTION 151
A company has an ecommerce checkout workflow that writes an order to a database and calls a service to process the payment.Users are experiencing timeouts during the checkout process. When users resubmit the checkout form, multiple unique orders are created for the same desired transaction. How should a solutions architect refactor this workflow to prevent the creation of multiple orders?
A. Configure the web application to send an order message to Amazon Kinesis Data Firehose. Set the payment service to retrieve the message from Kinesis Data Firehose and process the order.
B. Create a rule in AWS CloudTrail to invoke an AWS Lambda function based on the logged application path request. Use Lambda to query the database, call the payment service, and pass in the order information.
C. Store the order in the database. Send a message that includes the order number to Amazon Simple Notification Service(Amazon SNS).Set the payment service to poll Amazon SNS, retrieve the message, and process the order.
D. Store the order in the database. Send a message that includes the order number to an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Set the payment
service to retrieve the message and process the order. Delete the message from the queue.
Correct Answer: D
Section: (none)
QUESTION 152
A company used an Amazon RDS for MySQL DB instance during application testing. Before terminating the DB instance at the end of the test cycle, a solutions architect created two backups. The solutions architect created the first backup by using the mysqldump utility to create a database dump. The solutions architect created the second backup by enabling the final DB snapshot option on RDS termination. The company is now planning fora new test cycle and wants to create anew DB instance from the most recent backup. The company has chosen a MySQL-compatible edition of Amazon Aurora to host the DB instance. Which solutions will create the new DB instance? (Select TWO.)
A. Import the RDS snapshot directly into Aurora.
B. Upload the RDS snapshot to Amazon S3. Then import the RDS snapshot into Aurora.
C. Upload the database dump to Amazon S3. Then import the database dump into Aurora.
D. Use AWS Database Migration Service (AWS DMS) to import the RDS snapshot into Aurora.
E. Upload the database dump to Amazon S3.Then use AWS Database Migration Service (AWS DMS) to import the database dump into Aurora
Correct Answer: AC
Section: (none)
QUESTION 153
A company is creating an application that runs on containers in a VPC. The application stores and accesses data in an Amazon S3 bucket. During the development phase, the application will store and access 1 TB of data in Amazon S3 each day. The companywants to minimize costs and wants to prevent traffic from traversing the internet whenever possible.
Which solution will meet these requirements?
A. Enable S3 Intelligent-Tiering for the S3 bucket.
B. Enable S3 Transfer Acceleration for the S3 bucket.
C. Create a gateway VPC endpoint for Amazon S3.Associate this endpoint with all route tables in the VPC.
D. Create an interface endpoint for Amazon S3in the VPC.Associate this endpoint with all route tables in the VPC.
Correct Answer: C
Section: (none)
QUESTION 154
A rapidly growing global ecommerce company is hosting its web application on AWS. The web application includes static contentand dynamic content. The website stores online transaction processing (OLTP) data in an Amazon RDS database. The website'susers are experiencing slow page loads. Which combination of actions should a solutions architect take to resolve this issue? (Select TWO.)
A. Configure an Amazon Redshift cluster.
B. Set up an Amazon CloudFront distribution.
C. Host the dynamic web content in Amazon S3.
D. Create a read replica for the RDS DB instance.
E. Configure a Multi-AZ deployment for the RDS DB instance.
Correct Answer: BD
Section: (none)
QUESTION 155
A company needs to store contract documents. A contract lasts for 5 years. During the 5-year period, the company must ensure thatthe documents cannot be overwritten or deleted. The company needs to encrypt the documents at rest and rotate the encryptionkeys automatically every year. Which combination of steps should a solutions architect take to meet these requirements with theLEAST operational overhead? (Select TWO.)
A. Store the documents in Amazon S3.Use S3 Object Lock in governance mode.
B. Store the documents in Amazon S3. Use S3 Object Lock in compliance mode.
C. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3).Configure key rotation.
D. Use server-side encryption with AWS Key Management Service (AWS KMS) customer managed keys. Configure key rotation.
E. Use server-side encryption with AWS Key Management Service (AWS KMS) customer provided (imported) keys. Configure key rotation.
Correct Answer: BD
Section: (none)
有技術問題?點擊這裡 聯絡我們的小豬科技專員。
QUESTION 156
A solutions architect is migrating a document management workload to AWS. The workload keeps 7 TiB of contract documents on a shared storage file system and tracks them on an external database. Most of the documents are stored and retrieved eventually for reference in the future. The application cannot be modified during the migration, and the storage solution must be highlyavailable. Documents are retrieved
and stored by web servers that run on Amazon EC2 instances in an Auto Scaling group. The Auto Scaling group can have up to 12 instances. Which solution meets these requirements MOST cost-effectively?
A. Provision an enhanced networking optimized EC2 instance to serve as a shared NFS storage system.
B. Create an Amazon S3 bucket that uses the S3 Standard-Infrequent Access (S3Standard-lA) storage class. Mount the S3 bucket to the EC2 instances in the Auto Scaling group.
C. Create an SFTP server endpoint by using AWS Transfer for SFTP and an Amazon S3 bucket. Configure the EC2 instances in the Auto Scaling group to connect to the SFTP server.
D. Create an Amazon Elastic File System (Amazon EFS) file system that uses the EFS Standard- Infrequent Access (EFSStandard-IA) storage class. Mount the file system to the EC2 instances in the Auto Scaling group.
Correct Answer: D
Section: (none)
QUESTION 157
A company is using a SQL database to store movie data that is publicly accessible. The database runs on an Amazon RDS Single-AZ DB instance.A script runs queries at random intervals each day to record the number of new movies that have been added to the database. The script must report a final total during business hours.
The company's development team notices that the database performance is inadequate for development tasks when the script is running. A solutions architect must recommend a solution to resolve this issue Which solution will meet this requirement with the LEAST operational overhead?
A. Modify the DB instance to be a Multi-AZ deployment.
B. Create a read replica of the database. Configure the script to query only the read replica.
C. Instruct the development team to manually export the entries in the database at the end of each day.
D. Use Amazon ElastiCache to cache the common queries that the script runs against the database.
Correct Answer: B
Section: (none)
QUESTION 158
A gaming company is moving its public scoreboard from a data center to the AWS Cloud. The company uses Amazon EC2Windows Server instances behind an Application Load Balancer to host its dynamic application. The company needs a highlyavailable storage solution for the application. The application consists of static files and dynamic server-side code.
Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)
A. Store the static files on Amazon S3. Use Amazon CloudFront to cache objects at the edge.
B. Store the static files on Amazon S3.Use Amazon ElastiCache to cache objects at the edge.
C. Store the server-side code on Amazon Elastic File System (Amazon EFS).Mount the EFS volume on each EC2 instance to share the files.
D. Store the server-side code on Amazon FSx for Windows File Server. Mount the FSx for Windows File Server volume on each EC2 instance to share the files.
E. Store the server-side code on a General Purpose SSD (gp2)Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBS volume on each EC2 instance to share the files.
Correct Answer: AD
Section: (none)
QUESTION 159
A company has an application that places hundreds of .csv files into an Amazon S3 bucket every hour. The files are 1GB in size.Each time a file is uploaded, the company needs to convert the file to Apache Parquet format and place the output file into an S3 bucket.
Which solution will meet these requirements with the LEAST operational overhead?
A. Create an AWS Lambda function to download the .csv files, convert the files to Parquet format, and place the output files in an S3 bucket. Invoke the Lambda function for each S3 PUT event.
B. Create an Apache Spark job to read the .csv files, convert the files to Parquet format, and place the output files in an S3 bucket. Create an AWS Lambda function for each S3PUTevent to invoke the Spark job.
C. Create an AWS Glue table and an AWS Glue crawler for the S3 bucket where the application places the
.csv files. Schedule an AWS Lambda function to periodically use Amazon Athena to query the AWS Glue table, convert thequery results into Parquet format, and place the output files into an S3 bucket.
D. Create an AWS Glue extract, transform, and load (ETL)job to convert the .csv files to Parquet format and place the outputfiles into an S3 bucket. Create an AWS Lambda function for each S3PUT event to invoke the ETL job.
Correct Answer: D
Section: (none)
QUESTION 160
A company is creating a prototype of an ecommerce website on AWS. The website consists of an Application Load Balancer, an Auto Scaling group of Amazon EC2 instances for web servers, and an Amazon RDS for MySQL DB instance that runs with the Single-AZ configuration. The website is slow to respond during searches of the product catalog. The product catalog is a group of tables in the MySQL database that the company does not update frequently.A solutions architect has determined that the CPU utilization on the DB instance is high when product catalog searches occur. What should the solutions architect recommend toimprove the performance of the website during searches of the product catalog?
A. Migrate the product catalog to an Amazon Redshift database. Use the COPY command to load the product catalog tables.
B. Implement an Amazon ElastiCache for Redis cluster to cache the product catalog. Use lazy loading to populate the cache.
C. Add an additional scaling policy to the Auto Scaling group to launch additionalEC2 instances when database response is slow.
D. Turn on the Multi-AZ configuration for the DB instance. Configure the EC2 instances to throttle the product catalog queries that are sent to the database.
Correct Answer: B
Section: (none)
需要專業建議?立即聯絡 小豬科技專員,獲取專屬支持!