AWS Certified Solutions Architect Associate(SAA C03)

The AWS Certified Solutions Architect Associate(SAA C03) were last updated on today.
  • Viewing page 10 out of 198 pages.
  • Viewing questions 46-50 out of 990 questions
Disclaimers:
  • - ExamTopics website is not related to, affiliated with, endorsed or authorized by Amazon.and Azure
  • - Trademarks, certification & product names are used for reference only and belong to Amazon.and Azure

Topic 1 - Exam A

Question #46 Topic 1

A company has applications that run on Amazon EC2 instances in a VPC. One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet. Which solution will meet these requirements?

  • A Configure an S3 gateway endpoint.
  • B Create an S3 bucket in a private subnet.
  • C Create an S3 bucket in the same AWS Region as the EC2 instances.
  • D Configure a NAT gateway in the same subnet as the EC2 instances.
Suggested Answer: A
NOTE: Answer is :A
Explanation :Configuring an S3 gateway endpoint in a VPC enables the applications to directly connect to Amazon S3 without leaving the Amazon network, thus meeting the company's security regulations of not allowing traffic to travel across the Internet.
Question #47 Topic 1

A company runs an on-premises application that is powered by a MySQL database. The company is migrating the application to AWS to increase the application's elasticity and availability. The current architecture shows heavy read activity on the database during times of normal operation. Every 4 hours, the company's development team pulls a full export of the production database to populate a database in the staging environment. During this period, users experience unacceptable application latency. The development team is unable to use the staging environment until the procedure completes. A solutions architect must recommend replacement architecture that alleviates the application latency issue. The replacement architecture also must give the development team the ability to continue using the staging environment without delay. Which solution meets these requirements?

  • A Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
  • B Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production. Use database cloning to create the staging database on-demand.
  • C Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for production. Use the standby instance for the staging database.
  • D Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for production. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
Suggested Answer: B
NOTE: Answer is :B
Explanation :The Amazon Aurora MySQL with Multi-AZ Aurora Replicas would help increase the application's elasticity and availability. Moreover, the database cloning would allow the staging database to be created on-demand without affecting the normal operation or causing delays on the usage of the staging environment, which would solve the application latency issue experienced by the users every 4 hours when the full database export is pulled. By implementing this solution architecture, the development team would be able to continue using the staging environment without experiencing any service interruptions or delays.
Question #48 Topic 1

A survey company has gathered data for several years from areas in the United States. The company hosts the data in an Amazon S3 bucket that is 3 TB in size and growing. The company has started to share the data with a European marketing firm that has S3 buckets. The company wants to ensure that its data transfer costs remain as low as possible. Which solution will meet these requirements?

  • A Configure the Requester Pays feature on the company's S3 bucket.
  • B Configure S3 Cross-Region Replication from the company's S3 bucket to one of the marketing firm's S3 buckets.
  • C Configure cross-account access for the marketing firm so that the marketing firm has access to the company's S3 bucket.
  • D Configure the company's S3 bucket to use S3 Intelligent-Tiering. Sync the S3 bucket to one of the marketing firm's S3 buckets.
Suggested Answer: B
NOTE: -
Question #49 Topic 1

A company needs to configure a real-time data ingestion architecture for its application. The company needs an API, a process that transforms data as the data is streamed, and a storage solution for the data. Which solution will meet these requirements with the LEAST operational overhead?

  • A Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
  • B Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3.
  • C Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
  • D Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.
Suggested Answer: C
NOTE: Answer is :C
Explanation :This solution meets all the requirements specified in the question by using AWS integrated services. The Amazon API Gateway API can be used to ingest real-time data and Amazon Kinesis data stream will perform real-time processing. Lambda function are serverless and provides an event-driven compute service which enables to run the code and scale automatically. Amazon S3 will be used as the storage solution to store the data.
Question #50 Topic 1

A company needs to keep user transaction data in an Amazon DynamoDB table. The company must retain the data for 7 years. What is the MOST operationally efficient solution that meets these requirements?

  • A Use DynamoDB point-in-time recovery to back up the table continuously.
  • B Use AWS Backup to create backup schedules and retention policies for the table.
  • C Create an on-demand backup of the table by using the DynamoDB console. Store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
  • D Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda function. Configure the Lambda function to back up the table and to store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
Suggested Answer: B
NOTE: Answer is :B
Explanation :AWS Backup is a fully managed backup service that makes it easy to centralize and automate the backup of data across AWS services. In this case, we can use it to automatically backup DynamoDB table and set retention policies to keep the data for 7 years. It offloads the burden of writing custom scripts and manual interventions, thus being the most operationally efficient solution.