AWS Certified Database Specialty-(DBS-C01)

The AWS Certified Database Specialty-(DBS-C01) were last updated on today.
  • Viewing page 10 out of 68 pages.
  • Viewing questions 46-50 out of 340 questions
Disclaimers:
  • - ExamTopics website is not related to, affiliated with, endorsed or authorized by Amazon.and Azure
  • - Trademarks, certification & product names are used for reference only and belong to Amazon.and Azure

Topic 1 - Exam A

Question #46 Topic 1

A retail company uses Amazon Redshift Spectrum to run complex analytical queries on objects that are stored in an Amazon S3 bucket. The objects are joined with multiple dimension tables that are stored in an Amazon Redshift database. The company uses the database to create monthly and quarterly aggregated reports. Users who attempt to run queries are reporting the following error message: error: Spectrum Scan Error: Access throttled Which solution will resolve this error?

  • A Check file sizes of fact tables in Amazon S3, and look for large files. Break up large files into smaller files of equal size between 100 MB and 1 GB
  • B Reduce the number of queries that users can run in parallel.
  • C Check file sizes of fact tables in Amazon S3, and look for small files. Merge the small files into larger files of at least 64 MB in size.
  • D Review and optimize queries that submit a large aggregation step to Redshift Spectrum.
Suggested Answer: A
NOTE: Reference: https://aws.amazon.com/premiumsupport/knowledge-center/s3-upload-large-files/
Question #47 Topic 1

A company's database specialist is building an Amazon RDS for Microsoft SQL Server DB instance to store hundreds of records in CSV format. A customer service tool uploads the records to an Amazon S3 bucket. An employee who previously worked at the company already created a custom stored procedure to map the necessary CSV fields to the database tables. The database specialist needs to implement a solution that reuses this previous work and minimizes operational overhead. Which solution will meet these requirements?

  • A Create an Amazon S3 event to invoke an AWS Lambda function. Configure the Lambda function to parse the .csv file and use a SQL client library to run INSERT statements to load the data into the tables.
  • B Write a custom .NET app that is hosted on Amazon EC2. Configure the .NET app to load the .csv file and call the custom stored procedure to insert the data into the tables.
  • C Download the .csv file from Amazon S3 to the RDS D drive by using an AWS msdb stored procedure. Call the custom stored procedure to insert the data from the RDS D drive into the tables.
  • D Create an Amazon S3 event to invoke AWS Step Functions to parse the .csv file and call the custom stored procedure to insert the data into the tables.
Suggested Answer: B
NOTE:
Question #48 Topic 1

A database specialist is working on an Amazon RDS for PostgreSQL DB instance that is experiencing application performance issues due to the addition of new workloads. The database has 5 ׀¢׀’ of storage space with Provisioned IOPS. Amazon CloudWatch metrics show that the average disk queue depth is greater than 200 and that the disk I/O response time is significantly higher than usual. What should the database specialist do to improve the performance of the application immediately?

  • A Increase the Provisioned IOPS rate on the storage.
  • B Increase the available storage space.
  • C Use General Purpose SSD (gp2) storage with burst credits.
  • D Create a read replica to offload Read IOPS from the DB instance.
Suggested Answer: C
NOTE: General Purpose SSD. Reference: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_Storage.html
Question #49 Topic 1

A gaming company uses Amazon Aurora Serverless for one of its internal applications. The company's developers use Amazon RDS Data API to work with the Aurora Serverless DB cluster. After a recent security review, the company is mandating security enhancements. A database specialist must ensure that access to RDS Data API is private and never passes through the public internet. What should the database specialist do to meet this requirement?

  • A Modify the Aurora Serverless cluster by selecting a VPC with private subnets.
  • B Modify the Aurora Serverless cluster by unchecking the publicly accessible option.
  • C Create an interface VPC endpoint that uses AWS PrivateLink for RDS Data API.
  • D Create a gateway VPC endpoint for RDS Data API.
Suggested Answer: C
NOTE:
Question #50 Topic 1

A large IT hardware manufacturing company wants to deploy a MySQL database solution in the AWS Cloud. The solution should quickly create copies of the company's production databases for test purposes. The solution must deploy the test databases in minutes, and the test data should match the latest production data as closely as possible. Developers must also be able to make changes in the test database and delete the instances afterward. Which solution meets these requirements?

  • A Leverage Amazon RDS for MySQL with write-enabled replicas running on Amazon EC2. Create the test copies using a mysqidump backup from the RDS for MySQL DB instances and importing them into the new EC2 instances.
  • B Leverage Amazon Aurora MySQL. Use database cloning to create multiple test copies of the production DB clusters.
  • C Leverage Amazon Aurora MySQL. Restore previous production DB instance snapshots into new test copies of Aurora MySQL DB clusters to allow them to make changes.
  • D Leverage Amazon RDS for MySQL. Use database cloning to create multiple developer copies of the production DB instance.
Suggested Answer: C
NOTE: