AWS Certified Database Specialty-(DBS-C01)

The AWS Certified Database Specialty-(DBS-C01) were last updated on today.
  • Viewing page 6 out of 68 pages.
  • Viewing questions 26-30 out of 340 questions
Disclaimers:
  • - ExamTopics website is not related to, affiliated with, endorsed or authorized by Amazon.and Azure
  • - Trademarks, certification & product names are used for reference only and belong to Amazon.and Azure

Topic 1 - Exam A

Question #26 Topic 1

A company with 500,000 employees needs to supply its employee list to an application used by human resources. Every 30 minutes, the data is exported using the LDAP service to load into a new Amazon DynamoDB table. The data model has a base table with Employee ID for the partition key and a global secondary index with Organization ID as the partition key. While importing the data, a database specialist receives ProvisionedThroughputExceededException errors. After increasing the provisioned write capacity units (WCUs) to 50,000, the specialist receives the same errors. Amazon CloudWatch metrics show a consumption of 1,500 WCUs. What should the database specialist do to address the issue?

  • A Change the data model to avoid hot partitions in the global secondary index.
  • B Enable auto scaling for the table to automatically increase write capacity during bulk imports.
  • C Modify the table to use on-demand capacity instead of provisioned capacity.
  • D Increase the number of retries on the bulk loading application.
Suggested Answer: B
NOTE: Reference: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AutoScaling.html
Question #27 Topic 1

A company plans to use AWS Database Migration Service (AWS DMS) to migrate its database from one Amazon EC2 instance to another EC2 instance as a full load task. The company wants the database to be inactive during the migration. The company will use a dms.t3.medium instance to perform the migration and will use the default settings for the migration. Which solution will MOST improve the performance of the data migration?

  • A Increase the number of tables that are loaded in parallel.
  • B Drop all indexes on the source tables.
  • C Change the processing mode from the batch optimized apply option to transactional mode.
  • D Enable Multi-AZ on the target database while the full load task is in progress.
Suggested Answer: C
NOTE: Optimizing change processing. By default, AWS DMS processes changes in a transactional mode, which preserves transactional integrity. If you can afford temporary lapses in transactional integrity, you can use the batch optimized apply option instead. This option efficiently groups transactions and applies them in batches for efficiency purposes. Using the batch optimized apply option almost always violates referential integrity constraints. So we recommend that you turn these constraints off during the migration process and turn them on again as part of the cutover process. Reference: https://docs.aws.amazon.com/dms/latest/userguide/CHAP_BestPractices.html
Question #28 Topic 1

An ecommerce company uses a backend application that stores data in an Amazon DynamoDB table. The backend application runs in a private subnet in a VPC and must connect to this table. The company must minimize any network latency that results from network connectivity issues, even during periods of heavy application usage. A database administrator also needs the ability to use a private connection to connect to the DynamoDB table from the application. Which solution will meet these requirements?

  • A Use network ACLs to ensure that any outgoing or incoming connections to any port except DynamoDB are deactivated. Encrypt API calls by using TLS.
  • B Create a VPC endpoint for DynamoDB in the application's VPC. Use the VPC endpoint to access the table.
  • C Create an AWS Lambda function that has access to DynamoDB. Restrict outgoing access only to this Lambda function from the application.
  • D Use a VPN to route all communication to DynamoDB through the company's own corporate network infrastructure.
Suggested Answer: C
NOTE:
Question #29 Topic 1

A software company uses an Amazon RDS for MySQL Multi-AZ DB instance as a data store for its critical applications. During an application upgrade process, a database specialist runs a custom SQL script that accidentally removes some of the default permissions of the master user. What is the MOST operationally efficient way to restore the default permissions of the master user?

  • A Modify the DB instance and set a new master user password.
  • B Use AWS Secrets Manager to modify the master user password and restart the DB instance.
  • C Create a new master user for the DB instance.
  • D Review the IAM user that owns the DB instance, and add missing permissions.
Suggested Answer: A
NOTE: If you accidentally delete the permissions for the master user, you can restore them by modifying the DB instance and setting a new master user password. Reference: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.MasterAccounts.html
Question #30 Topic 1

A company has an AWS CloudFormation stack that defines an Amazon RDS DB instance. The company accidentally deletes the stack and loses recent data from the DB instance. A database specialist must change the CloudFormation template for the RDS resource to reduce the chance of accidental data loss from the DB instance in the future. Which combination of actions should the database specialist take to meet this requirement? (Choose three.)

  • A Set the DeletionProtection property to True.
  • B Set the MultiAZ property to True.
  • C Set the TerminationProtection property to True.
  • D Set the DeleteAutomatedBackups property to False.
  • E Set the DeletionPolicy attribute to No.
  • F Set the DeletionPolicy attribute to Retain.
Suggested Answer: ACE
NOTE: