AWS Certified Developer Associate Practice Test (DVA-C02)
Use the form below to configure your AWS Certified Developer Associate Practice Test (DVA-C02). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

AWS Certified Developer Associate DVA-C02 Information
AWS Certified Developer - Associate showcases knowledge and understanding of core AWS services, uses, and basic AWS architecture best practices, and proficiency in developing, deploying, and debugging cloud-based applications by using AWS. Preparing for and attaining this certification gives certified individuals more confidence and credibility. Organizations with AWS Certified developers have the assurance of having the right talent to give them a competitive advantage and ensure stakeholder and customer satisfaction.
The AWS Certified Developer - Associate (DVA-C02) exam is intended for individuals who perform a developer role. The exam validates a candidate’s ability to demonstrate proficiency in developing, testing, deploying, and debugging AWS Cloud-based applications. The exam also validates a candidate’s ability to complete the following tasks:
- Develop and optimize applications on AWS.
- Package and deploy by using continuous integration and continuous delivery (CI/CD) workflows.
- Secure application code and data.
- Identify and resolve application issues.
Scroll down to see your responses and detailed results
Free AWS Certified Developer Associate DVA-C02 Practice Test
Press start when you are ready, or press Change to modify any settings for the practice test.
- Questions: 15
- Time: Unlimited
- Included Topics:Development with AWS ServicesSecurityDeploymentTroubleshooting and Optimization
What is an essential characteristic of a unit test when developing applications?
It should interact with a live database to validate integration points.
It should be able to be run in a production environment to test real user scenarios.
It should cover multiple components and their interactions to ensure integration.
It should be isolated, testing a single component without external dependencies.
Answer Description
A unit test should be isolated, meaning it must test a single component (a 'unit') of the software without dependencies on external resources or the state of other units. Isolation allows for more predictable and faster tests which can pinpoint errors directly within the tested unit. Tests that rely on external state or resources might be integration tests or functional tests but are not considered 'unit' tests.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are external dependencies in unit testing?
What tools can be used to create isolated unit tests?
How do unit tests differ from integration tests?
Your system is responsible for storing a series of user-generated events, each characterized by a unique identifier, the time of occurrence, and the type of event. Given the need to prevent disproportionate use of any single data partition for storage and retrieval operations, which scheme should you adopt for the partition key assignment?
Assign a unique identifier as the sole partition key
Designate the event type as the partition key
Combine the unique identifier with the occurrence time to formulate a partition key
Use the occurrence time as the sole partition key
Answer Description
A partition key scheme that combines a unique identifier with the time of occurrence provides high cardinality ensuring that data entries are distributed uniformly across the database's partitions. This strategy reduces the risk of overloaded partitions, also known as 'hot partitions', which can occur when a single partition key value is accessed more frequently than others, such as would be the case with keys that are subject to frequent access like a standalone user identifier or the time field alone.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does 'high cardinality' mean in the context of partition keys?
What are 'hot partitions' and why are they problematic?
Why is combining a unique identifier with occurrence time effective for partition key assignment?
A developer is implementing an application that requires frequent retrieval of items from an Amazon DynamoDB table. To optimize performance, the application needs to minimize latency and reduce the number of network calls. Given the need for efficient data access patterns, which method should the developer use when implementing code that interacts with the DynamoDB table using the AWS SDK?
Perform individual
GetItem
operations for each item.Utilize
BatchGetItem
for batch retrieval of items.Use
PutItem
calls with a filter to only insert the desired items.Employ a
Scan
operation to fetch all table items and filter as needed.
Answer Description
Using the batch operations API of AWS SDKs, such as BatchGetItem
in the case of Amazon DynamoDB, allows the application to retrieve up to 100 items from one or more DynamoDB tables in a single operation. This reduces the number of network calls compared to individual GetItem
requests for each item and results in less network latency. The use of Query
or Scan
would not be as efficient because they are designed for different purposes. Query
is used to retrieve items using a key condition expression, and Scan
reads every item in a table, which can be less efficient for frequently accessed individual items. While PutItem
is used for data insertion, not retrieval, hence it is not suitable for the scenario given.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is `BatchGetItem` and how does it work in DynamoDB?
What are the differences between `GetItem`, `Query`, and `Scan` in DynamoDB?
Why is `Scan` not recommended for frequently accessed individual items?
A developer is required to manage the lifecycle of objects in an Amazon S3 bucket used for data analysis. The objects are frequently accessed for the first 30 days, occasionally accessed for the next 60 days, and rarely accessed thereafter. Which lifecycle configuration should the developer apply to optimize cost without compromising data availability?
Transition to S3 Standard-Infrequent Access after 30 days and to S3 Glacier after 90 days.
Transition to S3 One Zone-Infrequent Access after 30 days and delete objects after 90 days.
Leave the objects in S3 Standard for unlimited time to ensure rapid access.
Immediately store the objects in S3 Glacier upon creation.
Answer Description
The developer should transition the objects to S3 Standard-Infrequent Access after 30 days because it is designed for data that is accessed less frequently but requires rapid access when needed. After 90 days, the objects should be moved to Amazon S3 Glacier or S3 Glacier Deep Archive, which are suitable for archiving data that is rarely accessed. The other answers are incorrect because they do not appropriately align with the change in access patterns described, or they incur higher costs without providing the necessary access speed.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Amazon S3 and how does it work?
What are the different Amazon S3 storage classes?
What is lifecycle configuration in Amazon S3?
Automated testing within AWS CI/CD cannot include unit tests that are executed as part of the build stage in AWS CodePipeline.
True
False
Answer Description
This statement is false because AWS CI/CD, specifically AWS CodePipeline, can include a build stage where various types of automated tests, such as unit tests and integration tests, can be executed. AWS services such as AWS CodeBuild can be used within a pipeline to run tests, ensuring that every code change is validated before it is deployed.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is AWS CodePipeline?
What are unit tests and how are they used in AWS CI/CD?
What is the role of AWS CodeBuild in the CI/CD process?
Which AWS service can trigger a Lambda function in response to changes in data within a database table?
Amazon CloudWatch
Amazon S3
Amazon DynamoDB Streams
Amazon Simple Queue Service (SQS)
Answer Description
AWS Lambda allows you to execute code in response to triggers from AWS services like Amazon DynamoDB. A DynamoDB Stream triggers a Lambda function when there are changes in data within a DynamoDB table, enabling real-time processing of updates. Amazon S3 is used for storage and can trigger Lambda with object-level operations, but not with changes within a database table. Amazon SQS is a messaging queue service that can also trigger Lambda functions, but it does not monitor changes within a database table. Amazon CloudWatch monitors and manages AWS resources and applications but does not directly trigger Lambda functions based on changes within a database table like DynamoDB Streams.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are DynamoDB Streams and how do they work?
How does AWS Lambda integrate with DynamoDB Streams?
What are the limitations of using DynamoDB Streams?
A development team is building a social media analytics platform that rapidly processes and analyzes streaming data from various sources. The application must dynamically adapt to unpredictable schemas and requires immediate consistency for write operations due to the interdependent nature of the data. Considering the need for auto-scaling capabilities and high-throughput performance, which Amazon database service should the development team leverage for their data tier?
Amazon Keyspaces (for Apache Cassandra)
Amazon DynamoDB
Amazon Timestream
Amazon Aurora
Answer Description
Amazon DynamoDB is suitable for applications requiring schema flexibility, immediate consistency, and high-throughput performance. DynamoDB's ability to auto-scale and handle unpredictable workloads matches the application requirements specified in the scenario. Amazon Keyspaces offers Apache Cassandra-compatible features, but it does not inherently guarantee immediate consistency. Amazon Aurora is a relational database and might struggle with schema flexibility and the unpredictable workloads implied in the scenario. Amazon Timestream is built for time-series data and might not be the best fit when immediate consistency for write operations is a primary requirement.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are the benefits of using Amazon DynamoDB for our application?
What does immediate consistency mean in the context of database operations?
How does DynamoDB's auto-scaling feature work?
When developing serverless applications, it is possible to execute and debug your code on your local machine, simulating the cloud environment, without having to upload your functions to the actual cloud service each time during testing.
The statement is false; the framework requires all tests to be performed directly in the cloud and does not support local testing.
The statement is true; the framework allows for local testing and debugging which simulates the cloud environment.
Answer Description
The framework for building serverless applications provides a local environment that mimics the cloud execution environment. This feature allows developers to iterate quickly by executing and debugging their code on local machines, saving time and resources by not requiring code to be deployed to the cloud service for initial testing phases. This is particularly useful for developing, testing, and debugging Lambda functions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What tools can I use for local testing of serverless applications?
What are Lambda functions and how do they fit into serverless architecture?
What are the benefits of local testing versus cloud testing in serverless application development?
Which AWS service provides a managed in-memory data store that is compatible with Redis or Memcached?
Amazon S3
Amazon ElastiCache
Amazon DynamoDB
Amazon RDS
Answer Description
Amazon ElastiCache is a managed service that provides in-memory caching capabilities that are compatible with Redis or Memcached. Understanding the purpose and capabilities of Amazon ElastiCache is key for developers who need to implement caching strategies in order to reduce database load, increase throughput, and improve latency in their cloud-based applications.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is Amazon ElastiCache and how does it work?
What are the benefits of using a managed service like ElastiCache?
How do Redis and Memcached differ, and why would I choose one over the other in ElastiCache?
A developer must provision an external service's credentials to an application hosted on a virtual server without embedding them directly into the codebase or configuration files. Which service should the developer implement to securely manage and inject these credentials at runtime?
Permission assignment service
Encrypted object storage service
Version-controlled code repository
Parameter store without additional encryption
Dedicated secrets management service
Environment variables with plain text values
Answer Description
The correct service for securely managing and injecting credentials at runtime is a dedicated secrets management service. It allows for the encryption, storage, and on-demand retrieval of sensitive configuration details such as database credentials. Specifically, a secrets management service provided by a major cloud provider would enable these credentials to be fetched dynamically by the application without being stored insecurely in the environment or within the application’s source code. Services intended for key management or permissions management are not suited for the dynamic provision of secret material to applications.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a dedicated secrets management service?
How does a secrets management service handle encryption?
Why should I avoid using environment variables for storing sensitive credentials?
A development team is working on a new feature in a project hosted in AWS CodeCommit. They want to ensure that any changes pushed to the master branch have been reviewed and approved by at least two team members before being merged. Which feature in CodeCommit can they utilize to enforce this requirement?
Implementing stage locking on the master branch
Enabling branch protections for the master branch
Requiring a pre-signed commit policy on the master branch
Configuring a pull request approval rule in CodeCommit for the master branch
Answer Description
Pull request approvals are a feature in AWS CodeCommit that teams can use to enforce code review policies. By requiring a certain number of approvals before changes can be merged into a particular branch, like the master branch, teams can ensure that multiple team members have reviewed the changes. Branch protections in other services, such as GitHub, may offer similar functionality but are not the correct answer here because the scenario specifically involves AWS CodeCommit. The pre-signed commit policy is not applicable here, as it is not a feature of AWS CodeCommit. Lastly, the concept of 'stage locking' does not directly apply to Git-based version control systems and thus is incorrect in this context.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What are pull request approval rules in CodeCommit?
How do I configure pull request approval rules in CodeCommit?
What is branch protection, and how does it differ from pull request approval rules?
You are in charge of deploying an application that must access a database using specific credentials. The deployment requires setting environment variables that the application will utilize at runtime. How should you securely store and supply these database access details to the application?
Encrypt the database credentials and include them in the versioned application configuration, decrypting them with a stored key when the application starts.
Use a configuration management service with KMS to store the database credentials and control access through roles.
Adopt a managed secrets management service to handle the database credentials and dynamically provide them to the application when needed.
Implement environment variables in the application source code with encryption logic that decrypts these values on initialization.
Answer Description
Utilizing a managed service for secrets management to store credentials and securely retrieving them during application execution is the optimal solution. This approach avoids hardcoding sensitive information and leverages automatic rotation of secrets, as well as fine-grained permissions for access control. Encrypting environment variables within the application code provides some level of security but lacks the management features and automated rotations offered by a dedicated secrets management service. Storing encrypted credentials in the deployment configuration introduces the additional burden of managing encryption keys and does not offer the same level of access control or rotation capabilities. While parameter stores provide secure storage for configuration data, a service specifically designed for secrets management will include enhanced features suitable for managing sensitive credentials, such as automated secret rotation, which is particularly important for database credentials.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a managed secrets management service?
Why is avoiding hardcoding credentials important?
What are the benefits of automated secret rotation?
Which policy type allows you to specify access permissions directly on an AWS resource such as an S3 bucket or an RDS instance?
Resource-based policy
Managed policy
Principal policy
Service policy
Answer Description
Resource-based policies are attached directly to the resources and specify who has what permissions for that resource. Service and principal policies do not attach directly to resources. Instead, service policies dictate the permissions that the service has, while principal policies are attached to IAM users, groups, and roles to specify their permissions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a resource-based policy in AWS?
Can you give an example of when to use a resource-based policy?
What are the main advantages of using resource-based policies?
Your development team needs access to review configurations from a specific storage bucket, 'dev-configurations', but as per the company protocol, all permission policies must be centrally controlled by the security administrators. How would you proceed to grant the appropriate review privileges to your team without overstepping the bounds of the prescribed protocol?
Utilize a pre-existing general access policy from AWS and apply it to the development team's roles.
Alter the permission boundaries for your developers to incorporate the read privileges for the bucket in question.
Contact the security administrators to provision a tailored policy with appropriate permissions for your team.
Configure a new access policy directly for the users to review the specific bucket contents.
Answer Description
Since developers are not authorized to create or manage permission policies, the best action is to request the security team to craft a custom policy with the necessary permissions and subsequently associate this policy with the appropriate team members. Leveraging existing managed policies or embedding permissions directly would bypass the internal control processes and undermine the organizational rules.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is an access policy in AWS?
What are permission boundaries in AWS?
What role do security administrators play within AWS permissions management?
Your team is implementing a continuous delivery pipeline for a new application to ensure rigorous quality standards are met. To achieve compliance, they must insert a control point that requires a specific reviewer's sign-off before the application is released to the live environment. Which feature should you use to integrate this requirement within your deployment pipeline?
Define a stage timeout in the pipeline configuration to allow enough time for a manual review.
Insert a Manual Approval action within the appropriate stage of your delivery pipeline.
Create prerequisite conditions for progression that halt the process until a team member resolves them.
Incorporate a custom Lambda function, triggered to solicit a review from the necessary personnel.
Answer Description
A Manual Approval action is specifically designed for scenarios where human intervention is required in the pipeline. This gatekeeping step necessitates a designated reviewer's explicit approval, ensuring an additional layer of scrutiny is applied before the application reaches the live environment. Conditions and stage timeouts do not fulfill the requirement for a reviewer's conscious decision, as they do not specifically call for a review. While Lambda can introduce automation and logic into pipelines, it does not cater to the need for manual review and should be used for other automated tasks within the pipeline.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does a Manual Approval action entail in a deployment pipeline?
How do you set up a Manual Approval action in AWS CodePipeline?
Why can't conditions or timeouts replace Manual Approval actions?
Nice!
Looks like that's it! You can go back and review your answers or click the button below to grade your test.