A DevOps engineer uses a code repository with automated pipelines. The engineer wants all new submissions to trigger a build that stops if tests fail, and the logs must be kept for later review. Which method achieves this goal?
Include a stage that runs test tasks and blocks progression upon failure while preserving output in the pipeline
Set up a weekly job that executes builds and deletes log data after confirming success
Remove testing from the pipeline and rely on external tools to scrutinize the compiled code
Rely on local checks by each contributor, then manually upload logs after changes are merged
Allocating a dedicated phase that runs tests and blocks further tasks when something fails will ensure errors are caught before merging. Preserving logs allows teams to diagnose issues accurately. Timed jobs or manual steps lose efficiency, and skipping the test phase forfeits automated oversight, making it harder to detect problems.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a pipeline in DevOps?
Open an interactive chat with Bash
Why is preserving logs important in automated pipelines?
Open an interactive chat with Bash
What is the purpose of blocking progression upon test failure?
Open an interactive chat with Bash
CompTIA Cloud+ CV0-004
DevOps Fundamentals
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
IT & Cybersecurity Package Join Premium for Full Access