A global retail chain has experienced partial corruption in its transaction records when the data is streamed to a backup site. Investigations show that disruptive bursts in the environment are affecting the information mid-transfer. Which measure would best preserve accuracy of the logs despite the ongoing disruptions?
Use data parity or hashing with continuous verification of received data
Disable encryption measures to reduce slowdown incidents
Increase packet size to streamline transmission sessions
Reduce the broadcast interval to curtail collisions
Applying strong data parity or hashing with ongoing validation helps confirm that transmissions are not altered by disruptive bursts, preserving the reliability of the logs for audits. Disabling encryption measures does not ensure validation of the transmitted data, changing broadcast intervals focuses on speed rather than integrity, and increasing packet size alone does not guarantee correct information upon arrival.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is data parity and how does it work?
Open an interactive chat with Bash
What is hashing, and why is it useful for data integrity?
Open an interactive chat with Bash
What causes 'disruptive bursts' during data transfer, and how can they affect communication?