CSA CCM DSP-10
Sensitive Data Transfer

Ensuring the privacy and security of sensitive data is critical, especially when that data needs to be transferred. The Cloud Security Alliance (CSA) Cloud Controls Matrix (CCM) specifies controls organizations should implement to protect personal and sensitive data from unauthorized access during transfer. This article explores Control DSP-10 which focuses on defining processes and deploying technical measures to safeguard data in transit.

Where did this come from?

This control comes from the CSA Cloud Controls Matrix v4.0.10 released on 2023-09-26. You can download the full CCM document here.

The CCM provides a controls framework for cloud computing that is aligned to other industry-accepted security standards, regulations, and control frameworks such as ISO/IEC 27001, HIPAA, AICPA TSC, PCI DSS, and NIST SP 800-53. Implementing the CCM controls helps organizations manage cybersecurity risks associated with cloud services.

Who should care?

Several roles with different needs should pay attention to Control DSP-10:

  • Chief Information Security Officers (CISOs) responsible for managing information security risks
  • Data Protection Officers (DPOs) responsible for ensuring compliance with data protection regulations
  • Application owners who need to securely transfer sensitive data to/from their apps
  • IT operations staff responsible for securely configuring and monitoring data transfer mechanisms
  • Compliance auditors who need to verify that sensitive data transfers meet policy and regulatory requirements

What is the risk?

Transferring sensitive data introduces the risk that the data could be intercepted and accessed by unauthorized parties while in transit. This could lead to:

  • Exposure of sensitive data (e.g. Personally Identifiable Information, financial data, health records, intellectual property)
  • Violations of data protection regulations which can result in significant fines
  • Reputational damage and loss of customer trust

Properly implementing the processes and technical measures specified in DSP-10 can significantly reduce the likelihood and impact of these adverse events. Strong encryption makes intercepted data unreadable and unusable by attackers.

What's the care factor?

For most organizations, securing sensitive data transfers is a high priority for several reasons:

  • Data breaches are increasing in frequency and cost. The global average cost of a data breach reached $4.35 million in 2022 according to IBM.
  • Regulations like GDPR and HIPAA have severe penalties for non-compliance. GDPR fines can be up to €20 million or 4% of annual global turnover.
  • Customers are increasingly aware of privacy issues and less forgiving of companies that fail to protect their data. 31% of customers discontinue their relationship with a company after a breach according to PwC.

So while there are costs to implementing DSP-10 controls, the risk reduction is well worth the investment for the vast majority of organizations handling sensitive data.

When is it relevant?

DSP-10 is highly relevant for:

  • Cloud applications and APIs that send/receive sensitive data
  • Migration of on-premises data to the cloud
  • Integration between cloud services that exchange sensitive data
  • Anytime employees, partners or customers upload or download sensitive files
  • Backup data being transferred to an alternate site

It's less relevant for:

  • Public, non-sensitive data
  • Data transfers within a completely private, on-premises network that is physically controlled
  • Fully anonymized or encrypted data that would be unusable if intercepted

What are the trade offs?

Implementing DSP-10 does have some costs and potential downsides:

  • Encryption adds some processing overhead which can slightly increase transfer times and costs
  • Not all legacy systems support modern encryption standards which may require upgrades
  • Encryption can break network monitoring tools that need to inspect payloads
  • Key management introduces complexity and is critical to get right
  • Some countries have limits on cryptography which can affect multi-national operations
  • Proper implementation requires changes to application code, network configuration, and business processes which takes significant time and effort

However, for most organizations the risk reduction far outweighs these costs. And many of the leading cloud platforms and transfer tools have encryption built-in by default now.

How to make it happen?

Here are the high-level steps to implement DSP-10:

  1. Discover and classify your sensitive data. You can't protect what you don't know about.
  2. Define data protection policies that specify when encryption must be used, which encryption standards are approved, and any exceptions. Get sign-off from legal, compliance and business stakeholders.
  3. Define security requirements for each data transfer interface and document in a secure design pattern. For example:
    • Web UIs must use HTTPS with TLS 1.2+
    • APIs must use mutually authenticated TLS or VPN
    • Files must be encrypted with AES-256 prior to transfer
    • Access keys must be rotated every 90 days
    • Logs must record who transferred what data when
  4. Train developers on how to implement the secure design patterns. Provide secure coding guidelines and examples.
  5. Implement code changes in dev environment. This typically involves:
    • Upgrading to latest SSL/TLS libraries
    • Enabling HTTPS strict mode
    • Configuring certificate validation
    • Generating and rotating access keys
    • Adding envelope encryption prior to upload/download
    • Logging access to sensitive data
  6. Deploy to production using a blue/green deployment to minimize downtime. Restrict access to production keys.
  7. Test all data transfer interfaces to sensitive data
    • Verify HTTPS is used with valid certificate
    • Confirm failed connections if cert is invalid
    • Check encryption key access is restricted and logged
    • Attempt transfers from unauthorized IP addresses
    • Validate sensitive data is unreadable in transit
  8. Setup ongoing monitoring and auditing
    • Alert on any insecure transfers (e.g. HTTP)
    • Monitor encryption key access & rotation
    • Audit all sensitive data transfers
    • Scan for exposed access keys in github
    • Perform penetration tests annually
  9. Train users on secure data handling practices
    • How to identify sensitive data
    • When encryption should be used
    • How to report any issues
  10. Review and update processes at least annually and after any significant change.

What are some gotchas?

A few things to watch out for when implementing DSP-10:

  • Legacy systems may not support TLS 1.2+. Upgrades or compensating controls may be needed. Don't allow fallback to insecure SSL protocols.
  • Certificates expire. Use automation to track expiry and renew to avoid service interruptions.
  • Certificate validation is critical. Attacks often use invalid certificates. Enforce strict validation.
  • Encryption is only as secure as the key management. Restrict access to keys and rotate regularly. Use envelope encryption for long-term data protection.
  • Cloud security groups must be configured to allow HTTPS (TCP 443) but deny HTTP (TCP 80).
  • Transfers to third-parties require extra scrutiny. Only allow transfers to approved endpoints. Verify compliance with your policies. Get documented agreement on protection requirements.
  • Be careful with caching. Avoid caching sensitive data wherever possible. If caching is required, ensure it is secured and has a short expiry time.
  • Test regularly. Automated testing should run daily. Manual penetration tests should run at least annually and after major changes.

What are the alternatives?

There are a few alternatives to the specific controls in DSP-10:

  • Virtual Private Networks (VPNs) can provide a secure tunnel for data transfer without requiring application-level changes. However, VPNs typically have high overhead and are clunky for users.
  • Tokenization replaces sensitive data with a meaningless token. This can reduce risk but fundamentally the real data must still be transferred and protected at some point.
  • Managed file transfer services can handle the encryption and secure transfer for you. However, you are placing trust in the provider to handle your sensitive data appropriately.

In general, application-level encryption as specified in DSP-10 is considered best practice because it protects the data itself rather than relying on external network controls that can fail.

Explore Further

Here are some great resources to learn more about securing sensitive data transfers:

  • OWASP Cheat Sheet for Transport Layer Security
  • NIST SP 800-52 Rev 2 Guidelines for TLS Implementations
  • AWS Best Practices for DDoS Resiliency (encryption stops many DDoS attacks)
  • Google Cloud's 13 Best Practices for User Account, Authentication and Password Management

This control also relates closely to these CIS Controls:

  • CIS Control 3: Data Protection
Blog

Learn cloud security with our research blog