Understanding and Selecting Data Masking: Buyer’s Guide
The final installment in our masking series closes with a simplified buyer’s guide for product selection. As with most security product buyer’s guides, we offer a fairly involved process to help customers identify their needs and evaluate solutions against each other. These guides address the difficulty of getting all stakeholders to agree on a set of use cases and priorities, which is harder than it sounds. We also offer guidance on avoiding pitfalls and vendor BS. Of course you still need to ensure that your requirements are identified and prioritized before you start testing, but the process with masking technologies is a bit less complicated than with other technologies. The field of vendors has dwindled rapidly for one simple reasons: Customer requirements are narrowly defined along a few principal use cases (test data management, compliance, and database security), so most masking platforms focus their solutions along these lines. Only a couple full-featured platforms provide the necessary deployment models and sufficient database coverage to compete in all cases. But we often see a full-featured platform pitted against others that focus on a single use case, because not every customer needs or wants every possible capability. So don’t focus solely on ‘leaders’ in whatever analyst reports you may read, but cast your net across a wider group of vendors to start your ‘paper’ evaluations. That should give you a better idea of what’s available before you conduct a proof of concept deployment. Define Requirements Over and over again, we see dissatisfaction with security products stemming from a failure to fully understand internal requirements before product selection. We understand that it is impossible to fully evaluate questions such as ease-of-use across an entire organization before a product is in full deployment. But unfortunately, more often the real issue is lack of understanding of both the internal expectations for the product and where the organization is headed. So defining needs and getting input from all stakeholders are necessary for a successful product evaluation and selection. Create selection team: Even small firms have at least the technically-focused security and IT operations groups cooperate during the selection process; but typically different business units, along with risk, audit, and compliance have input as well. Identify the major stakeholders and designate a spokesperson for each group. Define what needs protecting: You need to identify the systems (file servers, databases, etc.) and data types to be protected. Summarize what the data is and how the systems are used, and map desired data flow if possible. Define how data will be protected: Map your protection and compliance needs to the systems, processes, and data from the previous step. Accept input from each stakeholder on the security and compliance requirements for each data type, and the risk or criticality of that data. Design your ideal deployment: Now that you have an understanding of what needs to be protected and how, document the specifics of integration and deployment. Determine what masks are appropriate for each data type, how data flows through your systems, and where your integration points should be. Define tests: Determine how you will verify that vendors meet your requirements. Decide what samples data sources and data types need to be tested. Confirm that adequate resources are available to thoroughly test the system. Pulling an old laptop from a drawer or an older server from a closet to run tests on is a way to ensure failure. Determine and assign responsibilities for who will test and who will evaluate the results. Tier the tests so the most critical elements are tested first, to weed out unworthy products as quickly as possible. Finally, figure how you will validate the efficacy of the masks, and whether they are genuinely producing suitable results. Formalize requirements: At this point you should have a very clear picture of what you need, so it’s time to document some of your requirements for a formal Request For Information (RFI) and Request For Proposals (RFP) to identify which vendors offer appropriate solutions, and then select the ones that best match your requirements for further evaluation. You should also have a good idea of your budget by this point – it will help guide your selection, and may force a phased deployment. Vendor Selection Deployment Architecture: Architecture is key because it determines compatibility with your environment. It also directly correlates with performance, scalability, management, and ease of deployment. Centralized masking servers, distributed deployments, on-database masking, and agents are all options – but which is best depends entirely on your environment and how you want to deploy. So testing your deployment model across sufficient systems is essential for developing a good idea of how well the masking solution fits your environment. Platform coverage: Verify that the vendors support the relational and quasi-relational databases you need, as well as their ability to work with the applications and file servers you wish to integrate with. This is typically the first area where vendors “wash out” of the evaluation, when they don’t adequately support one of your critical platforms. You should review vendors’ published support matrices, but we suggest you also test your critical platforms to make sure they work to your satisfaction. How data is collected and managed varies from vendor to vendor, and how well each solution works with different database types can be an eye-opening comparison. Use, customization, and management: Test the day-to-day tasks of adding data sources, performing discovery, adding masks, and customizing reports. You will be living with this UI and workflow on a daily basis, so ease of use is a major consideration. If the product is annoying during the evaluation process, it is unlikely to become more pleasant with familiarity. Poor user interfaces make administrators less likely to tune the system, and poor workflows are more likely to cause mistakes. Ease of use is rarely listed as an evaluation criterion, but it should weigh heavily in your choice of platform. Scale and performance: Vendor reported performance and real world performance are quite distinct, so you need