Having spent almost two decades in Testing, I have seen many ‘trends’ adopted which have changed how testing is performed in organisations. I have witnessed the move from integrated testers within build teams to the creation of a centralised test function (and now a move to decentralise again!). A move from religiously following a Waterfall methodology to adopting a more ‘Agile’ approach, the introduction of DevOps and lots more.
While these trends come and go, there is one consistent element at the foundation of all good testing practices and that's Test Data Management.
A good Test Data Management strategy has always been important to the success (or failure) of the test process. So, why is NOW the time to let this often ignored aspect of testing take centre stage in the continual drive to make testing more effective and efficient?
Here is my 5 top reasons why a good Test Data Management strategy is essential for today’s Test functions:
1) Data Protection Legislation - The increasing use and development of new technologies, especially digital, has resulted in a significant increase in the amount of personal data being held and transferred electronically. It is now more important than ever to understand the risks that this can have for you, your business and your customers and agree your approach on how to mitigate these risks. Organisations must prevent sensitive information from falling into the wrong hands or being misused.
2) Cloud Testing - With organisations embracing the new Digital world, compatibility and accessibility testing of devices and operating systems is essential. The use of the cloud has increased significantly as an alternative and an almost essential approach to testing multiple devices. Private clouds, while having their uses, can be hugely expensive and a cost that many organisations cannot justify in the current economic climate. The use of a public cloud can only become a viable option for most organisations if it is supported by a robust Test Data Management strategy which assures the protection of sensitive data.
3) Service Virtualisation - Virtualised components require realistic test data to simulate the behavior of the live service or software they are emulating. Leveraging a test data management strategy to subset production data while masking sensitive information meets these requirements.
4) DevOps – The ability to ‘build and burn’ targeted build or test environments quickly increases the focus on quickly identifying and sub-setting test data to align with the latest physical or virtual environment requirements. Speed to market is essential and organisations can no longer take weeks or months out of their busy schedules to refresh their test data.
5) 3rd Party Integration – More and more organisations are choosing to integrate with systems hosted and managed by 3rd parties. Integration testing of such systems often require the test data to be masked to reduce the risk of sensitive data falling into the wrong hands. In addition to this, 3rd parties will rarely accept production data during testing as it then becomes their risk to correctly dispose of this data once testing is complete.
What do you think the key components to a good Test Data Management Strategy are? I’ll share my ideas as a starter for 10 …
- Data Reusability framework
- Identification of a good Test Data Management tool (or tools) to facilitate a quick response to requests for Data Obfuscation and Data Sub-setting
- Automation framework for data generation to meet specific test scenario requirements
- Access to individuals with a strong understanding of data lineage
- Documented data flows across all data sources and a mechanism to keep these up to date
- Dedicated resources to manage and accommodate the project’s test data needs
Have you had to adapt your approach to Test Data Management to meet the demands of the Digital Age? If so, what changes have you made?