by Kalai Elumalai – Data Platform Solution Architect
This article attempts to answer,
Why Enterprise Architects should care about usability and security controls?
How the latest features in SQL Server 2016 (and Azure SQL Database) can help to address those challenges?
Usability and Security:
I recently attended a course (one of the best I’ve been to) on “People and Security” at Oxford University given by Professor Angela Sasse who has done extensive research in the field of usability and security. Her main research interest is the design and implementation of innovative technologies that are fit for purpose, value for money, and improve productivity and well-being of individuals and society. This course has prompted me to put into context some of the security implementations in Data Warehouse architectures to meet complex security requirements (both internal and external) from a usability perspective.
The process of building a secure system is pivotal for its effectiveness. The process is the methodology that is used to elicit the security needs, assess those needs, define the security policies and implement the countermeasures. While designing the countermeasures, the designers should consider that security is not the primary goal of the users and organisation. For example, a data analyst’s primary goal is to extract insight from information assets. The role of security is a supporting function, that is, to protect the assets and the operations the system users perform on those assets that are part of their core business activity. 
Zurko and Simon  were among the first to point out that current security mechanisms create additional overhead on all stakeholders: system administrators, application developers, and end users. This overhead puts additional onus on these stakeholders with the amount of work and additional complexity involved in securing the systems and in using the systems while adhering to the security controls in place.
The crux of the issue is that the implementation of security methods should not put an impediment on the usability of the system, for e.g. a database administrator looking to implement a security policy should introduce security measures that are resilient enough to allow changes to security policies and also be easy to implement. The ability to technically implement a security measure and the effort required by the users to adhere to the security controls plays a crucial role in the compliance decision.  “…The ‘pain tolerance’ – the amount of extra effort an individual is prepared to make for no personal gain – is what we call the Compliance Budget. The limit of the Compliance Budget is referred to as the compliance threshold; this being the point at which the individual no longer has the will to comply with official requirements.“
Also, when you view this from a Data Protection Act perspective, the Information Commissioner’s office has defined a set of data protection principles providing guidance to manage the security of the data (personal, sensitive etc.) organisations hold. Principle 7 suggests that organizations must have the right technical and physical security underpinned by information security policies and procedures. With the proposed changes to the EU Directive on Privacy and the introduction of Privacy by Design (Article 23), it is no longer only a responsibility of a siloed team or department in an organisation that needs to care about privacy. It is gaining more focus now than ever before and the report “Privacy and Data Protection by Design”  provides guidance to build in privacy in the lifecycle of system development and not as a bolt on. Organisations must, therefore, need to be better equipped with capabilities to effectively implement information security measures and become compliant with the regulations.
In this section, I’m going to discuss about how one of the security requirement was implemented in an Enterprise Data Warehouse project that I’ve worked on recently. To comply with the data protection act and the organisation’s information security policy framework, data was classified at attribute level by different groupings such as highly sensitive, confidential, financially sensitive etc. Based on the data model, data classification and the security policy was defined during the design phase drawing guidance from the organisations information classification standards.
The bespoke solution was to create views on top of the underlying tables. For example, entities whose attributes fall into multiple data classification types, will be segregated into separate views. In this way, only the data items allowed by a specific role will be able to access it. This solution met our security requirements and was implemented.
Before looking at the challenges of this bespoke solution, let’s look at few business scenarios which will be useful for our discussion.
The data management team in the business, which supports the customer insight team and campaign management team, wanted to run some data quality rules on the completeness/conformance of some of the attributes which are currently deemed as sensitive. For instance: from a marketing perspective, we would like to check whether a field has an email address such as email@example.com. The analyst doesn’t necessarily need to view the email address and all they care about is whether the column has any value.
The customer segmentation team may not need access to the date of birth of the customer – they only need the year of birth to understand the age band of their customers.
Let’s look at the downside of implementing such a bespoke solution.
All or nothing: In the above examples, the end user (campaign data planner, customer insight analysts) either gets full access to the underlying data or gets no access to all.
It becomes very impractical to anticipate at the time of design for all combinations of attributes used by a role, especially when roles are based on the business functions. For instance: In this scenario, after few months of go-live, the Operational Insight team wanted access to some of the financial data to understand the debt position and allocate resources between customer service and credit handling, which was not anticipated during the design phase by the design team.
The rationale and knowledge of those database views are lost when the delivery team and business transformation team move on (either move to a different project or in case of third party system integrators scenario the resources are released to different assignments). The security logic is hidden in multitude of views and are susceptible to manual errors too.
When there is a change to the view to be made, you need a resource who has the knowledge of those views to make the change and DBA team to implement the change in Dev/Test/Pre-Prod and Production. Multiple teams, especially with domain knowledge, involved which means more red tape, time and cost. This will allow the end users to request for access to all data rather than going through the complex process of implementing the change in the view to gain access to the required data.
The security controls that are difficult to change and adhering to those controls means more bureaucratic processes. This means most of the time is wasted in non-productive efforts and all of which would compete against the primary business goal taking up the compliance budget .
How does Dynamic Data Masking which is a built-in feature in SQL Server 2016 (and Azure SQL Database) help to secure sensitive data?
In simple terms, this feature masks the data that is returned from the DB engine. This new security programmability feature limits access to sensitive data by obfuscating query results. This feature offers:
Sensitive Data Protection: Protects sensitive data against unauthorized access by built-in or custom masking rules. Privileged users can view the unmasked data e.g. using DDM to mask sensitive information in an HR database and only authorized users can access the unmasked data.
Regulatory Compliance: Enabler for applications to meet regulatory compliance needs such as PCI DSS, Sarbanes-Oxley, HIPAA and other regulations which demand protection of sensitive financial and personal data.
Agility and Transparency: The biggest benefit of this feature is its ease of implementation and ease of use. The data is masked on the fly and the underlying data remains intact. And it is completely transparent to the application.
In the example above, the Email address column of the Employee table has been masked using a simple set of T-SQL statements based on the security policy. The Data Quality analyst can access the email address column and still can perform their primary tasks without compromising on security compliance. Whilst, the privileged user “admin1”, which could be used by applications such as a campaign management tool, can still access the unmasked email address and trigger email campaigns.
From implementation perspective, all the DBA or the security admin should do is,
define the security policy for the sensitive attributes in the entities.
apply those policies in T-SQL over the sensitive columns.
these data masking policies will then obfuscate the sensitive data in the query results when accessed by non-privileged users.
If I were doing the project I discussed above, I would be using SQL Server 2016 DDM. It would have not only simplified the implementation but also would have provided an effective governance with a central data masking policy and the agility required to meet the changing business data access needs.
Find out more information on “how to”:
 Usable security. Why do we need it? How do we get it?; M. Angela Sasse and Ivan Flechais
 Mary E. Zurko and Richard T. Simon, “User-Centered Security,” New Security Paradigms Workshop 1997 A. Beautement, M. A. Sasse & M. Wonham, M. (2008).
 The Compliance Budget: Managing security behaviour in organisations. In Proceedings of the 2008 Workshop on New Security Paradigms (2008)
 Privacy and Data Protection by Design – from policy to engineering December 2014. European Union Agency for Network and Information Security