March 20, 2020
Traditionally, Usability and Security have been set in opposition to each other: with tight security, we end up with painful user experience. In this blog, Guy focuses on financial services as an exemplar of how we can introduce usability into a vertical with challenging security and compliance requirements.
Data is central to IT activities. It provides information about business activities, about customer behaviour and about the performance of our systems. It is used extensively across the business for reporting, analysis and powering services.
To optimise our use of data, we need services which store it reliably, provide interfaces for analysis and automate transformation. In developing and configuring these services we must walk a fine line between security and usability.
Usability, because business value depends on frictionless access to data. We don’t want work to be hindered by challenging interfaces and unreliable systems.
Security, because we work with sensitive data, client information and intellectual property. We cannot afford for this data to make its way into the wrong hands and must comply with compliance mandates.
Traditionally, Usability and Security have been set in opposition to each other: with tight security, we end up with a painful user experience. With emphasis on user experience, we leave controls loose and services insecure. We believe that building usable systems can actually improve security by reducing the temptation to work around over-restrictive security controls. By humanising data we can improve security as well as motivation, satisfaction and productivity.</p
By making data security usable we also expect to make systems easier to manage, operate and understand.
In this blog, we will focus on financial services as an exemplar of how we can introduce usability into a vertical with challenging security and compliance requirements
The challenge for banks and other financial services providers is how to strike the right balance between security and providing a great user experience.
We must control both access to data and where data can be sent by those with access to it. Naturally, we wish to prevent unauthorised data access; exfiltration attacks can occur when sensitive datasets are maliciously or accidentally transferred outside of your organisation by users who are authorised to access the data.
To understand access requirements we must characterise user behaviour – what are the personas involved and what permissions do these groups require? Least privilege is often the best guideline – however, taking this to a high level of granularity can result in significant friction with back-and-forth access control change requests as the permissions are tuned. It is recommended that a pragmatic approach is taken here and that access control changes are handled in a simple and transparent fashion with clear expectations set around timescales. It can become easy for security to fall into draconian restrictions and ensure that there are complex approval processes governing access – the level of effort required to truly understand your user needs is slightly higher – but these damage business value directly and are unhelpful.
The use of automation can significantly accelerate change and, more importantly, provide accurate and complete logging of change. Cloud tools like AWS CloudWatch provide the opportunity to execute business logic (in the form of Lambda functions) whenever an event happens within the infrastructure. AWS CloudTrail provides a full audit trail of all actions performed over the AWS API. With these kinds of tools we can seek to detect insecure behaviour and converge on a consensus between onerous control and usability.
Data must not only be stored securely it must be handled in accordance with various compliance requirements. Regulations such as GDPR place controls on the management of Personally Identifiable Information (PII) and International Data Transfers. We must demonstrate that we are conforming by providing audit trails and reports.
To do this we recommend involving Security and Compliance teams early in the project process and ensuring that their view is incorporated from the initial designs. This includes understanding what data needs they have about any system and how this data is to be analysed and reported.
Again automation can help – allowing rapid response to security and compliance challenges as well as ensuring that controls are systematically and accurately deployed across the whole system.
Data engineering involves getting the right data, to the right place, at the right time and in the right format. Whilst we may often have a single data lake with raw or lightly processed data, this must generally be moved or transformed before significant analysis. So if we wish to use a specific cloud service then the data may need to be moved to a cloud region. We may need to perform Feature Engineering prior to machine learning analysis. These are complex engineering tasks which must be developed and appropriately controlled.
As algorithmic access and processing of data intensifies with the increased adoption of AI and Machine Learning, we must govern access to regulated data by these processes. We must also ensure that the models and algorithms are generating insights and decisions which are fair and legal. Interpretability ensures that we can predict the behaviour of any model and explainability ensures that we can understand the mechanics behind the model. With ML being traditionally opaque, we must seek to ensure that models are explainable and interpretable and an appropriate level of transparency is provided around the behaviour of data analyses.
There is much emerging research in this space and it remains challenging. We do see signs of commodification however, for example https://cloud.google.com/explainable-ai/.
To enable the efficient processes described above, we need the right tools for the job. We need good observability so that we can see what is going on and when. The logging and monitoring must, of course, be themselves usable.
In general, using industry-standard tools will yield the best results. What we are trying to achieve in operability is often best generic and commodified – boring even. With analysis, there is generally a small set of tools that your teams will be familiar with and will be easy to hire for. It is worth asking what your teams want to use.
As we have touched on earlier – the cloud vendors provide many of the tools required and these are provided as a managed service meaning minimal operational overhead and continuous improvement behind the scenes. These can be supplemented by best-of-breed open source and commercial software. Developing your own tools can lead to poor results and an ongoing burden as they must be maintained.
Balancing usability and security is challenging but rewarding: it requires two different mindsets to gain consensus about how best to implement controls in services and infrastructure. The earlier these conversations take place, the more harmonious the resulting security systems when baked into the design and observability of the architecture.
OpenCredo has partnered with our parent organisation Trifork to bring Design Thinking to our deep technical projects with Accelerate Workshops. We seek to create alignment between design and technology and create IT systems which are secure and a joy to use – improving the work life and productivity of staff in your organisation. If you are interested in finding out more contact: firstname.lastname@example.org or email@example.com
This blog is written exclusively by the OpenCredo team. We do not accept external contributions.