Our ethics

We treat data and people with care

Applying big data and advanced analytics in the public sector is a sensitive topic, and rightly so: it can involve the most personal information about people’s lives.

But we believe the issue isn’t where you apply this approach. It’s how and why.
We don’t use data and analytics to identify and step up your most serious cases. We use them to help you understand which cases might become serious in a few months’ time. So rather than building a punitive model (for example, that sends an at-risk child into care), we build one that’s designed to prevent that from happening.

This liberal approach underpins our ethics and everything we do. And we only partner with clients who share our view.

Our ethics

We hold ourselves to a higher data-sharing standard than the system

To achieve what we want, we need to combine data from multiple agencies. We’ve overcome the challenges this poses by becoming experts in information governance, and by controlling the flow of data more carefully than the current system would.

When we work with a client, we engage with people from across the organisation and capture their policies and processes in a data-sharing agreement. The client’s data owners then confirm which data to share with the platform, not us. We just configure the controls in OneView to reflect who’s allowed to see what.

We’re constantly fighting bias

Human bias will always exist. We fight this by not using data for our predictive models that could encourage bias, like gender or disability. We also check for bias before we include data in OneView. (For example, if the caseload for adult services contains a particular percentage of BAME, does the risk model say the same?) And we check for algorithmic bias in our regular reviews of the case summaries OneView generates, as well as in our user acceptance testing.

We empower people, not replace them

OneView saves councils money by reducing the caseloads of social workers, along with the time it takes them to assimilate the information they need. This means they can support the right people, at the right time. It also allows team leaders to prioritise caseloads and allocate cases to the right person. But OneView doesn’t decide how to act on its own insights – social workers do. We strongly believe we’ll always need social workers to deliver person-to-person care. Our technology simply supports them to do it better.

We only re-identify data under strict conditions

We believe in balancing the need for privacy with the need to protect the vulnerable.

We do this by using pseudonymised data and a system that requires individuals to consent to us sharing it. But because this approach doesn’t safeguard all vulnerable people, we also work with clients to define precisely when OneView can re-identify data without consent.

This tends to be:

  • if another public professional, like a police officer, has referred a case to the client agency (no advanced analytics involved)
  • if there are already causes for concern and advanced analytics reveal that the situation is getting worse.

In all cases, OneView makes an appropriate amount of identifiable data available, and only to professionals with the right level of access. But a case worker always decides what action to take – not the platform.

We take an active part in the debate around ethics

Anyone can build a predictive model in a lab. But as soon as you try and implement it in the real world, your ethics come under scrutiny. We’ve engaged with more organisations than anyone to make sure we’re adhering to the latest thinking. These include the Cabinet Office, the Centre for Data Ethics and Innovation and the Department for Digital, Culture, Media and Sport. We’ve also implemented approaches the Government recommends in its Data Ethics Workbook.

We’re open to competition as well as to our community

We aren’t precious about what we do. We want other companies to build models so data can change more lives. And we want public sector organisations to be less afraid of using them. It’s why we actively try to raise awareness by speaking at relevant government events.

We’re also passionate about giving back to the community we’re part of. We do this by partnering with LDN Apprenticeships to give apprenticeships (seven so far) to young people from London’s East End who want to get into tech.

From the blog

Good Practices for Implementing AI in the Public Sector

Introduction As a company using artificial intelligence (AI) and advanced analytics in the public sector, we are committed to ... read more

Xantura and Inbest Join Forces to Support Local Authorities in the Delivery of Debt and Welfare Advice Services

We're excited to announce the integration of the Money and Pension Service (MaPS) Debt Advice Tool into Xantura’s OneView platform. By ... read more

Addressing concerns on the use of AI by local authorities

At xantura, we work with a growing number of local authorities to help them to improve outcomes for vulnerable people through the use ... read more

@xantura

Recent tweets

Call us

+44 (0)20 3745 7001

Email us

© 2020 Xantura. All rights reserved | COOKIES & PRIVACY | READ OUR CARBON NETZERO PLAN

Site by Herd