Future Government services must be data-driven
How do public sector agencies harness the power of the vast datasets in the sector to design and deliver services that create better outcomes for citizens?
This was the focus of discussion involving the Crown Prosecution Service, Oil and Gas Authority, and UK digital transformation specialist CloudSource, on the opening day of the Government Data Show, an online conference bringing together over 530 UK public sector data professionals. The event runs until 23rd September.
John Seabourn, Chief Digital Officer, Oil & Gas Authority
At the Oil & Gas Authority (OGA) we work with industry and government on the economic recovery of oil and gas, and support the UK government's efforts to reach net zero greenhouse emissions by 2050. My role in particular is to be responsible for our digital and data strategy, and also some of the other functions that support that, such as around cybersecurity and IT.
We are the national data repository for subsurface data, so the top thing we’re focused on is a public release of enormous amounts of data so 100 terabytes to start with and then rising quickly to enable access to four petabytes within the five years, to help carbon capture and storage and maximise the economic recovery of oil and gas.
We're all aware that COP26 is coming up, and this is the first of a kind for a G7 country. We've got a cast-iron agreement with Government about what our role within that is: reducing emissions as a sector by 10% over the next few years, 25% over the next five years and eventually reaching net zero by 2050.
None of that's possible without the good quality of data from the oil and gas sector. We generated petabytes of data over the last 60 years, and if we look at onshore data probably over the last 100 years. So it is about how we can reuse that data for things that we didn't think were possible when it was generated.
We have to enable our data to be made available to academia for things that we didn't think were a thing as recently as five years ago.
Interoperability is a difficult challenge because we deal with large amounts of legacy data. We've spent a lot of time talking with operators about the data supply chain, and how they can collaborate and work better together to knock down some of these barriers. It's very complex with some of the commercial models in place, but we’re focusing on promoting the value that's coming out of the other end. If there's no competition, then once you've got an active field and you're looking for value, you might as well collaborate, work together with other companies and then share a larger portion of that pie.
Mark Williams, CEO of CloudSource
I think the biggest problem is around collecting the data that's needed to support better services from a myriad of technology silos, and extracting just the data we need to create a data model that's of real value, and that can be used to drive innovation.
If we were delivering a new organisation, then it would be clear how we would like them to perform and deliver these services.
However, it is the technical debt piece that really makes this complicated because over decades of rolling-out new services and information points within the public sector, we've implemented systems to deliver those services - and those systems will have their own data collection and may have their own security model.
It may have an access point or something like that and it's really about getting those things together, through an integration point that that enables that. Organisations know where they want to end up, but often don't really know where to start.
In this situation I recommend starting small and creating the pipelines that can tap into that legacy data, and push it somewhere where we've got some real innovation and processing power, and where we can start to build processes over the top of.
The second part is, is about the level of disruption, an organisation can encounter any one time while it's delivering those services. So if we're looking to build out new datasets, new ways of working, your strategy needs to be clear that you only move around so many pieces of the data puzzle at any one time. So there needs to be a timeline attached to this as well.
So we've got the technical piece around how we knit it all together, and then also be realistic about how much change can we roll-out at any one time. It's those two things combined that really bring that to life.
Fiona James, Deputy Director for Digital Technology, Crown Prosecution Service
I help oversee the digital service including the systems, security innovation architecture, and I have a background largely from Government departments and the Cabinet Office. So I bring a deep understanding across public policy strategy and some of the delivery challenges faced by public sector organisations.
Very much motivated by the enabling power of data digital and technology and I've seen the impact that it can have, for example, right at the beginning of the pandemic in terms of the join up and collaboration. In addition to this, I'm also the civil service digital wellbeing champion, and there's been an increasing need to focus on digital wellbeing and inclusion as we've all moved to remote ways of working.
In criminal justice, there's a significant backlog of cases which has worsened because of the pandemic. At CPS the current live caseload is 44% higher than the pre-Covid baseline and that's not just the backlog, because we have a duty to keep cases under continual review.
Obviously the organisation is doing everything it can to address this, including looking at how to improve workflow management, workload prioritisation and driving more automation across the organisation.
We operate in a complex ecosystem interacting with 43 different police forces and one of the challenges is that we rely a lot on our users to join up those systems and to join up that data, and they're still dealing with high volumes of of data and low quality of data as well, particularly at the data entry stage, and obviously we're working to help our prosecutors and legal staff focus and know where they can spend their time in the most high quality way, and not trying to join up different systems. We also want to have better data reporting and modelling, so that we can join up and do more of the systems thinking across the justice system to help kind of forecast our user needs.
We are facing a choice ahead of us now about how we try and move to the future in terms of our core casework system on which there is an amount of technical debt. I think the pandemic has shown us that the ability to interoperate has gotten much better because of the collaboration but also because the underlying technology has improved as well.
So we are now able to plan our future deliveries on interoperating with other justice systems, for example, putting in place for APIs and interfacing with the common platform, rather than migrating our data on to those systems and causing more disruption for our users.
We need to break things down into manageable chunks, and we're thinking about micro services, we're thinking about how we create more flexibility to access data in the future, and how we can just create that join up more, but do it in a kind of micro service approach.
Engaging with the business and engaging with your users is absolutely key. I am currently here in Leeds doing exactly that - trying to design our future setup, map out our pain points, and work out what the transition plan is. We're going to have to live with the technology debt for some time, but as you move forward you can build on the newest technologies and you have to transition it. As Mark said, it's not big bang. It's about having an approach that is resilient, stable, and future-proof.