Quality Data for Children in Care

Contents:

  1. Project outputs
  2. Project timeline

Children in care often have poor lifetime outcomes arising from neglect, abuse and trauma. Improving the quality of that care and investing in early intervention and prevention can improve life chances and generate wider benefits, including reduced criminal justice costs and better educational outcomes.

This beta project, led by Wigan Council, is building on the prototype developed during the previous alpha phase. The team will now build the web-based tool for analysts to identify the errors in their data and automatically fix 37% of these errors.

The project team will also build a community of engaged analysts to contribute to developing the tool further, and become up-skilled in using Python to automate tasks in a practical way.

Watch a recording of the Quality LAC Data Beta Tool Preview:

Project timeline

February 2021

The ‘Improving data and evidence on children in care’ project becomes ‘Quality Data for Children in Care’. 

The project is awarded £250,000 in continuous funding from the Local Digital Fund to move into a beta phase. 

Wigan Council is now the lead council for this project with partners Data to Insight.

June 2021

The team started planning their next steps for the beta phase of the project, which included plans to work with their community of data analysts through two workshops to understand their expectations for the project and to inform other planning activity.

They have had 30 analysts express their interest to support the development for the tool.

August 2021

The project kicked off with the team presenting their first show and tell for this phase of the project on 16 August.

The team completed the recordings of their introductions to Python, Replit, GitHub and more, and they’ve released the resources to their community of analysts. The resources can be accessed via the Data to Insight online learning hub.

24 Children’s Social Care Data Analysts signed up to take the python training and help deliver error coding in Python for the various errors seen in 903 returns data.

The team also made some front-end improvements to the tool to display placement distance and postcode, and completed the videos supporting analysts to deliver advanced error code.

The team made great progress with over 30 rules coded in the first month.

September 2021

The team published some great videos which support analysts in coding the most complex errors, which is now available on the Data to Insight website.

The team has successfully coded 68 errors in Python in total, with two new advanced rules coded this week.

Wigan joined the team on 27 September. This helped to increase the number of participating analysts and bolster the number of rules coded per week.

October 2021

On 11 October, the team held an event attended by 20 people to demonstrate a prototype of the error checking and validation tool.

The team did a call out for volunteers to test the tool and are now engaged with volunteers on a one-to-one basis, which allows them to respond to feedback. 

Following the team’s ‘Beta Tool Preview’ and ‘Mid Project Call for Volunteers’ Teams event, they gained 4 volunteers from local authorities to support with front end testing and 6 volunteers who have committed to code five rules a week.

As of 26 October, the team had coded a total of 98 rules.

The team also held a ‘front end’ user testing session on 28 October and received some positive and constructive initial feedback on the tool when presenting demos to regional groups.

November 2021

The Data to Insight team held front end user testing sessions with local authorities. These sessions enabled the team to find out what changes needed to be made to improve the tool.

Over 170 rules have now been coded and there are less than 100 rules remaining.

All people who volunteered to code some rules attended the second coding session and continue to help the team.

The team is planning to launch the MVP (minimum viable product) version of the tool to users on 3 December 2021.

December 2021

The Quality Data for Children in Care project launched the ‘CLA Placement Modelling Tool’ on 3 December. 

The team hosted a virtual launch event on 8 December where they:

  • demoed the tool in action
  • explained how to access the tool
  • shared next steps for further development and maintenance of the tool

March 2022

The project is awarded £349,560 in follow-on funding through the Continuous Funding Model.

The project will use the funding to migrate data to a new platform built on the user needs of children’s services analysts.

April-May 2022

The team is preparing for the full beta phase to kick off in June, with weekly catch-ups for the core project team and periodic meet-ups with local authority analyst colleagues.

Local authority analysts will have capacity to engage in the next phase of the project in July, so in the meantime the team is working on their approach to building a generic code platform out of the 903 data quality tool.

July 2022

The team held a session to work through the key outcomes that they want to achieve through the next phase of work. The next phase will focus on working with data analysts in Children’s Services to re-work the data quality tool — created as part of their previous funding, which focused on improving data quality for the 903 return — so that it can be used for Children In Need Census data. The next steps are for a smaller group to refine the detail of these objectives and to bring them back to the team to work up into an action plan to deliver against.

September 2022

The team have agreed on using Codespaces instead of Replit for this coding season as it provides a better development and collaboration experience for this project.

The team has also improved the amount of flexibility in how the rules for the tool are implemented. It is a significant improvement from the way the rules were structured in the 903 tool, allowing the team to enhance how the error locations are displayed. The new structure is robust enough to be changed at any time without causing issues to the tool.

In the past week, the team conducted a brainstorm session to look at the user interface. They analysed how the feedback they’ve received, the requirements of the CIN project, and the new and improved project architecture can all come together to provide a good user experience.