About Us

P2 Consulting was started by a group of award winning consultants who recognised the opportunity to build a global consultancy firm that had clients’ needs at its heart. We understand the challenges clients face – the pace of globalisation, technology change and increasing regulation – and the pressure they are under to respond to these changes rapidly and efficiently.

What We Do

We work on some of the largest transformation programmes in the corporate world and the public sector. Partnering closely with our clients, we help them deliver successful business change. Our reputation as a consultancy is built on excellence in portfolio and programme management, business architecture and design, testing and quality assurance and implementation.

Insights

Understanding the challenges that keep our clients awake at night is essential. In this section we demonstrate our expertise at solving your problems. We have deep insight into the business and technology issues facing all sectors.

Join Us

Are you looking to join a company where a challenge is welcomed, change is anticipated, and expertise is expected? Then have a look at our job listing and please get in touch.

Success Stories

We’ve worked with clients across a range of sectors and gained excellent results – but don’t just take our word for it. Have a browse through some of the work we’ve done.

IT glitch blamed for 16,000 coronavirus cases being overlooked

By Phil Rolfe, CEO, P2 Consulting

07.10.20

Phil Rolfe, CEO, P2 Consulting

The latest in the Government’s testing debacle is an IT glitch that has been blamed for thousands of coronavirus cases being left out of the daily briefing last week. A data blunder was responsible for almost 16,000 cases being left off official numbers between late September and early October. Apparently the glitch occurred when data was being transferred from one system to another and the Government has blamed IT and human error.

But the reputational fall out from issues like this can be significant, particularly when it’s a situation as sensitive as this. The Government could do without negative headlines that could have been avoided. Systems do not break themselves – there are several ways it could have happened, and they are important to bear in mind when dealing with new systems:

  1. Bugs: there was a bug in the system that had not been either identified or prioritised for a fix and the data triggered that bug. Often platforms are tested to expected capacity and then assumed to work with any volume of data – but databases and feeds slow down and break if they are not able to scale to cope with significant surges. A data volume limit may be hidden until volumes balloon and then has to be fixed once identified.
  2. Lack of testing: another possible reason is that an update was made to the system and there was insufficient testing of the system carried out prior to the change going live, resulting in an error which took a few days to spot i.e. this could be a break in the feed to the contact tracing part of the system which wasn’t noticed as backlogs were being worked through. Eventually they noticed the area storing new contacts to trace was not being added to – then the bug was found and corrected.

With new platforms there are always bugs. And fixing them should be prioritised based on the potential impact – a bug with a ‘Criticality One’ issue should stop a go live as it is fundamental and will cause the system to fall over. But a ‘Criticality Two’ issue might be allowed to go into the live environment, assuming there aren’t too many glitches to compromise the system. Volume testing a platform is also important, to ensure it can cope with unexpected data surges.

Detecting glitches and testing is key

Once live, systems are always subject to change, improvements are always happening, and fixes are being applied. Good practice ensures that before a fix is put into live, it is fully tested to ensure no unexpected consequences occur. If a system is being rushed through and too little time is given to testing, or if the testing environment doesn’t resemble the ‘live’ environment, then a fix could instigate a new error.

Detecting glitches and testing is key, but it is a balance between time, rigour and the cost benefits – all tricky to evaluate in an intense environment. This situation is a pressure cooker for the Government – and they can’t afford to make any more mistakes.

If you’d like to find out more about how we can help you  please call me on +44 (0) 7813 900 337 or email [email protected]


Get in touch

Click Here