The Phoenix audit we could have had – Part 2
Earlier, I ranted about the actual audit of Phoenix that was done by the Office of the Auditor General (A disappointing audit of the Phoenix problems). And in my post yesterday (The Phoenix audit we could have had – Part 1), I talked about what I expected to see or at least thought we could have seen, regarding governance and oversight.
Today I want to talk about the current state of pay requests outstanding.
What were the criteria?
There were two elements to the state of pay, and the first one was:
Problems related to paying public service employees are identified, and the nature and impact of these problems are understood.
To understand the first problem, the auditors relied upon the following documents.
- Pay Disbursement Administrative Services Order, 2011
- Directive on Financial Management of Pay Administration, Treasury Board
- Policy on Results, Treasury Board
- Directive on Results, Treasury Board
- Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies, Treasury Board of Canada Secretariat
- COBIT 5: Enabling Processes, Information Systems Audit and Control Association, ISACA
As with the review yesterday, the policy on results, directive on results, guide to PM strategies, and COBIT 5 are virtually worthless to the exercise. They tend to talk heavily about programmatic delivery results (external results of spending), and have very little to offer in the way of measuring or monitoring internal services. To the extent they do, they tell them what types of things they should do in general, they don’t dictate or give explicit instructions. The first two, however, are a lot more detailed and do include some directive language, along with some indication of actual service standards and duties/obligations. Not enough to run the Phoenix system, but at least PSCP managers would have had SOMETHING to rely upon.
The second criterion was:
The resolution of problems related to paying public service employees is being effectively and efficiently managed.
For these ones, a few of the documents are the same, but it is the ones from TBS that are different and quite telling:
- Pay Disbursement Administrative Services Order, 2011
- Policy on the Management of Projects, Treasury Board
- Policy on Learning, Training, and Development, Treasury Board
- Directive on the Administration of Required Training, Treasury Board
- Management Accountability Framework, Treasury Board
- Guidelines on Costing, Treasury Board of Canada Secretariat
- A Guide to Costing of Service Delivery for Service Standards, Treasury Board of Canada Secretariat
- COBIT 5: Enabling Processes, ISACA
- Information Technology Information Library Service Strategy, second edition, 2011
What the REAL criterion should have focused on
Now, if you take those above pieces, and break them down into manageable chunks to audit, you would expect to find some of the following:
- A comprehensive inventory of all the pay action requests in the system;
- Detailed reports of nature (type, age, and department) and impact (materiality, $$ estimate, $$ as a percentage of annual salary);
- Clear project management principles showing differentiated approaches based on nature and impact;
- A risk-based triage process and analysis of the various PARs;
- Cost breakdowns of the steps taken to date and how they impacted the resolution of numbers outstanding;
- Key performance measures in place for overall and individual workload management, tied to nature and impact; and,
- Training in place to respond to basic training, ongoing maintenance and emerging issues.
While there are other things you COULD see, those seven items are pretty basic tools.
You could also likely examine three other items that deal not with the PARs themselves, but the client service function:
- Clear identification and public sharing of service standards for various types of PARs and how the system is doing, updated likely weekly;
- The system in place for people to access and receive status updates on their individual file and to know what is happening, even a queue number if the answer is nothing yet; and,
- Detailed communications plan in place to transparently share the detailed reports.
What did the audit find?
The audit concluded that there was “The number of pay problems continues to increase”, “Public Services and Procurement Canada did not have a full understanding of the extent and causes of pay problems”, and “Departments and agencies had significant difficulties in providing timely and accurate pay information and in supporting employees in resolving pay problems.”
What COULD the audit have found?
It is clear to anyone and everyone that the solutions in place are not meeting the needs. And on some of the elements I mentioned above, the auditors did have some views:
- The inventory was not comprehensive, there were clear gaps;
- The reports are rudimentary at best, and don’t give details on nature or impact;
- Differentiated approaches based on nature and impact were done, mainly based on various pressure points over time, but with little analysis or evidence of the result for each group; and,
- The training was not done before launch and hasn’t kept up.
They could have also expressed concern that there were clearly other gaps:
- No detailed risk-based triage;
- No cost breakdowns;
- Little in the pay of performance metrics or service standards; and,
- No exception management system, nor any feedback and status mechanism.
Missed opportunities
The audit failed in this area on two counts. First, the audit recommendations could have been quite prescriptive and detailed, saying “We recommend you do x or y, and do it by such and such a date”. Which PSPC would then have to commit to doing, and to do so publicly. The recommendations are more general than that, telling them to do better rather than saying they failed to meet even the most basic standards at all. As a result, PSPC basically was able to respond that they are to develop a plan as to what their overall plan should be. A plan to have a plan, not even the final plan itself.
More importantly, though, the auditors had access to the internal data. While it is clear that the PSPC system is not robust enough to generate the reports needed, some more rudimentary reports could have been developed and calculated. And, given the public spectacle surrounding the audit, part of the role of auditors is to report on what is happening and how it is performing. Instead of giving us the reports we need, or coming as close as they could at least, they went for straight-up overall volumes. Stakeholders — namely employees — had almost no more useful info or data than before the audit.
Here is the most minimal of tables that I had expected and hoped to see:
Type of Pay Action Request | < 1 month | 1-6 months | > 6 months |
Put in pay (i.e., new employees) | # | # | # |
Acting pay (< 2w) | # | # | # |
Acting pay (> 2w) | # | # | # |
Promotions | # | # | # |
Removals from pay (retirement) | # | # | # |
Removals from pay (special leave) | # | # | # |
Removals from pay (separation) | # | # | # |
Contract settlements | # | # | # |
Don’t get me wrong, I think that table is insufficient. I just think it was the most basic table and that they should have been able to provide it, or even generate it as part of the audit, even if there was a gaping sub-area / black hole called “other”. Basically, people, they hadn’t triaged yet so they weren’t even sure what was in there. But arguing there are 500K requests doesn’t tell me anything about nature OR impact.
Now, I expected that table, and it didn’t come. Nor were there any details on maternity leave, sick leave, overpayments, assignments, secondments, other administrative changes, etc. There could be another 10-20 categories for the Type of PAR, but it is not just the categories so much as the time factor besides it — how old are the requests, how big is the queue? Because once the report is generated the first time, it can be generated again. With showing changes since the last one. I suspect the age figures would have to be even more disaggregated (maybe 1m, 2-3m, 4-6m, 6m-1y, 1-2y, 2-3y, etc.). That would give you decent details on the nature — at least for the type of PAR and the age of the request.
Where I was apparently dreaming in technicolour was in thinking that they might even go further. Think of the above table, with the same PAR categories down the side, but instead of showing the age, but imagine the table showed the dollar value across the top. For some reason, the materiality threshold set by PSPC was $100. Almost every request would be over that threshold in gross terms, particularly as acting pay is only paid if it is longer than 3 days now. So their threshold is meaningless. Instead, I’d hope to see something along the lines of 0-1000, 1000-2500, 2500-5000, 5-10K, 10-20K, > 20K as the cutoffs.
Why? Because it would give a clear and compelling indication of the magnitude of the problem. There are lots of stories of people complaining about Phoenix, but not all complaints are created equal. Some are life-altering disasters, with people not being paid for over a year, running into tens of thousands of dollars owed. At the same time, there’s Joe Worker next to them in the queue who didn’t get their three-day acting pay last week. Both people deserve to be paid, in full and on time, but when I have to triage the files, the person who is owed more money is likely facing a larger personal impact, particularly if it is combined with a longer period of time and affecting multiple tax years.
Now, the auditors hinted that the info isn’t available, and to be blunt, I don’t entirely believe them. I believe the info isn’t READILY available, but I don’t believe that it couldn’t be generated with some basic methodology. Even if they had 300K files in the queue, and didn’t really have a way to code 40% of them, they would still know the profile for the ones they have and be able to extrapolate what it means for the others. Is it a perfect methodology? Nope, but there are lots of previous audits that did more with less.
While there were some basic charts and tables included, they really didn’t provide much info on the scope at all. I thought they at least might have profiled one of the participating departments as an example, but they didn’t even do that.
I literally felt like it was a completely missed opportunity to pull back the curtains and provide SOME info to all the affected employees. They deserved an audit that went farther and produced more, particularly when they saw that PSPC didn’t have the data already available.
