Program management often is likened to firefighting, requiring the type of urgency that demands immediate attention. Whether your fire consists of volatile stakeholders, unreasonable customers or dealing with the latest of some unforeseen exception case, the connotation is clear – nothing else can be dealt with until this fire is duly smothered. And so it comes as no surprise the primary reason program owners are unable to address bigger picture strategic needs is the unending demands of the ringing telephone, the urgent-flagged emails or the failed VMS FTP upload.

It stands to reason that it can be difficult to invest time in fire prevention while putting out existing fires; but, of course, without doing so the fires will only continue to come until you are a full-time firefighter, accepting the status quo as the reality of the job. It is this acceptance of fate, that it has always been this way and therefore always will be, that can be most damning to the spirit of progress.

How to push forward then? The answer lies where progress has historically received its push most — science, data — the practice of determining factual answers based on the statistical analysis of empirical observations. More specifically, the testing of an educated guess to the cause of a given phenomenon by systematically refuting all other possible hypotheses.

Businesspeople of nearly every type have grown to love data. But few are properly trained in formal methods of gleaning truth from it. For the most part, data has been delivered in the form of reports. When people complained these reports were not answering their questions, the reports were made bigger — in size and number. The result is a classic case of information overload. In some cases the information is there but overshadowed by too much other information so as to make the needle in the haystack hard to find. The deficiency of the reports in our industry isn’t a lack of data, or observations, but answers people expect to magically pop out and smack them in the face.

So, before one can figure out how to get answers out of their data they must first understand what questions these data are capable of answering, and how to ask these questions of data. For those of you who have been through Module 3 of Staffing Industry Analysts’ CCWP training curriculum, you know that the first step is to graph key variables on a scatterplot, and to avoid the temptation of taking averages. The scatterplot tells you if your data distributes uniformly or not, if there are apparent correlations against other variables, and where to focus your attention on next steps in the analysis. By taking averages right off the bat, you are likely to miss the most relevant details — like mixing all the paint on your canvas to a muddy brown, sacrificing otherwise vibrant color — leaving you wondering why your program isn’t performing better.

There are many next steps in order to make good science out of program management, but getting better acquainted with your data is a critical first step, and scatterplots leave nothing hidden from you. Try this: look at a couple obvious variables, like plotting mark-up vs. pay rate, mark-up vs. tenure or time-to-fill vs bill rate. This may not tell you answers to the big questions you have but it ought to focus where you should be asking your questions.

print