Recently I was scanning my mail when I noticed solicitation from a reputed retailer trying to sell me a conditioner that promises “tangle-free hair”. Now, even from a mile away one can make out that entangled hair should be the least of my worries. I scrolled down and surprisingly there was another mail from the same sender, but this time about some product promising to take me back to my “million-hair” days!
Though amused, I was wondering why the same sender would try to sell me solutions for problems that are mutually exclusive. After all this is the era of big data analytics backed targeted promotions. Or, maybe not… Availability of data does not automatically translate to optimal usage – and there is a tendency to take all of the available data and “play with it”. In this case the seller decided to bombard me with all the products in its arsenal with the hope that at least one sticks rather than analytically figuring out the product that I am likely to buy (none in this particular case though).
Sadly, the story is not very different with Operations personnel in many organizations. Routinely I see operations personnel at our prospects download all active orders every morning to a gargantuan spreadsheet (one of them used the term “Mother of all Spreadsheets”). While high level trends & patterns do say a story, for operations it is always about the specific few orders or scenarios they need to focus on NOW. So, the next step is to apply a bunch of filters & formulae to zero in on the specific transactions they will work on first, and then the set they will work on next. Features like VLookup (MS Excel) are often used to consolidate notes from the previous day and apply them to the “latest” data.
Spreadsheets give a lot of power to the user in terms of filtering & jotting notes. But it also comes with a set of problems.
Firstly a good chunk of the data downloaded in the morning is outdated by the time meaningful work starts. Any attempt to keep the spreadsheet updated turns prohibitive given the long run times of reports that spew out the data and the difficulty in handling such volumes in a tool that was never meant to work like a database table. In fact I am yet to come across anybody who refreshes such spreadsheets more than once a day.
The second problem relates to collaboration – while spreadsheets can accommodate notes & comments, they are not built for collaboration. The result is a series of email trails where clarifications, approvals etc. are sought. Of course the operations personnel diligently copies comments onto the spreadsheet from the emails… another source of outdated information & mistakes.
Unfortunately the second problem leads to a third one – that of proliferation of spreadsheets. The moment a spreadsheet or parts of it are emailed out, people start responding with their own version of where the transaction is/should be.
The fourth & last problem is probably the most important – it is very difficult to get a hang of the downstream impact on the process, or the impact from an upstream process, through a set of filters and comments. One needs to apply more sophisticated measures like cycle time, exception resolution histories etc. in conjunction with the current status of the transaction. The net result is that more often than not some critical transactions that needs to be worked on gets ignored while precious time is spent on issues which would get resolved on a timely manner given the normal progression of the transaction.
While working on OpsVeda, we always knew that what the operations user is doing with his giant spreadsheet is akin to finding that proverbial needle in the haystack. So, the end users’ ability to add filters on any attribute and apply pattern based measures to zoom in on the few transactions that really need attention was a must have feature. We also recognized the frequent and detailed collaboration which typically leads to resolution of exceptions. Hence, enabling collaboration in context of the transaction in question and making all the relevant communication automatically available in one place (the transaction of course!) was another key feature that was part of the product from the beginning. And finally none of these are of any practical value unless the data is current – so making sure that data is captured from the source systems in REAL-TIME and the same is immediately analyzed for exceptions/alerts based on user defined rules/filters was paramount.
So what does all this mean for the operations users – (1) They work with the most current data ALL THE TIME, (2) They have better exception definition capabilities which means their time is channeled towards the orders that wouldn’t move unless they intervene and (3) They have a single workbench for managing their exceptions, reviewing operational data and collaboration with internal or external stakeholders. So now the user does have time to finally focus on the customer.