The Redundant Back-Up

New IBM Z10 MainframeI worked at a location that was running a single, world-wide instance of SAP. The whole of their HR, Finance, and some of their manufacturing was running from this single instance, twenty four hours per day, seven days per week. Naturally, the process analyst in me (and to some extent, the pragmatist) had a number of questions about this:

- “How do you back-up?” - “We mirror and back-up the mirror”

- “How do you do maintenance?” - “We have a back-up machine at a separate location across town. We transfer the users to that and perform the maintenance then”

The questions went on and on. Each one had a suitable reply. But then I started thinking about this single instance and it didn’t seem to sit right with me. For some reason I could foresee problems with it. Finally, I was able to verbalise the issues I was having: “How do you deal with disaster recovery?”

Most of my readers will know what disaster recovery is, but for those that do not, a quick explanation. When something happens to a computer system that renders it unusable, a business continuity plan (BCP) needs to kick in. This is where processes are enacted that allow the business to continue operating without the damaged/unusable machine. Running in parallel with this is a recovery process that attempts to repair or replace the damaged/unusable machine and reinstate all the data and transactions that were damaged or missing as a result of the disaster. I had concerns that because this was a single instance, and because it was being used globally, BCP and disaster recovery would be a nightmare.

The project team had already thought about this.

They told me that for BCP purposes, every affiliate running the software had processes on-site that would kick in if the system was unavailable. Even in the worst case scenario, the system would not be offline for any more than 48 hours because they had an agreement with a third-party provider to have a hot-site at a remote location which would be implemented at the first signs the redundant back-up didn’t work.

I asked if this had been tested. It hadn’t. But I was assured that it had been implemented in other organisations within 48 hours.

I turned my attention to the redundant back-up. What happens if there is a fire that destroys the main machine? The redundant back-up would kick in, and all systems would be re-routed. How is the rerouting done? Through underground cables that link the two sites. What happens if someone cuts through the underground cables? There are redundant underground cables that route North and South of the city between the two sites. The chances of both being cut are infinitesimally small. What happens if there is a power spike that takes out both systems? Both systems run off separate power supplies. A power spike would only take out one, not both. What happens if a small nuclear device takes out the town, both sites and the power sources? We would have bigger things to worry about than the system, but in that case the hot-site would kick into action.

I grilled the technical operators and designers of this system for almost two days, coming back to them time and time again when a new scenario occurred to me. They had a suitable answer for every single one of my questions. So, at the end of it - despite my reservations - I had nothing concrete to justify them

Then, a couple of weeks after I left the site they ran a new interface program. It was an HR population interface that filled in specific records on an HR master file. The interface had not been tested properly and it went into a loop. The program ran all day and most of the night. It kept populating the file overnight, making this file larger and larger and larger. It made it so large, in fact, that it completely filled all the disk space on the system.

The main computer, and global SAP instance, shut down.

As designed, it failed over to the redundant back-up across town. The system was on-line, ready to go as expected. It dutifully continued running all the processes it should be running - including the HR population interface. This ran for the rest of the night until it, too, had filled up the disk space on the redundant back-up.

That machine failed.

Plan C was, of course, the hot-site located in another town. A town which was, in fact almost 500 miles away from where the two main sites had been. The hot site didn’t work. It took almost 72 hours to get up and on-line and even then all the information needed to make the system run globally wasn’t there. The comms lines to the global sites were not connected. Anything that could go wrong, did go wrong.

The rest of the story gets a little fuzzy. All I know is that the global SAP system did not come back on line in the host town for almost six weeks. Retrofitting all the missed transactions from within the BCP took another few months.

Nobody lost their job over this. The hot-site provider lost their contract, I believe. I sat there shaking my head.

But that wasn’t the worst of it. The worst part came when I spoke to people who were affected by the outage. I asked them what their BCP was for when they had no access to the system. (remember I had been told by the tech crew that each affiliate had processes for dealing with on-going business when the system was down). They told me “We wait.” The BCP for operating a multi-billion dollar global business across numerous affiliates, sites and countries was “Wait”. Wait until the redundant back-up kicks in and, if this doesn’t work, wait 48 hours until the hot-site is ready to run. I wondered how they dealt with a six week backlog of waiting? Nobody was able to tell me. But I’m reasonably sure it involved manual work arounds.

The moral of the story?

Your redundant back-up isn’t. There is always the scenario where you will find yourself without it. Plan for that scenario. Make sure that you have processes in place to deal with it - even if the chances of that happening are remote. Remember the Twin Towers of the World Trade Centre were designed to withstand the impact of commercial airliners when they were built. The chances of an airliner destroying more than a few floors was negligible. But the impact of the jets didn’t bring down the towers. It was the thousands of gallons of jet fuel burning away at vital support beams that did it. Once a couple of these had weakened, gravity did the rest.

Photo Credit: pchow98 via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
 See related info below

The futility of ERP's

Midnight Sun!Time for another story from the Comerford archives.

Several years ago I was involved in a global project to implement an ERP system in a multinational pharmaceutical company.

This project was huge. It was the biggest thing the company had ever done internally (and this is a company that had developed some of the most well known drugs ever made). At one point we had so many external contractors working on it we were spending over $100,000 per week just on them. It was unsustainable (this was ten or fifteen years ago, too).

All we were doing was replacing financial systems worldwide along with HR systems.

But here's the kicker : Nobody outside the project really cared.

The people who were working in the labs discovering drugs didn't care. The salesmen out on the beat trying to persuade doctors and pharmacists to prescribe the drugs didn't care. The manufacturing folks physically making the end product didn't care.

More importantly, the customer didn't care.

The IT component of the business cared, as did the CFO who was convinced he would be able to get more accurate data about the state of the company quicker than before. HR cared because they would be able to see more accurate information about the number of people employed worldwide, and in what capacity. But in reality this was never going to happen. The financial people were always going to keep their own sets of books locally and only report the figures they had approved to the CFO, rather than the actual day-to-day figures. Excel was the main way of manipulating this data and the system allowed Excel to be used in just this way.

So this hugely expensive project (which had dodgy cost-benefit justification in the first place) was being run purely to give the CFO and head of HR more accurate data - and it wasn't even doing that appropriately. None of the key departments were affected by this project (and the associated process change that came from it), and the customers were blissfully unaware that they were paying a huge amount of money for their pharmaceuticals so that a lot of it could be hived off to pay for this extremely expensive internal project.

So why do it?

Somebody obviously thought there was a cost benefit to doing this project. 'Better information flow' was bandied about as one justification, as was 'Single Worldwide System'. In my capacity as an auditor I got to see some of the documents that were not widely available to others on the project and can tell you that some of the justification and cost-benefit was tenuous to say the least.

But it does bring up the whole question of why do we implement ERP systems in the first place? Sure, there are companies with disparate and widespread systems that would benefit from having some sort of unification across their enterprise. But, in my opinion, the benefit comes not necessarily from the software itself, but from the unification of underlying process and settings. Merging a couple of companies together always benefits from implementing a common chart of accounts and common processes to underly them, but the implementation of a common (usually expensive) system is not always required.

Or am I wrong?
Photo Credit: kaniths via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford See related info below

It's Just Not Cricket!

IMG_9475I was sitting over the weekend watching the guys practice their cricketing skills on the local cricket green. (For those of you who want an in-depth explanation of cricket I would recommend this:  Or for a more in-depth and less humourous look, this).

The exercise was simple. A batter would lob a ball really high into the air and the fielder would have to catch the high ball, immediately throw it back accurately towards the batter who would knock it along the ground. The fielder would then catch the low ball and return it to the batter who would lob another high ball for the next fielder in the line.

Simple straightforward and practical.

The problem came when one of the fielders didn't catch the ball. It threw the rythmn of the practice and the guys would have to regroup and start again.

I got to thinking how this applies to process.

At first glance it’s a simple process with a number of moving parts (or steps). The process is designed so that each step can be executed in a sequence and the output of one step is used as the input of another. The problem comes when the output of one step is not what is expected by the following step. This is the case with the cricket practice. The batter was expecting a ball to come at him from the previous fielder so he could launch that to the next participant in the process. When the ball didn’t come back (because the fielder dropped it) it made the process grind to a halt.

In reality that solution is really simple: Have a second ball ready to throw into the process when the previous step fails to deliver. But this is what a lot of process definitions steps fail to take into account. They are designed to run with an optimum process flow (i.e. they assume that the output from previous steps is valid). Designing some error handling into a process is always easier than trying to fix a process when it doesn’t work as designed.

How many error processing steps do you have in your processes?

Photo Credit: siddharthkhajuria via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

Have the BPM goalposts moved?

DSC_0811.jpgMy last post asked Where is the BPM Market going? and opened the way for discussion (mostly on Twitter) about the changing state of the BPM marketplace.

Thanks to Craig, The Process Ninja, we can now look at the latest analysts offerings from Gartner and Forrester in the form of the Forrester Wave for BPM suites and the Gartner Magic Quadrant for Intelligent Business Process Management Software.

Wait! Hold-up now. "Intelligent Business Process Management Software", you say? What the hell is that?

Gartner has made changes to the classification of BPM solutions by redefining the marketplace. They are referring to this as "an evolution of the BPMS market" that is "centered on a new IBO use case". IBO in this case means Intelligent Business Operations. It is the same thing they did several years ago when the Magic Quadrant for Pure-Play BPM morphed into the Magic Quadrant for Business Process Management Suites.

In other words, they've moved the goalposts!

Further analysis of the underlying reason for this reveals that Gartner feels that iBPMS represents a maturation of the capability and is used typically at higher levels of BPM maturity.

But the problem is they are pushing the same products as they had in the previous Magic Quadrant  for BPMS - which they say cannot be compared with this iBPMS Magic Quadrant.

My recollection of the earlier Magic Quadrant showed a group of vendors in the top right sector of the diagram with a number of other vendors trailing down a diagonal to the bottom left. The new Magic Quadrant shows a wider spread of vendors, many of whom can be found in the lower right quadrant. In Gartner speak this indicates that they are visionaries, but lack the ability to execute on their vision.

So what am I to do if I am a Gartner customer looking to identify which vendor I can pursue to fulfill my needs?

I might previously have gone for a Metastorm offering (for example) as they were highly regarded and in the top right quadrant. They have since been purchased by Open Text and now languish in the lower left quadrant as being niche players with an incomplete vision and a lack of ability to execute. Does this mean the purchase has been a failure? Not at all. But the goalposts have moved.

I worked for American multinationals who would only look at vendors who landed in the top right quadrant of the Gartner grid. As of the current offering this would reduce the market down to three vendors. In many multinationals that's not even enough to put out an Invitation to Tender as a minimum of four vendors are needed.

But is that a problem?

Well it might be if you are one of the vendors who was in a more elevated position and now find yourself in a less elevated position, but if we look at the Forrester Wave report for BPM suites we find some sobering statistics. In a survey of 520 IT decision makers in Q4 2012, when asked "What are your firm's plans to adopt BPM tools", fully 43% said they were not interested or had no plans and only 27% said they were planning to implement.

When the same group were asked "What are your firm's plans to use Software as a Service (SaaS) to complement or replace your BPM software?", 52% answered that they did not know or had no plans to us SaaS and only 33% had plans to do so in 2 years.

In a market where Forrester have identified 52 different vendors competing in the broader BPM market, where Gartner have redefined the goalposts about what a good vendor is, and where half the IT decisions makers are not looking at using BPM, I sense a serious disconnect.

Who are the 52 vendors marketing themselves at? Can the market sustain this onslaught?

Photo Credit: OnTask via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
 See related info below

Where's the BPM market going?

The Mist I remember back in the deep, dark mists of time (about ten years ago, actually), The BPM market used to have several players in it. Gartner's Magic Quadrant had a diverse number of players in each of the quadrants, and it was easy to look at and understand the fragmentation. Things were called 'BPM' and everyone knew where they stood with it - although, in reality, very few people could adequately define 'BPM' as a concept.

More recently, though, the market has started to amalgamate. Major companies were purchased by competitors and their products merged together (Metastorm and Provision is one example). The fragmentation of the market decreased suddenly. The Magic Quadrant (and Forrester's Wave) had fewer parts to it. Things looked good for the BPM vendors, but, not necessarily, good for the market.

People like Gartner then started to split their BPM Magic Quadrant up into different areas. We got BPMS, and ACM and the like. Different companies were invited in to join, and, pretty soon, the market seemed to be just as wide-ranging as before.

But is it really? Or have we just moved the goalposts?

Is this a classic reorganisation the likes of which we experience in companies at regular intervals? Movement for the sake of movement.

Is it a way for some of the consulting companies and business integrators to muddy the waters for customers and justify large consulting fees?

Or is the market genuinely in the throes of some major increase in the number of vendors working in a particular niche? Are we on the cusp of an explosion of products that will help customers conquer the BPM beast?

I'm not sure I know the answer myself, but I suspect a number of my readers will have opinions on this. Feel free to share in the comments below.

Photo Credit: Nathan O'Nions via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford See related info below

The Path of Least Resistance

A1(M) Darlington BypassNot far from where I live, there is a DIY store on an out-of-town trading estate. It is surrounded on all sides by dual carriageway roads and the only, real, access to it is by vehicular transport. However on the opposite side of this dual carriageway is a housing estate. I was waiting at the nearby traffic lights yesterday and noticed that from the fence surrounding the estate there appeared to be the beginnings of a pathway that had been worn by pedestrians across the central grass reservation of the dual carriageway and into a hole in the fence surrounding the DIY store. As I watched I saw at least three people take the route from the estate, across the dual carriageway, into the store.

It struck me as being a prime example of people finding a way of doing things that wasn't originally anticipated in the design of the thing.

When they first built the store and surrounded it by roads, nobody imagined that people would actually want to walk to the store. But people found a way. What's more they found the path of least resistance to achieve their goals.

The same happens in processes. You can design a process in whatever way you want, but people (users) will always find the easiest way to achieve their goal, and it may ot be by using the process in the way you anticipated. More often than not this should be the way the process should be designed in the first place.

Always bear this in mind when designing processes.

Photo Credit: s_gibson72 via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

Size Matters.

back 2 school : ruler
One thing I have learned since leaving the comfortable role I held in a huge American multinational and starting my own business is that large companies do not have a clue about how to run a responsive organisation.

They are slow (and very resistant) to change, unreactive and labouring. Their processes are usually large and complicated, and this results in a lack of ability to go with quick changes that are often required.

The governance process I worked with within the multi-national involved tortuous meetings with nineteen interested parties where prospective change agents would plead their case for why their particular affiliate/department/competency was different and required something to be changed. This would all then be discussed, moderated and voted upon. More often than not the stupid, nonsensical ideas would be passed, whereas the ones that would genuinely make a difference to the business were blocked. Sure, we would let the Spanish affiliate have its own, Spanish language, portal. And if we were doing that then we would also let Italy have an Italian language one, Portugal have their own and Germany and France have their own. But would we hold all these and sanction a European (or company wide) portal that was multi-lingual and customisable? No, not a chance.

Why? Size.

An affiliate that deals with its own portal can budget for its own portal. It can manage its own portal and it can pay for its own portal. If we put something in that is cross European (or global) then money has to be found from other budgets and responsibility for maintenance has to be found too. The fact that the money and resource all come, in effect, from the same big bucket is carefully swept under the table. It becomes a political game.

But in a small company, such as the one that I and millions of other small businessmen run around the world, something like this is a simple, no-brainer. Do we need a portal? Yes. Can we afford it? Yes. Do it! The governance is light and quick and the decision is made pretty much instantly.

The same can be said for process. Things work on a process in every company. In smaller companies the process is probably very light and fluid. Checks and balances might not be as necessary as they are in larger companies. But the ability to modify or redesign a process is a lot easier in smaller companies. We can react to the market conditions and change direction/strategy/ market a lot quicker.

So if small companies can do this so much quicker, why are they not ruling the world? Well, when they start to rule the world they get bigger and the ability to react as quickly disappears. Companies like Amazon and Google were once small start-ups. They had small staff numbers, small capital and big ideas. They were responsive to what the market wanted and they could pivot on a sixpence if needed. They could fail quickly and move on. This mentality is now no longer there to the same extent (although a lot of this is cultural. Google’s “Spend 20% of your time on your own projects” ethos has now fallen pretty much by the wayside, for example)

So what are larger companies to do? How can they become leaner and more reactive? The answer is easy (although the implementation is not). To become leaner and quicker you have to.. become leaner and quicker. Remove the levels of bureaucracy that slow down changes. Keep the organisational chart  shallow enough that you can get the decision makers into a room and make decisions quickly. Put the decision makers at the right place in the organisation. I’ve talked previously about process owners and the need for them to be at the board level. This is an immediate example of why.

Sure, size matters. 

But not in the way you think.

Photo Credit: S@Z via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

Entrenched Thinking

Prussians of R.I.R. 235 in a narrow trench
I’ve written before on this blog about entrenched thinking “The Way It’s Always Been Done”. But I think the time is ripe for visit this topic because it is still something that occurs a lot more than we would like in companies.

What is it?
The example I give - one passed down from my father - is the dry cleaning business which had a rather erratic and nonsensical route for the delivery truck to take. More details can be found here :

What that story illustrates is that there are decisions which are taken at a corporate level each and every day which are not always based on sound business judgement, but are based on historical reasons for doing things.

That’s bad, right?
Not that there is anything wrong with checking history when looking at why we do things. But businesses must remember that situations change over time and what was, historically, true may no longer by something to consider. My other example is related to the insurance industry and can be found here:

Both of these stories have an underlying symptom which is more relevant to today's working environment: Resistance to change.

Resistance to change
Sure, it’s easy to continue doing things the ways you’ve always done them. It’s familiar. It’s comfortable. It’s easy. It works. Changing the way something is done can lead to confusion, uncertainty, unfamiliarity, even a decrease in quality of the final product, but all of these things are just temporary. When I was nearing the end of high school, a classmate had an accident which resulted in him severing most of the tendons in his right arm, (the one he wrote with). Overnight he was forced to learn to write left handed. The initial results were not good but by the time examinations came around his hand writing was as good with his left hand as it had been with his right hand.

This shows that change - whilst not always welcome or expected - does not have to be bad. But it does need to be managed.

People need to understand why change has to occur. People need to have help in understanding how to change. Most of all people need to feel that they are being listened to and that their input is being heard.

Of course, this isn’t easy. But research has shown (and my own anecdotal evidence has confirmed) that bad change management is one of the key points of failure amongst projects. 

End users are not often told why things need to change. They are not told what the benefit is of changing. Mostly, though, they are not compensated for following the new behaviours

It has often been said that what gets measured gets rewarded. If this is the case then measuring adherence to implemented changes and rewarding users  for that will certainly increase uptake. Conversely, punishing users for adherence to the old way of doing things will have a similar affect, but will be viewed in a slightly less positive light.

Change is good. At least change with the intention of improving things. Entrenched thinking can be a source of inefficiency, resistance and cost and needs to be overcome to improve process. The opposite, of course, is true: needless change for the seek of it isn’t going to win you many friends either.

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

Filming for TV: Some thoughts on process ownership

2007 WKAR TV Auction
Those of you who have been reading this blog for a while will realise that I do, occasionally, like to look into things that happen in every day life and try to understand the process issues inherent within them.  I want to move onto something related, but a little different.


I was fortunate enough recently to spend time working on a new comedy series to be produced for UK television. It was filmed (as many comedy programs are nowadays) in from of a live studio audience.

This is different to a lot of things I have filmed in the past for two reasons
1) There is a live studio audience!
2) The dynamics of command and control are very subtly different.
Let me explain.

In ‘filmed’ television (‘Downton Abbey’, for example), the director is in charge of the filming production and the first assistant director (1st AD) is in charge of running the set. The chain of command goes Director --> 1st AD --> heads of department. It’s a little slow, but it works and it keep the unions happy. Filming is done with one camera at a time (usually) and the final footage is edited together separately for transmission.

Television works quite differently. For a start there are more cameras. On the show I did recently there were five cameras running independently. These were big broadcast cameras which didn’t have on-board recording facilities. The signal from their feed was sent to a control room where they were mixed together by a producer. He is - effectively - editing the show as it is being filmed. 

Takes are quite long and complex involving a lot of camera movement and choreography. If a camera isn’t in the right place at the right time the take is blown and we have to go again.

So far, so good. But here’s the rub. I couldn’t work out who was in charge on the set. Sure there is a director and 1st AD. These two worked together in a similar way to on a film set. But there was also the producer character who was involved in all the artistic decisions because he had to make it all work in the control room. The issue came when the Director wanted one thing and the producer wanted another. It became a case of review & decide, cajole & threaten in order to reach a compromise. And a compromise is never good, artistically.

But the whole discussion got me to thinking about a topic which is close to my heart : Process Ownership. I think it's accepted that processes need to have somebody responsible for them. But is it accepted what the scope of process ownership should be? I don't think so.

I think that process ownership is oftentimes equated to project ownership at the senior level. In many cases someone is allocated project ownership at C-level purely as a way of ensuring that the project is seen as having “clout”. In reality the assigned C-level executive has minimal, if any, ’skin in the game’ for this project.

And so it is with processes.

An executive Vice President for finance might be nominated as the process owner for a process in the finance department, but - in the big scheme of things - has very little, if any, involvement in the day to day running or execution of the process. Some would say that this is fine - after all, why would a senior executive need to be involved at that level?  But I have a different opinion. I am sure that the are arguments that can equate the ROI of having the exec manage a process vs delegating, and these are all totally valid calculations. 

But they miss the big picture. 

Process is not something that happens in parts of an organisation. Process is something that happens across the whole organisation and having someone who can manage that at the organisational level makes a lot of sense. Any lower in the organisation and you start to suffer from the problem of silo mentality and not invented here syndrome. But at the senior level you have someone who has both the executive clout and the mandate to manage a process from start to finish right across the organisation.

However the logical extension of this is that there are going to be senior executives who mange processes but who will not manage them appropriately. Take, for example, a senior Vice President of Finance who is managing a process which touches more areas than just finance. If a change needs to be made he will, most likely (and politically) favour his own department if anything needs to be done that is positive, and favour other departments if negative changes need to be made. This is human nature. Of course the simple way to do that is by following the old guideline of “whatever gets measured gets managed”. If you recompense the finance executive on his ability to appropriately manage the whole of the process rather than on the results of the finance department alone, this will start to remove any political bias that may exist.

On the other side of things is the issue we experience in the TV studio where the process owner is not adequately defined and this results in two people having differing idea related to a change. They end up with a compromise, and this is - by definition - less than optimal.

Of course this isn't easy. Nothing at this level ever is, process even more so because it covers a larger part of the organisation. But these are the challenges that need to be addressed to make process management as a competency work in your company.

Photo Credit: Corvair Owner via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below


In the style of Seth Godin.

If process is so important, and everything that happens in a company is a result of process occurring, why doesn't every company have a Chief Process Officer?

Reason: We haven't built a compelling, cost-effective, and succinct reason to install a CPO.

But the bigger question is : Why haven't we?

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

BPM Disruption

"asao al palo"
There are certain individuals in the BPM stream who seek to disrupt the status quo. Some of them are blatantly provocative with what they say, others are more subtle and seek to try and influence people. They have made it their mission in life (sometimes as individuals, sometimes as heads of companies) to be deliberately inflammatory in their comments. Usually this is done to provoke a reaction and start a dialogue or a conversation. 

This isn’t to say that they are wrong. Far from it. Many of the ideas they espouse are just what the discipline needs to move itself forward. But in this post I’d like to talk about this disruption and analyse what it means for our particular skill set.

What sort of disruption are we talking about?
Disruption in this sense refers to those bloggers who advocate completely rethinking how we work. They turn accepted principles on their head and attempt to make us consider things from a different point of view.

Some bloggers are less forceful and will suggest alternatives to commonly accepted practices in areas such as workflow design and case management, for example.

The tone taken by these individuals can vary from the basic “Maybe we should be considering…” right to “Why the hell don’t you just wake up and smell the coffee!”.

A simple example of this is the basic ‘What is BPM?’ conversation. It is one that I have had with several people both as an individual blogger and as the head of the now defunct BPM Nexus. We were looking to define BPM as a discipline and unite several different factions within the industry. Sadly it never came together. Others have tried to deal with this with manifestos and similar. But there are bloggers and practitioners who will look at this and say ‘Now isn’t the time to start defining what BPM is currently. We should be looking at what we want it to be and completely redefining the whole concept.’  An excellent challenge, I think, and one which perfectly fits the concept of disruption.

Is the BPM industry in need of disruption?
The question then arises about whether the BPM industry needs such disruption. After all it is an industry which has been in its current state (or very nearly) for several years. Standards have been defined (BPMN, for example), market leaders have been established for software provision, and lots of System Integration companies have made a healthy living from implementing solutions for companies that probably have no idea whether the correct solution is being advocated or not. Does this need disruption?

It’s a great question. History has shown us that industries which maintain the status quo are apt to be swallowed up by those that attempt to disrupt it. Only recently in England two major high-street shopping chains (HMW and Blockbuster Video) have filed for bankruptcy protection after their underlying business model was eroded by companies such as Apple and Amazon. The disruptor identified a potential niche, filled it and expanded to the point where the incumbents were forced to react. Those that reacted were able to survive, those that didn’t, failed. More recently Microsoft - once THE major player in the world of personal computers and technology - is now languishing as an also-ran in the world of tech, its recent hardware/software launch with Windows 8 and the Surface failing to excite consumer attention. Indeed, Microsoft (and computer manufacturers such as Dell) are barely mentioned as tech players nowadays with companies such as Google having taken much of the wind out of their sales (sic). No doubt, as we move on, Google will become complacent over time and be usurped by another company with a different business model that attracts user attention.

Is the BPM industry in need of similar disruption? Hard to say. One thing is for certain, if the industry sees itself as immune to innovation there is every chance that it will follow the likes of HMW, Blockbuster and Microsoft and be overtaken by some now innovation.

Is at appropriate - should we tolerate this?
So the question arises ‘Is the disruption of BPM by such bloggers something that should be tolerated?’ On a base level there’s nothing we can do about it. The First Amendment in the US, and similar laws in free countries around the world, allow for anyone to say what they want both in person, on broadcast and print media, and on the internet. So there’s nothing we can do to stop them saying what they want.

But should we listen to them?

That’s the important question. Whether you agree with what they say or not the key factor is always ‘Do they have anything interesting to say?’. They may be seeking the removal of workflow diagrams, for example, or advocating the dissolution of non-standardised training certifications, or any of a hundred other bug-bears that bloggers (myself included) have written about. You may not agree with any given idea, but if the idea is something which has merit to someone then of course they should be heard. A true civilisation is one where all voices are listened to and all voices are heard. Once we get into the practice of suppressing opinions we are on a long, slippery slope

Voices with dissenting opinions have always been around in free society. One of the key definers of a free society is how it treats people who differ in their views to those advocated by the leaders.  We may not like what they say, or we may like what they say but not how they say it, but as has been shown  time and time again “Those who ignore history are condemned to repeat it”.

Can there be too much inappropriate BPM disruption?

Photo Credit: antitezo via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

How does BYOD affect BPM?

Tablet Device ComparisonIn this post I'll be looking at the concept of BYOD (or Bring Your Own Device)

BYOD is a concept that, frankly, didn't exist a couple of years ago. Mobile devices were laptops and that was, pretty much, all you could use. You took one to a client (or an affiliate/branch site) and plugged it into the network. It became your mobile desk.

But since the advent of the iPhone and the subsequent boom in smartphone and tablets, the concept of someone 'bringing their own device' to a location is a much more diverse affair. (Of course, 'devices' have been around a lot longer than the iPhone. In the attached picture you can see a Palm Pilot, Apple Newtons, and Treos. But the iPhone - and it's ability to connect outside the device itself, has really pushed this sector forward)

I do a lot of work at the 'offices' of a well know Seattle-based coffee shop and whenever I go there I link into their Wi-fi with both my smartphone and tablet and I can work pretty much seamlessly between the two. I don't need my laptop which stays at home 99% of the time nowadays. I know that this is something that a large number of 'mobile' workers do.

Connie Moore from Forrester wrote an article on the concept of Bring Your Own Technology back in November. In it she stated:
According to Forrester's Forrsights Workforce Employee Survey, Q4 2011, of nearly 10,000 workers worldwide, 53% bring their own technology for work. The rapid growth of mobile BYOT devices within business is reminiscent of Web adoption during the mid-1990s. After early handwringing and resistance, followed by rapid growth and innovation, the Web emerged as an indispensable tool. No one thinks twice now about using the Web for work. BYOT will follow a similar pattern.

What does this mean for BPM?

As with a lot of 'new, fangled stuff', there are issues to be overcome.

The first issue to be tackled is acceptance.

Connie makes mention of an underlying issue with this concept in many businesses, which is that of security. In regulated firms the concept of allowing a 'foreign' machine to attach to a network is anathema. The RIM Blackberry phones are, pretty much, the only tools that are exempt from this and that's because they are controllable by a central IT function. As someone who previously worked in the highly regulated Pharmaceutical industry I can align with this sentiment.

In more  'open' firms this is less of an issue. I know creative designers who edit video at a remote location on their iPad using the LogMeIn app on their home PC and running renders on their work servers. For them this is a case where the BYOD concept has made them more productive and able to work in places they would never have been able to do so before (This particular instance occurred when the employee in question was sitting in a vehicle service reception waiting for his car to have some work done on it!)

Growth factors
A lot of this BYOD expansion has been brought about by increased usage of 'The Cloud' and associated applications. Companies like Dropbox and Evernote have pioneered the ability to produce something in one location on one machine and have it instantly available on all machines at any location. Application developers are linking in to this ability by designing connections to these apps (and others) in their products. This very post you are reading was started on my PC at home, continued on my iPad in a remote location and  finished back on the laptop at home, all through the use of Dropbox and apps that connect to it. The ease with which these tools are now integrated into many peoples lives means that their use in a work environment is bound to increase. Wouldn't it be nice to be able to work on a document at your desk, save it to the cloud and continue work on it on the train home in an evening via Dropbox or some other Cloud application? I can see this becoming something that is more prevalent as time increases.

The BPM world needs to understand this and incorporate it into their workflow.  Many workflow automation tools use email as a way of notifying users that tasks are awaiting their attention within the tool. In normal situations the user would sign on to the app, process the task and sign out. This is fine if you are in the office, but what happens if you want to use your commute, or time at a coffee shop, or a spare ten minutes after the kids have gone to bed, to process these things on your phone or tablet? If the security of the company is set up to inhibit access to your network from non-approved hardware this, effectively, rules that out.

For reasons we have discussed earlier this is something that will work differently depending on the sort of business you are working on. Highly regulated industries will have to work on finding some sort of alternative to doing this. Some of the less regulated industries will probably look at this and understand that there are benefits to allowing user devices on their network. Either way, this is not something that can be ignored.

BYOD is here to stay. The 'phablet' (phone and tablet in one device) is widely rumoured to be the next big thing. As the adoption of tools like these increases (my parents now have iPhones and iPads!), companies engaged in BPM need to look at this and understand what are the synergies and benefits of allowing BYOD.

Allowing people to use their own devices (if they want) to do things they might not, otherwise have done, can only be a benefit.

Are you allowing BYOD in relation to BPM? What are the results?

Photo Credit: Jamais Cascio via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
See related info below

10 BPM blogs you should be following (2013 version)

Back in 2010 I wrote a blog post called 10 BPM Blogs you should be following.

Looking back on my stats it is still one of the most viewed posts I have written.

What I wanted to do today was to update the list of the folks I read on a regular basis to make sure you are linking in to the best.

There have been a number of changes in this list and they basically, fall into two categories:

1) Lack of Updates - If you're not updating your web site you probably shouldn't be on this list
2) Low quality updates - If you're updating but your content isn't worth reading then you shouldn't be on this list.

So, without further ado, here is the updated list. (oh, and there's 11 on this list now...)

  • Bruce Silver: Bruce is the daddy of BPMN, has been in the business for years and knows BPMN like the back of his hand (he should do - he helped write it)
  • Jim Sinur: He's been with Global360 and Gartner and he is the industry analyst for the BPM sector. His writing is often formal and rigid, but that doesn't take away from the value of his contents.
  • Theo Priestley: He's the Process Maverick, always ready to try and upset the applecart when it comes to BPM. When he talks it pays to listen to what he's saying
  • Sandy Kemsley: One of only two women on the list (which is a discussion point in itself). She attends and presents a lot at BPM conferences around the world and always has some useful insight into the latest movements in the BPM market. Her blog is 'Column 2'
  • The Process Ninja: He's Australian based and blogs about real-life applications of process. I look forward to his posts.
  • Connie Moore: The Forrester analyst for BPM and the other woman on the list. Finger on the pulse, covers the industry and the general BPM environment.
  • Bouncing Thoughts - Jaisundar from Stanford on BPM, CRM and CPM.
  • Ashish Bhagwat - Posts on BPM at The Eclectic Zone
  • Keith Swenson - writes thoughtful and informative posts on BPM and ACM (adaptive case management) that often inspire long conversations in the comments, and manages to do so without pushing his own company's products
  • Alberto Manuel - If it's pure BPM research you're looking for, Alberto's your man.
  • BP3 Blog - Scott Francis posts regular updates on BPM and associated topics.

If you were in the original list, but have dropped out, don't worry. It happened to me last year when one of my BPM blogging colleagues dropped me from his list. Don't take it to heart, but have a look at what you might want to do to update your blog.


Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford See related info below

Priorities, priorities...

question mark ?

It occurred to me recently that I haven’t asked a question of my readers for a while - one that will provoke some conversation, I mean.

Obviously there are different forums that these conversations can take place - and if you want to do so, please continue those there and link to them in the comments so I can follow along.

My question today is focused on all those people who have responsibility for process (process definition, process management, process implementation etc. - in whatever form you wish to define it - Case Management, Workflow, BPM etc)

What is your main process priority this year? (and why?)

Potential answers could include (but are not limited to):

  • Understanding process gaps.
  • Defining process owners.
  • Documenting my processes.
  • Rationalising my current process inventory.
  • Getting senior management commitment for process change.
I’m interested to understand what people are focusing on this year.

If you feel like indicating how or if this has changed since last year, please do.

Thank you.
Photo Credit: Leo Reynolds via Compfight cc

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
  See related info below

Why our processes aren't perfect.

The Pareto Principle Photo Credit: pshegubj via Compfight cc

A perfect process is - by definition - the ultimate way in which a process can be defined and managed. So you would think that everyone would be striving for the perfect process.

Unfortunately, that isn’t always the case. People tend to strive for ‘good enough’ when it comes to many things. That report you are creating could do with a further grammar check, but we won’t do it. It’s good enough. The decorations on the Christmas tree could do with being tidied up and made absolutely symmetrical, but we’ll leave them as they are. They’re good enough. 

Why is this?
A lot of it is to do with the Pareto rule. The Pareto rule states that 80% of the work is done by 20% of the effort. The remaining 20% will take 80% of the time.

(From Wikipedia: The distribution is claimed to appear in several different aspects relevant to entrepreneurs and business managers. For example:
80% of your profits come from 20% of your customers
80% of your complaints come from 20% of your customers
80% of your profits come from 20% of the time you spend
80% of your sales come from 20% of your products
80% of your sales are made by 20% of your sales staff
Therefore, many businesses have an easy access to dramatic improvements in profitability by focusing on the most effective areas and eliminating, ignoring, automating, delegating or retraining the rest, as appropriate.)

The same happens with processes. An amount of time and energy can be spent reviewing, revising, improving, and implementing processes, but on further review it is discovered that they are only 80% of the way there. To move the process the remaining 20% will take a disproportionate amount of time and energy, sometimes to the point that it becomes economically unwise to pursue.

But is this right?
On a fundamental level the answer is “yes”. If the return on investment of performing the additional 80% the work is less than the benefit of improving the 20% that’s remaining, then ‘good enough’ will suffice. But there are certain points where it becomes less a matter of ROI and more a matter of other things. 

Let’s take airplane construction, for example: Would you fly on an aircraft that had only had 80% of the testing done on it? Would you trust an airline navigation system that had only had 80% of the coding tested? What about a surgeon who had only done 80% of his training?

I know I wouldn’t.

These are examples of when the 80/20 rule needs to be ignored and 100% of the effort needs to be put in.

Can you think of any of your processes where this is the same?

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
  See related info below

The problem with BPM Solutions....

BPM software is big business today. There are, literally, dozens of companies on this particular software space. Gartner classify some of them on various different magic quadrants. Forrester classify them on various different waves. Every day companies are pitched at by software vendors to put in the latest and greatest tool to help them.

But are they any better than people?

At a fundamental level, the answer is - obviously - yes. Software well designed to fulfil a particular task will always beat a human doing the same task (providing the task is something that can be automated or computerised - I wouldn't want a computer performing open heart surgery on me - although computers are now driving cars by themselves). A machine which is programmed to deal with workflow simulation is always going to be able to simulate thousands of transactions through a workflow better than a human.

But do we need them? Do we need to purchase a BPM system in order to be able to implement BPM in an organisation? Is it necessary to spend outlandish amounts of money to make BPM part of your everyday workflow? No. It isn’t. 

BPM - at a fundamental level - is the analysis and implementation of ideal process flows. These can be repetitive, or on a case-by-case basis. But they do not HAVE to be implemented via a computerised system. 

Working on basics it is simple enough to perform process analysis using pens, sticky notes and brown paper. This can then be documented in a workflow drawing, or a process flow, or a procedure. Employees can be trained and the new procedures can be implemented. All of this can be done without a dedicated BPM system.

But why would you?

Well, for one thing, you would do this to determine whether you have the right mindset to implement BPM It is wrong, in my opinion, to think that implementing a tool to solve a problem will solve that problem. Time and time again we see examples of computer systems being put in place to solve problems that don't actually solve those problems.

The problem itself, is, of course, that the problem under consideration is the wrong problem. (There were a lot of problems in that last, sentence, heh?). By this I mean that identifying a problem that needs a tool solution is the wrong problem. It’s similar to saying “This patient has a heart problem that needs a stent”. NO. The problem is that the patient has blocked veins. The stent is a solution for that problem.

Likewise companies become enamoured of the software solution to help them become more efficient, more effective, more responsive, whereas the problem may not be that the company is inefficient, ineffective or unresponsive in the first place.

This is why the first step in deciding whether to implement a BPM solution is NOT to decide to implement a BPM solution. It is to decide what the issue is that you are trying to address. Remove the system from the equation. Ask yourself  “If I was doing this manually, what is going wrong?” Understand what is the fundamental underlying item that needs to be changed to make your problem go away.

Losing customers? The problem might be that you need a CRM solution to better handle them. But it might also be that your product doesn’t give them what they want or you aren’t selling it right (See Jeffrey Gitomer for more about that) 

Manufacturing not fast enough? It might be that you are wasting time on a particular piece of the process and creating a bottleneck. But it might also be that your line is old-fashioned or your employees aren't trained to operate it correctly.

In the examples listed above, knowing what the actual problem is can lead to a completely different solution to the one first anticipated.

That's a good thing.

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
  See related info below

Progress. It is inevitable.

Arri AlexaI was reading an article recently on the progress being made in the manipulation of electronic data on film sets ( I know, right...). In particular the article was talking about how the current methods of dealing with digital film are something that vary between films, film-makers and equipment. In effect many high-profile film-makers are creating brand new workflows for each of their films depending in who they are working with and what equipment they are using.

The new breed of digital cinema cameras (Alexa, Red, Sony, Black Magic) are all slightly different in the way they deal with the pixels. As a consequence they all need to be dealt with in a slightly different manner once the scene has been shot. There is a world of work that might need to be done to take the electronic data stored in a (sometimes proprietary) storage device and turn that into something that can be accessed by an editor on an NLE (non-linear editor), such as Final Cut or Premier Pro. On top of this the storage and processing needs are becoming increasingly large. Consider the recent film The Hobbit which shot 3D scenes at 4k resolution at 48 frames per second, and compare that with, say Skyfall that shot on the Arri Alexa on 2D at 24 frames per second with a frame size of a little over 1080p. To put it another way each second of The Hobbit needed almost 16 times as much storage as Skyfall. That's a huge amount of data to be manipulated.

Added to this is the fact that "The Hobbit" shot on Red Epic cameras which use a different proprietary format to the Arri Alexa used on Skyfall, and you have a completely different workflow needed to deliver the bond film than the Middle-Earth tale.

Is this a good thing, though?

Let's go back and look at things as they were before digital cameras arrived. Someone would buy some celluloid film, rent a camera (several types were available but they all did exactly the same thing), expose the film on set, send it to someone like Technicolor to be developed, and edit using the footage sent back from there. It took a day or so for the footage to be developed, but everyone went through the same process. Sure, there were permutations you could follow (Janusz Kaminski famously 'flashed' the film stock on Saving Private Ryan prior to shooting to help create the Beach- bypass look which gave it it's distinct appearance. It also locked the film-makers into a specific look throughout the movie), but the process was, essentially, identical the world over.

Then we got to the stage where digital effects need to be inserted into these movies. It involved an extra couple of steps to make this happen. The exposed celluloid was scanned into a computer whereby the scenes needing digital effects (i.e. a dinosaur inserting into the scene) could be manipulated on a computer. Thereafter the finished scene would be printed back on to celluloid and edited into the film with all the other scenes.

This worked for a few years until the technology caught up again. During the making of the Star Wars prequels, George Lucas specifically shot certain scenes on digital cameras to see if anyone would notice which were which. Apparently very few people did. This allowed him to take the step of shooting the final prequel completely digitally.

That was when the rush started towards digital. Some of you may know that I also spend a lot of time on film and television sets, and have done so for about six years now. Over the last two years I cannot remember a production I have been involved in that has shot anything on film. Everything is now digital. The Arri Alexa is the camera of choice and each one has a small team of operators who are responsible for taking the data on the storage media and processing it so that it is suitable for the editor to review.

In terms of personnel, the camera team isn't much bigger than it used to be. The person responsible for lugging the huge film magazines around has been replaced by someone who is responsible for the digital data cards or drives. But, whereas the film used to get sent to a processing lab, the digital storage is now given to a DIT or similar. This is a group of people who have responsibility for taking the raw camera footage and backing it up before erasing the storage media so it can be used again later in the shoot. They also take the backed-up data and synchronise the sound (recorded separately) as well as logging the contents of the footage to enable the editor to work efficiently and find what needs to be found. So, overall, the number of people needed on the crew has increased. If you are a low budget production this means that you need to either get someone else in to provide the service detailed above, or you need to be multi-skilled and have the basic knowledge and equipment to do this as part of your workload.

I shot a corporate video recently where I was both the director and the DIT specialist. I purchased a separate backup device and made sure that at the end of the day all of the footage was offloaded to this device and a separate hard drive (dual redundancy), as well as logging the shots so that editing knew what was where. It certainly made for long, busy days on the film set!

So what has this got to do with process? Well, the sharper minds amongst you will have noticed that there are two facets that should be looked at here. The first one is the fact that different cameras and different people have different post-processing workflows. In an ideal process world there would be a common workflow amongst all cameras. Additional to this we have the issue of different players needing to be involved to enable the process to work ( or at the by least having performers with multiple skillets to enable the work to be done). Neither of these is ideal from a process point of view.

But we must ask ourselves whether this is something that needs to be the case. As technology has evolved, are we in the situation where the cameras themselves can make the workload easier? In a recent blog post an owner of a DIT company has said that he expects technology to move on at such a rate that his company will not be needed in three or four years. he expects the camera to have a large amount of the functionality built in to it.

Does this mean that the current processes set up to deal with things like this are wrong? I think it means they are immature. Of course the process professionals amongst would like more standardised handling of digital data - and no doubt the film crews and production companies would also like to be able to handle things the way they did in the old paradigm of celluloid. but until the types of data, the amount of data and the storage media are standardised, there are always going to be some sort of workarounds and camera specific actions that need to be performed. Does this make the process wrong? No, But it makes it less efficient.

Things will mature and standardise. They always do.

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford
 See related info below


From within the wallsI’ve written on this blog before about some of the issues and problems with BPM as a concept and implementation of these concepts. Mainly the issues can be boiled down to a few key items: Lack of correct change management. Lack of Executive Sponsorship. Lack of ownership.

The key thing that these have in common is that they identify something that isn’t there - be it an owner, a manager, an objective. But today I want to talk briefly about something that is usually there - and often in too many different places to be useful: Standards.

Time for the obligatory anecdote:

Back when I worked with a pharmaceutical company we developed a set of standards that would be applied to technology across the organisation. These standards detailed the operating systems we would adhere to, the specification of PC’s we would use, the types of devices we would allow to attach to our networks etc. etc. This was referred to as “The Standards List” for obvious reasons.

However, there was one part of the company who believed that these standards didn't apply to them. They were always looking to use something different, or not use something on the standards list. They were applying for exceptions on a regular basis and getting these approved. It helped that they were a large part of the organisation with a huge budget and some political clout at the CIO level.

But the result of this is that we, effectively, ended up with two lists. There was “The Standards List” that was used by 90% of the company, and there were the exceptions that were used by the remaining 10% (usually this one functional area)

In effect we had two sets of standards. Which, in effect, meant we had no standards. Because it then became known that if you wanted to bring something into the organisation that was non-standard it could be done because there was a precedent. You only had to say ‘This area uses Mac computers on our networks so we want to use them as well - it’s on the list', and we couldn’t stop you.

Why are standards important?
While each department had a perfectly legitimate reason for wanting to bring in something that was non-standard, it did create a problem that was one the standards list was trying to avoid - maintenance and associated costs.

One reason why the company decided on Windows 2000 (at the time) as the standard for desktop operating systems was because we had agreements in place with Microsoft to supply the software and associated upgrades. These were then placed on a disk image which would be quickly and easily replicated onto new machines. Centralised software roll-outs were also provided which meant that a new piece of software could be rolled out to the 30,000 pc’s in the company virtually overnight and under controlled circumstances. All this resulted in lower maintenance costs for the organisation.

However, once we started to allow Ubuntu and MacOS into the company we had to add new steps to our maintenance processes. These steps were exceptions to the standard. They cost time and money to implement and they slowed down software roll-out and burned resources at a time when the company was resource constrained.

Of course these costs were not born by the department that requested the exceptions. These costs were born by the support organisations that had to roll out new software and manage the desktop environment. These costs came out of their budget. They cost the company money it didn’t have.

I suppose it would have been easy to take these costs and bill them back to the appropriate department and make them feel the pain (and in fact I believe that’s what they did towards the end), but this merely resulted in justification for the departments who then took the view that “If you’re billing us for maintenance of our own systems why should we adhere to your standards anyway? We might as well do our own thing”. As a result they created a parallel support organisation which sapped overall costs across the company.

You can see the problem. It all came down to standards. Standards are there for a reason. They have been defined to allow consistency across organisations and allow ease of use and maintenance. It’s the same with process diagram notation. BPMN is a standard, as are TOGAF, Enterprise Architecture and Rummler and Brache. I’m not about to give any credence to one over the other, other than to say as long as you have made your choice and know what you want it doesn’t matter which one you use. But you must ALL use the same one in an organisation.

This can become particularly onerous when you are part of a company that has recently been bought by another company. You will, no doubt, have your own set of standards for many things in the business. The purchasing company will have similar. The chances of the two being the same are very small. So at some point there has to be a rationalisation. The two standards have to become one. This can either be through wholesale replacement of one set with another, or through the merging of the two into a third, common, standard. Either way the newly defined standard has to be accepted and implemented across the organisation.

But what is more important - and often overlooked - is a process for managing the standards. The set of standards you put in place when a business is two years old working in a manual environment manufacturing goods, for example, will not be the same set of standards used by that company after it has been running for twenty years and has automated many manufacturing steps. The management of that evolution should be in place as well. This is an area that is too often left to chance. Without a standardised process to manage your standards, you will end up with a process of half-measures. But you will have a process.

But I’m not sure it’s a process you really want.

Reminder: 'The Perfect Process Project Second Edition' is now available. Don't miss the chance to get this valuable insight into how to make business processes work for you. Click this link and follow the instructions to get this book.

All information is Copyright (C) G Comerford See related info below