Showing posts with label Business Process Management. Show all posts
Showing posts with label Business Process Management. Show all posts

Wednesday, March 10, 2010

Tips for the Business Process Developer - Activity Data from the Participant's Perspective

This posting is a continuation of the ongoing Process Data Perspectives conversation that I have been having with my Teamworks BPM colleagues...  In a nutshell, we think that you'll create a better process data model if you consider the data from the perspectives of the folks who are actually engaged in the process.  We've identified the following perspectives that are common to most Business Processes:
  • Process Owner
  • Process Analyst
  • Process Participant
  • Process Builder
Today I'll be focusing on the Process Participant's perspective, particularly how they see data within a single Activity:
Participants in the process are the folks who are asked to perform specific tasks. For example, in a Loan Application process, the participants would include the Loan Officers who review and make approval decisions about loan applications.

Participants care about now. Their perspective of process data is immediate. What forms do they need to fill out? What fields are on each form? What are the acceptable values for each field on a form? Participants view data from a task level - What information do they need to consult and what information do they need to gather and modify to accomplish their task.
Given this understanding, let's delve a bit deeper to help translate the Participant's perspective into terms that are more familiar to Professional Programmers.

From the Participant's perspective: When performing a specific activity, the data (information) falls into the following categories:
  1. Information that I am required to provide
  2. Information that I can optionally provide
  3. Information that I am required to verify
  4. Information that I am required to correct
  5. Information that is presented to me to help me with all of the above
There seems to be a lot of nuance in the above list because there is a lot of nuance in the above list.  The deeper you get into process activities, the more likely you are to encounter nuance.

"Information that is presented to me to help me with all of the above":

Let's start our journey at the end of the list: "Information that is presented to me to help me with all of the above".  Tell that to a programmer and they'll cringe.  We need to break it down a bit further into Instructions and Context.

Instructions:

All Activities need instructions. You need to tell the Participant what to do.  Some instructions are "static" - they don't change from instance to instance.  Other instructions are "context sensitive" and only displayed when appropriate.

Here's a static instruction: "Verify that the Postal Code is valid for the City."

Here's a context sensitive instruction: "Because the loan amount is over $200, you must record a co-signer."

Instructions aren't generally thought of as part of a software application's data model (by programmers) since they're generally "hard coded" on screens.  We programmers usually don't worry about where or how they are stored - they're just "part of the code".   In a Process Application we really ought to worry a bit more - The instructions need to come from the Business folks, and those folks really ought to be able to change the instructions when they need to.  This is particularly true when you have "localization" needs (instructions translated to multiple languages)... but there's a wider opportunity here for us to explore.
Wouldn't it be great if the Participants who perform the Activities could collaboratively improve the Instructions that are presented to them?
Phil Gilbert is always talking about BPM on every desktop - and this is a simple example of what that might look like.  Lousy instructions can make a simple task hard to perform.  Great instructions can simplify a complex task.  Give the users a mechanism to provide feedback, and over time they'll transform lousy instructions into great instructions, and process performance improvements might follow.

Context:

All Activities need context.  The Participant needs to know exactly what it is that they are working on.

Here's an example of context information: "The applicant is John Q. Public."

Context information has to come from somewhere... and that's why it's a crucial part of the data model.  Some context information comes from the Process Payload - that's information that's "passed" from Activity to Activity as the process progresses (think of this as an analog to the stuff inside the envelopes that are routed from worker to worker).  Other context information is retrieved from a System Of Record (SOR) when the Activity needs it (think of this as an analog for retrieving something from a filing cabinet when the worker actually starts the task).

Determining which context information should be part of the Process Payload and which should be retrieved from an SOR is both a science and an art that we'll leave for another discussion - for now just be aware that you'll need to iron out these details.

Determining the proper context information to display is on par with determining the right instructions to display - which means that you really need to validate context information with the users.  Paradoxically, context information is often "context sensitive" - Complex relationships on what to display often occur, based not only on the instance data of the Activity, but on attributes of the Participants themselves (a Manager may see more or less context data than a Worker).

Once again, this would be a great opportunity for BPM on every desktop - Give the Participants the power to define the context information that they need to see to efficiently perform their tasks.

"Information that I provide, modify, verify or correct":

From our original list of information types I've collapsed several entries for the sake of brevity...  Activities (with very few exceptions) require the user to provide, modify, verify or correct information.  Let's lump all of this data under the term "Manipulated Data".  There's often a blurry line, or no distinction at all, between Context data and Manipulated data, so it might be best to just think of this as the mutable (things that can be changed) subset of the Context data.

As with Context data, Manipulated data can either be part of the Process Payload, or it can reside in Systems Of Record... this is where the underlying architecture of your data systems is really going to come into play... If the Activity involves legacy systems, then System Integrations may be in your future.

Data Validation Rules:

From the Participant's perspective, they may or may not be aware of where specific information comes from, but they should be moderately aware of "what the rules are" around that data.  For example, the Participants must know whether or not a field is required or optional.  They must also know whether or not they are supposed to correct information (or ignore mistakes).

Validation of information that users supply or manipulate is a crucial concern that must be addressed - and you should start addressing this concern while defining the process data model.  Each field may have its own validation rules.  Collections of fields may have inter-dependencies.  The rules that are applied to fields may change as the process progresses, and they may change based on "who" is manipulating the field.

Purists may not consider the rules that apply to the data to be part of the data model, but pragmatists don't really care about distinctions.  The data requirements of a process are just as much about the data validation rules as they are about the data fields.

The Payoff:

That's a lot of stuff to consider when defining the Data Model for a single Process Activity, but you have to admit that it's mostly common sense from the perspective of the Participant who will be performing the Activity.  If you don't work with the Participants to define the data requirements to this level of detail "up front", then you're likely to have problems crop up when you least expect them.

Walking a mile in the shoes of the Participant at the beginning of your journey will probably make your journey shorter ;-)

Wednesday, March 3, 2010

An Error Occurred - Live With It

Have you ever seen a screen like this?
Helpful... Not.  But the color scheme and fonts are really pleasant.

On this particular screen, when you "Click here for more details" you get the following gem:
Much Better... Not.

This, my dear friends, is an example of what happens when you create a Business Process Definition without Participant Feedback.  If you had put mock-ups of these screens in front of any "real" users they'd say three things:

  1. Who is my "system administrator"?
  2. How do I contact my "system administrator"?
  3. What the heck am I supposed to do with the "more details" that you're showing me?
If you can't answer any of those questions, then you might as well just put up a message that says "An Error Occurred - Live With It".

Wednesday, January 27, 2010

Event Processing and Process Management

I am of the opinion that we in the software industry as obsessed, yet fickle, about acronyms... We insist on coining acronyms, yet we tire of them easily and introduce new acronyms that (almost) mean what the old ones did.  Must be something in our genetic makeup...

If you are acronym obsessed - today's topic relates to CEP and BPM.  Both relate to Business. The former is all about "Event Processing" and the latter is all about "Process Management".  They are very closely related.

Event Processing is conceptually really simple:  When X happens, do Y.
Process Management is also conceptually really simple: Perform these Activities in this order.

In reality, Process Management is Event Processing:
  1. When the Process Starts, do Activity One
  2. When Activity One completes, do Activity Two
  3. etc. etc. etc.
The primary difference between a CEP system and a BPM system lies in defining relationships between events.  In a BPMN diagram we define a flow of events - an order in which we expect those events to occur.  For example, Activity One always comes first in a process instance... It's completion triggers the start of Activity Two.  We're diagramming the expected sequence of events.

Truth be told, all real BPM solutions have to include a bit of CEP because real business processes must always take into account "out of band" events.  For example, an "Order" can be cancelled at any step of the "Order Fulfillment Process". When an Order is canceled, we need to perform some cleanup steps.  Based on where we were in the process, we may need to perform different cleanup steps.

We certainly would not want to "mess up" a nice clean BPMN diagram by adding a "Cancel Order" handler to each Activity, but we do have to deal with this eventuality or our solution isn't worth spit.  In general we will model an "Order Cancelled" sub-process - which is kicked off by a "Cancel Order" event.  Sounds kind of like "Complex Event Processing", doesn't it?

Acronyms and cache phrases are great... but they are highly overrated.  When you are building a Process solution for a Business, you're going to touch on CEP, BAM, Business Rules, Business Performance Monitoring, etc. etc. etc.  These are all aspects of what your Business needs to succeed, and you are going to have to deal with all of these aspects.

The challenge is to integrate all of these concepts in a manner that is comprehensible and maintainable by the Business... Your BPM solution must work (of course), but it can't be a magic trick and it can't be a black box or you haven't really met your goals. 

When it's time to modify the Process (notice that I said "when" and not "if") there will probably be a whole new crop of acronyms to confuse things - Don't let your solution get caught in that trap.

Tuesday, January 26, 2010

Bonitasoft - BPM Game Changer?

Bonitasoft rolled out their open BPM open solution 5.0 this morning... declaring it as a "game changer" for BPM.

I like Bonitasoft's approach: Keep it simple. Make the common tasks easy and the complex tasks possible.  Don't distract the process developers with any unnecessary "under the hood" details.

They've done a great job building an excellent suite that really feels like seamless solution... and I'm sure that many folks are going to love it.

That said, there's nothing "game changing" in any of Bonitasoft's features.  "Nothing new here" in terms of being able to build anything with Bonitasoft that you couldn't already build with a number of BPM suites, both proprietary and open source.

The "game changing" aspect of Bonitasoft isn't features - it's marketing.

Bonitasoft builds on the rich legacy of the Object Web open source projects -many of which have been in commercial use for years.  The huge difference between Bonitasoft and it's ancestors is packaging... both the packaging of the software components themselves and the packaging of the BPM message.

With most open source projects, if you want to use them you have to engage very skilled developers to install and configure the various packages that you want to use.  You have to spend a lot of time just getting the development infrastructure set up.  You have to spend a lot of money before you can even begin to work on your actual problem.

Contrast that to Bonitasoft where everything that you need is in a nice little bundle - ready to go in a few minutes.  That's a huge win on the Technical front.

A win on the Technical front is great - but that's not where the battle is.  The battle is on the Business front.

Despite what most IT folks would like to believe, it's really the Business that picks the solution in the BPM space.  Business comes first in BPM, so you have to convince Business first.

This is where Bonitasoft's marketing comes in - Their marketing team is transforming a Techno-Geek centric open source tool into a Business-Centric save-your-company tool.  That's what changes the game (if they pull it off).

I work for a company with a great proprietary BPM suite... and our suite is getting better all the time... but I really wish Bonitasoft success.  By lowering the barriers to entry in the BPM space they are helping to spread the message and educate the masses.  They are helping to raise the BPM tide, and a rising tide raises all boats (both large and small).

Good job Bonitasoft, and best wishes.

Wednesday, January 6, 2010

Multi-Manager Workflow Solutions

If you've worked with Filenet, Alfresco, or any number of Content Management suites, then you've most likely encountered their embedded workflow capabilities. It's very common to enforce some sort of process around adding, changing, and deleting content from your corporate system of record - so embedding workflow capabilities in these products just plain makes sense.

If you've worked on a BPM project that involves a wider scope than your corporate management system, then you've probably had to answer the dreaded question:
Which product should I use to implement this process?
My first encounter with this conundrum was with an Identity Management product rather than with a Content Management product, but the factors to consider were pretty much the same.

The process that we needed to implement involved employee credentials that were managed by Sun's Identity Management product. The Identity Management product's embedded workflow features weren't enough to handle the whole process,  so we were left with the option of implementing everything in Lombardi's Teamworks or of implementing the "master process" in Teamworks and key sub-processes in the Identity Manager.

Our original approach was to go with a pure Teamworks solution to keep all the process logic in a single system, but during implementation we changed our minds.  Using Identity Manager's embedded features for a few key sub-processes shortened our development cycle, so we went with it.

In an ideal world - at least from my perspective - Process (workflow) Managers would be decoupled from specialized products.  The specialized products would all expose Services (both Human-Powered and Automated), and the external Process Manager would choreograph and orchestrate those Services.

Unfortunately, my perspective is far from ideal if you are the vendor of a specialized product.  Leaving the process and workflow elements of your solutions to an external product just doesn't work.  Your product needs to provide as near an "Out Of The Box" solution as is possible, or people won't buy it.  You could (of course) partner with a Process Manager vendor, but that requires inter-vendor collaboration... and the end result probably won't be as tailored to your specific domain as you'd like.

Consequently, product vendors will embed workflow capabilities into their products, and we'll be left to answer the dreaded question:
Which product should I use to implement this process?
Clearly, it's not unreasonable to assume that you may have to deal with multiple "Process Managers" to implement a single process...  Teamworks may handle most of the process, Sun Identity Manager may handle "identity" aspects, and Filenet may handle "Document" aspects.  What's needed is a standard API for each Process Manager to use when interacting with another Process Manager...  Here's a partial list:
  • Start a Process Instance
  • Signal that a Process Instance has completed
I could easily come up with many more APIs... but these two are the biggies.  If I can start a process, and get notified when a process completes, then I can incorporate any process that's managed by and external Process Manager into a process that I am managing myself.

So far, so good - but I'm not quite done yet with my wish list...  I'm fine with Process Manager "A" handling the overall process, and with Process Manager "B" handling some key sub-processes... but only if Process Manager "B" keeps Process Manager "A" updated on what's going on.

For example - Let's assume that "A" kicked off a sub-process with a dozen activities that's handled by "B", and for some reason it's taking more time to complete than expected.  What's going on?  Which of "B's" tasks have been completed?  Which of B's tasks are left to complete?  Enquiring minds want to know :-)
 
If users have to switch between process managers to get a clear picture of "who is doing what" (or to claim tasks) then they're not going to be happy.  Multiply process managers are fine "under the hood" - but keep the hood closed and let the users concentrate on driving.

Friday, January 1, 2010

Missing pieces

I came across an article by David Rotman that describes W. Brian Arthur's efforts to understand why it takes so long to commercialize new technologies... and I was drawn to the following statement of why "lab on a chip" technologies weren't commercialized back in the 90s:
"The problem, as Arthur might put it, was that the toolbox was missing key pieces."
It occurs to me that this is the reason the status quo in software development remains the status quo... New and better approaches are all around us, but the incomplete nature of a new approach leads to its rejection.

Case in point, a few years ago I penned my frustration that Java developers resisted the adoption of BPM technologies on my java.net blog.  I got a huge number of mostly angry comments... but they all really boiled down to missing features in the BPM tools.  The BPM tools were more primitive than the "state of the art" development tools, and those missing features invalidated the advantages of BPM.

The BPM toolboxes were missing key features.

A lot has happened in the ensuing years, and the BPM toolboxes are much, much better... but I still wonder if there aren't just a few key features that we're missing that would really make the "Business drives IT" promise of BPM a reality.  It's great that "real" programmers are now comfortable with BPM suites, but I maintain my allegiance to making a BPM suite as natural for a Business person to use as a spreadsheet.

What key pieces are missing?  What are the few nifty tools that will enable those "Power Business User" to keep control of their BPM projects?

I've got my own thoughts on the subject... but I'd love to know what others think.

Friday, December 18, 2009

Departmental BPM joins Enterprise BPM - IBM Acquires Lombardi

I agree with Forrest Gump's assessment:

"I don't know if we each have a destiny or if we're all just floatin' around accidental like on a breeze... but i think, maybe it's both. Maybe both are happening at the same time..."

Whether by destiny or by accident, this week IBM announced that it will acquire Lombardi Software and my professional life got a whole lot more like a box of chocolates.

As a Lombardi employee there's not a lot that I am at liberty to say about this, and I don't really know any of the details anyway, but I wanted to throw in my two cents regarding the mention of "Departmental" versus "Enterprise" BPM in the announcement.

There are many many approaches to BPM adoption in a company, but if you just look at the ends of the spectrum you can categorize the extremes as "Top Down" versus "Bottom Up"endeavors.

"Top Down" BPM is a great idea - That's where your company executives make a firm transformation commitment for the entire enterprise.  Beyond just rolling out a few managed business processes, the entire IT infrastructure of the corporation needs to be analyzed, expanded, and updated.

"Bottom Up" BPM is also a great idea - That's where you have an operational problem that could really benefit from the application of BPM, so you take the bull by the horns, attack the problem, and deploy a managed business process to save the day.

The "Top Down" approach is most likely going to be an SOA based - ESB centric - BPM solution.  A lot of hard-core, high-powered Software Engineering is involved when your entire company depends on the solution - and that sounds a lot like IBM to me.

The "Bottom Up" approach is much more of a "just get it done" approach.  You need to deploy a managed business process as soon as you can.  Stop the bleeding now.  Once you've got the process deployed, you can evolve the solution over time... but time's money and you've got to do something now - that sounds a lot like Lombardi's Teamworks to me.

It's not really a technology issue - Lombardi's solution scales quite nicely.  It's a methodology issue...  Some tools really enhance the "Top Down" (Enterprise) approach, while others really enhance the "Bottom Up" (Departmental) approach.

Offering BPM tools that support both types of projects seems like a pretty good idea when you think about it, because (as Forrest muses) "maybe both are happening at the same time".

Tuesday, December 8, 2009

Process Debt and the Business Process Developer

My friend Scott Francis has authored an excellent blog posting on "Process Debt and BPM".

Scott's definition of Process Debt focuses on when our organizations fail to adapt to a changing business climate:
"If you rolled out a process two years ago and haven’t made any tweaks in the meantime, I believe you have acquired process debt – a steady, growing gap between what your software and processes are designed to handle, and what the reality of current business conditions requires." 
That steady, growing gap between what your software does and what your business needs is inevitable and inexorable. It's the real demon that business programmers need to slay.

My work is primarily with folks who are embarking on their first BPM project...  I help them implement their first managed business process with as little pain as possible, and I always goad them to "just get it done".  Process Applications aren't supposed to be pretty, and the longer it takes to build and deploy them, the longer it will be before your business realizes a return on their investment.

Despite preaching "just get it done", we mustn't forget that the process application that we are building today won't be what the business needs next year.  The business universe is in constant flux, and our solutions will likely be dated as soon as they are deployed.  We have to build our process apps with an eye to changing them in the future.

As Business Process Developers we really have to exert that extra effort to keep the linkage between each Process Requirement and how we have implemented that Process Requirement as clear as possible.  If you can't pinpoint how a requirement has been implemented then you can't easily adapt your code when that requirement changes.

Even with a great BPM suite, it's really tempting to implement process logic "where ever you need it".  For example, how many times have you embedded client-side JavaScript on a web page to implement a process rule?  I'm certainly guilty of doing this, and I know better.  If you aren't careful, your process logic will end up scattered across web pages, stored procedures, AJAX services, etc.  Your BPMN diagram may be clean, but under the covers it's a mess.

Hidden process logic may be expedient, and it may even be necessary, but it's the single biggest obstacle to continuous improvement of your process applications.  Be sure that you know where your process requirements are implemented, and put it in writing for those folks who will inherit your code down the road.

If we maintain a clear linkage between our process requirements and the "code" that implements those requirements, then the cost of adapting our process applications will be lower - and if the cost of adapting our process applications is lower we should be able to pay off those process debts a whole lot sooner.

Process debt will happen... it's up to us as developers to reduce the cost of addressing those debts.

Monday, November 23, 2009

How to make Soduko boring - Follow the Process


I spend a lot of time travelling to and from client sites, and when I travel I often pass the time by playing Soduko on my iPhone (I use a very good version from Mighty Good Games).  Once apon a time I relied on the paper versions in the airline's magazines - but the iPhone app is now my norm.
One incentive for playing Soduko was to keep my little grey cells from getting lazy - but it also a fun way to pass the time, so I was rather surprised last year when my friend Roberta told me that she had given up the game after just a few months of playing. According to Roberta - who is a math teacher - the game wasn't challenging any more.  She'd figured it out, and the puzzles were no longer puzzling.

My first reaction to Roberta's statement was typical for me - I just assumed that the Soduko puzzles she had access to weren't "hard enough".  Many, many times I would finish a flight with an unfinished puzzle from the American Way Magazine staring me in the face.  I certainly hadn't "figured out" Soduko.

Then in March I saw an article in USA Today about J. F. Crook and a Soduko "solution".  Crook is a mathematician, but his "solution" isn't a mathematical proof - it's a series of steps.  I'm a process guy - so when you say "series of steps" I say "process".  Crook had published a process definition for solving any Soduko puzzle.

I read Crook's process definition, and it wasn't that far from recommendations that I has seen earlier.  Soduko puzzles are grids of 81 squares, nine across and nine down. Some boxes have a number filled in; the rest are blank. Players must fill in the blank squares with numbers between 1 and 9 without repeating any numbers in a row, column or the nine interior 3-by-3 boxes of the puzzle.

Crook's Sudoku Solving Process can be summed up as follows:  Mark all the squares in the puzzle that could be a "1" - Based on "what could be a 1" apply some simple rules to "solve" a few squares.  Repeat this process for numbers 2 thru 9.  Crook admits that you still may find some sqaures that might be one of two numbers, but you just "guess"  a number for that square and proceed until your guess is either confirmed or invalidated.

The problem with Crook's Process is that it's really tedious. While solving a Sudoku puzzle you'll often have insights or spot patterns that lead you to solutions for a whole bunch of squares at one time.  Consequently, when I read Crook's Process I said to myself "That's nice" and went on solving my puzzles the same way I always had.

My method for Soduko was to "solve" the obvious squares first, then scan the whole puzzle looking for the less-obvious relationships between the squares.  Using that rather random process I could solve many "Moderate" puzzles in ten to fifteen minutes, but I was also frequently left with puzzles that I just couldn't crack.

When I switched from paper Soduko to iPhone Soduko I initially followed my old-but-not-reliable process... but the iPhone apps features opened my eyes to "the better way".

With paper and pencil it's really messy to follow Crook's Process.  There's a lot of erasing that goes on, and it's really easy to make mistakes.  With the software version I can mark/unmark squares to my heart's content without reducing the legibility of the puzzle.  The software version also immediately flags my mistakes, so I don't multiply those mistakes by contiuing on with bad data driving my subsequent decisions.

A little bit of software can make it a lot easier to follow a process - Who would have imagined that? :-)

Even with my new found ability to make notes on the puzzle - I still stuck to my old process for the most part.  I's use my process to solve as much as I could, and then resort to Crook's process... and sure enough I would still get stuck with puzzles that I just couldn't solve.  This didn't happen often, but it happened.

I should mention that up until this time I was limiting myself to the "Moderate" puzzles on my iPhone. It's very unsatisfying to have an unsolved puzzle when my plane lands, so I stuck to the relatively easy ones for the most part.

On a recent flight I decided to tackle an "Expert" Soduko puzzle - and it was a bear.  I was so intimidated that I followed Crook's Process.  I diligently marked all the "1's", then the "2's", etc. and sure enough in 30 minutes I had solved the puzzle.  I tried again with a new puzzle, and sure enough, in 30 minutes I had solved that one too.

Since that time I have improved a bit, I'd say my average is about 25 minutes, but sure enough I can solve any puzzle as long as I keep to the process.  Just like Roberta, the puzzles aren't really puzzling anymore.  Give me 30 minutes and I will solve the puzzle.

The trade-off here is that I won't be able to solve any puzzle in less time.  The process is tedious and it's not easy to speed it up.  I sacrifice "brilliant flashes of insight" for dependable, predictable, and plodding.  The process makes Soduko boring.

As always I have to turn this experience into an analogy for the domain of Managed Business Processes:  I could have followed Crook's Soduko Solving Process all along - but a little bit of software really made the process easier to follow. Software made each step of the process easier, but it didn't force me to stick to the process - If the software had managed the process (forced me to stick to the process) I would have achieved the goal of consistently solving Soduko puzzles sooner.

You might be saying - "Yeah, but it also made it boring".

That's certainly something to consider when introducing a Managed Business Process into your organization... We don't want to turn our process participants into mindless drones.  We don't want to lessen the importance of creativity and insight - but we do need to make things more predictable and insure success.  Finding the balance between predictable process and bored employees in your organization is a puzzle that won't ever get boring.

Friday, November 20, 2009

Tips for the Business Process Developer - Process Data Perspectives

Those of us who build Managed Business Process solutions for a living have to spend a lot of time thinking about data. Our BPM suites (like Teamworks) make it a snap to define Process Flow and Routing, but we still have to make sure that each Participant in the process has access to all the data that they need to accomplish their tasks.

I always insist that folks implement their process flow first. Build a place-holder for each Activity in the process, connect them up with flow lines and any necessary Decision Gateways, and add in just enough process data and build just enough user interface to make sure that you can play back each step through every path in the process with your business folks.

Building this much of your process first has huge paybacks - There's nothing like stepping through a process to validate whether or not the business folks agree that you've got the process right. Validate the process requirements first - worry about everything else later.

Once you have this first playback of the process done you've got to turn your attention to defining the rest of the process data... and suddenly it's not as clear how to proceed. Where do you start?

The data aspects of a Process encompasses many things that you might not consider when developing other types of applications. There are several common process data patterns to consider - processes gather, transport and transform data across activities that are performed by multiple participants. Data models often dynamically evolve and transform as the process proceeds.

I've had several recent discussions with Fahad Osmani and some of my other Lombardi colleagues about how to effectively develop process data models. We've all implemented a number of managed business processes - We've all thoroughly mastered the craft of implementing processes - But we haven't really formalized our approach for designing process data models. Knowing in your gut what to do is great, but formalizing that into something that you can share with others is better.

Here's what we've come up with so far:

In every business process you've got a variety of actors with a variety of perspectives that you have to deal with - we've abstracted this into the following list:

1. Process Owners
2. Process Analysts
3. Process Participants
4. Process Builders


We've found it best to design a process data model from each perspective in this order, and then consolidate those models into a consistent unified model.

The Owner's Perspective of Process Data:

Start with the owner's perspective of process data - the folks with overarching concerns to make sure that the outcome of a process is achieved. If you're working with a Bank on a Loan Application process, these are the managers of the Loan Department who are held accountable for efficient and effective handling of the loan applications.

Owners have a business perspective on the process data. They've got the big picture business view of the process payload - What are the basic business objects that the process deals with? What data is being gathered? How is the data transformed during the process? What happens to the data when the process ends?

Owners often see process payload data as starting points and outcomes. Their major concern is with the state of the business data "now". They care about the state of the business objects when the process starts, the state of the business objects at key milestones, and the state of the business objects when the process finishes. The owner's data model provides that information.

Beyond the payload data, owners also perceive data in terms of process rules - What data determines which activities to perform? What data determines when a task is due? What data determines who should perform a task? The owner's data model answers these questions.

The Analyst's Perspective of Process Data:

Process Analysts are the folks who look at historical process metrics to figure out if the process can be tweaked to work better. In some cases the Owner may also be the Analyst - but the Analyst perspective really is different. In a Loan Application process, the analysts are the folks who evaluate the effectiveness of the process itself in the hopes it can be improved in the future.

Analysts deal with understanding the past to improve the future.

Analysts view the process data as snapshots in an album. Their view of data is intricately linked to the process diagram - The snapshots are meaningless to them unless they know where the snapshots were taken.

Analysts care about how long things take and why some things take longer than other things. Analyst care about "rework" (tasks that have to be redone). They don't really care much about what's happening now - they care about what happened over some time period. To meet the analyst's needs, process context data (and time stamps) are stored at specific points in the process. Making sure that the right snapshot is taken at the right time is their major concern. The analyst's data model works in conjunction with "snapshot events" in the process flow to capture the information the analyst needs.

The Participant's Perspective of Process Data:

Participants in the process are the folks who are tasked to perform specific tasks. Sticking with a Loan Application process, these would include the Loan Officers who review and make approval decisions about loan applications.

Participants care about now. Their perspective of process data is immediate. What forms do they need to fill out? What fields are on each form? What are the acceptable values for each field on a form? Participants view data from a task level - What information do they need to consult and what information do they need to gather and modify to accomplish their task.

Some information is passed to the participant when they are assigned a task, and some information they look up - but it's all information that is necessary to accomplish a task. Making sure that they can access data and modify the data that they need is their major concern.

One special caveat when considering the Participant's view of data is to ask whether or not any of the data that the Participant changes must be "immediately" seen by others. Data visibility requirements like this are critically important to the Process Builders.

The Builder's Perspective of Process Data:

Finally we get down to the Process Builder's view of the process data - that's us. We really can't do our work until we understand what the Owners, Analysts and Participants need.

Builders care about the actual structure of the data. What data are we going to pass around and how are we going to pass the data around? What data needs to be stored an retrieved and how are we going to store and retrieve the data? How are we going to meet the data visibility requirements?

The Owner's perspective helps the Builder define the major business objects that the process will deal with. The owner's data model also identifies the data that is used to determine which paths are taken, how long each task should take, and who should perform each task.

The Analyst's perspective helps the Builder define the data structures for the snapshots that are taken throughout the process - and informs the Builder where in the process the snapshot needs to be taken.

The Participant's perspective helps the Builder define the process payload (the data that is passed to and from tasks) and it helps them define the data system architecture that is required to retrieve and modify data within activities.

The Payoff:

The end result of this multi-perspective approach is a model for process data that just plain works.

It takes a bit more up-front effort, but you are less likely to get blind-sided by unanticipated data requirements popping up late in your development cycle. The data requirements may change, and any requirement change may still require substantial rework - but odds are the rework will be much easier for you to accomplish.

Clearly understanding requirements from the perspectives of your the process actors always pays off - and this axiom is doubly true when data is involved.

Wednesday, July 22, 2009

Get the Words (and Notation) Right

Dan North (the chief evangelist for Behavior Driven Development) loves to say:
"Get the Words Right"

In the context of creating software if you don't know what somebody wants you to build - then you can't build it for them... That statement is true for lots of contexts but let's stick to software for the moment.

If the person who wants something built uses terms that you don't fully understand, then you'll have to translate those terms into something that you do fully understand. Translation is never perfect, so you're bound to screw up something. When you are ready to explain what you're building to the client (to confirm that you are in fact building the right thing) if you use terms that the client doesn't fully understand, then they'll have to translate your words - and they are likely to misunderstand what you're building.

Sometimes a 3rd party will be on hand who understands both the client's and the builder's terms (I do that a lot)... but then you are dependent on the translation skills of the 3rd party. The good news is that both you and the client can blame the 3rd party, but your software still stinks.

Dan's solution to that problem is to spend more time up front agreeing on a common vocabulary between builder and client... with a definite prejudice towards adopting the client's vocabulary whenever possible.

Seems pretty much like common sense, but it's very hard to actually do it.

Many programmers just don't really want to understand their client's businesses... they're excited about programming, not business. Business concerns are mundane and boring compared to the challenges that they face when creating software... in their opinion. Many business people are just as bored by technical details as programmers are by business details.

It's hard - but in truth it's the easiest way to succeed. When everyone is on the same page building projects are actually fun... and the "buildings" are really fun to "live in".

In the Business Process software domain, we have a common notation for expressing the "flow" aspects of processes - BPMN. BPMN is relatively simple to learn and it has proved to be a very effective means for Business and Programmer to communicate about the "flow" aspects of their business processes. Unfortunately, BPMN isn't good at all when communicating all of the other aspects of a business process (data requirements, routing rules, etc.) but it's a good start.

BPMN is Notation rather than words... but Dan's point is hugely relevant - you have to get the Notation right.

I've worked with BPMN for a few years, and I think the Notation has elements that just aren't right. They don't convey the meaning of the symbol to most Business people.

Here's my least favorite symbol - the Multiple Instance Activity:

When you see this symbol, you are supposed to know that many copies (instances) of this Activity will be performed at the same time. You "know" this because there are those two little parallel lines on the Activity - They're parallel to each other, so of course that means there are parallel Activities taking place.

I am not a Business person, but I doubt that many would describe Activities-That-Happen-At-The-Same-Time as parallel Activities. If someone out there knows of a study that proves me wrong, please tell me.

I think a better symbol would be something like the following:

With this symbol everyone can see at a glance that Multiple-Things-Are-Happening (hopefully at once). I'd also add some indicator of the number of "Things"... although often that's only determined when the process is running.

I suspect that the current Multi-Instance symbol was chosen because it is easy to draw by hand. Fair enough... but I find that most of my hand-drawing is done on white boards, and I almost always draw something that looks a lot more like my proposed symbol.

Fortunately, with the current state of BPMN editors it is usually pretty easy to swap out the "official" notation's symbols for your own... There are much bigger problems to solve in the realm of conveying Business Requirements to Programmers than this little icon - so please don't waste any time lobbying for it to change. I'm just using it to illustrate the point:
Words Matter - Notations Matter - Diagrams Matter

Pictures are worth a thousand words, so take the time to get the picture right before you start building your next project.

Tuesday, July 14, 2009

The Toilet Paper Replenishment Process

Don Norman has an excellent blog posting about the poor design of many Toilet Paper Dispensers - focusing mainly on the poor algorithms that we simple humans use when given a choice of which roll to use from a multi-roll dispenser... Multi-roll TP dispensers prove Don's thesis that many every day things just aren't designed well. I had read Don's book and seen his blog long ago, but it came back to mind when I encountered the following in a bathroom in Stanford University's Terman Engineering Building...




No less than six partial rolls of toilet paper perched on top of a standard two roll toilet paper dispenser.

Obviously Don is right about the design of the dispenser itself... Given the choice of which roll to use we'll pull sheets off each. But obviously there's something more at work here: The process for refilling the dispenser is suspect.

I suspect the process is something like the following:


Check the dispenser each day. If either of the rolls is less than 1/4 full
replace it with a full roll and place the partial roll on top of the dispenser.


The objective of this process is to make sure that there's always plenty of toilet paper available... Obviously you don't want to run out.

My guess is that the partial roll is left on the dispenser in the hopes that folks will use it - This is much better than throwing it away, and given frugal college students I'm surprised that the partials aren't disappearing.

The flaw in the process seems pretty simple... "Enough Toilet Paper on Hand" doesn't take into account the partial rolls. Instead of basing replacement on the size of the rolls installed in the dispenser, it really ought to be based on the total amount of tissue on hand - the rolls in the dispensers plus the partials.

At first glance this appears to be a classic example of mis-stating the objective of the process. The janitor who refills the dispenser has been told something like "No roll on the dispenser should be less than 1/4 full". The real objective should probably be something like "There should always be the equivalent of a full roll on hand".

Having restated the objective: What if you ended up an "equivalent full roll" made up of sixteen partial rolls? That certainly wouldn't be acceptable to most folks.



Returning to Don Norman's point - the design of the dispenser itself is really at the heart of this "process" problem. The process for refilling the dispenser is overly complex due to the poor design of the dispenser itself. Try as we might to give proper instructions to the janitor, success is going to be highly dependent on the janitor's judgement on whether or not enough toilet paper is on hand.

What can we learn from this example?

If a process that should be simple isn't, then look beyond the process definition: It may be better to buy a new dispenser than to figure out how to refill the one that you have.



UPDATE:


Prompted by the comment of a good friend and Stanford Alum I delved a bit deeper into the conundrum and sadly discovered that this is both a Process and a Design problem... In another bathroom at Stanford I discovered the following shocking sight:

The superior design of this multi-roll dispenser has been thwarted by the installation of two such dispensers... The point of the design was to eliminate choice (thus simplifying the refill process) - introducing a second dispenser nixes that advantage... and as you can see the "refill process" is still needlessly removing partial rolls.

Such a thing would never have happened at my alma matter (Rice University) ;-)

Thursday, July 9, 2009

Changing planes while in the air

I fly a lot... way more than I ever thought I would... and I can say with some confidence that it's sometimes much harder to get from point A to point B than you thought it would be.

My home bases are Austin Texas and Santa Fe New Mexico, and neither is an airline "hub". Airline hubs are those wonderful airports where you can fly to and from a lot of places (non stop)... Neither Austin nor Santa Fe, as I just said, are one of those wonderful places.

Virtually every trip that I take requires changing planes. I land, scurry to another gate, and take off again. "Scurry" is often a nice way of saying "run flat out as fast as I can". For those of you who don't fly often, you may not realize that most flights board 30 minutes prior to departure, so if you think you have an hour between flights, it's really only 30 minutes... even a minor flight delay can turn "plenty of time" into a missed connection or lost baggage (if you are brave enough to check your bags).

Each trip that I take is (in a sense) a process. I fly from home to an airline hub, then I fly from that hub to another airport. On rare occasions I have to fly from the second airport to a third... You get the idea. On each flight, I am sitting along with a couple of hundred other people who are each taking part in their own similar processes. As long as all of the planes that we'll use are on time (the plane that we are on along with all the planes that we'll connect to) all of our processes will run smoothly. If the plane that we are on has mechanical difficulties and has to return to the airport, or if weather or some other obstacle keeps us from landing where we expected to, then our processes are all in a heap of trouble.




From a process design perspective the events that I have described are exceptions. We don't "expect" these things to happen, but we have to handle them if they occur. Some airlines have processes in place that work really well... other airlines... well... let's just say that I've slept on a bench in O'Hare airport and leave it at that.

Exception handling is actually pretty well understood in the realm of Process Design. With BPMN notation you can attach event handlers to Activities and model "what needs to happen" when something out-of-the-ordinary happens. Proper event handling requires a lot of thought... but in a sense it's a well understood aspect of programming that isn't specific to process engineering... and it's not really what I want to blog about today.

Let's go back to flying... This example is admittedly kind of absurd, but imagine that the airline changes their schedule while I am in the air. Let's say that the itinerary that I am on is now supposed to go from Austin to Denver to Chicago instead of from Austin to Dallas to Chicago. From now on, whenever I want to go to Chicago I will change planes in Denver instead of Dallas.

What I've described here isn't a process exception - it's a process change. The airline has changed the process of getting from Austin to Chicago.

For somebody starting a new trip, it's obvious that they will follow the steps of the new process definition. For folks who are already in the air on the first leg of the trip, there are two options:
  1. Continue the following the "old" process steps until the process finishes - this depends on the airline keeping a flight from Dallas to Chicago until everyone in the air gets there.
  2. Switch to the new process while "in-flight" - this depends on the airline re-routing the plane mid-flight to land at Denver instead of Dallas - or perhaps jumping from one plane to another as they pass each other.

Option number one is really safe. The old process is well understood and it's very clear where everyone will end up... we know we have enough gase to get to Dallas. Unfortunately continuing an old process until it is complete isn't always a viable option. Some processes take a really long time to finish, and it's unreasonable to carry on as if nothing has happened (sometimes for years) - the airline can't keep an unfilled plane waiting at Dallas.

Option number two - metaphorically changing planes while in the air - can be very risky to your in-flight process instances - the planes in the air may not have enough gas to get to the new destination, and it's quite likely that some of the passengers on the flight planned to get off the plane in Dallas. Let's not even think about jumping from one plane to another.

It's this ability to handle in-flight process changes that makes the implementation of a really good process manager difficult. Creating an "engine" that can execute a process definition is pretty straight forward... it's mostly a state machine.



Switching from one process definition to another while "in flight" can be a heap more complex...


Process improvement is a key goal of BPM adoption... When you figure out how to improve a process you want to be able to implement that change as soon as you can... but process change requires more that modelling the new process... you also really need to model the process of getting instances from the old process definition to the new.


For every possible step in the original process definition, you need to define the corresponding step in the new process.

If the correlation isn't exact (one process definition has steps that don't really exist in the other), then you must tell the process manager what to do. In some cases, an in-flight instance may need to "go back" and redo a "previous" step in the new process... or the in-flight process may have to perform some "transformation" steps before merging with the new process flow.




Unfortunately, BPM tools don't generally have explicit support for "migration" processes... it's generally left to the practitioner to figure out what to do with in-flight instances, and sometimes there's a rather painful transition when a new process definition is deployed. I'm sure this will change as BPM adoption increases, but in the mean time you'll have to build your own parachutes in case something goes wrong.

Thursday, June 25, 2009

People Manage the Process?

Gerhard Basson has posted a very nice article Process-oriented Systems Paradigm for the Process Age over at the BPM Institute web site. I agree with almost everything that Gerhard says... but as usual one little phrase pops out and spawns a blog of my own.

Gerhard asserts:
"processes do not manage people – people manage processes"
I understand what Gerhard means... very clever word-smithing... but I squirmed when I read it. I know that I am splitting hairs here, but "manage" just isn't the right verb for the phrase:
"people verb processes"

That aside, I vehemently agree (as opposed to vehemently disagree) with Gerhard when he says:
"People will bypass and reject any system that does not help them perform work in a natural way."

It's always about the people... From the conception of the process to the implementation of the process to the running of the process to the improvement of the process... it's always about the people.

If we don't adapt our systems for the People of the Process Age we're not going to get very far.

Friday, May 8, 2009

Process Manager Interoperability - Wf-XML

Scott Francis sent me a snippet from a recent Keith Swenson blog posting on Wf-XML...

I've been looking for a standard like this... Process Folks have been pretty focused on Process Definitions that can be executed on any Process Manager (Process Portability), but in my day-to-day it seems a lot more important to insure that Process Managers can operate with each other...

Specifically, I want to be able to incorporate a Process that is running on one Process Manager as a Sub-Process of a Process that is running on another Process Manager. I want to be able to start the Process, Monitor the Process, and get Information back from the Process when it completes.

Much as we might like a world in which there is only one Process Manager, this just isn't going to happen. Consider the Java app server world... Despite corporate attempts to mandate the use of only one Java app server you still find a mixed bag. Once a WebApplication is developed and deployed on one flavor of Java app server it's seldom migrated to another. It's just not worth the expense or the hassle - even though migration might be relatively straight-forward.

In the Process World we should be able to take full advantage of previously deployed Process Applications regardless of their BPM platform. If an application has been developed with an embedded jBPM Process Manager I should be able to incorporate it in my Lombardi solution. If my Lombardi implemented Process can be used by a wider Pega implemented Process, then that should work too.

Wf-XML is an attempt to make that happen... I haven't had time to delve into the details, but the concept is right on the mark.

Wednesday, April 29, 2009

College new-hires and the zen of BPM

I'm supposed to be preparing a one hour lecture for some college new-hires on: "Basic and Advanced BPM Concepts"... but yet I blog instead ;-)

For those of you who know me well, you'll probably think that my biggest concern is that I only have an hour. I tend to go on and on (and on) about topics that I am passionate about, leading a former co-worker to coin the term "johntification" in my honor.

Explaining BPM is what I do - focussing pretty much on what's now being called BPM-tech more than BPM-bus. When I first encountered BPM I felt like I had found the missing link... a paradigm and technology that closed the gaps between what I knew how to do and what my business colleagues really needed.

I was blown-away by BPM because I had experienced the pain of the problems that BPM addresses. This stuff fixed something that I knew needed fixing.

My worry about introducing BPM to college new-hires is that they probably haven't experienced "Process Pains"... at least not from the perspective of one who writes or maintains software, or from the perspective of someone who tries to get the "right" software written.

When I describe the problems that BPM tackles they may say "So what?". They may scoff at the magnitude of the problems - and they probably assume that the solutions that BPM provides have always been around.

With my standard BPM audience I'm fairly assured that heads will begin to nod in recognition of shared pain in thirty seconds or less...

Most in my audience have experienced meetings where a dozen people had to be present to figure out how that incoming *application* finally ended up as an outgoing *disbursement* (*substitute the inputs and outputs of your own business).

Most in my audience have also experienced "Office Heroes" - those harried individual who on a daily basis keep a company running through shear force of will. Whenever anything falls through the cracks... Whenever anything gets lost or derailed... Whenever any critical deadline is in danger of being missed... Office Heroes jump in and save the day. If a truck hits an Office Hero your business will really have to scramble to recover.

BPM helps business people understand how their company really runs, and it helps reduce their company's reliance on Office Heroes. BPM helps IT people provide tools that make it easier to run the company, easing the burden of the Office Heroes.

My college new-hire audience will certainly understand what I tell them about BPM - but will they relate? Can I make BPM relevent to their own experiences, or will I just sound like an old guy droning on about the old days: "When I was your age, computers only had 4K of memory....."

These "kids" are smart... they just haven't had as much experience. Instead of relying on shared experience to grok BPM I'll have to find another approach... When (or if) I figure it out I'll let you know.

Friday, April 17, 2009

Applications as Process Activites

I return to this topic a lot... BPM solutions automate the transitions between Activities in a Process, but they don't necessarily automate the Activities themselves. Some Activities can be automated, but others require Human interaction or Human judgment to complete. If we are convinced that BPM is a good idea, and we want BPM to be as pervasive as spreadsheets, then we need to make it easier to implement the Human-centric Activities that our Processes need.

Many BPM suites support building Human-centric Activities within the suite itself, but sometimes you'd rather re-purpose an existing application to do your bidding.

Let me walk you through an example from my own wonderful life as a Travelling Guy...

When I get a new assignment there are a lot of things that I have to do. I've got a mental checklist that I follow, but I'd really rather have a Managed Process to keep me from screwing up.

Most of my gigs are somewhere else, and that makes it very important for me to make travel and lodging arrangements. I almost always need a flight and a hotel, and based on the location I might need a rental car.

To perform this Activity I use my company's preferred Travel site - it happens to be Orbitz, but it could just as easily be Expedia, Travelocity or any number of sites. Each of these sites supports the concept of a "Trip" and each lets me make flight, hotel, and car reservations. These travel sites also provide feedback - they'll send me email when a reservation is confirmed or "something" changes.

As I am going through my process of getting ready for a gig, I log on to Orbitz and make the necessary reservations. When I am "done" I will "check off" that task from my list of things to do.
This happens a lot in BPM solutions. An Activity (a task) is assigned to an individual Participant. The Participant gets directions on how to perform the Activity (what it is that they need to do) but they have to use some "external" application in order to actually perform the task (in this example, the Travel site). When they are "done" with the task they "check off" the task and the process continues.

Processes that are implemented like this can be really error prone. The Process Manager has to rely on the Participant's assertion that the task has been performed correctly. You just have to trust the Participant to do the right thing.

If you had the time and if you had the money, then you could build all of the tools that the Participant may need to use in order to complete their tasks. Quite frankly, that would be a huge waste of both your time and your money.

The better approach for all concerned is to encourage the makers of those "external" tools to become Process Aware. Get then to modify their existing applications to be aware of Process Flow.

Let's go back to my "Get Ready for a Gig" Process... Some time after the Process starts I will need to perform the "Make Reservations" Activity. The Process Instance knows the Location of the Assignment, and it knows the Dates of the Assignment (the Process was started by a message that contained this information).

When I run the "Make Reservations" Activity from my Task List I'd like Orbitz to open with the Dates and Location of my Trip pre-populated. After I make my reservations, I would like the site to send (structured) confirmation information directly to the Process Manager.

Based on the information that Orbitz returns, the Process Manager would either move on to the next task, or require me to try again. For example - if Orbitz says that my reserved flight is on the wrong day the Process Manager should tell me.

If you are a Programmer, then I think you can easily envision how you would make the Webapp that implements the Orbitz Travel Site "Process Aware". The trick is in recognizing that there is a beginning and an end to planning for each Trip. Modify the Webapp to support "Dates" and "Location" on the invoking URL - along with a "Return Address". When the Webapp determines that planning for the Trip is complete, use the "Return Address" to send structured information (most likely XML) back the the Process Manager.

For Web Services (automated Activities) we've done this for several years. Service providers (like Orbitz) publish WSDLs that define the Services that we can invoke. It's time to do the same for Activities that cannot be Automated. It's time to standardize interactions with what I've called Human Powered Web Services - Web Applications that can serve as Process Activities.

In my ideal world, all Travel Sites would support "the same" interface for starting a new "Trip" and for returning results. That might be a bit much to hope for... but it's a good target to shoot for. Even if every site had a unique interface, we could build adapters for our Processes - a much better situation than today (from a Process Manager's perspective).

If you are a Developer who builds sites where folks go to "do things", then I hope I've inspired you to think a bit about those "things" as part of a larger Process. It's not a huge leap to build into your site the hooks that a Process Guy like me needs to incorporate your site into my Managed Process. I'll be grateful, and I'll bet the users of your site will be too.

Wednesday, April 8, 2009

Business Process and Activity Interfaces

Business Processes are made up of Business Activities. In a BPM solution the transitions between each Activity are automated, but the Activities themselves may either be automated or they may be performed by Humans (That's why I use the term Managed Business Process instead of Automated Business Process).

Most of the focus in the community of BPM practitioners of late has been on the Business Process Definitions (BPDs) that the Process Managers use to control the transitions between Business Activities. BPDs provide the information that the Process Manager needs to assign the right Activities to the right Participants at the right time. Google BPMN, BPEL or XPDL and you'll find plenty to read.


BPDs are crucial... but they only define the Process Flow. Activities are identified in BPDs, but only the interfaces to the Activities are actually defined (The interface to the Activity defines the information that is passed from the Process to start the Activity and the information that is returned to the Process from the Activity when it completes).

All of the BPM suites that I am aware of generate executable process definitions (in one form or another). Some BPM suites generate "portable" definitions using standards like BPEL and XPDL, but many (including the Lomabardi Teamworks suite that I use) produce proprietary definitions that will only run on their own Process Managers. One of the big drivers between the efforts to make BPMN 2 "executable" is to drive the industry towards "portable" BPDs that can run Process Managers from any vendor. Great idea... but it's going to take some time to see if it gains traction.

Many BPM suites also incorporate tools to implement Business Activities. A Business Activity is really just an application that is tailored to perform a specific task. The Process Manager kicks off the application with some initial information, the application runs (usually gathering information from a Human) and when it completes it passes information back to the Process Manager.

I've had discussions with BPM practitioners who feel that implementing Business Activities should fall outside the realm of a BPM suite. Perhaps I've been spoiled by Teamworks, but I can't imagine creating my BPD in one tool and implementing my Activities in another tool. It's just incredibly nice to be able to drill down from an Activity on a Process Diagram to the underlying implementation of that Activity.

Despite my bias toward integrated Activities, I freely admit that you must be able to implement an Activity outside the limitation of your BMP suite of choice. Some Activities do require a very sophisticated user interface - and in many cases a pre-existing application can be fairly easily adapted to perform an Activity. I've had to resort to "external" Activities on several occasions (for key Activities).

If the implementation of an Activity is beyond a simple Form that the Participant needs to fill out, then you'll have to build some sort of task focused application for the Participant to use to complete the Activity.

BPM suites such as Teamworks allow you to build rather sophisticated task focused applications - but sometimes it makes more sense to go outside the suite and build the application using JSP, ASP, .Net, Ruby - whatever tools make the most sense to get the job done.

The "problem" is the lack of standard interfaces for connecting these "external" task oriented applications with your Process Manager/Process Engine. Most of the good BPM Process Managers provide the necessary interfaces, but they are almost always proprietary interfaces.
In the early days of BPEL, Activities were "just" standard web services. An Activity (web service) that was invoked from one BPEL engine could be invoked by any other.

BPEL (by itself) could only implement processes that could be completely automated. Most Business Processes include Human Activities - so pure BPEL implementations were few and far between.

Unfortuntely we've yet to come up with a standard (or even a de-facto standard) for implementing Human Activities. Some BPM suites (like Intalio) make use of standards (like XForms) to implement the user interfaces for their Activities, but the back-end interfaces between the Process Manager and the Activities are still unique to each vendor. There's no standard for registering an "Activity Application" with a Process Manager There's no standard way for an "Activity Application" to authenticate its user with the Process Manager. There's no standard way for an "Activity Application" to query a Process Manager for Activities that belong to its user.

Until we have these standards, there's really no such thing as a "Portable Activity". I could build a framework that helped me build task oriented applications, but I'd need adapters for any BPM suite that I wanted to interface with.

Hopefully this will change soon. As more BPM suites begin to support "standard" BPDs it will become obvious that we need Portable Activities too.

Saturday, February 21, 2009

Seeing is Believing - Business Process Visualization

Time to introduce a new TLA (Three Letter Acronym) to this blog: BPV - Business Process Visualization.

I often sum up basics of BPM (Business Process Management) with the following pitch:
  • Model your Process 
  • Manage your Process 
  • Monitor your Process 
  • iMprove your Process
For me, the enabling factor for BPM was the Executable Process Diagram.  Use tools to graphically depict the Activities and Flows of a Process, and then auto-generate the software necessary to manage the transitions between those Activities.  
With any good BPM Suite I can rest assured that the Right People will Execute the Right Activities at the Right Time to keep my Processes Flowing Smoothly.

The BPM Industry has more than met the PFS (Processes Flow Smoothly) challenge - The decision for a company to adopt BPM technology and methodology is pretty much a no-brainer (Which suite to adopt may require some thought, but whether or not to adopt isn't even an issue any more).

So what's next?  

In a recent post I attempted to describe a BPMS in terms that might have been familiar to any 19th Century businessman .  One of the comments I received really peaked my interest:

Roeland Loggen said...
Maybe your story reflects one of the key issues with current BPM thinking: if the software provides automated concepts of a year 1890 shopfloor, then not much innovation has been created. Since 1890 many many new ideas have been created on people working in processes. The mechanistic view of typical BPM software has not clue of these more modern concepts. 
The 20th century knowledge worker's reality and productivity is muh different from Taylor based process thinking... not based on orchestration, central coordination, workflow items, inbox/task screen.... but adhoc, collaboration, changing processes on the fly based on new insights and events....


For the sake of argument: If every 1890 shop floor had an exceptional Process Manager on hand, productivity would have soared... but Roland's right: We can't be satisfied to stop with systems that automate 19th century businesses.  It's the 21st Century, and we can do better than that.

The key for doing better is Business Process Visualization (BPV).

In my 19th Century BPMS I discussed someone that I dubbed the "Process Visualizer".  This is that person who took the raw metrics that the Process Manager had gathered and turned them into reports, charts and graphs that the Business Owner could use to understand the past and to plan for the future.  These reports, charts and graphs are an integral part of BPV, but there's a whole lot more to the full picture.

BPV begins with the Process Diagram.  The Process Diagram distills the essence of the problem into a form that Business and IT can share.  It's a crucial part of validating that the process is correct and for communicating the requirements from the process owner to programmers who will implement the process application.  In most of today's BPM Systems the life cycle of the Process Diagram often goes something like this...

The Business decides to improve their operations, so they hire an Analyst to come in and help them.  

The Analyst talks to a lot of people and produces an "As Is" Business Process Diagram.  This diagram is often used by the Analyst to reassure the Business that  the Analyst really understands what's going on.

Now that the Analyst really understands what's going on, a new "To Be" Business Process Diagram is created.  This diagram is used by the Business to capture exactly how they want the new process to operate.

At this point simulations can be run to predict how the new process will behave relative to the old.  Simulation is always tricky, but if the assumptions are accurate (hopefully based on historical data), then it's sometimes possible to tune the "To Be" definition to increase the likelihood of success.

Once the Business and the Analyst are happy with the "To Be" diagram, it can be passed on to the Implementers (and it now becomes known as the "As Planned" diagram).  The Implementers now morph the diagram into an executable diagram, usually adding way more details than the Business wishes to see.

Slight digression:  There's a lot of controversy swirling around BPMN 2.0 over the issue of adding execution semantics to process diagrams.  Implementers (in general) want truly executable diagrams but Business folks (in general) don't want to clutter the diagram with non-Business related details.  I think there's a simple solution... which I will get to later.

The Implementers work with the Analyst and Business until all are happy and the process application is deployed into production... at this point the Implementer's diagram becomes known as the "As Built" diagram.

Note to all: The "As Built" diagram is a clear differentiation from all pre-BPM Systems.  In a BPMS, the "As Built" really is the process that is running in production.  Before BPMS the diagram may have been accurate once, but you never really had confidence that the diagram ever matched reality.

In some BPM Systems, the "As Built" diagram is often the end of the line.  Once the process application is deployed they don't think about it much until the next time a process change is contemplated... However, in many BPM Systems the "As Built" morphs into a living and breathing "As Running" diagram.

Live and historical data breathes life into the "As Running" diagram.  My old friend the Process Visualizer takes the raw data from the Process Manager and animates the "As Running" diagram to give the Business a clear view of what's happening where (and what happened when).  The "As Running" diagram gives you your first glimpse of what BPV can achieve.

Thus far the Process Diagram has be limited to a pretty small audience - those involved in designing the process and those involved in implementing the process.  BPV should expand the Process Diagrams visibility to a much wider audience.

Go into just about any office and ask a typical worker the following question:

"What Processes do you work on?"

Most likely, the response will be a blank stare.

Now ask the worker:

"What Activities do you perform?"

The blank stare will vanish and you'll get a very accurate litany of tasks.

BPV needs to give each worker the answer to that first question.  It's not enough to know what Activities you need to work on - To do your best you really need to know how those Activities fit into a bigger picture.  Process knowledge is a bigger picture, and it goes a long way to helping the workers understand why they are doing what they are doing.

A task that seem stupid in isolation may make sense when seen in a process context - or it may really be stupid.  In the former case the worker may do a better job, and in the latter case BPV may give the worker the context needed to communicate their frustration.

When folks such as Lombardi's Phil Gilbert talk about BPM on every desk in an enterprise, this is what they are talking about.  Process Visualization helps everyone put things in context, and that helps everyone do a better job.

Now for a bit of Devil's advocate:  Most workers couldn't make heads or tails out of a BMPN diagram, and if you add all the execution semantics proposed for BPMN 2.0 that percentage is going to drop even further.

Granted, but there's a solution: Different views for different folks.

Returning to my programming roots... I am a huge fan of Aspect Oriented Programming.

Aspect Oriented Programming evolved from a simple fact: Most worthwhile endeavors can be broken down into distinct concerns, and often you can meet the goals sooner by dealing with these concerns (aspects) in isolation.  Obviously there are limits to this when concerns overlap, but it's a good place to start.

With Aspects as my inspiration it's easy to see where Process Diagrams need to go: Aspect Oriented Diagrams.  If we can identify the aspects of a diagram that the Business Owner is interested in, and those that the Operations Folks are interested in, and those that each Worker is interested in, then we can generate the Views that mean the most to the intended audience. It's all just a matter of Perspective, and BPV enables many, many perspectives.  Some views may look like process diagrams, others may look quite different - doing BPV right will employ a lot of artistic scientists.

Here are just a few process views that come to mind:

  • Process Owner's "40,000 foot" view
  • Activity Owner's "Escalation logic" view
  • Participant's "Where does this fit?" view
  • Operation's "What's happening now?" view
  • Operation's "What happened?" view
  • IT's "Capacity planning" view

 As Roland said, the future is about "ad hoc, collaboration, changing processes on the fly based on new insights and events"...

BPV's going to help us find those insights.