In this article, I would like to address some of the difficulties involved in replacing an existing client system with a completely new one. Having gone through this process several times in my career, there are some lessons I have learned that can make this transition easier for the end user. The key is not to take an initial set of requirements at face value, but to work with the future users of the new system (in conjunction with their management) to make sure what's delivered is what's needed.
Any new computing system that's being brought into an organization is replacing an existing system. By "system," we mean any set of processes that work together to perform a business function. The points of interaction with the processes in place may incorporate a mainframe interface, batch files, or even simple chits of paper. The processes may even be completely verbal. A computing system (the one you're bringing in) is an automated type of system.
If the new system is being developed or significantly customized, then there will be an analysis phase. Because you're replacing a system (which is a set of processes), one of the biggest challenges you and your target users will face is discriminating form from function. In this article, we take form to mean the processes by which a business function is performed. Form versus function: this age-old dichotomy is at the heart of systems analysis.
Usually, users will have operated within an existing system for some length of time. Almost immediately, users begin to see these processes as their business functions. The two are confounded in their minds: by following Process A, they perform business Function Z; this often leads to the subconscious conviction that this Process A is a necessary precondition for Function Z. However, in reality, any number of processes could be used to perform Function Z; the fact that you're replacing the existing system proves that this is so. After all, if the system in place performed its functions at the desired level of efficiency and accuracy, would you be asked to replace it?
Your job as an analyst is to lead your target users (the local domain experts) from discussion of how they do things at present to why they do them. This may be something they really haven't thought very much about before. Until you arrive at this stage, any requirements you get will reflect the existing processes, which (if you will recall) you're trying to replace, not mimic.
Of course, there's another factor at play here, and that's resistance to change. Hey, your future users have identified their current processes with their functional roles in the business; threaten to change those, and you threaten them, inasmuch as you take away some measure of their expertise. This, however, is not an analysis problem, but one of gaining trust and respect. More on that later.
This is the unstated initial requirement of any new system I've been asked to develop. Nobody really wants to change the way things are done, even though they recognize the problems. It's an expensive proposition to replace an existing system with a new one, and whoever's paying for it is going to want to see some bang for their buck. It may not be the end users, who have learned to live within the limitations of the current system and probably have all sorts of protocols for dealing with the stuff that goes wrong on a frequent basis. Instead, it will likely be the upper-level manager or executive who sees in the bottom line just how costly such workarounds can be. He or she may also understand that they "can't get there from here" with the system they have, while the end users may not possess such strategic vision.
So you, as an analyst, are stuck in the middle. You have two customers to please, or at least not wholly alienate. What can you do?
Your target user is the domain expert at his or her job, but often you have to work with a target group of such domain experts, each with his or her own perspective on how the overall process works. Often, these experts will disagree about (or at least remember differently) some aspect of a job. Since your system has to reflect the global reality, you need to watch out for a few key words from your experts in response to your analysis questions. These are the key words I generally pounce on when I hear them:
On these responses hinge fundamental design decisions that will have a profound impact on the new system's data model, logic structure, process flow, and error/status dialogs. The first four are definite answers that reflect a certainty that may be misplaced. "Yes" or "no" responses should be corroborated with other members of a group, or the question should be rephrased and asked again. You're conducting an interrogation session, and must be sure of your answers or nasty surprises await you in the future.
"Always" and "never" are really tricky, because people often overlook the rare exception in analysis sessions, and accounting for these later can have profound consequences for the data model and logic of an application. "Always," for instance, can change to "almost always" when further questions are asked. That's a large difference in a computing system. Usually, the exceptional case occurs rarely enough that nobody will notice the error in requirements until sometime after the system goes live, when it can be expensive to change. What's worse is that often what distinguishes "always" from "almost always" is a whole unexplored use case with even broader ramifications for system redesign.
I think it's important that you see your end users as your customers, even though management is paying the bills. While it's true that you can't ignore requirements from the management side, you can at least convince your end users that any additional data they have to key in, or any change in methods pertaining to new business processes, are not due to your own capriciousness. You have to identify with the end user, and recognize his or her pain as your pain. When you've done that, you see to it that change occurs in the least painful way possible.
In extreme cases, you may decide that the user's view of their job is more in line with reality than that of management. This can range from the trivial to the fundamental. Here are two cases from my own experience.
I once worked for a startup that built interdepartmental software for rural hospitals. One of the components of this system was an application for hospital admission. Admitting clerks can get quite harried at times, and it can be stressful getting vital patient information from someone who's in pain and has no patience for the necessary paperwork involved. As a consequence, admitting clerks learn to be fast. Sometimes they can be a little too fast, and will skip entering non-medical data that under less harried circumstances they would obtain.
One of the frequently skipped fields was for Social Security Number (SSN). Back in those days, you weren't required to have an SSN until you were employed, so sometimes there really wasn't one to enter. However, SSN was a very useful identifier for accounting. When a new manager subsequently arrived on the scene, the request was made that the admitting application be changed so that something had to be entered in the SSN field, if only to get the clerks' attention. Although it was a simple program change to make the field required, I knew enough to know such a change was not only going to be ineffective (after all, entering a dummy "0" is only attention-grabbing until you get a habituated to it, and doesn't force anybody to make a valid entry). It would also change data entry cadence of those admitting clerks who had by this time become lightning-fast in keying in patient data. My arguments to the contrary, I was forced to make the change.
Well, I lost that battle, but only for the day. As expected, the clerks revolted over the change, and the next day, I was asked to change it back by the now-sheepish manager.
Sometimes, advocacy is not over such a simple matter. I was also a member of a development team building a in-house publishing system for a large company. Our end users were two groups of engineers responsible for writing aircraft maintenance manuals. The executive in charge of funding this new system wanted both groups to use the same interface, as a way to coerce them into working more in tandem. In other words, the computing system was supposed to fix a management problem. In the right circumstances, it's possible to do that, but you have to be careful.
The current situation was that each group had its own mainframe application for its separate tasks. The larger of these groups wrote most the narrative of the manual; the smaller group would take bits of that narrative for the primary product they produced, called "task cards." Task cards described the procedures to be followed in a concise format that allowed a mechanic to take it with him to the aircraft he or she was working on. While the task cards would usually contain a few lines of narrative from the maintenance manual, much of the information on it was derived from other sources.
While on the surface it looked as though both groups were producing documents that could be managed by an SGML editing tool, in fact, the two groups worked separately precisely because their jobs were very different. The input that drove the production of the task cards wasn't very narrative at all. In fact, it was virtually all data-record driven: there were relations between tasks, durations, frequencies, priorities, part numbers, and all sorts of data fields of specific type, value range, and structure. In addition, there were requirements for schedule load-balancing so that daily maintenance hours for a fleet of aircraft weren't rising and falling like so many hills and valleys. The smaller group didn't need an editor; they needed a data-record input screen, with drop-down lists, combo boxes, range checking and the like.
Could they have worked in an SGML editor? Not likely; even if the data fields were properly marked up and type-validated, there were still dependencies among data fields that had to be cross-validated, which would require a very complex, in-memory document model to enforce. Not to mention that doing any real-time schedule balancing in an editor would be a real challenge.
In the end, what suited the task-card group was a database application that handled record entry and updates. It took considerable end-user advocacy to make management and executives aware of the need for separate, interacting systems for each group. It was not an easy battle, but in the end we were persuasive, and after building the system for the task card group, we had a high level of user satisfaction.
You, the software developer, are an agent of change, and people as a rule don't like change. They don't like it for a number of reasons: it requires adjusting to new ways of doing things, there's fear that automation will eliminate jobs, or that the new system may not work well, or that insufficient input was given in choosing the new system, etc., etc. You're often seen as the guy in the black hat, which (I'm sorry to say) is not the good guy.
You're going to need all of your mojo to convince them otherwise.
There may be a lot of griping at first. Pretend you're a bartender, or a Rogerian psychologist. You want them to know that their opinions count, even if you aren't able to resolve all of the issues.
Get involved in seeing how your target users currently do their jobs. I can't stress this enough: there's nothing like in situ observation for learning the true nature of someone's work. You won't get this kind of information from written requirements. You will likely see things that you can fix or make easier that the customer hasn't even noticed as being a problem or a likely candidate for automation. Ron Jeffries wrote a very good article on this topic some time back, placing this type of interaction within the framework of Extreme Programming.
Pick the Low-Hanging Fruit First
Deliver early and often. Look for tasks that you know you can easily automate in a way that gives the end users immediate payback and satisfaction. This allows them input into the process, and that gives them confidence in the ultimate result. You don't have to implement core features right away if there are high-payback ancillary features that have value and that you can deliver early.
Remember: There Are Two Domain Experts in the Room
In the final stages, you'll know enough about the users' jobs to superficially be able to do them yourself. Your target users won't have that same advantage regarding your expertise, but at this point, if all has gone well you will have gained their respect and trust. This means that you are now able to explore options with the users that they haven't foreseen or perhaps for which they don't even now see the need. Yet no matter how important you think a change in process may be needed, you still must take a passive stance in suggesting new features or changes in processes; hopefully, though, at this point they trust you enough to take what you say very seriously.
These two things will make your job easier:
If you're developing or customizing a new system, there's nothing that's going to make your job easier than having an advocate user on the inside who can see the potential benefits you're bringing. An ideal advocate is willing to provide ideas for improvement, test new beta changes, and help convince others of the value of the new application. You'll be able to bounce ideas off of this person, and in turn, this person can vouch for how open and responsive you are to others in the organization.
A Thick Hide
No system has ever satisfied all of the people all of the time. Some of your users' jobs are very demanding, and they'll likewise be demanding and critical of just about anything you do.
Remember those admissions clerks? No matter what I did, it seemed that I could never completely please them. But I never let it upset me: you do the best job you can and take the criticism with a grain of salt. One of the more amusing situations in my career was when these same users finally got a new admitting system from another vendor. I came in one day to help with the transition and the first thing they said to me was:
"Oh, we wish we had the old system back, yours was so much better!"
"Really?" I asked.
"Oh, yes, it was perfect," they said.
Ah, well. Perhaps you can please everybody. You just can't get them to admit it.
Jeff Lowery is a JDO expert and advocate and is experienced at using Exolab's Castor, Jakarta Ant, and more.
Return to ONJava.com.
Copyright © 2009 O'Reilly Media, Inc.