Preface
Chapter 2: Why Killer Applications Can Be Assisted by ERP and CRM Software
2.2 Organizational Studies, Reengineering, and ERP Support
As Chapter 2.1 demonstrated, there have been two major classes of killer applications in off-the-shelf programming. One is commodity software such as a spreadsheet, which was a new invention and received the market's eye almost instantaneously. The other is word packages, constituting a smart evolution of what already existed, albeit in hardware form. My prognostication is that advanced software solutions for smart materials and the Internet supply chain will largely fall into this second class, to which ERP also belongs.
Management that is worth its salt would see to it that it takes technology investment decisions in a way that makes it feasible to capitalize on this second class of advanced applications — much more common than the first. In terms of return on investment, as Exhibit 2.4 suggests, this is equivalent to taking the high road. This is precisely what the winners are doing; they are espousing high technology.
In contrast, the losers, and that is the majority of firms, choose the low road because they lack the vision and the guts to manage change.
Exhibit 2.4: Only Winners Know How to Reap Results from Spending Money on Technology Like the able use of commodity word processing software, which in the 1970s was required for the reengineering of the office as people knew it, the advent of killer applications in the smart supply chain will call for major restructuring efforts to bear benefits. Therefore, this and the following chapter section focus on what is needed for reengineering and for advanced organizational solutions.
Consider first an example from the author's personal experience on the disconnection currently existing between investments and return on investment. When asked by a company to make an audit of its technology status, my bet is that there is more than an 80 percent chance it has allowed both its headquarters and many of its subordinate organizations to maintain separate, functionally duplicative systems that are difficult to integrate; bought software that has been massaged beyond recognition to fit
"this" or "that" old procedure; and significant diversity to the support for operational requirements concerning various business components.
Because all this makes up a highly heterogeneous environment that is difficult to maintain, it becomes so much more complex to collect, analyze, and process data at higher organizational levels. In fact, when diverse platforms and incompatible operating systems are being used, collecting vital information for auditing reasons is a nearly impossible task in a short timeframe. Forget about real-time response and intraday virtual balance sheets.
From a bill-of-health perspective, it matters less if the company has in place a properly managed ERP implementation than if its computers and communications supports are streamlined and fairly
homogeneous. When this is not the case, the probability is that ERP software is misused, and invariably the technical audit proves that it is so.
Senior management should understand that what matters most is to get its company ready for a
quantum leap and to do so with clear objectives in mind. The best way to proceed, in my judgment, is to follow the advice of Jean Monet, an investment banker and father of the European Union, who
suggested the pathway depicted in Exhibit 2.5.
Exhibit 2.5: Working on Goals and Accomplishments: From End Results to the Allocation of Resources
For planning purposes, start at the end with the deliverables and the time at which they should be presented.
Obviously, the execution path will start at the beginning, observing the milestones the planning phase has established.
The second guess when confronted with heterogeneity, this too being dictated by experience, is that the company's top management is not really in charge of information technology. The old closed-shop practices are still around and do the organization a great disservice because they bend its
competitiveness.
It does not really matter if management has asked for a consolidation of data from lower levels. This collection process depends on subordinate organizational units feeding information upstream — a job quite often performed manually despite huge amounts of money spent on IT. Delays and costs aside, there is also a problem of timeliness and accuracy. Under the described conditions, experience teaches that consistency and accuracy of data flows is a practically impossible job. It is simply not possible to maintain and control data flows in a consistent manner — hence the need for reengineering.
If one only listens to the assertions of IT managers, it would appear that this need to reengineer a major part of the company's technological infrastructure is not urgent; it is not even self-evident. This attitude escapes answering some of the key questions facing companies today — questions to which there are no easy answers:
What kinds of new skills are necessary to capitalize on technology investments?
How long after a new technology is used should it be substituted?
How can one avoid the transition period being accompanied by problems that affect the end user's ability to perform his or her daily work?
Valid answers to the first two queries are most urgent if one accounts for the fact that practically every company has, at least in some part of its operations, EDP remains from the 1970s and even from the 1960s, characterized by a crying need for restructuring. But rarely is management willing or able to answer the query: "How long are the technological Middle Ages going to continue?"
Restructuring is, to a large extent, an organizational mission, invariably accompanied by the need for new, more efficient programming products. That is where ERP comes in, because re-equipping in software should not be done by reinventing the wheel but by following the policy described in Exhibit 2.6. This is based, in large part, on packages purchased off-the-shelf that do, however, require skill to be appropriately implemented.
Exhibit 2.6: The Policy to be Followed in Connection with Programming Products
The principle is that of setting the right priorities. Neither the introduction of bought software nor the reengineering effort that should address the technological dark corners solve the salient problems all by themselves. Change is necessary, but such change should not disrupt the current IT support. To the contrary, it should target continuity of the ongoing information services until such time, in the near future, that these are replaced by a system that is state-of-the-art, integrated, functional, seamless, reliable, flexible, and operating at reasonable cost.
The speed of work associated with reengineering complex systems and associated expenditures depends on how this work is organized, the availability of appropriate human capital, the retooling
necessary to get results, the ability to utilize relevant solutions previously attained and therefore tested, and other factors along this frame of reference.
Two of the processes that have been successfully tested over the years and are part and parcel of the infrastructural layer to reorganization and reengineering are classification and identification. Both are discussed in detail, along with practical examples, in Chapters 10 and 11. Classification raises the perceptual knowledge of objects into the conceptual knowledge of relationships, sequences, and laws.
This knowledge belongs to us by the inherent nature and structure of the mind. However, it can be brought into perspective only through rigorous organizational work.
Classification and identification are so crucial to the successful implementation of ERP and the advent and use of smart materials that Section II provides not only the principles, but also the hands-on experience in how to implement a rational classification solution that is valid throughout the organization, and describes how to associate with it a paralleled identification structure.
For starters, classification has to do with order and also with intellect and with the senses. The philosopher Locke was wrong when he said, "There is nothing in the intellect except what was in the senses." The mathematician Leibnitz was right when he added, "Nothing except the intellect itself."
Another philosopher, Emanuel Kant, advised that perceptions wove themselves into ordered thought. If mind were not an active entity hammering out order from chaos, how could the same experience leave one man mediocre while another more tireless person is raised to the light of wisdom?
Classification and identification are valuable intangibles. As discussed in Chapter 2.4, the measurement of benefits derived from a forward leap in intangibles could be instrumental in evaluating the killer applications in the years to come. Some others will concern idea databases and complex concepts embedded in documents.
Historically, computers have processed information while people have handled documents. While computers have been capable of both scanning documents and processing the information in them for many years, legacy systems have been unable to search documents through concepts and content — which is what underpins idea databases.[1]
Quite similarly, with legacy applications, corporate information is kept in many places and in many incompatible forms. While a great wealth of data — even knowledge — is stored on a company's computer system, it is not readily accessible nor easy to find. With legacy software data, retrieval procedures are time-consuming and it becomes expensive to access the database online. This has also led to massive ongoing errors that must be corrected through reengineering. However, unless a
thorough classification and identification job is performed, solutions are mostly elusive.
[1]D.N. Chorafas and H. Steinmann, Supercomputers, McGraw-Hill, New York, 1990.