K NOW LEDG E TRANSFER
CASE 7-2 REAL-WORLD CASE
HEWLETT- PACKARD SAP IMPLEMENTATION
Hewlett-Packard was founded in 1 939 by Bill Hewlett and D ave Packard, both st udents at Stanford University. They built an audio oscilla
tor - a n electronic test instrument used by sound engineers. One of their first sales was Lo Walt Disney Studios, who used the device to develop and test a new sound system for the movie
Fantasia. From 1 939 to the present H P has grown and changed as technology h as grown and changed, often inventing new and useful technol
ogy products for businesses and consumers. They are now a worldwide information tecllllology com
pany headquartered in Palo Alto, California, with
$85 billion in revenues. The company is currently organized into three divisions or groups:
• Personal Systems- business and con
sumer PCs, mobile computing devices, and workstations
• Imaging and Printing- inkjet, LaserJet and commercial printing, printing supplies, digital photography, and entertainment
• Technology Solutions- business products including storage and servers, managed services, and software.s
For several years Hewlett-Packard had been working to centralize i ts ERP systems. 'TIley had migrated five product groups into two SAP sys
tems and had been very successful. A couple of years earlier HP had purchased Compaq and as a result needed to incorporate the two operations into a single model. In May 2004, however, Hewlett-Packard was implementing the SAP ERP system in its l argest North A merican division.
Prior to that the ERP implementations in previous divisions were successful and there was no reason to think that this next one would be problematic.
5 Extracted fwm HP website:www.hp.com
1 71
The company had a number of successful imple
mentations under its belt and believed that even though this was a m uch larger division, there was a good, experienced team that could address most any implementation issues. The Go-live plan allowed for about 3 weeks of problems and issues related to interfacing between the legacy order
entry system and the new ERP, SAP. The project manager had identified one of the biggest risks and had a plan in place to address the issue.
When the system went live, however, there were some technical glitches between the legacy and the SAP system. Although the problems on the technical side were not a big issue and were mostly resolved in 4 or 5 weeks, about 20 percent of orders were stopped dead in the water until the problems were fixed. This created a backlog of orders, and the manual workarounds were not suf
ficient to keep the flow of orders to meet customer demand. Customers called HP to complain, but, even worse, they called their competitors to deliver the products not supplied by HP. HP had estimated the financial impact at about $160 mil
lion, $ 1 20 m i llion i n order backlogs and $40 mil
lion in lost revenue.
The implementation was considered a disas
ter. It was in fact the resul t of some very minor technical problems that created a snowball effect on the busin ess. The i mplementation team did many things right . They tested the system and the interface between the legacy and SAP. The team also trained the end-users 2 weeks prior to Go
live and m ade them pass a test to certify they knew how to use the system. A number of the issues could have been addressed prior to Go-live with some added investigation and more timely training.
1 72 CHAPTER 7 OPERATIONS AND POSTIM P LEMENTATION
--------------�����������������----------------
CONCLUSION
H i ndsight being what i t is, the obvioLls conclu
sion to be drawn from this implementation is that care needs to be taken when assessing readi
ness. The contingency plan was lacking and needed to be expanded to incl ude both technical issues and workarounds that also addressed the business issues.
Two specific key components for the end users were problematic:
• The training did not coincide with going live.
ll1e 2-week period between the training and going live allowed the users to forget some of the details on how to use the system. lllis may have been alleviated by providing a practice instance for end users from the time they were trained until beyond Go-live.
• The second issue involved more complete testing. In a supply chain ERP implemen
tation, the development of a robust test plan and the development of test data,
along with testing using "real" data and
"real" customer information is essential for a successful Go-live. This will ensure that orders can be filled in a timely basis and end-Llsers will develop a high level of confidence in the system and its processes.
QUESTIONS
• What were t h e common threads between the H u gger- Mugger and H P ERP imple
m en t ations?
• What were the key project management strategies that may have been used to mini
mize Go-live problems with the HP SAP Go
live process?
• When i mplementing an E RP system, espe
cially supply chain systems, identifying risks and m in i mizing them requires planning.
Discuss how IT needs to work with t he busi
ness to address Go-live planning and issue resolution . •
�
..,
� <:; I::l �
6 s:
- -
.�
.�
§ � �- ... �
Current � ==
.§ � � � §-
Contingent .� ..::.: I..l Minimum Actual � �·::: � � e i: <.>
..,
'" <::: '" '" '"Seq Category Criterion 0 i,;j i,;j 1J 1J KeJ!. Measures Workaround(s) I::l Eo;; "Pass Status" Status -.:t: -.:t:
1 Tech Infrastructure Readiness
1 .01 Tech Database servers. M X Production servers Go-live on pro- 95% HR avail-
Infrastructure installed, tested, stable. duction server. able for 1
Readiness All required software Go-live on month prior
installed, tested (system, reporting to Go-live;
database, network, server. Go-live 80% fin; 80%
application) Production on develop- rep.
servers available during ment server.
scheduled hours. Implement dis- Utilization aster recovery
assessments. process .
...
...
1 .02 Tech Application servers. M X Application Servers Go-live on 95% available
w
Infrastructure installed, tested, stable. Production for 1 month
Readiness All required software Server using prior to
installed, tested (system, logical three- Go-live.
network, application). tier approach.
Application server available during sched- uled hours.
1 .04 Tech Web servers. M X Web server installed, Go-live on pro- 95% available
Infrastructure tested, stable. All duction Web for 1 month
Readiness required software server. prior to
installed, tested. Server Go-live.
available during sched- uled hours.
( Continued)
( Continued)
� �
<::
:: <; Cl
6
�... ...
.£ :: �
-
... "l § § Current E :::.§
� �� � .-
WMinimum Actual � 5
Contingent .� .::.: �
.;: � � � � <..> '"
"Pass Status" Status � �
Seq Category Criterion � � � � (5 Key' Measures Workaround(s) Cl � � � �
1 .05 Tech Report distribution M X Product installed, config- Print centrally Productloperati Infrastructure (online view and ured, tested. Ability to and distribute ons test
Readiness remote distribution). view reports online. manually results indi-
Ability to print to through inter- cate Go-live
remote printers. nal mail. readiness.
1 .06 Tech Patches and fixes H X Patches and fixes applied. None Patches and
Infrastructure applied to database fixes issued 1
Readiness environments. month prior
to freeze date
...
... have been
� applied.
1 .07 Tech Network availability. H X Network availability dur- None. 95 % availability
Infrastructure ing scheduled hours for 1 month
Readiness performance test prior to Go-
complete. live for all
campuses.
1 .08 Tech Connectivity between M X X X X Connectivity from each Move user(s) to Minimum Infras tructure sites and internet (if site LAN to the produc- alternate latency of
Readiness needed). tion server established site(s)- depen- 100 milisec-
through connectivity dent upon site onds from
test. Go-live with point-to-
sub-set of user point.
population.
1 .09 Tech Local site network. M X X X Planned upgrades com- Move user(s) to Minimum
Infrastructure plete, if any. Site net- alternate site(s) latency of 100
Readiness work performance test -dependent miliseconds
complete. upon campus- from point-
need to assess to-point.
validity.
( Continued)
Seq
1 . 1 1
1 . 1 2
....
111 1 . 13
1 .14
1.1 5
Category
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Criterion
Workstation.
Application availability during business hours- production.
Application availability during business hours- -reporting database.
Application availability during business hours- development database.
Batch window.
C>
� ..,.; � �
� !"'I .� � � � �
.;: � � � � (i (;5 (;5 0 0
�
Key Measures
H x X Workstations in place con
nectivity test complete all required software installed, tested (system, network, application, OA).
H X X X X Application available dur
ing scheduled hours.
M X X X X Application available dur
ing scheduled hours.
M X X X X Application available dur
ing scheduled hours.
H X Batch performance test passed heaviest batch schedule can complete within allocated batch window.
Contingent Workaround(s)
Move user(s) to alternate site.
Extend batch process in to normal online hours. Quantify the number of hours that the schedule would extend if nec
essary. Move or eliminate some processes if possible.
t §:
6
.�
.:::? '-'
.,
Q
...
� '-'
u §
..:::
'"
�
Minimum
"Pass Status"
Percentage of campus key users: A (80%), B(80%), L(80%), and W (75 % ) . 95% available
for 1 month prior to Go-live 95% available
for 1 month prior to Go-live 95% available
for 1 month prior to Go- live
During perfor
mance test, batch processes can be completed in the desig
nated window.
.,
�
... Q ...
� �
Current :::
�
'"
Actual i;'j
� �
Status � �
( Continued)
-
...
CI
( Continued)
Seq
1 . 1 6
1 .1 7
1 .1 8
Category
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Tech
Infrastructure Readiness
Criterion
Key transaction throughput.
:.:: .:: � � C. <:: � � '" '" ::- �
·E � � e e
'-.l (;j (;j IJ IJ Key Measures
H X X X Online performance test passed.
Production user classes M X X X X Production classes config-
and user IDs ured and tested.
loaded. Production User IDs
loaded. Production user authorization received.
Production User IDs linked to cl asses.
Production environ- H X X X X Production environment
ment is stable. tested during conver-
sion dress rehearsal (after data is moved over).
Contingent Workaround(s)
Implement
"shifts" for online users to minimize con
currency.
Workload bal
ancing. Deploy additional staff on temporary basis. Work overtime.
Manual creation of operator IDs and passwords.
----
;;;
§:
<5
.�
.::1� Q
t §
8
..::( Minimum
1:l "Pass Status"
&;;
Performance Test results indicate Go
live readi
ness.
Performance test scripts test the mini
mum require
ments.
95% of classes and IDs loaded.
Test results indicate Go-live readiness.
....
'"
Q � .... ....
Current � S � S Actual � � Status � � � '" '"
( Continued)
...
'I 'I
Seq Category
2
2.01 Operational Readiness
2.02 Operational Readiness
2.03 Operational Readiness
2.04 Operational Readiness
Criterion
Backup/restore ability.
:.::: o � N
.� ... � §- §
·t � � � e v i;j i;j e,:, e,:,
H X
Disaster recovery plan L X X X X (includes both
technical and business aspects) - HIGH .AVAILABILITY.
Key Measures Contingent Workaround(s)
Operational Readiness Production instances None.
backed-up on regular schedule. Restore works successfully. Production instances available for restore on demand.
Both proven through ad-hoc testing.
High availability plan defined, documented (including manual steps). Recovery plan incorporated into operations test plan.
Utilize back-up offsite server.
Disaster recovery plan L X X X X Procedures defined JOIs
(includes both include manual steps.
technical and business aspects) - CATASTROPHIC.
Central high speed printer and other central printing services.
M X Equipment in place (incl.
back-up) all required software installed, configured, tested.
Split print file for multiple print
ers. Employ temps for fold
-ffig., U til ize backup printer.
Utilize backup folder.
....
., �
<5
.�
.�i:::I �
<:;
,:::
V §
..::( ""
�
Minimum
"Pass Status"
Operations test cycle 2 passed.
First draft complete .
First draft complete.
Operations test cycle N passed.
� .,
i:::I
- -
� �
Current
�
::Actual ""
Status � 1:l � � 1:l
( Continued)
... ...
=
( Continued)
Seq
2.05
2.06
2.07 A
2.07 B
Category
Operational Readiness
Operational Readiness
Operational Readiness
Operational Readiness
Criterion
Critical forms in place.
Batch scheduler and production schedule.
Table maintenance - key procedures in place for the high- impact data areas.
Additional Key Site procedures in place
- system requests - development - release
management - interface development - testing - training - migration - post- implementation.
.2 -\: ... �
r". !"'\
.� """'I N � �
.:: � � � �
(3 i;j i;j (5 (5 Key Measures
L x X X Available for use during functional training.
H X Product installed, configured, tested batch performance test complete scheduled jobs entered and tested ad-hoc jobs entered and tested.
L X Procedures defined, documented.
M X X X X Procedures defined, documented.
Contingent Workaround(s)
Utilize existing forms. Utilize
"baseline"
forms.
Project team executes jobs until scheduler trained and available .
Update tables centrally.
ii; � 6
.§ .� <..>
""
�
-<..>
§ �
W ..:::: Minimum
� "Pass Status"
f.;;
Approved forms distributed to campuses available to end-users.
Operations test cycle 1 passed.
First draft complete.
Operations test passed for critical tables.
� <:::
- � -
:: �
Current ""
:: ::
Actual ""
"" �
""
Status "" "" �
� �
( Continued)
t; �
<::
::
"- Cl:;..
<.>.2 6 � "-� "-�
-
... l'l.�
C Current ::.§
� �� �
\,..) ��
Contingent .�
..:.::
Minimum Actual.� � � E E <.>
..,
tl..,
Seq Category Criterion � (;:j (;:j l..:l l..:l '" � "Pass Status" Status
..,
KeJ!. Measures Workaround(s) Cl � �
2.08 Operational Define production user H X X X X Map of users to classes Sign-off opera-
Readiness IDs. complete. tor cl asses.
2.09 Operational Method to handle L X X X X Readiness reporting requests
from the sites.
3 Testing
3.01 Testing Critical funciton 1 . H X X X X Ability t o complete test of None. Successful pass
function 1 . o f key scripts
with no out- standing crit- ical issues .
...
\Q 3.02 Testing Critical function 2. H X X X X Ability to complete test of None. Successful pass
function 1 . of key scripts
with no out- standing crit- ical issues.
3.03 Testing Parallel test H X X X X 100% reconciliation; dis- 100%
completes. crepancies identified Reconciliation
and understood; new with discrep-
processes defined; man- ancies
ual steps defined. identified.
3.04 Testing Report security - H X X X X ensure that
security is set up appropriately.
4 Conversion Readiness
4.01 Conversion Conversion criteria 1 . H X Data converted Postconversion XX% of key
Readiness successfully fixes of small data success-
volume fully
(manual). converted.