• Tidak ada hasil yang ditemukan

Calibration

Dalam dokumen An Introduction to Engine (Halaman 68-72)

5 6 An Introduction to Enaine Testina and Develo~ment

Figure 3.22 Base airjow measurement system, 45-gallon fuel drum.

Figure 3.23 A Lucas- Dawe air massJlow meter:

British Standards orifice plate

'R

Airbox volume to be a minimum of 10 times the capacity of the engine in Air inlet order to damp cyclic pulsations

Atmospheric pressure P1

I

I engine

Lucas-Da we Air Mass Flow Meters

This flow meter originally was intended for use with engine management systems.

(These were superseded by the whetstone bridge hot wire systems.) However, this meter is better suited to laboratory use. The principle, as illustrated in Figure 3.23, shows the central electrode that is maintained at approximately 10 kV so that a corona discharge is formed. The exact voltage is varied so that the sum of the currents flowing to the two collector electrodes is constant. When air flows through the duct, the ion flow is deflected, thereby causing an imbalance in the current flowing to the two collector electrodes. The difference in current flow is proportional to the mass airflow rate.

Insulating supports

Collector electrodes

Instrumentation: Tem~erature. Pressure. Flow, and Calibration 5 7

that the test results are questionable, with the test series being a waste of valuable time and money. The calibration principles detailed in this chapter are based on the general requirements of European Quality Assurance Standards EN45001 and EN I S 0 9000.

The calibration process can be defined as the set of operations, which, under controlled conditions, establishes the relationship between values reported by the target measuring chain and the corresponding known values measured via a calibration transfer standard.

The calibration process should target a number of calibrated points that are representa- tive of real values anticipated during normal operating conditions:

To calibrate prior to use, and to adjust at prescribed intervals, the measuring and test equipment against certified equipment having a known valid relationship to nationally recognized standards.

To establish and maintain procedures to ensure that non-conforming equipment is prevented from inadvertent use or installation.

Calibration for all elements critical to the utility of the test should always be traceable to national standards. The frequency of calibration should be determined by a scientific approach, paying due attention to the past behavior of the measurement system used.

Evidence of calibration should be readily discerned on all measurement equipment.

Wherever possible, the calibration of each parameter should be performed with refer- ence to the whole measurement system. Calibration should include all instrumentation between the physical quantity being measured and the point at which data are logged for use with the test report. For example, calibration of a temperature sensor would require the sensor to be placed in a temperature-controlled bath and the measurement recorded employing the instrumentation used during the test. Alternatively, persuasive evidence should be provided for using other direct methods where these are considered appropriate.

Where possible, all critical equipment shall be tagged or labeled with the serial number, frequency of calibration, and calibration status shown on the tag or label.

Calibration of each parameter and its associated measuring chain must be performed at regular intervals not exceeding 12 months. On completion of calibration, the date of the calibration should be defined and the date documented. Regular calibration intervals should be defined, based on the history of each parameter and the associated measuring chain. If any parameterlmeasuring chain is suspected to be out of calibration, then an intermediate calibration check must be performed. Similarly, if any component in the measuring chain is replaced, then an intermediate calibration check must be performed.

Any excursions outside the permitted limit of error for that parameter must be docu- mented, with the analyzed reason for the unexpected occurrence recorded.

Definitions of Calibration Terms

Operational range--This is defined as the widest possible range of values (difference between the maximum and minimum values) anticipated to be seen for any given parameter throughout the duration of the test.

Parameterlmeasuring chain-All items of hardware with the potential to affect the reported value in engineering units must be identified. Examples include a sensor, cable, signal conditioning, and displayllogging resolution.

5 8 An Introduction to Engine Testing and Development

Parameter calibration range--This is the maximum recommended range over which each specific parameter should be calibrated.

Calibration range--The range (minimum to maximum) over which each specific parameterlmeasuring chain is calibrated should exceed the operating range.

Full-scale deflection (FSD))The maximum single-sided values for each measuring chain having a calibrated range intercepting at or through zero.

Limits of error ( L O E t T h e maximum plus or minus error acceptable with reference to the calibration transfer standard, or the defined limits of error excluding uncertainty errors associated with the calibration transfer standard and method of calibration.

Accuracy of measurement-The closeness of the agreement between the result of a measurement and the true value of the measurement.

Uncertainty of measurement-Results of the evaluation are aimed at characterizing the range within which a true value of measurement is estimated to lie, generally with a given likelihood.

Traceability-The property of the result of a measurement whereby it can be related to appropriate measurement standards (generally international or national), through an unbroken chain of comparison.

Calibration Personnel and Equipment

All personnel carrying out calibration should be adequately trained and regularly reviewed to assess their competence with the calibration equipment and procedures. All the channels defined as critical calibration parameters must be calibrated using equip- ment that is fit for that purpose and traceable to national standards. Equipment records must be kept, detailing the history of all calibration standards used.

A pressure transducer can be calibrated by direct comparison against a reference pressure instrument or by subjecting the transducer to a verifiable pressure using a dead-weight tester (Figures 3.24 and 3.25). Dead-weight testers comprise a piston, a reservoir, and a means to connect a pressure-sensing device to the reservoir. In the operation of some devices, the piston must be spun to indicate that it is riding on the reservoir liquid as opposed to resting on the supports. These devices are useful for calibration of pressure sensors in the range of 70 Pa to 700 MPa. The frequency response and rise time of a pressure transducer must be found by dynamic calibration.

Temperature Calibration

Calibration can be taken to mean the establishment of the relationship between the trans- ducer output and the transducer temperature within a tolerance or band of uncertainty.

Calibrations can be considered to fall into one of four categories:

1. Acceptance tests 2. Batch calibration

3. Calibration of a single thermocouple for a fixed application 4. Calibration of reference standards thermocouples

Instrumentation: Temperature, Pressure, Flow, and Calibration 59

Applied force

I I

y Oil reservoir

Adustable

r

plunger

Figure 3.24 A dead weight tester:

Figure 3.25 A commer- cial dead-weight tester:

(Courtesy of University of Sussex)

Chapter 4

An Introduction to Mr. Diesel

Dalam dokumen An Introduction to Engine (Halaman 68-72)