• Tidak ada hasil yang ditemukan

Performance Analysis of Wavelet based Medical Image Compression using EZW, SPIHT, STW and WDR Algorithms for Cloud Computing

N/A
N/A
Protected

Academic year: 2024

Membagikan "Performance Analysis of Wavelet based Medical Image Compression using EZW, SPIHT, STW and WDR Algorithms for Cloud Computing"

Copied!
8
0
0

Teks penuh

(1)

International Journal of Advanced Computer Engineering and Communication Technology (IJACECT) _______________________________________________________________________________________________

_______________________________________________________________________________________________

Performance Analysis of Wavelet based Medical Image Compression using EZW, SPIHT, STW and WDR Algorithms for Cloud

Computing

1D.Ravichandran, 2Ramesh Nimmatoori, 3Ashwin Dhivakar MR

1Dept of CSE, ATRI, HYDERABAD, 2Aurora Group of Colleges, HYDERABAD, 3Dept of CSE, JNU, JAIPUR Abstract - The objective of this paper is to investigate,

evaluate and analyze the effectiveness of wavelet based Embedded Zero Tree (EZW), Set Partitioning in Hierarchical Trees (SPIHT), Spatial Orientation Tree (STW) and Wavelet Difference Reduction (WDR) compression techniques on medical images. A comparative analysis amongst these algorithms are carried out based on the performance parameters such as Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Compression Ratio (CR) and Bit per Pixel (BPP) for a selected set of medical images of different modalities. The algorithm what is discussed in this paper has been implemented using the wavelet toolbox of MATLAB. The results of this study are presented in this paper. The simulation results exposition that the proposed algorithm is a good option for the image storage and retrieval on the cloud based medical image computing efficiently, effectively and economically.

Keywords - SPIHT, EZW, STW, WDR, wavelet, medical image compression, cloud computing

I. INTRODUCTION

IT IS well known that medical imaging has become one of the most important methods and techniques for visualization and interpretation of diagnostic in medicine [1]-[3],[15]-[18],[20]. Due to the advent of high performance digital computer systems in the past decade, it has witnessed a tremendous development of various advanced modalities and instruments for detecting, storing, transmitting, analyzing, and displaying images. Nowadays a large volume of medical image data is being generated in hospitals and health care institutions through different modalities, namely, Magnetic Resonance Imaging (MRI), Ultrasound Imaging (US), Single Photon Emission Computed Tomography (SPECT), Positron Emission Tomography (PET), Nuclear Medicine (Scintigraphy), Computed Tomography (CT) images, Digital Subtraction Angiography (DSA), Digital Flurography (DF) and X-ray imaging (Radiography) [15]-[18].

Wearable Internet of Things (IoT) devices, Ubiquitous Sensor Networks (USN) and Body Sensor Networks (BSN) also generate a massive collection of biosignals such as heart-rate, oxygen level, respiration, blood pressure at low cost [1]-[3].

Biomedical images and biosignals play a major role in modern e-health services and have become an integral part of medical data communication systems [ 2]. The medical communication system is a technology that allows any type of medical data to be transmitted from the point of care to the desired specialist(s). The data is transmitted securely and rapidly for delivery to mobile devices or computers so that physician‘s can review the data and provide opinions [2]-[3]. With the increasing use of multimedia technologies and utility computing such as cloud computing, grid computing and cluster computing on the medical domain make the e-health services very successful, viable and inevitable for cost effective delivery to the common men [15]-[18]. E- health services have been trying to utilize these technologies such as teleradiology, tele-consultation, telemedicine, telediagonosis and telematics for better patient care and timely services [1],[2],[18]-[19].

Storing and transmitting such a large volume of medical image data and bio-signals for e-health services on utility computing like cloud platform across the globe for telediagnosis is very critical and time consuming job [1],[15]-[19]. For the past two decades, efficient compression algorithms have been proposed and used in order to reduce transmission time and storage costs.

The quality evaluation of medical image compression is an essential process in order to provide the cost effective services to the common men in the healthcare sector. In this proposed paper, we have implemented a hybrid algorithm for medical image compression system based on high-efficiency coding scheme, namely, Embedded Zero Tree (EZW), Set Partitioning in Hierarchical Trees (SPIHT), Spatial Orientation Tree (STW) and Wavelet Difference Reduction (WDR) techniques on medical images. In our experiment, we have investigated the trade-off between quality of the reconstructed image and type of coder. The performance of the image coder is evaluated based on the Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Compression Ratio (CR) and Bit per Pixel (BPP) for a selected set of medical images of different modalities. The results show that diagnostically reliable compressed image can be obtained through the advanced wavelet based algorithms.

(2)

_______________________________________________________________________________________________

The rest of the paper is organized as follows: In section II, the related work and motivation of this paper are discussed. Section III describes the contemporary wavelet based image compression algorithms. The research methodology and working environment of the proposed study is given in the section IV. The experimental results are provided and discussed in section V. Finally, Section VI draws conclusions and suggests guidelines for future research.

II. RELATED WORK

It is well known that image compression is a process of reducing the size of an image for minimal storage and faster transmission time. As a matter of fact that digital images generally contain significant amounts of spatial and spectral redundancy. Spatial redundancy is due to the correlation between neighboring pixel values, and spectral redundancy is due to the correlation between different color planes. Image compression (coding) techniques reduce the number of bits required to represent an image by taking advantage of these redundancies [4]-[7]. An inverse process called decompression (decoding) is applied to the compressed data to get the reconstructed image. The objective of image compression is not only to save the storage area but also to keep the resolution and the visual quality of the reconstructed image as close to the original image as possible [12]. There are many compression algorithms designed, developed, constructed, and used for the past two decades and each algorithm has got its own merits and demerits.

Historically, the concept of "ondelettes" or "wavelets"

originated from the study of time-frequency signal analysis, wave propagation, and sampling theory [4].

One of the main reasons for the discovery of wavelets and wavelet transforms is that the Fourier transform analysis does not contain the local information of signals. So the Fourier transform cannot be used for analyzing signals in a joint time and frequency domain [4]–[7]. Wavelet transform has a good locality character of time-frequency domain, overcome the limitations of Fourier transform in dealing with the smooth complex image signal decomposition and reconstruction effectively and efficiently[7]. Due to its higher coding efficiency and superior spatial and quality scalability features over the DCT coding technique, the discrete wavelet transform (DWT) coding has been adopted by JPEG- 2000 still image coding standards as the core technology. J M Shapiro [5] proposed EZW (Embedded Zero-tree Wavelet) algorithm in 1993 and the complexity of this algorithm is not high and the streaming is embedded. It is easy to control compression ratio and realize scalable coding. A Said and W A Pearlman [10] proposed a new efficient improvement method in 1996, namely SPIHT (Set Partitioning In Hierarchical Tree), using the spatial direction tree. This method shows wavelet coefficients of zero tree structure efficient and accurate, which increased the compression efficiency and reduced the complexity of the coding.

James Walker et. al. [8] introduced the Wavelet

Difference Reduction (WDR) coder in 2000 and the enhanced version of WDR is called as Adaptively Scanned Wavelet Difference Reduction (ASWDR) coder in 2001 for image compression system which is very efficient and fast as on date.

III. WAVELET BASED IMAGE COMPRESSION ALGORITHMS

A typical wavelet based image coder consists of three major parts: a wavelet filter, a quantizer and an entropy coder (Fig 1). The wavelet filter bank decomposes the image into wavelet coefficients [6]-[9]. The quantizer then quantizes the wavelet coefficients. The entropy coder produces an output bit stream and then encodes these wavelet coefficients. Although the overall performance of the image compression depends on all three parts of the coder, the choice of wavelet filter decomposition and the method of bit stream generation will ultimately affect the performance of the coder. If the wavelet filter performance is poor, it will not maintain the picture quality [9],[12],[14].

Fig.1 Wavelet Compression Scheme A. Embedded Zero Tree algorithm (EZW)

The Embedded Zerotree Wavelet (EZW) coder is an image compression algorithm which yields the best image quality for a given bit rate and accomplishing this task in an embedded fashion [11]. The EZW coder works based on the transform coding method. The Discrete Wavelet Transform(DWT) is used as a mapper that converts 2-D image into a set of zerotree coding of wavelet coefficients. These coefficients are quantized based on the successive-approximation quantization (SAQ) techniques and finally adaptive arithmetic coding is used as an entropy coding. The main advantages of EZW coder is precise rate control of bit streams and effective performance of image compression [5].

The first step in this algorithm is setting up an initial threshold. Any coefficient in the wavelet is said to be significant if its absolute value is greater than the threshold. In a hierarchical sub-band system, every coefficient is spatially related to a coefficient in the lower band. Such coefficients in the higher bands are called ‗descendants‘. This is shown in Fig 2 as a parent child relationship in a zerotree. If a coefficient is significant and positive, then it is coded as ‗positive significant‘ (ps). If a coefficient is significant and negative, then it is coded as ‗negative significant‘ (ns). If a coefficient is insignificant and all its descendants are insignificant as well, then it is coded as ‗zero tree root‘

(ztr). If a coefficient is insignificant and all its descendants are not insignificant, then it is coded as

(3)

_______________________________________________________________________________________________

‗insignificant zero‘ (iz). The algorithm involves two passes – Dominant pass and Subordinate pass. The EZW scanning order is shown in Fig 3.

In the dominant pass, the initial threshold is set to one half of the maximum pixel value. Subsequent passes have threshold values one half of the previous threshold.

The coefficients are then coded as ps, ns, iz or ztr according to their values. Thus if the number of passes is increased, the precision of the coefficients is increased.

Fig 2. Parent Child relationship in a Zerotree

Fig 3. EZW scanning order B. Set Partitioning In Hierarchical Tree (SPIHT) The Set Partitioning In Hierarchical Tree (SPIHT) coder is a wavelet based image compression algorithm which is fast and efficient technique [10]. The SPIHT coder is modified and enhanced form of EZW and generally operates on an entire image at once. The whole image is loaded and transformed, and then the algorithm requires repeated access to all coefficient values. The SPIHT coder gives good image quality with a high PSNR, and faster coding and decoding than EZW coder. It also supports fully progressive bit-stream and can be used for lossless compression and error protection.

The SPIHT coding algorithm is performed on two passes, called sorting pass and refinement pass. SPIHT perform a recursive partitioning of the tree in such a way that it allows identifying the position of significant coefficient in the descendants of the considered coefficient [13]. During the sorting pass, the coefficients in the list of insignificant pixels (LIP) are sorted and those that become significant after changing the threshold are moved to a list of significant pixels (LSP).

The outline of the SPIHT coding algorithm is given as follows:

Step 1. Initialization.

Set the list of significant points (LSP) as empty. Set the roots of similarity trees in the lists of I insignificant points (LIP) and insignificant sets

(LIS). Set the significance

Threshold T = 2 log2 max ⁡( c x,y

where max(.) is the maximum coefficient value and c(x,y) is the coefficient.

Step 2. Sorting pass.

Using the set partitioning algorithm distribute the appropriate indices of the coefficients to the LIP, LIS and LSP.

Step 3. Refinement pass:

For each entry in the LSP significant for higher n, send the nth most significant bit to the decoder.

Step 4. Decrement n by one and return to step 2 Until the specified bit rate is reached.

C. Spatial Orientation Tree Wavelet (STW)

The Spatial-Orientation Tree Wavelet (STW) algorithm is a modified form of SPIHT image coder and proposed by Said et al. [10] in 1993. A wavelet image decomposition provides a hierarchical data structure for representing images with each coefficient corresponding to a spatial region in the image. A spatial orientation tree is defined as the tree structured set of coefficients with the tree root started at one of the directional bands (i.e.

LH, HL, and HH) at any level (Fig 5). The three direct descendants of any LL band coefficient are the tree roots of three full depth spatial orientation trees (Fig 4). These three trees carry the high frequency information in three different orientations horizontal, vertical and diagonal of the corresponding spatial region. We call it a full depth spatial orientation tree if the tree root starts at the highest level directional bands. The STW coder performance is based on three concepts: partial ordering of the transformed image by magnitude, transmission of coordinates via a subset partitioning algorithm, and exploitation of the hierarchical structure in the subband transformation.

Fig 4. Set Partitioning in Hierarchical Trees coding(SPIHT)

(4)

_______________________________________________________________________________________________

D. Wavelet Difference Reduction (WDR)

The Wavelet Difference Reduction (WDR) algorithm is a wavelet based image compression coder which was designed and developed by James S. Walker in 2000 [8].

The important features of WDR are low-complexity, region of interest, embeddedness, and progressive SNR.

When arithmetic compression is used, then the rate- distortion performance of the ASWDR algorithms is only slightly worse than SPIHT. The perceptual quality of ASWDR images is clearly superior to SPIHT.

WDR algorithm on a grey scale image is given below:

Step 1. Perform a wavelet transform of the image.

Step 2. Choose a scanning order for the transformed image, whereby the transform values are scanned via a linear ordering in a zig-zag through subbands from lower to higher. Row-based scanning is used in the low- pass/high-pass subbands and column-based scanning is used in the high-pass/low-pass subbands.

Step 3. Choose an initial threshold, such that at least one transform value has magnitude less than or equal to and all transform values.

Step 4. (Significance pass). Record positions for new significant values: new indices for which is greater than or equal to the present threshold. Encode these new significant indices using difference reduction.

Step 5. (Refinement pass). Record refinement bits for significant transform values determined using larger threshold values.

Step 6. (New scan order). Run through the significant values at level in the wavelet transform. Each significant value, called a parent value, induces a set of child

values—four child values for all levels except the last, and three child values for the last—as described in the quad-tree.

Step 7. Divide the present threshold by 2. Repeat the above steps 4–6 until either a bit budget is exhausted or a distortion metric is satisfied.

IV. RESEARCH METHODOLOGY AND WORKING ENVIRONMENT

The research methodology of the proposed work and algorithmic steps (Fig. 10 ) are summarized as follows:

1. Load medical image in MATLAB using Image Acquisition system.

2. Preprocessing of colour images

3. Study preprocessing effects of the given image and draw histogram of the original image.

4. Apply Forward Discrete Wavelet Transform (2D- DWT) using bior4.4 mother wavelet

5. Apply 3-level wavelet decomposition

6. Apply any one of the wavelet image compression coders (EZW / SPIHT / STW/ WDR)

7. Perform wavelet reconstruction

8. Study histogram probability reduction function on RGB components using Mean intensities (energy, entropy, and image gradients).

9. Study quality assessment of the compressed image based on CR, MSE and PSNR

10. Repeat the above all steps for rest of the images.

Fig 5. Subband Notation in the Hierarchical Tree

Fig 6 Original image and its histogram Fig 7 Compressed image and its histogram

(5)

_______________________________________________________________________________________________

Fig 8 Wavelet Simulation Diagram of Biorthogonal (bior4.4) at Decomposition level 3

Fig 9 Test Medical Images used for simulating the proposed experiment

Fig10 Flow Diagram of the proposed method

V. EXPERIMENTAL RESULTS AND DISCUSSIONS

The effectiveness of the proposed method is elucidated

by means of the experimental results. Experiments were performed and implemented in MATLAB-2014(a) on several standard 256 X 256 grey scale medical images of different modalities to test the proposed algorithm (Fig 6-8). The medical images which were used for investigation of this research work are downloaded from the on-line free medical data bases for the public utility services. The digitized images consist of 256 X 256 pixels with a depth of 8 bits/pixel (256 density levels).

The input images used in the experiments include Xray, MRI, CT and Ultrasound images (Fig 9).

The biorthogonal (bior4.4) wavelet filter is used to decompose the image (Fig 8). A three-level wavelet decomposition is performed and then carry out simulations using SPIHT, EZW, STW and WDR coders [15]-[18]. The performance parameters of the compressed image are determined by measuring the compression ratio (CR), Peak Signal to Noise Ratio (PSNR) and Mean Squared Error (MSE). The following tables (Table 1 - Table 4) summarize the experimental performance of medical images of our simulation studies under different hard threshold values.

Table 1 - Performance of EZW coder LOOP BPP CR MSE PSNR 8 0.7207 9.0088 205.501 25.003 9 1.195 14.9414 84.777 28.848 10 2.072 25.9033 31.511 33.146 11 3.439 42.9932 8.451 38.862 12 5.109 63.8672 1.59 46.118

(6)

_______________________________________________________________________________________________

Table 2 - Performance of SPIHT coder LOOP BPP CR MSE PSNR 8 0.5156 6.4453 257.604 24.021 9 0.8379 10.4736 106.77 27.846 10 1.385 17.3096 43.772 31.719 11 2.227 27.832 15.331 36.275 12 3.248 40.6006 5.277 40.907

Table 3 - Performance of STW coder LOOP BPP CR MSE PSNR 8 0.7207 9.0088 205.501 25.003 9 1.195 14.9414 84.777 28.848 10 2.072 25.9033 31.511 33.146 11 3.439 42.9932 8.451 38.862 12 5.109 63.8672 1.59 46.118

Table 4 - Performance of WDR coder LOOP BPP CR MSE PSNR 8 0.7969 9.9609 208.813 24.933 9 1.326 16.5771 87.275 28.722 10 2.283 28.54 34.071 32.807 11 3.836 47.9492 10.802 37.796 12 5.816 72.7051 3.647 42.512 PSNR is the most commonly used to measure the quality of reconstruction of any compression system and based on the following formula.

PSNR = 10 log 2n− 1 2 MSE

where n is the pixels with a depth of 8 bits/pixel and MSE is calculated as follows:

MSE = 1

MN p i, j −ṕ i, j 2

N

j=1 M

i=1

The signal p i, j in this case is the original data and the noise ṕ i, j is the error introduced by compression. The higher the PSNR value, the better the image reconstruction quality.

Fig. 11 Performance of coder based on CR The simulation results show that the WDR coder performs better than the rest of the coders on high compression ratio and least mean square error (Fig 11

and Fig 12). For high PSNR values, STW coder is better (Fig 13). PSNR values, however, are not always a reliable criteria for image fidelity [8]. Regarding the measurement of bits per pixel (bpp) ratio, the WDR coder outperforms than the rest of the coders specially on EZW and SPIHT (Fig 14).

Fig 12 Mean Square Error (MSE) of coders

Fig 13 Performance of the coders based on PSNR

Fig 14 Bits per Pixel vs Encoding loops

VI. CONCLUSION AND FUTURE SCOPE

In this study, we have implemented state-of-the-art quality metrics for analyzing the performance of wavelet based medical image compression using EZW, SPIHT, STW and WDR algorithms for cloud computing. Based on the above investigation, we comprehend that it would be reasonable to use the WDR coder for better performance of image compression on

(7)

_______________________________________________________________________________________________

medical images for time limited computational complexity and for high PSNR values, STW coder is better. PSNR values, however, are not always a reliable criteria for image fidelity [8]. It is also observed that adaptive decomposition is required to achieve balance between image quality and complexity of computations.

The future direction of the work will be to study on implications of coding performance based on image statistics and image features using evolutionary computation such as Genetic Algorithms (GAs) and Fuzzy Logics.

ACKNOWLEDGEMENT

The authors would like to express their gratitude to the management of Aurora group of colleges, Hyderabad where this work was performed and thank Dr Suresh Babu, Professor in CSE, Nalla Malla Reddy Engineering College, Hyderabad for helpful discussions during the course of this work.

REFERENCES

[1] A M Rahmani, N Thanigaivelan, K Gia, T.N.,Granados. J, Negash. B, and H Tenhunen,―

Smart e-Health Gateway: Bringing Intelligence to Internet-of-Things Based Ubiquitous Healthcare Systems‖, 12th Annual IEEE Consumer Communications and Networking Conference (CCNC), pp. 826- 834. 2015

[2] D Del Testa and M Rossi, ―Lightweight Lossy Compression of Biometric Patterns via Denoising Autoencoders‖, IEEE Signal Processing Letters, Vol. 22, No. 12, pp. 2304 – 2308, 2015.

[3] A Paul, T.Z Khan Podder, P Ahmed, M M ahman and M H Khan, ―Iris image compression using Wavelets transform coding‖, IEEE – 2nd Int. Conf. On Signal Processing and Integrated Networks (SPIN), pp. 544 – 548, 2015

[4] S. Mallat, ―A theory for multiresolution signal decomposition: The wavelet representation,‖

IEEE Trans. Pattern Anal. Mach. Intell., vol. 11, pp.674-693, July 1989

[5] J. M. Shapiro, ― Embedded image coding using zerotrees of wavelet coefficients,‖ IEEE Trans.

on Signal Processing, 41(12):3445–3462, December 1993.

[6] C. Christopoulos, A. Skodras, and T. Ebrahimi,

―The JPEG2000 Still Image Coding System: An Overview,‖ IEEE Trans. Cons. Elect., Vol. 46., No. 4, pp. 1103 – 1127, Nov. 2000.

[7] B. E. Usevitch, ―A Tutorial on Modern Lossy Wavelet Image Compression: Foundations of JPEG 2000,‖ IEEE Signal Processing Magazine, pp. 22 – 35, Sept. 2001.

[8] J. S. Walker and T Q. Nguyen, ―Adaptive Scanning Methods For Wavelet Difference Reduction In Lossy Image Compression‖, IEEE

Int. Conf. Proc. ICIP2000, Sept.2000, Vancouver, Canada

[9] M. Antonini, M. Barlaud, P. Mathieu, and I.

Daubechies. ―Image coding using wavelet transform‖, IEEE Trans. on Image Processing, 1(2):205–220, April 1992.

[10] A. Said and W. A. Pearlman. ―A new, fast, and efficient image codec based on set partitioning in hierarchal trees (SPIHT) ―. IEEE Trans. on Circuits and Systems for Video Technology, 6(3):243–250, June 1996.

[11] C. D. Creusere, ―A new method of robust image compression based on the embedded zerotree wavelet algorithm,‖ IEEE Trans. on Image Processing, 6(10):1436–1442, Oct. 1997.

[12] S. Grgic, M Grgic, and B Zovko-Cihlar,

―Performance Analysis of Image Compression using Wavelets‖, IEEE Trans. on Industrial Electronics, Vol. 48, No. 3, pp. 682-695, June 2001.

[13] E. H. Adelson, E. Simoncelli, and R.

Hingorani,―Orthogonal pyramid transforms for image coding,‖ Proc. SPIE, vol.845, Cambridge, MA, pp. 50-58, Oct. 1987

[14] I. H. Witten, R. Neal, and J. G. Cleary,

―Arithmetic coding for data compression,‖

Comm. ACM, vol. 30, pp. 520-540, June 1987 [15] D.Ravichandran, R. Nimmatoori, Ashwin

Dhivakar MR, ―Performance of Wavelet based Image Compression on Medical Images for Cloud Computing‖, IEEE-3rd Int. Conf on Computing for Sustainable Global Development, New Delhi, March, 2016.

[16] Ashwin Dhivakar M R, D. Ravichandran and V.

Dakha, ―Medical Image Compression based on Discrete Wavelet Transform (DWT) and Huffman Techniques for Cloud Computing," 2nd Int. Conf. on Advances in Computing, Control and Networking -Bangkok, Thailand, 28-29 August, 2015.

[17] D.Ravichandran, R. Nimmatoori, Ashwin Dhivakar MR, ―A study on Image Statistics and Image Features on Coding Performance of Medical Images‖, Int. J. of Advanced Computer Engineering and Communication Technology (IJACECT), Vol.5, No.1, pp.1-6, Jan 2016.

[18] D.Ravichandran, R. Nimmatoori, Ashwin Dhivakar MR,‖Medical Image Compression based on Daubechies wavelet, Global Thresholding and Huffman encoding Algorithm‖, Int. J. of Advanced Computer Engineering and Communication Technology (IJACECT),Vol.5, No.1, pp. 7-12, Jan 2016.

[19] B Bradie, ―Wavelet Packet-Based Compression of Single Lead ECG‖, IEEE Trans. on

(8)

_______________________________________________________________________________________________

Biomedical Engineering, Vol. 43, May 1996, pp.

493–501.

[20] A K Naik and R S Holambe, ―New Approach to

the Design of Low Complexity 9/7 Tap Wavelet filters with maximum vanishing moments‖, IEEE Trans. on Image Processing, Vol.23, No. 12, pp.

5722-5732, Dec. 2014.



Referensi

Dokumen terkait