Abstract- (that is, sequences in which the same

Abstract-In computer technology in various fields hassimplified the job of human being but has also resulted in large amount ofdigital data. The challenge is managing the large amount of data, i.

e. storingand retrieving it. People are sharing, transmitting and storing millions ofimages every moment. In image compression, we can reduce the quantity of pixels used inimage demonstration without extremely change image Visualization.Although data compression is mostly done to avoid residence of more memory, andincrease capacity of memory devices, The procedure of reducing data sizewithout losing the crucial information is known as data compression.

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

There isvarious data compression technique which can use. These techniques can beclassified into two types i.e. Lossy and Lossless compression. In this papersome of the lossless image compression is discussed in detail.IntroductionDigital images have become admired for transferring,sharing, storing and visual information and hence high speed compressiontechniques are needed. The most important one is to reduce the time taken intransmission of images. Data compression, particularly image compression play avery crucial role in the field of multimedia computer services and othertelecommunication applications.

The field of image compression has a widespectrum of methods ranging from classical lossless techniques and admiredtransform approaches to the more recent segmentation based coding methods.LOSSLESSCOMPRESSION TECHNIQUESInlossless compression there will no loss of data, (i. e) after decompression theimage will be retrieved without any loss of data. Belowmentioned techniques consists in the lossless compression:1.      Run length encoding 2.      Huffman encoding 3.      Arithmetic coding 4.      Area coding5.

     SCZ coding – Simple Compression Utilities and Library6.     Entropy Encoding7.     Delta Encoding algorithm8.      Dictionary Techniques   LZW coding – (Lempel-Ziv–Welch) is adictionary based coding.

a)      LZ77 b)      LZ78 c)      LZW 9.     Bit Plane coding  Runlength encodingRun-lengthencoding (RLE) is a very simple form of data compression in which runs of data(that is, sequences in which the same data value occurs in many consecutivedata elements) are stored as a single data value and count, rather than as theoriginal run. This is most useful on data that contains many such runs: forexample, simple graphic images such as icons, line drawings, and animations.

Itis not useful with files that don’t have many runs as it could greatly increasethe file size. Run-length encoding performs lossless data compression and iswell suited to palette-based iconic images. It does not work well at all oncontinuous-tone images such as photographs, although JPEG uses it quiteeffectively on the coefficients that remain after transforming and quantizingimage blocks.Huffman Algorithm The general idea in the Huffman encoding algorithm is to allocatethe very short code-words to those blocks of input along with the highpossibilities and the long code-words are allocated to those which are havingthe low probabilities.

The Huffman code process is dependent on the two observationsmentioned below: a.      Very frequently foundsymbols will have the shorter code-words as compare to the symbol which foundless frequently. b.     Two symbols which foundleast frequently may have the equal length.The Huffman code is prepared by combining together two leastpossible characters and that are repeating in this process as far as there isonly the one character is remaining.

A code-tree is hence prepared and then aHuffman code is generated from the labeling of code tree. It is the best prefixcode that is generated from the set of the probabilities and which has beenused in the different applications of the compression. These generated codes are of different length of code which isusing integral number of the bits. This concept results in a decrease inaverage length of the code and hence the whole size of the compressed data isbecome smaller as compare to the original one. The Huffman’s algorithm is thefirst that provides the solution to the issue of constructing the codes withless redundancy.Entropy Based Encoding:Inthis compression process the algorithm first counts the frequency of occurrenceof each pixel in the image.

Then the compression technique replaces the pixelswith the algorithm generated pixel. These generated pixels are fixed for acertain pixel of the original image; and doesn’t depend on the content of theimage. The length of the generated pixels is variable and it varies on thefrequency of the certain pixel in the original image.ArithmeticCoding: Arithmeticcoding is a form of entropy encoding used in lossless data compression.Normally, a string of characters such as the words “hello there” isrepresented using a fixed number of bits per character, as in the ASCII code.When a string is converted to arithmetic encoding, frequently used characterswill be stored with little bits and not-so-frequently occurring characters willbe stored with more bits, resulting in fewer bits used in total. Arithmeticcoding differs from other forms of entropy encoding such as Huffman coding inthat rather than separating the input into component symbols and replacing eachwith a code, arithmetic coding encodes the entire message into a single number.

Delta encoding: Delta encoding represents stream of compressed pixels as thedifference between the current pixel and the previous pixel. The first pixel inthe delta encoded file is the same as the first pixel in the original image.All the following pixels in the encoded file are equal to the difference(delta) between the corresponding value in the input image, and the previousvalue in the input image. In other words, delta encoding has increased theprobability that each pixel value will be near zero, and decreased theprobability that it will be far from zero.

This uneven probability is just thething that Huffman encoding needs to operate. If the original signal is notchanging, or is changing in a straight line, delta encoding will result in runsof samples having the same value.Area encoding: This method is a superior form of runlength encoding method. There is some major advantage of using this techniqueover other lossless methods. In constant area coding special code words areused to identify large areas of closest 1’s and 0’s. Here the image issegmented in to blocks and then the segments are classified as blocks whichonly contains black or white pixels or blocks with mixed intensity. Anothervariant of constant area coding is to use an iterative approach in which thebinary image is decomposed into successively smaller and smaller and smallerblock. A hierarchical tree is built from these blocks.

The section stops whenthe block reaches certain predefined size or when all pixels of the block havethe same value. The nodes of this tree are then coded. For compressing whitetext a simpler approach is used. This is known as white block skipping. In thisblock containing solid white areas are coded to 0 and all other areas are codedto 1. They are followed by bit pattern.Lempel-Ziv-Welch coding:Lempel-Ziv-Welch (LZW) is a universallossless data compression algorithm created by Abraham Lempel, Jacob Ziv, andTerry Welch.

It was published by Welch in 1984 as an improved implementation ofthe LZ78 algorithm published by Lempel and Ziv in 1978. LZWDictionary based coding can be static ordynamic. In static dictionary coding, dictionary is fixed when the encoding anddecoding processes. In dynamic dictionary coding, dictionary is updated on fly.The algorithm is simple to implement and has the potential for very highthroughput in hardware implementations. It was the algorithm of the widely usedUNIX file compression method on computers.

A large English text file cantypically be compressed via LZW to half its original size.