Definition - What does Wavelet mean?
A wavelet is a mathematical function used in compression of images and digital signal processing. It is in fact a basis function that can be isolated with respect to frequency/wavenumber and time/spatial location. Compressed images using wavelet technology are smaller in size than JPEG images and can be easily transmitted and downloaded over networks at faster speeds. Wavelet technology is used in image compression, signal compression and video compression.
Techopedia explains Wavelet
Wavelet technology is capable of compressing color images and grayscale images by a factor of five. Every wavelet has a characteristic scale and position. WIF is the extension for a wavelet compressed image file. Wavelet technology works by analyzing an image and dismantling it into a set of mathematical expressions, which can be sent to and decoded by the receiver. A wavelet transform differs from the Fourier transform by the fact that the wavelet transform considers time as well as frequency information, unlike the Fourier transform which considers only frequency information. A wavelet is capable of solving some of the inherent problems involved in Fourier analysis, such as establishing the relationship of the Fourier coefficients to the local or global behavior of the function. Wavelet technology is known for being not infinitely differentiable and for losing spectral accuracy when computing derivatives.