Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A spatiotemporal transferable image fusion technique for GeoEye-1 satellite imagery

A spatiotemporal transferable image fusion technique for GeoEye-1 satellite imagery This study proposed a novel technique to solve the problem of color distortion in the fusion of the GeoEye-1 satellite’s panchromatic (PAN) and multispectral (MS) images. This technique suggested reducing the difference in radiometry between the PAN and MS images by using modification coefficients for the MS bands in the definition of the intensity (I) equation, which guarantees using only the overlapped wavelengths with the PAN band. These modification coefficients achieve spatiotemporal transferability for the proposed fusion technique. As the reflectance of vegetation is high in the NIR band and low in the RGB bands, this technique suggested using an additional coefficient for the NIR band in the definition of the I equation, which varies based on the ratio of the agricultural features within the image, to indicate the correct impact of vegetation. This vegetation coefficient provides stability for the proposed fusion technique across all land cover classes. This study used three datasets of GeoEye-1 satellite PAN and MS images in Tanta City, Egypt, with different land cover classes (agricultural, urban, and mixed areas), to evaluate the performance of this technique against five different standard image fusion techniques. In addition, it was validated using six additional datasets from different locations and acquired at different times to test its spatiotemporal transferability. The proposed fusion technique demonstrated spatiotemporal transferability as well as great efficiency in producing fused images of superior spatial and spectral quality for all types of land cover. Keywords Image fusion · Pan sharpening · Panchromatic · Multispectral · Color distortion · Intensity-hue-saturation 1 Introduction Image fusion or pan-sharpening proposes enhancing the low spatial quality MS image by adding the details of the high Remote sensing aims to extract information about the Earth’s spatial quality PAN image [4]. Recently, the fusion of PAN features by interpreting the spectral data obtained from a and MS images has become an important requirement for distance [1]. Modern satellites provide images with vari- many remote sensing applications. The primary drawback ous spatial, spectral, temporal, and radiometric resolutions. of the existing fusion techniques is color distortion, which Merging the key characteristics of each image may gen- results from radiometric disparities between PAN and MS erate a new image with more information than any of the images. The proposed fusion technique aims to solve this input images [2]. There are two forms of satellite imagery: problem by (1) modifying the MS bands based on their inter- the panchromatic (PAN) image and the multispectral (MS) secting areas with the PAN band to make the PAN and MS image. The key feature of the PAN image is the superior bands have the same spectral wavelengths, and (2) using a spatial resolution, which means that the small objects appear variable coefficient for the NIR band to indicate the correct with high accuracy. The key feature of the MS image is the reflectance of the NIR in the agricultural areas. superior spectral resolution, which means that the objects In this study, the performance of the proposed fusion appear in their correct colors [3]. technique was assessed by comparison with five different fusion techniques on three different datasets of GeoEye-1 PAN and MS images acquired in Tanta City, in the north part of Egypt, and covering different types of land cover (urban, B Mohamed Elshora agricultural, and mixed areas). These fusion techniques are Mohammad.elShora@f-eng.tanta.edu.eg fast-intensity-hue-saturation (FIHS), principal component Department of Public Works Engineering, Faculty of analysis (PCA), Gram-Schmidt fusion (GS), hyper-spherical Engineering, Tanta University, Tanta 31511, Egypt 123 Aerospace Systems ⎡ ⎤ R +(Pan − I ) color space (HCS), and Ehlers fusion. Statistical analysis and ⎢ ⎥ visual examination were used to evaluate the output fused G +(Pan − I ) ⎣ ⎦ images and their correlation with the original PAN and MS B +(Pan − I ) images. In addition to the first three datasets from Tanta, ⎡ ⎤ ⎡ ⎤ R R + δ Egypt, the proposed fusion technique was validated using six ⎢ ⎥ ⎢ ⎥ G  G + δ . (1) additional datasets from different locations, acquired at dif- ⎣ ⎦ ⎣ ⎦ ferent times, and covering different land cover classes (urban, B B + δ agricultural, and mixed) to test its spatiotemporal transfer- ability. where, R’, G’, and B’ are the fused bands, δ is the spatial details: δ  (Pan – I), v and v are spectral characteristics 1 2 variables, which are calculated based on the hue and satura- tion components: v  S cos (H) and v  S sin (H). 1 2 2 Related work The FIHS fusion technique is based on transforming the RGB bands into IHS components, then the I component is Several image fusion techniques have been developed to subtracted from the histogram-matched PAN image to get merge the spectral characteristics of the MS image with the the spatial details, which will be added to the original MS spatial characteristics of the PAN image to produce fused bands by simple addition to generate the fused image. images of high spectral and spatial quality; some of these Due to the differences in radiance between the PAN and techniques are explained below. I image, the fused image may contain color distortion. It was suggested to add the NIR band into the I equation to 2.1 FIHS fusion technique minimize the radiance difference between PAN, and I images. The formula for deriving the I component is: The traditional IHS fusion technique depends on the trans- formation from the RGB color space to the IHS color space, I  (R+G+B+NIR)/4(2) which can separate the spectral characteristics in the hue (H) and saturation (S) components, and the spatial characteristics in the intensity (I) component [5]. The PAN image replaces 2.2 PCA fusion technique the I component [6], followed by applying the reverse trans- formation from IHS color space to RGB color space for PAN, This technique utilizes the principal component transforma- H, and S to create the fused image. tion to transfer the correlated MS bands into uncorrelated The fast IHS fusion technique [7] is a fast method of the principal components (PCs) [9]. The uncorrelated PCs have traditional IHS fusion technique [8] that can be implemented variance information from the original MS bands [10]. This according to the following equation: technique supposes that the first PC1 component, which has the highest variance, contains the overall luminance and ⎡ ⎤ is close to the PAN image [11]. Hence, PC1 is replaced ⎢ ⎥ by the histogram-matched PAN image, and the inverse ⎣ ⎦ transformation from PCA to RGB is implemented for the ⎡ ⎤ histogram-matched PAN band and the rest of the PCs to pro- √ √ 1 −1/ 21/ 2 duce the fused image. √ √ ⎢ ⎥ 1 −1/ 2 −1/ 2 ⎣ ⎦ 1 20 2.3 GS fusion technique ⎡ ⎤ Pan ⎢ ⎥ This technique primarily uses the Gram–Schmidt transfor- ⎣ ⎦ mation to reduce the correlation between the MS bands [9]. This technique creates a low spatial resolution PAN image by ⎡ ⎤ √ √ taking the average of the MS bands. A Gram-Schmidt trans- 1 −1/ 21/ 2 √ √ ⎢ ⎥ formation is performed for the low-resolution PAN image as 1 −1/ 2 −1/ 2 ⎣ ⎦ the first band and the MS bands to produce the GSs compo- 1 20 ⎡ ⎤ nents. Then, GS1 is replaced by the histogram-matched PAN I +(Pan − I ) image, and the reverse GS transformation is performed for ⎢ ⎥ ⎣ ⎦ 1 the histogram-matched PAN band and the rest of the GSs to produce the fused image [12]. 123 Aerospace Systems 2.4 HCS fusion technique 4 The proposed fusion technique This technique depends on the transformation between the The proposed fusion technique, shown in Fig. 5, depends RGB and hyper-spherical color spaces [13], and its steps are: on the accurate calculation of the intensity (I) component (1) calculating an intensity component (I) from the MS bands; from the MS bands. Then, the I component is subtracted (2) performing a hyper-spherical color space transformation; from the histogram-matched PAN image to get the spatial (3) matching the PAN image to the I component; (4) using details, which will be injected into the original MS bands to the matched PAN to calculate an adjusted intensity (I ); (5) generate the fused image. Therefore, the accuracy of calculat- adj replacing the I component by the I ; and (6) performing the ing the I component controls the quality of the output fused adj reverse transformation from the HCS color space back to the images. The radiometric differences between the PAN and RGB color space to produce the fused image. MS images are the main reason for the color distortion in the fused images. The GeoEye-1 spectral response curve, shown 2.5 Ehlers fusion technique in Fig. 6, demonstrates that the MS bands and the PAN band have different ranges of wavelengths. While the blue, green, This technique is based on a combination of color and Fourier and red bands have acceptable overlap with the PAN band, transforms [14]. Firstly, an IHS transformation is performed the NIR band has poor overlap. Such spectral dissimilarity to separate the spectral characteristics of the Hue (H) and of the PAN band with the MS bands, and consequently with the I component, will lead to color distortion. Accordingly, Saturation (S) components and the spatial characteristics of the Intensity (I) component. Secondly, by using a fast the equal representation of the MS bands in the formulation Fourier transform, the PAN and I images are converted into of the I equation, as in Eq. (2), was a mistake. the frequency domain where the spatial information can be The proposed fusion technique suggested modifying the easily enhanced or suppressed. A low-pass filter is used MS bands before their participation in the I equation to avoid for the I component, whereas a high-pass filter is used for the parts of the MS bands outside the PAN band range, which the PAN image. Thirdly, by using the reverse fast Fourier will reduce the radiance differences between the PAN band transform, the filtered PAN and I images are converted to and the I image. This technique proposed using the modi- the spatial domain and combined to create the fused inten- fication coefficient (α ) for each MS band according to its overlapping area with the PAN band to use only the shared sity. Fourthly, the original I component is replaced by the histogram-matched fused intensity and the reverse IHS trans- wavelengths between the PAN and MS images in the calcu- formation is performed for the histogram-matched fused lation of the I component. Therefore, the I component of the intensity and the H and S components to produce the fused proposed fusion technique is an average of four modified MS image. bands, as shown in Eq. (3). The intersection areas of each MS band with the PAN band were divided by the entire area of that MS band to calculate the modification coefficients (α ). 3 Study site and data sets After calculating the full areas of the MS bands and their overlapping areas with the PAN band from the GeoEye-1 The entire PAN and MS images, shown in Fig. 1,were spectral response curve, the modification coefficients were acquired by the GeoEye-1 satellite over Tanta City in north- as follows: α red  0.9885, α green  0.9470, α blue 0.8480, and α NIR  0.1733. ern Egypt. The GeoEye-1 satellite produces PAN images with a resolution of 41 cm and MS images with a resolution of 164 cm in four spectral bands (blue, green, red, and near- I  (α × R+ α × G+ α × B+ α × NIR) /4 1 2 3 4 infrared). By using ground control points (GCPs) collected α  A / A (3) i i i by the normalized cross-correlation method, the MS image was first registered to the PAN image. The MS image was then resampled by using the cubic convolution approach to ¯ where α is the modification coefficient of the MS band i, A i i have the same pixel size as the PAN image, and the digital is the intersection area of the MS band i with the PAN band, values of the pixels in the registered MS image were cal- and A is the full area of the MS band i. culated. After the registration and resampling preprocessing Then the I component will be as follows: steps, three different datasets including different classes of land cover (agricultural, urban, and mixed areas) were cho- I  (0.9885 × R+ 0.9470 × G+ 0.8480 × B sen from the PAN image and its corresponding registered MS +0.1733 × NIR) /4 image. The size of PAN and MS images in each dataset is 1024 × 1024 pixels, and the pixel size equals 0.5 m. The I=0.247 × R+ 0.237 × G+0.212 × B+ 0.043 × NIR three datasets are shown in Figs. 2, 3, and 4. (4) 123 Aerospace Systems Fig. 1 a TheentirePAN image and the study datasets. b The entire MS image and the study datasets Urban Agricultural Mixed 123 Aerospace Systems Fig. 2 GeoEye-1 dataset 1 (Urban area) Fig. 3 GeoEye-1 dataset 2 (Agricultural area) Because the reflectance of vegetation is high in the NIR where, β is the vegetation coefficient, which varies based on band and low in the RGB bands, it was a mistake for the the percentage of the agricultural features within the image. modification coefficient of the NIR band to be constant across To determine the appropriate β coefficient for the agricul- the different classes of land cover. It should be increased by tural dataset, it was fused by the proposed technique under increasing the agricultural areas within the image. Therefore, different values of β coefficient. It was found that increas- this technique suggested including an extra coefficient (β) ing the β coefficient significantly enhances the spectral and for the NIR band in the formulation of the I equation, as spatial quality, as shown in Fig. 7. After testing numerous shown in Eq. (5), to add the correct impact of vegetation. agricultural datasets acquired at different times and places, This coefficient is dependent on the percentage of agricultural the β coefficient of 7 significantly improved the spectral and areas within the image and is varied for all types of land cover. spatial resolution of the fused images. For mixed datasets, it was found that increasing the β coefficient enhances the spectral quality for agricultural features while reducing it for I  0.247 × R+ 0.237 × G+ 0.212 urban features. So, the proportion of the agricultural features × B+ 0.043 × β × NIR (5) 123 Aerospace Systems Fig. 4 GeoEye-1 dataset 3 (Mixed area) After calculating the β coefficient for the agricultural datasets and the mixed datasets with different ratios of agri- cultural features, the I equation for each land cover type will be as follows: (1) For agricultural areas: I  0.247 × R+ 0.237 × G (6) +0.212 × B+ 0.301 × NIR 2) For mixed areas: • The percentage of agricultural areas is (> 80%): I  0.247 × R+ 0.237 × G (7) +0.212 × B+ 0.172 × NIR • The percentage of agricultural areas is (50–80%): I  0.247 × R+ 0.237 × G (8) +0.212 × B+ 0.129 × NIR • The percentage of agricultural areas is (20–50%): Fig. 5 The procedures of the proposed fusion technique I  0.247 × R+ 0.237 × G (9) +0.212 × B+ 0.086 × NIR • The percentage of agricultural areas is (< 20%): within the image controls the spectral quality improvement. A lot of mixed datasets with different percentages of agri- I  0.247 × R+ 0.237 × G cultural features were used to determine the appropriate β (10) coefficient, which was found as shown in Table 1. +0.212 × B+ 0.043 × NIR 123 Aerospace Systems Fig. 6 The intersection areas of the GeoEye-1 spectral response curve Fig. 7 The impact of β coefficient on the spectral and spatial quality of the agricultural dataset Table 1 The appropriate β coefficient for the mixed datasets with different ratios of agricultural features Agricultural features ratio (%) 0–20 20–50 50–80 80–100 The appropriate β coefficient for mixed datasets 1 2 3 4 (3) For urban areas: datasets including different classes of land cover (agricul- tural, urban, and mixed areas) were chosen from the PAN image and its corresponding registered MS image. Then, the I  0.247 × R+ 0.237 × G fusion methods were utilized to fuse the three datasets. Fig- (11) +0.212 × B+ 0.043 × NIR ures 8, 9, and 10 show the original and fused images of all datasets. The following statistical parameters were used to evaluate the spectral characteristics of the output fused images: 5 Experiments and results After preprocessing steps, the MS image was geometrically aligned with the PAN image, and its pixel size was trans- formed to equal that of the PAN image. Three different • The correlation coefficient (CC) 123 Aerospace Systems Fig. 8 The original and fused images of GeoEye-1 dataset 1 (urban area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers The CC uses the following equation to compute the corre- digital number of pixel i in the fused band, Y  the mean lation between each fused band and the associated MS band: of the digital numbers of the fused band, and n  the number of the pixels. i n The high values of the CC indicate that the fused image has [(X − X )(Y − Y )] i m i m great spectral quality. The sum of the CC values of all bands i 1 CC  (12) is divided by the bands’ number to calculate the average CC i n i n 2 2 between the output fused image and the original MS image. (X − X ) × (Y − Y ) i m i m i 1 i 1 where X  the digital number of pixel i in the MS band, X i m the mean of the digital numbers of the MS band, Y  the • Relative dimensionless global error in synthesis (ERGAS) 123 Aerospace Systems Fig. 9 The original and fused images of GeoEye-1 dataset 2 (agricultural area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers ERGAS summarizes the errors in all bands. ERGAS can The absolute value of the mean difference (bias) between be calculated according to the following procedure: each fused band and its corresponding MS band is calculated The standard deviation of the difference image (SDD) according to the following equation: between each fused band and its corresponding MS band bias  X − Y (14) is calculated according to the following equation: m m The root mean square error (RMSE) between each fused band and its subsequent MS band is calculated according to i n the following equation: (X − Y ) i i i 1 SDD  (13) 2 2 RM SE  SDD + bi as (15) 123 Aerospace Systems Fig. 10 The original and fused images of GeoEye-1 dataset 3 (mixed area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers ERGAS is calculated for the fused image as the following the pixels, (h/l)  the ratio between the pixel size of the PAN equation: image and that of the MS image, and N  the fused bands’ number. ERGAS measures the error in the fused bands because i N h 1 RM SE(B ) it is based on the RMSE of each band. Therefore, the low ERG AS  100 (16) l N (X ) values of the ERGAS indicate that the fused image has great i 1 spectral quality. Tables 2, 3, and 4 show the values of the CCs where X  the digital number of pixel i in the MS band, X and ERGAS for all datasets. i m the mean of the digital numbers of the MS band, Y  the digital number of pixel i in the fused band, Y  the mean of the digital numbers of the fused band, n  the number of 123 Aerospace Systems Table 2 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 1 (Urban area) CC Red 0.8682 0.9429 0.8584 0.8582 0.8399 0.8818 CC Green 0.8215 0.9206 0.8592 0.8579 0.7947 0.8366 CC Blue 0.7652 0.8914 0.8645 0.8619 0.8014 0.8470 CC NIR 0.9208 0.9666 0.8578 0.8615 0.8365 0.9033 CC (Av.) 0.8439 0.9304 0.8600 0.8599 0.8181 0.8672 ERGAS 6.0875 3.9674 5.8064 5.7991 9.3792 5.6587 HPF CC Red 0.9563 0.9727 0.9944 0.9927 0.8867 0.9722 HPF CC Green 0.9561 0.9757 0.9898 0.9884 0.8498 0.9656 HPF CC Blue 0.9547 0.9762 0.9816 0.9810 0.8445 0.9609 HPF CC NIR 0.9541 0.9604 0.9955 0.9927 0.8868 0.9772 HPF CC (Av.) 0.9553 0.9713 0.9903 0.9887 0.8670 0.9690 Table 3 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 2 (Agricultural area) CC Red 0.9728 0.9935 0.9444 0.9493 0.9690 0.9780 CC Green 0.9279 0.9631 0.9300 0.9349 0.9465 0.9654 CC Blue 0.9171 0.9661 0.9424 0.9473 0.9621 0.9745 CC NIR 0.9828 0.9928 0.9811 0.9860 0.9609 0.9810 CC (Av.) 0.9502 0.9789 0.9495 0.9544 0.9596 0.9747 ERGAS 3.8501 2.5824 3.9846 3.7736 3.6081 2.7036 HPF CC Red 0.9661 0.9670 0.9882 0.9887 0.9450 0.9676 HPF CC Green 0.9680 0.9738 0.9921 0.9926 0.9630 0.9598 HPF CC Blue 0.9670 0.9737 0.9897 0.9902 0.9398 0.9555 HPF CC NIR 0.9598 0.9475 0.9803 0.9808 0.9328 0.9625 HPF CC (Av.) 0.9652 0.9655 0.9876 0.9881 0.9452 0.9614 Table 4 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 3 (Mixed area) CC Red 0.9423 0.9826 0.9163 0.9262 0.9257 0.9712 CC Green 0.9259 0.9724 0.9068 0.9157 0.9079 0.9666 CC Blue 0.9261 0.9729 0.9142 0.9245 0.9190 0.9699 CC NIR 0.9547 0.9951 0.9770 0.9674 0.9178 0.9729 CC (Av.) 0.9373 0.9808 0.9286 0.9335 0.9176 0.9702 ERGAS 4.9025 2.9060 5.3197 5.1217 6.2521 3.2974 HPF CC Red 0.9679 0.9749 0.9742 0.9750 0.8518 0.9747 HPF CC Green 0.9686 0.9765 0.9794 0.9792 0.8372 0.9754 HPF CC Blue 0.9680 0.9768 0.9809 0.9810 0.8259 0.9720 HPF CC NIR 0.9651 0.9692 0.9809 0.9822 0.8215 0.9614 HPF CC (Av.) 0.9674 0.9744 0.9789 0.9794 0.8341 0.9709 To evaluate the spatial characteristics of the output fused the original PAN image. The high values of the HPF CC indi- images, the PAN and fused images were filtered by a high- cate that the fused image has great spatial quality. Tables 2, pass Laplacian filter. Then, the CC values between the filtered 3, and 4 show the values of the HPF CCs for all datasets. PAN band and the filtered fused bands were determined according to Eq. (10). The sum of the HPF CC values of all bands is divided by the bands’ number to calculate the average of the HPF CC between the output fused image and 123 Aerospace Systems Table 5 The information of the Dataset Dataset 6 Dataset 7 Dataset 8 Dataset 9 other four datasets Type of land Agriculture Agriculture Mixed Mixed cover Study site Soca River, Karlovcic, Serbia Cape Town, South Genoa, Italy Slovenia Africa Location 46° 13 53.53 44° 47 12.98 N 33° 55 51.82 S 44° 20 53.68 N 19° 59 50.73 E 18° 26 5.46 E N 13° 36 9° 13 5.41" E 27.84 E Date 2020-07-31 2011-06-01 2013-07-31 2018-09-19 6 Analysis of results the green and blue bands have acceptable overlap, the NIR band has poor overlap. The spectral differences between PAN In terms of spectral quality, the results in Tables 2, 3, and 4 and I images are the main source of color distortion. show that: The proposed fusion technique, followed by the In terms of spatial quality, the results in Tables 2, 3, and Ehlers method, offered the greatest spectral quality for the 4 show that: All fusion techniques provide spatial improve- three data sets. Furthermore, the spectral quality of the Ehlers ments in the fused images with different levels of sharpness. method is close to that of the proposed fusion technique for The edges of fields in the agricultural areas and the sharper data sets 2 and 3 (agricultural and mixed areas). However, for edges and small objects in the urban areas are visible in the data set 1 (urban area), the spectral quality of the proposed output fused images. For all data sets, the GS and PCA fusion fusion technique is very high and far from that of the Ehlers techniques provided the greatest spatial quality. Moreover, method. the PCA fusion method provided a spatial quality that is The reasons for the high spectral characteristics of the slightly better than the GS fusion method for data set 1 (urban proposed fusion technique are: (1) using the modification area). While the GS fusion method provided a spatial quality coefficients for the MS bands in the formulation of the I image that is slightly better than the PCA fusion method for data makes PAN and I very close spectrally to each other and sets 2 and 3 (agricultural and mixed areas). reduces the gray level differences between them; (2) adding The proposed fusion technique followed the PCA and GS the variable coefficient for the NIR band in the formulation of fusion techniques and produced significant and stable results the I image provides the correct reflectance of the vegetation for all land cover types. The spatial quality obtained due to the in the NIR band. Ehlers fusion technique is also accepted. For all data sets, the The reason for the high spectral quality of the Ehlers fusion worst spatial characteristics were shown in the fused images technique is the use of the fast Fourier transform that trans- of the HCS fusion method. forms the PAN and I images into the frequency domain where In terms of visual inspection,Figs. 8, 9, and 10 show the spatial information can be easily enhanced or suppressed. that: Parallel to the statistical assessment, a visual inspection Using the high-pass and low-pass filters for the PAN image reveals that the proposed fusion technique produced undis- and I component, consequently, prevents inserting new gray- torted fused images for the three data sets. The Ehlers fusion level values into the MS image during the process of injecting technique provided acceptable spectral quality with low color the spatial details. distortion. On the other hand, the FIHS, PCA, and GS fusion The other fusion methods are ranked from best to worst: methods provided some color distortion in the green areas, PCA, GS, FIHS, HCS for data set 1 (urban area), HCS, GS, and the HCS fusion method provided some color distortion FIHS, PCA for data set 2 (agricultural area), and FIHS, GS, in the dark areas. PCA, HCS for data set 3 (mixed area). It should be noted The PCA and GS fusion techniques produced high spatial that the type of the scene and how the scene’s various fea- quality in the three data sets, followed by the proposed fusion tures are distributed affect how well a given fusion technique technique. The HCS fused images are the smoothest among performs. the resultant fused images for the three data sets. The FIHS fusion method produces low spectral charac- Overall evaluation The analysis of the results in Tables 2, teristics because all bands are equally represented in the 3, and 4 and the visual inspection of Figs. 8, 9, and 10 revealed definition of the I equation, while the spectral association that the FIHS fusion technique had a problem with the spec- between the PAN image and each of the MS bands is dif- tral quality because of the equal representation of the MS ferent. According to the GeoEye-1 spectral response curve, bands in the definition of the I component. It showed poor while the red band has perfect overlap with the PAN band and 123 Aerospace Systems Fig. 11 The original and fused images of GeoEye-1 dataset 4 (urban area), a original PAN, b original MS, c proposed, d FIHS Fig. 12 The original and fused images of GeoEye-1 dataset 5 (urban area), a original PAN, b original MS, c proposed, d FIHS 123 Aerospace Systems Fig. 13 The original and fused images of GeoEye-1 dataset 6 (agricultural area), a original PAN, b original MS, c proposed, d FIHS Fig. 14 The original and fused images of GeoEye-1 dataset 7 (agricultural area), a original PAN, b original MS, c proposed, d FIHS 123 Aerospace Systems Fig. 15 The original and fused images of GeoEye-1 dataset 8 (mixed area), a original PAN, b Original MS, c proposed, d FIHS Fig. 16 The original and fused images of GeoEye-1 dataset 9 (mixed area), a Original PAN, b Original MS, c proposed, d FIHS 123 Aerospace Systems Table 6 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 4 (Urban area) Proposed 0.9470 0.9431 0.9418 0.9391 0.9428 FIHS 0.9280 0.9233 0.9194 0.9336 0.9261 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9777 0.9776 0.9757 0.9320 0.9658 FIHS 0.9701 0.9711 0.9684 0.9566 0.9666 Table 7 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 5 (Urban area) Proposed 0.9221 0.9302 0.9111 0.9369 0.9251 FIHS 0.8819 0.9061 0.8562 0.9327 0.8942 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9681 0.9616 0.9643 0.9203 0.9536 FIHS 0.9574 0.9588 0.9539 0.9482 0.9546 Table 8 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 6 (Agricultural area) Proposed 0.9454 0.9037 0.8917 0.9865 0.9318 FIHS 0.9313 0.8858 0.8721 0.9763 0.9164 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.8735 0.9032 0.8612 0.8101 0.8620 FIHS 0.8264 0.8490 0.8173 0.7513 0.8110 Table 9 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 7 (Agricultural area) Proposed 0.9730 0.8683 0.9428 0.9886 0.9432 FIHS 0.9489 0.7927 0.8857 0.9884 0.9039 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.8704 0.8904 0.8738 0.8578 0.8731 FIHS 0.8578 0.8718 0.8609 0.8398 0.8576 Table 10 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 8 (Mixed area) Proposed 0.9161 0.9553 0.8976 0.9854 0.9386 FIHS 0.8454 0.9085 0.8131 0.9680 0.8838 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9817 0.9779 0.9830 0.9141 0.9642 FIHS 0.9627 0.9638 0.9611 0.9281 0.9539 123 Aerospace Systems Table 11 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 9 (Mixed area) Proposed 0.9537 0.8989 0.9012 0.9863 0.9350 FIHS 0.9279 0.8480 0.8541 0.9765 0.9016 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9506 0.9574 0.9511 0.8978 0.9392 FIHS 0.9383 0.9416 0.9376 0.8941 0.9279 spectral quality for the urban and mixed areas, while its per- in this dataset. Therefore, they were used in the validation formance was acceptable for the agricultural areas. On the process under the names of dataset 4 and 5, which cover other hand, it provided high spatial quality for all datasets. urban areas in Trenton, United States of America, and Lon- The PCA and GS fusion techniques produced the highest spa- don, United Kingdom, respectively. Furthermore, other four tial quality for all land cover classes. This was at the expense datasets, under the names of dataset 6 through 9, covering of spectral quality, as they produced poor spectral quality agricultural and mixed areas in different places and acquired for urban and mixed datasets and acceptable quality for the at different times, were also used in the validation process; agricultural dataset. While the spectral and spatial perfor- their information is shown in Table 5. mance of the HCS fusion technique was acceptable for the The proposed fusion technique was validated against the agricultural areas, it was very poor for the urban and mixed original FIHS fusion technique for the six datasets to make areas. The Ehlers fusion technique provided significant spa- sure it maintained its superiority. The original and fused tial quality for all land cover types. Its spectral performance images of the six datasets are shown in Figs. 11, 12, 13, was significant in agricultural and mixed datasets but poor in 14, 15, and 16. The spectral quality of the fused images was urban datasets. As the proposed fusion technique uses mod- assessed by the correlation coefficients between the fused ification coefficients for the MS bands in the formulation of bands and the original MS bands. Also, the spatial quality of the I component to use only the overlapped wavelengths with the fused images was assessed by the correlation coefficients the PAN band, it reduces color distortion and accomplishes between the filtered fused bands and the filtered PAN band. spatiotemporal transferability. Additionally, this technique For the six datasets, the spectral and spatial quality of the uses an additional coefficient for the NIR band based on the fused images is shown in Tables 6, 7, 8, 9, 10, and 11. ratio of the agricultural features within the image to indicate —The results in Tables 6, 7, 8, 9, 10, and 11 show that the correct impact of vegetation, which provides performance the proposed fusion technique had the highest spectral qual- stability across the different land cover classes. Therefore, the ity for all datasets. It provided a spectral quality that was proposed fusion technique could overcome the limitations of roughly (1.5–5.5%) better than the FIHS fusion technique. the existing fusion techniques and produce significant spec- This is because (1) using only the parts of the MS bands inside tral and spatial quality for all land cover types. the area of the PAN band in the definition of the I equation reduces the difference in radiometry between the PAN band and the MS bands and decreases the resultant color distor- tion, and (2) using the additional coefficient β for the NIR 7 Validation of the proposed fusion band in the definition of the I equation, which is variable technique for all types of land cover based on the percentage of the agricultural areas within the image, adds the correct effect of When a novel method is proposed, it should be thoroughly vegetation and produces significant and stable performance assessed from the perspective of spatiotemporal transferabil- for all types of land cover. Both fusion techniques produced ity to guarantee its effectiveness across various locations close spatial quality for the urban datasets, but for the agri- and times. In addition to the first three datasets from Tanta, cultural and mixed datasets, the proposed fusion technique Egypt, the proposed fusion technique was validated using outperformed the original FIHS fusion technique by about six additional datasets from different locations, acquired at (1 to 5%). The visual inspection of the Figs. 11, 12, 13, 14, different times, and covering different land cover classes 15, and 16 demonstrated that the proposed fusion technique (urban, agricultural, and mixed). Vivone et al. (2021) estab- significantly enhanced the spectral quality, reduced color dis- lished a reference dataset consisting of 14 PAN and MS tortion, and produced sharp fused images for all land cover image pairs acquired by several satellites over a variety of types. topography to evaluate new fusion techniques [15]. There were two PAN and MS image pairs of the GeoEye-1 satellite 123 Aerospace Systems 8 Conclusion 2. Elshora M, Afify H, Younes S (2018) Implementation of fusion techniques on GeoEye-1 satellite imagery. J Al-Azhar Univ Eng Sector 40(4):191–207 This study demonstrated that the proposed fusion technique 3. Gharbia A, Hassanien A, El-Baz AH, Elhoseny M, Gunasekarane can significantly solve the problem of color distortion and M (2018) Multi-spectral and panchromatic image fusion approach produce fused images of superior spectral and spatial res- using stationary wavelet transform and swarm flower pollination optimization for remote sensing applications. Future Gener Com- olution for all land cover classes because of: (1) using put Syst 88:501–511. https://doi.org/10.1016/j.future.2018.06.022 modification coefficients for the MS bands, in the formu- 4. Wady SMA, Bentoutou Y, Bengermikh A, Bounoua A, Taleb lation of the I equation, which are calculated based on their N (2020) A new IHS and wavelet based pansharpening algo- overlapping areas with the PAN band; and (2) adding a vari- rithm for high spatial resolution satellite imagery. Adv Sp Res 66:1507–1521. https://doi.org/10.1016/j.asr.2020.06.001 able coefficient for the NIR band, which is dependent on 5. Firouz AW, Kalyankar NV, Al-Zuky AA (2011) The IHS the proportion of the agricultural regions within the image. transformations-based image fusion. J Glob Res Comput Sci In addition to reducing the radiometric differences between 2(5):70–77. https://doi.org/10.48550/arXiv.1107.4396 the PAN image and the I component, the modification coeffi- 6. Ozay EK, Tunga B (2020) A novel method for multispectral image pansharpening based on high dimensional model represen- cients provide spatiotemporal transferability for the proposed tation. Expert Syst Appls 170:114512. https://doi.org/10.1016/j. fusion technique. While the vegetation coefficient indicates eswa.2020.114512 the correct impact of vegetation in the NIR band, it also pro- 7. Tu TM, Huang PS, Hung CL, Chang CP (2004) A fast intensity- vides stability for the proposed fusion technique across all hue-saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci Remote Sens Lett 1(4):309–312. land cover classes. In future work, this technique will be https://doi.org/10.1109/LGRS.2004.834804 developed to be used for satellites with different spectral 8. Zeng Y, Yi W, Deng J, Chen W, Xu S, Huang S (2020) Remote response curves. sensing image fusion using improved IHS and non-subsampled Contourlet transform. In: Yuan X, Elhoseny M (eds) Urban intelli- Funding The authors have not disclosed any funding. gence and applications, studies in distributed intelligence. Springer, Cham, pp 55–67. https://doi.org/10.1007/978-3-030-45099-1_5 9. Song S, Liu J, Pu H, Liu Y, Luo J (2019) The comparison of fusion methods for hsrrsi considering the effectiveness of land cover (Fea- Declarations tures) object recognition based on deep learning. Remote Sens 11(12):1435. https://doi.org/10.3390/rs11121435 10. Ricotta C, Avena GC, Volpe F (1999) The influence of princi- Conflict of interest The authors declare no competing interests. pal component analysis on the spatial structure of multispectral dataset. Int J Remote Sens 20(17):3367–3376. https://doi.org/10. Ethical approval Not applicable. 1080/014311699211381 11. Pohl C, Van Genderen JL (1998) Multisensor image fusion in Consent to participate Not applicable. remote sensing: concepts, methods and applications. Int J Remote Sens 19(5):823–854. https://doi.org/10.1080/014311698215748 Consent to publish Not applicable. 12. Maurer T (2013) How to pan-sharpen images using the Gram- Schmidt pan sharpen method. Int Arch Photogramm Remote Open Access This article is licensed under a Creative Commons Sens Spatial Inf Sci XL-1/W1:239–244. https://doi.org/10.5194/ Attribution 4.0 International License, which permits use, sharing, adap- isprsarchives-XL-1-W1-239-2013 tation, distribution and reproduction in any medium or format, as 13. Padwick C, Deskevich M, Pacifici F, Smallwood S (2010) long as you give appropriate credit to the original author(s) and the Worldview-2 Pan-sharpening. ASPRS 2010 Annual Conference, source, provide a link to the Creative Commons licence, and indi- San Diego, California. cate if changes were made. The images or other third party material 14. Ehlers M, Klonus S (2013) Image fusion using the ehlers spec- in this article are included in the article’s Creative Commons licence, tral characteristics preservation algorithm. GISci Remote Sens unless indicated otherwise in a credit line to the material. If material 44(2):93–116. https://doi.org/10.2747/1548-1603.44.2.93 is not included in the article’s Creative Commons licence and your 15. Vivone G, Mura MD, Garzelli A, Pacifici F (2021) A benchmark- intended use is not permitted by statutory regulation or exceeds the ing protocol for pansharpening: dataset, preprocessing, and quality permitted use, you will need to obtain permission directly from the copy- assessment. IEEE J Sel Top Appl Earth Observ Remote Sens right holder. To view a copy of this licence, visit http://creativecomm 14:6102–6118. https://doi.org/10.1109/JSTARS.2021.3086877 ons.org/licenses/by/4.0/. References 1. Ghassemian H (2016) A review of remote sensing image fusion methods. Inf Fusion 32:75–89. https://doi.org/10.1016/j.inffus. 2016.03.003 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Aerospace Systems Springer Journals

A spatiotemporal transferable image fusion technique for GeoEye-1 satellite imagery

Aerospace Systems , Volume OnlineFirst – Mar 15, 2023

Loading next page...
 
/lp/springer-journals/a-spatiotemporal-transferable-image-fusion-technique-for-geoeye-1-lUs5kTXKXd
Publisher
Springer Journals
Copyright
Copyright © The Author(s) 2023
ISSN
2523-3947
eISSN
2523-3955
DOI
10.1007/s42401-023-00208-7
Publisher site
See Article on Publisher Site

Abstract

This study proposed a novel technique to solve the problem of color distortion in the fusion of the GeoEye-1 satellite’s panchromatic (PAN) and multispectral (MS) images. This technique suggested reducing the difference in radiometry between the PAN and MS images by using modification coefficients for the MS bands in the definition of the intensity (I) equation, which guarantees using only the overlapped wavelengths with the PAN band. These modification coefficients achieve spatiotemporal transferability for the proposed fusion technique. As the reflectance of vegetation is high in the NIR band and low in the RGB bands, this technique suggested using an additional coefficient for the NIR band in the definition of the I equation, which varies based on the ratio of the agricultural features within the image, to indicate the correct impact of vegetation. This vegetation coefficient provides stability for the proposed fusion technique across all land cover classes. This study used three datasets of GeoEye-1 satellite PAN and MS images in Tanta City, Egypt, with different land cover classes (agricultural, urban, and mixed areas), to evaluate the performance of this technique against five different standard image fusion techniques. In addition, it was validated using six additional datasets from different locations and acquired at different times to test its spatiotemporal transferability. The proposed fusion technique demonstrated spatiotemporal transferability as well as great efficiency in producing fused images of superior spatial and spectral quality for all types of land cover. Keywords Image fusion · Pan sharpening · Panchromatic · Multispectral · Color distortion · Intensity-hue-saturation 1 Introduction Image fusion or pan-sharpening proposes enhancing the low spatial quality MS image by adding the details of the high Remote sensing aims to extract information about the Earth’s spatial quality PAN image [4]. Recently, the fusion of PAN features by interpreting the spectral data obtained from a and MS images has become an important requirement for distance [1]. Modern satellites provide images with vari- many remote sensing applications. The primary drawback ous spatial, spectral, temporal, and radiometric resolutions. of the existing fusion techniques is color distortion, which Merging the key characteristics of each image may gen- results from radiometric disparities between PAN and MS erate a new image with more information than any of the images. The proposed fusion technique aims to solve this input images [2]. There are two forms of satellite imagery: problem by (1) modifying the MS bands based on their inter- the panchromatic (PAN) image and the multispectral (MS) secting areas with the PAN band to make the PAN and MS image. The key feature of the PAN image is the superior bands have the same spectral wavelengths, and (2) using a spatial resolution, which means that the small objects appear variable coefficient for the NIR band to indicate the correct with high accuracy. The key feature of the MS image is the reflectance of the NIR in the agricultural areas. superior spectral resolution, which means that the objects In this study, the performance of the proposed fusion appear in their correct colors [3]. technique was assessed by comparison with five different fusion techniques on three different datasets of GeoEye-1 PAN and MS images acquired in Tanta City, in the north part of Egypt, and covering different types of land cover (urban, B Mohamed Elshora agricultural, and mixed areas). These fusion techniques are Mohammad.elShora@f-eng.tanta.edu.eg fast-intensity-hue-saturation (FIHS), principal component Department of Public Works Engineering, Faculty of analysis (PCA), Gram-Schmidt fusion (GS), hyper-spherical Engineering, Tanta University, Tanta 31511, Egypt 123 Aerospace Systems ⎡ ⎤ R +(Pan − I ) color space (HCS), and Ehlers fusion. Statistical analysis and ⎢ ⎥ visual examination were used to evaluate the output fused G +(Pan − I ) ⎣ ⎦ images and their correlation with the original PAN and MS B +(Pan − I ) images. In addition to the first three datasets from Tanta, ⎡ ⎤ ⎡ ⎤ R R + δ Egypt, the proposed fusion technique was validated using six ⎢ ⎥ ⎢ ⎥ G  G + δ . (1) additional datasets from different locations, acquired at dif- ⎣ ⎦ ⎣ ⎦ ferent times, and covering different land cover classes (urban, B B + δ agricultural, and mixed) to test its spatiotemporal transfer- ability. where, R’, G’, and B’ are the fused bands, δ is the spatial details: δ  (Pan – I), v and v are spectral characteristics 1 2 variables, which are calculated based on the hue and satura- tion components: v  S cos (H) and v  S sin (H). 1 2 2 Related work The FIHS fusion technique is based on transforming the RGB bands into IHS components, then the I component is Several image fusion techniques have been developed to subtracted from the histogram-matched PAN image to get merge the spectral characteristics of the MS image with the the spatial details, which will be added to the original MS spatial characteristics of the PAN image to produce fused bands by simple addition to generate the fused image. images of high spectral and spatial quality; some of these Due to the differences in radiance between the PAN and techniques are explained below. I image, the fused image may contain color distortion. It was suggested to add the NIR band into the I equation to 2.1 FIHS fusion technique minimize the radiance difference between PAN, and I images. The formula for deriving the I component is: The traditional IHS fusion technique depends on the trans- formation from the RGB color space to the IHS color space, I  (R+G+B+NIR)/4(2) which can separate the spectral characteristics in the hue (H) and saturation (S) components, and the spatial characteristics in the intensity (I) component [5]. The PAN image replaces 2.2 PCA fusion technique the I component [6], followed by applying the reverse trans- formation from IHS color space to RGB color space for PAN, This technique utilizes the principal component transforma- H, and S to create the fused image. tion to transfer the correlated MS bands into uncorrelated The fast IHS fusion technique [7] is a fast method of the principal components (PCs) [9]. The uncorrelated PCs have traditional IHS fusion technique [8] that can be implemented variance information from the original MS bands [10]. This according to the following equation: technique supposes that the first PC1 component, which has the highest variance, contains the overall luminance and ⎡ ⎤ is close to the PAN image [11]. Hence, PC1 is replaced ⎢ ⎥ by the histogram-matched PAN image, and the inverse ⎣ ⎦ transformation from PCA to RGB is implemented for the ⎡ ⎤ histogram-matched PAN band and the rest of the PCs to pro- √ √ 1 −1/ 21/ 2 duce the fused image. √ √ ⎢ ⎥ 1 −1/ 2 −1/ 2 ⎣ ⎦ 1 20 2.3 GS fusion technique ⎡ ⎤ Pan ⎢ ⎥ This technique primarily uses the Gram–Schmidt transfor- ⎣ ⎦ mation to reduce the correlation between the MS bands [9]. This technique creates a low spatial resolution PAN image by ⎡ ⎤ √ √ taking the average of the MS bands. A Gram-Schmidt trans- 1 −1/ 21/ 2 √ √ ⎢ ⎥ formation is performed for the low-resolution PAN image as 1 −1/ 2 −1/ 2 ⎣ ⎦ the first band and the MS bands to produce the GSs compo- 1 20 ⎡ ⎤ nents. Then, GS1 is replaced by the histogram-matched PAN I +(Pan − I ) image, and the reverse GS transformation is performed for ⎢ ⎥ ⎣ ⎦ 1 the histogram-matched PAN band and the rest of the GSs to produce the fused image [12]. 123 Aerospace Systems 2.4 HCS fusion technique 4 The proposed fusion technique This technique depends on the transformation between the The proposed fusion technique, shown in Fig. 5, depends RGB and hyper-spherical color spaces [13], and its steps are: on the accurate calculation of the intensity (I) component (1) calculating an intensity component (I) from the MS bands; from the MS bands. Then, the I component is subtracted (2) performing a hyper-spherical color space transformation; from the histogram-matched PAN image to get the spatial (3) matching the PAN image to the I component; (4) using details, which will be injected into the original MS bands to the matched PAN to calculate an adjusted intensity (I ); (5) generate the fused image. Therefore, the accuracy of calculat- adj replacing the I component by the I ; and (6) performing the ing the I component controls the quality of the output fused adj reverse transformation from the HCS color space back to the images. The radiometric differences between the PAN and RGB color space to produce the fused image. MS images are the main reason for the color distortion in the fused images. The GeoEye-1 spectral response curve, shown 2.5 Ehlers fusion technique in Fig. 6, demonstrates that the MS bands and the PAN band have different ranges of wavelengths. While the blue, green, This technique is based on a combination of color and Fourier and red bands have acceptable overlap with the PAN band, transforms [14]. Firstly, an IHS transformation is performed the NIR band has poor overlap. Such spectral dissimilarity to separate the spectral characteristics of the Hue (H) and of the PAN band with the MS bands, and consequently with the I component, will lead to color distortion. Accordingly, Saturation (S) components and the spatial characteristics of the Intensity (I) component. Secondly, by using a fast the equal representation of the MS bands in the formulation Fourier transform, the PAN and I images are converted into of the I equation, as in Eq. (2), was a mistake. the frequency domain where the spatial information can be The proposed fusion technique suggested modifying the easily enhanced or suppressed. A low-pass filter is used MS bands before their participation in the I equation to avoid for the I component, whereas a high-pass filter is used for the parts of the MS bands outside the PAN band range, which the PAN image. Thirdly, by using the reverse fast Fourier will reduce the radiance differences between the PAN band transform, the filtered PAN and I images are converted to and the I image. This technique proposed using the modi- the spatial domain and combined to create the fused inten- fication coefficient (α ) for each MS band according to its overlapping area with the PAN band to use only the shared sity. Fourthly, the original I component is replaced by the histogram-matched fused intensity and the reverse IHS trans- wavelengths between the PAN and MS images in the calcu- formation is performed for the histogram-matched fused lation of the I component. Therefore, the I component of the intensity and the H and S components to produce the fused proposed fusion technique is an average of four modified MS image. bands, as shown in Eq. (3). The intersection areas of each MS band with the PAN band were divided by the entire area of that MS band to calculate the modification coefficients (α ). 3 Study site and data sets After calculating the full areas of the MS bands and their overlapping areas with the PAN band from the GeoEye-1 The entire PAN and MS images, shown in Fig. 1,were spectral response curve, the modification coefficients were acquired by the GeoEye-1 satellite over Tanta City in north- as follows: α red  0.9885, α green  0.9470, α blue 0.8480, and α NIR  0.1733. ern Egypt. The GeoEye-1 satellite produces PAN images with a resolution of 41 cm and MS images with a resolution of 164 cm in four spectral bands (blue, green, red, and near- I  (α × R+ α × G+ α × B+ α × NIR) /4 1 2 3 4 infrared). By using ground control points (GCPs) collected α  A / A (3) i i i by the normalized cross-correlation method, the MS image was first registered to the PAN image. The MS image was then resampled by using the cubic convolution approach to ¯ where α is the modification coefficient of the MS band i, A i i have the same pixel size as the PAN image, and the digital is the intersection area of the MS band i with the PAN band, values of the pixels in the registered MS image were cal- and A is the full area of the MS band i. culated. After the registration and resampling preprocessing Then the I component will be as follows: steps, three different datasets including different classes of land cover (agricultural, urban, and mixed areas) were cho- I  (0.9885 × R+ 0.9470 × G+ 0.8480 × B sen from the PAN image and its corresponding registered MS +0.1733 × NIR) /4 image. The size of PAN and MS images in each dataset is 1024 × 1024 pixels, and the pixel size equals 0.5 m. The I=0.247 × R+ 0.237 × G+0.212 × B+ 0.043 × NIR three datasets are shown in Figs. 2, 3, and 4. (4) 123 Aerospace Systems Fig. 1 a TheentirePAN image and the study datasets. b The entire MS image and the study datasets Urban Agricultural Mixed 123 Aerospace Systems Fig. 2 GeoEye-1 dataset 1 (Urban area) Fig. 3 GeoEye-1 dataset 2 (Agricultural area) Because the reflectance of vegetation is high in the NIR where, β is the vegetation coefficient, which varies based on band and low in the RGB bands, it was a mistake for the the percentage of the agricultural features within the image. modification coefficient of the NIR band to be constant across To determine the appropriate β coefficient for the agricul- the different classes of land cover. It should be increased by tural dataset, it was fused by the proposed technique under increasing the agricultural areas within the image. Therefore, different values of β coefficient. It was found that increas- this technique suggested including an extra coefficient (β) ing the β coefficient significantly enhances the spectral and for the NIR band in the formulation of the I equation, as spatial quality, as shown in Fig. 7. After testing numerous shown in Eq. (5), to add the correct impact of vegetation. agricultural datasets acquired at different times and places, This coefficient is dependent on the percentage of agricultural the β coefficient of 7 significantly improved the spectral and areas within the image and is varied for all types of land cover. spatial resolution of the fused images. For mixed datasets, it was found that increasing the β coefficient enhances the spectral quality for agricultural features while reducing it for I  0.247 × R+ 0.237 × G+ 0.212 urban features. So, the proportion of the agricultural features × B+ 0.043 × β × NIR (5) 123 Aerospace Systems Fig. 4 GeoEye-1 dataset 3 (Mixed area) After calculating the β coefficient for the agricultural datasets and the mixed datasets with different ratios of agri- cultural features, the I equation for each land cover type will be as follows: (1) For agricultural areas: I  0.247 × R+ 0.237 × G (6) +0.212 × B+ 0.301 × NIR 2) For mixed areas: • The percentage of agricultural areas is (> 80%): I  0.247 × R+ 0.237 × G (7) +0.212 × B+ 0.172 × NIR • The percentage of agricultural areas is (50–80%): I  0.247 × R+ 0.237 × G (8) +0.212 × B+ 0.129 × NIR • The percentage of agricultural areas is (20–50%): Fig. 5 The procedures of the proposed fusion technique I  0.247 × R+ 0.237 × G (9) +0.212 × B+ 0.086 × NIR • The percentage of agricultural areas is (< 20%): within the image controls the spectral quality improvement. A lot of mixed datasets with different percentages of agri- I  0.247 × R+ 0.237 × G cultural features were used to determine the appropriate β (10) coefficient, which was found as shown in Table 1. +0.212 × B+ 0.043 × NIR 123 Aerospace Systems Fig. 6 The intersection areas of the GeoEye-1 spectral response curve Fig. 7 The impact of β coefficient on the spectral and spatial quality of the agricultural dataset Table 1 The appropriate β coefficient for the mixed datasets with different ratios of agricultural features Agricultural features ratio (%) 0–20 20–50 50–80 80–100 The appropriate β coefficient for mixed datasets 1 2 3 4 (3) For urban areas: datasets including different classes of land cover (agricul- tural, urban, and mixed areas) were chosen from the PAN image and its corresponding registered MS image. Then, the I  0.247 × R+ 0.237 × G fusion methods were utilized to fuse the three datasets. Fig- (11) +0.212 × B+ 0.043 × NIR ures 8, 9, and 10 show the original and fused images of all datasets. The following statistical parameters were used to evaluate the spectral characteristics of the output fused images: 5 Experiments and results After preprocessing steps, the MS image was geometrically aligned with the PAN image, and its pixel size was trans- formed to equal that of the PAN image. Three different • The correlation coefficient (CC) 123 Aerospace Systems Fig. 8 The original and fused images of GeoEye-1 dataset 1 (urban area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers The CC uses the following equation to compute the corre- digital number of pixel i in the fused band, Y  the mean lation between each fused band and the associated MS band: of the digital numbers of the fused band, and n  the number of the pixels. i n The high values of the CC indicate that the fused image has [(X − X )(Y − Y )] i m i m great spectral quality. The sum of the CC values of all bands i 1 CC  (12) is divided by the bands’ number to calculate the average CC i n i n 2 2 between the output fused image and the original MS image. (X − X ) × (Y − Y ) i m i m i 1 i 1 where X  the digital number of pixel i in the MS band, X i m the mean of the digital numbers of the MS band, Y  the • Relative dimensionless global error in synthesis (ERGAS) 123 Aerospace Systems Fig. 9 The original and fused images of GeoEye-1 dataset 2 (agricultural area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers ERGAS summarizes the errors in all bands. ERGAS can The absolute value of the mean difference (bias) between be calculated according to the following procedure: each fused band and its corresponding MS band is calculated The standard deviation of the difference image (SDD) according to the following equation: between each fused band and its corresponding MS band bias  X − Y (14) is calculated according to the following equation: m m The root mean square error (RMSE) between each fused band and its subsequent MS band is calculated according to i n the following equation: (X − Y ) i i i 1 SDD  (13) 2 2 RM SE  SDD + bi as (15) 123 Aerospace Systems Fig. 10 The original and fused images of GeoEye-1 dataset 3 (mixed area), a original PAN, b original MS, c FIHS, d proposed, e PCA, f GS, g HCS, h Ehlers ERGAS is calculated for the fused image as the following the pixels, (h/l)  the ratio between the pixel size of the PAN equation: image and that of the MS image, and N  the fused bands’ number. ERGAS measures the error in the fused bands because i N h 1 RM SE(B ) it is based on the RMSE of each band. Therefore, the low ERG AS  100 (16) l N (X ) values of the ERGAS indicate that the fused image has great i 1 spectral quality. Tables 2, 3, and 4 show the values of the CCs where X  the digital number of pixel i in the MS band, X and ERGAS for all datasets. i m the mean of the digital numbers of the MS band, Y  the digital number of pixel i in the fused band, Y  the mean of the digital numbers of the fused band, n  the number of 123 Aerospace Systems Table 2 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 1 (Urban area) CC Red 0.8682 0.9429 0.8584 0.8582 0.8399 0.8818 CC Green 0.8215 0.9206 0.8592 0.8579 0.7947 0.8366 CC Blue 0.7652 0.8914 0.8645 0.8619 0.8014 0.8470 CC NIR 0.9208 0.9666 0.8578 0.8615 0.8365 0.9033 CC (Av.) 0.8439 0.9304 0.8600 0.8599 0.8181 0.8672 ERGAS 6.0875 3.9674 5.8064 5.7991 9.3792 5.6587 HPF CC Red 0.9563 0.9727 0.9944 0.9927 0.8867 0.9722 HPF CC Green 0.9561 0.9757 0.9898 0.9884 0.8498 0.9656 HPF CC Blue 0.9547 0.9762 0.9816 0.9810 0.8445 0.9609 HPF CC NIR 0.9541 0.9604 0.9955 0.9927 0.8868 0.9772 HPF CC (Av.) 0.9553 0.9713 0.9903 0.9887 0.8670 0.9690 Table 3 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 2 (Agricultural area) CC Red 0.9728 0.9935 0.9444 0.9493 0.9690 0.9780 CC Green 0.9279 0.9631 0.9300 0.9349 0.9465 0.9654 CC Blue 0.9171 0.9661 0.9424 0.9473 0.9621 0.9745 CC NIR 0.9828 0.9928 0.9811 0.9860 0.9609 0.9810 CC (Av.) 0.9502 0.9789 0.9495 0.9544 0.9596 0.9747 ERGAS 3.8501 2.5824 3.9846 3.7736 3.6081 2.7036 HPF CC Red 0.9661 0.9670 0.9882 0.9887 0.9450 0.9676 HPF CC Green 0.9680 0.9738 0.9921 0.9926 0.9630 0.9598 HPF CC Blue 0.9670 0.9737 0.9897 0.9902 0.9398 0.9555 HPF CC NIR 0.9598 0.9475 0.9803 0.9808 0.9328 0.9625 HPF CC (Av.) 0.9652 0.9655 0.9876 0.9881 0.9452 0.9614 Table 4 Evaluation criteria (the Criterion FIHS Proposed PCA GS HCS Ehlers CCs and ERGAS) for dataset 3 (Mixed area) CC Red 0.9423 0.9826 0.9163 0.9262 0.9257 0.9712 CC Green 0.9259 0.9724 0.9068 0.9157 0.9079 0.9666 CC Blue 0.9261 0.9729 0.9142 0.9245 0.9190 0.9699 CC NIR 0.9547 0.9951 0.9770 0.9674 0.9178 0.9729 CC (Av.) 0.9373 0.9808 0.9286 0.9335 0.9176 0.9702 ERGAS 4.9025 2.9060 5.3197 5.1217 6.2521 3.2974 HPF CC Red 0.9679 0.9749 0.9742 0.9750 0.8518 0.9747 HPF CC Green 0.9686 0.9765 0.9794 0.9792 0.8372 0.9754 HPF CC Blue 0.9680 0.9768 0.9809 0.9810 0.8259 0.9720 HPF CC NIR 0.9651 0.9692 0.9809 0.9822 0.8215 0.9614 HPF CC (Av.) 0.9674 0.9744 0.9789 0.9794 0.8341 0.9709 To evaluate the spatial characteristics of the output fused the original PAN image. The high values of the HPF CC indi- images, the PAN and fused images were filtered by a high- cate that the fused image has great spatial quality. Tables 2, pass Laplacian filter. Then, the CC values between the filtered 3, and 4 show the values of the HPF CCs for all datasets. PAN band and the filtered fused bands were determined according to Eq. (10). The sum of the HPF CC values of all bands is divided by the bands’ number to calculate the average of the HPF CC between the output fused image and 123 Aerospace Systems Table 5 The information of the Dataset Dataset 6 Dataset 7 Dataset 8 Dataset 9 other four datasets Type of land Agriculture Agriculture Mixed Mixed cover Study site Soca River, Karlovcic, Serbia Cape Town, South Genoa, Italy Slovenia Africa Location 46° 13 53.53 44° 47 12.98 N 33° 55 51.82 S 44° 20 53.68 N 19° 59 50.73 E 18° 26 5.46 E N 13° 36 9° 13 5.41" E 27.84 E Date 2020-07-31 2011-06-01 2013-07-31 2018-09-19 6 Analysis of results the green and blue bands have acceptable overlap, the NIR band has poor overlap. The spectral differences between PAN In terms of spectral quality, the results in Tables 2, 3, and 4 and I images are the main source of color distortion. show that: The proposed fusion technique, followed by the In terms of spatial quality, the results in Tables 2, 3, and Ehlers method, offered the greatest spectral quality for the 4 show that: All fusion techniques provide spatial improve- three data sets. Furthermore, the spectral quality of the Ehlers ments in the fused images with different levels of sharpness. method is close to that of the proposed fusion technique for The edges of fields in the agricultural areas and the sharper data sets 2 and 3 (agricultural and mixed areas). However, for edges and small objects in the urban areas are visible in the data set 1 (urban area), the spectral quality of the proposed output fused images. For all data sets, the GS and PCA fusion fusion technique is very high and far from that of the Ehlers techniques provided the greatest spatial quality. Moreover, method. the PCA fusion method provided a spatial quality that is The reasons for the high spectral characteristics of the slightly better than the GS fusion method for data set 1 (urban proposed fusion technique are: (1) using the modification area). While the GS fusion method provided a spatial quality coefficients for the MS bands in the formulation of the I image that is slightly better than the PCA fusion method for data makes PAN and I very close spectrally to each other and sets 2 and 3 (agricultural and mixed areas). reduces the gray level differences between them; (2) adding The proposed fusion technique followed the PCA and GS the variable coefficient for the NIR band in the formulation of fusion techniques and produced significant and stable results the I image provides the correct reflectance of the vegetation for all land cover types. The spatial quality obtained due to the in the NIR band. Ehlers fusion technique is also accepted. For all data sets, the The reason for the high spectral quality of the Ehlers fusion worst spatial characteristics were shown in the fused images technique is the use of the fast Fourier transform that trans- of the HCS fusion method. forms the PAN and I images into the frequency domain where In terms of visual inspection,Figs. 8, 9, and 10 show the spatial information can be easily enhanced or suppressed. that: Parallel to the statistical assessment, a visual inspection Using the high-pass and low-pass filters for the PAN image reveals that the proposed fusion technique produced undis- and I component, consequently, prevents inserting new gray- torted fused images for the three data sets. The Ehlers fusion level values into the MS image during the process of injecting technique provided acceptable spectral quality with low color the spatial details. distortion. On the other hand, the FIHS, PCA, and GS fusion The other fusion methods are ranked from best to worst: methods provided some color distortion in the green areas, PCA, GS, FIHS, HCS for data set 1 (urban area), HCS, GS, and the HCS fusion method provided some color distortion FIHS, PCA for data set 2 (agricultural area), and FIHS, GS, in the dark areas. PCA, HCS for data set 3 (mixed area). It should be noted The PCA and GS fusion techniques produced high spatial that the type of the scene and how the scene’s various fea- quality in the three data sets, followed by the proposed fusion tures are distributed affect how well a given fusion technique technique. The HCS fused images are the smoothest among performs. the resultant fused images for the three data sets. The FIHS fusion method produces low spectral charac- Overall evaluation The analysis of the results in Tables 2, teristics because all bands are equally represented in the 3, and 4 and the visual inspection of Figs. 8, 9, and 10 revealed definition of the I equation, while the spectral association that the FIHS fusion technique had a problem with the spec- between the PAN image and each of the MS bands is dif- tral quality because of the equal representation of the MS ferent. According to the GeoEye-1 spectral response curve, bands in the definition of the I component. It showed poor while the red band has perfect overlap with the PAN band and 123 Aerospace Systems Fig. 11 The original and fused images of GeoEye-1 dataset 4 (urban area), a original PAN, b original MS, c proposed, d FIHS Fig. 12 The original and fused images of GeoEye-1 dataset 5 (urban area), a original PAN, b original MS, c proposed, d FIHS 123 Aerospace Systems Fig. 13 The original and fused images of GeoEye-1 dataset 6 (agricultural area), a original PAN, b original MS, c proposed, d FIHS Fig. 14 The original and fused images of GeoEye-1 dataset 7 (agricultural area), a original PAN, b original MS, c proposed, d FIHS 123 Aerospace Systems Fig. 15 The original and fused images of GeoEye-1 dataset 8 (mixed area), a original PAN, b Original MS, c proposed, d FIHS Fig. 16 The original and fused images of GeoEye-1 dataset 9 (mixed area), a Original PAN, b Original MS, c proposed, d FIHS 123 Aerospace Systems Table 6 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 4 (Urban area) Proposed 0.9470 0.9431 0.9418 0.9391 0.9428 FIHS 0.9280 0.9233 0.9194 0.9336 0.9261 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9777 0.9776 0.9757 0.9320 0.9658 FIHS 0.9701 0.9711 0.9684 0.9566 0.9666 Table 7 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 5 (Urban area) Proposed 0.9221 0.9302 0.9111 0.9369 0.9251 FIHS 0.8819 0.9061 0.8562 0.9327 0.8942 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9681 0.9616 0.9643 0.9203 0.9536 FIHS 0.9574 0.9588 0.9539 0.9482 0.9546 Table 8 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 6 (Agricultural area) Proposed 0.9454 0.9037 0.8917 0.9865 0.9318 FIHS 0.9313 0.8858 0.8721 0.9763 0.9164 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.8735 0.9032 0.8612 0.8101 0.8620 FIHS 0.8264 0.8490 0.8173 0.7513 0.8110 Table 9 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 7 (Agricultural area) Proposed 0.9730 0.8683 0.9428 0.9886 0.9432 FIHS 0.9489 0.7927 0.8857 0.9884 0.9039 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.8704 0.8904 0.8738 0.8578 0.8731 FIHS 0.8578 0.8718 0.8609 0.8398 0.8576 Table 10 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 8 (Mixed area) Proposed 0.9161 0.9553 0.8976 0.9854 0.9386 FIHS 0.8454 0.9085 0.8131 0.9680 0.8838 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9817 0.9779 0.9830 0.9141 0.9642 FIHS 0.9627 0.9638 0.9611 0.9281 0.9539 123 Aerospace Systems Table 11 Evaluation criteria (the Criterion CC Red CC Green CC Blue CC NIR CC (Av.) correlation coefficients) for dataset 9 (Mixed area) Proposed 0.9537 0.8989 0.9012 0.9863 0.9350 FIHS 0.9279 0.8480 0.8541 0.9765 0.9016 Criterion HPF CC Red HPF CC Green HPF CC Blue HPF CC NIR HPF CC (Av.) Proposed 0.9506 0.9574 0.9511 0.8978 0.9392 FIHS 0.9383 0.9416 0.9376 0.8941 0.9279 spectral quality for the urban and mixed areas, while its per- in this dataset. Therefore, they were used in the validation formance was acceptable for the agricultural areas. On the process under the names of dataset 4 and 5, which cover other hand, it provided high spatial quality for all datasets. urban areas in Trenton, United States of America, and Lon- The PCA and GS fusion techniques produced the highest spa- don, United Kingdom, respectively. Furthermore, other four tial quality for all land cover classes. This was at the expense datasets, under the names of dataset 6 through 9, covering of spectral quality, as they produced poor spectral quality agricultural and mixed areas in different places and acquired for urban and mixed datasets and acceptable quality for the at different times, were also used in the validation process; agricultural dataset. While the spectral and spatial perfor- their information is shown in Table 5. mance of the HCS fusion technique was acceptable for the The proposed fusion technique was validated against the agricultural areas, it was very poor for the urban and mixed original FIHS fusion technique for the six datasets to make areas. The Ehlers fusion technique provided significant spa- sure it maintained its superiority. The original and fused tial quality for all land cover types. Its spectral performance images of the six datasets are shown in Figs. 11, 12, 13, was significant in agricultural and mixed datasets but poor in 14, 15, and 16. The spectral quality of the fused images was urban datasets. As the proposed fusion technique uses mod- assessed by the correlation coefficients between the fused ification coefficients for the MS bands in the formulation of bands and the original MS bands. Also, the spatial quality of the I component to use only the overlapped wavelengths with the fused images was assessed by the correlation coefficients the PAN band, it reduces color distortion and accomplishes between the filtered fused bands and the filtered PAN band. spatiotemporal transferability. Additionally, this technique For the six datasets, the spectral and spatial quality of the uses an additional coefficient for the NIR band based on the fused images is shown in Tables 6, 7, 8, 9, 10, and 11. ratio of the agricultural features within the image to indicate —The results in Tables 6, 7, 8, 9, 10, and 11 show that the correct impact of vegetation, which provides performance the proposed fusion technique had the highest spectral qual- stability across the different land cover classes. Therefore, the ity for all datasets. It provided a spectral quality that was proposed fusion technique could overcome the limitations of roughly (1.5–5.5%) better than the FIHS fusion technique. the existing fusion techniques and produce significant spec- This is because (1) using only the parts of the MS bands inside tral and spatial quality for all land cover types. the area of the PAN band in the definition of the I equation reduces the difference in radiometry between the PAN band and the MS bands and decreases the resultant color distor- tion, and (2) using the additional coefficient β for the NIR 7 Validation of the proposed fusion band in the definition of the I equation, which is variable technique for all types of land cover based on the percentage of the agricultural areas within the image, adds the correct effect of When a novel method is proposed, it should be thoroughly vegetation and produces significant and stable performance assessed from the perspective of spatiotemporal transferabil- for all types of land cover. Both fusion techniques produced ity to guarantee its effectiveness across various locations close spatial quality for the urban datasets, but for the agri- and times. In addition to the first three datasets from Tanta, cultural and mixed datasets, the proposed fusion technique Egypt, the proposed fusion technique was validated using outperformed the original FIHS fusion technique by about six additional datasets from different locations, acquired at (1 to 5%). The visual inspection of the Figs. 11, 12, 13, 14, different times, and covering different land cover classes 15, and 16 demonstrated that the proposed fusion technique (urban, agricultural, and mixed). Vivone et al. (2021) estab- significantly enhanced the spectral quality, reduced color dis- lished a reference dataset consisting of 14 PAN and MS tortion, and produced sharp fused images for all land cover image pairs acquired by several satellites over a variety of types. topography to evaluate new fusion techniques [15]. There were two PAN and MS image pairs of the GeoEye-1 satellite 123 Aerospace Systems 8 Conclusion 2. Elshora M, Afify H, Younes S (2018) Implementation of fusion techniques on GeoEye-1 satellite imagery. J Al-Azhar Univ Eng Sector 40(4):191–207 This study demonstrated that the proposed fusion technique 3. Gharbia A, Hassanien A, El-Baz AH, Elhoseny M, Gunasekarane can significantly solve the problem of color distortion and M (2018) Multi-spectral and panchromatic image fusion approach produce fused images of superior spectral and spatial res- using stationary wavelet transform and swarm flower pollination optimization for remote sensing applications. Future Gener Com- olution for all land cover classes because of: (1) using put Syst 88:501–511. https://doi.org/10.1016/j.future.2018.06.022 modification coefficients for the MS bands, in the formu- 4. Wady SMA, Bentoutou Y, Bengermikh A, Bounoua A, Taleb lation of the I equation, which are calculated based on their N (2020) A new IHS and wavelet based pansharpening algo- overlapping areas with the PAN band; and (2) adding a vari- rithm for high spatial resolution satellite imagery. Adv Sp Res 66:1507–1521. https://doi.org/10.1016/j.asr.2020.06.001 able coefficient for the NIR band, which is dependent on 5. Firouz AW, Kalyankar NV, Al-Zuky AA (2011) The IHS the proportion of the agricultural regions within the image. transformations-based image fusion. J Glob Res Comput Sci In addition to reducing the radiometric differences between 2(5):70–77. https://doi.org/10.48550/arXiv.1107.4396 the PAN image and the I component, the modification coeffi- 6. Ozay EK, Tunga B (2020) A novel method for multispectral image pansharpening based on high dimensional model represen- cients provide spatiotemporal transferability for the proposed tation. Expert Syst Appls 170:114512. https://doi.org/10.1016/j. fusion technique. While the vegetation coefficient indicates eswa.2020.114512 the correct impact of vegetation in the NIR band, it also pro- 7. Tu TM, Huang PS, Hung CL, Chang CP (2004) A fast intensity- vides stability for the proposed fusion technique across all hue-saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci Remote Sens Lett 1(4):309–312. land cover classes. In future work, this technique will be https://doi.org/10.1109/LGRS.2004.834804 developed to be used for satellites with different spectral 8. Zeng Y, Yi W, Deng J, Chen W, Xu S, Huang S (2020) Remote response curves. sensing image fusion using improved IHS and non-subsampled Contourlet transform. In: Yuan X, Elhoseny M (eds) Urban intelli- Funding The authors have not disclosed any funding. gence and applications, studies in distributed intelligence. Springer, Cham, pp 55–67. https://doi.org/10.1007/978-3-030-45099-1_5 9. Song S, Liu J, Pu H, Liu Y, Luo J (2019) The comparison of fusion methods for hsrrsi considering the effectiveness of land cover (Fea- Declarations tures) object recognition based on deep learning. Remote Sens 11(12):1435. https://doi.org/10.3390/rs11121435 10. Ricotta C, Avena GC, Volpe F (1999) The influence of princi- Conflict of interest The authors declare no competing interests. pal component analysis on the spatial structure of multispectral dataset. Int J Remote Sens 20(17):3367–3376. https://doi.org/10. Ethical approval Not applicable. 1080/014311699211381 11. Pohl C, Van Genderen JL (1998) Multisensor image fusion in Consent to participate Not applicable. remote sensing: concepts, methods and applications. Int J Remote Sens 19(5):823–854. https://doi.org/10.1080/014311698215748 Consent to publish Not applicable. 12. Maurer T (2013) How to pan-sharpen images using the Gram- Schmidt pan sharpen method. Int Arch Photogramm Remote Open Access This article is licensed under a Creative Commons Sens Spatial Inf Sci XL-1/W1:239–244. https://doi.org/10.5194/ Attribution 4.0 International License, which permits use, sharing, adap- isprsarchives-XL-1-W1-239-2013 tation, distribution and reproduction in any medium or format, as 13. Padwick C, Deskevich M, Pacifici F, Smallwood S (2010) long as you give appropriate credit to the original author(s) and the Worldview-2 Pan-sharpening. ASPRS 2010 Annual Conference, source, provide a link to the Creative Commons licence, and indi- San Diego, California. cate if changes were made. The images or other third party material 14. Ehlers M, Klonus S (2013) Image fusion using the ehlers spec- in this article are included in the article’s Creative Commons licence, tral characteristics preservation algorithm. GISci Remote Sens unless indicated otherwise in a credit line to the material. If material 44(2):93–116. https://doi.org/10.2747/1548-1603.44.2.93 is not included in the article’s Creative Commons licence and your 15. Vivone G, Mura MD, Garzelli A, Pacifici F (2021) A benchmark- intended use is not permitted by statutory regulation or exceeds the ing protocol for pansharpening: dataset, preprocessing, and quality permitted use, you will need to obtain permission directly from the copy- assessment. IEEE J Sel Top Appl Earth Observ Remote Sens right holder. To view a copy of this licence, visit http://creativecomm 14:6102–6118. https://doi.org/10.1109/JSTARS.2021.3086877 ons.org/licenses/by/4.0/. References 1. Ghassemian H (2016) A review of remote sensing image fusion methods. Inf Fusion 32:75–89. https://doi.org/10.1016/j.inffus. 2016.03.003

Journal

Aerospace SystemsSpringer Journals

Published: Mar 15, 2023

Keywords: Image fusion; Pan sharpening; Panchromatic; Multispectral; Color distortion; Intensity-hue-saturation

References