In this paper, image fusion is performed by utilizing images derived from different cameras for the unmanned aerial vehicle (UAV). By producing the fused image, the spatial resolution of the multispectral (MS) image is improved on the one hand and the classification accuracy on the other hand. First, however, the horizontal and vertical accuracy of the generated products, orthophoto mosaics, and digital surface models, is determined using checkpoints that do not participate in the processing of the image blocks. Also, the changes of these accuracies with a 50% increase (or decrease) of the UAV's flight height are determined. The study area is the Early Christian Basilica C and the flanking Roman buildings, at the archaeological site of Amphipolis (Eastern Macedonia, Greece).
Panda, C.B. Remote Sensing. Principles and Applications in Remote Sensing, 1st ed.; Publisher: Viva Books, New Delhi, India, 1995; pp. 234–267.
Schowengerdt, R.A. Remote Sensing: Models and Methods for Image Processing, 2nd ed.; Publisher: Academic Press, Orlando, FL, USA, 1997.
Bethune, S.; Muller, F.; Donnay, P.J. Fusion of multi-spectral and panchromatic images by local mean and variance matching filtering techniques. In Proceedings of the Second International Conference en Fusion of Earth Data, Nice, France, 28–30 January 1998; pp. 31–36.
Wald, L. Some terms of reference in data fusion. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1190–1193. [Google Scholar]
Gonzalez, R.; Woods, R. Digital Image Processing, 2nd ed.; Publisher: Prentice Hall, Upper Saddle River, NJ, USA, 2002.
Choodarathnakara, L.A.; Ashok Kumar, T.; Koliwad, S.; Patil, G.C. Assessment of different fusion methods applied to remote sensing imagery. Int. J. Comput. Sci. Inf. Technol. 2012, 3, 5447–5453. [Google Scholar]
Fonseca, L.; Namikawa, L.; Castejon, E.; Carvalho, L.; Pinho, C.; Pagamisse, A. Image Fusion for Remote Sensing Applications. In Image Fusion and Its Applications, 1st ed.; Publisher: IntechOpen, Rijeka, Croatia, 2011; pp. 153–178.
Shi, W.; Zhu, C.; Tian, Y.; Nichol, J. Wavelet-based image fusion and quality assessment. Int. J. Appl. Earth Obs. Geoinf. 2005, 6, 241–251. [Google Scholar]
Zhang, H.K.; Huang, B. A new look at image fusion methods from a Bayesian perspective Remote Sens. 2015, 7, 6828–6861. [Google Scholar]
Helmy, A.K.; El-Tawel, G.S. An integrated scheme to improve pan-sharpening visual quality of satellite images. Egypt. Inf. J. 2015, 16, 121–131. [Google Scholar]
Jelének, J.; Kopacková, V.; Koucká, L.; Mišurec, J. Testing a modified PCA-based sharpening approach for image fusion. Remote Sens. 2016, 8, 794. [Google Scholar]
Chavez, P.S.; Sides, S.C.; Anderson, J.A. Comparison of three different methods to merge multiresolution and multispectral data: Landsat TM and SPOT Panchromatic. Photogramm. Eng. Remote Sens. 1991, 57, 295–303. [Google Scholar]
Fryskowska, A.; Wojtkowska, M.; Delis, P.; Grochala, A. Some Aspects of Satellite Imagery Integration from EROS B and LANDSAT 8. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; pp. 647–652.
Grochala, A.; Kedzierski, M. A Method of Panchromatic Image Modification for Satellite Imagery Data Fusion. Remote Sens. 2017, 9, 639. [Google Scholar]
Pohl, C.; Van Genderen, J.L. Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar]
Aiazzi, B.; Baronti, S.; Selva, M. Improving component substitution pansharpening through multivariate regression of MS + Pan data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar]
Erdogan, M.; Maras, H.H.; Yilmaz, A.; Özerbil, T.Ö. Resolution merge of 1:35000 scale aerial photographs with Landsat 7 ETM imagery. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Beijing, China, 3–11 July 2008; Volume XXXVII, Part B7; pp. 1281–1286.
Stabile, M.; Odeh, I.; McBratney, A. Fusion of high-resolution aerial orthophoto with Landsat TM image for improved object-based land-use classification. In Proceedings of the 30th Asian Conference on Remote Sensing 2009 (ACRS 2009), Beijing, China, 18–23 October 2009; pp. 114–119.
Siok, K.; Jenerowicz, A.; Woroszkiewicz, M. Enhancement of spectral quality of archival aerial photographs using satellite imagery for detection of land cover. J. Appl. Remote Sens. 2017, 11, 036001. [Google Scholar]
Kaimaris, D.; Kandylas, A. Small Multispectral UAV Sensor and Its Image Fusion Capability in Cultural Heritage Applications. Heritage 2020, 3, 1046–1062. [Google Scholar]
Iliades, Y. The orientation of Byzantine Churches in eastern Macedonia and Thrace. Mediterr. Archaeol. Archaeom. 2006, 6, 209–214. [Google Scholar]
Vakataris N.K. The architecture in Macedonia from the 4th to the 8th century. Master Thesis, School of Theology, Department of Pastoral and Social Theology, Aristotle University of Thessaloniki, 2010.
Papaeconomou, C.E. Study-Restoration and enhancement proposal of the early Christian basilica of New Pella. Master Thesis, Protection Conservation Restoration of Cultural Monuments, Aristotle University of Thessaloniki, 2016.
Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar]
Borgogno, M.E.; Gajetti, M. Preliminary considerations about costs and potential market of remote sensing from UAV in the Italian viticulture context. Eur. J. Remote Sens. 2017, 50, 310–319. [Google Scholar]
Franzini, M.; Ronchetti, G.; Sona, G.; Casella, V. Geometric and radiometric consistency of parrot sequoia multispectral imagery for precision agriculture applications. Appl. Sci. 2019, 9, 3–24. [Google Scholar]
Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierar-chicalland cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Remote Sens. 2017, 38, 2037–2052. [Google Scholar]
Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Näsi, R.; Moriya, E.A.S. Radio-metric block adjustment of hyperspectral image blocks in the Brazilian environment. Int. J. Remote Sens. 2018, 39, 4910–4930. [Google Scholar]
Guo, Y.; J. Senthilnath, W.; Wu, Zhang, X.; Zeng, Z.; Huang. H. Radiometric calibration for multi-spectral camera of different imaging conditions mounted on a UAV platform. Sustainability 2019, 11, 978. [Google Scholar]
Mafanya, M.; Tsele, P.; Botai, J.O.; Manyama, P.; Chirima, G.J.; Monate, T. Radiometric calibration framework for ultra-high-resolution UAV-derived orthomosaics for large-scale mapping of invasive alien plants in semi-arid woodlands: Harrisia pomanensis as a case study. Int. J. Remote Sens. 2018, 39, 5119–5140. [Google Scholar]
Johansen, K.; Raharjo, T. Multi-temporal assessment of lychee tree crop structure using multi-spec-tralRPAS imagery. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, 42, 165–170. [Google Scholar]
Honkavaara, E.; Khoramshahi, E. Radiometric correction of close-range spectral image blocks captured using an unmanned aerial vehicle with a radiometric block adjustment. Remote Sens. 2018, 10, 256. [Google Scholar]
Assmann, J.J., Kerby, T.J., Cunliffe, M.A.; Myers-Smith, H.I. Vegetation monitoring using multispectral sensors - best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar]
Fraser, B.T.; Congalton, R.G. Issues in Unmanned Aerial Systems (UAS) data collection of complex forest environments. Remote Sens. 2018, 10, 1–21. [Google Scholar]
Jinwei, G.; Yasunobu, H. Coded rolling shutter photography: Flexible space-time sampling. In Proceedings of the IEEE International Conference on Computational Photography (ICCP), Cambridge, MA, USA, 29–30 March 2010; pp. 1–8.
Zhihang, Z.; Yinqiang, Z.; Imari, S. Towards Rolling Shutter Correction and Deblurring in Dynamic Scenes. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 9219–9228.
Bradley, D.; Atcheson, B.; Ihrke, I.; Heidrich, W. Synchronization and rolling shutter compensation for consumer video camera arrays. In Proceedings of the IEEE International Workshop on Projector-Camera Systems (PROCAMS), Miami, FL, USA, 20–25 June 2009
Geyer, C.; Meingast, M.; Sastry, S. Geometric models of rollingshutter cameras. In Proceedings of the IEEE Workshop on Omnidirectional Vision Camera Networks and Non-Classical Cameras, Beijing, China, 21 October 2005; pp. 12–19.
Liang, C.K.; Chang, L.W.; Chen, H.H. Analysis and compensation of rolling shutter effect. IEEE Trans. Image Process. 2008, 17, 1323–1330. [Google Scholar]
González-Audícana, M.; Saleta, J.L.; Catalán, G.R.; García, R. Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1291–1299. [Google Scholar]
Choi, J.; Yu, K.; Kim, Y. A new adaptive component-substitution-based satellite image fusion by using partial replacement. IEEE Trans. Geosci. Remote Sens. 2011, 49, 295–309. [Google Scholar]
Kumar, T.; Verma, K. A theory based on conversion of RGB image to Gray image. Int. J. Comput. Appl. 2010, 7, 7–10. [Google Scholar]
Pramod Kaler, P. Study of grayscale image in image processing. Int. J. Recent Innov. Trends Comput. Commun. 2016, 4, 309–311. [Google Scholar]
Azzeh, A.L.J.; Alhatamleh, H.; Alqadi, A.Z.; Abuzalata, K.M. Creating a color map to be used to convert a gray image to color image. Int. J. Comput. Appl. 2016, 153, 31–34. [Google Scholar]
Queiroz, L.R.; Braun, M.K. Color to gray and back: Color embedding into textured gray images. IEEE Trans. Image Process. 2006, 15, 1464–1470. [Google Scholar]
Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
Ranchin, T.; Aiazzi, B.; Alparone, L.; Baronti, S.; Wald, L. Image fusion-The ARSIS concept and some successful implementation schemes. ISPRS J. Photogramm. Remote Sens. 2003, 58, 4–18. [Google Scholar]
Otazu, X.; González-Audícana, M.; Fors, O.; Núñez, J. Introduction of sensor spectral response into image fusion methods-application to wavelet-based methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2376–2385. [Google Scholar]
Liu, J.G. Smoothing filter-based intensity modulation: A spectral preserve image fusion technique for improving spatial details. Int. J. Remote Sens. 2000, 21, 3461–3472. [Google Scholar]
Wang, Z.; Ziou, D.; Armenakis, C. A comparative analysis of image fusion methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1391–1402. [Google Scholar]
Helmy, A.K.; Nasr, H.A.; El-Taweel, S.G. Assessment and evaluation of different data fusion techniques. Int. J. Comput. 2010, 4, 107–115. [Google Scholar]
Susheela, D.; Pradeep, K.G.; Mahesh, K.J. A comparative study of various pixel based image fusion techniques as applied to an urban environment. Int. J. Image Data Fusion 2013, 4, 197–213. [Google Scholar]
Jong-Song, J.; Jong-Hun, C. Application effect analysis of image fusion methods for extraction of shoreline in coastal zone using Landsat ETM+. Atmos. Ocean. Sci. 2017, 1, 1–6. [Google Scholar]
Wald, L. Data Fusion. Definitions and Architectures-Fusion of Images of Diferent Spatial Resolutions; Presses del’Ecole, Ecole de Mines de Paris: Paris, France, 2002.
Gao, F.; Li, B.; Xu, Q.; Zhong, C. Moving vehicle information extraction from single-pass worldview-2 imagery based on ERGAS-SNS analysis. Remote Sens. 2014, 6, 6500–6523. [Google Scholar]
Greece in Europe and the location of Ancient Amphipolis in the Greek territory.
On the left, the wider area of ancient Amphipolis: in light gray the contour curves (relief), in dark gray the modern road network, in black the walls of the ancient city and in red in the center of the left figure the position of Early Christian Ba-silica III. Right the MS orthophoto mosaic from Sequoia+ (bands: G, R, Near-infrared-NIR) of the Early Christian Basilica III and the adjacent Roman buildings.
Panoramic photography of the study area. Location of the ground shot from the path in the eastern part of the right im-age of Figure 2. Among other things, the elevation differences are also evident (relief).
Distribution of GCPs (symbol: triangle) and CPs (symbol: circle) (Sequoia+ MS orthophoto mosaic background, bands: G, R, NIR).
RGB Phantom image excerpt. On the left, the mosaic of the through of the temple, part of which will be depicted in the chapter of the production of the fused image. On the right, the paper targets of GCPs and CPs measuring 24×24 cm.
Two images with flight directions from south to north. On the left, the RGB image of the RGB Phantom. In the yellow frame the surface covered by the images (RGB or MS) of Sequoia+. On the right, the Green band of the MS image of Se-quoia+.
(**a**) DSM and (**b**) RGB orthophoto mosaic of RGB Phantom, 30 m flight height; (**c**) DSM and (**d**) RGB orthophoto mosaic of RGB Phantom, 45m flight height; (**e**) DSM and (**f**) MS orthophoto mosaic (bands: G, R, NIR) of Sequoia+, 30m flight height).
(**g**) DSM and (**h**) MS orthophoto mosaic (bands: G, R, NIR) of Sequoia+, 45 m flight height; (**i**) DSM and (**j**) RGB ortho-photo mosaic of Sequoia+, 30 m flight height; (**k**) DSM and (**l**) RGB orthophoto mosaic of Sequoia+, 45 m flight height.
The graph of Average values and Standard deviations of the CPs for both flight heights.
Excerpt of the mosaic of the through of Figure 5. (**a**) Excerpt of RGB Phantom's RGB orthophoto mosaic; (**b**) Excerpt of the PPAN image of RGB Phantom; (**c**) Excerpt of the MS orthophoto mosaic (bands: R, RedEdge, NIR) of Sequoia+; (**d**) Ex-cerpt of the fused image (PCA2: R, PCA3: RedEdge, PCA4: NIR).
Mosaic excerpt of the nave. (**a**) Excerpt of RGB Phantom's RGB orthophoto mosaic; (**b**) Excerpt of the PPAN image of RGB Phantom; (**c**) Excerpt of the MS orthophoto mosaic (bands: R, RedEdge, NIR) of Sequoia+; (**d**) Excerpt of the fused image (PCA2: R, PCA3: RedEdge, PCA4: NIR).
(**a**) Excerpt of MS orthophoto mosaic (bands: R, RedEdge, NIR) of Sequoia+ of the splint’s mosaic; (**b**) the classification image of (**a**); (**c**) excerpt of the fused image (PCA2: R, PCA3: RedEdge, PCA4: NIR); (**d**) the classification image (**c**).
(**a**) Excerpt of MS orthophoto mosaic (bands: R, RedEdge, NIR) of Sequoia+ of the nave’s mosaic; (**b**) the classification image of (**a**); (**c**) excerpt of the fused image (PCA2: R, PCA3: RedEdge, PCA4: NIR); (**d**) the classification image (**c**).