09:30-09:50: László Bertalan, Brigitta Tóth: Welcome from local organizers 09:50-10:05: Salvatore Manfreda – The achievements HARMONIOUS COST Action 10:05-10:25:Plenary talk: Francesco Nex – Towards real-time UAV mapping: example, challenges and opportunities 10:25-10:55: Plenary talk: James Dietrich – Drones for River Monitoring, a ten-year perspective
10:55-11:20 Coffee Break
11:20-11:35: Eyal Ben-Dor – Summary of WG5: Harmonization of methods and results 11:30-11:50: Sorin Herban – Summary of WG1: UAS data processing 11:50-12:05: Jana Müllerová – Summary of WG2: Vegetation status (part 1) 12:05-12:20: Antonino Maltese – Summary of WG2: Vegetation status (part 2) 12:20-12:35: Yijian Zeng – Summary of WG3: Soil moisture content 12:35-12:50: Dariia Strelnikova – Summary of WG4: River monitoring
12:50-14:30: Lunch break
14:30-14:45: Gábor Papp – HungaroControl’s Air-Ground-Air communication concept in order to enable UAVs’ ecosystem 14:45-15:00: Géza Király et al. – UAS and their application in forest monitoring 15:00-15:15: Gábor Bakó et al. – HRAM: High Spatial Resolution Aerial Monitoring Network for Nature Conservation 15:15-15:30: Ferenc Kovács et al. – Application of UAV imagery in environmental research at the University of Szeged 15:30-15:45: Anette Eltner et al. – Hydro-morphological mapping of river reaches using videos captured with UAS 15:45-16:00: Ilyan Kotsev et al. – UAS-aided bedform and habitat mapping of Bolata Cove, Bulgarian Black Sea
16:00-16:30: Coffee Break
16:30-16:50: Lance R. Brady – UAS for Research and Applied Science in the United States Geological Survey 16:50-17:05: Kamal Jain et al. – Crop identification and classification from UAV images using conjugated dense convolutional neural network 17:05-17:20: Nicolas Francos et al. – Mapping Water Infiltration Rate Using Ground and UAV Hyperspectral Data: A Case Study of Alento, Italy 17:20-17:35: Martin Jolley et al. – Considerations When Applying UAS-based Large-Scale PIV and PTV for Determining River Flow Velocity 17:35-17:50: Adrian Gracia-Romero et al. – UAS plant phenotyping under abiotic stresses 17:50-18:05: Shawn C. Kefauver et al. – High-resolution UAV Imaging for Forest Productivity Monitoring
Considering this year’s meeting situation, the organizing team has decided it is necessary to post-pone the event by one year to ensure for all participants a successful, safe on-site meeting and good travel conditions.
The new conference date is 19–22 September 2022.
Looking forward to an exciting exchange in the historical center of sunny Naples!
The organizing committee: Günter Blöschl (TU Wien, Austria) Isabelle Braud (Irstea, France) Gabrielle de Lannoy (KU Leuven, Belgium) Karsten Høgh Jensen (University of Copenhagen, Denmark) Laurent Pfister (Luxembourg Institute of Science and Technology LIST, Luxembourg) Nunzio Romano, Salvatore Manfreda and Paolo Nasta (University of Naples Federico II, Italy) Sonia Seneviratne (ETH Zurich, Switzerland) Ana Maria Tarquis (Universidad Politécnica de Madrid, Spain) Ilja van Meerveld (University of Zurich, Switzerland) Harry Vereecken, Heye Bogena, Ralf Kunkel and Roland Baatz (Forschungszentrum Jülich, Germany) Marc Voltz (INRAE, France) Yijian Zeng (Twente University, the Netherlands)
University of Naples Federico II and University of Thessaly are organizing the 5th EWaS (Efficient Water Systems) International Conference on “Water Security and Safety Management: emerging threats or new challenges? Moving from Therapy and Restoration to Prognosis and Prevention”. The meeting will take place in Naples (Italy) from 13th to 16th July, 2022.
Abstract should be submitted by December 15th, 2021 using the following platform:
UAS-based surveys and structure from motion (SfM) can lead to extraordinary and realistic 3D models to preserve our cultural heritage.
In our recent applications, our members are developing new strategies to build extremely detailed point clouds using UAS and portable cameras. In the following, we provided some examples developed within HARMONIOUS partnership cooperation:
During this meeting the WG1 finalized the Glossary of terms used for UAS-based applications considering the three macro categories : platform and equipment, software and outputs.
1 Category: Platforms and Equipment
Global Navigation Satellite System (GNSS) is a constellation of satellites used for positioning a receiver on the ground.
GALILEO is the GNSS European solution used to determine the ground position of an object.
GPS is the most common GNSS based on the reception of signals from about 24 orbiting satellites by the USA, used to determine the ground position of an object. This global and accurate system allows users to know their exact location, velocity, and time 24 hours per day, anywhere in the world.
Light Detection and Ranging(LiDAR) is based on laser pulses to locate the acquired point cloud in a 3D remote sensing. LiDAR data products are often managed within a gridded or raster data format.
Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The used spectral regions are often at least partially outside the visible spectral range, covering parts of the infrared and ultraviolet region. For example, a multi-spectral imager may provide wavelength channels for near-UV, red, green, blue, near-infrared, mid-infrared and far-infrared light – sometimes even thermal radiation.
Near Infrared(NIR) is a subset of the infrared band that is just outside the range of what humans can see. Applied to cameras, NIR cameras cover the wavelength range of 900 to 1700 nm, a range that is best suited for absorption and radiation characteristics analyses.
Noise is an irregular fluctuation that accompanies a transmitted electrical signal but is not part of it and tends to obscure it. The main sources of noise can be divided into two main categories: the physical noise, linked to physics constraints like the corpuscular nature of light, and the hardware noise, linked to mechanical issues in the camera.
Optical Camera is a photographic device aimed to form and record an image of an object. An optical camera sensor is an imager that collects visible light (400~700nm).
Payload is the weight a drone or unmanned aerial vehicle (UAV) can carry on board. It is usually counted outside of the weight of the drone itself and includes anything additional to the drone – such as extra cameras, sensors, or packages for delivery.
Pixel size of an image identifies the spatial resolution and it is dependent on the sensor capabilities. It provides a measure of the image resolution, which is higher with finer grids, where the degree of recognizable details increases.
RGB Camera is equipped with a standard Complementary Metal Oxide Semiconductor (CMOS) sensor through which the colourful images of persons and objects are acquired. In a CMOS sensor, the charge from the photosensitive pixel is converted to a voltage at the pixel site and the signal is multiplied by row and column to multiple on chip Digital-to-Analog Converters (DACs). In a RGB camera, the acquisition of static photos is commonly expressed in megapixels that define the amount of pixels in a singular photo. While, the acquisition of videos is usually expressed with terms such as Full HD or Ultra HD.
Thermal Camera is a non-contact temperature measurement sensor. All objects (above absolute zero) emit infrared energy as a function of their temperature. The vibration of atoms and molecules generates infrared energy. The higher the temperature of an object, the faster its molecules and atoms move. This movement is emitted as infrared radiation, which our eyes cannot see but our skin can feel (as heat). Thermal imaging uses special infrared camera sensors to illuminate a spectrum of light invisible to the naked eye. Thermal energy is invisible to the naked eye and works in different ways; it can be emitted, absorbed, or reflected. Infrared cannot see through objects but can detect differences in radiated thermal energy between materials. This is known as thermal bridging or heat transfer.
Unmanned Aerial System (UAS) is a remotely controlled professional system integrating several technological components (e.g., navigation system, gyroscope, and sensors) in order to perform spatial observations.
Unmanned Aerial Vehicle (UAV) is a remotely controlled vehicle able to perform several operations and observations.
Aero-triangulation is the method most frequently applied to the photogrammetry to determine the X, Y, and Z ground coordinates of individual points based on photo coordinate measurements. The purpose of aero-triangulation is to increase the density of a geodetic network in order to provide images with an exhaustive number of control points for topographic mapping. Deliverables from aero-triangulation may be three-dimensional or planimetric, depending on the number of point coordinates determined.
Checkpoints are Ground Control Points (GCPs) used to validate the relative and absolute accuracy of the geo-localization of maps. The checkpoints are not used for processing. Instead, they are used to calculate the error of the map by comparing the known measured locations of the checkpoints to the coordinates of the checkpoints shown on the map.
Flight Type refers to the flight mission mode (manual or autonomous). In the manual mode, a pilot manages the UAS during the flight. The autonomous mission is programmed to react to various types of events, in a preset and direct way by means of special sensors. This makes UAS flight predictable and subject to intervention by a remote pilot, only if necessary.
Flight Time is a measurement of the total time needed to complete a mission, from the first to the last image taken during a flight. Flight time can be used to characterize the wind impacts on flight performance of UAS.
Ground Control Points (GCPs) are user defined and priorly determined tie points within the mapping polygon used in the process of indirectly georeferencing UAS images. Such tie points can be permanent or portable markers with or without georeferenced data.
Masking is the procedure of excluding some part of the scene from image analysis. For instance, clouds, trees, bushes and their shadows should not be considered in further processing, such as in vegetation studies for the evaluation of crop vegetation indices.
Orthorectification is a process of linearly scaling the image pixel size to real-world distances. This is achieved by accounting for the impacts of camera perspective and relative height above the sensed object. The objective is the reprojection of the original image, which could be captured from oblique viewing angles looking at unlevelled terrain, into an image plane to generate a distortion-free photo.
Point Cloud is a collection of data points in a three-dimensional plane. Each point contains several measurements, including its coordinates along the X, Y, and Z-axes, and sometimes additional data such as a color value, which is stored in RGB format, and luminance value, which determines how bright the point is.
Radiometric Calibration is a process that allows the transformation of the intensities or digital numbers (DN) of multiple images in order to describe an area and detect relative changes of the landscape, removing anomalies due to atmospheric factors or illumination conditions.
Structure from Motion (SfM) is the process of reconstructing a three-dimensional model from the projections derived from a series of images taken from different viewpoints. Camera orientation and scene geometry are reconstructed simultaneously through the automatic identification of matching features in multiple images.
Tie Point is a point in a digital image or aerial photograph that can be found in the same location in an adjacent image or aerial photograph. A tie point is a feature that can be clearly identified in two or more images and selected as a reference point and whose ground coordinates are not known. The ground coordinates of Tie Points are computed during block triangulation. So, Tie points represent matches between key points detected on two (or more) different images and represent the link between images to get 3D relative positioning.
Precision is a description of random errors in the 2D/3D representations.
Quality Assessment is an estimation of the statistical geometric and radiometric errors of the final products obtained using ground true data.
3 UAS-based Outputs
2D Model is a bidimensional representation of the earth that contains 2 coordinates X and Y.
3D Model is a mathematical or virtual representation of a three dimensional object.
2.5D Model (Pseudo 3D Model) is a three-dimensional representation that uses X, Y coordinates, which are associated to a single elevation value in order to relate different points.
Digital Elevation Model (DEM) or Digital Height Model (DHM) is a gridded image describing the altitude of the earth excluding all other objects artificial or natural.
Digital Surface Model (DSM) is a gridded image describing the altitude of the earth including all other objects artificial or natural. For instance, the DSM provides information about dimensions of buildings and forests.
Digital Terrain Model(DTM) is a vector or raster dataset consisting of a virtual representation of the land environment in the mapping polygon. In a DTM the height of the point belongs to the bare ground.
Orthophoto is an aerial or terrestrial photograph that has been geometrically corrected to make the scale of the photograph uniform and use it as a map. Since each pixel of the orthophoto has a X and Y, it can be overlapped to other orthophotos, and it can be used to measure true distances of features within the photograph.
Orthomosaic is a high resolution image made by the combination of many orthophotos. It is a single, radiometrically corrected image that offers a photorealistic representation of an area that can produce surveyor-grade measurements of topography, infrastructure, and buildings.
Feature Identification is a vector information computed from images using artificial intelligence algorithms in order to identify objects (roads, buildings, bridges, etc.) automatically.
Point Cloud is a set of data points in space representing a three-dimensional object. Each point position has its set of Cartesian coordinates (X, Y, Z). It can be generated from overlapping images or LiDAR sensors.
Point Cloud Classification is the output of an algorithm that classifies the points of a cloud by computing a set of geometric and radiometric attributes.
Image Segmentation is a process that detects the features of an image clearly distinguishable based on the image texture and color.
Triangulated Irregular Network (TIN) is a pseudo three-dimensional representation obtained from the relations in a point cloud using triangles.
Vegetation Indices (VIs) are combinations of surface reflectance at two or more wavelengths designed to highlight a particular property of vegetation. VIs are designed to maximize sensitivity to the vegetation characteristics while minimizing confounding factors such as soil background reflectance, directional, or atmospheric effects. VIs can be found in the scientific literature under different forms such as NDVI, EVI, SAVI, etc.
Aerial photograph is an image taken from an air-borne (i.e., UAS) platform using a precision camera. From aerial photographs, it is possible to derive qualitative information of the depicted areas, such as land use/land cover, topographical forms, soil types, etc.
Terrestrial photograph is an image taken from the earth surface using a camera with an orientation that in most cases is not Nadiral.
Research and innovation driving transformative change.
Becoming the world’s first climate-neutral continent by 2050, Europe needs to modernize the approach to engineering design, to ensure an inclusive ecological transition.
Research and innovation will play a central role in accelerating and navigating the necessary transition to a climate-neutral engineering.
This Phd School aims to spread among young researchers the green transition in the field of civil, architectural and environmental engineering.
DICEA School series
This is the second event of a series of PhD Schools that our Department, DICEA, will organize annually in the framework of the Department of Excellence, project funded by the Italian Ministry of University and Research.
PhD Students in any field are invited to participate free of charge. Awards are available reserved to PhD students in the Civil, Architectural and Environmental Engineering area.
Topic of the school
The PhD school will include a plenary session (yellow), which will focus on Ecological Transition and four parallel thematic sessions (red) on Hydraulic, Transportation, Architectural and Geotechnical Engineering. An important effort will be devoted to applications (blue)
Unmanned Aerial Systems (UAS) play an increasingly important role in collecting data for environmental monitoring. The primary challenges for UAS in environmental studies include creating consistent, standardised guidelines for data collection and establishing practices that apply to a range of environments. Dr Salvatore Manfreda from the University of Naples Federico II, along with the HARMONIOUS team, identified critical steps in planning, acquiring, and processing UAS data to ensure best practices and a streamlined, effective workflow.
As drone technology has improved over the last decade, Unmanned Aerial Systems (UAS) have become a fundamental part of environmental monitoring, bridging the gap between traditional field studies and satellite remote sensing. UAS is an inexpensive way of acquiring visual data on a large temporal scale across the electromagnetic spectrum, making it an invaluable technology for monitoring dynamic environmental processes.
UAS can provide real-time aerial photography or video to map and monitor natural and artificial ecosystems, giving a unique insight into the environment. The versatility, adaptability, and flexibility of UAS make them an essential tool for environmental studies such as forestry planning, tracking glacier geomorphology and precision agriculture, to name but a few applications.
The continual improvements in UAS and sensor technologies, coupled with the variety of environmental settings in which they are deployed, have led to a diversity of methodologies in how data is collected, analysed, and processed. The inconsistencies in the UAS study designs have triggered multiple issues regarding the quality of the final imagery and data collected and have led to overblown budgets. These issues highlighted the necessity for a standardised protocol in UAS environmental mapping and monitoring to be developed.
There are clear economic, temporal and qualitative benefits in using UAS over satellites or manned aircraft.
Dr Manfreda from the University of Naples Federico II, with the international team of researchers of the HARMONIOUS COST Action, explored the primary issues in utilising UAS in environmental studies and produced guidance to improve planning, acquisition, and processing of data and the quality and reproducibility of research. They created a generalised workflow methodology with five interconnected steps:
processing of aerial data;
UAS limitations There are clear economic, temporal, and qualitative benefits in using UAS over satellites or manned aircraft, which are limited by their cost and how often a survey can use them. However, as UAS is still an immature technology, limitations exist in how data is collected and analysed.
Previous studies have indicated that many UAS surveys fail to consider the planning and processing of UAS imagery. When the speed and height of the UAS and the calibration of the sensors are not considered in the planning stage, and the weather is not accounted for on the day of the flight, the UAS imagery will be blurred or of incorrect resolution.
These limitations could be mitigated through a structure of standardisation which can work as a checklist for UAS surveys to ensure accurate collection and analysis of data.
Standardising UAS data collection Although every UAS survey will be slightly different owing to the wide variety of vegetation, topography, climate, and local legislation in study environments, a standardised workflow, which accounts for every stage of the survey and applies to every environment, will be incredibly beneficial in assuring appropriate planning for high-quality results.
Through creating a generalised workflow in five interrelated steps, HARMONIOUS’s research aims to improve the final quality of data and analysis. The workflow was designed based on harmonising multiple methods collated from recent research and reviews of different UAS surveys.
Workflow design Every UAS study can vary greatly and therefore requires a bespoke study design to set out a detailed mission plan for the study area. Consequently, the initial step in the workflow process is to design the study; this step is essential to set up the parameters of the survey and consider the specifics of the environment and the research question as this will shape where, how, and when the flight can take place and what sensors will be used.
When all factors are considered, the study design can be an incredibly complex problem. The final quality of the model is dependent on all of these interconnected factors being correctly accounted for.
In general, mission plans for environmental studies focus on four primary elements:
UAS regulations and legislation;
platform and sensor choice;
camera settings and UAS control software;
Local UAS regulations and legislation will have to be understood first to ensure the mission will get permission to fly in the study area. The platform, sensor, camera settings, and UAS control software choices are purely dependent on the survey’s requirements and limitations – concerning budget and time limitations, or the image quality, spectral and spatial resolution, and the survey area’s size. Finally, in the study design, target geo-referencing must be conducted to ensure the imagery is taken correctly. The best way to do this is to find ground control points (GCP) for reference.
Once the study design is complete, the next step in the workflow is to conduct a pre-flight study. This section of the workflow entails reconnaissance and a terrestrial survey of the survey area. The area’s reconnaissance will reveal take-off and landing points, any possible visual or flight obstructions, and any GCP’s for the flight to be geo-referenced. The field study will be highly dependent on the environmental medium being studied but will supplement and influence any data collected from the UAS study.
The researchers have created a harmonised workflow that will be an essential element of any UAS survey in the future.
Following the pre-flight, the workflow explains how to safely and most effectively conduct the flight itself. The challenge at this stage is to account for the weather accurately. Wind speed, humidity, light levels, and fog can affect data quality, so it must be compensated for before the flight takes place.
The final stage in the workflow describes how to best process the imagery and data from the flight. When processing, it is essential for the surveyors to account for the distortions, often in UAS imagery. These can misrepresent the radiometrics and geometrics of the study object. However, a series of steps quantify the radiometric or geometric problems, for which there is a corrective method.
A critical aspect of HARMONIOUS’s method is that quality assurance must be evaluated at every step to guarantee a quality survey outcome. One such way alluded to, which can save time and money and ensure quality images, uses a portable resolution test chart. These charts, when used correctly, can give assurances that cameras are calibrated correctly before the flight takes place.
A new standard practice Recent advances in UAS have meant that low-cost and near real-time data collection has become possible in an array of environmental studies. With their essential work, Dr Manfreda and his fellow researchers have created a harmonised workflow and accompanying checklists that will be a vital element of any UAS survey in the future, furthering the efficacy of UAS and making them a more valuable tool in studying the environment.
The researchers have designed the workflow to reduce error in data collection and processing and ensure flights are conducted within budget, safely, and effectively. This research will undoubtedly improve future UAS studies and be a template by which all reviews can be guided, streamlining the study process and making results easily reproducible.
HARMONIOUS’s research assists in furthering UAS procedures and ensuring that UAS studies in the future will have more accurate results if they utilise the workflow checklists referenced in this article.
As new iterations of UAS technologies are developed, could the workflow process become more automated?
We are now focusing on the preparation of a book edited by Elsevier providing more detailed guidelines for UAS applications in environmental monitoring.
This Special Issue is dedicated to machine learning-based methods in:•proximal and digital global mapping of soil properties (e.g., basic, hydraulic, thermal, functional, ecosystem services);•computing systems/algorithms/approaches using Earth observation data to derive global gridded soil datasets;•preprocessing Earth observation data to feed into global soil mapping;•data-intensive computing methods for incorporating Earth observation data for predictive soil mapping;•optimizing temporal resolution to globally track the changes of soil properties;•uncertainty assessment of the derived gridded soil information;•other related topics.
A 40% discount can be granted to papers received from this conference/project on the basis that the manuscript is accepted for publication following the peer review process.
Can you tell us how you started working on using UASs for environmental monitoring? What was your motivation, and what did you find the most interesting in this research field? What are the knowledge gaps and major challenges in this research field?
I have always been interested in spatial patterns of natural ecosystems. Nature is able to create an incredible diversity of elements that have been inspiring for all of us. The driving processes that produce such patterns are open questions stimulating many of my studies. In this context, UAS offers the opportunity to explore such patterns at a level of detail that was unimaginable a few years ago. Therefore, I envisaged the possibility to use this tool to tackle my research questions in the field of hydrological and ecohydrological science.
Can you share with us any current specific project, activity, or initiative that you are particularly excited about?
I’m particularly proud to be the Chair of the COST Action “Harmonization of UAS techniques for agricultural and natural ecosystems monitoring – HARMONIOUS”, which includes more than 100 scientists from 36 countries. The HARMONIOUS Action is one of the biggest Actions funded by COST Organization (https://www.cost.eu) focusing on the development of guidelines for the use of UAS applied for hydrological monitoring. Members of the HARMONIOUS Action are now focusing on the preparation of a book edited by Elsevier providing more detailed guidelines for UAS applications in hydrology, which will be one of the main deliverables of the project.
More details about the project activities can be found on the web-page
What are some of the areas of research you’d like to see tackled over the next ten years?
UAS offers the opportunity of acquiring high-resolution data for monitoring environmental processes, bridging the gap between traditional field studies and satellite remote sensing [An important paper in this context is https://doi.org/10.3390/rs10040641]. Their versatility, adaptability, and flexibility may allow the implementation of new strategies to support the validation of satellite products, which are systematically adopted in a series of operational weather and hydrological models. This may help to develop an integrated global monitoring system of higher accuracy and precision.
Can you share with us your perspectives and experiences on how UAS remote sensing has changed the way the world addresses environmental monitoring and conservation agendas? What do you think is the role of remote sensing and geospatial information science in achieving a sustainable environment?
With the evolution of drone technologies over the last decade, UAS became an inexpensive way of mapping environmental processes for forestry planning, tracking landslides, river monitoring and precision agriculture. Environmental agencies and civil protection are increasingly adopting UAS- photogrammetry, but there are an enormous number of additional information that may be retrieved by UAS (e.g., stream flow, morphological evolution, soil moisture, state of vegetation, among others). It is our responsibility to simplify the use of UASs and make their products accessible to anyone.
What are some of the biggest challenges you face (or have you faced) as a scientist in your field? Are there any common misconceptions about this area of research?
It is common to underestimate the complexity associated with the use of these tools. UAS requires a large number of competencies and knowledge that should be implemented in clear protocols in order to transform the huge amount of data acquired to useful information. Therefore, one challenge is represented by the standardization of procedures adopted for UAS surveys in different operating configurations and environmental conditions. In this context, the members of the HARMONIOUS COST Action have published some preliminary studies to support this process [see the manuscript].
Finally, what are you most passionate about? What is your advice to students and young professionals who are pursuing research on UAS remote sensing and environmental protection, and nature conservation? Which areas in this research field remain understudied and should be considered for future research?
I believe that UAS remote sensing will evolve in the coming years, offering new monitoring opportunities. One of the main limitations that we are encountering right now in the description of hydrological processes is represented by the limited extent of UAS imagery. There is a pressing need to extend the limits of surveyed areas in order to have intercomparison between UAS and satellite data. This may help to define downscaling procedures for the estimation of environmental variables at high resolution and over large scales. This will be possible with the use of long range UAS or with swarms of drones which will be fundamental for future advances in remote sensing.