Relative reflectance value
Type of resources
Topics
Keywords
Contact for the resource
Provided by
Years
Update frequencies
status
-
This dataset contains UAV multispectral imagery collected as part of a field trial to test the Uncrewed Aerial System to be used for the TERN Drone project. The UAS platform is DJI Matrice 300 RTK with 2 sensors: Zenmuse P1 (35 mm) RGB mapping camera and Micasense RedEdge-MX Dual (10-band multispectral sensor). P1 imagery were geo-referenced using the onboard GNSS in M300 and the D-RTK 2 Mobile Station. P1 Camera positions were post-processed using <a href="https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/auspos">AUSPOS</a>. Flight conducted between 10:26 am and 10:47 am AEDT at flying height 80 m, forward and side overlaps for Zenmuse P1 set to 80%. MicaSense RedEdge-MX Dual triggered using timer mode (every second). <br><br> Micasense multispectral sensor positions were interpolated using P1, following which a standard workflow was followed in Agisoft Metashape to generate this orthomosaic (resolution 5 cm). Reflectance calibration was performed using captures of the MicaSense Calibration Panel taken before the flight. The orthomosaic raster has the relative reflectance (no unit) for the 10 bands (Coastal Blue, Blue, Green 531, Green, Red 650, Red, RedEdge 705, RedEdge, RedEdge 740, NIR). The cloud optimised (COG) GeoTIFF was created using rio command line interface. The coordinate reference system of the COG is EPSG 7855 - GDA2020 MGA Zone 55. <br><br> In the raw data RedEdge-MX image file suffixes correspond to bands like so - 1: Blue, 2: Green, 3: Red, 4: NIR, 5: Red Edge, 6: Coastal Blue, 7: Green 531, 8: Red 650, 9: RedEdge 705, 10: RedEdge 740. However, in the processed Orthomoasic GeoTIFF, the bands 1-10 are ordered as per the Central Wavelength (Coastal Blue, Blue, Green 531, Green, Red 650, Red, RedEdge 705, RedEdge, RedEdge 740, NIR).
-
This dataset is a collection of drone RGB and multispectral imagery from plots across Australia (AusPlots, SuperSites, Cal/Val sites to be established in the future). Standardised data collection and data processing protocols are used to collect drone imagery and to generate orthomosaics. The protocols developed in 2022 are based on the DJI Matrice 300 (M300) RTK drone platform. DJI Zenmuse P1 and MicaSense RedEdge-MX/Dual sensors are used with M300 to capture RGB and multispectral imagery simultaneously. The data is georeferenced using the DJI D-RTK2 base station and onboard GNSS RTK. In the processing workflow, the multispectral image positions (captured with navigation-grade accuracy) are interpolated using image timestamp and RGB image coordinates. Dense point clouds and the fine-resolution RGB smoothed surface were used to generate co-registered RGB (1 cm/pixel) and multispectral (5 cm/pixel) orthomosaics. Mission-specific metadata for each plot is provided in the imagery/metadata folder. The Drone Data Collection and RGB and Multispectral Imagery Processing protocols can be found at <em> https://www.tern.org.au/field-survey-apps-and-protocols/ </em>.