From 1 - 2 / 2
  • Categories    

    This dataset contains UAV multispectral imagery collected as part of a field trial to test the Unmanned Aerial System to be used for the TERN Drone project. The UAS platform is DJI Matrice 300 RTK with 2 sensors: Zenmuse P1 (35 mm) RGB mapping camera and Micasense RedEdge-MX (5-band multispectral sensor). P1 imagery were geo-referenced using the onboard GNSS in M300 and the D-RTK 2 Mobile Station. P1 Camera positions were post-processed using <a href="https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/auspos">AUSPOS</a>. The flights took place between 14:58 and 03:08 at a height of 80m with a flying speed set to 5 m/s. Forward and side overlaps of photographs were set to 80%. <br><br> Micasense multispectral sensor positions were interpolated using P1, following which a standard workflow was followed in Agisoft Metashape to generate this orthomosaic (resolution 5 cm). Reflectance calibration was performed using captures of the MicaSense Calibration Panel taken before the flight. The orthomosaic raster has the relative reflectance (no unit) for the 5 bands (B, G, R, RedEdge, NIR). This cloud optimised (COG) GeoTIFF was created using rio command line interface. The coordinate reference system of the COG is EPSG 7855 - GDA2020 MGA Zone 55. <br><br> In the raw data RedEdge-MX image file suffixes correspond to bands like so - 1: Blue, 2: Green, 3: Red, 4: NIR, 5: Red Edge. However, in the processed Orthomoasic GeoTIFF, the bands are ordered in the wavelength order (Blue, Green, Red, Red Edge, NIR).

  • Categories    

    This dataset is a collection of drone RGB and multispectral imagery from plots across Australia (AusPlots, SuperSites, Cal/Val sites to be established in the future). Standardised data collection and data processing protocols are used to collect drone imagery and to generate orthomosaics. The protocols developed in 2022 are based on the DJI Matrice 300 (M300) RTK drone platform. DJI Zenmuse P1 and MicaSense RedEdge-MX/Dual sensors are used with M300 to capture RGB and multispectral imagery simultaneously. The data is georeferenced using the DJI D-RTK2 base station and onboard GNSS RTK. In the processing workflow, the multispectral image positions (captured with navigation-grade accuracy) are interpolated using image timestamp and RGB image coordinates. Dense point clouds and the fine-resolution RGB smoothed surface were used to generate co-registered RGB (1 cm/pixel) and multispectral (5 cm/pixel) orthomosaics. Mission-specific metadata for each plot is provided in the imagery/metadata folder. The Drone Data Collection and RGB and Multispectral Imagery Processing protocols can be found at <em> https://www.tern.org.au/field-survey-apps-and-protocols/ </em>.