Hot Water

The above image is a thermal orthophoto mosaic and was captured by a FLIR TAU2 camera. Which was housed in one of my Cessna camera pods, flown on a Cessna 172. Imagery was captured from around 4000 ft AMSL just on sundown, and comprised of 4 flight lines and around 100 individual thermal frames.

The orthophoto mosaic is of the hot water outlet of a nearby power station that discharges the cooling water into a small bay of Lake Macquarie on the NSW central coast. I chose this site as I have previously  been involved with sea grass monitoring of this particular bay, using high resolution RGB aerial imagery. In these previous studies I worked closely with an environmental consultant who had placed temperature loggers in the bay, to monitor the water temperatures that the sea grasses were being exposed to over the summer period. Having temperature values in an excel spread sheet is one thing but being able to actually see the thermal plume is another thing entirely. To date the only thermal imagery of the site available is satellite, but that data is extremely course with a pixel size of around 40m, compared to the 0.7m pixels we captured.

One other notable feature in the imagery is how visible the road network is. This is due to the fact that the imagery was captured just after sundown on a hot day. The roads that had been baking in the sun all day are significantly hotter that the vegetated areas surrounding them and stand out in the imagery.



Thermal Mapping Pods

Over the last few months I have been playing with integrating a FLIR thermal camera into our portable mapping pods for light aircraft (Cessna 100 series) and helicopters. The main image for this article was from one of our trial flights in Central QLD, and it is a thermal image of the town of Clermont in Central QLD. It was captured using one of my portable mapping pods carried by a Cessna 172. Imagery was captured at around 11pm at night to avoid artifacts caused by the sun. Our data set comprises of around 6 flight lines and a few hundred images, processed in Agisoft photoscan. While its only early days, I am really happy with the data coming out of the camera.

Cheers and hope everyone had a good Xmas and new years

Mapping Pods in Antarctica

Portable Mapping Pods in Antarctica!

One of my portable mapping systems has made it all the way to Antarctica. It will be used for various aerial survey projects around Davis Station during the short summer period. So far its racked up two survey flights, attached to a lovely Squirrel B3 operated by Helicopter Resources from Tasmania. The main photo shows a technician fitting the pod to the helicopter before one of its survey flights.




Portable Mapping Pods on Helicopters

Recently I was contacted by a helicopter operator in Tasmania to see if our portable mapping pods could be adapted to fit a Squirrel B3 helicopter bound for Antarctica over the summer. Our camera pods were initially designed to fit to the landing gear leg of a Cessna 172, so the obvious problem was where to mount the thing? After a bit of head scratching it was decided that the best place to fit the system would be the end of the skid. A small adapter bracket was engineered and the pod was fitted to this. Rest of the install was fairly painless as the system only relies on one cable that can be closed in the door simplifying cable routing.

Never having flown a camera on a helicopter before we had some initial concerns about the possible extra vibration from the helicopter rotors and how this may effect the imagery. While our camera pods do have a vibration isolation plate in them, this was designed around the expected engine vibration frequencies of a Cessna 172 not a helicopter. So we were unsure about how this would perform, and it turns out it performed fairly well, with no blur evident in the imagery.

Final inspection of system before test flight
Final inspection of system before test flight
Not a bad view from the office - Flight testing the mapping pod
Not a bad view from the office – Flight testing the mapping pod


In order to test the vibration and the demonstrate the accuracy of the system, a test flight was in order. In order to minimize aircraft costs a survey area close to the airport was chosen. We wanted to give the system a reasonable test, so an area of around 30 sqkm was chosen. Imagery was captured over this area at a GSD of 8cm, and a local surveyor collected 15 ground control points (GCP’s) to test the systems spatial accuracy. The system performed well and the resulting mosaic and digital surface model from the test flight can be seen below.


8 cm GSD Orthophoto and resulting digital surface model (DSM) from test flight
8 cm GSD Orthophoto and resulting digital surface model (DSM) from test flight. Area is approx 30 sq km

Weed Mapping With a Drone

Recently a colleague and I used a DJI multi rotor drone to capture high resolution RGB imagery over a small 20ha project site in Sydney. With the aim to see if was possible to use the imagery to map the species composition, and volume of weeds within the site. As we did not have access to radiometric calibration targets, no atmospheric correction was performed on the raw imagery. A simple image based classification technique was then used to create a classified map from the high resolution RGB imagery.  Overall the interpretation worked well, however it did struggle in some areas, particularly trying to separate species of grasses. Manual re-coding of some sections of the image was needed. Volumes and the area coverage of weed species were then derived from the classified map. These were used to produce an estimate of the man hours and the volume of herbicide required to remove the weed species from the site.

Before people start jumping up and down about the project methodology, I would like to point out that yes I know ideally we should have performed some type of atmospheric correction on the imagery, and used a multispectral camera. However this was a simple test using only the available equipment that we had on hand, which was a Zenmuse RGB camera. I acknowledge that for this type of work a multispectral camera like the Parrot Sequoia or similar would have yielded more spectral information, and I hope to investigate one of these in the near future.

Why map weeds with a drone?

Normally I am not a big fan of simply doing something with a drone simply because you can. However in this case I think there are real potential benefits of using drones to help map weed species. There are two main benefits that drone surveys can provide. These are helping to provide an initial scope of works for a regeneration site, and then being able to provide ongoing monitoring of that site. Currently the initial scoping study of a regeneration site is performed by an experienced bush regenerator visiting the site and estimating the scope of works manually, by simply using experience as a guide. While this methodology works it is highly dependent on the experience of the estimator and presents opportunity for errors in the estimation. In this study we wanted to see if we could improve this process by augmenting the site visit with an initial site survey carried out by a drone.  From the imagery a ‘classified map’ can then be produced that shows the species composition of a site, and the volume in area that each species of weed occupies on the site. The volume information for each weed species can then be used to accurately predict the man hours required to manually remove that species and also the amount of chemical herbicide needed to treat the weed. Having access to accurate site metrics is not only helpful to bush regeneration companies bidding on projects, but is also useful to land managers like local councils, community and not for profit groups.

Equipment used

For this study we flew a DJI Matrice 100 with Zenmuse X5 camera and 15 mm lens at a height of 60 m AGL.  The Matrice 100 is DJI’s professional quadcopter and has an endurance of up to 30 minutes enabling it to cover reasonable areas in one flight. As far as UAV’s go its reasonably affordable, I think our unit as flown was about $6K AUD  worth of drone.  Note- This was the older Matrice 1 not the newer 2.


The flight plan and data capture was created with the drone deploy application, and was planned with 80/60 overlaps. As this was a small site of around 20 ha the imagery was able to be captured in a single flight, lasting around 15-20 minutes. Data was captured in JPEG format and a GPS string was written into the image header from the L1 GPS in the drone. I am going to guess that the GPS in the Matrice100 is similar in accuracy to that of the Inspire, which I tested in a previous post and found to be in the vertical accuracy range of around 3-5m. An initial orthophoto was created using the GPS airstations only, producing an orthophoto with a spatial accuracy of 1-2m. Using this image an initial image classification was run. Accuracy of this initial classification was assessed using visual interpretation methods. Believe it or not visual interpretation of aerial photography is a dying art, and is still a very effective method of mapping vegetating communities. After running a few classifications and fine tuning the performance, we then ground truthed a number of sample sites. A ground truthing site is like ground control but used to both check and guide the performance of the image classification. To collect a ground truth point, the Lat and long of each point is first soured from our orthophoto. We then use a hand held GPS to navigate to that point and record the vegetation that occurs there. Actual recorded vegetation at each truthing site is then compared to the classified vegetation class for that site, and the accuracy assessed.  For this study ground truth data was collected using a simple hand held GPS.

As stated above imagery was captured with 80 percent forward lap and 60 percent side lap. High overlaps were flown for two reasons.  Firstly high forward and side lap protects against any gaps in the data caused by platform pitching and rolling, and against any camera firing / cycle issues. Also flying high overlaps gives a true vertical perspective on the imagery, and serves to minimise any tree or vegetation lean. For this project a vertical perspective was paramount. As any tree lean could occlude smaller weeds like ground covers or small shrubs from view. As the saying goes if you can’t see it, then you can’t map it. On the day of the flight the imagery was captured under a high overcast, limiting harsh shadows leaving the imagery a little washed out with a lack of contrast. When trying to classify pixels using their spectral characteristics, a high overcast is not ideal especially if the overcast is not even. It does have some benefits however. Namely that it reduces the amount and depth of shadows cast by tall vegetation, allowing better observation of the midstory and ground cover vegetation. Our survey site was a small clearing ringed by some tall eucalypts on all sides. So had we flown under full sun these trees may have cast shadows across the survey area, potentially causing problems.

About the site

Firstly a disclaimer, I am not a botanist and not fantastic with my plant identification. I do have some basic skills but relied solely on the regeneration professionals for plant ID. So if I don’t sound like I know what I am talking about in this section, it’s because I probably don’t. From my limited knowledge I did manage to pick up that this is a fairly disturbed site, and does not have a lot of remnant native vegetation. It is mostly comprised of Crofton Weed and various grasses like African Love Grass, and Pampass Grass. Both the Pampass Grass and the Crofton Weed stuck out like the dogs proverbial in the aerial imagery, which made them easy to map. Grasses were a different story and were hard to separate.


RGB Mosaic
RGB mosaic of the project site, captured with DJI Matrice 100


So How Did We Go?

Ok is the short answer, but there are lots of areas were we could improve in future surveys. We were successfully able to map and estimate the volume for the two main weed species on the site which were Crofton Weed, and Pampass Grass. However we were not able to map most of the various grass species and smaller ground covers, as we could not separate them in the classification. Using a multispectral or Hyperspectral camera may give us the extra spectral fire power to be able to separate out these species, but I am not confident of this.

Overall for our application I think the trail was a success. Our weed species map had enough information to allow our regenerators to estimate the volume of weed species on the site and formulate an estimate of both man hours and chemical to remove them.

Also the spatial accuracy of our final map was not fantastic, being good to around 3-5m real world. For our purposes this level of accuracy was fine, but differing clients may require better. This would mean using an RTK enabled drone, or laying ground control prior to the flight, both of which would drive up the cost of the survey.

UAV Weed Map
Final classified weed map, showing the weed species and the expected amount of man hours and herbicide required for treatment.

Future Surveys

I think we coud have improved our classifcation results by using the following gear;

  • Multispectral camera (Parrot Sequoia, Tetracam, Micasense ect)
  • Atmospheric calibration targets or camera with in-built radiometric correction like the Parrot
  • PPK or RTK GPS, to enable a spatially accurate map to be produced with little or no ground control

We used an image based classification work flow in this project to create our final weed map. While an image based classification produces good results, it does require expensive software and is labor intensive. To get to a final map as shown in this study including photogrammetry would be about 2- 2.5 days labor. A far more cost effective method may be simple digitization workflow. Where the high resolution orthophoto is opened in a GIS package like global mapper and an operator simply traces around the target weed species creating a poloygon. Not quite as accurate as an image based classification but it would be cheaper faster a possibly faster and works with a standard RGB camera.

Future Applications

I can see real applications here for using high resolution imagery to aid in weed mapping in certain situations. It will only really work on sites that are reasonably open / disturbed with now canopy cover to occlude the mid and ground cover vegetation. For sites with thick low scrubby vegetation like coastal dunes that are hard to traverse on foot. I think this type of survey would be ideal, especially when combined with ongoing monitoring flights.

As with all things UAV the million dollar question is, just because it can be done with a drone, is it the most cost effective way of producing the answer. In our particular case is the extra cost involved with a UAV survey worth the extra information it provides. I think this can only really be answered on a site by site basis. For small sites, a simple site visit and walk around may be sufficient and a drone survey won’t add likely add much value. For a larger more complex site like a coastal dune with access issues, then a drone survey may be the most efficient method and provide savings.

Cheers Erron



GNSS Inertial Solutions – Mapping with UAV’s


I have been looking around lately for a GPS+IMU solution that can be carried in our portable mapping pods to enable the direct geo-referencing of aerial imagery, an minimise the use of ground control. Whilst checking out the Applanix web site I came across this really cool webanir on  GNSS Inertial Solutions for UAV mapping. Its a really comprehensive look at this technology as applied to UAV mapping. As you would expect being sponsored by Applanix they push there own GNSS solutions a bit, which is fair enough. Apart from just GNSS they also talk about different types of airborne sensors, including a UAV hyper-spectral system. Its packed full of info, and defiantly worth a watch if you are new to the world of UAV remote sensing.  At a bit over 1 hour 20 minutes long its a long one. So grab your self a cuppa, a TimTam or 3 and settle back and enjoy.



Ever wondered how Airborne LIDAR Works?

Whist mucking about on the interweb, I came across this great video on how LIDAR works. It does a great job of explaining a complex subject, so enjoy!








Portable Mapping Pod Update

As I have alluded to in earlier posts I have been working on a portable mapping pod system, over in my day job at Aerial Acquisitions. Designing, building, testing and refining the pods and sensors has been a long and on-going exercise. So it has been great to see the first versions of our systems taking flight around the country on various projects.

Mapping pods installed on our company 172 for early test flights around Sydney
Mapping pod installed on our company 172 for early test flights around Sydney

One of these projects was a forestry project in the wilds of Tasmania, which is turning out to be a great test of the system and our mapping pod concept. Now I have to admit to being a little nervous to sending a system to Tasmania, far away from technical support on one of its first major hit outs. Aerial survey tends to have an amendment to Murphy ’s Law and that is that if something can fail it will, and the likely hood of failure is increased with distance from home base. So far Murphy has been kind and only thrown us two small issues, both easily resolved.

Why Tassie is a good test for the mapping pod

Tasmania is a difficult place to capture imagery, due to really variable weather and lots of cloud cover. As the saying goes, if you don’t like the weather in Tasmania just wait an hour! Constant cloud makes capturing imagery difficult, and you may only get 2-3 cloud free days per month to capture imagery. Often you may only get ½ day clear either in the morning or afternoon. So this presents some interesting problems for image capture. It’s not economically viable to have a survey aircraft and crew sitting on the ground for 28 days per month watching cloud drift past from the comfort of a bar stool at the local pub (been there!). But to take advantage of the short gaps in cloud you need a local aircraft on the spot that can be mobilised quickly. So enter the portable mapping pod strapped to a local aircraft, using local aircrews that know the weather.

Project Methodology

I thought I would briefly outline our methodology of our project in Tasmania, as its fairly typical of how I see our pods operating once released into the wild.

Getting there – Mobilisation

A major design feature of our portable pods is that they had to be completely portable. So we designed them to fit into a small pelican case, roughly the size of a small suitcase to make travel and shipping easy. All packed in its case ready to ship the system only weighs around 8kg, allowing it to be taken with you as luggage on a commercial flight, or sent via mail or courier. For this project my business partner and I opted to jump on a commercial flight and take the pod with us as luggage. As we wanted to train some local pilots on the system and oversee the initial install on a local aircraft. Now that we have a local aircraft and pilots lined up, for any future projects we will simply ship the pod down.

This is an early test version of one of our mapping pods, packed up ready for shipping. As well as the DSLR there is also a Tetracam MCA6 in the pelican case

Being able to jump on a commercial flight with the system offers large savings over having to ferry our company survey aircraft down. I estimated that to ferry our company Cessna 172 from Sydney to Tasmania would have been around 15 hours return depending on winds and cost around $8k AUD including pilot wages and expenses. In contrast to this two of us were able to fly down on a burner for less than $1k. For future projects the savings are even greater as I can simply ship the system to Tasmania for a few hundred dollars. If I were to take on a project in Perth or Darwin the savings in mobilisation would be even greater.

Fitting to a Local aircraft

For this project we were able to find a local Cessna 172XP which we were able to fit our pod to in around 40 minutes. The 172 XP is a nifty aircraft it has a bugger donk than the standard 172, it also has a constant speed propeller and is fuel injected giving slightly better cruise and climb performance than the standard 172. Normally the increased performance of the XP would not really matter. But for this project a number of our coupes need to be flown at close to 10 000 ft above sea level, so the increased climb performance will come in handy.

mapping pod install
Mapping Pod being fitted – In this photo the step is being removed ready to fit the pod
Step to be removed
Step that need to be removed in fitting pod

We found a local engineer who was able to oversee the fitting of the pod in around 40 minutes to the aircraft. Fitting of the pod is really simple. Firstly the co-pilot step is removed from the landing gear leg by removing 3 bolts. Our pod then simply fits in place of the step, and the 3 bolts are re-fitted. Lastly a few cable ties are used to secure the single cable from the pod into the aircraft cabin. A single cable runs out to the pod from the flight management system in the aircraft cabin, this triggers the camera and records when the camera fires.

One of the reasons the system is so simple to install is that it does not require aircraft power to operate. Our mapping pods were designed to run independent of aircraft power. Once you start running cables with power out along wing struts, or landing gear legs engineers get understandably nervous and this starts to greatly complicate things.

mapping pod installed
Mapping pod installed and ready for survey. Note clean instillation with only one small cable running out to pod


Flight Management System (FMS)

pod flight managment system
Flight management system installed in C172 cockpit

Our systems are designed to be easily operated by a single pilot only, with no camera operator / navigator required. After hours and hours of flight testing we have finetuned our cockpit set up, to a simple system that works. This comprises of a small touch screen tablet that runs the flight management software, mounted in the eye line of the pilot. Tablet mounting is flexible to allow individual pilots to set up the system to their individual preference and allow easy access.

Once airborne the FMS acts like a big video game and gives the pilot guidance to fly the project. It also fires the camera at pre-planned photo locations, and records a GPS event for each photo. Because the camera is buried in the pod away from the pilot’s sight, the FMS also gives a confirmation on the tablet screen that the camera has fired. This saves the rather expensive exercise of the pilot flying around for hours with a camera that’s not working.

Flight management system in action
Flight management system in action

We were able to train our local pilot on our system who had never flown survey before in around an hour. After flying a number of test runs over the airport he was confident enough to head off into the wild blue yonder and have a go at some real projects. He has now flown over 30 small forestry areas without issue and has commented on how he feels comfortable with the system.

Processing Data

Once the aircraft lands the data card is removed from the Nikon D800 camera, and uploaded to me via the interweb for processing in Agisoft Photoscan. For our Tasmanian project the client only requires simple othrophotos to update their GIS, and our pods are proving ideal for this application.

Having been around in the good old days of aerial survey when we still used 250 ft long rolls of film, it still amazes me how seamless the whole process is now. When I started in aerial photography, turning data around quickly meant processing the film myself in a small tin shed on the airfield after landing. Once the film was processed I would then rushed the still wet film to the office so they could start on the photogrammetry overnight. This wasn’t too bad in winter but that tin shed was not much fun in summer in 36C heat!

Section of a forestry orthophoto created with a portable mapping pod
Section of a forestry orthophoto created with a portable mapping pod

Wrap up

I have to say I think it’s pretty cool that I can be remotely managing an active aerial survey project from a different state without leaving my office. Using our portable mapping pods and a local aircraft and crew to capture the data is proving to be very effective. There is still some fine tuning to be done on transferring data but overall it’s working really well. Next step is to integrate in some more sensor options, with multispectral and thermal coming soon.

What’s really exciting for me is that using portable mapping pods carried by local aircraft has the potential to slash data capture costs over traditional methods, delivering data at bargain basement prices. I see real potential here for our portable mapping pods to be used for a range of projects not just commercial mapping work. I would love to see some of these systems deployed on environmental projects, rapid response events, and humanitarian projects.





Holiday Break

Apologies for the lack of posts over the last few weeks, I have been on a family holiday over the Christmas and new year break. I spent some laptop and phone free time soaking my pasty backside in the ocean down on the beautiful NSW South Coast. I hope everyone had a merry Christmas and a happy new year. I aim to  crank out some more posts in the next week or two so stay tuned…….



First Images out of World View 4

Digital Globe has released the first images out of its World View-4 Satellite. World View-4’s first image was captured on November 26 and features the Yoyogi National Gymnasium as seen above.

World View-4 doing its thing

Here are some quick stats on the World View-4 bird. It’s claimed to have a circular error 90% (CE90) of 3m, which is impressive when you consider the image was taken from an orbit of 617km above the earth’s surface. Even more impressive is the image resolution of 30 cm GSD, which is getting into manned aircraft territory. Before you get too excited this for the PAN (black and white) band only. If you want standard colour imagery (RGB) then the resolution drops to 1.2m GSD, which for a satellite is still rather good. There is also always the possibility of producing 30 cm colour imagery through pan sharpening, but to me this always looks a little unnatural with weird halos and colour tones. If you are wondering what pan sharpened imagery looks like, Google earth uses a lot of it outside of capital cities, especially in Australia.

World View-4 will collect over 680,000 sqkm of data every day, and have a revisit time of around 4.5 days. Revisit time may be an issue if you live somewhere where there are only a few cloud free days per month.

Lastly like most modern satellites data can be collected in stereo meaning that DSM’s and DEM’s can be created from the system.