Well, we’re between coup attempts and attempts to throw out a seditionist president, so I’m going to take the opportunity to describe this research that was recently accepted in @SERestoration for the grassland ecology SF.

Led by @Blackburn_RC and in collaboration with @barberecology and others not on the Twitters, this work was from Ryan’s MSc research @NIULive. I say research and not thesis, dear reader, bc it didn’t make it into his thesis.
Truth be told, this was an epic battle of computer vs researcher and many times, it felt like computer won. I encourage students in my lab pick a risky part and less risky part of their research. Drone imagery was Ryan’s risky part.
As sometimes happens w the risky part, this one didn't go in the thesis, but Ryan persevered to get it published nonetheless.
More times than not, in fact, it felt like the drone work went like this:
Ask @Blackburn_RC for the tale of the battle because today, I’m just going to stick to the findings. We wanted to see whether drone imagery could accurately predict the fine-scale ecological data we were gathering @nachusa, a tallgrass prairie #restoration.
We used a fixed wing @Parrot DiscoProAg drone paired w Sequoia sensor, which collects green, red, red-edge and near-infrared bands of wavelength.

Parrot provided this drone to us via a climate change grant (thanks, Parrot!).

Here is how takeoffs are supposed to go, btw.
Ryan flew this puppy over all of Nachusa (how many flights, Ryan?), and we compared indices arrived f the sensor (e.g. NDVI and many others) to the plant/soil data we’d collected at the local scale.
Our broad question was: Could we use indices collected by drone imagery to predict soil nutrients, plant biomass, plant composition or functional traits?
A lot of research has shown in monocrops, drone imagery can be used to approximate leaf N content, biomass, etc. But what about in one of the most diverse grasslands?
Of the 66 responses we measured (b/c we looked at max/min/mean/variance of various responses, response numbers blew up quickly), four worked when predicting the data values from the training sets we used to generate predictions.
Specifically, mean graminoid (grass-like stuff) cover, dead aboveground biomass, dry biomass, and mean soil K were predicted with multispectral drone imagery! This is cool bc multi spec sensors are cost effective compared to hyper spectral sensors which are super expensive.
Another note is that bc we chose which sites to use for the test data randomly, our only remnant in the study and a woodier site ended up in the test data set.
That meant we were using drone indices f restoration sites that were much different, to try to predict soil, plant, and biomass dynamics of pretty different sites ecologically.
What’s that mean? I think that if we had better paired sites ecologically, perhaps even more biodiversity indices could have been predicted, even in the super heterogenous landscape pictured.
So, take home: Drones w relatively cheap multispectral sensors might be a good tool to predict landscape-scale changed in ecological data, that it’s too time and cost-intensive to measure over such large scales.
Another take home: Drone image processing, stitching, etc. can be a huge pain in the butt.
Lastly, we used Ridge Regression to analyze the tons of independent variables we had (24 indices and all their mins/maxes, variances, means medians, etc - more than the dependent variables), and that tool is underutilized in drone studies, but we found it to be effective.
So...any of y'all looking for the riskier sides of projects, give drone imagery a try...it will give you endless data, punctuated with a few headaches, but also lots of fun new stats and questions to ask. /end
And...just in case you all missed it, in typical Ryan fashion, he did better at me in conveying the drone launch fail: https://t.co/gIC5gLJ4vn

More from Tech

You May Also Like