Creating a Mapping Mission

Unmanned Aerial Vehicles (UAVs) can create orthomosaics to better understand ecosystems. To plan the flight path of a UAV for orthomosaic creation, we’ve developed a plugin compatible with QGIS

UAV Mapping Path Generator (for Litchi)

Our UAV Mapping Path Generator plugin takes into account camera parameters, altitude, overlaps, and more to prepare an optimized flight plan. This flight plan is compatible with the FlyLitchi app, which operates flights for DJI drones.

The steps below describe the installation process, logic and method to use the plugin. To install the plugin:

  1. Go to Plugins > Manage and Install Plugins.

  2. Search for UAV Mapping Path Generator (for Litchi) and install it.

Fig. 01a: Install the plugin.

Fig. 01b: Install the plugin.

The plugin will be available for use from the ‘Vector’ menu.

Fig. 02: UAV Path Generator Plugin. 

SETTING UP THE PLUGIN

Some default drone camera specifications are pre-filled in the plugin. However, you can adjust these settings by accessing the drone specifications for your specific model here.

Mandatory Inputs for the Plugin:

  • Altitude

  • Field of View (FoV)

  • Side Overlap %

  • Front Overlap %

  • Aspect Ratio

  • Image Height & Width

  • Speed

  • Gimbal Pitch Angle

These drone camera parameters are readily available in the drone specifications. The overlap percentages you choose will influence both the area covered and the time taken for the drone to complete the flight. The plugin calculates and displays the approximate flight time at the end of its execution (see Fig. 05).

Note: The flight time shown in FLyLitchi is slightly longer than that calculated here.

The parameters and the input types are as described below.

Fig. 03

Fig. 04: Calculated parameters and formulae.

How to use the PlugIn:

Then the user has to upload a shapefile of the area of interest. This shapefile should be of polygon geometry type only and in EPSG:4326 GCS projection. The users are advised to select a polygon and fill this input in every execution and make sure to click on the ‘Load’ option after filling the input. Following the selection of AOI, the user must draw a line - ‘Input_line.shp’:

Notes on the line:

  • This line must be outside the AOI polygon.

  • It should be in a direction parallel to the desired drone path.

  • The drone path will be drawn to the right/above this line.

  • This line is plotted in EPSG:4326 GCS projection.

As soon as you click on the ‘Draw Line’ option, the toggle editing is enabled and the user will be prompted to draw a line in the required direction. To end this action, save the edit, untoggle the edit mode and then the plugin dialog box will pop back in the map interface and the draw line box will be populated.

Then the user should input an approximate number of parallel lines to the input line that would cover the entire area of interest. When the user clicks on the ‘Draw Flight Path’ option, the parallel lines are drawn and clipped to the area of interest and then the way points are created according to the overlap percentages as shown in Fig. 06.

The flight time is also calculated as soon as one clicks on the ‘Draw Flight Path’ is clicked. The Fig. 05 shows that the flight will take 60 minutes to finish the entire flight. Hence, the user must split the area to be covered in each flight as one drone battery gives 20 mins of effective flight time.

Fig. 05: Flight time calculated.

Example of input line and the drone path:

(a) Input horizontal line, (b) Drone path with horizontal input line at the bottom,

(c) Input vertical line; (d) Drone path with vertical input line on the left.

Fig. 06: Input for a drone-path.

Fig. 07: The flight path in FMH.

A Fly Litchi compatible csv file is then created and it can be opened in Fly Litchi Mission hub.  Make sure the waypoints in the csv file show on the map view.

From this point, one can use the FlyLitchi App to check settings and execute the mapping mission. For more instructions on using the fly litchi app, please refer to our other blog on planning photo and video missions for UAVs.

Below is a visual walk-through of using the above plugin.

Planning Photo and Video Missions for UAVs

When wanting to conduct a drone flight or mapping mission over a precise area of interest, one needs to plan the flight and path beforehand. This ensures that the entire area is covered completely and removes any visual errors or barriers that might occur during field work. Further, prefixing flight settings allows for more consistent data collection as compared to manual flights on the go.

We use a combination of two softwares for planning flights.The first among them is FlyLitchi Mission Hub (FMH) and the second is QGIS. The two can also be used in conjunction with one another. The process using both these softwares is described in this blog.

Using FlyLitchi:

FMH is a web app that enables us to plan drone flights from the desktop and later execute them using the android based Litchi app - Litchi for DJI drones which is available for a one time purchase in Google Play Store. 

The FlyLitchi mission hub interface has many flights that are publicly available for viewing on the desktop app as shown in Fig 1. We can also share our flight paths and drone videos to this application.

Fig. 01: FlyLitchi mission hub interface.

Login and Register: 

FMH is a free desktop application. One needs to register in order to create, upload and save the flight missions. To do this, click the ‘Log in’ button on the top right of the interface and create your account.

Set your area of interest:

a. Search for your Area Of Interest (AOI) in the search bar which is at the top left of the interface.

b. Click around the AOI to create way points to set the route of the mission.

Fig. 02: Way Point settings

c. Click around the AOI to create way points to set the route of the mission.

d. This mission can be stored by clicking on the MISSIONS option - bottom left and then clicking Save.

Way Point Settings:

I. Lat-Long: The location of the point is automatically logged as soon as you create a point.

II. Altitude (m): One needs to set the altitude of the drone above ground for all the points. In India, the legal limit as per the Directorate General of Civil Aviation (DGCA) is 120m. 

There is an ‘Above Current’ option under ‘Altitude settings’ available when you are batch editing the way point attributes. The “Above Current” option in FMH allows you to add a specific altitude to the current altitude of selected waypoints. It’s a convenient way to elevate all waypoints by a set distance without adjusting each one separately.

III. Speed (m/s): One can set the speed depending on the purpose of the mission. Max speed is 15m/s.

In a video mission like this, the speed can be set between 10m/s to 15m/s.

Pro tip: If you’re designing a mission to create an orthomosaic, keep the maximum speed below 8m/s to avoid distortion.

IV. Curve (m): If the curve is 0m then the drone will exactly fly to each point and then head to the next. If the purpose of the mission does not necessitate going to the exact location (esp. Turning Points) of the points, the drone will follow a curved path (blue line in Fig: 2) near the point (curve > 0m).

Fig. 03: Curve at the turns 

V. Heading: This is a circular (turn right - turn left) movement of the drone at a point location. It ranges from 0 to 360°. This controls which direction the drone is pointing. It is marked by a triangle-like marker on each way point. 

VI. POI: One can set one or multiple Points of Interest (POI) in FMH. The drone automatically points to the POI if set to do so. Here you can select which POI the drone should focus on if there are multiple POIs.

VII. Gimbal Pitch: This is like the angle defining the up-down movement of the camera. One can choose between 30 to -90°, where -90° points straight down. 

If you have chosen to Focus on a POI, then the drone automatically recalibrates its gimbal pitch to capture the POI throughout its mission. Alternatively, by picking the ‘Interpolate’ option, you can set a constant pitch angle which will be consistently maintained through the mission.

VIII. Intervals and actions: One can add some actions at each waypoint like Start/ Stop recording, Click a picture, Stay/ Hover etc. This option allows to set an action and the duration of the action if applicable.

Fig. 04: Way Point settings and Mission settings.

Mission settings: 

I. Units: One can choose the unit of measurement here.

II. Map Type: One can choose the base map upon which the flight missions appear. 

III. Heading Mode: The direction that the drone faces is set by this.

A. Auto: The drone faces the next way point.

B. Initial: The heading of the first way point is fixed for all.

C. Manual: Heading can be fixed during the flight. 

D. Custom: You can set the heading for each way point in FlyLitchi.

IV. Finish Action: At the end of the automated flight, the drone follows this action. 

A. RTH: Return to home - flies back to the drone launch point.

B. Land: It will land at the end of the mission.

C. Back to Start: The drone returns to the start of the mission.  

D. Reverse: It comes back the same path as the mission. 

V. Path Mode: The curve setting in way point settings is overridden by this path mode. If path mode is set to ‘Straight line’ then the flight path cannot have curved turns. If ‘Curved path’ is selected, one can set the curves at the points. 

VI. Cruising Speed and Max Flight Speed (m/s): The speed can be set here. 

VII. Default Curve Size (%): If path mode = ‘Curved turns’, then the new points that you add to the flight path will have the default curve size % that you specify. Default curve size will not work if the Path mode is set to ‘Straight Lines’.

VIII. Default Gimbal Pitch Mode: Same as way point settings but ‘MISSIONS’ settings override the way point settings.

Note: Mission settings override the way point settings.


Using QGIS to aid flight planning:

The boundary of the video mission can also be created in QGIS: This method is useful when the area of interest may not be obviously distinguishable in a satellite image. For example, if one owns a parcel of land, the boundaries need to be incorporated precisely and cannot be determined accurately through an image.

Here one has two options:

a. Load a boundary file of the AOI that you may already have.

b.  Create a shapefile layer from the Layer menu.

If you already have a shapefile for your area of interest-load it into QGIS and export it as a .kml file. This is because the FMH only processes .kmls and .csvs at present. The process for importing the .kml into FMH is described towards the end of the blog.

If you don’t already have a shapefile, you can create one using the following steps:

a. Go to Layer> create Layer and select New Shapefile Layer

Fig. 05: Creating a shapefile in QGIS.

b. Choose the name, source folder and projection to save your file.

Fig. 06: Creating a line shapefile.

c.  Now right click on the layer and select toggle editing. Then click on the Add line feature. You can now create a feature by digitising this boundary. 

Fig. 07: Digitising a line shapefile.

d. Export this line file to a .kml file. 

e. Import the .kml file to FMH using the ‘import’ option from the MISSIONS tab on the bottom left of the screen. 

Fig. 08: Import kml to FMH

f. Check and edit settings of each way point as mentioned in the previous section. 

Thus FlyLitchi and QGIS can be used to create video missions. The planned flights saved in FMH can then be executed using Litchi for DJI drones android app. The videos from the drone are recorded as .mp4 files.

The next step in our workflow would be processing these videos to sync them with telemetry data. More information on this process can be found on our blog on using Dashware.

The Practical Nuances of Calculating Field of View

In this blog, we explore single-image photogrammetry – which is the extraction of information from a single image – including measurements and creation of 3D models. As we delve into this topic, we are learning a lot about the practical nuances of applying these techniques to solve real-world problems. We are currently working on calculating the real-life size of a Ganges river dolphin from the drone footage we acquired from our fieldwork in Bihar earlier this year. This blog continues from our previous post on measuring object size using a nadir image. While that post covered theoretical formulas, here we focus on the practical application of those formulas.

A single frame extracted from drone footage of a Ganges river dolphin surfacing, with zoomed inset of the same.

To get the real-life size of our dolphin, we would need to know the number of pixels the dolphin is made up of in an extracted frame (the image), along with the size of each pixel in some real-life units (eg: metres, centimetres). We would then multiply the number of pixels with the pixel real-life size to get estimates on area covered, breadth, and length. This real-life size that each pixel corresponds to is known as the Ground Sampling Distance or the GSD. The GSD is an extremely useful metric for any georeferenced image since it represents the distance between two consecutive pixel centres measured on the ground, or the distance a side of a pixel represents.

This exploration is driven by the absence of a reference object of a known size in any of our drone footage acquired during our fieldwork from earlier this year. It's challenging to have a reference object consistently in the frame since the drone moves with the dolphin sightings, which are sporadic and spaced out. If there had been a reference object, we could have used the ratio between the object's pixel count and its known size to establish a scale. This scale would then be used to determine the real-life size of a dolphin based on its pixel count.

Theoretically, measuring the GSD of a nadir image is straightforward. One of the formulas we can use to figure out the GSD is as follows, taken from our aforementioned blog:

GSD = A_L/ I_L

where 

A_L is the real-life length of the area being captured in an image, in metres

I_L is the number of pixels that make up the length of that image

To calculate A_L we can use the following formula:

A_L= 2 * H * tan(FOV_L/ 2)

where

H is the altitude of the drone, in metres

FOV_L is the angular field of view along the length axis of the image


I_L and H are parameters that are easy to calculate. While I_L can be figured out by just examining the number of pixels making up the length of the image, H is a parameter captured by the drone itself as metadata. Finding the FOV_L for our drones though – took quite some time!

This parameter, more generally called the field of view (FOV), is used to calculate the GSD of our drone images using the above formula. We tried to figure out the FOV of our drone cameras by exploring forums and official drone manuals, which inevitably led us to some extremely interesting finds—especially regarding the factors and camera settings that might affect a drone's FOV.

So let’s take a look at what exactly is the FOV, how can we find the FOV for a given drone camera, and what all settings would one have to take into account if they are measuring the FOV on their own.

WHAT IS THE FIELD OF VIEW?

The term field of view refers to the viewable area across a specific axis as seen in an image. It is the amount of area that a particular lens system can capture in an image. The larger the FOV, the more area that can be seen and captured by the camera. Inversely, the smaller the FOV, the less is the area that is being seen. Thus, FOV is directly proportional to the extent or amount of area being captured by a camera.

A graphic showing an aerial drone camera's field of view¹- Luo, Y., & Chen, Y. (2021), CC BY 4.0

This term usually refers to either the actual physical area that is captured (in units such as mm) or the angle at which a lens is capturing that area. In this context, we will be using FOV to refer to the angular extent of the captured image. Thus, it should be assumed to be expressed in degrees for the rest of this blogpost.

For any given image, there are multiple types of FOVs. These FOVs are specific to the axis being considered while measuring the area seen by a lens. By axis here, we are referring to an imaginary line that travels across the diagonal, width, or length of an image. Corresponding to these dimensions, there are three distinct FOVs: FOV_D, FOV_W, and FOV_L, respectively. We will be referring to these collectively as FOVs/ FOV unless specified otherwise.

An diagram showing the angles being captured by the three different FOVs for a given image.

To gain a deeper intuition about this term we can carry out a simple exercise. 

Pull out your mobile phone and open your camera to photo mode. Try to keep your phone fixed in a particular position and observe how the edges of the area being captured in your screen change when you do the following:

  1. Change aspect ratio of picture

  2. Change from photo to video mode

  3. Change from HD to 4K or to some other photo/video quality setting

The above are three images taken by a smartphone in the same position but with different aspect ratios: 9:16, 1:1, 3:4  from left to right, respectively. The vertical and horizontal areas being covered change drastically as we shift from one aspect ratio to another. Thus, all the three FOVs change too when we change between these settings.

FACTORS THAT AFFECT A DRONE’S FIELD OF VIEW

One’s first instinct in figuring out the FOV of their drone would be to simply check the official manual for the mentioned specs. Since we use off-the-shelf drones for our work, understanding any limitations was also crucial. As it turns out, the specs in the official DJI manuals aren’t specific or exhaustive. The official DJI manuals don’t really mention how the FOV changes with different camera settings but rather usually mention a single default FOV. This FOV measurement indicates the diagonal FOV and applies only to the native aspect ratio of that drone’s camera while taking pictures with that drone. 

Consider this scenario. Say you need the FOV for when you are recording a video at 60 fps with a 16:9 aspect ratio at resolution 3840 x 2160. The first thought would be to apply the manual mentioned FOV and proceed. But that would be incorrect as based on the settings you are applying, the FOV also tends to change. Chances are that you won’t be able to simply search online for the FOV for your particular combination of settings or even find it for these specific settings in the manual. Rather, you will have to try and figure it out on your own.

The discussion around how to build an experiment to measure the FOV of a drone at some given settings is best left for another day. For now, let’s take a look at some of the factors that change the FOV of the drone’s camera.

Aspect Ratio

The aspect ratio is the ratio of width of the image to its height. It is typically expressed as two numbers separated by a colon, such as 4:3 or 16:9. Camera sensors, devices inside the camera that capture light to create an image, are manufactured in various shapes and sizes, each having a native aspect ratio. The native aspect ratio is determined by the number of pixels along the width and height of the sensor. When shooting in the sensor's native aspect ratio, the entire sensor area is used, maximising the resolution and image quality.

When we shift from one aspect ratio to another, we will usually notice a change in the extent of the area being captured in the frame. This occurs because the camera either crops the image or resizes it to fit the desired aspect ratio. Cropping involves trimming parts of the image, effectively reducing the number of pixels used from the sensor. For instance, changing from a 3:2 to a 1:1 (square) aspect ratio would mean cutting off parts of the image on the sides. Since the extent of area being captured is changing, the FOVs end up changing too.

Consider these two images captured by the same drone camera with different aspect ratios:

Images taken with a DJI quadcopter showing a change in the field of view when aspect ratio is changed from 3:2 (top) to 16:9 (base). Notice how the upper and lower portions of the top image get cropped in the bottom image.

Zoom Level

This is one of the more obvious factors. The more you zoom in when taking an image, the lesser is the physical area you are capturing in the image. The more zoomed out the frame is, the more is the extent of the area being captured by the camera. As the captured area changes, so does the FOV.

A still from this video shows how an increase in the optical zoom level of a DJI Mavic 3 Pro results in smaller captured areas.

Video Mode

If there are additional modes in which you can record your video, then FOVs for different modes might vary significantly. Even when the resolution and aspect ratio remain the same, a change in the recording mode can alter the FOVs. Take the DJI Mavic 2 Pro as an example. There are two different modes for recording a 4K video: Full FOV mode and the HQ mode. Videos recorded using these modes have the same resolution, 3840 x 2160. However, both of these have different FOVs! 

In the Full FOV mode, the full camera sensor is used, more or less, followed by a downsampling of the video to a 4K resolution. In the HQ mode, however, a cropped portion of the sensor is used to capture video in 4K resolution² directly. This leads to different portions of the camera sensors being used and therefore, different amounts of areas being captured. Thus, their FOVs are different too.

This image taken from this video clearly illustrates the difference between Full FOV mode and HQ mode on the DJI Mavic 2 Pro. Both modes have the same resolution and aspect ratio., but they capture different areas.

Frames per second

Commonly written as fps, it is the measurement of how many individual image frames appear in one second of video. The higher the fps, the more frames there are in a video, and the smoother that video appears.

This was surprising to be honest. One wouldn’t expect the FOV to change with a change in fps. While fps is simply the number of frames recorded in a video, FOV is a completely different concept that just governs the amount of area a camera captures. They are theoretically supposed to be independent of each other. However, for some drones, changing the fps at which you are recording a video changes the FOV.

Take the AIR2S ³ ⁴ for example. Changing the fps while recording a video will change FOVs of the resultant video. The higher the fps the more frames the drone is recording and processing. If recording a high quality video at 4K resolution, then the drone might lack computational power to process all the frames at the same time. As a fix, the drone crops the video to reduce the data/information that the drone is processing. Because the video is cropped, the FOV is also reduced.

The top image shows an image taken by DJI Air2S at 24 fps while the one at the base is at 60 fps by the same drone with the exact same other settings. Notice how the area captured and field of view reduce in the bottom image.

In conclusion, there are many practical nuances that one might overlook when focusing on the theoretical aspects of a problem. Listed in this blogpost are just some of the factors that affect the FOV when working with drones. Building equations to model how all the above factors influence FOVs could be a fascinating challenge. Creating a parametric equation would be a valuable tool for effective estimation of FOVs, but it would require significant effort to collect data across all the setting combinations via multiple field experiments for a number of different drones. Moreover, there would be a minor problem where for every combination of camera settings that one employs- one would need to understand how that setting exactly affects the FOV, which might not be a trivial exercise. Therefore, instead of taking that route, we are only trying to measure the FOVs for the settings which we commonly employ during field work. We have worked on some interesting field work and experiments for this – and look forward to discussing this soon!

REFERENCES

1- Energy-Aware Dynamic 3D placement of Multi-Drone Sensing fleet. Sensors, 21(8), 2622. https://doi.org/10.3390/s21082622 , Luo, Y., & Chen, Y. (2021)

2- DJI Mavic 2 Pro 4K HQ vs Full FOV EXPLAINED + TESTS 

3- DJI Air 2S video crop at high fps? 

4-  Air 2S FOV - 4K 30FPS VS. 4K 60FPS  

And some Helpful Links