Gaze-Based Drone Navigation

Description/Abstract/Artist Statement

This project aims to develop a novel drone navigation activity that would allow an operator of a drone to immerse in alternate realities, called Gaze Augmentation (a partial immersion), for drone proximity navigation. In particular, the drone operator would be able to control the point-to-point navigation by moving their gaze position to a series of desired target waypoints in a visual field. We propose using eye-tracking via PupilLabs Core eye tracker (sampling frequency of 200HZ) to control a drone point-to-point navigation. We utilize ArUco markers to designate a waypoint in the visual field. An ArUco marker is a synthetic square marker composed by a wide black border and an inner binary matrix which determines its identifier. Before controlling the drone using eye-tracking, we create a simple navigation plan for the drone using gaze-guided waypoints via ArUco markers (the operator will create an initial navigation plan by creating a gaze position map via designated ArUco makers in the visual field). From there, we plan to use the data points to get the initial drone rotation angle, then moving towards the ArUco markers. In the evaluation, the drone makes an initial rotation, then detects and flies to an ArUco Marker position. Once the navigation towards the marker is successfully detected, it will get another rotation and detect the next waypoint marker and so on.

Presenting Author Name/s

Kayla Pineda

Faculty Advisor/Mentor

Sampath Jayarathna

College Affiliation

College of Sciences

Presentation Type

Poster

Disciplines

Artificial Intelligence and Robotics | Data Science | Graphics and Human Computer Interfaces

Session Title

Poster Session

Location

Learning Commons @ Perry Library

Start Date

3-19-2022 9:00 AM

End Date

3-19-2022 11:00 AM

This document is currently not available here.

Share

COinS
 
Mar 19th, 9:00 AM Mar 19th, 11:00 AM

Gaze-Based Drone Navigation

Learning Commons @ Perry Library

This project aims to develop a novel drone navigation activity that would allow an operator of a drone to immerse in alternate realities, called Gaze Augmentation (a partial immersion), for drone proximity navigation. In particular, the drone operator would be able to control the point-to-point navigation by moving their gaze position to a series of desired target waypoints in a visual field. We propose using eye-tracking via PupilLabs Core eye tracker (sampling frequency of 200HZ) to control a drone point-to-point navigation. We utilize ArUco markers to designate a waypoint in the visual field. An ArUco marker is a synthetic square marker composed by a wide black border and an inner binary matrix which determines its identifier. Before controlling the drone using eye-tracking, we create a simple navigation plan for the drone using gaze-guided waypoints via ArUco markers (the operator will create an initial navigation plan by creating a gaze position map via designated ArUco makers in the visual field). From there, we plan to use the data points to get the initial drone rotation angle, then moving towards the ArUco markers. In the evaluation, the drone makes an initial rotation, then detects and flies to an ArUco Marker position. Once the navigation towards the marker is successfully detected, it will get another rotation and detect the next waypoint marker and so on.