Eye-Tracking based Object Detection using Drone Captured Video Streams

Description/Abstract/Artist Statement

The use of unmanned aerial vehicles (UAVs) or drones, has significantly increased over the past few years. There is a growing demand in the drone industry, creating new workforce opportunities such as package delivery, search and rescue, real estate, transportation, agriculture, infrastructure inspection, and many others, signifying the importance of effective and efficient control techniques. We propose a scheme for controlling a drone through gaze extracted from eye-trackers, enabling an operator to focus a drone on an object in a scene. Using the object detection model You Only Look Once (YOLO), our plan is to have a user identify an object in a video feed. As the user watches the drone’s video feed on a screen, different objects are identified by YOLO in the feed. The user will then identify and focus on an object in the video feed. Once the object is selected and identified, the drone will begin to track that object.

Presenting Author Name/s

Kayla Pineda

Faculty Advisor/Mentor

Sampath Jayarathna

Faculty Advisor/Mentor Department

Computer Science Department

College Affiliation

College of Sciences

Presentation Type

Poster

Disciplines

Artificial Intelligence and Robotics | Data Science | Graphics and Human Computer Interfaces

Session Title

Poster Session

Location

Learning Commons Lobby @ Perry Library

Start Date

3-25-2023 8:30 AM

End Date

3-25-2023 10:00 AM

This document is currently not available here.

Share

COinS
 
Mar 25th, 8:30 AM Mar 25th, 10:00 AM

Eye-Tracking based Object Detection using Drone Captured Video Streams

Learning Commons Lobby @ Perry Library

The use of unmanned aerial vehicles (UAVs) or drones, has significantly increased over the past few years. There is a growing demand in the drone industry, creating new workforce opportunities such as package delivery, search and rescue, real estate, transportation, agriculture, infrastructure inspection, and many others, signifying the importance of effective and efficient control techniques. We propose a scheme for controlling a drone through gaze extracted from eye-trackers, enabling an operator to focus a drone on an object in a scene. Using the object detection model You Only Look Once (YOLO), our plan is to have a user identify an object in a video feed. As the user watches the drone’s video feed on a screen, different objects are identified by YOLO in the feed. The user will then identify and focus on an object in the video feed. Once the object is selected and identified, the drone will begin to track that object.