The REU site will provide undergraduates with a comprehensive research experience in the context of unmanned systems, a rapidly growing field of scientific and technological research. REU students will participate in carefully prepared research projects on unmanned systems with topics including unmanned aerial vehicle control, behavior studies of robots in complex environments around humans, characterizing and developing vehicles for atmospheric profiling, and asset management during wildfires with unmanned aerial vehicles. Research activities across the projects will be structured to provide a systematic research experience for the cohort, while enabling flexibility for each participant to focus on a particular area. The projects will be hosted at the University of Nebraska-Lincoln's Nebraska Intelligent MoBile Unmanned Systems Lab (UNL NIMBUS), which has graduated over 30 undergraduate, master's, and doctoral students in the last five years, and is a leader in aerial unmanned systems and their application to the environment.
Competitive stipend: $6,000
Suite-style room and meal plan
Travel expenses to and from Lincoln
Campus parking and/or bus pass
Full access to the Campus Recreation Center and campus library system
Despite decades of control theory and education, engineering students often leave with undergraduate degrees still unable to completely design, verify, and implement controllers in computer-controlled systems. Furthermore, the techniques involved in this process are rapidly changing but educational strategies rarely leverage them. In this project students will learn to model, design, and implement in software a controller for an Unmanned Aircraft System (UAS). The focus will be on rapidly developing models, verifying them, and synthesizing the controllers for the vehicle. Each controller will also be field tested providing an opportunity for students to experimentally validate the underlying theory as well as develop skills in robotics field experimentation.
Development and Characterization of Low-Level Atmospheric Profiling UAVs
Understanding the lower hundreds of meters of the atmosphere is critical for everything from being able to better predict severe weather development to understanding the impact of vegetation on the carbon cycle. Small UAVs have significant potential to aid in these measurements. Current weather monitoring relies on radars, weather balloons, and airplanes that primarily measure the atmosphere 1000 meters above ground level and ground based weather stations that measure the atmosphere between ground level and ten to fifty meters and are fixed in only a single location.
The NIMBUS Lab is developing systems and algorithms for deploying and recovering UAVs and other robot systems to better monitor the lower atmosphere in complex environments from windy farms to within the canopy of rainforests. This project entails working with a range of faculty and other students with backgrounds in CS, ME, EE, and Atmospheric Sciences to develop systems and algorithms to help address this challenging problem. Students with interests or backgrounds in any of these or related fields are welcome to apply.
Much of the current research related to unmanned aerial systems approaches to humans in human-robot interactions occurs in simulated environments or with otherwise unnatural testing conditions that may make the results difficult to generalize. A significant portion of this research occurs in laboratory spaces and does not consider the impact of robot approach on the quality of the interaction. This line of research proposes to understand human attitudes toward unmanned system-initiated assistance in a public space in order to assist designers with decisions on how to build systems that are helpful and to aid public space administrators with decisions on how and when to deploy unmanned systems that will be beneficial to patrons. This project will assess the inclusion of a visible purpose (such as distribution of items) and the addition of sound or light-based communications on interactions with various unmanned systems to determine whether it impacts acceptance and distancing.
Assessing User Expertise with small Unmanned Aerial Systems
Unlike autonomous cars or autopilots available in manned aircraft, users of UAS cannot be assumed to have undergone training and skills-based testing before operating a system on their own. Eye tracking provides fine-grained insights into what users are looking at while doing the task. This is in contrast to asking them what they did after the fact (causing users to forget or misreport what they did while doing the task). Several eye tracking measures will be assessed (such as number of fixations with durations, regressions to name a few) on various areas of interest during the task. Leveraging state-of-the-art eye tracking technology, user testing in varied environments (indoors and outdoors) will improve the fundamental understanding and measurement of how users develop expertise as they work with the UAS. This will lead to richer, more accurate representations of how eye tracking data can be used to inform interactions in lab, field, and mobile environments.
The result of this data collection and analysis will provide better recommendations for when to use different types of eye trackers, which metrics best convey information such as trust or expertise, and large sets of data for other researchers to employ in their own work.
Developing Prototypes of Learning-Based Autonomous Systems
The future of transportation is likely to rely heavily on the way of smart autonomous vehicles interacting with each other and the physical world via the 5G communication revolution. Rigorous efforts have been made in both industry and academia to build autonomous vehicles that can "observe" and "understand" the environments surrounding them to safely adjust the vehicle's behavior according to the environments' changes. This complex task is beyond the applicability of the traditional control methodologies, which have been investigated for decades. A new generation autonomous system must have the ability to "learn" from experience like a human and automatically "make a decision" for a new scenario.
Thanks to the enormous progress in deep learning and hardware development, we are taking a further step closer to fully autonomous systems based on learning using deep neural networks (DNNs). This project aims at developing two prototypes of a learning-based autonomous system, including autonomous driving cars and drones. Students will: 1) learn and build such systems in simulation, 2) train controllers using reinforcement learning, and 3) deploy and test the controllers on hardware prototypes.