The goal of this research is to produce a compact, light-weight sensor which will enable an unmanned aerial vehicle to autonomously traverse trajectories through cluttered environments, such as a forest, while sensing and avoiding objects. The sensor is inspired by biological echolocation as performed by bats, which involves emitting an ultrasonic signal and closely listening to its reflections. By analyzing how each received echo differs from the emitted signal and how they mutually vary between its two ears, the bat can determine where the object which reflected the signal is located. Additionally, using sequences of these echoes makes is possible to determine the movement through the environment. We will mimic these features in our system to achieve the same results. For our application we use radio waves (radar) instead of sound (sonar), because these travel at a much greater speeds, while allowing the sensor to operate under circumstances where optical cameras would fail, such as at night, in rain, fog, smoke, etc. Furthermore, we propose a control scheme inspired by cognition, such as insect intelligence, to steer the robot. The idea is to implement a layered system of behavioral units, each with its own goal. Examples of these units include, stopping to avoid a collision, dodging an obstacle, and following a corridor. The system will then execute the behavior with the highest priority which is active at each given moment, creating an overall emergent intelligence.