The National Academies Logo
Research Associateship Programs
Fellowships Office
Policy and Global Affairs

Participating Agencies - AFRL

  Sign InOpen Printer View

Opportunity at Air Force Research Laboratory (AFRL)

Bio-Inspired Image Fusion


Airman Systems Directorate, RHX/Human Centered ISR Division

RO# Location
13.15.13.B5699 Wright-Patterson AFB, OH 45433


Name E-mail Phone
Warren, Richard 937.469.2223


Modern information systems produce a variety of imagery, which must be integrated and assimilated for proper use. For some time, there have been serious attempts to produce algorithms for optimal sensor fusion. The results have been uninspiring. While such algorithms are essential to autonomous multisensor robotic systems, man-in-loop systems might do well to simply optimize presentation for the human observer. Humans effortlessly and pre-attentively fuse information from many sensory subsystems (e.g., form, color, motion) as long as the presentation modes are compatible. How the human brain accomplishes this “binding” feat is controversial, but there is little doubt that the visual cortex exploits spatiotemporal correlations between the sensory subsystems to be bound together. The goal of this research is to investigate several presentation modes for multisensor imagery that would harness and exploit these powerful human abilities.

For example, humans have an astonishing ability to derive a coherent worldview based on limited correlation information. There are many examples. (1) Random dot images with hidden disparity correlations evoke a sense of depth (random dot stereopsis). (2) Orientation correlations between random dots-created by dilation or rotation of a copy pattern relative to an original-give a strong impression of radial or circular symmetry in the random dot patterns (glass patterns). (3) Point light walkers, created by encoding just a few points on the joints of otherwise invisible actors, yield an immediate sense of the identity and actions of the actors; through correlated motion, observers discriminate gender, distinguish sham weightlifting from the real exercise, and readily identify individuals locked together in dance. We believe that this inherent visual information processing capability could be exploited in biological sensor data fusion.


Vision; Sensor fusion; Data fusion;


Citizenship:  Open to U.S. citizens
Level:  Open to Postdoctoral and Senior applicants


Base Stipend Travel Allotment Supplementation
$76,542.00 $4,000.00

$3,000 Supplement for Doctorates in Engineering & Computer Science

Experience Supplement:
Postdoctoral and Senior Associates will receive an appropriately higher stipend based on the number of years of experience past their PhD.

Copyright © 2019. National Academy of Sciences. All rights reserved.Terms of Use and Privacy Policy