Tsotsos, John K.Mehrani, Paria2022-03-032022-03-032021-122022-03-03http://hdl.handle.net/10315/39150The goal of human-level performance in artificial vision systems is yet to be achieved. With this goal, a reasonable choice is to simulate this biological system with computational models that mimic its visual processing. A complication with this approach is that the human brain, and similarly its visual system, are not fully understood. On the bright side, with remarkable findings in the field of visual neuroscience, many questions about visual processing in the primate brain have been answered in the past few decades. Nonetheless, a lag in incorporating these new discoveries into biologically-inspired systems is evident. The present work introduces novel biologically-inspired models that employ new findings of shape and color processing into analytically-defined neural networks. In contrast to most current methods that attempt to learn all aspects of behavior from data, here we propose to bootstrap such learning by building upon existing knowledge rather than learning from scratch. Put simply, the processing networks are defined analytically using current neural understanding and learned where such knowledge is not available. This is thus a hybrid strategy that hopefully combines the best of both worlds. Experiments on the artificial neurons in the proposed networks demonstrate that these neurons mimic the studied behavior of biological cells, suggesting a path forward for incorporating analytically-defined artificial neural networks into computer vision systems.Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.NeurosciencesBiologically-inspired Neural Networks for Shape and Color RepresentationElectronic Thesis or Dissertation2022-03-03Learning shape selectivityPart-based shape representationBiologically-inspired neural networksHue-selective neural networkFigure border ownership assignment