[6] Hodierna D G. L’Occhio Della Mosca[M]. Decio Cirillo: Palermo, 1664.
[7] Miiller J. Zur Vergleichenden Physiologie Des Gesichtssinnes Des Menschen Und Der Thiere[M]. Cnobloch,Leipzig & wien,1826.
[8] EXNER S. Die Physiologie Der Facettirten Augen Von Krebsen Und Insecten[M]. Franz Deuticke, Leipzig & wien, 1891.
[11] KUNZE P, HAUSEN K. Inhomogeneous refractive index in the crystalline cone of a moth eye[J]. Nature, 1971, 231: 392-393.
[12] TANG S, WOLF R, XU S, et al. Visual pattern recognition in Drosophila is invariant for retinal position[J]. Science, 2004, 305: 1020-1022.
[16] Pericet C R, Dobrzynski M, Eplattenier G L, et al. CURVACE-CURVed artificial compound eyes[J]. Procedia Computer Science, 2011, 7: 308-309.
[19] Tanida J, Kumagai T, Yamada K, et al. Thin observation module by bound optics (TOMBO): concept and experimental verification[J]. Applied Optics, 2001, 40(11): 1806-1813.
[20] Duparre J, Dannberg P, Schreiber P, et al. Artificial apposition compound eye fabricated by micro-optics technology[J]. Applied Optics, 2004, 43(22): 4303-4310.
[21] Duparre J, Radtke D, Tunnermann A. Spherical artificial compound eye captures real images[C]// Proceedings of SPIE, 2007, 6466: 64660K -1 -64660K-9.
[22] Afshari H, Jacques L, Bagnato L, et al. The PANOPTIC camera: a plenoptic sensor with real-time omnidirectional capability[J]. Journal of Signal Processing Systems, 2013, 70: 305-328.
[35] NIE J. Morphogenesis of Drosophila Photoreceptor Cells[D]. Indiana University, 2014.
[38] Wardill T J, List O, Li X F, et al. Multiple spectral inputs improve motion discrimination in the Drosophila visual system[J]. Science, 2012, 336 (6083): 925-931.
[39] Hassenstein B, Reichardt W. System theoretsche analyse der Zeit, reihenfolgen und vor zeichenauswertung bei der bewegungsperzeption der russelkafers chlorophanus[J]. Zeitschrift Fur Naturforschung B, 1956, 11(9): 513-524.
[40] Egelhaaf M, Borst A. Transient and steady-state response properties of movement detectors[J]. Journal of Optical Society of America A, 1989, 6(1): 116-127.
[41] Borst A, Egelhaaf M. Principles of visual motion detection[J]. Trends in Neurosciences, 1989, 12(8): 297-306.
[42] Missler J M, Kamangar F A. A neural network for pursuit tracking inspired by the fly visual system[J]. Neural Networks, 1995, 8(3): 463-480.
[43] Rind F C, Bramwell D I. Neural network based on the input organization of an identified neuron signaling impending collision[J]. Journal of Neurophysiology, 1996, 75(3): 967-985.
[44] Yue S G, Rind F C. Postsynaptic organisations of directional selective visual neural networks for collision detection[J]. Neurocomputing, 2013, 103: 50-62.
[50] Mehmet N O, Felix S, Shadi J, et al. Neuronal diversity and convergence in a visual system developmental atlas[J]. Nature, 2020, 589(7840): 88-95.
[51] LI P H, Lindsey L F, Januszewski M, et al. Automated reconstruction of a serial-Section EM Drosophila brain with flood-filling networks and local realignment[J]. Microscopy Social Science Electronic Publishing, 2019, 25(2): 1364-1365.
[54] Schneider J, Murali N, Taylor G W, et al. Can Drosophila melanogaster tell who's who?[J]. PLOS ONE, 2018, 13(10): 1-10.
[55] ZHAO F F, ZENG Y, GUO A K, et al. A neural algorithm for Drosophila linear and nonlinear decision-making[J/OL]. Scientific Reports, 2020. https://doi.org/10.1038/s41598-020-75628-y.