1. [1] J. W. Krakauer, A. A. Ghazanfar, A. Gomez-Marin, M. A. MacIver, and D. Poeppel, "Neuroscience Needs Behavior: Correcting a Reductionist Bias," Neuron, vol. 93, no. 3, pp. 480-490, 2017.
2. [2] S. R. Datta, D. J. Anderson, K. Branson, P. Perona, and A. Leifer, "Computational Neuroethology: A Call to Action," Neuron, vol. 104, no. 1, pp. 11-24, Oct. 2019.
3. [3] A. I. Dell et al., "Automated image-based tracking and its application in ecology," Trends Ecol. Evol., vol. 29, no. 7, pp. 417-428, 2014.
4. [4] Y. Hao, A. M. Thomas, and N. Li, "Fully autonomous mouse behavioral and optogenetic experiments in home-cage," Elife, vol. 10, May 2021.
5. [5] L. Berg, J. Gerdey, and O. A. Masseck, "Optogenetic Manipulation of Neuronal Activity to Modulate Behavior in Freely Moving Mice," JoVE (Journal Vis. Exp., vol. 2020, no. 164, p. e61023, Oct. 2020.
6. [6] D. J. Wallace, D. S. Greenberg, J. Sawinski, S. Rulla, G. Notaro, and J. N. D. Kerr, "Rats maintain an overhead binocular field at the expense of constant fusion," Nature, vol. 498, no. 7452, pp. 65-69, Jun. 2013.
7. [7] H. L. Payne and J. L. Raymond, "Magnetic eye tracking in mice," Elife, vol. 6, Sep. 2017.
8. [8] R. Paylor, "Simultaneous behavioral characterizations: Embracing complexity," Proc. Natl. Acad. Sci. U. S. A., vol. 105, no. 52, pp. 20563-20564, 2008.
9. [9] L. von Ziegler, O. Sturman, and J. Bohacek, "Big behavior: challenges and opportunities in a new era of deep behavior profiling," Neuropsychopharmacology, vol. 46, no. 1, pp. 33-44, 2021.
10. [10] A. F. Meyer, J. Poort, J. O'Keefe, M. Sahani, and J. F. Linden, "A Head-Mounted Camera System Integrates Detailed Behavioral Monitoring with Multichannel Electrophysiology in Freely Moving Mice," Neuron, vol. 100, no. 1, pp. 46-60.e7, Oct. 2018.
11. [11] M. O. Pasquet et al., "Wireless inertial measurement of head kinematics in freely-moving rats," Sci. Rep., vol. 6, no. October, pp. 1-13, 2016.
12. [12] R. Fayat et al., "Inertial measurement of head tilt in rodents: Principles and applications to vestibular research," Sensors, vol. 21, no. 18, pp. 1-22, 2021. [
DOI:10.3390/s21186318] [
PMID] [
]
13. [13] W. Abbas and D. M. Rodo, "Computer methods for automatic locomotion and gesture tracking in mice and small animals for neuroscience applications: A survey," Sensors (Switzerland), vol. 19, no. 15, 2019.
14. [14] S. Arvin, R. N. Rasmussen, and K. Yonehara, "EyeLoop: An Open-Source System for High-Speed, Closed-Loop Eye-Tracking," Front. Cell. Neurosci., vol. 15, p. 494, Dec. 2021.
15. [15] M. de Jeu and C. I. De Zeeuw, "Video-oculography in Mice," JoVE (Journal Vis. Exp., no. 65, p. e3971, Jul. 2012. [
DOI:10.3791/3971-v] [
PMID] [
]
16. [16] V. Panadeiro, A. Rodriguez, J. Henry, D. Wlodkowic, and M. Andersson, "A review of 28 free animal-tracking software applications: current features and limitations," Lab Anim. (NY)., vol. 50, no. 9, pp. 246-254, 2021.
17. [17] E. Insafutdinov, L. Pishchulin, B. Andres, M. Andriluka, and B. Schiele, "DeeperCut: A Deeper, Stronger, and Faster Multi-Person Pose Estimation Model," Lect. Notes Comput. Sci., vol. 9910 LNCS, pp. 34-50, May 2016.
18. [18] A. Mathis et al., "DeepLabCut: markerless pose estimation of user-defined body parts with deep learning," Nat. Neurosci., vol. 21, no. 9, pp. 1281-1289, Sep. 2018.
19. [19] M. W. Mathis and A. Mathis, "Deep learning tools for the measurement of animal behavior in neuroscience.," Curr. Opin. Neurobiol., vol. 60, pp. 1-11, Feb. 2020.
20. [20] A. Mathis, S. Schneider, J. Lauer, and M. W. Mathis, "A Primer on Motion Capture with Deep Learning: Principles, Pitfalls, and Perspectives," Neuron, vol. 108, no. 1, pp. 44-65, 2020.
21. [21] T. D. Pereira et al., "SLEAP: A deep learning system for multi-animal pose tracking," Nat. Methods, vol. 19, no. 4, pp. 486-495, Apr. 2022.
22. [22] J. M. Graving et al., "Deepposekit, a software toolkit for fast and robust animal pose estimation using deep learning," Elife, vol. 8, no. January 2020.
23. [23] T. Nath, A. Mathis, A. C. Chen, A. Patel, M. Bethge, and M. W. Mathis, "Using DeepLabCut for 3D markerless pose estimation across species and behaviors," Nat. Protoc., vol. 14, no. 7, pp. 2152-2176, Jun. 2019. [
DOI:10.1038/s41596-019-0176-0] [
PMID]
24. [24] O. Sturman et al., "Deep learning-based behavioral analysis reaches human accuracy and is capable of outperforming commercial solutions," Neuropsycho pharmacology, vol. 45, no. 11, pp. 1942-1952, 2020.
25. [25] G. Lopes et al., "Bonsai: An event-based framework for processing and controlling data streams," Front. Neuroinform., vol. 9, no. APR, pp. 1-14, 2015.
26. [26] J. Lauer et al., "Multi-animal pose estimation, identification and tracking with DeepLabCut," Nat. Methods 2022 194, vol. 19, no. 4, pp. 496-504, Apr. 2022. [
DOI:10.1038/s41592-022-01443-0] [
PMID] [
]
27. [27] G.-W. Zhang, L. Shen, Z. Li, H. W. Tao, and L. I. Zhang, "Track-Control, an automatic video-based real-time closed-loop behavioral control toolbox," bioRxiv, p. 2019.12.11.873372, Dec. 2019.
28. [28] T. Imai, Y. Takimoto, N. Takeda, A. Uno, H. Inohara, and S. Shimada, "High-Speed Video-Oculography for Measuring Three-Dimensional Rotation Vectors of Eye Movements in Mice. PLOS ONE 11(3), 2016.
29. [29] S. S. Oh and H. L. Narver, "Mouse and rat anesthesia and Analgesia," Current Protocols, vol. 4, no. 2, Feb. 2024.
30. [30] H. Leinonen and H. Tanila, "Vision in laboratory rodents-tools to measure it and implications for behavioral research," Behavioural Brain Research, vol. 352, pp. 172-182, Oct. 2018.
31. [31] S. O. H. Madgwick, A. J. L. Harrison, and R. Vaidyanathan, "Estimation of IMU and MARG orientation using a gradient descent algorithm," IEEE Int. Conf. Rehabil. Robot., pp. 179-185, 2011.