发布: 2024年11月05日第14卷第21期 DOI: 10.21769/BioProtoc.5098 浏览次数: 820
评审: Edgar Soria-GomezNoah Cowan
Abstract
Behavioral neuroscience requires precise and unbiased methods for animal behavior assessment to elucidate complex brain–behavior interactions. Traditional manual scoring methods are often labor-intensive and can be prone to error, necessitating advances in automated techniques. Recent innovations in computer vision have led to both marker- and markerless-based tracking systems. In this protocol, we outline the procedures required for utilizing Augmented Reality University of Cordoba (ArUco) markers, a marker-based tracking approach, to automate the assessment and scoring of rodent engagement during an established intracortical microstimulation-based nose-poking go/no-go task. In short, this protocol involves detailed instructions for building a suitable behavioral chamber, installing and configuring all required software packages, constructing and attaching an ArUco marker pattern to a rat, running the behavioral software to track marker positions, and analyzing the engagement data for determining optimal task durations. These methods provide a robust framework for real-time behavioral analysis without the need for extensive training data or high-end computational resources. The main advantages of this protocol include its computational efficiency, ease of implementation, and adaptability to various experimental setups, making it an accessible tool for laboratories with diverse resources. Overall, this approach streamlines the process of behavioral scoring, enhancing both the scalability and reproducibility of behavioral neuroscience research. All resources, including software, 3D models, and example data, are freely available at https://github.com/tomcatsmith19/ArucoDetection.
Key features
• The ArUco marker mounting hardware is lightweight, compact, and detachable for minimizing interference with natural animal behavior.
• Requires minimal computational resources and commercially available equipment, ensuring ease of use for diverse laboratory settings.
• Instructions for extracting necessary code are included to enhance accessibility within custom environments.
• Developed for real-time assessment and scoring of rodent engagement across a diverse array of pre-loaded behavioral tasks; instructions for adding custom tasks are included.
• Engagement analysis allows for the quantification of optimal task durations for consistent behavioral data collection without confirmation biases.
Keywords: ArUco markers (ArUco标记)Background
Behavioral neuroscience relies on an accurate assessment of animal behavior to understand complex brain–behavior relationships. Traditionally, such assessments have been performed using labor-intensive manual scoring methods, which are time-consuming, subject to human error, and can become impractical for large-scale studies [1,2]. Recent advancements in computer vision and tracking technologies have led to the development of automated behavioral scoring systems, offering more efficient and objective ways to analyze animal behavior [3–7]. Of these technologies, markerless-based approaches, which leverage machine-learning algorithms for position estimation, have been favored for their ability to capture complex animal movements by tracking an object’s inherent features instead of an attached marker [5,8]. However, they often require high-end computational equipment and extensive training data for robust performance [3,6]. For instance, DeepLabCut, a leading markerless system, recommends a powerful GPU for training models, as relying solely on the CPU can significantly slow execution [3]. Alternatively, marker-based approaches utilize physical markers that can be attached to animals directly, tracking only the markers themselves as a robust estimation of an object’s movement and orientation [4,7,9]. Among these marker-based tracking systems, Augmented Reality University of Cordoba (ArUco) markers have emerged as a promising tool for automated behavioral scoring procedures [10]. ArUco markers consist of black-and-white square grid patterns of various sizes, serving as fiducial target markers that are easily detected and tracked by cameras. Unlike markerless systems, ArUco markers do not rely on machine learning or extensive training data to be tracked, making them computationally lightweight and feasible for real-time analyses [10]. This simplicity enables ArUco markers to be tracked with minimal computational resources, such as low-end CPUs, ensuring accessibility and scalability for a wide range of laboratories. However, one limitation of this approach is the selectivity of the tracked animal behavior. In contrast to a markerless system’s ability to track quick complex movements with predictions between unmarked frames, the ArUco approach is best when utilized for simple movement analyses like locomotion due to camera capture motion blur [10–12]. Nevertheless, ArUco markers offer flexibility in experimental setups where researchers can adapt their tracking to suit their specific experimental needs and study a wide range of behaviors in diverse animal models. Although simple by design, detailed protocols enabling researchers with different backgrounds to readily employ these methods for animal tracking are lacking. Here, we present detailed protocols for establishing a simple and consistent method for automated behavioral scoring of task engagement in an intracortical microstimulation-based nose-poke, go/no-go task using ArUco markers. Our step-by-step approach provides clear instructions, allowing researchers to implement ArUco marker-based tracking systems effectively, thereby advancing the accessibility and reproducibility of behavioral neuroscience research.
Materials and reagents
Biological materials
Male Sprague-Dawley rats (Charles River Laboratories Inc., catalog number: 001CD)
¼ in. diameter vinyl tubing (the length required depends on the setup)
¼ in. diameter, 20 threads/inch, 1 in. long bolt with corresponding nut and (8×) washers
3 mm diameter heat shrink tubing (NTE Electronics, catalog number: HS-ASST-5)
¾ in. binder clip (DUIJINYU, catalog number: W003)
4 in. cable ties (Grand Rapids Industrial Products, catalog number: 54157)
in. binder clip (OWLKELA, catalog number: Clips-Black-0.6inch-120pcs)
6 in. cable ties (Utilitech, catalog number: SGY-CT33)
(2×) 4 ft. Bayonet Neill–Concelman (BNC) cables
6.4 mm diameter heat shrink tubing (Gardner Bender, catalog number: HST-101B)
Acoustic foam wedges (Ultimate Support, catalog number: UA-KIT-SBI)
Break away male header pins, straight (Leo Sales Ltd., catalog number: LS-00004)
Clear packing tape (Duck Brand, catalog number: 287206)
ClearWeld quick-set epoxy (J-B Weld, catalog number: 50112)
Double-sided foam tape
Dustless sugar precision reward pellets (Bio-Serv, catalog number: F0021)
Electrical tape
Hot glue gun with thermal-plastic adhesive (hot glue) sticks
M3 10 mm machine screws with corresponding nuts (6×) (K Kwokker, catalog number: TLEEP2324+FBADE-28)
M3 20 mm machine screws with corresponding nuts (12×) (K Kwokker, catalog number: TLEEP2324+FBADE-28)
M3 30 mm machine screw with corresponding nut (Everbilt, model: 837841)
Medical-grade air tank with regulator (Cuevas Distribution Inc., catalog number: AI USPxxx)
Polylactic acid (PLA) fused deposition modeling (FDM) 3D printer filament
Rubber bands (Advantage, catalog number: 2632A)
Solder wire (Shenzhen Joycefook Technology Co. Ltd, catalog number: JF850)
Stainless steel spring cable shielding (Protech International Inc., catalog number: 6Y000123101F)
Super glue (Loctite, catalog number: 234796-6)
Supplemental food pellets (LabDiet, catalog number: 5LL2, Prolab RMH 1800)
TIP120 negative-positive-negative (NPN) transistors (3×) (BOJACK, catalog number: BJ-TIP-120)
White stock printer paper
Equipment
12V 1A DC power supplies (2×) (Mo-gu, catalog number: B09TXJ9RK6)
150 mm T-Slot 2020 aluminum extrusion rails (4×) (MECCANIXITY, catalog number: mea231123ee000975)
200 mm T-Slot 2020 aluminum extrusion rails (x8) (Tsnamay, catalog number: TSCP5035)
3D-printed ArUco marker mounting assembly hardware (see Procedure)
3D-printed camera mounts (see Procedure)
3D-printed RGB LED strip mounting kit (see Procedure)
3D-printed T-Slot rail connector pieces (see Procedure)
4-pin 12V RGB LED strip
5 mm wide ~4.5 in. long hex wrench
57 in. × 42 in. × 56 in. sound-attenuating chamber (Maze Engineers, catalog number: 5831)
5V activation relay module (Inland, catalog number: 509687)
ArUco tracking cameras (2×) (Logitech, catalog number: 960-001105)
ATmega328 UNO R3 microcontroller (Inland, model: UNO R3 BOARD)
Behavior camera (j5create, catalog number: JVCU100)
Cable tensioner bar (see Procedure)
Commutator (Moog Inc., catalog number: AC6023-18)
Disposable lighter
Electrical Stimulator (Plexon Inc., model: PlexStim)
Fused deposition modeling (FDM) 3D printer
Heat gun (Wagner, catalog number: FBA_503008-cr)
Luxmeter (Leaton, catalog number: HRD-PN-79081807)
Multimeter (Innova, model: 3320)
Omnetics tether adapter (Omnetics Connector Corporation, catalog number: A79021-001)
Operant conditioning chamber (Vulintus, model: OmniTrak)
Oscilloscope (Tektronix Inc., model: TBS 1052B)
Pneumatic solenoid (AOMAG, catalog number: SKUSKD1384729)
Soldering iron (Guangzhou Yihua Electronic Equipment Co. Ltd., model: YIHUA 926 III)
Stackable headers, female 10 pin (2×) (Leo Sales Ltd., catalog number: LS-00009)
Workstation computer with Windows 10–11 (Dell, model: precision 5860 workstation)
Software and datasets
All data and code have been deposited to GitHub: https://github.com/tomcatsmith19/ArucoDetection
Cura v4.13.0 (Ultimaker)
Microsoft Excel
MATLAB vR2023b (MathWorks; requires a student license or higher)
Add-On: MATLAB Support Package for Arduino Hardware v23.1.0 (MathWorks)
Add-On: MATLAB Support Package for USB Webcams v23.1.0 (MathWorks)
Add-On: Instrument Control Toolbox v4.8 (MathWorks)
PlexStim v2.3 (Plexon Inc.)
Prism v10.0.2 (GraphPad; requires a student license or higher)
Python v3.10.11 (Python Software Foundation)
TekVisa v4.0.4 (Tektronix)
Visual Studio Code v1.85 (Microsoft)
Extension: Pylance v2024.6.1 (Microsoft)
Extension: Python v2024.8.0 (Microsoft)
Extension: Python Debugger v2024.6.0 (Microsoft)
Package: numpy v1.25.0
Package: opencv-contrib-python v4.7.0.72 (Open Source Vision Foundation)
Package: scipy v1.13.0 (SciPy)
Microsoft Word
Procedure
文章信息
稿件历史记录
提交日期: Jun 28, 2024
接收日期: Sep 8, 2024
在线发布日期: Sep 28, 2024
出版日期: Nov 5, 2024
版权信息
© 2024 The Author(s); This is an open access article under the CC BY-NC license (https://creativecommons.org/licenses/by-nc/4.0/).
如何引用
Readers should cite both the Bio-protocol article and the original research article where this protocol was used:
分类
神经科学 > 行为神经科学 > 感觉运动反应
您对这篇实验方法有问题吗?
在此处发布您的问题,我们将邀请本文作者来回答。同时,我们会将您的问题发布到Bio-protocol Exchange,以便寻求社区成员的帮助。
提问指南
+ 问题描述
写下详细的问题描述,包括所有有助于他人回答您问题的信息(例如实验过程、条件和相关图像等)。
Share
Bluesky
X
Copy link