Quantifying animal behaviour during microscopy is crucial to associate optically recorded neural activity with behavioural outputs and states. Here, I describe an imaging and tracking system for head-restrained larval zebrafish compatible with functional microscopy. This system is based on the Raspberry Pi computer, Pi NoIR camera and open-source software for the real-time tail segmentation and skeletonization of the zebrafish tail at over 100 Hz. This allows for precise and long-term analyses of swimming behaviour, which can be related to functional signals recorded in individual neurons. This system offers a simple but performant solution for quantifying the behaviour of head-restrained larval zebrafish, which can be built for 340€.

A chief application of the larval zebrafish for neuroscience is to image the activity of neurons in the intact and behaving animal using microscopy. This is facilitated by its translucent and small brain, measuring approximately 0.1 mm3. By expressing genetically encoded indicators, such as the GCaMP Ca2+ sensors (Akerboom et al., 2012; Chen et al., 2013), signals related to the activity of practically any or all neurons can be recorded from the larval zebrafish brain (Ahrens et al., 2012; Portugues et al., 2014).

Ca2+ imaging can be performed with standard microscopes, but such systems are not equipped for monitoring the behaviour of the animal. Therefore, any analyses directly relating neural activity to behaviour will require the integration of a behavioural recording apparatus. Behavioural recording is typically done in the context of custom-built microscopes, which can be designed explicitly with this behaviour-monitoring goal in mind. However, many research groups (including my own) have neither the financial nor the technical means to implement such a complete system. We rely on microscope equipment in a shared core facility. Such microscopes generally cannot be substantially or permanently modified, and often present physical and optical constraints that make installing a behaviour imaging system challenging.

Here, I present a solution for this problem based on the Raspberry Pi computer, that I call pi_tailtrack. The system includes illumination, camera, computer and software, yielding a complete setup that is compact, inexpensive and self-contained. The pi_tailtrack system can reliably track larval zebrafish behaviour in real-time at over 100 Hz while performing functional imaging experiments.

Animal ethics statement

Adult zebrafish used to generate larvae were housed in accordance with the Plateau de Recherche Expérimentale en Criblage In Vivo (PRECI) zebrafish facility approved by the animal welfare committee (comité d’éthique en expérimentation animale de la Région Rhône-Alpes: CECCAPP, agreement no. C693870602). Behaviour and microscopy experiments were performed at the 5 days post fertilization (dpf) stage, and are thus not subject to ethical review, but these procedures do not harm the larvae.

Animals

All experiments were performed on larval zebrafish at 5 dpf, raised at a density of ∼1 larva ml−1 of E3 medium in a 14 h:10 h light:dark cycle at 28–29°C. Adult zebrafish were housed, cared for and bred at the Lyon PRECI zebrafish facility. mitfa/nacre mutant animals (ZDB-ALT-990423-22) were used to prevent pigmentation.

Larval zebrafish were mounted and head restrained for two-photon imaging and behavioural analysis by placing them in a very small drop of E3 medium in the lid of a 35 mm Petri dish (Greiner Bio-One, 627102). Molten (∼42°C) 2% low melting point agarose (Sigma A9414) in E3 medium was added to the dish in an approximately 10 mm diameter droplet around the fish, and the zebrafish was repositioned within the solidifying agarose using a gel-loading pipette tip, such that it was oriented symmetrically for imaging with the dorsal surface of the head at the surface of the agarose. After the agarose had solidified (∼10 min), E3 medium was added to the dish, and then the agarose around the tail was cut away. This was done using a scalpel in two strokes emanating laterally from just below the swim bladder. It is critical to not scratch the dish in the vicinity of the freed tail, which can interfere with tail-tracking.

Hardware

I used a Raspberry Pi 4 Model B Rev 1.4 computer, running Raspbian GNU/Linux 11 (Bullseye). Table 1 contains the details of the hardware components that I used, their approximate price and an option for supplier (keeping in mind that the latter two are subject to change and will rapidly become inaccurate).

Table 1.

Bill of materials

Bill of materials
Bill of materials

The short 2 cm focal distance between the animal and the camera allowed for a compact and direct imaging setup, where the camera is mounted directly below the larva (Fig. 1). This avoids the need for any mirrors, and frees the space above the animal for the microscope objective, and any stimulus apparatus necessary. In our case, we used red LEDs to provide visual stimuli to the larvae (Lamiré et al., 2023).

Fig. 1.

pi_tailtrack apparatus. (A) The zebrafish larva being imaged under the microscope is illuminated with infrared (IR) LEDs, and imaged with the IR-sensitive Raspberry Pi NoIR camera. Image acquisition and processing is done with a Raspberry Pi computer and open-source Python packages. The zebrafish tail is identified and segmented in real-time as a sequence of 10 tail segments (green × symbols). (B) Rendering of the main components of the apparatus. IR LEDs illuminate the zebrafish larvae that is head-restrained in agarose in a 35 mm diameter Petri dish lid. An IR filter blocks the visible stimulus lights (red LEDs), and the microscope laser from reaching the Raspberry Pi NoIR camera suspended below the fish. (i) Wiring diagram for powering the IR LEDs. (C) Rendering including the 3D-printed mount and microscope objective. (D) Annotated photograph of the pi_tailtrack apparatus.

Fig. 1.

pi_tailtrack apparatus. (A) The zebrafish larva being imaged under the microscope is illuminated with infrared (IR) LEDs, and imaged with the IR-sensitive Raspberry Pi NoIR camera. Image acquisition and processing is done with a Raspberry Pi computer and open-source Python packages. The zebrafish tail is identified and segmented in real-time as a sequence of 10 tail segments (green × symbols). (B) Rendering of the main components of the apparatus. IR LEDs illuminate the zebrafish larvae that is head-restrained in agarose in a 35 mm diameter Petri dish lid. An IR filter blocks the visible stimulus lights (red LEDs), and the microscope laser from reaching the Raspberry Pi NoIR camera suspended below the fish. (i) Wiring diagram for powering the IR LEDs. (C) Rendering including the 3D-printed mount and microscope objective. (D) Annotated photograph of the pi_tailtrack apparatus.

To illuminate the larvae and visualize the tail, I used 890 nm infrared (IR) LEDs. Using the IR LEDs as an oblique illumination source generated a nicely resolved image of the mitfa mutant zebrafish tail that was sufficient for reliable identification and tracking (Fig. 2). IR LEDs were wired in a simple circuit, with 10 LEDs in a series, powered by an 18 V DC power supply and a 47 Ω current limiting resistor (Fig. 1Bi). Using these exact voltage/resistance configurations is not important, provided a relevant power supply and resistor are chosen to match the LED characteristics (for our 890 nm LEDs: forward voltage=1.4 V, current=100 mA; see e.g. www.amplifiedparts.com/tech-articles/led-parallel-series-calculator).

Fig. 2.

Larval zebrafish tail tracking examples. (A) Screenshot of a single frame of a tracking image, showing the image from the camera (‘camera image’) with the resultant tracking points overlayed as white dots. The final tracking point is shown as a white semicircle, which is used in the coordinate search algorithm. ‘Threshold’ shows the result of the adaptive thresholding operation, and ‘threshold+filtering’ the result of the morphological opening and closing operations. Displayed along the top are: frame (current frame number of the experiment), sec_elapsed (number of seconds elapsed in the experiment), fps (current frame rate, frames per second) and stim_val (the current value read on the stimulus recording pin: GPIO Pin 4). A schematic of the image field, depicting the agarose mounting medium, the position of the zebrafish and the microscope objective visible in the background, is shown in the left panel. (B) Probability density distribution of individual frame periods from two representative experiments. (Ci) Example frames during a swimming event. (Cii) Tail angle deflections during three distinct swim events. (Ciii) 3D plot of tail coordinates through the same time period as Cii, drawn in the same time color code. (D) Same as C, but for a period in which the larvae executes a struggle/escape maneuver and associated high-amplitude tail deflections.

Fig. 2.

Larval zebrafish tail tracking examples. (A) Screenshot of a single frame of a tracking image, showing the image from the camera (‘camera image’) with the resultant tracking points overlayed as white dots. The final tracking point is shown as a white semicircle, which is used in the coordinate search algorithm. ‘Threshold’ shows the result of the adaptive thresholding operation, and ‘threshold+filtering’ the result of the morphological opening and closing operations. Displayed along the top are: frame (current frame number of the experiment), sec_elapsed (number of seconds elapsed in the experiment), fps (current frame rate, frames per second) and stim_val (the current value read on the stimulus recording pin: GPIO Pin 4). A schematic of the image field, depicting the agarose mounting medium, the position of the zebrafish and the microscope objective visible in the background, is shown in the left panel. (B) Probability density distribution of individual frame periods from two representative experiments. (Ci) Example frames during a swimming event. (Cii) Tail angle deflections during three distinct swim events. (Ciii) 3D plot of tail coordinates through the same time period as Cii, drawn in the same time color code. (D) Same as C, but for a period in which the larvae executes a struggle/escape maneuver and associated high-amplitude tail deflections.

We used an 880 nm bandpass filter in front of the Raspberry Pi NoIR camera module to selectively pass the IR LED light. This filter is essential to block the intense microscope laser light, which will obscure the image of the fish by saturating (and likely damaging) the camera sensor. Notably, this filter it is the most expensive part in the setup, costing more than the computer and camera combined (Table 1). With our typical two-photon GFP/GCaMP imaging settings and the laser tuned to 930 nm, laser light is not visible in the camera image. Using such a bandpass filter in the 880 nm range should allow this system to be compatible with many other imaging modalities (confocal, epifluorescence, brightfield, etc.), provided that the excitation wavelengths are not in the ∼870–900 nm range, and the microscope system effectively filters out the 890 nm light from the LEDs. If necessary, these wavelength characteristics can be adapted using different LED and filter components.

To house the system components I used a 3D-printed mount (Fig. 1C,D). This was designed using FreeCAD (freecad.org, FreeCAD file), and 3D printed in black PETG using and Creality Ender 3 Pro 3D printer. It consists of the main body shape that holds the the camera, IR filter, red stimulus LEDs above the fish and IR LEDs in the oblique illumination configuration (Main Shape). An insert is placed into the depression above the IR filter, forming the platform onto which the zebrafish dish is placed (Depression Insert). The final 3D-printed component is a semicircular shape that completes the encirclement of the objective, and helps minimize light scattering (Fig. 2Bi, Semicircle STL file).

I would note that I built up the size of the platform of the mount to match with the relatively spacious configuration of the microscope I was using (Fig. 1D). A much more compact configuration is possible, as we only require ∼26 mm of clearance from the fish to the bottom of the ∼6 mm thick camera. The base design could be easily adapted to match different microscope stage configurations. For example, the entire system could be inverted to accommodate an inverted microscope to image ventral structures during behaviour, such as the lateral line ganglia or the heart. Or, if stimuli need to be bottom-projected, a small 45 deg hot mirror could be used to divert the image to the camera and free the space directly beneath the animal for stimuli.

Tail tracking approach

Software was written in Python, using the picamera library for camera control (https://picamera.readthedocs.io/en/release-1.13/). Tail tracking was performed using OpenCV (cv2 version 4.5.5; https://opencv.org/) and NumPy (version 1.19.5; Harris et al., 2020). All code is provided in the file record_tail.py.

Image frames are acquired directly from the camera buffer as an 8-bit NumPy array, and thresholded using adaptive thresholding (cv2.adaptiveThreshold) to identify bright objects in the image (Fig. 2A, ‘threshold’), using a threshold of −10 and a 33 pixel neighborhood. This binary image is then filtered using a morphological opening and closing operation (cv2.morphologyEx). This combination generally results in a nicely segmented fish blob in the final binary image (Fig. 2A, ‘threshold+filtering’). Thresholding and filtering parameters can be adjusted in real-time using the w/s and a/d keys. However, this method identifies all large bright objects in the image, including borders of the agarose block and reflections on the microscope objective, and therefore we need a method to identify the fish object among these various segmented blobs.

The fish object is identified with a pre-defined coordinate that acts as the first tracking point of the fish. The fish object is then skeletonized into up to 10 tail segments (Fig. 2A, ‘tracking points’), which can be used to reconstruct the posture of the tail to identify swimming events (Fig. 2C,D). To perform this skeletonization, the tracking points are iteratively identified based on the intersection of a semicircle and the fish object, offset 7 pixels (0.19 mm) from the previous tracking point, and oriented in the direction of the previous segment (similar to Štih et al., 2019; Randlett et al., 2019). For the first search, this direction is toward the right of the image. Therefore, this strategy relies on the zebrafish larva being oriented with its tail pointed towards the right, and being placed in the same position such that the exit point of the tail from the agarose block intersects with the first tracking point. The starting coordinate for the tail tracking can be adjusted using the arrow keys. It also requires that no other bright objects intersect with the fish object after binarization. Therefore, it is critical to avoid distracting objects in the imaging scene, such as scratches in the dish or stray pieces of agarose.

Tracking data format

The tail tracking data are saved in a comma-separated text file ‘*_coords.txt’, the 10 pairs of x and y coordinates for each tail point are saved as rows, and thus there are two rows with 10 columns for every tracked frame. ‘NaN’ values represent instances where a tail point is not identified.

The timing of the data is saved in a separate text file ‘*_tstamps.txt’, which also has two rows for each frame. The first value is the ‘timestamp’ reflecting the time elapsed since the beginning of the tracking experiment. This is used to relate the tail posture and behavioural events to specific points in time. This is important for experiments in which precise timing of behavioural events is important, because the frame rate is not fixed and can fluctuate during the experiment (see above). However, it is important to note that the timestamp recorded is based on the time at which the frame is received from the camera buffer, which may lag from the time at which it was actually acquired by the camera. This could be problematic if millisecond-level precision on behavioural timing is critical, for example if differentiating between short- and long-latency acoustic stimulus responses (Burgess and Granato, 2007).

The second value in the ‘*_tstamps.txt’ file is the value recorded on one of the GPIO pin 4 of the Raspberry Pi. This value will read either ‘low’=0 for a voltage less than 1.8 V, or ‘high’=1 for 1.8–3.3 V. I use these recordings to synchronize the behavioural recordings with the frames recorded on the microscope. In our typical setup we are using an analog output pin from the DAQ board on the microscope to control the red stimulis LEDs (Fig. 1B), and we also connect this output of the DAQ board to the GPIO pin 4 on the microscope. In this way, we can synchronize the stimuli, the microscope imaging frames and the behavioural recordings.

These datasets can be read into Python for analysis using, for example:

graphic

Ca2+ imaging and analysis

Two-photon Ca2+ imaging was performed and analyzed as described in Lamiré et al. (2023). Briefly, a 5 dpf Tg2(elavl3:GCaMP6s) (ZDB-ALT-180502-2; Dunn et al., 2016) larva was imaged using a 20×1.0 NA water dipping objective (Olympus) on a Bruker Ultima microscope at the CIQLE imaging platform (Lyon, LYMIC). Frames were acquired using a resonant scanner over a rectangular region of 1024×512 pixels (0.6 µm x/y resolution) and piezo objective to scan 12 planes separated at 10 µm steps, with a repeat rate of 1.98 Hz. The position of the functional imaging stack within the brain was stabilized in the x, y and z dimensions ‘online’ during acquisition using Bruker's PrairieLink API and Python (brukerPL_stable_tseries.py). The central imaging plane was compared with a high-quality anatomical stack acquired before functional imaging using the registration.phase_cross_correlation function from the scikit-image package (Van der Walt et al., 2014).

Regions of interest (ROIs) were identified and fluorescence time series were extracted using suite2p (Pachitariu et al., 2016 preprint). The zebrafish was stimulated with 60 ‘dark flash’ stimuli at 60 s intervals (Lamiré et al., 2023), though responses to these stimuli were not incorporated into the analyses presented here, other than to synchronize the behavioural tracking with the microscope acquisition timing.

To identify neurons tuned to turning direction (Fig. 3C), the fluorescence trace from each ROI was compared with vectors derived from the pi_tailtrack recordings of the tail reflecting leftward or rightward turns, respectively. These ‘behaviour state’ vectors were convolved with the GCaMP response kernel to generate ‘regressors’ reflecting the predicted Ca2+ response in neurons that are activated during the relevant behavioural state (as in Miri et al., 2011). Tuning images were then generated reflecting the Pearson correlation coefficient between the z-scored fluorescence trace of the ROI and the relevant regressor. Images output from the analysis were adjusted for brightness/contrast and lookup table (LUT) using FIJI/ImageJ (Schindelin et al., 2012). This same approach was used to identify the relationship between ROIs and ‘swim’ and ‘struggle’ motor events (Fig. 3D,E).

Fig. 3.

Identification of behaviour-associated neurons in a larval zebrafish brain via two-photon Ca2+ imaging. (A) Histogram for mean tail angle during individual movement bouts for a single larva over an 80 min imaging session. Bouts are classified as left or right turns based on a threshold value of 0.004 radians/bout. (B) Histogram for bout vigor, quantified using a rolling standard deviation of absolute tail angle. Movements are classified as ‘swims’ or ‘struggles’ based on a threshold value of 0.017. (C) Tuning of Ca2+ traces in regions of interest to turns to the left (green) or right (magenta), as classified in A. Images are the Pearson correlation coefficient to each behavioural regressor (left or right turns), scaled from 0 to 0.3. Tg2(elavl3:GCaMP6s) expression pattern is shown in grey. Arrows highlight the anterior rhombencaphalic turning region (ARTR), with ipsilateral tuning to turning direction. A, anterior; P, posterior. (D,E) Tuning of neurons to swims (D) and struggles (E), as classified in B.

Fig. 3.

Identification of behaviour-associated neurons in a larval zebrafish brain via two-photon Ca2+ imaging. (A) Histogram for mean tail angle during individual movement bouts for a single larva over an 80 min imaging session. Bouts are classified as left or right turns based on a threshold value of 0.004 radians/bout. (B) Histogram for bout vigor, quantified using a rolling standard deviation of absolute tail angle. Movements are classified as ‘swims’ or ‘struggles’ based on a threshold value of 0.017. (C) Tuning of Ca2+ traces in regions of interest to turns to the left (green) or right (magenta), as classified in A. Images are the Pearson correlation coefficient to each behavioural regressor (left or right turns), scaled from 0 to 0.3. Tg2(elavl3:GCaMP6s) expression pattern is shown in grey. Arrows highlight the anterior rhombencaphalic turning region (ARTR), with ipsilateral tuning to turning direction. A, anterior; P, posterior. (D,E) Tuning of neurons to swims (D) and struggles (E), as classified in B.

Design goals

I wanted to track the swimming behaviour of head-restrained larval zebrafish while performing Ca2+ imaging. There are many ways that this might be accomplished, but I wanted a system that was: (1) able to identify and characterize individual swimming events while we are imaging the brain using two-photon microscopy; (2) compact and self contained, so that it can be easily and rapidly installed and removed for our imaging sessions on a shared microscope; and (3) made using low-cost and open source hardware and software to facilitate re-use in other contexts.

Using a Raspberry Pi camera to image the larval zebrafish tail

The Raspberry Pi is a very inexpensive, credit-card-sized computer that plugs into a standard monitor, keyboard and mouse. The Raspberry Pi's open-source nature and large user community, and its ability to control and interface with a variety of devices and sensors, make it a powerful and accessible platform for developing and sharing custom neuroscience and behavioural research tools. Indeed, many such systems have been developed in recent years based around the Raspberry Pi and the Pi Camera, and especially the IR-sensitive Pi NoIR camera, as an acquisition device (Geissmann et al., 2017; Maia Chagas et al., 2017; Saunders et al., 2019; Tadres and Louis, 2020; Broussard et al., 2022).

However, obtaining sufficient resolution and contrast to resolve the larval zebrafish tail is challenging because the tail is very narrow (∼0.25 mm diameter), and nearly transparent. This is especially true in de-pigmented animals that are generally used for brain imaging owing to their lack of melanin pigment over the brain (e.g. mitfa/nacre mutants, or larvae treated with N-phenylthiourea). This also removes melanin pigment from the tail, increasing its transparency and making it harder to image and track. Thus, it was not clear whether the 26€ Pi NoIR camera would be up to this task.

The stock lens configuration on the Pi Camera is also not designed for macro photography, and has a minimum focus distance of 50 cm. But, extension tubes are a well-known macro-photography hack that work by increasing the distance between the lens and the camera. Increasing this distance acts to decrease the focus distance of the optical system, increasing the maximal magnification. By unscrewing the lens of the Pi NoIR camera until just before it falls off, it is possible to focus on objects at a 2 cm distance, allowing for sufficient magnification to observe and track the tail of mitfa mutant zebrafish (Figs 1 and 2).

A second challenge is that larval zebrafish move their tails very rapidly, with a tail-beat frequency between 20 and 40 Hz for normal swimming, which can increase to 100 Hz during burst/escape swimming (Budick and O'Malley, 2000; Muller, 2004; Severi et al., 2014). The V2.1 camera documentation indicates a maximum frame rate of 30 Hz, which is insufficient for imaging tail dynamics. However, by adopting a cropped sensor configuration, and by omitting the JPG compression step in image processing, the camera can be pushed to image at up to 1000 Hz (Elder, 2019). I adopted a configuration where I imaged with a fixed gain/ISO of 800 in auto-exposure mode, and with a cropped sensor of 128×128 pixels covering a 3.5×3.5 mm field of view. This gives sufficient spatial resolution to observe and track the tail of the fish (27 µm pixel–1), and most importantly, minimal CPU load. This frees the limited CPU resources on the Raspberry Pi to be used for real-time image processing and tail tracking.

Tail tracking

Tracking objects in images and videos has undergone a revolution with deep learning and neural network frameworks, where the tracking and reconstruction of complex animal postures is possible after training networks on only a few example images (Mathis et al., 2018; Pereira et al., 2022). However, such approaches are computationally intensive and generally require dedicated and GPU hardware beyond the capabilities of the standard Raspberry Pi, making them incompatible with our project goals. In contexts where the image background is predictable and stable, classical computer vision methods like background subtraction, filtering and thresholding may still be preferable to network-based object identification, especially when speed or computational resources are priorities (Mirat et al., 2013; Štih et al., 2019; Zhu et al., 2023). Here, I have used the NumPy (Harris et al., 2020) and OpenCV (https://opencv.org/) libraries to handle the image data and computer vision tasks (Fig. 1).

I used a computationally lean segmentation and skeletonization strategy (see Materials and Methods) to segment the tail into 10 segments (Fig. 2), which takes less than 10 ms on the Raspberry Pi CPU. The imaging frame rate when using the picamera Python package will adjust based on the throughput of the analysis system, which can change with the complexity of the binary images that are processed or external CPU demands, but runs at approximately 104 frames s−1 (Fig. 2B). This is sufficient to clearly distinguish different types of movement events (such as ‘swims’ from ‘struggles’, Fig. 2C versus D), and where individual tail beats during swimming events are resolvable. However, this will not be true during rapid/burst swimming, in which tail-beat frequency will exceed our frame rate. If such temporal resolution is required, our setup will be insufficient, and we will only reliably track tail half-beat frequencies of ≤50 Hz. Therefore, this system is not capable of comprehensive behavioural characterization, but can be used to identify different types of swim events.

During the experiment, the software provides a visual display, as is shown in the screenshots in Fig. 2, and screen capture video (Fig. 2; Movie 1). Results of the thresholding, filtering and skeleton tracking are visible and updated in real time. This can be used to optimize the position of the zebrafish, the adaptive thresholding parameters (neighborhood, threshold) using the ‘w/a/s/d’ keys, and the position of the first tracking point using the arrow keys.

Behavioural analysis of Ca2+ imaging data

To test the performance of the pi_tailtrack system, I analyzed Ca2+ imaging data from an 80-min-long volumetric recording covering a large proportion of the brain (as in Lamiré et al., 2023). To identify neurons tuned to behavioural parameters, I used ‘regressors’ derived from the pi_tailtrack recordings reflecting different motor states convolved with the GCaMP response kernel (as in Miri et al., 2011). Zebrafish swim bouts can be classified as either forward swims or turns, and an area within the anterior hindbrain is associated with turning direction. This area is known as the anterior rhombencephalic turning region (ARTR: Dunn et al., 2016; also called the HBO: Ahrens et al., 2013; Wolf et al., 2017), and shows a conspicuous activity pattern with stripes of neurons tuned to the ipsilateral turning direction. By looking at correlations to regressors reflecting right and left turns, I identified these stripes of neurons in the ARTR, indicating that I can successfully identify the ARTR using pi_tailtrack (Fig. 3A,C). A similar analysis looking at ‘swims’ versus ‘struggles’, with ‘struggles’ reflecting high-amplitude tail flicking events (Figs 2D and 3B), identified differential neuronal activation in the context of these two movement categories (Fig. 3D,E), with the presence of lateral hindbrain populations of neurons that were negatively correlated with ‘swims’, and a broader and more positively correlated population with ‘struggles’.

Future developments

Here, I have used the pi_tailtrack system to simply record the behaviour of the animal independent of the microscopy or any stimulus delivery. Therefore, the timing of microscope image acquisition is controlled by the microscope computer and is independent of pi_tailtrack. These separate experimental clocks (microscope frames versus Pi Camera frames) must be synchronized, and in the present case I have used the GPIO input pin on the Raspberry Pi to record the timing of the stimuli delivered by the microscope relative to the Pi Camera frames. An alternative solution would be to use the Raspberry Pi to deliver the stimuli, perhaps by integrating a video projector system to allow for the delivery of arbitrary and complex visual stimuli. This would also open up possibilities for performing ‘virtual reality’ experiments, where the behaviour of the animal dictates the stimulus in closed loop. In some microscope systems it should also be possible to use the Raspberry Pi GPIO to trigger microscope acquisitions. This may be preferable if the synchronization between imaging and behaviour frames is critical.

It is also important to note that hardware in this micro-computer/Raspberry Pi space is rapidly evolving. Indeed, a new suite of Raspberry Pi V3 Cameras has just been released, offering increased resolution, dynamic range and frame rate. Using these cameras, we may be able to increase the frame rate of tracking into the hundreds of Hertz, which would allow us to more reliably resolve individual tail half-beats. The Raspberry Pi ‘Global Shutter’ camera has also recently been released, which is likely also going to be very interesting for behavioural neuroscience, as the use of a global shutter avoids rolling shutter artifacts that distort images along the frame during rapid motion. The software introduced here could be further optimized for speed, for example by moving to a multi-threaded architecture to distribute the image acquisition and tracking computations (Zhu et al., 2023; Randlett et al., 2019), using a compiled language (e.g. C/C++ or Julia) or perhaps by moving image processing onto the Raspberry Pi GPU.

Conclusions

Here, I described our system for tracking the tail of the larval zebrafish during microscopy. Many of the practical considerations of this setup may be specific to our application, and therefore may need modification for use in other experiments in other labs. However, I feel that the core and simple idea of using an IR-sensitive Raspberry Pi camera and a simple Python script, coupled with IR LEDs and an IR filter, provides an approachable and flexible solution that may be widely useful for observing and tracking the behaviour of zebrafish (or perhaps other animals) while performing imaging experiments. This system's attributes may also make it an ideal tool for community engagement activities such as school outreach programs. It could serve as a platform for learning about microelectronics, behavioural analyses, machine vision, and hardware design and construction.

I thank the Centre d'Imagerie Quantitative Lyon-Est (CIQLE) Imaging facility for providing access to the 2-photon microscope equipment, and the Plateau de Recherche Expérimentale en Criblage In Vivo (PRECI) for zebrafish care and maintenance.

Funding

This work was supported by funding from the ATIP-Avenir program of the Centre national de la recherche scientifique (CNRS) and Inserm, a Fondation Fyssen research grant, and the IDEX-Impulsion initiative of the University of Lyon.

Data availability

Software and analysis code is available here: https://github.com/owenrandlett/pi_tailtrack/. Datasets are available here: pi_tailtrack datasets.

Ahrens
,
M. B.
,
Li
,
J. M.
,
Orger
,
M. B.
,
Robson
,
D. N.
,
Schier
,
A. F.
,
Engert
,
F.
and
Portugues
,
R.
(
2012
).
Brain-wide neuronal dynamics during motor adaptation in zebrafish
.
Nature
485
,
471
-
477
.
Ahrens
,
M. B.
,
Orger
,
M. B.
,
Robson
,
D. N.
,
Li
,
J. M.
and
Keller
,
P. J.
(
2013
).
Whole-brain functional imaging at cellular resolution using light-sheet microscopy
.
Nat. Methods
10
,
413
-
420
.
Akerboom
,
J.
,
Chen
,
T.-W.
,
Wardill
,
T. J.
,
Tian
,
L.
,
Marvin
,
J. S.
,
Mutlu
,
S.
,
Calderón
,
N. C.
,
Esposti
,
F.
,
Borghuis
,
B. G.
,
Sun
,
X. R.
et al. 
(
2012
).
Optimization of a GCaMP calcium indicator for neural activity imaging
.
J. Neurosci.
32
,
13819
-
13840
.
Broussard
,
G. J.
,
Kislin
,
M.
,
Jung
,
C.
and
Wang
,
S. S.-H.
(
2022
).
A flexible platform for monitoring cerebellum-dependent sensory associative learning
.
J. Vis. Exp
.
19
,
179
.
Budick
,
S. A.
and
O'Malley
,
D. M.
(
2000
).
Locomotor repertoire of the larval zebrafish: swimming, turning and prey capture
.
J. Exp. Biol.
203
,
2565
-
2579
.
Burgess
,
H. A.
and
Granato
,
M.
(
2007
).
Sensorimotor gating in larval zebrafish
.
J. Neurosci.
27
,
4984
-
4994
.
Chen
,
T.-W.
,
Wardill
,
T. J.
,
Sun
,
Y.
,
Pulver
,
S. R.
,
Renninger
,
S. L.
,
Baohan
,
A.
,
Schreiter
,
E. R.
,
Kerr
,
R. A.
,
Orger
,
M. B.
,
Jayaraman
,
V.
et al. 
(
2013
).
Ultrasensitive fluorescent proteins for imaging neuronal activity
.
Nature
499
,
295
-
300
.
Dunn
,
T. W.
,
Mu
,
Y.
,
Narayan
,
S.
,
Randlett
,
O.
,
Naumann
,
E. A.
,
Yang
,
C.-T.
,
Schier
,
A. F.
,
Freeman
,
J.
,
Engert
,
F.
and
Ahrens
,
M. B.
(
2016
).
Brain-wide mapping of neural activity controlling zebrafish exploratory locomotion
.
eLife
5
,
e12741
.
Elder
,
R
. (
2019
).
A guide to recording 660fps video on a $6 Raspberry Pi camera. https://blog.robertelder.org/recording-660-fps-on-raspberry-pi-camera/
(accessed 5 May 2023)
Geissmann
,
Q.
,
Garcia Rodriguez
,
L.
,
Beckwith
,
E. J.
,
French
,
A. S.
,
Jamasb
,
A. R.
and
Gilestro
,
G. F.
(
2017
).
Ethoscopes: An open platform for high-throughput ethomics
.
PLoS Biol.
15
,
e2003026
.
Harris
,
C. R.
,
Millman
,
K. J.
,
van der Walt
,
S. J.
,
Gommers
,
R.
,
Virtanen
,
P.
,
Cournapeau
,
D.
,
Wieser
,
E.
,
Taylor
,
J.
,
Berg
,
S.
,
Smith
,
N. J.
et al. 
(
2020
).
Array programming with NumPy
.
Nature
585
,
357
-
362
.
Lamiré
,
L.-A.
,
Haesemeyer
,
M.
,
Engert
,
F.
,
Granato
,
M.
and
Randlett
,
O.
(
2023
).
Functional and pharmacological analyses of visual habituation learning in larval zebrafish
.
eLife
12
,
84926
.
Maia Chagas
,
A.
,
Prieto-Godino
,
L. L.
,
Arrenberg
,
A. B.
and
Baden
,
T.
(
2017
).
The €100 lab: A 3d-printable open-source platform for fluorescence microscopy, optogenetics, and accurate temperature control during behaviour of zebrafish, Drosophila, and Caenorhabditis elegans
.
PLoS Biol.
15
,
e2002702
.
Mathis
,
A.
,
Mamidanna
,
P.
,
Cury
,
K. M.
,
Abe
,
T.
,
Murthy
,
V. N.
,
Mathis
,
M. W.
and
Bethge
,
M.
(
2018
).
DeepLabCut: markerless pose estimation of user-defined body parts with deep learning
.
Nat. Neurosci.
21
,
1281
-
1289
.
Mirat
,
O.
,
Sternberg
,
J. R.
,
Severi
,
K. E.
and
Wyart
,
C.
(
2013
).
ZebraZoom: an automated program for high-throughput behavioral analysis and categorization
.
Front. Neural Circuits
7
,
107
.
Miri
,
A.
,
Daie
,
K.
,
Burdine
,
R. D.
,
Aksay
,
E.
and
Tank
,
D. W.
(
2011
).
Regression-based identification of behavior-encoding neurons during large-scale optical imaging of neural activity at cellular resolution
.
J. Neurophysiol.
105
,
964
-
980
.
Muller
,
U. K.
(
2004
).
Swimming of larval zebrafish: ontogeny of body waves and implications for locomotory development
.
J. Exp. Biol.
207
,
853
-
868
.
Pachitariu
,
M.
,
Stringer
,
C.
,
Dipoppa
,
M.
,
Schröder
,
S.
,
Rossi
,
L. F.
,
Dalgleish
,
H.
,
Carandini
,
M.
and
Harris
,
K. D.
(
2016
).
Suite2p: beyond 10,000 neurons with standard two-photon microscopy
.
bioRxiv
.
Pereira
,
T. D.
,
Tabris
,
N.
,
Matsliah
,
A.
,
Turner
,
D. M.
,
Li
,
J.
,
Ravindranath
,
S.
,
Papadoyannis
,
E. S.
,
Normand
,
E.
,
Deutsch
,
D. S.
,
Wang
,
Z. Y.
et al. 
(
2022
).
SLEAP: A deep learning system for multi-animal pose tracking
.
Nat. Methods
19
,
486
-
495
.
Portugues
,
R.
,
Feierstein
,
C. E.
,
Engert
,
F.
and
Orger
,
M. B.
(
2014
).
Whole-brain activity maps reveal stereotyped, distributed networks for visuomotor behavior
.
Neuron
81
,
1328
-
1343
.
Randlett
,
O.
,
Haesemeyer
,
M.
,
Forkin
,
G.
,
Shoenhard
,
H.
,
Schier
,
A. F.
,
Engert
,
F.
and
Granato
,
M.
(
2019
).
Distributed plasticity drives visual habituation learning in larval zebrafish
.
Curr. Biol.
29
,
1337
-1
345
.
Saunders
,
J. L.
,
Ott
,
L. A.
and
Wehr
,
M.
(
2019
).
AUTOPILOT: Automating experiments with lots of Raspberry Pis
. bioRxiv, 807693. doi:10.1101/807693
Schindelin
,
J.
,
Arganda-Carreras
,
I.
,
Frise
,
E.
,
Kaynig
,
V.
,
Longair
,
M.
,
Pietzsch
,
T.
,
Preibisch
,
S.
,
Rueden
,
C.
,
Saalfeld
,
S.
,
Schmid
,
B.
et al. 
(
2012
).
Fiji: an open-source platform for biological-image analysis
.
Nat. Methods
9
,
676
-
682
.
Severi
,
K. E.
,
Portugues
,
R.
,
Marques
,
J. C.
,
O'Malley
,
D. M.
,
Orger
,
M. B.
and
Engert
,
F.
(
2014
).
Neural control and modulation of swimming speed in the larval zebrafish
.
Neuron
83
,
692
-
707
.
Štih
,
V.
,
Petrucco
,
L.
,
Kist
,
A. M.
and
Portugues
,
R.
(
2019
).
Stytra: An open-source, integrated system for stimulation, tracking and closed-loop behavioral experiments
.
PLoS Comput. Biol.
15
,
e1006699
.
Tadres
,
D.
and
Louis
,
M.
(
2020
).
PiVR: An affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior
.
PLoS Biol.
18
,
e3000712
.
Van der Walt
,
S.
,
Schönberger
,
J. L.
,
Nunez-Iglesias
,
J.
,
Boulogne
,
F.
,
Warner
,
J. D.
,
Yager
,
N.
,
Gouillart
,
E.
and
Yu
,
T.
(
2014
).
scikit-image: image processing in Python
.
PeerJ
2
,
e453
.
Wolf
,
S.
,
Dubreuil
,
A. M.
,
Bertoni
,
T.
,
Böhm
,
U. L.
,
Bormuth
,
V.
,
Candelier
,
R.
,
Karpenko
,
S.
,
Hildebrand
,
D. G. C.
,
Bianco
,
I. H.
,
Monasson
,
R.
et al. 
(
2017
).
Sensorimotor computation underlying phototaxis in zebrafish
.
Nat. Commun.
8
,
651
.
Zhu
,
Y.
,
Auer
,
F.
,
Gelnaw
,
H.
,
Davis
,
S. N.
,
Hamling
,
K. R.
,
May
,
C. E.
,
Ahamed
,
H.
,
Ringstad
,
N.
,
Nagel
,
K. I.
and
Schoppik
,
D.
(
2023
).
SAMPL is a high-throughput solution to study unconstrained vertical behavior in small animals
.
Cell Rep.
42
,
112573
.

Competing interests

The author declares no competing or financial interests.

Supplementary information