The EthoVision video tracking system— A tool for

Transcription

The EthoVision video tracking system— A tool for
Physiology & Behavior 73 (2001) 731 – 744
The EthoVision video tracking system—
A tool for behavioral phenotyping of transgenic mice
A.J. Spink*, R.A.J. Tegelenbosch, M.O.S. Buma, L.P.J.J. Noldus
Noldus Information Technology B.V., P.O. Box 268, 6700 AG Wageningen, the Netherlands
Received 1 September 2000; accepted 21 December 2000
Abstract
Video tracking systems enable behavior to be studied in a reliable and consistent way, and over longer time periods than if they are
manually recorded. The system takes an analog video signal, digitizes each frame, and analyses the resultant pixels to determine the location
of the tracked animals (as well as other data). Calculations are performed on a series of frames to derive a set of quantitative descriptors of the
animal’s movement. EthoVision (from Noldus Information Technology) is a specific example of such a system, and its functionality that is
particularly relevant to transgenic mice studies is described. Key practical aspects of using the EthoVision system are also outlined, including
tips about lighting, marking animals, the arena size, and sample rate. Four case studies are presented, illustrating various aspects of the
system: (1) The effects of disabling the Munc 18-1 gene were clearly shown using the straightforward measure of how long the mice took to
enter a zone in an open field. (2) Differences in exploratory behavior between short and long attack latency mice strains were quantified by
measuring the time spent in inner and outer zones of an open field. (3) Mice with hypomorphic CREB alleles were shown to perform less
well in a water maze, but this was only clear when a range of different variables were calculated from their tracks. (4) Mice with the trkB
receptor knocked out in the forebrain also performed poorly in a water maze, and it was immediately apparent from examining plots of the
tracks that this was due to thigmotaxis. Some of the latest technological developments and possible future directions for video tracking
systems are briefly discussed. D 2001 Elsevier Science Inc. All rights reserved.
Keywords: Video tracking; Automated observation; Water maze; Rodent
1. Introduction
1.1. Why automated observation?
The behavior of animals, including transgenic mice, is
commonly recorded in either a manual or semiautomated
way. Traditionally, a researcher observes the animal, and if
he or she considers that a certain behavior pattern is
displayed, the behavior is noted — either by writing it
down, or by entering the data into an event-recording
program such as The Observer [1,2]. Although manual
recording of behavior can be implemented with a relatively
low investment, and for some behaviors it may be the only
way to detect and record their occurrence, automated
observation (such as video tracking) can provide significant
* Corresponding author. Tel.: +31-317-497-677; fax: +31-317-424496.
E-mail address: [email protected] (L.P.J.J. Noldus).
advantages. The behaviors are recorded more reliably
because the computer algorithm always works in the same
way, and the system does not suffer from observer fatigue or
drift. In contrast to manual observation, video tracking
carries out pattern analysis on a video image of the observed
animals to extract quantitative measurements of the animals’
behavior (for more details about how this works, see below).
1.2. The development of automated observation systems
Technology for automated detection and recording of
behaviors has evolved dramatically in the past decade.
Early systems, using hard-wired electronics, were only able
to track a single animal in highly artificial environments.
For example, an open field can be sampled with a grid of
infrared beams either as the sole detectors [3– 5] or in
combination with other methods such as placing a series
of strain gauge transducers under the arena to estimate the
animal’s position [6]. The magnitude of an animal’s motion
can also be estimated using various types of touch-sensitive
0031-9384/01/$ – see front matter D 2001 Elsevier Science Inc. All rights reserved.
PII: S 0 0 3 1 - 9 3 8 4 ( 0 1 ) 0 0 5 3 0 - 3
732
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
sensors ranging from a crude estimate of movement by
placing the animal on a bass loudspeaker and monitoring
the loudspeaker’s electrical output when its cone was
moved by the rat [7], or measuring the movement of a
mouse’s wheel by attaching the wheel’s axle in place of the
ball of a computer mouse [8], to sensors that can also
measure the position of the animal (and hence locomotion),
for instance by changes in the capacitance of a plate when
the animal is in proximity to it [9], or changes in body
resistance [10]. Other comparable detection methods have
included use of ultrasound [11] and microwave Doppler
radar [12]. The position of a rat in an open field has been
recorded by attaching the rat to a computer joystick via a
series of rods attached by a collar to the rat’s neck [13].
Animals’ behavior can also be measured using a computerized version of a Skinner box [14], in which the subject
has to tap a touch-sensitive monitor [15,16]. The motion of
individual limbs can be monitored automatically using
actigraph sensors, which detect movement by means of a
piezoelectric accelerometer [17].
Early video tracking systems offered clear advantages of
flexibility, precision, and accuracy over the various hardware devices listed above. However, the actual path of the
animal still had to be entered manually by the experimenter,
either by following the track of the animal with the computer mouse [18] or joystick [19], or a digitizing tablet or
similar device [20].
Another early method was to feed the analog video signal
to a dedicated video tracking unit, which detected peaks in
the voltage of the video signal (indicating a region of high
contrast between the tracked animal and background), and
used this to produce the x, y coordinates of the tracked
animals as output which was then fed to the serial port of a
computer [21,22]. These analog systems have the disadvantage of being relatively inflexible (dedicated to particular
experimental setups) and can normally only track one animal, in rather restricted lighting and background conditions.
The first video digitizers were of the column scan type.
These had no internal memory of their own, and were only
able to sample the video signal at rather low rates [23]. In
contrast, a modern frame grabber uses a high-speed analog –
digital converter to enable real time conversion of the entire
video image to a high-resolution grid of pixels [23].
It is also possible to acquire digitized video images by
first converting the video input to a digital video format
such as AVI, and then using the AVI file as the input for
object detection [24]. However, this method has disadvantages that the AVI file quickly gets very large (and so only
trials of a limited duration can be carried out), and the
method does not allow for real-time live data acquisition. It
is also possible to use a digital overlay board to obtain
positional data of tracked animals without the need for a
frame grabber [25,26].
Modern systems, which are based on full-color video
frame grabbers and have flexible software, can track multiple animals simultaneously against a variety of complex
backgrounds. Whereas an infrared detector system typically operates in an arena of 0.5 0.5 m with 12 – 32
beams (though systems do exist with a grid of 24 24
beams, in an arena of 2 m diameter [4]), the frame grabber
used by EthoVision has a resolution of 768 576 pixels
and can track a rat in an arena with a variable shape and up
to 20 m diameter.
1.3. The application of video tracking
Video tracking is particularly suitable for measuring three
types of behaviors: behaviors that occur briefly and are then
interspersed with long periods of inaction (rare behaviors
[27]), behaviors that occur over many hours [24,28] (such as
diurnal variation in behavior), and spatial measurements
(e.g., Refs. [29,30]) (distance, speed, turning, etc.) that the
human observer is unable to accurately estimate. If a
behavior can be automatically detected using a video tracking system, that can greatly reduce the human effort
required [28]. This not only reduces costs, but enables a
larger and therefore statistically more responsible number of
replicates to be used, whilst at the same time reducing the
total number of animals used in an experiment (because
each mouse can be used much more efficiently). This is
particularly important for transgenic mice because in that
case the experiment examines a Heredity Environment
interaction and a larger sample size is required to test if
interaction effects are significant [31].
It is usually obvious if an animal is experiencing an
extreme behavior such as an epileptic seizure. However,
certain other behaviors, such as behaving in a stressed
manner, may be more open to interpretation in borderline
cases. A key advantage of automated video tracking is that it
forces researchers to define exactly what they mean by a
given behavior, and so enables standardization of methodologies [32] — which is one of the most pressing needs in
mouse behavioral phenotyping [31,33,34].
In this paper we present EthoVision — a general-purpose
video tracking, movement analysis, and behavior-recognition system, and illustrate its use in behavioral phenotyping
with a number of case studies. Noldus Information Technology introduced the first version of EthoVision in 1993.
The system has undergone numerous updates over the years,
based on feedback from users around the world, which has
resulted in a very comprehensive package for studies of
movement and behavior [35]. The software has recently
been comprehensively redesigned for 32-bit Windows platforms, with a highly interactive graphical user interface for
display of experimental design, experimental arena, and
tracks of movement.
2. How video tracking works
In a video tracking system such as EthoVision, the basis
of the system is that a CCD video camera records the area in
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
733
Fig. 1. The basic EthoVision setup. The analog image from the camera is fed into a frame grabber in the computer, which digitizes the image. EthoVision then
analyzes the signal to produce quantitative descriptions of the tracked objects’ behavior.
which the subject animals are (the scene). The camera’s
signal can either be fed directly to a computer, or via a video
cassette recorder (see Fig. 1).
In the case of an analog camera (the most practical
option with current technology) the signal is then digitized
(by a hardware device called a frame grabber) and passed
on to the computer’s memory. The video signal consists of
a series of frames (25 or 30 a second, depending on the
TV standard of the camera), and each digitized frame is
composed of a grid of pixels. Each pixel has either a gray
scale value (for a monochrome signal) or values describing the color of each pixel (red, green, and blue, or hue,
saturation, and intensity [36]). The software then analyzes
each frame in order to distinguish the tracked objects from
the background. Having detected the objects, the software
extracts the required features from each frame (in the case
of EthoVision 2.0 this includes the position of the
mathematical center of each object, and its surface area).
Calculations are carried out on the features to produce
quantified measurements of the animals’ behavior. For
instance, if the position of an animal is known for each
frame, and the whole series of frames is analyzed, the
average speed of locomotion (velocity) of an animal
during an experiment can be calculated. In addition, if
certain regions are identified as being of interest (the
center and edges of an open field experiment, for example), the proportion of time spent by the animals in those
regions can also be calculated. Measurements such as
average velocity and time in the center of an open field
can be used as indicators of anxiety (e.g., Case Study 2)
and so in this way an accurate quantified measure can be
obtained of whether animals receiving a particular treatment were more stressed than other animals, and these
measurements can be directly comparable with data from
other workers with similar questions.
3. Practical aspects of video tracking
When using a video tracking system such as EthoVision, the data obtained will only be as good as the quality
of the experimental design. Of course, basic considerations
such as sufficient replication and avoiding pseudoreplication [37] must still be adhered to. Likewise, the better the
practical setup, the better the quality of the data obtained.
If the video signal is of poor quality (for example,
containing reflections), the tracked animals may not be
detected, or other objects (such as the reflections) may be
mistaken for the animals. A number of factors should (if
possible) be optimized:
3.1. Illumination
Lighting should be indirect and even. The level of
illumination (brightness) is not so critical, provided that
it is above the detection level of the camera (down to 0.01
lx for a good monochrome CCD camera, about 1 lx for a
color CCD camera). Monochrome cameras can also detect
light in the near infrared (to about 850 nm). However, if
one part of the arena is brighter than another, or if the
brightness changes during an experiment, this may cause
problems; both technically and because it might affect the
behavior of the mice [38]. Above all, reflections must be
avoided. The easiest way to do this is to ensure that the
light only reaches the scene by first bouncing off a mat
surface (such as the wall of the laboratory). For instance, if
734
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
Fig. 2. Possible lighting setup for a Morris water maze (after Ref. [39]). The lights are either placed so far away from the camera that the angle is too great to
cause reflections (as in the diagram) or next to the pool and below the water level.
an overhead light is used, a mat white wooden board can
be suspended in between the light bulb and the scene.
Alternatively the lights can be placed sufficiently far away
from the camera so that they do not produce a reflection
(see Fig. 2 [39]). If color tracking is used, the incident
light source must contain a sufficient range of wavelengths
to enable separation of the colors being tracked. Color
cameras cannot detect infrared light.
3.2. Marking
EthoVision distinguish between two mice in the same
arena on the basis of size, for example an adult mouse and
a pup. To distinguish two mice that are the same size, the
researcher can color part of one of the mice the same
grayness as the background, thus reducing its apparent
size (Fig. 3). If the mouse is light-colored, dark hair dye
can be used, if it is dark, a patch can be bleached on one.
The procedure is to mix the hair bleach with dilute
hydrogen peroxide, then leave it to soak for 30 min,
and then rinse it thoroughly.
Color tracking can be used to distinguish more than two
mice (up to 16 per arena in the case of EthoVision [40]). If
the mice being tracked are the same color, livestock markers
can be used to paint blobs on the mice, which the system
will be able to track. However, these blobs will only last 1 or
2 days, so can only be used for short-term experiments.
Mice can also be colored with permanent marker pens or
highlight markers for short-term experiments. For longerterm work the mice can be dyed with a brightly colored
human hair dye. If the mouse has dark fur, it can first be
bleached before the dye is applied, using the method
described above. Colored fluorescent tags (illuminated with
a UV light) can also be used to mark the animals. It is also
possible to track mice in the dark by surgically attaching a
small plastic cap to their sculls, and filling the cap with
fluorescent dye. When tracking a relatively large number of
mice, the marker colors must be as different from each other
as possible, in terms of their hue and saturation. Colors
should also be selected to be of a similar intensity (brightness) to each other, and to be not too low an intensity (that
is, not too dull).
Fig. 3. Mouse marked with dark coloring, to distinguish it from another
mouse by reducing its apparent size.
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
If different colored markers are applied to different parts
of the body of an animal (head, back, and tail, for
example), these parts can be tracked individually, and
parameters such as orientation (in absolute terms of the
body axis and in respect to other animals) or head contact
can be calculated [41].
3.3. Arena size
When a video signal is digitized by a frame grabber, the
resulting grid of pixels produced for each video frame will
be of certain dimensions (determined by the hardware that
digitizes the signal). The ratio of the minimum number of
pixels of the mouse’s image (usually 3), to the number of
pixels in the entire image, determines the ratio of the
actual size of the tracked mouse to the maximum size of
the scene. If one cage is filmed, this is also the maximum
size of that cage. If more than one cage is placed within
the camera’s field of view, so that they can be analyzed
simultaneously, the maximum total of all the cages’ widths
is the maximum size of the scene. For instance if a frame
grabber has a image resolution of 576 768, and you are
tracking mice that are 2 cm wide, the maximum size of the
scene is 576/3 2 = 384 cm. If the cages are placed in a
4 2 grid under the camera, the external width of each
cage can be no more than 384/4 = 96 cm. In practice, if a
high-resolution frame grabber is used (such as the one in
the EthoVision system), the size of the scene tends to be
limited by how big an area the camera can ‘see’ when
suspended from the laboratory ceiling, with a lens that is
not too wide-angle (less than 4), for a given size of the
camera’s image detector (CCD).
735
all the settings), data management within EthoVision is
particularly important. The EthoVision Workspace Explorer
(Fig. 5) enables the user to manage these files through a
simple interface (similar to the Windows Explorer), so that,
for instance, settings defining the acquisition method (the
tracking profile) can be dragged and dropped from one
experiment to another. An entire experiment or workspace
can be backed up to a single file to facilitate both safekeeping of the data and transfer between different computers.
In addition, users are able to find their way around the
parts of the program by means of the Experiment menu.
They can use the program’s functions in a logical order
starting with experiment design and arena and zone definition, then to data acquisition, followed by track visualization, and finally data analysis.
4.2. Independent variable definition
As well as tracking objects, EthoVision allows a researcher to define a complete experimental protocol, in
terms of the independent variables of an experiment, and
their values. Up to 99 independent variables (such as
treatment, room temperature, genetic strain of the mouse)
can be defined, and in addition EthoVision automatically
records a series of system independent variables such as
the profile (setting file) used, time and date of the trial, trial
duration, etc. These independent variables can be used to
select, sort, and group data, both when plotting tracks and
analyzing the data. For instance, you can select to plot all
the tracks from mice of a particular genotype, or calculate
the mean time taken to reach the target of a water maze by
control mice compared with treated mice.
3.4. Sampling rate
4.3. Object detection methods
The video camera produces frames at a rate of 25 or 30
per second (depending on its TV standard: NTSC in
America and Japan, PAL in most of the rest of the world).
Because a mouse usually does not move a significant
distance in 1/25th of a second, this sample rate can usually
be reduced to 5 – 10 samples/second if a measurement such
as velocity or distance moved is required (5 s 1 was used in
most of the case studies detailed below).
When studying different mouse strains, the object detection method is very important; each method has its advantages and disadvantages. EthoVision offers three detection
methods. Gray scaling is the fastest. It defines all connecting
pixels that are both brighter than a high gray scale threshold
and darker than a low threshold as the animal, and all other
pixels as the background. The thresholds can be set either
manually by the user, or calculated automatically by the
program. It is a fast method (giving the highest sample rate),
but cannot be used if the same gray scale values are present
both in the mouse’s image and the background. The subtraction method first makes a reference image, with no mice
present, then subtracts the gray scale value of each pixel of
the reference image from the equivalent pixel of the live
image. Any pixels which have a subtracted value other than
zero are different between the reference and live images, due
to the presence of the mouse. The user can define whether
the mouse has to be lighter than or darker than the background, or just different. This method is more tolerant for
differences in light intensity across the scene. The third
detection method uses the hue and saturation of the tracked
4. The EthoVision system
The EthoVision for Windows software has a number of
features that make it particularly suitable as a tool in studies
such as those described in the Case Studies section (below).
4.1. Data and experiment management
Since a water maze experiment of different transgenic
mice strains tends to be a large experiment with over 1000
data files as well as their associated profiles (which contain
736
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
mouse (or a marker painted on it) to define its color, and
pixels with the defined colors are tracked. Up to 16 different
colors can be tracked in each arena at once. With all three
methods, objects that are either smaller than the mouse
(such as droppings) or larger (such as reflections) can be
excluded on the basis of their size.
4.4. Arena and zone definitions
Up to 16 enclosures can be placed under one camera, and
each enclosure can be treated as a separate independent
replicate (called an arena; see Fig. 4). The animals in these
arenas can be tracked simultaneously. Multiple arena definitions and zone definitions can be defined for each experiment. This is particularly handy when, for instance, a water
maze experiment is prepared, with multiple platform positions. When setting up the experiment, the user can assign
the proper arena definition to each track file before the
experiments are started. The user can define up to 99 regions
of interest (zones; see Fig. 4). These can be used in the
analysis (for example, the time taken for a mouse to reach
the platform of a water maze, see Case study 1) and also for
automatic start and stop conditions. For instance the trial can
be stopped automatically when the mouse has reached the
platform of a water maze. Zones can be added together to
make a cumulative zone (for instance, if a maze has more
than one target). Zones can also be defined as hidden zones,
for use with nest boxes, burrows, etc. The system assumes
that when an animal disappears from view and it was last
seen adjacent to a hidden zone, it is inside the hidden zone.
Fig. 4. Arenas and zones of a Morris water maze. The arena is the circular area bounding the outside of the water maze, and it is divided up into four
zones (‘‘quadrants,’’ see Case Study 3) labeled North, South, East, and West.
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
All the zones can be defined and altered either before of after
data acquisition, allowing iterative exploratory data analysis
of the effects of changing zone positions and shapes.
The arena can be calibrated so that the parameters such as
velocity are calculated in meaningful units. If more than one
arena is used, they can each be calibrated separately, or one
calibration can be used for all arenas. There are two
calibration methods; in standard calibration the user measures a series of lines (for example, a meter ruler placed on
the floor of a cage). If the camera image is distorted (by a
fish-eye lens, or a lens that is not directly overhead) the
standard calibration will not be accurate (the standard
deviation of the calibration factors is given to enable the
user to determine this), and the user can opt for advanced
calibration. In that method a grid of points is mapped onto
the arena, and a series of coordinate points are entered,
giving a calibration that is accurate for the entire image,
even when it is distorted.
4.5. Behavior recording
In addition to automatically tracking the movement of the
animal, EthoVision can automatically detect certain behaviors, such as rearing. EthoVision also allows the researcher
737
to manually record (by keystroke or mouse-click) behaviors
that cannot be detected automatically (such as sniffing).
4.6. Image filtering
As in most video tracking systems, EthoVision calculates
the mathematical point at the center of the digitized picture
of the animals’ body, including the tail. This means this
point is further towards the mouse’s posterior than is correct.
Furthermore, a tracking system will still detect movement of
the mouse when only the tail is moving. In order to prevent
this, the tail can be removed from the image, using image
filtering. The bars of a cage can also be filtered using a
similar technique.
4.7. Track visualization
The user can specify data sets at several hierarchical
levels for the analysis and visualization of the data. The
data can be selected, sorted, and grouped by independent
variables. This way a selection of the tracks in an experiment can be plotted on a single screen, by treatment or
genetic strain (see Fig. 5). The track plots can be displayed
with or without the arena and zones or the background
Fig. 5. This figure displays the EthoVision Workspace Explorer on the left panel and in the right panel eight tracks of a water maze experiment are plotted. The
tracks are sorted by animal ID and trial number. As well as plotting the complete tracks, the user can also play the plots back in a customized way.
738
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
image of the experimental setup. The plots can be exported
as graphic files in a variety of standard formats.
4.8. Data analysis
After initially selecting the tracks for analysis (using
independent variables), the user can carry out further data
selection by nesting over time-windows, zones, and behaviors (either ones automatically detected by the system, or
behaviors entered by the researcher manually). The user can
also group by the values of independent variables (such as
treatment values), and further fine-tune the data using filter
settings like minimal distance moved, to eliminate slight
‘‘apparent’’ movements, and down-sampling steps, to eliminate redundant data if a lower sample rate describes the
path better. A wide range of quantitative measures of
behavior is available in EthoVision, and these can be
described with a full range of descriptive statistics:
Distance moved — For example, the total path distance followed by a mouse in a water maze until it reached
the target.
Velocity — For example, the mean speed of the mouse
whilst swimming in a water maze.
In zone — For example, the time taken for a mouse to
reach the target of a maze. The zones can be completely
redefined at any time before or after data acquisition.
Distance to zone and Distance to zone border — For
example, the average distance that a mouse was from the
walls of an open field, except when it was near a novel object.
Heading — For example, the mean direction of a
mouse’s path in relation to a maze’s target.
Turn angle — A large mean and variance in the
turn angle may indicate stereotypical behavior such as
circling movements.
Angular velocity — High values of absolute angular
velocity can be used to indicate a variety of behaviors
including local searching patterns, stereotypic movements
and abnormal behaviors, reaction to toxins, and measures of
behavioral asymmetries in animals with unilateral brain
lesions [42,43].
Meander — The change in direction of movement of
an object relative to the distance it moves. Therefore it can
be used to compare the amount of turning of objects
travelling at different speeds.
Sinuosity — The amount of random turning in a given
path length. This is one value for the entire track and can be
used, for instance, to quantify the foraging behavior of an
animal [44,45].
Moving — Whether or not the mouse is moving,
compared to user-defined thresholds. For example, the
percentage of the time that the mouse in a water maze
was floating or the number of bouts of movement whilst the
mouse was in the central zone of an open field.
Rearing — A useful parameter for assessing exploratory
behavior, for example the frequency of rearing in an open
field with and without a novel object. EthoVision detects
rearing on the basis of the change in surface area of the
mouse (as seen from above), using user-defined thresholds.
Distance between objects — In EthoVision, the user
can select as many combinations of actors and receivers as
wanted to calculate social interactions, including this parameter and the ones listed below (which are based on the
distance between objects). Both other animals and zones can
be selected as receivers of social interactions.
Relative movement — Whether a mouse is moving
towards or away from another mouse. This is a good
example of providing an objective measure of behavior,
which can later be explicitly interpreted as aggression or
avoidance [46].
Proximity — Whether a mouse is close to or far away
from another mouse (in relation to user-defined thresholds).
This parameter has been used to study the effect of individual or group housing on social behavior [46] and social
isolation as a symptom of schizophrenia [47,48].
Speed of moving to/from — The distance-weighted
speed at which an object moves towards or away from
another object. These parameters can be used to quantify the
intensity of avoidance or aggression.
Net relative movement — A combined measure for the
above two parameters, this parameter is also frequently used
in studies of social interaction [46,49].
The analysis report is a spreadsheet that can be manipulated for optimal presentation and exported in different file
formats (or copied and pasted) to other spreadsheets or
statistics packages for further analysis. The parameters can
be calculated for every individual sample, and their statistics calculated both per track and across groups of tracks
with the same values of user-defined independent variables
(for example, all tracks of mice receiving the same dosage
of a treatment).
5. Case studies
The following case studies are presented in order to
illustrate the use of EthoVision for detection and quantification of various behaviors in mice. Therefore, full details of
neither the experimental setup nor full conclusions drawn
from the data are given; the reader is referred to the
researchers or the original publications for that information
[55,70]. However, the raw data from some of these case
studies have been placed on the web; follow the link from
http://www.noldus.com/products/ethovision/. The data can
be downloaded and opened in EthoVision for visualization
and analysis.
5.1. Effect of genetic composition on open field behavior of
mice. Behavioral phenotyping of the effects of disabling the
Munc 18-1 gene
The following case study is based on unpublished data
kindly provided by R.A. Hensbroek and B.M. Spruijt
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
739
5.2. Open field exploration strategies in mice selected for
short and long attack latency
Fig. 6. The behavior of mice with various genotypes (genetic background
and munc18-1 mutation) in an open field expressed as total distance moved
and time spent in the central area. Thick shading represents heterozygotes
and thin shading wild types. Left-slanting shading represents 129/Sv mice,
right-slanting shading represents hybrids. The bars represent ± 1 standard
error of the mean.
(Rudolf Magnus Institute for Neuroscience, Utrecht University). For further details about the experimental protocol, please contact them at: [email protected].
Mice that lack the gene for munc18-1 have no secretion
of neurotransmitters and consequently die at birth [50].
Heterozygotes still have one functional copy of the gene
and a 50% reduction in munc18-1 protein levels. Earlier
work had indicated that heterozygotes were more active
than wild type mice, but it was not clear whether this was
due to the munc18-1 deletion, or due to differences in
genetic background. Heterozygotes were therefore repeatedly backcrossed with the inbred strain 129/Sv. The
resulting mice were also crossed once with C57Bl/6. The
effects of the munc18-1 deletion could now be tested both
in a purely 129/Sv background and in hybrids of 129/Sv
and C57Bl/6.
The mice were placed in an empty circular open field for
15 min, firstly when the field was empty, and secondly with
a 10 5 cm metallic cylinder in the middle. Heterozygotes
were more active than wild types, both for the hybrids and
129/Sv mice (Fig. 6). The genetic background had a clear
effect on the behavior of the mice: in 129/Sv, heterozygotes
spent more time in the central area of the open field, while
such a difference was not seen in hybrids.
The following case study is based on unpublished data
kindly provided by M.C. de Wilde and J.M. Koolhaas
(University of Groningen). For further details about the
experimental protocol, please contact them at: [email protected] or [email protected].
Male wild house mice selected for short and long attack
latency (SAL and LAL, respectively) differ in their way of
coping with environmental challenges [51,52]. SAL males
show an active behavioral response whereas LAL males
show a reactive coping strategy. For instance, in the defensive burying test SAL males actively cover the shock prod
with bedding material and LAL males freeze in a corner of
the cage. Several studies suggest that the coping styles have
a differential fitness in nature, depending on environmental
conditions such as food availability [53,54]. The current
experiment studied the exploratory behavior of SAL and
LAL mice in an open field.
The LAL mice (white bars in Fig. 7) were more active
than the SAL genotype, moving a greater total distance, and
exploring the outer zone more than the SAL mice, whereas
the distance traveled in the inner zone was the same for the
two groups. The SAL animals showed no difference in
behavior in the second time period, compared with the first
time period, whereas the LAL mice were significantly more
active after the first 5-min period was over, moving a greater
distance in both the inner zone and outer zone.
These data are consistent with other work describing the
two strategies. SAL mice continue to exhibit a behavior that
they have found to be successful, whereas LAL mice will
develop new behaviors with time. In the beginning the SAL
and LAL mice explore the open field in a similar way, and
after 5 min the SAL mice continue to explore the arena in
that way. In contrast, the LAL mice develop a new exploratory strategy, both moving more, and spending more time
Fig. 7. Distance moved by short attack latency (n = 16, black bars,
± standard error of the mean [S.E.M.]) and long attack latency (n = 12, white
bars, ± S.E.M.) wild male house mice in an open field experiment.
740
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
Table 1
Variables calculated from the animals’ tracks in the water maze experiment
in Case Study 3
Variable
WT vs. aD ( P)
WT vs. comp ( P)
Swim time
Swim path length
% time in goal zone
Mean distance to goal
Swimming parallel to wall
Swimming < 22 mm from wall
Number of wall contacts
Swimming speed
n.s.
n.s.
n.s.
< .05
< .05
< .05
n.s.
n.s.
< .05
< .01
< .05
< .002
< .001
< .002
n.s.
n.s.
Table legend: n.s. = not significant (Scheffe test), WT = wild type,
aD = mice with aD isoform disrupted, comp = mice with all CREB
isoforms disrupted.
in the inner zone as they adapt to the new situation in the
second period.
5.3. The effects of CREB gene dosage on the performance of
mice in the water maze learning task
The following case study was based on data kindly
provided by Dr. D. Wolfer and his colleagues (Department
of Anatomy, University of Zurich). For details, please see
the publication in Learning & Memory [55].
The Morris water maze task is one of the most frequently
used experimental paradigms designed to assess cognitive
performance [56]. EthoVision is used by several groups to
track and analyze water maze experiments (e.g., Refs. [57 –
60]). A tank is filled with opaque water (chalk or milk
powder is added) and the animal swims until it locates a
hidden platform just below the surface of the water. Its
speed of learning can be assessed by how rapidly the time to
locate the platform is reduced when the task is repeated.
Most studies have used rats as subjects, whereas mice have
been less frequently used [61,62]. Mice have been shown to
produce a normal learning curve [62]. The behavior of mice
in tasks of learning and memory is less characterized than in
rats and in view of the development of transgenic mouse
models with predefined deficiencies as new tools to identify
putative cognition enhancers, it is important to conduct
preliminary tests on wild type strains that are used to
produce these transgenic animals, to calibrate the behavioral
paradigm before beginning testing of the transgenic or
knockout mice.
The transcription factor CREB is involved in the
formation and retention of long-term memory [63,64].
Although learning in mice is strongly influenced by the
genetic background of the mice [65,66], early water maze
studies did not take this into account. The current study
aimed to validate previous studies using mice with welldefined genetic backgrounds (see the original paper [55]
for details), with mice with different dosages of the
CREB gene.
Several variables were calculated from the animals’
tracks (the EthoVision data were processed using the
WinTrack program [67]), see Table 1. It was found that
mice with one hypomorphic CREB allele (aD) and one
null CREB allele (CREBcomp, in which all CREB isoforms are disrupted) showed statistically significant
(Scheffe test) defects for most measures of the performance in the water maze (compared with wild type mice)
whereas mutants with a and D isoforms disrupted and the
b isoform up-regulated showed less disruption. Previous
Fig. 8. Two typical tracks of wild-type (right) and trkB-mice in the Morris water maze. The control mouse finds the hidden platform (the top-left square), on the
basis of previous training, whereas the mutant mouse stays close to the edges of the tank (the gray circle).
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
studies [68,69] reported major deficits for mice with these
genetic defects in all paradigms tested. It is likely that the
difference is due to the differing genetic background of the
mice used in different studies.
5.4. Learning behavior of mice in a water maze with
the trkB gene knocked out in the forebrain during
prenatal development
The following case study was based on data kindly
provided by Dr. D. Wolfer (Department of Anatomy, University of Zurich). For details, please see the publication in
Neuron [70].
The TrkB receptor for brain-derived neurotrophic factor
regulates short-term synaptic functions and long-term potentiation of brain synapses [71]. Mice were studied which had
had the trkB gene knocked-out in the forebrain during
postnatal development. The mice developed without gross
morphological defects (the mice do not survive until adulthood if the gene is knocked-out for the entire brain [72]),
and the activity of the trkB-mice in an open field were
similar to wild-type mice. However, in a Morris water maze
experiment (similar to that of Case study 1), although the
control mice learnt the route to the hidden platform (left
track in Fig. 8), the mutant mice tended to hug the walls of
the maze (thigmotaxis) and therefore did not find the platform (right track in Fig. 8).
6. Discussion and conclusions
The four case studies illustrate different aspects of using
EthoVision to track mice. The first case study (knock-out of
the Munc 18-1 gene) looked at mice in an open field which
were hypothesized to be hyperactive as a result of the
knocked-out gene. Plots of two simple variables quantifying
the mice’s behavior demonstrated the experiment’s effects.
Simple, straightforward parameters such as distance moved
or time in center can be powerful measures to quantify
differences in behavior between different genotypes. Likewise the case study of SAL and LAL mice (Case study 2)
showed clear differences in behavior between the genotypes
on the basis of a variable (total distance moved) calculated
from the EthoVision tracks, even though the actual differences in behavior were rather complex. The Morris water
maze study on mice with differing CREB gene dosage
illustrated that there are occasions one or two variables are
insufficient to provide a good description of the experimental effects. It was only when a batch of variables was
calculated that the interaction between the mice’s genetic
background and specific effects of the CREB gene became
clear. An advantage of measuring behavior using a video
tracking system is that once the tracks of the animals have
been derived, a large number of dependent variables can
easily be calculated from those tracks. Finally, the study of
mice with the trkB gene disabled in the forebrain illustrated
741
the fact that although proper statistical analysis is, of course,
necessary, much valuable information, especially about
exactly what to test, can be gained from viewing the plots
of the animals’ tracks.
The case studies demonstrate that video tracking provides
an accurate and powerful means to quantify mouse behavior.
Because it is repeatable and not subject to observer variance,
it is eminently suitable for comparison of different data sets
between different research groups. The case studies shown
are all examples of relatively straightforward uses of a
system like EthoVision. Even with the current system
(EthoVision 2), more complex experiments and analyses
can be carried out. In the future, EthoVision and other video
tracking systems will be able to carry out more complex
tasks. An indication of what these might be can be seen from
projects that have been carried out in research projects, or
customizations of the software. Whether these functions
ever become part of the standard commercial package will
depend on both how technically feasible the solution is, and
how much interest there is in that application.
Synchronization of video (behavioral) and physiological data. It is often the case that an experiment involves not
only quantifying the behavior of an animal, but also
measuring a variety of physiological parameters [73]. In
order to carry out a full statistical analysis of the data
obtained, it is necessary for the various data streams to be
synchronized. At the time of writing (November 2000) there
is no straightforward and uniform solution to this problem.
However, Noldus Information Technology, together with
several partners are carrying out a research project to resolve
this issue [74].
Direct extraction of physiological data from video
data. A customized version of EthoVision has been developed which uses an infrared thermographic camera both to
record an animal’s behavior, and simultaneously to measure
its body temperature (at several points on the body simultaneously) [75]. The method is noninvasive and thus particularly appropriate for monitoring the effects of stress.
Automatic detection of complex behaviors. EthoVision
is able to detect whether a mouse is moving, or whether it is
rearing, but (in common with other commercial systems) is
not able to automatically detect other behaviors (unless, of
course, it is possible to define them in terms of the existing
parameters). A behavior such as an epileptic seizure (characterized by the animal turning over with the tail in a
characteristic position over the body) can be detected by
using EthoVision in combination with neural networks [76]
and statistical classification [77].
3-D tracking. If more than one camera is used (or one
camera and a mirror [18]), it is in theory possible to obtain
information about the movement of animals in all three
dimensions. Although tracking in 3-D was already possible
with some simple hardware devices [13] and more primitive video-based systems [19], ironically the more sophisticated the tracking system, the more difficult this
becomes. However, a special version of EthoVision has
742
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
been used to successfully track objects moving in three
dimensions [78].
Identification of large and indeterminate numbers of
animals. To track a large and continually varying number of
animals, especially if crowded close together, calls for quite
different techniques than the object identification methods
used by EthoVision. Some progress has been made using
techniques such as active modeling and prediction of the
animals’ shapes [79,80] and dynamic imaging with statistical analysis of detected body features [81].
Acknowledgments
We are grateful of those who provided us with their data
for use in the case studies section of this paper. All members
of the EthoVision team at Noldus Information Technology
have contributed towards the product since it was initially
launched in 1993; their hard work and dedication is
acknowledged here.
References
[1] Noldus LPJJ. The Observer: a software system for collection and analysis of observational data. Behav Res Methods, Instrum Comput
1991;23:415 – 29.
[2] Noldus LPJJ, Trienes RJH, Hendriksen AH, Jansen H, Jansen RG.
The Observer Video-Pro: new software for the collection, management, and presentation of time-structured data from videotapes and
digital media files. Behav Res Methods, Instrum Comput 2000;32:
197 – 206.
[3] Clarke RL, Smith RF, Justesen DR. An infrared device for detecting
locomotor activity. Behav Res Methods, Instrum Comput 1985;17:
519 – 25.
[4] Robles E. A method to analyze the spatial distribution of behavior.
Behav Res Methods, Instrum Comput 1990;22:540.
[5] Kirkpatrick T, Schneider CW, Pavloski R. A computerized infrared
monitor for following movement in aquatic animals. Behav Res Methods, Instrum Comput 1991;23:16 – 22.
[6] Gapenne O, Simon P, Lannou J. A simple method for recording the
path of a rat in an open field. Behav Res Methods, Instrum Comput
1990;22:443 – 8.
[7] Silverman R, Chang AS, Russell RW. A microcomputer-controlled
system for measuring reactivity in small animals. Behav Res Methods,
Instrum Comput 1998;20:495 – 8.
[8] Szalda-Petree AD, Karkowski AM, Brooks LR, Haddad NF. Monitoring running-wheel movement using a serial mouse and an IBMcompatible system. Behav Res Methods, Instrum Comput 1994;26:
54 – 6.
[9] Clarke RL, Smith RF, Justesen DR. A programmable proximity-contact sensor to detect location or locomotion of animals. Behav Res
Methods, Instrum Comput 1992;24:515 – 8.
[10] Tarpy RM, Murcek RJ. An electronic device for detecting activity in
caged rodents. Behav Res Methods, Instrum Comput 1984;16:383 – 7.
[11] Akaka WH, Houck BA. The use of an ultrasonic monitor for recording
locomotor activity. Behav Res Methods, Instrum Comput 1980;12:
514 – 6.
[12] Martin PH, Unwin DM. A microwave Doppler radar activity monitor.
Behav Res Methods, Instrum Comput 1988;20:404 – 7.
[13] Brodkin J, Nash JF. A novel apparatus for measuring rat locomotor
behavior. J Neurosci Methods 1995;57:171 – 6.
[14] Skinner BF. The behavior of organisms. New York: AppeltonCentury-Crofts, 1938.
[15] Morrison SK, Brown MF. The touch-screen system in the pigeon
laboratory: an initial evaluation of its utility. Behav Res Methods,
Instrum Comput 1990;22:1236 – 66.
[16] Sahgal A, Steckler T. TouchWindows and operant behaviour in rats.
J Neurosci Methods 1994;55:59 – 64.
[17] May CH, Sing HC, Cephus R, Vogel S, Shaya EK, Wagner HN. A
new method of monitoring motor activity in baboons. Behav Res
Methods, Instrum Comput 1996;28:23 – 6.
[18] Pereira P, Oliveira RF. A simple method using a single video camera
to determine the three-dimensional position of a fish. Behav Res
Methods, Instrum Comput 1994;26:443 – 6.
[19] Morrel-Samulels P, Krauss RM. Cartesian analysis: a computer – video
interface for measuring motion without physical contact. Behav Res
Methods, Instrum Comput 1990;22:466 – 70.
[20] Santucci AC. An affordable computer-aided method for conducting
Morris water maze testing. Behav Res Methods, Instrum Comput
1995;27:60 – 4.
[21] Vorhees CV, Acuff-Smith KD, Minck DR, Butcher RE. A method for
measuring locomotor behavior in rodents: contrast-sensitive computer-controlled video tracking activity assessment in rats. Neurotoxicol Teratol 1992;14:43 – 9.
[22] Klapdor K, Dulfer BG, van der Staay FJ. A computer-aided method to
analyse foot print patterns of rats, mice and humans. Poster presented
at Measuring Behavior ’96, International Workshop on Methods and
Techniques in Behavioral Research, 16 – 18 October 1996, Utrecht,
the Netherlands, 1996.
[23] Olivo RF, Thompson MC. Monitoring animals’ movements using
digitized video images. Behav Res Methods, Instrum Comput
1988;20:485 – 90.
[24] Derry JF, Elliott JH. Automated 3-D tracking of a video-captured
movement using the example of an aquatic mollusk. Behav Res Methods, Instrum Comput 1997;29:353 – 7.
[25] Hartmann MJ, Assad C, Rasnow B, Bower JM. Applications of video
mixing and digital overlay to neuroethology. Methods. Companion
Methods Enzymol 2000;21:385 – 91.
[26] Rasnow B, Assad C, Hartmann MJ, Bower JM. Applications of multimedia computers and video mixing to neuroethology. J Neurosci
Methods 1997;76:83 – 91.
[27] Martin BR, Prescott WR, Zhu M. Quantification of rodent catalepsy
by a computer-imaging technique. Pharmacol, Biochem Behav 1992;
43:381 – 6.
[28] Spruijt BM, Gispen WH. Prolonged animal observations by use of
digitized videodisplays. Pharmacol, Biochem Behav 1983;19:
765 – 9.
[29] Buresova O, Bolhuis JJ, Bures J. Differential effects of cholinergic
blockade on performance of rats in the water tank navigation task and
in a radial water maze. Behav Neurosci 1986;100:476 – 82.
[30] Spruijt BM, Pitsikas N, Algeri S, Gispen WH. Org2766 improves
performance of rats with unilateral lesions in the fimbria fornix in a
spatial learning task. Brain Res 1990;527:192 – 7.
[31] Whalsten D. Standardizing tests of mouse behaviour: reasons, recommendations, and reality (submitted for publication).
[32] Spruijt BM, Buma MOS, van Lochem PBA, Rousseau JBI. Automatic
behavior recognition; what do we want to recognize and how do we
measure it? Proceedings of Measuring Behavior ’98 (Groningen, 18 –
21 August) 1998;264 – 6.
[33] Brown RE, Stanford L, Schellinck HM. Developing standardized behavioral tests for knockout and inbred mice. ILAR J 2000;41:163 – 74.
[34] Drai D, Elmer G, Benjamion Y, Kafkafi N, Golani I. Phenotyping of
mouse exploratory behavior. Proceedings of Measuring Behavior
2000 (Nijmegen, 15 – 18 August) 2000;86 – 8.
[35] Buma M, Smit J, Noldus LPJJ. Automation of behavioral tests using
digital imaging. Neurobiology 1997;4:277.
[36] Endler JA. On the measurement and classification of colour in studies
of animal colour patterns. Biol J Linn Soc 1990;41:315 – 532.
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
[37] Hurlbert SH. Pseudoreplication and the design of ecological field
experiments. Ecol Monogr 1984;54:187 – 211.
[38] Fentrop N, Wotjak CT. Fiat lux! Spotting a common experimental
problem. Proceedings of Measuring Behavior 2000 (Nijmegen,
15 – 18 August) 2000;108.
[39] Evets H, Koolhaas J. Still waters run deep: EthoVision and the Morris
water maze. Noldus News 1999;6(1):3 (February).
[40] Spink AJ, Buma MOS, Tegelenbosch RAJ. EthoVision color identification: a new method for color tracking using both hue and saturation. Proceedings of Measuring Behavior 2000 (Nijmegen, 15 – 18
August) 2000;295 – 6.
[41] Šustr P, Špinka M, Newberry RC. Automatic computer analysis of pig
play. Proceedings of Measuring Behavior 2000 (Nijmegen, 15 – 18
August) 2000;307 – 8.
[42] Bjerselius R, Olsen KH. Behavioural and endocrinological responses
of mature male goldfish to the sex pheromone 17a,20b-dihydroxy4-pregnen-3-one in the water. J Exp Biol 1995;198:747 – 54.
[43] Schwarting RKW, Goldenberg R, Steiner H, Fornaguera J, Huston JP.
A video image analyzing system for open-field behavior in the rat
focusing on behavioral asymmetries. J Neurosci Methods 1993;49:
199 – 210.
[44] Bovet P, Benhamou S. Spatial analysis of animals’ movements using a
correlated random walk model. J Theor Biol 1988;131:419 – 33.
[45] Bovet P, Benhamou S. Optimal sinuosity in central place foraging
movements. Anim Behav 1991;42:57 – 62.
[46] Spruijt BM, Hol T, Rousseau JBI. Approach, avoidance and contact
behavior of individually recognized animals automatically quantified
with an imaging technique. Physiol Behav 1992;51:747 – 52.
[47] Sams-Dodd F. Automation of the social interaction test by a video
tracking system: behavioural effects of repeated phencyclidine treatment. J Neurosci Methods 1995;59:157 – 67.
[48] Sams-Dodd F. The effects of dopamine agonists and antagonists on
PCP-induced stereotyped behaviour and social isolation in rats. Behav
Pharmacol 1996;7(Suppl. 1):99.
[49] Hol T, Spruijt BM. The MSH/ACTH(4 – 9) analog Org2766 counteracts isolation-induced enhanced social behavior via the amygdala.
Peptides 1992;13:541 – 4.
[50] Verhage M, Maia AS, Plomp JJ, Brussaard AB, Heeroma JH, Vermeer H, Toonen RF, Hammer RE, Van den Berg TK, Missler M,
Geuze HJ, Südhof TC. Synaptic assembly of the brain in the absence
of neurotransmitter secretion. Science 2000;287:864 – 9 (Feb. 4).
[51] Sluyter F, Bult A, Meeter F, Van Oortmerssenn GA, Lynch CB. A
comparison between Fl reciprocal hybrids of house mouse lines bidirectionally selected for attack latency or nest-building behavior: no Y
chromosomal effects on alternative behavioral strategies. Behav Genet
1997;27:477 – 82.
[52] Sluyter F, Korte SM, Van Baal GCM, De Ruiter AJH, Van Oortmerssen GA. Y chromosomal and sex effects on the behavioral stress
response in the defensive burying test in wild house mice. Physiol
Behav 1999;67:579 – 85.
[53] Monahan EJ, Maxson SC. Y chromosome, urinary chemosignals, and
an agonistic behavior (offense) of mice. Behav Physiol 1998;64:
123 – 32.
[54] Bult A, Lynch CB. Breaking through artificial selection limits of an
adaptive behavior in mice and the consequences for correlated responses. Behav Genet 2000;30:193 – 206.
[55] Gass P, Wolfer DP, Balschun D, Rudolph D, Frey U, Lipp HP, Schütz
G. Deficits in memory tasks of mice with CREB mutations depend on
gene dosage. Learning & Memory 1998;5:274 – 88.
[56] Morris R. Development of a water-maze procedure for studying spatial learning in the rat. J Neurosci Methods 1984;11:47 – 60.
[57] Oitzl MS. Behavioral approaches to study function of corticosteriods
in brain. Methods Neurosci 1994;22:483 – 95.
[58] Everts HGJ, Koolhaas JM. Differential modulation of lateral septal
vasopressin receptor blockade in spatial learning, social recognition, and anxiety-related behaviors in rats. Behav Brain Res
1999;99:7 – 16.
743
[59] Leggio MG, Neri P, Graziano A, Mandolesi L, Molinari M, Petrosini
L. Cerebellar contribution to spatial event processing: characterization
of procedural learning. Exp Brain Res 1999;127:1 – 11.
[60] Dere E, Frisch C, De Souza Silva MA, Huston JP. Improvement and
deficit in spatial learning of the eNOS-knockout mouse (eNOS / )
dependent on motivational demands of the task; water maze versus
radial maze. Proceedings of Measuring Behavior 2000 (Nijmegen,
15 – 18 August) 2000;79.
[61] Paylor R, Baskall L, Wehner JM. Behavioural dissociations between
C57/BL6 and DBA/2 mice on learning and memory tasks: a hippocampal-dysfunction hypothesis. Psychobiology 1993;21:11 – 26.
[62] Van der Staay FJ, Stollenwerk A, Hovarth E, Schuurman T. Unilateral
middle cerebral artery occlusion does not affect water-escape behaviour of CFW1 mice. Neurosci Res Commun 1992;11:11 – 8.
[63] Frank DA, Greenberg ME. CREB: A mediator of long-term memory
from mollusks to mammals. Cell 1994;79:5 – 8.
[64] Bailey CH, Bartsch D, Kandel ER. Toward a molecular definition of
long-term memory storage. Proc Natl Acad Sci 1996;93:13445 – 52.
[65] Gerlai R. Gene-targeting studies of mammalian behavior: is it the
mutation or the background genotype? Trends Neurosci 1996;19:
177 – 81.
[66] Owen EH, Logue SF, Rasmussen DL, Wehner JM. Assessment of
learning by the Morris water task and fear conditioning in inbred mice
strains and F1 hybrids: implications of genetic background for single
gene mutations and quantitative trait loci analyses. Neuroscience
1997;80:1087 – 99.
[67] Wolfer DP, Lipp H-P. A new computer program for detailed off-line
analysis of swimming navigation in the Morris water maze. J Neurosci
Methods 1992;41:65 – 74.
[68] Hummler E, Cole TJ, Blendy JA, Ganss R, Aguzzi A, Schmid F,
Beermann F, Schütz G. Targeted mutation of the cAMP response
element binding protein (CREB) gene: compensation within the
CREB/ATF family of transcription factors. Proc Natl Acad Sci
1994;91:5647 – 51.
[69] Bourtchuladze R, Frenguelli B, Blendy J, Cioffi G, Schütz G, Silva
AJ. Deficient long-term memory in mice with a targeted mutation of
the cAMP responsive element binding (CREB) protein. Cell 1994;79:
59 – 68.
[70] Minichiello L, Korte M, Wolfer DP, Kühn R, Unsicker K, Cestari V,
Rossi-Arnaud C, Lipp HP, Bonhoeffer T, Klein R. Essential role for
TrkB receptors in hippocampus-mediated learning. Neuron 1999;24:
401 – 14.
[71] McAllister AK, Katz LC, Lo DC. Neurotrophins and synaptic plasticity. Annu Rev Neurosci 1999;22:295 – 318.
[72] Minichiello L, Klein R. TrkB and TrkC neurotrophin receptors cooperate in promoting survival of hippocampal and cerebellar granule
neurons. Genes Dev 1996;10:2849 – 58.
[73] Koolhaas JM, de Boer SF, Sgoifo A, Buwalda B, Meerlo P, Kato K.
Measuring behavior: integrating behavior and physiology. Proceedings of Measuring Behavior ’98, 2nd International Conference on
Methods and Techniques in Behavioral Research, 18 – 21 August,
Groningen, The Netherlands 1998;190 – 1.
[74] Hoogeboom PJ. Data synchronisation through post-processing. Proceedings of Measuring Behavior 2000 (Nijmegen, 15 – 18 August)
2000;147 – 50.
[75] Houx BB, Buma MOS, Spruijt BM. Non-invasive temperature tracking with automated infrared thermography: measuring inside out. Proceedings of Measuring Behavior 2000, 3rd International Conference
on Methods and Techniques in Behavioral Research, 15 – 18 August,
Nijmegen, The Netherlands 2000;151 – 2.
[76] Spruijt BM, Buma MOS, van Lochem PBA, Rousseau JBI. Automatic behavior recognition: what do we want to recognize and
how do we measure it? Proceedings of Measuring Behavior ’98,
2nd International Conference on Methods and Techniques in Behavioral Research, 18 – 21 August, Groningen, The Netherlands
1998;264 – 6.
[77] van Lochem PBA, Buma MOS, Rousseau JBI, Noldus LPJJ. Auto-
744
A.J. Spink et al. / Physiology & Behavior 73 (2001) 731–744
matic recognition of behavioral patterns of rats using video imaging
and statistical classification. Proceedings of Measuring Behavior ’98,
2nd International Conference on Methods and Techniques in Behavioral Research, 18 – 21 August, Groningen, The Netherlands 1988;
203 – 4.
[78] Takken W, Huisman PWT, Buma MOS, Noldus LPJJ. Three-dimensional video tracking and analysis of the flight of nocturnal anopheline
mosquitoes. Poster presented at Measuring Behavior ’96, International
Workshop on Methods and Techniques in Behavioral Research,
16 – 18 October, Utrecht, 1996.
[79] Sergeant DM, Boyle RD, Forbes JM. Computer visual tracking of
poultry. Comput Electron Agric 1988;21:1 – 18.
[80] Bulpitt AJ, Boyle RD, Forbes JM. Monitoring behavior of individuals
in crowded scenes. Proceedings of Measuring Behavior 2000, 3rd
International Conference on Methods and Techniques in Behavioral
Research, 15 – 18 August, Nijmegen, The Netherlands 2000;28 – 30.
[81] van Schelt J, Buma MOS, Moskal J, Smit J, van Lenteren JC.
EntoTrack: using video imaging to automate the quality control of
mass-reared arthropods. Proc IOBC Workshop Quality Control of
Mass-Reared Arthropods 1995 (Santa Barbara, 9 – 12 October).