Based on results of early as well as recent behavioural studies, the present review compares the performance of different eye regions in exploiting information on shape, colour and motion, relevant to the honeybee’s foraging task. The comparisons reveal similarities, as well as differences, among the performances of various eye regions, depending on the visual parameter involved in the task under consideration. The outcome of the comparisons is discussed in the light of anatomical and optical regional specializations found in the bee’s peripheral visual pathway, as well as in the light of the foraging bee’s natural habits. It is concluded that the functional differences found among different eye regions are based on neural mechanisms subserving the bee’s natural needs, rather than on peripheral specializations.

The worker honeybee’s compound eye consists of approximately 5500 facets (ommatidia), with different eye regions looking at different portions of a nearly spherical view, thus providing the bee with a large amount of visual information at any time. With her relatively small brain, however, the bee is not expected to process and exploit more than a fraction of that information. The preferential use of a particular cue may thus depend not only on the task in hand (Lehrer, 1994) but, in addition, on the eye region that happens to be confronted with that particular cue.

In many insect species, the significance of a particular eye region can be predicted on the basis of peripheral anatomical, optical or physiological specializations that enhance spatial resolution, temporal acuity or colour vision. In the context of spatial vision, these so-called acute zones, or foveas (for reviews, see Horridge, 1980; Wehner, 1981; Land, 1989, 1997), are mainly characterized by increased facet density and enlarged facet diameters. Whenever such specializations have been considered in the light of behaviour, they have proved to constitute adaptations to the ecological needs of the animal (see Wehner, 1981; Land, 1997).

In the worker honeybee’s eye, the interommatidial angles in the horizontal direction are smallest in the frontal eye region, increasing towards the medial and lateral directions, whereas in the vertical direction, the smallest interommatidial angles are found around the equator of the eye, increasing towards the dorsal and ventral poles (for references, see Land, 1989, 1997). These two gradients result in two zones of potentially enhanced spatial acuity, one in the central frontal visual field and another around the eye equator. The latter predicts enhanced spatial resolution in the vertical direction, but not in the horizontal one. However, with respect to temporal acuity, it predicts the opposite, namely that images moving horizontally should be resolved better than images moving vertically (see Land, 1989, 1997).

With respect to colour vision, all eye regions are expected to perform equally well, because the distribution (Menzel and Blakers, 1976) and the sensitivities (Bernard and Wehner, 1980) of the bee’s three spectral types of photoreceptors (green, blue and ultraviolet) do not differ among ommatidia situated in different eye regions.

However, it is only the animal’s behaviour that can reveal whether the final product of information processing is determined as early as at the level of the receptors. Although the bee’s performance in exploiting a variety of visual cues for pinpointing and recognizing a food source has been investigated in countless studies over many decades, an attempt to relate the behavioural findings to the peripheral specializations has hardly ever been undertaken. Furthermore, only a few studies were aimed specifically at comparing the performances among different eye regions; most of them were conducted independently in different eye regions without considering such a comparison. In the present review, the results of early as well as recent behavioural experiments will be compared in the light of both the environmental constraints and the peripheral specializations.

We will distinguish among the ventral, frontal, lateral and dorsal eye regions. Because the bee’s eye is elongated in the dorsoventral (vertical) direction (Fig. 1A,B), the frontal and the lateral visual fields will be further subdivided in this direction. The dorsal eye region (Fig. 1C) should not be confused with the uppermost dorsal ‘rim area’ (‘POL region’, depicted by the black sickle shapes in Fig. 1C). The unique function of the POL region cannot be compared with that of any other eye region. We will return to this point in the Discussion.

Fig. 1.

Schematic drawing (after Seidl and Kaiser, 1981) illustrating the elongation of the honeybee’s eye in the dorsoventral (vertical) direction. (A) Frontal, (B) lateral and (C) dorsal views of the worker bee’s head. The eye is shaded. The dorsal rim regions (see text) are depicted by the black sickle-shaped areas in C. The extent of the ventral eye region (not shown) is similar to that of the dorsal region.

Fig. 1.

Schematic drawing (after Seidl and Kaiser, 1981) illustrating the elongation of the honeybee’s eye in the dorsoventral (vertical) direction. (A) Frontal, (B) lateral and (C) dorsal views of the worker bee’s head. The eye is shaded. The dorsal rim regions (see text) are depicted by the black sickle-shaped areas in C. The extent of the ventral eye region (not shown) is similar to that of the dorsal region.

The individual sections describing the experimental findings are concerned with (i) shape discrimination, (ii) colour discrimination, (iii) responses to moving stimuli, (iv) the use of self-generated image motion, and (v) navigation. We will only consider performances that have, over the years, been investigated in more than just one eye region.

With one exception (see the section on the optomotor response), all the results to be reviewed here were obtained by training freely flying honeybees to make regular visits to an artificial food source, where they learned to associate the food reward with a particular visual stimulus. The trained bees were then usually tested by giving them a choice between the learned stimulus and others that differed from it in one parameter or another, but sometimes other test procedures, to be specified in due context, were employed. In some cases, two stimuli differing in a particular parameter were presented simultaneously during the training, one positive (i.e. rewarded) and the other negative (unrewarded), thus encouraging the bees to learn that parameter and ignore others. The two stimuli were interchanged at regular intervals to prevent the bees from using positional cues. The use of olfactory cues was excluded in all cases, but the measures taken towards this end will not be specified.

Although bees may fly forwards, sideways, upwards, downwards and even backwards prior to selecting a target, landings only occur from above or frontally. Therefore, whenever landing on the target serves as the criterion for the bee’s choice, it is the ventral or the frontal eye region that is involved. At an artificial food source, bees can be made to use either the former or the latter by presenting the stimuli on a horizontal or a vertical plane, respectively.

Comparison between the ventral and the frontal eye regions in pattern recognition tasks

Most of the earlier workers on pattern discrimination in the bee presented the stimuli on a horizontal plane. All of them agreed that the main spatial cue used in this task is contrast frequency, i.e. the number of contours, or of on-and-off stimulation (flicker), per area of the pattern (e.g. Hertz, 1930, 1933; Zerrahn, 1934; Wolf and Zerrahn-Wolf, 1935; Free, 1970; Anderson, 1977). However, patterns presented on a horizontal plane can be approached from any direction. Therefore, parameters that require space-constant learning, such as spatial alignment, are not expected to be used unless pattern recognition is space-invariant. Thus, indirectly, the early results suggest that pattern recognition in the honeybee is not space-invariant.

That this is, indeed, the case was demonstrated in extensive studies using patterns presented on vertical planes (for a review, see Wehner, 1981). Although contrast frequency was found to be an effective parameter even in the frontal visual field (Wehner, 1981; Lehrer et al. 1994; Horridge, 1997), further spatial parameters were shown to be used as reliably as contrast frequency. These include the orientation of contours (Wehner and Lindauer, 1966; van Hateren et al. 1990; Srinivasan, 1994; Horridge, 1997), the distribution of contrasting areas (Wehner, 1972a,b, 1981; Menzel and Lieke, 1983; Srinivasan and Lehrer, 1988; Lehrer, 1990, 1997), geometry (Lehrer et al. 1994; Zhang and Srinivasan, 1994; Horridge, 1997) and symmetry (Lehrer et al. 1994; Giurfa et al. 1996; Horridge, 1996).

Thus, the frontal eye region provides the bee with a larger variety of spatial information than does the ventral one. Viewed in the light of co-evolution, this finding would explain the large variety of shapes and patterns found in zygomorphic flower species, many of which present themselves in a vertical plane (Neal et al. 1998), compared with actinomorphic species that are approached from above and therefore need not differ from one another in more than their spatial frequency in order to be discriminated.

Eye-region-specific pattern learning in the frontal visual field

However, the frontal eye region consists of more than just the central forward-looking fovea (see Fig. 1A). The question of whether different frontal eye regions perform equally well in tasks involving spatial vision would only make sense if pattern recognition were found to be eye-region specific, i.e. if a pattern that has been learned with a particular eye region can later be recognized exclusively by that eye region, but not by any other.

The method for achieving eye-region-specific learning was first introduced by von Frisch (1915) in the context of a quite different problem. When patterns are presented on a vertical plane, the reward of sugar water cannot be offered directly on the pattern against the force of gravity. Instead, a feeder containing sugar water is placed in a dark box fixed behind the pattern. To collect the reward, the bees must first land on the entrance of a horizontal tube penetrating the centre of the pattern and then walk into the box. This method proved, more than 50 years later, to offer an important advantage: it ensures that a bee approaching the tube entrance views different elements of the pattern with different, well-defined frontal eye regions. Using this method, it was shown that bees memorize an eidetic (‘photographic’) image of the pattern, i.e. individual pattern elements are mapped topographically on the ommatidial array (Wehner and Lindauer, 1966; Wehner, 1972a,b). A pattern element that has projected onto a particular eye region during training is not recognized when that region has been occluded prior to the test (Wehner, 1974), although other eye regions are free to view it. Later it was shown that pattern learning occurs during a fixation phase in which the bee hovers on the spot in front of the tube entrance prior to landing (Wehner and Flatt, 1977). Very recently, Horridge (1997, 1998) demonstrated that two pattern elements that are discriminated well when they project onto the same frontal eye region are not discriminated when one projects onto one side and the other onto the other side of the fixation point. Eye-region-specific pattern learning was also demonstrated in experiments in which a sectored disc to which the bees had been trained was tested against an identical disc that had been rotated by half a period (Wehner, 1981). An example is shown in Fig. 5A below.

Dorsoventral asymmetry of pattern vision in the frontal visual field

The eye-regional specificity of pattern learning made it possible to compare the accuracy of pattern recognition among different frontal eye regions. This comparison was undertaken in two independent studies, one concerned with pattern detection, the other with the discrimination of spatial frequencies.

Pattern detection

Wehner (1972a,b) trained honeybees to a white disk and then offered them a choice between it and each of a series of white discs that had a black sector inserted in them in different positions. The test results (Fig. 2) show that the sector is detected best when it is presented in the exact ventral position.

Fig. 2.

Eye-region-specific performance in a pattern detection task in the frontal visual field. Percentage of choices in favour of the learned white disc (mean values ± S.D.) are shown as a function of the position of the black sector presented in the test disc. N is the number of choices. Data from Wehner (1972a).

Fig. 2.

Eye-region-specific performance in a pattern detection task in the frontal visual field. Percentage of choices in favour of the learned white disc (mean values ± S.D.) are shown as a function of the position of the black sector presented in the test disc. N is the number of choices. Data from Wehner (1972a).

Discrimination of spatial frequencies

Honeybees were trained to a white disc that displayed a black-and-white sectored pattern in a quarter of its area (Fig. 3, insets). The pattern was presented in the lower, the lateral or the upper position, with a new group of bees being trained in each case (Lehrer, 1997). In subsequent tests, the bees had to choose between the learned pattern and each of a series of patterns that differed from it in frequency, all presented in the trained position. Best discrimination between the trained pattern and each of the test patterns was obtained when training and tests were conducted with the patterns presented in the ventral position (black bars in Fig. 3A,B). Thus, discrimination of spatial frequencies, like pattern detection (see Fig. 2), is best in the ventral part of the frontal visual field.

Fig. 3.

Eye-region-dependent discrimination of spatial frequencies in the frontal visual field. The rewarded stimulus was a sectored pattern projecting onto the ventral, lateral or dorsal eye region (right-hand insets), using a fresh group of bees in each case. In A, bees were trained to a high-frequency pattern that was then tested against lower-frequency ones (abscissa). In B, this situation was reversed. Mean values + S.D. of choice frequencies are shown as calculated from several tests conducted at each frequency. N is the number of choices. λ is spatial period. Modified after Lehrer (1997).

Fig. 3.

Eye-region-dependent discrimination of spatial frequencies in the frontal visual field. The rewarded stimulus was a sectored pattern projecting onto the ventral, lateral or dorsal eye region (right-hand insets), using a fresh group of bees in each case. In A, bees were trained to a high-frequency pattern that was then tested against lower-frequency ones (abscissa). In B, this situation was reversed. Mean values + S.D. of choice frequencies are shown as calculated from several tests conducted at each frequency. N is the number of choices. λ is spatial period. Modified after Lehrer (1997).

Indeed, when a bee flies above a meadow, it is the ventral eye region that is most likely to be involved in detecting and recognizing flowers. Thus, stimuli perceived in this eye region are being assigned more weight than are stimuli perceived in other frontal eye regions.

Discrimination of contour orientation

The ability of bees to discriminate between patterns that differ in the spatial orientation of contours was demonstrated more than 30 years ago using patterns presented on vertical planes (Wehner and Lindauer, 1966). More recently, an extensive series of experiments (for reviews, see Srinivasan, 1994; Srinivasan et al. 1993, 1994), using a Y-maze apparatus, was concerned with the possible neural mechanisms underlying the bee’s use of this parameter (see also Horridge, 1997). Giger and Srinivasan (1997) showed that neither the dorsal nor the ventral eye region is capable of exploiting contour orientation in a pattern discrimination task. Indeed, under natural conditions, the dorsal eye region is hardly ever confronted with the target, and the ventral eye region is not suitable for determining spatial orientation, which is a space-variant parameter. In the lateral visual field, however, contour orientation was shown to be learned as reliably as in the frontal one (Giger and Srinivasan, 1997).

Eye-region-specific learning and regional differences in the non-frontal visual field

The use of contours presented laterally has been demonstrated in the context of yet another task. In Osmia bees (Wehner, 1979), as well as in the honeybee (Wehner, 1981), lateral horizontal marks were shown to be very effective in guiding the insect to a frontally positioned target.

To examine the role that other non-frontal eye regions play in this task, bees were trained to collect sugar water from a small box placed behind a vertical circular board presenting an array of 89 holes (Fig. 4A,B) (Lehrer, 1990). The entrance to the box was through the central hole of the array. To reach it, bees had to fly through an opaque white cylinder that carried a horizontal black stripe whose position was varied from one experiment to another, with a new group of bees being trained in each experiment. Each bee was then tested individually, with no reward present, by recording her choices among the 89 holes. The percentage of choices was then calculated for the so-called mark-band (Fig. 4B), which is the band of three rows (or columns) of holes at which the stripe projects onto the bee’s eye in (roughly) the same retinal position as it does when viewed from the rewarded hole during the training. The results (Fig. 4C, filled symbols) show that a stripe offered in lateral positions is much more effective than is a stripe offered in any other position. When the mark was displaced to a new position, on one or the other side of the original mark-band, the choices of the bees were shifted to the newly defined mark-band (Fig. 4C, open symbols), showing that the stripe has been learned eye-region-specifically. The best performance was, again, in the exact lateral visual field.

Fig. 4.

Eye-region-specific differences in a task involving the localization of a frontal target with the help of non-frontal marks. (A) View of the experimental apparatus and definition of the nine positions in which a horizontal stripe mark was offered. (B) View of the array of 89 holes. Entrance to the reward box is through the central hole of the array. A definition of the mark-band (shaded) (see text) is shown, as an example, for a stripe at 90 °. In ‘displacement tests’, the stripe was offered in a neighbouring position in two separate types of experiment, defining a new mark-band on either side of the original mark-band. (C) Percentage of choices on the mark-band as a function of stripe position in the training situation (filled symbols) and in the displacement tests (open symbols; the two types of displacement test taken together). N is the number of choices. Values are means ± S.D. Modified after Lehrer (1990).

Fig. 4.

Eye-region-specific differences in a task involving the localization of a frontal target with the help of non-frontal marks. (A) View of the experimental apparatus and definition of the nine positions in which a horizontal stripe mark was offered. (B) View of the array of 89 holes. Entrance to the reward box is through the central hole of the array. A definition of the mark-band (shaded) (see text) is shown, as an example, for a stripe at 90 °. In ‘displacement tests’, the stripe was offered in a neighbouring position in two separate types of experiment, defining a new mark-band on either side of the original mark-band. (C) Percentage of choices on the mark-band as a function of stripe position in the training situation (filled symbols) and in the displacement tests (open symbols; the two types of displacement test taken together). N is the number of choices. Values are means ± S.D. Modified after Lehrer (1990).

The ecological significance of the particularly good performance in the lateral eye region is likely to be based on the fact that the most conspicuous and omnipresent natural mark perceived by the bee, namely the horizon line, projects onto the non-frontal eye regions in a lateral position. It is conceivable that bees use the horizon line as a mark in several visual tasks (see also Wehner, 1981).

Colour is a most powerful cue in target recognition tasks (for references, see von Frisch, 1965; Chittka and Menzel, 1992; Menzel and Shmida, 1993). Some colours are learned faster than are others (Menzel, 1967), and the acuity of colour discrimination depends on the pair of colours to be discriminated (e.g. Daumer, 1956; von Helversen, 1972; Menzel and Backhaus, 1989). However, until quite recently, the dependence of colour discrimination on the eye region involved has not been examined specifically.

Colour discrimination in different eye regions

Giger and Srinivasan (1997) trained bees to discriminate between a blue and a yellow disc each presented in one of the two arms of a Y-maze, one rewarded, the other not. In four separate experiments, the stimuli were presented in the frontal, the lateral, the ventral and the dorsal eye region, respectively.

Discrimination was found to be excellent in the ventral, frontal and lateral visual fields. The dorsal eye region, however, proved to be totally incapable of colour discrimination. Indeed, in the bee’s natural world, the dorsal visual field is hardly ever confronted with a colour discrimination task.

Eye-region-specific colour learning in the frontal visual field

The question of whether colour, like pattern (see above), is stored topographically in such a way that it can only be recognized when viewed in the trained retinal position was investigated independently in the frontal and in the lateral visual fields. (In the ventral visual field, position-specific learning is, a priori, not expected to occur.)

As in the case of black-and-white sectored discs (see Fig. 5A), a two-coloured sectored disc is discriminated well from an identical disc that has been rotated by half a period (Fig. 5B,C), showing that even colours are stored topographically (Srinivasan and Lehrer, 1988). In these experiments, the orientation of contours could not have served as a discrimination cue, because it did not differ between the two patterns.

Fig. 5.

Eye-region-specific pattern learning in the frontal visual field. (A) Bees trained to a black-and-white sectored disc (period 45 °) are offered a choice between it and an identical one rotated by half a period. (B,C) As in A, but two-coloured sectored discs are used. Percentage of choices is shown under each pattern. (A) Data from Wehner (1981); (B,C) Data from Srinivasan and Lehrer (1988).

Fig. 5.

Eye-region-specific pattern learning in the frontal visual field. (A) Bees trained to a black-and-white sectored disc (period 45 °) are offered a choice between it and an identical one rotated by half a period. (B,C) As in A, but two-coloured sectored discs are used. Percentage of choices is shown under each pattern. (A) Data from Wehner (1981); (B,C) Data from Srinivasan and Lehrer (1988).

Is the retinal position of coloured areas as effective even when edge orientation is available as a cue? To examine this question, bees were trained to a half-yellow, half-blue disc, with the edge oriented horizontally (0 °) (Fig. 6). In one experiment, conducted in 1997, yellow was in the upper half and blue in the lower. In another experiment (1998), this arrangement was reversed. In the tests, the trained bees were offered a choice between the previously rewarded disc and one of four identical discs that had been rotated by 45 °, 90 °, 135 ° or 180 ° (Fig. 6, abscissa) (M. Lehrer, unpublished results).

Fig. 6.

(A,B) The dominance of eye-region-specific colour distribution over edge orientation in the frontal visual field. In 1997, the trained disc has yellow in the upper and blue in the lower half of the visual field. In 1998, this situation is reversed. In either case, the trained disc is tested against identical discs in which the orientation of the edge, and therefore also the distribution of the colours, is varied (bottom insets). The number of choices is given above each column (M. Lehrer, unpublished data).

Fig. 6.

(A,B) The dominance of eye-region-specific colour distribution over edge orientation in the frontal visual field. In 1997, the trained disc has yellow in the upper and blue in the lower half of the visual field. In 1998, this situation is reversed. In either case, the trained disc is tested against identical discs in which the orientation of the edge, and therefore also the distribution of the colours, is varied (bottom insets). The number of choices is given above each column (M. Lehrer, unpublished data).

The angular deviation from the trained disc is largest for the 90 ° disc, but it does not differ between the 45 ° and the 135 ° discs. However, the 135 ° disc differs from the trained disc in colour distribution much more than does the 45 ° disc. If edge orientation were crucial, then discrimination from the trained disc would be expected to be best with the 90 ° disc, and it should not differ between the 45 ° and 135 ° discs. However, discrimination of these three discs from the trained disc improved the more the test disc deviated from the trained one in the distribution of the two colours, rather than in the orientation of the edge (Fig. 6). Still, discrimination of the 180 ° disc was poorer than that of the 135 ° disc, showing that the orientation of the edge was not totally ignored.

In a set of earlier, similar experiments, Menzel and Lieke (1983) used test discs rotated by either +45 ° or −45 ° (rather than 135 °) with respect to the trained disc. When the edge in the trained disc was oriented horizontally, as in Fig. 6, the +45 ° and −45 ° discs were discriminated from it equally well, which is as expected, because these two test discs deviate from the trained disc by the same amount with respect to both orientation and colour distribution.

Dorsoventral asymmetry of colour discrimination in the frontal visual field

The eye-region specificity of colour learning demonstrated above provided the basis for examining whether colour discrimination is subject to a dorsoventral asymmetry similar to that found in pattern vision.

Bees were trained to a half-blue, half-yellow disc, employing two reciprocal training procedures, as in Fig. 6. Bees trained in either situation were given a choice between the trained disc and a one-coloured disc presenting either the trained yellow or the trained blue (Fig. 7Aa,b, Ba,b). Thus, bees had to discriminate between the same two colours in either the lower (Fig. 7Aa and Ba) or the upper (Fig. 7Ab and Bb) visual field. The results of these tests (as well as the results of a more detailed study to be published elsewhere) show that colour discrimination is significantly better when it involves the lower half of the visual field than when it involves the upper half.

Fig. 7.

Dorsoventral asymmetry in a colour discrimination task in the frontal visual field. A and B differ in the colour distribution of the trained pattern. The trained bees were tested in three situations (a–c). In Aa and Ba, discrimination between blue and yellow involves the lower visual field. In Ab and Bb, the same discrimination task is presented in the upper visual field. In Ac and Bc, the two trained colours are pitted against each other. The mean values of the choice frequencies obtained for each pattern are shown. N is the number of choices (M. Lehrer, unpublished data).

Fig. 7.

Dorsoventral asymmetry in a colour discrimination task in the frontal visual field. A and B differ in the colour distribution of the trained pattern. The trained bees were tested in three situations (a–c). In Aa and Ba, discrimination between blue and yellow involves the lower visual field. In Ab and Bb, the same discrimination task is presented in the upper visual field. In Ac and Bc, the two trained colours are pitted against each other. The mean values of the choice frequencies obtained for each pattern are shown. N is the number of choices (M. Lehrer, unpublished data).

However, the difference between test a and test b is greater in Fig. 7A than in Fig. 7B, suggesting that there exists still another type of dorsoventral asymmetry in the frontal visual field: bees prefer to view blue in the upper visual field, as they indeed would when flying under blue sky. This conclusion is corroborated by the results shown in Fig. 7Ac, Bc. In these tests, the two trained colours were pitted against each other. Bees previously trained with blue in the lower half preferred blue over yellow, whereas bees trained with yellow in the lower half preferred yellow over blue, which is as expected if the lower visual field is indeed weighted more strongly than the upper visual field. However, the preference for blue in Fig. 7Ac was much stronger than that for yellow in Fig. 7Bc. In the former case, the stronger weighting of the lower visual field is added to the preference for blue in the upper position, whereas in the latter case the two tendencies conflict with each other.

In the experiments by Menzel and Lieke (1983) mentioned above, when the edge of the trained disc was oriented at 45 ° with respect to the horizontal, rotation by +45 ° and by −45 ° with respect to it rendered asymmetrical results, revealing a preference for ultraviolet in the upper visual field.

Position-specific colour learning in the lateral visual field

Bees were trained to collect sugar water from a small box placed behind a vertical board containing an array of 27 holes, arranged in nine rows and three columns (Fig. 8, inset) (Lehrer, 1990). The entrance to the box was through the central hole of the array. To reach it, the bees had to fly between two lateral walls, each carrying a half-yellow, half-blue pattern, with yellow in the upper half. The edge between the two coloured areas was at the height of the central (rewarded) hole. In the tests, with no reward present, the choices of the bees among the 27 holes were recorded. The percentage of choices was then calculated for the upper, central and lower subarray of holes, each comprising nine holes.

Fig. 8.

The use of colour distribution in the lateral visual field in the task of localizing a frontal target. The top inset gives the definition of the upper, central and lower subarray of holes viewed frontally. To reach the central (rewarded) hole, bees had to fly between two lateral walls each carrying a two-coloured pattern, with the edge positioned at the height of the central hole. Tests were conducted with the edge at the training height (A), with the edge displaced to either a higher (B) or a lower (C) position and with the colour distribution reversed (D). The dashed line denotes random-choice level. The number of choices is given above each set of columns. Modified after Lehrer (1990).

Fig. 8.

The use of colour distribution in the lateral visual field in the task of localizing a frontal target. The top inset gives the definition of the upper, central and lower subarray of holes viewed frontally. To reach the central (rewarded) hole, bees had to fly between two lateral walls each carrying a two-coloured pattern, with the edge positioned at the height of the central hole. Tests were conducted with the edge at the training height (A), with the edge displaced to either a higher (B) or a lower (C) position and with the colour distribution reversed (D). The dashed line denotes random-choice level. The number of choices is given above each set of columns. Modified after Lehrer (1990).

The results (Fig. 8) show that the bees have learned to use the lateral stimulus in the task of localizing the frontal target. When, in the test, the edge was displaced to a lower or a higher position, searching was shifted accordingly. However, when the two colours were interchanged, the trained bees failed to localize the target, showing that the crucial cue is the distribution of the two colours in the visual field, rather than the retinal position of the edge. Thus, even in the lateral visual field, colours are learned eye-region-specifically and cannot be used in the task when they are viewed with the wrong eye regions.

Bees are spontaneously attracted to small moving targets (Zhang et al. 1990; Lehrer and Srinivasan, 1992), suggesting that motion cues play a role in attracting pollinators. It has already been shown that bees land much more often on flowers that sway in the wind than on neighbouring, motionless flowers (Wolf, 1933; Kevan, 1973). There exist, however, several types of response to image motion that have little to do with attraction.

The optomotor response to rotational stimuli

An insect flying tethered in a rotating black-and-white striped drum responds to the stimulus by turning in the direction of motion, thus stabilizing the image of the pattern on the eye. This reflex-like behaviour, termed the optomotor response (for references, see Wehner, 1981), constitutes a directionally sensitive reaction to large-field motion that would, under natural conditions, be the result of an involuntary deviation of the animal from its intended course of locomotion. Depending on the direction of motion and on the eye region that is stimulated, different turning responses (yaw, pitch or roll) are elicited, all of which, however, are aimed at stabilizing the image on the retina by compensating for the perceived image motion.

The bee’s optomotor yaw response: differences between the lateral and the medial eye regions

Tethered flying bees were found to display a striking lateral–medial asymmetry of the optomotor yaw response, as revealed by experiments in which the medial or the lateral eye region was occluded (Moore and Rankin, 1982). The lateral regions of both eyes were found to be sensitive exclusively to front-to-back motion, whereas the medial eye regions responded exclusively to back-to-front motion. The same study showed that optomotor stimulation elicits stronger responses in the lateral eye regions than in the medial ones. This finding might be based on a stronger weighting of the input provided by the lateral eye regions. Indeed, during forward flight, the lateral visual field perceives a much larger amount of image motion than does the frontal one.

The spectral sensitivity of the bee’s optomotor system

For reasons that will become obvious later, I here include, without going into the details, a result obtained (e.g. Kaiser and Liske, 1974) from an investigation of the optomotor yaw response of tethered flying bees. By using moving gratings constructed of different combinations of two spectral colours, the authors found that the bee’s optomotor system is mediated exclusively by the input of the green receptor. Because a single spectral type of receptor cannot encode colour, this finding implies that the bee’s optomotor system is colour-blind.

The colour-blindness of the optomotor response had already been suggested by Schlieper (1928) on the basis of experiments on several insect species, including the bee. However, he was unable to explain it by the participation of a single spectral type of photoreceptor.

The movement avoidance response

The study to be summarized in the present section was, originally, designed to investigate the bee’s power of temporal resolution. Our first attempt to do this was by training bees to discriminate between a steady coloured light (green, blue or ultraviolet) and a flickering light of the same colour presented on a horizontal plane (Srinivasan and Lehrer, 1984a). However, irrespective of the colour and the flicker frequency used, the bees did not accomplish the discrimination. We therefore set out to examine the question by using moving, rather than flickering, stimuli (Srinivasan and Lehrer, 1984b).

Bees were rewarded in a vial inserted in the centre of a black-and-white sectored disc (period 60 °) presented on a vertical plane. The disc rotated at 50 revs s−1, thus producing a temporal frequency of 300 Hz. Because the bee’s photoreceptors resolve flicker only up to a frequency of 200 Hz (Autrum and Stöcker, 1950), the black and the white sectors in this disc are fused to grey (Fig. 9A, inset). An identical disc, unrewarded, was presented simultaneously, but it rotated at a much lower speed, producing a temporal frequency of only 30 Hz, at which the individual sectors are expected to be resolved. In subsequent tests, with the reward absent, the trained bees were given a choice between the fused disc and the alternative one, but now the latter rotated at different speeds in different tests. The idea was to determine the frequency at which the bees would choose randomly between the two stimuli, indicating that the sectors in the test disc are now fused as well.

Fig. 9.

The movement avoidance response in the frontal visual field. The positive and negative stimuli (inset) are identical sectored discs (period 60 °), but the former rotates at high speed, producing a contrast frequency of 300 Hz at which the sectors are fused. The percentage of landings on the positive disc as a function of the temporal frequency of the alternative disc is shown. (A) Black-and-white discs. (B) The sectored discs are constructed of two pigment papers that produce contrast detectable exclusively by the bee’s green receptor. (C) As in B, but using a colour combination that produces no green-contrast. Values are means ± S.D. Data from Srinivasan and Lehrer (1984b).

Fig. 9.

The movement avoidance response in the frontal visual field. The positive and negative stimuli (inset) are identical sectored discs (period 60 °), but the former rotates at high speed, producing a contrast frequency of 300 Hz at which the sectors are fused. The percentage of landings on the positive disc as a function of the temporal frequency of the alternative disc is shown. (A) Black-and-white discs. (B) The sectored discs are constructed of two pigment papers that produce contrast detectable exclusively by the bee’s green receptor. (C) As in B, but using a colour combination that produces no green-contrast. Values are means ± S.D. Data from Srinivasan and Lehrer (1984b).

The results (Fig. 9A) revealed a fusion frequency of 200 Hz, in agreement with the electrophysiological findings. However, the experiment provided another result: in a broad range of temporal frequencies (between approximately 20 and 120 Hz), the bees avoid landing on the test disc and land almost exclusively on the grey disc.

This behaviour, which we termed the ‘movement avoidance response’, is clearly distinct from the optomotor response, mainly because it is active at much higher temporal frequencies. The bee’s optomotor response is optimal at approximately 8 Hz (Kaiser and Liske, 1974), and at 100 Hz nothing is left of it (Kunze, 1961). Therefore, the discovery of the movement avoidance response provided an opportunity to examine whether colour-blindness (see above) is restricted to the optomotor response or whether it is instead a general principle in tasks involving motion detection.

The experiment presented in Fig. 9A cannot provide an answer to this question, because black-and-white stimuli offer high contrasts to all three spectral types of receptor. Therefore, we repeated the experiment using two-coloured sectored discs (Srinivasan and Lehrer, 1984b). Two combinations of blue and yellow pigment papers were used. In one, the contrast between the two colours was restricted to the green receptor. We refer to this contrast as ‘green-contrast’. The other colour combination offered contrast (termed ‘blue-contrast’) to the blue and the ultraviolet receptors, but not to the green receptor. With green-contrast (Fig. 9B), movement avoidance was as strong as before. In the absence of green-contrast, however (Fig. 9C), the bees landed on the test disc at all frequencies, just as in the flicker experiments mentioned above. It follows that the movement avoidance response is a colour-blind behaviour mediated by the green receptor, as is the optomotor response.

The movement avoidance response in the ventral visual field

More recently, the experiments shown in Fig. 9 were repeated presenting the stimuli on a horizontal plane (M. Lehrer, unpublished results). With black-and-white discs (Fig. 10A), as well as with green-contrast ones (Fig. 10B), the movement avoidance response was similar to that in the frontal visual field (see Fig. 9A,B). In the absence of green-contrast, however (Fig. 10C), the preference for the fused disc was as strong as with green-contrast at frequencies of 18 Hz or above and very much stronger than the latter at all lower frequencies of the test disc. In control tests, the same bees (trained to the fused blue-contrast disc, Fig. 10C) were presented with black-and-white discs, as in Fig. 10A. Their response changed dramatically, choice frequency for the fused disc being only 35 % at 0 Hz, 40 % at 1.8 Hz and 76 % at 9 Hz. A choice frequency of 100 % was only reached at 18 Hz, as in Fig. 10A. Thus, in the ventral visual field, when green-contrast is present, the bees avoid the moving stimuli for as long as motion is still resolved, just as they do in the frontal visual field. However, when green-contrast, and therefore motion, is absent (Fig. 10C), they switch to the use of a different cue, namely colour. Their choice behaviour in this experiment seems to be based on a discrimination between the previously rewarded mixture of two colours and an alternative stimulus in which the two colours can still be resolved individually. Indeed, in an earlier study, we obtained similar results, again in the ventral visual field, by training bees to discriminate between a steady mixture of two coloured lights (green and blue, blue and ultraviolet, or ultraviolet and green) and a heterochromatic flickering stimulus in which the same two lights alternated at variable frequencies (Srinivasan and Lehrer, 1985). The preference of the bees for the colour mixture was very similar to that shown in Fig. 10C at both low and high frequencies of heterochromatic flicker, and so was the fusion frequency. It thus seems that, in the ventral visual field, when motion is invisible, the two-coloured rotating discs are treated as if they constituted heterochromatic flicker.

Fig. 10.

The movement avoidance response in the ventral visual field. As in Fig. 9, but stimuli are presented on a horizontal plane (M. Lehrer, unpublished data). For further details, see Fig. 9.

Fig. 10.

The movement avoidance response in the ventral visual field. As in Fig. 9, but stimuli are presented on a horizontal plane (M. Lehrer, unpublished data). For further details, see Fig. 9.

The ecological significance of the differences found between the ventral and the frontal eye regions with respect to the use of heterochromatic flicker may be sought in the fact that colours keep changing continuously when a bee flies above a meadow in search of a flower. Thus, in the ventral visual field, colour resolution during flight seems to be as important as is motion resolution. Motion in the frontal visual field (for example, when a bee forages within a tree or a bush), in contrast, does not elicit very frequent colour changes as the bee flies from one flower to the next nearest flower. In this situation, it is more important to focus on collision avoidance, a task that, as will be shown below, can only be mastered by using motion cues.

In the studies on the optomotor response and the movement avoidance response summarized above, the stimuli used were actually moving. In the following sections, we will be concerned with image motion that is a consequence of the bee’s own, voluntary locomotion.

Depth from image motion

Like most insects, the bee lacks all the mechanisms that vertebrates have evolved for perceiving the third dimension, such as stereoscopic vision, convergence of the eyes and lens accommodation. How, then, does the bee measure the distance of an object?

One way would be to exploit the object’s angular size: a near object subtends a larger visual angle at the eye than does a more distant object. The bee’s capacity to learn angular size was demonstrated in both the frontal (Wehner and Flatt, 1977; Wehner, 1981) and the ventral (Schnetter, 1972; Mazochin-Porshnyakov et al. 1977; Ronacher, 1979; Horridge et al. 1992) visual fields, and there is much evidence that the bee uses this cue in distance estimation tasks (frontal visual field, Cartwright and Collett, 1979, 1983; Collett, 1992; Lehrer and Collett, 1994; ventral visual field, Horridge et al. 1992).

However, when object size is unknown (as, for example, when the bee arrives at a novel feeding site), then the only distance information available is the speed of image motion: the contours of a near object move faster on the eye than do those of a more distant object. However, to examine the bees’ use of image speed as a cue to distance, bees must be prevented from learning the angular size of the relevant object.

The bee’s performance in using motion cues in distance estimation tasks was examined independently in the ventral, the frontal and the lateral eye regions, as described below.

Size-independent distance estimation in the ventral visual field

Bees were trained to visit a white ‘meadow’ offering seven black discs, each of a different size (Lehrer et al. 1988). One of them, placed on a stalk 70 mm above the ground, was provided with a drop of sugar water, whereas the others were placed flat on the ground and each carried a drop of plain water. The positions of all seven discs were varied between rewarded visits, and, at the same time, the size of the rewarded disc was altered. The only parameter that always remained constant was the height of the rewarded disc above the ground. In subsequent tests, five discs, each of a different size, were placed at five different heights. Their sizes and positions were varied between tests.

The distribution of the landings of the bees on the five test discs (Fig. 11A) was strictly correlated with the height of the discs, showing that bees discriminate range irrespective of size. Similar results were obtained with blue discs on a yellow ground, using the green-contrast combination mentioned above (Fig. 11B). In the absence of green-contrast, however (Fig. 11C), range discrimination broke down, showing that it is a green-sensitive, colour-blind motion detection system that extends the bee’s vision into the third dimension.

Fig. 11.

The use of image motion for distance estimation in the ventral visual field. The insets (top left) show the training and test situations. The rewarded dummy flower (one of seven dummy flowers) was placed at a constant height (70 mm) above the ground, but its size and position were randomized between rewarded visits. Tests were conducted using five dummy flowers of different sizes placed at different heights. (A–C) The distribution of the bees’ landings on the five test flowers as a function of the height of the flowers. (A) Black discs on a white ground. (B,C) Blue discs on a yellow ground; (B) green-contrast, (C) blue-contrast. N is the number of landings. Modified from Lehrer et al. (1988).

Fig. 11.

The use of image motion for distance estimation in the ventral visual field. The insets (top left) show the training and test situations. The rewarded dummy flower (one of seven dummy flowers) was placed at a constant height (70 mm) above the ground, but its size and position were randomized between rewarded visits. Tests were conducted using five dummy flowers of different sizes placed at different heights. (A–C) The distribution of the bees’ landings on the five test flowers as a function of the height of the flowers. (A) Black discs on a white ground. (B,C) Blue discs on a yellow ground; (B) green-contrast, (C) blue-contrast. N is the number of landings. Modified from Lehrer et al. (1988).

The use of self-generated image motion for distance estimation in the ventral visual field was also demonstrated in recent experiments in which bees were video-recorded whilst landing on a horizontal black-and-white patterned surface. The bees were found to adjust their flight speed according to their height above the ground (Srinivasan et al. 1996; Srinivasan and Zhang, 1997).

Size-independent distance estimation in the frontal visual field

Bees were trained to discriminate between two black discs, one rewarded, the other not, placed each in one of the two arms of a Y-maze (Horridge et al. 1992). During training, the bees were presented alternately with four situations in which the distance of the positive disc from the arm entrance was kept constant but its angular size (as viewed from the arm entrance) was varied. The distance of the negative disc from the arm entrance differed from that of the positive disc in each of the four situations, but its size was adjusted so that it always subtended the same visual angle as did the latter. On every arrival, each bee’s first decision between the two arms was recorded at the arm entrance. The percentage of choices in favour of either arm in each of the four situations (Fig. 12) shows that the bees have learned the distance of the rewarded disc despite the fact that its angular size could not be used in this discrimination task.

Fig. 12.

Size-independent distance estimation in the frontal visual field. Bees were trained in a Y-maze to discriminate between two black discs presented in four situations that alternated in random succession. In all situations, the distance of the positive disc from the arm entrance was kept constant, but its angular size was varied. The distance of the negative disc from the arm entrance was varied (abscissa), but its angular size was always the same as that of the positive disc. The percentage of choices (as measured at the arm entrance) for the positive (black columns) and the negative (hatched columns) arms is shown. N is the number of choices. Data from Horridge et al. (1992).

Fig. 12.

Size-independent distance estimation in the frontal visual field. Bees were trained in a Y-maze to discriminate between two black discs presented in four situations that alternated in random succession. In all situations, the distance of the positive disc from the arm entrance was kept constant, but its angular size was varied. The distance of the negative disc from the arm entrance was varied (abscissa), but its angular size was always the same as that of the positive disc. The percentage of choices (as measured at the arm entrance) for the positive (black columns) and the negative (hatched columns) arms is shown. N is the number of choices. Data from Horridge et al. (1992).

Bees can even exploit self-produced image motion in the frontal visual field to estimate the distance of landmarks (Lehrer and Collett, 1994). The use of self-generated image motion for distance estimation in the frontal visual field has been demonstrated in several further insect species (locusts, Wallace, 1959; Collett, 1978; Horridge, 1988; Sobel, 1990; crickets, Campan et al. 1981; mantids, Horridge, 1988; Walcher and Kral, 1994; wasps, Zeil, 1993a,b; solitary bees, Brünnert et al. 1994).

Motion-dependent distance estimation in the lateral visual field

Bees were trained to collect a food reward at the end of a tunnel flanked by two black-and-white vertical gratings (Kirchner and Srinivasan, 1989). Frame-by-frame evaluation of video recordings conducted from above revealed that the bees fly along the midline of the tunnel, indicating that they strive to equalize the motion perceived from the two sides. This ‘centring response’ is manifest even when the gratings on the two walls differ in their spatial period (Fig. 13A,B) (Srinivasan et al. 1991), showing that the relevant cue, as opposed to the optomotor response, is not the contrast frequency of the pattern, but rather the speed of image motion. When one grating (either the low-or the high-frequency one) is moved in the direction of the bee’s flight, thus reducing the apparent speed of image motion perceived on that side, the bees fly on a route that is nearer to the moving wall (Fig. 13C,D), and

Fig. 13.

Motion-based estimation of lateral distance. Results of a frame-by-frame evaluation of video-recordings of flight trajectories of bees trained to collect a food reward at the end of a tunnel flanked by two gratings (top panel). The position of the bees’ route (mean ±2 S.D.) is depicted in A–F by the shaded horizontal bars. Arrows within the bars denote the bee’s flight direction. In A and B, both gratings are stationary. In C and D, one of the gratings is moved in the bee’s flight direction; in E and F, one of the gratings is moved against the bee’s flight direction. λ is stripe period. Data from Srinivasan et al. (1991); illustration modified from Lehrer (1994).

Fig. 13.

Motion-based estimation of lateral distance. Results of a frame-by-frame evaluation of video-recordings of flight trajectories of bees trained to collect a food reward at the end of a tunnel flanked by two gratings (top panel). The position of the bees’ route (mean ±2 S.D.) is depicted in A–F by the shaded horizontal bars. Arrows within the bars denote the bee’s flight direction. In A and B, both gratings are stationary. In C and D, one of the gratings is moved in the bee’s flight direction; in E and F, one of the gratings is moved against the bee’s flight direction. λ is stripe period. Data from Srinivasan et al. (1991); illustration modified from Lehrer (1994).

when the grating moves in the opposite direction, thus increasing the apparent speed of image motion on that side, the bees fly nearer to the stationary grating (Fig. 13E,F). Srinivasan and Zhang (1997) propose that the mechanism underlying the centring response is the same as that governing the movement avoidance response.

Summing up the present section, self-generated image motion serves the bee for distance estimation in all three planes of the visual world, which is what one would indeed expect from an animal that moves in three dimensions.

Object–ground discrimination

The bee’s capacity to discriminate among different speeds of image motion demonstrated above is expected to enable her to cope with yet another task, namely object–ground discrimination. An object that is nearer to the flying bee than is the background will move faster than the latter on the bee’s eye, thus creating relative motion (motion parallax) between it and the background. Such an object is expected to be discriminated from the background even if the two differ in neither brightness nor colour.

To test this prediction, bees were trained to a randomly patterned black-and-white disc placed on a transparent Perspex sheet raised above a similarly patterned horizontal surface (Srinivasan et al. 1990). In the tests, the landings of the bees on the disc, as well as elsewhere on the Perspex sheet, were recorded. The percentage of landings on the disc (Fig. 14) shows that the disc is better detected the higher it is placed, i.e. the larger the amount of motion parallax. This performance was independent of whether the density of the pattern on the disc was the same as that on the ground, showing that object–ground discrimination is not based on pattern discrimination.

Fig. 14.

The use of motion parallax for figure–ground discrimination. Bees were trained to collect a food reward from a patterned disc (inset) placed on a transparent Perspex sheet raised above a similarly patterned ground. The proportion of landings on the disc as a function of its height above the ground is shown. The dashed line depicts random-choice level. Values are means ± S.D. Data from Srinivasan et al. (1990).

Fig. 14.

The use of motion parallax for figure–ground discrimination. Bees were trained to collect a food reward from a patterned disc (inset) placed on a transparent Perspex sheet raised above a similarly patterned ground. The proportion of landings on the disc as a function of its height above the ground is shown. The dashed line depicts random-choice level. Values are means ± S.D. Data from Srinivasan et al. (1990).

In a more recent study (Zhang and Srinivasan, 1994), bees were shown to use motion parallax for object–ground discrimination even in the frontal visual field. The task is accomplished only in the presence of green-contrast, but not in its absence (Zhang et al. 1995), supporting the conclusion that object–ground discrimination is based on motion perception.

Edge detection

Edges in the ventral visual field

The experiments of Srinivasan et al. (1990) described above showed that landings on the raised figure occur mainly at the figure boundaries. Thus, object–ground discrimination is based on the detection of a motion discontinuity perceived at the edge between the object and the background. This conclusion is corroborated by the finding that the preference for edges disappears in the absence of green-contrast (Lehrer et al. 1990).

Evaluation of video-taped flight trajectories (Lehrer and Srinivasan, 1993) revealed that the majority of landings on an edge occur when bees fly from the low surface towards the raised one (see also Kern et al. 1997). Bees flying in the opposite direction usually crossed the edge without landing on it. Thus, landings are triggered by the local increase in the speed of image motion perceived at the edge. This conclusion is corroborated by the results of model simulations that took a motion detection mechanism to be responsible for the observed behaviour (Kern et al. 1997). The model bees behaved much the same as did the experimental bees with respect to both the frequency and the direction of landings on edges.

Edges in the frontal visual field

In the frontal visual field, landing on edges cannot be investigated, because bees will not land on a vertical plane unless a small horizontal surface is provided on which landing is possible. Still, the significance of edges in the frontal visual field is evident from the bees’ flight behaviour. Evaluation of video-taped flight trajectories of bees flying in front of different black-and-white patterns revealed that bees follow the contours contained in the pattern (Lehrer et al. 1985). This behaviour, which we termed ‘scanning’, might constitute some type of image stabilization or motion avoidance, because crossing contours produces retinal image motion, whereas following contours does not. This interpretation is supported by the finding that scanning occurs only in the presence of green-contrast, but not in its absence (Lehrer et al. 1985).

Bees follow the contours of linear gratings even when these are presented on a horizontal plane (Lehrer and Srinivasan, 1994). However, when the task requires discrimination between a low and a raised grating, and thus the use of image motion, the bees abandon the otherwise innate scanning behaviour and select oblique or perpendicular directions with respect to the orientation of the contours, thus actively acquiring depth information (Lehrer and Srinivasan, 1994).

Edges in the lateral visual field

The role of edges in the lateral visual field was examined using the experimental arrangement shown in Fig. 8. A half-blue and half-yellow pattern was placed on each of the two lateral walls. This time, however, blue and yellow, respectively, were presented alternately in the lower and the upper visual fields (Lehrer, 1990). In this situation, the bees could not rely on the distribution of the colours and were forced to use the retinal position of the edge. With the green-contrast colour combination, the bees were very successful in using the edge in the task of localizing the frontal target (Fig. 15A). However, in the absence of green-contrast, the edge was ineffective in guiding the bees to the goal (Fig. 15B) although, with the same colour combination, bees were perfectly able to use the distribution of the two colours for accomplishing the same task (see Fig. 8). The use of the edge in the task shown in Fig. 15A is thus similar to the scanning behaviour in that it is mediated by a colour-blind, green-sensitive mechanism that acts to stabilize the position of the edge on the retina.

Fig. 15.

The use of an edge between two coloured areas presented in the lateral visual field in the task of localizing a frontal target. As in Fig. 8 except that, during training, the polarity of the edge was reversed between rewarded visits to prevent the bees from using the colour distribution of the lateral stimuli. The number of choices is given above each set of columns. For further details, see Fig. 8. (A) Green-contrast. (B) Blue-contrast. Data from Lehrer (1990).

Fig. 15.

The use of an edge between two coloured areas presented in the lateral visual field in the task of localizing a frontal target. As in Fig. 8 except that, during training, the polarity of the edge was reversed between rewarded visits to prevent the bees from using the colour distribution of the lateral stimuli. The number of choices is given above each set of columns. For further details, see Fig. 8. (A) Green-contrast. (B) Blue-contrast. Data from Lehrer (1990).

An animal planning to travel over a relatively long distance to a particular goal needs knowledge about the bearing of the goal as well as its distance. Honeybee foragers returning to the hive from a profitable food source communicate, using the dance language (reviewed by von Frisch, 1965), the direction as well as the distance that potential recruits should select to arrive at that food source. The dancing bee’s knowledge of the direction of the food source has been shown many times to be based on visual information derived from the skylight pattern (von Frisch, 1965; Wehner and Rossel, 1985; Wehner, 1997). The source of her information on the distance flown, however, has been the subject of much controversy. For several decades, it was believed that this information is inferred from the energy expenditure associated with the journey (for references, see von Frisch, 1965; Esch and Burns, 1996). However, in the light of new results (for reviews, see Wehner, 1992; Ronacher and Wehner, 1995; Esch and Burns, 1996), there is good reason to abandon the energy hypothesis in favour of an ‘optic flow hypothesis’ based on the use of image motion.

The use of optic flow in the ventral visual field for the estimation of the distance flown was investigated by observing the dances of foragers trained to a food source attached to a balloon flying above the ground at various heights (Esch and Burns, 1995, 1996). As the balloon’s altitude increases, the amount of energy needed to reach it increases accordingly, but the speed of image motion perceived from the ground decreases. In these experiments, the dancing foragers indicated a distance that decreased, rather than increased, as the height of the balloon was increased, showing that the speed of optic flow, rather than the energy expenditure, constitutes the relevant cue for estimating the distance flown.

In the lateral visual field, the same question was investigated by training bees to collect food in a tunnel carrying, on each of the two lateral walls, a vertical linear grating (Srinivasan et al. 1996, 1997a,b) or a random-pixel pattern (Srinivasan et al. 1997b). In different experiments, the feeder was placed at different distances from the tunnel entrance. In the tests, the trained bees searched for the food at the correct distance in all the experiments, although the feeder was absent during the tests. When a tail wind or head wind was introduced, the distance flown was neither underestimated not overestimated, respectively (Srinivasan et al. 1996, 1997b), showing again that energy expenditure is not the relevant cue in this task.

Interestingly, a pattern placed on the floor of the tunnel was not effective in indicating the distance flown (Srinivasan et al. 1997b), a result that seems to contradict the finding of Esch and Burns (1996), as well as results obtained from desert ants (Ronacher and Wehner, 1995), where patterns viewed ventrally were found to be effective. We will return to this point below.

Most of the behavioural studies reviewed here were, originally, aimed neither at comparing visual performance among different eye regions nor at testing the correlation between the performance and the specializations found in the peripheral visual pathway. However, the large amount of information that has accumulated over the years allows the comparisons undertaken in the present review.

The comparisons reveal similarities, as well as differences, among the performances of the various eye regions. Here, the outcome of these comparisons will be discussed in the light of (i) ecological aspects, and (ii) the peripheral anatomical specializations summarized in the Introduction.

Ecological aspects

In the individual sections describing the various results, I have included some considerations pointing at the correlation between the behavioural findings and the expectations inferred from the foraging bee’s natural habits. I here sum up these findings by listing the results that reveal such a correlation, without repeating the considerations already made in due context above.

(i) Shape detection (Fig. 2), (ii) pattern discrimination (Fig. 3) and (iii) colour discrimination (Fig. 7) are accomplished best in the ventral part of the frontal visual field. (iv) In colour discrimination tasks, the frontal (Figs 5–7), the ventral (Fig. 10C) and the lateral (Fig. 8) eye regions perform well, whereas the dorsal eye region does not (Giger and Srinivasan, 1997). (v) Discrimination of spatial frequencies is accomplished in both the ventral (e.g. Anderson, 1977) and the frontal (Wehner, 1981, and Fig. 4) visual field. (vi) Contour orientation is used as a discrimination cue in the frontal (Srinivasan, 1994) and the lateral (Giger and Srinivasan, 1997) eye regions, but not in the ventral and the dorsal regions (Giger and Srinivasan, 1997). (vii) Responses to edges during free flight are elicited in all the eye regions investigated. However, the functional significance of the response differs among the various eye regions depending on the task. In the frontal (Lehrer et al. 1985) and the ventral (Lehrer and Srinivasan, 1993) visual fields, edges elicit scanning behaviour (image stabilization). The use of edges presented in non-frontal positions for guiding the insect to a frontal target (Figs 4, 15) might also constitute some type of image stabilization. In this case, however, the lateral visual field performs best (Fig. 4). In the frontal and the ventral visual fields, edges serve, in addition, for object–ground discrimination (frontal visual field, Zhang and Srinivasan, 1994; ventral visual field, Fig. 14; see also Lehrer et al. 1990; Kern et al. 1997). In the ventral visual field, edges trigger, in addition, landing responses (Srinivasan et al. 1990; Lehrer and Srinivasan, 1993; Kern et al. 1997). (viii) Rotational optomotor stimulation evokes a response in all the eye regions investigated (see the section on the optomotor response), but (ix) optomotor stimuli elicit a stronger response in the lateral visual field than in the medial field (Moore and Rankin, 1982). (x) Temporal resolution, as measured by the movement avoidance response, is as good in the ventral eye region as it is in the frontal region (Figs 9, 10). However, the performance in the ventral eye region is based not only on motion resolution but, in addition, on colour resolution (Fig. 10C). (xi) Range estimation based on the speed of translational image motion is accomplished in all three planes (ventral eye region, Fig. 11; frontal eye region, Fig. 12, and Lehrer and Collett, 1994; lateral eye region, Fig. 13). (xii) Adjustment of flight height or of lateral distance, respectively, and adjustment of flight speed are accomplished in the ventral visual field (Kirchner and Heusipp, 1996; Srinivasan et al. 1996; Srinivasan and Zhang, 1997) as well as in the lateral visual field (Srinivasan and Zhang, 1997; Srinivasan et al. 1996, 1997a,b), and (xiii) the same holds true for estimation of the distance flown (ventral visual field, Esch and Burns, 1995, 1996; lateral visual field, Srinivasan et al. 1996, 1997a,b).

All these findings are correlated with the bee’s natural needs, irrespective of whether they can be explained, in addition, by some of the peripheral specializations.

Correlation with peripheral specializations

It remains to compare the various performances in the light of the peripheral specializations. Spatial vision, colour vision and motion vision will each be discussed separately.

Spatial resolution

The peripheral specializations predict better spatial resolution in the frontal eye region than in the other regions, as well as enhanced vertical resolution around the eye equator. In contrast to these predictions, we find the following. (i) Pattern detection (Fig. 2) and (ii) pattern discrimination (Fig. 3) are best in the lower frontal part of the visual field, an eye region that does not contain an acute zone. (iii) Spatial resolution of sectored patterns (Fig. 3) is better in the ventral frontal eye region than in the lateral frontal region, although the latter lies on the eye equator, whereas the former does not. (iv) Spatial frequency is discriminated in the ventral visual field (Anderson, 1977) as reliably as in the frontal field (Fig. 3; see also Fig. 59 in Wehner, 1981), although the latter contains an acute zone, whereas the former does not. (v) Using patterns presented in the frontal visual field, Srinivasan and Lehrer (1988) found that spatial resolution of vertically striped patterns is as accurate as that of horizontally striped patterns although, on the basis of anatomical findings, spatial resolution in the vertical direction, and thus of the horizontally striped pattern, is expected to be better than that of the vertically striped pattern. (vi) Discrimination of angular size (Schnetter, 1972; Wehner, 1981) and of (vii) absolute size (Horridge et al. 1992) are as accurate in the ventral visual field as they are in the frontal field. (viii) The same holds true for the detection of small objects against a contrasting background (Zaccardi et al. 1997). (ix) The finding that a horizontal stripe in an exactly lateral position is more effective than are more dorsal or ventral ones in guiding the bee to a frontal goal (Fig. 4) cannot be explained in terms of anatomical specializations. Although the acute zone around the eye equator would, indeed, predict a particularly good spatial resolution in the vertical direction, and thus of the lateral stripe, the width of the stripe (14 °) was well above resolution threshold in all the eye regions in which it was presented (Lehrer, 1990). (x) The finding that the frontal eye region makes use of several spatial parameters that the ventral eye region cannot make use of (such as contour orientation and the distribution of contrasting areas) is explained better by the finding that spatial vision in the bee is not space-invariant than by the particularly good resolution expected from the frontal visual field.

None of the results listed above (some of which have not been described in previous sections of this review) is in accordance with expectations based on peripheral specializations.

Colour vision

With respect to colour vision, the physiological findings predict similar performances in all eye regions. What we find, however, is (i) that colour discrimination in the lower half of the frontal eye region is better than it is in the upper half (Fig. 7), (ii) that the dorsal eye region is incapable of colour discrimination (Giger and Srinivasan, 1997), and (iii) that, in tasks that require the use of image motion, the bee behaves as if she were colour-blind, regardless of the eye region being investigated (e.g. Figs 9–11; for a review, see Lehrer, 1993), although there are no peripheral correlates for colour blindness.

Motion resolution

On the basis of the anatomical findings, stimuli moving in a horizontal direction are expected to be resolved better than stimuli moving in a vertical direction. Although stabilization of an edge on the eye was found to be based on motion detection (Lehrer et al. 1985; Lehrer, 1990), the particular efficacy of horizontal edges presented in the lateral visual field (Figs 4, 15) cannot be due to this specialization, because a horizontal edge can only move on the eye in the vertical direction.

However, the particularly strong optomotor response to vertical gratings moving horizontally in the lateral visual field (Moore and Rankin, 1982) would be in accordance with the anatomical findings, predicting a better resolution of horizontal motion in the lateral visual field than in the frontal field. However, the optomotor system is only active at very low contrast frequencies, and thus the stimuli used are expected to have been resolved easily even in the frontal eye region.

One finding that might be explained by the anatomical specializations is that of Srinivasan et al. (1997b). In their experiments, estimation of the distance flown did not function in the ventral visual field, whereas in the lateral visual field the bees’ performance in this task was excellent. It is possible that the pattern on the ground moved too fast at the bee’s eye to be resolved, whereas resolution of the same pattern in the lateral visual field was still possible because of the larger horizontal interommatidial angles there. Using the movement avoidance response, temporal resolution in the ventral visual field (Fig. 10A,B) was found to be as high as in the frontal visual field (Fig. 9A,B). However, movement avoidance requires no more than motion detection, whereas estimation of the distance flown requires the integration of motion speed over time. It might be of some value to evaluate the bees’ speed of flight and thus the speed of image motion perceived by them in the tunnel used by Srinivasan et al. (1997b) or to vary the spatial frequency of the pattern, as has been done by Ronacher and Wehner (1995).

The special case of the dorsal rim region

We have not considered the bee’s uppermost dorsal rim region (POL area, see black sickle-shaped areas in Fig. 1C), which is the only eye region capable of analyzing the orientation of the E-vector of the skylight pattern (Wehner and Strasser, 1985). This function is correlated with several very conspicuous specializations (for a review and references, see Wehner, 1994) that are unique to the POL area and are lacking in all the other eye regions. The POL area seems to be the only eye region in which the correlation between the behaviourally measured performance and the peripheral specializations has been demonstrated beyond any doubt.

The present review illustrates the large variety of visual tasks that different eye regions must be prepared to undertake, depending on the situation. Because it is impossible to construct foveas all over the eye, the best way to render all the eye regions suitable for all possible types of performance is by evolving neural, rather than anatomical, specializations. It seems that each eye region is capable of admitting all types of incoming visual information, and then of extracting, via particular neural pathways, the particular information that is relevant to the task in hand. The differences found among the performances of different eye regions may thus be a consequence of different degrees of facilitation associated with the different neural pathways. The degree to which this facilitation is effective might be correlated with the probability of a particular visual cue being encountered in a particular eye region. The facilitation might thus be a consequence of individual experience and therefore of learning processes.

I am greatly indebted to Mandyam Srinivasan for many thoughtful comments on the manuscript. Thanks are due to Eric Meyer for preparing the coloured illustrations and the electronic versions of the figures. The data shown in Figs 6, 7 and 10 were collected with the enthusiastic help of several students to whom I extend my gratitude. Last but not least, I wish to thank William Harvey, the review editor of this journal, for having accepted this review for publication despite its unusual length.

Anderson
,
A. M.
(
1977
).
Parameters determining the attractiveness of stripe patterns in the honey bee
.
Anim. Behav
.
25
,
80
87
.
Autrum
,
H.
and
StÖcker
,
M.
(
1950
).
Die Verschmelzungsfrequenzen des Bienenauges
.
Z. Naturforsch
.
5b
,
38
43
.
Bernard
,
G. D.
and
Wehner
,
R.
(
1980
).
Intracellular optical physiology of the bee’s eye
.
J. comp. Physiol
.
137
,
193
203
.
Brünnert
,
U.
,
Kelber
,
A.
and
Zeil
,
J.
(
1994
).
Ground-nesting bees determine the location of their nest relative to a landmark by other than angular size cues
.
J. comp. Physiol. A
175
,
363
370
.
Campan
,
R.
,
Goulet
,
M.
and
Lambin
,
M.
(
1981
).
L’appréciation de l’étoignement relatif entre deux objets chez le grillon Nemobius sylvestris (Bosc) et l’aptérypote Lepismachilis targionii (Grassi)
.
Bull. Soc. Hist. nat. Toulouse
117
,
41
50
.
Cartwright
,
B. A.
and
Collett
,
T. S.
(
1979
).
How honey-bees know their distance from a near-by visual landmark
.
J. exp. Biol
.
82
,
367
372
.
Cartwright
,
B.
and
Collett
,
T. S.
(
1983
).
Landmark learning in bees: experiments and models
.
J. comp. Physiol
.
121
,
521
543
.
Chittka
,
L.
and
Menzel
,
R.
(
1992
).
The evolutionary adaptation of flower colors and the insect pollinator’s color vision system
.
J. comp. Physiol. A
171
,
171
181
.
Collett
,
T. S.
(
1978
).
Peering – a locust behaviour pattern for obtaining motion parallax information
.
J. exp. Biol
.
76
,
237
241
.
Collett
,
T. S.
(
1992
).
Landmark learning and guidance in insects
.
Phil. Trans. R. Soc. Lond. B
337
,
295
303
.
Daumer
,
K.
(
1956
).
Reizmetrische Untersuchung des Farbensehens der Biene
.
Z. vergl. Physiol
.
38
,
413
478
.
Esch
,
H. E.
and
Burns
,
J. E.
(
1995
).
Honeybees use optic flow to measure the distance of a food source
.
Naturwissensenchaften
82
,
38
40
.
Esch
,
H. E.
and
Burns
,
J. E.
(
1996
).
Distance estimation by foraging honeybees
.
J. exp. Biol
.
199
,
155
162
.
Free
,
J. B.
(
1970
).
Effect of flower shapes and nectar guides on the behaviour of foraging bees
.
Behaviour
37
,
269
285
.
Giger
,
A. D.
and
Srinivasan
,
M. V.
(
1997
).
Honeybee vision: analysis of orientation and colour in the lateral, dorsal and ventral fields of view
.
J. exp. Biol
.
200
,
1271
1280
.
Giurfa
,
M.
,
Eichmann
,
B.
and
Menzel
,
R.
(
1996
).
Symmetry perception in an insect
.
Nature
382
,
458
461
.
Hertz
,
M.
(
1930
).
Die Organisation des optischen Feldes bei der Biene II
.
Z. vergl. Physiol
.
11
,
107
145
.
Hertz
,
M.
(
1933
).
über figurale Intensitäten und Qualitäten in der optischen Wahrnehmung der Biene
.
Biol. Zbl
.
54
,
10
40
.
Horridge
,
G. A.
(
1980
).
Apposition eyes of large diurnal insects as organs adapted to seeing
.
Proc. R. Soc. Lond. B
285
,
1
59
.
Horridge
,
G. A.
(
1988
).
A theory of insect vision: velocity parallax
.
Proc. R. Soc. Lond. B
229
,
13
17
.
Horridge
,
G. A.
(
1996
).
The honeybee (Apis mellifera) detects bilateral symmetry and discriminates its axis
.
J. Insect Physiol
.
42
,
755
764
.
Horridge
,
G. A.
(
1997
).
Spatial and non-spatial coding of patterns by the honey-bee
.
In From Living Eyes to Seeing Machines
(ed.
M. V.
Srinivasan
and
S.
Venkatesh
), pp.
52
79
.
Oxford
:
Oxford University Press
.
Horridge
,
G. A.
(
1998
).
Spatial coincidence of cues in visual learning by the honeybee (Apis mellifera)
.
J. Insect Physiol
.
44
,
343
350
.
Horridge
,
G. A.
,
Zhang
,
S. W.
and
Lehrer
,
M.
(
1992
).
Bees can combine range and visual angle to estimate absolute size
.
Phil. Trans. R. Soc. Lond. B
337
,
49
57
.
Kaiser
,
W.
and
Liske
,
E.
(
1974
).
Die optomotorischen Reaktionen von fixiert fliegenden Bienen bei Reizung mit Spektrallichtern
.
J. comp. Physiol
.
80
,
391
408
.
Kern
,
R.
,
Egelhaaf
,
M.
and
Srinivasan
,
M. V.
(
1997
).
Edge detection by landing honeybees: behavioural analysis and model simulations of the underlying mechanism
.
Vision Res
.
15
,
2103
2117
.
Kevan
,
P. G.
(
1973
).
Flowers, insects and pollination ecology in the Canadian High Arctic
.
Polar Records
16
,
667
674
.
Kirchner
,
W. H.
and
Heusipp
,
M.
(
1990
).
Freely flying honeybees use retinal image motion and motion parallax in visual course control
.
Proceedings of the Göttingen Neurobiology Conference
18
,
84
.
Stuttgart, New York
:
George Thieme Verlag
.
Kirchner
,
W. H.
and
Srinivasan
,
M. V.
(
1989
).
Freely flying honeybees use image motion to estimate object distance
.
Naturwissensenschaften
76
,
281
282
.
Kunze
,
P.
(
1961
).
Untersuchung des Bewegungssehen fixiert fliegender Bienen
.
Z. vergl. Physiol
.
44
,
656
684
.
Land
,
M. F.
(
1989
).
Variations in the structure and design of compound eyes
.
In Facets of Vision
(ed.
D. G.
Stavenga
and
R. C.
Hardie
), pp.
90
111
.
Berlin, Heidelberg
:
Springer-Verlag
.
Land
,
M. F.
(
1997
).
The resolution of insect compound eyes
.
Israel J. Plant Sci
.
45
,
79
92
.
Lehrer
,
M.
(
1990
).
How bees use peripheral eye regions to localize a frontally positioned target
.
J. comp. Physiol. A
167
,
173
185
.
Lehrer
,
M.
(
1993
).
Parallel processing of motion, colour and shape in the visual system of the honeybee
.
In Arthropod Sensory Systems
(ed.
K.
Wiese
,
F. G.
Gribakin
,
A. V.
Popow
and
G.
Renninger
), pp.
266
272
.
Basel, Boston, Berlin
:
Birkhäuser
.
Lehrer
,
M.
(
1994
).
Spatial vision in the honeybee: The use of different cues in different tasks
.
Vision Res
.
34
,
2363
2385
.
Lehrer
,
M.
(
1997
).
Honeybee’s use of spatial parameters for flower discrimination
.
Israel J. Plant Sci
.
45
,
159
169
.
Lehrer
,
M.
and
Collett
,
T. S.
(
1994
).
Approaching and departing bees learn different cues to the distance of a landmark
.
J. comp. Physiol. A
175
,
171
177
.
Lehrer
,
M.
,
Horridge
,
G. A.
,
Zhang
,
S. W.
and
Gadagkar
,
R.
(
1994
).
Shape vision in bees: innate preference for flower-like patterns
.
Trans. Phil. R. Soc. Lond. B
347
,
123
137
.
Lehrer
,
M.
and
Srinivasan
,
M. V.
(
1992
).
Freely flying bees can discriminate between moving and stationary objects: performance and possible mechanisms
.
J. comp. Physiol. A
171
,
457
467
.
Lehrer
,
M.
and
Srinivasan
,
M. V.
(
1993
).
Object–ground discrimination in bees: Why do they land on edges?
J. comp. Physiol. A
173
,
23
32
.
Lehrer
,
M.
and
Srinivasan
,
M. V.
(
1994
).
Active vision in honeybees: task-oriented suppression of an innate behaviour
.
Vision Res
.
34
,
511
516
.
Lehrer
,
M.
,
Srinivasan
,
M. V.
and
Zhang
,
S. W.
(
1990
).
Visual edge detection in the honeybee and its spectral properties
.
Proc. R. Soc. Lond
.
238
,
321
330
.
Lehrer
,
M.
,
Srinivasan
,
M. V.
,
Zhang
,
S. W.
and
Horridge
,
G. A.
(
1988
).
Motion cues provide the bee’s visual world with a third dimension
.
Nature
332
,
356
357
.
Lehrer
,
M.
,
Wehner
,
R.
and
Srinivasan
,
M. V.
(
1985
).
Visual scanning behaviour in honeybees
.
J. comp. Physiol. A
157
,
405
415
.
Mazochin-porshnyakov
,
G. A.
,
Semyonova
,
S. A.
and
Milevskaya
,
I. A.
(
1977
).
Characteristic features of the identification by Apis mellifera of objects by their size (in Russian)
.
J. Obsch. Biol
.
38
,
855
962
.
Menzel
,
R.
(
1967
).
Untersuchungen zum Erlernen von Spektralfarben durch die Honigbiene, Apis mellifica
.
Z. vergl. Physiol
.
56
,
22
62
.
Menzel
,
R.
and
Backhaus
,
W.
(
1989
).
Colour vision in honeybees: Phenomena and physiological mechanism
.
In Facets of Vision
(ed.
D. G.
Stavenga
and
R. C.
Hardie
), pp.
281
297
.
Berlin, Heidelberg
:
Springer
.
Menzel
,
R.
and
Blakers
,
M.
(
1976
).
Colour receptors in the bee eye – morphology and spectral sensitivity
.
J. comp. Physiol
.
108
,
11
33
.
Menzel
,
R.
and
Lieke
,
E.
(
1983
).
Antagonistic color effects in spatial vision of honeybees
.
J. comp. Physiol
.
151
,
441
448
.
Menzel
,
R.
and
Shmida
,
A.
(
1993
).
The ecology of flower colours and the natural colour vision of insect pollinators: The Israeli flora as as a study case
.
Biol. Rev
.
68
,
81
120
.
Moore
,
D.
and
Rankin
,
M. A.
(
1982
).
Direction-sensitive partitioning of the honeybee optomotor system
.
Physiol. Ent
.
7
,
25
36
.
Neal
,
P. R.
,
Dafni
,
A.
and
Giurfa
,
M.
(
1998
).
Floral symmetry and its role in plant-pollinator systems: Terminology, distribution and hypotheses
.
A. Rev. ecol. Syst
.
29
,
345
373
.
Ronacher
,
B.
(
1979
).
Äquivalenz zwischen Grössen-und Helligkeitsunterschieden im Rahmen der visuellen Wahrnehmung der Honigbiene
.
Biol. Cybernetics
32
,
63
75
.
Ronacher
,
B.
and
Wehner
,
R.
(
1995
).
Desert ants Cataglyphis fortis use self-induced optic flow to measure distance travelled
.
J. comp. Physiol. A
177
,
21
28
.
Schlieper
,
C.
(
1928
).
über die Helligkeitsverteilung im Spektrum bei verschiedenen Insekten
.
Z. vergl. Physiol
.
8
,
281
282
.
Schnetter
,
B.
(
1972
).
Experiments on pattern discrimination in honey bees
.
In Information Processing in the Visual Systems of Arthropods
(ed.
R.
Wehner
), pp.
195
200
.
Berlin, Heidelberg, New York
:
Springer
.
Seidl
,
R.
and
Kaiser
,
W.
(
1981
).
Visual field size, binocular domain and the ommatidial array of the compound eyes in the worker honeybee
.
J. comp. Physiol
.
143
,
17
26
.
Sobel
,
E. C.
(
1990
).
The locust’s use of motion parallax to measure distance
.
J. comp. Physiol. A
167
,
579
588
.
Srinivasan
,
M. V.
(
1994
).
Pattern recognition in the honeybee: Recent progress
.
J. Insect Physiol
.
40/3
,
18
194
.
Srinivasan
,
M. V.
,
Chal
,
J. S.
,
Nagle
,
M. G.
and
ZHANG
,
S. W.
(
1997a
).
Embodying natural vision into machines
.
In From Living Eyes to Seeing Machines
(ed.
M. V.
Srinivasan
and
S.
Venkatesh
), pp.
249
266
.
Oxford
:
Oxford University Press
.
Srinivasan
,
M. V.
and
Lehrer
,
M.
(
1984a
).
Temporal acuity of honeybee vision: behavioural studies using flickering stimuli
.
Physiol. Ent
.
9
,
447
457
.
Srinivasan
,
M. V.
and
Lehrer
,
M.
(
1984b
).
Temporal acuity of honeybe vision: behavioural studies using moving stimuli
.
J. comp. Physiol. A
155
,
297
312
.
Srinivasan
,
M. V.
and
Lehrer
,
M.
(
1985
).
Temporal resolution of colour vision in the honeybee
.
J. comp. Physiol. A
157
,
579
586
.
Srinivasan
,
M. V.
and
Lehrer
,
M.
(
1988
).
Spatial acuity of honeybee vision and its chromatic properties
.
J. comp. Physiol. A
162
,
159
172
.
Srinivasan
,
M. V.
,
Lehrer
,
M.
and
Horridge
,
G. A.
(
1990
).
Visual figure–ground discrimination in the honeybee: the role of motion parallax at boundaries
.
Proc. R. Soc. Lond
.
238
,
331
350
.
Srinivasan
,
M. V.
,
Lehrer
,
M.
,
Kirchner
,
W.
and
ZHANG
,
S. W.
(
1991
).
Range perception through apparent image speed in freely flying honeybees
.
Visual Neurosci
.
6
,
519
536
.
Srinivasan
,
M. V.
and
Zhang
,
S. W.
(
1997
).
Visual control of honeybee flight
.
In Orientation and Communication in Arthropods
(ed.
M.
Lehrer
), pp.
95
114
.
Basel, Boston, Berlin
:
Birkhäuser
.
Srinivasan
,
M. V.
,
Zhang
,
S. W.
and
Bidwell
,
N. J.
(
1997b
).
Visually mediated odometry in honeybees
.
J. exp. Biol
.
200
,
2513
2522
.
Srinivasan
,
M. V.
,
Zhang
,
S. W.
,
Lehrer
,
M.
and
COLLETT
,
T. S.
(
1996
).
Honeybee navigation en route to the goal: visual flight control and odometry
.
J. exp. Biol
.
199
,
237
244
.
Srinivasan
,
M. V.
,
Zhang
,
S. W.
and
Rolfe
,
B.
(
1993
).
Pattern vision in insects: ‘cortical’ processing?
Nature
362
,
539
540
.
Srinivasan
,
M. V.
,
Zhang
,
S. W.
and
Whitney
,
K.
(
1994
).
Visual discrimination of pattern orientation by honeybees
.
Phil. Trans. R. Soc. Lond. B
343
,
199
210
.
Van hateren
,
H. J.
,
Srinivasan
,
M. V.
and
Wait
,
P. B.
(
1990
).
Pattern recognition in bees: orientation discrimination
.
J. comp. Physiol. A
167
,
649
654
.
Von frisch
,
K.
(
1915
).
Der Farbensinn und Formensinn der Bienen
.
Zool. Jb. Abt. all. Zool. Physiol
.
35
,
1
182
.
Von frisch
,
K.
(
1965
).
Tanzsprache und Orientierung der Bienen
.
Berlin, Heidelberg, New York
:
Springer
.
Von helversen
,
O.
(
1972
).
Zur spektralen Unterschiedsempfindlichkeit der Honigbiene
.
J. comp. Physiol
.
80
,
439
472
.
Walcher
,
F.
and
Kral
,
K.
(
1994
).
Visual deprivation and distance estimation in the praying mantis larva
.
Physiol. Ent
.
19
,
230
240
.
Wallace
,
G. K.
(
1959
).
Visual scanning in the desert locust Schistocerca gregaria, Forskål
.
J. exp. Biol
.
36
,
512
525
.
Wehner
,
R.
(
1972a
).
Dorsoventral asymmetry in the visual field of the bee, Apis mellifica
.
J. comp. Physiol
.
77
,
256
277
.
Wehner
,
R.
(
1972b
).
Pattern modulation and pattern detection in the visual system of Hymenoptera
.
In Information Processing in the Visual System of Arthropods
(ed.
R.
Wehner
), pp.
183
194
.
Berlin, Heidelberg, New York
:
Springer
.
Wehner
,
R.
(
1974
).
Pattern recognition
.
In The Compound Eye and Vision of Insects
(ed.
G. A.
Horridge
), pp.
75
113
.
Oxford
:
Clarendon Press
.
Wehner
,
R.
(
1979
).
Mustererkennung bei Insekten: Lokalisation und Identifikation visueller Objekte
.
Verh. dt. zool. Ges
.
1979
,
19
41
.
Wehner
,
R.
(
1981
).
Spatial vision in arthropods
.
In Handbook of Sensory Physiology
, vol.
VII/6C
(ed.
H.
Autrum
), pp.
287
616
.
Berlin, Heidelberg, New York
:
Springer
.
Wehner
,
R.
(
1992
).
Homing in arthropods
.
In Animal Homing
(ed.
F.
Papi
), pp.
45
144
.
London
:
Chapman & Hall
.
Wehner
,
R.
(
1994
).
The polarization-vision project: championing organismic biology
.
Fortschr. Zool
.
39
,
104
143
.
Wehner
,
R.
(
1997
).
The ant’s celestial compass system: spectral and polarization channels
.
In Orientation and Communication in Arthropods
(ed.
M.
Lehrer
), pp.
145
186
.
Basel, Boston, New York
:
Birkhäuser
.
Wehner
,
R.
and
Flatt
,
I.
(
1977
).
Visual fixation in freely flying bees
.
Z. Naturforsch
.
32c
,
469
471
.
Wehner
,
R.
and
Lindauer
,
M.
(
1966
).
Zur Physiologie des Formensehens bei der Honigbiene. I. Winkelunterscheidung an vertikal orientierten Streifenmustern
.
Z. vergl. Physiol
.
52
,
290
324
.
Wehner
,
R.
and
Rossel
,
S.
(
1985
).
The bee’s celestial compass – A case study in behavioural neurobiology
.
Fortschr. Zool
.
31
,
11
53
.
Wehner
,
R.
and
Strasser
,
S.
(
1985
).
The POL area of honey bee’s eye: behavioural evidence
.
Physiol. Ent
.
10
,
337
349
.
Wolf
,
E.
(
1933
).
Das Verhalten der Bienen gegnüber flimmernden Feldern und bewegten Objekten
.
Z. vergl. Physiol
.
20
,
151
161
.
Wolf
,
E.
and
Zerrahn-wolf
,
G.
(
1935
).
The effect of light intensity, area and flicker frequency on the visual reactions of the honeybee
.
J. gen. Physiol
.
18
,
853
863
.
Zaccardi
,
G.
,
Giurfa
,
M.
and
Vorbyev
,
M.
(
1997
).
How bees detect different targets using different regions of their compound eyes
.
Proceedings of the Göttingen Neurobiology Conference
25
,
479
.
Stuttgart, New York
:
George Thieme Verlag
.
Zeil
,
J.
(
1993a
).
Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera). I. Description of flight
.
J. comp. Physiol. A
172
,
189
205
.
Zeil
,
J.
(
1993b
).
Orientation flights of solitary wasps (Cerceris; Sphecidae; Hymenoptera). II. Similarity between orientation and return flights and the use of motion parallax
.
J. comp. Physiol. A
172
,
209
224
.
Zerrahn
,
G.
(
1934
).
Formdressur und Formunterscheidung bei der Honigbiene
.
Z. vergl. Physiol
.
20
,
117
150
.
Zhang
,
S. W.
and
Srinivasan
,
M. V.
(
1990
).
Visual tracking of moving targets by freely flying honeybees
.
Visual Neurosci
.
4
,
379
386
.
Zhang
,
S. W.
and
Srinivasan
,
M. V.
(
1994
).
Prior experience enhances pattern discrimination in insect vision
.
Nature
368
,
330
332
.
Zhang
,
S. W.
,
Srinivasan
,
M. V.
and
Collett
,
T. S.
(
1995
).
Convergent processing in honeybee vision: Multiple channels for the recognition of shape
.
Proc. natn. Acad. Sci. U.S.A
.
92
,
3029
3031
.