Добавлен: 03.02.2019

Просмотров: 21582

Скачиваний: 19

ВНИМАНИЕ! Если данный файл нарушает Ваши авторские права, то обязательно сообщите нам.
background image

1-56 Principles of Sound and Hearing

Loudness: This self-explanatory dimension is a useful check on the accuracy with which the
sound levels of comparison sounds have been matched. It should, however, be noted that some
listeners seem to regard the adjective loud as a synonym for sharp, hard, or painful.

The relative importance of these dimensions in describing overall sound quality changes

slightly according to the specific nature of the devices under test, the form of the listener ques-
tionnaire, the program material, and, to some extent, the listeners themselves. In general, Gabri-
elsson and colleagues [13, 15] have found that clarity, or definition, brightness versus darkness,
and sharpness, or hardness, versus softness are major contributors to the overall impression of
sound quality.

1.4.3f

Audibility of Variations in Amplitude and Phase

Other things being equal, very small differences in sound level can be heard: down to a fraction
of a decibel in direct A/B comparisons. Level differences that exist over only a small part of the
spectrum tend to be less audible than differences that occupy a greater bandwidth. In other
words, a small difference that extends over several octaves may be as significant as a much larger
difference that is localized in a narrow band of frequencies. Spectral tilts of as little as 0.1 dB per
octave are audible. For simple sounds the only audible difference may be loudness, but for com-
plex sounds differences in timbre may be more easily detectable.

The audibility of phase shift is a very different matter. This hotly debated issue assumes major

proportions because of the implication that if phase shifts are not audible, then the waveform of a
complex sound, per se, is not important. Several independent investigations over many years
have led to the conclusion that while there are some special signals and listening situations where
phase effects can be heard, their importance when listening to music in conventional environ-
ments is small [19]. Psychophysical studies indicate that, in general, sensitivity to phase is small
compared with sensitivity to the amplitude spectrum and that sensitivity to phase decreases as
the fundamental frequency of the signal increases. At the same time, it appears to be phase shifts
in the upper harmonics of a complex signal that contribute most to changes in timbre [20].

The notion that phase, and therefore waveform, information is relatively unimportant is con-

sistent with some observations of normal hearing. Sounds from real sources (voices and musical
instruments) generally arrive at our ears after traveling over many different paths, some of which
may involve several reflections. The waveform at the ear therefore depends on various factors

Figure 1.4.6

 The estimated average

transformation of sound pressure level
from the free field to the eardrum as a
function of frequency, showing the vari-
ations as a function of the angle of ele-
vation for sounds arriving from the
forward direction. (

From [1]. Used with

permission.)

Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)

Copyright © 2004 The McGraw-Hill Companies. All rights reserved.

Any use is subject to the Terms of Use as given at the website.

The Physical Nature of Hearing


background image

The Physical Nature of Hearing 1-57

other than the source itself. Even the argument that the direct sound is especially selected for
audition and that later arrivals are perceptually suppressed does not substantially change the situ-
ation because sources themselves do not radiate waveforms that are invariably distinctive. With
musical instruments radiating quite different components of their sound in different directions
(consider the complexity of a grand piano or the cello, for example), the sum of these compo-
nents—the waveform at issue—will itself be different at every different angle and distance; a
recording microphone is in just such a situation.

The fact that the ear seems to be relatively insensitive to phase shifts would therefore appear

to be simply a condition born of necessity. It would be incorrect to assume, however, that the
phase performance of devices is totally  unimportant. Spectrally localized phase anomalies are
useful indicators of the presence of resonances in systems, and very large accumulations of
phase shift over a range of frequencies can become audible as group delays.

While the presence of resonances can be inferred from phase fluctuations, their audibility

may be better predicted from evidence in the amplitude domain [19]. It should be added that res-
onances of low Q in sound reproduction systems are more easily heard than those of higher Q
[21–23]. This has the additional interesting ramification that evidence of sustained ringing in the
time domain may be less significant than ringing that is rapidly damped; waveform features and
other measured evidence that attract visual attention do not always correspond directly with the
sound colorations that are audible in typical listening situations.

1.4.3g

Perception of Direction and Space

Sounds are commonly perceived as arriving from specific directions, usually coinciding with the
physical location of the sound source. This perception may also carry with it a strong impression
of the acoustical setting of the sound event, which normally is related to the dimensions, loca-
tions, and sound-reflecting properties of the structures surrounding the listener and the sound
source as well as objects in the intervening path.

Blauert, in his thorough review of the state of knowledge in this field [17], defines spatial

hearing  as embracing “the relationships between the locations of auditory events and other
parameters—particularly those of sound events, but also others such as those that are related to
the physiology of the brain.” This statement introduces terms and concepts that may require
some explanation. The adjective sound, as in sound event, refers to a physical source of sound,
while the adjective auditory identifies a perception. Thus, the perceived location of an auditory
event usually coincides with the physical location of the source of sound. Under certain circum-
stances, however, the two locations may differ slightly or even substantially. The difference is
then attributed to other parameters having nothing whatever to do with the physical direction of
the sound waves impinging on the ears of the listener, such as subtle aspects of a complex sound
event or the processing of the sound signals within the brain.

Thus have developed the parallel studies of monaural, or one-eared, hearing and binaural, or

two-eared, hearing. Commercial sound reproduction has stimulated a corresponding interest in
the auditory events associated with sounds emanating from a single source (monophonic) and
from multiple sources that may be caused to differ in various ways (stereophonic). In common
usage it is assumed that stereophonic reproduction involves only two loudspeakers, but there are
many other possible configurations. In stereophonic reproduction the objective is to create many
more auditory events than the number of real sound sources would seem to permit. This is
accomplished by presenting to the listener combinations of sounds that take advantage of certain

Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)

Copyright © 2004 The McGraw-Hill Companies. All rights reserved.

Any use is subject to the Terms of Use as given at the website.

The Physical Nature of Hearing


background image

1-58 Principles of Sound and Hearing

inbuilt perceptual processes in the brain to create auditory events in locations other than those of
the sound events and in auditory spaces that may differ from the space within which the repro-
duction occurs.

Understanding the processes that create auditory events would ideally permit the construction

of predictable auditory spatial illusions in domestic stereophonic reproduction, in cinemas, in
concert halls, and in auditoria. Although this ideal is far from being completely realized, there
are some important patterns of auditory behavior that can be used as guides for the processing of
sound signals reproduced through loudspeakers as well as for certain aspects of listening room,
concert hall, and auditorium design.

1.4.3h

Monaural Transfer Functions of the Ear

Sounds arriving at the ears of the listener are subject to modification by sound reflection, diffrac-
tion, and resonances in the structures of the external ear, head, shoulders, and torso. The amount
and form of the modification are dependent on the frequency of the sound and the direction and
distance of the source from which the sound emanates. In addition to the effect that this has on
the sensitivity of the hearing process, which affects signal detection, there are modifications that
amount to a kind of directional encoding, wherein sounds arriving from specific directions are
subject to changes characteristic of those directions.

Each ear is partially sheltered from sounds arriving from the other side of the head. The effect

of diffraction is such that low-frequency sounds, with wavelengths that are large compared with
the dimensions of the head, pass around the head with little or no attenuation, while higher fre-
quencies are progressively more greatly affected by the directional effects of diffraction. There
is, in addition, the acoustical interference that occurs among the components of sound that have
traveled over paths of slightly different length around the front and back and over the top of the
head.

Superimposed on these effects are those of the pinna, or external ear. The intriguingly com-

plex shape of this structure has prompted a number of theories of its behavior, but only relatively
recently have some of its important functions been properly put into perspective. According to
one view, the folds of the pinna form reflecting surfaces, the effect of which is to create, at the
entrance to the ear canal, a system of interferences between the direct and these locally reflected
sounds that depends on the direction and distance of the incoming sound [24]. The small size of
the structures involved compared with the wavelengths of audible sounds indicates that disper-
sive scattering, rather than simple reflection, is likely to be the dominant effect. Nevertheless,
measurements have identified some acoustical interferences resembling those that such a view
would predict, and these have been found to correlate with some aspects of localization [18, 25].

In the end, however, the utility of the theory must be judged on the basis of how effectively it

explains the physical functions of the device and how well it predicts the perceptual conse-
quences of the process. From this point of view, time-domain descriptions would appear to be at
a disadvantage since the hearing process is demonstrably insensitive to the fine structure of sig-
nals at frequencies above about 1.5 kHz [17]. Partly for this reason most workers have favored
descriptions in terms of spectral cues.

It is therefore convenient that the most nearly complete picture of external-ear function has

resulted from examinations of the behavior of the external ear in the frequency domain. By care-
fully measuring the pressure distributions in the standing-wave patterns, the dominant reso-

Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)

Copyright © 2004 The McGraw-Hill Companies. All rights reserved.

Any use is subject to the Terms of Use as given at the website.

The Physical Nature of Hearing


background image

The Physical Nature of Hearing 1-59

nances in the external ear have been identified [26.] These have been related to the physical
structures and to the measured acoustical performance of the external ear [1].

A particularly informative view of the factors involved in this discussion comes from an

examination of curves showing the transformation of SPL from the free field to the eardrum
[27]. These curves reveal, as a function of frequency, the amplitude modifications imposed on
incident sounds by the external hearing apparatus. Figure 1.4.5 shows the family of curves repre-
senting this transformation for sounds arriving from different directions in the horizontal plane.
Figure 1.4.6 shows the estimated transformations for sound sources at different elevations.

An interesting perspective on these data is shown in Figure 1.4.7, where it is possible to see

the contributions of the various acoustical elements to the total acoustical gain of the ear. It
should be emphasized that there is substantial acoustical interaction among these components, so
that the sum of any combination of them is not a simple arithmetic addition. Nevertheless, this
presentation is a useful means of acquiring a feel for the importance of the various components.

It is clear from these curves that there are substantial direction-dependent spectral changes,

some rather narrowband in influence and others amounting to significant broadband tilts. Sev-
eral studies in localization have found that, especially with pure tones and narrowband signals,
listeners could attribute direction to auditory events resulting from sounds presented through
only one ear (monaural localization) or presented identically in two ears, resulting in localization
in the median plane (the plane bisecting the head vertically into symmetrical left-right halves).
So strong are some of these effects that they can cause auditory events to appear in places differ-
ent from the sound event, depending only on the spectral content of the sound. Fortunately such
confusing effects are not common in the panorama of sounds we normally encounter, partly
because of familiarity with the sounds themselves, but the process is almost certainly a part of
the mechanism by which we are able to distinguish between front and back and between up an
down, directions that otherwise would be ambiguous because of the symmetrical locations of the
two ears.

Figure 1.4.7

 Contributions of various body parts to the total acoustic gain of the external hearing

system for a sound source at a horizontal angle of 45°. Note that the interactions between these
components prevent simple arithmetic addition of their individual contributions. (

From [1]. Used

with permission.)

Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)

Copyright © 2004 The McGraw-Hill Companies. All rights reserved.

Any use is subject to the Terms of Use as given at the website.

The Physical Nature of Hearing


background image

1-60 Principles of Sound and Hearing

Interaural Differences

As useful as the monaural cues are, it is sound localization in the horizontal plane that is domi-
nant, and for this the major cues come from the comparison of the sounds at the two ears and the
analysis of the differences between them. From the data shown in Figure 1.4.5 it is evident that
there is a substantial frequency-dependent interaural amplitude difference (lAD) that character-
izes sounds arriving from different horizontal angles. Because of the path length differences
there will also be an associated interaural time difference (ITD) that is similarly dependent on
horizontal angle.

Figure 1.4.8 shows IADs as a function of frequency for three angles of incidence in the hori-

zontal plane. These have been derived from the numerical data in [28], from which many other
such curves can be calculated.

The variations in IAD as a function of both frequency and horizontal angle are natural conse-

quences of the complex acoustical processes in the external hearing apparatus. Less obvious is
the fact that there is frequency dependency in the ITDs. Figure 1.4.9 shows the relationship
between ITD and horizontal angle for various pure tones and for broadband clicks. Also shown
are the predictive curves for low-frequency sounds, based on diffraction theory, and for high-fre-
quency sounds, based on the assumption that the sound reaches the more remote ear by traveling
as a creeping wave that follows the contour of the head. At intermediate frequencies (0.5 to 2
kHz) the system is dispersive, and the temporal differences become very much dependent on the
specific nature of the signal [29, 30].

It is evident from these data that at different frequencies, especially the higher frequencies,

there are different combinations of ITD and IAD associated with each horizontal angle of inci-
dence. Attempts at artificially manipulating the localization of auditory events by means of fre-

Figure 1.4.8

 The interaural amplitude difference as a function of frequency for three angles of inci-

dence. (

After [28].)

Downloaded from Digital Engineering Library @ McGraw-Hill (www.digitalengineeringlibrary.com)

Copyright © 2004 The McGraw-Hill Companies. All rights reserved.

Any use is subject to the Terms of Use as given at the website.

The Physical Nature of Hearing