Archive Liste Typographie
Message : Re: Mais où est passé Re: Re :Typographie du Gaffiot

(Jean Fontaine) - Lundi 12 Mars 2001
Navigation par date [ Précédent    Index    Suivant ]
Navigation par sujet [ Précédent    Index    Suivant ]

Subject:    Re: Mais où est passé Re: Re :Typographie du Gaffiot
Date:    Mon, 12 Mar 2001 12:26:01 -0500
From:    "Jean Fontaine" <jfontain@xxxxxxxxxxx>

> comme je sais que tu es un maniaque des mesures... j'ai ceci dans ma
> besace...

À propos d'unités, connaissez-vous les twips et les kyus?

twip
a unit of distance used in computer graphics for high-resolution control of
the elements of an image. One twip is equal to 1/1440 inch, about 17.639
micrometers, or 0.070 556 kyu. "Twip" is an acronym for "twentieth of a
point," which is accurate if the point [2] is interpreted as being exactly
1/72 inch.

kyu
a metric unit of distance used in typography and graphic design. The kyu,
originally written Q, is equal to exactly 0.25 millimeter, about 0.71 point
[2], or about 14.173 twips. The spelling "kyu" seems to have been introduced
by the software company Macromedia.

point (pt) [2]
a unit of length used by typographers and printers. When printing was done
from hand-set metal type, one point represented the smallest element of type
that could be handled, roughly 1/64 inch. Eventually, the point was
standardized in Britain and America as exactly 0.013 837 inch, which is
about 0.35 mm (351.46 micrometers) and a little bit less than 1/72 inch. In
continental Europe, typographers traditionally used a slightly larger point
of 0.014 83 inch (about 1/72 pouce, 0.377 mm, or roughly 1/67 English inch),
called a Didot point after the French typographer Firmin Didot (1764-1836).
In the U.S., Adobe software defines the point to be exactly 1/72 inch (0.013
888 9 inch or 0.352 777 8 millimeters) and TeX software uses a slightly
smaller point of 0.351 459 8035 mm. The German standards agency DIN has
proposed that all these units be replaced by multiples of 0.25 millimeters
(1/101.6 inch).

inch (in or ") [1]
a traditional unit of distance, equal to 1/12 foot or exactly 2.54
centimeters. The Old English word ynce is derived from the Latin uncia,
meaning a 1/12 part; thus "inch" and "ounce" actually have the same root.
The inch was originally defined in England in two ways: as the length of
three barleycorns laid end to end, or as the width of a man's thumb at the
base of the nail. The barleycorn definition is peculiarly English, but the
thumb-width definition is generic. In fact, in many European languages the
word for inch also means thumb: examples include the Dutch duim, Swedish
tum, French pouce, and Spanish pulgada. The inch seems to predate the foot
in the history of English units: the foot was defined after the Norman
conquest of 1066 to equal 12 inches, rather than the inch being defined as
1/12 foot.

foot (ft or ')
a traditional unit of distance. Almost every culture has used the human foot
as a unit of measurement. The natural foot (pes naturalis in Latin), an
ancient unit based on the length of actual feet, is about 25 centimeters
(9.8 inches). This unit was replaced in early civilizations of the Middle
East by a longer foot, roughly 30 centimeters or the size of the modern
unit, because this longer length was conveniently expressed in terms of
other natural units:
1 foot = 3 hands = 4 palms = 12 inches (thumb widths) = 16 digits (finger
widths)
This unit was used in both Greece and Rome; the Greek foot is estimated at
30.8 centimeters (12.1 inches) and the Roman foot at 29.6 centimeters (11.7
inches). In northern Europe, however, there was a competing unit known in
Latin as the pes manualis or manual foot. This unit was equal to 2
shaftments, and it was measured "by hand," grasping a rod with both hands,
thumbs extended and touching. The manual foot is estimated at 33.3
centimeters (13.1 inches).
In England, the Roman foot was replaced after the fall of Rome by the
natural foot and the Saxon shaftment (16.5 centimeters). The modern foot
(1/3 yard or about 30.5 centimeters) did not appear until after the Norman
conquest of 1066. It may be an innovation of Henry I, who reigned from 1100
to 1135. Later in the 1100s a foot of modern length, the "foot of St.
Paul's," was inscribed on the base of a column of St. Paul's Church in
London, so that everyone could see the length of this new foot. From 1300,
at least, to the present day there appears be little or no change in the
length of the foot.
Late in the nineteenth century, after both Britain and the U.S. signed the
Treaty of the Meter, the foot was officially defined in terms of the new
metric standards. In the U.S., the Metric Act of 1866 defined the foot to
equal exactly 1200/3937 meter, or about 30.480 060 96 centimeters. This
unit, still used for geodetic surveying in the United States, is now called
the survey foot. In 1959, the U.S. National Bureau of Standards redefined
the foot to equal exactly 30.48 centimeters (about 0.999 998 survey foot).
This definition was also adopted in Britain by the Weights and Measures Act
of 1963, so the foot of 30.48 cm is now called the international foot.

Source :
How Many? A Dictionary of Units of Measurement
http://www.unc.edu/~rowlett/units/index.html

Jean Fontaine