Internationally, a unified system of typographic measurement
does not yet exist. In fact, printing was an inexact science
until the 18th century. Until that time, typographic characters
cast in different foundries were incompatible.
In the year 1737, a Parisian typefounder by the name
of Pierre Fournier le jeune proposed the first standard unit
of measure known as the point. This unit was equal to 0.349
millimeter, which allowed for the precise measurement of
small sizes of type.
smallest whole unit = 1 millimeter (mm)
1 mm = 2.85 points (pt)
10 mm = 1 centimeter (cm)
100 mm = 10 cm = 1 decimeter (dm)
1000 mm = 100 cm = 10 dm = 1 meter (m)
smallest whole unit = 1 inch
1 inch = 72 points
12 inches = 1 foot
3 feet = 1 yard
Eventually, in 1886, the American Type Founders’ Association
standardized the point to 0.3515 millimeter. This new unit
was adopted in America and Britain as the standard unit of
measure. To make things more confusing, about 40 years
later, a larger point was invented by Firmin Didot.
нія nuit was equal to 0.576 millimeter and was adopted by
the rest of continental Europe.
Since digitization, the point has been rounded off to
72 per inch, to make things more simple, and to match
the 72 dots per inch resolution of a computer display
monitor. However, Europe now uses the millimeter as the
standard for measuring type.
British-American Point System (Pierre Fournier le jeune)
smallest whole unit = 1 point (0.349 mm)
12 points = 1 Pica
Didot Point System (François-Ambrose "Firmin" Didot)
smallest whole unit= 1 point (0.376 mm)
12 points = 1 Cicero (Germany, Austria and Switzerland)
= 1 Douze (France)
= 1 Riga Tipografica (Italy)
= 1 Augustijn (The Netherlands)