Fog Creek Software
Discussion Board

Metric Units on Windows

I have been attempting to display a grid, measured in millimeters, on the screen.

If you have ever tried this before you would soon discover that the grid is not quite perfect.  I'm guessing this is because of the rounding errors that occur during display.  Even when using the Windows mapping modes MM_LOMETRIC or MM_HIMETRIC, the grid is still not perfect.

It may also be because Windows measures the display in pixels per inch not pixels per (insert metric unit here). 

I've come to the conclusion that it is impossible to display a nice even metric grid on a display device under Windows because it does not natively support the metric system but I thought I'd ask in case it is possible or in case anyone has run into this problem before and can offer some advice.

(*)Maybe someone could also point me to a resource that explains why Windows and physical devices measure units in "units per inch" rather than or in addition to "units per metric measure".  I would guess the answer lies in the origins of the development of Windows, Printers and other devices and their need for backward compatibility.

* For example, it may be convenient if the screen device driver could change the screen format and report 96 pixels per inch or 20 pixels per centimeter.  I don't know if this would require a hardware change or not. (i.e. Are monitors physically made to have 96 pixels per inch).

Dave B.
Wednesday, January 21, 2004

Monitors can be set to display multiple resolutions. Does that answer your question as to whether they're manufactured to show a certain number of pixels per inch? ;)

Sum Dum Gai
Thursday, January 22, 2004

You mean you undserstand all that mapping mode sh*t? You may be the only one. Or I may be the only one that doesn't.

Thursday, January 22, 2004


Twentieth of a point, this is 1/1440th of an inch and 1/567th of a centimetre.  There are algorithms to convert from twips to pixels in the desired resolution, and whilst twips varies in physicial size on the screen depending on its resolution it will always be a twentieth of a point for printing purposes.

It seems there's a do it all digital ruler which I haven't used or have any experience of at

Simon Lucy
Thursday, January 22, 2004

MS is really screwy Metafiles are natively in 100ths of a millimetre. The print preview stuff I did a while ago may help you here (Must finish the notes section).

However when printing to Postscript printers the native mode is in points. The native screen resolution on most (it's changeable) windows machines is 96 dpi which if convert you it into twips gives a nice round 15 twips per pixel.

Peter Ibbotson
Thursday, January 22, 2004

Hrm... I guess it's hard for me to explain it any simpler than I have but I can try.

As Peter mentioned there are 15 twips per pixel on a 96 dpi screen.

1440 twips per inch / 96 pixels per inch = 15 twips per pixel.

Now let's try to determine how many twips there are per millimeter.

We know there are 96 pixels per inch and 25.4 millimeters per inch so we can now calculate how many pixels per millimeter.

96 pixels per inch / 25.4 millimeters per inch = 3.779527559 pixels per millimeter.

We know from our earlier calculation that there are 15 twips per pixel and 3.779527559 pixels per milleter.

15 twips per pixel * 3.779527559 pixels per millimeter = 56.69291339 twips per millimeter.

The result, 56.69291339 twips per millimeter is obviously not a nice round number like 1440 twips per inch and does not map nicely to 15 twips per pixel.

This is because the whole thing is based on inches and not millimeters (or whatever metric unit suits your fancy).

If Windows or the hardware (i'm not sure which is at fault) would instead provide separate modes for inches and millimeters so that they would report 96 dpi or X dpm (or something similar) then the calculations would turn out nice and even and would not suffer from rounding errors.

Dave B.
Thursday, January 22, 2004


Wouldn't you have similar discrepancies in the displayed grid using inch measurements? For example, if you wanted to draw a grid spaced at one-tenth inch intervals, then each grid-line would be 9.6 pixels apart so the lines on the screen would be drawn to the nearest pixel and therefore spaced at 10 pixels, then 9 pixels, then 10 pixels, 9, 10, 10, 9, etc.

I haven't used them, but I thought the purpose of the mapping modes MM_LOMETRIC or MM_HIMETRIC (and MM_LOENGLISH, etc) was to allow you to draw on the device context using virtual units in the selected measurement system and then allow Windows or the device driver to map appropriately to the device. That is, using MM_HIMETRIC for example, if you want to draw a 1mm square then you draw it with dimensions of 100 "units" (because MM_HIMETRIC is mapped to 0.01mm).

Philip Dickerson
Thursday, January 22, 2004

If you choose to draw in 10ths or 100ths of an inch then yes you may run into those problems.

If you choose to draw in the standard units of measure, (i.e. the ones you find on a ruler or tape measure) such as 1/2, 1/4, 1/8, 1/16, 1/32 of an inch then no you do not run into those problems because it maps evenly down to 3 pixels.

1 / 1 in. * 1440 = 1440 / 15 twips per pixel = 96 pixels
1 / 2 in. * 1440 = 720 / 15 twips per pixel = 48 pixels
1 / 4 in. * 1440 = 360 / 15 twips per pixel = 24 pixels
1 / 8 in. * 1440 = 180 / 15 twips per pixel = 12 pixels
1 / 16 in. * 1440 = 90 / 15 twips per pixel = 6 pixels
1 / 32 in. * 1440 = 45 / 15 twips per pixel = 3 pixels

1 / 64 in. * 1440 = 22.5 - precision stops here

Of course this is based on 72 points for a 1 inch high font.

Millimeters OTOH are based on units of 10 and that does not evenly map to 72 points per 1 inch because there are 25.4 millimeter per inch.

The above is the reason why the ticks of a ruler drawn on the screen using standard units appear evenly spaced and the ticks on a metric ruler are not even, instead some are closer together and some are further apart.

When you specify "number of units of .1 mm" to the windows functions, windows must internally map these to 96 pixels per inch.

Given there are 254 units of .1 mm in an inch and they map to 96 pixels per inch:

1in. = 254 units of .1 mm = 96 pixels
1/2in. = 127 units of .1 mm = 48 pixels
1/4in. = 63.5 units of .1 mm = precision stops here for millimeters

Sorry if I'm not making myself clear and perhaps I just answered my own question (if I haven't lost site of it), but the history of why all this stuff is so, would be interesting and would give meaning to it and would perhaps help in understanding it better. (i.e. Why does Windows choose to use inches to map pixels, 96 pixels per inch)  I suppose it's because of the historical value of 72 points per inch for a font one inch high.

Dave B.
Thursday, January 22, 2004

Forgot the English Units chart - 1 level more precise than metric

1/1 in. = 100 units of .1in = 96 pixels
1/2 in. = 50 units of .1in = 48 pixels
1/4 in. = 25 units of .1in = 24 pixels
1/8 in. = 12.5 units of .1in = precision stops here

Dave B.
Thursday, January 22, 2004

Oops... looks like I should have .01mm and .01in.  (HIMETRIC & HIENGLISH) and the metrics would have the same precision as the English because it would be:

1in. = 2540 units of .1 mm = 96 pixels
1/2in. = 1270 units of .1 mm = 48 pixels
1/4in. = 635 units of .1 mm = 24 pixels
1/8in. = 317.5 units of .1mm = precision stops here

Dave B.
Thursday, January 22, 2004

Ok last post should also read .01mm. /sigh

Dave B.
Thursday, January 22, 2004

or you could just scale, declare that the monitor is something like 200pixels/cm, draw as appropriate, and save as mm.

most monitors aren't actualy 96px/in anyway.

Thursday, January 22, 2004

It doesn't really make sense to try to draw a correct measure on a monitor.  Windows knows the resolution, but it doesn't know the physical size of the displayed image, which in general has little relation to the number of pixels displayed.  Even with a single monitor at fixed resolution, you can stretch and shrink the image.

Mike McNertney
Friday, January 23, 2004

*  Recent Topics

*  Fog Creek Home