Why are good metric micrometers so hard to find?

All tooling will end up being more expensive if you go against the grain, not just micrometers. It’s cheapest to use inches in North America, and metric almost everywhere else.

Early into the hobby I tried to only acquire inch tooling because I’m in the US. That didn’t work either, though because many of the things I work on were made outside the US. So I’ve been forced to also acquire a lot of metric tooling.

I still don’t own any metric (or digital) micrometers, though. I just multiply and divide by 25.4 a lot.

Mitutoyo doesn't make metric sets, being from Japan.
Japan went metric during the Meiji restoration, I believe. Try buying an inch micrometer from Mitsutoyo or any other vendor inside of Japan.

The old shakkanhou system carpenters used was actually outlawed: everything had to use metric.

To keep selling their rules and squares marked with the old system, the clever manufacturers started stamping “/33” onto their tools because one shaku is 1/33 of a meter (30.3 cm, and coincidentally almost exactly one foot, but divided into ten smaller units not twelve).
 
0.001 mm is just under half a tenth (0.000039701").

0.01 mm is just under half a thou (0.0003937008")

I would think for the majority of small to medium sized shops doing 'paid for' work, measuring at half a thou (given that their tolerances for most parts are likely to be be in the low single digit thou; a couple of thou for example) would be perfectly acceptable.

For most hobbyists, being able to measure at a level of half a thou is entirely sufficient.

Take a look in Machinerys Handbook at the standard fit charts. The tolerances for most of the fits are shown in thou.

I'd think the use case for 0.001 mm metrology would be calibration of metrology tools in in-house toolrooms not in a hobbyist work shop.

I think you may have fallen into a trap that gets all of us in the hobbyist world (it's a trap that you see in other hobbies too; recreational shooters looking for half MOA at 500 yards for example). We get excited about a level of precision that rarely, if at all, isn't relevant to us.

If you get a genuine Mitutoyo micrometer that has a resolution of 0.01 mm and don't chuck it around the shop, you'll get a quality of measurement that will definitely be sufficient. Here's the relevant specs for an economy Mitutoyo analog mic:

Mitutoyo 103-137 (0.01mm) Ratchet Stop Economy Design Micrometer 0-25mm
  • Resolution: 0.01mm
  • Accuracy: ± 0.002mm
That will give you a measurement that will be accurate plus or minus about 0.00015".

Now sadly, just like on any forum, there'll be "willy-wavers" who insist that their hobby work demands micron level accuracy. Also there'll be some others who are/were production machinists/engineers and genuinely forget that this is a hobby forum and have their 'precision production' head on when posting.

Either way, for hobby shop work, a metrology resolution at 0.01 mm should by and large be entirely sufficient.:)
Yeah, ok. I think others had been hinting and this, but that was very clearly laid out.
 
Yeah, ok. I think others had been hinting and this, but that was very clearly laid out.
Yeah, sorry. If you'd already taken the 'hints' in the other posts, my post may have made you want to roll your eyes and mutter to yourself "Enough already! I've got the damn message!" :grin:

I do like to be unambiguous though (and that can lead to me being over-wordy:oops:) and in this case, others may well come along later and read this thread. If I can post something that explicitly lays out the case for 0.01 mm (or indeed half a thou) resolution being sufficient for hobby shop usage, then that might help someone who hasn't got the 'hints'. :)
 
Yeah, 0.01 mm graduation is not hard to find, and it is affordable. But as I see things, you should ideally be able to measure with greater accurracy than you can machine, because otherwise, you'd loose certainty on those good 0.01 mm accurate parts you're making, wouldn't you?
To measure below 0.01 mm repeatable, surface finish is playing a big role. If you hold the micrometer in your hand to long, the body will warmup and you will measure a thicker part. Measuring in the micrometer range is a real challenge.
Measure the same part 10 times using the same micrometer. You probably won't find 10 times the same value and that is not due to the micrometer but due to you holding/measuring the part. A micrometer with 10 times more resolution won't change that.
If a tool is not commonly available than probably there is not a big market for it.
Using a 0.01 mm micrometer, readable at 0.005 mm, I do measure more "accurate" than I can repeatable produce. In general, I estimate the value on 1/10 of the resolution (annalists background).
Despite climate control or cooling fluid, parts heat up when being machined. When turning large diameter parts, I let the part cool down for a few hours before I take the last 2 finishing passes. That is not some thing you can do in a commercially machine shop. If they produce the same part in larger quantities, they can set the machining parameters so that the parts are withing spec after they cooled down.
When machining wood or plastics, in general 0.01 mm tolerance is not feasible at all.
 
If I can post something that explicitly lays out the case for 0.01 mm (or indeed half a thou) resolution being sufficient for hobby shop usage, then that might help someone who hasn't got the 'hints'. :)

I agree and appreciate that philosophy. I learn stuff on this forum all the time because people take the time to do this.

My 25.4 cents, taking the precept to heart (and keeping in mind that I mostly work on small parts):

I definitely agree that machining to the closest 0.01 mm (0.00039") or 0.001" (0.0254 mm) is all most hobbyists need to strive for most of the time. If your parts are more than a couple inches (~50 mm) in size, even this level of accuracy is overkill.

My "tenths" indicators (0.0001"/div) are relegated to surface plate work or particularly finicky stuff (usually involving high speed spindles, reference standards/tooling, hand-scraping, and/or ball bearings). A tenths indicator moves if you look at it wrong. Same goes for a 0.001 mm/div indicator, only worse.

But "machining to the closest thou" specifies an absolute target, not a tolerance (accuracy, not precision). Targeting, say, 0.250" with a tolerance of +/- 0.001" is far too sloppy for much of the work I do. I often strive for +/- 0.0002" (roughly +/- 0.005 mm), especially for "piston fits", and pretty much always strive for at least +/- 0.0005" (+/- 0.0127 mm).

I'd argue strongly that for those working in inches to the closest thou, a 0.0005"/div indicator is generally more useful and surprisingly more pleasant to work with than a 0.001"/division indicator. For similar reasons, I also insist that all of my "good" micrometers include a vernier scale to read to the closest "tenth" (0.0001" or 0.00254 mm) .

I'd sorely miss the latter. An inch micrometer without a vernier scale to read finer than 0.001" would be exactly equivalent to a 0.001"/division indicator: you have to guesstimate how close you are to a line.

I don't use metric micrometers, but my understanding is that they generally read to the closest 0.01mm directly, without a vernier. That would definitely be good enough in my book, but only because I would be working to +/- 0.01 mm or worse precision (a range spanning 0.02mm or 0.0008").

The general principle is that you want references and measuring devices with finer precision than the tolerances you work to. If a carpenter wants to measure to the closest 1/16" s/he should generally use a 1/32"/div scale.

I spend a lot of time building things where I want a piston fit between a shaft and a bore, and I'm usually machining both.

From experience, I know it's easier to make fine adjustments to a shaft diameter than it is to adjust a bore (though there are tricks) so I usually bore the hole first.

Let's say I bored a hole that was nominally 1/4" (6.35 mm).

A QA metrologist cares about absolute sizes and tolerances. They would test to discover that, say, a 0.250+ ZZ grade go gauge fits, ensuring the bore was at least 0.2500", but a .251- no-go pin doesn't fit, ensuring the bore was smaller than 0.2508". With grade ZZ pins they could only say that bore was 0.250" +0.0008/-0.

I'm a hobbyist, though, and mostly only care about relative sizes. I just want to ensure my shaft is slightly smaller than the bore, so I don't bother with "go" aka "plus" pins. It's good enough to know that a .250- pin has the fit that I want (and even easier to tell that a 0.251- pin doesn't fit).

It's pretty easy to tell by feel how tight the fit is with a pin. If that .250- pin feels like a piston fit, I can be fairly confident that I want to make my shaft just a hair under 0.250" without going over. If the .250- pin fits the way I want, I just turn my shaft until it mics the same as my pin. If the pin has a loose feel, I know I can turn the shaft a tenth or two larger.

If the bore is truly 0.2500" and I turn a shaft to 0.2500", it will definitely be a press fit. But if I turn it to 0.2490" it would be a surprisingly loose slip fit. Relative tenths really do matter. They can make the difference between a sloppy feel and a precision feel.

THAT's why I find it useful to have that tenths vernier on my micrometers! I'll first find a pin that fits the way I want. Then I'll mic that pin to the closest tenth. I actually don't really care about the absolute measurement of either the bore or the shaft, but I often want piston fits, and that's hard to achieve without finer than 0.001" resolution.

It's pretty easy to tell if a reading is on a line or between a line, but without a tenths vernier it's hard to tell 0.2499 from 0.2500 from 0.2501 (especially with my eyes) and that can definitely be the difference between a piston fit and an interference fit when a .250- pin is the largest that fits in the hole. Similarly, the difference between 0.2494" and 0.2496" is hard to tell without a vernier and can be the difference between a sloppy feel and a hydraulic feel.

So to net it out: 0.01 mm direct reading mic's are fine for hobbyists, but if you use inch mics, a vernier that reads to "tenths" (0.0001") is extremely helpful.
 
I agree and appreciate that philosophy. I learn stuff on this forum all the time because people take the time to do this.

My 25.4 cents, taking the precept to heart (and keeping in mind that I mostly work on small parts):

I definitely agree that machining to the closest 0.01 mm (0.00039") or 0.001" (0.0254 mm) is all most hobbyists need to strive for most of the time. If your parts are more than a couple inches (~50 mm) in size, even this level of accuracy is overkill.

My "tenths" indicators (0.0001"/div) are relegated to surface plate work or particularly finicky stuff (usually involving high speed spindles, reference standards/tooling, hand-scraping, and/or ball bearings). A tenths indicator moves if you look at it wrong. Same goes for a 0.001 mm/div indicator, only worse.

But "machining to the closest thou" specifies an absolute target, not a tolerance (accuracy, not precision). Targeting, say, 0.250" with a tolerance of +/- 0.001" is far too sloppy for much of the work I do. I often strive for +/- 0.0002" (roughly +/- 0.005 mm), especially for "piston fits", and pretty much always strive for at least +/- 0.0005" (+/- 0.0127 mm).

I'd argue strongly that for those working in inches to the closest thou, a 0.0005"/div indicator is generally more useful and surprisingly more pleasant to work with than a 0.001"/division indicator. For similar reasons, I also insist that all of my "good" micrometers include a vernier scale to read to the closest "tenth" (0.0001" or 0.00254 mm) .

I'd sorely miss the latter. An inch micrometer without a vernier scale to read finer than 0.001" would be exactly equivalent to a 0.001"/division indicator: you have to guesstimate how close you are to a line.

I don't use metric micrometers, but my understanding is that they generally read to the closest 0.01mm directly, without a vernier. That would definitely be good enough in my book, but only because I would be working to +/- 0.01 mm or worse precision (a range spanning 0.02mm or 0.0008").

The general principle is that you want references and measuring devices with finer precision than the tolerances you work to. If a carpenter wants to measure to the closest 1/16" s/he should generally use a 1/32"/div scale.

I spend a lot of time building things where I want a piston fit between a shaft and a bore, and I'm usually machining both.

From experience, I know it's easier to make fine adjustments to a shaft diameter than it is to adjust a bore (though there are tricks) so I usually bore the hole first.

Let's say I bored a hole that was nominally 1/4" (6.35 mm).

A QA metrologist cares about absolute sizes and tolerances. They would test to discover that, say, a 0.250+ ZZ grade go gauge fits, ensuring the bore was at least 0.2500", but a .251- no-go pin doesn't fit, ensuring the bore was smaller than 0.2508". With grade ZZ pins they could only say that bore was 0.250" +0.0008/-0.

I'm a hobbyist, though, and mostly only care about relative sizes. I just want to ensure my shaft is slightly smaller than the bore, so I don't bother with "go" aka "plus" pins. It's good enough to know that a .250- pin has the fit that I want (and even easier to tell that a 0.251- pin doesn't fit).

It's pretty easy to tell by feel how tight the fit is with a pin. If that .250- pin feels like a piston fit, I can be fairly confident that I want to make my shaft just a hair under 0.250" without going over. If the .250- pin fits the way I want, I just turn my shaft until it mics the same as my pin. If the pin has a loose feel, I know I can turn the shaft a tenth or two larger.

If the bore is truly 0.2500" and I turn a shaft to 0.2500", it will definitely be a press fit. But if I turn it to 0.2490" it would be a surprisingly loose slip fit. Relative tenths really do matter. They can make the difference between a sloppy feel and a precision feel.

THAT's why I find it useful to have that tenths vernier on my micrometers! I'll first find a pin that fits the way I want. Then I'll mic that pin to the closest tenth. I actually don't really care about the absolute measurement of either the bore or the shaft, but I often want piston fits, and that's hard to achieve without finer than 0.001" resolution.

It's pretty easy to tell if a reading is on a line or between a line, but without a tenths vernier it's hard to tell 0.2499 from 0.2500 from 0.2501 (especially with my eyes) and that can definitely be the difference between a piston fit and an interference fit when a .250- pin is the largest that fits in the hole. Similarly, the difference between 0.2494" and 0.2496" is hard to tell without a vernier and can be the difference between a sloppy feel and a hydraulic feel.

So to net it out: 0.01 mm direct reading mic's are fine for hobbyists, but if you use inch mics, a vernier that reads to "tenths" (0.0001") is extremely helpful.
Eloquently stated. Piston and slip vs interference fits is exactly why I have a micrometer set reading to tenths from 0-4". And a Digital micrometer reading to 0.001mm. If I can't measure it when I'm making the part, how can I ensure the proper fit for the assembly? It's also the reason I installed a 1um scale on my lathe cross slide.

Can't say I always hit the mark every time, but at least I can get very close. I try to achieve my drawing numbers every time, but only concentrate on getting them for the things that truly require it. It's basically practicing the art. Without practice and repetition it's hard to get better. 99% of the time, the accuracy may not be needed, but if you need that special fit, it's darn hard to get it if you haven't been sharpening the saw in the interim.

I'm nowhere near being a machinist, but do understand that saying, practice makes perfect.
 
I agree and appreciate that philosophy. I learn stuff on this forum all the time because people take the time to do this.

My 25.4 cents, taking the precept to heart (and keeping in mind that I mostly work on small parts):

I definitely agree that machining to the closest 0.01 mm (0.00039") or 0.001" (0.0254 mm) is all most hobbyists need to strive for most of the time. If your parts are more than a couple inches (~50 mm) in size, even this level of accuracy is overkill.

My "tenths" indicators (0.0001"/div) are relegated to surface plate work or particularly finicky stuff (usually involving high speed spindles, reference standards/tooling, hand-scraping, and/or ball bearings). A tenths indicator moves if you look at it wrong. Same goes for a 0.001 mm/div indicator, only worse.

But "machining to the closest thou" specifies an absolute target, not a tolerance (accuracy, not precision). Targeting, say, 0.250" with a tolerance of +/- 0.001" is far too sloppy for much of the work I do. I often strive for +/- 0.0002" (roughly +/- 0.005 mm), especially for "piston fits", and pretty much always strive for at least +/- 0.0005" (+/- 0.0127 mm).

I'd argue strongly that for those working in inches to the closest thou, a 0.0005"/div indicator is generally more useful and surprisingly more pleasant to work with than a 0.001"/division indicator. For similar reasons, I also insist that all of my "good" micrometers include a vernier scale to read to the closest "tenth" (0.0001" or 0.00254 mm) .

I'd sorely miss the latter. An inch micrometer without a vernier scale to read finer than 0.001" would be exactly equivalent to a 0.001"/division indicator: you have to guesstimate how close you are to a line.

I don't use metric micrometers, but my understanding is that they generally read to the closest 0.01mm directly, without a vernier. That would definitely be good enough in my book, but only because I would be working to +/- 0.01 mm or worse precision (a range spanning 0.02mm or 0.0008").

The general principle is that you want references and measuring devices with finer precision than the tolerances you work to. If a carpenter wants to measure to the closest 1/16" s/he should generally use a 1/32"/div scale.

I spend a lot of time building things where I want a piston fit between a shaft and a bore, and I'm usually machining both.

From experience, I know it's easier to make fine adjustments to a shaft diameter than it is to adjust a bore (though there are tricks) so I usually bore the hole first.

Let's say I bored a hole that was nominally 1/4" (6.35 mm).

A QA metrologist cares about absolute sizes and tolerances. They would test to discover that, say, a 0.250+ ZZ grade go gauge fits, ensuring the bore was at least 0.2500", but a .251- no-go pin doesn't fit, ensuring the bore was smaller than 0.2508". With grade ZZ pins they could only say that bore was 0.250" +0.0008/-0.

I'm a hobbyist, though, and mostly only care about relative sizes. I just want to ensure my shaft is slightly smaller than the bore, so I don't bother with "go" aka "plus" pins. It's good enough to know that a .250- pin has the fit that I want (and even easier to tell that a 0.251- pin doesn't fit).

It's pretty easy to tell by feel how tight the fit is with a pin. If that .250- pin feels like a piston fit, I can be fairly confident that I want to make my shaft just a hair under 0.250" without going over. If the .250- pin fits the way I want, I just turn my shaft until it mics the same as my pin. If the pin has a loose feel, I know I can turn the shaft a tenth or two larger.

If the bore is truly 0.2500" and I turn a shaft to 0.2500", it will definitely be a press fit. But if I turn it to 0.2490" it would be a surprisingly loose slip fit. Relative tenths really do matter. They can make the difference between a sloppy feel and a precision feel.

THAT's why I find it useful to have that tenths vernier on my micrometers! I'll first find a pin that fits the way I want. Then I'll mic that pin to the closest tenth. I actually don't really care about the absolute measurement of either the bore or the shaft, but I often want piston fits, and that's hard to achieve without finer than 0.001" resolution.

It's pretty easy to tell if a reading is on a line or between a line, but without a tenths vernier it's hard to tell 0.2499 from 0.2500 from 0.2501 (especially with my eyes) and that can definitely be the difference between a piston fit and an interference fit when a .250- pin is the largest that fits in the hole. Similarly, the difference between 0.2494" and 0.2496" is hard to tell without a vernier and can be the difference between a sloppy feel and a hydraulic feel.

So to net it out: 0.01 mm direct reading mic's are fine for hobbyists, but if you use inch mics, a vernier that reads to "tenths" (0.0001") is extremely helpful.

Very neatly explained, thank you! So, if I follow you, that 0.0002'' difference in your example would be about 0.005 mm. And that is something one could probably eyeball in a 0.01 mm graduated mic even without a Vernier scale, wouldn't it?

What I am interested in are model steam engines. So reading all of this I think it would probably be best if I did buy the 0.001 mm Mitutoyo mics in the 0-50 mm range. I think I'll leave 50-150 for later.
 
about 0.005 mm. And that is something one could probably eyeball in a 0.01 mm graduated mic even without a Vernier scale, wouldn't it?
Yes. If you were doing something finicky it might be worth distinguishing between “on a line” or “between two lines” on the barrel of a .01 mm micrometer.

But I can’t imagine myself ever needing to read down to microns — at that level you really do need temperature control for consistent readings, anyway.

If I was after a piston fit, though, I can imagine looking for 6.35- mm vs 6.35+ mm.
 
Unless you have a climate controlled hobby shop I don’t think any mics are going to help hold that tolerance.
Over temperature, everything is growing and shrinking, including the micrometer. Everything is rubber, moving and distorting, and our common sense notions of the world don't hold true. The smaller we look at things, the weirder it gets.

Think we are merely wanting to ensure that things fit correctly together, (at some common temperature) which is a subtly different issue. This is a relative measurement, which doesn't require long term stability, only short term. Think that is the difference. Even so, it requires good technique to get meaningful results. It's all too easy to be fooled. In the microwave and mm wave world, it was very difficult to get accurate absolute measurements, but relatively easy to perform very accurate relative measurements.

Absolute metrology is hard - that takes temperature control and sophisticated measurement equipment.
 
Back
Top