- Joined
- Apr 4, 2013
- Messages
- 884
Well, then I have more diagnostic work to do. Thanks for the info, Yurily!
I am not sure if my expectations are reasonable...that gauge blocks measured with a 0.0001 dial indicator, ought to give the right answer within maybe a few microns. You think?
Before you go too deep into the rabbit hole (sorry, I think I'm actually digging said hole here...):
Your tenth's indicator (if you have a really good one) has +/-0.0001" maximum error when new (that's what Mitutoyo states in their specs, for example), so even if you do everything perfect, you can be off by +/- 2.5 um. This is not even taking into account temperature, etc.
I don't know how you tested/calibrated your scales, but with a 0.0001" indicator and a gauge block, the approach you can take is:
"When my DRO shows X, does the indicator show X as well". You can't go the other way around (i.e. "my indicator shows X, does the DRO show X".
To properly measure the error in the scales you need something that has more resolution and is inherently more accurate. One of these (photo) can measure down to double digit nanometers (well, when it's not sitting on my beat up kitchen table), and it takes a LOT of effort to get repeatable measurement of 1um scales.
I.e. what I'm saying is "don't stress about it". Given that you bought your scale for a decent price that doesn't look like it was a "fell of the truck" unit. As long as you get reasonable readings from your thents indicator, you'll not be lacking in accuracy.
Regards
Yuriy