[Newbie] Math Question

I'm going to have to respectfully disagree here, mathematically they are identical. The only difference would be on a tolerance callout on a drawing. The 4 decimal place callout would normally require a tighter tolerance.
You need to read the posts. I already stated this.
 
An interesting question and subject to interpretation. I refer to "a ten thousandth" as .0001" but if said I was off by ten thousandths, I would mean .01" so context matters. A thousandth is also referred to as a "mil" and confuses some people in that the expression also means a millimeter, particularly in Europe, so I stopped using that expression when dealing with colleagues, vendors, and customers. The best bet is to write out the number as .001". That is unambiguous.

Regarding zeros after the last significant digit, they don't matter. Informally, I use additional zeros to indicate a level of accuracy or precision. e.g., if I say 1" that can mean 1+/- 1/4" but if i say 1.000, that would imply 1.000 +/- .002", say. Note: This is my own personal convention and does not in any way reflect a standard. When I am working on my own drawing, for my eyes only, I have my decimal places set at four so 1 would read 1.0000. I know what my design intent is so I apply my interpretation to that number as required by the situation.

Please excuse if I am repeating what has been said already. I am a slow typist.;)

Bob
 
If I couldn't rely on my tenth reading measuring tools to be accurate over their range, I would junk them and buy new ones. That's why we calibrate against traceable standards.
 
It's good that you guys love math. I am sure to learn from all your responses
 
If I couldn't rely on my tenth reading measuring tools to be accurate over their range, I would junk them and buy new ones. That's why we calibrate against traceable standards.
+1 Jim. The rule of thumb in metrology is to use a calibration standard that is ten times more accurate than the device being calibrated. If I am calibrating a micrometer to .0001", I need a set of gage blocks accurate to +/- .00001". In order to certify to NIST standards, those standards in turn have to be calibrated with a standard accurate to 1 millionth of an inch, and so on down the line to a primary standard consisting of the measurement of a wavelength of light from a HeNe laser.

A micrometer could be calibrated with a pin gage, stated accuracy, -.0002/+.0000, in which case, the reliable accuracy would be ten times the gage pin accuracy per good metrology practice. However, that doesn't mean that the micrometer isn't accurate to +/- .0001". For hobbyist use, this is probably sufficient to confidently use the micrometer to its full resolution.
 
"OPINIONS" no references cited
Haha!
No citations needed for common knowledge. Any one with formal schooling in the physical sciences learned sig figs (and probably hated them).

But, just to add to the general confusion:
https://en.wikipedia.org/wiki/Significant_figures
You will probably hate yourself if you read it.

and the word from the horses mouth:
http://www.mitutoyo.com/wp-content/uploads/2013/04/E11003_2_QuickGuide.pdf
Includes info on tolerable error, what kind of problem produces what kind of error, and other fun facts - including the effects of the heat from your hand on measurements and the effects of various fixtures.

The fact is, that a 0.0001 measuring tool exceeds the users ability to be able to position it.
All that said, I still sleep better at night knowing I measure with a high end (and expensive) tool even though I know it is non-sense.
I measure to a ten, and eyeball the dial on my 0.001 DTI and cut right to the mark. The accuracy of my measurement only makes me feel warm and fuzzy, the real measurements come from my senses, not my fancy Mitutoyo micrometer.
 
I will concede that to get an accurate absolute measurement that you need controlled conditions. Most of the measurement that is done in the shop and especially the home shop are comparative. So let's say I need to fit a bearing to a bore with a light press fit. The bearing measures 2.2504, and I want a 0.0005 interference fit. Then I'm going to make the bore 2.2499 as measured with the same tools and ambient conditions. This is not an absolute measurement but a comparative one. With care and experience, it's pretty easy to hold +/- 0.0001 or better when machining.

With practice, it is easy to get a repeatable measurement to 0.0001, even when transferring from a snap gauge or an inside mic. But again, these are not absolute measurements, close but not exact as compared to a traceable standard. It's all about feel and as you say, your senses.
 
Back
Top