# Accuracy Of Dial Calipers



## randyc (Mar 28, 2015)

A post on another forum alluded to the inaccuracy of calipers.  This was a raging argument six or eight months ago on PM and I put in my two cents there as follows:

_"I read negative comments constantly regarding dial calipers and certainly some question exists as to repeatability and accuracy in a dirty environment. But they surely have their place and vernier calipers are reliable and tolerant of some uncleanliness -

The opinion that dial calipers can be trusted only within some arbitrary number (usually much greater than .001) makes me wonder. Just for grins, I went out to the shop and grabbed a $15 set from you know where that I use for WOODWORKING - measuring wood thickness to correct the planer depth setting.

I wiped the jaws clean and then measured a series of gage blocks twice, recording the error as closely as I could interpolate from the dial (yeah I know, this is sort of a visual crap shoot). The first number is the gage block dimension, followed by the caliper measurement:

.0500 : .0503
.1110 : .1110
.1430 : .1433
.2500 : .2502
.4500 : .4500
.6500 : .6502
.9000 : .9002
2.000 : .2001
3.000 : 3.002
4.000 : 4.000

Apparently I could trust the accuracy/repeatability of these particular dial calipers - in a clean environment at a reasonable temperature- to within .0003, following good practice like wiping the jaws and checking zero before making a measurement.

Metrology texts suggest that a measurement instrument should have an accuracy/repeatability of ten times better than the accuracy requirement of the part being measured. This is certainly no problem using micrometers when tolerances are on the order of .001. However it is not unusual to make measurements in the order of .0003 using "tenths" micrometers. This certainly doesn't satisfy the suggested ten times accuracy. Perhaps an "acceptable" scale of accuracy might be three times the tolerances being measured ?

If so, then the above calipers would be acceptable to measure within .001, again assuming good practice. And remember these are cheap $15 imports, not Mitutoyos. And regarding vernier calipers, why would anyone not trust these instruments to be just as accurate as a vernier height gage ?"_

FYI:  the consensus of the majority of machinists was pretty much in agreement with the above although there were a few that disagreed violently, LOL.

edit to add:  I wouldn't trust dial calipers to be accurate when measuring large dimensions in cold temperatures.  I don't know the temperature range over which an imported pair of calipers are accurate but I'm OK with using mine from 60 deg F to 80 deg F.


----------



## VFM3 (Mar 28, 2015)

Caliper use is always being visciously debated on the internet.

I see two issues with this quote:
-Repeatability and Accuracy are two different things. Fellow didn't state he actually tested repeatability.
-It take an experienced hand to keep the same amount of pressure on the jaws.


----------



## kd4gij (Mar 28, 2015)

I belive that calipers on only as good as the person using them. I worken with an oldtimer that verry rarley used a mic. and did some of the best work. And then there was some that couldn't messure anything with calipers.


----------



## John Hasler (Mar 28, 2015)

The "10 times" metrology rule assumes that you are making a part that has to work with one being made by a guy in California and that only thing your instruments and his have in common is the NIST traceability of their calibration.


----------



## randyc (Mar 28, 2015)

VFM3 said:


> Caliper use is always being visciously debated on the internet.
> 
> I see two issues with this quote:
> -Repeatability and Accuracy are two different things. Fellow didn't state he actually tested repeatability.
> -It take an experienced hand to keep the same amount of pressure on the jaws.



Howdy, the fellow was me and as I said, I made two measurements on each gauge block.  Granted, that's not a lot of measurements to confirm repeatability but what the heck, these were $15 calipers and the two measurements agreed in every case.

I don't think pressure on the jaws is at all critical.  Using the same $15 calipers and a one inch gauge block, I just made the following experiment.  From barely snugging the caliper jaws (on the gauge block) to exerting a considerable amount of pressure, I note a variation of about .0005 on the dial.

Not trying to argue with you but you might try the same experiment yourself - it only takes a few seconds you might be surprised at how good your calipers are


----------



## randyc (Mar 28, 2015)

John Hasler said:


> The "10 times" metrology rule assumes that you are making a part that has to work with one being made by a guy in California and that only thing your instruments and his have in common is the NIST traceability of their calibration.



Yep - nobody could dispute that the "rule" would guarantee interchangability because the point of having measuring tools that are capable of ten times the accuracy of the part to be measured is to reduce measurement uncertainty.

The "go to" guy for this kind of discussion is _Gordon B. Clarke_ on "Practical Machinist".  He's spent his entire career dealing with the nit-picky topics of metrology and is a wealth of information.


----------



## RJSakowski (Mar 28, 2015)

randyc said:


> A post on another forum alluded to the inaccuracy of calipers.  This was a raging argument six or eight months ago on PM and I put in my two cents there as follows:
> 
> _"I read negative comments constantly regarding dial calipers and certainly some question exists as to repeatability and accuracy in a dirty environment. But they surely have their place and vernier calipers are reliable and tolerant of some uncleanliness -
> 
> ...


Are your  2 and 3 inch measurements meant to be .001" and .002" high?  If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003".  If they are indeed .0001" and .0002"  you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value.  Not bad for an instrument that is normally only read to the nearest .001".

Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated.  i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.

I was not aware that that requirement held for measurements.


----------



## 4GSR (Mar 29, 2015)

randyc said:


> ......The "go to" guy for this kind of discussion is _Gordon B. Clarke_ on "Practical Machinist".  He's spent his entire career dealing with the nit-picky topics of metrology and is a wealth of information.



Gordon has a wealth of information on most things.  But he's nowhere near the "go to" person on the subject.  I respect Gordon for he has done and support him 100%.  He has a nice screw measuring system for measuring pitch diameter, but like many, can't write instructions for most of us to understand and use. 

BTW: He was a member for a short period before being banned several years back.


----------



## chips&more (Mar 29, 2015)

RJSakowski said:


> Are your  2 and 3 inch measurements meant to be .001" and .002" high?  If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003".  If they are indeed .0001" and .0002"  you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value.  Not bad for an instrument that is normally only read to the nearest .001".
> 
> Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated.  i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.
> 
> I was not aware that that requirement held for measurements.



A +1 on what you said. Yes, this is more like it. And accuracy is not repeatability. Precision is repeatability…Good Luck, Dave.


----------



## randyc (Mar 29, 2015)

RJSakowski said:


> Are your  2 and 3 inch measurements meant to be .001" and .002" high?  If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003".  If they are indeed .0001" and .0002"  you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value.  Not bad for an instrument that is normally only read to the nearest .001".
> 
> Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated.  i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.
> 
> I was not aware that that requirement held for measurements.



The 3.000 inch measurement is a typo, one of the zeros took a lunch break.  The 2.000 inch measurement - ditto, the decimal point became liberal and migrated to the left.  Although it would be natural to question the measurements, I am positive that those measurements are good ones WITHIN the ability of my eyes to interpolate the divisions on the dial (which, as I wrote, is sort of a crap shoot).

I strongly suggest that others make the same measurements to confirm (or not) the surprising results, given the presumed low quality of the tool I was using !

Metrology is not a strong suit for me - at least not mechanical metrology (I'm much better at electronics metrology, specifically RF and Microwave instrumentation).  On reflection, although three standard deviations would seem to guarantee a much greater accuracy than either claimed or assumed, I would be uncomfortable with accepting measurements of that precision that as I'm sure most others would be !

Similarly while calibration standards almost always require an order of magnitude greater precision than the device to be measured, when making measurements down to the tenths (as I noted in my .0003 example) I personally would prefer something with that same magnitude of precision to reduce uncertainty when making bearing fits and the like.

Without spending a bundle, however, I'm stuck with tenths micrometers and a set of gauge blocks to compare them against.  A sometimes useful technique in the absence of gauge blocks is to use three instruments to make the measurement, discarding the worst measurement of the three.  This is simple enough for small dimensions where micrometers are cheap but gets pricey for large dimensions which are the ones that I'm most doubtful about !

A condition that I mentioned but didn't stress highly enough, perhaps, is the temperature environment.  Maybe my instrumentation (micrometers) is not as stable as could be but I have to recalibrate if shop temperatures change more than 15 or 20 degrees F.  Example:  A micrometer calibrated to a stainless steel standard measures an aluminum workpiece at 1.7500 inches in diameter when shop temperature is 60 degrees F.  When the same workpiece reaches a temperature of 80 degrees, it now measures 1.7502 inches.

Might be trivial depends on the application -


----------



## randyc (Mar 29, 2015)

4gsr said:


> Gordon has a wealth of information on most things.  But he's nowhere near the "go to" person on the subject.  I respect Gordon for he has done and support him 100%.  He has a nice screw measuring system for measuring pitch diameter, but like many, can't write instructions for most of us to understand and use.
> 
> BTW: He was a member for a short period before being banned several years back.



Perhaps I reveal my own lack of knowledge but Gordon always seemed to have the last word on measurement techniques and instrumentation.  His manner can be condescending but everything that I've read by him made sense (to me, at least).  I can't think of anyone on PM that is his obvious superior in knowledge and experience ... my personal opinion of course.


----------



## RJSakowski (Mar 29, 2015)

randyc said:


> The 3.000 inch measurement is a typo, one of the zeros took a lunch break.  The 2.000 inch measurement - ditto, the decimal point became liberal and migrated to the left.  Although it would be natural to question the measurements, I am positive that those measurements are good ones WITHIN the ability of my eyes to interpolate the divisions on the dial (which, as I wrote, is sort of a crap shoot).
> 
> I strongly suggest that others make the same measurements to confirm (or not) the surprising results, given the presumed low quality of the tool I was using !
> 
> ...


Randy, IMO your data supports your ability to accurately interpolate between divisions.  When I first learned to make scientific measurements, way back in Physic 101, we wer taught how to interpolate.  I personally feel comfortable with interpolating to the nearest .2 divisions and am fairly good at hitting .1 divisions.  I have a B & S pseudo-digital micrometer.  It reads out every five thousandths. Past that there are divisions for half and full thousandths and a vernier scale to determine where you are between the half thousandths intervals.  I usually don't bother with the vernier and interpolate directly.  When I do check the vernier, I find that that invariably I am within a half a tenth of what the vernier says.  It sounds like you have that same skill.

As much as we tend to rag on Chinese tooling, I believe that every manufacturer tries to to make things accurately.  Like your experience, every time that I have checked one of the cheap calipers or mikes against a higher standard, I have found them to be accurate to their expected reading.  At work, I had a pair of Harbor Freight digital calipers that I used for my personal work. They were included in the calibration schedule when the metrology lab came in to do our annual ISO calibrations and they were certified.
I expect that there is more error due to technique than to the tool.
Regarding the dimensional changes with temperature, I would guess that the difference is due more to the thermal expansion of the part than to the mike.  An easy test: keep one at a constant temperature and the change the temperature of the other.  That same metrology lab would stabilize the temperature of any instrument they calibrated before they would calibrate.  Surface plates had to sit for a week before they would touch them.  I am fortunate in that my shop is located in my basement and varies no more than 10F year around. Most of the time it is within 5F.


----------



## randyc (Mar 29, 2015)

RJSakowski said:


> ..Regarding the dimensional changes with temperature, I would guess that the difference is due more to the thermal expansion of the part than to the mike...



That's definitely correct.  The example that I posted was derived from the difference between T/C for stainless and aluminum multiplied by the temperature difference and the work diameter.  There are so many variables that can affect precision, I will illustrate an example at the end of this post, extracted from:

http://www.hobby-machinist.com/threads/making-a-high-performance-yo-yo-what.32255/

I have an old micrometer from Sears, purchased new for slightly more than $5 in 1965.  It was cheap because of various shortcuts in the manufacturing process including making the frame from some mystery metal that is non-ferrous ("Mazak" ?).  When this tool is brought into the house from the shop, you can literally watch the "zero" changing half a division in five minutes !  At a constant temperature, the tool is just fine and can be easily interpolated, as in your own experience, to .0002, verifiable by gauge blocks.

You are indeed fortunate in the location/environment of your shop !  Mine is just an insulated 2-1/2 car garage ... the insulation does little because the big garage door is uninsulated 

OK, here's the quote detailing an experience that I had a few years back:



randyc said:


> Deflection of the outer rim of the hub was problematical because high spindle speeds caused the O.D. of the rim to centrifugally expand.  For example:  without moving the cutting tool location and changing the spindle RPM from 400 to 950 produced a measurable difference in the turned diameter - nearly .001.  This wasn't due to a "spring" cut, just centrifugal expansion of the perimeter of the workpiece due to the flexible cross section !
> 
> 
> 
> ...


----------



## RJSakowski (Mar 30, 2015)

My woodworking shop is in a 100+ year old granary where the temperature swing runs between -30F and +110F.  I have seen a 60 degree swing in a single day.  My welding/blacksmithing shop is in an old carriage house which can go between -30 and +100.  At least there, I can fire up the coal forge to take the chill out.  Needless to say, woodworking is a three season event and sometimes, summer days are better off spent fishing.  Fortunately, the fallback is the machine shop.  I also have my old Miller buzz box and Miller MIG welder there for small winter tasks.  I open the back door to the outside world and weld quickly.

The one problem with the basement shop is the summer humidity. at temperatures in the high 60's and low 70's, it is the heat sink and it is a fight to control moisture.  We run the central air all summer and I run a dehumidifier all summer but the old stone foundation is porous so moisture accumulates.  I pull about 20 pints of water out each day.


----------



## randyc (Mar 30, 2015)

Holy Cow !!!  I must be fortunate for experiencing only a fifteen or twenty degree swing throughout the day


----------



## VFM3 (Apr 5, 2015)

randyc said:


> Howdy, the fellow was me and as I said, I made two measurements on each gauge block.  Granted, that's not a lot of measurements to confirm repeatability but what the heck, these were $15 calipers and the two measurements agreed in every case.
> 
> I don't think pressure on the jaws is at all critical.  Using the same $15 calipers and a one inch gauge block, I just made the following experiment.  From barely snugging the caliper jaws (on the gauge block) to exerting a considerable amount of pressure, I note a variation of about .0005 on the dial.
> 
> Not trying to argue with you but you might try the same experiment yourself - it only takes a few seconds you might be surprised at how good your calipers are



Greetings Randyc,

I have confidence in my hands and of my calipers; however, sometimes I need to demonstrate to the other guys how measuring pressure will change readings. So I bought this specially designed (spring assisted) caliper for that purpose from Mitutoyo; Gordon C. also manufactures a better design.

Some blokes were surprised on how they were either 1.) measuring too hard or 2.) measuring too soft on various materials from plastics to steels.

I am not surprised that your cheap $15 calipers can pass calibration. Too many people believe that China is not capable of releasing decent, bargain priced stuff.


----------



## RJSakowski (Apr 5, 2015)

VFM3 said:


> Greetings Randyc,
> 
> I have confidence in my hands and of my calipers; however, sometimes I need to demonstrate to the other guys how measuring pressure will change readings. So I bought this specially designed (spring assisted) caliper for that purpose from Mitutoyo; Gordon C. also manufactures a better design.
> 
> ...


A tenth with calipers is impressive!   Have you made a comparison of readings with a mike on real world measurements (rounds, narrow flats, at the tips, etc.)?


----------



## randyc (Apr 5, 2015)

VFM3 said:


> ...I have confidence in my hands and of my calipers; however, sometimes I need to demonstrate to the other guys how measuring pressure will change readings....  Some blokes were surprised on how they were either 1.) measuring too hard or 2.) measuring too soft on various materials from plastics to steels...



If I interpret your post correctly, you are of the school that calipers are sensitive to measuring pressure to which I earlier said that they are relatively insensitive.  I just made the experiment that follows:

The normally movable jaw was clamped securely with the calipers oriented vertically while the normally fixed jaw was allowed to move freely.  A one inch micrometer standard was inserted between the jaws and the dial was "zeroed" with light finger pressure against the top jaw.  Two different weights were then carefully balanced on top of the calipers, forcing the two jaws together.  Both weights were steel, the first 1.25 diameter x 1.375 long (0.48 pounds) and the second 3.75 diameter x 2.75 long (8.6 pounds).

The photo shows the variation of the needle for the two weights (remember that the needle was zeroed with jaws closed and no weight other than finger pressure).  I stand by my statement that calipers - at least THESE calipers - are relatively insensitive to pressure.  The variation from almost zero pressure to almost 9 pounds of pressure was about .001.




When it comes to plastics, obviously a soft touch is best


----------



## VFM3 (Apr 6, 2015)

I am not sure what I am shocked at: the fact that the caliper withstood all that weight without damage or that you managed to balance that weight precariously on the step end .

And I would like to restate my position on this, Calipers are not sensitive to measuring pressure, but the parts you measure will be affectedif you do not repeat the pressure exerted.


----------



## RJSakowski (Apr 6, 2015)

I find that variability in readings with applied pressure is greatly affected by the sliding fit of the movable jaw.  If the adjustments are loose, it is almost impossible to get consistent measurements.  My B & S dial calipers are similar in response to Randy's.  Measurements are most reliable when made close to the beam.


----------



## randyc (Apr 6, 2015)

VFM3 said:


> I am not sure what I am shocked at: the fact that the caliper withstood all that weight without damage or that you managed to balance that weight precariously on the step end .
> 
> And I would like to restate my position on this, Calipers are not sensitive to measuring pressure, but the parts you measure will be affectedif you do not repeat the pressure exerted.



Ha-ha, you're right to be shocked at that balancing act.  It probably required a full minute to stabilize the larger weight on the end of the calipers and since I live in earthquake country, I hastened to take the photo before quickly removing the weight.  

Regarding damage, recall that we are discussing a $15 set of calipers (that are over ten years old, BTW).  I had little to lose but these devices are so simply constructed and sturdy that I would have confidently used a heavier weight if I could have balanced it, LOL.

OK, your position is now clear that you are NOT of the school that feels caliper pressure is important   Thanks for the clarification.

Insofar as parts being measured being affected by pressure, well sure for very soft materials, that's why I mentioned using a light touch on plastics.  I don't think repeatability of measurement pressure is an issue for the typical HSM (unless measuring urethane foam or something similar - in which case precise machining is all but impossible anyway).

In an industrial environment (like yours, I infer), you are best qualified to determine the proper measurement technique for characterizing the dimensions of the material you are processing.

For harder materials, the experiment indicates that the pressure exerted and repeatability is uncritical.  For softer materials, I simply repeat the initial statement:



randyc said:


> ....When it comes to plastics, obviously a soft touch is best






RJSakowski said:


> I find that variability in readings with applied pressure is greatly affected by the sliding fit of the movable jaw.  If the adjustments are loose, it is almost impossible to get consistent measurements.  My B & S dial calipers are similar in response to Randy's.  Measurements are most reliable when made close to the beam.



Deflection of the jaws due to fit will of course multiply the error depending on the location of the part in the caliper jaws.  In the experiment that I conducted, the near nine-pound load would have multiplied the error at the end of the jaws resulting in about .002 rather than .001.

In my opinion, that's a fairly trivial difference given the magnitude of the load.  And the fact that this incredibly cheap set of calipers performs much like your B & S pair suggests a few things

this particular pair of imported calipers are obviously of good quality

as a mass-produced product one _might_ extrapolate that most calipers like this one are of good quality

B & S calipers are known to be of high quality

based on performance comparison between low end and high end (my cheap calipers and your nice B & S pair) we _might_ make a presumption that _the design/construction of this class of tools in general is excellent_, capable of most tasks (other than high precision) required by the HSM
Now the above comments - all of them - are not meant to be argumentative and I mostly agree with the posts made by the two forum members !  And, if one accepts my bullets, the consistent accuracy and repeatability of these common tools is outstanding - the mechanical design/construction is adequate to minimize measurement uncertainty even under extreme conditions !

Makes me want to order a few more sets from HF


----------



## RJSakowski (Apr 6, 2015)

randyc said:


> Ha-ha, you're right to be shocked at that balancing act.  It probably required a full minute to stabilize the larger weight on the end of the calipers and since I live in earthquake country, I hastened to take the photo before quickly removing the weight.
> 
> Regarding damage, recall that we are discussing a $15 set of calipers (that are over ten years old, BTW).  I had little to lose but these devices are so simply constructed and sturdy that I would have confidently used a heavier weight if I could have balanced it, LOL.
> 
> ...


The last two digital calipers that I got from HF were terrible.  The grinding was rough and uneven on the pairs (Pittsburgh brand).  I bought two pairs because they were cheap.  Throwaways almost.  
One pair will randomly jump in multiples of .200"; someone else mentioned the same problem. The second pair was adjusted so loosely that it was impossible to get consistent readings.  When I tightened them up, they would bind terribly at about 2.5 -4.0 inches.  Undoubtedly, this was the reason they were set so loose.  A little judicious hand work with my stones and they are now serviceable. 
In all fairness, over the course of more than ten years, I have purchased eight pairs of HF digital calipers for work and for my own use and found them to perform well with the exception of the last two pairs.   But I will definitely check any I buy in the future before I leave the store parking lot.


----------



## randyc (Apr 6, 2015)

Oh well, experiences differ   I have three pair, all are excellent but all were purchased over ten years ago.


----------



## 18w (Apr 6, 2015)

Just like death and taxes, the one thing you can count on these days, what you bought a year ago will invariably be better than the same item tomorrow. It doesn't seem to matter whether we are talking Starrett, Craftsman, or Harbor Freight. At least that has been my experience lately.

Darrell


----------



## RJSakowski (Apr 6, 2015)

VFM3 said:


> I am not sure what I am shocked at: the fact that the caliper withstood all that weight without damage or that you managed to balance that weight precariously on the step end .
> 
> And I would like to restate my position on this, Calipers are not sensitive to measuring pressure, but the parts you measure will be affectedif you do not repeat the pressure exerted.


I would argue that calipers are sensitive to pressure.  The opposing forces are not directly in line giving rise to a torque moment.  If there is clearance in the beam tracking, it will twist the head slightly, giving a slightly different reading.  How much is dependent upon the clearance.  I adjust mine more on the tight side to minimize the effect.  

If you are measuring metal, the amount of force you apply by hand will not significantly compress the metal.  Plastics, wood, and rubber are obviously more susceptible to pressure, although on a piece of 1/2" Delrin, I couldn't detect any effect.


----------



## David S (Apr 7, 2015)

As an ex quality assurance person, the best way to validate what you are doing with a set of caliper, or any other measuring device is to conduct a GR&R study.  By going through the exercise you can determine if the measurement error is due to the instrument or the person.  Quite simple to do, we used three people and did three readings of each feature.  Google will give the method for those interested.

David


----------



## Optic Eyes (Apr 28, 2021)

randyc said:


> The 3.000 inch measurement is a typo, one of the zeros took a lunch break.  The 2.000 inch measurement - ditto, the decimal point became liberal and migrated to the left.  Although it would be natural to question the measurements, I am positive that those measurements are good ones WITHIN the ability of my eyes to interpolate the divisions on the dial (which, as I wrote, is sort of a crap shoot).
> 
> I strongly suggest that others make the same measurements to confirm (or not) the surprising results, given the presumed low quality of the tool I was using !
> 
> ...


Just use the same instrument for all the measurements, measure a shaft to bore a pully, use a caliper for both measurements.


----------

