[Metrology] Dial indicator contact point alignment

MrWhoopee can dial test indicators like the one shown be calibrated for linearity or just repeatability. I know the Starrett back plunger test indicators can only be calibrated for repeatability.


Lolz. I have a SPI (the I stands for Imports) plunger back dial with a US Navy cal certificate. A calibration program is as strong or as weak as its authors intend for it to be.
 
The test indicator measures a swept arc. They are prone to what machinists oddly call cosine error. I do not agree that it is an error at all, it is a fact of life and is easily resolved in vector components. So it is true that a DTI can't be calibrated for linearity, but there are plenty of ways to write a cal routine for a variety of uses based on knowledge of the instrument.

Most of us use it as an indicator, to indicate and locate a difference, not (necessarily) to quantify one. That's why it is a setup tool, not a measuring tool in principle.

In a way, metrology is the science of mincing words.
Excellent answer, it is a comparative tool with increments allowing adjustment until the 2 things being compared are either adjacent, parallel or centered. If I missed your point let me know.
 
Some test indicators actually can read directly, if the manufacturer specifies the angle the angle of the arm for a typical measurement. For a Starrett Last word, I found out from a factory tech that the angle of the arm is to be 15 degrees to measure tenths accurately. Mitutoyo refuses to document the angle, but if you set up gauge blocks you can determine the angle that allows direct memasurement - for a few degrees, anyway. On a tenths indicator, you can read directly and accurately for just over a thou before SIN error (or COSIN) error begins to get significant, but only if you know the correct angle and use it. (too much trouble)...

The normal practice is to use plunge indicators to read directly and test indicators to make relative measurements., as stated above several times.
 
Just a curiosity question along the lines of metrology, I will ask it because there are some really sharp guys in this post. Have any of you ever heard of, or run across a new lathe with factory installed DRO where "both X and Y" are generating, for lack of a better term, creeping errors? If you run the reader out using the carriage/cross feed then back to a solid stop, it starts at 20 or 30/100000 and each time you cycle it the error grows, after the 3rd or 4th cycle the error will be 3 or 4/1000+? I am puzzled and have not figured out the cause, the vendor is sending me new read heads, I have checked the current read heads and do not see any issues. I am wondering if the ground for 1 of the scales is messed up if I could be getting electrical noise..... :distress:
 
Dabbler and Pontiac are both right. Also, "Mincing words" is done to show the customer you have control of your calibration process. And in the end, control of your process to produce product.
 
Thank you gentlemen. After some investigation, the test point is definitely bent. It is quite obvious when unscrewing it. Now the question is, do I want the $8 steel tipped or $25 carbide tipped test point. lol
I got a carbide point for my Shars 0.0005” DTI only because it was only a couple of $ more.

More importantly is to get the correct length point for your specific DTI, assuming you are interested in using it for absolute measurements: too long/too short points will affect actual measurements ‘way more than cosine error (or lack there of).
 
Lolz. I have a SPI (the I stands for Imports) plunger back dial with a US Navy cal certificate. A calibration program is as strong or as weak as its authors intend for it to be.
Back plunger test indicators CAN be calibrated for repeatability but NOT linearity. Our back plunger test indicators were in our calibration program but we all knew they were not accurate for linearity, only repeatability.
 
Just a curiosity question along the lines of metrology, I will ask it because there are some really sharp guys in this post. Have any of you ever heard of, or run across a new lathe with factory installed DRO where "both X and Y" are generating, for lack of a better term, creeping errors? If you run the reader out using the carriage/cross feed then back to a solid stop, it starts at 20 or 30/100000 and each time you cycle it the error grows, after the 3rd or 4th cycle the error will be 3 or 4/1000+? I am puzzled and have not figured out the cause, the vendor is sending me new read heads, I have checked the current read heads and do not see any issues. I am wondering if the ground for 1 of the scales is messed up if I could be getting electrical noise..... :distress:
Lathes have X and Z, Y would be a milling attachment.
 
Readers are susceptible to noise, but most of the commercial ones, including China imports, have metal shielded twisted-pair DC wiring and AC inductor filters built in. Modern encoding on the strip is digital and absolute addressed. A reader sitting still against a mag strip or glass scale isn't traversing any "bar codes" to create drift. Voltage float will not affect the reading as long as it is in the TTL range (3.3-5v), so wandering readings are most likely to be faulted at the head unit. Even if a read head is splitting two codes and jumping back and forth, the software should take care of that. It sounds to me like the seller is shipping you the cheapest component to swap. Makes no sense- if the fault is found on both channels, it's not a bad read head.
 
Back
Top