- Joined
- Dec 18, 2019
- Messages
- 7,428
Been having some issues with my drill press. Either bent drill bits or something else. Thought it would be good to measure the TIR. Used a ground 3/8" rod mounted in my drill press chuck, using a DTI and a dial indicator. DTI was a no name thing. It seemed to indicate a lot of runout. 0.003" or so. I then used an old Federal dial (plunger) indicator (0.0001" per division), it measured 0.0012" TIR (I think, but see below). Both held by my Noga holder. Made sure neither of them were bottomed out. The DTI has a little ball, whereas the old Federal has a relatively large hemispherical contact point.
Contact point was 1.5" down from chuck. Like to believe the Federal. Have an old Enco (made in Japan) DTI that I can try. Since I'm just a newbie, I'm well aware that technique (or lack of it) can change the answer. Like when I forgot to tighten all three points on the Jacobs chuck - that reduced the TIR from 0.005" to 0.003 on the no name DTI.
Perhaps I don't quite know how to use the DTI. It was relatively flat to reduce cosine errors. How flat does it have to be? For measuring a rod, does it have to be 90 degrees to the axis of the workpiece and the lever tangential to the surface?
When using the Federal dial indicator I noticed the run out didn't strictly repeat. By that I mean the low point never got that low again. The high point always seemed to repeat. So say it started at 0, then went to 0.0014", keep rotating until it should have been 0" again. Instead it read 0.0002". Rotate some more and 0.0014", keep going (in same direction) down to 0.0002". If I change direction or rotation, the Federal will drop to 0.0001, but not 0. (But if I keep rotating, I get 0.0014 and 0.0002) So is it sticking, or this is sort of what happens? (0 wasn't really zero, it was 0.1000") I put a drop of oil on the tip of the "ball" and the behavior stayed the same.
Don't know if this belongs here, or in the measurement forum. Either way, I'm a little puzzled. Can someone enlighten me?
Contact point was 1.5" down from chuck. Like to believe the Federal. Have an old Enco (made in Japan) DTI that I can try. Since I'm just a newbie, I'm well aware that technique (or lack of it) can change the answer. Like when I forgot to tighten all three points on the Jacobs chuck - that reduced the TIR from 0.005" to 0.003 on the no name DTI.
Perhaps I don't quite know how to use the DTI. It was relatively flat to reduce cosine errors. How flat does it have to be? For measuring a rod, does it have to be 90 degrees to the axis of the workpiece and the lever tangential to the surface?
When using the Federal dial indicator I noticed the run out didn't strictly repeat. By that I mean the low point never got that low again. The high point always seemed to repeat. So say it started at 0, then went to 0.0014", keep rotating until it should have been 0" again. Instead it read 0.0002". Rotate some more and 0.0014", keep going (in same direction) down to 0.0002". If I change direction or rotation, the Federal will drop to 0.0001, but not 0. (But if I keep rotating, I get 0.0014 and 0.0002) So is it sticking, or this is sort of what happens? (0 wasn't really zero, it was 0.1000") I put a drop of oil on the tip of the "ball" and the behavior stayed the same.
Don't know if this belongs here, or in the measurement forum. Either way, I'm a little puzzled. Can someone enlighten me?