[Newbie] Dro Error

I would use gage blocks, to multiple distances.

That is standard calibration proceedure, and being lucky enough to have a Mitutoyo inspection grade set that's what I'd do, but are you suggesting that everyone buying a DRO has or should buy a gauge block set? :D
 
You don't have to assume... that IS the problem. When you take an .011 cut on a rotating piece of work, it takes .011 off all the way around, not just on one side. Some DROs will work in diameter or radius, operator's choice. Don't know if yours has that feature or not.

When you take a .011 cut it takes .055 of each side.
 
When you take a .011 cut it takes .055 of each side.
I think you mean .0055, but in any case, I should have said "depth of cut", not just "cut". I corrected it in post #7.

Tom
 
That is standard calibration proceedure, and being lucky enough to have a Mitutoyo inspection grade set that's what I'd do, but are you suggesting that everyone buying a DRO has or should buy a gauge block set? :D
Beg, borrow, buy, or otherwise find a suitably accurate substitute. I know half a dozen people locally who I could borrow what is needed from if I did not have my own. It helps to have machinist friends... ;) Not everyone wants or needs a set of gage blocks. They are nice for the job because you can wring them together into different sized accurate stacks and test at multiple distances, quite easily. With my ordinary gage block set I can wring 4", 2", 1", and 1/2" blocks into many combinations to test with. If you have or can borrow micrometer standards of various sizes you can use them. Think outside the box, but maybe a yard stick is not the direction to steer towards... :eek 2:
 
Something else to consider, your toolpost and tool are not perfectly inflexible. So the DRO will measure the tool movement, but inevitably there will be some slop in the actual movement when the tool pushes against the work. I normally re-fix my dro settings after actual micrometer measurement as I get close to the final desired diameter. In the incident case, it's true this large error is the radius/diameter thing, but be aware that the dro readings in the real world are not absolute, you have to adjust to the flex in your setup as you near your desired diameter.
 
With my ordinary gage block set I can wring 4", 2", 1", and 1/2" blocks into many combinations to test with.

For calibration of measuring devices including micrometers and DROs you should use some intermediate fractional sizes and small increments as whole units only will not detect some problems

Something else to consider, your toolpost and tool are not perfectly inflexible. So the DRO will measure the tool movement, but inevitably there will be some slop in the actual movement when the tool pushes against the work.

Setting your DRO after an initial skim and then a cut of the depth of your planned finishing cut will get you very close.
 
That is standard calibration proceedure, and being lucky enough to have a Mitutoyo inspection grade set that's what I'd do, but are you suggesting that everyone buying a DRO has or should buy a gauge block set? :D
For calibration of measuring devices including micrometers and DROs you should use some intermediate fractional sizes and small increments as whole units only will not detect some problems
OK, so first you tell me that not everyone can afford test equipment like a gage block set, so I offer some simpler ideas that will also work better than a caliper or a dial indicator. Then you say that my basic gage block stacks are not good enough. I have offered several accurate but low dollar approaches to doing the calibration. I think it is now your turn to tell us the answers that makes you happy. :)

BTW, I really don't think a DRO cares much about the increments (micrometers yes), just testing correctly and at multiple accurately known distances. Please don't keep us in suspense, enlighten us!
 
Since we are all here to learn, does it not matter whether the compound slide is positioned at 90° or 29.5° (or any other angles )? based on my understanding shouldn't it also affect the correct reading of the DRO?
I know the dro on my LMS mini lathe measures radius and if I don't divide the amount to be removed from the diameter ,in half,(.010" max,depth of cut on mild steel) I could potentially stall the motor .
 
Last edited:
Please don't keep us in suspense, enlighten us!

I based my replies to you on your own statements of what is or isn't suitable, if you need gauge blocks and want to follow an acceptable calibration procedure for use with blocks then whole measurement units only are unsuitable ;-)

In fact a good dial gauge plus a good vernier caliper along with suitable sight aids, if required, are adequate for the job of checking a DRO axis, if they were not then they would also be unsuitable for their other common uses, which quite obviously, they are not. :D
 
After reading this thread, I decided to check the DRO on my old mi,ll/drill. I hadn't realized the the scales could be calibrated. (thanks, Bob!)
I set up two 1-2-3 blocks the align with the x and y axes and mounted a tenths reading test indicator on the head. I zeroed the indicator and DRO on the 1-2-3 block and then moved out 6" on the x axis using a 6" parallel as a spacer.

I had previously measured the parallel using my 6" micrometer and knew the actual length to .0001" and the accuracy of my micrometer. As I recall, it came out at 6.0004" I then corrected the DRO calibration to read the same. I repeated for the y axis.

There is an assumption that the micrometer is accurate and the only way to tell for certain would be to calibrate it using a traceable standard. I do not have one so I have to assume that it is reading correctly. For my hobby purposes, this is not a serious issue for me.

This brings up the more general problem of how a hobbyist goes about verifying his/her metrology instruments. My particular attack is to try measure known values. To that extent, I use gage pins for values under one inch. For longer distances, making a set of blocks or pins of appropriate lengths and measuring them with calibrated instruments and reserving them as your internal standard works. I realize that proper calibration dictates the the standard used have at least three times and preferably ten times better accuracy the the instrument you're calibrating but it isn't necessary for most purposes. If I can say that a measurement is good to +/- .0002", it will meet almost all of my needs.

I also take the viewpoint that if two instruments give exactly the same reading, they are most likely both accurate. Errors rarely happen in exactly the same amount. For smaller micrometers, I have a number of them to compare in this way. For my 0 -6" set, I can measure the calibration standards with two different micrometers (e.g., the 5" standard can be used to check both the 5" and 6" micrometer. In this way, I can build up enough confidence in me metrology instruments to permit functional use of them in my work.
 
Back
Top