I wouldn't trust the lead screw pitch to be accurate. Especially on a HF machine. Many of the Chinese machines use a metric lead screw marked with Imperial measure on the assumption that .03937" is equal to .0400". Even if the lead screw has an Imperial pitch , you would be trusting that the manufacturer's machining accuracy is spot on. Given the reputation of Chinese machinery, that is a pretty big trust.
The accuracy of the calibration depends upon the accuracy of your primary standard. It will also depend upon the distance between the calibration points so the greater the better. I used my 6" micrometer as the primary reference standard. Ideally, it would be calibrated with a good gage block which it wasn't so I have to trust that it is accurate. My verification of trust is that it agrees with other measurement devices that I possess.
Even my cheap 1/2/3 blocks are fairly accurate. They typically measure about .0002" under nominal. It should be noted that even name brand 1/2/3 blocks can vary from nominal. They are often sold as slightly over to allow for lapping to final size. Assuming the .0002" difference and using the 3" distance of a 1/2/3 block as a standard, its contribution to error in calibration would be .00007"/1" which is close enough for any hobby class work. (This assumes good technique and care in measurement). I used a .0001" test indicator so repeatability should not be a factor in the calibration. If using a .0005" indicator, one should still be able to repeat to +/- .0001" with care.
My setup was as follows. I clamped a bar to my table parallel to the axis being calibrated, as determined by sweeping the edge with my test indicator. Next, I clamped a 1/2/3 block for a stop at one end of my calibration distance. (I used the 1/2/3 block because it has a reasonably good surface and is reasonably square.) Using my test indicator, I zeroed my DRO at the the stop and then moved the table to slightly more than the calibration distance and inserted my calibration block. Then I used the test indicator to find the surface of my calibration block and noted the DRO reading. Finally, I measured my calibration block and used that number, compared to the DRO reading to determine the scale factor. I used one of my parallels which actually measured slightly longer than 6" for my calibration block but was within the range of the 6" micrometer.
Yuriy's method is basically the same, the difference is that I swept the surface with a test indicator rather than using a square to determine parallelism of the block to the axis. Any error in the angle will show as a cosine error. A 1º error amounts to .0002"/1" error. By sweeping the surface, you will greatly reduce that possibility.
Lacking a test indicator, it is still possible to do the calibration using an edge finder. With care, it is possible to locate an edge to +/- .0001".
When opportunity presents itself, I would suggest preparing a set of calibration standards as long as compatible with your machines and measuring them accurately with a calibrated micrometer. This may be by taking them in to work and measuring them with a NIST traceable calibrated micrometer. Mark them with the measured size and safely store them for future use.