How do I calibrate micrometers and gage blocks?

HMF

Site Founder
Administrator
Joined
Sep 22, 2010
Messages
7,223
I have seen notations on micrometers and sets that they were calibrated on so and so date.
Gage blocks have a certificate of calibration.

Stupid newbie questions:


1- What does that mean?

2- How do you calibrate these tools?


Nelson
 
Ultimately, calibration refers the measuring tool to a standard kept in the National Physical Laboratory or a similar national institution... The "local" standard used for in-house calibration won't be as accurate, and will be calibrated against a more precise standard at a calibration or test house.

An example would be the standard for a 1 to 2 inch micrometer that should come with it if it's a reputable instrument - this will have been precision ground at a stable temperature (often 68 F) by the manufacturer, who check it against a snap gauge that is calibrated against a local working standard, which is checked against a master snap gauge which is checked against another (known to be more accurate) standard, eventually the increasingly precise standards through the chain are referenced to a national master which is measured by counting a number of wavelengths of a specific frequency of light (which is how standard lengths are now defined)!

Gauge blocks, micrometers and other measuring tools in a production environment are subject to wear and tear, so they need to be periodically checked (calibrated) against a more accurate (and less worn!) standard, often a production shop (with aspirations to precision) will keep a set of reference standards to test measuring tools in-house, and these will in turn be sent to a calibration lab with more accurate (and more authoritative) standards, who will do the same with their standards, ultimately reaching the right-to-less-than-a-wavelength precision back at the national lab.

So... ultimately, you can't calibrate them yourself without a lot of *very* expensive equipment, but you can have them calibrated for a price :)

For the hobbyist, a good solution would be to find (and buy a few drinks for) a friendly inspector at a commercial precision engineers who is willing to at least check their equipment and write down the errors he finds, or in a real scrape buy a set of 1-2-3 blocks - these are mostly precision ground and accurate to within a few tenths, and can be used to adjust micrometers etc. fairly accurately, particularly if they come with their own set of calibration certificates - but they can't then be used for setup on the mill, for instance, as the wear they suffer will affect the dimensions and take *them* out of calibration!

Dave H.
 
Absolutely nothing "stupid" about this, very good questions. In a manufacturing organization or a governmental agency (responsible for adherence to specifications) there is a formal procedure including documentation that is designed to assure that any measurements made with the organizations instruments are within a specific range of accuracy - allowable tolerances. This goes back into the birth of interchangeable parts (oversimplified: that parts made by a shop in Cleveland will fit properly with parts made in Detroit, etc.). Over the decades the "science" of all this evolved into "metrology." There are a lot of research studies for this on wikipedia and other sites. In general the maintenance of the tools of measure fall under the organizations "quality control" (QC) or "quality assurance" (QA) departments or programs. I have always been confused as to the difference between "QC" and "QA" but I will venture that quality control is the physical testing of the tools and quality assurance is the formal recording and compliance to the adherence of the program.
In any case, all tools are tested to be within a given tolerance that matches the original tool manufacturer's tolerance of accuracy; it being generally accepted that these tolerances are matched to standards held in the (US) National Bureau of Standards - now known as the National Institute for Technology (I think). Overseas manufacturers may have different governing standards, such as DIN or British Standards, etc. The tolerances are transferred to the tool manufacturer or to independent testing laboratories via "secondary" bars or blocks, etc. These standards come with a written certification specifying the degree of accuracy. These certs also detail the temperature and method of testing, etc.
Gauge blocks come with this kind of certification. I may be wrong but I believe that the first precise standardized gauge blocks were made by Ford Motor Company and subsequently produced by Starrett. Please will others confirm this.
The "line" measuring tools have a certain time interval that requires periodic re-certification or when other factors (such as dropping). The certification of tolerances not only applies to direct measuring tools but to optical comparators, electronic, optical, temperature, and radiological frequency devices, and a host of other components that have a function to maintain tolerances.
Hope this gives a reasonable answer. Your gauge blocks will more than likely hold their tolerance within the design temperature range for both of our lifetimes combined and can perform a very close check on your mikes, calipers, etc.
Geoff
 
Hopefuldave is spot on! Much less formal than mine and more practical.
Cheers, Dave
 
In practical terms, you cannot calibrate or more accurately stated, certify to a size or tolerance, gage (jo) blocks at home. Calibration labs are equipped with clean rooms and environmental controls to keep their high precision measuring equipment as accurate as possible. Gage are manufactured to precise requirements, and graded according to where they actually measure. There are various classes of gage blocks, pins and angle blocks. In the QC lab, you buy the grade required to insure that your in house instruments meet their respective accuracy requirements. Many people use the terms "calibrate" and "verify" or "certify" interchangeably. They are different, really. To calibrate an instrument really means to adjust the instrument to meet a specified degree of accuracy. So in order to calibrate something, it is implied that the instrument is adjustable. Not all are. Gage block are just one example of a non adjustable instrument. No way to adjust it, so you really can't calibrate it, only certify whether or not it meets a particular level of accuracy, or grade. Thread gages are the same, except tapered thread gages such as used in API tool joint gaging. They can be reground to restore proper thread profile, and gaging surfaces replaced or reground to yield acceptable accuracy.

Certification simply means comparing the instrument to a known standard, and evaluating it to use as is and mark as such or reject it. An instrument, or a standard for that matter, can be measured and marked as "actual" and be perfectly acceptable and useable. For example, if you have micrometer standard rod that is not exactly 3.00000, (and few of them are dead on). Some lab with the proper equipment can accurately (and remember, accuracy is relative) measure the standard rod and mark its actual side. Say it accurately measures 3.00120. That mean toss it? No, just be aware of its actual size and make sure that when you use it, you refer to its actual size, not its nominal size of 3.00000. The micrometer should read 3.00120, just the size of the standard rod. Same with gage (jo) blocks. You will find acceptance standards for blocks and pins that will allow them to be categorized as A, B, C, etc, or one of the newer standard designation. If you require a gage block set to meet those standards, then you must replace the out of tolerance blocks or pins, individually. For all practical purposes in a home shop, a grade B (or current equivalent grade) plenty sufficient for any work. And there is no real reason to have them re-certified unless there has been damage, or abuse.

Once you have a known good standard, whether it be rods or blocks, you can do a basic calibration of your micrometers. Measure the standard, and if needed, adjust the micrometer to agree with the standard.....whatever actual size it is. This you can do at home. There are two more aspects for OD micrometers to check: Flatness of contact surfaces (anvil and spindle), and lead screw error. Lead screw error is checked by comparing the readings at zero, then at the 1.0000 point (micrometer at its widest opening). Hopefully they agree, but they seldom do, because of lead screw error. Some is to be expected, and some people prefer that the micrometer be set as accurately s possible at the 0.5000 point, allowing the lead screw error to be averaged out on either side of the half way point. The flatness of the anvil and spindle are checked with optical flats, and they can be purchased and this done at home. Not usually much at home you can do to fix it if they aren't flat and parallel, but at least you can know of they are good or not. Again, depends on your specific needs as to whether they are acceptable or not. A special micrometer checking block set has blocks at odd intervals so that the measurements fall every 90 degrees instead of only on a "zero" line.

That is only a cursory discussion of the topic. There is much more, but I don't want to overload the thread.
 
Tony, that's the best condensed explanation of calibration I've ever heard. Thank you.

Tom
 
Tony: Very well stated and a pleasure to read. Thanks for the explanation. Roger
 
Tony, sounds like you have been there and done this a few times in the past. I know I have.

Some of the places I have worked at in the past calibrate your mics to within +/-.0005 and slap a "green" sticker on them and tell you to come back in 6 months.

Believe me, I' ve seen worn out mics with green stickers on them and used to manufacture parts with!
 
Back
Top