That's not controversial with me at all, as it's all the accuracy that's required, but I do have a "slight" counterpoint. Maybe not even a counterpoint, perhaps just a footnote....
When looking for a standard micrometer, I'd recommend one that reads to tenths. Why, when you'll probably never need it? Because the cost of a tenths micrometer versus 0.001 is NOT much different , if at all. And specific to this type of micrometer, because the tenths reading is "separate" from the thousandths reading, it can (and should) be ignored for 99.9 percent of all functional purposes. And it is self supporting, it does not rely on a separate holder to keep it in place, so it won't drive you mad. And it's the highest resolution that you'll get meaningful results from outside of a highly controlled environment. The benefit in having it is because A, that's a zero to low cost way to add that precision to the collection, and B, having that unnecessary precision available to "play with" is a very good opportunity to learn good habits and practices that will apply to just about every measurement you can take. And just getting comfortable with a vernier scale, if one isn't already. I still say it's not needed, just that having it there for "playing with" is a good opportunity for skill building, and just how much the world is made out of rubber, even at the one thousandth level. And I'm speaking in inches, but I'd say the same about a 0.002mm micrometer. You can do the math and see that 0.002mm is not exactly 0.0001 inches, but let me assure you that in a home shop, you need not do that math, because they are the same...
Link for visual reference only. Literally the first hit in the search. I'm not recommending for or against this one, just that this style of micrometer is the type that I think is a great way to "play with" unreasonable accuracy when you playing with your tools, and to ignore that excessive accuracy when you are actually doing something useful.