No, I have mine on a mill, and the default raw display is decimal plus five digits.
It's not more accurate, but the 1mm has a finer resolution, so it can potentially be more precise.
The issue is on diametric feed, where Dia=infeed x 2, is your .0002 step resolution becomes .0004 step resolution. But I don't see it that way, read heads are reading two lines at once, coarse+abs and fine. The head can interpolate steps between steps using some math tricks, so I do not think you would actually see .0004 steps on your screen. My mill scales are all 1:1, my display is set for four zeros (.0001), and I get valid microstep DRO data without steps at .0002 intervals due to the interpolated result.
Wikipedia has a good article on linear encoders that explains how it works, but it's typical of multiplexing schemes and not that cutting edge. I don't buy any talk of rounding error here, it does not fit this particular implementation of technology. For all we know, the head could be reporting a 16-bit decimal and incrementing it, it's a black box until we see the source code. I don't think .0002 would be an acceptable limitation otherwise.