Actually, no. The margin of error is GREATER for imprecise measuring devices than it is for extremely precise ones.
For example, let's say I'm 5'5" tall and that I have a 3-foot long piece of wood with no markings on it whatsoever. I use this wood to measure my height, and I get almost 6 feet. I don't know how much shorter I am than 6 feet, because I don't have a measuring device that can handle smaller distances. All I know is that my height is between 3 and 6 feet. That's a HUGE margin of error!
Now let's say I use a 1-foot long piece of wood. Now I know that my height is a bit more than 5 feet. I know that my height is between 5 and 6 feet. But again, I can't get any closer to my actual height than this, because my measuring instrument is imprecise. I now have a margin of error of 1 foot.
However, if I used a yardstick or measuring tape and got a height of 5'5", I'd know that was accurate to within about 1/16" (or 1/32", depending on how finely the yardstick/tape was calibrated). That's a much more reasonable margin of error than a foot, because it's a minuscule fraction of my height.
Now, let's pretend that I treated the margin of error for all three measuring systems as negligible. My first measurement says 6 feet, my second measurement says 5 feet, and my third measurement says 5'5"! So I shrank by one foot, then grew five inches, between measurements. Clearly, this is nonsense.