That is weird and annoying. It is bad to have units that force measurements to overstate how accurate they really are. If someone says they are 60 inches, we know that that is probably accurate to within an inch. If they say they are 1524 mm, they are wrong.
Customarily you wouldn't say 1524mm unless you were reaching that level of accuracy. Assuming you're suggesting this is an estimate, this is 1.5m. Anything that gets over 1000mm is described by decimal. So 1.1 metres. 1.23m. 3.9m.
This allows for a decimal level of precision in an estimate, and it's very simple.
If it's less than a metre, you're going to say something is 300 "mill". But funny you should be talking about accuracy, as there's really no reason to understate accuracy. If you're measuring, you're measuring in "mills". The misunderstanding you have is that you're still considering what something is in inches, and converting.
We're not converting.
The one mistake that the US Government made when it was attaching to metric was it should have made "1 inch" = exactly 25mm. Instead they made it 25.4 for accuracy (which I understand, but they still rounded…) and this makes the transition very tough.
In fact, if it had been 25mm = 1in, and therefore 300mm = 1ft, and 900mm = 1yd, you'd probably have dropped the yard and be working in metric by now.
Just remember that if we're starting out measuring in millimetres, we're not losing accuracy in back-and-forth conversions. If something's measured at a rough 300mm, and it's actually 298, it's just 2mm off. That's still smaller than an 1/8 of an inch.
And by the way… imagine a measurement system without those stupid fractions...