The displayed numbers now use a fixed decimal precision of 3 by default, but can be set between 1 and 6 inclusive, via a runtime-per-user setting.
But after some thinking, I am not convinced anything needs to be done about the /m and /s measurements. For the reported numbers, I think it's better for them to be measured per second because:
1) In the case of raw resources, it's easier to compute how many miners / pumpjacks you need, since their mining / extraction rates are also measured per second.
2) It's easier to figure out when you need red/blue belts for a certain item or when yellow belts are just fine, since their transport speeds are also measured per second. A similar argument can be made for the different types of inserters.
And I think the user specified final product rate should me measured per minute because it can save the user from having to input ugly decimal values (for example, you might want to produce 1 Productivity Module 3 every minute; it's easier to write 1/m, then 0.0166666/s in that case)
Can you think of any good reasons why anyone would want to choose differently? And if I were to implement the feature, should it be changeable dynamically trough the GUI or would runtime-per-user settings (separate ones for displayed numbers and input production rate) be just fine?