Some of our products use MicroPython though we also use a whole host of other technologies. Some of our devices are proof-of-concept (often designed to progress a theory) but we also deliver up to Class B solutions.
Carefully, at least for devices with higher classifications. Using pre/early allocation helps but, more importantly, we monitor memory use over time in realistic scenarios. We've built tooling, like a memory-profiler [1] that allows us to observe memory consumption and quantify performance over time.
However, it turns out that MicroPython has a simple and efficient GC - and once you eliminate code that gratuitously fragments memory it behaves quite predictably. We've tested devices running realistic scenarios for months without failure.