It's an empirical question what the ratio of false positives to true positives is. If the ratio is 1 to 10000, for example, I think that's tolerable and a net positive. You also have to account for humans making similar mistakes.
But humans aren't always logical. Especially when it comes to low probability events that have a high impact if they do occur. Like plane crashes, or to a lesser degree car crashes.
Yeah but in this case, the impact is the same whether you die from human error or automation error. I would rather reduce the overall probability of death by 50% even if probability death by malfunctioning equipment goes up by 1%.