A frequently held position is that the best results can be achieved by allowing the free market to operate, and attempts by government to correct it in the interest of fairness simply make matters worse instead of better. This is a very typical conservative economic philosophy. On the other hand, because of a pre-existing condition, I was never able to get health insurance until the Affordable Care Act was passed, and without insurance I would never have been able to get the surgery I needed six years ago. (I realize that what is good for me might be bad in general, but I would like to see some proof that this is the case.) Would the free market have mandated, for instance, warning labels on cigarettes, or even putting ingredient information on canned goods? This view is called "free market fundamentalism" and it doesn't seem to me to be supported by the evidence.
Is there good reason to believe this? If so, what is it?