radicalized book cover image, cory doctorow

US health insurers get more and more federal funding, deliver less and less care

The American healthcare system is the worst of all possible worlds. Unlike every other wealthy country, the US leaves its health insurance to the private sector, where your health and your life are a distant second to shareholder profits.