Read the full story HERE.
Walmart Clinics, Health Insurance Sales Plans Show Retailer’s Rising Interest In Health Care Business
The world’s largest retailer wants a bigger piece of America’s growing health care market.
Walmart, which already makes a lot of money from health care through its pharmacies and retail stores, is eyeing a line of business that would sell health insurance to small employers, and the company also wants to add to its chain of in-store health care clinics, the Orlando Business Journal reported in two stories Friday.