A majority of Americans think Obamacare will make health care in our country worse, and they're right.— Phil Gingrey
A majority of Americans think Obamacare will make health care in our country worse, and they're right.