by Star Parker
Now that Democrats have won the White House and have widened their margin of control in Congress, does this signify that American voters have moved to the left?
Many Republicans question this claim. And a new report from the Pew Research Center seems to verify that America is still a right of center as a country. But the picture gets murky when you look at the details. And this murkiness presents a considerable challenge for Republicans who are trying to figure out where to steer their party.
Click here to read the rest of the article: