America Has Become a Nation of Lies
America has become a nation of pernicious lies, and it is increasingly mandatory that the citizens not only embrace the lies, but also espouse and defend them. Those who dare refuse to comply are ostracized and publicly castigated by the led-by-the-nose politically correct, who are as legitimately ignorant as they are illegitimate to all that …