Whatever happened to trust and faith in Uncle Sam? What happened to the root ideals of democracy? Wasn't the purpose of a democratic republic to make people have faith in the government because they controlled it? Whatever happened to people not just being paranoid about the government, but actually doing something because they can? Why are we so afraid of the government, when none of us have really been affected by its "oppresiveness"? Why are Americans, who live in the most free country in the entire world so fearful of their government? What happened to a time when liberals weren't "governments suck, blah, blah"? Why is it that anybody who is fearful of an oppressive government actually run for office and do something? Why do people feel that nothing is being done? Why can't people realize that United States is actually the best country to actually live in, in all the hindsight?