God doesn't support America's national interests. He doesn't fight on America's side, and America has no special claim to God's blessing or protection. Americans are not God's chosen people, and American patriotism isn't a Biblical value. It's time to set the record straight.
America isn't a Christian nation. America was founded during a time when nearly everyone in the Western world believed in the God of the Bible, at least His existence, and early America was settled and shaped by religious, God-fearing people, many of them Christians. Therefore, God, the Bible, and religion featured prominently in the personal language of our Founding Fathers and influenced the laws set forth in the documents that govern our nation. There is no question that the basic principles of this country are strongly informed by Judeo-Christian ethics and the gospel of Jesus Christ. America wouldn't be the nation we know today were it not for its Christian heritage. That said, the Founders intentionally crafted the documents that govern our nation to prevent government from establishing a national religion or exercising any form of religious intolerance. In this representative democracy, everyone's voice must be respected; the majority may rule, but they never have the right to tyrannize the minority. Even this openly tolerant Constitutional principle is informed by Scripture: it acknowledges the truth that all people have equal value and promotes Christ's command to love your neighbor (even your political or religious enemies) — despite individual differences.
So, while you may (rightly) argue that America has a Christian heritage, or (accurately) suggest that America used to be a "nation of Christians" in the sense that, at one time in history, nearly every American citizen professed some form of Christian faith, the fact remains that this country has never been officially Christian in policy, practice, fact, or intent. There's certainly nothing inherently Christian about a democratic government or capitalist economy. America has only ever been as Christian as its people. If America has become "less Christian" over the decades, you'll find it's because a smaller percentage of its citizens call themselves Christian, and even fewer of those practice what they profess. This isn't a trend you can (or should) reverse with legislation.
From another angle, Christians should bear in mind that America came to exist through rebellion and selfish defiance (however heroic and eloquently justified) and, since then, many of our dealings have fallen far short of anything remotely Christ-like. Examine our history (our treatment of Native Americans, our oppression of women, our support of slavery, our racism against Japanese citizens during WWII and against African Americans and Hispanics even still) or our current affairs (our deplorable stewardship, our neglect of the poor, our national arrogance — to say nothing of the greed and self-interest that fuel the political and economic engines that drive the country). Do followers of Jesus really want this country representing Christianity to the world? Make no mistake, this is a great nation and I'm thankful for it. But God's not an American. Personally, I'd prefer it if the name of Christ and the reputation of His followers wasn't automatically on the line for every action of these United States.
SHRINKAGE!
8 years ago
3 comments:
Great points! I hope it starts some good comment-dialogue.
Sadly, I think the church in the U.S. is too wrapped up in patriotism. Good words.
America was "One Nation Under God", "In God we trust" etc. Not disputable.
Now that "America" has been flooded and is drowning in idols and foreign gods you can accurately say that we are not a "Christian" nation. I'm not even sure if we are American anymore!
Post a Comment