mattfivefour
Well-Known Member
While America is not specifically mentioned in the Bible, I believe she figured richly in God's plans. When Great Britain abdicated her role as defender of the world's weak and proclaimer of the Gospel to the nations, America took over. And I believe she still has a role to play, broken and bruised as she is in the spiritual battlefield. Her situation today is much like Israel's of old when they allowed the riches and comforts provided by God's rich blessings to deceive them into thinking they earned all those things by their own hands.Why do you assume God is going to bring back America? The only nation with a promise is Israel. The Bible seems to suggest all nations will fall to the deception which includes America. America is not biblically special. Unless you actually are Mystery Babylon but that means active judgment and being destroyed.
I know it is painful but we have to accept that the West is dying. There is too much rot.
God is calling the truly saved in America back to repentance and prayer. There is still work to be done around the world and He wants His people to step up. Hopefully we all listen and obey. If we humble ourselves and repent and turn from our wicked ways, He will hear and use us. We may be but a remnant, but little is much when God is in it.