I understand not everyone here is from the United States. But for those who do live there, such as myself, American society has become ever more secular, materialistic and increasingly hostile to Christianity and its associated family values - there are rising divorce rates, increasing amount of vulgarity and violence in popular culture such as Hollywood, mental illnesses, our broken health care system and student debt problems, the systemic failures in our education system, a drastic rise in violent crime in recent years, mass shootings, substance abuse and drug addiction problems, joblessness, and of course, faith and spirituality both seem to be in very heavy decline.
And if you've been on social media or some "mainstream" online forums recently...the people are SO hateful and dismissive of Christianity that even some atheists would be frankly somewhat shocked at their intensity. Like, there was a feminist who argued that Christianity's attitude of encouraging modesty regarding women helped promoted a culture of assault against them or something. And that is far from the most extreme view I have encountered.
There is a definite sense of lack of real purpose and deeper meaning in life and existence for so many Americans today, a kind of nihilistic and jaded self-absorption - there is no real deeper moral compass to anchor them as they travel through the myriad complexities of an ever increasingly complicated and fragmented society. And of course, the ongoing political problems, amidst all this recent chaos and confusion - both parties seem to hate each other more than ever, so much that I must wonder whether the two-party system itself is somehow outdated or becoming flawed.
Is it not telling that most Americans despise each other more than they do any other country nowadays? Is ANY other society quite this deeply divided and polarized as we are now? Add that to that entire complicated mess in Afghanistan (even Democrats are no longer as supportive of Joe Biden after recent developments) - and we have a very sad situation, objectively speaking. All of this has, quite honestly, disillusioned me for quite a while now, not just one or two different issues or problems, but all of them piled on top of each other.
I do believe that change is necessary for any society - but what most people forget is that it is possible for a society to "regress" as well as "progress" - or that not all change is necessarily for the better in the long run. And sometimes, seeing so much darkness and mutual hatred almost everywhere I look, I cannot help but wonder whether America is a fallen nation already.
In U.S., Decline of Christianity Continues at Rapid Pace
https://www.washingtonpost.com/religion/2021/03/29/church-membership-fallen-below-majority/
My brothers and sisters in Christ, what do you think? Do you think these concerns are valid? Do you think it is legitimate to consider leaving the United States, for another country?
"No man can serve two masters: for either he will hate the one, and love the other; or else he will hold to the one, and despise the other. Ye cannot serve God and mammon." (Matthew 6:24)