We have been bombarded with the media lately and news of infanticide, socialism, women’s rights (or what is masquerading as such) and forced medical treatments.
I am here to tell you all that the God that made us did so in his perfect image. While we do have sin in the world, there is nowhere in the Bible that God tells us to rely on the world instead of Him. Nowhere that He says we should stray from His teachings.
Our religious freedoms are under attack daily in the media and among our lawmakers. If we stick our heads in the sand, we will end up with a mouth full of sand and a nation under the rule of the liberal left. There has got to be
I am also here to tell you that your body is sovereign from the government. That you are not owned by them, nor should you lay your trust or empowerment in them.
Therefore, I state outright, do not give to others what is not theirs to take. I know that we have become a nation that relies on doctors and
So I conclude with stating: before you throw your hands up and believe the media hype and fear mongering, before you divide and classify people based on your own biases, make sure of one thing: is it in Scripture? Is it Biblically sound? Does what you are thinking/saying come from the point of the Gospel?
Comment Form