Suppressing the Truth--Romans Chapter 1 Come to Life
Manage episode 415324939 series 3315342
Why has American culture changed so dramatically in the past 25 years? Was there a tipping point that triggered significant cultural change, and if not, what contributed to the shifts in values and behaviors we're now seeing in the streets, on campuses, even in the halls of Congress? In what way does American culture's view of truth, or the lack thereof, play into this? According to the great historian Will Durant, “a great civilization is not conquered from without until it has destroyed itself from within. " This is a scary thought. Is America sliding toward its demise? How should Christians respond? For more Christian commentary, check my website at rexmrogers.com.
185 episoade