The west has fallen
an older catchphrase of alt-right and adjacent groups. It's the idea that, to them, the social decay and fall from grace perceived in western countries is akin to the fall of Rome; that the Americas and Europe are beginning to collapse or lose their glory.
Paul: The western countries are declining crime is rising, whites are getting replaced, values are lost and cultures are getting eradicated
Thomas: The west has fallen!
Thomas: The west has fallen!