Who's actually looking at western "first world" soocieties these days and comes to the conclusion that it's an example they want to follow?
A large part of the west is now islamified and people aren't even talking about it anymore since the refugee "crisis" began in 2015. I mean, looking back 20 years these changes seem crazy. You only experience western culture in movies from the distant past endlessly repeated on tv.