America is often accused of having no culture. The left says that there is no “American culture”. Is that true? Even for a second?
America is often accused of having no culture. The left says that there is no “American culture”. Is that true? Even for a second?