• Yeah, in general, the US is definitely right wing. The US Democrats are, at best, center right for most of the US’s peers, with the Republicans being what they would call their Fascist parties.