What are the main reasons people care so much about American politics?

This is not meant to be controversial, I’m just simply curious. I’ve spent time traveling around the world, and no matter what country I visited everyone seemed to have an opinion on US politics. It doesn’t seem like this with any other country.