Well, everybody born in the american continent is technically “american” too, including Central and South America. Is there a specific term in english for these people?
Edit: Thanks for all your answers, especially the wholesome ones and those patient enough to explain it thoroughly. Since we (South Americans) and you (North Americans) use different models/conventions of continent boundaries, it makes sense for you to go by “Americans”, while it doesn’t for us.
I didn’t know that, thanks.
Look man, I’m not american and I didn’t ask the question to create some debate about the ethics or whatsoever. I just wanted to know if there was a specific word for that.
Eh, I agree common and mostly unambiguous usage is that ‘America’ refers to USA, but even in English it feels incongruous sometimes.
Just to be clear, I didn’t think that you were being offensive. It came across entirely as a good faith question from a foreigner, but it ties into (ironically arrogant) advocacy from some foreigners who call Americans arrogant for using the term American.