Do Americans really believe everyone actually wants to be American?
When the boys were talking about taking Greenland they seemed to have the attitude of “Why wouldn’t they want to be American?” With all this talk about taking over Canada I’ve seen similar takes, as a Canadian literally not one person I know or have talked to wants to be American. Do y’all really believe other countries want to be a part of the United States?