• RadioFreeArabia@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    No, the US has been an empire from the start. Unless you don’t count conquering and colonizing the indigenous peoples because they aren’t “civilized” or something.