• CharlesDarwin@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    5 months ago

    To do a “well, actually” I guess we didn’t really have an empire until well past the country’s founding. We were a mostly irrelevant backwater for some period of time.

    ETA: And as someone who freaking loves this country, I realize that saying a comment like the above is one that could probably get you into a fight two days from now, most especially, when all kinds of fetishistic performative military nonsense and uncritical screeds about this country are at peak ridiculousness (and I love July 4th in spite of these flaws). I consider myself a real patriot, but one that is unwilling to whitewash this country.

    • RadioFreeArabia@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      5 months ago

      No, the US has been an empire from the start. Unless you don’t count conquering and colonizing the indigenous peoples because they aren’t “civilized” or something.