Kind of a vague question. But I guess anyone that responds can state their interpretation.

Edit: I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.

  • Flummoxed@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    8 months ago

    I have to wonder if both our teachers (the good ones in elementary, at least) meant to inform us about how it should work, because that was all we could grasp at the time? Maybe it was their (misguided?) attempt to make us experience serious anger and feel called to action as we discover the truth of the system for ourselves. I’m a teacher, and I have sometimes realized students are not capable of understanding a complex situation, and in those cases I have attempted to at least ensure they understand I am giving them an idealized, simplified perspective of that situation that does not apply to how it works in reality. I try to plant the seeds for a critical understanding in the future, but I am sure there are students out there that believe I lied to them about how the world really works.

    ETA: added “good” to modify “ones” in first sentence for clarity