Kiwi asks…
I'd often seen the "Empire" appear as the main antagonist in western media. Why is that? Does the empire symbolize something oppressive?
Answer from a Native speaker

Rebecca
Yes, historically, empires have existed and thrived from colonizing and taking over other cultures and states, which is how they become an empire in the first place. Once they successfully invade a place, they force their language, politics, and culture onto the local peoples in order to better control them. For example, the British Empire and Japanese Empire were known to be cruel and oppressive to the people and places they took over. This is why you will usually find empires portrayed in a negative light, as they have historically caused much human suffering. Ex: The British Empire colonized much of Asia, Africa, Oceania, and the Americas. Ex: If the Empire invades, they will kill us all.
Listening Quiz