If there is one true thing about being a famous woman in Hollywood, it's that you MUST announce whether or not you're a #feminist upon entering Tinseltown's pearly gates.
Today, Taylor Swift has realized that she's been a feminist THIS ENTIRE TIME!
She said, "As a teenager, I didn't understand that saying you're a feminist is just saying that you hope women and men have equal rights and equal opportunities."
"What it seemed to me, the way it was phrased in culture, society, was that you hate men."
"And now, I think a lot of girls have had a feminist awakening because they understand what the word means."
"For so long, it's been made to seem like something where you'd picket against the opposite sex, whereas it's not about that at all."
"Becoming friends with Lena [Dunham]…"
"Without her preaching to me, but just seeing why she believes what she believes, why she says what she says, why she stands for what she stands for - has made me realize that I've been taking a feminist stance without actually saying so."
So, there you have it. Taylor Swift has always been a feminist and just didn't know it.