So tell me, why do women feel the need
So tell me, why do women feel the need to 'play the victim'?
I break up with a girl and we've been in a relationship for a while...we both knew it was time for it to end, we've agreed about this, decided we'd still be cool with each other now that it's over.
Immediately, just days after we break up, she's complaining to her friends, talking about how she was torn to pieces, how I totally mis-treated her (in fact, I don't think I could have treated her any better, practically like a queen), etc...and I know she's just doing it to get sympathy from her friends, but in the process she's making me out to be a bad person.
Seriously, this kind of thing happens quite often, I see it in other relationships as well, and always from the female side. I don't mean to sound sexist, but this is just what I see from my experience. Is it REALLY necessary to do all that?