Lots of younger people and feminists these days are insisting that American culture promotes an intrinsic culture of inequality and assault against women, they are talking about this all the time on social media, and while I am not necessarily a big supporter of the "Me Too" movement per se, I do feel that this does seem to be a serious problem.
Was this always the case in America? Is America's situation more severe than those in Canada and other countries? It is 2021 - and you still read about all kinds of allegations and crimes against females in the news almost every day, which is very sad and shocking.
Was this always the case in America? Is America's situation more severe than those in Canada and other countries? It is 2021 - and you still read about all kinds of allegations and crimes against females in the news almost every day, which is very sad and shocking.