Gender Roles Changing - Research Shows Changing Gender Roles - 0 views
-
Kathryn Walker on 04 Nov 13According to a survey of 3,500 Americans performed by Families and Work Institute released in March 2009, traditional gender roles are changing: there is has been an increase in the expectation of men and women to share in paid work as well as taking care of the home and children. This article discusses some interesting changes in percentages (compared to prior years) in the increased role of men in the home and women's increased ambition for jobs with more responsibility.