Our website is made possible by displaying online advertisements to our visitors.
Please consider supporting us by disabling your ad blocker.
Feminization (sociology)
Shift in cultural norms
In sociology, feminization is the shift in gender roles and sex roles in a society, group, or organization towards a focus upon the feminine. It can also mean the incorporation of women into a group or a profession that was once dominated by men.[1]