Gender norms

define what society considers male and female behavior, and it leads to the formation of gender roles, which are the roles males and females are expected to take in society.

» Glossary