Radical feminism is a perspective within feminism that calls for a radical
re-ordering of society in which male supremacy is eliminated in all social and
economic contexts, while recognizing that women's experiences are also
affected by other social divisions such as in race, class, and sexual
orientation. The ideology and movement emerged in the 1960s.
Radical feminists view society fundamentally as a patriarchy in which men
dominate and oppress women. Radical feminists seek to abolish the patriarchy
in a struggle to liberate women and girls from a perceived unjust society by
challenging existing social norms and institutions.