1 Answer | Add Yours
Assuming you are specifically looking for information about the roles and rights of women in the United States, the Nineteenth Amendment to the Constitution, which became law in 1920, gave women the right to vote. This was a huge change in society, allowing women to become involved in the campaigning for passage or disapproval of legislation that impacted them and issues about which they were concerned.
The Civil Rights Act of 1964 impacted women in many ways. Title VII established the requirement that employers needed to treat men and women employees equally. This was the first piece of legislation that began to address concerns such as pay inequities and discriminatory hiring practices.
The 1973 Supreme Court decision in the case of Roe v Wade ruled that respect of the right to privacy of every individual woman was "broad enough to encompass a woman's decision whether or not to terminate her pregnancy." Interpretation and reinterpretaion of this ruling, which had the immediate effect of legalizing abortion procedures when the pregnancy is in its early stages, continues as opponents and advocates explore various sets of specific circumstances.
These changes have allowed women to become more actively involved in political and social actions outside of their traditional domestic setting. They have brought about greater financial independence and increased opportunities to become employed, and they have revolutionized the impact of reproduction issues upon the lives of women.
We’ve answered 319,204 questions. We can answer yours, too.Ask a question