when did woman in america first get their rights

Expert Answers
saintfester eNotes educator| Certified Educator

Women actually had many rights prior the passage of the 19th Amendment to the U.S. Constitution, which granted them the right to vote.

In certain religious groups, such as the Quakers, women were allowed to be ministers, speak in public and vote in community meetings.

Prior to the passage of the Constitution, many states actually extended women the right to vote. New Jersey allowed both women and free black the right to vote until 1807.

In the mid 1800’s, women began joined reform movements aimed at addressing social issues in the U.S. Many were allowed to speak in public, hold important offices in large organizations and travel without their husbands to rallies and meetings. The abolition movement especially would not have been nearly as successful without the help of women.

In 1841, the first women in the U.S. earned their bachelors degree from Oberlin College.

In 1848, The Women’s Rights movement officially began after a Women’s Rights Convention was held in Seneca Falls, New York. These conventions were held regularly between 1850 and 1861 and succeeded in getting many states to create legal protection for women. Illinois passed divorce laws that favored women. New York allowed women to own their own property.

And of course, in 1920 American women finally received the right to vote.

hanny-banany | Student

Due to Women's Sufferage act in the early 1920's women began to gain rights such as right to vote.

cutiepie92298 | Student

yup 1920 i knew i was close...

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question