The role of women in society has always been an issue throughout the ages and throughout Western Europe, and more or less all over the world. Before the age of the Enlightenment, or the Dark Ages, women were always seen as secondary to men in all aspects. Most reasons were religious while others were just the way life was then. Many changes occurred during the Enlightenment period of the late eighteenth century. For instance, during the eighteenth century, married women's lives revolved to a large extent around managing the household, a role which in many cases included partnership in running farms or home businesses.
The defiance of English rule and the onset of the war disrupted the ...view middle of the document...
Women were a very important element as they carried out paying tasks for laundering and nursing. Men were unwilling to do this and without this tasks being done the army would have been even more seriously depleted by disease. In addition, women performed duties as cooks, food foragers, spies and water carriers all of these being unpaid. However, the number of women generally exceeded that which would have been required and often represented a nuisance to commanding officers; women and accompanying children used scarce rations and slowed the movement of the army. Nevertheless, they were tolerated because they performed important jobs for the welfare of the armies and for fear that the men would desert if their families were sent home.
In colonial America, women who earned their own living usually became seamstresses or kept boardinghouses. But some women worked in professions and jobs available mostly to men. There were women doctors, lawyers, preachers, teachers, writers, and singers. By the early 19th century, however, acceptable occupations for working women were limited to factory labor or domestic work. Women were excluded from the professions, except for writing and teaching.
The medical profession is an example of changed attitudes in the 19th and 20th centuries about what was regarded as suitable work for women. Prior to the 1800s there were almost no medical schools, and virtually any enterprising person could practice medicine. Beginning in the 19th century, the required educational preparation, particularly for the practice of medicine, increased. This tended to prevent many young women, who married early and bore many children, from entering professional careers. Although home nursing was considered a proper female occupation, nursing in hospitals was done almost exclusively by men. Specific discrimination against women also began to appear. The American Medical Association, founded in 1846, barred women from membership. Barred also from attending men's medical colleges, women enrolled in their own for instance, the Female Medical College of Pennsylvania, which was established in 1850. By the 1910s, however, women were attending many leading medical schools, and in 1915 the American Medical Association began to admit women members.
Throughout most of history women generally have had fewer legal rights and career opportunities than men. Wifehood and motherhood were regarded as women's most significant professions. In the 20th century, however, women in most nations won the right to vote and increased their educational and job opportunities. Perhaps most important, they fought for and to a large degree accomplished a reevaluation of traditional views of their role in society.
The modern world today is proud to recognize the equality that has been acknowledged between age, gender, and race. Women are beginning to be treated as equals with men, in new customs, lifestyle, society, and economy. Today, women are freer and are...