1 Answer | Add Yours
Of course, any answer to this will be based largely on somewhat stereotypical visions of sex roles and what sorts of things appeal to men and to women.
Women are typically said to be more nurturing and caring. One aspect of Christianity is the idea that God cares for us and that Jesus loves us all personally. This aspect of Christianity emphasizes being compassionate towards others and behaving properly. This aspect is said to appeal to women.
Some churches feel that Christianity has become too feminized and try to present a more “masculine” Christianity. They emphasize the paternal aspect of God. They emphasize the idea that God wants men to be in control of their families. They often portray Christianity as a religion that encourages aggressive actions aimed at getting ahead in life. These emphases are said to appeal more to men since men are supposed to be assertive and somewhat aggressive.
We’ve answered 319,210 questions. We can answer yours, too.Ask a question