Homework Help

Do you think colleges have a moral obligation to help students become more ethical...

user profile pic

bertyfitz | Student, College Freshman | Valedictorian

Posted August 10, 2012 at 5:14 AM via web

dislike 1 like
Do you think colleges have a moral obligation to help students become more ethical individuals?

Do you think colleges have a moral obligation to help students become more ethical individuals?

9 Answers | Add Yours

user profile pic

mwalter822 | Teacher | (Level 2) Educator

Posted August 6, 2012 at 7:36 PM (Answer #2)

dislike 0 like

Ethics are systems of moral principles in various disciplines and situations. I would not say that colleges have a “moral” obligation to make people more ethical, because I think that ethics are too personal in many cases. As we all know and have experienced, what is right from one person’s point of view isn’t necessarily right in another person’s view.

However, colleges definitely should encourage ethical debate in the various disciplines in which they confer degrees. To have a class that has the word “ethics” in the title should mean that certain moral principles are discussed and explored in that particular area, but students should never be indoctrinated into a particular moral or ethical system.

user profile pic

wordprof | College Teacher | (Level 3) Senior Educator

Posted August 6, 2012 at 10:17 PM (Answer #3)

dislike 0 like

Perhaps the obligation of a college is to make sure the students are taught how ethics began (the Greek origins), how they are formed inside a society, and how to differentiate between moral decisions and logical decisions -- in other words, a college graduate should be expected toknows what ethics means.

user profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted August 10, 2012 at 5:21 AM (Answer #4)

dislike 0 like

No, colleges do not have a moral obligation to teach students how to be more ethical individuals.  I agree with the first post because I do not believe that there is any one objective way to be an ethical person.  If a college teaches students an ethical system, it is necessarily choosing its own vision of ethics and imposing them on its students.  For example, we would not think that it would be appropriate for a public university to teach its students to adhere to ethical values that the political left would like but that the political right would find abhorrent.  Since teaching ethics involves trying to impose a particular world view on students, colleges (or at least public colleges) should refrain from doing so.

http://en.wikipedia.org/wiki/Ethics

 

user profile pic

litlady33 | High School Teacher | (Level 3) Assistant Educator

Posted August 10, 2012 at 9:07 AM (Answer #5)

dislike 0 like

I think the main purpose of colleges is to provide an education that will provide students with the necessary skills and training for their chosen field. In some cases, this might lead to the teaching of ethics related to that job or field, such as ethics in teaching or ethics in medicine; however, I do not think colleges in any way have a moral obligation to teach ethics to all students, especially if it's not related to the subject of the class. Students pay tuition so that college will prepare them for their field. If ethical training is not necessary for that job, it doesn't need to be taught.

Now, there are classes that deal heavily with ethics, but even these center more around the discussion and debate of ethics rather than the teaching of one set of ethics that an institution deems as universally "right."

http://en.wikipedia.org/wiki/Higher_education

 

 

 

user profile pic

Kristen Lentz | Middle School Teacher | (Level 1) Educator Emeritus

Posted August 10, 2012 at 1:00 PM (Answer #6)

dislike 0 like

I don't think that is the job of colleges or universities.  To me, the job of the university is to provide a safe, enriching, and challenging environment that affords students a variety of opportunity to discover their academic and personal strengths and weaknesses.  What the student does with that opportunity is up to the individual.

Kristen Lentz

user profile pic

rrteacher | College Teacher | (Level 1) Educator Emeritus

Posted August 10, 2012 at 3:11 PM (Answer #7)

dislike 0 like

While I agree that it is not always the role of colleges to impose ethical systems on their students, I do think there are many ethical behaviors that schools, by their very nature, ought to encourage. For example, academic integrity, which involves crucial ethical concerns, is absolutely appropriate for colleges to encourage, and even demand of their students. I also think it is entirely appropriate for colleges to encourage appreciation for diversity, which is just a reality on our college campuses as well as in society.

user profile pic

larrygates | College Teacher | (Level 1) Educator Emeritus

Posted August 10, 2012 at 8:42 PM (Answer #8)

dislike 0 like

A number of the previous posts seem to conflate ethics with political correctness. Obviously, colleges have no obligation to teach the latter by either precept or example. Still, colleges are training the next generation of leaders, professionals, teachers, etc. There are certain universal standards of ethical behavior, such as fairness toward individuals, academic integrity, honesty, fidelity, etc. that colleges can and should inculcate. It is neither necessary nor proper for higher education officials to teach positions on matters of social debate; however those universal principles on which all agree can and should be consistently and regularly integrated into the curriculum.  To suggest otherwise, as some of the above posts argue, is to revert to "situation ethics," a philosophy that has long since been abandoned.

http://en.wikipedia.org/wiki/Ethics

http://en.wikipedia.org/wiki/Higher_education

user profile pic

Tamara K. H. | Middle School Teacher | (Level 1) Educator Emeritus

Posted August 10, 2012 at 9:58 PM (Answer #9)

dislike 0 like

Ethics, especially ethical behavior, is far more than the theoretical study of Ancient Greek philosophy. I agree with the post above that all universities should in teach, or rather hold students to ethical standards of behavior, such as integrity, honesty, and fairness. Some may even argue that the drive to be self-educated and self-motivated are ethical behaviors.  However, I would also argue that by the time a student reaches university level, it is too late to still learn these ethical behaviors; hence, it is not the universities job to teach ethical conduct. While I know that some universities are more lenient when it comes to matters like plagiarism and only issue fines, at the colleges and universities I come from a student would be immediately dismissed for plagiarism and laughed off the face of the academic planet. Thus, I say that if a student does not know how to behave ethically by the time he/she reaches college, it is simply too late. Where I attended school, a warning may be issued for some behaviors, such as, "If you continue to allow your alcoholism to affect your studies, you will be dismissed," but in general my schools expected us to be self-motivated, moral, ethical adults and would not bother themselves to teach us to do otherwise.

http://en.wikipedia.org/wiki/Plagiarism

user profile pic

litteacher8 | Middle School Teacher | (Level 1) Distinguished Educator

Posted August 11, 2012 at 1:10 AM (Answer #10)

dislike 0 like
I think that colleges need to teach their subjects in such a way as to incorporate the ethics of the field. This should extend more than one basic ethics class. It is the responsibility of the colleges to help ensure that students act ethically in their chosen profession. Instructors should be good role models of ethical behavior.

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes