9 Answers | Add Yours
Ethics, especially ethical behavior, is far more than the theoretical study of Ancient Greek philosophy. I agree with the post above that all universities should in teach, or rather hold students to ethical standards of behavior, such as integrity, honesty, and fairness. Some may even argue that the drive to be self-educated and self-motivated are ethical behaviors. However, I would also argue that by the time a student reaches university level, it is too late to still learn these ethical behaviors; hence, it is not the universities job to teach ethical conduct. While I know that some universities are more lenient when it comes to matters like plagiarism and only issue fines, at the colleges and universities I come from a student would be immediately dismissed for plagiarism and laughed off the face of the academic planet. Thus, I say that if a student does not know how to behave ethically by the time he/she reaches college, it is simply too late. Where I attended school, a warning may be issued for some behaviors, such as, "If you continue to allow your alcoholism to affect your studies, you will be dismissed," but in general my schools expected us to be self-motivated, moral, ethical adults and would not bother themselves to teach us to do otherwise.
A number of the previous posts seem to conflate ethics with political correctness. Obviously, colleges have no obligation to teach the latter by either precept or example. Still, colleges are training the next generation of leaders, professionals, teachers, etc. There are certain universal standards of ethical behavior, such as fairness toward individuals, academic integrity, honesty, fidelity, etc. that colleges can and should inculcate. It is neither necessary nor proper for higher education officials to teach positions on matters of social debate; however those universal principles on which all agree can and should be consistently and regularly integrated into the curriculum. To suggest otherwise, as some of the above posts argue, is to revert to "situation ethics," a philosophy that has long since been abandoned.
While I agree that it is not always the role of colleges to impose ethical systems on their students, I do think there are many ethical behaviors that schools, by their very nature, ought to encourage. For example, academic integrity, which involves crucial ethical concerns, is absolutely appropriate for colleges to encourage, and even demand of their students. I also think it is entirely appropriate for colleges to encourage appreciation for diversity, which is just a reality on our college campuses as well as in society.
I don't think that is the job of colleges or universities. To me, the job of the university is to provide a safe, enriching, and challenging environment that affords students a variety of opportunity to discover their academic and personal strengths and weaknesses. What the student does with that opportunity is up to the individual.
I think the main purpose of colleges is to provide an education that will provide students with the necessary skills and training for their chosen field. In some cases, this might lead to the teaching of ethics related to that job or field, such as ethics in teaching or ethics in medicine; however, I do not think colleges in any way have a moral obligation to teach ethics to all students, especially if it's not related to the subject of the class. Students pay tuition so that college will prepare them for their field. If ethical training is not necessary for that job, it doesn't need to be taught.
Now, there are classes that deal heavily with ethics, but even these center more around the discussion and debate of ethics rather than the teaching of one set of ethics that an institution deems as universally "right."
No, colleges do not have a moral obligation to teach students how to be more ethical individuals. I agree with the first post because I do not believe that there is any one objective way to be an ethical person. If a college teaches students an ethical system, it is necessarily choosing its own vision of ethics and imposing them on its students. For example, we would not think that it would be appropriate for a public university to teach its students to adhere to ethical values that the political left would like but that the political right would find abhorrent. Since teaching ethics involves trying to impose a particular world view on students, colleges (or at least public colleges) should refrain from doing so.
Perhaps the obligation of a college is to make sure the students are taught how ethics began (the Greek origins), how they are formed inside a society, and how to differentiate between moral decisions and logical decisions -- in other words, a college graduate should be expected toknows what ethics means.
Ethics are systems of moral principles in various disciplines and situations. I would not say that colleges have a “moral” obligation to make people more ethical, because I think that ethics are too personal in many cases. As we all know and have experienced, what is right from one person’s point of view isn’t necessarily right in another person’s view.
However, colleges definitely should encourage ethical debate in the various disciplines in which they confer degrees. To have a class that has the word “ethics” in the title should mean that certain moral principles are discussed and explored in that particular area, but students should never be indoctrinated into a particular moral or ethical system.
We’ve answered 319,622 questions. We can answer yours, too.Ask a question