This is a good question, because it is important to know why something is worthwhile before committing to it. There are several points as to why being a nurse is important. Let me offer three reasons.
First, there are a lack of nurses. So, you would be going into a field where there is a need. Therefore, this occupation would benefit society. It is also wise from an economic point of view.
Second, all people will be in need of health care professionals at some point in their lives. So, you will be going into a field that is absolutely essential for the comfort and well-being of all people.
Third, as we face greater economic hardships, and various budget cuts, nurses will do more in medicine. And in my opinion, this is a good thing. For instance, nurses will be able to prescribe medicine and treat various other ailments. In short, hey will play a greater role in medicine. So, if there is a desire to help people in practical ways, then nursing is a great option.
Having grown up in a family full of doctors and nurses, I understand fully the important role that nurses play in the medical system. Today's health care industry is bogged down in insurance paperwork, so nurses are pretty much the doctors today. Think about your recent doctor visits---who spends more time with you in the examination room---the nurse practitioner or the doctor? The nurse spends the time, whereas the doctor swoops in for a moment and puts the final checks in on the list that the nurse made, and then the doctor swoops out to pop in on yet another patient for a moment. The nurses are the front line soldiers, whereas the doctors are the generals. A nurse friend who I work with part time at a gym told me a "joke" that nurses have for doctors at her hospital: "If you see someone who is doing something, it's not a doctor, it's a nurse." Basically, her point was that doctors are so bogged down with paperwork and the like, that nurses spend much more time doing the "doctoring" than do the doctors themselves.
From my own experience, which parallels the experience of other editors here, the nurse is all important in providing compassionate patient care. In particular, I think the role of the nurse can be usefully compared to the role of the doctor. The doctor on the whole I have always found to be rather an unsympathetic figure. It is the nurse that has actually done the looking after of me and helped explain what the doctor has been talking about. The nurse is the first point of contact with the patient, whereas the doctor is rather more removed.
Unlike (most) doctors, nurses are the first and last to interact with patients. They also interact with patients MORE than the doctors, usually. I consider your question from the patient point of view, and my experience with nurses. Personally, if I enter a new medical practice and have a bad experience with the doctor, I can be persuaded to stay if the nursing staff is excellent. It is the same the other way around. Even when a doctor is excellent, if I do not like the nurses, I usually do not stay.
Though the medical practices of nurses are all important (the shots, the statistics, providing information), in my opinion, the most important part of the profession is the relationships built with patients. When you consider how much mood/attitude can affect the body in the healing process, the nurse is even more important.
Nurses are extremely vital components to quality healthcare and patient education. Nurses are the ones who spend the most time with patients, and forms more of a personal relationship with them. Good nurses are able to provide comfort to their patients and get their patients to relax, or laugh, or calm down, and in many cases these things can be beneficial to one's healing. Good nurses also take the time to answer patients' questions, and explain various procedures or medications. This ongoing communication with patients helps patients to understand and manage their health issues.
Nurses are essential! In my experience, doctors have lost their bedside manners. For example...when my mom's oncologist gave her a 1 year prognosis...the doctor promptly left the room without trying to comfort my mom or my dad. The nurses, however, who helped my mom through her 6 month cancer ordeal brought kind and loving care to my mom which made her transition to hospice and death human.
So, besides all they do for the body, I believe nurses are necessary to nurture patient souls.
We really need compassionate health professionals. Nursing is a difficult job, emotionally and physically. If the hours don't drain you, the things you see will. Yet you are there for people in the time of their most need. When people are afraid and uncomfortable, you have the opportunity to make things at least a little bit better. That's important, and meaningful.
One reason is fairly obvious: nurses tend to spend more time in direct contact with patients than physicians do, and so the patient experience is highly influenced by what goes on with nursing. As we continue to move full speed into the era of patient-centered medicine, the ability of nurses to affect the perceived quality of care is becoming more significant.