I have to go on a little rant for a second.
I was sitting in lecture today and my professor started talking about advance practice nursing and how nurse practitioners (and physician assistants) are going to be at the forefront of primary care in the coming years in response to the high demand for care of aging baby boomers. As part of the 2010 Patient Protection and Affordable Care Act, funds will be put in place for the establishment and maintenance of nurse-managed care facilities. In summary, nurse practitioners and registered nurses would be staffed at and in charge of running these primary-care-oriented facilities without the supervision of physicians. When my professor mentioned this, she also added that doctors would basically act as "technicians" who work under the direction of the nurses.
To which I thought,
Hold up a second.
What?
The definition of a "technician" who works in the medical field, in my mind, is a unit tech, who changes bed linens and takes vital signs, among other things that require no training whatsoever. Was my professor honestly trying to say that physicians would come in and do that after completing 7+ years of intense medical education and training? Was she saying that physicians would be expected to take instructions from people who receive nearly half the education they do?
So I raised my hand.
"Um, can you please clarify what exactly you mean by doctors being 'technicians' at these places? I'm not sure I understand."
Immediately she was on edge. I've written about having a husband in medical school in my journal reflections for this class and talked about it in practicum so she's very familiar with who I am married to and what he does. She explained through pursed lips that our state's licensing rules and regulations bar nurse practitioners from implementing certain tasks and performing certain procedures that only physicians are granted permission to do. She went on to say that these nurse-managed centers are being established because physicians do not want to go into primary care solely because of the lack of money they would make. She redeemed herself before I had the chance to open my mouth and mention MEDICAL SCHOOL DEBT by saying that this was understandable because of their debt, but somewhat brushed it off by shrugging her shoulders. At the tail end of her response, she started to say something about how she knows that I have a connection with physicians but I quickly said, "I was just curious about what you meant, that's all. I didn't understand what 'technician' meant in this instance," not wanting the conversation to go any further.
I've encountered instances like this before in my nursing classes. I feel like nurses try to one-up doctors so many times, making themselves out to be "above" physicians and having a better model of providing patient care. Whatever happened to collaborative care? Do they think that they are better than respiratory therapists, too? What about physical therapists, dietitians, or occupational therapists?
Then there's the whole thing about doctors only being in it for the money. While there may be some doctors out there with the God-complex who treat others with incredible disrespect and who therefore don't deserve respect themselves, I want to say this: Who in their right mind would willingly choose to put themselves through the hardship that is four years of medical school, racking up hundreds of thousands of dollars of debt along the way, then proceed to work 80+ hours per week during 3+ years of residency at less than hourly minimum wage, sacrificing personal time with their families and friends countless numbers of times if they didn't genuinely want to help others and were only in it for the money?
Nobody.
I wish there were more cohesion within the medical field. Yes, some doctors are jerks. Well, some nurses aren't real peaches, either. Why isn't there more collaborative care? Why isn't provision of health care more of a team effort, each providing care and contributing as appropriated by their level and method of training? For that matter, why is it always Western medicine versus naturopathic medicine and never a combination of the two? Both have positives and negatives and research-based evidence. I really don't think health care is as black and white as people think it has to be.
And that's my little rant.