Showing posts with label wait times. Show all posts
Showing posts with label wait times. Show all posts

Sunday, October 2, 2011

An important Division of Urology project comes home to roost

A couple of years ago, I attended a presentation about implementing change.  The speaker showed us this slide:


And waited...

At first, there was silence as we examined the picture for some hidden meaning.  Then came nervous laughter as the speaker remained silent.  After what seemed like 5 or 10 minutes (but was likely only 2 or 3), there was annoyed muttering around the room.

Finally, she moved on to this slide:


"It takes 21 days for a chicken egg to hatch," she told us.  "How much of the chick's development do you think happens on the 21st day?"

Her point was that, in change initiatives, even though we celebrate the dramatic final outcome, much of the ongoing effort toward achieving the goal is under-appreciated, yet critical to eventual success.

Last week, in the division of urology, one of our eggs hatched.


Thanks to the efforts of St. Paul's Hospital Foundation several years ago, generous donors have contributed to the establishment of a Urology Centre of Health.  While there will be a bricks-and-mortar aspect to this Centre, the real value is the service we'll provide for patients.  A crucial part of that is the development of a staff position that is new to our division and the Saskatoon Health Region: Nurse Navigator.  Our Nurse Navigator, Nicole, works on a range of quality improvement projects in our division, and one of those projects recently broke out of its shell.  A little background information is in order.

A common reason for urologic consultation is so a man can be evaluated for the possibility that he may have prostate cancer.  Our current process is for the man to see one of us in the office for discussion and examination.  We then decide whether or not he needs to undergo a prostate biopsy.   We contact him with the biopsy results and, if cancer is found, arrange an appointment to discuss treatment options.

Because of the nature of prostate cancer and the available treatments, that discussion takes between 45 and 60 minutes.  For some urologic cancers, such as kidney tumours, there is a single effective treatment, and so the discussion is fairly brief.  However, prostate cancer may be treated with surgery, radiation (with 2 varieties offered), or even observation.  It's a complex discussion that involves not only the technical aspects of treatment, statistics about success rates, but also men's relative preference/aversion for various side-effects.

Not only does it take a significant amount of specialist time to have the discussion, it is challenging to find time in our schedules to have this urgent conversation.  As such, men, having just been informed that they have cancer, may wait up to 2 weeks to hear about their treatment options.  We offer written and online material so they can inform themselves before the discussion, but those resources don't take the place of individual consultation.

Since the spring, we've been working toward having Nicole carry out those discussions.  Over several months, she's familiarized herself with the details of prostate cancer treatment.  Her previous work on the inpatient urology ward gave her experience with surgical treatment. She's visited the Cancer Clinics in Saskatoon and Vancouver to learn about radiation treatments.  Much of her time has been spent "shadowing" urologists as we discuss prostate cancer treatment with our patients.

After spending time as an observer in those discussions, Nicole then started to lead the discussion, with the urologist present as a resource.  More recently, she has been meeting with men independently.   Nicole had shadowed me on several occasions, but 2 weeks ago was the first time I had been solely the observer, with her leading the conversation.  All 4 of us - the man and his wife, Nicole and I - then reviewed any questions that arose.  I was completely satisfied that the man had received the same information I would have given him, and in an unbiased fashion.

The next day, another man was scheduled for "the talk".  Nicole met with him and his wife, and I joined them afterward.   The questions they asked of Nicole and me showed me that they had gained a good understanding of the complexities around the decision for prostate cancer treatment.

I had a sudden appreciation of how this new process would change things for our patients and our practice.  Having Nicole available to lead these discussions would free up 45-60 minutes of specialist time.  Those appointments had usually been scheduled during the most precious hours of our workday, that is, late afternoon, after we had finished operating and were wanting to return phone calls and review lab results.  Each of our 8 urologists may see 2 to 4 men a month with newly diagnosed prostate cancer.  Also, because our schedules are usually filled weeks in advance, our staff have to scramble to find openings in which we can have these urgent discussions, and the available openings are rarely as soon as we/our patients would like.  Nicole's ability to schedule more prompt appointments means that men will save up to 2 weeks in the journey from diagnosis to treatment.

When I thought about this a-ha/hatching moment, I also knew that a lot of work had gone into achieving a very satisfying result.  Nicole had designed her own education program, as there was no formal curriculum to guide her.  Several of my partners had spent time with Nicole, discussing the complexities of prostate cancer treatment.  My office staff made sure that Nicole knew about upcoming appointments for her to attend.

But, in the developmental stages of a project, team members may find it difficult to keep up their motivation when the final goal seems far away.   Going back to our egg-hatching example, this would be the equivalent of trying to keep a group of kindergarten students interested in incubating and hatching baby chicks.  When they see an unchanging shell day after day, their enthusiasm will wane.

To keep them interested, you could do this:



If you shine a bright light through an egg, you can see what's going on inside.  That will help our young students to understand that the chick is developing.

For our next divisional project, perhaps we should give all stakeholders regular peeks at progress by setting milestones and reporting when they're reached.  While it's a nice surprise for everyone to see a project finally hatch, those who are less immediately involved may be more inclined to nurture and protect the fragile work-in-progress if they can see what's going on inside the shell.



Now, let's drop the egg analogy.  We've learned something else through our Nurse Navigator's work.  And this may have more important benefits than the improved timeliness she brings to discussions around prostate cancer treatment.  As Nicole has observed all our urologists discussing treatment with men, she's noted differences in individual practices.  Some of the variation she's noticed involves recommendations for treatment and followup.  She wants to give consistent, best practice information to men, but also doesn't want to confuse them by telling them something that an individual urologist may contradict later, based on his/her own practice habits.

This is an opportunity for us to decide, as a group, whether there is a standard, best practice that our division should follow when advising men about prostate cancer treatment and followup.  Because we manage our individual office practices in isolation, we rarely have conversations about these more mundane (to us...) aspects of urology.  In academic centres with residency training programs, the postgraduate trainee serves as the bee, cross-pollinating ideas and practices from one staff urologist to another.  We don't have a residency program, so it looks like our Nurse Navigator will be the one to point out areas in which we can address practice variation.

Sunday, July 31, 2011

I ♥ Calgary's online ER wait times project

I have a huge (data-) crush on Calgary's Health Region!

They have captured and posted online the ER wait times at the city's healthcare facilities.

The website shows estimated wait times for 4 hospitals and 2 health centres.  The information is automatically updated every 2 minutes.  There's a comprehensive disclaimer that reminds people that ERs are unpredictable places, that wait times may change significantly within a short period, and that patients will be see according to the severity of their condition.

Health region representatives said they hope that making this information easily available will help patients to decide whether to go to the closest ER, or the one with the shortest wait time, and thus distribute the workload more evenly.

There's an interesting "behind the scenes" page linked to the main page.  It explains more about the online system and how the wait times are calculated.  The wait times displayed online are calculated based on the number of patients waiting to be seen, their disease acuity, and the number of medical staff available to see patients.

The times are automatically calculated by Calgary Health's IT system, so there's no additional clerical work needed.  Nice!

A few thoughts on this national first:

I'd be interested to see how the calculated wait time correlates with the actual patient experience.  This will likely be studied and posted as part of the evaluation phase of this project.

Might patients be discouraged from seeking urgent medical care if they see how long the wait will be?  People already realize they will have to wait for ER attention, but if they have already invested the time and effort to get to the ER, I suspect they are more likely to stick around until they are seen.  Will advance knowledge of ER wait times change patient's behaviour?  If so, is this necessarily a bad thing? That is, might some people be more likely to seek care for less urgent problems from their family physician if the ER is "less convenient"?  This would be a tough one to measure because the patient's experience won't be captured at an ER visit.  Maybe family medicine clinics will anecdotally report that patients are deciding not to go to the ER.

Power to the people!  Now that this information is available publicly and in real-time, I'm keen to see who will be the first to use it for other than its stated purpose.  I don't mean using the information for a nefarious reason (although there may be some way to do that...), I mean a mashup, combining online data sets to produce new functionality beyond the original intent.  For example, someone could combine Calgary traffic and transit system data with the ER wait time to show the patient's real wait time experience.  (Similar to how we now consider patient's entire wait for surgery to be "Wait 1" - wait for consultation with surgeon - plus "Wait 2" - the time from the OR booking being submitted to the actual date of surgery.)  
Depending on where someone lives and the transportation available to them, it might make more sense to visit the ER that nominally has a longer wait time, because the total patient wait (combined transit + ER wait) is actually shorter.  If that were the case, and it resulted in more congestion in an already busy ER, perhaps Calgary Health IT would communicate with Calgary Transit and more buses could be put on the routes that lead to the less congested ER.   (Mmm, mmm, mmm! System integration!)
Some enterprising computer science student will create an app that pulls the data to smart phones, so a single click will let people know which ER they should head for.  As long as that app is in the works, why not link it to a health advice FAQ site (official Alberta Health, of course) that gives suggestions for self-management of common conditions that often lead to low-acuity ER visits.  
Similarly enterprising engineering or business students will track the publicly posted data and identify trends of ER congestion.  Queue theory experts insist that, even in the unpredictable world of the ER, there is enough predictability to guide staffing plans.  Analyzing the trends in Calgary's ERs would be a great student project.

The greatest thing about this project is just that they did it.  Plain and simple, they did it!  Alberta has shown that meaningful, real-time health system data can be collected and displayed in a way that helps the public make better decisions about their health care.  Once the bugs are worked out, this can be spread across Alberta.  Soon, people in other provinces will come to expect this service.

We can use the Alberta's ER model to help manage other health care congestion, for example, hospital beds.  Hospital ward managers tell me they spend a big part of their day figuring out which patients are ready for discharge and then facilitating discharge or transfer.  Sometimes, a message will be posted in the OR: "Please arrange patient discharge as soon as possible today.  Wards are full and surgery may be cancelled."  By the time word gets around, it's at least 10 am, and the prime opportunities for deciding about discharge have passed.

How about pushing real-time data to each hospital physician?  Include the number of patients he/she has in hospital, the "national expected length of stay" for each patient's condition, the current length of stay, hospital occupancy and an indicator as to whether the physician has indicated a planned date of discharge.  This information could be sent to the physician's phone every evening so discharge planning can be done that night, or early in the morning.  The information is already available; it just needs to be aggregated.

Show us the way, Alberta!

Wednesday, March 23, 2011

97% is not a passing mark! (More to CIHI report on wait times than meets the eye)

Can you score 97% on a test and still fail?

CIHI just released its 2010 results on which provinces are meeting national benchmarks on wait times for specified procedures.   I took vague note of the results for hip and knee replacement, surgery for fractured hips, cataracts and cardiac bypass.  I don't have much to do with those procedures in my urology practice.

But, I was interested in the Star-Phoenix's report that 97% of Saskatchewan patients receive radiotherapy (for cancer) within the benchmark time (4 weeks).   Nice!  But, wait...

I regularly refer patients for radiation treatment, usually for prostate cancer.  My impression is that patients usually wait longer than 4 weeks for their treatment.  I often get phone calls from patients I've referred, asking when their treatment will start.  I usually quote a wait of 8-10 weeks from when I send a referral letter to when they start their treatment.

Perhaps I'm not speaking the same language as CIHI.

The CIHI report states that the 4 week wait is measured from when patients are "ready to receive care".  Interesting.  I would consider most of my patients to be ready to receive care from the moment I refer them.  Some still require xray testing to be completed, but there would be very few men who are medically unfit to receive treatment.  So, why the discrepancy between my perception of patient wait and CIHI's report?

On the Saskatchewan Cancer Agency website, "ready to treat" is explained as "the date that the patient is ready to be treated, taking into account clinical factors and patient preference".  So, "ready to treat" equals "ready to be treated"...

I was no further ahead after reading this, so I asked a senior physician at our Cancer Clinic what "ready to treat" meant.  His answer was more enlightening:


Ready to Treat means the patient has been assessed by a Radiation Oncologist with all necessary work-up completed, treatment options considered and a consent for Radiation therapy signed.  It means that if simulation and planning could be done in minutes the patient is ready to start treatment that day. It means the patient is available and willing to start.
OK, now I get it.  It means we've stacked the measurement deck by ignoring all the heavy lifting necessary to get the patient to the point of "ready to treat".  Here's what goes on before the official clock starts:

Referral letter generated and sent to Cancer Clinic
Letter reviewed by triage clerk
Letter reviewed by Radiation Oncologist
Appointment date assigned
Consultation with Radiation Oncologist
Further testing (possibly)
Patient decision to proceed with treatment
Each of the spaces in the above list equals its own wait time.  Who is measuring those waits?  Our patients sure are, but CIHI isn't.  I have no doubt that CIHI recognizes the importance of each of these wait times.  But, there isn't a system in place to track them.

It's relatively easy to track wait times once a patient is in the Cancer Clinic system.  It's harder to track all the other times.  It's even more difficult if you want to measure the patient's real waiting experience, that is, from the time the patient is referred by their family doctor, or even when they first consult their doctor with symptoms.  Who decided on this benchmark anyway?  Did anyone ask patients whether this was truly reflective of what was important to them?

There's a chance that making these easy measurements could actually hamper efforts at overall system improvement.  What if health administrators and politicians look at the "success story" of radiotherapy across the country and decide that it's "fixed", and that attention and resources can be moved elsewhere?

If my son came home from his basketball game and told me that he had scored 50 points, I'd be curious how that had happened.  I wouldn't be surprised to find out that the baskets had been lowered to 6 feet high.  Easy slam dunk.

In a health care system that has universal struggles with access, we should be suspicious when one area seemingly slam dunks the access problem.  Their basket is too low.

97% = Fail.

Monday, February 1, 2010

Semi-transparent

I’ve been feeling guilty since my last post. I hadn’t shown you our 3rd NAA/wait time chart for many months, and if you’ve been following our adventures, you know that the 3rd NAA was the raison d’ĂȘtre of this project. When I finally posted the recent data, it was in anticipation of our upcoming backlog blitz that should drop the 3rd NAA to our target level of 2 weeks.

Our Advanced Access project has broadened to a Clinical Practice Redesign effort, and so has a wider range of goings-on to share in this blog. However, I’m aware that I’ve used that wealth of material as an excuse to avoid exposing our biggest failure: we have not beaten the backlog, and our patients continue to wait too long for their consultations.

I rationalized it beautifully in the last post, didn’t I? I pointed out that the number of FTE urologists in Saskatchewan had dropped over the last few years, and that we were lucky the wait times hadn’t soared as a result of the manpower situation. And, I sweetened the bitterness of showing a stagnant 3rd NAA trend by breaking the exciting news of the backlog blitz.

Why did I keep this under wraps for so long? Here are a few reasons:

As the project lead, I find it frustrating and embarrassing to admit that, while we’ve had success in other areas (there’s that rationalization again!), the main goal eludes us.

When I share our results at meetings and with colleagues, I feel it undermines my credibility as a “champion” for this type of quality improvement.

Other physicians may be reluctant to start similar projects if they see early adopters are struggling to achieve durable results.

Blog posts about an unchanging 3rd NAA would be pretty dry. (Lame reason, I admit.)

I have no malicious intent, and I have never knowingly posted misleading data. However, I recognize that withheld information can affect decisions, impressions and outcomes as much as incorrect information can.

In this case, our Clinical Practice Redesign project continues because we’re excited about the positive changes that we see coming from it. The 3rd NAA data is simply a way we measure our progress and consider other improvements that we can make. As such, apart from the reasons noted above, there’s little risk in sharing the data (flattering or not) with you.

But that’s the case in our group; what if the situation were different? What if we were part of a “pay-for-performance” compensation plan, where our remuneration was dependent on providing prompt consultation? Or, if there were another urology group in town, there would be competition for referrals, and a shorter wait time would be a potent marketing tool.

Most importantly, what does a lack of transparency mean for patients? If all else (demeanor, aptitude and location) were equal, people would likely choose the specialist with the shortest wait time. Perhaps wait time would be the prime criterion for some to make their choice. Controlling access to the information then takes on a new importance.

So who controls the access? Ontario and Alberta share some of their acute care wait times online. Information about wait times to see Saskatchewan surgeons is already collated in an online database and available to referring physicians. They could (and are intended to!) share this information with their patients, to assist in making an informed decision about a specialist referral. The information, therefore, is not considered a secret yet, at present, it is password-protected.

If a patient wished to obtain wait time information, she could do so without relying on a physician to grant her access to the database. The information is available, but not without doing a lot of work. She would call all the offices of that particular specialty and ask what the wait time would be for a new referral appointment. (This is essentially the same process used to fill the database, i.e. self-reported wait time.) If she required a sub-specialty consultation (such as a shoulder problem, rather than a knee problem), she would also ask if that surgeon dealt with that area – also information contained in the database.

So why would we make our patients jump through hoops to gain access to information that we already have, and that they can laboriously obtain of their own accord? (Could anyone make a case that they have a right to the information?) There are good reasons why we might restrict access. We want to be sure that the self-reported data is accurate. After all, if livelihoods may be affected by this information, even the most earnest professional may be tempted to fudge the figures slightly.

But, surely the information physician’s clinics would report to the database would be the same as they would give our to our fictional, diligent patient over the phone. If so, she’s no worse off. I suspect that information reported by physicians to the Department of Health would be at least as accurate as that given out ad hoc to curious patients, as physicians would realize that there would be some auditing/confirmation process applied eventually.

If I have been reluctant to share our wait time data for reasons that bear trivial consequences for me, how will people behave when the stakes are higher? What expectations and rights do patients have about access to information that is critical in their informed decision-making around their healthcare?

Monday, October 19, 2009

A Thousand Cuts

Initiatives to reduce wait times for surgery generally focus on the interval from when the surgeon submits a booking to when the surgery is completed. It's hard to imagine a less client-centred measurement.

The time from booking to surgery describes the system’s awareness of the client's need. But, that person has been aware of their need since the onset of symptoms, or the finding of an abnormal lab or x-ray result by their primary care practitioner. A common example of this in urologic practice is the man who has an abnormal PSA (prostate-specific antigen) blood test during his annual medical review. This triggers a series of other events (read: waits) that may culminate in the diagnosis and treatment of prostate cancer.

The series of events looks like this:
  1. PSA blood test
  2. Consultation with Urologist
  3. Prostate biopsy
  4. Definitive treatment (radiation or surgery), if cancer is diagnosed
That's a pretty high-level view of the man's journey through the system. Of course, I mean that's how the system usually looks at the process. The man may see it like this:

    Friday, August 7, 2009

    Heaven

    I’ve been to wait line heaven... it’s a Wal-Mart.

    I studiously avoid shopping at Wal-Mart. I know it’s a popular spot, and that’s the problem – the more people who shop there, the longer the wait at the checkout. And I hate wait lines.

    But, last month, while looking for a piece of summer camp equipment for my son, I paid my first visit to our local Wal-Mart outlet. They had the item in stock, so I prepared to brave the wait for the till. I headed for the express checkout line. There were over a dozen people in the first line. I looked around for a shorter line. But, there was only one queue for multiple cashiers. Now, that’s odd for a department store.

    Whether by tradition, or based on hard statistical analysis and marketing research, various businesses manage wait lines differently; for example, grocery store lines vs. bank lines. At the bank, you form a single queue, at the front of which you look for the next available teller. At the grocery store (and most department stores), you size up individual lines, trying to judge who has the most groceries, which teller is the chattiest, and who will be paying with loose pennies dredged up from the bottom of their purse. Then, while standing in line, you kick yourself for not picking another line that seems to be zipping along. Queue-er’s remorse.

    Friday, January 9, 2009

    I Love Lines

    I love standing in line.

    Or, more accurately, I love what I learn from standing in line. Being stuck in traffic, waiting at the grocery store checkout – they're all golden learning experiences if you're a student of queues. But nothing beats air travel...

    Over the holidays, I enjoyed a tremendous learning opportunity courtesy of a leading national airline. So many of the problems I observed at Toronto airport were analogous to the situation in physicians' offices. Because so many people have experienced the frustration of waiting in line at the airport, perhaps this could be an effective model to explain Advanced Access/Clinical Practice Redesign to novices.

    Before we even arrived at the airport, we had been primed to expect a long wait. Airlines establish cultural norms with the advisory printed on every ticket: Be at the airport at least 60 (or 90, or 120) minutes before your flight departure. So we shrug our shoulders and drag our suitcases to the end of the line, because... that's the way it's always been!

    Sound familiar?  It takes forever to get in to see my doctor. You'll wait a long time to see a specialist. Health care sets the same norms. Earlier this week, I heard a presentation about a new project in the Saskatoon Health Region, aimed at reducing patient wait times when they come for assessment and education at the Pre-operative Clinic. The project coordinator showed a sign currently posted at the entrance of the clinic. It showed a drawing of a man resigned to his fate (shrugging his shoulders in a C'est la vie kind of way) and said: Your visit to the pre-operative clinic may take 4-5 hours. Those are the expectations we establish for our patients. That's the promise of service we give as our patients come through our door.