Cancer Care Ontario: How research evidence helped improve outcomes

Within the span of 10 years, Cancer Care Ontario (CCO) went from an ailing organization to a leading health-care agency. As former CCO head Dr. Terry Sullivan explained at the 2010 Alf Nachemson Memorial Lecture, much of the turn-around can be attributed to a culture of quality improvement tied to research evidence.

Published: February 15, 2011

When Terry Sullivan left his job as president of the Institute for Work & Health (IWH) in 2001 to join Cancer Care Ontario, the health-care agency was in political crisis. Two-thirds of cancer patients faced unacceptable delays in treatment, and patients were being sent to U.S. border cities for care. Indeed, on the very day Sullivan took up his new job, the headline story in The Toronto Star read: “Cancer agency facing the axe.”

As Sullivan put it, I wondered if I had my brains about me when I decided to move over there. But move he did, and 10 years later, in his seventh year as Cancer Care Ontario’s chief executive officer, Sullivan found himself at the helm of a global and national leader in cancer care.

How the agency got from there to here was the subject of Sullivan’s talk at the 2010 Alf Nachemson Memorial Lecture, held in mid-November in Toronto. The lectureship is awarded annually by IWH to a person who has integrated research knowledge into decision-making to improve the health of working Canadians.

Clinician practice leaders key to culture of evidence

According to Sullivan, building a culture of quality improvement by bringing research evidence to the front lines – involving clinical practitioners every step of the way – was a key ingredient to the organization’s turn-around. We now have a very strong culture of evidence …, Sullivan said. There has been a very large investment in the production of systematic reviews, consensus statements and evidence guidance statements for cancer.

The improvement cycle began in 2004, when CCO brought all the practice leaders together to benchmark where the organization stood. We asked them to draw a picture showing the state of quality in each of the areas we’re involved in: surveillance, prevention and screening, radiation therapy/chemotherapy, surgical oncology, palliative care and associated cancer information systems, Sullivan said.

The Cancer Quality Council of Ontario was then set up, populated by people largely external to the organization so that it could report independently to the public – something CCO itself was historically not encouraged to do. To this day, the Quality Council reports annually on 30 to 40 indicators of cancer-system performance.

Based on the identified gaps in knowledge, practice leaders were called upon to produce guidance documents, which were circulated to the specialty and subspecialty practitioners working within each cancer-site community. It’s not medical politics. It’s not organized marketing, Sullivan said. It’s practice leaders producing guidance documents in their field of activity.… The guidelines are based on a systematic extraction and summary of the evidence.

Research-based standards improve outcomes

Sullivan provided a number of examples showing how evidence-based standards were implemented to improve outcomes. For instance, in the last two years, CCO has been focusing on multidisciplinary case conferences as the best way to help patients make decisions about their treatment. There’s reasonable evidence that higher quality of care evolves from this, he said.

In the past, Sullivan explained, a patient’s treatment might depend on the specialty of the attending clinician. If you saw a surgeon first, you might be offered surgery. If you saw a radiation oncologist, you might get radiation, he said. But particularly complex patients need to see a multidisciplinary team to make the best decisions based on a full and complete picture.

A February 2010 performance report shows that all regions in Ontario have begun conducting multidisciplinary case conferences. About 80 per cent are meeting the specified standards set for such meetings.

In another example, CCO took note of an evidence-informed organizational standard from 2005, which was subsequently published in 2007 in the Annals of Thoracic Surgery. The standard suggested surgeons performing lung/thoracic/oesophagal surgeries should be doing at least 150 lung procedures and 20 oesophagectomies a year in order to bring down the high mortality rates associated with these complicated surgeries. The standard also specified surgical criteria, as well as training and practice-setting requirements.

Based on this evidence, CCO went from hospital to hospital and from surgeon to surgeon, to convince them that these surgeries should be performed in designated thoracic surgery centres only. It used a carrot-and-stick approach. The regional consolidation centres received extra money for thoracic/oesophageal surgery; the low-volume centres did not. Ultimately, low-volume hospitals were told that CCO would discount funding for other procedures if they continued to do thoracic surgery.

Now, only 14 designated hospitals perform these complicated surgeries, down from close to 50. By the end of 2010, 90 per cent of thoracic surgeries were being performed in the designated centres. By the end of 2011, it will be 100 per cent: the last hospital, which had one surgeon doing 20 such surgeries a year, has just agreed to stop performing the surgeries.

It’s been a great triumph because there’s a clear volumetric outcome relationship for some of the more complex procedures, Sullivan said. The same thing is now happening with pancreatic and liver surgeries, which are being consolidated within a small number of hospitals.

Sullivan to leave organization on solid ground

Sullivan, who announced in May 2010 that he is stepping down as president of CCO, is excited about the changes that have taken place at the agency. The once ailing organization is now on solid ground. And that change, he said, is based on some very simple concepts: having clearly identified and accountable leaders; creating a plan that sets specific targets; creating an organization that collects, measures and reports on the attainment of those targets; and, importantly, building a culture of evidence by engaging the front lines – in this case, clinicians – in a very active way in the development of guidance documents.

Getting the information, driving towards an agenda, managing it and feeding it back – this seems to have utility across the board, said Sullivan. It has utility in cancer, and I think it has utility in the area of the clinical management of work-related disorders, where you … have to continuously specify the strength of the evidence and make decisions based on that.

To hear an interview with Sullivan or to view his presentation slides, go to