Neoadjuvant chemotherapy increases rates of breast-conserving surgery in early operable breast cancer

Hong Kong Med J 2017 Jun;23(3):251–7 | Epub 9 May 2017
DOI: 10.12809/hkmj164972
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Neoadjuvant chemotherapy increases rates of breast-conserving surgery in early operable breast cancer
Vivian CM Man, MB, BS, MRCS1; Polly SY Cheung, FCSHK, FHKAM (Surgery)2
1 Department of Surgery, Queen Mary Hospital, Pokfulam, Hong Kong
2 Breast Care Centre, Hong Kong Sanatorium & Hospital, Happy Valley, Hong Kong
 
Corresponding author: Dr Polly SY Cheung (pollyc@pca.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Neoadjuvant chemotherapy is commonly used in stage III breast cancer for disease down-staging. Its use has now been extended to early breast cancer to increase the rate of breast-conserving surgery. This study aimed to evaluate the effectiveness of neoadjuvant chemotherapy in early operable cancers.
 
Methods: A retrospective study was carried out at the Hong Kong Sanatorium & Hospital of 102 patients with stage I to III primary breast cancer. All patients who underwent neoadjuvant chemotherapy followed by definitive breast surgery between January 2004 and July 2013 were included. Their pathological complete response and rate of breast-conserving surgery were studied. Data were compared using Chi squared test and Student’s t test.
 
Results: After neoadjuvant chemotherapy, 23% of patients achieved a pathological complete response, of whom 80% had human epidermal growth factor receptor 2 (HER2)–positive disease or triple-negative disease. Hormonal receptor negativity was associated with a higher pathological complete response rate (P<0.05) that was in turn associated with a higher likelihood of breast-conserving surgery (P=0.028). Patients with stage II disease were more likely to convert from mastectomy to breast-conserving surgery following neoadjuvant chemotherapy.
 
Conclusions: Neoadjuvant chemotherapy is a useful treatment to downsize tumour in early breast cancer, thereby increasing the rate of breast-conserving surgery. It is especially effective in patients with HER2-positive/oestrogen receptor–negative disease or triple-negative disease.
 
 
New knowledge added by this study
  • Neoadjuvant chemotherapy for breast cancer can downsize the tumour with a consequent higher rate of breast-conserving surgery, especially in patients with human epidermal growth factor receptor 2–positive/oestrogen receptor–negative disease or triple-negative disease.
Implications for clinical practice or policy
  • Neoadjuvant chemotherapy is a useful alternative in early breast cancer for women considering breast-conserving surgery.
 
 
Introduction
Breast cancer is the leading cancer affecting women in Hong Kong, followed by colorectal and lung malignancy.1 The number of new breast cancer cases in Hong Kong has tripled since the 1990s and the lifetime breast cancer risk in women is currently one in 17.1 Among the 12 345 patients studied in the cohort of the Hong Kong Breast Cancer Registry from 2008 to February 2014, 55% were diagnosed with stage II disease or above and 5% of the cohort received neoadjuvant chemotherapy.1 This cohort of patients is estimated to cover approximately 40% of patients reported by the Hong Kong Cancer Registry of the Hospital Authority.
 
Neoadjuvant chemotherapy has played an increasing role in the management of breast cancer over the last few decades. It was considered at least as effective as postoperative chemotherapy in terms of disease-free survival (DFS) and overall survival (OS) in the National Surgical Adjuvant Breast and Bowel Project B-18 trial.2 Neoadjuvant chemotherapy allows disease down-staging, thus increasing the probability of successful breast-conserving therapy.3 4 In addition, tumour response can be monitored ‘in vivo’ and chemotherapeutic regimens modified accordingly. Studies have also suggested its role in disease prognostication, especially the presence of pathological complete response in highly proliferative tumours.3
 
The aims of this study were to identify possible tumour characteristics that may benefit from neoadjuvant chemotherapy and to evaluate the effectiveness of neoadjuvant chemotherapy in increasing the rates of breast-conserving surgery in early operable breast cancer.
 
Methods
This was a retrospective study carried out at the Hong Kong Sanatorium & Hospital and approved by the hospital research committee in September 2013; the requirement of patient informed consent was waived because of its retrospective nature. This study was done in accordance with the principles outlined in the Declaration of Helsinki. All patients with breast cancer who underwent neoadjuvant chemotherapy followed by definitive breast surgery from January 2004 to July 2013 were recruited. The choice of definitive breast surgery, either breast-conserving surgery or mastectomy, was determined by an experienced breast surgeon (CSY) and based on the oncological and cosmetic outcome of each patient. Patients who presented with distant metastases and those who underwent neoadjuvant hormonal therapy were excluded. Those who had stage IV disease were also excluded.
 
Patient records were retrieved from the breast cancer database at the Hong Kong Sanatorium & Hospital and out-patient clinic of one of the authors (CSY) by an independent research assistant who was blinded to the study hypothesis and outcome. All recruited patients had their surgery performed by CSY, who is one of the breast surgery specialists at the hospital. Patients were followed up perioperatively in the out-patient clinic of CSY. Patient demographics, pre-chemotherapy and post-chemotherapy disease staging, tumour characteristics, positron emission tomography–computed tomography findings, and prescribed chemotherapeutic agents were evaluated.
 
Effectiveness of neoadjuvant chemotherapy was assessed in two ways: presence of pathological complete response and the feasibility of breast-conserving surgery after chemotherapy. Intrinsic tumour characteristics that influenced treatment response were analysed. Tumour size, nodal status, tumour grade, hormonal receptor status, human epidermal growth factor receptor 2 (HER2) receptor status, Ki67 level, and chemotherapeutic agents used were the independent variables in this study. Statistical analysis was performed with SPSS (Windows version 20.0; IBM Corp, Armonk [NY], United States) and a P value of <0.05 was considered statistically significant. Univariate analysis was performed with Student’s t test and Chi squared test where appropriate. Definitions of various terms used in this study are listed in the Appendix.
 

Appendix. Definitions
 
Results
Patient’s characteristics
From January 2004 to July 2013, 2156 patients underwent breast cancer surgery at Hong Kong Sanatorium & Hospital by an experienced breast surgeon (CSY). Stage II or III disease was diagnosed in 48% and 105 (5%) of all patients underwent neoadjuvant chemotherapy. Three patients were excluded due to significant missing data. A total of 102 were ultimately recruited.
 
Characteristics of patients are summarised in Table 1. Almost all recruited patients had stage II or III disease before commencement of neoadjuvant chemotherapy. Invasive ductal carcinoma constituted more than 90% of all diagnosed breast cancers. One quarter of the recruited patients had triple-negative disease and one third had HER2-positive disease. In our study, 48 patients received sequential anthracycline-taxane-based chemotherapy and 52 received taxane-based chemotherapy only. One patient received four cycles of anthracycline-based chemotherapy only and another patient received gemcitabine and vinorelbine. There were 35 patients prescribed herceptin as part of their neoadjuvant chemotherapy.
 

Table 1. Patient characteristics
 
Tumour size
After commencement of neoadjuvant chemotherapy, the mean tumour size reduced by more than half, from 4 cm to <2 cm. The HER2-positive group showed a relatively greater tumour size reduction to almost 75% (Fig). On the contrary, the mean tumour size in luminal A breast cancers remained relatively static despite neoadjuvant chemotherapy.
 

Figure. Changes in mean tumour size after neoadjuvant chemotherapy
 
Nodal status
More than 80% of the studied population presented with N1 disease or above. After neoadjuvant chemotherapy, the proportion of patients with N0 disease increased from 15% to 43%. Just over half (51%) of the studied group achieved a reduction in nodal staging following neoadjuvant chemotherapy (Table 2). Similarly, patients with HER2-positive disease or triple-negative disease showed a more significant nodal down-staging after chemotherapy (P=0.007).
 

Table 2. Change in nodal status after neoadjuvant chemotherapy for different biological subtypes
 
Pathological complete response
Effectiveness of neoadjuvant chemotherapy was determined by the presence of pathological complete response. Pathological complete response was achieved by 23% (n=23) of patients and 60% had a partial tumour response. Among these 23 patients, 18 (78%) had triple-negative disease or HER2-positive disease; oestrogen receptor (ER) status was negative in 14 patients and progesterone receptor (PR) status was negative in 17 patients. Four patients with triple-negative disease or HER2-positive disease had nodal down-staging from N2 or N3. Breast cancers with negative ER status (P=0.039) or negative PR status (P=0.029) had a higher chance of pathological complete response in univariate analysis (Table 3). Other factors including the Ki67 value, tumour grade, and the prescribed chemotherapeutic regimen did not appear to influence the rate of pathological complete response.
 

Table 3. Univariate analysis for pathological complete/partial response
 
Does pathological complete response predict likelihood of breast-conserving surgery?
Pathological complete response was achieved by 23 patients, of whom 15 (65%) underwent breast-conserving surgery; whereas only 39% of those with partial or no response had breast conservation. Univariate analysis revealed that patients who had pathological complete response following neoadjuvant chemotherapy had a higher chance of successful breast-conserving surgery (P=0.028). Mastectomy was required in eight patients despite a pathological complete response due to pre-chemotherapy large tumour size, extensive carcinoma in situ, or central location of the tumour.
 
Among those with pathological complete response, 11 (79%) of 14 stage II patients and four (50%) of eight stage III patients eventually had breast-conserving surgery. Patients with stage II disease showed a trend for more breast-conserving surgery after neoadjuvant therapy although this was not statistically significant (P=0.15).
 
Feasibility of breast-conserving surgery
The change of treatment plan after neoadjuvant chemotherapy is shown in Table 4. Before the commencement of neoadjuvant chemotherapy, one quarter of patients (n=26) were scheduled for breast-conserving surgery and three quarters for mastectomy (n=72). After chemotherapy, one third of those scheduled for mastectomy (24 patients) changed to breast-conserving surgery. The number of breast-conserving surgeries increased from 26 to 45, with an increase of 19% of all patients.
 

Table 4. Change of treatment plan after neoadjuvant chemotherapy
 
After neoadjuvant chemotherapy, 24 patients with planned mastectomy underwent breast-conserving surgery and 48 continued with mastectomy. On the other hand, five patients with planned breast-conserving surgery underwent mastectomy after neoadjuvant chemotherapy as a result of disease progression or patient’s preference (Table 4). Among the 24 patients with successful conversion from mastectomy to breast-conserving surgery, 21 had tumour size of <5 cm and 18 had stage II disease. Pre-chemotherapy disease staging (P=0.001) and tumour size (P=0.005) were important factors that determined successful conversion to breast-conserving treatment in univariate analysis (Table 5). The breast-conserving surgery to mastectomy ratio in patients with stage II disease was 32:14 patients, ie 2.3 to 1. On the contrary, 13 patients with stage III disease underwent breast-conserving surgery and 38 underwent mastectomy, ie a ratio of 1:3 for stage III disease. Among those who underwent breast-conserving surgery, 93% had tumour size of <5 cm. The corresponding proportion in those who underwent mastectomy was 60%. Tumours with size of <5 cm were more likely to be amenable to successful breast-conserving surgery. Other factors including the Ki67 index, tumour grade, and the prescribed chemotherapeutic regimen did not appear to influence the rate of breast-conserving surgery.
 

Table 5. Univariate analysis for successful conversion to breast-conserving surgery
 
Discussion
Neoadjuvant chemotherapy was introduced in the 1980s as standard treatment for locally advanced breast cancers, defined as stage III disease (and a subset of stage IIB disease).5 6 In the last decade, the use of neoadjuvant chemotherapy has been extended to patients with early operable primary breast cancers with promising results. The aim of this study was to evaluate the response of early operable breast cancers to neoadjuvant chemotherapy and the predictors of good responders.
 
Efficacy of neoadjuvant chemotherapy and adjuvant chemotherapy has been carefully evaluated in a number of publications. A prospective randomised trial of the Austrian Breast and Colorectal Cancer Study Group (ABCSG-07) recruited 423 breast cancer patients with stage II to III disease and randomised them to neoadjuvant CMF (cyclophosphamide, methotrexate, fluorouracil) or adjuvant CMF.7 The adjuvant CMF group showed superior results in recurrence-free survival, although the OS was similar. Nonetheless, this ‘old’ chemotherapeutic regimen has now mostly been replaced by anthracycline-taxane-based chemotherapy.
 
With the emergence of newer chemotherapeutic agents, the National Surgical Adjuvant Breast and Bowel Project B-18 published the largest prospective study with the use of AC (doxorubicin and cyclophosphamide).2 Neoadjuvant chemotherapy was at least as effective as adjuvant chemotherapy after a 9-year follow-up. A similar study by the European Organization for Research and Treatment of Cancer published an update after 10 years of follow-up.8 There was no difference in OS or relapses between patients with preoperative and postoperative chemotherapy. Those with neoadjuvant chemotherapy had more breast-conserving treatment. Further subgroup analysis showed a comparable loco-regional recurrence rate between patients initially allocated to receive breast-conserving treatment and those who did after tumour downsizing.8
 
Meta-analysis of 14 randomised controlled trials that included patients with mostly stage II or III disease showed similar results.9 The loco-regional recurrence rate was also comparable between the two groups. There was a statistically significant decrease in mastectomy rate that favoured neoadjuvant chemotherapy.
 
In our study, patients with stage II to III disease were further stratified in the subgroup analysis. Stage II disease was considered early operable breast cancer while patients with stage III disease represented those with locally advanced disease. This stratification was in line with the MD Anderson Cancer Centre Classification of locally advanced disease.5 Patients with early operable breast cancer showed comparatively greater benefits following neoadjuvant chemotherapy in terms of the rate of pathological complete response and breast-conserving surgery.
 
Pathological complete response has been one of the commonly used study endpoints in publications. It has been suggested to correlate with a better long-term outcome. Meta-analysis by Mieog et al9 found improved OS in patients with pathological complete response. The definition of pathological complete response varies from institution to institution, however. In our study, we adopted the definition recognised by the MD Anderson Cancer Centre and in the ABCSG study,10 in which there should be no invasive residual disease in breast or nodes although non-invasive breast residuals are allowed. Studies have shown no difference in DFS or OS between patients with ypT0ypN0 and ypTisypN0 tumours.3 11
 
Of note, the rate of pathological complete response appears to be different among various intrinsic types of breast cancer.12 In 2005, Rouzier et al13 stratified breast cancer patients into four molecular classes using the genetic profile from a fine-needle aspiration specimen. Patients with basal-like and c-erbB2+ breast cancers had the highest rate of pathological complete response. Age younger than 50 years and ER-negative status were independent variables with a higher likelihood of pathological complete response. In our study, core biopsies with immunohistochemical staining and proliferation index were used to classify patients into luminal A, luminal B, triple-negative, or HER2-positive subgroups and also showed consistent findings.
 
Carey et al14 described the phenomenon of triple-negative paradox in 2007. Basal-like and HER2+/ER- subtypes were more chemosensitive than their luminal counterparts. They were more likely to have pathological complete response but those with residual disease also had a higher likelihood of relapse and worse outcome. The study by the German Breast Group in 2012 highlighted the impact of pathological complete response on prognosis in different intrinsic subtypes of breast cancer.10 Patients with ypT0N0 tumours had the best DFS (P<0.001) and a trend of better OS. More importantly, pathological complete response was predictive of DFS and OS in highly aggressive tumours only such as those with negative ER or PR status. Patients with HER2-positive or triple-negative tumours did better if they achieved pathological complete response after neoadjuvant chemotherapy. Residual disease in breast and nodes, on the contrary, was associated with worse distant DFS.15
 
Last but not the least, recent publications have described possible changes in receptor status before and after neoadjuvant chemotherapy although the significance remains controversial.16 In our study, change in ER status was evident in 10% of the study group and that of HER2 in 50%.
 
The current study has several limitations. First, this was a retrospective study and the database in the earlier period was incomplete with missing information. There were three patients with significant missing information who were excluded from this small study. Second, there may be selection bias as patients chosen for neoadjuvant chemotherapy were subject to surgeon assessment and patient preference. This study represents the experience of neoadjuvant chemotherapy by one experienced breast surgery specialist in one private hospital in Hong Kong. As such, the findings may not apply to other breast cancer patients in public hospitals or in other countries. Third, long-term survival data are not included in the present study, and significance of pathological complete response is not known. Lastly, the number of cases in this study was small, therefore further subgroup analysis in patients with pathological complete response or successful conversion to breast-conserving surgery was not possible. It does not allow further multivariate analysis for controlling potential confounding factors. Future study in this area with a larger sample size may be useful to guide patient selection for systemic treatment of breast cancer in a neoadjuvant setting.
 
Conclusions
Neoadjuvant chemotherapy has expanded indications from treatment of locally advanced breast cancers to render it operable, to downsizing early operable breast cancers enabling breast-conserving surgery. The current study has shown an increased rate of breast-conserving surgery with neoadjuvant chemotherapy, especially in the early operable group. Negative hormonal status was an independent variable that determined pathological complete response.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Hong Kong Breast Cancer Registry Report No. 8. Hong Kong Breast Cancer Foundation. Available from: http://www.hkbcf.org. Accessed Feb 2016.
2. Fisher B, Brown A, Mamounas E, et al. Effect of preoperative chemotherapy on local-regional disease in women with operable breast cancer: findings from National Surgical Adjuvant Breast and Bowel Project B-18. J Clin Oncol 1997;15:2483-93. Crossref
3. Gampenrieder S, Rinnerthaler G, Greil R. Neoadjuvant chemotherapy and targeted therapy in breast cancer: past, present, and future. J Oncol 2013;2013:732047. Crossref
4. Thompson AM, Moulder-Thompson SL. Neoadjuvant treatment of breast cancer. Ann Oncol 2012;23 Suppl 10:x231-6. Crossref
5. Giordano SH. Update on locally advanced breast cancer. Oncologist 2003;8:521-30. Crossref
6. Alassas M, Chu Q, Burton G, Ampil F, Mizell J, Li BD. Neoadjuvant chemotherapy in stage III breast cancer. Am Surg 2005;71:487-92.
7. Taucher S, Steger GG, Jakesz R, et al. The potential risk of neoadjuvant chemotherapy in breast cancer patients—results from a prospective randomized trial of the Austrian Breast and Colorectal Cancer Study Group (ABCSG-07). Breast Cancer Res Treat 2008;112:309-16. Crossref
8. van Nes JG, Putter H, Julien JP, et al. Preoperative chemotherapy is safe in early breast cancer, even after 10 years of follow-up; clinical and translational results from the EORTC trial 10902. Breast Cancer Res Treat 2009;115:101-13. Crossref
9. Mieog JS, van der Hage JA, van de Velde CJ. Neoadjuvant chemotherapy for operable breast cancer. Br J Surg 2007;94:1189-200. Crossref
10. von Minckwitz G, Untch M, Blohmer JU, et al. Definition and impact of pathologic complete response on prognosis after neoadjuvant chemotherapy in various intrinsic breast cancer subtypes. J Clin Oncol 2012;30:1796-804. Crossref
11. Mazouni C, Peintinger F, Wan-Kau S, et al. Residual ductal carcinoma in situ in patients with complete eradication of invasive breast cancer after neoadjuvant chemotherapy does not adversely affect patient outcome. J Clin Oncol 2007;25:2650-5. Crossref
12. Bhargava R, Beriwai S, Dabbs DJ, et al. Immunohistochemical surrogate markers of breast cancer molecular classes predicts response to neoadjuvant chemotherapy: a single institutional experience with 359 cases. Cancer 2010;116:1431-9. Crossref
13. Rouzier R, Perou CM, Symmans WF, et al. Breast cancer molecular subtypes respond differently to preoperative chemotherapy. Clin Cancer Res 2005;11:5678-85. Crossref
14. Carey LA, Dees EC, Sawyer L, et al. The triple negative paradox: primary tumor chemosensitivity of breast cancer subtypes. Clin Cancer Res 2007;13:2329-34. Crossref
15. Corben AD, Abi-Raad R, Popa I, et al. Pathologic response and long-term follow-up in breast cancer patients treated with neoadjuvant chemotherapy: a comparison between classifications and their practical application. Arch Pathol Lab Med 2013;137:1074-82. Crossref
16. Pedrini JL, Francalacci Savaris R, Casales Schorr M, Cambruzi E, Grudzinski M, Zettler CG. The effect of neoadjuvant chemotherapy on hormone receptor status, HER2/neu and prolactin in breast cancer. Tumori 2011;97:704-10.

Comparison of a commercial interferon-gamma release assay and tuberculin skin test for the detection of latent tuberculosis infection in Hong Kong arthritis patients who are candidates for biologic agents

Hong Kong Med J 2017 Jun;23(3):246–50 | Epub 27 Jan 2017
DOI: 10.12809/hkmj164880
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Comparison of a commercial interferon-gamma release assay and tuberculin skin test for the detection of latent tuberculosis infection in Hong Kong arthritis patients who are candidates for biologic agents
H So, MSc, FHKAM (Medicine); Carol SW Yuen, BNurs, MSc; Ronald ML Yip, FHKCP, FHKAM (Medicine)
Department of Medicine and Geriatrics, Kwong Wah Hospital, Yaumatei, Hong Kong
 
An earlier version of this paper was presented at the ASM of the Hong Kong Society of Rheumatology held in Hong Kong on 22 November 2015.
 
Corresponding author: Dr H So (h99097668@hotmail.com)
 
 Full paper in PDF
 
Abstract
Introduction: It is universally agreed that screening for latent tuberculosis infection prior to biologic therapy is necessary, especially in endemic areas such as Hong Kong. There are still, however, controversies regarding how best to accomplish this task. The tuberculin skin test has been the routine screening tool for latent tuberculosis infection in Hong Kong for the past decade although accuracy is far from perfect, especially in patients who have been vaccinated with Bacillus Calmette–Guérin, who are immunocompromised, or who have atypical mycobacterium infection. The new interferon-gamma release assays have been shown to improve specificity and probably sensitivity. This study aimed to evaluate agreement between the interferon-gamma release assay and the tuberculin skin test in the diagnosis of latent tuberculosis infection in patients with arthritic diseases scheduled to receive biologic agents.
 
Methods: We reviewed 38 patients with rheumatoid arthritis, psoriatic arthritis, or spondyloarthritis at a local hospital in Hong Kong from August 2013 to April 2014. They were all considered candidates for biologic agents. The patients underwent both the interferon-gamma release assay (ASACIR.TB; A.TB) and the tuberculin skin test simultaneously. Concurrent medications were documented. Patients who tested positive for either test (ie A.TB+ or TST+) were prescribed treatment for latent tuberculosis if they were to be given biologic agents. All patients were followed up regularly for 1 year and the development of active tuberculosis infection was evaluated.
 
Results: Based on an induration of 10 mm in diameter as the cut-off value, 13 (34.2%) of 38 patients had a positive tuberculin skin test. Of the 38 patients, 11 (28.9%) also had a positive interferon-gamma release assay. The agreement between interferon-gamma release assay and tuberculin skin test was 73.7% (kappa=0.39). Six patients were TST+/A.TB– and four were TST–/A.TB+. When positive tuberculin skin test was defined as an induration of 5-mm diameter, the agreement between the two tests improved with a kappa value of 0.47. In that case, half of the patients had a positive tuberculin skin test; among them, nine were TST+/A.TB–. Only one was TST–/A.TB+. Subgroup analysis showed that the agreement between both tests improved further (kappa=0.69) in patients not taking a concurrent systemic steroid. For patients prescribed systemic steroid, the agreement was only slight with a kappa value of 0.066. Finally, none of the 38 patients, of whom 32 had an exposure to biologic agents, developed active tuberculosis during the 1-year follow-up period.
 
Conclusion: In a tuberculosis-endemic population, although 10-mm diameter induration is the usual cut-off for a positive tuberculin skin test, the level of agreement between the interferon-gamma release assay and tuberculin skin test improved from fair to moderate when the cut-off was lowered to 5 mm. A dual testing strategy of tuberculin skin test and interferon-gamma release assays appeared to be effective and should be pursued especially in patients who are on systemic steroid therapy. Nonetheless, the issue of potential overtreatment is yet to be evaluated.
 
 
New knowledge added by this study
  • In Hong Kong, a tuberculosis-endemic area, the level of agreement between tuberculin skin test (TST) and interferon-gamma release assay (IGRA) for detecting latent tuberculosis infection was only fair in arthritis patients scheduled to receive biologic therapy.
  • Although 10 mm is the cut-off for positive TST according to the local guideline, the level of agreement between the two tests improved when a 5-mm cut-off was used.
Implications for clinical practice or policy
  • Dual testing strategy with TST and IGRA appeared to be effective and should be employed, especially in patients who are prescribed systemic steroid therapy.
 
 
Introduction
The advent of biologic agents has revolutionised the treatment of patients with rheumatoid arthritis (RA), psoriatic arthritis (PSA), and spondyloarthritis (SPA). The outcome is now greatly improved. This, however, comes at the price of a clear heightened risk of active tuberculosis (TB) as a progression of latent TB infection (LTBI).1 Therefore, it is universally agreed that screening for LTBI prior to biologic therapy is necessary, especially in endemic areas such as Hong Kong.2 Unfortunately, there remains controversy regarding how best to accomplish this task.
 
The tuberculin skin test (TST) has been the routine screening tool in Hong Kong for the past decade.3 Its accuracy, however, is far from perfect, especially in patients who have been vaccinated with Bacillus Calmette–Guérin (BCG), are immunocompromised, or have been infected with atypical mycobacterium.4 Recently, interferon-gamma release assays (IGRAs) that measure interferon-gamma secretion in response to Mycobacterium tuberculosis–specific antigens have become available to detect LTBI. They have been shown to offer improved specificity and probably sensitivity.5 6 Other shortcomings of the TST, such as the need for return visits and reader variability, are also overcome. One of the IGRAs, the ASACIR.TB (A.TB; Haikou VTI Biological Institute, Hainan, China), has shown encouraging results in a large-scale clinical trial conducted in China and might be more appropriate in Chinese populations.7
 
On the other hand, IGRAs are not flawless. The rate of indeterminate results has been reported to be as high as 40%.8 The immunocompromised state of arthritic patients will also induce a depressed response to a T-cell reaction leading to an inaccurate IGRA result. There are recent data to argue that an IGRA alone is insufficient to identify all patients at risk.9 10 Furthermore, various studies have suggested very different concordance figures between the IGRA and the TST, likely as a result of heterogeneity (eg differing background TB prevalence, variable immunosuppressive therapies, or underlying BCG status).11
 
This study aimed to evaluate the agreement between the IGRA and the TST in the diagnosis of LTBI in patients with arthritic diseases scheduled to receive biologic agents in Hong Kong.
 
Methods
Patients
We reviewed 38 patients with RA, PSA, or SPA at a local hospital in Hong Kong from August 2013 to April 2014. They were diagnosed according to the 2010 classification criteria for RA of the American College of Rheumatology/European League Against Rheumatism, the Classification Criteria for Psoriatic Arthritis, and the Assessment of SpondyloArthritis international Society classification criteria, respectively. Patients were included if they were considered candidates for biologic agents. Concurrent medications were documented. Candidates were excluded if they had active TB infection, a history of incomplete TB treatment, or no measured induration. Patients underwent both the IGRA and the TST simultaneously. Those who tested positive for either test and who were due to be prescribed biologic agents were given latent TB treatment with isoniazid or rifampicin for 9 months. All patients were followed up regularly for 1 year and the development of active TB infection was evaluated. This study conforms to the provisions of the Declaration of Helsinki and the guidelines of the local ethical committee. Informed consent was considered not necessary due to the retrospective nature of the study.
 
Tuberculin skin test
The TST was performed by rheumatologists. A 0.1 mL of 2-TU PPD (tuberculin units of purified protein derivative) was injected intradermally into the volar aspect of the forearm. The indurations were measured in millimetres after 48 hours of inoculation by rheumatologists who were blinded to the IGRA results. According to the local guideline, induration of ≥10 mm was considered a positive result of LTBI.3
 
Interferon-gamma release assay
We performed the A.TB IGRA test (Haikou VTI Biological Institute) in all study patients. This assay employs Haikou VTI’s patented technology (US patent number 7754219) that enables intracellular delivery of the full-length protein CFP-10 and the antigen ESAT-6 to stimulate antigen-specific T-cells through the major histocompatibility complex class 1 pathway.12 13 The assay was performed according to the user manual. In brief, negative control phosphate buffered saline (N), positive control concanavalin A (P), and the TB stimulators CFP-10 and ESAT-6 (T) were mixed with fresh heparinised whole blood and incubated for approximately 24 hours at 37.8°C. The plasma was collected and stored at 48°C for up to 2 weeks. The interferon-gamma level in the plasma was then determined by enzyme-linked immunosorbent assay. If N was <0.5 IU/mL and (T-N)/(P-N) ≥0.6, or if N was ≥0.5 IU/mL and (T-N)/(P-N) ≥0.85, the test was considered to be positive (A.TB+), otherwise the result was negative (A.TB–).
 
Statistical analysis
Descriptive statistics were presented as frequencies and means ± standard deviations as appropriate. The concordance between TST and IGRA was evaluated by the Cohen’s weighted k statistic. A kappa value of >0.6 represents substantial agreement, 0.41 to 0.60 moderate agreement, 0.21 to 0.40 fair agreement, and <0.21 slight agreement. The concordance was subanalysed in patients with and without prednisolone.
 
Results
The demographic and clinical characteristics of 38 patients are summarised in Table 1. All patients were residents of Hong Kong. Half of the patients were prescribed systemic steroid therapy that comprised prednisolone at a dose of 2.5 mg daily to 15 mg daily. All except three patients with SPA were on various conventional disease-modifying antirheumatic drugs.
 

Table 1. Demographic and clinical characteristics of patients
 
The results of the concomitant TST and IGRA are shown in Table 2. Of the 38 patients, based on an induration of 10-mm diameter as the cut-off value, 13 (34.2%) had a positive TST, 11 (28.9%) had a positive IGRA. The agreement between A.TB IGRA test and TST was 73.7%. Six patients were TST+/A.TB– and four were TST–/A.TB+. Subgroup analysis showed that four of the six divergent TST+/A.TB– results were in patients on systemic steroid, and only three patients with systemic steroid were A.TB+ versus eight patients without. When positive TST was defined as an induration of 5-mm diameter, half of the patients had a positive TST, among them nine were TST+/A.TB–. Only one was TST–/A.TB+. In patients prescribed a systemic steroid, with 5-mm induration as positivity, TST missed no patients who had positive IGRAs.
 

Table 2. Tuberculin skin test (TST) and A.TB test results
 
Analysis of the agreement between the two tests, assessed by kappa statistic, showed only fair strength in our study, with a kappa value of 0.39 (Table 3). When a 5-mm induration was taken as a positive TST, however, the agreement between the two tests improved to moderate with a kappa value of 0.47. Subgroup analysis revealed that the agreement between both tests improved further (kappa=0.57) in patients not taking a concurrent systemic steroid. For patients taking a systemic steroid, the agreement was only slight (kappa=0.066). Again, the agreement of the TST and IGRA was substantial (kappa=0.69) in patients not on systemic steroid therapy when a 5-mm induration was regarded as positive.
 

Table 3. Agreement between the two tests
 
At the end of the study, 32 of the initial 38 patients had received biologic agents. None of them developed active TB during the 1-year follow-up period.
 
Discussion
In clinical practice there is no gold standard test for diagnosing LTBI. Both IGRA and TST have strengths and weaknesses. In a meta-analysis performed on an unselected population, the specificity of IGRA was 99% in a non-BCG–vaccinated population and 96% in a BCG-vaccinated population.14 The specificity of the TST was 97% in a non-BCG–vaccinated population but dropped to 59% in a BCG-vaccinated population.14 In addition to BCG history, comparison between the two tests must take into account the underlying disease, the immunosuppression status, and the background TB burden of the population being screened.
 
In this study, we found only a fair agreement (kappa=0.39) between the results of TST and A.TB IGRA in arthritis patients from a TB-endemic area. Some studies have also reported a discrepancy between the two tests in countries with intermediate TB burden.15 16 Nonetheless, the reported agreement between TST and the IGRA was good (kappa of 0.72 in United Kingdom17 and 0.87 in Denmark18). Consequently, the incidence of TB is a crucial determinant of agreement between the two tests.
 
In this study the concordance between the TST and the IGRA was also affected by immune status. There was only slight agreement in patients taking concurrent prednisolone, but the agreement improved when these patients were excluded from the analysis. Some previous studies of immunosuppressed RA patients have shown similarly poor concordance between the two tests regardless of TB burden.19 20 There was also discordance between the two tests in LTBI diagnosis among individuals infected with the human immunodeficiency virus.21 The patients on prednisolone in our study had a lower rate of positivity for both tests. It seems intuitive to assume that an immunosuppressed state will induce a depressed response to a T-cell reaction. In the literature, a systematic review showed that both positive IGRA and positive TST results were significantly influenced by immunosuppressive therapy.22
 
The Hong Kong guideline for the TST cut-off value for LTBI diagnosis before anti-tumour necrosis factor treatment is an induration of >10 mm.3 In the current study, we showed that the TST cut-off value that achieved better agreement between IGRA and TST results was 5 mm. If we cannot rely on IGRA to diagnose LTBI, it may be more appropriate to lower that TST cut-off to 5 mm. We also showed that our approach to LTBI screening with both TST and IGRA was successful in preventing the development of active TB in patients who would receive biologic therapy. This dual testing strategy might be especially applicable to patients on systemic steroid, as they are at higher risk of developing active TB and the two test results are more discordant. The consequent improved sensitivity will invariably lower the specificity and cause a potential overtreatment. In these high-risk settings, however, it is reasonable to favour sensitivity in screening for LTBI.
 
Only five patients could give a definite history of BCG vaccination. For other patients, such information was uncertain. While this might reflect the local clinical situation, it is one of the limitations of the present study. Despite the mechanistic similarity, because of the different interpretation methods employed for the test results and the lack of comparative trials of the performance of A.TB IGRA and other IGRAs, the conclusions drawn from the current study may not be applicable to patients who are given a different IGRA. Further studies using individual IGRAs may be warranted to address the same question.
 
Conclusion
In arthritis patients in a TB-endemic population, the level of agreement between TST and A.TB IGRA for detecting LTBI was only fair. Although 10 mm is the usual cut-off for TST, the level of agreement between the two tests improved from fair to moderate when a 5-mm cut-off was used. A dual testing strategy with TST and IGRA appeared to be effective and should be pursued, especially in patients who are prescribed a systemic steroid. The issue of potential overtreatment is yet to be evaluated.
 
Declaration
All authors have disclosed no conflicts of interst.
 
References
1. Furst DE. The risk of infections with biologic therapies for rheumatoid arthritis. Semin Arthritis Rheum 2010;39:327-46. Crossref
2. World Health Organization. Guidelines on the management of latent tuberculosis infection. Geneva, Switzerland: World Health Organization; 2015.
3. Mok CC. Consensus statements on the indications and monitoring of anti-tumor necrosis factor (TNF) therapy for rheumatic diseases in Hong Kong. Hong Kong Bull Rheum Dis 2005;5:19-25.
4. Huebner RE, Schein MF, Bass JB Jr. The tuberculin skin test. Clin Infect Dis 1993;17:968-75. Crossref
5. Matulis G, Jüni P, Villiger PM, Gadola SD. Detection of latent tuberculosis in immunosuppressed patients with autoimmune diseases: performance of a Mycobacterium tuberculosis antigen-specific interferon gamma assay. Ann Rheum Dis 2008;67:84-90. Crossref
6. Ponce de Leon D, Acevedo-Vasquez E, Alvizuri S, et al. Comparison of an interferon-gamma assay with tuberculin skin testing for detection of tuberculosis (TB) infection in patients with rheumatoid arthritis in a TB-endemic population. J Rheumatol 2008;35:776-81.
7. Song Q, Guo H, Zhong H, et al. Evaluation of a new interferon-gamma release assay and comparison to tuberculin skin test during a tuberculosis outbreak. Int J Infect Dis 2012;16:e522-6. Crossref
8. Ferrara G, Losi M, Meacci M, et al. Routine hospital use of a new commercial whole blood interferon-gamma assay for the diagnosis of tuberculosis infection. Am J Respir Crit Care Med 2005;172:631-5. Crossref
9. Kleinert S, Tony HP, Krueger K, et al. Screening for latent tuberculosis infection: performance of tuberculin skin test and interferon-gamma release assays under real-life conditions. Ann Rheum Dis 2012;71:1791-5. Crossref
10. Mariette X, Baron G, Tubach F, et al. Influence of replacing tuberculin skin test with ex vivo interferon gamma release assays on decision to administer prophylactic antituberculosis antibiotics before anti-TNF therapy. Ann Rheum Dis 2012;71:1783-90. Crossref
11. Winthrop KL, Weinblatt ME, Daley CL. You can’t always get what you want, but if you try sometimes (with two tests—TST and IGRA—for tuberculosis) you get what you need. Ann Rheum Dis 2012;71:1757-60. Crossref
12. Cao H, Agrawal D, Kushner N, Touzjian N, Essex M, Lu Y. Delivery of exogenous protein antigens to major histocompatibility complex class I pathway in cytosol. J Infect Dis 2002;185:244-51. Crossref
13. McEvers K, Elrefaei M, Norris P, et al. Modified anthrax fusion proteins deliver HIV antigens through MHC class I and II pathways. Vaccine 2005;23:4128-35. Crossref
14. Pai M, Zwerling A, Menzies D. Systematic review: T-cell–based assays for the diagnosis of latent tuberculosis infection: an update. Ann Intern Med 2008;149:177-84. Crossref
15. Yilmaz N, Zehra Aydin S, Inanc N, Karakurt S, Direskeneli H, Yavuz S. Comparison of QuantiFERON-TB Gold test and tuberculin skin test for the identification of latent Mycobacterium tuberculosis infection in lupus patients. Lupus 2012;21:491-5. Crossref
16. Lee JH, Sohn HS, Chun JH, et al. Poor agreement between QuantiFERON-TB Gold test and tuberculin skin test results for the diagnosis of latent tuberculosis infection in rheumatoid arthritis patients and healthy controls. Korean J Intern Med 2014;29:76-84. Crossref
17. Ewer K, Deeks J, Alvarez L, et al. Comparison of T-cell-based assay with tuberculin skin test for diagnosis of Mycobacterium tuberculosis infection in a school tuberculosis outbreak. Lancet 2003;361:1168-73. Crossref
18. Brock I, Weldingh K, Lillebaek T, Follmann F, Andersen P. Comparison of tuberculin skin test and new specific blood test in tuberculosis contacts. Am J Respir Crit Care Med 2004;170:65-9. Crossref
19. Huang YW, Shen GH, Lee JJ, Yang WT. Latent tuberculosis infection among close contacts of multidrug-resistant tuberculosis patients in central Taiwan. Int J Tuberc Lung Dis 2010;14:1430-5.
20. Shalabi NM, Houssen ME. Discrepancy between the tuberculin skin test and the levels of serum interferon-gamma in the diagnosis of tubercular infection in contacts. Clin Biochem 2009;42:1596-601. Crossref
21. Mandalakas AM, Hesseling AC, Chegou NN, et al. High level of discordant IGRA results in HIV-infected adults and children. Int J Tuberc Lung Dis 2008;12:417-23.
22. Shahidi N, Fu YT, Qian H, Bressler B. Performance of interferon-gamma release assays in patients with inflammatory bowel disease: a systematic review and meta-analysis. Inflamm Bowel Dis 2012;18:2034-42. Crossref

A prospective interventional study to examine the effect of a silver alloy and hydrogel-coated catheter on the incidence of catheter-associated urinary tract infection

Hong Kong Med J 2017 Jun;23(3):239–45 | Epub 17 Feb 2017
DOI: 10.12809/hkmj164906
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
A prospective interventional study to examine the effect of a silver alloy and hydrogel–coated catheter on the incidence of catheter-associated urinary tract infection
Patrick HY Chung, FRCSEd(Paed), FHKAM (Surgery)1; Carol WY Wong, MB, BS, MRCSEd1; Christopher KC Lai, MB, ChB, FRCPath2; HK Siu, BSc (Statistics), MPhil (CUHK)3; Dominic NC Tsang, MB, BS, FRCPath2,3; KY Yeung, MNurs, BNurs4; Dennis KM Ip, MB, BS, MPhil(Epidemiology)(Cantab)5; Paul KH Tam, FRCS (Edin, Glasg, Irel), FHKAM (Surgery)1
1 Department of Surgery, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong
2 Department of Pathology, Queen Elizabeth Hospital, Jordan, Hong Kong
3 Chief Infection Control Officer’s Office, Hospital Authority, Hong Kong
4 Infection Control Team, Central Nursing Department, Kowloon Hospital, Argyle Street, Hong Kong
5 School of Public Health, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong
 
Corresponding author: Dr Christopher KC Lai (laikcc@ha.org.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Catheter-associated urinary tract infection is a major hospital-acquired infection. This study aimed to analyse the effect of a silver alloy and hydrogel–coated catheter on the occurrence of catheter-associated urinary tract infection.
 
Methods: This was a 1-year prospective study conducted at a single centre in Hong Kong. Adult patients with an indwelling urinary catheter for longer than 24 hours were recruited. The incidence of catheter-associated urinary tract infection in patients with a conventional latex Foley catheter without hydrogel was compared with that in patients with a silver alloy and hydrogel–coated catheter. The most recent definition of urinary tract infection was based on the latest surveillance definition of the National Healthcare Safety Network managed by Centers for Disease Control and Prevention.
 
Results: A total of 306 patients were recruited with a similar ratio between males and females. The mean (standard deviation) age was 81.1 (10.5) years. The total numbers of catheter-days were 4352 and 7474 in the silver-coated and conventional groups, respectively. The incidences of catheter-associated urinary tract infection per 1000 catheter-days were 6.4 and 9.4, respectively (P=0.095). There was a 31% reduction in the incidence of catheter-associated urinary tract infection per 1000 catheter-days in the silver-coated group. Escherichia coli was the most commonly involved pathogen (36.7%) of all cases. Subgroup analysis revealed that the protective effect of silver-coated catheter was more pronounced in long-term users as well as female patients with a respective 48% (P=0.027) and 42% (P=0.108) reduction in incidence of catheter-associated urinary tract infection. The mean catheterisation time per person was the longest in patients using a silver-coated catheter (17.0 days) compared with those using a conventional (10.8 days) or both types of catheter (13.6 days) [P=0.01].
 
Conclusions: Silver alloy and hydrogel–coated catheters appear to be effective in preventing catheter-associated urinary tract infection based on the latest surveillance definition. The effect is perhaps more prominent in long-term users and female patients.
 
 
New knowledge added by this study
  • The use of a silver alloy and hydrogel–coated (SAH) catheter has the potential to reduce catheter-associated urinary tract infection (CA-UTI), especially in certain subgroups of patients (long-term users and female patients).
Implications for clinical practice or policy
  • The use of a SAH catheter potentially reduces the incidence of CA-UTI. This will lead to less morbidity and medical costs associated with CA-UTI.
  • This study provides pilot data for future research.
 
 
Introduction
Catheter-associated urinary tract infection (CA-UTI) is a major cause of hospital-acquired infection, with local data showing 4.9 infections per 1000 catheter-days.1 Internationally, an estimated 900 000 nosocomial UTIs occur every year, prolonging the mean duration of hospital stay by 1 to 3.8 days. It has been estimated that approximately 80% of UTIs are related to the presence of an indwelling urinary catheter. In severe cases, these infections may lead to bacteraemia, urosepsis, and even mortality.2 3 A case-control study also suggested that patients with CA-UTI had excess costs of US$3803 compared with patients without infection.4 Therefore, by prevention of CA-UTI, a significant reduction in morbidity and mortality, as well as the health care economic burden, can be anticipated.
 
Bactiguard-coated Foley catheters (Bactiguard, Sweden) were approved by the US Food and Drug Administration in 1994. These catheters have a stable noble metal alloy and hydrogel coating (also referred to as silver alloy and hydrogel–coated, SAH) on the outer- and inner-luminal surfaces of the catheter, providing repellent and anti-infective properties by preventing the formation of microbial biofilm. The coating consists of gold, silver and palladium, and also preserves the urethral mucosal integrity and helps to avoid the onset of inflammation. Previous studies of CA-UTI prevention had asymptomatic bacteriuria (ASB) alone or in combination with symptomatic UTI as the endpoint so their clinical relevance was called into question. We conducted a prospective, interventional study to provide additional data on the effectiveness of the noble metal alloy urinary catheter in the prevention of CA-UTI, using the updated surveillance definition of National Healthcare Safety Network (NHSN) managed by the Centers for Disease Control and Prevention (CDC). This surveillance definition was adopted in 2009 and modified the criteria for symptomatic infection, as well as adding a category and definition for asymptomatic bacteraemic UTI together with the removal of ASB completely.5 To study the effect on ASB, we adopted the criteria used in the Infectious Diseases Society of America practice guideline developed in 2009.6
 
Methods
This single-centre 1-year prospective study was completed in 2012 in a regional rehabilitation hospital in Hong Kong. The study population was in-patients in two medical rehabilitation wards. All patients over 18 years of age on either of the wards during the study period with an indwelling catheter for longer than 24 hours were recruited after giving informed consent. Patients who underwent suprapubic catheterisation, single in-and-out catheterisation for collection of a urine specimen, intermittent catheterisation for urine drainage, catheterisation for less than 24 hours, or who were catheterised with a silicone Foley catheter, and those who had been treated with antibiotics for a UTI were excluded from the study. Both of the study wards rotated through the two different interventions in two 6-month periods in order to act as a self-control to minimise the potential problem of variability in medical and nursing practice that might affect the outcomes. Conventional latex Foley catheters without hydrogel (sized Fr 12, 14, and 16) were used for catheterisation on both wards during the first half of the study period; SAH catheters (sized Fr 12, 14, and 16) were used during the second half of the study period. If a catheter was changed due to the presence of infection, the appropriate catheter according to the month of the study was used. Thus it was possible for patients who required a catheter for a long time and underwent catheter exchange to be exposed to both types of urinary catheter (Fig 1).
 

Figure 1. Flowchart showing the study design and patient distribution
 
The definition of CA-UTI was adopted and modified from the CDC/NHSN definition of symptomatic UTI (Appendix5 6). Routine, regular screening and clinical urine samples were collected from all subjects according to the hospital protocol. Routine urine samples were taken from all subjects at four fixed time-points: on admission, on catheterisation, before removal of the catheter, and before hospital discharge. Screening samples were taken weekly. Clinical samples were taken whenever a patient demonstrated symptoms and signs of UTI, or as part of a sepsis workup. The incidence of CA-UTI in the two groups was analysed in terms of the absolute number of CA-UTI episodes and the number of CA-UTI episodes per 1000 catheter-days. Values were expressed as mean ± standard deviation. Comparison between the two groups was performed by Pearson’s Chi squared test, Student’s t test, and one-way analysis of variance test when appropriate with a two-sided significance level of 0.05. The rate ratio of CA-ASB and CA-UTI between the two groups was compared by exact Poisson test for rate ratio. The occurrence of CA-UTI between the two groups was also analysed with Kaplan-Meier analysis. Results were analysed using the Statistical Package for the Social Sciences (Windows version 21.0; SPSS Inc, Armonk [NY], US) and R version 3.1.2.
 

Appendix. Definition of CA-UTI5 and CA-ASB6 adopted in the current study
 
This study was done in accordance with the principles outlined in the Declaration of Helsinki.
 
Results
During the 1-year study period, 306 patients were recruited. The male-to-female ratio was 1:1.13 and the mean age was 81.1 ± 10.5 years (Table 1). Overall, 187 patients used a conventional catheter only, 36 patients used a SAH catheter only, and 83 patients used both a conventional and a SAH catheter (Fig 1).
 

Table 1. Characteristics of the study population, specimens collected, and catheter used
 
The total numbers of catheter-days were 4352 and 7474 in the SAH and conventional groups, respectively. The numbers of CA-UTI episodes were 28 and 70, respectively. Thus the incidences of CA-UTI per 1000 catheter-days in the SAH and conventional groups were 6.4 and 9.4, respectively (P=0.095) with a rate ratio of 0.69 (95% confidence interval [CI], 0.42-1.08). There was a 31% reduction in CA-UTI incidence in the SAH group. Using Kaplan-Meier analysis and log-rank test, SAH catheter was associated with a significantly lower rate of CA-UTI (P=0.045; Fig 2). Regarding CA-ASB, the incidences per 1000 catheter-days in the SAH and conventional groups were 70.8 and 67.2, respectively (P=0.467) with a rate ratio of 1.05 (95% CI, 0.91-1.22). Results are summarised in Table 2. Blood cultures were taken from patients who developed CA-UTI. In both groups, none of the patients with CA-UTI developed bacteraemia. Escherichia coli was the most commonly involved urinary pathogen and accounted for 36.7% of all cases, followed by Candida albicans (17.3%) and Proteus mirabilis (14.3%) [Table 3]. The same pathogens were observed in both groups.
 

Figure 2. Comparison of CA-UTI occurrence between SAH and conventional catheters in the entire study population using Kaplan-Meier analysis
 

Table 2. Overall comparison of CA-UTI and CA-ASB episodes between SAH catheters and conventional catheters
 

Table 3. Organisms identified from CA-UTI specimens (some specimens showed mixed flora)
 
This study was not a randomised controlled trial. Thus to eliminate patient selection bias, a subgroup analysis was performed among those patients who used both types of catheter (n=83). These patients had more catheters used and more catheter-days than those patients who used only one type of catheter (Table 4a). This was due to study design where longer-term users had a higher chance of exposure to both types of urinary catheter. Among them, the total numbers of catheter-days were 3210 and 3457 in the SAH and conventional groups, respectively. The numbers of CA-UTI episodes were 17 and 35, respectively. This resulted in the incidences of CA-UTI per 1000 catheter-days in the SAH and conventional groups being 5.3 and 10.1, respectively (P=0.027) with a rate ratio of 0.52 (95% CI, 0.27-0.96). There was a statistically significant reduction of 48% in CA-UTI incidence in the SAH group (Table 4b). Because the catheters were exchanged when an infection occurred, the CA-UTI reducing effect resulted in less need to exchange a SAH catheter—the mean catheterisation time per person was 17.0 days for a SAH catheter compared with 10.8 days for a conventional catheter and 13.6 days for patients using both catheters (Table 4a).
 

Table 4. (a) Characteristics of patients who used both types of catheter, SAH catheter only, or conventional catheter only. (b) Comparison of CA-UTI and CA-ASB incidences between SAH and conventional catheters in 83 patients who used both types of catheter (cross-over group). (c) Comparison of CA-UTI and CA-ASB incidences between SAH and conventional catheters in male and female patients
 
To examine the presence of outcome difference in relation to gender in the entire study population, we also performed a subgroup analysis based on gender differences (Table 4c). In male patients (n=144), the number of CA-UTI episodes was 15 in the SAH group (total catheter-days, 1966) and 33 in the conventional group (total catheter-days, 3529). The incidences of CA-UTI per 1000 catheter-days in the SAH and conventional groups were 7.6 and 9.4, respectively (P=0.551) with a rate ratio of 0.82 (95% CI, 0.41-1.54). For female patients (n=162), the number of CA-UTI episodes was 13 in the SAH group (total catheter-days, 2386) and 37 in the conventional group (total catheter-days, 3945). The incidences of CA-UTI per 1000 catheter-days in the SAH and conventional groups were 5.4 and 9.4, respectively (P=0.108), with a rate ratio of 0.58 (95% CI, 0.28-1.12).
 
Discussion
Urinary tract infection is one of the most commonly encountered infections in daily clinical practice and the majority of cases are catheter-related. Although a number of clinical practices such as aseptic technique for catheter insertion, closed drainage systems, and shorter duration of catheterisation have been introduced in an attempt to reduce the onset of CA-UTI, the incidence remains high.3 7 Therefore, research for strategies or new technologies to prevent CA-UTI is still needed. Since the early 1990s, research has focused on different anti-infective catheter-coating materials but results have been generally inconclusive. Bactiguard-coated Foley catheters, an essential noble metal (gold, silver, and palladium) alloy and hydrogel–coated catheter, have been introduced to slow bacterial colonisation.
 
In the early 2000s, a randomised cross-over study by Karchmer et al8 demonstrated that the risk of UTI could be decreased by 21% on wards and by 32% among patients when a noble metal alloy catheter was used instead of a conventional catheter. Since then, more studies to compare anti-infective urinary catheters with conventional urinary catheters have been carried out. The noble metal alloy indwelling catheter has been shown in multiple large clinical trials and smaller case studies to reduce the incidence of CA-UTI, when compared with conventional catheters.9 10 11 12 13 14 15 These studies have examined endpoints such as bacteriuria and symptomatic CA-UTI, or surveillance-defined UTI.8 16 17 In a study by Pickard et al,17 noble metal alloy catheters were found to be ineffective in reducing the incidence of symptomatic surveillance-defined UTI when used in short-term (mean, 2 days) surgical patients and they did not support the routine use of these catheters in this patient group. Lack of effect is not surprising due to the short catheterisation time and low-risk patient group. In a more recent multicentre cohort study in 2014, Lederer et al4 examined the impact of noble metal alloy catheters on symptomatic CA-UTI and antibiotic use based on the NHSN surveillance and concluded that a 58% relative reduction (P<0.0001) in NHSN-defined CA-UTI rate was observed and 60% fewer antibiotics were used when compared with conventional catheters.
 
In the present study, we were able to demonstrate a 31% reduction in the incidence of CA-UTI episodes per 1000 catheter-days in the SAH group although it did not reach statistical significance, likely due to too small study groups. We believe that the incidence rate per catheter-days is a more appropriate comparison to reflect the risk of infection associated with different types of catheter as it also takes into account the duration of catheterisation, which is known to be an important factor associated with the incidence of CA-UTI. This is also reflected by the fact that the noble metal alloy catheter can be left in situ for the longest period of time. Although the cost of each SAH catheter (approximately HK$100) is higher than that of a conventional catheter (approximately HK$15), we believe the benefit of longer duration and potential reduction in CA-UTI justify its use.
 
With subgroup analysis, the effect of a noble metal alloy catheter on reduction of CA-UTI was more prominent in long-term users and female patients. In patients who used both catheters and who served as their own control, a significant reduction (48%, P=0.027) was observed in the SAH group. The same reduction was not observed in those who used only one type of urinary catheter whose number of catheters used and catheter-days were significantly fewer (Table 4a and 4b). We cannot give an exact explanation for this observation but we believe the protective effect of Bactiguard catheters is best seen in patients who require long-term urinary catheterisation. Nonetheless, it must be emphasised that the effect due to mixed use of catheters is unknown. The reduction in CA-UTI was also slightly more prominent in female patients (rate ratio of CA-UTI episodes per 1000 catheter-days, 0.58; Table 4c). Whether these are genuine and significant findings will warrant future randomised controlled studies to confirm.
 
This study has several limitations. First, this was a non-randomised study with a lack of blinding of outcome observers. Second, some patients might have used both catheters and the effects of each catheter type might have confounded the results. Third, as patients were recruited from a regional rehabilitation hospital, their underlying different medical conditions and risk factors might have affected the outcomes. As patients admitted during the two 6-month periods were incomparable, confounding by underlying risk factors for CA-UTI could not be excluded.
 
Conclusions
Our findings suggest that SAH-coated catheters may be effective in reducing CA-UTI based on CDC’s NHSN surveillance definition. The effect seems to be more pronounced in high-risk patients such as long-term users and female patients. Future randomised controlled studies on this subject should be carried out based on these pilot data.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Kong MY. Systematic review of the effective approach for limiting urinary catheter use and duration to reduce nosocomial catheter-associated urinary tract infections in hospitalized patients. Hong Kong: Faculty of Health and Social Sciences, the Hong Kong Polytechnic University; 2010.
2. Centers for Disease Control. Public health focus: surveillance, prevention, and control of nosocomial infections. MMWR Morb Mortal Wkly Rep 1992;41:783-7.
3. Maki DG, Tambyah PA. Engineering out the risk for infection with urinary catheters. Emerg Infect Dis 2001;7:342-7. Crossref
4. Lederer JW, Jarvis WR, Thomas L, Ritter J. Multicenter cohort study to assess the impact of a silver-alloy and hydrogel-coated urinary catheter on symptomatic catheter-associated urinary tract infections. J Wound Ostomy Continence Nurs 2014;41:473-80. Crossref
5. Division of Healthcare Quality Promotion, Centers for Disease Control and Prevention. The National Healthcare Safety Network manual. Atlanta, GA: Centers for Disease Control and Prevention; 2009.
6. Hooton TM, Bradley SF, Cardenas DD, et al. Diagnosis, prevention, and treatment of catheter-associated urinary tract infection in adults: 2009 International Clinical Practice Guidelines from the Infectious Diseases Society of America. Clin Infect Dis 2010;50:625-63. Crossref
7. Salgado CD, Karchmer TB, Farr BM. Prevention of catheter-associated urinary tract infection. In: Wenzel RP, editor. Prevention and control of nosocomial infections. 4th ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2003: 297-311.
8. Karchmer TB, Giannetta ET, Muto CA, Strain BA, Farr BM. A randomized crossover study of silver-coated urinary catheters in hospitalized patients. Arch Intern Med 2000;160:3294-8. Crossref
9. Gentry H, Cope S. Using silver to reduce catheter-associated urinary tract infections. Nurs Stand 2005;19:51-4. Crossref
10. Newton T, Still JM, Law E. A comparison of the effect of early insertion of standard latex and silver-impregnated latex foley catheters on urinary tract infections in burn patients. Infect Control Hosp Epidemiol 2002;23:217-8. Crossref
11. Gould CV, Umscheid CA, Agarwal RK, Kuntz G, Pegues DA; Healthcare Infection Control Practices Advisory Committee. Guideline for prevention of catheter-associated urinary tract infections 2009. Infect Control Hosp Epidemiol 2010;31:319-26. Crossref
12. Schumm K, Lam TB. Types of urethral catheters for management of short-term voiding problems in hospitalized adults: a short version Cochrane review. Neurourol Urodyn 2008;27:738-46. Crossref
13. Seymour C. Audit of catheter-associated UTI using silver alloy-coated Foley catheters. Br J Nurs 2006;15:598-603. Crossref
14. Rupp ME, Fitzgerald T, Marion N, et al. Effect of silver-coated urinary catheters: efficacy, cost-effectiveness, and antimicrobial resistance. Am J Infect Control 2004;32:445-50. Crossref
15. Verleyen P, De Ridder D, Van Poppel H, Baert L. Clinical application of the Bardex IC Foley catheter. Eur Urol 1999;36:240-6. Crossref
16. Johnson JR, Kuskowski MA, Wilt TJ. Systematic review: antimicrobial urinary catheters to prevent catheter-associated urinary tract infection in hospitalized patients. Ann Intern Med 2006;144:116-26. Crossref
17. Pickard R, Lam T, MacLennan G, et al. Antimicrobial catheters for reduction of symptomatic urinary tract infection in adults requiring short-term catheterisation in hospital: a multicentre randomised controlled trial. Lancet 2012;380:1927-35. Crossref

Outcomes after oesophageal perforation: a retrospective cohort study of patients with different aetiologies

Hong Kong Med J 2017 Jun;23(3):231–8 | Epub 10 Mar 2017
DOI: 10.12809/hkmj164942
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE  CME
Outcomes after oesophageal perforation: a retrospective cohort study of patients with different aetiologies
TT Law, FRCSEd, FHKAM (Surgery); Jonathan YL Chan, MB, BS; Desmond KK Chan, FRCSEd, FHKAM (Surgery); Daniel Tong, MS, PhD; Ian YH Wong, FRCSEd, FHKAM (Surgery); Fion SY Chan, FRCSEd, FHKAM (Surgery); Simon Law, MS, FRCSEd
Division of Esophageal and Upper Gastrointestinal Surgery, Department of Surgery, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
 
Corresponding author: Prof Simon Law (slaw@hku.hk)
 
 Full paper in PDF
 
Abstract
Introduction: The mortality rate after oesophageal perforation is high despite advances in operative and non-operative techniques. In this study, we sought to identify risk factors for hospital mortality after oesophageal perforation treatment.
 
Methods: We retrospectively examined patients treated for oesophageal perforation in a university teaching hospital in Hong Kong between January 1997 and December 2013. Their demographic and clinical characteristics, aetiology, management strategies, and outcomes were recorded and analysed.
 
Results: We identified a cohort of 43 patients treated for perforation of the oesophagus (28 men; median age, 66 years; age range, 30-98 years). Perforation was spontaneous in 22 (51.2%) patients (15 with Boerhaave’s syndrome and seven with malignant perforation), iatrogenic in 15 (34.9%), and provoked by foreign body ingestion in six (14.0%). Of the patients, 14 (32.6%) had pre-existing oesophageal disease. Perforation occurred in the intrathoracic oesophagus in 30 (69.8%) patients. Emergent surgery was undertaken in 23 patients: 16 underwent primary repair, six surgical drainage or exclusion, and one oesophagectomy. Twenty patients were managed non-operatively, 13 of whom underwent stenting. Two stented patients subsequently required oesophagectomy. Four patients had clinical signs of leak after primary repair: two were treated conservatively and two required oesophagectomy. Overall, six (14.0%) patients required oesophagectomy, one of whom died. Nine other patients also died in hospital; the hospital mortality rate was 23.3%. Pre-existing pulmonary and hepatic disease, and perforation associated with malignancy were significantly associated with hospital mortality (P=0.03, <0.01, and <0.01, respectively).
 
Conclusions: Most oesophageal perforations were spontaneous. Mortality was substantial despite modern therapies. Presence of pre-existing pulmonary disease, hepatic disease, and perforation associated with malignancy were significantly associated with hospital mortality. Salvage oesophagectomy was successful in selected patients.
 
 
New knowledge added by this study
  • We report the outcomes of a cohort of patients with oesophageal perforation managed in a single centre.
  • Mortality rate was substantial despite advances in surgery and endoscopic therapy.
Implications for clinical practice or policy
  • Surgical and non-operative treatment options are available.
  • The aetiology, timing of presentation, and patients’ co-morbidities should be considered carefully when managing oesophageal perforation.
  • Oesophagectomy may be indicated in selected patients.
 
 
Introduction
Oesophageal perforation is uncommon, yet its management remains a substantial challenge to surgeons. Diagnosis and treatment are often delayed due to lack of clinical suspicion and accurate diagnostic tools. Hence, reported mortality rates range from 10% to 25%.1 2 3
 
Oesophageal perforation can occur spontaneously from forceful vomiting (Boerhaave’s syndrome), or in pre-existing pathology (such as oesophageal cancer) or can be associated with ingestion of a foreign body. Iatrogenic perforation usually occurs after therapeutic endoscopic procedures such as dilatation, and is the predominant cause of perforation reported in many studies.1 2 4 5
 
Diagnosis and treatment within 24 hours of perforation are critical if favourable outcomes are to be achieved.1 6 After diagnosis and the initial phase of resuscitation, there is a wide range of treatment options, which are informed by the presentation, aetiology, location of perforation, and the extent of mediastinal or intrathoracic contamination. Surgery remains the mainstay of treatment; the conventional operative approach is considered to be primary repair of the perforation site and drainage.7 8 9 Some surgeons advocate primary repair only for those patients presented within 24 hours of perforation,10 while others would try primary repair as the initial treatment regardless of the timing of presentation.9 11 Endoscopic treatment, including stenting, is becoming an increasingly popular means of treating oesophageal perforation in selected patients, and reportedly has a high technical success rate.12 13 14 15 16
 
Oesophageal perforation should be managed in specialised centres. In this study, we report the characteristics, treatment, and outcomes of a cohort of patients with oesophageal perforation treated at a single tertiary centre in Hong Kong over a period of 16 years.
 
Methods
We retrospectively identified patients treated for perforation of the oesophagus at a university teaching hospital in Hong Kong between January 1997 and December 2013. Patients’ demographic characteristics, presentation, investigations, management, and outcomes were recorded.
 
Diagnosis of perforation was confirmed by one or more of the following methods: oesophagogastroduodenoscopy (OGD), water-soluble contrast swallow study, and contrast-enhanced computed tomography imaging of the neck, thorax, and abdomen. After confirmation of the diagnosis, patients were resuscitated to address homoeostatic and haemodynamic disturbances, followed by definitive treatment. All patients were kept ‘nil by mouth’, administered parenteral broad-spectrum antibiotics and proton pump inhibitors, and chest drain(s) was inserted if clinically indicated. Patients with significant haemodynamic instability or respiratory distress requiring intubation and mechanical ventilation were admitted to the intensive care unit (ICU) for optimisation before definitive treatment.
 
Definitive treatment depended on the location of the perforation, its aetiology, the extent of mediastinal and intrathoracic contamination, and the patient’s physical status. In general, patients with malignant perforation or perforation contained within the mediastinal pleura were treated non-operatively. For the former, self-expanding metallic stents were inserted under fluoroscopic guidance. In selected patients with a benign cause of perforation and limited contamination, a polyester oesophageal stent (Polyflex; Boston Scientific, Natick [MA], United States) was placed under fluoroscopic guidance. For patients in whom the site of perforation could not be identified, and in the absence of clinical signs of sepsis, a conservative management strategy was adopted. This entailed placement of a nasogastric feeding tube under endoscopic guidance, followed by enteral feeding for 7 days. Thereafter, a water-soluble contrast swallow study was undertaken to confirm the absence of a leak before oral feeding was resumed.
 
When a surgical management strategy was decided, patients with perforation of the intra-abdominal oesophagus were treated with laparotomy, primary repair of the perforation, and feeding jejunostomy. For an intrathoracic perforation with significant contamination of the pleural cavity, thoracotomy and primary repair was the preferred approach. A left-sided thoracotomy was the usual approach for Boerhaave’s perforation of the distal thoracic oesophagus. Necrotic tissue was debrided, the edges of the perforation were trimmed, and the defect was closed with fine sutures in two layers. The mucosal edges of the perforation were approximated using interrupted absorbable sutures, and the muscular defect was approximated using interrupted monofilament absorbable sutures. Lung decortication was performed. One drain was placed in close proximity to the repair, generally accompanied by one basal and one apical large-bore chest drain. Feeding jejunostomy was performed in selected patients. Postoperatively patients remained nil by mouth, and were given nutritional support and intravenous antibiotics. A contrast swallow study was generally performed 7 to 10 days postoperatively; oral intake was commenced if there was no evidence of leak. The choice of antibiotics and duration of treatment were guided by microbiology culture findings.
 
In selected patients who presented late, and in those who developed a persistent leak after primary repair, oesophageal exclusion (cervical oesophagostomy and jejunostomy) followed by second-stage oesophagectomy might be considered. In the first stage, the oesophagus was excluded proximally in the neck with an oesophagostomy, and the abdominal oesophagus was stapled. A drain was placed from the neck into the oesophageal stump for decompression. Oesophagectomy was performed once sepsis had subsided. A gastric tube was used for reconstruction via the retrosternal route, and cervical oesophagogastrostomy was performed.
 
The principles outlined in the Declaration of Helsinki have been followed.
 
Statistical analysis
Continuous data were represented as the median (range), unless otherwise stated. Fisher’s exact test was used to compare categorical variables and the Mann-Whitney U test for continuous variables. We undertook univariate analysis to identify factors associated with hospital mortality. P<0.05 was considered statistically significant. Data were analysed using SPSS 20.0 (IBM Corp, Armonk [NY], United States).
 
Results
During the study period, 43 patients with oesophageal perforation were identified. Patients’ demographic and clinical characteristics are summarised in Table 1. The median age of the cohort was 66 years (range, 30-98 years); 28 (65.1%) were men. Medical co-morbidities were present in 27 (62.8%) patients, and pre-existing oesophageal pathologies were present in 14 (32.6%; of whom half had oesophageal cancer). Spontaneous perforation occurred in 22 (51.2%) patients: 15 occurred as a result of Boerhaave’s syndrome and seven as a result of malignant perforation. Fifteen (34.9%) patients had an iatrogenic perforation: 13 occurred after an endoscopic procedure (three after endoscopic retrograde cholangiopancreatography and 10 after OGD), one occurred after attempted endotracheal intubation, and one occurred during thyroidectomy. Of the 10 OGDs, eight had been therapeutic. Six (14.0%) perforations were associated with ingestion of a foreign body.
 

Table 1. Patients’ clinical and demographic characteristics
 
Chest pain and vomiting were the most common presenting symptoms in patients with spontaneous perforation, occurring in 13 and 10 patients, respectively. Surgical emphysema and dysphagia were the least common presenting signs and symptoms; both were only present in two patients. Over half the patients presented and were diagnosed within 24 hours of symptom onset. Of the cohort of 43 patients, 29 (67.4%) underwent two out of the three diagnostic imaging modalities. The presenting symptoms and investigations of patients with spontaneous perforation are shown in Table 2.
 

Table 2. Presenting symptoms in patients with spontaneous perforation, and investigations undertaken in all patients
 
The management and outcomes of patients are shown in the Figure. Of the 15 patients with Boerhaave’s perforation, 10 underwent primary repair: four repairs were complicated by a leak and two patients subsequently required oesophagectomy. The remaining five patients were initially treated non-operatively: four underwent endoscopic stent placement and one endoscopic clipping of the perforation. Three patients required subsequent operations: one underwent oesophagectomy, one bypass operation, and one surgical drainage. There were no deaths in the group of patients with Boerhaave’s syndrome.
 

Figure. Outcomes of patients with oesophageal perforation according to treatment algorithm
 
Seven patients had malignant perforation: five were treated with endoscopic placement of a metallic stent. All but one of these procedures were successful; the patient in whom stenting failed underwent oesophagectomy. Five (71.4%) of the seven patients with malignant perforation died during their hospital stay.
 
There were 15 iatrogenic perforations (Fig). Nine of these patients underwent early operative treatment: five underwent primary repair, one exclusion, two drainage, and one oesophagectomy. There were no leaks in those who underwent primary repair. Six patients were initially treated non-operatively, four with stents, one with a feeding tube, and one was judged to be unfit for treatment. Two of the six patients initially treated non-operatively ultimately required surgery, one underwent exclusion, and the other surgical drainage. Five of the 15 patients with iatrogenic perforations died during their hospital stay, with a mortality rate of 33.3%.
 
Six oesophageal perforations were associated with foreign body ingestion (Fig). Three patients were treated non-operatively; of the remainder, one underwent primary repair, one exclusion, and one surgical drainage. None of these patients died during hospitalisation.
 
Overall, 16 of the 43 patients underwent primary repairs in the initial treatment, and four (25%) developed clinical signs of leak subsequently. All were from Boerhaave’s perforation. Two required oesophagectomy while two were managed conservatively.
 
Overall, six of the 43 patients underwent oesophagectomy, generally as a salvage treatment due to failure of other treatment modalities. Three patients with Boerhaave’s syndrome required oesophagectomy, two with persistent leak after primary repair and one with a persistent leak after stenting. All had presented >24 hours from symptom onset. One patient with a perforated oesophageal cancer developed a leak after stenting and required oesophagectomy. Two patients with iatrogenic perforation in the presence of caustic strictures underwent oesophagectomy. Only one patient who underwent oesophagectomy died in hospital.
 
Overall, 10 patients died in hospital, with a mortality rate of 23.3%. The 30-day mortality rate was 16.3%. The median length of hospital stay was 36.5 days (range, 6-241 days), and median ICU stay was 6 days (range, 0-71 days).
 
All 10 patients who died had pre-existing oesophageal disease; five had cancer of the oesophagus, one caustic stricture, and four had oesophageal varices secondary to hepatic cirrhosis. Malignant perforation had a substantially higher mortality rate of 71.4%. The median survival for patients with perforated oesophageal cancer was 28.5 days (range, 13-848 days).
 
The results of univariate analysis of factors potentially associated with hospital mortality are shown in Table 3. The presence of pulmonary disease, hepatic disease (liver cirrhosis), and malignant perforation were significantly associated with hospital mortality (P=0.03, <0.01, and <0.01, respectively), but the site of perforation and timing of presentation were not.
 

Table 3. Univariate analysis of factors associated with hospital mortality
 
Discussion
Oesophageal perforation may be difficult to diagnose. Patients can present with a wide variety of symptoms, which can be non-specific. It is not uncommon for the diagnosis to be missed in the acute phase. Computed tomography imaging (preferably with oral contrast) should be undertaken when the index of clinical suspicion is high, because it allows the site of mediastinal or intra-abdominal collections to be identified and rules out other pathologies. Of note, OGD performed by an experienced endoscopist using minimal insufflation is an effective means of detecting the site and size of perforation, and is reported to have a sensitivity and specificity of 100% and 83% for intrathoracic perforation, respectively.17 A positive OGD therefore has a substantial influence on clinical decision making.
 
Spontaneous perforation was the most common aetiology in our cohort; around one third was associated with underlying cancer of the oesophagus. Squamous cell carcinoma remains the most common malignant cell type globally, despite the rising incidence of adenocarcinoma in the western population. Patients often present at an advanced stage. Of those patients with malignant perforations, all but one had a squamous cell carcinoma of the intrathoracic oesophagus. Perforation either occurs spontaneously or results from concurrent chemoradiotherapy. Ohtsu et al18 reported a perforation rate of 13.9% (five out of 36 patients) in cases of T4-stage cancer of the oesophagus with concurrent chemoradiotherapy. In our cohort, perforation occurred shortly after completion of radiotherapy in one patient.
 
The prognosis for patients with perforated oesophageal cancer is poor. The disease is often inoperable and in these circumstances treatment is palliative.19 20 Non-operative treatment, such as insertion of a metallic covered stent, is the usual practice at our centre. Stenting of the intrathoracic portion of the oesophagus is technically straightforward and is successful in most cases. Sealing of the perforation site can be confirmed by a subsequent contrast study, and oral intake can be resumed in the absence of a leak. Nevertheless, the prognosis of this group of patients is poor despite the successful placement of a stent, and the hospital mortality rate remains high. Patients most often succumb as a consequence of sepsis caused by the perforation.
 
Many treatment options are available for non-malignant perforation, and the treatment strategy should be tailored to the individual. Factors to be considered include the site of perforation, extent of contamination, pre-existing oesophageal disease, and patient co-morbidities. Operative treatment is favoured for perforation of the intra-abdominal oesophagus or perforation that involves the oesophagogastric junction (OGJ). These patients often present with abdominal pain and peritonitis. Laparotomy, primary repair of the perforation, and fashioning of a feeding jejunostomy allow alimentation in the event of persistent leak. The placement of an oesophageal stent that crosses the OGJ has a higher chance of migration and is not recommended.
 
The intrathoracic oesophagus is the most common site of perforation. Of the three most common causes (Boerhaave’s syndrome, iatrogenic perforation, and foreign body ingestion), Boerhaave’s syndrome is the most challenging. Traditionally, Boerhaave’s syndrome is associated with a mortality rate of up to 30%.11 Patients may present late, the site of perforation is usually at the distal thoracic oesophagus, and there may be extensive contamination due to the high pressure generated by vomiting. Contamination with food particles is common. Operative treatment with primary closure of the perforation and drainage is favoured by many7 8 9; this is also our preferred approach. Many surgeons advocate primary repair irrespective of the timing of presentation.9 11 21 22 Leak rates after primary repair range from 17% to 32%.9 11 21 22 23 24 Minor leaks can be managed conservatively with drainage, while further surgery (usually exclusion) is required for larger leaks and in the presence of sepsis. Lin et al23 reported that the incidence of postoperative leak was 37.5% in patients in whom treatment was delayed for more than 48 hours, compared with 0% in those who were treated more promptly. Wright et al22 reported that three out of the four leaks in their patient cohort were repaired more than 24 hours after perforation. The incidence of leak after primary repair was 25.0% in our study, which is comparable to other reports in the literature. Of the four leaks, two patients required reoperation and ultimately oesophagectomy; both had presented more than 24 hours after symptom onset.
 
Endoscopic stenting for benign perforation has been reported in several small case series. Freeman et al13 14 have reported the outcomes of stent placement in patients with iatrogenic and spontaneous perforation. They proposed a hybrid approach, namely a combination of endoscopic and minimally invasive surgical techniques to drain intrathoracic and/or intra-abdominal collections. The main advantage of this strategy is the avoidance of thoracotomy and/or laparotomy. The incidence of stent migration was reported to be approximately 20% in their cohort of patients with spontaneous oesophageal perforation.14 Relative contra-indications to stent insertion include a perforation that crosses the OGJ and circumferential necrosis of the oesophagus. In our experience, operative treatment is recommended for the treatment of Boerhaave’s syndrome unless the patient is unfit for surgery or declines surgical treatment. Five patients in our series initially treated with stenting subsequently required surgery, of whom four had benign perforations (two with Boerhaave’s syndrome and two with iatrogenic perforations). One patient in our cohort with oesophageal dissection complicated by perforation underwent stenting in another hospital before transferring to our centre; this patient developed a persistent leak after stenting. In that case, the placement of the stent appeared to have aggravated the leak, and oesophagectomy was eventually required.25 In our opinion, stent placement in benign perforation is only suitable for selected patients who present early and have minimal contamination. However, stenting may allow more time for optimisation of a patient’s condition if they are initially judged not to be fit for surgery.
 
Oesophagectomy as a treatment for perforations was first reported in the 1950s.26 Single-stage oesophageal resection and reconstruction was first reported by Hendren and Henderson in 1968.27 Altorjay et al28 reported a hospital mortality rate of 3.7% in a series of patients undergoing oesophagectomy for intrathoracic perforation; in this series iatrogenic perforation represented 55.6% of all perforations. Some surgeons have opined that oesophagectomy may be superior to primary repair in the presence of pre-existing oesophageal disease and of extensive perforation with substantial sepsis, while the general condition of the patient should always be taken into account.28 29 There is no consensus about the optimum surgical approach and timing of reconstruction after oesophagectomy. We advocate primary repair as the initial treatment irrespective of the timing of presentation, and oesophagectomy is considered a salvage treatment. In our experience, patients with persistent leak after primary repair and sepsis should undergo oesophageal exclusion to control sepsis before oesophagectomy is contemplated. Oesophagectomy with primary reconstruction can be performed safely after patient optimisation. Oesophagectomy was undertaken in six patients in our cohort; three of these patients had pre-existing oesophageal disease. All patients had a cervical oesophagogastric anastomosis fashioned via the retrosternal route. A cervical anastomosis distant from the infected mediastinum appears to be a safe option.29 Thoracotomy is the most common surgical approach, but Yeo et al30 reported using transhiatal oesophagectomy to treat perforated oesophageal cancer in four patients. Thoracotomy is avoided in the transhiatal approach, but this technique can only be considered in perforations of the distal oesophagus and in the presence of minimal mediastinal contamination.
 
Oesophageal perforation after foreign body ingestion in adults is more common in China as a result of its dietary culture. The foreign body is usually a fish, chicken, or pork bone. An impacted foreign body can usually be retrieved endoscopically; however, oesophageal perforation can occur if there is deep penetration of the foreign body or extensive manipulation during retrieval. The site of perforation is usually the cervical oesophagus, followed by the intrathoracic oesophagus. In severe cases, operative management is indicated; the approach is dependent on the site of perforation, and the site and size of any collection. The aim of management is to drain any collection, remove any residual foreign body, repair the perforated site, and protect the airway. In the absence of sepsis and imaging appearances of a peri-oesophageal collection, conservative treatment may be warranted. Operative drainage may be necessary if there is a sizeable collection and if there is sepsis. Mediastinitis and sepsis are more likely after intrathoracic perforation, and would dictate treatment strategy.
 
It is essential to identify factors associated with mortality after oesophageal perforation so as to improve treatment and outcomes. Early diagnosis and management (in the ‘golden 24 hours’) are reportedly associated with superior outcomes.1 6 Malignant perforation, sepsis, the need for mechanical ventilation on presentation, and pulmonary co-morbidity are reported to have a significant impact on overall survival.5 In our cohort, pulmonary co-morbidity, hepatic disease, and malignant perforation were associated with risk of death. A recent meta-analysis of 75 studies that included 2971 patients reported a pooled mortality rate of 11.9% (95% confidence interval, 9.7%-14.3%).3 Of the different aetiologies, spontaneous perforation had the highest mortality rate of 14.8%.3
 
Oesophageal perforation remains a difficult condition to treat despite advances in surgery, endoscopic treatment, and ICU care. The mortality rate is still substantial with modern therapies. The presence of pre-existing pulmonary disease, hepatic disease, and perforation associated with malignancy was significantly associated with hospital mortality in our cohort. Oesophagectomy for salvage had a reasonable success rate in selected patients.
 
Declaration
The authors have disclosed no conflicts of interest.
 
References
1. Vallböhmer D, Hölscher AH, Hölscher M, et al. Options in the management of esophageal perforation: analysis over a 12-year period. Dis Esophagus 2010;23:185-90. Crossref
2. Søreide JA, Konradsson A, Sandvik OM, Øvrebø K, Viste A. Esophageal perforation: clinical patterns and outcomes from a patient cohort of Western Norway. Dig Surg 2012;29:494-502. Crossref
3. Biancari F, D’Andrea V, Paone R, et al. Current treatment and outcome of esophageal perforations in adults: systematic review and meta-analysis of 75 studies. World J Surg 2013;37:1051-9. Crossref
4. Abbas G, Schuchert MJ, Pettiford BL, et al. Contemporaneous management of esophageal perforation. Surgery 2009;146:749-55. Crossref
5. Bhatia P, Fortin D, Inculet RI, Malthaner RA. Current concepts in the management of esophageal perforations: a twenty-seven year Canadian experience. Ann Thorac Surg 2011;92:209-15. CrossRef
6. Shaker H, Elsayed H, Whittle I, Hussein S, Shackcloth M. The influence of the ‘golden 24-h rule’ on the prognosis of oesophageal perforation in the modern era. Eur J Cardiothorac Surg 2010;38:216-22. Crossref
7. Brinster CJ, Singhal S, Lee L, Marshall MB, Kaiser LR, Kucharczuk JC. Evolving options in the management of esophageal perforation. Ann Thorac Surg 2004;77:1475-83. Crossref
8. Eroglu A, Can Kürkcüogu I, Karaoganogu N, Tekinbaş C, Yimaz O, Başog M. Esophageal perforation: the importance of early diagnosis and primary repair. Dis Esophagus 2004;17:91-4. Crossref
9. Jougon J, Mc Bride T, Delcambre F, Minniti A, Velly JF. Primary esophageal repair for Boerhaave’s syndrome whatever the free interval between perforation and treatment. Eur J Cardiothorac Surg 2004;25:475-9. Crossref
10. Flynn AE, Verrier ED, Way LW, Thomas AN, Pellegrini CA. Esophageal perforation. Arch Surg 1989;124:1211-4. Crossref
11. Lawrence DR, Ohri SK, Moxon RE, Townsend ER, Fountain SW. Primary esophageal repair for Boerhaave’s syndrome. Ann Thorac Surg 1999;67:818-20. Crossref
12. Fischer A, Thomusch O, Benz S, von Dobschuetz E, Baier P, Hopt UT. Nonoperative treatment of 15 benign esophageal perforations with self-expandable covered metal stents. Ann Thorac Surg 2006;81:467-72. Crossref
13. Freeman RK, Van Woerkom JM, Ascioti AJ. Esophageal stent placement for the treatment of iatrogenic intrathoracic esophageal perforation. Ann Thorac Surg 2007;83:2003-7. Crossref
14. Freeman RK, Van Woerkom JM, Vyverberg A, Ascioti AJ. Esophageal stent placement for the treatment of spontaneous esophageal perforations. Ann Thorac Surg 2009;88:194-8. Crossref
15. Kiernan PD, Khandhar SJ, Fortes DL, Sheridan MJ, Hetrick V. Thoracic esophageal perforations. Am Surg 2010;76:1355-62.
16. Dasari BV, Neely D, Kennedy A, et al. The role of esophageal stents in the management of esophageal anastomotic leaks and benign esophageal perforations. Ann Surg 2014;259:852-60. Crossref
17. Horwitz B, Krevsky B, Buckman RF Jr, Fisher RS, Dabezies MA. Endoscopic evaluation of penetrating esophageal injuries. Am J Gastroenterol 1993;88:1249-53.
18. Ohtsu A, Boku N, Muro K, et al. Definitive chemoradiotherapy for T4 and/or M1 lymph node squamous cell carcinoma of the esophagus. J Clin Oncol 1999;17:2915-21.
19. Di Franco F, Lam PJ, Karat D, Hayes N, Griffin SM. Iatrogenic perforation of localized oesophageal cancer. Br J Surg 2008;95:837-9. Crossref
20. Jethwa P, Lala A, Powell J, McConkey CC, Gillison EW, Spychal RT. A regional audit of iatrogenic perforation of tumours of the oesophagus and cardia. Aliment Pharmacol Ther 2005;21:479-84. Crossref
21. Whyte RI, Iannettoni MD, Orringer MB. Intrathoracic esophageal perforation. The merit of primary repair. J Thorac Cardiovasc Surg 1995;109:140-4. Crossref
22. Wright CD, Mathisen DJ, Wain JC, Moncure AC, Hilgenberg AD, Grillo HC. Reinforced primary repair of thoracic esophageal perforation. Ann Thorac Surg 1995;60:245-8. Crossref
23. Lin Y, Jiang G, Liu L, et al. Management of thoracic esophageal perforation. World J Surg 2014;38:1093-9. Crossref
24. Richardson JD. Management of esophageal perforations: the value of aggressive surgical treatment. Am J Surg 2005;190:161-5. Crossref
25. Zhu RY, Law TT, Tong D, Tam G, Law S. Spontaneous circumferential intramural esophageal dissection complicated with esophageal perforation and esophageal-pleural fistula: a case report and literature review. Dis Esophagus 2016;29:872-9. Crossref
26. Johnson J, Schwegman CW, MacVaugh H III. Early esophagogastrectomy in the treatment of iatrogenic perforation of the distal esophagostomy. J Thorac Cardiovasc Surg 1956;32:827-31.
27. Hendren WH, Henderson BM. Immediate esophagectomy for instrumental perforation of the thoracic esophagus. Ann Surg 1968;168:997-1003. Crossref
28. Altorjay A, Kiss J, Vörös A, Szirányi E. The role of esophagectomy in the management of esophageal perforation. Ann Thorac Surg 1998;65:1433-6. Crossref
29. Orringer MB, Stirling MC. Esophagectomy for esophageal disruption. Ann Thorac Surg 1990;49:35-42. Crossref
30. Yeo CJ, Killemoe KD, Klein AS, Zinner MJ. Treatment of instrumental perforation of esophageal malignancy by transhiatal esophagectomy. Arch Surg 1988;123:1016-8. Crossref

A descriptive study of Lewy body dementia with functional imaging support in a Chinese population: a preliminary study

Hong Kong Med J 2017 Jun;23(3):222–30 | Epub 5 May 2017
DOI: 10.12809/hkmj166023
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
A descriptive study of Lewy body dementia with functional imaging support in a Chinese population: a preliminary study
YF Shea, MRCP(UK), FHKAM (Medicine); LW Chu, MD, FRCP; SC Lee, BHS(Nursing)
Geriatrics Division, Department of Medicine, Queen Mary Hospital, Pokfulam, Hong Kong
 
Corresponding author: Dr YF Shea (elphashea@gmail.com)
 
 Full paper in PDF
 
Abstract
Introduction: Lewy body dementia includes dementia with Lewy bodies and Parkinson’s disease dementia. There have been limited clinical studies among Chinese patients with Lewy body dementia. This study aimed to review the presenting clinical features and identify risk factors for complications including falls, dysphagia, aspiration pneumonia, pressure sores, and mortality in Chinese patients with Lewy body dementia. We also wished to identify any difference in clinical features of patients with Lewy body dementia with and without an Alzheimer’s disease pattern of functional imaging.
 
Methods: We retrospectively reviewed 23 patients with Lewy body dementia supported by functional imaging. Baseline demographics, presenting clinical and behavioural and psychological symptoms of dementia, functional and cognitive assessment scores, and complications during follow-up were reviewed. Patients with Lewy body dementia were further classified as having an Alzheimer’s disease imaging pattern if functional imaging demonstrated bilateral temporoparietal hypometabolism or hypoperfusion with or without precuneus and posterior cingulate gyrus hypometabolism or hypoperfusion.
 
Results: The pre-imaging accuracy of clinical diagnosis was 52%. In 83% of patients, behavioural and psychological symptoms of dementia were evident. Falls, dysphagia, aspiration pneumonia, pressure sores, and death occurred in 70%, 52%, 26%, 26%, and 30% of patients, respectively with corresponding event rates per person-years of 0.32, 0.17, 0.18, 0.08, and 0.10. Patients with aspiration pneumonia compared with those without were more likely to have dysphagia (100% vs 35%; P=0.01). Deceased patients with Lewy body dementia, compared with alive patients, had a higher (median [interquartile range]) presenting Clinical Dementia Rating score (1 [1-2] vs 0.5 [0.5-1.0]; P=0.01), lower mean (± standard deviation) baseline Barthel index (13 ± 7 vs 18 ± 4; P=0.04), and were more likely to be prescribed levodopa (86% vs 31%; P=0.03). Patients with Lewy body dementia with an Alzheimer’s disease pattern of functional imaging, compared with those without the pattern, were younger at presentation (mean ± standard deviation, 73 ± 6 vs 80 ± 6 years; P=0.02) and had a lower Mini-Mental State Examination score at 1 year (15 ± 8 vs 22 ± 6; P=0.05).
 
Conclusions: Falls, dysphagia, aspiration pneumonia, and pressure sores were common among patients with Lewy body dementia. Those with an Alzheimer’s disease pattern of functional imaging had a younger age of onset and lower 1-year Mini-Mental State Examination score.
 
 
New knowledge added by this study
  • Behavioural and psychological symptoms of dementia were present in 83% of patients with Lewy body dementia (LBD).
  • Falls, dysphagia, aspiration pneumonia, and pressure sores were common complications in LBD patients.
  • Chinese LBD patients with an Alzheimer’s disease pattern of functional imaging had a younger age of onset and lower 1-year Mini-Mental State Examination score.
Implications for clinical practice or policy
  • Such information is useful in the formulation of a management plan, including advance care planning, for Chinese LBD patients.
 
 
Introduction
Lewy body dementia (LBD) includes dementia with Lewy bodies (DLB) and Parkinson’s disease dementia (PDD).1 Pathological hallmarks of LBD include α-synuclein neuronal inclusions (Lewy bodies and Lewy neurites) with subsequent neuronal loss.1 The difference between DLB and PDD lies in the sequence of onset of dementia and parkinsonism, although syndromes and pathological changes become similar with progression.1 2 Thus, they are regarded as a continuum instead of two separate entities. In western studies, approximately 10% to 15% of patients with dementia had DLB.1 In contrast, only 3% of 1532 patients with dementia followed up at the memory clinic of Queen Mary Hospital in Hong Kong had LBD (unpublished data). It is likely that LBD remains under-recognised among the Chinese population.
 
Compared with autopsy, sensitivity and specificity for clinical diagnosis of DLB have been reported to be 32% and 95%, respectively.1 In our memory clinic, 50% of DLB patients were initially misdiagnosed (mostly as Alzheimer’s disease [AD] in 75% of cases).3 There has been only one case series of 35 Chinese DLB patients and diagnosis was based mainly on clinical criteria.4 The presence of occipital hypometabolism on [18F]-2-fluoro-2-deoxy-D-glucose positron emission tomography (18FDG-PET) has a sensitivity of 90% and specificity of 71% to 80% in differentiating AD and DLB patients.5
 
Pathologies of AD are common in DLB patients; 35% of patients with Parkinson’s disease fulfil the pathological diagnostic criteria of AD, while deposition of amyloid plaques is present in approximately 85% of DLB patients.6 A meta-analysis revealed a positive amyloid scan in 57% and 35% of patients with DLB and PDD, respectively.5 A higher cortical amyloid burden has been associated with greater cortical and medial temporal lobe atrophy in LBD patients.6 Significant cortical amyloid burden may accelerate the cognitive decline in LBD patients, suggesting the possibility of a synergistic contribution of AD pathologies to LBD dementia.6 On 18FDG-PET, apart from bilateral occipital hypometabolism, some LBD patients also demonstrate an AD pattern of hypometabolism, ie temporoparietal, posterior cingulate gyrus or precuneus hypometabolism (Fig).7 In a recent study comparing 12 patients with DLB and an AD pattern of hypometabolism on 18FDG-PET with 11 patients with DLB and no AD pattern of hypometabolism, the former had a higher prevalence of visual hallucinations and extracampine hallucination.7 As far as we are aware, an AD pattern of functional imaging has not been studied in Chinese patients with DLB.
 
In contrast with AD, the clinical features of patients with DLB are unfamiliar to the general public. In a recent study that involved 125 carers of LBD patients, 82% to 96% expressed a wish to have information and support about visual hallucinations, changes in the brain and the body, and ways to cope with behavioural changes.8 Unfortunately clinical studies of LBD among the Chinese population are limited and none has examined the long-term outcomes of LBD, including falls, dysphagia, aspiration pneumonia, pressure sores, mortality, and behavioural and psychological symptoms of dementia (BPSD).
 
Based on a review of the clinical records of all LBD patients followed up in our memory clinic, the current study aimed to review the presenting clinical features and identify risk factors for long-term outcomes including falls, dysphagia, aspiration pneumonia, pressure sores, and mortality in Chinese patients with LBD. It was hoped that this would provide useful clinical information for carers of such patients. We also wished to identify any difference in clinical features of LBD patients with and without an AD pattern of functional imaging. We hypothesised that LBD patients with an AD pattern of functional imaging would have a young age at presentation or diagnosis due to concomitant AD pathologies.
 
Methods
This was a retrospective case series of Chinese LBD patients. The case records of Chinese patients with LBD who attended a memory clinic at Queen Mary Hospital between 1 January 2007 and 31 December 2015 were reviewed. This study was done in accordance with the principles outlined in the Declaration of Helsinki. Probable DLB was diagnosed according to McKeith criteria2 (Table 11 2). Probable PDD was diagnosed according to the following: the patient should meet the diagnostic criteria of Queen Square Brain Bank criteria with dementia developing in the context of established Parkinson’s disease with cognitive impairment in more than one domain and severe enough to impair daily life (Table 11 2). The differentiation between DLB and PDD was based on temporal sequence of symptoms—for DLB, dementia developed before or within 1 year of parkinsonism (ie the clinical syndrome characterised by tremor, bradykinesia, rigidity, and postural instability); for PDD, dementia developed more than 1 year after the established diagnosis of Parkinson’s disease.1 All patients underwent functional imaging in the form of 18FDG-PET or technetium-99m hexamethylpropylene amine oxime single-photon emission computed tomography (SPECT) that would show either hypometabolism or hypoperfusion of the occipital lobes, respectively. Data on baseline demographics, baseline and first-year Mini-Mental State Examination (MMSE) score,9 Clinical Dementia Rating (CDR) score,10 age-adjusted Charlson Comorbidity Index,11 baseline Neuropsychiatric Inventory (NPI) score,12 baseline Barthel index–20,13 presenting cognitive symptoms, and BPSD were derived from clinical records. ‘Time to diagnosis’ was defined as the difference between the date of first presentation to the memory clinic and the date of first diagnosis of LBD. Patients with DLB were further classified as having an ‘AD imaging pattern’ if the functional imaging demonstrated bilateral temporoparietal hypometabolism or hypoperfusion with or without precuneus and posterior cingulate gyrus hypometabolism or hypoperfusion (Fig).7 14
 

Table 1. Diagnostic criteria for DLB and PDD1 2
 

Figure. 18FDG-PET brain imaging of patients with LBD
(a) A patient with an AD pattern of hypometabolism over the bilateral temporoparietal and occipital lobes and posterior cingulate gyrus (arrows). (b) A patient without an AD pattern of hypometabolism; the hypometabolism occurred in mainly bilateral occipital lobes and mild hypometabolism over the bilateral parietal lobes (arrows)
 
Clinical outcomes including falls, dysphagia, aspiration pneumonia, development of pressure sores, and mortality were traced. For patients with a history of falls, geriatric day hospital (GDH) training was traced including the pre-/post-training Elderly Mobility Scale15 and Berg Balance Scale.16 Parkinsonism medication was often titrated at the GDH. Dysphagia was further subclassified as oral or pharyngeal according to the clinical assessment by a speech therapist (ST) or, if available, a video fluoroscopic swallowing study (VFSS).17 Penetration was defined as barium material entering the airway but not passing below the vocal cords; aspiration was defined as barium material passing below the level of the vocal cords.18 The locations of pressure sores and staging, according to National Pressure Ulcer Advisory Panel,19 were recorded. ‘Time to event’ was defined as the difference between the date of diagnosis and first appearance of these events.
 
Statistical analyses
Parametric variables were expressed as mean ± standard deviation and non-parametric variables were expressed by median (interquartile range [IQR]). Descriptive statistics were used to express the frequency of defining features of LBD. Chi squared test or Fisher’s exact test was used to compare categorical variables. Independent-samples t test or Mann-Whitney U test were used to compare continuous variables when appropriate. Statistical significance was inferred by a two-tailed P value of <0.05. All statistical analyses were carried out using the Statistical Package for the Social Sciences (Windows version 18.0; SPSS Inc, Chicago [IL], United States).
 
Results
Baseline demographics and clinical characteristics
There were 23 patients with LBD (16 with DLB and 7 with PDD). The mean age at presentation was 76 ± 7 years and the mean MMSE score at presentation was 19 ± 7 with a total duration of follow-up of 72 patient-years (mean follow-up, 1138 ± 698 days). The baseline demographics of the patients are summarised in Table 2. There was no statistically significant difference in baseline demographics between DLB and PDD patients. The time to diagnosis appeared to be longer but not statistically significant for PDD patients, possibly due to very small numbers in the two groups. The overall accuracy of diagnosis was 52%. Six (38%) of the 16 DLB patients were initially misdiagnosed as AD. The frequency of defining clinical characteristics of LBD among DLB and PDD patients is summarised in Table 3. There were no statistically significant differences (results not shown). Of note, 69% of DLB patients presented with parkinsonism and 74% of LBD patients had vivid visual hallucinations. Information about the content of visual hallucinations was available for 14 of 17 patients: 50% (n=7) involved persons, 7% (n=1) involved objects, 21% (n=3) a combination of persons and animals, 7% (n=1) a combination of insects and animal, 7% (n=1) a combination of insects and objects, and 7% (n=1) a combination of animal and objects. An NPI score was available for 78% (18/23) at baseline, of whom 83% (15/18) had a score of ≥1, and 78% (14/18) had at least one NPI subcategory rated as severe, ie ≥4. The three most common BPSD as indicated by a NPI score of ≥1 were visual hallucination (56%), anxiety (50%), and apathy (50%). There was no significant difference in BPSD in terms of NPI score for DLB and PDD patients (results not shown).
 

Table 2. Baseline demographics for patients with LBD
 

Table 3. Frequency of defining clinical characteristics of Lewy body dementia at the time of initial presentation
 
Falls
Of the patients, 16 (70%) had a total of 23 falls (all non-syncopal) with four complicated by bone fractures and two associated with intracerebral haemorrhage. The event rate was 0.32 per person-years. Ten patients underwent GDH training after their fall(s). The median time to first GDH training (from the time of diagnosis) was 56 (IQR, 56-663) days with a mean time of 93 ± 44 days. Paired-samples t test did not identify any significant pre-/post-training difference in Elderly Mobility Scale15 or Berg Balance Scale16 scores (10 ± 4 vs 11 ± 5, P=0.81 and 25 ± 13 vs 25 ± 14, P=1.0, respectively). Comparison of LBD patients with and without a fall history revealed no significant difference in clinical features (including visual hallucination, parkinsonism, and fluctuation of consciousness) or medication (including benzodiazepine or antipsychotics) [results not shown]. Parkinsonism was numerically more prevalent among fallers (88% vs 57%; P=0.14).
 
Dysphagia and aspiration pneumonia
Of the patients, six (26%) had a total of 13 episodes of pneumonia, and 12 (52%) had dysphagia. The mean time to first episode of aspiration pneumonia was 866 ± 689 days (median, 634; IQR, 315-1456 days; n=6) and to first diagnosis of dysphagia 951 ± 734 days (median, 862; IQR, 311-1624 days; n=12). Five patients underwent VFSS (time to VFSS: 831 ± 728; median, 723; IQR, 154-1562 days). Three of four patients with penetration developed aspiration pneumonia; both patients with aspiration on VFSS developed aspiration pneumonia. The event rates were 0.17 and 0.18 per person-years for dysphagia and aspiration pneumonia, respectively. Patients with LBD with a history of aspiration pneumonia compared with those without were more likely to have been identified by the ST to have dysphagia (100% vs 35%; P=0.01), oral dysphagia (83% vs 29%; P=0.04), pharyngeal dysphagia (83% vs 29%; P=0.04), or dysphagia that required Ryle’s tube insertion (67% vs 12%; P=0.02) [Table 4].
 

Table 4. Comparison of clinical features in LBD patients with and without aspiration pneumonia
 
Pressure sores
Six (26%) patients developed pressure sores with four over the sacrum and two over the heel (two in stage 1, two in stage 3, and two in stage 4). The mean time to development of pressure sore was 978 ± 599 days (median, 994; IQR, 379-1528 days). The event rate was 0.08 per person-years. A comparison between LBD patients with or without pressure sores did not identify any significant difference in clinical features (results not shown).
 
Mortality
Seven (30%) patients died of various diseases: three (43%) of pneumonia, two (29%) of unexplained cardiac arrest (UCA), and one (14%) each of pressure sore sepsis and choking. The mean time to death was 894 ± 617 days (median, 798; IQR, 312-1597 days). The event rate was 0.10 per person-years. Deceased patients with LBD, compared with alive patients, scored a higher presenting CDR (median [IQR]: 1 [1-2] vs 0.5 [0.5-1.0]; P=0.01), lower mean baseline Barthel index (13 ± 7 vs 18 ± 4; P=0.04), and were more likely to have been prescribed levodopa (86% vs 31%; P=0.03).
 
Alzheimer’s disease pattern of functional imaging
Of the patients, 19 underwent 18FDG-PET and four underwent perfusion SPECT imaging. Twelve patients (9 DLB and 3 PDD) had an AD pattern of functional imaging: all 12 had bilateral temporoparietal lobe hypometabolism/hypoperfusion; three patients concomitantly had hypoperfusion/hypometabolism over the posterior cingulate gyrus and one patient had additional hypometabolism over the precuneus. Patients with LBD with an AD pattern of functional imaging compared with those without were younger at presentation (73 ± 6 vs 80 ± 6 years; P=0.02) and had a lower MMSE score at 1 year (15 ± 8 vs 22 ± 6; P=0.05). There was no difference in the presentation of visual hallucination between the two groups of patients (Table 5).
 

Table 5. Comparison of clinical features in LBD patients with and without Alzheimer’s disease pattern of hypoperfusion/hypometabolism on SPECT or 18FDG-PET
 
Discussion
An accurate diagnosis of LBD is important. It allows prescription of acetylcholinesterase inhibitors or avoidance of antipsychotics in view of the risk of neuroleptic sensitivity. An overall accuracy of clinical diagnosis of 52% in our study is similar to findings in western studies, eg 62% by a neurologist.20 Our results reveal that no single diagnostic criteria or test is infallible and the diagnosis needs follow-up review and support from functional imaging when appropriate. In our series, 38% of DLB patients were initially misdiagnosed as AD. This is thought to have been related to the presence of amnesia in all patients at initial presentation: only 69% of DLB patients presented with parkinsonism. There was no difference in the clinical features of DLB and PDD patients. Braak et al21 have proposed a pathological staging of Parkinson’s disease with Lewy body pathology starting in the dorsal IX/X motor nucleus or adjoining intermediate reticular zone, and spreading rostrally in the brainstem then to the limbic system and subsequently to the neocortex with the underlying mechanism being the spread of α-synuclein from cell to cell. This might explain why DLB and PDD can progress and later overlap clinically.
 
Compared with the previous literature review of Chinese LBD case reports (1980-2012) by Han et al4 including 31 DLB and four PDD patients with a younger mean age of onset (67 ± 10 years), more patients in our series presented with cognitive decline (100% vs 60%), parkinsonism (78% vs 9%), visual hallucinations (74% vs 9%), and rapid eye movement sleep behaviour disorder (48% vs 11%). These differences may be related to the heightened awareness of the core features of LBD among Chinese doctors in recent years or because patients in our series presented at a more advanced stage of dementia (Han et al4 did not state the severity of dementia). Nonetheless, both case series reported similar rates of postural hypotension (9% vs 3%) and BPSD (83% vs 86%).4 The rate of postural hypotension in both case series is much lower than that reported in other literature on DLB patients, ie 50%.22 The lower rates of postural hypotension may be related to under-recognition. Similar to our findings, in a study of 22 Chinese DLB patients (mean age, 74 ± 8 years; mean MMSE score, 16 ± 7; mean NPI score, 24 ± 16), three most commonly observed BPSDs were visual hallucinations (86%), delusions (64%), and anxiety (59%); total NPI score was an independent predictor of caregiver burden (odds ratio=1.537; P=0.048).22 Clinicians should pay particular attention to BPSD, particularly visual hallucinations and anxiety symptoms, when managing Chinese LBD patients.
 
In addition, falls, dysphagia, and pressure sores can contribute to carer stress but they were not included in previous studies of LBD patients.23 24 Nearly 70% of our LBD patients experienced falls. This rate is greater than the previously reported rates of 11% to 44%.25 26 Although visuospatial impairment, cognitive fluctuation, parkinsonism, or visual hallucinations were proposed as possible mechanisms that contributed to falls, only parkinsonism was identified in one study of 51 AD and 27 DLB patients as an independent risk factor.25 26 Because of the limited sample size, no significant risk factors could be identified in our series. With regard to the finding of limited improvement in mobility after GDH training, it is likely that many factors affect the mobility of LBD patients. These factors include dementia, postural hypotension, and poor balance from disease. Clinicians should alert carers of the risk of falls and offer advice about general measures for falls prevention, including addressing environmental risk factors and use of safety alarms. Compared with Londos et al’s finding17 that 29% of 82 LBD patients (median age, 77 years; median MMSE score, 20) had dysphagia on VFSS, we reported a higher rate of 52%. When identified by STs, LBD patients and their carers should be given advice about diet modification (eg use of thickeners) and postural changes (eg chin tuck). In addition, clinicians should titrate the levodopa dose as far as possible.27
 
Given that LBD is an irreversible neurodegenerative disease, advance care planning (ACP) forms a major part of care.28 Our data for dysphagia, aspiration pneumonia, pressure sores, and mortality can offer useful information during ACP for Chinese LBD patients. Since those patients who have died had a higher presenting CDR score, lower Barthel Index, and greater usage of levodopa (which probably reflects more severe parkinsonism), ACP should be initiated earlier in LBD patients with these features. It has been reported that LBD patients can have UCA and in our series two (28.6%) patients died of UCA. Unexplained cardiac arrest is proposed to be related to pathological involvement of the intermediolateral columns of the spinal cord, autonomic ganglia, and sympathetic neurons, affecting either respiration or heart rhythm.29 Such risk of UCA should be explained during ACP. Presence of hypometabolism/hypoperfusion over the temporoparietal lobes/precuneus/posterior cingulate gyrus was used as surrogate markers of concomitant AD pathologies in our LBD patients. As far as we are aware, this is the first study to show that concomitant AD pathologies among Chinese LBD patients can result in an early age of presentation or diagnosis and lower MMSE score at 1 year. Our findings provide further evidence of the synergistic contribution of AD pathologies to LBD dementia.6
 
This study has several limitations. It was a single-centre retrospective case series, therefore, we considered clinical features only as present or absent when clearly stated as such. This might have affected our results. Pathological diagnosis was not obtained including pathological proof of concomitant AD pathologies. Since all subjects were recruited from the memory clinic, LBD patients who present to a psychiatric clinic may be different, eg with more BPSD or visual hallucinations. The severity of parkinsonism was not graded so the influence of parkinsonism on long-term outcomes such as falls or aspiration pneumonia was not fully analysed. Although our case series comprised the largest number of Chinese patients with LBD supported by functional imaging, the number remained limited. Our findings should be confirmed by a larger study with Pittsburgh compound B imaging to delineate the concomitant presence of amyloid plaques.
 
Conclusions
Falls, dysphagia, aspiration pneumonia, and pressure sores were common among LBD patients. Lewy body dementia patients with an AD pattern of neuroimaging had an earlier age of diagnosis or presentation and lower 1-year MMSE scores. Such information is useful in the formulation of a management plan for Chinese LBD patients.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Walker Z, Possin KL, Boeve BF, Aarsland D. Lewy body dementias. Lancet 2015;386:1683-97. Crossref
2. McKeith IG, Dickson DW, Lowe J, et al. Diagnosis and management of dementia with Lewy bodies: third report of the DLB consortium. Neurology 2005;65:1863-72. Crossref
3. Shea YF, Ha J, Lee SC, Chu LW. Impact of 18FDG PET and 11C-PIB PET brain imaging on the diagnosis of Alzheimer’s disease and other dementias in a regional memory clinic in Hong Kong. Hong Kong Med J 2016;22:327-33.
4. Han D, Wang Q, Gao Z, Chen T, Wang Z. Clinical features of dementia with lewy bodies in 35 Chinese patients. Transl Neurodegener 2014;3:1. Crossref
5. Valkanova V, Ebmeier KP. Neuroimaging in dementia. Maturitas 2014;79:202-8. Crossref
6. Gomperts SN. Imaging the role of amyloid in PD dementia and dementia with Lewy bodies. Curr Neurol Neurosci Rep 2014;14:472. Crossref
7. Chiba Y, Fujishiro H, Ota K, et al. Clinical profiles of dementia with Lewy bodies with and without Alzheimer’s disease-like hypometabolism. Int J Geriatr Psychiatry 2015;30:316-23. Crossref
8. Killen A, Flynn D, De Brún A, et al. Support and information needs following a diagnosis of dementia with Lewy bodies. Int Psychogeriatr 2016;28:495-501. Crossref
9. Chiu FK, Lee HC, Chung WS, Kwong PK. Reliability and validity of the Cantonese version of Mini-Mental State Examination: a preliminary study. J Hong Kong Coll Psychiatr 1994;4(2 Suppl):S25-8.
10. Morris JC. The Clinical Dementia Rating (CDR): current version and scoring rules. Neurology 1993;43:2412-4. Crossref
11. Charlson ME, Pompei P, Ales KL, MacKenzie CR. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J Chronic Dis 1987;40:373-83. Crossref
12. Cummings JL, Mega M, Gray K, Rosenberg-Thompson S, Carusi DA, Gornbein J. The Neuropsychiatric Inventory: comprehensive assessment of psychopathology in dementia. Neurology 1994;44:2308-14. Crossref
13. Collin C, Wade DT, Davies S, Horne V. The Barthel ADL Index: a reliability study. Int Disabil Stud 1988;10:61-3. Crossref
14. McKhann GM, Knopman DS, Chertkow H, et al. The diagnosis of dementia due to Alzheimer’s disease: recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimers Dement 2011;7:263-9. Crossref
15. Smith R. Validation and reliability of the Elderly Mobility Scale. Physiotherapy 1994;80:744-7. Crossref
16. Berg KO, Wood-Dauphinee SL, Williams JI, Maki B. Measuring balance in the elderly: validation of an instrument. Can J Public Health 1992;83 Suppl 2:S7-11.
17. Londos E, Hanxsson O, Alm Hirsch I, Janneskog A, Bülow M, Palmqvist S. Dysphagia in Lewy body dementia—a clinical observational study of swallowing function by videofluoroscopic examination. BMC Neurol 2013;13:140. Crossref
18. Rosenbek JC, Robbins JA, Roecker EB, Coyle JL, Wood JL. A penetration-aspiration scale. Dysphagia 1996;11:93-8. Crossref
19. National Pressure Ulcer Advisory Panel. NPUAP pressure injury stages. Available from: http://www.npuap.org/resources/educational-and-clinical-resources/npuap-pressure-injury-stages/. Accessed 7 May 2016.
20. Galvin JE. Improving the clinical detection of Lewy body dementia with the Lewy body composite risk score. Alzheimers Dement (Amst) 2015;1:316-24. Crossref
21. Braak H, Del Tredici K, Rüb U, de Vos RA, Jansen Steur EN, Braak E. Staging of brain pathology related to sporadic Parkinson’s disease. Neurobiol Aging 2003;24:197-211. Crossref
22. Takemoto M, Sato K, Hatanaka N, et al. Different clinical and neuroimaging characteristics in early stage Parkinson’s disease with dementia and dementia with Lewy bodies. J Alzheimers Dis 2016;52:205-11. Crossref
23. Liu S, Jin Y, Shi Z, et al. The effects of behavioral and psychological symptoms on caregiver burden in frontotemporal dementia, Lewy body dementia, and Alzheimer’s disease: clinical experience in China. Aging Ment Health 2016 Feb 16:1-7. Epub ahead of print. Crossref
24. Leggett AN, Zarit S, Taylor A, Galvin JE. Stress and burden among caregivers of patients with Lewy body dementia. Gerontologist 2011;51:76-85. Crossref
25. Imamura T, Hirono N, Hashimoto M, et al. Fall-related injuries in dementia with Lewy bodies (DLB) and Alzheimer’s disease. Eur J Neurol 2000;7:77-9. Crossref
26. Kudo Y, Imamura T, Sato A, Endo N. Risk factors for falls in community-dwelling patients with Alzheimer’s disease and dementia with Lewy bodies: walking with visuocognitive impairment may cause a fall. Dement Geriatr Cogn Disord 2009;27:139-46. Crossref
27. Alagiakrishnan K, Bhanji RA, Kurian M. Evaluation and management of oropharyngeal dysphagia in different types of dementia: a systematic review. Arch Gerontol Geriatr 2013;56:1-9. Crossref
28. Jethwa KD, Onalaja O. Advance care planning and palliative medicine in advanced dementia: a literature review. BJPsych Bull 2015;39:74-8. Crossref
29. Molenaar JP, Wilbers J, Aerts MB, et al. Sudden death: an uncommon occurrence in dementia with Lewy bodies. J Parkinsons Dis 2016;6:53-5. Crossref

Improving medication safety and diabetes management in Hong Kong: a multidisciplinary approach

Hong Kong Med J 2017 Apr;23(2):158–67 | Epub 17 Mar 2017
DOI: 10.12809/hkmj165014
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE  CME
Improving medication safety and diabetes management in Hong Kong: a multidisciplinary approach
Agnes YS Chung, BPharm1; Shweta Anand, BDS1; Ian CK Wong, PhD2; Kathryn CB Tan, MD, MBBCH3; Christine FF Wong, PharmD4; William CM Chui, MSc4; Esther W Chan, PhD1
1 Department of Pharmacology and Pharmacy, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong Kong
2 Research Department of Practice and Policy, School of Pharmacy, University College London, United Kingdom
3 Department of Medicine, Queen Mary Hospital, Pokfulam, Hong Kong
4 Department of Pharmacy, Queen Mary Hospital, Pokfulam, Hong Kong
 
Corresponding author: Dr Esther W Chan (ewchan@hku.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Patients with diabetes often require complex medication regimens. The positive impact of pharmacists on improving diabetes management or its co-morbidities has been recognised worldwide. This study aimed to characterise drug-related problems among diabetic patients in Hong Kong and their clinical significance, and to explore the role of pharmacists in the multidisciplinary diabetes management team by evaluating the outcome of their clinical interventions.
 
Methods: An observational study was conducted at the Diabetes Clinic of a public hospital in Hong Kong from October 2012 to March 2014. Following weekly screening, and prior to the doctor’s consultation, selected high-risk patients were interviewed by a pharmacist for medication reconciliation and review. Drug-related problems were identified and documented by the pharmacist who presented clinical recommendations to doctors to optimise a patient’s drug regimen and resolve or prevent potential drug-related problems.
 
Results: A total of 522 patients were analysed and 417 drug-related problems were identified. The incidence of patients with drug-related problems was 62.8% with a mean of 0.9 (standard deviation, 0.6) drug-related problems per patient. The most common categories of drug-related problems were associated with dosing (43.9%), drug choice (17.3%), and non-allergic adverse reactions (15.6%). Drugs most frequently involved targeted the endocrine or cardiovascular system. The majority (71.9%) of drug-related problems were of moderate clinical significance and 28.1% were considered minor problems. Drug-related problems were totally solved (50.1%) and partially solved (11.0%) by doctors’ acceptance of pharmacist recommendations, or received acknowledgement from doctors (5.5%).
 
Conclusions: Pharmacists, in collaboration with the multidisciplinary team, demonstrated a positive impact by identifying, resolving, and preventing drug-related problems in patients with diabetes. Further plans for sustaining pharmacy service in the Diabetes Clinic would enable further studies to explore the long-term impact of pharmacists in improving patients’ clinical outcomes in diabetes management.
 
 
New knowledge added by this study
  • Pharmacists make an important contribution to the identification, resolution, and prevention of drug-related problems by medication reconciliation and review. Most problems were related to dosing with moderate clinical significance according to Dean and Barber’s validated scale for scoring medication errors. Over half of the clinical interventions initiated by pharmacists were accepted or acknowledged by doctors to improve medication management.
Implications for clinical practice or policy
  • Collaboration between pharmacists and other health care professionals is valuable for the improvement of medication safety in the management of diabetes.
 
 
Introduction
Diabetes mellitus is a chronic disease that is prevalent worldwide.1 Patients with diabetes often require complex medication regimens and are likely to develop multiple irreversible complications that significantly worsen their quality of life.2 Effective diabetes management requires collaboration among health care professionals in a multidisciplinary diabetes management team (DMT). Pharmacists, as a part of the DMT, are well positioned to optimise pharmacological treatment, educate patients about diabetes management, and promote medication compliance.3
 
The major role of a pharmacist in a DMT is to conduct medication reconciliation (MR) and medication review—MR is the process of comparing a patient’s prescriptions with all their usual medications and identifying the most complete and updated medication history4; whereas medication review aims to review a patient’s medical and drug history, assess their current prescriptions, and ascertain their drug knowledge and compliance.5 This enables pharmacists to identify drug-related problems (DRPs) that can actually or potentially interfere with optimum health outcomes in specific patients.6 7 Polypharmacy (concurrent use of multiple medications) is commonly seen in people with chronic diseases which could lead to potential DRPs.8 9 These DRPs might be overlooked by prescribers and interfere with diabetes management. In several overseas studies, pharmacists have implemented timely interventions to resolve or prevent DRPs by offering recommendations to prescribers, with an acceptance rate of over 60%.10 11 12 13
 
The positive impact of pharmacists in improving diabetes management or its co-morbidities has also been recognised by interventional and controlled observational studies worldwide.14 Greater overall improvement in glycosylated haemoglobin, fasting plasma glucose, blood pressure, cholesterol levels, renal outcomes, and medication adherence has been demonstrated in patients who received pharmacist-led diabetes services compared with the standard care.12 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Few studies, however, have been conducted in Hong Kong.17 29 In view of inadequate available data and potential for expansion of local pharmacy services, more studies are required to investigate the future development of a sustainable diabetes service provided by pharmacists.
 
This study aimed to characterise DRPs among Chinese diabetic out-patients, and to define the clinical significance and outcome of pharmacist interventions; thereby highlighting their contribution to the detection, resolution, and prevention of DRPs to improve medication safety and diabetes management.
 
Methods
Study design and setting
An observational study was conducted weekly in the Diabetes Clinic at Queen Mary Hospital (QMH) from October 2012 to March 2014. The study protocol was approved by the Institutional Review Board of the University of Hong Kong (HKU)/Hospital Authority (HA) Hong Kong West Cluster. Informed consent was not required for the study.
 
Inclusion and exclusion criteria
Patients were included if they were at ‘high risk’ due to their multiple disease state and complex drug regimen and if they fulfilled the following criteria:
  • Aged ≥65 years (elderly patients are considered having high risk for DRPs since they usually take more drugs than younger patients)
  • Taking five or more medications including all routes of administration, or over-the-counter medications (regular or as needed)
  • Taking medications that have a low therapeutic index or require monitoring
  • Attending multiple specialist clinics
 
Nursing home residents were excluded due to their relatively low risk for non-compliance, compared with community-dwelling elderly patients.
 
Procedure and materials
The day before the scheduled weekly clinic consultation, two researchers screened the medical history, previous consultation notes, current medications, and latest laboratory results of Chinese elderly patients with diabetes to select high-risk patients. Selected patient records were printed and prepared for quick reference during the medication interview. To facilitate data collection, a memo was attached to the patient’s records to indicate patient selection.
 
Two pharmacists from QMH and one from the HKU attended the clinic on alternate Wednesdays to compile a thorough medication history from selected patients and conduct an independent medication review prior to the medical consultation. During the review, pharmacists also recorded medications not shown in the Clinical Management System (CMS), such as drugs prescribed by general practitioners (GPs), over-the-counter products, vitamins, and herbal supplements.
 
A MR form (Appendix 1) was then completed by pharmacists, documenting the identified DRPs and formulating an intervention proposal. The MR forms were collected following medical consultation, either on the same day or within the next few days.
 

Appendix 1. Medication reconciliation form
 
Pharmacist intervention
For the selected high-risk patients, pharmacists reviewed the patient’s drug regimen and made recommendations to doctors for adjustment, provided doctors with an updated drug list after MR, suggested a need to further investigate a patient’s condition, provided drug education to patients and caregivers, reinforced the importance of drug compliance to patients, and suggested lifestyle modifications such as dietary control.
 
Drug-related problems were identified from the completed MR forms, and pharmacist recommendations were collected for analysis. The CMS was checked for outcome of intervention.
 
Data collection
Demographic data—for example, age, gender, drug allergy status, number of regular medications obtained from the HA clinics, and the most current laboratory results, including glycosylated haemoglobin, fasting plasma glucose, and lipids (Appendix 2)—were retrieved from the CMS. Additional information in terms of medication, drug storage methods, smoking status, drinking habits, vaccination record, and latest readings from self-monitoring of blood glucose (SMBG) was also collected.
 

Appendix 2. Form for patient’s laboratory values
 
Data analysis
Demographic data were tabulated as frequency and percentage using Microsoft Excel 2010. Primary outcomes included the frequency and categories of DRPs, drug classes involved, clinical significance of DRPs, and outcome of pharmacist interventions. The incidence of DRPs was also calculated as the percentage of patients with at least one DRP.
 
Definition and classification of drug-related problems
Using the Pharmaceutical Care Network Europe (PCNE) classification system for DRPs V5.01, DRPs were categorised as ‘adverse reactions’, ‘drug choice problem’, ‘dosing problem’, ‘drug use problem’, ‘interactions’, or ‘others’.7 This is an established system that has been revised several times with tested validity and reproducibility11 31 and has been used in many studies.9 32 33 When a single drug was associated with more than one possible DRP category, the one that best described the clinical scenario was chosen. Drugs involved in DRPs were categorised according to their British National Formulary classification.34
 
The clinical significance of DRPs was assessed to determine their actual or potential consequence for patient health outcomes. Using a validated scale,35 four independent reviewers (two pharmacists and two doctors) scored the severity of each DRP from 0 (without potential effects on the patient) to 10 (lead to a fatal event). A mean score of <3 indicated a minor problem (very unlikely to cause adverse effects), 3 to 7 indicated a moderate problem (likely to cause some adverse effects or interfere with therapeutic goals), and >7 indicated a severe DRP that could likely cause death or lasting impairment.
 
To evaluate prescribers’ acceptance level, the outcome of pharmacist interventions was categorised as ‘not known’, ‘solved’, ‘partially solved’, or ‘not solved’ according to PCNE classification V5.01.7
 
Results
Patient demographics and characteristics
During the study period, a total of 652 patients were included based on the selection criteria; 526 (80.7%) were interviewed, of whom 522 (99.2%) were analysed (Fig).
 

Figure. Selection of study patients
 
The mean (± standard deviation) age of the 522 patients was 75.2 ± 5.4 years (range, 65-91 years). The number of prescribed regular HA medications ranged from 5 to 17 with a mean of 9 ± 2. The demographics and characteristics of patients are shown in Table 1.
 

Table 1. Demographics and characteristics of patients
 
Categories of drug-related problems
A total of 417 DRPs were identified. Among the 522 patients analysed, 328 (62.8%) had at least one DRP and the mean number of DRPs per patient was 0.9 ± 0.6. The most prevalent DRP category was related to dosing (n=183, 43.9%), followed by drug choice (n=72, 17.3%) and non-allergic adverse reaction (n=65, 15.6%). The subcategories of each of them are listed in Table 2.
 

Table 2. Categories of drug-related problems
 
Classes of medications involved in drug-related problems
The most common classes of medication involved were those targeting the endocrine system with 190 (45.6%) DRPs, followed by cardiovascular system with 159 (38.1%) DRPs (Table 3).
 

Table 3. Classes of medications involved in drug-related problems
 
Clinical significance of drug-related problems
The mean clinical severity scores assigned to DRPs ranged from 0.50 to 7.00. The majority of DRPs (n=300, 71.9%) were classified as moderate with the remainder (n=117, 28.1%) considered minor. No clinically severe DRP was identified (Table 4).
 

Table 4. Clinical severity scores assigned to drug-related problems
 
Outcome of pharmacist interventions
As Table 5 shows, modifying drug regimens or reinforcing compliance by doctors or referral to pharmacists solved 209 (50.1%) DRPs. On the other hand, 46 (11.0%) DRPs were partially resolved by doctors adjusting prescriptions, although not according to pharmacist recommendations; 62 (14.9%) DRPs were not resolved due to patient reluctance to change prescriptions, resolution considered unnecessary, or for unknown reasons; 23 (5.5%) DRPs had an unknown outcome because these were non-compliance issues not acknowledged by doctors.
 

Table 5. Outcome of pharmacist interventions
 
Discussion
The incidence of patients with DRPs (62.8%) and the mean number of DRPs per patient analysed (0.9) in this study were comparable to a Norwegian study (59.2% and 1.2, respectively)10 but considerably lower than those identified in four overseas studies (incidence of 80.7%-90.5%, and mean number of DRPs per patient between 1.9 ± 1.2 and 4.6 ± 1.7).9 11 12 36 Such discrepancies might be attributed to variations in patient selection criteria, data collection methods, pharmacists’ clinical experience, as well as study duration and setting.9 36 37
 
The majority of DRPs were dosing problems (43.9%), with “drug dose too low or dosage regimen insufficient” as the largest subcategory. In contrast to the lower percentage (5.9%-21.6%) in five overseas studies,9 10 11 12 36 our high prevalence of dosing problems was in line with a local study of medication incidents among hospital in-patients,38 mostly arising from self-adjustment of dosage or frequency, confusion about previous dose changes and dosage modification by GPs or doctors overseas. These highlight the pivotal role of local pharmacists in conducting MR, reviewing drug dosages to ensure safety and efficacy, monitoring patients’ metabolic control regularly as well as reminding patients and/or their caregivers to maintain an updated medication list and follow the latest drug label instructions.
 
Drug choice problem was the second most common DRP; 17.3% of DRPs related to this category, which is comparable to the findings of two overseas studies (9.1%, 23%)9 36 but deviating from others (31.8%-30.2%).10 11 The most common subcategory was “no drug prescribed but clear indication”, such as the omission of angiotensin-converting enzyme inhibitor/angiotensin-receptor blocker (ACEI/ARB) in patients with microalbuminuria or patient’s reluctance to use insulin. Hence, pharmacists have a role in advising doctors to adhere to the latest treatment guidelines and educate patients about the treatment benefits of each drug class.39 Other causes of problems surrounding drug choice included drug duplication and changes to drug choices by GPs to prevent side-effects. This suggests that some DRPs might have arisen due to the lack of a common platform between the public and private health care sector for sharing patient information. Pharmacists can make a valuable contribution by establishing a patient’s drug history by MR and by liaison with different health care sectors.
 
Adverse reactions were the third most common DRP (15.6%). The major types of “side-effects suffered (non-allergic)” were insulin-induced hypoglycaemia, gastrointestinal disturbances, and dizziness caused by antidiabetic drugs, for which pharmacists recommended changes in drug choice or dosage. Adverse reactions could lead to other DRP categories,7 such as drug choice and drug use problems. This reflects the pharmacist’s pivotal role in reviewing prescribed doses, suggesting dosage adjustments to doctors, monitoring adverse effects, and providing information about prevention of side-effects (such as performing SMBG regularly to prevent hypoglycaemia).39
 
Drug use issues were the fourth most common category with comparable prevalence (12.0%) with a Malaysian study9 although this ranges widely among other studies (3.8%-54.2%).10 11 36 Reasons for the subcategory of “drug not taken/administered at all” included inability to purchase a self-financed item due to cost, ignorance of the indications, concern about side-effects, and confusion about previous regimen changes.40 In our study, pharmacists mainly intervened by direct patient counselling, recommending reinforcement of patient compliance to doctors or suggesting changes to drug regimens. Pharmacists could also work closely with other DMT members to educate patients about their disease and the most updated regimen, address drug cost concerns or side-effects, and encourage patients to update their medication list and use dose administration aids such as pill boxes.41
 
The low prevalence of drug interactions (1.0%) was similar to that (0.6%) in a Danish study,36 although much higher percentages were found in three other studies (8.0%-16.3%),9 10 11 possibly ascribed to differences in prescribing practice, references used to define drug interactions,9 and also because CMS could already detect a range of clinically significant interactions when doctors issued prescriptions. Nonetheless system checking and prompts cannot replace clinical judgement or recommendations of alternative regimens. Other categories of DRPs included “insufficient awareness of health and diseases” (such as poor dietary control) and “inappropriate timing of administration”, but this category could also encompass therapy failure and inappropriate lifestyle choices, resulting in greater variation of prevalence from overseas studies (6.8%-46.6%).9 10 11 36 Pharmacists are ideally positioned to advise patients about the importance of diet, smoking cessation, regular exercise, and SMBG.22
 
The drug classes most implicated in DRPs were for the endocrine system (45.6%) followed by cardiovascular system (38.1%). These findings were not surprising as insulins, oral antidiabetic drugs, antihypertensive, antihyperlipidaemic, antiplatelet agents, and ACEI/ARB are most commonly prescribed to manage diabetes, its co-morbidities and complications.11 39
 
The majority of DRPs were classified as moderate. Among similar overseas studies, only one analysed the clinical significance of DRPs, in which 87% had high or medium clinical/practical relevance.10 These findings could not be readily compared with the present study because of different assessment scales, potential variations in reviewers’ clinical experience,35 and unknown relative proportions of cases with medium and high relevance.
 
Over half of the DRPs were totally solved as doctors implemented pharmacist recommendations. The acceptance rate was somewhat similar to that observed in two overseas studies (60.2%-62.7%).12 13 The physicians acknowledged the provision of service by pharmacists and were more aware of the written recommendations provided by pharmacists. In particular, the value of verbal communication between different health care professionals in resolving or preventing DRPs has been recognised in earlier studies,10 42 43 44 45 suggesting potential improvement in the acceptance rate if pharmacists had more time to hand over DRPs by speaking with doctors.
 
The outcome of pharmacist interventions could also be influenced by doctors’ clinical experience and familiarity with the new service. Doctors’ acceptance level could have been underestimated since some of them might have neglected or missed written information from pharmacists. This highlights the importance of promoting the role of pharmacists to doctors and keeping all participating doctors well-informed.
 
Difficulties and limitations
This pilot study allowed for an opportunity to assess the proportion of patients who might be seen by clinical pharmacists in a busy specialist out-patient clinic at a teaching hospital. Approximately 10% of patients were chosen each week and not all eligible patients could be selected owing to time restrictions. The number of patients interviewed was further limited due to time constraints, patient absence or refusal. Local figures from the QMH Diabetes Clinic indicate that approximately 7% to 8% of all patients who attend the clinic are deemed ‘high risk’, based on ongoing work and prioritisation of those taking five or more regular medications. Limited work space was another consideration. A designated area is required to conduct patient interviews. Further arrangements could be made with the medical and nursing staff in the Diabetes Clinic to access better space.
 
This study only described the current situation of DRPs. It did not assess the implementation of interventions and their impact on patient health outcome. As the majority of patients did not bring their drugs to the clinic and had no medication list available, the MR process was not always comprehensive or effective. Only a minority of patients could name their regular drugs. The majority relied on pharmacist investigation and prompts about the colour, shape, package, or indication of each drug. Due to the potential for misinterpretation, DRP prevalence may be underestimated. One possible solution might be to show patients samples of commonly prescribed medications. Alternatively, selected patients could be telephoned in advance and asked to bring along their medications, although this measure may not be sustainable. A multifaceted promotional campaign could be introduced to encourage patients to bring their regular medications to clinic. This has been shown to be effective in an emergency setting.46
 
Although completed MR forms were presented to doctors after the interviews, some written information might have been missed with a consequent lack of response to certain DRPs. Pharmacists should ideally have informed doctors about every DRP in person, but this was not always possible due to time constraints and the great volume of patients. In the long run, pharmacists should document DRPs and their recommendations in the CMS. This would enhance visibility and allow doctors to input their response electronically and facilitate organised documentation and easy data retrieval.
 
Future directions
Upon completion of this study, pharmacists have been continuing to provide MR and medication review services in QMH Diabetes Clinic. They have also been collecting data about DRPs to plan for a sustainable service. Following a longer study period, patient and staff satisfaction surveys could be introduced and also control groups added to enable comparison of the effectiveness of pharmacist intervention. This would further support the extension of hours of service and potentially the setup of similar pharmacy services to other hospitals and diabetes clinics in Hong Kong.
 
Conclusions
Approximately two thirds of patients at the Diabetes Clinic had at least one DRP. The most frequent categories of DRPs were related to dosing, drug choice, and non-allergic adverse reaction. Drugs targeting the endocrine and cardiovascular systems were most commonly involved. The majority of DRPs were of moderate clinical significance. Pharmacist interventions for over half the DRPs were accepted or acknowledged by prescribers. Through effective communication and collaboration within the multidisciplinary health care team, pharmacists had a positive impact on identifying, resolving, and preventing DRPs. Future plans to sustain the diabetes service will enable more local research to enhance medication safety and optimise patients’ medication regimens in diabetes management.
 
Acknowledgements
We would like to acknowledge Ms Cyan Chan for her assistance in patient screening and data collection, and pharmacists Ms Phoebe Chan (HKU); Ms Amy Chan, Ms Dominique Yeung, Ms Katie Chan, and Mr Ric Fung (QMH); Prof Karen Lam (QMH); nursing and medical staff in S6 Diabetes Clinic, QMH for their advice and contributions to service provision in the study. We would also like to thank Mr Michael Ling and Ms Elaine Lo (Kwong Wah Hospital); Dr Michael Mok (Geelong Hospital, Victoria, Australia); Dr Vickie Tse (HKU) contributing to the independent assessment of clinical severity of DRPs; and Dr Anthony Tam (HKU) and Sharon Law (HKU) for proofreading the manuscript.
 
References
1. International Diabetes Federation International diabetes atlas. 6th ed. Available from: http://www.diabetesatlas.org/resources/previous-editions.html. Accessed Mar 2014.
2. Fowler MJ. Microvascular and macrovascular complications of diabetes. Clin Diabetes 2008;26:77-82. Crossref
3. Tapp H, Phillips SE, Waxman D, Alexander M, Brown R, Hall M. Multidisciplinary team approach to improved chronic care management for diabetic patients in an urban safety net ambulatory care clinic. J Am Board Fam Med 2012;25:245-6. Crossref
4. Hellström LM, Bondesson Å, Höglund P, Eriksson T. Errors in medication history at hospital admission: prevalence and predicting factors. BMC Clin Pharmacol 2012;12:9. Crossref
5. Krska J, Cromarty JA, Arris F, et al. Pharmacist-led medication review in patients over 65: a randomized, controlled trial in primary care. Age Ageing 2001;30:205-11. Crossref
6. Draft statement on pharmaceutical care. ASHP Council on Professional affairs. American Society of Hospital Pharmacists. Am J Hosp Pharm 1993;50:126-8.
7. Pharmaceutical Care Network Europe. The PCNE Classification V 5.01. 2006. Available from: http://www.pcne.org/upload/files/16_PCNE_classification_V5.01.pdf. Accessed 22 Oct 2013.
8. Viktil KK, Blix HS, Moger TA, Reikvam A. Polypharmacy as commonly defined is an indicator of limited value in the assessment of drug-related problems. Br J Clin Pharmacol 2007;63:187-95. Crossref
9. Zaman Huri H, Fun Wee H. Drug related problems in type 2 diabetes patients with hypertension: a cross-sectional retrospective study. BMC Endocr Disord 2013;13:2. Crossref
10. Granas AG, Berg C, Hjellvik V, et al. Evaluating categorisation and clinical relevance of drug-related problems in medication reviews. Pharm World Sci 2010;32:394-403. Crossref
11. van Roozendaal BW, Krass I. Development of an evidence-based checklist for the detection of drug related problems in type 2 diabetes. Pharm World Sci 2009;31:580-95. Crossref
12. Borges AP, Guidoni CM, Ferreira LD, de Freitas O, Pereira LR. The pharmaceutical care of patients with type 2 diabetes mellitus. Pharm World Sci 2010;32:730-6. Crossref
13. DeName B, Divine H, Nicholas A, Steinke DT, Johnson CL. Identification of medication-related problems and health care provider acceptance of pharmacist recommendations in the DiabetesCARE program. J Am Pharm Assoc 2008;48:731-6. Crossref
14. Kiel PJ, McCord AD. Pharmacist impact on clinical outcomes in a diabetes disease management program via collaborative practice. Ann Pharmacother 2005;39:1828-32. Crossref
15. Wubben DP, Vivian EM. Effects of pharmacist outpatient interventions on adults with diabetes mellitus: a systematic review. Pharmacotherapy 2008;28:421-36. Crossref
16. Evans CD, Watson E, Eurich DT, et al. Diabetes and cardiovascular disease interventions by community pharmacists: a systematic review. Ann Pharmacother 2011;45:615-28. Crossref
17. Chan CW, Siu SC, Wong CK, Lee VW. A pharmacist care program: positive impact on cardiac risk in patients with type 2 diabetes. J Cardiovasc Pharmacol Ther 2012;17:57-64. Crossref
18. Pepper MJ, Mallory N, Coker TN, Chaki A, Sando KR. Pharmacists’ impact on improving outcomes in patients with type 2 diabetes mellitus. Diabetes Educ 2012;38:409-16. Crossref
19. Jarab AS, Alqudah SG, Mukattash TL, Shattat G, Al-Qirim T. Randomized controlled trial of clinical pharmacy management of patients with type 2 diabetes in an outpatient diabetes clinic in Jordan. J Manag Care Pharm 2012;18:516-26. Crossref
20. Jacobs M, Sherry PS, Taylor LM, Amato M, Tataronis GR, Cushing G. Pharmacist Assisted Medication Program Enhancing the Regulation of Diabetes (PAMPERED) study. J Am Pharm Assoc 2012;52:613-21. Crossref
21. Ali M, Schifano F, Robinson P, et al. Impact of community pharmacy diabetes monitoring and education programme on diabetes management: a randomized controlled study. Diabet Med 2012;29:e326-33. Crossref
22. Al Mazroui NR, Kamal MM, Ghabash NM, Yacout TA, Kole PL, McElnay JC. Influence of pharmaceutical care on health outcomes in patients with type 2 diabetes mellitus. Br J Clin Pharmacol 2009;67:547-57. Crossref
23. Mehuys E, Van Bortel L, De Bolle L, et al. Effectiveness of a community pharmacist intervention in diabetes care: a randomized controlled trial. J Clin Pharm Ther 2011;36:602-13. Crossref
24. Shah M, Norwood CA, Farias S, Ibrahim S, Chong PH, Fogelfeld L. Diabetes transitional care from inpatient to outpatient setting: pharmacist discharge counseling. J Pharm Pract 2013;26:120-4. Crossref
25. Heisler M, Hofer TP, Schmittdiel JA, et al. Improving blood pressure control through a clinical pharmacist outreach program in patients with diabetes mellitus in 2 high-performing health systems: the adherence and intensification of medications cluster randomized, controlled pragmatic trial. Circulation 2012;125:2863-72. Crossref
26. Dobesh PP. Managing hypertension in patients with type 2 diabetes mellitus. Am J Health Syst Pharm 2006;63:1140-9. Crossref
27. Planas LG, Crosby KM, Mitchell KD, Farmer KC. Evaluation of a hypertension medication therapy management program in patients with diabetes. J Am Pharm Assoc 2009;49:164-70. Crossref
28. Leal S, Soto M. Chronic kidney disease risk reduction in a Hispanic population through pharmacist-based disease-state management. Adv Chronic Kidney Dis 2008;15:162-7. Crossref
29. Leung WY, So WY, Tong PC, Chan NN, Chan JC. Effects of structured care by a pharmacist-diabetes specialist team in patients with type 2 diabetic nephropathy. Am J Med 2005;118:1414. Crossref
30. American Pharmacists Association. DOTx. MED: Pharmacist-delivered interventions to improve care for patients with diabetes. J Am Pharm Assoc 2012;52:25-33. Crossref
31. Björkman IK, Sanner MA, Bernsten CB. Comparing 4 classification systems for drug-related problems: processes and functions. Res Social Adm Pharm 2008;4:320-31. Crossref
32. Eichenberger PM, Lampert ML, Kahmann IV, van Mil JW, Hersberger KE. Classification of drug-related problems with new prescriptions using a modified PCNE classification system. Pharm World Sci 2010;32:362-72. Crossref
33. Hohmann C, Eickhoff C, Klotz JM, Schulz M, Radziwill R. Development of a classification system for drug-related problems in the hospital setting (APS-Doc) and assessment of the inter-rater reliability. J Clin Pharm Ther 2012;37:276-81. Crossref
34. British Medical Association, Royal Pharmaceutical Society of Great Britain. British National Formulary 71. London: British Medical Association, Royal Pharmaceutical Society; 2016.
35. Dean BS, Barber ND. A validated, reliable method of scoring the severity of medication errors. Am J Health Syst Pharm 1999;56:57-62.
36. Haugbølle LS, Sørensen EW. Drug-related problems in patients with angina pectoris, type 2 diabetes and asthma—interviewing patients at home. Pharm World Sci 2006;28:239-47. Crossref
37. Westerlund T, Almarsdottir AB, Melander A. Factors influencing the detection rate of drug-related problems in community pharmacy. Pharm World Sci 1999;21:245-50. Crossref
38. Song L, Chui WC, Lau CP, Cheung BM. A 3-year study of medication incidents in an acute general hospital. J Clin Pharm Ther 2008;33:109-14. Crossref
39. American Diabetes Association. Standards of medical care in diabetes—2013. Diabetes Care 2013;36 Suppl 1:S11-66. Crossref
40. Odegard PS, Gray SL. Barriers to medication adherence in poorly controlled diabetes mellitus. Diabetes Educ 2008;34:692-7. Crossref
41. Morello CM, Chynoweth M, Kim H, Singh RF, Hirsch JD. Strategies to improve medication adherence reported by diabetes patients and caregivers: results of a taking control of your diabetes survey. Ann Pharmacother 2011;45:145-53. Crossref
42. Perera PN, Guy MC, Sweaney AM, Boesen KP. Evaluation of prescriber responses to pharmacist recommendations communicated by fax in a medication therapy management program (MTMP). J Manag Care Pharm 2011;17:345-54. Crossref
43. Doucette WR, McDonough RP, Klepser D, McCarthy R. Comprehensive medication therapy management: identifying and resolving drug-related issues in a community pharmacy. Clin Ther 2005;27:1104-11. Crossref
44. Chrischilles EA, Carter BL, Lund BC, et al. Evaluation of the Iowa Medicaid pharmaceutical case management program. J Am Pharm Assoc 2004;44:337-49. Crossref
45. Galt KA. Cost avoidance, acceptance, and outcomes associated with a pharmacotherapy consult clinic in a Veterans Affairs Medical Center. Pharmacotherapy 1998;18:1103-11.
46. Chan EW, Taylor SE, Marriott JL, Barger B. Bringing patients’ own medications into an emergency department by ambulance: effect on prescribing accuracy when these patients are admitted to hospital. Med J Aust 2009;191:374-7.

Mothers’ attitude to the use of a combined oral contraceptive pill by their daughters for menstrual disorders or contraception

Hong Kong Med J 2017 Apr;23(2):150–7 | Epub 24 Feb 2017
DOI: 10.12809/hkmj164993
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Mothers’ attitude to the use of a combined oral contraceptive pill by their daughters for menstrual disorders or contraception
KW Yiu, MRCOG, FHKAM (Obstetrics and Gynaecology); Symphorosa SC Chan, FRCOG, FHKAM (Obstetrics and Gynaecology); Tony KH Chung, FRANZCOG, FRCOG
Department of Obstetrics and Gynaecology, The Chinese University of Hong Kong, Prince of Wales Hospital, Shatin, Hong Kong
 
Corresponding author: Dr Symphorosa SC Chan (symphorosa@cuhk.edu.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Mothers’ attitude may affect use of combined oral contraceptive pills by their daughters. We explored Chinese mothers’ knowledge of and attitudes towards the use of combined oral contraceptive pills by their daughters for menstrual disorders or contraception, and evaluate the factors affecting their attitude.
 
Methods: This survey was conducted from October 2012 to March 2013, and recruited Chinese women who attended a gynaecology clinic or accompanied their daughter to a gynaecology clinic, and who had one or more daughters aged 10 to 18 years. They completed a 41-item questionnaire to assess their knowledge of and attitude towards use of the combined oral contraceptive pills by their daughters. The demographic data of the mothers and their personal experience in using the pills were also collected.
 
Results: A total of 300 women with a mean age of 45.2 (standard deviation, 5.0) years completed the questionnaire. Only 58.3% of women reported that they had knowledge about the combined oral contraceptive pills; among them, a majority (63.3%) reported that their source of knowledge came from medical professionals. Of a total possible score of 22, their mean knowledge score for risk, side-effects, benefits, and contra-indications to use of combined oral contraceptive pills was only 5.0 (standard deviation, 4.7). If the medical recommendation to use an oral contraceptive was to manage their daughter’s dysmenorrhoea, menorrhagia, acne, or contraception needs, 32.0%, 39.3%, 21.0% and 29.7%, respectively would accept this advice. Women who were an ever-user of combined oral contraceptive pills or who were more knowledgeable about combined oral contraceptives had a higher acceptance rate.
 
Conclusions: Chinese women had a low acceptance level of using combined oral contraceptive pills as a legitimate treatment for their daughters. This was associated with lack of knowledge or a high degree of uncertainty about their risks and benefits. It is important that health caregivers provide up-to-date information about combined oral contraceptive pills to women and their daughters.
 
 
New knowledge added by this study
  • Chinese women had a low acceptance level of combined oral contraceptive (COC) pills as a legitimate treatment for their daughters. This was associated with lack of knowledge or a high degree of uncertainty about the risks and benefits of COC use.
Implications for clinical practice or policy
  • Health caregivers should provide up-to-date information to potential COC users.
 
 
Introduction
The combined oral contraceptive (COC) pill is an effective contraception. It has a very low-risk profile documented over several decades and its protective effect on endometrial and ovarian carcinoma has been well established.1 It is also an effective treatment for menstrual problems and polycystic ovarian syndrome,2 3 which are common in adolescents.4 5 6 7 The prevalences of menorrhagia, dysmenorrhoea, and menstrual symptoms in adolescent girls have been reported to be 17.9%, 68.7%, and 37.7%, respectively.4 The prevalence of polycystic ovarian syndrome in adolescent girls has been reported to be 16% in those who attended a local paediatric and adolescent gynaecology clinic.5 Nonetheless, the use of COC pills in Chinese women has remained low; only 1% of women of reproductive age (20-49 years) in China used the pills in 2010.8 There are some obstacles to access although family planning is a relatively well-funded area of health care in China and has been implemented for decades. In Hong Kong, the situation is slightly less unusual. From an online survey conducted in Hong Kong, 12.6% of women had used an oral contraceptive in the year prior to the survey, but many of them had stopped using it.9 According to the annual report of the Family Planning Association of Hong Kong in 2014-2015, 22% of the 48 363 clients who practised birth control, including women who were both married and unmarried, used an oral contraceptive.10 Only 6% of teenage girls who attended the youth health care centres used COC pills for contraception.11
 
Although sex education has been integrated into the primary and secondary educational curriculum for many years, efforts to provide quality sex education have been limited.12 According to a survey conducted by the Hong Kong SAR Government, sex education at the junior secondary school level is limited to an average of 3 to 4 school hours only.13 Sometimes concepts emphasised included protection of self and avoidance of sex, especially prior to marriage.14 The median age of marriage in Hong Kong is now close to 30 years. It is notable that Hong Kong has a high rate of therapeutic abortion that is underestimated by official statistics because an indeterminate proportion is performed in mainland China due to cost considerations. In women attending for their antenatal visit, a high proportion of 36.5% reported a previous therapeutic abortion (unpublished data from our institute). This suggests that women of reproductive age may not have been educated about contraception. There is little published information on the use of COC pills for the management of menstrual problems in Hong Kong but it is likely to be low. Misconceptions and myths about COC pills are likely to be the main obstacles to use. Although extensive high-quality information about use of COC is currently available from various sources, many women remain unaware of the non-contraceptive benefits of COC. They also have little awareness of the risks of COC.15 For female adolescents, their decision about whether to use COC is likely to be influenced by their parents, especially their mothers, who may be giving advice to their daughters based on little or erroneous knowledge. This may lead mothers to make decisions that are not in their daughter’s best interests.
 
Focused education about COC may lead to a more balanced view, both in adolescents and their mothers. In a study of Korean and Japanese university students, significant correlation between knowledge of and positive attitude towards COC pills was reported.16 Mothers in Asia are also often involved in their teenage daughters’ decision to begin sexual relationships, the use of contraceptives, and even vaccination.17 18 Since the mothers’ attitude may affect use of COC by their daughters, we explored Chinese mothers’ knowledge of and attitudes towards such use. Mothers’ knowledge about the COC pills and factors affecting their attitude were also evaluated.
 
Methods
The study was conducted from October 2012 to March 2013 in the gynaecology clinic of a tertiary teaching hospital in Hong Kong. Women who attended the clinic or accompanied a daughter to the clinic, and who had one or more daughters aged 10 to 18 years were recruited. Women who did not speak or read Chinese were excluded. The participants completed a 41-item questionnaire to assess their knowledge of and attitude towards use of COC pills by their daughters. Firstly, they were asked to self-assess their own knowledge of the COC pill. The knowledge domain consisted of 19 items testing their knowledge of the non-contraceptive health benefits and side-effects of COC pills, and three items on contra-indications to use of COC pills. For each item, participants were asked to respond “yes”, “no”, or “don’t know”. They were then asked about their attitudes to the use of COC pills by their daughters aged 10 to 18 years for the management of dysmenorrhoea, menorrhagia, acne, or as a contraceptive. They were asked to respond from “strongly agree”, “agree”, “disagree”, to “strongly disagree”. Reasons for agreeing or disagreeing with the use of COC pills and the appropriate age or life-events for using COC pills by their daughters were also asked. Finally, demographic data and their personal experience in using COC pills were collected. Knowledge score and uncertainty score were calculated for the participants based on their response15 16—knowledge score was defined as the score of correct answers with 1 score given for each correct answer (ie range from 0-22); a “don’t know” reply would create the uncertainty score. The participants provided written informed consent, and approval was obtained from the local ethics committee (CRE-2012.186).
 
Statistical analyses
Descriptive statistics were used to summarise participants’ demographic information. Association between participant characteristics and overall attitude was explored using Chi squared and independent-samples t test. A P value of <0.05 was considered statistically significant. All statistical analyses were conducted using the SPSS (Windows version 18.0; SPSS Inc, Chicago [IL], United States). Assuming that 50% of the women accepted the use of COC pills by their daughters with an accepted error of 0.05%, 278 women were required. An additional 10% was recruited to prepare for an incomplete questionnaire.
 
Results
Apart from 150 women who were excluded because they did not have a daughter aged 10 to 18 years, a total of 317 women were invited to participate; 302 agreed and 300 (94.6%) completed the questionnaire. Demographic characteristics of the participants are shown in Table 1. Their mean age was 45.2 years (range, 28-58 years). The median number of daughters was one and most participants (88.3%) were married. Most (>70%) had high school education. In all, 125 (41.7%) were ever-users of COC pills, including both current and ex-users. Overall, 175 (58.3%) reported that they had knowledge about the COC pills, while 125 (41.7%) reported no knowledge.
 

Table 1. Demographics of the participants
 
The rates of giving correct answers about the COC pill and the comparison between ever- and never-users of COC pills are shown in Table 2. Of a total possible score of 22, the mean (± standard deviation) knowledge score of all the participants was 5.0 ± 4.7. Of all the participants, only approximately 20% of the mothers correctly answered that COC pills would not cause carcinoma of ovary and corpus; 26.0%, 29.7%, and 30.3% respectively correctly answered that COC pills did not have proven teratogenicity, cause pelvic inflammatory disease and infertility; 10.3% knew that COC pills would not cause weight gain and 25.7% answered that COC pills would not lead to a depressive mood. In all, 43.3%, 33.0%, and 25.3% knew that COC pills had the benefits of regulating the menstrual cycle, decreasing menstrual flow, and helping to relieve dysmenorrhoea, respectively. Moreover only 20% knew that the COC pills are not contra-indicated in people with a family history of breast cancer but is contra-indicated in thromboembolism. The knowledge score of the 175 women who responded to have knowledge of the COC pills was significantly higher than those who reported lack of knowledge (8.0 ± 4.4 vs 3.0 ± 3.7; P<0.001). Among those who declared they had knowledge about the use of COC pills, their sources of knowledge were from medical professional (63.3%), media (30.3%), friends (24.6%), family members (6.9%), school (2.9%), and others (1%).
 

Table 2. Rates of giving correct answers about COC and comparison between ever- and never-users of COC pills
 
The rate of responding “uncertain” to the health benefit, side-effects, or contra-indications of COC use ranged from 43.7% to 71.0% for each item. The mean uncertainty score among all participants was 13.0 ± 7.6. The uncertainty score was significantly higher in participants who reported to have lack of knowledge when compared with those reported to have knowledge about COC pills (15.6 ± 7.1 vs 9.1 ± 6.7; P<0.001).
 
Among the ever-users, 43 (34%), five (4%), and 96 (77%) women reported that COC pills had been used to manage their own menstrual problems, acne problems, and as contraception, respectively. Table 3 lists the participants’ acceptance rate of COC use by their daughters in different gynaecological conditions and the comparison between ever- and never-users of COC pills. More ever-users than never-users accepted the use of COC for their daughter’s gynaecological indications. Participants who accepted their daughter’s use of COC also had a higher knowledge score and lower uncertainty score (Table 4). Table 5 shows the participants’ reasons for accepting use of COC pills by their daughters. Recommendation by medical professionals was the major reason, followed by the knowledge that COC pills provided effective contraception.
 

Table 3. Participants’ acceptance of COC use by their daughters for different gynaecological conditions and comparison between ever- and never-users of COC pills
 

Table 4. Comparison of the knowledge score and uncertainty score in participants who agreed or disagreed with the use of COC pills by their daughters under different gynaecological conditions
 

Table 5. Participants’ reasons for accepting use of COC pills by their daughters for menstrual problems
 
Age, education level, and whether they had previous experience of side-effects of COC pills were not associated with participants’ acceptance of COC use by their daughters. Among the 125 ever-users of COC pills, 65 (52.0%) reported they had experienced side-effects, including weight gain (n=45), fluid retention (n=25), headache (n=12), increase in appetite (n=8), mode disturbance (n=8), and acne (n=4). Table 6 lists the reasons for disagreement with the use of COC pills by their daughters for menstrual problems. Finally, only 71 (23.6%) participants thought that the use of COC pills was appropriate in girls aged 12 to 18 years.
 

Table 6. Participants’ reasons for disagreement with the use of COC pills by their daughters for menstrual problems
 
Discussion
Our study highlights a notable lack of knowledge about the use of COC pills in many Hong Kong Chinese mothers. Many were uncertain or had erroneous beliefs about the use of COC pills. They believed that such usage would lead to cancers, fetal deformity, and cause infertility and pelvic inflammatory disease. These misconceptions and uncertainties may further reinforce their non-acceptance of the COC pills as an appropriate medication for their daughters. This inevitably often leads to suboptimal treatment for their daughters.
 
Fear of increased risk of cancer is an important reason for low acceptance of COC pills and only 22% of our participants thought it did not increase the risk for carcinoma of ovary or uterus. More than 60% of the participants were uncertain about the risk of cancer with the use of COC pills (results not shown). Research has shown that contraceptives have a significantly protective effect on carcinoma of ovary and corpus uteri.19 20 In fact, a collaborative re-analysis of individual data from 53 297 women with breast cancer and 100 239 women without breast cancer from 54 epidemiological studies revealed that while women were taking COC pills and in the 10 years after stopping, there was a small increase in the relative risk of breast cancer.21 There was, however, no significant excess risk of having breast cancer diagnosed ≥10 years after stopping use of COC pills. The breast cancer incidence rises steeply with age. The estimated excess number of cancers diagnosed in the period between starting use and 10 years after stopping increases with age at last use. The estimated excess number of breast cancers diagnosed up to 10 years after stopping use from the age of 16 to 19 years among 10 000 women has been reported to be 0.5 (95% confidence interval, 0.3-0.7) only.21 The Nurses’ Health Study with 121 701 participants followed up for 36 years revealed that longer duration of COC use was strongly associated with premature mortality due to breast cancer.22 Another highlight was that only 20.7% of our participants knew that a family history of breast cancer was not a contra-indication to use of COC pills.
 
Another common misconception is that the use of COC pills can lead to future subfertility. The COC pill preserves fertility by diminishing the risk of ectopic pregnancy.23 According to a review, 1-year pregnancy rates after discontinuation of COC ranged from 79% to 96%, similar to those reported following discontinuation of barrier methods or no contraception.24 Moreover, the progestogen effect of COC pills results in production of thick, tenacious cervical mucus that resists penetration by bacteria and spermatozoa and reduces the risk of upper genital tract infection. Use of COC pills was also quoted to be protective against symptomatic pelvic inflammatory disease, with a 50% reduction in rate of hospitalisation for the disease, with itself being a risk factor for subfertility.25 The COC pill does not protect against sexually transmitted infections. On the other hand, there is no evidence to support the notion that the use of COC pills is associated with high-risk sexual behaviour in adolescents, which is a very common fear among Hong Kong mothers.
 
Primary dysmenorrhoea is prevalent during adolescence. Approximately 6.4% of adolescents or 29% of those reporting severe dysmenorrhoea seek help from a physician.4 A review and meta-analysis of five trials of the use of COC pills concluded that it was more effective than placebo in managing dysmenorrhoea.2 One of the most common problems reported by adolescents is irregular and/or profuse menstruation. The COC pill is also effective in treating and preventing heavy menstrual bleeding. In our study, only 25% to 43% of the participants knew that it is an appropriate treatment for menstrual problems and 50% were uncertain.
 
The fear of side-effects often leads to reluctance to using new treatments.18 In many cases, such fears are often unfounded. In our study, participants believed that weight gain and depressive mood were side-effects of COC pills, although pooled analysis of a placebo-controlled trial showed no difference in weight gain.26 Furthermore, depressive symptoms are common in adolescence.27 In a randomised controlled study, there was no difference in mood changes throughout the menstrual cycle between COC users and non-users.28 In a prospective study of 43 adolescents, subjects anticipated more side-effects than they actually experienced after 6 months of taking COC pills.29
 
It is important to provide correct information to women and their teenage daughters if they are contemplating the use of COC pills. In our study, 40% of participants indicated that recommendation from a medical professional was a critical factor in their acceptance of the use of COC pills by their daughters.
 
Only 19% of participants were aware that thromboembolism is a contra-indication to COC use. Venous thromboembolism in Asians has been reported to be low.30 A recent case-control study confirmed that current exposure to any COC poses a 3-times higher risk of venous thromboembolism compared with no exposure in the previous year.31 The risk is higher with COC pills containing desogestrel (odds ratio, 4.3), gestodene (3.6), drospirenone (4.1), and cyproterone (4.3) than the second-generation COC pills with levonorgestrel (2.4) and norgestimate (2.5).31 The risk of venous thromboembolism is also increased for COC users with a family history of venous thromboembolism.32 Clinicians should assess the woman’s personal and family history of thromboembolism, and provide information about the warning symptoms of venous thromboembolism before prescribing a new generation pill. As recommended by the World Health Organization, the COC pill is not contra-indicated in smokers <35 years old33; approximately 30% of our participants answered this correctly.
 
This study has several limitations. First, there may be selection bias as subjects were women who presented to the gynaecology clinic or accompanied their daughter to a gynaecology clinic for a gynaecological problem. The results may not be generalised to the whole population of Hong Kong. Second, the questionnaire was not validated and the questions did not include all aspects of the use of COC pills. Third, we relied on women’s self-reported use of COC pills and could not verify the information. Despite these, the questionnaire included the most widely studied aspects of non-contraceptive benefits and risks of the COC pill and knowledge score or uncertainty score have been used in previous literature.15 16 Although this study was conducted in only one centre and in Chinese women only, it helps clinicians to understand the low levels of acceptance of and compliance with prescribed COC pills.
 
The degree of misconception among Hong Kong mothers about COC use is of concern. Hong Kong has a well-developed education system with many highly regarded universities. The reported limited sex education in schools may be responsible for this knowledge gap of Hong Kong mothers.13 This may in turn have an impact on the advice they give their daughters, who are usually compliant with their mother’s wishes. Specific training in communication and counselling skills should be provided to health care professionals when promoting sexual health to women and adolescents.34
 
Conclusions
Our study found that the Hong Kong Chinese women who attended a gynaecology clinic of a tertiary centre had a low acceptance rate of the use of COC pills by their daughters. This low acceptance was associated with a lack of knowledge and misconception of the risks and benefits of the COC pills. Such ignorance will exert an adverse influence on the choice of treatment for many gynaecological problems in teenage daughters.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Vessey M, Painter R. Oral contraceptive use and cancer. Findings in a large cohort study, 1968-2004. Br J Cancer 2006;95:385-9. Crossref
2. Proctor ML, Roberts H, Farquhar CM. Combined oral contraceptive pill (OCP) as treatment for primary dysmenorrhoea. Cochrane Database Syst Rev 2001;(4):CD002120.
3. Nader S, Diamanti-Kandarakis E. Polycystic ovary syndrome, oral contraceptives and metabolic issues: new perspectives and a unifying hypothesis. Hum Reprod 2007;22:317-22. Crossref
4. Chan SS, Yiu KW, Yuen PM, Sahota DS, Chung TK. Menstrual problems and health-seeking behaviour in Hong Kong Chinese girls. Hong Kong Med J 2009;15:18-23.
5. Chung PW, Chan SS, Yiu KW, Lao TT, Chung TK. Menstrual disorders in a Paediatric and Adolescent Gynaecology Clinic: patient presentations and longitudinal outcomes. Hong Kong Med J 2011;17:391-7.
6. Esmaelizadeh S, Delavar MA, Amiri M, Khafri S, Pasha NG. Polycystic ovary syndrome in Iranian adolescents. Int J Adolesc Med Health 2014;26:559-65. Crossref
7. Christensen SB, Black MH, Smith N, et al. Prevalence of polycystic ovary syndrome in adolescents. Fertil Steril 2013;100:470-7. Crossref
8. Wang C. Trends in contraceptive use and determinants of choice in China: 1980-2010. Contraception 2012;85:570-9. Crossref
9. Lo SS, Fan SY. Acceptability of the combined oral contraceptive pill among Hong Kong women. Hong Kong Med J 2016;22:231-6.
10. The Family Planning Association of Hong Kong. 2014-2015 Annual report. Available from: http://www.famplan.org.hk/fpahk/en/template1.asp?style=template1.asp&content=about/annualreport.asp. Accessed 28 Apr 2016.
11. The Family Planning Association of Hong Kong. Youth sexuality in Hong Kong secondary school survey. Available from: http://www.famplan.org.hk/fpahk/en/template1.asp?content=info/research.asp. Accessed 28 Apr 2016.
12. Che FS. A study of the implementation of sex education in Hong Kong secondary schools. Sex Educ 2005;5:281-94. Crossref
13. Survey of life skills-based education on HIV/AIDS at junior level of secondary schools in Hong Kong. Red Ribbon Centre, Department of Health, Hong Kong SAR Government; 2014.
14. Wong WC, Lee A, Tsang KK, Lynn H. The impact of AIDS/sex education by schools or family doctors on Hong Kong Chinese adolescents. Psychol Health Med 2006;11:108-16. Crossref
15. Voqt C, Schaefer M. Disparities in knowledge and interest about benefits and risks of combined oral contraceptives. Eur J Contracept Reprod Health Care 2011;16:183-93. Crossref
16. Lim HJ, Lee MS, Cho YH, Kazumi U. A comparative study of knowledge about and attitudes toward the combined oral contraceptives among Korean and Japanese university students. Pharmacoepidemiol Drug Saf 2004;13:741-7. Crossref
17. Bachar R, Yogev Y, Fisher M, Geva A, Blumberg G, Kaplan B. Attitudes of mothers toward their daughters’ use of contraceptives in Israel. Contraception 2002;66:117-20. Crossref
18. Chan SS, Cheung TH, Lo WK, Chung TK. Women’s attitudes on human papillomavirus vaccination to their daughters. J Adolesc Health 2007;41:204-7. Crossref
19. Cibula D, Gompel A, Mueck AO, et al. Hormonal contraception and risk of cancer. Hum Reprod Update 2010;16:631-50. Crossref
20. Bitzer J. Oral contraceptives in adolescent women. Best Pract Res Clin Endocrinol Metab 2013;27:77-89. Crossref
21. Collaborative Group on Hormonal Factors in Breast Cancer. Breast cancer and hormonal contraceptives: collaborative reanalysis of individual data on 53 297 women with breast cancer and 100 239 women without breast cancer from 54 epidemiological studies. Lancet 1996;347:1713-27. Crossref
22. Charlton BM, Rich-Edwards JW, Colditz GA, et al. Oral contraceptive use and mortality after 36 years of follow-up in the Nurses’ Health Study: prospective cohort study. BMJ 2014;349:g6356. Crossref
23. Burkman R, Schlesselman JJ, Zieman M. Safety concerns and health benefits associated with oral contraception. Am J Obstet Gynecol 2004;190 (4 Suppl):S5-22. Crossref
24. Mansour D, Gemzell-Dianielsson K, Inki P, Jensen JT. Fertility after discontinuation of contraception: a comprehensive review of the literature. Contraception 2011;84:465-77. Crossref
25. Guillebaud J, MacGregor A. Contraception: your questions answered. 6th ed. London: Churchill Livingstone; 2013.
26. Gallo MF, Lopez LM, Grimes DA, Carayon F, Schulz KF, Helmerhorst FM. Combination contraceptives: effects on weight. Cochrane Database Syst Rev 2014;(1):CD003987. Crossref
27. Saluja G, Iachan R, Scheidt PC, Overpeck MD, Sun W, Giedd JN. Prevalence of and risk factors for depressive symptoms among young adolescents. Arch Pediatr Adolesc Med 2004;158:760-5. Crossref
28. Walker A, Bancroft J. Relationship between premenstrual symptoms and oral contraceptive use: a controlled study. Psychosom Med 1990;52:86-96. Crossref
29. Rosenthal SL, Cotton S, Ready JN, Potter LS, Succop PA. Adolescents’ attitudes and experiences regarding levonorgestrel 100 mcg/ethinyl estradiol 20 mcg. J Pediatr Adolesc Gynecol 2002;15:301-5. Crossref
30. Lee WS, Kim KI, Lee HJ, Kyung HS, Seo SS. The incidence of pulmonary embolism and deep vein thrombosis after knee arthroplasty in Asians remains low: a meta-analysis. Clin Orthop Relat Res 2013;471:1523-32. Crossref
31. Vinogradova Y, Coupland C, Hippisley-Cox J. Use of combined oral contraceptives and risk of venous thromboembolism: nested case-control studies using the QResearch and CPRD databases. BMJ 2015;350:h2135. Crossref
32. Zöller B, Ohlsson H, Sundquist J, Sundquist K. Family history of venous thromboembolism is a risk factor for venous thromboembolism in combined oral contraceptive users: a nationwide case-control study. Thromb J 2015;13:34. Crossref
33. Medical eligibility criteria for contraceptive use. 5th ed. Geneva: World Health Organization; 2015.
34. Yip BH, Sheng XT, Chan VW, Wong LH, Lee SW, Abraham AA. ‘Let’s talk about sex’—a knowledge, attitudes and practice study among paediatric nurses about teen sexual health in Hong Kong. J Clin Nurs 2015;24:2591-600. Crossref

A hospital-wide screening programme to control an outbreak of vancomycin-resistant enterococci in a large tertiary hospital in Hong Kong

Hong Kong Med J 2017 Apr;23(2):140–9 | Epub 24 Feb 2017
DOI: 10.12809/hkmj164939
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
A hospital-wide screening programme to control an outbreak of vancomycin-resistant enterococci in a large tertiary hospital in Hong Kong
Christopher KC Lai, MB, ChB, FHKCPath1,2; Stephenie YN Wong, MB, BS, FHKCPath1,2; Shirley SY Lee, BSc (Nursing), MSC (Nursing)2; HK Siu, BSc (Statistics), MPhil (Social Medicine)3; CY Chiu, BSc (Biomedical Sciences), MSc (Medical Laboratory Sciences)4; Dominic NC Tsang, MB, BS, FHKCPath1,2,3; Margaret PY Ip, FRCP, FRCPath4; CT Hung, FANZCA, FHKAM (Anaesthesiology)5
1 Department of Pathology, Queen Elizabeth Hospital, Hong Kong
2 Infection Control Team, Queen Elizabeth Hospital, Hong Kong
3 Chief Infection Control Officer’s Office, Hospital Authority, Hong Kong
4 Department of Microbiology, The Chinese University of Hong Kong, Hong Kong
5 Queen Elizabeth Hospital, Hong Kong
 
Corresponding authors: Dr Christopher KC Lai (laikcc@ha.org.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Apart from individual small-scale outbreaks, infections with vancomycin-resistant enterococci are uncommon in Hong Kong. A major outbreak of vancomycin-resistant enterococci, however, occurred at a large tertiary hospital in 2013. We describe the successful control of this outbreak and share the lessons learned.
 
Methods: In 2013, there was an abnormal increase in the incidence of vancomycin-resistant enterococci carriage compared with baseline in multiple clinical departments at Queen Elizabeth Hospital. A multipronged approach was adopted that included a 10-week hospital-wide active screening programme, which aimed to identify and isolate hidden vancomycin-resistant enterococci carriers among all in-patients. The identified carriers were completely segregated in designated wards where applicable. Other critical infection control measures included directly observed hand hygiene and environmental hygiene. A transparent and open disclosure approach was adopted throughout the outbreak.
 
Results: The infection control measures were successfully implemented. The active screening of vancomycin-resistant enterococci was conducted between 30 September and 10 November 2013. A total of 7053 rectal swabs were collected from patients in 46 hospital wards from 11 departments. The overall carriage rate of vancomycin-resistant enterococci was 2.8% (201/7053). Pulsed-field gel electrophoresis showed a predominant outbreak clone. We curbed the outbreak and kept the colonisation of vancomycin-resistant enterococci among patients at a pre-upsurge low level.
 
Conclusions: We report the largest cohesive effort to control spread of vancomycin-resistant enterococci in Hong Kong. Coupled with other infection control measures, we successfully controlled vancomycin-resistant enterococci to the pre-outbreak level. We have demonstrated that the monumental tasks can be achieved with meticulous planning, and thorough communication and understanding between all stakeholders.
 
 
New knowledge added by this study
  • This is the largest vancomycin-resistant enterococci control study ever conducted in Hong Kong.
  • We have demonstrated the infection control measures required in controlling a large outbreak in a Hong Kong public hospital setting.
  • The key infection control measures are active case finding followed by case-cohorting, directly observed hand hygiene, and equipment and environmental hygiene.
Implications for clinical practice or policy
  • Control of large infectious disease outbreaks and effective implementation of infection control measures can be achieved with meticulous planning, thorough communication, and understanding between all stakeholders.
 
 
Introduction
Vancomycin-resistant enterococci (VRE) is an important cause of health care–associated infection and is known to prolong hospital stay, increase treatment cost, and patient morbidity and mortality.1 2 3 4 5 A VRE carrier was defined as any patient with VRE isolated from a clinical or surveillance specimen. The first case of VRE in Hong Kong was identified at Queen Elizabeth Hospital (QEH) in 1997.6 In 2010, VRE constituted 0.4% of all Enterococcus isolates. Apart from individual small-scale outbreaks,7 8 VRE had not gained a foothold in Hong Kong. Queen Elizabeth Hospital is the largest public acute general tertiary hospital under the administration of the Hospital Authority (HA) with 1800 beds. There are more than 160 000 admissions with 104 000 in-patients treated annually. A major VRE outbreak occurred in QEH in 2013. There was an abnormal increase in the incidence of VRE carriage in multiple clinical departments compared with baseline. Prior to this outbreak, VRE control measures were stipulated by the HA Guideline on Control of VRE. Active screening was not mandatory but was usually performed in contact investigations after VRE was recovered from clinical specimens. The baseline incidence of VRE never exceeded five per week prior to December 2012. Nonetheless, the incidence crept up and by March 2013, a total of 34 VRE carriers were identified in week 13 alone. This study aimed to describe in detail the approach to rapidly control VRE in our hospital.
 
Methods
Multipronged infection control measures for vancomycin-resistant enterococci
The hospital’s control measures can be divided into two phases based on the intensity of measures with the triggering event of the constitution of QEH VRE Task Group.
 
Emerging phase (1 January 2012 to 13 May 2013)
(1) Find and confine—active case finding by admission screening in high prevalence wards with additional weekly screening for outbreak wards. Carriers of VRE were cohorted in either a single room or designated cubicles with a mobile curtain as segregation. Signage for contact precautions was posted at the entrance to the cohort area and at the patients’ bedside. Gloves and gowns were worn when in contact with the patient or patient environment. All VRE cases and their contacts were tagged in the corporate electronic Clinical Management System.
(2) Hand hygiene—chlorhexidine-alcohol hand rub was used in clinical areas with high VRE prevalence. Only two visitors were allowed per VRE patient with their hand hygiene compliance monitored.
(3) Nursing care—all patients in Intensive Care Unit were bathed with chlorhexidine daily. Wards were advised that excreta and tube feeding should be handled by separate teams.
(4) Equipment and environment—we introduced colour-coding to all clinical wards. Two-in-one disinfectants and disposable wipes were provided to clinical wards to improve two-step cleaning. Dedicated non-critical patient care equipment was provided for all VRE cases. Hydrogen peroxide vaporisation sessions were used to disinfect non-critical patient care equipment. Cleaners were coached by infection control nurses and their performance was gauged by environmental sampling and fluorescence markers.
(5) Open disclosure—all outbreaks were disclosed through press release.
 
Intensive control phase (13 May 2013 to 10 November 2013)
(1) Command and control—a VRE Task Group was formed with clear administrative mandates from the Hospital Chief Executive, head of nursing, and head of administrative services. The Task Group included senior representatives from clinical departments, human resources, laboratories, and infection control teams. Weekly meetings were held. Local experts from HA Head Office, Centre for Health Protection, and a local university were also invited to jointly devise an intensive VRE control programme.
(2) Active screening—the pan-hospital VRE screening was the hallmark of this period; it exemplified the determination of the hospital administration. Rectal swabs were collected to identify VRE carriers in different stages. Each ward performed a point prevalence screening followed by 2 weeks of admission and discharge screening. The screening of 46 hospital wards from 11 departments was to be completed within 10 weeks.
 
Carriage of VRE is associated with additional length of stay.1 A sudden surge in VRE cases would result in blockage of admissions, resulting in redirection of emergency admissions to other hospitals. Based on prevalence figures from contact investigations in previous localised VRE outbreaks (range, 0%-20%), bed status and occupancy rates, 126 VRE cases would be identified on the first day of screening alone, 566 cases would be identified at the end of the screening, assuming 10% of our in-patients were VRE carriers. To avoid overwhelming the hospital services due to inadequate isolation facilities, a modified risk-based pan-hospital screening was adopted with consideration of the following parameters: daily number of specimens, daily number of VRE carriers identified, consequent additional length of stay, and designated cohort ward capacity. The final schedule had exacted the number of specimens to be taken by ward and date over a 10-week period and was agreed by all stakeholders.
 
To segregate VRE carriers, a VRE ward was created to avert cross-transmission. Bed capacity was ‘created’ by rescheduling elective procedures from both medical and surgical teams.
 
To avoid inadvertently overloading the hospital’s capacity during active screening, two ‘brake points’ were set, namely number of patients waiting at the emergency department at 7 am each morning for emergency hospital admission should not exceed 30, and total VRE cases identified should not exceed 25 per day. When these points were met, screening on that particular day would stop. A real-time close monitoring communication group using instant messaging (WhatsApp) was formed to connect all key stakeholders on a 24/7 basis.
 
Other additional measures included:
  • Hand hygiene—audit results of hand hygiene compliance were reported to department and hospital administration on a weekly basis. Alcoholic hand rub dispensers were installed in patient toilets. Hand hygiene before meals and medications in all conscious hospitalised patients were directly observed.
  • Nursing care—disposable disinfection wipes were provided to optimise disinfection of commodes, bedpans, and urinals. On-site coaching was provided by infection control nurses about contamination-prone procedures, particularly napkin change and care for nasogastric tube.
  • Equipment and environment—we increased cleaning staff manpower by recruiting additional external cleaning staff and instigating an overtime allowance for existing staff. The frequency of changing privacy curtains was shortened from monthly to biweekly for all VRE carriers. Cleaning efficacy was monitored by regular environmental sampling using Polywipe (Medical Wire & Equipment/Wiltshire, United Kingdom) in wards where the outbreak was detected.
  • Staff engagement, education, and communication—staff forums were organised so all parties would understand the importance of VRE and their role as health care workers, with dedicated sessions in Cantonese for supporting staff.
  • Open disclosure—the result from the pan-hospital screening was released to hospital administration and HA head office on a daily basis.
 
Laboratory protocol
Rectal swabs and stool specimens were inoculated onto chromID VRE agar (bioMérieux, France) and incubated at 35°C ± 2°C according to the manufacturer’s recommendations. The agar plates were examined daily for 2 days. Suspected colonies were identified to be Enterococcus species by both MALDI-TOF (Vitek-MS, bioMérieux) and conventional microbiological methods of Gram stain and biochemical reactions. Vancomycin resistance was confirmed by E-test (bioMérieux, France) according to Clinical Laboratory Standards Institute breakpoints.9 Detection of vancomycin resistance genes was performed by the local reference laboratory. Strains were typed by pulsed-field gel electrophoresis (PFGE) and patterns of SmaI-restricted chromosomal DNA analysed by unweighted pair group method with arithmetic mean (UPGMA) using the BioNumerics software (Applied Maths).10
 
Hand hygiene compliance audit
We adopted the World Health Organization (WHO) hand hygiene observation tools by directly observing compliance with the WHO five moments. The observation was conducted by infection control nurses using a WHO standardised audit form. Nurses, supporting staff, doctors, and allied health personnel were included for observation.
 
Antibiotics consumption data
The consumption of vancomycin, ceftazidime, and ceftriaxone in QEH between week 1 of 2012 and week 39 of 2015 was retrieved from the Clinical Data Analysis and Reporting System. Consumption data were presented in defined daily dose.
 
Statistical analysis
The relationship between VRE carriage, a binary dependent variable, and five independent variables related to patients’ demographic background and hospitalisation history were analysed by univariate methods (Chi squared test supplemented with measurement of the association [odds ratio for binary variables and Spearman’s correlation for ordinal variables]) and the significant independent variables were included in the subsequent multiple logistic regression model. The 30-day mortality between groups was analysed by Chi squared test.
 
In multiple logistic regression, one category of each independent variable was selected as ‘reference category’ to compare with other categories in the variable and the odds ratio calculated. Likelihood ratio test was used to compare the final model with null model and Hosmer-Lemeshow test was used to evaluate the goodness-of-fit of the final model. The Statistical Package for the Social Sciences (SPSS Windows version 21.0; IBM Corp, Armonk [NY], United States) was used for data analysis.
 
Results
Our multipronged infection control measures successfully brought down VRE to pre-outbreak level. Prior to screening, 150 non-emergency procedures were rescheduled. The screening was conducted between 30 September and 10 November 2013. A total of 7053 specimens from 4966 patients were collected—1422 from point prevalence, 4225 from admission, and 1406 on discharge (Table 1). We managed to complete the screening schedule without meeting the brake points.
 

Table 1. Results of the pan-hospital screening of VRE based on clinical departments
 
The baseline incidence of VRE never exceeded five per week prior to the current outbreak. After December 2012, it crept up and peaked at week 13 of 2013 with 34 new VRE cases identified. After the pan-hospital screening, the incidence dropped to no more than five cases per week after March 2015 (Fig 1).
 

Figure 1. VRE epidemiology in Queen Elizabeth Hospital from January 2012 to September 2015
 
Of all the specimens screened, 2.8% (201/7053) were positive for VRE—65.7% (132/201) of VRE came from the specialty of medicine, 19.9% (40/201) from the surgical stream (all surgical subspecialties except neurosurgery and orthopaedics). The point prevalence of VRE was 5.8% (83/1422), admission prevalence was 1.8% (75/4225), and discharge prevalence was 3.1% (43/1406). Risk factors for VRE carriage included male gender, residence in a home for the elderly, older age, longer hospital stay, and more hospitalisation episodes in the previous 90 days prior to screening (Table 2). From logistic regression results compared with the reference group, there was a progressive increase in the risk of VRE carriage with increasing age, and increase in days of hospitalisation in the previous 90 days prior to screening, but not with increasing episodes of hospitalisation in the previous 90 days prior to screening (Table 3).
 

Table 2. Demographic data for VRE-positive patients
 

Table 3. Logistic regression results (outcome: patient ever has VRE-positive result = yes)
 
Infection control measures
A total of 28 588 hand hygiene observations were made in 2013. The compliance rate improved from 37% in the first quarter of 2013 to 73% in the fourth quarter of 2013. The improvement was seen across all departments and all staff groups. A total of 30 sessions of on-site education about napkin change, nasogastric tube care, and environmental cleaning were provided with 88 napkin care procedures observed in 28 wards. Furthermore, 37 hydrogen peroxide vapour sessions were offered to disinfect non-critical equipment; and 15 staff forums dedicated to VRE control were held with a total of 1339 attendances.
 
Microbiology
During the screening period, 105 VRE isolates recovered from the pan-hospital screening were all vanA gene carrying Enterococcus faecalis. They were analysed with eight unrelated archived VRE strains. The PFGE patterns of SmaI-restricted chromosomal DNA of 113 VRE isolates are shown in Figure 2. Dendrogram of PFGE patterns was obtained by UPGMA method. A predominant cluster A was classified using a cut-off at 90% similarity, as calculated by Dice coefficient with 1% position tolerance and 2% band optimisation. Cluster A comprised 49 strains from the current pan-hospital screening and one unrelated archived strain from another hospital.
 

Figure 2. PFGE patterns of SmaI-restricted chromosomal DNA of 113 VRE isolates
 
Carriage of vancomycin-resistant enterococci and 30-day mortality
During the pan-hospital screening period, the 30-day all-cause mortality of all VRE carriers identified in the pan-hospital screening and non-VRE carriers were 20.5% and 6.1%, respectively. The odds ratio was 3.93 (95% confidence interval, 2.68-5.78). When compared with previously known VRE carriers but with negative VRE screening results in the same period (13.6%), the 30-day all-cause mortality were 20.5% and 13.6%, respectively. The odds ratio was 1.64 (95% confidence interval, 0.71-3.76).
 
Antibiotic consumption
There was no significant change in consumption of vancomycin or ceftazidime during the emerging phase or during and beyond the intensive control phase. There was an apparent increase in consumption of ceftriaxone noted after the intensive phase in the first half of 2014 (Fig 1).
 
Discussion
Identification of VRE carriers, segregation of primary sources, hand hygiene, and environmental hygiene are the critical success factors in controlling the VRE outbreak. The territory-wide effort to control the emergence of VRE in public hospitals in Hong Kong has been discussed elsewhere.11 Our study revealed the critical elements involved in controlling a multi-sourced VRE outbreak in a major tertiary hospital. We believe our failure to contain VRE in the emerging phase was in part due to the lack of perceived need of staff for VRE control as well as skepticism about the effectiveness of infection control measures. Senior clinicians may be ambivalent towards our approach due to perceived loss of autonomy. Frontline staff rebuffed the screening programme as they sensed extra work and doubted its effectiveness. Overseas experience has shown that once VRE becomes hospital endemic, eradication is difficult despite the best efforts.12 13 14
 
We faced an additional challenge of an absence of facilities to completely segregate VRE carriers. Our hospital faces overcrowding on a daily basis with bed occupancy often exceeding 100%, and reaching as high as 130% during influenza seasons. Studies have shown that bed occupancy, isolation room availability, and staffing have a direct impact on ease of VRE control.15 16 Our difficulties were compounded by lack of inter-bed spacing and limited toilet facilities as the hospital was designed more than 60 years ago, and the need to keep the hospital functioning at all times.
 
In the intensive control phase, commitment from hospital administration became visible as a result of the pan-hospital screening. Close liaison between departments, careful and extensive planning with input from the frontline at every step, effective communication, and staff engagement were also key to our success. Some researchers have questioned the effectiveness of active surveillance cultures in reducing VRE transmission.17 Others have suggested that VRE will not be successfully controlled if the policy excludes asymptomatic VRE colonisation.18 19 20 21 We believed it was necessary to take drastic action and perform active screening of the whole hospital.
 
Our planning took reference from similar overseas experiences. Christiansen et al18 successfully controlled VRE by screening 19 658 patients and found 169 patients from 23 wards to be colonised with vanB-containing Enterococcus faecium in 6 months. Their experience was different from ours as they had fewer cases. Moretti et al19 reported their extensive active surveillance with enhanced infection control measures in a Brazilian teaching hospital. They performed 8692 rectal swabs for VRE (mean, 300 swabs/month), with an overall positive rate of 3.7%. In their 2.5-year intervention, their VRE positive rate decreased from 7.2% in 2007 to 1.5% in 2009. Kurup et al20 reported their experience in a large Singaporean hospital. They performed a large-scale screening of 4924 patients over 2 months and successfully reduced the positive rate from 11.4% at the peak of the outbreak to 4.2% at the end of screening. We did not observe a decline over the pan-hospital screening period as in the Singaporean experience. It was because we deliberately spaced out the departments with a high VRE prevalence throughout the 10-week period to avoid overwhelming the hospital’s facilities.
 
Rapid laboratory turnaround time is another key element.22 It was soon evident that the hospital laboratory could not handle the additional specimens alone. Assistance from three HA microbiology laboratories was sought. A huge amount of liaison work with extensive communication between laboratory directors, senior medical technologists, and scientific officers followed to ensure the smooth running of this unprecedented inter-laboratory cooperation. A unified set of logistics was established, governing the tiniest details. Procurement of key reagents like chromogenic agar was coordinated centrally with support from the HA head office.
 
Hygiene management has been shown to be important in controlling VRE in endemic areas.16 23 24 Contamination of the hospital environment by VRE, and occurrence of cross-contamination, either through the hands of health care workers, equipment, or surfaces is well known.25 26 27 The association of environmental contamination and the occurrence of an outbreak has also been well established.18 28 29 30
 
The improvement in hand hygiene compliance from approximately 37% to 73% was remarkable. Several explanations are postulated: (1) the VRE Task Group escalated the need for urgent improvement. The weekly reporting of hand hygiene compliance rate via the VRE workgroup created a driving force at an administrative level; (2) we implemented directly observed hand hygiene before meals and when taking medications in all conscious hospitalised patients; (3) we actively engaged infection control link nurses, creating a collective learning opportunity that has facilitated collaboration and system thinking; and (4) making the hand hygiene compliance data visible (and comparing with other wards/departments) might change the behaviour of many.31
 
All the VRE recovered in the pan-hospital screening was vanA-containing Enterococcus faecium. The PFGE patterns showed 49 out of the 105 pan-hospital screening isolates belonged to a single cluster (cluster A), signifying the possibility of clonal spread of a dominant strain, with co-circulation of various less dominant strains. Some clones may have developed de novo. Further analysis of these strains will allow a more thorough understanding of the transmission dynamics within the hospital, and whether the outbreak clone has a survival advantage over other clones.
 
We identified residents of homes for the elderly, advanced age, and prolonged hospitalisation as risk factors for VRE carriage. This is most likely due to their associated co-morbidities rather than the individual factors per se. It is unknown why men were at a higher risk than women. It might have been a chance finding since more outbreaks occurred in male wards before and during the study period than female wards.
 
Antibiotics, especially vancomycin and third-generation cephalosporins like ceftazidime and ceftriaxone, were known to be a risk for VRE colonisation. We did not observe significant changes in the consumption of vancomycin or ceftazidime throughout the study period. Nonetheless, an increase in consumption of ceftriaxone was observed in the first half of 2014. We hypothesise the increase might be a squeeze-the-balloon effect by actively avoiding big gun antibiotics, or an artefact due to irregularities in returning ward antibiotic stock to the hospital pharmacy.
 
We observed a significant increase in 30-day mortality in VRE carriers identified in the pan-hospital screening when compared with those who tested negative for VRE during the same period. However, when we compared the VRE carriers identified in pan-hospital screening with those who tested VRE negative but were known to have had previous VRE carriage, they were not significantly different. Confounding factors like length of hospitalisation and co-morbidities are likely causes of this observation. Further analysis of these factors is required to give a more definitive answer.
 
The pan-hospital screening was immediately followed by the 10-week HA-wide targeted surveillance screening. Any patient with a history of admission to any one of the hospitals in Hong Kong within 3 months, or on haemodialysis, were actively screened for VRE on admission. The VRE level continues to be maintained at a low level, 3 years after the intensive period that ended in 2013. This is important because a one-time effort is often difficult and does not always result in a lasting effect unless a system and culture change has been brought about.
 
A limitation of this study was that the analysis was performed retrospectively. We retrospectively studied the odds ratio after both the exposure and the outcomes had already occurred. It is in contrast to prospective cohort studies where participants are enrolled and then followed over time to identify the occurrence of VRE carriage. In addition, sustained control of VRE is multifactorial and not dependent on any one isolated intervention. Although there were no large-scale outbreaks or VRE control programmes in other hospitals, interdependence among hospitals and other health care facilities are well described.
 
Conclusions
We have successfully controlled a multiple-sourced hospital-wide VRE outbreak in a tertiary hospital with multipronged infection control measures. The need to establish a close working relation between all stakeholders in the hospital cannot be overemphasised. Our experience is useful to other hospitals challenged by VRE or other multidrug-resistant bacteria.
 
Acknowledgements
We are grateful to all the medical, nursing, and supporting staff in Queen Elizabeth Hospital who assisted in VRE control. We thank the microbiology laboratories of Princess Margaret Hospital, United Christian Hospital, and Queen Mary Hospital for their excellent support in handling VRE specimens during pan-hospital screening.
 
Declaration
We would like to acknowledge the Food and Health Bureau, Hong Kong SAR for supporting the typing of the VRE strains under Health and Medical Research Fund (Commissioned HMRF Project No. CU-15-B5).
 
References
1. Cheah AL, Spelman T, Liew D, et al. Enterococcal bacteraemia: factors influencing mortality, length of stay and costs of hospitalization. Clin Microbiol Infect 2013;19:E181-9. Crossref
2. Vergis EN, Hayden MK, Chow JW, et al. Determinants of vancomycin resistance and mortality rates in enterococcal bacteremia: A prospective multicenter study. Ann Intern Med 2001;135:484-92. Crossref
3. Lloyd-Smith P, Younger J, Lloyd-Smith E, Green H, Leung V, Romney MG. Economic analysis of vancomycin-resistant enterococci at a Canadian hospital: assessing attributable cost and length of stay. J Hosp Infect 2013;85:54-9. Crossref
4. Carmeli Y, Eliopoulos G, Mozaffari E, Samore M. Health and economic outcomes of vancomycin-resistant enterococci. Arch Intern Med 2002;162:2223-8. Crossref
5. Muto CA, Giannetta ET, Durbin LJ, Simonton BM, Farr BM. Cost-effectiveness of perirectal surveillance cultures for controlling vancomycin-resistant Enterococcus. Infect Control Hosp Epidemiol 2002;23:429-35. Crossref
6. Chuang VW, Tsang DN, Lam JK, Lam RK, Ng WH. An active surveillance study of vancomycin-resistant Enterococcus in Queen Elizabeth Hospital, Hong Kong. Hong Kong Med J 2005;11:463-71.
7. Cheng VC, Tai JW, Ng ML, et al. Extensive contact tracing and screening to control the spread of vancomycin-resistant Enterococcus faecium ST414 in Hong Kong. Chin Med J 2012;125:3450-7.
8. Cheng VC, Chan JF, Tai JW, et al. Successful control of vancomycin-resistant Enterococcus faecium outbreak in a neurosurgical unit at non-endemic region. Emerg Health Threats J 2009;2:e9. Crossref
9. Clinical and Laboratory Standards Institute (CLSI). Performance standards for antimicrobial susceptibility testing: twenty-third informational supplement M100-S23. Wayne, PA: CLSI; 2013.
10. Miranda AG, Singh KV, Murray BE. DNA fingerprinting of Enterococcus faecium by pulsed-field gel electrophoresis may be a useful epidemiologic tool. J Clin Microbiol 1991;29:2752-7.
11. Cheng VC, Tai JW, Chau PH, et al. Successful control of emerging vancomycin-resistant enterococci by territory-wide implementation of directly observed hand hygiene in patients in Hong Kong. Am J Infect Control 2016;44:1168-71. Crossref
12. Willems RJ, Top J, van Santen M, et al. Global spread of vancomycin-resistant Enterococcus faecium from distinct nosocomial genetic complex. Emerg Infect Dis 2005;11:821-8. CrossRef
13. Arias CA, Murray BE. The rise of the Enterococcus: beyond vancomycin resistance. Nat Rev Microbiol 2012;10:266-78. Crossref
14. Werner G, Coque TM, Hammerum AM, et al. Emergence and spread of vancomycin resistance among enterococci in Europe. Euro Surveill 2008;13.pii:19046.
15. Arias CA, Mendes RE, Stilwell MG, Jones RN, Murray BE. Unmet needs and prospects for oritavancin in the management of vancomycin-resistant enterococcal infections. Clin Infect Dis 2012;54 Suppl 3:S233-8. Crossref
16. Aumeran C, Baud O, Lesens O, Delmas J, Souweine B, Traoré O. Successful control of a hospital-wide vancomycin-resistant Enterococcus faecium outbreak in France. Eur J Clin Microbiol Infect Dis 2008;27:1061-4. Crossref
17. Huskins WC, Huckabee CM, O’Grady NP, et al. Intervention to reduce transmission of resistant bacteria in intensive care. N Engl J Med 2011;364:1407-18. Crossref
18. Christiansen KJ, Tibbett PA, Beresford W, et al. Eradication of a large outbreak of a single strain of vanB vancomycin-resistant Enterococcus faecium at a major Australian teaching hospital. Infect Control Hosp Epidemiol 2004;25:384-90. Crossref
19. Moretti ML, de Oliveira Cardoso LG, Levy CE, et al. Controlling a vancomycin-resistant enterococci outbreak in a Brazilian teaching hospital. Eur J Clin Microbiol Infect Dis 2011;30:369-74. Crossref
20. Kurup A, Chlebicki MP, Ling ML, et al. Control of a hospital-wide vancomycin-resistant Enterococci outbreak. Am J Infect Control 2008;36:206-11. Crossref
21. Lee SC, Wu MS, Shih HJ, et al. Identification of vancomycin-resistant enterococci clones and inter-hospital spread during an outbreak in Taiwan. BMC Infect Dis 2013;13:163. Crossref
22. Delmas J, Robin F, Schweitzer C, Lesens O, Bonnet R. Evaluation of a new chromogenic medium, ChromID VRE, for detection of vancomycin-resistant Enterococci in stool samples and rectal swabs. J Clin Microbiol 2007;45:2731-3. Crossref
23. Nolan SM, Gerber JS, Zaoutis T, et al. Outbreak of vancomycin-resistant enterococcus colonization among pediatric oncology patients. Infect Control Hosp Epidemiol 2009;30:338-45. Crossref
24. Morris-Downes M, Smyth EG, Moore J, et al. Surveillance and endemic vancomycin-resistant enterococci: some success in control is possible. J Hosp Infect 2010;75:228-33. Crossref
25. Ramsey AM, Zilberberg MD. Secular trends of hospitalization with vancomycin-resistant enterococcus infection in the United States, 2000-2006. Infect Control Hosp Epidemiol 2009;30:184-6. Crossref
26. Muto CA, Jernigan JA, Ostrowsky BE, et al. SHEA guideline for preventing nosocomial transmission of multidrug-resistant strains of Staphylococcus aureus and enterococcus. Infect Control Hosp Epidemiol 2003;24:362-86.
27. Morris JG Jr, Shay DK, Hebden JN, et al. Enterococci resistant to multiple antimicrobial agents, including vancomycin. Establishment of endemicity in a university medical center. Ann Intern Med 1995;123:250-9. Crossref
28. Rossini FA, Fagnani R, Leichsenring ML, et al. Successful prevention of the transmission of vancomycin-resistant enterococci in a Brazilian public teaching hospital. Rev Soc Bras Med Trop 2012;45:184-8. Crossref
29. Boyce JM. Environmental contamination makes an important contribution to hospital infection. J Hosp Infect 2007;65 Suppl 2:50-4. Crossref
30. Harris AD. How important is the environment in the emergence of nosocomial antimicrobial-resistant bacteria? Clin Infect Dis 2008;46:686-8. Crossref
31. Kirkland KB, Homa KA, Lasky RA, Ptak JA, Taylor EA, Splaine ME. Impact of a hospital-wide hand hygiene initiative on healthcare-associated infections: results of an interrupted time series. BMJ Qual Saf 2012;21:1019-26. Crossref

Ten-year review of survival and management of malignant glioma in Hong Kong

Hong Kong Med J 2017 Apr;23(2):134–9 | Epub 2 Dec 2016
DOI: 10.12809/hkmj164879
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Ten-year review of survival and management of malignant glioma in Hong Kong
Danny TM Chan, FRCS, FHKAM (Surgery); Sonia YP Hsieh, MB, BS, MSc; Claire KY Lau, MSc; Michael KM Kam, FRCR, FHKAM (Radiology); Herbert HF Loong, MB, BS, MRCP (UK); WK Tsang, FRCR, FHKAM (Radiology); Darren MC Poon, FRCR, FHKAM (Radiology); WS Poon, FRCS, FHKAM (Surgery)
CUHK Otto Wong Brain Tumour Centre, 1/F, Sir Yue-kong Pao Centre for Cancer, Prince of Wales Hospital, Shatin, Hong Kong
 
Corresponding author: Dr Danny TM Chan (tmdanny@surgery.cuhk.edu.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Surgical resection used to be the mainstay of treatment for glioma. In the last decade, however, opinion has changed about the goal of surgical resection in treating glioma. Ample evidence shows that maximum safe resection in glioblastoma improves survival. Neurosurgeons have therefore revised their objective of surgery from diagnostic biopsy or limited debulking to maximum safe resection. Given these changes in the management of glioma, we compared the survival of local Chinese patients with glioblastoma multiforme over a period of 10 years.
 
Methods: We retrospectively reviewed the data of the brain tumour registry of the CUHK Otto Wong Brain Tumour Centre in Hong Kong. Data of patients with glioblastoma multiforme were reviewed for two periods, during 1 January 2003 to 31 December 2005 and 1 January 2010 to 31 December 2012. Overall survival during these two periods of time was assessed by Kaplan-Meier survival estimates. Risk factors including age, type and extent of resection, use of chemotherapy, and methylation status of O6-methylguanine-DNA methyltransferase were also assessed.
 
Results: There were 26 patients with glioblastoma multiforme with a mean age of 52.2 years during 2003 to 2005, and 42 patients with a mean age of 55.1 years during 2010 to 2012. The mean overall survival during these two periods was 7.4 months and 12.7 months, respectively (P<0.001). The proportion of patients who underwent surgical resection was similar: 69.2% in 2003 to 2005 versus 78.6% in 2010 to 2012 (P=0.404). There was a higher proportion of patients in whom surgery achieved total removal in 2010 to 2012 than in 2003 to 2005 (35.7% and 7.7%, respectively; P=0.015). During 2010 to 2012, patients who were given concomitant chemoradiotherapy showed definitively longer survival than those who were not (17.9 months vs 4.5 months; P=0.001). The proportion of patients who survived 2 years after surgery increased from 11.5% in 2003 to 2005 to 21.4% in 2010 to 2012.
 
Conclusions: Hong Kong has made substantial improvements in the management of glioblastoma multiforme over the last decade with corresponding improved survival outcomes. The combination of an aggressive surgical strategy and concomitant chemoradiotherapy are probably the driving force for the improvement.
 
 
New knowledge added by this study
  • Maximum safe resection of glioblastoma multiforme (GBM) is feasible and has improved survival of patients over the last decade.
  • Concomitant chemoradiotherapy has been shown to improve overall survival of patients with GBM.
Implications for clinical practice or policy
  • A combined multidisciplinary approach with surgery, radiotherapy, and chemotherapy should be adopted for treatment of GBM.
 
 
Introduction
Glioblastoma multiforme (GBM) is a malignant primary brain tumour with an incidence of 1 to 2 per 100 000 population in Hong Kong.1 The survival of patients with GBM remains dismal, mainly due to its inevitable progression and recurrence. Little progress was made until the last decade. The establishment of concomitant chemoradiotherapy (CCRT) with temozolomide (TMZ) and the discovery of O6-methylguanine-DNA methyltransferase (MGMT) promoter methylation in association with significantly better outcome were the two major and inspiring breakthroughs.2 3 Before these developments, the treatment for GBM was homogeneous but desperate, and comprised surgery and irradiation only.
 
In 2001, TMZ was first used in the treatment of recurrent high-grade glioma in Hong Kong. Its favourable anti-tumour activity and acceptable safety profile were proven in a local study.4 In 2005, TMZ was the first chemotherapy to show objective survival benefit as a primary treatment when used together with radiotherapy as part of CCRT in GBM.2 Since then, CCRT for GBM has become the norm in Hong Kong.
 
In the last decade, opinion has changed about the goal of surgical resection in treating glioma. Ample evidence has shown that maximum safe resection in GBM improves survival. Neurosurgeons have therefore revised their objective of surgery from a diagnostic biopsy or limited debulking to a maximum safe resection. Knowing the infiltrative nature of the tumour, surgeons have a demanding job of balancing maximum resection and safe surgery. Awake craniotomy and mapping technique are two essential surgical techniques that enable safe resection.5 The goal of maximum resection can be achieved with a fluorescence-guided surgery with 5-aminolevulinic acid (5-ALA).6
 
Given these changes to the management of GBM, we therefore analysed the changes in overall survival of GBM over the past 10 years in Hong Kong.
 
Methods
Data were retrieved from the Chinese University of Hong Kong Otto Wong Brain Tumour Centre brain tumour registry. The registry has been collecting data from all histology-proven glioma patients in the institute since January 2003. Patients aged 18 years or above with histologically proven glioma diagnosed in the institute were included in the registry. Patients with histologically confirmed World Health Organization grade IV GBM during January 2003 to December 2005 and January 2010 to December 2012 were recruited and grouped. Patients treated between 2006 and 2009 were excluded because the surgical policy was evolving and the availability of chemotherapy was variable during the period. Therefore this would be a heterogeneous group of patients with various treatments due to availability or affordability. Patients with an unstable neurological condition or who were considered a poor medical risk after surgery resulting in Karnofsky Performance Scale score of below 70 were excluded, as were those who received initial chemotherapeutics other than TMZ, ie procarbazine, lomustine, vincristine, or bevacizumab. Data on type of surgery, extent of resection, tumour histology, irradiation and chemotherapy parameters were collected as well as information about patient’s age and gender. The registry defined the death date according to the electronic patient record in the Hospital Authority Clinical Management System. For patients who defaulted from clinical follow-up, telephone follow-up ascertained death date. The study end date was 30 June 2015.
 
During 2003 to 2005, all patients were treated with surgical resection and adjuvant radiotherapy; TMZ was only used in patients with recurrent disease. Ability to pay for chemotherapy was the key determinant of its application and utility. In our hospital, TMZ was prescribed at a dose of 200 mg/m2 once per day for 5 days in a 28-day cycle.
 
With regard to the contouring methodology of irradiation, either European Organisation for Research and Treatment of Cancer or Radiation Therapy Oncology Group protocol was chosen according to the serial assessment of both pre- and post-operative magnetic resonance imaging (MRI) scans. A total dose of 60 Gy irradiation was delivered to the tumour bed and its adjacent tissue in 30 fractions, with 2 Gy each.
 
Since 2009, neuroradiologists have been responsible for assessing the extent of resection by MRI on postoperative day 1. Total resection was defined as no remaining contrast enhancement on MRI T1-weighted and subtraction scans of T1 plain with T1 plus contrast. For patients in whom the enhancing lesion was still noticeable, the resection was categorised as debulking.
 
In the 2010-2012 cohorts, TMZ was recommended to all patients. The dosage was 75 mg/m2/day concomitant with radiotherapy, then 150-200 mg/m2/day on the first 5 days every 4 weeks for 6 cycles, in accordance with the regimen described by Stupp et al.2 Methylation status of the MGMT was detected using methylation-specific polymerase chain reaction at our institution. The method has been explained in detail in one of our earlier studies.7 Survival was calculated from the date of surgery for brain tumour to death. Kaplan-Meier survival curves were used to compare different groups of biopsy versus surgical resection and chemoradiotherapy versus radiotherapy alone.
 
This audit review was done in accordance with the principles outlined in the Declaration of Helsinki.
 
Results
Demographics, management, and survival of patients are shown in the Table.
 

Table. Demographics, management, and survival of patients
 
During the period 1 January 2003 to 31 December 2005, 26 patients with a mean age of 52.2 years were eligible for study. Two patients below the age of 18 years were excluded from the registry. The median overall survival for this cohort was 7.4 months (Fig 1). Eight (30.7%) patients underwent biopsy only, with a non-inferior median overall survival compared with the remaining 18 patients who underwent resection (7.2 months vs 7.4 months, P=0.988, log-rank test; Fig 2). Total removal could be achieved in only two patients, with overall survival of 14.7 and 28.8 months, respectively. For the remaining 16 patients who underwent debulking surgery, the median overall survival was 7.2 months.
 

Figure 1. Kaplan-Meier curves for overall survival of patients during the two periods
2003-2005 (n=26): 7.4 months; 2010-2012 (n=42): 12.7 months
 

Figure 2. Kaplan-Meier curves for overall survival of the biopsy group and surgical resection group during 2003 to 2005
Biopsy (n=8): 7.2 months; resection (n=18): 7.4 months
 
During the period 1 January 2010 to 31 December 2012, 42 patients with a mean age of 55.1 years were identified. One patient was excluded because he declined CCRT after surgery and opted for alternative medicine. The median overall survival was markedly prolonged to 12.7 months (P<0.001, log-rank test; Fig 1). The proportion of patients who had biopsy (9/42, 21.4%) during 2010 to 2012 remained similar to 10 years ago (8/26, 30.8%). Patients with resection performed was not significantly different between the two periods (P=0.404, Chi squared test). Overall survival of the surgical resection group was distinctly longer than that for the biopsy group (15.2 months and 4.5 months, respectively; P=0.026, log-rank test; Fig 3). A higher proportion of patients achieved total surgical removal in 2010-2012 than in 2003-2005, being 35.7% (15/42) and 7.7% (2/26), respectively (P=0.015, Chi squared test). The difference between debulking and total resection remained undefined in the 2010-2012 arm (13.0 months vs 16.0 months; P=0.966, log-rank test) by the time of analysis.
 

Figure 3. Kaplan-Meier curves for overall survival of the biopsy group and surgical resection group during 2010 to 2012
Biopsy (n=9): 4.5 months; resection (n=33): 15.2 months
 
Of the 42 patients with GBM during 2010-2012, CCRT was initiated in 23, accompanied by a meaningful longer median survival of 17.9 months compared with only 4.5 months for those given radiotherapy only (P=0.001, log-rank test; Fig 4). Data for MGMT were available in 33 patients. The overall survival of 19 patients with methylated MGMT promoter was longer than that of 14 patients with unmethylated MGMT promoter, being 28.4 months and 6.3 months, respectively (P<0.001, log-rank test).
 

Figure 4. Kaplan-Meier curves for overall survival of the surgery plus concomitant chemoradiotherapy (CCRT) group and surgery plus radiotherapy (RT) group during 2010 to 2012
Surgery + CCRT (n=23): 17.9 months versus surgery + RT (n=19): 4.5 months
 
Improvement in 2-year survival was also evident, from 11.5% in the earlier cohort, to 21.4% in the later one.
 
Discussion
Glioma has attracted international research interest over the last 20 years in both clinical and laboratory setting. The determination to fight the disease yielded with proven survival benefit of TMZ in recurrent high-grade glioma in 2000.8 A 6-month event-free survival of 21% in TMZ compared favourably with 9% for procarbazine.8 The full effect of TMZ was reported in a randomised trial as primary treatment for GBM in 2005.2 The regimen included two phases of TMZ, starting with a concomitant phase of daily low-dose TMZ during the course of radiotherapy, followed by the adjuvant phase of a high-dose TMZ for 5 days during each 28-day cycle for 6 cycles.2 The results of the study benchmarked a standard for chemotherapy in the treatment of GBM. The median survival of 14.6 months in the CCRT arm compared favourably with the 12.1 months of the control radiotherapy-alone arm.2
 
In Hong Kong, TMZ was first introduced in 2001. Its safety and effect had been tested and reported in a small series of recurrent high-grade glioma.4 The use of CCRT in Hong Kong was also reported with favourable results.9 The overall survival was much improved in these 10 years from 7.4 months in 2003-2005 to 12.7 months in 2010-2012. Among the cohorts in 2010-2012, however, only 54.8% (23/42) received CCRT. This can be attributed to the fact that in 2010, TMZ, whilst already incorporated into the Hospital Authority Drug Formulary, was listed as a self-financed item only. The financial burden on patients was the major cause of low usage during the time.
 
In 2011, TMZ was granted conditional funding through the Samaritan Fund scheme. Approval of funding was based on the financial situation of the patient and the tumour’s MGMT methylation status, with approval only granted to patients with tumours with MGMT methylation. This may have been a cost-effectiveness consideration because the largest survival benefit would be in MGMT-methylated GBM. Local data showed that only 43% of local GBM were methylated in MGMT status,9 thus essentially limiting the possibility of funding for less than half of the patients with GBM. Thus, within the 2010-2012 cohort, only patients diagnosed from 2011 onwards with tumours of methylated MGMT status (accounting for roughly a further half of the patient population) would have benefited from the scheme. This may account for the relatively low number of patients treated with CCRT. Then the policy of restricting funding based on MGMT status was re-addressed and such criterion was removed in 2013. Currently, support of Samaritan Fund for TMZ is available for eligible patients with GBM based on their financial situation, and regardless of their tumour MGMT status.
 
The treatment of CCRT had made an impact not only on clinical outcomes, but also on the working dynamics between different professional disciplines involved in the management of patients with GBM. The need for timely arrangement and administration of radiotherapy and chemotherapy within a short postoperative window has encouraged a multidisciplinary team approach. This continues to be the current treatment delivery model for patients with GBM in many hospitals in Hong Kong. Better clinical outcomes encouraged professional enthusiasm. In this atmosphere, a local group of clinicians got together and founded the Hong Kong Neuro-Oncology Society in 2011.
 
The reasons for longer survival of GBM in recent years are likely to be multifactorial. The extent of surgical resection has been intensely studied over the last two decades. Nonetheless, since a prospective randomised surgical study would be unethical, evidence to support maximum safe resection must be gleaned retrospectively. Despite this, neurosurgical professionals were convinced that surgical resection was the first and major treatment for GBM. Surgical conservatism was abandoned and the demand for maximum safe resection was set out by neurosurgeons. This change was reflected in the decrease in surgical biopsy rate from 30.8% in 2003-2005 to 21.4% in 2010-2012. The ability of total surgical removal of the contrast-enhancing tumour was also increased from 7.7% in 2003-2005 to 35.7% in 2010-2012. Local neurosurgeons have been equipped with two surgical techniques to achieve maximum safe resection in the last 10 years—the technique of cortical mapping and awake surgery was brought to all local neurosurgeons in two workshops of commissioned training organised by the Hospital Authority in 2003 and 2010. This technique allows safer resection of the tumour at or near the eloquent area of the brain. A tumour fluorescent technique (5-ALA) was introduced in 2009 that facilitated detection of residual tumour for maximum resection. In 2003-2005, the survival of the surgical resection group and biopsy group was similar but in 2010-2012, those in the surgical resection group survived longer. The difference was probably due to both aggressive surgical resection and CCRT in the latter group.
 
The major limitation of this study was the presence of potential confounding factors during the 10-year study period. Such factors included incomplete data of MGMT methylation status and extent of resection in the 2003-2005 group. There was no MGMT methylation testing or day-1 MRI scan after resection in 2003-2005. The interval between surgery and commencement of radiotherapy has been controlled to within 4 weeks since 2009 but this was not the case in 2003-2005. All these confounding factors made valid comparison of the effect of surgical resection or chemotherapy during these two time periods difficult. Moreover, the registry included only surgical patients who had undergone biopsy or resection, and excluded a small group of patients, who were usually elderly (age >70 years) or with poor co-morbidities, who may have received radiotherapy or chemotherapy alone.
 
Conclusions
Hong Kong has made substantial improvements in the management of GBM with improved survival over the last decade. The combination of aggressive surgical strategy and CCRT are probably the driving force for the improvement.
 
Declaration
None of the authors has disclosed any conflicts of interest.
 
References
1. Pu JK, Ng GK, Leung GK, Wong CK. One-year review of the incidence of brain tumours in Hong Kong Chinese patients as part of the Hong Kong Brain and Spinal Tumours Registry. Surg Pract 2012;16:133-6. Crossref
2. Stupp R, Mason WP, van den Bent MJ, et al. Radiotherapy plus concomitant and adjuvant temozolomide for glioblastoma. N Engl J Med 2005;352:987-96. Crossref
3. Hegi ME, Diserens AC, Gorlia T, et al. MGMT gene silencing and benefit from temozolomide in glioblastoma. N Engl J Med 2005;352:997-1003. Crossref
4. Chan DT, Poon WS, Chan YL, Ng HK. Temozolomide in the treatment of recurrent malignant glioma in Chinese patients. Hong Kong Med J 2005;11:452-6.
5. Chan DT, Kan PK, Lam JM, et al. Cerebral motor cortical mapping: awake procedure is preferable to general anaesthesia. Surg Pract 2010;14:12-8. Crossref
6. Stummer W, Pichlmeier U, Meinel T, et al. Fluorescence-guided surgery with 5-aminolevulinic acid for resection of malignant glioma: a randomised controlled multicentre phase III trial. Lancet Oncol 2006;7:392-401. Crossref
7. Dong SM, Pang JC, Poon WS, et al. Concurrent hypermethylation of multiple genes is associated with grade of oligodendroglial tumors. J Neuropathol Exp Neurol 2001;60:808-16. Crossref
8. Yung WK, Albright RE, Olson J, et al. A phase II study of temozolomide vs. procarbazine in patients with glioblastoma multiforme at first relapse. Br J Cancer 2000;83:588-93. Crossref
9. Chan DT, Kam MK, Ma BB, et al. Association of molecular marker O6Methylguanine DNA methyltransferase and concomitant chemoradiotherapy with survival in Southern Chinese glioblastoma patients. Hong Kong Med J 2011;17:184-8.

Preimplantation genetic diagnosis and screening by array comparative genomic hybridisation: experience of more than 100 cases in a single centre

Hong Kong Med J 2017 Apr;23(2):129–33 | Epub 17 Feb 2017
DOI: 10.12809/hkmj164883
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Preimplantation genetic diagnosis and screening by array comparative genomic hybridisation: experience of more than 100 cases in a single centre
Judy FC Chow, MPhil1; William SB Yeung, PhD1; Vivian CY Lee, FHKAM (Obstetrics and Gynaecology)2; Estella YL Lau, PhD2; PC Ho, FRCOG, FHKAM (Obstetrics and Gynaecology)1; Ernest HY Ng, FRCOG, FHKAM (Obstetrics and Gynaecology)1;
1 Department of Obstetrics and Gynaecology, The University of Hong Kong, Queen Mary Hospital, Pokfulam, Hong Kong
2 Department of Obstetrics and Gynaecology, Queen Mary Hospital, Hong Kong
 
Corresponding author: Dr William SB Yeung (wsbyeung@hku.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Preimplantation genetic screening has been proposed to improve the in-vitro fertilisation outcome by screening for aneuploid embryos or blastocysts. This study aimed to report the outcome of 133 cycles of preimplantation genetic diagnosis and screening by array comparative genomic hybridisation.
 
Methods: This study of case series was conducted in a tertiary assisted reproductive centre in Hong Kong. Patients who underwent preimplantation genetic diagnosis for chromosomal abnormalities or preimplantation genetic screening between 1 April 2012 and 30 June 2015 were included. They underwent in-vitro fertilisation and intracytoplasmic sperm injection. An embryo biopsy was performed on day-3 embryos and the blastomere was subject to array comparative genomic hybridisation. Embryos with normal copy numbers were replaced. The ongoing pregnancy rate, implantation rate, and miscarriage rate were studied.
 
Results: During the study period, 133 cycles of preimplantation genetic diagnosis for chromosomal abnormalities or preimplantation genetic screening were initiated in 94 patients. Overall, 112 cycles proceeded to embryo biopsy and 65 cycles had embryo transfer. The ongoing pregnancy rate per transfer cycle after preimplantation genetic screening was 50.0% and that after preimplantation genetic diagnosis was 34.9%. The implantation rates after preimplantation genetic screening and diagnosis were 45.7% and 41.1%, respectively and the miscarriage rates were 8.3% and 28.6%, respectively. There were 26 frozen-thawed embryo transfer cycles, in which vitrified and biopsied genetically transferrable embryos were replaced, resulting in an ongoing pregnancy rate of 36.4% in the screening group and 60.0% in the diagnosis group.
 
Conclusions: The clinical outcomes of preimplantation genetic diagnosis and screening using comparative genomic hybridisation in our unit were comparable to those reported internationally. Genetically transferrable embryos replaced in a natural cycle may improve the ongoing pregnancy rate and implantation rate when compared with transfer in a stimulated cycle.
 
 
New knowledge added by this study
  • Array comparative genomic hybridisation is a reliable method for preimplantation genetic diagnosis for translocation/inversion carriers, and for patients with mosaic sex chromosome aneuploidy. Replacement of vitrified embryos after warming in a natural cycle may improve the ongoing pregnancy rate and implantation rate.
Implications for clinical practice or policy
  • Preimplantation genetic diagnosis by array comparative genomic hybridisation shall be offered as an alternative to prenatal diagnosis for translocation/inversion carriers, and for patients with mosaic sex chromosome aneuploidy. The results of this local case series provide information, such as the anticipated percentage of genetically transferrable embryos and the expected ongoing pregnancy rate, which is useful for patient counselling before preimplantation genetic diagnosis or screening.
 
 
Introduction
Preimplantation genetic diagnosis (PGD) is an alternative to prenatal diagnosis for detection of chromosomal abnormalities in translocation or inversion carrier couples. In the past 13 years, more than 6000 cycles of PGD for chromosomal abnormalities have been performed.1 Fluorescence in-situ hybridisation (FISH) was first used in PGD for translocation carriers.2 Due to its technical limitations however,3 4 5 it has been replaced by array comparative genomic hybridisation (aCGH) in many centres. In our centre, we have previously shown that the use of aCGH for PGD in translocation carriers results in a significantly higher rate of ongoing pregnancy than PGD by FISH.6
 
Aneuploidy is the most common abnormality found in embryos derived from in-vitro fertilisation (IVF), and leads to poor outcomes.7 8 9 10 11 12 13 Morphological assessment of embryos or blastocysts alone, however, cannot negate the potential risk of replacing aneuploid embryos or blastocysts.14 Preimplantation genetic screening (PGS) has been proposed to improve the IVF outcomes by screening for aneuploid embryos or blastocysts. More than 26 000 PGS cycles have been performed worldwide.1 The aCGH technique enables us to screen all 24 chromosomes within 24 hours and makes fresh transfer possible after blastomere biopsy or trophectoderm biopsy.15 A randomised study has shown that PGS by aCGH plus selection by morphology of blastocysts can significantly improve the ongoing pregnancy rate in patients with good prognosis when compared with selection of blastocysts by morphology alone.16 Another randomised study also showed an improvement in the implantation rate after PGS by aCGH in addition to morphological assessment of embryos.17 We report here the clinical outcome of 133 cycles of PGD/PGS by aCGH in a local unit.
 
Methods
Study population
Data from all treatment cycles performed for PGD and PGS in the Department of Obstetrics and Gynaecology, Queen Mary Hospital/The University of Hong Kong from 1 April 2012 to 30 June 2015 were retrieved. This study was done in accordance with the principles outlined in the Declaration of Helsinki. Patient consent has been obtained. Data were stored in a database and coded for indication. Indications for PGS were defined as: (1) advanced maternal age (AMA) group for patients aged >38 years; (2) recurrent miscarriage (RM) group with patients having at least two clinical miscarriages and negative investigations for RM; (3) repeated implantation failure group (RIF) with those who failed to get pregnant after three embryo transfer cycles with at least six good-quality embryos replaced; and (4) optional PGS group included those with normal karyotype but who had experienced a previous pregnancy with abnormal karyotype, and those who opted for PGS when performing PGD for monogenetic disease.
 
Indications for PGD by aCGH were divided as follows: (1) mosaic were those with mosaic sex chromosome abnormalities on karyotyping, including mosaic Klinefelter’s or mosaic Turner’s syndromes; (2) Robertsonian translocation; (3) reciprocal translocation; (4) inversion; and (5) double translocations.
 
Treatment regimen
The details of the ovarian stimulation regimen, gamete handling, and frozen-thawed embryo transfer (FET) have been previously described.18 Surplus good-quality blastocysts with no aneuploidy/unbalanced chromosome detected were vitrified by the CVM Vitrification System (CryoLogic, Victoria, Australia). If the patient did not get pregnant in the stimulated cycle, the vitrified blastocysts were warmed and replaced in subsequent FET cycles. The details of biopsy and PGD/ PGS by aCGH have been described elsewhere.6 19 In brief, a single blastomere was removed from good-quality day-3 embryos (6-to-8 cell stage) and the blastomere underwent whole-genome amplification (SurePlex; BlueGnome, Cambridge, United Kingdom). Array CGH was performed using 24sure+ (BlueGnome) on reciprocal translocation and inversion cases while other cases were tested by 24sure V3 (BlueGnome) according to the manufacturer’s protocol. All results were interpreted independently by two laboratory staff, usually with a high concordant rate (>95%). Discrepancies were resolved through consensus.
 
Results
Between 1 April 2012 and 30 June 2015, 94 couples underwent 133 cycles of ovarian stimulation for PGD for chromosomal abnormalities, or PGS with indications listed in Table 1. The most frequent indication for PGD/PGS was reciprocal translocation (35.3%) followed by RM (27.1%) and Robertsonian translocation (16.5%). The median age of the women was 36.5 (range, 25-44) years. Embryo biopsy was performed in 112 cycles. The mean number of embryos biopsied per retrieval cycle was 5.6 (740/133), with 99.2% of biopsies resulting in a conclusive diagnosis, of which only 25.8% (191/740) were genetically transferrable. The whole-genome amplification failed in all the samples with inconclusive diagnosis.
 

Table 1. Indications for PGD and PGS
 
Overall, PGD/PGS was cancelled in 21 (15.8%) cycles after ovarian stimulation due to poor response (19 cycles), failed fertilisation (1 cycle), or no sperm found in the testicular biopsy (1 cycle). In case of poor response (<4 good-quality embryos on day 3), cleavage-stage embryos were frozen/vitrified, subsequently thawed/warmed, and pooled with fresh embryos from the following stimulation cycle for diagnosis. Fresh embryo transfer was cancelled in 47 (42.0%) cycles after biopsy due to unavailability of genetically transferrable embryo (31 cycles), high serum progesterone level on the day of human chorionic gonadotropin (>5 nmol/L; 10 cycles), risk of ovarian hyperstimulation (2 cycles), delayed assay (3 cycles), or patient request (1 cycle). Overall, 65 PGD/PGS cycles proceeded to embryo transfer in the stimulated cycles with one or two blastocysts replaced on day 5 (mean, 1.4). As shown in Table 2, the result of aCGH was further subdivided into two categories (PGS and PGD) based on indications. The ongoing pregnancy rates (pregnancy beyond 8-10 weeks of gestation) of PGS and PGD were 50.0% (11/22) and 34.9% (15/43), respectively.
 

Table 2. Results of PGS / PGD by aCGH in stimulated and frozen-thawed embryo transfer cycles
 
There were 26 cycles of FET in a natural cycle in which one or two biopsied and vitrified blastocysts were replaced (mean, 1.2), resulting in a pregnancy rate of 36.4% (4/11) in the PGS group and 66.7% (10/15) in the PGD group. Ongoing pregnancy rates in the PGS and PGD group were 36.4% (4/11) and 60.0% (9/15), respectively (Table 2). The miscarriage rates in the stimulated embryo transfer cycles and FET cycles were 21.2% (7/33) and 7.1% (1/14), respectively. The differences in ongoing pregnancy rate and miscarriage rate between stimulated embryo transfer and FET cycle were not statistically significant. All pregnant women following PGD for chromosomal abnormalities were referred to the Prenatal Counselling and Diagnosis team at Tsan Yuk Hospital for counselling and confirmation of the PGD result by prenatal diagnosis or postnatal cord blood karyotyping. Based on the available results of the confirmation tests, no misdiagnosis was found in this small series.
 
Discussion
The 13th data report of the ESHRE PGD Consortium includes a total of 1071 oocyte retrieval cycles for chromosomal abnormalities and 2979 oocyte retrieval cycles for PGS, resulting in a delivery rate of 21%-25% per transfer and an implantation rate of 22%-26%.1 The ongoing pregnancy rate and implantation rate of the present series are 34.9%-50.0% and 41.1%-45.7%, respectively.
 
As shown in Table 1, the percentage of transferrable embryos varies among different indications for PGD/PGS. In cases of PGD for chromosomal abnormalities, as expected, the lowest percentage of genetically transferrable embryos was found in the reciprocal translocation group (17.5%), followed by the Robertsonian translocation group (31.8%) and the mosaic Turner’s / Klinefelter’s syndrome group (32.7%). These data are in line with those of the ESHRE PGD consortium,1 of which the corresponding percentages are 16.6%, 33.5%, and 36.8%, respectively. The high proportion of unbalanced gametes can be explained by the segregation modes and behaviour of the translocated chromosomes during meiosis.20
 
In the PGS group (RM, RIF, AMA, and optional PGS), the overall percentage of genetically transferrable embryos was 27.5% (69/251), similar to that of the ESHRE PGD consortium (30%). It is noteworthy that there were no transferrable embryos in all four cases of AMA (median age, 42.5 years). It is well known that chromosomal aneuploidy increases exponentially with increasing maternal age.21 22 Therefore, patients with advanced age should be counselled accordingly before the initiation of PGS cycles.
 
The cancellation rate for PGD/PGS after initiation of stimulation was 15.8% (21/133) and the reason for cancellation in the great majority of cases was poor ovarian response (19/21). Furthermore, for those cases proceeding to biopsy, 42.0% (31/47) did not have an embryo transfer, mainly due to no normal/balanced embryos available. When a low percentage of normal/balanced embryos is expected, patients can consider pooling embryos from several stimulation cycles and perform PGD/PGD in a single batch. Such ‘batching’ can increase the chance of having normal/balanced embryos and allow selection of the best-quality genetically transferrable embryos for replacement in the PGD/PGS cycle, instead of having multiple cycles with no embryo transfer.
 
There were 26 cycles of vitrified-warmed blastocyst transfer (11 cycles after PGS and 15 cycles after PGD) performed during a natural cycle. The ongoing pregnancy rate per transfer in these natural cycles after PGD appeared to be higher than those with transfer in a stimulated cycle, while the miscarriage rate of transfer in the natural cycle was lower than that of transfer in a stimulated cycle. Such findings did not reach statistical significance due to the small number of cases, however. Some reports have suggested that transfer of embryos in a natural cycle may result in a higher pregnancy and implantation rate than in a stimulated cycle due to the better receptivity of the endometrium without gonadotropin stimulation.23 24 25 26
 
The limitation of the present study was the small number of cases for each indication of PGS. Moreover, it was not a randomised controlled trial. The usefulness of PGS by aCGH in these cases needs to be confirmed in a large randomised controlled trial. It is noteworthy that aCGH cannot detect mutation and/or small chromosomal aberrations (<10 Mb for Robertsonian translocation, mosaic sex chromosome aneuploidy and PGS; <5 Mb for reciprocal translocation and inversion). False results can be attributed to mosaicism of embryos, although no misdiagnosis was found in the present study.
 
Conclusions
The clinical outcomes of PGD and PGS in our unit were comparable to those reported internationally. A genetically transferrable embryo after PGD that is replaced during a natural cycle may improve the ongoing pregnancy rate and implantation rate when compared with transfer during a stimulated cycle.
 
Acknowledgements
We would like to thank the patients, nurses, clinicians, technicians, and embryologists at the Centre of Assisted Reproduction and Embryology, Queen Mary Hospital–The University of Hong Kong for their contribution in the PGD programme.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. De Rycke M, Belva F, Goossens V, et al. ESHRE PGD Consortium data collection XIII: cycles from January to December 2010 with pregnancy follow-up to October 2011. Hum Reprod 2015;30:1763-89. Crossref
2. DeUgarte CM, Li M, Surrey M, Danzer H, Hill D, DeCherney AH. Accuracy of FISH analysis in predicting chromosomal status in patients undergoing preimplantation genetic diagnosis. Fertil Steril 2008;90:1049-54. Crossref
3. Li M, DeUgarte CM, Surrey M, Danzer H, DeCherney A, Hill DL. Fluorescence in situ hybridization reanalysis of day-6 human blastocysts diagnosed with aneuploidy on day 3. Fertil Steril 2005;84:1395-400. Crossref
4. Velilla E, Escudero T, Munné S. Blastomere fixation techniques and risk of misdiagnosis for preimplantation genetic diagnosis of aneuploidy. Reprod Biomed Online 2002;4:210-7. Crossref
5. Wells D, Alfarawati S, Fragouli E. Use of comprehensive chromosomal screening for embryo assessment: microarrays and CGH. Mol Hum Reprod 2008;14:703-10. Crossref
6. Lee VC, Chow JF, Lau EY, Yeung WS, Ho PC, Ng EH. Comparison between fluorescent in-situ hybridisation and array comparative genomic hybridisation in preimplantation genetic diagnosis in translocation carriers. Hong Kong Med J 2015;21:16-22.
7. Bielanska M, Tan SL, Ao A. Chromosomal mosaicism throughout human preimplantation development in vitro: incidence, type, and relevance to embryo outcome. Hum Reprod 2002;17:413-9. Crossref
8. Munné S, Sandalinas M, Magli C, Gianaroli L, Cohen J, Warburton D. Increased rate of aneuploid embryos in young women with previous aneuploid conceptions. Prenat Diagn 2004;24:638-43. Crossref
9. Kuliev A, Cieslak J, Verlinsky Y. Frequency and distribution of chromosome abnormalities in human oocytes. Cytogenet Genome Res 2005;111:193-8. Crossref
10. Magli MC, Gianaroli L, Ferraretti AP, Lappi M, Ruberti A, Farfalli V. Embryo morphology and development are dependent on the chromosomal complement. Fertil Steril 2007;87:534-41. Crossref
11. Munné S, Chen S, Colls P, et al. Maternal age, morphology, development and chromosome abnormalities in over 6000 cleavage-stage embryos. Reprod Biomed Online 2007;14:628-34. Crossref
12. Hassold T, Hunt P. Maternal age and chromosomally abnormal pregnancies: what we know and what we wish we knew. Curr Opin Pediatr 2009;21:703-8. Crossref
13. Vanneste E, Voet T, Le Caignec C, et al. Chromosome instability is common in human cleavage-stage embryos. Nat Med 2009;15:577-83. Crossref
14. Alfarawati S, Fragouli E, Colls P, et al. The relationship between blastocyst morphology, chromosomal abnormality, and embryo gender. Fertil Steril 2011;95:520-4. Crossref
15. Rubio C, Rodrigo L, Mir P, et al. Use of array comparative genomic hybridization (array-CGH) for embryo assessment: clinical results. Fertil Steril 2013;99:1044-8. Crossref
16. Yang Z, Liu J, Collins GS, et al. Selection of single blastocysts for fresh transfer via standard morphology assessment alone and with array CGH for good prognosis IVF patients: results from a randomized pilot study. Mol Cytogenet 2012;5:24. Crossref
17. Yang Z, Salem SA, Liu X, Kuang Y, Salem RD, Liu J. Selection of euploid blastocysts for cryopreservation with array comparative genomic hybridization (aCGH) results in increased implantation rates in subsequent frozen and thawed embryo transfer cycles. Mol Cytogenet 2013;6:32. Crossref
18. Ng EH, Yeung WS, Lau EY, So WW, Ho PC. High serum oestradiol concentrations in fresh IVF cycles do not impair implantation and pregnancy rates in subsequent frozen-thawed embryo transfer cycles. Hum Reprod 2000;15:250-5. Crossref
19. Chow JF, Yeung WS, Lau EY, et al. Singleton birth after preimplantation genetic diagnosis for Huntington disease using whole genome amplification. Fertil Steril 2009;92:828.e7-10.
20. Scriven PN, Handyside AH, Ogilvie CM. Chromosome translocations: segregation modes and strategies for preimplantation genetic diagnosis. Prenat Diagn 1998;18:1437-49. Crossref
21. Spandorfer SD, Davis OK, Barmat LI, Chung PH, Rosenwaks Z. Relationship between maternal age and aneuploidy in in vitro fertilization pregnancy loss. Fertil Steril 2004;81:1265-9. Crossref
22. Hassold T, Hall H, Hunt P. The origin of human aneuploidy: where we have been, where we are going. Hum Mol Genet 2007;16 Spec No. 2:R203-8.
23. Evans J, Hannan NJ, Edgell TA, et al. Fresh versus frozen embryo transfer: backing clinical decisions with scientific and clinical evidence. Hum Reprod Update 2014;20:808-21. Crossref
24. Shapiro BS, Daneshmand ST, Garner FC, Aguirre M, Hudson C. Clinical rationale for cryopreservation of entire embryo cohorts in lieu of fresh transfer. Fertil Steril 2014;102:3-9. Crossref
25. Roque M. Freeze-all policy: is it time for that? J Assist Reprod Genet 2015;32:171-6. Crossref
26. Roque M, Valle M, Guimarães F, Sampaio M, Geber S. Freeze-all policy: fresh vs. frozen-thawed embryo transfer. Fertil Steril 2015;103:1190-3. Crossref

Pages