Silver-Russell syndrome in Hong Kong

Hong Kong Med J 2016 Dec;22(6):526–33 | Epub 29 Jul 2016
DOI: 10.12809/hkmj154750
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Silver-Russell syndrome in Hong Kong
HM Luk, MB, BS, FHKAM (Paediatrics)1#; KS Yeung, BSc, MPhil2#; WL Wong, BSc, MPhil2; Brian HY Chung, FHKAM (Paediatrics), FCCMG (Clinical Genetics)2; Tony MF Tong, MPhil, MSc1; Ivan FM Lo, MB, ChB, FHKAM (Paediatrics)1
1 Clinical Genetic Service, Department of Health, 3/F, Cheung Sha Wan Jockey Club Clinic, 2 Kwong Lee Road, Sham Shui Po, Hong Kong
2 Department of Paediatrics and Adolescent Medicine, Queen Mary Hospital, The University of Hong Kong, Pokfulam, Hong Kong
 
# Co-first author
 
Corresponding author: Dr Ivan FM Lo (con_cg@dh.gov.hk)
 
 Full paper in PDF
Abstract
Objectives: To examine the molecular pathogenetic mechanisms, (epi)genotype-phenotype correlation, and the performance of the three clinical scoring systems—namely Netchine et al, Bartholdi et al, and Birmingham scores—for patients with Silver-Russell syndrome in Hong Kong.
 
Methods: This retrospective case series was conducted at two tertiary genetic clinics, the Clinical Genetic Service, Department of Health, and clinical genetic clinic in Queen Mary Hospital in Hong Kong. All records of patients with suspected Silver-Russell syndrome under the care of the two genetic clinics between January 2010 and September 2015 were retrieved from the computer database.
 
Results: Of the 28 live-birth patients with Silver-Russell syndrome, 35.7% had H19 loss of DNA methylation, 21.4% had maternal uniparental disomy of chromosome 7, 3.6% had mosaic maternal uniparental disomy of chromosome 11, and the remaining 39.3% were Silver-Russell syndrome of unexplained molecular origin. No significant correlation between (epi)genotype and phenotype could be identified between H19 loss of DNA methylation and maternal uniparental disomy of chromosome 7. Comparison of molecularly confirmed patients and patients with Silver-Russell syndrome of unexplained origin revealed that postnatal microcephaly and café-au-lait spots were more common in the latter group, and body and limb asymmetry was more common in the former group. Performance analysis showed the Netchine et al and Birmingham scoring systems had similar sensitivity in identifying Hong Kong Chinese subjects with Silver-Russell syndrome.
 
Conclusion: This is the first territory-wide study of Silver-Russell syndrome in Hong Kong. The clinical features and the spectrum of underlying epigenetic defects were comparable to those reported in western populations.
 
New knowledge added by this study
  • The epigenetic defects of Silver-Russell syndrome (SRS) in Hong Kong Chinese patients are comparable to those reported in western populations.
  • No epigenotype-phenotype correlation was demonstrated among SRS patients in this study.
Implications for clinical practice or policy
  • All suspected SRS patients should be referred to a genetic clinic for assessment.
  • A new diagnostic algorithm has been proposed for Chinese patients with SRS.
 
 
Introduction
Silver-Russell syndrome (SRS) [OMIM 180860] is a clinically and genetically heterogeneous congenital imprinting disorder. It was first described in 1953 by Dr Henry Silver and his colleagues, who reported two children with short stature and congenital hemihypertrophy.1 In the following year, Dr Alexander Russell reported five similar cases with intrauterine dwarfism and craniofacial dysostosis.2 The term SRS has been used since 1970 to describe a constellation of features with intrauterine growth retardation without postnatal catch-up, distinct facial characteristics, relative macrocephaly, body asymmetry, and/or fifth finger clinodactyly.3 4 The prevalence of SRS was estimated to be 1 in 100 000,5 but was probably underestimated due to the diverse and variable clinical manifestations. The majority of SRS cases are sporadic, although occasional familial cases have been reported.
 
Two major molecular mechanisms have been implicated in SRS—maternal uniparental disomy of chromosome 7 (mUPD7)6 and loss of DNA methylation (LOM) of the imprinting control region 1 (ICR1) on the paternal allele of chromosome 11p15 region that regulates the IGF2/H19 locus.6 7 8 9 According to the studies, LOM of ICR1 and mUPD7 roughly account for 45% to 50% and 5% to 10% of SRS cases, respectively.6 7 8 9 Rare cytogenetic rearrangements have also been reported in 1% to 2% of cases.4 10 11 There remain 30% to 40% of SRS cases in which the molecular mechanisms remain elusive, however.
 
Owing to the wide spectrum of clinical presentations of SRS, there is considerable clinical overlap with other growth retardation syndromes. At present there is no consensus for the diagnostic criteria, so diagnosing SRS is challenging. Several scoring systems have been proposed to facilitate clinical diagnosis and to guide genetic testing.7 11 12 13 14 Based on the prevalence of different molecular mechanisms, methylation study of the 11p15 region is the recommended first-tier investigation for patients with suspected SRS, and mUPD7 analysis is the second tier.14
 
The comprehensive clinical spectrum and molecular study of SRS have not been reported in the Chinese population. Therefore, a retrospective review that aimed to summarise the clinical and genetic findings of all SRS patients in Hong Kong was conducted. The sensitivity and specificity of different scoring systems7 11 12 13 14 in identifying Hong Kong Chinese SRS patients have also been studied.
 
Methods
Patients
The Clinical Genetic Service (CGS), Department of Health and the clinical genetic clinic at Queen Mary Hospital (QMH), The University of Hong Kong, are the only two tertiary genetic referral centres that provide comprehensive genetic counselling, and diagnostic and laboratory service for the Hong Kong population. Patients with a clinical suspicion of growth failure due to genetic causes or possibly SRS were referred for assessment and genetic testing.
 
In this review, all records of patients with suspected SRS seen at the CGS or clinical genetic clinic of QMH between January 2010 and September 2015 were retrieved from the computer database system using the key words of “Silver Russell syndrome” and “failure to thrive and growth retardation”. The clinical and laboratory data of these patients were retrospectively analysed. Patients with alternative diagnoses after assessment and genetic investigation were excluded. This study was done in accordance with the principles outlined in the Declaration of Helsinki.
 
Clinical diagnostic criteria for Silver-Russell syndrome in this study
Currently, there is no universal consensus on the diagnostic criteria of SRS, but the Hitchins et al’s criteria15 are the most commonly used clinically. The diagnosis of SRS in this study was made when a patient fulfilled three major, or two major and two minor criteria.
 
Major criteria included (1) intrauterine growth retardation/small for gestational age (<10th percentile); (2) postnatal growth with height/length <3rd percentile; (3) normal head circumference (3rd-97th percentile); and (4) limb, body, and/or facial asymmetry.
 
Minor criteria included (1) short arm span with normal upper-to-lower segment ratio; (2) fifth finger clinodactyly; (3) triangular facies; and (4) frontal bossing/prominent forehead.
 
Epimutation in imprinting control region 1
Investigation of the methylation status and copy number change of the H19 differentially methylated region (H19 DMR) and KvDMR1 at chromosome 11p15 region was done with methylation specific–multiplex ligation-dependent probe amplification (MS-MLPA) method, using SALSA MLPA ME030-B1 BWS/RSS kit (MRC-Holland, Amsterdam, The Netherlands). Following the manufacturer’s instructions, approximately 100 ng genomic DNA was first denatured and hybridised overnight with the probe mixture supplied with the kit. The samples were then split into two portions, treated either with ligase alone or with ligase and HhaI. Polymerase chain reactions (PCR) were then performed with the reagents and primers supplied in the kit. The PCR products were separated by capillary electrophoresis (model 3130xl; Applied Biosystems, Foster City [CA], US). The electropherograms were analysed using GeneScan software (Applied Biosystems, Foster City [CA], US), and the relative peak area was calculated using the Coffalyser version 9.4 software (MRC-Holland, Amsterdam, The Netherlands).
 
Analysis of maternal uniparental disomy of chromosome 7
We studied mUPD7 with eight polymorphic microsatellite markers, three on 7p and five on 7q (D7S531, D7S507, D7S2552, D7S2429, D7S2504, D7S500, D7S2442, and D7S2465), using a standard protocol. Haplotype analysis was then performed. A diagnosis of mUPD7 required evidence of exclusive maternal inheritance at two or more informative markers.
 
Data analysis and (epi)genotype-phenotype correlation
Epidemiological data, physical characteristics, growth records, and molecular findings were then collected for analysis. Clinical photographs were taken during consultation (Fig 1). In order to delineate the (epi)genotype-phenotype correlation, we divided the patients according to their (epi)genotype, namely H19 LOM, mUPD7, mosaic maternal uniparental disomy of chromosome 11 (mUPD11), or SRS of unexplained origin. The SRS of unexplained origin was defined as negative for 11p15 region epimutation and mUPD7 study. For statistical calculation, Student’s t test was used for continuous variables and Fisher’s exact test for categorical variables. Two-tailed P values were also computed. Differences were considered to be statistically significant when P≤0.05.
 

Figure 1. Clinical photos for molecularly confirmed SRS in this study
Patients with (a to d) mUPD7 and (e to g) H19 LOM. All had relative macrocephaly, frontal bossing, triangular face, and pointed chin. Patients showing (e) fifth finger clinodactyly and (f) body asymmetry. (h) Informative microsatellite markers in UPD study that shows mUPD7
 
Clinical score
Three clinical scoring systems were applied to all patients referred with suspected SRS and included Netchine et al score,7 Bartholdi et al score,12 and the Birmingham score.14 An overview of the three SRS scoring systems is summarised in Table 1. Using the Hitchins et al’s criteria15 as standard in this study, the sensitivity and specificity of these three scoring systems in identifying SRS were compared.
 

Table 1. Comparison of three common clinical scoring systems for SRS
 
Results
During the study period, 83 patients with suspected SRS were referred to both genetic clinics. After clinical assessment and investigations, 54 patients had an alternative diagnosis. The remaining 29 patients were clinically diagnosed with SRS using the Hitchins et al’s criteria.15 All were Chinese. One was a prenatal case with maternal H19 duplication. Since termination of pregnancy was performed at 23 weeks of gestation, it was excluded for downstream analysis. For the remaining 28 SRS patients, their age at the end of the study (September 2015) ranged from 2 years to 22 years 9 months, with a median of 9 years 4 months. The male-to-female ratio was 9:5. Sequential MS-MLPA study on chromosome 11p15 region and mUPD7 study were performed on all SRS patients. Among the 28 live-birth SRS patients, 35.7% (n=10) had H19 LOM, 21.4% (n=6) had mUPD7, 3.6% (n=1) had mosaic mUPD11, and 39.3% (n=11) were of SRS of unexplained origin. The clinical features of the SRS cohort are summarised in Table 2. The clinical features of some molecularly confirmed SRS patients in this study and one illustrative microsatellite electropherogram in mUPD7 analysis are shown in Figure 1.
 

Table 2. Summary of the clinical features in different subgroups of SRS patients
 
In order to study the (epi)genotype-phenotype correlation among the H19 LOM and mUPD7 groups, the clinical features were compared. There was no significant difference among the two groups (data not shown). When comparing the 28 molecularly confirmed SRS and 54 SRS of unexplained origin patients, postnatal microcephaly (P=0.01) and café-au-lait spots (P=0.05) were more common among SRS of unexplained origin, while body asymmetry (P<0.01) and limb asymmetry (P<0.01) were more common among the molecularly confirmed group.
 
The performance of the three clinical scoring systems namely Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 in identifying SRS in our cohort was compared. The proportion of molecularly confirmed cases in those ‘likely SRS’ and ‘unlikely SRS’ based on the scoring system are summarised in Table 3. The sensitivity and specificity among different scoring systems for identifying SRS are summarised in Table 4.
 

Table 3. Proportion of different SRS subtypes with ‘likely SRS’ and ‘unlikely SRS’ score in different scoring systems in our cohort
 

Table 4. The sensitivity and specificity of the three clinical scoring systems compared with Hitchin et al’s criteria15 in identifying SRS in our cohort
 
Discussion
Silver-Russell syndrome is a clinically and genetically heterogeneous disorder. This is the first comprehensive clinical and epigenetic study of SRS in Hong Kong. With sequential 11p15 epimutation analysis and mUPD7 study of SRS patients in this cohort, molecular confirmation was achieved in 60.7% of cases; H19 LOM and mUPD7 accounted for 35.7% and 21.4% of the cases, respectively. Although the proportion of H19 LOM–related SRS cases was similar to the western and Japanese populations,6 7 8 9 16 the proportion of mUPD7 in our cohort was significantly higher. Nonetheless, due to the small sample size, this observation might not reflect the true ethnic-specific epigenetic alteration in the Chinese population. Further studies are necessary to confirm this difference.
 
In previous studies of (epi)genotype-phenotype correlation4 7 12 17 18 19 20 in SRS, patients with mUPD7 had a milder phenotype but were more likely to have developmental delay. On the contrary, patients with H19 LOM appeared to have more typical SRS features such as characteristic facial profile and body asymmetry. Such a correlation could not be demonstrated in our cohort. When comparing the molecularly confirmed and SRS of unexplained origin groups, postnatal microcephaly and café-au-lait spots were more common in the group of SRS of unexplained origin, while body/limb asymmetry was more common in the molecularly confirmed group. This observation has also been reported in Japanese SRS patients.16 This might be due to the greater clinical and genetic heterogeneity in the molecularly negative SRS.
 
Although SRS has been extensively studied, there remains no universal consensus on the clinical diagnostic criteria. Hitchins et al’s criteria15 are currently the most commonly used. In order to facilitate the clinical diagnosis, several additional scoring systems have been proposed which include the Netchine et al,7 Bartholdi et al,12 and Birmingham scores.14 Each of them has its advantages and limitations. The major caveats of those scoring systems include relative subjectivity of clinical signs, and time-dependent and evolving clinical features. The heterogeneity of clinical manifestations also limits their application. In order to validate these scoring systems, several studies have been performed to evaluate their accuracy in predicting the molecular genetic testing result.14 21 We also evaluated the performance of these three scoring systems in this Chinese cohort. All three scoring systems are 100% specific in diagnosing SRS, but the sensitivity for Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 is 75%, 53.6%, and 71.4%, respectively when compared with Hitchins et al’s criteria.15 This suggests that Hitchins et al’s criteria15 remain the most sensitive diagnostic criteria for SRS when used clinically.
 
The management of SRS is challenging and requires multidisciplinary input. Growth hormone (GH) treatment is the current recommended therapy for children with small for gestational age without spontaneous catch-up growth and those with GH deficiency. In SRS, abnormalities in spontaneous GH secretion and subnormal responses to provocative GH stimulation have been well reported.20 The proposed mechanism is dysregulation of the growth factors and its major binding protein,4 particularly in the H19 LOM group. Besides, SRS patients are expected to have poor catch-up growth. Nonetheless, GH therapy is not a universal standard treatment for SRS. In Hong Kong, the indications for GH therapy under Hospital Authority guidelines do not include SRS22 without GH response abnormalities. In our cohort, only three patients who had a suboptimal GH provocative stimulation test are currently receiving GH treatment. The long-term outcome is not yet known.
 
Although tissue-specific epigenetic manifestation has been reported in SRS,23 mosaic genetic or epigenetic alteration is uncommon.24 We have one patient with mUPD11 confirmed by molecular testing with peripheral blood and buccal swab samples. Mosaicism should be considered when a patient has typical SRS phenotype but negative routine testing. Testing of other tissue should be pursued so as to provide an accurate molecular diagnosis that can guide subsequent genetic counselling and clinical management.
 
Finally, upon review of the literature, it is well known that gain of function of the CDKN1C gene25 and maternal UPD14 (Temple syndrome)26 27 can result in a phenotype mimicking SRS. There are also other syndromic growth retardation disorders with many overlapping clinical features with those of SRS, such as mulibrey nanism and 3-M syndrome.28 29 Therefore, with the latest understanding of the molecular pathogenetic mechanisms of SRS, together with evidence21 30 31 and results from this study, we propose the diagnostic algorithm for Chinese SRS patients as depicted in Figure 2. All clinically suspected SRS patients should be assessed by a clinical geneticist. Although the Netchine et al score,7 Bartholdi et al score,12 and Birmingham score14 are highly specific, they are less sensitive than the Hitchins et al’s criteria15 for diagnosing SRS in our Chinese cohort. Therefore, the Hitchins et al’s criteria15 should be used clinically to classify those suspected SRS patients into ‘likely’ or ‘unlikely’ SRS. For those ‘likely SRS’ patients, sequential 11p15 region methylation study and mUPD7 analysis should be performed because 11p15 region epigenetic alteration is more prevalent than mUPD7 in SRS. For those molecularly unconfirmed SRS, further testing for other SRS-like syndromes including Temple syndrome or CDKN1C-related disorder should be pursued if indicated.
 

Figure 2. Proposed algorithm for management and genetic investigations for suspected SRS in Hong Kong
 
Conclusion
This 5-year review is the first territory-wide study of Chinese SRS patients in Hong Kong. It showed that the clinical features and underlying epigenetic mechanisms of Chinese SRS are similar to those of other western populations. Early diagnosis and multidisciplinary management are important for managing SRS patients. Vigilant clinical suspicion with confirmation by molecular testing is essential. Based on the current evidence and performance evaluation of different clinical scoring systems, a comprehensive diagnostic algorithm is proposed. We hope that with an increase in understanding of the underlying pathophysiology and the (epi)genotype-phenotype correlation in Chinese SRS patients, the quality of medical care will be greatly improved in the near future.
 
Acknowledgements
We thank all the paediatricians and physicians who have referred their SRS patients to our service and QMH. We are also grateful to all the laboratory staff in CGS for their technical support. This work in HKU is supported by HKU small project funding and The Society for the Relief of Disabled Children in Hong Kong.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Silver HK, Kiyasu W, George J, Deamer WC. Syndrome of congenital hemihypertrophy, shortness of stature, and elevated urinary gonadotropins. Pediatrics 1953;12:368-76.
2. Russell A. A syndrome of intra-uterine dwarfism recognizable at birth with cranio-facial dysostosis, disproportionately short arms, and other anomalies (5 examples). Proc R Soc Med 1954;47:1040-4.
3. Wollmann HA, Kirchner T, Enders H, Preece MA, Ranke MB. Growth and symptoms in Silver-Russell syndrome: review on the basis of 386 patients. Eur J Pediatr 1995;154:958-68. Crossref
4. Wakeling EL, Amero SA, Alders M, et al. Epigenotype-phenotype correlations in Silver-Russell syndrome. J Med Genet 2010;47:760-8. Crossref
5. Christoforidis A, Maniadaki I, Stanhope R. Managing children with Russell-Silver syndrome: more than just growth hormone treatment? J Pediatr Endocrinol Metab 2005;18:651-2. Crossref
6. Kotzot D, Schmitt S, Bernasconi F, et al. Uniparental disomy 7 in Silver-Russell syndrome and primordial growth retardation. Hum Mol Genet 1995;4:583-7. Crossref
7. Netchine I, Rossignol S, Dufourg MN, et al. 11p15 Imprinting center region 1 loss of methylation is a common and specific cause of typical Russell-Silver syndrome: clinical scoring system and epigenetic-phenotypic correlations. J Clin Endocrinol Metab 2007;92:3148-54. Crossref
8. Gicquel C, Rossignol S, Cabrol S, et al. Epimutation of the telomeric imprinting center region on chromosome 11p15 in Silver-Russell syndrome. Nat Genet 2005;37:1003-7. Crossref
9. Schönherr N, Meyer E, Eggermann K, Ranke MB, Wollmann HA, Eggermann T. (Epi)mutations in 11p15 significantly contribute to Silver-Russell syndrome: but are they generally involved in growth retardation? Eur J Med Genet 2006;49:414-8. Crossref
10. Azzi S, Abi Habib W, Netchine I. Beckwith-Wiedemann and Russell-Silver Syndromes: from new molecular insights to the comprehension of imprinting regulation. Curr Opin Endocrinol Diabetes Obes 2014;21:30-8. Crossref
11. Price SM, Stanhope R, Garrett C, Preece MA, Trembath RC. The spectrum of Silver-Russell syndrome: a clinical and molecular genetic study and new diagnostic criteria. J Med Genet 1999;36:837-42.
12. Bartholdi D, Krajewska-Walasek M, Ounap K, et al. Epigenetic mutations of the imprinted IGF2-H19 domain in Silver-Russell syndrome (SRS): results from a large cohort of patients with SRS and SRS-like phenotypes. J Med Genet 2009;46:192-7. Crossref
13. Eggermann T, Gonzalez D, Spengler S, Arslan-Kirchner M, Binder G, Schönherr N. Broad clinical spectrum in Silver-Russell syndrome and consequences for genetic testing in growth retardation. Pediatrics 2009;123:e929-31. Crossref
14. Dias RP, Nightingale P, Hardy C, et al. Comparison of the clinical scoring systems in Silver-Russell syndrome and development of modified diagnostic criteria to guide molecular genetic testing. J Med Genet 2013;50:635-9. Crossref
15. Hitchins MP, Stanier P, Preece MA, Moore GE. Silver-Russell syndrome: a dissection of the genetic aetiology and candidate chromosomal regions. J Med Genet 2001;38:810-9. Crossref
16. Fuke T, Mizuno S, Nagai T, et al. Molecular and clinical studies in 138 Japanese patients with Silver-Russell syndrome. PLoS One 2013;8:e60105. Crossref
17. Bliek J, Terhal P, van den Bogaard MJ, et al. Hypomethylation of the H19 gene causes not only Silver-Russell syndrome (SRS) but also isolated asymmetry or an SRS-like phenotype. Am J Hum Genet 2006;78:604-14. Crossref
18. Bruce S, Hannula-Jouppi K, Peltonen J, Kere J, Lipsanen-Nyman M. Clinically distinct epigenetic subgroups in Silver-Russell syndrome: the degree of H19 hypomethylation associates with phenotype severity and genital and skeletal anomalies. J Clin Endocrinol Metab 2009;94:579-87. Crossref
19. Kotzot D. Maternal uniparental disomy 7 and Silver-Russell syndrome—clinical update and comparison with other subgroups. Eur J Med Genet 2008;51:444-51. Crossref
20. Binder G, Seidel AK, Martin DD, et al. The endocrine phenotype in Silver-Russell syndrome is defined by the underlying epigenetic alteration. J Clin Endocrinol Metab 2008;93:1402-7. Crossref
21. Azzi S, Salem J, Thibaud N, et al. A prospective study validating a clinical scoring system and demonstrating phenotypical-genotypical correlations in Silver-Russell syndrome. J Med Genet 2015;52:446-53. Crossref
22. But WM, Huen KF, Lee CY, Lam YY, Tse WY, Yu CM. An update on the indications of growth hormone treatment under Hospital Authority in Hong Kong. Hong Kong J Paediatr 2012;17:208-16.
23. Azzi S, Blaise A, Steunou V, et al. Complex tissue-specific epigenotypes in Russell-Silver Syndrome associated with 11p15 ICR1 hypomethylation. Hum Mutat 2014;35:1211-20. Crossref
24. Bullman H, Lever M, Robinson DO, Mackay DJ, Holder SE, Wakeling EL. Mosaic maternal uniparental disomy of chromosome 11 in a patient with Silver-Russell syndrome. J Med Genet 2008;45:396-9. Crossref
25. Brioude F, Oliver-Petit I, Blaise A, et al. CDKN1C mutation affecting the PCNA-binding domain as a cause of familial Russell Silver syndrome. J Med Genet 2013;50:823-30. Crossref
26. Ioannides Y, Lokulo-Sodipe K, Mackay DJ, Davies JH, Temple IK. Temple syndrome: improving the recognition of an underdiagnosed chromosome 14 imprinting disorder: an analysis of 51 published cases. J Med Genet 2014;51:495-501. Crossref
27. Kagami M, Mizuno S, Matsubara K, et al. Epimutations of the IG-DMR and the MEG3-DMR at the 14q32.2 imprinted region in two patients with Silver-Russell Syndrome–compatible phenotype. Eur J Hum Genet 2015;23:1062-7. Crossref
28. Hämäläinen RH, Mowat D, Gabbett MT, O’brien TA, Kallijärvi J, Lehesjoki AE. Wilms’ tumor and novel TRIM37 mutations in an Australian patient with mulibrey nanism. Clin Genet 2006;70:473-9. Crossref
29. van der Wal G, Otten BJ, Brunner HG, van der Burgt I. 3-M syndrome: description of six new patients with review of the literature. Clin Dysmorphol 2001;10:241-52. Crossref
30. Scott RH, Douglas J, Baskcomb L, et al. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) robustly detects and distinguishes 11p15 abnormalities associated with overgrowth and growth retardation. J Med Genet 2008;45:106-13. Crossref
31. Spengler S, Begemann M, Ortiz Brüchle N, et al. Molecular karyotyping as a relevant diagnostic tool in children with growth retardation with Silver-Russell features. J Pediatr 2012;161:933-42. Crossref

Management of health care workers following occupational exposure to hepatitis B, hepatitis C, and human immunodeficiency virus

Hong Kong Med J 2016 Oct;22(5):472–7 | Epub 26 Aug 2016
DOI: 10.12809/hkmj164897
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE  CME
Management of health care workers following occupational exposure to hepatitis B, hepatitis C, and human immunodeficiency virus
Winnie WY Sin, MB, ChB, FHKAM (Medicine); Ada WC Lin, MB, BS, FHKAM (Medicine); Kenny CW Chan, MB, BS, FHKAM (Medicine); KH Wong, MB, BS, FHKAM (Medicine)
Special Preventive Programme, Centre for Health Protection, Department of Health, Kowloon Bay Health Centre, Hong Kong
 
Corresponding author: Dr Kenny CW Chan (kcwchan@dh.gov.hk)
 
 Full paper in PDF
 
Abstract
Introduction: Needlestick injury or mucosal contact with blood or body fluids is well recognised in the health care setting. This study aimed to describe the post-exposure management and outcome in health care workers following exposure to hepatitis B, hepatitis C, or human immunodeficiency virus (HIV) during needlestick injury or mucosal contact.
 
Methods: This case series study was conducted in a public clinic in Hong Kong. All health care workers with a needlestick injury or mucosal contact with blood or body fluids who were referred to the Therapeutic Prevention Clinic of Department of Health from 1999 to 2013 were included.
 
Results: A total of 1525 health care workers were referred to the Therapeutic Prevention Clinic following occupational exposure. Most sustained a percutaneous injury (89%), in particular during post-procedure cleaning or tidying up. Gloves were worn in 62.7% of instances. The source patient could be identified in 83.7% of cases, but the infection status was usually unknown, with baseline positivity rates of hepatitis B, hepatitis C, and HIV of all identified sources, as reported by the injured, being 7.4%, 1.6%, and 3.3%, respectively. Post-exposure prophylaxis of HIV was prescribed to 48 health care workers, of whom 14 (38.9%) had been exposed to known HIV-infected blood or body fluids. The majority (89.6%) received HIV post-exposure prophylaxis within 24 hours of exposure. Drug-related adverse events were encountered by 88.6%. The completion rate of post-exposure prophylaxis was 73.1%. After a follow-up period of 6 months (or 1 year for those who had taken HIV post-exposure prophylaxis), no hepatitis B, hepatitis C, or HIV seroconversions were detected.
 
Conclusions: Percutaneous injury in the health care setting is not uncommon but post-exposure prophylaxis of HIV is infrequently indicated. There was no hepatitis B, hepatitis C, and HIV transmission via sharps or mucosal injury in this cohort of health care workers.
 
 
New knowledge added by this study
  • The risk of hepatitis B (HBV), hepatitis C (HCV), and human immunodeficiency virus (HIV) transmission following occupational sharps or mucosal injury in Hong Kong is small.
Implications for clinical practice or policy
  • Meticulous adherence to infection control procedures and timely post-exposure management prevents HBV, HCV, and HIV infection following occupational exposure to blood and body fluids.
 
 
Introduction
Needlestick injury or mucosal contact with blood or body fluids is well recognised in the health care setting. These incidents pose a small but definite risk for health care workers of acquiring blood-borne viruses, notably hepatitis B virus (HBV), hepatitis C virus (HCV), and human immunodeficiency virus (HIV). The estimated risk of contracting HBV infection through occupational exposure to known infected blood via needlestick injury varies from 18% to 30%, while that for HCV infection is 1.8% (range, 0%-7%).1 The risk of HIV transmission following percutaneous or mucosal exposure to HIV-contaminated blood is 0.3% and 0.09%, respectively.1 The risk is further affected by the type of exposure, body fluid involved, and infectivity of the source.
 
In Hong Kong, injured health care workers usually receive initial first aid and immediate management in the Accident and Emergency Department. They are then referred to designated clinics for specific post-exposure management. Currently, aside from staff of the Hospital Authority who are managed at two designated clinics post-exposure, all other health care workers from private hospitals, and government or private clinics and laboratories are referred to the Therapeutic Prevention Clinic (TPC) of the Integrated Treatment Centre, Department of Health. Since its launch in mid-1999, the TPC has provided comprehensive post-exposure management to people with documented percutaneous, mucosal, or breached skin exposure to blood or body fluids in accordance with the local guidelines set out by the Scientific Committee on AIDS and STI, and Infection Control Branch of Centre for Health Protection, Department of Health.2 The present study describes the characteristics and outcome of health care workers who attended the TPC from mid-1999 to 2013 following occupational exposure to blood or body fluids.
 
Methods
The study included all health care workers seen in the TPC from July 1999 to December 2013 following occupational exposure to blood or body fluids, who attended following secondary referral by an accident and emergency department of a public hospital. Using two standard questionnaires (Appendices 1 and 2), data were collected by the attending nurse and doctor during a face-to-face interview with each health care worker on the following: demography and occupation of the exposed client, type and pattern of exposure, post-exposure management, and clinical outcome.
 
Appendix 1. TPC First Consultation Assessment Form
 
Appendix 2. Therapeutic Prevention Clinic (TPC) human immunodeficiency virus (HIV) Post-exposure Prophylaxis Registry Form (to be completed on completion or cessation of post-exposure prophylaxis)
 
Details of the exposure, including type of exposure and the situation in which it occurred, were noted. The number of risk factors (see definitions below) for HIV transmission was counted for each exposure and further classified as high risk or low risk. Where known and reported by the injured party, hepatitis B surface antigen (HBsAg), HCV, and HIV status of the source were recorded.
 
The timing of the first medical consultation in the accident and emergency department, any prescription of HIV post-exposure prophylaxis (PEP), and the time since injury were noted. Exposed health care workers who received HIV PEP were reviewed at clinic visits every 2 weeks until completion of the 4-week course of treatment, and any treatment-related adverse effects were reported. Blood was obtained as appropriate at these visits for measurement of complete blood count, renal and liver function, and amylase, creatine kinase, fasting lipid, and glucose levels.
 
Apart from HIV PEP–related side-effects (reported and rated by patients as mild, moderate, or severe), the rate of completion of PEP, and number of HBV, HCV, and HIV seroconversions following the incident was also recorded. The HBsAg, anti-HBs, anti-HCV, and anti-HIV were checked at baseline and 6 months post-exposure to determine whether seroconversion had occurred. Those exposed to a known HCV-infected source or a source known to be an injecting drug user had additional blood tests 6 weeks post-exposure for liver function, anti-HCV, and HCV RNA. Additional HIV antibody testing at 3 and 12 months post-exposure was arranged for those who received HIV PEP. For those who contracted HCV infection from a source co-infected with HCV and HIV, further HIV testing was performed at 1 year post-exposure to detect delayed seroconversion.
 
Definitions
Health care workers included doctors and medical students, dentists and dental workers, nurses, midwives, inoculators, laboratory workers, phlebotomists, ward or clinic attendants, and workmen. Staff working in non–health care institutions (eg elderly home, hostels, and sheltered workshops) were excluded. Five factors were classified as high-risk exposure: (i) deep percutaneous injury, (ii) procedures involving a device placed in a blood vessel, (iii) use of a hollow-bore needle, (iv) device that was visibly contaminated with blood, and (iv) source person with acquired immunodeficiency syndrome (AIDS).3 Another five factors were classified as low-risk exposure: (i) moderate percutaneous injury, (ii) mucosal contact, (iii) contact with deep body fluids other than blood, (iv) source person was HIV-infected but not or not sure about the stage of AIDS, and (v) any other reason contributing to increased risk according to clinical judgement.
 
Results
From July 1999 to December 2013, 1525 health care workers (75-168 per year) with occupational exposure to HBV, HCV, or HIV were referred to the TPC (Fig). Females constituted 77% of all attendees. The median age was 33 years (range, 17-73 years). The majority came from the dental profession (36.8%) and nursing profession (33.4%), followed by ward/clinic ancillary staff (11.6%) and the medical profession (4.7%).
 

Figure. Referrals of health care workers with occupational exposure to Therapeutic Prevention Clinic and the post-exposure prophylaxis (PEP) prescription
 
Type and pattern of exposure
The majority of exposures occurred in a public clinic or laboratory (n=519, 34.0%), followed by public hospital (n=432, 28.3%), private clinic or laboratory (n=185, 12.1%), and private hospital (n=23, 1.5%). Most were a percutaneous injury (88.9%). Mucosal contact, breached skin contact, and human bite were infrequent (Table 1). Approximately 60% of the incidents occurred in one of the four situations: (a) cleaning/tidying up after procedures (the most common), (b) other bedside/treatment room procedures, (c) injection, including recapping of needles, or (d) blood taking/intravenous catheter insertion. The contact specimen was blood or blood products, blood-contaminated fluid, and saliva or urine in 30.6%, 5.8%, and 14.1% of the cases, respectively. The technical device involved was a hollow-bore needle in 48.1%, dental instrument in 20.7%, and lancet in 7.7%. More than 80% considered the injury superficial.
 

Table 1. Details of occupational exposure in health care workers
 
High-risk and low-risk factors were noted in 869 (57%) and 166 (11%) exposures, respectively. Blood taking/intravenous catheter insertion carried the highest risk among all the procedures, with a mean risk factor of 1.29 (Table 2). Gloves were used in 956 (62.7%) exposures, goggles/mask in 50 (3.3%), and gown/apron in 55 (3.6%). Nonetheless, 101 (6.6%) health care workers indicated that they did not use any personal protective equipment during the exposure.
 

Table 2. Risk factors in health care workers with higher-risk occupational exposure during various activities/procedures from 1999 to 2013
 
The source patient could be identified in 1277 (83.7%) cases but the infectious status was unknown in most. The baseline known positivity rate for HBV, HCV, and HIV of all identified sources was 7.4%, 1.6%, and 3.3%, respectively (Table 1).
 
Care and clinical outcome
Nearly half of the injured health care workers attended a medical consultation within 2 hours (n=720, 47.2%) and another 552 (36.2%) attended between 2 and 12 hours following exposure. The median time between exposure and medical consultation was 2.0 hours.
 
During the study period, 48 (3.1%) health care workers received HIV PEP for occupational exposure, ranging from zero to eight per year (Fig). One third received PEP within 2 hours of exposure, and the majority (89.6%) within 24 hours. The median time to PEP was 4.0 hours post-exposure (interquartile range, 2.0-8.1 hours). A three-drug regimen was prescribed in 85.7% of cases. The most common regimen was zidovudine/lamivudine/indinavir (39.6%), followed by zidovudine/lamivudine/ritonavir-boosted lopinavir (31.3%), and zidovudine/lamivudine (12.5%) [Table 3]. Upon consultation and risk assessment at the TPC, 36 (75%) workers had treatment continued from the accident and emergency department. Among them, the source was confirmed to be HIV-positive in 14 (38.9%) cases. Of the 35 clients with known outcome, drug-related adverse events were seen in 31 (88.6%) health care workers; more than half (n=18, 58.1%) of which were considered to be moderate or severe. Treatment-related side-effects led to early termination of PEP in eight (22.9%) health care workers. Excluding nine clients in whom prophylaxis was stopped when the source was established to be HIV-negative, 19 (73.1%) clients were able to complete the 28-day course of PEP. Of the 14 clients who sustained injury from an HIV-infected source patient, all received PEP but two did not complete the course; the completion rate was 85.7%.
 

Table 3. Post-exposure prophylaxis regimens of human immunodeficiency virus
 
At baseline, none of the injured health care workers tested positive for HCV or HIV, while 49 (3.2% of all health care workers seen in TPC) tested HBsAg-positive. Almost half of the health care workers (n=732, 48.0%) were immune to HBV (anti-HBs positive). After follow-up of 6 months (1 year for those who took PEP), no case of HBV, HCV, or HIV seroconversion was detected in this cohort.
 
Discussion
Health care workers may be exposed to blood-borne viruses when they handle sharps and body fluids. Thus, adherence to standard precautions of infection control is an integral component of occupational health and safety for health care workers. In this cohort, percutaneous injury with sharps during cleaning or tidying up after procedures remained the most common mechanism of injury. Many of these incidents could have been prevented by safer practice, for instance, by not recapping needles or by disposing needles directly into a sharps box after use. The use of gloves as part of standard precautions was suboptimal and greater emphasis on the importance of wearing the appropriate personal protective equipment should be made during staff training at induction and on refresher courses. Technical devices with safety needleless features may reduce sharps injuries. Improvement in the system (eg by placing a sharps box near the work area) or the workflow to minimise distraction may also help compliance with infection control measures.
 
Once exposure occurs, PEP is the last defence against HBV and HIV. For HBV infection, PEP with hepatitis B immunoglobulin followed by hepatitis B vaccination has long been the standard practice in Hong Kong. For HIV infection, the efficacy of PEP in health care workers following occupational exposure was demonstrated by a historic landmark overseas case-control study.3 Prescription of zidovudine achieved an 81% reduction in risk of HIV seroconversion following percutaneous exposure to HIV-infected blood.3 Local and international guidelines now recommend a combination of three antiretroviral drugs for PEP.2 4 5 6 In this cohort, although more than half of the exposures had higher risk factors for HIV acquisition, it was uncommon for the source patients to have known HIV infection (2.8% of these exposures). Thus, in accordance with the local guideline, PEP was not commonly prescribed. Nevertheless, PEP was prescribed in all 14 exposures to a known HIV-positive source and in other 34 exposures after risk assessment. Our experience is comparable with the health care service in the UK and US. In the UK, 78% of health care workers exposed to an HIV-infected source patient were prescribed PEP.7 In a report from the US, only 68% of health care workers with such exposure took PEP.8 For HCV, PEP with antiviral therapy is not recommended according to the latest guidelines from American Association for the Study of Liver Diseases/Infectious Diseases Society of America.9 In case seroconversion occurs and early treatment is considered desirable, these patients with acute hepatitis C can be treated with direct-acting antivirals using the same regimen recommended for chronic hepatitis C.
 
If indicated, HIV PEP should be taken as early as possible after exposure to achieve maximal effect. Initiation of PEP after 72 hours of exposure was shown to be ineffective in animal studies.10 The timing of PEP initiation in our cohort appeared to be less prompt (33.3% within 2 hours compared with more than 60% and 80% within 3 hours in the UK and US, respectively). Overall, however, 89.6% managed to start PEP within 24 hours, in line with experience in the UK or US. Health care workers should be reminded about post-exposure management and the need for timely medical assessment following occupational exposure. In the accident and emergency department, priority assessment should be given to health care workers exposed to blood-borne viruses. The median duration of PEP intake of 28 days was in line with the local guidelines. With the availability of newer drugs with fewer toxicities, the tolerance and compliance rate should improve.
 
Finally, using the estimated risk of HIV transmission with percutaneous injury of 0.3%, we would expect four HIV seroconversions in 1356 percutaneous exposures in TPC if all were exposed to HIV-infected blood. Because in most of these exposures the source HIV status was unknown and likely negative in this region of overall low HIV prevalence (approximately 0.1%11), the actual risk of HIV transmission was much lower in the health care setting of Hong Kong. This finding is confirmed by the fact that no HIV seroconversion occurred in this cohort. In addition, those with exposure of the highest risk received HIV PEP. In the UK, there were 4381 significant occupational exposures from 2002 to 2011, of which 1336 were exposures to HIV-infected blood or body fluid. No HIV seroconversions occurred among these exposures.7 In the US, there has been one confirmed case of occupational transmission of HIV in health care workers since 1999.12 Similarly, the local prevalence of HCV infection is low (<0.1% in new blood donors13), partly explaining the absence of HCV transmission in this cohort. In contrast, there were 20 cases of HCV seroconversion in health care workers reported between 1997 and 2011 in the UK.7 Hepatitis B is considered to be endemic in Hong Kong, with HBsAg positivity of 1.1% in new blood donors and 6.5% in antenatal women in 2013.13 Nonetheless, the HBV vaccination programme in health care workers coupled with HBV PEP has proven successful in preventing HBV transmission to health care workers. With concerted efforts in infection control and timely PEP, transmission of blood-borne viruses via sharps and mucosal injury in the health care setting is largely preventable.
 
There are several limitations to our study. First, data were collected from a single centre and based on secondary referral. We did not have data for other health care workers who had occupational exposure but who were not referred to the TPC for post-exposure management, or who were referred but did not attend. Thus, we were not able to draw any general conclusions on the true magnitude of the problem. Second, details of the exposure and the infection status of the source were self-reported by the exposed client and prone to bias and under-reporting.
 
Conclusions
Percutaneous injury with sharps during cleaning or tidying up after procedures was the most common cause of occupational exposure to blood or body fluids in this cohort of health care workers. The majority of source patients were not confirmed HIV-positive and HIV PEP was not generally indicated. Prescriptions of HIV PEP were appropriate and timely in most cases. There were no HIV, HBV, and HCV seroconversions in health care workers who attended the TPC following sharps or mucosal injury from mid-1999 to 2013.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Pruss-Ustun A, Rapiti E, Hutin Y. Sharps injuries: Global burden of disease from sharps injuries to health-care workers (World Health Organization Environmental Burden of Disease Series, No. 3). Available from: http://www.who.int/quantifying_ehimpacts/publications/en/sharps.pdf?ua=1. Accessed 2 Feb 2016.
2. Scientific Committee on AIDS and STI (SCAS), and Infection Control Branch, Centre for Health Protection, Department of Health. Recommendations on the management and postexposure prophylaxis of needlestick injury or mucosal contact to HBV, HCV and HIV. Hong Kong: Department of Health; 2014.
3. Cardo DM, Culver DH, Ciesielski CA, et al. A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. N Engl J Med 1997;337:1485-90. Crossref
4. Kuhar DT, Henderson DK, Struble KA, et al. Updated US Public Health Service guidelines for the management of occupational exposures to human immunodeficiency virus and recommendations for postexposure prophylaxis. Infect Control Hosp Epidemiol 2013;34:875-92. Crossref
5. UK Department of Health. HIV post-exposure prophylaxis: guidance from the UK Chief Medical Officers’ Expert Advisory Group on AIDS. 19 September 2008 (last updated 29 April 2015).
6. WHO Guidelines Approved by the Guidelines Review Committee. Guidelines on Post-Exposure Prophylaxis for HIV and the Use of Co-Trimoxazole Prophylaxis for HIV-Related Infections Among Adults, Adolescents and Children: Recommendations for a Public Health Approach: December 2014 supplement to the 2013 consolidated guidelines on the use of antiretroviral drugs for treating and preventing HIV infection. Geneva: World Health Organization; December 2014.
7. Eye of the Needle. United Kingdom surveillance of significant occupational exposures to bloodborne viruses in healthcare workers. London: Health Protection Agency; December 2012.
8. US Department of Health and Human Services, Centers for Disease Control and Prevention. The National Surveillance System for Healthcare Workers (NaSH): Summary report for blood and body fluid exposure data collected from participating healthcare facilities (June 1995 through December 2007).
9. American Association for the Study of Liver Diseases/Infectious Diseases Society of America. HCV guidance: recommendations for testing, managing, and treating hepatitis C (updated 24 February 2016). Available from: http://www.hcvguidelines.org. Accessed 5 May 2016.
10. Tsai CC, Emau P, Follis KE, et al. Effectiveness of postinoculation (R)-9-(2-phosphonylmethoxypropyl) adenine treatment for prevention of persistent simian immunodeficiency virus SIVmne infection depends critically on timing of initiation and duration of treatment. J Virol 1998;72:4265-73.
11. HIV surveillance report—2014 update. Department of Health, The Government of the Hong Kong Special Administrative Region; December 2015.
12. Joyce MP, Kuhar D, Brooks JT, Occupationally acquired HIV infection among health care workers—United States, 1985-2013. MMWR Morb Mortal Wkly Rep 2015;63:1245-6.
13. Surveillance of viral hepatitis in Hong Kong—2014 update. Department of Health, The Government of the Hong Kong Special Administrative Region; December 2015.

Violence against emergency department employees and the attitude of employees towards violence

Hong Kong Med J 2016 Oct;22(5):464–71 | Epub 26 Aug 2016
DOI: 10.12809/hkmj154714
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Violence against emergency department employees and the attitude of employees towards violence
Halil Í Çıkrıklar, MD1; Yusuf Yürümez, MD1; Buket Güngör, MD2; Rüstem Aşkın, MD2; Murat Yücel, MD1; Canan Baydemir, MD3
1 Department of Emergency Medicine, Sakarya University, Medical Faculty, Sakarya, Turkey
2 Psychiatry Clinic, Ministry of Health, Şevket Yilmaz Training and Research Hospital, Bursa, Turkey
3 Department of Biostatistics, Eskişehir Osmangazi University, Medical Faculty, Eskişehir, Turkey
 
Corresponding author: Dr Halil Í Çıkrıklar (halilcikriklar@hotmail.com)
 
 Full paper in PDF
 
Abstract
Introduction: This study was conducted to evaluate the occurrence of violent incidents in the workplace among the various professional groups working in the emergency department. We characterised the types of violence encountered by different occupation groups and the attitude of individuals working in different capacities.
 
Methods: This cross-sectional study included 323 people representing various professional groups working in two distinct emergency departments in Turkey. The participants were asked to complete questionnaires prepared in advance by the researchers. The data were analysed using the Statistical Package for the Social Sciences (Windows version 15.0).
 
Results: A total of 323 subjects including 189 (58.5%) men and 134 (41.5%) women participated in the study. Their mean (± standard deviation) age was 31.5 ± 6.5 years and 32.0 ± 6.9 years, respectively. In all, 74.0% of participants had been subjected to verbal or physical violence at any point since starting employment in a medical profession. Moreover, 50.2% of participants stated that they had been subjected to violence for more than 5 times. Among those who reported being subjected to violence, 42.7% had formally reported the incident(s). Besides, 74.3% of participants did not enjoy their profession, did not want to work in the emergency department, or would prefer employment in a non–health care field after being subjected to violence. According to the study participants, the most common cause of violence was the attitude of patients or their family members (28.7%). In addition, 79.6% (n=257) of participants stated that they did not have adequate safety protection in their working area. According to the study participants, there is a need for legal regulations to effectively deter violence and increased safety measures designed to reduce the incidence of violence in the emergency department.
 
Conclusion: Violence against employees in the emergency department is a widespread problem. This situation has a strong negative effect on employee satisfaction and work performance. In order to reduce the incidence of violence in the emergency department, both patients and their families should be better informed so they have realistic expectations as an emergency patient, deterrent legal regulations should be put in place, and increased efforts should be made to provide enhanced security for emergency department personnel. These measures will reduce workplace violence and the stress experienced by emergency workers. We expect this to have a positive impact on emergency health care service delivery.
 
 
New knowledge added by this study
  • The prevalence of violence against employees in emergency departments is high.
Implications for clinical practice or policy
  • Various measures can be implemented to reduce the incidence of violence in the emergency department.
 
 
Introduction
Violence, which has been ever present throughout the history of humanity, is defined as a threat or application of possessed power or strength towards another person, self, a group, or a community in order to cause injury and/or loss.1 The World Health Organization defines violence as “physical assault, homicide, verbal assault, emotional, sexual or racial harassment”.2
 
Workplace violence is defined as “abuse or attacks by one or more people on an employee within the workplace”.3 The health care field, which encompasses a wide range of employees, is among those in which workplace violence is common.4 Violence in the health care field is defined as “risk to a health worker due to threatening behaviour, verbal threats, physical assault and sexual assault committed by patients, patient relatives, or any other person”.3
 
According to the 2002 Workplace Violence in the Health Sector report, 25% of all violent incidents occurred in the health care sector.5 A study conducted in the United States determined that the risk of being subjected to violence is 16 times higher in the health care sector relative to other service sectors.6 Within the health care field, the department that is most frequently exposed to violence is the emergency department (ED).3 7 8 9 In this context, verbal and physical attacks by dissatisfied patients and their relatives are at the forefront.10 11
 
In this study we aimed to determine the extent of violence towards ED employees, analyse the attitude of the staff exposed to violence, and propose possible solutions.
 
Methods
This cross-sectional study was conducted in the EDs of Şevket Yilmaz Training and Research Hospital and Sakarya University between 1 July and 15 August 2012. Employees of ED—including doctors, nurses, health care officials, Emergency Medical Technicians (EMT), secretaries, laboratory technicians, radiology technicians, and security and cleaning staff—were included in the study. The questionnaire was prepared in accordance with previous publications3 10 11 and distributed to participants. All study participants were provided with information regarding the objectives of the study and were given instructions for completing the form. Of the 437 ED employees working in the two hospitals, 323 (73.9%) agreed to participate in the study and returned a completed questionnaire.
 
In addition to demographic information, the questionnaire contained questions about the number of violent incidents to which the individual had been subjected to, the type of violence, and whether the subject reported the incident or the reason for not reporting. Additional questions concerned a description of the person(s) responsible for the violence, the estimated age of the person(s) responsible for the violence, and the severity of the violence. We also asked participants about their attitude following the violent incident and suggestions for reducing violence in the ED.
 
This study was conducted in accordance with the principles of the 2008 Helsinki Declaration. The data were analysed using the Statistical Package for the Social Sciences (Windows version 15.0; SPSS Inc, Chicago [IL], US). Both proportions and mean ± standard deviation were used to represent the results. The Student’s t test, Pearson’s Chi squared test, and the Monte Carlo Chi squared tests were used to evaluate observed differences between groups and a P value of <0.05 was considered to represent a statistically significant difference.
 
Results
Among the 323 participants included in the study, 189 (58.5%) were male and 134 (41.5%) were female. The mean age of the male participants was 31.5 ± 6.5 years (range, 18-55 years) and that of the female participants was 32.0 ± 6.9 years (range, 20-52 years). There was no significant difference in the age distribution between the male and female participants (P=0.476).
 
When participants were asked if they had ever been exposed to verbal or physical violence in the workplace during the course of their career, 239 (74.0%) indicated that they had been subjected to one or the other, and 57 (17.6%) reported being subjected to both verbal and physical violence. Among the participants who were subjected to violence, 162 (67.8%) reported being the victim of more than five violent incidents (Table 1).
 

Table 1. Frequency of exposure to violence for male and female employees
 
The frequency of exposure to violence and the frequency of exposure to more than five violent incidents were similar for both men and women (P=0.185 and 0.104, respectively). Nonetheless, 25.9% of men reported both verbal and physical violence compared with only 6.0% of women, suggesting that the incidence of verbal and physical violence against men was greater than that against women (P<0.001) [Table 1].
 
We investigated the frequency of exposure to violence and the reported incidence of violence among various occupation groups (Table 2). The prevalence of exposure to violence was the highest among health care officials, EMTs, doctors, and security staff (P<0.001). In addition, only 102 (42.7%) out of 239 participants reported these violent incidents. It is notable that although the rate of incident reporting was 100% among security staff, none of the laboratory technicians reported the violent incidents (P<0.001).
 

Table 2. The distribution of occupation groups according to frequency of exposure to violence and rate of reporting
 
A total of 43 (31.4%) out of the 137 study participants who had been exposed to violence but had not reported the incident provided reasons (Table 3). The most common reason for not notifying the authorities was the perception that “no resolution will be reached”. Other important reasons included the heavy workload, not wanting to deal with the legal process, disregarding verbal attacks, understanding/sympathising with the emotions of patients and their relatives, fear of the threat from patients and their relatives, and not knowing how and where to report such incidents.
 

Table 3. Reasons for not reporting a violent incident (n=43)
 
A total of 248 participants responded to a question regarding the identity of the person who was to blame for the violence in ED in general (not their own experiences). Accordingly, 65.3% (n=162) stated that the patient’s relatives were responsible, 27.0% (n=67) stated that both the patients and their relatives were responsible, and 5.2% (n=13) placed sole responsibility on the patients. Six (2.4%) participants stated that they had been subjected to violence from other health care professionals.
 
When we asked individuals to estimate the age of the person(s) causing the violence that they had experienced, respondents who were exposed to multiple violent incidents answered this question by selecting multiple options and a total of 405 answers were obtained. As shown in Table 4, the majority (71.4%) of people responsible for violent incidents were young patients and patient relatives between the ages of 18 and 39 years.
 

Table 4. Estimated age of violent patients/family members (n=405)
 
When participants who were exposed to violence were asked who caused the violent incident, three (1.3%) participants stated that they themselves were responsible, five (2.1%) indicated that both sides were responsible, and the remaining 231 (96.7%) held the attacker responsible.
 
Participants were asked “What do you think is the reason for the violence?”. A total of 181 (56.0%) participants responded to this question. Some participants indicated more than one reason and a total of 188 answers were obtained. The top 10 most common responses to this question are given in descending order of frequency in Table 5. The most common cause of violence was ignorance and lack of education of patients and their relatives (28.7%), followed by the impatient attitudes and demanding priorities (23.4%) and the heavy workload and prolonged waiting time (10.6%).
 

Table 5. Answers to the questions: “What do you think is the reason for the violence?” and “How do you think violence against health care workers can be reduced?”
 
Participants were asked “How do you think violence against health care workers can be reduced?”. Some participants indicated more than one reason and a total of 509 answers were obtained. They considered the most important steps suggested to reduce violence against ED employees were the enactment of deterrent legislation (42.6%), increased security measures in hospitals (28.5%), and improved public education (16.7%) [Table 5].
 
Participants were asked about their attitude after experiencing violence. Some respondents gave more than one answer and a total of 498 answers were obtained. There were 27.1% of participants who did not enjoy working in their current profession, 25.7% wanted to work in non–health care field, and 21.5% did not want to work in the ED (Table 6).
 

Table 6. The attitude of health care workers after exposure to violence (n=498)
 
A total of 96.3% (n=311) of participants answered “Yes” to the question “Do you think that the violence against health care workers has increased in recent years?” Moreover, 90.7% (n=293) of the participants answered “Yes” to the question “Do news reports regarding violence against health care workers affect you?”. Then, when participants were asked “How does the news affect you?”, 64.7% (n=209) reported that they were “sad”, 44.3% (n=143) said they were “angry”, and 18.9% (n=61) said they were “scared”.
 
When participants were asked “Are there sufficient security measures in your workplace?”, only 66 (20.4%) participants gave a positive response, while 257 (79.6%) responded negatively. Among the 41 participants working as security staff, 33 (80.5%) found the safety measures inadequate. Thus, both the security staff and the general employee population agreed that hospital security was inadequate.
 
Discussion
Workplace violence is the most prevalent in the health care sector.4 The ED is the health care unit with the highest frequency of exposure to violence.3 7 8 9 According to several previous studies, the proportion of health care professionals who report prior exposure to violence in the workplace ranges from 45% to 67.6%.3 8 12 13 14 The rate of violence against ED employees (79%-99%), however, is higher than the average for the health care field.15 16 17
 
Emergency services are high-risk areas for patients and staff with regard to workplace violence18 19 20 21; 24-hour accessibility, a high-stress environment, and the apparent lack of trained security personnel are underlying factors.22 Workplace violence negatively affects the morale of health care workers and negatively affects the health and effectiveness of presentation.23 24 25 26
 
Our study was conducted among ED employees of two different hospitals. We investigated the rate of exposure to verbal or physical violence. Among the participants, 239 (74.0%) stated that they had been subjected to exposure to violence, and 57 (17.6%) reported having been exposed to both verbal and physical violence. A study in Turkey found that among ED employees, including nurses, in the İzmir province of Turkey, 98.5% of respondents had been subjected to verbal violence and 19.7% were exposed to physical violence.16 In another study conducted in Turkey, 88.6% of ED employees were subjected to verbal violence and 49.4% reported having been the victim of physical violence.17
 
In the present study, the rate of exposure to violence by profession was 95.7% among health care officials/EMTs, 90.7% among doctors, and 80.5% among security personnel. According to Ayrancı et al,3 exposure to violence was most common among practitioners (67.6%) and nurses (58.4%). In another study, Alçelik et al27 reported that nurses were exposed to violence 3 times more often than other health care professionals. In the present study, the frequency of exposure to violence among nurses was 62.7%, which is lower than that in other professional groups.
 
In the present study, the estimated age distribution of patients and patient relatives responsible for violent incidents showed that the majority (71.4%) were between 18 and 39 years of age. Other studies have reported that individuals prone to violence are generally younger than 30 years.28
 
Health care workers are often subjected to verbal and physical attacks from patients and their relatives who are dissatisfied with the services provided.10 11 In the present study, the most common cause of violence was the lack of education and ignorance of the patients and their relatives. Heavy workload was identified as another cause of workplace violence. Factors such as patient stress and anxiety regarding their condition, high expectations of the patients and their relatives, lack of effective institutional and legal arrangements aimed at preventing violence, and the failure to effectively document the extent of workplace violence contribute to the high frequency of violence.12 There are several factors that increase the risk of violence in health care institutions, including 24-hour service, long waiting time for patients, poor access to health care services, heavy workload, limited staff, inadequate employee training, and lack of security personnel.29 30
 
Previous studies conducted in Turkey revealed that 60% of ED employees who were exposed to violence did not report the incident. Among the reasons for not reporting was a lack of confidence in health care and executive leadership as well as the justice system.12 In the present study, the incident reporting rate was also low (42.7%) and the most important reason (34.9%) for not reporting was the perception that “no resolution will be reached”. Indeed, a study found that there were no repercussions for the attacker in 77% of instances.12 This suggests the perception that “no resolution will be reached” is a valid one.
 
A heavy workload consumes the energy of employees and reduces their ability to empathise with patients and tolerate violent situations. Sometimes verbal or physical conflicts may arise between a stressed patient who may be subject to long waiting times and exhausted and stressed health care workers. Training regarding communication with patients helps health care professionals to avoid these problems.31 Effective communication alone, however, is not sufficient and additional steps must be taken to reduce waiting time of patients. Previous studies have indicated that the most important reason for patient dissatisfaction in the ED is the waiting time.32 33 Yet, the most important reason for long waiting times is the heavy workload caused, in part, by the discourteous attitude of patients and their relatives. Studies have also shown that more than half of patients who present to the ED are not ‘emergency patients’.34 35 36 Further education regarding the definition of “emergency” and the practice of effective triage may reduce the heavy workload in the ED and associated violent incidents.
 
One previous study reported that verbal and physical attacks by patients and their relatives are the most important factors contributing to stress among ED employees.37 Consistent exposure to high-stress conditions resulting from exposure to verbal and physical violence results in both physical and mental exhaustion. As a result, a situation known commonly as ‘burnout syndrome’ emerges.38 39 The burnout syndrome is defined as holding a negative view of current events, frequent despair, and lost productivity and motivation.40 Reluctance among physicians to work in the ED is one consequence of burnout syndrome.41 In the present study, among the participants who were subjected to violence, 21.5% indicated that they wanted to work in a department other than the ED, while 25.7% stated a desire to work outside the health care field. In a study conducted in Canada, 18% of participants who had been exposed to violence stated that they did not want to work in the ED, and 38% wanted to work outside the health care field.9 Others indicated that they had quitted their jobs because of workplace stress.9 In the present study, 10.4% of ED employees stated that they were afraid of patients and their relatives. In the same Canadian study, 73% of respondents stated that after experiencing violence they were afraid of patients.9 In our study, 96.3% of respondents thought that there had been an increase in violence against ED health care workers in recent years. Moreover, 79.6% of respondents stated that the safety measures in their institutions were insufficient. The participants in the present study suggested that the preparation of deterrent legislation, increased security measures, and efforts to better educate the general population regarding the appropriate use of ED resources will help to reduce violence against health care workers.
 
Limitations
The study was carried out in only two hospitals in Turkey that may not be representative of all hospitals. In addition, participants could decide whether or not to answer all questions and some questionnaires were incomplete. The response rate was only 74% and this might give rise to self-selection bias, that is, those who did not respond may have had a higher (or lower) exposure to violence than those who responded. Hence, the various percentages reported in this paper might be over- or under-estimated.
 
Conclusion
The results of the current study as well as those of earlier studies indicate that the prevalence of violence against ED employees is high. Factors such as patient and stress of health care provider, prolonged waiting times due to overcrowding in the ED, negative attitude of discourteous patients and their relatives, insufficient security measures, and the lack of sufficiently dissuasive legal regulations may contribute to increased violence in the ED. These factors in turn increase stress among ED employees, reduce job satisfaction, and lower the quality of services provided. Measures to decrease the workload in the ED and shorten waiting time of patients, the adoption of legal policies that deter violent behaviour, and increased security measures in health care facilities should be reassessed. Steps should be taken to educate the public in order to reduce violence against health care workers.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Kocacik F. On violence [in Turkish]. Cumhuriyet Univ J Econ Adm Sci 2001;2:1-2.
2. Violence and injury prevention. Available from: http://www.who.int/violence_injury_prevention/violence/activities/workplace/documents/en/index.html. Accessed Nov 2012.
3. Ayrancı Ü, Yenilmez Ç, Günay Y, Kaptanoğlu C. The frequency of being exposed to violence in the various health institutions and health profession groups. Anatol J Psychiatry 2002;3:147-54.
4. Wells J, Bowers L. How prevalent is violence towards nurses working in general hospitals in the UK? J Adv Nurs 2002;39:230-40. Crossref
5. Workplace violence in the health sector. Framework guidelines for addressing workplace violence in the health sector. Available from: http://www.ilo.org/wcmsp5/groups/public/---ed_dialogue/---sector/documents/publication/wcms_160912.pdf. Accessed Nov 2012.
6. Kingma M. Workplace violence in the health sector: a problem of epidemic proportion. Int Nurs Rev 2001;48:129-30. Crossref
7. Gülalp B, Karcıoğlu, Köseoğlu Z, Sari A. Dangers faced by emergency staff: experience in urban centers in southern Turkey. Ulus Travma Acil Cerrahi Derg 2009;15:239-42.
8. Lau J, Magarey J, McCutcheon H. Violence in the emergency department: a literature review. Aust Emerg Nurs J 2004;7:27-37. Crossref
9. Fernandes CM, Bouthillette F, Raboud JM, et al. Violence in the emergency department: a survey of health care workers. CMAJ 1999;161:1245-8.
10. Yanci H, Boz B, Demirkiran Ö, Kiliççioğlu B, Yağmur F. Medical personal subjected to the violence in emergency department—enquiry study. Turk J Emerg Med 2003;3:16-20.
11. Sucu G, Cebeci F, Karazeybek E. Violence by patient and relatives against emergency service personnel. Turk J Emerg Med 2007;7:156-62.
12. Çamci O, Kutlu Y. Determination of workplace violence toward health workers in Kocaeli. J Psychiatr Nurs 2011;2:9-16.
13. Stirling G, Higgins JE, Cooke MW. Violence in A&E departments: a systematic review of the literature. Accid Emerg Nurs 2001;9:77-85. Crossref
14. Sönmez M, Karaoğlu L, Egri M, Genç MF, Günes G, Pehlivan E. Prevalence of workplace violence against health staff in Malatya. Bitlis Eren Univ J Sci Technol 2013;3:26-31.
15. Stene J, Larson E, Levy M, Dohlman M. Workplace violence in the emergency department: giving staff the tools and support to report. Perm J 2015;19:e113-7. Crossref
16. Senuzun Ergün F, Karadakovan A. Violence towards nursing staff in emergency departments in one Turkish city. Int Nurs Rev 2005;52:154-60. Crossref
17. Boz B, Acar K, Ergin A, et al. Violence toward health care workers in emergency departments in Denizli, Turkey. Adv Ther 2006;23:364-9. Crossref
18. Joa TS, Morken T. Violence towards personnel in out-of-hours primary care: a cross-sectional study. Scand J Prim Health Care 2012;30:55-60. Crossref
19. Magnavita N, Heponiemi T. Violence towards health care workers in a Public Health Care Facility in Italy: a repeated cross-sectional study. BMC Health Serv Res 2012;12:108. Crossref
20. Arimatsu M, Wada K, Yoshikawa T, et al. An epidemiological study of work-related violence experienced by physicians who graduated from a medical school in Japan. J Occup Health 2008;50:357-61. Crossref
21. Taylor JL, Rew L. A systematic review of the literature: workplace violence in the emergency department. J Clin Nurs 2011;20:1072-85. Crossref
22. Gacki-Smith J, Juarez AM, Boyett L, Homeyer C, Robinson L, MacLean SL. Violence against nurses working in US emergency departments. J Nurs Adm 2009;39:340-9. Crossref
23. Kowalenko T, Gates D, Gillespie GL, Succop P, Mentzel TK. Prospective study of violence against ED workers. Am J Emerg Med 2013;31:197-205. Crossref
24. Position statement: violence in the emergency care setting. Available from: https://www.ena.org/government/State/Documents/ENAWorkplaceViolencePS.pdf. Accessed Nov 2012.
25. Workplace violence. Washington, DC: United States Department of Labor; 2013. Available from: http://www.osha.gov/SLTC/workplaceviolence/index.html. Accessed Nov 2012.
26. Adib SM, Al-Shatti AK, Kamal S, El-Gerges N, Al-Raqem M. Violence against nurses in healthcare facilities in Kuwait. Int J Nurs Stud 2002;39:469-78. Crossref
27. Alçelik A, Deniz F, Yeşildal N, Mayda AS, Ayakta Şerifi B. Health survey and life habits of nurses who work at the medical faculty hospital at AIBU [in Turkish]. TAF Prev Med Bull 2005;4:55-65.
28. Young GP. The agitated patient in the emergency department. Emerg Med Clin North Am 1987;5:765-81.
29. Stathopoulou HG. Violence and aggression towards health care professionals. Health Sci J 2007;2:1-7.
30. Hoag-Apel CM. Violence in the emergency department. Nurs Manage 1998;29:60,63.
31. Yardan T, Eden AO, Baydın A, Genç S, Gönüllü H. Communication with relatives of the patients in emergency department. Eurasian J Emerg Med 2008;7:9-13.
32. Al B, Yıldırım C, Togun İ, et al. Factors that affect patient satisfaction in emergency department. Eurasian J Emerg Med 2009;8:39-44.
33. Yiğit Ö, Oktay C, Bacakoğlu G. Analysis of the patient satisfaction forms about Emergency Department services at Akdeniz University Hospital. Turk J Emerg Med 2010;10:181-6.
34. Kiliçaslan İ, Bozan H, Oktay C, Göksu E. Demographic properties of patients presenting to the emergency department in Turkey. Turk J Emerg Med 2005;5:5-13.
35. Ersel M, Karcıoğlu Ö, Yanturali S, Yörüktümen A, Sever M, Tunç MA. Emergency Department utilization characteristics and evaluation for patient visit appropriateness from the patients’ and physicians’ point of view. Turk J Emerg Med 2006;6:25-35.
36. Aydin T, Aydın ŞA, Köksal Ö, Özdemir F, Kulaç S, Bulut M. Evaluation of features of patients attending the Emergency Department of Uludağ University Medicine Faculty Hospital and emergency department practices. Eurasian J Emerg Med 2010;9:163-8. Crossref
37. Kalemoglu M, Keskin O. Evaluation of stress factors and burnout in the emergency department staff [in Turkish]. Ulus Travma Derg 2002;8:215-9.
38. Ferns T, Stacey C, Cork A. Violence and aggression in the emergency department: Factors impinging on nursing research. Accid Emerg Nurs 2006;14:49-55. Crossref
39. Keser Özcan N, Bilgin H. Violence towards healthcare workers in Turkey: A systematic review [in Turkish]. Turkiye Klinikleri J Med Sci 2011;31:1442-56. Crossref
40. Maslach C. Burned-out. Hum Behav 1976;5:16-22.
41. Dwyer BJ. Surviving the 10-year ache: emergency practice burnout. Emerg Med Rep 1991;23:S1-8.

Population-based survey of the prevalence of lower urinary tract symptoms in adolescents with and without psychotropic substance abuse

Hong Kong Med J 2016 Oct;22(5):454–63 | Epub 12 Aug 2016
DOI: 10.12809/hkmj154806
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Population-based survey of the prevalence of lower urinary tract symptoms in adolescents with and without psychotropic substance abuse
YH Tam, FHKAM (Surgery)1; CF Ng, FHKAM (Surgery)2; YS Wong, FHKAM (Surgery)1; Kristine KY Pang, FHKAM (Surgery)1; YL Hong, MSc1; WM Lee, MSc2; PT Lai, BN2
1 Division of Paediatric Surgery and Paediatric Urology, Department of Surgery, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
2 Division of Urology, Department of Surgery, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
 
Corresponding author: Dr YH Tam (pyhtam@surgery.cuhk.edu.hk)
 
 Full paper in PDF
 
Abstract
Objective: To investigate the prevalence of lower urinary tract symptoms in adolescents and the effects of psychotropic substance use.
 
Methods: This was a population-based cross-sectional survey using a validated questionnaire in students from 45 secondary schools in Hong Kong randomly selected over the period of January 2012 to January 2014. A total of 11 938 secondary school students (response rate, 74.6%) completed and returned a questionnaire that was eligible for analysis. Individual lower urinary tract symptoms and history of psychotropic substance abuse were documented.
 
Results: In this study, 11 617 non-substance abusers were regarded as control subjects and 321 (2.7%) were psychotropic substance users. Among the control subjects, 2106 (18.5%) had experienced at least one lower urinary tract symptom with urinary frequency being the most prevalent symptom (10.2%). Females had more daytime urinary incontinence (P<0.001) and males had more voiding symptoms (P=0.01). Prevalence of lower urinary tract symptoms increased with age from 13.9% to 25.8% towards young adulthood and age of ≥18 years (P<0.001). Among the substance users, ketamine was most commonly abused. Substance users had significantly more lower urinary tract symptoms than control subjects (P<0.001). In multivariate analysis, increasing age and psychotropic substance abuse increased the odds for lower urinary tract symptoms. Non-ketamine substance users and ketamine users were respectively 2.8-fold (95% confidence interval, 2.0-3.9) and 6.2-fold (4.1-9.1) more likely than control subjects to develop lower urinary tract symptoms. Females (odds ratio=9.9; 95% confidence interval, 5.4-18.2) were more likely to develop lower urinary tract symptoms than males (4.2; 2.5-7.1) when ketamine was abused.
 
Conclusions: Lower urinary tract symptoms are prevalent in the general adolescent population. It is important to obtain an accurate history regarding psychotropic substance use when treating teenagers with lower urinary tract symptoms.
 
 
New knowledge added by this study
  • Prevalence of lower urinary tract symptoms (LUTS) increases consistently from onset of adolescence towards adulthood. Psychotropic substance abuse, particularly ketamine, is associated with an increased risk of developing LUTS in adolescents. Girls are more susceptible than boys if ketamine is abused.
Implications for clinical practice or policy
  • It is important to obtain an accurate history regarding psychotropic substance use when treating teenagers with LUTS.
 
 
Introduction
Lower urinary tract symptoms (LUTSs) are prevalent worldwide. An estimated 45.2% of the 2008 worldwide population aged ≥20 years are affected by at least one LUTS.1 Large-scale population-based survey has reported that LUTS prevalence increases with advancing age up to 60% at the age of 60 years.2 Evaluation and treatment of LUTS for the general population have incurred significant costs to the health care system. In children, the association of LUTS with urinary tract infection, persistent vesicoureteric reflux, renal scarring, and constipation have drawn substantial attention over the years.3 4 Among various LUTSs, urinary incontinence (UI) has been most extensively investigated in children with the reported prevalence varying from 1.8% to 20%.5 Previous studies of the prevalence of individual LUTS using the International Children’s Continence Society (ICCS) definitions6 have focused primarily on pre-adolescent children in primary schools.7 8 9 10 To date, no large-scale studies have investigated the prevalence of LUTSs in adolescents.
 
Psychotropic substance use among adolescents is a growing concern worldwide and creates psychosocial, security, and health care issues. In recent years, ketamine abuse has been found to cause severe LUTSs and Hong Kong is one of the earliest countries/regions to report the newly established clinical entity of ketamine-associated uropathy.11 12 13 Ketamine is the most popular psychotropic substance being abused by people aged <21 years in our society.14 The aim of the present study was to investigate the prevalence of LUTSs in our adolescents and the differences between those with and without psychotropic substance use.
 
Methods
Study design, sample size estimation, and participant recruitment
This was a cross-sectional questionnaire survey that recruited adolescents from secondary schools serving Hong Kong local residents during the period of January 2012 to January 2014. There were almost 500 secondary schools in Hong Kong serving approximately 470 000 adolescents in 2009/10. Based on the data of children and young adults available in the literature,2 9 we assumed the prevalence of LUTSs among our adolescents to be 20%. A study sample of 6050 participants would be required to allow an error of ±1%. Government sources suggested 2.3% of our secondary school students used psychotropic substance in 2011/12.15 We assumed the prevalence of LUTSs among those secondary students using psychotropic substance was 15% higher than in normal subjects. In order to detect a difference with a type 1 error of 0.05 and a power of 0.8, a sample size of 4500 participants would be required. Based on the above two assumptions and a predicted response rate of 50% to 60%, we determined that a potential target of not less than 10 000 participants would be required.
 
In the selection of schools we included all government, aided, and Direct Subsidy Scheme schools. Private international schools and special schools were excluded. Co-educational, boys’, and girls’ schools were included. The list of secondary schools was provided by the Education Bureau and schools were grouped into 18 geographical districts. As the prevalence of psychotropic substance use might vary significantly between schools, we arbitrarily determined to recruit participants from not less than 8% to 10% of the secondary schools in order to reduce the sampling bias.
 
The random selection process started with drawing a district followed by a school within the selected district. Based on a rough estimation of population distribution, we intended to select schools from Hong Kong Island (HKI), Kowloon (Kln), and New Territories (NT) in an approximate ratio of 1:2:3. We invited the selected schools to participate in the study. If the invitation was declined, the next school following the drawing sequence would be contacted. The above procedure was repeated until the target sampling size was reached. Finally, 45 out of 121 schools were selected and approached, and agreed to participate in the study (HKI, n=7; Kln, n=13; NT, n= 25) giving a potential target of 16 000 participants.
 
The grades/classes of students participating in the survey from each school were not randomly selected but were determined after discussion and mutual agreement with the school management. In order to avoid the possible bias of intentional selection or exclusion of a particular class of students, school management was invited to express their preferences about which grade/grades of students would participate provided that all students of the selected grade/grades participated. Although we tried to avoid over-representation of a particular grade of students by making some suggestions to the school management, their preferences were always respected and accepted. Of the 45 participating schools, we recruited two or three grades of Form 1-3 students, two or three grades of Form 4-6 students, and all the students in 18, 10, and 8 schools, respectively. In the remaining nine schools, we recruited only one grade of their students.
 
Study measures
The measuring tool was an anonymous self-reported questionnaire accompanied by an information sheet. In both the information sheet and the questionnaire, we stated clearly that participation in the study was voluntary and consent to participate was presumed on receipt of a completed questionnaire that was returned in the envelope provided. Individuals who did not consent to participate were told to disregard the questionnaire. The questionnaire consisted of three parts: demographic data on gender and age, LUTS assessment, and history of psychotropic substance use (Appendix).
 

Appendix. The questionnaire
 
Age was divided into four categories: <13, 13-15, 16-17, and ≥18 years. Participants were asked to respond to an 8-item LUTS assessment that included storage symptoms (urinary frequency, urgency, nocturia, and daytime UI), voiding symptoms (intermittent stream, straining, and dysuria), and post-micturition symptom (incomplete emptying). The recall period was the last 4 weeks. The LUTS questions were adapted from the Hong Kong Chinese version of International Prostate Symptom Score questionnaire that has been validated to assess LUTSs in our local adult population.16 We believed that the level of comprehension of most of our adolescent participants in secondary education was close to that of an average adult. The response options for most of the LUTSs were on a 6-point Likert scale: “never”, “seldom (<20% of the time)”, “sometimes (20-50% of the time)”, “often (50% of the time)”, “always (>50% of the time)”, and “almost every time”. Any LUTS with frequency threshold of ‘≥20% of the time’ was defined as being present in the study subject. Daytime UI and nocturia were assessed on a different 5-point Likert scale according to their frequency. Daytime UI and nocturia were defined as present if the study subject had ≥1 to 3 times per month and ≥2 times per night, respectively.2 17
 
Responses to questions on psychotropic substance use were dichotomised as either “yes” or “no”. Those with positive responses were directed to questions on the type of substance being abused, which included ketamine, ecstasy, methamphetamine, cough mixture, marijuana, and others. Participants were allowed to indicate more than one substance. According to the response to questions on psychotropic substance use, the participants were classified as control subjects or psychotropic substance users. The psychotropic substance users were further subdivided into either ketamine users or non-ketamine users.
 
Statistical analysis
The responses to each LUTS were dichotomised as “present” versus “absent” and prevalence rate for each LUTS was expressed in percentage with 95% confidence interval (CI). Missing data were excluded for analysis. Chi squared and trend tests were performed in univariate analysis to compare prevalence differences between groups divided by gender, age, and psychotropic substance use. Using the outcome of “at least one LUTS”, which was dichotomised into “yes” or “no”, a binary logistic regression model using enter method was set up to investigate risk factors including gender, age, and psychotropic substance use. Odds ratio (OR) of “at least one LUTS” was estimated with 95% CI for the potential risk factors. A P value of <0.05 was considered to be significant.
 
The study protocol was approved by the Joint CUHK-NTEC Clinical Research Ethics Committee.
 
Results
A total of 16 000 questionnaires were sent to schools and 11 938 were returned (estimated response rate, 74.6%) that were eligible for analysis in the study. The response rate was estimated since the number of questionnaires delivered to each school was not necessarily equal to the number of students of that school who received the questionnaire. The conduction of the survey at schools was not supervised. We were uncertain if students absent from school would receive our questionnaire. The number of questionnaires requested by each school was always rounded off to the nearest 10 and not necessarily equal to the actual number of students in the selected classes. It seems logical to assume that the actual number of students who received our questionnaires was less than 16 000 and the actual response rate might be higher. There were similar numbers of males (n=6040) and females (n=5819) among the participants who responded to the question on gender. Among the 11 938 participants, 11 617 did not report use of any psychotropic substances and were defined as control subjects; 321 (2.7%) participants reported to have used one or more types of psychotropic substance were defined as substance users.
 
Of 11 617 control subjects, 2106 (18.5%; including only the valid subjects) had experienced at least one LUTS with the symptom frequency of ‘≥20% of the time’ in the last 4 weeks (Table 1). The most prevalent LUTSs were urinary frequency (10.2%), incomplete emptying (5.4%), and nocturia ≥2 times per night (4.4%). Daytime UI ≥1 to 3 times per month was reported by 3.7% of control subjects. Females had more daytime UI than males (5.2% vs 2.2%; P<0.001), while males had significantly more voiding symptoms and incomplete emptying. There was significant increase in the prevalence of all LUTSs except for daytime UI across the age-groups from <13 years to the young adulthood age-group of ≥18 years (Table 2).
 

Table 1. Prevalence of LUTS in control subjects and comparison by gender
 

Table 2. Comparison of LUTS in control subjects by age
 
Compared with control subjects, the psychotropic substance users experienced significantly more LUTSs in all areas (Table 3). Of the 321 substance abusers, 305 responded to the question about types of psychotropic substance abused. Ketamine was the most commonly abused substance (n=139; 45.6%), followed by cough mixture (n=96; 31.5%), ecstasy (n=77; 25.2%), methamphetamine (n=76; 24.9%), and marijuana (n=70; 23.0%). Of ketamine users, 60.7% had at least one LUTS. Comparing the ketamine users with other non-ketamine substance users, the former experienced significantly more LUTSs in all areas except for daytime UI, though for which a higher prevalence was still observed. Female ketamine users appeared to be more affected by LUTS than males (Table 4).
 

Table 3. Comparison of control subjects with psychotropic substance users
 

Table 4. Comparison of ketamine users with non-ketamine substance users, and male with female ketamine users
 
In multivariate analysis, increasing age and psychotropic substance use were found to increase the odds for experiencing at least one LUTS. With reference to age of <13 years, the ORs of experiencing at least one LUTS at age 13-15, 16-17, and ≥18 years were 1.3 (95% CI, 1.1-1.5), 1.7 (95% CI, 1.4-2.0), and 2.1 (95% CI, 1.7-2.7), respectively. With reference to the control subjects, the ORs of experiencing at least one LUTS were 2.8 (95% CI, 2.0-3.9) for those who used substances other than ketamine, and 6.2 (95% CI, 4.1-9.1) for those who used ketamine. When assessing the two genders separately in multivariate analysis, female ketamine users were 9.9-fold (95% CI, 5.4-18.2) and male ketamine users were 4.2-fold (95% CI, 2.5-7.1) more likely than their non-abuser counterparts to develop LUTSs.
 
Discussion
Large-scale population-based surveys of LUTS prevalence have been conducted in adults.2 17 Recently a few paediatric studies using the ICCS definitions have reported LUTS prevalence in children varying from 9.3% to 46.4%.7 8 9 The wide variation in prevalence can be attributed to the differences in the study population, questions used to assess LUTS, and the criteria to define the presence of symptoms. Vaz et al8 reported a prevalence of 21.8% in 739 Brazilian children aged 6 to 12 years while Yüksel et al7 found 9.3% of their 4016 Turkish children aged 6 to 15 years had LUTSs. In both studies, the investigators used validated scoring systems for a combination of LUTSs being assessed and pre-determined cut-off points in the total scores to define the presence or absence of LUTS.7 8 In contrast, Chung et al9 investigated 16 516 Korean children aged 5 to 13 years by measuring the presence of individual LUTS and reported the highest prevalence of 46.4% experiencing at least one LUTS. The high prevalence rate in the Korean study can be partly explained by their methodology wherein the responses to the LUTS questions were dichotomised into “yes” or “no” and a positive symptom was defined without considering its frequency.9
 
To the best of our knowledge, the present study is the first large-scale prevalence study focused on adolescents. We used a similar methodology to other major adult studies to measure each LUTS individually and define its presence by a frequency threshold of ‘≥20% of the time’.2 17 18 19 We agree with others that using a scoring system to define LUTS in a prevalence study may not reflect the true impact of individual LUTS as it is possible that a highly prevalent symptom may happen alone and the summed score may not reach the threshold.17
 
In our adolescents without any substance abuse, 18.5% experienced at least one LUTS. Our finding suggests that LUTS prevalence in adolescents appears to be lower than that in young adults. Previous studies including two conducted in Chinese populations have reported that 17% to 42% of men and women aged 18 to 39 years experience at least one LUTS.2 18 19 Notably, LUTS prevalence increased with age during adolescence from 13.9% in those <13 years to 25.8% in those aged ≥18 years in this study. In children, the prevalence of LUTS peaks at age 5 to 7 years and then declines with increasing age up to 13 to 14 years.7 8 9 10 20 The decline in prevalence has been attributed to the maturation of urinary bladder function along with the growth and development of children. Our study is the first to provide evidence that LUTS prevalence rises from the trough at the onset of adolescence and continues to increase throughout adolescence into adulthood. Our reported prevalence of 25.8% in those participants aged ≥18 years is in agreement with the trend in young adults reported elsewhere.2 18 19
 
Little is known in the existing literature regarding the trend of LUTS prevalence from adolescence to adulthood. In a Finnish study of 594 subjects aged 4 to 26 years, the authors reported that individuals aged 18 to 26 years had more urgency than the other two age-groups of 8 to 12 years and 13 to 17 years.10 Although adolescence spans less than a decade, it is unique with rapid physical, psychological, and developmental changes. Reasons for the increase in LUTS prevalence from adolescence to young adulthood are largely unknown but likely to be multifactorial. Changes in lifestyle, altered micturition behaviour, habitual postponement of micturition, unhealthy bowel habits, attitudes to the use of school toilets, anxiety associated with academic expectations, or worsening of relationships with family may all contribute to newly developed LUTS during adolescence. Further studies are warranted to investigate this phenomenon.
 
Our findings that storage symptoms were more prevalent than voiding symptoms are in agreement with the reported results in young adults.2 18 19 Urinary frequency (10.2%) and nocturia ≥2 times per night (4.4%) were the two most prevalent storage symptoms among the control subjects. We agree with others that nocturia once per night is very common in the general population and using the threshold of nocturia ≥2 times per night as LUTS is more appropriate.2 17 18 19 Only 2.9% of our control subjects had urgency suggestive of overactive bladder (OAB) according to ICCS definitions,6 in contrast to 12% of Korean children aged 13 years.20 Children with OAB may have urinary frequency in addition to urgency. The much lower prevalence of urgency than urinary frequency in our study suggests that many of our study subjects had urinary frequency unrelated to OAB. Glassberg et al21 found over 70% of their paediatric patients with dysfunctional voiding (DV) and primary bladder neck dysfunction (PBND) experienced urinary frequency; DV and PBND are also associated with high residual urine volume. Our finding that the feeling of incomplete emptying (5.4%) was the second most prevalent LUTS suggests that in some participants urinary frequency was secondary to incomplete bladder emptying associated with DV or PBND.
 
In our study, male non-substance users experienced more voiding symptoms while females had more daytime UI. Literature has consistently found female gender to be a risk factor for daytime UI in children.5 22 23 Our finding suggests that the gender association with daytime UI extends from childhood to adolescence. There are inconsistencies in the paediatric literature with respect to gender differences in voiding symptoms. Kyrklund et al10 found more voiding symptoms in boys than girls only in the age-group of 4 to 7 years, while such difference was not noted by others.8 24
 
Psychotropic substance use increased the risk of LUTS in our adolescents. Notably, 60% of our adolescents who abused ketamine had experienced at least one LUTS and had high prevalence rates of 28% to 47% in all areas of LUTS. Our finding that 2.7% of our participants abused psychotropic substances is consistent with the latest figure of 2.3% estimated by our government in its survey conducted in 2011/12.15 Ketamine-associated uropathy has emerged as a new clinical entity in our society since 2007.13 This chemically induced cystitis as a result of the urinary metabolites of ketamine is associated with severe LUTS with the possible consequence of irreversible bladder damage.12 25 Little information is available in the medical literature about the prevalence of LUTS among ketamine users. An online survey conducted in the UK reported a prevalence of 26.6% of at least one LUTS in the last 12 months among 1285 participants who had illicitly used ketamine.26 The LUTS prevalence is likely influenced by variation in dose and frequency of ketamine use of the study population. We have recently reported that both the dose and frequency of ketamine use and female gender are associated with the severity of the LUTS at presentation among the young patients who sought urological treatment for ketamine-associated uropathy.25 In the present study, female ketamine users were at a higher risk of developing LUTS than males. This observation is in agreement with our previous findings and our postulation that females appear to be more susceptible to the chemically induced injury following illicit use of ketamine for unknown reasons.25
 
Non-ketamine substance users also experienced more LUTSs than the control subjects in this study although the prevalence was not as high as that of ketamine users. Most recently Korean investigators have reported a 77% prevalence rate of LUTS among a group of young methamphetamine (also known as ‘ice’) users, and suggested that a pathological dopaminergic mechanism plays a predominant role in methamphetamine-associated LUTS.27 There has been a rising trend of using methamphetamine in recent years and it is now the second most popular psychotropic substance abused by youths aged <21 years in our community.14 It would not be surprising if we encountered more and more young patients presenting with LUTS associated with methamphetamine use in the foreseeable future.
 
Limitations of this study
There was potential bias in the sampling process as almost two thirds of the schools that we selected and approached refused to participate, the grades of the participants were not randomly selected, and non-response rate was approximately 20%. Young participants of lower grades may not be able to comprehend the LUTS questions that were designed for adults. Nevertheless, our finding of 2.7% of psychotropic substance use appears to be consistent with the 2.3% reported by the 2011/12 government survey in over 80 000 secondary school students.15 We did not study other potential risk factors that may be associated with LUTS in adolescents such as bowel function, urinary tract infection, stressful events, lifestyle, and toilet environment. The 0.5% to 2% missing data in each of the LUTS questions, though small, may still affect the estimated prevalence of each LUTS among our control subjects. Although daytime UI was not a prevalent symptom, the fact that less than half of the participants were asked this question because of a printing error may underestimate the overall prevalence of experiencing at least one LUTS among different subgroups. The 4-week recall period only allowed crude assessment of LUTS. A more-prevalent symptom may not necessarily cause more inconvenience than a less-prevalent symptom. How each individual LUTS concerned the participant and how different the substance abusers and non-substance abusers were concerned by the LUTS were not investigated in this study. Therefore individuals, particularly the non-substance abusers, who reported the experience of LUTS did not necessarily suffer from any established lower urinary tract conditions that warranted medical attention. The dose and frequency of illicit psychotropic substance use would certainly have an impact on the prevalence of LUTS but this was not investigated in this survey.
 
Despite all these limitations, our study provides important data on the prevalence of LUTS in adolescents and the effect of psychotropic substance use. The LUTSs are prevalent in the general adolescent population. It is important for clinicians to obtain a history about psychotropic substance use when treating teenagers with LUTS as there is a substantial possibility that the LUTSs are caused by organic pathology associated with psychotropic substance use and not functional voiding disorders.
 
Appendix
Additional material related to this article can be found on the HKMJ website. Please go to <http://www.hkmj.org>, and search for the article.
 
Declaration
The study was supported by the Beat Drugs Fund (BDF101012) of the Hong Kong SAR Government. The funding source had no role in the study design, data collection, data analysis, results interpretation, writing of the manuscript, or the decision to submit the manuscript for publication. All authors have no conflicts of interest relevant to this article to disclose.
 
References
1. Irwin DE, Kopp ZS, Agatep B, Milsom I, Abrams P. Worldwide prevalence estimates of lower urinary tract symptoms, overactive bladder, urinary incontinence and bladder outlet obstruction. BJU Int 2011;108:1132-8. Crossref
2. Irwin DE, Milsom I, Hunskaar S, et al. Population-based survey of urinary incontinence, overactive bladder, and other lower urinary tract symptoms in five countries: results of the EPIC study. Eur Urol 2006;50:1306-14; discussion 1314-5. Crossref
3. Koff AS, Wagner TT, Jayanthi VR. The relationship among dysfunctional elimination syndromes, primary vesicoureteral reflux and urinary tract infections in children. J Urol 1998;160:1019-22. Crossref
4. Leonardo CR, Filgueiras MF, Vasconcelos MM, et al. Risk factors for renal scarring in children and adolescents with lower urinary tract dysfunction. Pediatr Nephrol 2007;22:1891-6. Crossref
5. Sureshkumar P, Jones M, Cumming R, Craig J. A population based study of 2,856 school-age children with urinary incontinence. J Urol 2009;181:808-15; discussion 815-6. Crossref
6. Nevéus T, von Gontard A, Hoebeke P, et al. The standardization of terminology of lower urinary tract function in children and adolescents: report from the Standardization Committee of the International Children’s Continence Society. J Urol 2006;176:314-24. Crossref
7. Yüksel S, Yurdakul AC, Zencir M, Cördük N. Evaluation of lower urinary tract dysfunction in Turkish primary schoolchildren: an epidemiological study. J Pediatr Urol 2014;10:1181-6. Crossref
8. Vaz GT, Vasconcelo MM, Oliveira EA, et al. Prevalence of lower urinary tract symptoms in school-age children. Pediatr Nephrol 2012;27:597-603. Crossref
9. Chung JM, Lee SD, Kang DI, et al. An epidemiologic study of voiding and bowel habits in Korean children: a nationwide multicenter study. Urology 2010;76:215-9. Crossref
10. Kyrklund K, Taskinen S, Rintala RJ, Pakarinen MP. Lower urinary tract symptoms from childhood to adulthood: a population based study of 594 Finnish individuals 4 to 26 years old. J Urol 2012;188:588-93. Crossref
11. Wood D, Cottrell A, Baker SC, et al. Recreational ketamine: from pleasure to pain. BJU Int 2011;107:1881-4. Crossref
12. Chu PS, Ma WK, Wong SC, et al. The destruction of the lower urinary tract by ketamine abuse: a new syndrome? BJU Int 2008;102:1616-22. Crossref
13. Chu PS, Kwok SC, Lam KM, et al. ‘Street ketamine’–associated bladder dysfunction: a report of ten cases. Hong Kong Med J 2007;13:311-3.
14. Central Registry of Drug Abuse Sixty-third Report 2004-2013. Narcotics Division, Security Bureau, The Government of the Hong Kong Special Administrative Region. Available from: http://www.nd.gov.hk/en/crda_63rd_report.htm. Accessed Dec 2015.
15. The 2011/12 survey of drug use among students. Narcotics Division, Security Bureau, The Government of the Hong Kong Special Administrative Region. Available from: http://www.nd.gov.hk/en/survey_of_drug_use_11-12.htm. Accessed Dec 2015.
16. Yee CH, Li JK, Lam HC, Chan ES, Hou SS, Ng CF. The prevalence of lower urinary tract symptoms in a Chinese population, and the correlation with uroflowmetry and disease perception. Int Urol Nephrol 2014;46:703-10. Crossref
17. Coyne KS, Sexton CC, Thompson CL, et al. The prevalence of lower urinary tract symptoms (LUTS) in the USA, the UK and Sweden: results from the Epidemiology of LUTS (EpiLUTS) study. BJU Int 2009;104:352-60. Crossref
18. Zhang L, Zhu L, Xu T, et al. A population-based survey of the prevalence, potential risk factors, and symptom-specific bother of lower urinary tract symptoms in adult Chinese women. Eur Urol 2015;68:97-112. Crossref
19. Wang Y, Hu H, Xu K, Wang X, Na Y, Kang X. Prevalence, risk factors and the bother of lower urinary tract symptoms in China: a population-based survey. Int Urogynecol J 2015;26:911-9. Crossref
20. Chung JM, Lee SD, Kang DI, et al. Prevalence and associated factors of overactive bladder in Korean children 5-13 years old: a nationwide multicenter study. Urology 2009;73:63-7; discussion 68-9. Crossref
21. Glassberg KI, Combs AJ, Horowitz M. Nonneurogenic voiding disorders in children and adolescents: clinical and videourodynamic findings in 4 specific conditions. J Urol 2010;184:2123-7. Crossref
22. Kajiwara M, Inoue K, Usui A, Kurihara M, Usui T. The micturition habits and prevalence of daytime urinary incontinence in Japanese primary school children. J Urol 2004;171:403-7. Crossref
23. Hellström A, Hanson E, Hansson S, Hjälmås K, Jodal U. Micturition habits and incontinence in 7-year-old Swedish school entrants. Eur J Pediatr 1990;149:434-7. Crossref
24. Akil IO, Ozmen D, Cetinkaya AC. Prevalence of urinary incontinence and lower urinary tract symptoms in school-age children. Urol J 2014;11:1602-8.
25. Tam YH, Ng CF, Pang KK, et al. One-stop clinic for ketamine-associated uropathy: report on service delivery model, patients’ characteristics and non-invasive investigations at baseline by a cross-sectional study in a prospective cohort of 318 teenagers and young adults. BJU Int 2014;114:754-60. Crossref
26. Winstock AR, Mitcheson L, Gillatt DA, Cottrell AM. The prevalence and natural history of urinary symptoms among recreational ketamine users. BJU Int 2012;110:1762-6. Crossref
27. Koo KC, Lee DH, Kim JH, et al. Prevalence and management of lower urinary tract symptoms in methamphetamine abusers: an under-recognized clinical identity. J Urol 2014;191:722-6. Crossref

Clinical transition for adolescents with developmental disabilities in Hong Kong: a pilot study

Hong Kong Med J 2016 Oct;22(5):445–53 | Epub 19 Aug 2016
DOI: 10.12809/hkmj154747
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Clinical transition for adolescents with developmental disabilities in Hong Kong: a pilot study
Tamis W Pin, PhD; Wayne LS Chan, PhD; CL Chan, BSc (Hons) Physiotherapy; KH Foo, BSc (Hons) Physiotherapy; Kevin HW Fung, BSc (Hons) Physiotherapy; LK Li, BSc (Hons) Physiotherapy; Tina CL Tsang, BSc (Hons) Physiotherapy
Department of Rehabilitation Sciences, Hong Kong Polytechnic University, Hunghom, Hong Kong
 
Corresponding author: Dr Tamis W Pin (tamis.pin@polyu.edu.hk)
 
This paper was presented as a poster at the Hong Kong Physiotherapy Association Conference 2015, Hong Kong on 3-4 November 2015.
 
 Full paper in PDF
 
Abstract
Introduction: Children with developmental disabilities usually move from the paediatric to adult health service after the age of 18 years. This clinical transition is fragmented in Hong Kong. There are no local data for adolescents with developmental disabilities and their families about the issues they face during the clinical transition. This pilot study aimed to explore and collect information from adolescents with developmental disabilities and their caregivers about their transition from paediatric to adult health care services in Hong Kong.
 
Methods: This exploratory survey was carried out in two special schools in Hong Kong. Convenient samples of adolescents with developmental disabilities and their parents were taken. The questionnaire was administered by interviewers in Cantonese. Descriptive statistics were used to analyse the answers to closed-ended questions. Responses to open-ended questions were summarised.
 
Results: In this study, 22 parents (mean age ± standard deviation: 49.9 ± 10.0 years) and 13 adolescents (19.6 ± 1.0 years) completed the face-to-face questionnaire. The main diagnoses of the adolescents were cerebral palsy (59%) and cognitive impairment (55%). Of the study parents, 77% were reluctant to transition. For the 10 families who did move to adult care, 60% of the parents were not satisfied with the services. The main reasons were reluctant to change and dissatisfaction with the adult medical service. The participants emphasised their need for a structured clinical transition service to support them during this challenging time.
 
Conclusions: This study is the first in Hong Kong to present preliminary data on adolescents with developmental disabilities and their families during transition from paediatric to adult medical care. Further studies are required to understand the needs of this population group during clinical transition.
 
 
New knowledge added by this study
  • These results are the first published findings on clinical transition for adolescents with developmental disabilities in Hong Kong.
  • Dissatisfaction with the adult health services and reluctance to change were the main barriers to clinical transition.
  • The concerns and needs of the families were similar regardless of whether adolescents had physical or cognitive disabilities.
Implications for clinical practice or policy
  • A structured clinical transition service is required for adolescents with developmental disabilities and their parents.
  • Further in-depth studies are required to examine the needs for and concerns about clinical transition for all those involved. This should include adolescents with developmental disabilities, their parents or caregivers, and service providers in both paediatric and adult health services.
 
 
Introduction
Advances in medical management now enable children with developmental disabilities (DD) who may previously have died to live well into adulthood.1 Such disabilities are defined as any condition that is present before the age of 22 years and due to physical or cognitive impairment or a combination of both that significantly affects self-care, receptive and expressive language, mobility, learning, independent living, or the economic independence of the individual.2 The transition from adolescence to adulthood is a critical period for all young people.3 In a clinical context, adult transition is “the purposeful, planned movement of adolescents and young adults with chronic physical and medical conditions from child-centered to adult-oriented health-care systems”.4 In 2001, a consensus statement with guidelines was endorsed to ensure adolescents with DD, who depend on coordinated health care services, make a smooth transition to the adult health care system in order to receive the services that they need in developed countries such as the United States.5
 
Researchers have identified needs and factors necessary for the successful transition of adolescents with DD.6 7 From the adolescent’s perspective, barriers to success include their dependence on others, reduced treatment time and access to specialists in the adult health service, lack of information about transition, and lack of involvement in the decision-making process. Parents of adolescents with DD were reluctant or confused about changing responsibilities during the transition period. The majority of challenges came from the service systems and included unclear eligibility criteria and procedures, limited time and lack of carer training, fragmented adult health service provision, lack of communication between service providers, and inaccessibility to resources including information.6 7
 
Based on the 2015 census of ‘Persons with disabilities and chronic diseases in Hong Kong’ from the Hong Kong Census and Statistics Department, there were 22 100 (3.8%) people with disability (excluding cognitive impairment) aged between 15 and 29 years, ie who were transitioning from the paediatric to adult health service.8 According to the Hospital Authority, Hong Kong, all public hospitals, and specialist and general out-patient clinics are organised into seven hospital clusters based on geographical location.9 The Duchess of Kent Children’s Hospital (DKCH) is a tertiary centre that provides specialised services for children with orthopaedic problems, spinal deformities, cerebral palsy and neurodevelopmental disorders, and neurological and degenerative diseases. Unlike overseas health care, there is no children’s hospital in Hong Kong that provides an acute health service. All paediatric patients go to the same hospital as adult patients but are triaged into the paediatric section for management by both in-patient and out-patient services. The specialised out-patient clinic list under each hospital cluster varies. Children with DD might receive services from general paediatrics clinic, a cerebral palsy clinic, Down’s clinic, behavioural clinic, or paediatric neurology out-patient clinic. Once a child reaches the age of 18 years, they are referred to the adult section of the same hospital for continued care. They will be followed up in neurology or movement disorder clinics, where other patients with adult-onset neurological conditions or movement disorders, such as stroke, Parkinson’s disease, multiple sclerosis are followed up. There is no separate specialised clinic for complex child-onset DD.10
 
Although adult transition for adolescents with DD has been recognised as a crucial area in health care overseas, it is an under-developed service in Hong Kong.11 A local study found that there is a service gap in adult transition for young people with chronic medical conditions, such as asthma, diabetes, and epilepsy. Training and education are urgently required for both service providers and young people with chronic health conditions and their families.11 It is unclear if the challenges and barriers identified in overseas literature6 7 are applicable to Hong Kong, where the paediatric and adult health services, especially the medical services, are located in the same building. At present, no study has been conducted with adolescents with DD and their families in Hong Kong about the issues they face during clinical transition. As a start, in this pilot study, we aimed to explore the acceptance of clinical transition and identify the main barriers to successful clinical transition for adolescents with DD and their caregivers in Hong Kong.
 
Methods
Participants
A survey study was conducted on a convenience sample of adolescents and/or their caregivers, who were recruited from two special high schools (Hong Kong Red Cross John F Kennedy Centre [JFK] and Haven of Hope Sunnyside School [HOH]) in Hong Kong. Students from JFK have primarily physical and multiple disabilities including cerebral palsy or muscular dystrophy and those from HOH have severe cognitive impairment. Both schools provide rehabilitation services on site including physiotherapy, occupational therapy, speech therapy, nursing support, and family support via the school social workers. The medical out-patient services for the students fall under different hospital clusters, depending on where the students’ families live. The parents are responsible for taking their adolescent children for medical review. As there was no previous study on which basis to calculate the required sample size and as a pilot study, we aimed to recruit 10 adolescents and their parents/caregivers in each school.
 
The inclusion criteria of the adolescents were: (1) aged 16 to 19 years and (2) a diagnosis of DD. All participating adolescents and/or their parents or legal guardians gave written informed consent before the survey. For adolescents with severe cognitive impairment, consent was sought from their parents as proxy and only their parents participated in the survey. This pilot study was approved by the Human Subjects Ethics Sub-committee of the Hong Kong Polytechnic University.
 
Survey
A specific questionnaire was developed for this pilot study to collect information relative to: (1) demographic characteristics of the participants; (2) whether or not the study participants were aware of the transition and the information source(s); (3) if the study participants were willing to transition to the adult health service and the underlying reasons; (4) for those who had transitioned, if they were satisfied with the adult health service and the underlying reasons; and (5) the opinion of the study participants of the clinical transition service. As a pioneer pilot study, health service included both medical and rehabilitation services and the study aimed to explore the general issues faced by this group of adolescents and their families during clinical transition. Lastly, as we predicted that the competency level of the adolescents and their caregivers in managing their disabilities might influence their perception of clinical transition, information about their self-rated competency level was also collected. Questions in the latter part were based on information from the Adolescent Transition Care Assessment Tool and were designed to assist health care professionals to provide a better transition for adolescents with chronic illness.12 The whole survey was administered by interviewers for the study adolescents and/or their caregivers separately in Cantonese. All interviews were recorded for data analyses.
 
The survey comprised closed- and open-ended questions. The closed-ended questions were designed to require a dichotomous answer, ie ‘yes’ or ‘no’, or an answer from a set of choices. For example, when asked about the sources of information about clinical transition, the study participants could choose their answers from a list of professionals such as paediatricians, social workers, physiotherapists, and school teachers. The open-ended questions focused on the reasons for an earlier response. For example, when asked if they wished to move to the adult health service, the individual would first answer ‘yes’ or ‘no’, then give their reasons. For the questions about self-perceived competency level, study participants read a number of statements (8 for the adolescents and 14 for parents) and indicated how much they agreed with the statement using a Likert-scale from ‘strongly disagree’ to ‘strongly agree’.
 
Data analyses
Descriptive statistics—including mean, median, standard deviation, or quartiles—were used to analyse the responses to the closed-ended questions. Discussion of the open-ended questions was transcribed and summarised by five team members (CLC, KHF, KHWF, LKL, and TCLT). The content was analysed and themes were identified independently by two other team members (TWP and WLSC). These themes were discussed and a consensus reached by all team members.
 
Results
Thirteen potential families were approached at the JFK via the physiotherapist of the school anticipating possible refusal by some. All the students and their parents agreed to participate, so all were included. Ten families were approached at the HOH via the social worker at the school but one parent declined the offer and no other family was interested in participating in the study. Since the students from HOH, who were cognitively impaired, were not able to be interviewed, only their parents/caregivers were interviewed. As a result, 22 parents (13 from the JFK and 9 from the HOH) and 13 adolescents (all from the JFK) were asked to complete the face-to-face survey. The demographic data of the participants are listed in Table 1. Cerebral palsy and cognitive impairment were the principal types of DD. All adolescents received rehabilitation from the Hospital Authority and/or from their special school. All the adolescents accessed between four and seven paediatric medical specialists (eg paediatrician, neurologist, orthopaedic surgeon) and rehabilitation services (eg physiotherapy, occupational therapy, speech therapy) indicating their complex needs. Over 90% of the JFK students were followed up at the DKCH that is within walking distance of the school (personal communication, Senior Physiotherapist at JFK).
 

Table 1. Demographic information of study participants
 
Table 2 summarises the participant responses about clinical transition. The majority of the parents (77%) and adolescents (85%) knew that clinical transition to adult care would occur at 18 years of age. They were mainly informed by their paediatrician (50% of parents and 69% of adolescents). Most parents (77%) were reluctant to make this move. Ten parents stated that their adolescent child was already receiving care from the adult sector and over half of them (60%) were dissatisfied with the service. Four of the 13 adolescents clearly stated that they had transitioned to the adult health service during the survey and all of them were happy with the transition. Among those 17 families who knew that clinical transition to adult care would occur at 18 years of age, 10 adolescents were all over 18 years old, whereas those in another seven (41%) families were also over 18 years old and still receiving services from the paediatric sector at the time of the study.
 

Table 2. Summary of responses to questions about clinical transition
 
When asked why they were not willing to the transition or why they were dissatisfied after the transition, the parent responses could be summed up as two main areas of concern: reluctance to change and dissatisfaction with the adult health services. Most parents (16/22, 73%) did not want to change their existing care circumstances. When asked why, some parents cited dissatisfaction with the adult health service or health system (for the latter, 13/22, 59% of parents). For example, parents found it difficult to attend the follow-up appointments using public transport. Although there was a free shuttle bus service for families who needed it, parents were frustrated by the limited service.
 
Some parents also wanted more flexible visiting hours in the adult hospital so that they could look after their adolescent children, especially those who were cognitively impaired. The parents worried about the quality of care for their children who were entirely dependent for their daily activities. They were also unhappy about the waiting time for medical appointments and stated that their children with DD had a short attention span and were unable to control their behaviour. Long waiting times in a crowded waiting area, which is commonly observed in the adult setting, could easily trigger their behavioural problems.
 
There was also dissatisfaction with the adult health service providers (13/22, 59% of parents). Parents often found that the adult medical staff demonstrated limited understanding and knowledge of their child’s clinical presentation and abilities, especially for those with severe cognitive impairment. The adult health service providers did not know how to communicate with the cognitively impaired adolescents and treated them as other normal adults.
 
There is no formal clinical transition service in Hong Kong but when asked, the majority of parents (21/22, 95%) and adolescents (11/13, 85%) stated that they would welcome such a service. About two thirds of the study parents and adolescents (23/35, 66%) would like the clinical transition service to support them during the clinical transition. About one third of the parents (7/22, 32%) believed that the service could act as a bridge linking the paediatric and adult health services, providing information about available services in the adult sector.
 
The Figure summarises the responses of adolescents for self-perceived competency in managing their disability, and Table 3 summarises the study parents’ responses. Most adolescents demonstrated understanding of instructions (11/13, 85%), confidence in communicating with the service providers about their condition (10/13, 77%), and understanding the importance of treatments for their condition (12/13, 92%) [Fig]. About half were confident in seeking help from different specialties according to their condition (6/13, 46%) and making medical decisions (7/13, 54%) [Fig]. Over half of the adolescents, however, lacked the confidence to attend routine medical visits on their own (8/13, 62%) and worried about the unfamiliar adult medical service (6/13, 46%) [Fig]. Most parents stated that they were familiar with their children’s medical conditions and treatments (20/22, 91%) and able to seek help from different medical specialties based on their child’s condition (14/22, 64%) [Table 3]. Only a minority of parents (1/22, 5%), however, believed that their children were capable of attending medical appointments on their own. Less than half of the parents believed that their children would be able to explain their medical condition (9/22, 41%) or make independent clinical decisions in the future (7/22, 32%). None of the parents of an adolescent with cognitive impairment believed that the child would ever be able to manage their own health.
 

Figure. Responses of study adolescents about their self-perceived competency in managing their disability
 

Table 3. Summary of responses of study parents about their self-perceived competency in managing the disability
 
Discussion
The present pilot study aimed to determine how adolescents with DD and their parents in Hong Kong accept the clinical transition and identify the main barriers to successful transition. This was the first step to understanding the issues of this population group during clinical transition and to enable planning for the future. As far as we know, this study is the first to be conducted in Hong Kong for this population group. Overall, 22 parents and 13 adolescents were recruited from two special schools (one for primarily physically disabled children and the other for severe physically and/or cognitively impaired individuals), aiming to understand what was the acceptance level and barriers faced by these two vastly different groups with DD. The results were very similar between these two subgroups, indicating that the study parents had similar issues during clinical transition, regardless of the type of DD of their child. Hence, the results from these two subgroups were discussed as one group.
 
Most of the study participants were aware of the clinical transition necessary at the age of 18 years. Only 10 (45%) of the 22 families shifted to the adult health service, despite the fact that their adolescent child was close to or over 18 years old (Tables 1 and 2). The reasons for this delay were not thoroughly explored but it has been suggested that medical practitioners in the paediatric service felt that the adolescents were not ready for the transition so they continued to see them well into adulthood while the parents and the adolescents were reluctant to make the move.11 The latter appeared to be true because when asked, most study participants did not want to change and move to the adult health service. This contradicts the results of a previous local study of adolescents with chronic medical conditions, in which over 80% of the study participants (adolescents and parents) were willing to move to the adult health service.11 The difference is likely due to the complexity of the health conditions of the present cohort. Adolescents with DD usually have varying degrees of physical and/or cognitive impairment and so depend more on others for managing their health condition, making them and their carers more anxious about any change.6 7 For those with chronic medical conditions, the physical and cognitive abilities of the adolescent were unlikely affected and hence the adolescents could manage their condition more independently and more readily after the transition.13 This speculation was supported by the findings about self-perceived competency level. Most study adolescents, who had mainly physical disabilities, were not confident about attending a medical appointment alone because of their limited physical abilities (Fig). The reluctance for change may also be due to fear of the unknown and of not being well prepared.11 In Hong Kong, clinical transition is non-structured and unplanned.11 Parents are often informed just before the transition, leading to poor preparation and confusion. Early and continuous clinical transition from early adolescence can enable parents and adolescents with DD to be prepared and actively participate in the transition planning.7 14 Although the clinical transition service is not well-known in Hong Kong, the study participants had a positive attitude towards a clinical transition service to help them navigate the process by bridging the paediatric and adult health services. In addition, the study parents wished to have more information about available adult health services, eg rehabilitation services, wheelchair maintenance services, etc. More information about the unknown has been shown to reduce the reluctance of parents and adolescents to change and to further improve their confidence about moving to adult care.5 6 15 16
 
Another barrier was dissatisfaction with the health care system and service providers in the adult setting (Table 2), and is in line with present literature.7 Some parents found it difficult to arrange transport to the adult hospital for follow-up while most of the appointments were currently at the special school. More accessible public transport might help, especially in Hong Kong, where private vehicles are not a common option. Flexible visiting hours in the adult hospital that would enable parents to care for their dependent adolescent child may also reduce their dissatisfaction. Longer waiting times for medical appointments in the adult setting was frequently mentioned by the study parents. In the adult sector, patients with all kinds of neurological conditions, both child-onset and adult-onset, are reviewed in the same clinic. The number of patients attending the clinic is vastly increased compared with the paediatric setting. In addition, patients with adult-onset neurological conditions and their family may not understand the characteristics of DD. Stressed behaviour of an adolescent child with DD may be perceived by other families as impatience. In the paediatric clinic, where all clinic attendees were children or adolescents with DD and their parents, the waiting time was shorter. Families as well as clinic staff had a full understanding of DD and would be more tolerant. It is likely that this lack of support in the adult setting further discouraged the study parents to make the transition willingly. Changes to the existing health care system, such as a separate clinic for child-onset DD conditions, may be a possible small step to assist this group in making a smooth transition. Education about clinical transition for staff in both the paediatric and adult settings allows them to prepare the families in advance. Education about paediatric conditions and communication with adolescents with DD can also equip adult staff with confidence so they develop a strong rapport with the families.16
 
Interestingly, from the perspective of the adolescents, the four adolescents who had transitioned stated that they were happy with the adult health service. Two (50%) adolescents indicated that because the ‘new’ doctors did not know them well, they paid more attention to them and one adolescent did not give any reason to support his statement. It is likely that in the paediatric sector, these adolescents had been followed up from early childhood by the same medical staff who virtually watched them grow up. A change of scene and people in the adult sector is welcomed by these adolescents. In addition, they might welcome the idea that they have started to actively participate in the consultation as a ‘patient’, unlike in the paediatric sector, where the consultation was directed at their parents, not them.14
 
Parents of adolescents who had physical and/or cognitive impairment had a similar perception of their adolescent child, that they would never be able to attend a medical appointment alone, presumably because of their disability (responses to questions 9 to 10 in Table 3). None of the parents of a cognitively impaired adolescent child believed their child to be capable of explaining their medical condition to others or making an independent medical decision. On the contrary, parents of a physically disabled child thought that while it may not apply at present, their child would be able to make their own decisions in future (responses to questions 11 to 14 in Table 3). Nonetheless, there was a discrepancy in this perception between the study adolescents and parents (Fig and Table 3). Most adolescents believed they could explain their condition to others (question 2 in the Fig) and over half believed that they could make an independent medical decision at the time of the study (question 7 in the Fig). Further studies are needed to determine whether this discrepancy is due to confusion on the part of the parents because of changing responsibilities during the transition.13 While western literature emphasises the importance of active participation by adolescents during clinical transition,14 it would be interesting to see if this can be endorsed in a parent-dominant Chinese society such as Hong Kong, where cultural differences may influence attitudes towards clinical transition.17
 
We were unable to analyse other challenges identified in overseas literature, such as unclear eligibility criteria and procedures and limited time for clinical transition, because comparable data were not available for Hong Kong. In other developed countries, adolescents are shifted with the assistance of a clinical transition service based in the children’s hospital (if any) or by applying clinical guidelines for best service.18 In Hong Kong, adolescents are referred from the paediatric section to the adult section within the same hospital. Each hospital cluster may have different procedures for this ‘transition’ and there is no defined department within the hospital structure to assist adolescents and their families through this process. Nor do the families receive any advice about how to negotiate this process. A future in-depth study is recommended to understand the existing situation of clinical transition in different hospital clusters and determine how to establish a more formal approach among all the hospital clusters in Hong Kong to support this group of adolescents and their families during this confusing time.
 
Limitations of the present study
There may have been a selection bias in the study sample as the families were approached by convenience through the school staff. Due to the small sample size, no statistical analysis was conducted to compare the subgroups of adolescents with physical and cognitive impairments. Although the sample size was small, the purpose of the present pilot study was not to generalise the findings to all adolescents with DD but to begin to understand the acceptance of clinical transition and the main barriers to success for adolescents with DD and their family in Hong Kong. The present results are also in line with the literature in this area.4 7 11 14 17 Future studies with a larger sample size and more in-depth qualitative data are required to verify the present results. The potential subjective bias of the results, especially for the open-ended questions, was another limitation but we attempted to minimise this through consensus agreement among the team.
 
Conclusions
In the present explorative study, close to half of the study families had a delayed clinical transition to the adult health service. Most study parents were reluctant for their adolescent children to shift to the adult health service due to unwillingness to change and dissatisfaction with the adult medical service. A structured and well-planned clinical transition was urged by the study participants to bridge the paediatric and adult health services and to provide support to the family. Further studies are required to analyse the needs and concerns of adolescents with DD and their families as well as the service providers in the adult medical setting to facilitate the future development of a clinical transition service in Hong Kong.
 
Acknowledgements
The authors would like to thank all the participating families from the Hong Kong Red Cross John F Kennedy Centre and Haven of Hope Sunnyside School.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Westbom L, Bergstrand L, Wagner P, Nordmark E. Survival at 19 years of age in a total population of children and young people with cerebral palsy. Dev Med Child Neurol 2011;53:808-14. Crossref
2. Public Law 98-527, Developmental Disabilities Act of 1984.
3. Staff J, Mortimer JT. Diverse transitions from school to work. Work Occup 2003;30:361-9. Crossref
4. Blum RW, Garell D, Hodgman CH, et al. Transition from child-centered to adult health-care systems for adolescents with chronic conditions. A position paper of the Society for Adolescent Medicine. J Adolesc Health 1993;14:570-6. Crossref
5. American Academy of Pediatrics, American Academy of Family Physicians, American College of Physicians-American Society of Internal Medicine. A consensus statement on health care transitions for young adults with special health care needs. Pediatrics 2002;110(6 Pt 2):1304-6.
6. Bindels-de Heus KG, van Staa A, van Vliet I, Ewals FV, Hilberink SR. Transferring young people with profound intellectual and multiple disabilities from pediatric to adult medical care: parents’ experiences and recommendations. Intellect Dev Disabil 2013;51:176-89. Crossref
7. Stewart D, Stavness C, King G, Antle B, Law M. A critical appraisal of literature reviews about the transition to adulthood for youth with disabilities. Phys Occup Ther Pediatr 2006;26:5-24. Crossref
8. Persons with disabilities and chronic diseases in Hong Kong. Hong Kong: Hong Kong Census and Statistics Department; 2016. Available from: http://www.statistics.gov.hk/pub/B71501FB2015XXXXB0100.pdf. Accessed Jul 2016.
9. Clusters, hospitals & institutions. Hospital Authority. 2016. Available from: http://www.ha.org.hk/visitor/ha_visitor_index.asp?Content_ID=10036&Lang=ENG&Dimension=100&Parent_ID=10004. Accessed Jul 2016.
10. Hospital Authority Statistical Report 2012-2013. Hong Kong: Hospital Authority; 2013.
11. Wong LH, Chan FW, Wong FY, et al. Transition care for adolescents and families with chronic illnesses. J Adolesc Health 2010;47:540-6. Crossref
12. Hong Kong Society for Adolescent Health. Adolescent Transition Care Assessment Tool, Public Education Series No. 12 (2013). Available from: http://hksah.blogspot.hk/2013/11/adolescent-transition-care-assessment.html. Accessed 6 Feb 2016.
13. Stewart DA, Law MC, Rosenbaum P, Willms DG. A qualitative study of the transition to adulthood for youth with physical disabilities. Phys Occup Ther Pediatr 2002;21:3-21. Crossref
14. Viner RM. Transition of care from paediatric to adult services: one part of improved health services for adolescents. Arch Dis Child 2008;93:160-3. Crossref
15. Blum RW. Introduction. Improving transition for adolescents with special health care needs from pediatric to adult-centered health care. Pediatrics 2002;110(6 Pt 2):1301-3.
16. Stewart D. Transition to adult services for young people with disabilities: current evidence to guide future research. Dev Med Child Neurol 2009;51 Suppl 4:169-73. Crossref
17. Barnhart RC. Aging adult children with developmental disabilities and their families: challenges for occupational therapists and physical therapists. Phys Occup Ther Pediatr 2001;21:69-81. Crossref
18. Department of Health. National Service Framework for Children, Young People and Maternity Services. Transition: getting it right for young people. Improving the transition of young people with long term conditions from children’s to adult health services. 2006. Available from: http://dera.ioe.ac.uk/8742/1/DH_4132145%3FIdcService%3DGET_FILE%26dID%3D23915%26Rendition%3DWeb. Accessed Jul 2016.

Can surgical need in patients with Naja atra (Taiwan or Chinese cobra) envenomation be predicted in the emergency department?

Hong Kong Med J 2016 Oct;22(5):435–44 | Epub 12 Aug 2016
DOI: 10.12809/hkmj154739
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Can surgical need in patients with Naja atra (Taiwan or Chinese cobra) envenomation be predicted in the emergency department?
HY Su, MD1; MJ Wang, PhD2; YH Li, PhD3; CN Tang, MD4; MJ Tsai, MD, PhD5
1 Department of Emergency Medicine, E-Da Hospital and I-Shou University, Kaohsiung, Taiwan; Department of Emergency Medicine, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
2 Department of Medical Research, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
3 Department of Public Health, Tzu Chi University, Hualien, Taiwan
4 Department of Family Medicine, Buddhist Tzu Chi General Hospital, Hualien, Taiwan
5 Department of Emergency Medicine, Ditmanson Medical Foundation Chiayi Christian Hospital, Chiayi, Taiwan; Department of Sports Management, Chia Nan University of Pharmacy and Science, Tainan, Taiwan
 
Corresponding author: Dr MJ Tsai (tshi33@gmail.com)
 
An earlier version of this paper was presented at the 7th Asian Conference on Emergency Medicine held in Tokyo, Japan on 23-25 October 2013.
 
 Full paper in PDF
 
Abstract
Objectives: To investigate the clinical predictors and the aetiologies for surgery in patients with Naja atra (Taiwan or Chinese cobra) envenomation.
 
Methods: This case series was conducted in the only tertiary care centre in eastern Taiwan. Patients who presented to the emergency department with Naja atra bite between January 2008 and September 2014 were included. Clinical information was collected and compared between surgical and non-surgical patients.
 
Results: A total of 28 patients with Naja atra envenomation presented to the emergency department during the study period. Of these, 60.7% (n=17) required surgery. Necrotising fasciitis (76.5%) was the main finding in surgery. Comparisons between surgical and non-surgical patients showed skin ecchymosis (odds ratio=34.36; 95% confidence interval, 2.20-536.08; P=0.012) and a high total dose of antivenin (≥6 vials; odds ratio=14.59; 95% confidence interval, 1.10-192.72; P=0.042) to be the most significant predictors of surgery. The rate of bacterial isolation from the surgical wound was 88.2%. Morganella morganii (76.5%), Enterococcus faecalis (58.8%), and Bacteroides fragilis (29.4%) were the most common pathogens involved. Bacterial susceptibility testing indicated that combined broad-spectrum antibiotics were needed to cover mixed aerobic and anaerobic bacterial infection.
 
Conclusions: Patients with Naja atra envenomation who present with skin ecchymosis or the need for a high dose of antivenin may require early surgical assessment. Combined broad-spectrum antibiotics are mandatory.
 
 
New knowledge added by this study
  • Among the six major venomous snakebites in Taiwan, Naja atra envenomation most commonly leads to surgical intervention.
  • Ecchymosis on the bite wound may be a good indicator for surgical need in N atra envenomation.
  • Adequate antibiotic treatment may play an important role in the early management of N atra envenomation.
Implications for clinical practice or policy
  • Surgical debridement and broad-spectrum antibiotic treatment are suggested in patients with N atra envenomation who develop ecchymosis. Surgery is more likely when high-dose antivenin has been used.
 
 
Introduction
Snakebites are an important public health and wilderness medical issue in Taiwan. Because of the warm and humid climate in Taiwan, there are more than 40 terrestrial snake species, of which 15 are venomous. Six of the venomous species are of high clinical importance, including Protobothrops mucrosquamatus (Taiwan Habu), Trimeresurus stejnegeri (Taiwan bamboo viper), Naja atra (Taiwan or Chinese cobra), Bungarus multicinctus (banded krait), Deinagkistrodon acutus (hundred pacer), and Daboia russelii siamensis (Russell’s viper).1 2
 
Naja atra belongs to the Elapidae family, and in addition to Taiwan, it inhabits southern China, Hong Kong, northern Laos, and northern Vietnam.3 Cobra venom contains a mixture of components, including cardiotoxin, cobrotoxin, haemotoxin, and phospholipase A2.4 Patients envenomed by a cobra experience varying degrees of neurotoxicity and cytotoxicity depending upon the proportions of the venom components. Due to evolution and geographical variations, different cobra species cause distinct clinical effects. For example, Naja philippinensis (northern Philippine cobra) causes a purely neurotoxic effect without local cytotoxicity.5 In contrast, N atra envenomation is associated with more cytotoxic effects.3 6 7 Although an equine-derived bivalent F(ab)2 antivenin has been produced by the Centers for Disease Control, ROC (Taiwan) to neutralise the venom of N atra, the surgical intervention rate remains high.1 8 The main objective of this study was to investigate the clinical presentations and predictors for surgery in patients with N atra envenomation. Due to high wound infection rates, the isolated bacteria from surgical wounds and the antimicrobial susceptibility were also analysed.
 
Methods
Study design and patient population
The Buddhist Tzu Chi General Hospital is the only tertiary care centre in eastern Taiwan. There are 1000 beds and the emergency department (ED) has more than 55 000 patient visits per year. This hospital is also the toxicant, drug information, and antidote control centre for eastern Taiwan. A retrospective study was conducted to analyse data from patients admitted to the ED with N atra envenomation between 1 January 2008 and 30 September 2014.
 
Data collection, processing, and categorisation
A medical assistant was responsible for collecting the medical records of patients admitted with snakebite during the study period by using the computerised chart system and International Classification of Diseases, 9th Revision, Clinical Modification codes 989.5, E905.0, E905.9, E906.2, and E906.5. Two physicians (the first and fifth authors) independently reviewed the charts and categorised these patients as having venomous or non-venomous snakebites based on the patient’s presentation with or without toxic effects. For venomous snakebites, classification of the snake species was based on the identification of the snake brought in by the patient or identification by the patient from a picture. All the included patients had a compatible presentation and consistent antivenin use as recorded in the patient chart. Patients who were initially recognised as having venomous snakebites but did not receive antivenin treatment were excluded from the study because of the high probability of a dry bite or misidentification of the snake species. Patients with a toxic presentation who could not identify the snake species or who received more than one type of antivenin were recorded as having an unknown poisonous snakebite.
 
Here we only report patients who were bitten by N atra. To identify the early clinical predictors of surgery, we categorised the patients into surgical and non-surgical groups. All surgical interventions were performed after surgical consultation in the ED or after admission when patients presented with progressive signs suggesting tissue necrosis, necrotising fasciitis, or suspected compartment syndrome. The final diagnoses of necrotising fasciitis and compartment syndrome were made according to surgical pathological findings and intracompartmental pressure measurement, respectively. The surgical procedures included debridement, fasciotomy, skin graft, and digit or limb amputation. The potential clinical predictors of surgery in N atra envenomation included the patient’s age, gender, season of snakebite, co-morbidities, details of envenomation, site of snakebite, initial vital signs on arriving at the ED, clinical presentation, laboratory data, treatment, timing of initial antivenin therapy, and total dose of antivenin.
 
For the laboratory analyses, the initial data obtained in the ED were collected, including haematology, biochemistry, and coagulation profiles. In regard to clinical presentation, the local signs and symptoms, local complications, and systemic manifestations and complications were classified. Local signs and symptoms included swelling, ecchymosis, necrosis, numbness, and bulla formation. Local complications included necrotising fasciitis and suspected compartment syndrome. Systemic manifestations and complications included neurological symptoms, including ptosis, blurred vision, drooling, and paralysis of facial, limb, or respiratory muscles; leukocytosis, defined as a white blood cell count of >11.0 x 109 /L; thrombocytopenia, defined as a platelet count of <150 x 103 /mm3;2 prothrombin time (PT) prolongation, defined as PT of >11.6 seconds; activated partial thromboplastin time (aPTT) prolongation, defined as aPTT of >34.9 seconds (prolonged PT and aPTT were defined according to our clinical laboratory reference range); fibrinogen consumption, defined as a fibrinogen level of <1.8 g/L; elevated D-dimer level, defined as a D-dimer level of >500 µg/L; acute renal impairment, defined as a creatinine level of >123.8 µmol/L9; and rhabdomyolysis, defined as a creatine kinase level of >1000 U/L.10 Two physicians reviewed the charts of the enrolled patients and rechecked the accuracy of the data collection. If the patient’s initial vital signs were not measured or laboratory tests were not performed in the ED, this was recorded as a missing value in the database. Any discrepancy regarding the collected data was resolved through discussion with the third physician on the research team. The study protocol was approved by the institutional review board of the Buddhist Tzu Chi General Hospital (IRB102-38). All patient records and information were anonymised and de-identified prior to analysis.
 
Statistical analyses
To identify significant early clinical presentation and laboratory data associated with surgery in patients with N atra envenomation, the Student’s t test or the Mann-Whitney U test for continuous variables and Chi squared test for categorical variables were used to perform univariate analysis. A P value of <0.05 was considered statistically significant, and all statistical tests were two-tailed. For multivariate analysis, the categorical variables with a P value of <0.05 in the initial univariate analysis were selected and entered into a logistic regression forward stepwise Wald test to calculate the odds ratios (ORs). The Statistical Package for the Social Sciences (Windows version 12.0; SPSS Inc, Chicago [IL], US) was used to perform the statistical analyses.
 
Results
Epidemiology and surgical intervention rate for snake envenomation
Between 1 January 2008 and 30 September 2014, a total of 245 patients with venomous snakebites were recorded. Among these, 64 (26.1%) patients had P mucrosquamatus envenomation, 56 (22.9%) had T stejnegeri envenomation, 28 (11.4%) had N atra envenomation, five (2.0%) had B multicinctus envenomation, six (2.4%) had D acutus envenomation, seven (2.9%) had D r siamensis envenomation, and 79 (32.2%) had unknown poisonous snake envenomation.
 
The snakebites associated with the highest surgical intervention rates were N atra (60.7%), followed by D acutus (33.3%), and P mucrosquamatus (12.5%).
 
Characteristics and clinical status of patients with Naja atra envenomation
Of the 28 patients with a N atra bite, 20 (71.4%) were male. The mean (± standard deviation) age of patients was 52.3 ± 3.2 years. Of the patients, 22 (78.6%) were bitten in the summer or fall; 17 (60.7%) were bitten on an upper limb; and 17 (60.7%) with N atra envenomation received surgical treatment. These patients had a significantly longer duration of hospitalisation than non-surgical patients (27.5 ± 10.2 days vs 2.7 ± 3.1 days; P<0.001). The main operative diagnosis was necrotising fasciitis (n=13, 76.5%) with confirmation by histopathology. The clinical characteristics of the 17 surgical patients are shown in Table 1. The mean duration from the time of initial presentation to the day of surgery was 5.5 ± 4.3 days. All 13 patients with necrotising fasciitis underwent emergency fasciotomy and debridement, and two required limb or digit amputation. The other four surgical patients without necrotising fasciitis only received local debridement with or without skin graft due to local tissue necrosis. Therefore, a smaller surgical wound and a shorter duration of hospitalisation were observed for these patients (Table 1). Nearly all surgical patients presented with local swelling and ecchymosis on the bite wound. Only one non-surgical patient presented with ecchymosis on a finger and was discharged from the ED 1 day later after four vials of antivenin were administered. The Figure shows the initial ecchymosis and necrosis of a N atra bite wound, the development of extensive tissue necrosis, and the postoperative wounds of a surgical patient (patient No. 9 in Table 1).
 

Table 1. Clinical characteristics of the 17 surgical patients with Naja atra envenomation
 

Figure. Patient No. 9 in Table 1
A 59-year-old man bitten by Naja atra on his left foot visited our hospital 6 hours after the snakebite. (a) Despite the use of 10 vials of antivenin, progressive ecchymosis and necrosis on the bite wound developed later. (b) Fasciotomy and debridement were done on the second day of patient visit. (c) Progressive wound necrosis and necrotising fasciitis of the leg developed 5 days later. (d and e) He underwent second surgical debridement of the foot and fasciotomy of the leg
 
Demographic and clinical characteristics associated with surgical treatment in patients with Naja atra envenomation
The demographic and clinical characteristics were compared between the surgical and non-surgical patients with N atra envenomation (Tables 2 and 3). Overall, the surgical patients received significantly higher doses of antivenin (9.2 ± 4.9 vials vs 3.8 ± 2.4 vials; P=0.002) and had significantly higher white blood cell counts (11.0 ± 3.7 x 109 /L vs 8.2 ± 2.4 x 109 /L; P=0.043). A higher respiratory rate was also evident in surgical patients (median [interquartile range]: 20 [20-21] vs 18 [16-18] breaths/min; P=0.015), but the incidence of missing data in both groups for this factor was high (Table 2). A significantly higher proportion of surgical patients received six or more vials of antivenin in total compared with non-surgical patients (82.4% vs 18.2%; P=0.001) [Table 3]. For local signs, symptoms and complications, a significantly higher proportion of surgical patients presented with local swelling (100% vs 72.7%; P=0.05), ecchymosis (82.4% vs 9.1%; P<0.001), necrosis (58.8% vs 0%, P=0.002), bulla formation (41.2% vs 0%; P=0.023), and necrotising fasciitis (76.5% vs 0%; P<0.001) [Table 3]. Age, season and site of snakebite, co-morbidity with diabetes, allergy to antivenin, and other systemic manifestations were not found to be significantly different between surgical and non-surgical patients. None of the patients with N atra envenomation presented with neurological symptoms. One patient with a small area of ecchymosis on the bite wound of his left hand did not receive surgical intervention, because the condition of the local wound improved and healed after administration of four vials of antivenin and intravenous antibiotics.
 

Table 2. Clinical and laboratory characteristics of 28 patients with Naja atra envenomation
 

Table 3. Demographics, and clinical and laboratory characteristics of 28 patients with Naja atra envenomation
 
Independent predictors of surgery in patients with Naja atra envenomation
To determine clinical predictors of surgery, a multivariate logistic regression analysis was conducted for the significant variables derived from the univariate analysis. Necrotising fasciitis was not included in the multivariate analysis because it was a surgical finding and not an early sign that could be identified in the ED. The results showed that local ecchymosis (OR=34.36; 95% confidence interval [CI], 2.20-536.08; P=0.012) and a high total dose of antivenin (≥6 vials; OR=14.59; 95% CI, 1.10-192.72; P=0.042) were the most significant clinical predictors of surgery in patients with N atra envenomation.
 
Bacterial isolates identified from the snakebite wounds of surgical patients with Naja atra envenomation, and bacterial susceptibility to common antibiotics
To analyse the cause of necrotising fasciitis in surgical patients, the bacterial isolates identified from snakebite wounds were further analysed in surgical patients. The positive culture rate was 88.2% (n=15). More than one type of bacteria were isolated from the snakebite wound in 14 (82.4%) surgical patients. The isolated pathogens included aerobic Gram-positive and Gram-negative bacteria, as well as anaerobic bacteria. The most commonly identified pathogen was Morganella morganii (76.5%), followed by Enterococcus faecalis (58.8%) and Bacteroides fragilis (29.4%) [Table 4].
 

Table 4. Bacterial isolates from snakebite wounds of surgical patients
 
The susceptibility of the bacteria to common antibiotics was analysed (Table 5). All Gram-positive bacteria were susceptible to vancomycin and teicoplanin. All Gram-negative bacteria were susceptible to cefotaxime and amikacin. Cefmetazole, gentamicin, levofloxacin, and trimethoprim/sulfamethoxazole were also effective against the isolated Gram-negative bacteria. Nearly all anaerobic bacteria were susceptible to clindamycin and metronidazole (Table 5).
 

Table 5. Susceptibility of bacteria isolated from snakebite wounds to common antibiotics
 
Discussion
In our study, skin change of ecchymosis on the bite wound was a good clinical predictor of surgery for N atra envenomation. The majority of N atra venom is cytotoxic, not haemorrhagic. The cardiotoxin and phospholipase A2 in N atra venom are direct cytotoxic polypeptides and cause degradation of cell membranes. They induce cell death by activating calcium-dependent proteases, and inhibit mitochondrial respiration. Hyaluronidase in N atra venom destroys interstitial constituents and precipitates the spreading of venom.11 A histopathological study of N atra bite wounds demonstrated thrombotic and fibrinoid deposits in superficial and deep dermal vessels, and leukocytoclastic vasculitis.12 Hence, both the cytotoxic and ischaemic effects of N atra venom may lead to blood extravasation from the destroyed subcutaneous vessels or capillaries and result in the characteristic ecchymosis on the bite wound. This finding may be a potentially important clinical sign of irreversible subcutaneous tissue necrosis due to development of tissue ischaemia.3 If management at this stage is inadequate, tissue destruction may progress to involve the fascia rapidly and extensively with ultimate development of necrotising fasciitis.13 In our patients, extensive tissue destruction beyond the original bite site was evident once necrotising fasciitis developed. Further study is required to verify whether early surgical intervention can prevent the development of necrotising fasciitis, reduce the size of surgical wound, or shorten the length of hospital stay. Nonetheless, surgical assessment may be needed in patients with N atra bite who present with local ecchymosis on the bite wound.
 
Traditionally, immediate injection of antivenin to neutralise N atra venom was the only efficient management.14 A study using an enzyme-linked immunosorbent assay to detect the amount of N atra venom revealed that two to eight vials of antivenin are sufficient to eliminate systemic circulating venom if presentation is early.6 The efficacy of systemically administrated antivenin to diminish local tissue destruction is still controversial, however, and needs further study.3 In an animal study, the cytotoxic venom of N atra was shown to bind with high affinity to tissues leading to high levels of local tissue destruction.15 This finding may explain the difficulties associated with neutralisation of local venom toxicity, especially in cases of delayed presentation. Thus, the adequate dose of antivenin for preventing advanced tissue destruction remains unknown. In our study, nearly all patients presented within 1 hour following envenoming. Intravenous injection of antivenin was administered as soon as clinically possible following identification of cobra envenoming. Interestingly, the use of higher doses of antivenin in patients with N atra envenomation did not decrease surgical rates even in cases of early presentation. More than half of the patients underwent surgery and the majority were diagnosed with necrotising fasciitis. Surgical intervention appears to be crucial for the management of N atra envenomation. Hence, the identification of clinical predictors of surgical need and sufficient evidence to support surgeons’ decisions to carry out early surgical intervention are important issues in N atra management.
 
High bacterial isolation rates and the growth of mixed spectrums of bacteria from bite wounds indicate bacterial infection (which may be another cause of necrotising fasciitis in N atra envenomation), bacterial colonisation, or both. Morganella morganii and Enterococcus species were the most common pathogens cultured from N atra bite wounds in this study. This finding is consistent with the bacterial cultures taken from oral swabs of N atra in Hong Kong.16 Similar results were also described in a previous study in western Taiwan.17 Hence, the use of adequate antibiotics is important in N atra envenomation management. In accordance with the results of our tests of the antibiotic susceptibility of the isolated bacteria, treatment with glycopeptide antibiotics (vancomycin or teicoplanin) combined with a third-generation cephalosporin (cefotaxime) with or without anti-anaerobic antibiotics (clindamycin or metronidazole) is recommended.
 
Limitations
There are several limitations in our study. First, this was a retrospective chart review comparative study. Non-uniform description of symptoms and signs documented by different providers may have influenced the validity of the statistics. Second, the small sample size may limit the statistical power in the multivariate analysis. Third, there are no definitive guidelines for the management of venomous snakebites in Taiwan, and various treatment strategies were employed; this may have influenced the final outcome. A large-scale prospective study is warranted to verify the risk factors we have identified to provide more accurate data for early risk stratification, treatment, and management of these patients.
 
Conclusions
Of the six common venomous snakes in eastern Taiwan, bites by N atra most frequently lead to surgical intervention. Severe tissue necrosis and necrotising fasciitis were the main findings during surgery. Patients who present with ecchymosis on the bite wound or who require higher doses of antivenin may have a higher probability of surgical intervention. In addition to early and adequate antivenin treatment, combined broad-spectrum antibiotics and surgical intervention may be needed in the management of N atra snakebites.
 
Acknowledgement
This work was supported by Buddhist Tzu Chi General Hospital Grants TCRD103-53 (to the first author).
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Liau MY, Huang RJ. Toxoids and antivenoms of venomous snakes in Taiwan. Toxin Rev 1997;16:163-75.
2. Hung DZ. Taiwan’s venomous snakebite: epidemiological, evolution and geographic differences. Trans R Soc Trop Med Hyg 2004;98:96-101. Crossref
3. Wong OF, Lam TS, Fung HT, Choy CH. Five-year experience with Chinese cobra (naja atra)–related injuries in two acute hospitals in Hong Kong. Hong Kong Med J 2010;16:36-43.
4. Li S, Wang J, Zhang X, et al. Proteomic characterization of two snake venoms: Naja naja atra and Agkistrodon halys. Biochem J 2004;384:119-27. Crossref
5. Watt G, Padre L, Tuazon L, Theakston RD, Laughlin L. Bites by the Philippine cobra (Naja naja philippinensis): prominent neurotoxicity with minimal local signs. Am J Trop Med Hyg 1988;39:306-11.
6. Hung DZ, Liau MY, Lin-Shiau SY. The clinical significance of venom detection in patients of cobra snakebite. Toxicon 2003;41:409-15. Crossref
7. Wang W, Chen QF, Yin RX, et al. Clinical features and treatment experience: A review of 292 Chinese cobra snakebites. Environ Toxicol Pharmacol 2014;37:648-55. Crossref
8. Huang LW, Wang JD, Huang JA, Hu SY, Wang LM, Tsan YT. Wound infections secondary to snakebite in central Taiwan. J Venom Anim Toxins Incl Trop Dis 2012;18:272-6. Crossref
9. Hung DZ, Wu ML, Deng JF, Lin-Shiau SY. Russell’s viper snakebite in Taiwan: differences from other Asian countries. Toxicon 2002;40:1291-8. Crossref
10. Chen YW, Chen MH, Chen YC, et al. Differences in clinical profiles of patients with Protobothrops mucrosquamatus and Viridovipera stejnegeri envenoming in Taiwan. Am J Trop Med Hyg 2009;80:28-32.
11. Harris JB. Myotoxic phospholipases A2 and the regeneration of skeletal muscles. Toxicon 2003;42:933-45. Crossref
12. Pongprasit P, Mitrakul C, Noppakun N. Histopathology and microbiological study of cobra bite wounds. J Med Assoc Thai 1988;71:475-80.
13. Gozal D, Ziser A, Shupak A, Ariel A, Melamed Y. Necrotizing fasciitis. Arch Surg 1986;121:233-5. Crossref
14. Russell FE. Snake venom immunology: historical and practical considerations. Toxin Rev 1988;7:1-82. Crossref
15. Guo MP, Wang QC, Liu GF. Pharmacokinetics of cytotoxin from Chinese cobra (Naja naja atra) venom. Toxicon 1993;31:339-43. Crossref
16. Lam KK, Crow P, Ng KH, et al. A cross-sectional survey of snake oral bacterial flora from Hong Kong, SAR, China. Emerg Med J 2011;28:107-14. Crossref
17. Chen CM, Wu KG, Chen CJ, Wang CM. Bacterial infection in association with snakebite: a 10-year experience in a northern Taiwan medical center. J Microbiol Immunol Infect 2011;44:456-60. Crossref

Multimodal analgesia model to achieve low postoperative opioid requirement following bariatric surgery

Hong Kong Med J 2016 Oct;22(5):428–34 | Epub 15 Jul 2016
DOI: 10.12809/hkmj154769
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Multimodal analgesia model to achieve low postoperative opioid requirement following bariatric surgery
Katherine KY Lam, FHKCA, FHKAM (Anaesthesiology); Wilfred LM Mui, FCSHK, FHKAM (Surgery)
Hong Kong Bariatric and Metabolic Institute and Evangel Hospital Weight Management Centre, Room 610, Champion Building, 301-309 Nathan Road, Jordan, Hong Kong
 
Corresponding author: Dr Katherine KY Lam (katherinelamky@gmail.com)
 
 Full paper in PDF
Abstract
Objective: To investigate whether a new anaesthesia protocol can reduce opioid use in obese patients following laparoscopic sleeve gastrectomy.
 
Methods: This prospective observational case series was conducted in a private hospital in Hong Kong that has been accredited as a Centre of Excellence for Bariatric Surgery. Thirty consecutive patients scheduled for laparoscopic sleeve gastrectomy from 1 January 2015 to 31 March 2015 were reviewed.
 
Results: Of the 30 patients, 14 (46.7%) did not require any opioids for rescue analgesia during the entire postoperative period; six (20.0%) required rescue opioids only in the post-anaesthetic care unit, but not in the surgical ward. The mean postoperative total opioid requirement per patient was 32 mg of pethidine.
 
Conclusion: With combination of multimodal analgesia with local anaesthetic infiltration, it is possible to avoid giving potent long-acting opioids in anaesthesia for bariatric surgery.
 
New knowledge added by this study
  • It is possible to avoid giving potent long-acting opioids in anaesthesia for bariatric surgery, by using multimodal analgesia with a combination of paracetamol, pregabalin, COX-2 inhibitors, tramadol, ketamine, dexmedetomidine, and local anaesthetic wound infiltration.
Implications for clinical practice or policy
  • The use of this opioid-sparing anaesthetic technique can potentially reduce the adverse effects and morbidity associated with the use of opioids in obese patients. The technique can be extended to other types of surgery in obese patients.
 
 
Introduction
Obese patients are particularly sensitive to the sedative and respiratory depressive effects of long-acting opioids. Many obese patients also have obstructive sleep apnoea syndrome (OSAS) and will be prone to airway obstruction and desaturation in the postoperative period, especially if opioids have been given.1 2 Given this background, multimodal analgesia is advocated for bariatric surgery with the aim of reducing opioid use.3 4 At the time of writing, no studies were able to demonstrate a technique that can consistently remove the need for any postoperative opioid analgesia. In this study, we report the use of an anaesthesia protocol that allowed a significant proportion of our patients undergoing laparoscopic sleeve gastrectomy to be completely free from any long-acting potent opioids in the intra-operative and postoperative period.
 
Methods
Patient selection
This was a prospective observational study. The study was conducted in a private hospital in Hong Kong that has been accredited as a Centre of Excellence for Bariatric Surgery. All patients scheduled for laparoscopic sleeve gastrectomy for management of obesity or type 2 diabetes from 1 January 2015 onwards were anaesthetised using the same protocol. We analysed 30 consecutive cases between 1 January 2015 and 31 March 2015 to investigate the postoperative opioid requirements using this anaesthesia protocol. Patients were excluded from the case series if they had contra-indications or allergy to any of the anaesthetic or analgesic drugs, or if anaesthesia deviated from the standard protocol for any reason. Three patients were excluded—one was taking serotonin-specific reuptake inhibitor antidepressants and pethidine was avoided to prevent serotonin syndrome (morphine given instead); one was allergic to non-steroidal anti-inflammatory drugs (NSAIDs), so intravenous parecoxib and oral etoricoxib were not given; one accidentally had a larger dose of ketamine given intra-operatively than allowed by the protocol. Concomitant laparoscopic cholecystectomy was performed with laparoscopic sleeve gastrectomy in three patients who were included in the study.
 
The anaesthesia protocol
All patients were fasted from midnight on the night before surgery. All operations were scheduled in the morning. Patients were premedicated with oral pantoprazole 40 mg on the night before surgery, and 2 g of oral paracetamol and 150 mg or 300 mg of oral pregabalin (for patients of body mass index <35 kg/m2 or ≥35 kg/m2, respectively) 2 hours before surgery.
 
Upon arrival in the operating theatre, intravenous access was established and 1 to 2 mg of intravenous midazolam was administered followed by an infusion of dexmedetomidine. The dose of dexmedetomidine was titrated according to the calculated lean body weight (LBW) using the Hume formula.5 The starting dose of the dexmedetomidine was 0.2 µg/kg/h using LBW.6 No loading dose was given.
 
Standard monitoring was applied to the patient together with a bispectral index (BIS) monitor and peripheral nerve stimulation monitor. Graduated compression stockings and sequential compression devices were used for all patients. Induction of anaesthesia was accomplished with fentanyl 100 µg, a titrated dose of propofol, and either suxamethonium or rocuronium as appropriate. The trachea was intubated and patients were ventilated with a mixture of air, oxygen, and desflurane.
 
Intra-operatively, desflurane was titrated to maintain BIS value between 40 and 60. Muscle relaxation was maintained with a rocuronium infusion to keep a train-of-four count of 1. Dexmedetomidine infusion continued at 0.2 µg/kg/h or higher if necessary. Shortly after induction, the various supplementary analgesic drugs were given. A loading dose of ketamine 0.3 mg/kg LBW was given followed by intermittent boluses roughly equivalent to 0.2 to 0.3 mg/kg/h of LBW. Intravenous parecoxib 40 mg and tramadol 100 mg were given. Dexamethasone 8 mg and tropisetron 5 mg were given intravenously for prophylaxis of postoperative nausea and vomiting (PONV).
 
For intravenous fluids, patients were given 10 mL/kg actual body weight of either lactated Ringer’s solution or normal saline, then more were given as appropriate. Hypotension was treated with either ephedrine or phenylephrine.
 
When the surgeon started to close the wounds, rocuronium infusion was stopped and dexmedetomidine infusion rate was reduced to 0.1 µg/kg/h. Wounds were infiltrated with 20 mL of 0.5% levobupivacaine. When all wounds were closed, dexmedetomidine infusion was stopped and desflurane switched off, muscle relaxation reversed by neostigmine and atropine. Patients were extubated when awake and able to obey command.
 
After extubation, patients were transferred to the post-anaesthetic care unit (PACU) for observation for 30 minutes, or longer if appropriate. If a patient required rescue analgesia, intravenous pethidine 20 mg with intravenous ketamine 5 mg was given, and the dose repeated if necessary. When 10 mg of intravenous ketamine had been given, further rescue analgesia was intravenous pethidine 20 mg without any more ketamine. This avoided administration of too much ketamine in an awake patient causing dizziness or hallucinations. When patients had good pain control and stable vital signs, they were transferred back to the ward. The standard postoperative protocol was initiated: if patients requested analgesics, an intramuscular injection of pethidine 50 mg was given, and repeated after 4 hours if necessary. By early evening, when vital signs were stable, patients were allowed sips of water followed by a fluid diet of 60 mL/h. Regular oral paracetamol and etoricoxib were given, and oral pregabalin was added to the protocol the next day. Opioid requirements were reviewed for 24 hours after surgery.
 
As part of the standard postoperative protocol, patients were asked to get off the bed and walk around the ward with the assistance of nursing or physiotherapy staff by the evening of the day of surgery. Provided there were no complications, patients were discharged on the second postoperative day. The anaesthesia protocol is summarised in Table 1.
 

Table 1. Anaesthesia protocol
 
Results
Patient characteristics are shown in Table 2, and postoperative opioid requirements are listed in Table 3.
 

Table 2. Patient characteristics (n=30)
 

Table 3. Postoperative opioid requirements (n=30)
 
Of the 30 patients, no opioid rescue analgesia was required in 14 (46.7%) throughout the postoperative period; six (20%) required intravenous pethidine for rescue analgesia in the PACU, but not after their return to the ward. The remaining 10 (33.3%) patients were given intramuscular pethidine injections in the ward on request.
 
The mean postoperative opioid requirement per patient in the whole case series was 32 mg of pethidine. Among the 16 patients who required rescue analgesia in the ward or in the PACU, their mean opioid requirement was 60 mg of pethidine, with a range of 20 to 150 mg.
 
This anaesthetic protocol included a dexmedetomidine infusion that might cause hypotension and bradycardia due to its alpha-2 adrenoceptor blocking action. In our case series, 11 (36.7%) patients developed transient hypotension despite intravenous fluid loading and required either intravenous ephedrine or phenylephrine. One patient had transient intra-operative bradycardia requiring atropine, probably due to preoperative use of a beta blocker and low resting heart rate.
 
Discussion
Importance of reducing postoperative opioid use in obese patients
Opioids are among the world’s oldest known drugs. They have been used in anaesthesia traditionally as part of a balanced anaesthesia, to provide hypnosis and analgesia, to blunt the sympathetic response to surgery, and are the mainstay of postoperative analgesia in many situations. Morbidly obese patients, however, are particularly sensitive to the respiratory depressant effects of opioids. Taylor et al2 found that the use of opioids per se is a risk factor for respiratory events in the first 24 hours after surgery. Ahmad et al1 demonstrated in their study of 40 morbidly obese patients who presented for laparoscopic bariatric surgery, that in using desflurane and remifentanil-morphine–based anaesthesia, hypoxaemic episodes in the first 24 hours were common, and 14 of their 40 patients had more than five hypoxic episodes per hour despite supplementary oxygen.
 
Another concern with use of opioids in bariatric patients is the high incidence (>70%) of OSAS.7 In our study, 30% (n=9) of patients had OSAS confirmed by an overnight sleep study. The remaining patients were not tested although many had varying symptoms of OSAS. These untested patients were assumed to have OSAS unless proven otherwise. The American Society of Anesthesiologists recommends that in patients with OSAS, methods should be used to reduce or eliminate the requirement for systemic opioids.8 Hence, reducing perioperative opioid use by these obese patients can potentially reduce morbidity.
 
How can the anaesthetist avoid or reduce the use of perioperative opioids, and yet still provide balanced anaesthesia with hypnosis, analgesia, haemodynamic stability, and satisfactory postoperative analgesia? The first method is to combine general anaesthesia with regional analgesia techniques, such that anaesthetic agents will provide hypnosis while the regional blocks will provide analgesia and block sympathetic responses to surgery. Any form of major regional block in a morbidly obese patient can be technically challenging, however. Furthermore, with respect to bariatric surgery, most procedures are now performed laparoscopically, so that thoracic epidural analgesia techniques have become largely unnecessary.
 
Putting aside the use of regional analgesia, the second method to reduce perioperative opioid use is to use a combination of non-opioid agents with volatile agents or propofol to achieve analgesia and haemodynamic control.3 A point to note here is that as acute tolerance to the analgesic effects of opioids can rapidly develop (such as after 90 minutes of remifentanil infusion),9 any attempts to reduce postoperative opioid requirement must include an effort to either eliminate or reduce the use of intra-operative opioids. These techniques are now often described as opioid-free anaesthesia or non-opioid techniques.
 
Paracetamol, NSAID, or COX-2 inhibitors, gabapentinoids, ketamine and alpha-2 agonists, when used individually, have all been shown to reduce postoperative opioid requirement and improve pain relief.10 11 12 13 14 Different combinations of these agents, together with local anaesthetic infiltration of the wounds, have been reported for bariatric surgery, as discussed below.
 
Development of the study protocol based on previous studies
In 2003, Feld et al15 described a technique of using sevoflurane combined with ketorolac, clonidine, ketamine, lignocaine, and magnesium for patients undergoing open gastric bypass. Compared with the control group where sevoflurane was used with fentanyl, they found the non-opioid group to be less sedated, with less morphine use in PACU although the total morphine use at 16 hours was not significantly different to the opioid group.
 
In 2006 Feld et al16 again described using desflurane combined with dexmedetomidine infusion, and compared it with a control group using desflurane and fentanyl, for patients undergoing open gastric bypass. In the dexmedetomidine group, there were lower pain scores and less morphine use in the PACU.
 
In 2005, Hofer et al17 described a case report of a super-obese patient weighing 433 kg who underwent open gastric bypass. No opioids were used but instead replaced with a high-dose dexmedetomidine infusion together with isoflurane.
 
As laparoscopic techniques have become more common in bariatric surgery, more studies have been carried out of non-opioid anaesthetic techniques for laparoscopic bariatric surgery. Tufanogullari et al18 described a technique in which either fentanyl or varying doses of dexmedetomidine were used with desflurane for laparoscopic bariatric surgery. All patients were also given celecoxib. Postoperatively, patients were given fentanyl boluses in PACU, then intravenous morphine via a patient-controlled analgesia system. The only statistical difference was decreased PACU fentanyl use in the dexmedetomidine groups.
 
Ziemann-Gimmel et al19 looked at 181 patients undergoing laparoscopic gastric bypass. In the treatment group, volatile anaesthetics were used together with intravenous paracetamol and ketorolac. Postoperatively patients were given regular paracetamol and ketorolac. If there was breakthrough pain, intermittent oral oxycodone or intravenous hydromorphone was given. A small number of patients in this treatment group (3/89) were able to remain opioid-free throughout, and 15 patients did not require opioid medications when they were back to the ward.
 
In another study where the primary outcome was the incidence of PONV, Ziemann-Gimmel et al20 evaluated 119 patients undergoing laparoscopic bariatric surgery. The treatment group was managed with propofol infusion, dexmedetomidine infusion, paracetamol, ketorolac, and ketamine. The other group was managed with volatile anaesthetic and opioids. Postoperative analgesia regimen was the same as the previous study.19 They reported a large reduction in PONV in their treatment group.
 
While most studies reported decreased requirement of opioids for postoperative analgesia in their non-opioid groups, very few studies could achieve zero postoperative opioid use. Only Ziemann-Gimmel et al19 could achieve total opioid sparing in a small proportion (3 out of 92 patients) of the treatment group by using intra-operative and postoperative intravenous paracetamol and ketorolac.
 
Most of these earlier studies used a combination of only a few of the available non-opioid adjuncts. Dexmedetomidine remains a mainstay of non-opioid adjunct in most of these studies. We hence proposed the use of a wider mix of non-opioid adjuncts, using a combination of paracetamol, COX-2 inhibitor, pregabalin, ketamine, dexmedetomidine, and local anaesthesia infiltration. In contrast to the earlier studies, in our study we were able to achieve zero postoperative opioid use in a significant percentage of patients (46.7%).
 
In our protocol, the only opioid given during anaesthesia was fentanyl 100 µg for intubation, and tramadol 100 mg, a weak opioid, shortly after induction. All other opioid analgesics, if required, were given after the patient was awake. This avoided having to blindly give intra-operative long-acting opioids during anaesthesia, and allowed better titration of the drug by giving small boluses each time with the patient awake.
 
Dexmedetomidine
Dexmedetomidine was a useful agent in our protocol. Before the addition of this agent to our protocol, total opioid sparing was very difficult to achieve. Dexmedetomidine is a highly selective alpha-2 adrenoceptor blocker, with analgesic and sedative properties.21 Previous study of its use in bariatric anaesthesia has failed to show any reduction in opioid requirements.18 In our protocol, we used more non-opioid adjuncts, and since we calculated the infusion dose using LBW instead of total body weight (TBW), overall we administered a much lower dose of dexmedetomidine.
 
Infusion of dexmedetomidine may cause initial hypertension and tachycardia (especially during a loading dose infusion), followed by hypotension and bradycardia. In our study, no loading dose was given. Of the 30 patients, 11 (36.7%) developed transient hypotension despite intravenous fluid loading and required either intravenous ephedrine or phenylephrine. This transient hypotension was also aggravated by putting the patient in a steep reverse trendelenburg position to facilitate surgical exposure, which decreases the venous return. When using dexmedetomidine in bariatric surgery, care must be taken to ensure the patient is euvolaemic.
 
Ketamine
Ketamine was another useful adjunct in our protocol. Ketamine is an N-methyl-D-aspartate receptor antagonist with strong analgesic properties when given at subanaesthetic doses.22 The use of ketamine has advantages in morbidly obese patients as it causes little respiratory depression compared with opioids. In our protocol, we used LBW to calculate the ketamine dose, and used relatively low ketamine doses (0.3 mg/kg bolus followed by 0.2-0.3 mg/kg/h with intermittent boluses). This resulted in a low total ketamine dose, with a mean of 31 mg ketamine per patient (range, 25-50 mg). Midazolam 1 to 2 mg was also given at induction to prevent any psychomimetic reactions caused by ketamine. No patient developed any hallucinations or dysphoria and there was no delay in emergence noted in our patients.
 
The use of lean body weight as dosing scalar
In our protocol, we chose to use LBW to calculate the dose for dexmedetomidine and ketamine. The classic teaching is that for obese patients, anaesthetic drugs can be dosed according to the TBW versus ideal body weight or LBW according to lipid solubility. Lipophilic drugs are better dosed according to actual body weight due to an increase in volume of distribution, whereas hydrophilic drugs are better dosed according to LBW or ideal body weight.23 Lean body weight is significantly correlated with cardiac output, and drug clearance increases proportionately with LBW.6
 
There is insufficient information regarding the pharmacokinetics and pharmacodynamics of dexmedetomidine and ketamine in the morbidly obese patient. In the few previous studies regarding dexmedetomidine and bariatric anaesthesia, TBW was used as dosing scalars. For example, Feld et al16 used 0.5 µg/kg TBW loading dose followed by 0.4 µg/kg/h infusion in their series of 10 patients with open gastric bypass. Ziemann-Gimmel et al20 used 0.5 µg/kg TBW loading dose followed by 0.1 to 0.3 µg/kg/h infusion for their group of 60 patients undergoing a variety of bariatric procedures. Tufanogullari et al18 gave no loading dose and infused from 0 to 0.8 µg/kg/h in their series of 80 patients undergoing laparoscopic banding or bypass. There were little data regarding ketamine dose in bariatric surgery. We chose to dose these two drugs using LBW to see how our results would differ from the other published studies.
 
Limitations of the study
Our study has several limitations. It was a prospective observational study with a relatively small number of cases. We do not have data to compare this protocol with our previous protocols, nor do we have data in the form of a randomised controlled trial to look at the isolated effect of any of the drugs used.
 
The opioid that we used for rescue analgesia was pethidine, given intravenously in the recovery room by the anaesthetist, or given intramuscularly on the ward by the nurses upon standing order. One can argue that the mean opioid dose per patient was not accurate as some were given small intravenous boluses and others were given intramuscular injections of fixed dose. To accurately assess the postoperative parenteral opioid requirements in theory, all patients should be given a patient-controlled analgesia system to deliver boluses of parenteral opioids as required. This, however, is not practical and not necessary for the patient, given that two thirds of our patients did not require any opioids at all. This would also represent a lot of drug wastage when the whole cassette of drugs was unused.
 
We were able to demonstrate that a significant proportion of patients did not require any opioids, but we do not have data to demonstrate a reduction in respiratory complications or an improvement in time to ambulation or discharge. This could be the basis for further studies.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Ahmad S, Nagle A, McCarthy RJ, et al. Postoperative hypoxaemia in morbidly obese patients with and without obstructive sleep apnea undergoing laparoscopic bariatric surgery. Anesth Analg 2008;107:138-43. Crossref
2. Taylor S, Kirton OC, Staff I, Kozol RA. Postoperative day one: a high risk period for respiratory events. Am J Surg 2005;190:752-6. Crossref
3. Mulier JP. Perioperative opioids aggravate obstructive breathing in sleep apnea syndrome: mechanisms and alternative anaesthesia strategies. Curr Opin Anaesthesiol 2016;29:129-33. Crossref
4. Alvarez A, Singh PM, Sinha AC. Postoperative analgesia in morbid obesity. Obes Surg 2014;24:652-9. Crossref
5. Hume R. Prediction of lean body mass from height and weight. J Clin Pathol 1966;19:389-91. Crossref
6. Ingrande J, Lemmens HJ. Dose adjustment of anaesthetics in the morbidly obese. Br J Anaesth 2010;105 Suppl 1:i16-23. Crossref
7. Lopez PP, Stefan B, Schulman CI, Byers PM. Prevalence of sleep apnea in morbidly obese patients who presented for weight loss surgery evaluation: more evidence for routine screening for obstructive sleep apnea before weight loss surgery. Am Surg 2008;74:834-8.
8. American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Practice guidelines for the perioperative management of patients with obstructive sleep apnea: an updated report by the American Society of Anesthesiologists Task Force on Perioperative Management of patients with obstructive sleep apnea. Anesthesiology 2014;120:268-86. Crossref
9. Vinki HR, Kissin I. Rapid development of tolerance to analgesia during remifentanil infusion in humans. Anesth Analg 1998;86:1307-11. Crossref
10. Dahl JB, Nielsen RV, Wetterslev J, et al. Post-operative analgesic effects of paracetamol, NSAIDs, glucocorticoids, gabapentinoids and their combinations: a topical review. Acta Anaesthesiol Scand 2014;58:1165-81. Crossref
11. Blaudszun G, Lysakowski C, Elia N, Tramèr MR. Effect of perioperative systemic α2 agonists on postoperative morphine consumption and pain intensity: systematic review and meta-analysis of randomized controlled trials. Anesthesiology 2012;116:1312-22. Crossref
12. Cabrera Schulmeyer MC, de la Maza J, Ovalle C, Farias C, Vives I. Analgesic effects of a single preoperative dose of pregabalin after laparoscopic sleeve gastrectomy. Obes Surg 2010;20:1678-81. Crossref
13. Weinbroum AA. Non-opioid IV adjuvants in the perioperative period: pharmacological and clinical aspects of ketamine and gabapentinoids. Pharmocol Res 2012;65:411-29. Crossref
14. Alimian M, Imani F, Faiz SH, Pournajafian A, Navadegi SF, Safari S. Effect of oral pregabalin premedication on post-operative pain in laparoscopic gastric bypass surgery. Anesth Pain Med 2012;2:12-6. Crossref
15. Feld JM, Laurito CE, Beckerman M, Vincent J, Hoffman WE. Non-opioid analgesia improves pain relief and decreases sedation after gastric bypass surgery. Can J Anaesth 2003;50:336-41. Crossref
16. Feld JM, Hoffam WE, Stechert MM, Hoffman IW, Anada RC. Fentanyl or dexmedetomidine combined with desflurane for bariatric surgery. J Clin Anesth 2006;18:24-8. Crossref
17. Hofer RE, Sprung J, Sarr MG, Wedel DJ. Anesthesia for a patient with morbid obesity using dexmedetomidine without narcotics. Can J Anaesth 2005;52:176-80. Crossref
18. Tufanogullari B, White PF, Peixoto MP, et al. Dexmedetomidine infusion during laparoscopic bariatric surgery: the effect on recovery outcome variables. Anesth Analg 2008;106:1741-8. Crossref
19. Ziemann-Gimmel P, Hensel P, Koppman J, Marema R. Multimodal analgesia reduces narcotic requirements and antiemetic rescue medication in laparoscopic Roux-en-Y gastric bypass surgery. Surg Obes Relat Dis 2013;9:975-80. Crossref
20. Ziemann-Gimmel P, Goldfarb AA, Koppman J, Marema RT. Opioid-free total intravenous anaesthesia reduces postoperative nausea and vomiting in bariatric surgery beyond triple prophylaxis. Br J Anaesth 2014;112:906-11. Crossref
21. Carollo DS, Nossaman BD, Ramadhyani U. Dexmedetomidine: a review of clinical applications. Curr Opin Anaesthesiol 2008;21:457-61. Crossref
22. Gammon D, Bankhead B. Perioperative pain adjuncts. In: Johnson KB, editor. Clinical pharmacology for anesthesiology. McGraw-Hill Education; 2014: 157-78.
23. Sinha AC, Eckmann DM. Anesthesia for bariatric surgery. In: Miller RD, Eriksson LI, Fleisher LA, Wiener-Kronish JP, Young WL, editors. Miller’s anesthesia. 7th ed. Philadelphia: Churchill Livingston; 2015: 2089-104.

Seatbelt use by pregnant women: a survey of knowledge and practice in Hong Kong

Hong Kong Med J 2016 Oct;22(5):420–7 | Epub 19 Aug 2016
DOI: 10.12809/hkmj164853
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Seatbelt use by pregnant women: a survey of knowledge and practice in Hong Kong
WC Lam, MPH (CUHK), FHKAM (Obstetrics and Gynaecology)1; William WK To, MD, FHKAM (Obstetrics and Gynaecology)1; Edmond SK Ma, MD, FHKAM (Community Medicine)2
1 Department of Obstetrics and Gynaecology, United Christian Hospital, Kwun Tong, Hong Kong
2 The Jockey Club School of Public Health and Primary Care, The Chinese University of Hong Kong, Shatin, Hong Kong
 
Corresponding author: Dr WC Lam (lamwc2@ha.org.hk)
 
 Full paper in PDF
 
Abstract
Introduction: The use of motor vehicles is common during pregnancy. Correct seatbelt use during pregnancy has been shown to protect both the pregnant woman and her fetus. This survey aimed to evaluate the practices, beliefs, and knowledge of Hong Kong pregnant women of correct seatbelt use, and identify factors leading to reduced compliance and inadequate knowledge.
 
Methods: A self-administered survey was completed by postpartum women in the postnatal ward at the United Christian Hospital, Hong Kong, from January to April 2015. Eligible surveys were available from 495 women. The primary outcome was the proportion of pregnant women who maintained or reduced seatbelt use during pregnancy. Secondary outcomes were analysed and included knowledge of correct seatbelt use, as well as contributing factors to non-compliance and inadequate knowledge.
 
Results: There was decreased compliance with seatbelt use during pregnancy and the decrease was in line with increasing gestation. Pregnant women’s knowledge about seatbelt use was inadequate and only a minority had received relevant information. Women who held a driving licence and had a higher education level were more likely to wear a seatbelt before and during pregnancy. Women with tertiary education or above knew more about seatbelt use.
 
Conclusions: Public health education for pregnant women in Hong Kong about road safety is advisable, and targeting the lower-compliant groups may be more effective and successful.
 
 
New knowledge added by this study
  • There was decreased compliance with seatbelt use by pregnant women in Hong Kong. The decrease in compliance became more pronounced as gestation increased. This may be related to lack of relevant information and misconceptions.
Implications for clinical practice or policy
  • As a form of public health and road traffic safety promotion, information about seatbelt use during pregnancy should be provided to pregnant women, health care workers, and all road traffic users.
 
 
Introduction
Road traffic safety is an important public health issue. Health care professionals are usually involved in the treatment of road traffic accident victims rather than prevention of their occurrence or minimising the severity of injury. Education about and promotion of road traffic safety is important for all; pregnant women are no exception. Safety issues relate to both the mother and her fetus, and different information and/or a different approach may be required. With any kind of intervention during pregnancy, an emphasis on the safety of the fetus may improve compliance.
 
The number of pregnant drivers in Hong Kong is unknown, but the use of motor vehicles including private car, taxi, and public light bus is common during pregnancy. To promote maternal seatbelt use among the local pregnant population, information about their beliefs is essential.
 
Correct seatbelt use during pregnancy has been shown to protect both the pregnant woman and her fetus. There is evidence that pregnant women who do not wear a seatbelt and who are involved in a motor vehicle accident are more likely to experience excessive bleeding and fetal death.1 2 3 Compliance and proper use of the seatbelt are crucial. Incorrect placement of the seatbelt and a subsequent accident may result in fetal death due to abruptio placentae.4 The three-point restraint (ie shoulder harness in addition to a lap belt) provides more protection for the fetus than a lap belt alone. Previous studies have revealed incorrect positioning of the seatbelt in 40% to 50% of pregnant women.5 6 Various other studies have shown reduced seatbelt compliance during pregnancy.7 The proportion of seatbelt use has been reported to be around 70% to 80% before pregnancy, but reduced by half at 20 weeks or more of gestation.5 7 There is also evidence that pregnant women lack information about the proper use of a seatbelt and its role in preventing injury: only 14% to 37% of pregnant women received advice from health care professionals.5 6 7 8 The common reasons for not using a seatbelt have been reported to include discomfort, inconvenience, forgetfulness, and fear of harming the fetus.9
 
In this study, the current practice and knowledge of Hong Kong pregnant women about seatbelt use was surveyed, and any determining factors were identified. The results will enable public health education and promotion to be targeted to at-risk groups to improve road traffic safety among local pregnant women.
 
Methods
Study design
This was a cross-sectional survey using a convenient sampling carried out from January to April 2015. A self-administered questionnaire was distributed to postpartum women in the postnatal ward of United Christian Hospital (UCH) in Hong Kong. Participation in the survey was entirely voluntary.
 
Questionnaires were analysed if at least 50% of questions were answered, including the main outcomes. Those from women who did not understand the content or who did not understand Chinese or English were excluded.
 
Questionnaire
The questionnaire was based on a pilot study, with questions revised after review. It was available in English and Chinese (traditional and simplified) versions and was divided into four parts. The first part included demographic and pregnancy information and driving experience. The second part focused on practice of seatbelt use before and during pregnancy, any change in habit with progression of pregnancy, and the reason(s) for non-use of a seatbelt. The third part related to awareness and knowledge of the Road Traffic Ordinance on seatbelt use and the correct use of both lap and shoulder belts. Text descriptions and diagrams of different restraint positions were provided. The correct way is to place the lap belt below the abdomen and the shoulder belt diagonally across the chest. The diagram of restraint positions were adopted from the leaflet “Protect your unborn child in a car” by the Transport Department of Hong Kong with permission.10 The final part asked whether the postpartum woman had received any advice about seatbelt use during pregnancy, the source of information, and whether they thought such information was useful and/or relevant.
 
Statistical analysis
Sample size calculation
Using the results of overseas studies as reference, the sample size was calculated according to the assumption that around 80% of Hong Kong pregnant women use a seatbelt. A previous questionnaire survey among postpartum women at a local hospital indicated that a response rate of approximately 80% could be expected.11 We assumed the margin of error that could be accepted to be 4%, with a confidence level of 95%, and using this formula: n = z2 x p x (1-p)/d2 (where p = proportion of wearing seatbelt [0.8]; d = margin of error [0.04]; and z value = 1.96), the adjusted sample size was 481.
 
All statistical analysis was performed using PASW Statistics 18 (Release Version 18.0.0; SPSS Inc, Chicago [IL], US). For categorical data, the Chi squared test was used to compare knowledge about seatbelt use in wearers and non-wearers. For continuous data with a highly skewed distribution, non-parametric test (Mann-Whitney U test for two groups and Kruskal-Wallis H test for more than two groups) was used to compare the knowledge of correct seatbelt use. Knowledge score was calculated based on the answer to questions about the Road Traffic Ordinance on seatbelt use and the proper way to use both the lap and shoulder belts. One point was given for each correct answer. The critical level of statistical significance was set at 0.05.
 
The relative effects of factors (age, marital status, education level, resident status, husband’s occupation, family monthly income, respondent’s and husband’s driving licence holder status, frequency of public transport use, and stage of pregnancy) that might influence seatbelt use during pregnancy were estimated using generalised estimating equation (GEE). The outcome variables were dichotomous correlated responses (eg use of seatbelt in different gestations), and the outcome variables were assumed to be independent. The issue about statistical significance due to lack of independence was corrected using GEE.
 
To account for the interdependence of observations, we used robust estimates of variance (GEE) by including each period of observation as a cluster. For use of a seatbelt before and during each trimester of pregnancy, since the responses were correlated as time progressed, the GEE model with working correlation matrix was adopted.12
 
Results
Demographic data
There were 769 postpartum women in the postnatal ward during the study period. A total of 550 questionnaires were distributed by convenience and the response rate was 91% with 501 questionnaires returned. The remaining women (n=49, 9%) either refused to participate or did not return the questionnaire. Among the returned questionnaires, six were excluded due to missing information on the main outcomes of the survey or they were <50% complete. At the end of the recruitment period, 495 (90%) questionnaires were valid for analysis.
 
The majority (93.5%) of respondents were aged between 21 and 40 years. Only 10 (2%) were English speakers; others (98%) spoke Cantonese or Mandarin as their first language and completed the Chinese questionnaire. With regard to education level, 188 (38%) women had received tertiary education or above, 290 (58.6%) secondary education, and 14 (2.8%) primary education. There was no existing information about any association between pregnant woman or spousal occupation and compliance with or knowledge about seatbelt use. We therefore investigated whether occupation was a relevant factor, for example, driver and health care worker. Around half (n=216, 43.6%) of the women were housewives, 57 (11.6%) were professionals, and 14 (2.8%) were medical health care workers. Among spouses, 32 (6.5%) were drivers, two (0.4%) were medical health care workers, and 122 (24.6%) were professionals. Other occupations were unrelated to transportation or health care, including clerk, construction site worker, restaurant waiter, and chef. Overall, 439 (88.7%) women were Hong Kong residents, others were new immigrants or double-entry permit holders from Mainland China. Of the respondents, 477 (96.4%) women had attended regular antenatal check-ups, and 215 (43.4%) were first-time mothers.
 
Driving experience and mode of transport
Around half of the spouses (49.1%) but only 71 (14.3%) women held a Hong Kong driving licence. Among those women with a driving licence, only 16 (22.5%) drove daily, and seven (9.9%) only at weekends. Public transport was used daily by 300 (60.6%) women. Among different means of public transport, buses (53.7%) were the most commonly used but not all seats on buses have seatbelts. In public light buses and taxis, use of a seatbelt, if available, is mandatory: 38.6% and 15.2% of respondents used public light buses and taxis, respectively.
 
Use of a seatbelt before and during pregnancy
Of the respondents, 379 (76.6%) pregnant women reported using a seatbelt in the 6 months before pregnancy, but compliance was reduced as pregnancy progressed. Seatbelt use was reduced to 73.5% in the first trimester, 70.5% in the second trimester, and 67.1% in the third trimester (Table 1). There were 26 women who changed their behaviour from not wearing a seatbelt prior to pregnancy to wearing one after they became pregnant. Therefore the total number of ever seatbelt users was 405. Analysis of the knowledge score was performed by excluding these 26 women; the result showed a similar finding and statistical significance.
 

Table 1. Use of seatbelt before and during pregnancy
 
Reasons for not using a seatbelt during pregnancy
With regard to the reasons for not using a seatbelt at any time during pregnancy, 156 (89.1%) of 175 women stated that the seatbelt caused discomfort, 22 (12.6%) thought seatbelts were not useful, and 79 (45.1%) worried that they would cause harm to the fetus (Table 2). Apart from the three stated options in the questionnaire, several respondents stated that the travelling distance was usually short on public light buses and the time taken to buckle up and unfasten the seatbelt may delay other passengers. Other women admitted to being lazy or forgetful, or were just not in the habit of using a seatbelt. They also found seatbelts inconvenient because those on public transport were “not user-friendly”, “too short”, or were “dirty” (Table 2).
 

Table 2. Reasons for not using a seatbelt
 
Knowledge of seatbelt use during pregnancy
Of the respondents, 216 (43.6%) correctly answered that pregnant women are not exempted from seatbelt use according to the Road Traffic Ordinance. The remaining 56.4% either answered wrongly or did not know the answer. Approximately 52.7% women correctly pointed out that appropriate use of a seatbelt will not harm the fetus. Although around half of the women wrongly believed that pregnant women are exempted from seatbelt legislation or that use of a seatbelt will harm the fetus, 358 (72.3%) stated that pregnant women should wear a seatbelt. When the three-point seatbelts were shown on the diagrams, 403 (81.4%) women could identify the correct way of wearing the seatbelt with the lap strap placed below the bump, not over it (Table 3).
 

Table 3. Knowledge of seatbelt use (seatbelt users vs non-users)
 
Among all the respondents, 90 (18.2%) women never wore a seatbelt, and the other 405 (81.8%) were seatbelt users either before or during pregnancy. Comparison of responses revealed that never wearers of a seatbelt had significantly poorer knowledge in three of the four questions about seatbelt use during pregnancy (P<0.05) [Table 3].
 
Information about seatbelt use during pregnancy
Information about seatbelt use had been received by only 32 (6.5%) women. Among them, 13 (40.6%) had derived the information from the internet, others from staff of a government and private clinic, magazine, and publications of Transport Department. Seven (21.9%) received information from friends or family members; one had a car accident during pregnancy and was given relevant information by health care workers at the Accident and Emergency Department. Most (n=426, 86%) women expressed the view that information about seatbelt use during pregnancy was useful and necessary.
 
Factors influencing use of seatbelt during pregnancy
Among all potential factors, women who held a driving licence (odds ratio [OR]=3.28; P=0.004) or had a higher level of education (OR=2.13; P<0.001) were more likely to use a seatbelt. Considering time as another variable, as pregnancy progressed women were significantly less likely to use a seatbelt (OR=0.84; P<0.001) [Table 4].
 

Table 4. Determining factors influencing use of a seatbelt before and during pregnancy
 
Factors influencing knowledge about correct seatbelt use
Women with a lower education level (P<0.001) were less aware of the Road Traffic Ordinance on seatbelt use, the protective effects of a seatbelt during pregnancy, and the correct way to position both the lap and shoulder belts (Table 5).
 

Table 5. Determining factors influencing knowledge score for correct seatbelt use
 
Discussion
Main findings
In this study, 76.6% of Hong Kong pregnant women were consistent seatbelt wearers before pregnancy; this is similar to overseas studies which reported 70% to 80%.5 7 Compliance was reduced during all trimesters, and decreased as gestation progressed. Only 26 women changed their behaviour from non-users to users after becoming pregnant. It also demonstrated the misconception about the effects of seatbelt use on pregnancy and the fetus. Pregnant women’s knowledge about seatbelt use was inadequate and only a minority had received relevant information. Women who held a driving licence or had a higher education level were more likely to wear a seatbelt before and during pregnancy. Women with a tertiary education or above were more knowledgeable about seatbelt use.
 
Strengths and limitations
As far as we know, this is the first survey in Hong Kong of the knowledge of pregnant women about seatbelt use and their associated practice, with a reasonably high response rate. One limitation of the study was that the questionnaire was not validated and there were overlapping categories for numerical variables. Results and experience in this study can serve to revise the questions for a future study with improved validity and reliability. During the study period, 769 postpartum women stayed in the postnatal ward and 495 (64%) completed questionnaires were collected. The proportion included was relatively high, but still the method of convenient sampling may have affected the representativeness of the sampled subjects. Moreover this was a single-centre survey in the obstetric unit of a district hospital. The UCH provides obstetric services to the population in the Kowloon East region. The geographical location of a clinic could dictate the mode of travelling to attend antenatal hospital appointments. Although taxis and public light buses are the usual mode of transport, some women may have taken the bus or Mass Transit Railway, and these do not require use of a seatbelt. Furthermore, the delivery rate at UCH was less than 10% of the total deliveries in Hong Kong, therefore the results may not be applicable to other clusters with patients of different education levels, driving experience, and transportation habits.
 
In addition, those who were unable to read or understand Chinese or English were excluded. These were usually illiterate or non–Hong Kong residents, and may be the group with the lowest compliance and poorest knowledge about seatbelt use. There were 49 women who refused to participate and six who did not complete the questionnaire; this 10% also introduced inaccuracy and bias in our data. Reporting bias is another concern. Discrepancies between observed and self-reported seatbelt use were found in a previous study.13 Anonymity of the questionnaires might have minimised reporting bias. Although all demographic variables included in the questionnaire were analysed, there were other potential confounders that might have affected the knowledge score and the use of a seatbelt during pregnancy. These were not investigated and hence not adequately adjusted in the knowledge score analysis or in the GEE model, for example prior traffic accidents in the respondents and their family members, risk-taking behaviours such as smoking, alcohol drinking, and drug use. Finally, multivariate instead of univariate analysis of the factors affecting knowledge score could be performed to investigate the relationship among different variables.
 
Interpretation
Prevention plays a major role in ensuring maternal and fetal survival in road traffic accidents. Motor vehicle crashes are responsible for severe maternal injury and fetal loss. Despite existing knowledge about the protective effects of wearing a seatbelt, pregnant women remain poorly compliant. This was confirmed in this local survey and in overseas studies.14 15
 
In the Report on Confidential Enquiries into Maternal Deaths in the United Kingdom 1994-1996 published by the Royal College of Obstetricians and Gynaecologists, 13 pregnant women died as a result of road traffic accidents. One of the victims did not use a seatbelt and was forcibly ejected from the vehicle.16 Ten years later, in a more recent Report on Confidential Enquiries into Maternal Deaths in the United Kingdom 2006-2008,17 there were 17 pregnant women who died as a result of road traffic accidents. A specific recommendation was made in the report: “All women should be advised to wear a 3-point seat belt throughout pregnancy, with the lap strap placed as low as possible beneath the ‘bump’ lying across the thighs and the diagonal shoulder strap above the ‘bump’ lying between the breasts. The seat belt should be adjusted to fit as snugly and comfortably as possible, and if necessary the seat should be adjusted”.17
 
According to the Road Traffic Ordinance in Hong Kong, drivers and passengers must wear seatbelts where provided. The exceptions are when reversing a vehicle, making a three-point turn, manoeuvring in and out of a parking place, and those who have a medical certificate and have been granted an exemption on medical grounds by the Commissioner for Transport.18 According to a report of the Transport Department of Hong Kong, the total number of road traffic accidents was 14 436 in 2003. In 2013, the number rose to 16 089. The number of pregnant women involved or injured in road traffic accidents is unknown.19 The Hong Kong SAR Government revises seatbelt legislation regularly to enhance road safety. Since 1 January 2001, passengers have been required to wear a seatbelt, if available, in the rear of taxis as well as in the front. Since 1 August 2004, passengers on public light buses have also been required to wear a seatbelt where one is fitted.20 21 Stickers were put inside buses and taxis to remind passengers of their responsibility to wear a seatbelt and to give clear instructions on the correct way to wear it. Nonetheless, the requirement to use a seatbelt and its protective effects were not well recognised among the respondents in this survey. This may be due to the lack of information provision as only 6.5% of women had received information related to seatbelt use in pregnancy.
 
In this study, those with a lower education level had poorer knowledge about seatbelt use in pregnancy. Effective public education should target these women. Using diagrams as instruction can be simple and direct so that those with a lower education level or who only use public transportation occasionally can easily understand and follow the advice. In the past, leaflets or stickers about seatbelt use were widely seen, especially after introduction of the new legislation, but those specifically targeted to the pregnant population were not common. Maternal child health centres and antenatal clinics of government hospitals are ideal places to distribute educational material. Television announcements may also convey the message effectively, not only to pregnant women, but to all road traffic users. It is also a good opportunity to inform drivers and other passengers so that they can help pregnant women as well as the elderly and disabled who use public transport. Regular spot-checks on public transport and law enforcement may also encourage compliance with seatbelt use. The majority of doctors and midwives give advice about seatbelt use only if asked. This survey demonstrated that the proportion of pregnant women who received seatbelt information was very small. It is recommended that written instructions and advice should be available from well-informed health care professionals, and pregnant women should always be encouraged to wear a correctly positioned seatbelt. Obstetricians, midwives, and general practitioners play an important role in disseminating information. A study in Ireland showed that 75% of general practitioners believed women should wear seatbelts in the third trimester, although only 30% provided regular advice and fewer than 50% indicated that they were aware of the correct advice to give.22
 
Conclusions
This study demonstrated decreased compliance with seatbelt use during pregnancy that continued to decrease as pregnancy progressed. Women with a lower education level or without a driving licence were less likely to use a seatbelt during pregnancy. The former were also less aware of the Road Traffic Ordinance on seatbelt use and the correct way to position both the lap and shoulder belts. Only a minority of pregnant women had received information about seatbelt use. Future studies to assess the knowledge of Hong Kong health care workers about use of seatbelts in pregnancy may enhance the awareness and involvement of medical professionals in educating pregnant women on this issue. Publicity and education about road safety by health care providers and the government are advised, and targeting the lower compliant groups may be more effective and successful.
 
Acknowledgements
The authors gratefully acknowledge Mr Edward Choi for his valuable statistical advice, the staff in the postnatal ward of UCH for helping to collect the questionnaires, and the Transport Department of Hong Kong for permission to use the diagram of restraint positions adopted from the leaflet “Protect your unborn child in a car” on the questionnaires.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Hyde LK, Cook LJ, Olson LM, Weiss HB, Dean JM. Effect of motor vehicle crashes on adverse fetal outcomes. Obstet Gynecol 2003;102:279-86. Crossref
2. Wolf ME, Alexander BH, Rivara FP, Hickok DE, Maier RV, Starzyk PM. A retrospective cohort study of seatbelt use and pregnancy outcome after a motor vehicle crash. J Trauma 1993;34:116-9. Crossref
3. Klinich KD, Schneider LW, Moore JL, Pearlman MD. Injuries to pregnant occupants in automotive crashes. Annu Proc Assoc Adv Automot Med 1998;42:57-91.
4. Bunai Y, Nagai A, Nakamura I, Ohya I. Fetal death from abruptio placentae associated with incorrect use of a seatbelt. Am J Forensic Med Pathol 2000;21:207-9. Crossref
5. Jamjute P, Eedarapalli P, Jain S. Awareness of correct use of a seatbelt among pregnant women and health professionals: a multicentric survey. J Obstet Gynaecol 2005;25:550-3. Crossref
6. Johnson HC, Pring DW. Car seatbelts in pregnancy: the practice and knowledge of pregnant women remain causes for concern. BJOG 2000;107:644-7. Crossref
7. Ichikawa M, Nakahara S, Okubo T, Wakai S. Car seatbelt use during pregnancy in Japan: determinants and policy implications. Inj Prev 2003;9:169-72. Crossref
8. Taylor AJ, McGwin G Jr, Sharp CE, et al. Seatbelt use during pregnancy: a comparison of women in two prenatal care settings. Matern Child Health J 2005;9:173-9. Crossref
9. Weiss H, Sirin H, Levine JA, Sauber E. International survey of seat belt use exemptions. Inj Prev 2006;12:258-61. Crossref
10. Transport Department, The Government of the Hong Kong Special Administrative Region. Protect your unborn child in a car. Available from: http://www.td.gov.hk/filemanager/en/content_174/belt-e.pdf. Accessed Aug 2016.
11. Yu CH, Chan LW, Lam WC, To WK. Pregnant women’s knowledge and consumption of long-chain omega-3 polyunsaturated fatty acid supplements. Hong Kong J Gynaecol Obstet Midwifery 2014;14:57-63.
12. Liang KY, Zeger SL. Longitudinal data analysis using generalized linear models. Biometrika 1986;73:13-22. Crossref
13. Robertson LS. The validity of self-reported behavioral risk factors: seatbelt and alcohol use. J Trauma 1992;32:58-9.Crossref
14. Luley T, Fitzpatrick CB, Grotegut CA, Hocker MB, Myers ER, Brown HL. Perinatal implications of motor vehicle accident trauma during pregnancy: identifying populations at risk. Am J Obstet Gynecol 2013;208:466.e1-5. Crossref
15. Grossman NB. Blunt trauma in pregnancy. Am Fam Physician 2004;70:1303-10.
16. Chapter 13: Fortuitous deaths. Why mothers die: report on confidential enquiries into maternal deaths in the United Kingdom 1994-1996. London: Royal College of Obstetrics and Gynaecologists Press; 2001.
17. Cantwell R, Clutton-Brock T, Cooper G, et al. Saving Mothers’ Lives: Reviewing maternal deaths to make motherhood safer: 2006-2008. The Eighth Report of the Confidential Enquiries into Maternal Deaths in the United Kingdom. BJOG 2011;118 Suppl 1:1-203. Crossref
18. Transport Department, The Government of the Hong Kong Special Administrative Region. Be Smart, buckle up. Available from: http://www.td.gov.hk/filemanager/en/content_174/seatbelt_leaflet.pdf. Accessed Aug 2016.
19. Transport Department, The Government of the Hong Kong Special Administrative Region. Road Traffic Accident Statistics Year 2013. Available from: http://www.td.gov.hk/en/road_safety/road_traffic_accident_statistics/2013/index.html. Accessed Aug 2016.
20. Transport Department, The Government of the Hong Kong Special Administrative Region. Seat belt: safe motoring guides. Available from: http://www.td.gov.hk/en/road_safety/safe_motoring_guides/seat_belt/index.html. Accessed Aug 2016.
21. Transport Department, The Government of the Hong Kong Special Administrative Region. Road Safety Bulletin; March 2001. Available from: http://www.td.gov.hk/filemanager/en/content_182/rs_bulletin_04.pdf. Accessed Aug 2016.
22. Wallace C. General practitioners knowledge of and attitudes to the use of seat belts in pregnancy. Ir Med J 1997;90:63-4.

Primary ventriculoperitoneal shunting outcomes: a multicentre clinical audit for shunt infection and its risk factors

Hong Kong Med J 2016 Oct;22(5):410–9 | Epub 26 Aug 2016
DOI: 10.12809/hkmj154735
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Primary ventriculoperitoneal shunting outcomes: a multicentre clinical audit for shunt infection and its risk factors
Working Group on Neurosurgical Outcomes Monitoring; Peter YM Woo, MMedSc1; HT Wong, FRCSEd (SN)1; Jenny KS Pu, FRCSEd (SN)2; WK Wong, FRCSEd (SN)3; Larry YW Wong, FRCSEd (SN)4; Michael WY Lee, FRCSEd (SN)5; KY Yam, FRCSEd (SN)6; WM Lui, FRCSEd (SN)2; WS Poon, FRCSEd (SN)7
1 Department of Neurosurgery, Kwong Wah Hospital, Yaumatei, Hong Kong
2 Division of Neurosurgery, Department of Surgery, Queen Mary Hospital, Pokfulam, Hong Kong
3 Department of Neurosurgery, Princess Margaret Hospital, Laichikok, Hong Kong
4 Department of Neurosurgery, Queen Elizabeth Hospital, Jordan, Hong Kong
5 Department of Neurosurgery, Pamela Youde Nethersole Eastern Hospital, Chai Wan, Hong Kong
6 Department of Neurosurgery, Tuen Mun Hospital, Tuen Mun, Hong Kong
7 Division of Neurosurgery, Department of Surgery, Prince of Wales Hospital, Shatin, Hong Kong
 
Corresponding author: Dr Peter YM Woo (wym307@ha.org.hk)
 
This paper was presented at the 21st Annual Scientific Meeting of the Hong Kong Neurosurgical Society, 6 December 2014, Hong Kong.
 
 Full paper in PDF
 
Abstract
Objectives: To determine the frequency of primary ventriculoperitoneal shunt infection among patients treated at neurosurgical centres of the Hospital Authority and to identify underlying risk factors.
 
Methods: This multicentre historical cohort study included consecutive patients who underwent primary ventriculoperitoneal shunting at a Hospital Authority neurosurgery centre from 1 January 2009 to 31 December 2011. The primary endpoint was shunt infection, defined as: (1) the presence of cerebrospinal fluid or shunt hardware culture that yielded the pathogenic micro-organism with associated compatible symptoms and signs of central nervous system infection or shunt malfunction; or (2) surgical incision site infection requiring shunt reinsertion (even in the absence of positive culture); or (3) intraperitoneal pseudocyst formation (even in the absence of positive culture). Secondary endpoints were shunt malfunction, defined as unsatisfactory cerebrospinal fluid drainage that required shunt reinsertion, and 30-day mortality.
 
Results: A primary ventriculoperitoneal shunt was inserted in 538 patients during the study period. The mean age of patients was 48 years (range, 13-88 years) with a male-to-female ratio of 1:1. Aneurysmal subarachnoid haemorrhage was the most common aetiology (n=169, 31%) followed by intracranial tumour (n=164, 30%), central nervous system infection (n=42, 8%), and traumatic brain injury (n=27, 5%). The mean operating time was 75 (standard deviation, 29) minutes. Shunt reinsertion and infection rates were 16% (n=87) and 7% (n=36), respectively. The most common cause for shunt reinsertion was malfunction followed by shunt infection. Independent predictors for shunt infection were: traumatic brain injury (adjusted odds ratio=6.2; 95% confidence interval, 2.3-16.8), emergency shunting (2.3; 1.0-5.1), and prophylactic vancomycin as the sole antibiotic (3.4; 1.1-11.0). The 30-day all-cause mortality was 6% and none were directly procedure-related.
 
Conclusions: This is the first Hong Kong territory-wide review of infection in primary ventriculoperitoneal shunts. Although the ventriculoperitoneal shunt infection rate met international standards, there are areas of improvement such as vancomycin administration and the avoidance of scheduling the procedure as an emergency.
 
 
New knowledge added by this study
  • The local rate of infection in ventriculoperitoneal (VP) shunts meets international standards.
  • Vancomycin antibiotic prophylaxis is a risk factor for shunt infection and is a novel finding.
  • VP shunt inserted as an emergency procedure is the strongest risk factor for infection.
Implications for clinical practice or policy
  • There is a need to review prophylactic vancomycin administration in terms of timing, dosage, and the need for its combination with another antibiotic.
  • Emergency VP shunting is not recommended. Shunts should be implanted whenever possible as an elective procedure.
  • A comprehensive local shunt surgery protocol to reduce the risk of shunt infection is recommended.
 
 
Introduction
Ventriculoperitoneal (VP) shunting is one of the most common neurosurgical procedures performed to treat patients with hydrocephalus, which is a disorder related to an abnormal accumulation of cerebrospinal fluid (CSF) in the brain. The operation involves diverting CSF from the ventricles of the brain to the peritoneal cavity of the abdomen by catheter implantation. Despite being a well-established procedure, shunt failure can be as high as 70% in the first year with an annual occurrence rate of 5% thereafter.1 One of the main causes for failure is shunt infection, a potentially debilitating complication that more than doubles the risk of death and exposes affected patients to 3 times as many neurosurgical procedures as non-infected patients.2 Shunt infection varies and occurs in 3% to 17% of patients. Standard management involves intravenous antibiotic therapy, shunt removal, insertion of an external ventricular drain, and replacement with a new shunt once the patient’s CSF is free of microbial infection.1 3 4 5 6 The economic impact of VP shunt infection can be considerable. In the US, the median cost per episode per patient has been reported as US$23 500, accountable for US$2.0 billion in annual hospital charges.7 8 Evidence suggests that the adoption of a strict institutional implantation protocol can significantly reduce the risk of this most challenging shunt complications.9 10 11 12 This retrospective study aimed to determine the frequency of primary VP shunt reinsertions and infection among patients treated in Hong Kong’s public health system and to identify risk factors for shunt infection.
 
Methods
This was a multicentre retrospective study of patients who underwent VP shunt implantation at all seven Hong Kong Hospital Authority neurosurgical units. The Hospital Authority is a public service highly subsidised by the Hong Kong Special Administrative Region Government, and responsible for delivering health care for 90% of inpatient bed days in the city.13 Clinical research ethics committee approval was obtained from the participating centres. Patients who underwent primary VP shunting from 1 January 2009 to 31 December 2011 were included in this study. Those who underwent alternative CSF diversion procedures or those with a history of VP shunt implantation were excluded from this review. Data from clinical records, operation notes, medication-dispensing records, CSF biochemistry, cell counts, and microbiological cultures were collected. The primary endpoint for this study was primary VP shunt infection. The criteria for shunt infection were: (1) CSF or shunt hardware culture that yielded the pathogenic micro-organism with associated compatible symptoms and signs of central nervous system (CNS) infection or shunt malfunction5 14 15; or (2) surgical incision site infection, as defined by the National Nosocomial Infection Surveillance System, requiring shunt reinsertion (even in the absence of a positive culture)16; or (3) intraperitoneal pseudocyst formation (even in the absence of a positive culture). Secondary endpoints were shunt malfunction, defined as unsatisfactory CSF drainage that required shunt reinsertion, and 30-day mortality. Potential risk factors for shunt infection were classified as patient-, disease-, or surgical-related factors. All subjects were followed up for at least 30 days from the operation date or until death.
 
Statistical analysis was carried out using Pearson’s Chi squared test, Fisher’s exact test, and binary logistic regression to identify risk factors for shunt infection. The Kaplan-Meier (log-rank) and Cox proportional hazards models were employed for survival analysis. Patient, disease, and surgical factors were used as covariates and a stepwise regression strategy was adopted (Table 1). P values of <0.05 were considered statistically significant. All tests were performed using the Statistical Package for the Social Sciences (Windows version 16.0.1; SPSS Inc, Chicago [IL], US).
 

Table 1. Clinical characteristics of patients with primary ventriculoperitoneal shunt and univariate logistic regression for shunt infection
 
Results
During the 3-year period, 538 patients underwent primary VP shunt implantation and 87% (n=470) had complete clinical follow-up with a median duration of 37 months (range, 3 days to 76 months). Seven (1%) patients were transferred to other hospitals within 30 days of the procedure. The median duration of hospitalisation was 42 days (range, 3 days to 36 months) and the median length of time from admission to shunting was 18 days (range, <1 day to 21 months). The clinical features and surgical variables are presented in Table 1. The mean (± standard deviation) age of patients was 48 ± 13 years (range, 13-88 years) and the male-to-female ratio was 1:1. In the study group, 80 (15%) were paediatric patients and 48 (9%) were infants. Overall, primary VP shunting was performed for post-aneurysmal subarachnoid haemorrhage communicating hydrocephalus in 169 (31%) patients, for CNS neoplasms in 164 (30%) patients, and for spontaneous intracerebral or intraventricular haemorrhage in 64 (12%) patients. For patients who had preoperative CSF sampling performed, the mean red blood cell count was 1900/µL, white cell count was 17/µL, total protein level was 0.78 g/L, and glucose level was 3.6 mmol/L.
 
Over one quarter of patients (n=155, 29%) had never had prior cranial neurosurgery and approaching half had undergone either one (n=141, 26%) or two (n=115, 21%) previous procedures. Antiseptic skin preparation was povidone-iodine 10% combined with another antiseptic in 422 (78%) patients and with povidone iodine alone in the remainder. The mean operating time for VP shunting was 75 ± 29 minutes. All patients had antibiotic prophylaxis of whom 328 (61%) were prescribed a third-generation cephalosporin and 40 (7%) had vancomycin. Twelve (2%) patients had a rifampicin-clindamycin antibiotic-impregnated ventricular catheter as part of the shunt system. The majority of operations were performed in an emergency setting (n=312, 58%) and shunt implantation was the sole procedure performed (n=514, 96%). The burr hole was most frequently positioned at the parietal location in 320 (59%) patients and 135 (25%) had a frontal burr hole. New burr holes were fashioned for shunt placement 95% of the time. The median number of surgeons was two, with a third of shunts performed by higher neurosurgical trainees (n=174, 32%) and the remaining performed by a neurosurgical specialist. Almost three quarters of VP shunts had a fixed-pressure valve (n=390, 72%) and the predominant design utilised was the Integra Pudenz flushing valve (Integra LifeSciences Corporation, Plainsboro [NJ], US) in 324 (60%) patients.
 
The rate of VP shunt reinsertion was 16% (n=87) and infection was 7% (n=36). The main causes for reinsertion were malfunction (9%) followed by infection. The annual proportion of shunts that required reoperation or were infected was comparable (P=0.87) [Fig 1]. The median time from shunt implantation to shunt removal for infection was 64 days (range, 2 days to 10 months). A cumulative risk for infection was noted affecting 3% of shunts in the first 30 days, 6% in 6 months, and 7% in 1 year. Although 68 (13%) patients were lost to follow-up, attrition analysis revealed that this did not affect infection rates. The mean follow-up duration in this subgroup between those with infection and those without was comparable at 526 days and 554 days, respectively (P=0.43). In addition, the incidence of shunt infection in patients with incomplete follow-up (5%) was similar to those with complete follow-up (7%) [P=0.42].
 

Figure 1. Comparison of the total number of primary ventriculoperitoneal (VP) shunts performed from 2009 to 2011 with the number of shunt reoperations and infected shunts
 
Most infections manifested as meningitis or ventriculitis (n=19, 53%), followed by wound breakdown (n=15, 42%) and peritonitis (n=2, 6%). The most common causative bacteria were coagulase-negative staphylococci (CoNS) [n=25, 69%] of which methicillin resistance was detected in 19 (76%) patients (Table 2). All CoNS species were sensitive to vancomycin with a quarter of methicillin-resistant (MR) species susceptible to aminoglycosides such as gentamicin or amikacin. The second most common infective agent affecting four (11%) patients was MR Staphylococcus aureus (MRSA). Polymicrobial infection was evident in six (17%) patients. One patient with peritonitis had mixed Gram-positive and -negative micro-organisms from CSF cultures.
 

Table 2. micro-organisms cultured from cerebrospinal fluid or shunt hardware from the 36 infected cases with antibiotic sensitivity distribution
 
The only patient risk factor for shunt infection was sex (Table 1). Male patients had a greater than two-fold increased odds of infection (odds ratio [OR]=2.2; 95% confidence interval [CI], 1.1-4.5). Traumatic brain injury (TBI) was the only disease risk factor (OR=7.8; 95% CI, 2.9-18.1). Surgical factors included the use of vancomycin as the prophylactic antibiotic (OR=3.7; 95% CI, 1.3-10.5) and shunts implanted as an emergency procedure (OR=2.2; 95% CI, 1.0-4.7). After adjusting for confounding factors, the independent risk factors for primary VP shunt infection were TBI (adjusted OR=6.2; 95% CI, 2.3-16.8), the use of vancomycin (adjusted OR=3.4; 95% CI, 1.1-11.0), and emergency shunting (adjusted OR=2.3; 95% CI, 1.0-5.1) [Table 3].
 

Table 3. Independent predictors for primary ventriculoperitoneal shunt infection
 
With respect to shunt infection, there was a difference in duration of shunt implantation for patients with the aforementioned risk factors as demonstrated by the significant separation of Kaplan-Meier survival curves (Fig 2). This was ascertained by Cox regression analysis (Table 3). Median shunt survival for trauma patients was 35 days (vs 154 days in non-trauma patients), 32 days for patients who received vancomycin (vs 124 days for alternative antibiotics), and 65 days for emergency operations (vs 208 days for elective operations).
 

Figure 2. Kaplan-Meier shunt survival analysis
Patients with traumatic brain injury, administered vancomycin as the sole prophylactic antibiotic, or had implantation as an emergency procedure were more likely to experience shunt infection (P<0.05, log-rank test)
 
In this study, 30-day all-cause mortality was 6% (n=32), but none was directly procedure-related. Almost half of these patients (n=15, 47%) had an underlying malignant CNS tumour; the majority being brain metastases (n=12, 80%). After accounting for patient age, sex, disease aetiology, shunt reinsertion and infection, a diagnosis of malignant brain tumour was the only significant independent predictor for 30-day mortality with an adjusted OR of 5.6 (95% CI, 2.6-11.7).
 
Discussion
Ventriculoperitoneal CSF shunting has considerably reduced the morbidity and mortality of patients with hydrocephalus since its first description in 1908.17 More than a century later the operation remains the mainstay treatment for this condition. Despite the introduction of antibiotics, improvements in shunt materials and surgical techniques, VP shunt complications are common. Long-term epidemiological studies have indicated that more than half of all patients with CSF shunts will require a surgical revision in their lifetime.4 18 Shunt infection is a serious complication with potentially devastating consequences. Observational studies have recorded infection rates ranging between 3% and 17%, but more consistent estimates from larger patient cohorts cite rates of 6% to 8%.1 3 4 5 Our local shunt infection rate of 7% is relatively lower than other previously published findings and is in keeping with the results from other developed countries.
 
The wide range of shunt infection rates quoted in the literature is due in part to the diverse definitions adopted for shunt infection and the patient populations studied. Many studies have defined infection as a positive CSF microbial culture or the presence of CSF pleocytosis or low CSF glucose levels with clinical features of CNS infection.3 11 19 Due to the study design, a more pragmatic definition was adopted whereby infection was determined retrospectively by either a positive CSF or shunt hardware microbial culture in the presence of shunt malfunction.5 14 15 Nonetheless, it is acknowledged that the true incidence of shunt infection may be overestimated by false-positive cultures from skin flora. The reasons for selecting this interpretation for shunt infection were three-fold. First, it allowed for micro-organism identification and consequent epidemiological analysis; second, infection may not be clinically apparent with malfunctioning shunts; and third, CSF cultures alone cannot exclude infection in cases of shunt malfunction.15
 
The only disease risk factor independently associated with VP shunt infection was TBI. This may be due to two reasons. Delayed post-traumatic hydrocephalus invariably occurs in severe TBI patients and develops in over a third of those subject to decompressive craniectomies.20 Such patients often have a prolonged hospital stay and undergo multiple operations before a shunt is eventually implanted. In this cohort, TBI patients had a mean duration of hospitalisation of 89 days, which was 2 weeks more than the mean stay of 74 days for hydrocephalic patients with alternative neurosurgical conditions. Protracted hospitalisation may lead to skin colonisation with drug-resistant organisms that can evade single-agent conventional antibiotic prophylaxis.21 This is supported by evidence from this study where causative bacteria of shunt infection were resistant to the prescribed prophylactic antibiotic in 80% of TBI patients. These patients also had a mean number of three prior cranial procedures before shunting compared with two operations in patients with non-traumatic hydrocephalus aetiology. Previous surgery is well known to be a main cause of CSF leak in shunted patients and contributes to an increased risk of infection.5 22 Although in this patient series, the number of prior cranial procedures per se did not impart greater risk, detailed clinical data regarding CSF leak in TBI patients were not collected and therefore the influence of TBI on infection can only be inferred.
 
An unexpected finding was that patient age was not a risk factor for shunt infection. This is in contrast to several larger studies that identified paediatric patients, especially infants (younger than a year), to be particularly at risk.11 23 Infants have less-developed humoral and cellular immune systems with immature skin growth rendering them more vulnerable to shunt infection. A likely reason for this observation is the small number of paediatric patients (n=80, 15%) in this cohort with only 9% (n=48) being infants. Larger sample size may delineate more distinctive differences among age-groups.
 
There is little doubt that systemic antibiotic prophylaxis can prevent shunt infection.24 We, however, interestingly identified the sole use of vancomycin as a risk factor for shunt infection, a novel observation that has not been previously reported in the literature. The antibiotic was regularly reserved for patients allergic to penicillin-group antibiotics or for those with documented penicillin-resistant microbial infection or colonisation. This finding apparently seems counter-intuitive especially when all causative CoNS identified in this cohort were sensitive to vancomycin. The issue may lie with the timing of its administration before the procedure and its dosage. With regard to timing, systemic vancomycin requires slow intravenous infusion to reduce the risk of a hypersensitivity reaction that manifests as either red man syndrome or anaphylaxis and occurs in 3.7% to 47% of patients.25 To further illustrate the incidence of these symptoms, the first randomised controlled trial investigating its efficacy in shunt procedures was prematurely discontinued due to these adverse effects.26 Most hospital protocols require infusion rates over an hour as a minimum, but clinical trials have demonstrated that even lengthier 2-hour infusions can further reduce the frequency and severity of these reactions.25 27 Furthermore, the efficacy of vancomycin to treat CoNS and MRSA infections has been questioned due to observations of slower bactericidal activity, compared with nafcillin, than was previously recognised.28 To address this issue we suggest that rigid guidelines should be adhered to with respect to the adequate timing of vancomycin infusion before skin incision. Should more rapid infusions be required, for example, in the emergency setting, pre-administration of intravenous diphenhydramine before vancomycin infusion can prevent the development of red man syndrome.25
 
Limited data are available about the pharmacokinetics and CSF concentrations of vancomycin in neurosurgical patients. In a study reviewing intra-operative serum and CSF vancomycin concentrations of paediatric patients undergoing shunt implantation, the authors noted that CSF penetration was negligible in patients with non-inflamed meninges despite presumed adequate loading doses.29 This was echoed in a later study determining that among non-meningitic patients, vancomycin CNS penetration was poor with a CSF-to-serum ratio of only 18%.30 Its increasing use over the course of decades has also led to a corresponding rise in minimum bactericidal concentrations of CoNS.28 These unique findings should prompt an extensive review of prophylactic vancomycin use as there is currently no consensus on a recommended loading dose for neurosurgical procedures. In the meantime, researchers have attempted to improve the bactericidal activity in patients treated with vancomycin with varying measures of success. Vancomycin in combination with gentamicin results in more rapid bactericidal rates in animal models28 and has been proven to be as effective as third-generation cephalosporins in preventing surgical site infection for neurosurgical procedures in a randomised trial.31 Others have proposed intra-operative combined vancomycin-aminoglycoside administration either intraventricularly, for shunt hardware antibiotic bath immersion prior to implantation or applied in powder form within the subgaleal space of the wound with tenable positive results.11 32 33 34
 
Antibiotic-impregnated (by rifampicin with either clindamycin or minocycline) and silver-coated ventricular catheters offer the greatest promise in preventing shunt infection.35 36 There exists a growing body of evidence in support of antibiotic-impregnated ventricular catheters and they are gradually replacing conventional plain silicone catheters in daily practice with considerable cost savings.37 38 39 40 There are also accompanying concerns about the development of antibiotic-resistant micro-organisms and a recent meta-analysis has elucidated the higher risk of Gram-negative and MRSA shunt infections.38 In this series, only 12 patients received antibiotic-impregnated catheters during the study period so it is difficult to draw any conclusion about their effectiveness.
 
Emergency VP shunting is another surgical-related risk factor for infection and was performed in more than half of patients who underwent the procedure. Its significance is the greatest among the three independent factors identified and is possibly the most amenable to change in current practice. The clinical condition of most patients with hydrocephalus who require primary VP shunting does not warrant emergency surgery although a few indications exist, for example, obstructive pineal region or cerebellar tumours that may present with acute symptoms. More than two thirds of patients (70%) in this study had conditions that necessitated delayed shunting when the primary disease had been treated and the patients stabilised. It is most likely because of limited availability of operating theatre among other related resources and the general practice that VP shunting is delegated to more junior members of the surgical team that this phenomenon prevails. Several reasons support why ‘emergency’ primary shunting should be discouraged. It has long been established by several protocol-driven trials that shunting should be performed as the first procedure of the operative day to minimise the risk of contamination.9 10 11 To illustrate, a surgical incision time after 10 am was observed to be a predictor for infection.5 In elective procedures the neurosurgeon in-charge and other responsible operating theatre staff are likely to be more experienced in comparison with personnel involved in emergencies. In particular, it has been shown that individual surgical experience is an important factor for infection with researchers stating a higher incidence among neurosurgical trainees or in surgeons who performed fewer than 147 shunts within a decade.4 41 Nonetheless, using the former stratification of trainee versus specialist, this was not evident in our cohort. Another argument against ‘emergency’ VP shunting could be the location where the procedure is performed. For a variety of resource allocation reasons, shunting scheduled as an emergency procedure is often not performed in neurosurgery-designated operating theatres. A study investigating the distribution of bacteria in the operating room environment and its relationship with ventricular shunt infections concluded that positive environmental cultures were more likely to occur in a theatre not devoted to neurosurgery.42 Although procedure timing and location were not explored in this audit, it is believed that they were the principal explanations why shunts implanted as an emergency were more likely to become infected.
 
The time interval from shunt implantation to revision for infection in our study is longer than most published data with a median shunt survival of 64 days.5 34 Our data show that 92% (n=33) of shunt infections occurred within 6 months and is compatible with the commonly held belief that the infection begins intra-operatively with the inoculation of skin flora, either from the patient or surgeon, into the surgical wound.42 43 44 This is further substantiated by the predominance of CoNS and S aureus in 81% of bacterial cultures in this patient series and similarly in several previous reports.4 5 9 10 11 12 32 35 36 43 Coupled with positive research findings that theatre discipline during surgery reduces infection risk, it seems reasonable to conclude that institutional shunt implantation protocols should be established.9 10 11 12 32
 
Even though the performance of our neurosurgical community with regard to primary VP shunt infection meets international standards, there is room for improvement. The implementation of a standardised shunt surgery protocol that covers preoperative preparation as well as intra-operative and postoperative management has consistently been proven to be effective in reducing infection.9 10 11 12 32 The landmark study by Choux at al10 first demonstrated that meticulous measures—such as adopting a no-touch technique for shunt handling, limiting the length of shunt exposure time and the number of people in the operating room—have dramatically decreased shunt infection rates from 16% to less than 1%. It is our belief that a similarly comprehensive protocol should be developed and based on the findings of this preliminary study.
 
This study has several limitations. Data collection was retrospective so key clinical information such as the presence of CSF leak and patient co-morbidities were missing. This may have led to inadequate control for confounding factors. An additional limitation inherent in studies of this nature is the potential presence of observational bias where data were collected without blinding after outcomes were known. Follow-up was incomplete with 68 (13%) patients defaulting from clinical review over the course of 3 years. Inadequate follow-up duration was also noted; seven (1%) patients were transferred to other hospitals within 30 days of the procedure and this might have influenced 30-day all-cause mortality findings. Finally, our definition of shunt infection did not include abnormal CSF biochemistry criteria that could have confirmed or refuted positive culture results of specimens that might have been contaminated during collection.
 
Conclusions
This is the first territory-wide review of infection in primary VP shunts conducted in Hong Kong’s public health care setting. This study is also one of the largest in the literature examining shunt infection complications among a predominantly Chinese population. Shunt infection was the second most common cause for reinsertion occurring in 7% of patients. Significant independent predictors for shunt infection were TBI, vancomycin administration for prophylaxis, and procedures performed in an emergency setting. Although VP shunt infection rates meet international standards, there are areas of improvement that can be readily addressed such as the timing or dosage of vancomycin and the avoidance of performing the procedure as an emergency. The best approach to reducing shunt infection may be the design and adoption of a standardised shunt surgery protocol customised to local practice.
 
Acknowledgements
We would like to thank members of the Hospital Authority Head Office (HAHO) Co-ordinating Committee (Neurosurgery), the Clinical Effectiveness & Technology Management Department, the Division of Quality and Safety, and the Clinical Data Analysis and Reporting System Team, HAHO IT Service for their administrative advice and data collection. We also wish to thank Drs Chris YW Liu, Alphon HY Ip, and Claudia Law for their contributions in data collection and entry. This study was supported by the Tung Wah Group of Hospitals Neuroscience Research Fund.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Wong JM, Ziewacz JE, Ho AL, et al. Patterns in neurosurgical adverse events: cerebrospinal fluid shunt surgery. Neurosurg Focus 2012;33:E13. Crossref
2. Schoenbaum SC, Gardner P, Shillito J. Infections of cerebrospinal fluid shunts: epidemiology, clinical manifestations, and therapy. J Infect Dis 1975;131:543-52. Crossref
3. Birjandi A, Zare E, Hushmandi F. Ventriculoperitoneal shunt infection: a review of treatment. Neurosurg Q 2012;22:145-8. Crossref
4. Borgbjerg BM, Gjerris F, Albeck MJ, Børgesen SE. Risk of infection after cerebrospinal fluid shunt: an analysis of 884 first-time shunts. Acta Neurochir (Wien) 1995;136:1-7. Crossref
5. Korinek AM, Fulla-Oller L, Boch AL, Golmard JL, Hadiji B, Puybasset L. Morbidity of ventricular cerebrospinal fluid shunt surgery in adults: an 8-year study. Neurosurgery 2011;68:985-94; discussion 994-5. Crossref
6. Patwardhan RV, Nanda A. Implanted ventricular shunts in the United States: the billion-dollar-a-year cost of hydrocephalus treatment. Neurosurgery 2005;56:139-44; discussion 144-5. Crossref
7. Simon TD, Riva-Cambrin J, Srivastava R, et al. Hospital care for children with hydrocephalus in the United States: utilization, charges, comorbidities, and deaths. J Neurosurg Pediatr 2008;1:131-7. Crossref
8. Shannon CN, Simon TD, Reed GT, et al. The economic impact of ventriculoperitoneal shunt failure. J Neurosurg Pediatr 2011;8:593-9. Crossref
9. Faillace WJ. A no-touch technique protocol to diminish cerebrospinal fluid shunt infection. Surg Neurol 1995;43:344-50. Crossref
10. Choux M, Genitori L, Lang D, Lena G. Shunt implantation: reducing the incidence of shunt infection. J Neurosurg 1992;77:875-80. Crossref
11. Kestle JR, Riva-Cambrin J, Wellons JC 3rd, et al. A standardized protocol to reduce cerebrospinal fluid shunt infection: the Hydrocephalus Clinical Research Network Quality Improvement Initiative. J Neurosurg Pediatr 2011;8:22-9. Crossref
12. Pirotte BJ, Lubansu A, Bruneau M, Loqa C, Van Cutsem N, Brotchi J. Sterile surgical technique for shunt placement reduces the shunt infection rate in children: preliminary analysis of a prospective protocol in 115 consecutive procedures. Childs Nerv Syst 2007;23:1251-61. Crossref
13. World Health Organization and Department of Health, Hong Kong. Hong Kong (China) health service delivery profile, 2012. Available from: http://www.wpro.who.int/health_services/service_delivery_profile_hong_kong_(china).pdf. Accessed Mar 2016.
14. Overturf GD. Defining bacterial meningitis and other infections of the central nervous system. Pediatr Crit Care Med 2005;6 Suppl:S14-8. Crossref
15. Vanaclocha V, Sáiz-Sapena N, Leiva J. Shunt malfunction in relation to shunt infection. Acta Neurochir (Wien) 1996;138:829-34. Crossref
16. Horan TC, Gaynes RP, Martone WJ, Jarvis WR, Emori TG. CDC definitions of nosocomial surgical site infections, 1992: a modification of CDC definitions of surgical wound infections. Infect Control Hosp Epidemiol 1992;13:606-8. Crossref
17. Kausch W. Die behandlung des hydrocephalus der kleinen kinder. Arch Klin Chir 1908;87:709-96.
18. Tuli S, Drake J, Lawless J, Wigg M, Lamberti-Pasculli M. Risk factors for repeated cerebrospinal shunt failures in pediatric patients with hydrocephalus. J Neurosurg 2000;92:31-8. Crossref
19. Odio C, McCracken GH Jr, Nelson JD. CSF shunt infections in pediatrics. A seven-year experience. Am J Dis Child 1984;138:1103-8. Crossref
20. Honeybul S, Ho KM. Incidence and risk factors for post-traumatic hydrocephalus following decompressive craniectomy for intractable intracranial hypertension and evacuation of mass lesions. J Neurotrauma 2012;29:1872-8. Crossref
21. Wang KW, Chang WN, Shih TY, et al. Infection of cerebrospinal fluid shunts: causative pathogens, clinical features, and outcomes. Jpn J Infect Dis 2004;57:44-8.
22. Jeelani NU, Kulkarni AV, Desilva P, Thompson DN, Hayward RD. Postoperative cerebrospinal fluid wound leakage as a predictor of shunt infection: a prospective analysis of 205 cases. Clinical article. J Neurosurg Pediatr 2009;4:166-9. Crossref
23. Davis SE, Levy ML, McComb JG, Masri-Lavine L. Does age or other factors influence the incidence of ventriculoperitoneal shunt infections? Pediatr Neurosurg 1999;30:253-7. Crossref
24. Ratilal B, Costa J, Sampaio C. Antibiotic prophylaxis for surgical introduction of intracranial ventricular shunts. Cochrane Database Syst Rev 2006;(3):CD005365. Crossref
25. Sivagnanam S, Deleu D. Red man syndrome. Crit Care 2003;7:119-20. Crossref
26. Odio C, Mohs E, Sklar FH, Nelson JD, McCracken GH Jr. Adverse reactions to vancomycin used as prophylaxis for CSF shunt procedures. Am J Dis Child 1984;138:17-9. Crossref
27. Healy DP, Sahai JV, Fuller SH, Polk RE. Vancomycin-induced histamine release and “red man syndrome”: comparison of 1- and 2-hour infusions. Antimicrob Agents Chemother 1990;34:550-4. Crossref
28. Stevens DL. The role of vancomycin in the treatment paradigm. Clin Infect Dis 2006;42 Suppl 1:S51-7. Crossref
29. Fan-Havard P, Nahata MC, Bartkowski MH, Barson WJ, Kosnik EJ. Pharmacokinetics and cerebrospinal fluid (CSF) concentrations of vancomycin in pediatric patients undergoing CSF shunt placement. Chemotherapy 1990;36:103-8. Crossref
30. Albanèse J, Léone M, Bruguerolle B, Ayem ML, Lacarelle B, Martin C. Cerebrospinal fluid penetration and pharmacokinetics of vancomycin administered by continuous infusion to mechanically ventilated patients in an intensive care unit. Antimicrob Agents Chemother 2000;44:1356-8. Crossref
31. Pons VG, Denlinger SL, Guglielmo BJ, et al. Ceftizoxime versus vancomycin and gentamicin in neurosurgical prophylaxis: a randomized, prospective, blinded clinical study. Neurosurgery 1993;33:416-22; discussion 422-3. Crossref
32. Choksey MS, Malik IA. Zero tolerance to shunt infections: can it be achieved? J Neurol Neurosurg Psychiatry 2004;75:87-91.
33. Abdullah KG, Attiah MA, Olsen AS, Richardson A, Lucas TH. Reducing surgical site infections following craniotomy: examination of the use of topical vancomycin. J Neurosurg 2015;123:1600-4. Crossref
34. Ragel BT, Browd SR, Schmidt RH. Surgical shunt infection: significant reduction when using intraventricular and systemic antibiotic agents. J Neurosurg 2006;105:242-7. Crossref
35. Keong NC, Bulters DO, Richards HK, et al. The SILVER (Silver Impregnated Line Versus EVD Randomized trial): a double-blind, prospective, randomized, controlled trial of an intervention to reduce the rate of external ventricular drain infection. Neurosurgery 2012;71:394-403; discussion 403-4. Crossref
36. Sciubba DM, Stuart RM, McGirt MJ, et al. Effect of antibiotic-impregnated shunt catheters in decreasing the incidence of shunt infection in the treatment of hydrocephalus. J Neurosurg 2005;103 Suppl:131-6. Crossref
37. Thomas R, Lee S, Patole S, Rao S. Antibiotic-impregnated catheters for the prevention of CSF shunt infections: a systematic review and meta-analysis. Br J Neurosurg 2012;26:175-84. Crossref
38. Konstantelias AA, Vardakas KZ, Polyzos KA, Tansarli GS, Falagas ME. Antimicrobial-impregnated and -coated shunt catheters for prevention of infections in patients with hydrocephalus: a systematic review and meta-analysis. J Neurosurg 2015;122:1096-112. Crossref
39. Parker SL, Farber SH, Adogwa O, Rigamonti D, McGirt MJ. Comparison of hospital cost and resource use associated with antibiotic-impregnated versus standard shunt catheters. Clin Neurosurg 2011;58:122-5. Crossref
40. Parker SL, McGirt MJ, Murphy JA, Megerian JT, Stout M, Engelhart L. Cost savings associated with antibiotic-impregnated shunt catheters in the treatment of adult and pediatric hydrocephalus. World Neurosurg 2015;83:382-6. Crossref
41. Cochrane DD, Kestle JR. The influence of surgical operative experience on the duration of first ventriculoperitoneal shunt function and infection. Pediatr Neurosurg 2003;38:295-301. Crossref
42. Duhaime AC, Bonner K, McGowan KL, Schut L, Sutton LN, Plotkin S. Distribution of bacteria in the operating room environment and its relation to ventricular shunt infections: a prospective study. Childs Nerv Syst 1991;7:211-4. Crossref
43. Bayston R, Lari J. A study of the sources of infection in colonised shunts. Dev Med Child Neurol 1974;16 Suppl 32:16-22. Crossref
44. Tulipan N, Cleves MA. Effect of an intraoperative double-gloving strategy on the incidence of cerebrospinal fluid shunt infection. J Neurosurg 2006;104 Suppl:5-8. Crossref

Chronic peritoneal dialysis in Chinese infants and children younger than two years

Hong Kong Med J 2016 Aug;22(4):365–71 | Epub 17 Jun 2016
DOI: 10.12809/hkmj154781
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Chronic peritoneal dialysis in Chinese infants and children younger than two years
YH Chan, FHKCPaed, FHKAM (Paediatrics); Alison LT Ma, FHKCPaed, FHKAM (Paediatrics); PC Tong, FHKCPaed, FHKAM (Paediatrics); WM Lai, FHKCPaed, FHKAM (Paediatrics); Niko KC Tse, FHKCPaed, FHKAM (Paediatrics)
Paediatric Nephrology Centre, Department of Paediatric and Adolescent Medicine, Princess Margaret Hospital, Laichikok, Hong Kong
 
Corresponding author: Dr YH Chan (genegene.chan@gmail.com)
 
 Full paper in PDF
Abstract
Objective: To review the outcome for Chinese infants and young children on chronic peritoneal dialysis.
 
Methods: The Paediatric Nephrology Centre of Princess Margaret Hospital is the designated site offering chronic dialysis to children in Hong Kong. Medical records of children who started chronic peritoneal dialysis before the age of 2 years, from 1 July 1995 to 31 December 2013, were retrieved and retrospectively reviewed.
 
Results: Nine Chinese patients (male-to-female ratio, 3:6) were identified. They were commenced on automated peritoneal dialysis at a median age of 4.7 (interquartile range, 1.1-13.3) months. The median duration of chronic peritoneal dialysis was 40.9 (interquartile range, 22.9-76.2) months. The underlying aetiologies were renal dysplasia (n=3), pneumococcal-associated haemolytic uraemic syndrome (n=3), ischaemic nephropathy (n=2), and primary hyperoxaluria I (n=1). Peritonitis and exit-site infection rate was 1 episode per 46.5 patient-months and 1 episode per 28.6 patient-months, respectively. Dialysis adequacy (Kt/Vurea >1.8) was achieved in 87.5% of patients. Weight gain was achieved in our patients although three required gastrostomy. Four patients were delayed in development. All patients survived except one patient with primary hyperoxaluria I who died of acute portal vein thrombosis following liver transplantation. One patient with pneumococcal-associated haemolytic uraemic syndrome had sufficient renal function to be weaned off dialysis. Four patients received deceased donor renal transplantation after a mean waiting time of 76.7 months. Three patients remained on chronic peritoneal dialysis at the end of the study.
 
Conclusions: Chronic peritoneal dialysis is technically difficult in infants. Nonetheless, low peritonitis rate, low exit-site infection rate, and no chronic peritoneal dialysis–related mortality can be achieved. Chronic peritoneal dialysis offers a promising strategy to bridge the way to renal transplantation.
 
New knowledge added by this study
  • Literature on infant chronic peritoneal dialysis (CPD) is scarce. This is the first report about long-term outcome of Chinese infants on CPD.
  • The local catheter-related infection rate is low compared with western countries.
Implications for clinical practice or policy
  • CPD in infancy is a feasible modality as a bridge to transplantation with low infection and mortality rate. A shared decision-making process between parents and paediatric nephrologists is necessary to provide an optimal care plan for this group of patients, considering the predicted outcome, associated co-morbidities, and family burden.
 
 
Introduction
End-stage renal disease (ESRD) is a rare disease with high mortality in infants and young children under 2 years of age. In the past, the decision to initiate infant dialysis was not easy due to technical difficulties and poor clinical outcome, as evidenced by a 1990 survey showing that only 50% of paediatric nephrologists would offer dialysis to ESRD children younger than 1 year, and only 40% would offer dialysis to neonates.1 With technological advances and improving outcome for children on dialysis in terms of physical growth, development and quality of life,2 3 most paediatric nephrologists will now consider peritoneal dialysis (PD) as a bridge to renal transplantation. Data in North American Pediatric Renal Trials and Collaborative Studies (NAPRTCS) 2011 indicated that 92% of ESRD children younger than 2 years were on chronic PD (CPD).4
 
Literature in this area is scarce2 3 5 6 especially on the long-term outcome of these infants in the Chinese population. As the only tertiary referral paediatric nephrology centre in Hong Kong, we retrospectively reviewed our experience in the epidemiology, dialysis prescription, complications, and outcome in this group of patients.
 
Methods
The Paediatric Nephrology Centre of Princess Margaret Hospital is the designated site offering renal replacement therapy to children in Hong Kong. Medical records of children who started CPD before 2 years old, from 1 July 1995 to 31 December 2013, were retrieved and reviewed. Information regarding their primary renal diagnosis, co-morbidities, growth profile, infectious and non-infectious complications, dialysis prescription, dialysis adequacy, peritoneal membrane transport status, relevant laboratory investigations, and final outcome were reviewed. Data collected were recorded on data entry forms. Patients who underwent CPD for less than 6 months were excluded. The study was approved by the ethics committee of Princess Margaret Hospital.
 
In our centre, CPD was the preferred dialysis modality in young children; PD was performed by automated cycler in the modes of nocturnal intermittent peritoneal dialysis (NIPD), continuous cyclic peritoneal dialysis (CCPD), continuous optimal peritoneal dialysis, and tidal peritoneal dialysis. Peritoneal equilibration test was performed annually with membrane transport status classified as high, high-average, low-average, or low transporters.7 8 Dialysis adequacy was monitored by both clinical parameters and biochemical parameters. Due to limited information on residual renal function, solute clearance referred to contribution by CPD only, and was expressed in terms of Kt/Vurea.
 
Peritonitis was defined as cloudy peritoneal effluent, with white cell count of >100/mm3 in the dialysate with at least 50% polymorphonuclear leukocytes.9 Additionally, clinical symptoms of fever with or without abdominal pain were included. Exit-site infection (ESI) was diagnosed in the presence of peri-catheter swelling, redness, tenderness, and discharge at the exit site.9 Developmental delay was defined as children who received special education or failed to reach a normal developmental milestone in two or more developmental domains (eg gross motor, cognition, etc).
 
Chronic kidney disease–mineral bone disease (CKD-MBD) was defined as a systemic disorder of mineral bone metabolism due to renal failure, manifesting as biochemical abnormalities (calcium, phosphate, parathyroid hormone [PTH], or vitamin D metabolism), abnormal bone turnover, or vascular calcification.10 Renal osteodystrophy, the skeletal component of CKD-MBD, was defined as alteration of bone morphology in patients with ESRD.10 Target of PTH ranged from 11 to 33 pmol/L (100-300 pg/mL) in children on CPD, supported by recent data from the International Pediatric Peritoneal Dialysis Network (IPDN).11
 
Statistical analysis
Data collection and analysis were performed with Microsoft Excel 2010. The demographic data and biochemical parameters were expressed as mean ± standard deviation, range, median, interquartile range (IQR), number, or percentage as appropriate. Height and weight were expressed as standard deviation scores (SDSs), calculated according to a local study on growth of children.12
 
Results
Patient characteristics
From 1995 to 2013, nine Chinese children under 2 years of age (3 boys and 6 girls) receiving CPD were identified. The mean estimated glomerular filtration rate immediately prior to dialysis was 6.9 ± 3.8 (range, 3.9-15) mL/min/1.73 m2, calculated by Schwartz Formula. The median age at initiating CPD was 4.7 (IQR, 1.1-13.3) months. The median duration of CPD was 40.9 (IQR, 22.9-76.2) months. The most common causes of ESRD were renal dysplasia (n=3, 33%) and pneumococcal-associated haemolytic uraemic syndrome (pHUS) [n=3, 33%], followed by ischaemic nephropathy due to severe perinatal asphyxia (n=2, 22%) and primary hyperoxaluria I (PH1) [n=1, 11%] (Tables 1 and 2). All three patients with pHUS presented with pneumococcal pneumonia, microangiopathic haemolytic anaemia, and acute kidney injury. Either direct Coombs test or T-antigen test was positive to support the diagnosis of pHUS.
 

Table 1. Summary of children younger than 2 years who were on chronic peritoneal dialysis during July 1995 to December 2013
 

Table 2. Clinical characteristics of nine Chinese children started on chronic peritoneal dialysis before 2 years old during July 1995 to December 2013
 
Peritoneal dialysis prescription, transporter status, and peritoneal dialysis adequacy
All patients were put on automated peritoneal dialysis (APD). Initially, eight children were on NIPD and only one was on CCPD. Over the course of CPD, five (56%) patients changed to CCPD, and one (11%) patient changed to tidal PD because of drainage pain. Three (33%) patients remained on NIPD. Decreasing residual renal function and inadequate dialysis were the most common reasons for changing modes of CPD.
 
Peritoneal equilibration test and dialysis adequacy assessment were performed in eight patients. Four patients were high transporters, while two patients were high-average transporters and two patients were low-average transporters (Table 3). Seven (87.5%) patients achieved a dialysis adequacy (Kt/Vurea) of >1.8. The mean Kt/Vurea was 2.5 ± 0.6 (range, 1.5-3.4). The mean weekly creatinine clearance was 38.3 ± 6.2 (range, 25.6-47.1) L/week/1.73 m2.
 

Table 3. Transporter status in children younger than 2 years on chronic peritoneal dialysis
 
Catheter survival
During the study period, 23 episodes of Tenckhoff catheter insertion were carried out in these nine patients. The median catheter survival was 260 (IQR, 19-569) days. Only one patient did not require any catheter change. Fourteen catheter changes were performed in eight patients. Catheters were replaced once in four patients, twice in three patients, and 4 times in one patient. The most common reason for catheter change was catheter blockage due to omental wrap (n=7, 50%), followed by chronic ESI or refractory peritonitis (n=4, 29%), migration or malposition (n=2, 14%), and cuff extrusion (n=1, 7%). While omentectomy was not routinely performed, 44% patients eventually required partial omentectomy due to omental wrap.
 
Peritonitis, exit-site infection, and surgical complications
Five patients experienced a total of eight episodes of peritonitis. Four patients did not have peritonitis. Peritonitis rate was 0.26 episode per patient-year or 1 episode per 46.5 patient-months. Two (25%) episodes were caused by Staphylococcus aureus, one of which was methicillin-resistant. One (12.5%) episode was caused by coagulase-negative Staphylococcus (CoNS) and the other by Mycobacterium chelonae (12.5%). The remaining four (50%) episodes were culture-negative peritonitis (Fig 1). Altogether 13 episodes of ESI occurred in five patients, and patient 3 contributed seven episodes. The rate of ESI was 0.42 episode per patient-year or 1 episode per 28.6 patient-months. The most common organisms were Pseudomonas aeruginosa (n=7, 54%) and methicillin-sensitive S aureus (n=2, 15%). Other causative pathogens included CoNS, diphtheroid, Serratia, and M chelonae, each of which resulted in one ESI (Fig 2).
 

Figure 1. Causative organisms in eight peritonitis episodes (five patients) in the study population
 

Figure 2. Causative organisms in 13 exit-site infections (five patients) in the study population
 
One patient required surgical correction of patent processus vaginalis that led to hydrocoele. One patient required repair of bilateral inguinal hernia. One patient had cuff extrusion and required replacement of PD catheter. No catheters developed leakage.
 
Growth and nutrition
Weight gain was observed after initiation of CPD. At the start of dialysis, 12 months and 24 months post-dialysis, the mean weight SDS (wtSDS) was -1.32, -1.44, and -1.27, while height SDS (htSDS) was -0.75, -0.92, and -1.45, respectively (Table 4). Three (33%) patients were commenced on nasogastric (NG) enteral feeding and eventually were switched to gastrostomy feeding. Six (67%) patients were fed on demand, one of whom was awaiting gastrostomy insertion at the end of the study. Three (33%) patients were prescribed growth hormone therapy before the age of 2 years.
 

Table 4. Growth outcome in children younger than 2 years on chronic peritoneal dialysis
 
Development
Four (44%) children were delayed in development or received special education. Two of them had severe perinatal asphyxia associated with hypoxic ischaemic encephalopathy, one was born prematurely at 32 weeks of gestation and the other had PH1, all of which could account for the developmental delay.
 
Anaemia, chronic kidney disease–mineral bone disease, and hypertension
During the first 2 years of CPD, all patients received erythropoiesis-stimulating agent, except patient 5 who later became dialysis-free. The mean maximum dose of recombinant human erythropoietin (rHuEPO) was 169 ± 91 (range, 65-300) units/kg/week. Three patients received rHuEPO at a dose exceeding 200 units/kg/week. Seven patients were put on oral iron supplements; one of whom was switched to intravenous iron replacement subsequently due to functional iron deficiency. The mean haemoglobin level was 109 ± 8 g/L; only two patients (patients 1 and 7) failed to achieve a mean haemoglobin level of ≥100 g/L.
 
All patients showed some degree of CKD-MBD, as evidenced by raised PTH level and the need for activated vitamin D and phosphate binder. Five patients had severe renal osteodystrophy with clinical or radiological manifestations (Table 2). All of them had markedly elevated mean PTH (90-111 pmol/L) outside the recommended target. Of note, two patients (patients 6 and 7) had pathological fractures. Two patients (patients 7 and 9) received a calcimimetic (cinacalcet) for tertiary hyperparathyroidism. Five (56%) patients had hypertension and were on antihypertensive medications with satisfactory control.
 
Outcome
All patients survived except patient 6 with PH1 who died of acute portal vein thrombosis following liver transplantation at the age of 5 years. Patient 5 with pHUS became dialysis-free after 8.6 months of CPD. Four patients underwent deceased donor renal transplantation (DDRT) with a mean waiting time of 76.7 (range, 54-90) months, of whom two were switched to chronic haemodialysis before transplantation because of inadequate dialysis. Three patients remained on PD at the end of the study.
 
Discussion
End-stage renal disease is rare in infants and young children. The reported incidence is variable but remains low around the globe. Up to 16 cases per age-related population per year have been reported in the UK.13 In NAPRTCS 2011, 13.2% of children on dialysis were under 2 years old.4 In Hong Kong, recent data from the Hong Kong Renal Registry showed that the incidence and prevalence of ESRD in those <20 years old was around 5 and 28 per million children, respectively.14
 
The most common aetiology of ESRD in this age-group is congenital anomalies of the kidney and urinary tract, including renal dysplasia and obstructive uropathy.15 Nonetheless, pHUS constituted an important cause of ESRD in Hong Kong. A potential explanation is the late introduction of a universal pneumococcal vaccination programme in 2009, compared with 2000 in the US population.
 
In our study, all patients started with CPD. Difficult vascular access for haemodialysis and a high volume of daily milk intake make CPD the more favourable choice of renal replacement therapy in young infants. While local mean DDRT waiting time in children younger than 18 years was 4.4 ± 2.4 years,16 the waiting time in our young patients was much longer (mean, 6.4 years). This is because patients have to weigh more than 15 kg before DDRT can be carried out due to technical difficulties. Therefore, CPD acts as a bridge to transplantation and reserves vascular access for future use.15
 
Ethical considerations and infection, together with growth and nutrition, are the most challenging aspects of infant CPD.
 
Ethical considerations
Decisions to initiate or withhold dialysis remain one of the most challenging aspects in infant ESRD. Recent data, which showed improvement in mortality and developmental outcome, support initiation of dialysis. Shroff et al6 reported a survival rate of 77% at 5 years in children commenced on chronic dialysis before the age of 1 year. Our unpublished data revealed 91 patients were put on APD from 1996 to 2013. The overall survival rate was 90%. In this series, survival rate in young infants was similar and there was no CPD-related mortality. The only mortality resulted from surgical complications after liver transplantation.
 
Warady et al17 reported the 79% infants who started CPD had normal developmental scores at 1 and 4 years old and 94% of school-aged children attended school. In our series, 44% of patients were delayed in development, all of which could be accounted for by co-morbidities or underlying aetiology of ESRD.
 
Nonetheless, unpredictable outcome, psychosocial burdens, and cost continually fuel the ethical dilemma.13 15 18 The family burden is tremendous. Since CPD is a home-based treatment, caregivers must perform dialysis daily. Up to 55% of paediatric nephrologists felt a parental decision to refuse dialysis should be respected for neonates and 26% for children of 1 to 12 months old.13 In two surveys, serious co-existing co-morbidities and predicted morbidity were the most important factors when a physician considered withholding dialysis.1 19 While serious non-renal co-morbidities such as pulmonary hypoplasia are strongly associated with a poor prognosis,20 patients with isolated renal disease should be considered separately as their prognosis is generally better.18 It should be a shared decision-making process between parents and paediatric nephrologists, after detailed counselling on potential burdens and after considering co-morbidities, expected quality of life, and available resources and expertise.18 21 Designated nurses, clinical psychologists, and medical social workers are crucial in supporting patients and parents.
 
Peritoneal dialysis–related infection
Infants and young children are at risk of PD-related infectious complications. In the US, the annualised rate of peritonitis in children younger than 2 years was 0.79 episode per patient-year, compared with 0.57 episode per patient-year in adolescents aged over 12 years.4 In our current series, the annual peritonitis rate was 0.26, which is less frequent than the US data. As previously reported, the overall annual peritonitis rate among all our paediatric patients on APD was low at 0.22.22 A low infection rate has similarly been reported in several Asian countries.22
 
There are a few possible explanations. First, all our patients were on APD that is associated with a reduced risk of infection as shown in a systematic review by Rabindranath et al23 and previous data in NAPRTCS 2005.24 Second, we strictly complied with the guidelines and recommendations on prevention of PD-related infection.4 9 21 25 26 Measures included the use of double-cuffed Tenckhoff catheters, downward or laterally pointing exit sites away from diaper and ostomies, antibiotic prophylaxis at catheter insertion, post-insertion immobilisation of the catheter, nasal methicillin-resistant S aureus screening and decolonisation with mupirocin, and selective use of prophylactic topical antibiotics for patients with a history of ESI. Third, all patients and their carers completed an intensive PD training programme before commencing home APD. Training was conducted by a senior renal nurse with regular reviews and phone follow-ups. The high culture-negative peritonitis rate in our series highlights the need for proper specimen collection and handling.9
 
Growth and nutrition
Growth in infancy is important because one third of postnatal height is achieved during the first 2 years of life.27 Growth during this period largely relies on nutritional intake, rather than growth hormone. Growth in ESRD is often impaired because of poor appetite, increased circulatory leptin, nutritional loss through peritoneal dialysate and repeated vomiting due to dysmotility, gastroesophageal reflux, and raised intraperitoneal pressure.15 27 Infants can lose more than 2 htSDS that can be irreversible.15 Importantly, it is also a period of catch-up growth; NAPRTCS reported improvement in both htSDS and wtSDS in children who started dialysis before the age of 2 years—htSDS improved from -2.59 at baseline to -2.15 at 24 months post-dialysis, while wtSDS improved from -2.26 to -1.05.4
 
In our cohort, there was weight gain, but a decline in htSDS was observed. The IPDN recently analysed growth in 153 very young children on CPD.27 Interestingly, htSDS decreased further in the first 6 to 12 months of CPD and then stabilised. Although catch-up in height was noted in the NAPRTCS report, such improvement was only observed in children with worse baseline height deficit, defined as htSDS ≤ –1.88. Children with htSDS > –1.88 instead had a decline in htSDS by 0.11 and 0.2 at 12 and 24 months, respectively.4 Only two of our patients had worse baseline height deficit (≤ –1.88), with htSDS being -2 at CPD initiation. Similar to the findings in IPDN and NAPRTCS, catch-up growth in height was observed in these two patients. At 12 months post-dialysis, their htSDS improved to -0.94 and -1.6, respectively.
 
Oral intake is often unsatisfactory and enteral feeding is required, either by NG or gastrostomy tube. This allows overnight feeding and reduces vomiting. In the recent IPDN study on growth, 37% young children were fed on demand, 39% by NG tube, 7% by gastrostomy tube, and 17% switched from NG to gastrostomy feeding.27 Both NG and gastrostomy feeding led to significant increase in body mass index SDS, although regional variation was observed. Gastrostomy but not NG feeding was associated with improved linear growth, an effect that was no longer significant after adjusting the baseline length. Feeding by gastrostomy appeared to be superior to NG tube in growth promotion and may be related to less vomiting.27
 
Over the years, the use of a gastrostomy to enhance nutritional supplementation has been promoted in our centre, with intensified collaboration with a paediatric renal dietitian. In our series, only one (20%) patient who commenced CPD before 2008 received enteral feeding, owing to low parental acceptance. Of the remaining four patients who started CPD after 2008, two had gastrostomies, one was awaiting gastrostomy insertion, and one thrived satisfactorily without the need for enteral feeding. This suggests an improved nutritional management and parental acceptance. Extra efforts should also be made to optimise factors such as acidosis, anaemia, and metabolic bone disease.27 In addition, KDOQI (Kidney Disease Outcomes Quality Initiative) suggests consideration of growth hormone when children have htSDS and height velocity SDS of ≤ –1.88 after optimising nutrition and metabolic abnormalities.28
 
There are a few limitations to this study. First, because of the retrospective study design, there was recall bias. Some information could not be retrieved from medical records, especially for children who presented in the late 1990s and early 2000s. Second, the total case number was small since patients were recruited from a single nephrology centre. Last, infant dialysis has changed considerably over the past two decades and might in turn affect patient outcome.
 
Conclusions
End-stage renal disease in very young children is uncommon. Chronic PD is feasible and the outcome is improving. Vigilant adoption of guidelines, universal use of APD, and a well-structured PD training programme are crucial to achieve low peritonitis and ESI rates with no CPD-related mortality in our centre. Optimisation of dialysis, nutritional support, and developmental training are important while successful renal transplantation is the ultimate goal for these infants.
 
Declaration
All authors have disclosed no conflicts of interest.
 
References
1. Geary DF. Attitudes of pediatric nephrologists to management of end-stage renal disease in infants. J Pediatr 1998;133:154-6. Crossref
2. Kari JA, Gonzalez C, Ledermann SE, Shaw V, Rees L. Outcome and growth of infants with severe chronic renal failure. Kidney Int 2000;57:1681-7. Crossref
3. Ledermann SE, Scanes ME, Fernando ON, Duffy PG, Madden SJ, Trompeter RS. Long-term outcome of peritoneal dialysis in infants. J Pediatr 2000;136:24-9. Crossref
4. North American Pediatric Renal Trials and Collaborative Studies (NAPRTCS). 2011 Annual dialysis report. Available from: https://web.emmes.com/study/ped/annlrept/annualrept2011.pdf. Accessed Nov 2015.
5. Vidal E, Edefonti A, Murer L, et al. Peritoneal dialysis in infants: the experience of the Italian Registry of Paediatric Chronic Dialysis. Nephrol Dial Transplant 2012;27:388-95. Crossref
6. Shroff R, Rees L, Trompeter R, Hutchinson C, Ledermann S. Long-term outcome of chronic dialysis in children. Pediatr Nephrol 2006;21:257-64. Crossref
7. Warady BA, Alexander SR, Hossli S, et al. Peritoneal membrane transport function in children receiving long-term dialysis. J Am Soc Nephrol 1996;7:2385-91.
8. Warady BA, Alexander S, Hossli S, Vonesh E, Geary D, Kohaut E. The relationship between intraperitoneal volume and solute transport in pediatric patients. Pediatric Peritoneal Dialysis Study Consortium. J Am Soc Nephrol 1995;5:1935-9.
9. Warady BA, Bakkaloglu S, Newland J, et al. Consensus guidelines for the prevention and treatment of catheter-related infections and peritonitis in pediatric patients receiving peritoneal dialysis: 2012 update. Perit Dial Int 2012;32 Suppl 2:S32-86. Crossref
10. Kidney Disease: Improving Global Outcomes (KDIGO) CKD-MBD Work Group. KDIGO clinical practice guideline for the diagnosis, evaluation, prevention, and treatment of Chronic Kidney Disease-Mineral and Bone Disorder (CKD-MBD). Kidney Int Suppl 2009;113:S1-130.
11. Borzych D, Rees L, Ha IS, et al. The bone and mineral disorder of children undergoing chronic peritoneal dialysis. Kidney Int 2010;78:1295-304. Crossref
12. Leung SS, Tse LY, Wong GW, et al. Standards for anthropometric assessment of nutritional status of Hong Kong children. Hong Kong J Paediatr 1995;12:5-15.
13. Rees L. Paediatrics: Infant dialysis—what makes it special? Nat Rev Nephrol 2013;9:15-7. Crossref
14. Yap HK, Bagga A, Chiu MC. Pediatric nephrology in Asia. In: Avner ED, Harmon WE, Niaudet P, Yoshikawa N, Emma F, Goldstein SL, editors. Pediatric nephrology. 6th ed. Springer; 2010: 1981-90.
15. Zaritsky J, Warady BA. Peritoneal dialysis in infants and young children. Semin Nephrol 2011;31:213-24. Crossref
16. Chiu MC. An update overview on paediatric renal transplantation. Hong Kong J Paediatr 2004;9:74-7.
17. Warady BA, Belden B, Kohaut E. Neurodevelopmental outcome of children initiating peritoneal dialysis in early infancy. Pediatr Nephrol 1999;13:759-65. Crossref
18. Lantos JD, Warady BA. The evolving ethics of infant dialysis. Pediatr Nephrol 2013;28:1943-7. Crossref
19. Teh JC, Frieling ML, Sienna JL, Geary DF. Attitudes of caregivers to management of end-stage renal disease in infants. Perit Dial Int 2011;31:459-65. Crossref
20. Wood EG, Hand M, Briscoe DM, et al. Risk factors for mortality in infants and young children on dialysis. Am J Kidney Dis 2001;37:573-9. Crossref
21. Zurowska AM, Fischbach M, Watson AR, Edefonti A, Stefanidis CJ, European Paediatric Dialysis Working Group. Clinical practice recommendations for the care of infants with stage 5 chronic kidney disease (CKD5). Pediatr Nephrol 2013;28:1739-48. Crossref
22. Chiu MC, Tong PC, Lai WM, Lau SC. Peritonitis and exit-site infection in pediatric automated peritoneal dialysis. Perit Dial Int 2008;28 Suppl 3:S179-82.
23. Rabindranath KS, Adams J, Ali TZ, Daly C, Vale L, MacLeod AM. Automated vs continuous ambulatory peritoneal dialysis: a systematic review of randomized controlled trials. Nephrol Dial Transplant 2007;22:2991-8. Crossref
24. North American Pediatric Renal Transplant Cooperative Study (NAPRTCS). 2005 Annual report. Available from: https://web.emmes.com/study/ped/annlrept/annlrept2005.pdf. Accessed Nov 2015.
25. Piraino B, Bailie GR, Bernardini J, et al. Peritoneal dialysis-related infections recommendations: 2005 update. Perit Dial Int 2005;25:107-31.
26. Auron A, Simon S, Andrews W, et al. Prevention of peritonitis in children receiving peritoneal dialysis. Pediatr Nephrol 2007;22:578-85. Crossref
27. Rees L, Azocar M, Borzych D, et al. Growth in very young children undergoing chronic peritoneal dialysis. J Am Soc Nephrol 2011;22:2303-12. Crossref
28. KDOQI Work Group. KDOQI Clinical Practice Guideline for Nutrition in Children with CKD: 2008 update. Executive summary. Am J Kidney Dis 2009;53:S11-104. Crossref

Pages