Quality of life of Chinese urologists: a cross-sectional study using WHOQOL-BREF

Hong Kong Med J 2015 Jun;21(3):232–6 | Epub 13 Feb 2015
DOI: 10.12809/hkmj144297
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Quality of life of Chinese urologists: a cross-sectional study using WHOQOL-BREF
YB Wei, MD1,2; Z Yin, MD1; YL Gao, MD1; B Yan, MD1; Z Wang, MD1; YR Yang, PhD1
1 Department of Urology, The Second Xiangya Hospital, Central South University, Changsha 410011, China
2 Department of Urology, Fujian Provincial Hospital, The Teaching Hospital of Fujian Medical University, Fuzhou 350001, China
Corresponding author: Dr JR Yang (yjinrui2012@163.com)
 Full paper in PDF
Abstract
Objectives: In recent years, Chinese hospital settings are under violent threats. The exact status of quality of life of Chinese doctors under these disastrous situations remains obscure. The aim of this study was to assess the quality of life of Chinese urologists and analyse its potential affecting factors.
 
Design: Cross-sectional survey.
 
Setting: Beijing, China.
 
Participants: Overall, 1000 participants from more than 30 areas of China, who participated in the 20th National Urology Conference in Beijing in 2013, were surveyed. The brief version of the World Health Organization Quality of Life (WHOQOL-BREF) Chinese version was used to assess the quality of life among these urologists. The relationship between quality of life and the affecting factors was analysed.
 
Results: Of the 1000 questionnaires, 856 were completed and returned, and 708 questionnaires were valid for analysis. Approximately 46% of the respondents came from provincial capitals, 54.2% of them felt stress from medical environment, while 76.0% felt stress from research work, and 85.3% from promotion. Cronbach’s α coefficient of the instrument was 0.825, Kaiser-Meyer-Olkin measure was 0.841, and P value of Bartlett’s sphericity was <0.001. The results of binary logistic regression indicated gender, work years, and medical environment as potential affecting factors of quality of life only influenced one domain. In contrast, research work and promotion influenced three domains of the WHOQOL-BREF.
 
Conclusions: The study indicated that the WHOQOL-BREF may be a reliable and valid tool to assess quality of life of Chinese urologists. In China it is true that the deteriorative medical environment negatively affects medical practice according to previous studies, and policies are recommended to improve the situation. Nevertheless, we should not be too pessimistic about it, as in today’s context research work and promotion may be the most extensive and significant affecting factors on doctors’ quality of life.
 
 
New knowledge added by this study
  •  The brief version of the World Health Organization Quality of Life (Chinese version) may be reliable and valid to assess quality of life of Chinese urologists.
Implications for clinical practice or policy
  •  The policy-makers should pay attention to Chinese urologists’ quality of life.
 
 
Introduction
In recent years, cases of Chinese hospital settings under violent threats have been reported and such reports have become the subject of worldwide attention.1 2 These adverse events have affected doctors and medical students in China.3 4 Violence against medical staff is not solely limited to China, but a worldwide issue.5 6 7 Nevertheless, it is unimaginable that this kind of violence could become exacerbated, and this threat has even influenced the medical education of the future Chinese generation. As the lack of trust and relationship between doctors and patients in China becomes worse, many doctors are discontented and concerned about their safety during daily work. The survival status of Chinese doctors in this special period is worthy of attention. As of now, the exact status of Chinese doctors’ quality of life (QOL) under conditions like these disastrous situations remains obscure.
 
A brief version of the World Health Organization Quality of Life (WHOQOL-100; WHOQOL-BREF) is one of the best known and acceptable instruments available. It has been developed for cross-cultural comparison of QOL and is available in more than 40 languages. Its validity has been confirmed in assessing the subjective QOL of patients and the general public. The Chinese version of WHOQOL-BREF has also proven to be reliable and valid in the assessment of QOL in Chinese individuals.8 9
 
The aim of this study was to assess the QOL of Chinese urologists from a nationwide survey10 to explore the possible influencing factors of QOL, and to generate public attention on the issue of QOL of the current medical community.
 
Methods
Ethics statement
The approval for this study was obtained from the Institutional Review Board of the Second Xiangya Hospital, Central South University, China. The survey was anonymous and questionnaires did not contain information that could identify individual respondents. The administrator saved all the returned questionnaires and data drawn from the survey remained confidential.
 
Subjects
The survey was carried out at the 20th National Urology Conference in Beijing held between 19 and 21 December 2013.10 The conference was organised by the Chinese Urological Association and was held at the China National Convention Center in Beijing. More than 2000 members registered for the conference from over 30 areas, which included participants from different provinces, cities, and autonomous regions of China. This cross-sectional study was conducted on 19 December 2013. A total of 1000 questionnaires were sent to the delegates and none of them reported repeating the test. Four well-trained investigators distributed and carried out the survey simultaneously and each survey process was limited to less than 10 minutes per participant. If participants had any questions regarding the survey, they could ask for help at any time during the process. The exclusion criteria were: (1) if more than 20% of items (5 items) were not answered in the WHOQOL-BREF questionnaire; or (2) if more than two items were not answered in the general information section, except the items of WHOQOL-BREF.
 
Survey instrument
The questionnaire comprised two sections: (a) general information of respondents which included gender, professional qualifications (titles), working years, hospital location, and sources of stress including medical environment (referring to working environment and workplace safety), clinical work, research work, and promotion; and (b) the Chinese version of WHOQOL-BREF which consisted of 26 items in four domains. The four domains included in the brief version of WHOQOL-100 were physical health (PHYS), psychological health (PSYCH), social relationships (SOCIAL), and environment (ENVIR). Each of the 26 items was assigned value scores of 1 to 5. The score for each domain was transformed into a linear scale from 0 to 100, reflecting QOL which ranged from lowest to highest.
 
Statistical analyses
Software EpiData version 3.1 (The EpiData Association, Odense, Denmark) was used to establish the database. Double data entry was done and this was double-checked by two well-trained researchers until the results were exactly the same. Besides the questionnaires that were excluded, in the general information section of the valid questionnaires, all the missing values were replaced with medians (rounded), and for WHOQOL-BREF, the missing data were replaced with the series mean.
 
All the statistical analyses were performed with the Statistical Package for the Social Sciences (SPSS; Windows version 16.0; SPSS Inc, Chicago [IL], US). Cronbach’s α was used to measure internal consistency (‘reliability’), while Kaiser-Meyer-Olkin (KMO) measure and Bartlett’s test were used to assess the validity of the instrument. Data in each domain of WHOQOL-BREF were divided into two grouping variables by its mean. Binary logistic regression was carried out to analyse impact factors. A P value of <0.05 was considered statistically significant.
 
Results
Sample characteristics
Of the 1000 questionnaires sent, 856 were completed and returned. Approximately 17% (148/856) were excluded according to the exclusion criteria. Among the remaining 708 questionnaires, the total missing data in the general information section and WHOQOL-BREF section were about 2.1% (15/708) and about 3.0% (21/708), respectively, and these were considered to be valid. Of the 708 respondents, 597 (84.3%) were male, and 111 (15.7%) were female. The work years was divided into groups of <10, 10-19, 20-29 and ≥30 years which was composed of 35.9%, 35.5%, 18.4% and 10.2% of respondents, respectively. Approximately 46% of the respondents came from provincial capitals like Guangzhou in Guangdong province, and municipality directly under the central government like Shanghai. The professional qualifications (titles) were subdivided into three categories: junior, intermediate, and senior titles. From the start of career as a doctor in China, doctors work approximately 5 years to get promoted from each title level to the subsequent one. In the survey, almost half the respondents held senior professional titles. With regard to stress, all the four sources of stress had two options to choose from: ‘Yes’ and ‘No’. For example, choosing ‘Yes’ in medical environment meant that respondents felt stress from medical environment, while choosing ‘No’ meant feeling no stress from this aspect. Each respondent could choose one or more than one source of stress. The results with regard to source of stress showed that 54.2% (384) felt stress from medical environment, 45.1% (319) from clinical work, 76.0% (538) from research work, and 85.3% (604) from promotion.
 
Reliability and validity
Reliability and validity were performed by SPSS. Cronbach’s α, the most common measurement of reliability, was used to assess the degree of internal uniformity. The overall Cronbach’s α coefficient of the instrument was 0.825, indicating the questionnaire was of good quality. Exploratory factor analysis is a mature and effective method used to uncover the underlying structure of a relatively large set of variables. Results showed that KMO measure was 0.841 and P value of Bartlett’s sphericity was <0.001, indicating that the data gathered from the study were suitable for factor analysis.
 
Quality of life according to affecting factors
We then analysed the factors affecting each domain using binary logistic regression. Gender, titles, work years, hospital locations, and four sources of stress were entered as independent factors into the regression model. The analysis was performed by the Enter method. The Hosmer-Lemeshow test for the four regression equations were obtained: PHYS (P=0.198), PSYCH (P=0.863), SOCIAL (P=0.246), and ENVIR (P=0.959), indicating higher fitting degrees. Affecting factors are presented in detail in Table 1 and their relative risks and 95% confidence intervals are listed in Table 2. Three factors that affected PHYS domain were found to be gender, research work, and promotion. Research work and promotion were also the two affecting factors of PSYCH domain. In the domain of SOCIAL, only research work proved to be an affecting factor. In the ENVIR domain, three factors were found affecting—work years, medical environment, and promotion. All the above affecting factors were significant with P values of <0.05. The above results suggested that gender, work years, and medical environment were potential affecting factors of QOL and only influenced one domain. In contrast, research work and promotion influenced three domains of WHOQOL-BREF. Title, hospital location, and clinical work were demonstrated as non-affecting factors of four domains of WHOQOL-BREF (all P>0.05).
 

Table 1. Results of binary logistic regression on affecting factors of quality of life of Chinese urologists
 

Table 2. Relative risks (RRs) and 95% confidence intervals (CIs) of binary logistic regression showing affecting factors of quality of life of Chinese urologists
 
Discussion
The reliability and validity of the WHOQOL-BREF instrument in a specialised Chinese population were analysed. The WHOQOL-BREF is used worldwide to assess QOL of different populations. The result suggests that the instrument is feasible in the assessment of QOL of Chinese medical professionals like urologists. The QOL of Chinese medical students8 and urban community residents9 have been successfully assessed using the WHOQOL-BREF and these studies have also proved the reliability and validity of the Chinese version of WHOQOL-BREF. 8 9
 
Usually the QOL of patients and geriatric populations are monitored with consideration. However, less attention has been placed on the QOL of health care practitioners, even physicians themselves. It is necessary to emphasise the QOL of health care facilitators when catastrophic events happen to medical staff. The QOL of health care providers (including physicians, nurses, and technicians) has been studied after the 2010 Haiti earthquake, and results suggest that health care providers have expressed dissatisfaction about their environment.11 In recent years, violence in health care settings in China has becoming increasingly fierce. More and more physicians and nurses have encountered physical attacks, light injuries resulting in psychological problems, or severe harm leading to death or disability. Living with high amounts of tension and fear, the work environment and personal life of Chinese medical staff are severely affected according to previous studies.2 12 In these situations, the QOL of Chinese physicians needs to be estimated. The study aimed to evaluate the QOL of Chinese urologists across the country.
 
In our study, 856 questionnaires were returned with a valid response rate of about 86% which is reasonable, and a total of 708 copies were used for final assessment. Males comprised the majority (84.3%) which may be partially derived from the characteristics of field of urology, and because fewer females prefer being a surgeon, not to mention an urologist. It is known that Chinese medical staff’s work environment and personal life are severely affected by violence happening in hospitals.1 2 In the four sources of stress in our study, only 54.2% of medical staff felt stress from medical environment, while 76.0% and 85.3% felt stress from research work and promotion, respectively. Following binary logistic regression analysis (Tables 1 and 2), titles, hospital locations, and clinical work were demonstrated as non-affecting factors in the four domains of WHOQOL-BREF (all P>0.05), indicating that these three variables may have very limited impact on doctors’ QOL in today’s world. When considering hospital location as an example, this information may provide a powerful and useful reference for recently graduated medical students in their job search. As in recently, most medical students tend to work in big cities,13 and our result indicated that the QOL of doctors living in provincial capitals and municipality directly under the central government may not be better than the other two city types, even though they might have better opportunities for further study, better life, and convenience which are driving their choice to work in big cities.
 
Like medical students,8 gender, work years, and medical environment proved as potential affecting factors of QOL but only influenced one domain of WHOQOL-BREF. In contrast, research work and promotion influenced three domains. These results suggest that research work and promotion might be the two most considerable sources of stress to Chinese doctors. As in recent China, with economic and technological take-off, especially the huge advances in modern medicine, Chinese doctors have to seize the opportunity and redouble their efforts to meet the challenges. Besides daily clinical work, they usually have to deal with extensive research work, and only then will they get promoted and paid well. Combining the above percentage of delegates choosing medical environment as a source of stress, it seems that although the medical environment has become worse and negatively impacts Chinese medical practice, it has not made such a powerful or deep influence to urologists’ QOL, when compared with a wider and subtle impact of research work and promotion. Nevertheless, the side-effects of deteriorating medical environment on doctors should not be ignored, as it indeed negatively affects QOL of medical staff and medical education even in the next generation.2 3 The policy-makers in China should pay more attention to protect medical staff from violent threats during medical practice and policies to improve the situation are recommended.
 
This study has some limitations. First, selection and response bias might exist as the survey was done using convenience sampling, was self-reported, only urologists were investigated, and no comparison on the time span and other medical specialties were analysed. Second, other factors which might affect QOL and also be associated with the factors were not included and analysed in this study.
 
Conclusions
The study indicated that the WHOQOL-BREF may be a reliable and valid QOL assessment tool for Chinese urologists. It is true that the deteriorative medical environment negatively affects medical practice in China according to previous studies and policies are recommended to improve the situation. We, however, should not be too pessimistic about it, as in today’s context research work and promotion may be the most extensive and significant affecting factors on doctors’ QOL.
 
Acknowledgements
The study was supported by the Fundamental Research Funds for the Central Universities of Central South University in 2013 (no. 2013zzts095). The authors are grateful to the participants in the conference who took time to complete the survey.
 
Declaration
No conflicts of interest were declared by authors.
 
References
1. Huang J, Yan L, Zeng Y. Facing up to the threat in China. Lancet 2010;376:1823. Crossref
2. Sun S, Wang W. Violence against Chinese health-care workers. Lancet 2011;377:1747. Crossref
3. Jie L. New generations of Chinese doctors face crisis. Lancet 2012;379:1878. Crossref
4. Zeng J, Zeng XX, Tu Q. A gloomy future for medical students in China. Lancet 2013;382:1878. Crossref
5. Fernandes CM, Bouthillette F, Raboud JM, et al. Violence in the emergency department: a survey of health care workers. CMAJ 1999;161:1245-8.
6. Friedrich MJ. Human rights report details violence against health care workers in Bahrain. JAMA 2011;306:475-6. Crossref
7. Kowalenko T, Hauff SR, Morden PC, Smith B. Development of a Data Collection Instrument for Violent Patient Encounters against Healthcare Workers. West J Emerg Med 2012;13:429-33. Crossref
8. Zhang Y, Qu B, Lun S, Wang D, Guo Y, Liu J. Quality of life of medical students in China: a study using the WHOQOL-BREF. PLoS One 2012;7:e49714. Crossref
9. Xia P, Li N, Hau KT, Liu C, Lu Y. Quality of life of Chinese urban community residents: a psychometric study of the mainland Chinese version of the WHOQOL-BREF. BMC Med Res Methodol 2012;12:37. Crossref
10. Special report one: the Twentieth National Urology Conference in 2013 opens today. 2013. Available from: http://www.ynurol.com/ct_show.asp?id=557. Accessed 6 Jan 2015.
11. Haar RJ, Naderi S, Acerra JR, Mathias M, Alagappan K. The livelihoods of Haitian health-care providers after the January 2010 earthquake: a pilot study of the economic and quality-of-life impact of emergency relief. Int J Emerg Med 2012;5:13. Crossref
12. Huang SL, Ding XY. Violence against Chinese health-care workers. Lancet 2011;377:1747. Crossref
13. Xinhua: High salary and housing cannot attract medical students; candidate students prefer big cities [in Chinese]. 2009. Available from: http://news.xinhuanet.com/edu/2009-11/24/content_12528605.htm. Accessed 23 Jan 2014.

Mechanism and epidemiology of paediatric finger injuries at Prince of Wales Hospital in Hong Kong

Hong Kong Med J 2015 Jun;21(3):237–42 | Epub 8 May 2015
DOI: 10.12809/hkmj144344
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Mechanism and epidemiology of paediatric finger injuries at Prince of Wales Hospital in Hong Kong
WH Liu, MB, BS; Johann Lok, MB, ChB; MS Lau, MB, ChB; YW Hung, FHKCOS, FHKAM (Orthopaedic Surgery); Clara WY Wong, FHKCOS, FHKAM (Orthopaedic Surgery); WL Tse, FHKCOS, FHKAM (Orthopaedic Surgery); PC Ho, FHKCOS, FHKAM (Orthopaedic Surgery)
Department of Orthopaedics and Traumatology, Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong
Corresponding author: Dr WH Liu (liuwinghong@yahoo.com.hk)
 
This paper was presented at the 27th Annual Congress of the Hong Kong Society for Surgery of the Hand, 15-16 March 2014, Hong Kong.
 
 Full paper in PDF
Abstract
Objectives: To determine the mechanism and epidemiology of paediatric finger injuries in Hong Kong during 2003-2005 and 2010-2012.
 
Design: Comparison of two case series.
 
Setting: University-affiliated teaching hospital, Hong Kong.
 
Patients: This was a retrospective study of two cohorts of children (age, 0 to 16 years) admitted to Prince of Wales Hospital with finger injuries during two 3-year periods. Comparisons were made between the two groups for age, involved finger(s), mechanism of injury, treatment, and outcome. Telephone interviews were conducted for parents of children who sustained a crushing injury of finger(s) by door.
 
Results: A total of 137 children (group A) were admitted from 1 January 2003 to 31 December 2005, and 109 children (group B) were admitted from 1 January 2010 to 31 December 2012. Overall, the mechanisms and epidemiology of paediatric finger injuries were similar between groups A and B. Most finger injuries occurred in children younger than 5 years (group A, 56%; group B, 76%) and in their home (group A, 67%; group B, 69%). The most common mechanism was crushing injury of finger by door (group A, 33%; group B, 41%) on the hinge side (group A, 63%; group B, 64%). The right hand was most commonly involved. The door was often closed by another child (group A, 37%; group B, 23%) and the injury often occurred in the presence of adults (group A, 60%; group B, 56%). Nailbed injury was the commonest type of injury (group A, 31%; group B, 39%). Fractures occurred in 24% and 23% in groups A and B, respectively. Traumatic finger amputation requiring replantation or revascularisation occurred in 12% and 10% in groups A and B, respectively.
 
Conclusions: Crushing injury of finger by door is the most common mechanism of injury among younger children and accounts for a large number of hospital admissions. Serious injuries, such as amputations leading to considerable morbidity, can result. Crushing injury of finger by door occurs even in the presence of adults. There has been no significant decrease in the number of crushing injuries of finger by door in the 5 years between the two studies despite easily available and affordable preventive measures. It is the authors’ view that measures aimed at promoting public awareness and education, and safety precautions are needed.
 
New knowledge added by this study
  •  Similar to other countries, crushing injury of finger by door was the most common cause of paediatric finger injuries in Hong Kong.
  •  Although many preventive measures are available and easily accessible at low cost, there were no significant differences in injury mechanism and epidemiology between 2003-2005 and 2010-2012.
Implications for clinical practice or policy
  •  Paediatric crushing injury of finger by door can occur even in the presence of adults. Reinforcement of public education on the use of safety measures, including door modification and precautions in the home, should be conducted to prevent such injuries.
 
 
Introduction
Injuries to the hand and fingers are extremely common in children, yet they can have a significant impact on a child’s growth and development. Fingers are used to explore surroundings and perform daily activities such as playing, eating, and homework. Restricting children from these activities due to injuries can have immediate short- and long-term detrimental effects on the function of the hand, psychological wellbeing, and quality of life of the children. A 10-year review on the psychological impact on children and adolescents with finger or hand injuries noted that “Hand injuries are common and loss of a dominant hand or opposition is most important [sic]. Self-esteem and skill are associated with hand sensation, appearance, and functions.”1
 
Studies by Al-Anazi2 and Doraiswamy3 have identified crushing injury of finger(s) by door as the main cause of finger injuries in children. However, there has been no local study to identify the main cause of finger injuries in Hong Kong. In 2007, Lau and Ho presented data on the epidemiology of childhood finger injuries (unpublished data; Lau M, Ho PC. 20th Annual Congress of the Hong Kong Society for Surgery of the Hand, Hong Kong, 2007) that supported the findings in other cities. Similar to Al-Anazi2 and Doraiswamy,3 Lau and Ho found that crushing injury of finger by door was the most common cause of paediatric finger injuries from 2003 to 2005, and recommended various preventive measures.
 
The present study aimed to compare the previous set of data from 2003 to 2005 reported by Lau and Ho with more recent data obtained from 2010 to 2012. By comparing the epidemiology and mechanisms of finger injuries among local Hong Kong children, we aimed to determine whether there have been any significant changes over the past 5 years.
 
Methods
Data of patients admitted to Prince of Wales Hospital from 1 January 2003 to 31 December 2005 (group A) and from 1 January 2010 to 31 December 2012 (group B) were retrieved using the Clinical Data Analysis and Reporting System (CDARS) of the Hospital Authority’s Clinical Management System. Children aged 0 to 16 years, and with at least one of the International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) codes listed in the Box among the top three diagnoses were included in the analysis.
 

Box. CDARS case finding list of Prince of Wales Hospital
 
Discharge summaries of all patients were reviewed to identify the mechanisms of finger injuries. For children in whom the mechanism was not immediately discernable from the discharge summary, further clarifications were obtained by telephone interviews with the child’s parents, which were conducted in 2006 for group A and in 2013 for group B. For children in whom crushing injury of finger(s) were due to closing doors, additional data were collected by telephone interviews with their parents using a specifically designed questionnaire (Fig 1).
 

Figure 1. Questionnaire on crushing injury of finger by door
 
Results
Group A consisted of 140 children who presented with finger injury to Prince of Wales Hospital from 1 January 2003 to 31 December 2005. Three children from this group were excluded due to coding error. Group B comprised 109 children who presented from 1 January 2010 to 31 December 2012. No children from this group were excluded.
 
In both groups, crushing injury of finger by door was the most common cause of injury—45 (33%) in group A and 45 (41%) in group B—followed by sports injury, cut, and slip and fall (Table 1). Among children with crushing injury of finger by door, younger children were the most commonly injured (Fig 2a). The male-to-female ratio was 1:1.25 in group A and 1.37:1 in group B.
 

Table 1. Mechanism of injury
 

Figure 2. (a) Distribution of children with crushing injury of finger by door by age, and (b) localisation of injury
 
In the telephone interviews conducted with the parents of the 45 children who had crushing injury of finger by door, parents of two children in group A and six children in group B could not be contacted. Overall, all the parameters measured were similar between the periods 2003-2005 and 2010-2012. In both groups, most of the fingers involved were from the right hand, with the middle, ring, and little fingers being more commonly affected than the other fingers (Fig 2b).
 
Most of the injuries occurred at home—29 (67%) in group A and 27 (69%) in group B. At home, fingers were most frequently crushed at the hinge side of the door—27 (63%) in group A and 25 (64%) in group B—followed by the lock side and the middle of a double door. The doors were frequently closed by another child—16 (37%) in group A and 9 (23%) in group B and, in more than half of the cases, occurred even in the presence of adults—26 (60%) in group A and 22 (56%) in group B (Table 2).
 

Table 2. Telephone interviews of parents of children who had crushing injury of finger by door
 
The types of injury and their relative frequencies were compared between the groups (Table 3). Among the more common injuries were: nailbed injury—22 (31%) in group A and 22 (39%) in group B; fracture—17 (24%) in group A and 13 (23%) in group B; and laceration—17 (24%) in group A and 11 (20%) in group B. Most of the children in both group A (31 [72%]) and group B (30 [77%]) required operation. Among the 31 operations in group A, 21 (68%) were performed under general anaesthesia. By contrast, only 12 (40%) of the 30 operations in group B involved general anaesthesia.
 

Table 3. Types of injury, surgical intervention, and mode of anaesthesia of children having crushing injury of finger by door
 
Clinical outcomes were assessed when the telephone interviews were conducted, ie, in 2006 for group A and in 2013 for group B. Most children had a good recovery following treatment. Overall, 42 (98%) of children in group A and 37 (94%) of children in group B reported no pain. Only minor cosmetic problems prevailed in most children, with 34 (79%) in group A and 35 (89%) in group B rating their current level of cosmesis over 7 out of 10 (score 10 = no cosmetic problem). The injuries had minimal adverse effects for most children, with 42 (98%) in group A and 39 (100%) in group B rated their daily activities with a score over 8 (score 10 = no problem) [Fig 3].
 

Figure 3. Clinical outcomes assessed by telephone interview: (a) pain, (b) cosmesis, and (c) daily activities
 
However, five (12%) children in group A and four (10%) children in group B had crushing injury of finger by door resulting in finger amputation. Altogether seven children in groups A and B received replantation or revascularisation, one child underwent open reduction and fixation only, and one had a failed replantation due to failure to locate the arteries intra-operatively.
 
In group A, one child developed thrombosis following replantation of the right ring finger, requiring a subsequent revascularisation procedure 3 days later. This was complicated by hooknail deformity 1 year post-replantation and was subsequently treated by further reconstructive procedures. The levels of satisfaction in terms of appearance and daily function at final follow-up were rated 5 and 3 (out of 10, with 10 means no cosmetic problem and no problem in daily activities), respectively.
 
Discussion
Crushing injury of finger by door is common. The true incidence of this type of injury is likely to be higher, as our data were limited to public hospitals so relied on the correct entry of ICD-9-CM codes into the CDARS. Data from the accident and emergency department and private practitioners were not analysed. Furthermore, many minor injuries might have been managed at home and not reported.
 
Crushing injury of finger by door is not just a local problem. Studies from Saudi Arabia and Glasgow showed that this type of injury accounted for most childhood fingertip injuries in these areas.2 3 These injuries consistently occurred at home, with the involved finger being frequently crushed at the hinge side of doors. Younger children were mostly affected. The similarity in epidemiology between the overseas data and our local data can help with recommendations for suitable door safety devices.
 
In this study, we identified that crushing injury by door was the major cause of paediatric finger injuries leading to hospital admission in both 2003-2005 and 2010-2012. Although most children were satisfied with the level of pain, cosmesis, and daily function of the injured digit at their final follow-up after treatment, serious injuries involving fractures and amputations occurred in a minority of patients. In addition to the surgical intervention and long-term hospitalisation required, these injuries could further lead to detrimental effects on the children’s growth and development.
 
Our study showed the presence of adults did not reduce the rate of these accidents, since most occurred even in the presence of an adult. This highlights the need for other preventative measures.
 
Many types of safety devices are easily available and affordable in Hong Kong. As the hinge side of doors is the most common side for fingers to be crushed, finger guard devices can be installed to prevent fingers being trapped in the opposing surfaces. Triangular-shaped rubbers, plastic or wooden stoppers can be inserted at the bottom of a door to prevent spontaneous closure. Magnets applied to the back of a door and its opposing wall surface present another equally effective and simple method of preventing unintended door closures. Dampers can be set up to reduce the speed of closing doors, thereby decreasing the force exerted on trapped fingers. The use of automatic doors should be avoided.
 
Yet, despite the easy availability and accessibility of these safety devices, there has been no significant change or improvement in terms of incidence and morbidity of children with crushing injury of fingers by door admitted to Prince of Wales Hospital in the 5-year period between 2003-2005 and 2010-2012. Thus, we should promote public awareness about this type of injury and provide more educational programmes on safety precautions in order to reduce the incidence of crushing injury of finger by door.
 
Conclusions
 
Crushing injury of finger by door accounts for the most common cause of paediatric finger injury requiring hospitalisation in Hong Kong. These injuries frequently result in hospital admission and surgical intervention, with considerable morbidity and high treatment cost. Crushing injury of finger by door occurs even in the presence of adults. Despite the easily available and affordable preventative measures in Hong Kong, our comparison revealed no significant difference in the incidence, nature, and severity of these domestic injuries between the years 2003-2005 and 2010-2012. Thus, it is our view that more effort should be invested into raising public awareness and education about these preventable injuries and to promote prevention measures.
 
References
1. Stoddard F, Saxe G. Ten-year research review of physical injuries. J Am Acad Child Adolesc Psychiatry 2001;40:1128-45. Crossref
2. Al-Anazi AF. Fingertip injuries in paediatric patients—experiences at an emergency centre in Saudi Arabia. J Pak Med Assoc 2013;63:675-9.
3. Doraiswamy NV. Childhood finger injuries and safeguards. Inj Prev 1999;5:298-300. Crossref

Double balloon catheter for induction of labour in Chinese women with previous caesarean section: one-year experience and literature review

Hong Kong Med J 2015 Jun;21(3):243–50 | Epub 22 May 2015
DOI: 10.12809/hkmj144404
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Double balloon catheter for induction of labour in Chinese women with previous caesarean section: one-year experience and literature review
Queenie KY Cheuk, MB, ChB, FHKAM (Obstetrics and Gynaecology)1;  TK Lo, MB, BS, FHKAM (Obstetrics and Gynaecology)2;  CP Lee, FRCOG, FHKAM (Obstetrics and Gynaecology)2;  Anita PC Yeung, FRCOG, FHKAM (Obstetrics and Gynaecology)1
1 Department of Obstetrics and Gynaecology, Pamela Youde Nethersole Eastern Hospital, Chai Wan, Hong Kong
2 Department of Obstetrics and Gynaecology, Queen Mary Hospital, The University of Hong Kong, Pokfulam, Hong Kong
Corresponding author: Dr Queenie KY Cheuk (cheuky3@ha.org.hk)
 Full paper in PDF
Abstract
Objectives: To evaluate the efficacy and safety of double balloon catheter for induction of labour in Chinese women with one previous caesarean section and unfavourable cervix at term.
 
Design: Retrospective cohort study.
 
Setting: A regional hospital in Hong Kong.
 
Patients: Women with previous caesarean delivery requiring induction of labour at term and with an unfavourable cervix from May 2013 to April 2014.
 
Major outcome measures: Primary outcome was to assess rate of successful vaginal delivery (spontaneous or instrument-assisted) using double balloon catheter. Secondary outcomes were double balloon catheter induction-to-delivery and removal-to-delivery interval; cervical score improvement; oxytocin augmentation; maternal or fetal complications during cervical ripening, intrapartum and postpartum period; and risk factors associated with unsuccessful induction.
 
Results: All 24 Chinese women tolerated double balloon catheter well. After double balloon catheter expulsion or removal, the cervix successfully ripened in 18 (75%) cases. The improvement in Bishop score 3 (interquartile range, 2-4) was statistically significant (P<0.001). Overall, 18 (75%) cases were delivered vaginally. The median insertion-to-delivery and removal-to-delivery intervals were 19 (interquartile range, 13.4-23.0) hours and 6.9 (interquartile range, 4.1-10.8) hours, respectively. Compared with cases without, the interval to delivery was statistically significantly shorter in those with spontaneous balloon expulsion or spontaneous membrane rupture during ripening (7.8 vs 3.0 hours; P=0.025). There were no major maternal or neonatal complications. The only factor significantly associated with failed vaginal birth after caesarean was previous caesarean section for failure to progress (P<0.001).
 
Conclusions: This is the first study using double balloon catheter for induction of labour in Asian Chinese women with previous caesarean section. Using double balloon catheter, we achieved a vaginal birth after caesarean rate of 75% without major complications.
 
New knowledge added by this study
  •  This is the first report from Asian Chinese women on the use of double balloon catheter (DBC) for induction of labour in the presence of a caesarean scar. Using DBC, a vaginal birth after caesarean (VBAC) rate of 75% was achieved without major complications.
  •  During cervical ripening with DBC, cases with spontaneous balloon expulsion or spontaneous membrane rupture had a more favourable outcome with shorter interval to delivery.
  •  Previous caesarean section for failure to progress was significantly associated with failed VBAC.
Implications for clinical practice or policy
  •  Our anecdotal experience with DBC was favourable and its application may reduce repeated caesarean section rates. Further research exploring this potential is warranted and large randomised controlled trials are needed to confirm its efficacy.
 
 
Introduction
There is widespread public and professional concern about the increasing rates of caesarean section (CS). In the UK and North America, around 25% and 32% of births respectively were by CS.1 2 In Hong Kong, according to the 2009 territory-wide O&G audit report, CS rate has been around 42.1%.3 Previous CS has been the most common indication for caesarean delivery.3 In subsequent pregnancies, CS can be associated with serious maternal morbidities.4 To reduce CS rate and related morbidities, vaginal birth after caesarean (VBAC) is an alternative advocated in most developed countries.2 5 6 According to the UK and North American guidelines, induction of labour (IOL) can be offered to women with medical or obstetric indications who opt for VBAC after discussion.2 5
 
Unfavourable cervix, which is a common obstetric problem, can be addressed using pharmacological and mechanical methods to enable cervical ripening. In Hong Kong pharmacological method is more commonly used for IOL. In women with previous CS, the increased risk of uterine rupture is a major concern during IOL.2 5 6 7 Mechanical methods apply pressure on the internal cervical os, stretch the lower uterine segment, and increase local production of prostaglandin. There is a lack of compelling evidence suggesting increased risk of uterine rupture because mechanical devices can be readily removed when needed and are stable in room temperature. Compared to conventional Foley catheter, the double balloon catheter (DBC) has a cervicovaginal balloon in addition, allowing greater compression of the cervical os and avoiding the need for traction (Fig 1). Nevertheless, there are limited reports about the experience with the use of DBC. Regarding the question of which IOL method is suitable in women with prior CS, a recent Cochrane review stated that there was insufficient information available to conclude on the optimal method of IOL in women with prior CS.7
 

Figure 1. Double balloon catheter
 
Since May 2013, our unit has been offering the option of DBC (Cook Cervical Ripening Balloon; Cook Medical, Bloomington [IN], US) for IOL in women with one previous CS. Therefore, we conducted a study on Chinese women with an objective to evaluate the efficacy and safety of the DBC in IOL with one previous CS and unfavourable cervix at term. Another objective was to identify risk factors associated with unsuccessful VBAC. This is one of the first studies to report using DBC for this indication in Asian Chinese population.
 
Methods
This retrospective study was conducted in the obstetrics unit of Pamela Youde Nethersole Eastern Hospital in Hong Kong. The unit provides tertiary care and conducts over 3000 deliveries per year. Prior to the introduction of DBC, the background CS and VBAC rates in our unit was approximately 30% and 1.9%, respectively. The overall success rate of VBAC was more than 80%. In our study, we identified VBAC cases using DBC for IOL between 1 May 2013 and 30 April 2014 through the departmental database. Clinical details were reviewed from the case notes and hospital electronic systems.
 
Inclusion and exclusion criteria
Inclusion criteria were women with one lower transverse caesarean scar and no contra-indication for VBAC who were given the option of either repeated elective CS or VBAC. Those VBAC cases requiring medically or obstetrically indicated IOL were offered DBC if the cervix was unfavourable (modified Bishop score <6) and membranes intact.
 
The exclusion criteria for using DBC were: women with two or more previous CS, classical CS scar, inverted T or J or low vertical incision in previous CS; previous uterine scar for gynaecological conditions, eg myomectomy, hysterotomy; congenital uterine abnormality; twin pregnancy, non-cephalic presentations, intra-uterine death, suspected fetal distress; uterine fibroids which may obstruct labour, placenta praevia, antepartum haemorrhage, leaking, clinical chorioamnionitis, suspected macrosomia (ultrasound estimated fetal weight ≥4000 g), polyhydramnios (amniotic fluid index ≥25 cm or single deepest pocket ≥8 cm), congenital fetal abnormalities; and maternal diseases or maternal infection which would contra-indicate vaginal delivery or warrant prompt delivery. Ethical approval for this study was obtained from the local institutional human research ethics committee.
 
Induction-of-labour protocol
Eligible patients were admitted into hospital in the evening and an initial Bishop score was obtained. Cardiotocogram for 60 minutes, and ultrasound scan to assess estimated fetal weight, liquor volume, fetal wellbeing by umbilical artery Doppler and placental location were performed. After informed consent, the DBC was inserted according to the manufacturer’s instruction. If the DBC insertion failed, women would be offered CS the next day morning. The procedure of DBC insertion in all patients was done by one investigator (KY Cheuk). The uterine and vaginal balloons were inflated in phases to 40-50 mL and 60 mL, respectively using normal saline. After insertion, vaginal examination was performed to confirm correct placement. The catheter was taped to the woman’s inner thigh without tension. Following insertion of catheter, continuous fetal heart monitoring (CFHM) was done for 60 minutes. The catheter was kept for 12 hours if spontaneous expulsion did not occur, or removed earlier if there was spontaneous rupture of membranes, excessive vaginal bleeding, fetal distress, scar tenderness, or patient intolerance. Immediately following balloon expulsion or removal, the Bishop score was reassessed, followed by an attempt to have artificial rupture of membranes (ARM) regardless of Bishop score. To reduce the potential inter-observer bias, the same investigator (KY Cheuk) assessed the Bishop score before DBC insertion and immediately after DBC expulsion or removal. Oxytocin infusion (Syntocinon; Sandoz Pharmaceuticals, East Hanover [NJ], US) was commenced if after ARM the uterine contractions remained suboptimal at a rate of 1 mU/min and the infusion rate was doubled every 30 minutes until the uterine contractions were regular at 3 minutes’ interval. The maximum dose was capped at 8 mU/min. Oxytocin was not started without membrane rupture or if the DBC was still in place; CFHM was started after ARM till delivery. Labour was managed by the attending obstetrician and midwives. Assessment of labour progress and administration of analgesia was made according to departmental protocols. Group B streptococcus prophylaxis was given according to departmental protocol. It was commenced after DBC insertion until delivery for group B streptococcus carriers.
 
Outcome measures
The primary outcome was successful vaginal delivery (spontaneous or instrument-assisted). The secondary outcomes were: induction-to-delivery interval; device-removal-to-delivery interval; cervical score improvement; oxytocin augmentation; maternal or fetal complications during cervical ripening, intrapartum and postpartum period, which included failed device insertion, inability to void during insertion, intolerance of device necessitating early removal, uterine hyperstimulation, uterine rupture, fetal distress, abruption, antepartum haemorrhage, cord prolapse, malpresentation, meconium-stained liquor, intrapartum and postpartum infection, postpartum haemorrhage, readmission in puerperium period, neonate delivery with Apgar score of <7 in 5 minutes, cord blood pH of <7.2, admission to neonatal intensive care unit, neonatal sepsis, respiratory distress syndrome and neonatal death, and risk factors associated with unsuccessful induction.
 
Uterine hyperstimulation was defined as either the occurrence of five or more contractions in 10 minutes for two consecutive 10-minute period, or a contraction lasting for at least 2 minutes, with or without changes in fetal heart rate pattern. Uterine rupture was defined as disruption of the uterine muscle extending to and involving the uterine serosa or disruption of the uterine muscle with extension to the bladder or broad ligament.5 Uterine dehiscence was defined as disruption of the uterine muscle with intact uterine serosa.5 Intrapartum infection was defined by maternal fever of ≥38°C during labour. Failed IOL was defined as failed ARM after catheter removal or cervical dilatation of <3 cm after at least 8 hours of optimal uterine contractions.
 
Literature review
We also conducted a literature search on PubMed, Ovid Medline, EMBASE, Cochrane library database of systematic reviews and open library using the keywords “double balloon catheter”, “Atad balloon”, “double balloon device”, “Foley catheter”, “induction”, “previous caesarean section”, and “previous scarred uterus”. Bibliographies of all relevant articles identified were searched manually to locate additional studies. We excluded non-English publications, or if the original paper was not available from various sources such as PubMed, local hospital or universities library systems and internet.
 
Statistical analyses
The statistical analysis was done by PASW Statistics 18, Release Version 18.0.0 (SPSS Inc, 2009, Chicago [IL], US). Fisher’s exact test was used for categorical data, while independent t test was used if normally distributed, and non-parametric test (ie Mann-Whitney U test) if highly skewed. Univariate analysis was used to assess the risk factors associated with unsuccessful VBAC. To identify the differential effect over time, Wilcoxon signed rank test was used to compare the cervical Bishop scores before and after DBC application. The critical level of statistical significance was set at P<0.05.
 
Results
Twenty-five cases were identified during the 1-year study period, and one non-Chinese woman’s data were excluded. The remaining 24 cases were included for analysis. Table 1 summarises the baseline characteristics of study patients.
 

Table 1. Baseline characteristics of women with previous caesarean section induced with double balloon catheter
 
Figure 2 depicts the induction process and outcomes of the 24 cases; DBC was well tolerated in all cases (Table 2). There was no case of failed insertion. After DBC expulsion or removal, the cervix became favourable (Bishop’s score ≥6) in 18 (75%) cases. The improvement in Bishop score 3 (interquartile range [IQR], 2-4) was statistically significant (P<0.001). Artificial rupture of the membranes was successful in all 22 cases with intact membranes, regardless of cervical favourability. Oxytocin augmentation was required in 18 (75%) cases. Overall, 75% of cases were delivered vaginally. Among them, the median insertion-to-delivery and removal-to-delivery intervals were 19 (IQR, 13.4-23.0) hours and 6.9 (IQR, 4.1-10.8) hours, respectively. All the four women with previous vaginal deliveries had successful VBAC. Compared with cases without, the balloon expulsion-to-delivery or removal-to-delivery interval was shorter in those with spontaneous balloon expulsion or early balloon removal due to spontaneous membrane rupture during ripening (7.8 vs 3.0 hours, P=0.025). All the cases had good neonatal outcomes with cord blood pH of >7.25, 5-minute Apgar score of 10, without the need for neonatal intensive care unit admissions (Table 3). One case reported severe scar pain during oxytocin augmentation. Scar dehiscence was suspected and emergency CS performed. Dehiscence was not substantiated intra-operatively. The baby was born in good condition. Apart from a few cases of maternal complications (eg postpartum infection and postpartum haemorrhage), there was no case of uterine rupture or adverse neonatal complications.
 

Figure 2. Labour induction with double balloon catheter
 

Table 2. Labour and delivery outcomes after induction of labour with DBC
 

Table 3. Maternal complications and neonatal outcomes after induction of labour with double balloon catheter
 
To study the risk factors associated with unsuccessful VBAC, univariate analysis was performed using maternal age, height, body mass index, cervical Bishop score, cervical favourability after DBC removal or dislodgement, gender and birth weight of baby, gestational diabetes mellitus, history of vaginal delivery, history of successful VBAC, inter-pregnancy interval, the indication for previous CS, and the indication for IOL in the current pregnancy as variables. Previous CS for failure to progress was the only factor significantly associated with unsuccessful VBAC (P<0.001).
 
Discussion
Few studies have investigated the use of DBC for IOL in patients with previous caesarean scars. Table 4 summarises the findings from some of the studies8 9 10 11 including the current study. Most of the studies showed significant improvement in Bishop score after using the device and a favourable cervix was achieved in 75% to 85% of cases. The overall vaginal delivery rate was 60.2% (71/118). There was one (0.85%) case of symptomatic scar dehiscence, and no adverse neonatal complications. The cervical ripening success rate in our study was comparable to those in other studies, and also our study achieved a higher vaginal delivery rate. One explanation for this would have been due to differences in the IOL protocols. Some authors would offer CS directly if the cervix failed to ripen after the DBC.9 Our practice was to continue induction with ARM and oxytocin even if the cervix remained unfavourable after the DBC. In our patients, 83.3% (5/6) in this group delivered vaginally with continued induction. A second potential reason for better outcomes in our study would lie in the differences in inclusion criteria. Some studies had excluded women with previous vaginal delivery,10 a factor known to be associated with successful VBAC. A third reason could have been differences in ethnicity. Studies have shown that ethnicity does impact VBAC success rates.12
 

Table 4. Analysis of studies using DBC and Foley catheter for IOL in women with previous caesarean section8 9 11 14 20 21 22 23 24
 
With increasing rates of CS worldwide, it is estimated that 10% of women requiring IOL have a history of CS. However, the optimal induction method for this high-risk group is unknown. To counter unfavourable cervix with intact membranes, prostaglandins and mechanical methods such as Foley or DBC have been used. Prostaglandins appeared to be associated with a higher uterine rupture risk.13 14 Ravasia et al14 found that the relative risk of uterine rupture with prostaglandins versus spontaneous labour was 6.41, whereas the risk with the use of Foley catheter was comparable to spontaneous labour. Although infective morbidity associated with mechanical induction is a concern, the evidence is contradictory.15 16 A systematic review by Heinemann et al15 on studies using Foley catheter for IOL showed that use of mechanical devices was associated with significant increase in maternal morbidity due to infectious morbidity when compared with pharmacological agents. On the other hand, a recent Cochrane review showed no increase in serious maternal morbidity with the use of the Foley catheter.7 Further support was provided from the recent open-label randomised controlled trial PROBAAT,16 which compared Foley catheter to vaginal prostaglandin in 824 women without previous CS. The study showed that Foley catheter had similar CS rates, less uterine hyperstimulation, fewer maternal and fetal morbidities, and no increase in infectious morbidity. Although Foley catheter was featured in all these studies, DBC potentially has additional utility for an unripe cervix as it applies pressure on both the external and internal os, avoiding the need for traction and reduces the associated patient discomfort. Double balloon catheter has a larger inflated volume compared with Foley catheter (80 mL vs 30 mL) and therefore IOL with a bigger balloon volume may shorten duration of labour with better cervical dilatation.17 Nevertheless, clinical data comparing DBC with Foley catheter in the presence of a caesarean scar are lacking, while those on intact uterus are scarce and inconclusive.18 19
 
Table 4 summarises the results of studies using Foley catheter for IOL in the presence of a caesarean scar.14 20 21 22 23 24 It appears that DBC achieved comparable vaginal delivery rate (60.2% vs 58.0%) and similar uterine rupture/dehiscence rate (0.85 % vs 0.65%). There was no case report of neonatal death in studies using DBC while there were two cases reported with Foley catheter. One was due to uterine rupture; another was due to rupture of vasa praevia which was independent of the method of induction.23 Further research to compare the efficiency and safety of the two devices for IOL in women with previous CS is warranted. Although uterine rupture and infectious morbidity seemed rare with DBC in women with previous CS, the number of women studied was too small to allow solid conclusion on its safety.
 
The complication rate for VBAC attempt was highest in those who failed to achieve VBAC in the end.25 Knowledge of the factors associated with successful VBAC would therefore enable better counselling on the choice of mode of delivery. Landon et al12 in a large cohort of 14 529 women showed that previous vaginal delivery and previous successful VBAC were the best predictors of successful VBAC; the success rates were 86.6% and 89.6%, respectively. In our study, all four cases with previous vaginal delivery (including one with previous VBAC) had successful VBAC. In the study by Landon et al,12 factors associated with unsuccessful VBAC included obesity, previous CS for dystocia, IOL, birth weight of <4000 g, advanced maternal age, short stature, more than 2 years from previous caesarean, gestational age of ≥41 weeks, and previous preterm CS. Despite our small sample size, we concurred that previous CS for failure to progress was a significant factor associated with unsuccessful VBAC.
 
Conclusions
This is the first report from East Asia on the use of DBC for IOL in the presence of caesarean scar. A success rate of 75% was achieved using VBAC in Chinese women with a caesarean scar and an unfavourable cervix. The procedure of DBC was well tolerated, and no major complications were observed. Our favourable experience with DBC in Asian Chinese women lends support to further research exploring the potential of this promising modality in averting the rising CS rates in this part of the world.
 
References
1. Caesarean section. NICE Clinical Guidelines. National Collaborating Centre for Women’s and Children’s Health (UK). London: RCOG Press; November 2011.
2. Vaginal birth after previous Caesarean delivery. ACOG Practice Bulletin. No. 115. American College of Obstetricians and Gynecologists; August 2010.
3. HKCOG Territory-wide O&G Audit Report: Caesarean section. Hong Kong: Hong Kong College of Obstetricians and Gynaecologists; 2009.
4. Bates GW Jr, Shomento S. Adhesion prevention in patients with multiple cesarean deliveries. Am J Obstet Gynecol 2011;205(6 Suppl):S19-24. Crossref
5. Birth after previous Caesarean birth. RCOG Green-top Guideline No. 45. Royal College of Obstetricians and Gynaecologists; February 2007.
6. Society of Obstetricians and Gynaecologists of Canada. SOGC clinical practice guidelines. Guidelines for vaginal birth after previous caesarean birth. Number 155 (Replaces guideline Number 147), February 2005. Int J Gynaecol Obstet 2005;89:319-31.
7. Jozwiak M, Dodd JM. Methods of term labour induction for women with a previous caesarean section. Cochrane Database Syst Rev 2013;(3):CD009792. Crossref
8. Khotaba S, Volfson M, Tarazova L, et al. Induction of labor in women with previous cesarean section using the double balloon device. Acta Obstet Gynecol Scand 2001;80:1041-2. Crossref
9. Miller TD, Davis G. Use of the Atad catheter for the induction of labour in women who have had a previous Caesarean section—a case series. Aust N Z J Obstet Gynaecol 2005;45:325-7. Crossref
10. Ferradas E, Alvarado I, Gabilondo M, Diez-Itza I, García-Adanez J. Double balloon device compared to oxytocin for induction of labour after previous caesarean section. Open J Obstet Gynecol 2013;3:212-6. Crossref
11. Ebeid E, Nassif N. Induction of labor using double balloon cervical device in women with previous cesarean section: experience and review. Open J Obstet Gynecol 2013;3:301-5. Crossref
12. Landon MB, Leindecker S, Spong CY, et al. The MFMU Cesarean Registry: factors affecting the success of trial of labor after previous cesarean delivery. Am J Obstet Gynecol 2005;193:1016-23. Crossref
13. Lydon-Rochelle M, Holt VL, Easterling TR, Martin DP. Risk of uterine rupture during labor among women with a prior cesarean delivery. N Engl J Med 2001;345:3-8. Crossref
14. Ravasia DJ, Wood SL, Pollard JK. Uterine rupture during induced trial of labor among women with previous cesarean delivery. Am J Obstet Gynecol 2000;183:1176-9. Crossref
15. Heinemann J, Gillen G, Sanchez-Ramos L, Kaunitz AM. Do mechanical methods of cervical ripening increase infectious morbidity? A systematic review. Am J Obstet Gynecol 2008;199:177-87. Crossref
16. Jozwiak M, Oude Rengerink K, Benthem M, et al. Foley catheter versus vaginal prostaglandin E2 gel for induction of labour at term (PROBAAT trial): an open-label, randomized controlled trial. Lancet 2011;378:2095-103. Crossref
17. Levy R, Kanengiser B, Furman B, Ben Arie A, Brown D, Hagay ZJ. A randomized trial comparing a 30-mL and an 80-mL Foley catheter balloon for preinduction cervical ripening. Am J Obstet Gynecol 2004;191:1632-6. Crossref
18. Salim R, Zafran N, Nachum Z, Garmi G, Kraiem N, Shalev E. Single-balloon compared with double-balloon catheters for induction of labor: a randomized controlled trial. Obstet Gynecol 2011;118:79-86. Crossref
19. Mei-Dan E, Walfisch A, Valencia C, Hallak M. Making cervical ripening EASI: a prospective controlled comparison of single versus double balloon catheters. J Matern Fetal Neonatal Med 2014;27:1765-70. Crossref
20. Ben-Aroya Z, Hallak M, Segal D, Friger M, Katz M, Mazor M. Ripening of the uterine cervix in a post-cesarean parturient: prostaglandin E2 versus Foley catheter. J Matern Fetal Neonatal Med 2002;12:42-5. Crossref
21. Bujold E, Blackwell SC, Gauthier RJ. Cervical ripening with transcervical foley catheter and the risk of uterine rupture. Obstet Gynecol 2004;103:18-23. Crossref
22. Ziyauddin F, Hakim S, Beriwal S. The transcervical foley catheter versus the vaginal prostaglandin e2 gel in the induction of labour in a previous one caesarean section—a clinical study. J Clin Diagn Res 2013;7:140-3. Crossref
23. Jozwiak M, van de Lest HA, Burger NB, Dijksterhuis MG, De Leeuw JW. Cervical ripening with Foley catheter for induction of labor after cesarean section: a cohort study. Acta Obstet Gynecol Scand 2014;93:296-301. Crossref
24. Sananès N, Rodriguez M, Stora C, et al. Efficacy and safety of labour induction in patients with a single previous Caesarean section: a proposal for a clinical protocol. Arch Gynecol Obstet 2014;290:669-76. Crossref
25. Landon MB, Hauth JC, Leveno KJ, et al. Maternal and perinatal outcomes associated with a trial of labor after prior cesarean delivery. N Engl J Med 2004;351:2581-9. Crossref

Impact of nuchal cord on fetal outcomes, mode of delivery, and management: a questionnaire survey of pregnant women

Hong Kong Med J 2015 Apr;21(2):143–8 | Epub 10 Mar 2015
DOI: 10.12809/hkmj144349
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Impact of nuchal cord on fetal outcomes, mode of delivery, and management: a questionnaire survey of pregnant women
CW Kong, FHKAM (Obstetrics and Gynaecology); Diana HY Lee, MB, BS; LW Chan, FRCOG; William WK To, MD, FRCOG
Department of Obstetrics and Gynaecology, United Christian Hospital, Kwun Tong, Hong Kong
Corresponding author: Dr CW Kong (melizakong@gmail.com)
 Full paper in PDF
Abstract
Objectives: To explore pregnant women’s views on the impact of nuchal cord on fetal outcomes, mode of delivery, and management.
 
Design: Questionnaire survey.
 
Setting: Antenatal clinic of two regional hospitals in Hong Kong.
 
Participants: A questionnaire survey of all pregnant women at their first visit to the antenatal clinic of United Christian Hospital and Tseung Kwan O Hospital in Hong Kong was conducted between August and October 2012.
 
Results: Most participants (71.8%) were worried about nuchal cord, and 78.3% and 87.7% of them thought that nuchal cord could cause intrauterine death and fetal death during labour, respectively. Approximately 87.5% of participants thought that nuchal cord would reduce the chance of successful vaginal delivery and 56.4% thought that it would increase the chance of assisted vaginal delivery. Most (94.1%) participants thought that it was necessary to have an ultrasound scan at term to detect nuchal cord. In addition, 68.8% thought that it was necessary to deliver the fetus early and 72.8% thought that caesarean section must be performed in the presence of nuchal cord. Participants born in Mainland China were significantly more worried about the presence of nuchal cord than those born in Hong Kong. However, there was no difference between participants with different levels of education.
 
Conclusion: Most participants were worried about the presence of nuchal cord. Many thought that nuchal cord would lead to adverse fetal outcomes, affect the mode of delivery, and require special management. These misconceptions should be addressed and proper education of women is needed.
 
 
New knowledge added by this study
  •  Most women were worried about the presence of nuchal cord.
  •  Many women thought that nuchal cord would lead to adverse fetal outcomes, affect the mode of delivery, and require special management.
Implications for clinical practice or policy
  •  Avoiding routine ultrasound scans for nuchal cord in order to reduce needless maternal anxiety and unnecessary caesarean sections on women’s request is warranted.
  •  The correct concept that nuchal cord would not normally lead to adverse fetal outcomes and that its presence should not affect the mode of delivery should be publicised widely in Hong Kong.
 
 
Introduction
In daily clinical practice, pregnant women regularly request antenatal ultrasound scans to look for nuchal cord around the time of delivery or request that the presence of nuchal cord is specifically checked for when they undergo ultrasound scans for other obstetric reasons. Many women have requested elective caesarean sections because nuchal cord has been detected on ultrasound scan. In order to explore women’s views on the impact of nuchal cord on fetal outcomes, mode of delivery and management, we conducted a questionnaire survey to evaluate their true concerns and beliefs.
 
Methods
A questionnaire evaluating the impact of nuchal cord on fetal outcomes and mode of delivery were distributed to all pregnant women at their first antenatal visit to the out-patient clinic of United Christian Hospital and Tseung Kwan O Hospital from August to October 2012. The questionnaire was in three versions: traditional Chinese, simplified Chinese, and English according to the participant’s preference (Appendices 1 to 3). Participants who were not able to understand Chinese or English were excluded from the study. The questionnaires were collected by the nursing staff immediately after completion. Assuming that 50% of the women would express concern about the presence of nuchal cord, a sample size of 357 women would allow for random errors of up to 5%. Assuming the response rate to the questionnaire to be around 80%, distribution of around 450 questionnaires would be sufficient.
 
The Statistical Package for the Social Sciences (Windows version 20.0; SPSS Inc, Chicago [IL], US) was used for statistical analysis. Chi squared test and Fisher’s exact test were used when appropriate. All the differences were defined as being statistically significant at P<0.05.
 
Results
Of 950 questionnaires distributed, a total of 869 (91.5%) questionnaires were received. The demographic data of the participants are shown in Table 1. Around 72% of participants expressed worries about nuchal cord. The different demographic parameters among the participants who expressed worries about nuchal cord were analysed (Table 2). Participants born in Mainland China were more worried about nuchal cord than those born in Hong Kong. Advanced maternal age, nulliparity, and lower education level were not associated with higher maternal anxiety for nuchal cord.
 

Table 1. Demographic data of the participants (n=869)
 

Table 2. Comparison of the demographic data of participants who were concerned about nuchal cord
 
The perceived incidence of nuchal cord was assessed by a linear scale from 0% to 100%. Excluding the 50 participants who did not reply to this question, 37.9% thought that the incidence of nuchal cord was less than 20%. The perceived sonographic accuracy for nuchal cord was similarly assessed. Around one third (31.2%) of participants thought that the accuracy was less than 70% while 35 participants did not answer this question.
 
The perceived impact of nuchal cord on fetal outcomes, mode of delivery, and management are shown in Table 3. Around 78.3% and 87.7% thought that nuchal cord could cause intrauterine death and fetal death during labour, respectively, while 87.5% of participants thought that it would reduce the chance of successful vaginal delivery and 56.4% thought that it would increase the chance of assisted vaginal delivery. In addition, 94.1% of participants thought that it was necessary to have ultrasound scan to detect nuchal cord at term, while 68.8% thought that it was necessary to deliver the fetus early and 72.8% thought that caesarean section must be performed in the presence of nuchal cord.
 

Table 3. Participants’ views of nuchal cord on fetal outcomes, mode of delivery, and management of nuchal cord (n=869)
 
Women’s experience of nuchal cord from their previous pregnancies or from their relatives’ or friends’ deliveries were explored. We asked questions on the mode of delivery for nuchal cord pregnancies and whether or not the babies were healthy. Only 32 (8.8%) participants had nuchal cord in their previous pregnancies; one participant had nuchal cord in both her previous two pregnancies. Among those nuchal cord pregnancies, 48.5% of them had normal vaginal deliveries, 15.2% had instrumental deliveries, and 36.4% had caesarean sections. None of these babies were remarked to be unhealthy. A total of 142 (16.6%) participants had relatives or friends who had nuchal cord in their previous pregnancies, and some of them had more than one relative or friend who had nuchal cord in their previous pregnancies. The total number of their relatives’ or friends’ deliveries with nuchal cord was 155. Among those nuchal cord pregnancies, 31.6% of them had normal vaginal deliveries, 9.0% had instrumental deliveries, and 59.4% had caesarean sections. Approximately 6.5% of the babies were claimed to be unhealthy by the participants and such replies were evenly distributed in the normal vaginal delivery group, instrumental delivery group, and caesarean section group (Table 4).
 

Table 4. Participants’ experiences of nuchal cord
 
Table 5 shows the comparison of the views between participants with different places of birth and education levels. Those born in Mainland China were more likely to believe that nuchal cord led to assisted instrumental deliveries when compared with those born in Hong Kong (63.2% vs 50.7%). In contrast, they were less likely to believe that nuchal cord led to intrapartum death in labour (84.1% vs 90.9%) and the need for earlier delivery (64.9% vs 71.8%). There were no significant differences between the two groups in their views on the impact on intrauterine death, chance of successful vaginal delivery, and whether or not caesarean section was needed.
 

Table 5. Comparison of the participants’ views of nuchal cord on foetal outcomes, mode of delivery, and management of nuchal cord between participants born in Hong Kong and those born in Mainland China and between participants with non-tertiary education and those with tertiary education
 
For the education level, there was no significant difference for worry about the presence of nuchal cord. Those who had received tertiary education were less likely to believe that nuchal cord led to intrauterine death (71.9% vs 81.8%). However, more of this group thought that nuchal cord decreased the chance of successful normal vaginal delivery (91.8% vs 85.0%).
 
Discussion
This questionnaire survey revealed that many of our participants were worried about nuchal cord. The percentage (71.8%) was much higher than anticipated, implying that this issue should be given greater attention in the antenatal education of pregnant women. Our local audit showed that the incidence of nuchal cord was 27% among all singleton deliveries (n=5166) in 2010 (not published). Therefore, about one third of the participants underestimated the incidence of nuchal cord.
 
It is common for nuchal cord to be the indication for caesarean section in China, which accounted for 16.1% to 25.4% of the indications in a teaching hospital and some regional hospitals there.1 2 As many participants are immigrants from Mainland China, their views on nuchal cord were compared with those born in Hong Kong. Although this survey showed that participants born in Mainland China were more worried about nuchal cord than those born in Hong Kong, most participants in both groups also believed that nuchal cord could cause intrauterine death (>77%) and would reduce the chance of successful vaginal delivery (>85%). Moreover, despite variable levels of education, most participants also believed that nuchal cord would cause fetal death during labour (>87%) and more than 70% thought that caesarean section was needed in the presence of nuchal cord. Therefore, it was apparent that misconceptions about the clinical implications of nuchal cord were widespread among all groups.
 
In our survey, only 8.8% of the participants claimed to have nuchal cord in their previous pregnancies and none of them reported adverse fetal outcomes. However, a significant proportion of the participants’ experiences and impressions on nuchal cord were from their relatives and friends. From this survey, the caesarean section rate in participants’ relatives or friends with nuchal cord was high. This may be one of the reasons why so many participants thought that caesarean section must be performed for nuchal cord.
 
Women were worried about nuchal cord due to the concept that nuchal cord could lead to adverse fetal outcomes. Although some studies showed nuchal cord was associated with increased prevalence of variable fetal heart rate decelerations during labour and increased incidence of umbilical artery acidaemia, higher incidences of lower 1-minute Apgar score and meconium-stained liquor,3 4 these findings may not reflect clinically on fetal wellbeing. Furthermore, most available studies showed nuchal cord was not associated with lower Apgar scores in 5 minutes and was not associated with increase in caesarean sections, neonatal intensive care unit admissions, and perinatal mortalities.5 6 7 8 9 Such reassuring evidence supporting the benign nature of nuchal cord and the absence of true adverse impact clinically on the fetal outcomes should be publicised widely to the general population to reduce their misconceptions and anxiety.
 
Although 94.1% of participants thought that it is necessary to have an ultrasound scan to detect nuchal cord at term, this is not usually necessary. As almost all participants now have continuous fetal heart rate monitoring during labour in Hong Kong, even if there is presence of nuchal cord causing variable fetal heart rate decelerations during labour, this will be detected on cardiotocogram and appropriate actions such as fetal blood sampling or assisted delivery can be performed when needed. Avoiding routine ultrasound scans for nuchal cord should reduce needless maternal anxiety and unnecessary caesarean sections on participants’ request, as 68.8% thought that it was necessary to deliver the fetus early and 72.8% thought that caesarean section must be performed for nuchal cord.
 
Conclusion
Many pregnant women are worried about nuchal cord due to misconceptions on its effect on fetal outcomes and mode of delivery. Proper education is necessary to reduce maternal anxiety. The correct concept that nuchal cord would not normally lead to adverse fetal outcomes and that its presence should not affect the mode of delivery should be publicised widely in Hong Kong.
 
Appendices

Additional material related to this article can be found on the HKMJ website. Please go to , and search for the article.
 
References
1. Gao Y, Xue Q, Chen G, Stone P, Zhao M, Chen Q. An analysis of the indications for cesarean section in a teaching hospital in China. Eur J Obstet Gynecol Reprod Biol 2013;170:414-8. Crossref
2. Qin C, Zhou M, Callaghan WM, et al. Clinical indications and determinants of the rise of cesarean section in three hospitals in rural China. Matern Child Health J 2012;16:1484-90. Crossref
3. Hankins GD, Snyder RR, Hauth JC, Gilstrap LC 3rd, Hammond T. Nuchal cords and neonatal outcome. Obstet Gynecol 1987;70:687-91.
4. Singh G, Sidhu K. Nuchal cord: a retrospective analysis. Medical Journal Armed Forces India 2008;64:237-40. Crossref
5. Sheiner E, Abramowicz JS, Levy A, Silberstein T, Mazor M, Hershkovitz R. Nuchal cord is not associated with adverse perinatal outcome. Arch Gynecol Obstet 2006;274:81-3. Crossref
6. Shrestha NS, Singh N. Nuchal cord and perinatal outcome. Kathmandu Univ Med J (KUMJ) 2007;5:360-3.
7. Schäffer L, Burkhardt T, Zimmermann R, Kurmanavicius J. Nuchal cords in term and postterm deliveries—do we need to know? Obstet Gynecol 2005;106:23-8. Crossref
8. González-Quintero VH, Tolaymat L, Muller AC, Izquierdo L, O’Sullivan MJ, Martin D. Outcomes of pregnancies with sonographically detected nuchal cords remote from delivery. J Ultrasound Med 2004;23:43-7.
9. Peregrine E, O’Brien P, Jauniaux E. Ultrasound detection of nuchal cord prior to labor induction and the risk of Cesarean section. Ultrasound Obstet Gynecol 2005;25:160-4. Crossref

Implementation of secondary stroke prevention protocol for ischaemic stroke patients in primary care

Hong Kong Med J 2015 Apr;21(2):136–42 | Epub 16 Jan 2015
DOI: 10.12809/hkmj144236
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Implementation of secondary stroke prevention protocol for ischaemic stroke patients in primary care
YK Choi, FHKCFP, FHKAM (Family Medicine)1; JH Han, MD, PhD1; Richard Li, FHKCP, FHKAM (Medicine)2; Kenny Kung, FHKCFP, FHKAM (Family Medicine)3; Augustine Lam, FHKCFP, FHKAM (Family Medicine)4
1 Lek Yuen General Out-patient Clinic, Department of Family Medicine, New Territories East Cluster, Hong Kong
2 Department of Medicine, Pamela Youde Nethersole Eastern Hospital, Hong Kong
3 Department of Family Medicine and Primary Care, The University of Hong Kong, Hong Kong
4 Department of Family Medicine, New Territories East Cluster, Hong Kong
Corresponding author: Dr YK Choi (yuekwan@hotmail.com)
 Full paper in PDF
Abstract
Objective: To investigate the effectiveness of a secondary stroke prevention protocol in the general out-patient clinic.
 
Design: Cohort study with pre- and post-intervention comparisons.
 
Setting: Two general out-patient clinics in Hong Kong.
 
Patients: Ischaemic stroke patients who had long-term follow-up in two clinics were recruited. The patients of one clinic received the intervention (intervention group) and the patients of the second clinic did not receive the intervention (control group). The recruitment period lasted for 6 months from 1 September 2008 to 28 February 2009. The pre-intervention phase data collection started within this 6-month period. The protocol implementation started at the intervention clinic on 1 April 2009. The post-intervention phase data collection started 9 months after the protocol implementation, and ran for 6 months from 1 January 2010 to 30 June 2010.
 
Main outcome measures: Clinical data before and after the intervention, including blood pressure, glycated haemoglobin level, low-density lipoprotein level and prescription pattern, were compared between the two groups to see whether there was enhancement of secondary stroke management.
 
Results: A total of 328 patients were recruited into the intervention group and 249 into the control group; data of 256 and 210 patients from these groups were analysed, respectively. After intervention, there were significant reductions in mean (± standard deviation) systolic blood pressure (135.2 ± 17.5 mm Hg to 127.7 ± 12.2 mm Hg), glycated haemoglobin level (7.2 ± 1.0% to 6.5 ± 0.8%), and low-density lipoprotein level (3.4 ± 0.8 mmol/L to 2.8 ± 1.3 mmol/L) in the intervention group (all P<0.01). There were no significant reductions in mean systolic blood pressure, glycated haemoglobin level, or low-density lipoprotein level in the control group. There was a significant increase in statin use (P<0.01) in both clinics.
 
Conclusion: Through implementation of a clinic protocol, the standard of care of secondary stroke prevention for ischaemic stroke patients could be improved in a general out-patient clinic.
 
 
New knowledge added by this study
  •  A standard secondary stroke prevention protocol can significantly improve the control of cardiovascular risk factors in ischaemic stroke patients.
  •  Implementation of such a programme is effective and feasible in local primary care.
Implications for clinical practice or policy
  •  This study supports more widespread use of a secondary stroke prevention programme in the setting of a general out-patient clinic.
 
 
Introduction
Stroke is the second commonest cause of death worldwide1 and the fourth leading cause of death in Hong Kong.2 Stroke is also the commonest cause of permanent disability in adults. Patients with stroke are at high risk for recurrent stroke and other major vascular events. As the ageing population is increasing in most developed countries, stroke will remain a major burden to patients’ families and carers, the health care system, and the community. According to local data from Hong Kong, cerebrovascular disease was the principal diagnosis for about 26 500 in-patient discharges and deaths in all hospitals and accounted for 7.5% of all deaths in 2012.3 The mortality rate was significantly higher in patients with stroke recurrence than in those without.4 5 Prevention of recurrent stroke offers great potential for reducing the burden of this disease.
 
Over 80% of all strokes are ischaemic stroke. There are effective strategies for secondary prevention of ischaemic stroke, which are summarised as follows6:
(1) Modification of lifestyle risk factors (smoking, alcohol consumption, obesity, physical inactivity).
(2) Modification of vascular risk factors (hypertension, hypercholesterolaemia, diabetes).
(3) Antiplatelet therapy for non-cardioembolic ischaemic stroke.
(4) Anticoagulation for cardioembolic stroke.
(5) Intervention for symptomatic carotid stenosis.
 
As stroke patients need lifelong monitoring and control of risk factors, family physicians play the most important role in providing secondary stroke prevention care. However, despite the availability of evidence-based guidelines, studies show that adherence to these preventive strategies by physicians is poor.7 8 9 10 11 Local Hong Kong data about secondary stroke prevention in primary care are largely lacking. This study aimed to review the clinical effectiveness of a secondary stroke prevention programme in a general out-patient clinic (GOPC).
 
Methods
This was a cohort study of pre- and post-intervention comparison between patients receiving or not receiving the intervention to ascertain the effect of a secondary stroke prevention programme on clinical outcomes.
 
Clinic setting
The Lek Yuen GOPC was selected as the intervention site where the secondary stroke prevention programme was implemented. Another clinic, the Ma On Shan GOPC, was selected as the control site, where usual care was provided. Both clinics are large public primary care clinics under the management of the Department of Family Medicine of the New Territories East Cluster of the Hospital Authority. Both clinics are accredited Family Medicine Training Centres with similar service throughput annually, covering a population of around 600 000 and providing approximately 30 000 attendances monthly.
 
Most of the stroke patients in the clinics are referred from the public hospitals. The patients usually have a history of minor stroke with good functional recovery and are clinically stable.
 
Clinic protocol development and implementation
A protocol of secondary stroke prevention (Box) was developed with reference to evidence-based guidelines (mainly according to the American Heart Association and American Stroke Association stroke guidelines).12 13 14
 
Study design
The target population of the study was all the ischaemic stroke patients with long-term follow-up in the two clinics. The recruitment period lasted for 6 months from 1 September 2008 to 28 February 2009. As the usual follow-up interval of long-term patients is about 3 to 4 months, the 6-month recruitment period included all stroke patients who have regular follow-up. The pre-intervention phase data collection started within this 6-month period. The protocol implementation started at the intervention clinic on 1 April 2009. The post-intervention phase data collection started 9 months after the protocol implementation, that is, for 6 months from 1 January 2010 to 30 June 2010.
 
Sampling
The clinical data were collected by reviewing the medical records of all patients assigned with the International Classification of Primary Care coding of K90 (stroke/cerebrovascular accident) or K91 (cerebrovascular disease). Only those patients diagnosed with ischaemic stroke and who had at least two consecutive follow-up visits within the recruitment period were included. Patients who had a history of haemorrhagic stroke were excluded. In order to exclude patients with sporadic follow-up, only those patients with two consecutive follow-up visits in the post-intervention phase were regarded as eligible for data collection. Those patients without two consecutive follow-up visits in the post-intervention phase were classified as dropouts.
 
Protocol implementation
One month before initiation of the protocol, two 1-hour training sessions were arranged for medical officers and nurses in the intervention clinic. During the training sessions, the treatment goals for secondary ischaemic stroke prevention and the relevant clinical evidence were presented. The workflow and applicability of the protocol were also discussed. Medical officers were required to have good documentation of all the lifestyle and cardiovascular risk factors of the ischaemic stroke patients and provide care according to the protocol. Nurses were trained to be familiar with the treatment goals and provide patient education and lifestyle modification interventions in line with the doctors’ referrals. Allied health services such as a dietitian, smoking cessation clinic, diabetes complication screening programme, and patient empowerment programmes for diabetic and hypertensive patients were available in both the intervention and control clinics. Doctors in the intervention clinic were encouraged to refer appropriate patients to these services. There was no additional consultation time allocated to these patients. In order to monitor progress, the electronic consultation notes were reviewed monthly for each patient to assess compliance with the protocol. If suboptimal care was noted, an electronic reminder with appropriate management advice was issued to the patient’s electronic medical record. The consulting doctor would then be able to provide the appropriate management at the next follow-up. Throughout the protocol implementation period, interim clinic meetings were held quarterly to present the data for protocol compliance with the medical and nursing staff of the intervention clinic.
 
In the control clinic, no specific protocol was applied. Medication prescription and adjustment was based solely on the physicians’ discretion. The drug formulary was the same in both clinics. Statins were introduced to both clinic formularies in July 2009. There were no training sessions for doctors and nursing staff in the control clinic, and no electronic reminders or interim meetings for progress monitoring.
 
Data collection
Baseline characteristics on sex, age, chronic illness status, chronic drug use, laboratory results, and blood pressure (BP) values were extracted from the Clinical Data Analysis and Reporting System. The latest laboratory results and BP values within the data collection period were taken as the study data. Individual case records were also reviewed for the following lifestyle parameters: smoking status, alcohol consumption, body mass index (BMI), and exercise and diet history.
 
Statistical analysis
All statistical analysis was performed using the Statistical Package for the Social Sciences (Windows version 20.0; SPSS Inc, Chicago [IL], US). Continuous variables were expressed as mean and standard deviation. Baseline comparisons were made with the Student’s t test or the Chi squared test as appropriate. The mean BP, glycated haemoglobin (HbA1c) level, and low-density lipoprotein (LDL) level before and after intervention were compared by paired-samples t test in both the intervention and control clinics.
 
Results
In the intervention clinic, 328 patients were recruited to the intervention group and 72 dropped out. In the control clinic, 249 were recruited to the control group and 39 dropped out. The reasons for dropping out are shown in Table 1. More patients in the intervention group than in the control group dropped out due to restroke (9 vs 2, respectively) and death (22 vs 11, respectively), but these were not statistically significant due to the small number of patients. In both the intervention and control groups, most of the patients who died had no medication changes during the intervention period (Table 1).
 

Table 1. Reasons for dropping out of the study
 
A total of 256 patients in the intervention group and 210 in the control group were recruited for data analysis. At baseline, there were no significant differences in the demographic and cardiovascular risk factor profiles between the two groups, except that patients in the intervention group had a higher mean LDL level and a lower mean diastolic BP (Table 2).
 

Table 2. Baseline demographic data
 
After the intervention period, significant improvements in systolic BP, HbA1c and LDL levels were observed in the intervention group (Table 3). There were significant improvements in all lifestyle modification parameters (alcohol and smoking status, obtaining exercise and diet history, and BMI measurement) in the intervention group (P<0.01), and the control group had improvements in smoking status (P<0.01) and BMI measurement (P<0.05) [Table 3].
 

Table 3. Changes in clinical parameters and lifestyle modifications before and after the intervention
 
There was no significant increase in the number of antihypertensive drugs prescribed in either group (Table 4). Approximately 96% of patients were taking an antiplatelet after the intervention period in both clinics (Table 4) and the antiplatelet was always aspirin. The proportion of patients prescribed statins increased significantly in both groups since the introduction of simvastatin to the GOPC formulary in 2009. However, the overall proportion of statin use was still below 50%. Statins were less frequently prescribed to patients older than 80 years (Table 5).
 

Table 4. Comparison of medication use between the intervention and control groups
 

Table 5. Statin usage stratified according to age-group
 
Statins were stopped for 1.6% of patients in the intervention group and 4.3% in the control group (Table 4). Statins were discontinued because of dyspepsia for all patients in the intervention group. The reasons for stopping statins for the control group were dyspepsia, myalgia, mild liver function derangement, hypotension, hypoglycaemia, and drug-induced hepatitis. Only two patients in the control group required emergency admission for hypoglycaemia during the intervention period. There was no restroke in the intervention group and four restrokes in the control group during the intervention period.
 
Discussion
 
This study showed that the implementation of a secondary stroke prevention programme in GOPCs could improve control of cardiovascular risk factors, including BP, HbA1c and LDL levels among ischaemic stroke patients. We observed an improvement of BP control in the intervention group, although there was no significant increase in the number of antihypertensives used. However, since simvastatin was introduced into the GOPC drug formulary in 2009, the use of statins increased in both the control and intervention clinics, although the effect of LDL reduction was only observed in the intervention group. This result implies that the improvement in outcome for this group is due to more than just the effects of medications. Lifestyle modifications may provide additional benefits.
 
Although the BP and HbA1c level in the intervention group were comparable with recent recommendations (BP <140/90 mm Hg for patients without diabetes; BP <130/80 mm Hg and HbA1c level of <7% for patients with diabetes), the mean LDL levels remained well above the recommended target of 1.9 mmol/L. Only about half of the patients were taking statins. There is a suggestion that doctors may not prescribe or maximise statin therapy because treatment may be considered futile, especially among older people whose life expectancy is limited.15 This trend was observed in both the control and intervention sites in this study (Table 4). The percentage of patients taking statins was relatively low in this study as some doctors may have concerns about the possible side-effects. However, no severe adverse effects of statins were noted in the intervention group despite the more aggressive treatment approach.
 
The implementation of secondary stroke prevention protocol has raised doctors’ awareness of lifestyle modification for patients with ischaemic stroke. This was reflected by the significant increase in the use of lifestyle modifications in the intervention group. We encouraged doctors to provide appropriate advice on lifestyle modification when lifestyle risk factors were identified during the consultation. However, due to heavy patient loads in the GOPC, no additional time can be allocated for medical consultations. During clinic meetings, our staff expressed difficulty in providing quality lifestyle education due to limited consultation time. The lack of additional resources for lifestyle education was a main shortcoming of this programme.
 
Certain subgroups of ischaemic stroke patients are not well represented by this study, for example, those with atrial fibrillation. Atrial fibrillation is one of the major risk factors for recurrent stroke.16 However, as warfarin was not available in the drug formulary of the GOPCs during the study period, most patients with atrial fibrillation were not referred to these clinics. Only a few patients with atrial fibrillation were identified in our study and all of them had a contra-indication for warfarin. At the time of writing, warfarin has become available in the GOPCs and several novel anticoagulants have been introduced as self-finance items. The use of anticoagulants in GOPCs is an important aspect of secondary stroke prevention that warrants further investigation.
 
Implementation of evidence-based guidelines into routine clinical practice is complicated.17 18 Physicians usually have concerns about the applicability of new trial data to individual patients, and it takes time for them to change their practice. Apart from considering the best available evidence, we also need to take into account the practical barriers in the clinical practice setting. The heavy workload in the clinic, shortage of consultation time, and limited scope of the drug formulary may impose difficulty in introducing an evidence-based protocol to local GOPCs.
 
From the experience of this study, a dedicated training session for clinic staff is necessary before the implementation of any new protocol. Additional review sessions are needed to audit clinicians’ compliance with the protocol. Review of the GOPC drug formulary, for example, to include greater choices of statins and antiplatelets, may be helpful to improve the care of stroke patients. Lifestyle modification is an important aspect for secondary stroke prevention, but time constraints in busy GOPCs are always an issue. A designated nurse clinic for patient education and annual risk factor monitoring should be introduced. For better utilisation of resources, it is beneficial to recruit community partners from allied health services to provide a structured secondary stroke prevention programme for patient empowerment and engagement.
 
In our study, approximately 5% to 6% of patients were lost to other GOPCs and medical clinics (Table 1). This may introduce some bias. In addition, differences between the two clinics such as proportions of health care workers, doctors’ qualifications, and differences in the socio-economic groups of the patients are possible confounders that might introduce bias. The intervention group had a higher rate of dropouts due to death and restroke although this was not statistically significant. As most of these patients had no change in medications during the intervention period (Table 1), the higher death and restroke rates were unlikely to be related to any adverse effects from the implementation of the protocol. However, we do not have data on the rates of stroke recurrence, adverse events, and mortality over a longer period, which are the most important outcomes for effective secondary stroke prevention. Furthermore, we may need to take into account the Hawthorne effect when looking at the effectiveness of the protocol implementation, in that physicians perform better simply because they are aware that they are in a study rather than because of the nature of the protocol.19 20 This is an unavoidable bias in clinical research.
 
Conclusion
This study demonstrates that through implementation of a standardised treatment protocol, the standard of care of secondary stroke prevention for ischaemic stroke patients could be improved in local GOPCs. However, due to the relatively small sample size in this study, this preliminary result should be interpreted with caution and further studies involving more primary care clinics are required to test its clinical value.
 
References
1. The top ten causes of death. (Fact sheet No 310/July 2013). Geneva: World Health Organization; 2013.
2. Centre for Health Protection. Vital statistics: death rates by leading causes of death, 2001-2012. Hong Kong: HKSAR Government.
3. Centre for Health Protection. Health topics: non communicable diseases and risk factors: cerebrovascular disease. Available from: http://www.chp.gov.hk/en/content/9/25/58.html. Accessed 31 Mar 2014.
4. Cheung CM, Tsoi TH, Hon SF, et al. Outcomes after first-ever stroke. Hong Kong Med J 2007;13:95-9.
5. Tsoi TH, Huang CY, Hon SF, et al. Trends in stroke types and mortality in Chinese. Stroke 2004;35:e256.
6. Wong HC, Mok CT. Update on secondary stroke prevention. Hong Kong Pract 2007;29:271-6.
7. Wang Y, Wu D, Wang Y, Ma R, Wang C, Zhao W. A survey on adherence to secondary ischemic stroke prevention. Neurol Res 2006;28:16-20. Crossref
8. Whitford DL, Hickey A, Horgan F, O’Sullivan B, McGee H, O’Neill D. Is primary care a neglected piece of the jigsaw in ensuring optimal stroke care? Results of a national study. BMC Fam Pract 2009;10:27. Crossref
9. Ovbiagele B, Drogan O, Koroshetz WJ, Fayad P, Saver JL. Outpatient practice patterns after stroke hospitalization among neurologists. Stroke 2008;39:1850-4. Crossref
10. Rudd AG, Lowe D, Hoffman A, Irwin P, Pearson M. Secondary prevention for stroke in the United Kingdom: results from the National Sentinel Audit of Stroke. Age Ageing 2004;33:280-6. Crossref
11. Xu G, Liu X, Wu W, Zhang R, Yin Q. Recurrence after ischemic stroke in Chinese patients: impact of uncontrolled modifiable risk factors. Cerebrovasc Dis 2007;23(2-3):117-20. Crossref
12. Sacco RL, Adams R, Albers G, et al. Guidelines for prevention of stroke in patients with ischemic stroke or transient ischemic attack: a statement for healthcare professionals from the American Heart Association/American Stroke Association Council on Stroke: co-sponsored by the Council on Cardiovascular Radiology and Intervention: the American Academy of Neurology affirms the value of this guideline. Stroke 2006;37:577-617. Crossref
13. Adams RJ, Albers G, Alberts MJ, et al. Update to the AHA/ASA recommendations for the prevention of stroke in patients with stroke and transient ischemic attack. Stroke 2008;39:1647-52. Crossref
14. Furie KL, Kasner SE, Adams RJ, et al. Guidelines for the prevention of stroke in patients with stroke or transient ischemic attack: a guideline for healthcare professionals from the American Heart Association/American Stroke Association. Stroke 2011;42:227-76. Crossref
15. Walker DB, Jacobson TA. Initiating statins in the elderly: the evolving challenge. Curr Opin Endocrinol Diabetes Obes 2008;15:182-7. Crossref
16. Li R, Cheng S, Mok M, et al. Atrial fibrillation is an independent risk factor of poor stroke outcome and mortality in Chinese ischaemic stroke patients. Cerebrovasc Dis 2013;36(Supp 1):74.
17. Cranney M, Warren E, Barton S, Gardner K, Walley T. Why do GPs not implement evidence-based guidelines? A descriptive study. Fam Pract 2001;18:359-63. Crossref
18. Foy R, Eccles M, Grimshaw J. Why does primary care need more implementation research? Fam Pract 2001;18:353-5. Crossref
19. Wickström G, Bendix T. The “Hawthorne effect”—what did the original Hawthorne studies actually show? Scand J Work Environ Health 2000;26:363-7. Crossref
20. McCarney R, Warner J, Iliffe S, van Haselen R, Griffin M, Fisher P. The Hawthorne effect: a randomised, controlled trial. BMC Med Res Methodol 2007;7:30. Crossref

Anterior cruciate ligament tear in Hong Kong Chinese patients

Hong Kong Med J 2015 Apr;21(2):131–5 | Epub 19 Dec 2014
DOI: 10.12809/hkmj134124
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Anterior cruciate ligament tear in Hong Kong Chinese patients
August WM Fok, FHKCOS, FHKAM (Orthopaedic Surgery); WP Yau, FHKCOS, FHKAM (Orthopaedic Surgery)
Division of Sports and Arthroscopic Surgery, Department of Orthopaedics and Traumatology, Queen Mary Hospital, The University of Hong Kong, Hong Kong
 
Corresponding author: Dr August WM Fok (augustfok@hotmail.com)
 Full paper in PDF
Abstract
Objective: To investigate the associations between patient sex, age, cause of injury, and frequency of meniscus and articular cartilage lesions seen at the time of the anterior cruciate ligament reconstruction.
 
Design: Case series.
 
Setting: University affiliated hospital, Hong Kong.
 
Patients: Medical notes and operating records of 672 Chinese patients who had received anterior cruciate ligament reconstruction between January 1997 and December 2010 were reviewed. Data concerning all knee cartilage and meniscus injuries documented at the time of surgery were analysed.
 
Results: Of the 593 patients, meniscus injuries were identified in 315 (53.1%). Patients older than 30 years were more likely to suffer from meniscal injury compared with those younger than 30 years (60% vs 51%, P=0.043). Longer surgical delay was observed in patients with meniscal lesions compared with those without (median, 12.3 months vs 9.1 months, P=0.021). Overall, 139 cartilage lesions were identified in 109 (18.4%) patients. Patients with cartilage lesions were significantly older than those without the lesions (mean, 27.6 years vs 25.1 years, P=0.034). Male patients were more likely to have chondral injuries than female patients (20.1% vs 10.9%, P=0.028). The risk of cartilage lesions was increased by nearly 3 times in the presence of meniscal tear (P<0.0001; odds ratio=2.7; 95% confidence interval, 1.7-4.2).
 
Conclusions: Increased age and surgical delay increased the risk of meniscal tears in patients with anterior cruciate ligament tear. Increased age, male sex, and presence of meniscal tear were associated with an increased frequency of articular lesions after an anterior cruciate ligament tear.
 
New knowledge added by this study
  •  This study served to identify the risk factors for meniscal and cartilage injuries in patients with anterior cruciate ligament (ACL) tear.
Implications for clinical practice or policy
  •  Patients with ACL deficiency should be informed about the increased risk of meniscus injuries associated with surgical delay.
 
 
Introduction
Anterior cruciate ligament (ACL) tear is one of the commonest sport injuries seen in clinical practice, and such injury is often associated with meniscal and chondral lesions. It is widely believed that early surgery can prevent such lesions in ACL-deficient patients, and probably help avoid the most dreadful complication of early osteoarthritis of the knee.1 Despite multiple studies conducted to evaluate the relationship between intra-articular injuries and ACL tear, such associations among Asians, especially Chinese, have not been extensively studied. Data show that females are more susceptible to ACL injury than their male counterparts,2 3 4 but lower risk of other intra-articular injuries in females was observed in some studies.5 Furthermore, a study showed that the incidence of meniscus tear was associated with the mechanism of ACL injury6; however, other studies were not able to show a significant relationship between the type of sports causing injury and the incidence of meniscal and chondral lesions.7 The objective of this study was two-fold. Our first aim was to report the meniscal and chondral lesions that accompany ACL tears in a large Chinese population. Our second aim was to test for relationships between the aforementioned lesions and patient sex, age, surgical delay, and causes of ACL injury.
 
Methods
A database that recorded all patients who had received ACL reconstruction in our hospital since 1997 was reviewed. Overall, 672 Chinese patients who had received the surgery between January 1997 and December 2010 were identified. Their medical notes and operating records were reviewed. Data concerning the patient sex, age, causes of injury, elapsed time from injury to surgery, and all knee cartilage and meniscus injuries documented at the time of surgery were analysed.
 
Exclusion criteria were: patients who had radiological evidence of osteoarthritis (Kellgren-Lawrence grade 3 or 4); a concomitant grade III medial collateral ligament, lateral collateral ligament, or posterior cruciate ligament deficiency (evaluated and recorded by means of examination with the patient under anaesthesia at the time of surgery); any revision procedure involving the ACL; or knee dislocation.
 
The time of the initial ACL injury was determined from the patient’s history. This included a definite incident of a single twisting injury, with the knee giving away with a ‘pop’ sound, gross knee swelling, and inability to resume the sport or walking. The nature of this injury was further verified with the hospital medical notes, or records of the primary attending physician, when available. Patients were considered potential candidates for ACL reconstruction if any two of the following criteria were satisfied: (1) instability during pivoting movements; (2) signs of ACL deficiency, including a positive Lachman test, anterior drawer test, or a positive pivot shift test; and (3) evidence of an ACL tear on magnetic resonance imaging (MRI).
 
The presence of cartilage injuries and meniscal lesions was confirmed in the operating room by means of knee arthroscopy. Several independent variables were studied: patient sex, age at the time of surgery, surgical delay (defined as the duration in months between the index ACL injury and reconstruction), and causes of ACL injury.
 
Statistical analyses
Data analysis was performed using the Statistical Package for the Social Sciences (Windows version 15.0; SPSS Inc, Chicago [IL], US). Student’s t test was used to compare the means of the age. Mann-Whitney U test was used to compare the means of the length of surgical delay. Fisher’s exact test was used to evaluate the categorical variables. Binary logistic regression was used to calculate the independent effects of individual factors. A P value of <0.05 was considered to be statistically significant.
 
Results
Of 672 patients who received ACL reconstruction, 79 were excluded (7 with concomitant high-grade ligament deficiency, and 72 with revision ACL surgery) and 593 patients were considered for analysis. These included 483 (81%) males and 110 (19%) females. There were 297 (50%) right and 296 (50%) left knees. Their mean age at the time of surgery was 26 years (range, 13-51 years), and their median length of surgical delay was 10.5 months (range, 0.4-241.8 months).
 
Most of the patients had had their injuries during sports activities (89.5%), with soccer (n=226, 42.6%) and basketball (n=163, 30.7%) being the two most common sports (Tables 1 and 2). The age distribution of patients having meniscal and cartilage injuries is shown in Table 3. The incidence of intra-articular lesions in different sports activities leading to injury is shown in Table 4.
 

Table 1. Causes of injury
 

Table 2. Type of sports activity causing anterior cruciate ligament tear
 

Table 3. Age distribution of patients who had meniscal and cartilage injuries
 

Table 4. The incidence of intra-articular lesions in different sports activities leading to injury
 
Meniscus injuries were identified in 315 (53.1%) patients. There were 146 (24.6%) isolated lateral tears, 123 (20.7%) isolated medial tears, and 46 (7.8%) bilateral tears.
 
Patients older than 30 years were more likely to suffer from meniscal injury versus those younger than 30 years (60% vs 51%; P=0.043 by Fisher’s exact test). Longer surgical delay was observed in patients with meniscal lesions versus those without such lesions (median, 12.3 months vs 9.1 months; P=0.021 by Mann-Whitney U test). Also, patients with medial meniscal tear had a longer surgical delay than those with lateral meniscal tear (median, 16.7 months vs 9.0 months; P<0.001, Mann-Whitney U test). However, no significant associations were observed between sex, causes of injury, type of sports, and presence of meniscal lesions.
 
Overall, 139 cartilage lesions were identified in 109 (18.4%) patients. There were 16 patella (11.5%) lesions, 92 (66.2%) femoral condyle lesions, and 31 (22.3%) tibial plateau lesions. Patients with cartilage lesions were significantly older than those without the lesions (mean, 27.6 years vs 25.1 years; P=0.034 by Student’s t test). Female patients were less likely to suffer from chondral injuries than male patients (10.9% vs 20.1%; P=0.028 by Fisher’s exact test). Female sex was found to be independently associated with incidence of cartilage injury in binary logistic regression (P=0.029; odds ratio [OR]=0.475; 95% confidence interval [CI], 0.243-0.929) [Table 5]. Presence of meniscal tear was associated with a 3-fold increased risk of cartilage lesions (P<0.001 by Fisher’s exact test; OR=2.7, 95% CI, 1.7-4.2).
 

Table 5. Binary logistic regression for the factors associated with risk of cartilage injury
 
No significant association, however, was found between surgical delay, causes of injury, type of sports, and cartilage lesions.
 
Discussion
Our study showed that longer surgical delay was present in patients with meniscal lesions, a finding that concurs with data from other published literature. Although Slauterbeck et al,5 Piasecki et al,8 and O’Connor et al9 reported that female patients had a lower rate of meniscus injury than male patients, such association was not observed in our study which recruited a lower proportion of female patients; similar observation was made in the study by Murrell et al.10
 
It is postulated that in acute ACL injury, excessive anterolateral rotation of the tibia on the femur traps the lateral meniscus between the posterolateral aspect of the tibial plateau and the central portion of the lateral femoral condyle. The lateral meniscus is susceptible to a tear when the tibia reduces. However, the scenario is different in patients with chronic ACL deficiency. Recurrent anterior translation of tibia on the femur results in increased stress on the more stably fixed medial meniscus due to the coronary ligaments, leading to a subsequent medial meniscal tear.11 Our study found that ACL-deficient patients with medial meniscus tear had a mean of 9 months longer surgical delay than those with lateral meniscus tear. Mitsou and Vallianatos12 reported that the incidence of medial meniscal tears increased from 17% in patients with ACL reconstruction within 3 weeks of injury to 48% in those who had surgery of more than 6 months after injury; such risk was not observed in lateral meniscus tears. O’Connor et al9 found that patients who underwent ACL reconstruction more than 2 years after injury had only 1.5 times increased risk in lateral meniscus injuries, but 2.2 times increased risk in medial meniscus injuries.
 
In our study, males were found to have higher incidence of cartilage defect than females, but there was no significant difference in terms of meniscal lesions. Slauterbeck et al5 found that male sex was associated with an increased risk of meniscal and chondral lesions in ACL-deficient patients. In a study by Piasecki et al,8 female high-school athletes were found to have fewer meniscal tears (while playing soccer) and a reduced number of intra-articular injuries to the medial femoral condyle while playing basketball, but such associations were not observed among amateur athletes. So far, there has been little research on sex differences in articular cartilage injuries accompanying ACL tears. Granan et al13 reported that cartilage lesions were nearly twice as frequent if there was a meniscal tear, and similar observations were found in our study.
 
The association of age with meniscus tear and cartilage injury with intact ACL is less extensively studied. In a cross-sectional MRI study of nearly 1000 individuals from the general population who were aged 50 to 90 years, 31% of knees were found to have a meniscal tear and the incidence increased with age. It was shown that 21% of the 50- to 59-year-old subjects had a meniscal tear, compared to 46% of subjects aged 70 to 90 years.14 In several large-scale retrospective studies which reviewed the articular cartilage defects during knee arthroscopy, the incidence of isolated chondral lesions without associated intra- and extra-articular knee lesions ranged from 30% to 36.6%.15 16 17 18 No significant statistical associations, however, were found between age and the cartilage lesions.
 
Studies have shown that individuals who participate in vigorous physical activities are more disabled by an ACL injury than those who are relatively sedentary. Paul et al6 reported an association between the mechanism of an ACL injury (jumping and non-jumping) and the incidence of concomitant meniscus injuries, but other authors failed to show such associations. In our study, since more than half of the patients were injured while playing soccer or basketball, an analysis was performed to evaluate if the soccer and basketball players suffered from lesions that were different from those sustained from other causes or during other sports activities. However, type of sports was not associated with any of the parameters we studied. A larger sample including patients with other causes of injury will be needed to prove if there are differences among other sports activities.
 
Another limitation of this study was that patients receiving conservative treatment for their ACL injury were not recruited in the present study. This could lead to potential bias as their risks of meniscal and articular injuries could not be estimated. We are also aware that more sophisticated systems to evaluate the meniscal and chondral lesions, eg the Cooper’s classification19 and the ICRS (International Cartilage Repair Society) classification system,20 could be used to map the lesions, so as to provide more precise anatomical description and details of the lesions.
 
Compared with other studies, which report surgical delay ranging from 1.2 to 13 months,5 6 7 9 10 11 patients in our series had a longer surgical delay. Patients may have postponed the waiting time for surgery or imaging including MRI. It was unclear if patients would suffer from repeated knee injuries, or the activities in which the patients were involved before the surgery would have any effect over the findings of our study.
 
Currently, there is intense debate concerning the optimal timing for ACL reconstruction.21 22 Different surgeons have different personal preferences. Some prefer early surgery while others are in favour of an optimal period of rehabilitation before considering surgery. Frobell et al23 concluded in his randomised controlled trial that “In young, active adults with acute ACL tears, a strategy of rehabilitation plus early ACL reconstruction was not superior to a strategy of rehabilitation plus optional delayed ACL reconstruction.” According to Richmond et al,22 however, Frobell’s conclusion is flawed; they believe that prompt operative intervention reduces long-term osteoarthritis after knee ACL tear. No matter what approach the surgeons prefer, our patients with ACL tear should be well informed about the risks and benefits of conservative management versus surgical reconstruction, so they can make their best decision with the best information on hand.
 
Conclusions
Increased age and surgical delay were associated with meniscal tear in patients with ACL tear, and longer surgical delay was observed in patients with medial meniscal tear. Increased age, male sex, and presence of meniscal tear were all associated with chondral lesions after an ACL tear. Cause of injury or type of sports activity leading to ACL injury was not associated with intra-articular lesions.
 
References
1. Lohmander LS, Englund PM, Dahl LL, Roos EM. The long-term consequence of anterior cruciate ligament and meniscus injuries: osteoarthritis. Am J Sports Med 2007;35:1756-69. Crossref
2. Arendt E, Dick R. Knee injury patterns among men and women in collegiate basketball and soccer. NCAA data and review of literature. Am J Sports Med 1995;23:694-701. Crossref
3. Bjordal JM, Arnly F, Hannestad B, Strand T. Epidemiology of anterior cruciate ligament injuries in soccer. Am J Sports Med 1997;25:341-5. Crossref
4. Messina DF, Farney WC, DeLee JC. The incidence of injury in Texas high school basketball. A prospective study among male and female athletes. Am J Sports Med 1999;27:294-9.
5. Slauterbeck JR, Kousa P, Clifton BC, et al. Geographic mapping of meniscus and cartilage lesions associated with anterior cruciate ligament injuries. J Bone Joint Surg Am 2009;91:2094-103. Crossref
6. Paul JJ, Spindler KP, Andrish JT, Parker RD, Secic M, Bergfeld JA. Jumping versus nonjumping anterior cruciate ligament injuries: a comparison of pathology. Clin J Sport Med 2003;13:1-5. Crossref
7. Tandogan RN, Taşer O, Kayaalp A, et al. Analysis of meniscal and chondral lesions accompanying anterior cruciate ligament tears: relationship with age, time from injury, and level of sport. Knee Surg Sports Traumatol Arthrosc 2004;12:262-70. Crossref
8. Piasecki DP, Spindler KP, Warren TA, Andrish JT, Parker RD. Intraarticular injuries associated with anterior cruciate ligament tear: findings at ligament reconstruction in high school and recreational athletes. An analysis of sex-based differences. Am J Sports Med 2003;31:601-5.
9. O’Connor DP, Laughlin MS, Woods GW. Factors related to additional knee injuries after anterior cruciate ligament injury. Arthroscopy 2005;21:431-8. Crossref
10. Murrell GA, Maddali S, Horovitz L, Oakley SP, Warren RF. The effects of time course after anterior cruciate ligament injury in correlation with meniscal and cartilage loss. Am J Sports Med 2001;29:9-14.
11. Duncan JB, Hunter R, Purnell M, Freeman J. Meniscal injuries associated with acute anterior cruciate ligament tears in alpine skiers. Am J Sports Med 1995;23:170-2. Crossref
12. Mitsou A, Vallianatos P. Meniscal injuries associated with rupture of the anterior cruciate ligament: a retrospective study. Injury 1988;19:429-31. Crossref
13. Granan LP, Bahr R, Lie SA, Engebretsen L. Timing of anterior cruciate ligament reconstructive surgery and risk of cartilage lesions and meniscal tears: a cohort study based on the Norwegian National Knee Ligament Registry. Am J Sports Med 2009;37:955-61. Crossref
14. Englund M, Guermazi A, Gale D, et al. Incidental meniscal findings on knee MRI in middle-aged and elderly persons. N Engl J Med 2008;359:1108-15. Crossref
15. Arøen A, Løken S, Heir S, et al. Articular cartilage lesions in 993 consecutive knee arthroscopies. Am J Sports Med 2004;32:211-5. Crossref
16. Curl WW, Krome J, Gordon ES, Rushing J, Smith BP, Poehling GG. Cartilage injuries: a review of 31,516 knee arthroscopies. Arthroscopy 1997;13:456-60. Crossref
17. Hjelle K, Solheim E, Strand T, Muri R, Brittberg M. Articular cartilage defects in 1,000 knee arthroscopies. Arthroscopy 2002;18:730-4. Crossref
18. Widuchowski W, Widuchowski J, Trzaska T. Articular cartilage defects: study of 25,124 knee arthroscopies. Knee 2007;14:177-82. Crossref
19. Cooper DE, Arnoczky SP, Warren RF. Meniscal repair. Clin Sports Med 1991;10:529-48.
20. Brittberg M, Winalski CS. Evaluation of cartilage injuries and repair. J Bone Joint Surg Am 2003;85-A Suppl 2:58-69.
21. Bernstein J. Early versus delayed reconstruction of the anterior cruciate ligament: a decision analysis approach. J Bone Joint Surg Am 2011;93:e48. Crossref
22. Richmond JC, Lubowitz JH, Poehling GG. Prompt operative intervention reduces long-term osteoarthritis after knee anterior cruciate ligament tear. Arthroscopy 2011;27:149-52. Crossref
23. Frobell RB, Roos EM, Roos HP, Ranstam J, Lohmander LS. A randomized trial of treatment for acute anterior cruciate ligament tears. N Engl J Med 2010;363:331-42. Crossref

Comparison of efficacy and tolerance of short-duration open-ended ureteral catheter drainage and tamsulosin administration to indwelling double J stents following ureteroscopic removal of stones

Hong Kong Med J 2015 Apr;21(2):124–30 | Epub 10 Mar 2015
DOI: 10.12809/hkmj144292
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Comparison of efficacy and tolerance of short-duration open-ended ureteral catheter drainage and tamsulosin administration to indwelling double J stents following ureteroscopic removal of stones
Vikram S Chauhan, MB, BS, MS (Surgery)1; Rajeev Bansal, MB, BS, MS (Surgery)1; Mayuri Ahuja, MB, BS, DGO2
1 School of Medical Sciences & Research, Sharda University, Greater Noida (U.P.) 201306, India
2 Kokila Dhirubhai Ambani Hospital & Medical Research Institute, Andheri West, Mumbai 400053, India
Corresponding author: Dr Vikram S Chauhan (vsing73@rediffmail.com)
 Full paper in PDF
Abstract
Objectives: To evaluate the efficacy of short-duration, open-ended ureteral catheter drainage as a replacement to indwelling stent, and to study the effect of tamsulosin on stent-induced pain and storage symptoms following uncomplicated ureteroscopic removal of stones.
 
Design: Prospective randomised study.
 
Setting: School of Medical Sciences and Research, Sharda University, Greater Noida, India.
 
Patients: Patients who underwent ureteroscopic removal of stones for lower ureteral stones between November 2011 and January 2014 were randomly assigned into three groups. Patients in group 1 (n=33) were stented with 5-French double J stent for 2 weeks. Patients in group 2 (n=35) were administered tablet tamsulosin 0.4 mg once daily for 2 weeks in addition to stenting, and those in group 3 (n=31) underwent 5-French open-ended ureteral catheter drainage for 48 hours.
 
Main outcome measures: All patients were evaluated for flank pain using visual analogue scale scores at days 1, 2, 7, and 14, and for storage (irritative) bladder symptoms using International Prostate Symptom Score on days 7 and 14, and for quality-of-life score (using International Prostate Symptom Score) on day 14.
 
Results: Of the 99 patients, visual analogue scale scores were significantly lower for groups 2 and 3 (P<0.0001). The International Prostate Symptom Scores for all parameters were lower in patients from groups 2 and 3 compared with group 1 both on days 7 and 14 (P<0.0001). Analgesic requirements were similar in all three groups.
 
Conclusion: Open-ended ureteral catheter drainage is equally effective and better tolerated than routine stenting following uncomplicated ureteroscopic removal of stones. Tamsulosin reduces storage symptoms and improves quality of life after ureteral stenting.
 
 
New knowledge added by this study
  •  This study shows that short-duration (up to 48 hours) ureteral drainage following ureteroscopic removal of stones (URS) has better efficacy and tolerance than indwelling stent placement with respect to the need for postoperative drainage. Hence, this can be a replacement for double J stenting.
  • Routine tamsulosin administration in patients with indwelling stents following URS has beneficial effects not only on irritative bladder symptoms but also on flank pain (both persistent and voiding).
Implications for clinical practice or policy
  •  Replacement of stents with short-duration open-ended ureteral catheter drainage provides early and more rehabilitation to the patients following URS. This is a viable option because there is no need for follow-up for stent-related symptoms, or maintaining records for planning its removal (no lost or retained stents).
  • It avoids a second invasive endoscopic procedure of stent removal, thereby reducing the medical and financial burden on the patient (especially important in developing countries). Patients are more likely to undergo URS again if required in the future (with stone recurrence) than opt for less effective or expensive choices like medical management, shock wave lithotripsy, or alternative forms of medicine.
  • In stented patients, tamsulosin administration improves the overall quality of life, and makes the period with stent in situ more bearable and asymptomatic.
 
 
Introduction
Ureteroscopic removal of stones (URS) is the standard endoscopic method for treatment of lower ureteric calculi. In recent times, this procedure does not require routine dilatation of ureteric orifice due to the availability of small-calibre rigid ureteroscopes that can be easily manipulated into the ureter in most of the cases.
 
Once the stones are removed, an indwelling ureteral double J stent is placed which remains in situ postoperatively for a period of 2 to 4 weeks. This is dependent on a variety of factors such as the difficulty in removal of stones, any mucosal injury, and associated stricture of the ureter or its meatus. Finney1 was the first to describe the use of double J stents in the year 1978.2 The use of stents has proved to be beneficial as seen in various studies, because they prevent or reduce the occurrence of ureteric oedema, clot colic, and subsequent development of secondary ureteric stricture in cases with mucosal injury or difficult stones.3 4 5 However, the use of ureteral stents is not without its attendant complications. Patients may develop flank pain, haematuria, clot retention, dysuria, frequency, and other irritative bladder symptoms following stent placement in the postoperative period. Hence, many authors have questioned the need for routine placement of stents or their early removal.6 Recently, researchers have proposed that the irritative and other symptoms due to stents can be reduced or overcome by the use of alpha blockers.7 With this background knowledge, we conducted a prospective randomised study with the aim to assess the efficacy of oral tamsulosin for 14 days following stenting, and efficacy of an open-ended ureteral catheter for 48 hours instead of a stent as viable options in patients who underwent uncomplicated URS for lower ureteric stones.
 
Methods
This study was conducted at School of Medical Sciences and Research, Sharda University, Greater Noida, India, after obtaining due clearance from the ethics committee. Recruitment of patients was done over a period from November 2011 to January 2014 and included a total of 99 patients who underwent URS for lower ureteric stones.
 
Inclusion criteria were lower ureteric stones defined as those imaged below the lower border of sacroiliac joint of up to 10 mm in diameter on computed tomography. Stones larger than 10 mm in diameter, presence of ipsilateral kidney stones, cases with lower ureteric or meatal stricture requiring dilatation, and cases which had significant mucosal injury (flap formation) per-operatively were excluded.
 
All patients underwent URS under spinal anaesthesia using an 8-French rigid ureteroscope, and stones requiring fragmentation were broken with a pneumatic lithoclast and these fragments were retrieved with forceps. One surgeon performed all the interventional procedures during the study period.
 
The patients were randomly assigned to three groups using randomisation table. On the random number table, we chose an arbitrary place to start and then read towards the right of the table from that number. We used a number read on the table from 1 to 3 to assign cases to group 1, a number from 4 to 6 to assign to group 2, and a number from 7 to 9 to assign cases to group 3 (a value of 0 was ignored). A duty doctor prepared 120 serially numbered slips of papers (indicating the number of enrolment) by following the above randomisation protocol and had written in them the group to which a new case was to be assigned. The chits were folded, stapled, and stacked in a box and stored in the operating theatre. After completion of the URS, the floor nurse opened the chit to reveal the appropriate enrolment number and the group (group 1, 2 or 3) to which the patient would go, thereby deciding further intervention.
 
Patients in group 1 underwent double J stent placement following URS for a period of 2 weeks. Patients in group 2 were administered tablet tamsulosin 0.4 mg once daily for 2 weeks in addition to double J stent. Patients in group 3 underwent placement of an open-ended 5-French ureteral catheter following the URS procedure, the distal end of which was introduced into the lumen of Foley catheter. Both the ureteric and Foley catheter were removed on the second postoperative day in group 3 patients.
 
A 5-French 25-cm double J stent was used for stenting and the duration of surgery was recorded as time from the introduction of ureteroscope to the placement of Foley catheter. Postoperatively, patients were assessed for flank pain (persistent or voiding) by asking them to report the pain on a visual analogue scale (VAS) of 0 to 10 (0 being no pain and 10 pain as severe as it could be) on postoperative days 1, 2, 7, and 14. Patients were also asked to report storage symptoms using the International Prostate Symptom Score (IPSS) at 1 and 2 weeks postoperatively to assess irritative bladder symptoms, while the IPSS quality-of-life index was assessed at 2 weeks postoperatively. All stented patients were discharged with tablet levofloxacin 250 mg orally once daily for 2 weeks as suppressive prophylaxis for infection.
 
Patients who had an indwelling double J stent underwent stent removal after 2 weeks by cystoscopy under local anaesthesia using 2% lidocaine jelly supplemented with intravenous injection of pentazocine 30 mg on a patient-need basis, and were asked to report the pain experienced during the stent removal on a VAS. Administration and reporting of VAS scores was done by the floor manager (administrative personnel) with assistance from nurse on duty for the in-patients (wards), while an intern and nurse on duty for out-patients on follow-up was done in local language (Hindi). All of these staff assessing VAS were blinded and had no direct influence or active role in the treatment or assessment protocol.
 
All patients on completion of 2 weeks of surgery were asked, “Whether you would opt for the same procedure again as treatment if you develop ureteral stones in the future?” Patients complaining of pain postoperatively were given injection tramadol 50 mg intravenously if needed. If pain persisted, patients were given intravenous injection of pentazocine 30 mg. All patients underwent intravenous urography after 1 month of procedure to document stone clearance and development of ureteral stricture. Patients were asked to report to the out-patient department if any other complications occurred following discharge.
 
The sample size was estimated with the following logic. We assumed the margin of error that could be accepted as 5%, with a confidence level of 90% and population size of 45 (cases that were admitted with flank pain and require URS for stones), in our institution the number of cases who undergo URS typically in a year would be roughly around 45 to 50. Assuming the response distribution to be 50%, with the above assumptions, the sample size calculated was 39, using the following formula:
Sample size n and margin of error E are given by
x = Z(c/100)2r(100 - r)
n = N x/((N - 1)E2 + x)
E = Sqrt[(N - n)x/n(N - 1)]
where N is the population size, r is the fraction of responses that we are interested in, and Z(c/100) is the critical value for the confidence level c.
 
This calculation is based on the normal distribution, and assumes that there are more than 30 samples and a power of 80. Hence, we chose to recruit approximately 35 patients in each arm of study.
 
Statistical analyses
After collation of data, Student’s t test and Pearson Chi squared test were used to analyse the three groups for age, sex, stone size, and operating time. We also comparatively evaluated the severity of flank pain on postoperative days 1, 2 and weeks 1 and 2, and the IPSS for each group regarding storage symptoms, total IPSSs at postoperative weeks 1 and 2, and the quality-of-life index at 2 weeks. Results from groups 2 and 3 were compared with group 1 to draw conclusions. Fisher’s exact test and Pearson Chi squared tests were used to compare the number of patients who needed intravenous analgesics due to severe postoperative pain and to examine the response to our question, “Whether you would opt for the same procedure again as treatment if you develop ureteral stones in the future?”
 
Results
There was no significant variation in the three groups with regard to variables like age, sex, stone size, and operating time (Table 1). The VAS score for flank pain, however, showed significant differences among the three groups. On postoperative day 1, the mean (± standard deviation) VAS scores in groups 1, 2, and 3 were 2.73 ± 1.14, 2.34 ± 1.12, and 2.35 ± 0.86 respectively, but were not statistically significant (groups 1 and 2, P=0.17; groups 1 and 3, P=0.15). On day 7, the mean VAS scores for groups 2 and 3 were 0.97 ± 0.77 and 1.00 ± 0.72 respectively, which were significantly lower than group 1 score of 2.85 ± 1.52 (P<0.0001). On day 14, the mean VAS scores for groups 1, 2, and 3 were 2.48 ± 1.40, 0.66 ± 0.67, and 0.55 ± 0.56 respectively (P<0.0001). This amounted to significantly greater pain in group 1 patients as compared with those in groups 2 and 3 (for groups 1-2 and 1-3, P<0.0001; Fig 1). Among those stented, the mean VAS score for stent removal using 2% lidocaine jelly was 3.76 ± 1.55 but the mean VAS score for stent removal with regard to sex (male:female = 36:32) was 4.97 ± 0.80 and 2.41 ± 0.96, respectively and this was statistically significant (P<0.0001).
 

Table 1. General characteristics of study patients
 

Figure 1. Visual analogue scale (VAS) scores on postoperative day 14
 
Analyses of IPSS on both postoperative days 7 and 14 for bladder sensation, frequency, urgency, nocturia, and the sum total of IPSS showed there was significant decrease in group 2 as compared with group 1 for all four parameters (P<0.0001). Group 3 patients had minimal mean IPSS scores to begin with (Table 2). The mean quality-of-life scores for groups 1, 2, 3 were 4.00 ± 0.92, 1.37 ± 0.86, and 0.52 ± 0.50 respectively, and this was significantly better for groups 2 and 3 compared with group 1 (P<0.00001; Fig 2 and Table 3).
 

Table 2. Mean (± standard deviation) International Prostate Symptom Scores (IPSS) according to groups on postoperative day 7
 

Figure 2. International Prostate Symptom Score (IPSS) and quality-of-life score on postoperative day 14
 

Table 3. Pain requiring analgesia, quality of life, and willingness to opt for the same procedure again with stone recurrence among the groups
 
Nine patients in group 1, 11 in group 2, and seven in group 3 complained of pain requiring injection of tramadol 50 mg (Table 3). Only one patient (stent-only group) further required intravenous injection of pentazocine 30 mg due to persistent pain. No patient in any group required intravenous analgesic after day 2 making analgesic need similar in all groups. One patient who was stented and had not received tamsulosin reported gross haematuria on the sixth day, which required readmission and catheterization with bladder wash, and the haematuria responded to conservative treatment. Beyond the 2-week period, no patient reported any other complication during the 2-month follow-up.
 
In this study, 20, 29, and all patients in groups 1, 2, and 3 respectively showed willingness for undergoing same procedure in future if needed. This showed that a higher percentage of patients in groups 2 and 3 were willing for repeated surgery (if needed) than in group 1, which was statistically significant (for groups 1-2, P=0.04, and for groups 1-3, P=0.0003; Table 3). Two patients from the open drainage group were lost to follow-up after 7 days. There was no crossover from one group to the other once assigned.
 
Discussion
Indwelling double J stents are routinely placed following URS to prevent flank pain and secondary ureteral strictures.4 8 9 However, duration-dependent symptoms due to ureteral stents have been well documented. Pollard and Macfarlane10 reported stent-related symptoms in 18 (90%) out of 20 patients who had indwelling ureteral stents following URS. Bregg and Riehle11 reported that symptoms such as gross haematuria (42%), dysuria (26%), and flank pain (30%) appeared in stented patients prior to being taken up for shock wave lithotripsy. Stoller et al8 documented ureteral stent–related symptoms, like flank pain, frequency, urgency, and dysuria, in at least 50% of patients who had an indwelling ureteral stent. In a series by Han et al,12 haematuria was reported as the most common symptom (69%) followed by dysuria (45.8%), frequency (42.2%), lower abdominal pain during voiding (32.2%), and flank pain (25.4%). Most studies report that apart from urgency and dysuria (which improve with time), there is no relief in other symptoms till the stent is removed.
 
Wang et al7 showed that administration of α-blocker (tamsulosin) in stented patients improves flank pain and IPSS storage symptoms, along with an overall improvement in quality of life. They reported mean scores of frequency, urgency, nocturia as 3.7, 3.82, 2.01 in stented patients and 1.55, 1.43, 0.65 in those who received tamsulosin for 2 weeks, respectively. The mean score of quality of life in IPSS was 4.21 in stented group and 1.6 in stented + tamsulosin group. Moon et al13 reported that when compared with stenting, all the storage categories of the IPSS were significantly lower in the 1-day ureteral stent group (P<0.01). Although the VAS scores were not significantly different on postoperative day 1, it was significantly lower in the 1-day ureteral catheter group on postoperative days 7 and 14 (P<0.01).13
 
In our study, the mean total IPSS score at 2 weeks postoperatively was 9.64, 1.71, and 0.13 for groups 1, 2, and 3 respectively (Fig 2). We also found that the mean VAS scores for flank pain and the mean IPSS scores of bladder sensation, frequency, urgency, nocturia, were significantly higher in patients in group 1 when compared with groups 2 and 3 (Figs 1 and 2). These findings suggest that the indwelling double J stent causes time-dependent pain and storage symptoms due to persistent bladder irritation and administration of tamsulosin did significantly decrease symptoms. Our patients who received tamsulosin also fared much better on the quality-of-life index at both 1 and 2 weeks postoperatively than the group with stent placement only (mean score, 1.37 and 4.00 respectively), while those who underwent open-ended catheter drainage showed minimal irritative symptoms (Table 2).
 
In addition, removal of indwelling stent constitutes an additional procedure, which not only is physical but also a financial burden to the patient especially in a developing country like India. Kim et al14 evaluated pain that occurred on cystoscopy following an intramuscular injection of diclofenac 90 mg. The mean score of VAS during the procedure was 7.8 ± 0.7, which indicated severe pain. In addition, only 22.5% of patients responded “yes” to a questionnaire about their willingness to submit to the same procedure again.14 Moon et al13 reported a mean VAS score of 4.96 ± 1.29 for stent removal using lidocaine gel. Although the mean VAS score for stent removal under local anaesthesia in our series was 3.76, the mean for males and females was 4.97 and 2.41, respectively. This amounts to moderately severe pain in males, and in association with irritative bladder symptoms that could influence the patient’s willingness to go for a repeated procedure in future if required. Besides, manipulation during the procedure to remove the stent under local anaesthesia especially in males could lead to urethral or bladder injuries, a drawback that Hollenbeck et al15 have observed.
 
Many have questioned the need for ureteral stenting following URS. Denstedt et al16 in a series of 58 patients who underwent URS (29 stented and 29 non-stented) reported that there was no significant difference in complications or success rates for URS between stented versus non-stented cases. However, Djaladat et al17 reported that when ureteroscopy was performed without catheterization, flank pain and renal colic could result from early ureteral oedema implying that some postoperative drainage is better than no drainage at all. This formed the premise of using the open-ended ureteral catheter in immediate postoperative period in our series and the significantly lower VAS scores suggest that their placement can be as effective as stents with minimal irritative symptoms.17 Nabi et al18 concluded that there was no significant difference in postoperative requirements for analgesia, urinary tract infection, the stone-free rate, or ureteric stricture formation in patients who underwent uncomplicated URS. There was no significant difference in analgesic requirement in the three groups in our study; 9, 11, and 7 patients in groups 1, 2, and 3 respectively required intravenous tramadol on postoperative days 1 and 2, only one patient in group 1 needed further analgesia. No patient needed analgesics beyond the second postoperative day which is comparable to the series by Moon et al13 who reported that ratio of patients who needed intravenous analgesics because of severe postoperative flank pain was not significantly different between stented and open-drainage groups.
 
In our study, 20 out of 33 in group 1, 29 out of 35 in group 2, and all 31 patients in group 3 responded affirmatively when asked “Whether you would opt for the same procedure again as treatment if you develop ureteral stones in the future?” The P values for willingness for repeated procedure were 0.04 and 0.0003 when comparing groups 1-2 and 1-3 respectively, which is in line with another study (willingness P=0.02 in favour of open-ended drainage).13 The results show that patients in groups 2 and 3 (tamsulosin and open-catheter drainage) were significantly more likely to accept a repeated procedure if needed. Hence, it can be inferred that administration of tamsulosin following stenting or placement of open-ended catheter (removed on day 2) was better tolerated by patients compared with an indwelling stent–only procedure.
 
The relatively small sample size and being unblinded which was a likely placebo effect in the tamsulosin group were the most obvious limitations in our study. We believe that since in the stented group patients were given tablet levofloxacin 250 mg as suppressive prophylaxis post-discharge, any relief in lower urinary tract symptoms therefore could not be attributed to tamsulosin alone as placebo effect. Assessment of VAS was done by personnel who were blinded and had no direct influence on the treatment or assessment protocol; this ruled out surgeons’ bias and their involvement in influencing the patient’s reporting of VAS scores. Degree of difficulty, complexity, and duration of the procedure could be construed as confounding factors in the study. However, the relatively simple inclusion and exclusion criteria which included all but the absolute indications for stenting for comparison obviate this and the results demonstrate that open-ended short-duration ureteral drainage can replace stenting in all other scenarios.
 
Conclusion
Accepting the limitations of a smaller sample size, open-ended catheter drainage for 2 days is better tolerated for flank pain and irritative bladder symptoms when compared with an indwelling double J stent for 2 weeks, without any significant difference in complications or efficacy. We recommend this procedure as a viable replacement to routine stenting following URS. In those patients who do undergo stenting following URS, administration of tamsulosin significantly reduces stent-related flank pain and irritative symptoms and enhances the overall quality of life. In view of the possible placebo effect on patients in group 2, the results show that there is a need for more exhaustive and larger multicentre randomised controlled trials to assess the role of tamsulosin in countering post-URS stenting symptoms, given its wide acceptance for pain relief and stone passage in treating lower ureteral stones.
 
Declaration
No conflicts of interest were declared by authors.
 
References
1. Finney RP. Experience with new double J ureteral catheter stent. J Urol 1978;120:678-81.
2. Hepperlen TW, Mardis HK, Kammandel H. Self-retained internal ureteral stents: a new approach. J Urol 1978;119:731-4.
3. Lee JH, Woo SH, Kim ET, Kim DK, Park J. Comparison of patient satisfaction with treatment outcomes between ureteroscopy and shock wave lithotripsy for proximal ureteral stones. Korean J Urol 2010;51:788-93. Crossref
4. Harmon WJ, Sershon PD, Blute ML, Patterson DE, Segura JW. Ureteroscopy: current practice and long-term complications. J Urol 1997;157:28-32. Crossref
5. Boddy SA, Nimmon CC, Jones S, et al. Acute ureteric dilatation for ureteroscopy. An experimental study. Br J Urol 1988;61:27-31. Crossref
6. Hosking DH, McColm SE, Smith WE. Is stenting following ureteroscopy for removal of distal ureteral calculi necessary? J Urol 1999;161:48-50. Crossref
7. Wang CJ, Huang SW, Chang CH. Effects of tamsulosin on lower urinary tract symptoms due to double-J stent: a prospective study. Urol Int 2009;83:66-9. Crossref
8. Stoller ML, Wolf JS Jr, Hofmann R, Marc B. Ureteroscopy without routine balloon dilation: an outcome assessment. J Urol 1992;147:1238-42.
9. Netto Júnior NR, Claro Jde A, Esteves SC, Andrade EF. Ureteroscopic stone removal in the distal ureter. Why change? J Urol 1997;157:2081-3. Crossref
10. Pollard SG, Macfarlane R. Symptoms arising from Double-J ureteral stents. J Urol 1988;139:37-8.
11. Bregg K, Riehle RA Jr. Morbidity associated with indwelling internal stents after shock wave lithotripsy. J Urol 1989;141:510-2.
12. Han CH, Ha US, Park DJ, Kim SH, Lee YS, Kang SH. Change of symptom characteristics with time in patients with indwelling double-J ureteral stents. Korean J Urol 2005;46:1137-40.
13. Moon KT, Cho HJ, Cho JM, et al. Comparison of an indwelling period following ureteroscopic removal of stones between Double-J stents and open-ended catheters: a prospective, pilot, randomized, multicenter study. Korean J Urol 2011;52:698-702. Crossref
14. Kim KS, Kim JS, Park SW. Study on the effects and safety of propofol anaesthesia during cytoscopy. Korean J Urol 2006;47:1230-5. Crossref
15. Hollenbeck BK, Schuster TG, Faerber GJ, Wolf JS Jr. Routine placement of ureteral stents is unnecessary after ureteroscopy for urinary calculi. Urology 2001;57:639-43. Crossref
16. Denstedt JD, Wollin TA, Sofer M, Nott L, Weir M, D’A Honey RJ. A prospective randomized controlled trial comparing nonstented versus stented ureteroscopic lithotripsy. J Urol 2001;165:1419-22. Crossref
17. Djaladat H, Tajik P, Payandemehr P, Alehashemi S. Ureteral catheterization in uncomplicated ureterolithotripsy: a randomized, controlled trial. Eur Urol 2007;52:836-41. Crossref
18. Nabi G, Cook J, N’Dow J, McClinton S. Outcomes of stenting after uncomplicated ureteroscopy: systematic review and meta-analysis. BMJ 2007;334:572. Crossref

Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool

Hong Kong Med J 2015 Apr;21(2):114–23 | Epub 10 Mar 2015
DOI: 10.12809/hkmj144398
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool
Magdalene HY Tang, PhD1; CK Ching, FRCPA, FHKAM (Pathology)1; ML Tse, FHKCEM, FHKAM (Emergency Medicine)2; Carol Ng, BSW, MA3; Caroline Lee, MSc1; YK Chong, MB, BS1; Watson Wong, MSc1; Tony WL Mak, FRCPath, FHKAM (Pathology)1; Emerging Drugs of Abuse Surveillance Study Group
1 Toxicology Reference Laboratory, Hospital Authority, Hong Kong
2 Hong Kong Poison Information Centre, Hospital Authority, Hong Kong
3 Hong Kong Lutheran Social Service, the Lutheran Church – Hong Kong Synod, Homantin, Hong Kong
Corresponding author: Dr Tony WL Mak (makwl@ha.org.hk)
 Full paper in PDF
Abstract
Objective: To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers.
 
Design: Cross-sectional study.
 
Setting: Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong.
 
Participants: A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected.
 
Main outcome measures: Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients.
 
Results: The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population.
 
Conclusions: This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.
 
 
New knowledge added by this study
  •  A locally developed method is a valid tool for monitoring the penetrance of emerging drugs of abuse into our society whilst performing regular drugs of abuse testing.
Implications for clinical practice or policy
  •  Implementation of the analytical method in the routine drug monitoring of drug abusers will enable simultaneous surveillance of novel drugs of abuse at minimal extra cost and effort.
  •  Continued and proactive surveillance of emerging drugs of abuse in the population will facilitate prompt measures in the clinical, social, and legislative management of these constantly changing and potentially dangerous drugs.
 
 
Introduction
Despite continuous efforts, drug abuse remains a major social and medical problem in today’s society. In particular, there has been a rapid and continued growth of ‘emerging’ drugs of abuse (DOA) on a global scale.1 2 Emerging DOA, also called designer drugs or novel psychoactive substances, bear a chemical and/or pharmacological resemblance to conventional DOA and pose a threat to public health, but are often (initially) not controlled by law. They are easily accessible from street dealers or through the internet, and are often presumed to be safer than conventional DOA owing to their ‘legal’ or ‘herbal’ nature.1 3 In Hong Kong, the drug scene has also been penetrated in recent years by such substances as the piperazine derivative TFMPP [1-(3-trifluoromethylphenyl)piperazine],4 the synthetic cannabinoids,5 the methamphetamine derivative PMMA (paramethoxymethamphetamine),6 and the NBOMe (N-methoxybenzyl derivatives of phenethylamine).7 Some of these novel drugs pose a significant health threat and numerous fatalities have been reported worldwide.8 9 10 In particular, PMMA and the NBOMe drugs have been associated with severe clinical toxicity and fatalities in Hong Kong.6 7
 
Effective diagnosis and treatment of emerging DOA intoxication rely on the timely and accurate detection of these substances. Whilst immunoassay and drug screening methods are well-established for conventional DOA, laboratory analysis of novel drugs is not so readily available. This inevitably leads to the delayed discovery of emerging drugs and consequently early medical and social intervention is compromised. Recently, a liquid chromatography–tandem mass spectrometry (LC-MS/MS)–based method has been established locally that allows the simultaneous detection of 47 commonly abused drugs in addition to over 45 emerging DOA and their metabolites in urine11 and hair (the latter manuscript in preparation). The aim of the current study was to validate this analytical method as a tool to monitor emerging DOA whilst performing regular DOA testing by applying the method to 2000 urine and hair specimens collected from drug abusers as well as high-risk individuals.
 
Methods
Sample collection
Between 1 November 2011 and 31 July 2013, 964 urine and 1036 hair specimens (n=2000 in total) were collected for analysis. Subjects who were included in the study were patients/clients of the units listed, and who were suspected to be actively using DOA and who agreed to participate in the study: (i) substance abuse clinics within the Hospital Authority (Castle Peak Hospital, Kowloon Hospital, Kwai Chung Hospital, Pamela Youde Nethersole Eastern Hospital, Prince of Wales Hospital, Queen Mary Hospital); (ii) accident and emergency (A&E) departments within the Hospital Authority (Pamela Youde Nethersole Eastern Hospital, Pok Oi Hospital, Princess Margaret Hospital, Queen Mary Hospital, Tuen Mun Hospital, United Christian Hospital, Yan Chai Hospital); (iii) the Hong Kong Poison Information Centre (HKPIC) toxicology clinic; (iv) counselling centres for psychotropic substance abusers (CCPSA; Evergreen Lutheran Centre, Rainbow Lutheran Centre, Cheer Lutheran Centre); (v) various rehabilitation centres including the Society of Rehabilitation and Crime Prevention (SRACP), Operation Dawn and Caritas Wong Yiu Nam Centre; and (vi) Youth Outreach. Pregnant women and individuals aged under 18 years were excluded from the study. The majority of the participants were Chinese, although those recruited from SRACP were exclusively South Asians.
 
The study was approved by the institutional ethics review boards (Kowloon West Cluster: KW/FR-11-011 (41-05); Kowloon Central/Kowloon East Cluster: KC/KE-11-0170/ER-2; Hong Kong West Cluster: UW 11-398; Hong Kong East Cluster: HKEC-2011-068; New Territories West Cluster: NTWC/CREC/989/11; New Territories East Cluster: CRE-2011.427). Subjects donated samples on a voluntary basis and informed consent was obtained. Each subject donated either urine or hair, or both, at each donation episode. Some gave repeated sample(s): donations were at least 8 weeks apart. Urine was collected in a plain plastic bottle and frozen until analysis. For hair, a lock of hair was collected from the back of the head for analysis. The root end was identified to facilitate segmental analysis.
 
Sample analysis
The methodology for urine analysis has been detailed in a separate publication.11 In brief, the urine sample was subjected to an initial glucuronidase digestion, followed by solid phase extraction and sample concentration. The hair sample (first 3-cm segment) was first decontaminated and subsequently subjected to simultaneous micro-pulverisation and extraction in solvent. The final filtrates were analysed by LC-MS/MS performed on an Agilent 6430 triple-quadrupole mass spectrometer (Agilent Technologies, Singapore) coupled with Agilent 1290 Infinity liquid chromatography system. The 47 conventional and 47 emerging DOA identified for analysis are listed in Table 1. The analytical method had previously been validated according to international guidelines.12
 

Table 1. The conventional and emerging drugs of abuse being analysed for
 
Statistical analysis
Statistical analysis was performed using Fisher’s exact test, with a P value of less than 0.05 considered statistically significant. Comparison of the drug detection rates was made between (i) different ethnic groups and (ii) samples collected in the rehabilitation and acute settings.
 
Results
Subject demographics
In total, 972 individuals took part in the study (720 males, 252 females). Their respective mean and median age was 35 and 33 years (range, 18-74 years). Of the 972 subjects, 815 were single-time donors and 157 donated repeated samples (between 2 and 6 donations each). There were 1224 donation episodes (815 from single-time donors; 409 from repeated donors) and 2000 specimens collected in total, of which 964 were urine and 1036 were hair (Fig 1). Of the 1224 donation episodes, the subjects were recruited from: substance abuse clinics (n=822), drug rehabilitation and counselling centres (n=320), youth hangout centre (n=41), HKPIC toxicology clinic (n=28), and A&E departments (n=13).
 

Figure 1. Subject demographics
 
Emerging drugs of abuse
In the 2000 specimens analysed, five specimens were found to contain three emerging DOAs: PMMA, TFMPP, and methcathinone. A methamphetamine derivative, PMMA, was detected in three hair specimens (cases 1-3, Table 2). All three hair samples were also found to contain cocaine and ketamine. Nonetheless, PMMA was not detected in the subjects’ concurrent urine sample.
 

Table 2. Emerging drugs of abuse detected in the study
 
A piperazine derivative, TFMPP, was detected in one urine specimen (case 4, Table 2), together with cocaine and ketamine. Nonetheless TFMPP was not detected in the parallel hair sample.
 
Methcathinone, also known as ephedrone, is a cathinone (beta-keto amphetamine) analogue. It was detected in combination with amphetamine, methamphetamine, and cocaine metabolite in one urine specimen (case 5, Table 2). No parallel hair specimen was available from this subject.
 
Conventional drugs of abuse
Analysis of the 964 urine samples revealed the presence of 19 types of conventional DOA (Fig 2a). Codeine was the most common, being detected in 47% of the urine samples, followed by methadone (35%), heroin (22%), methamphetamine (21%), ketamine (20%), zopiclone (20%), amphetamine (17%), midazolam (17%), and dextromethorphan (14%). Cocaine and cannabis were detected in 6% and 3% of urine samples, respectively.
 

Figure 2. Conventional drugs of abuse detected in (a) urine and (b) hair samples as a percentage of the total number of samples collected (964 urine and 1036 hair samples)
 
In hair specimens (1036 in total), 14 types of conventional DOA were detected (Fig 2b). Codeine (36%) and methadone (35%) were the most prevalent, followed by ketamine (34%), heroin (33%), methamphetamine (29%), dextromethorphan (28%), and zopiclone (26%). Cocaine and zolpidem were detected in 12% and 7% of the samples, respectively.
 
Ethnic minority
A subset of participants (n=130) were of South Asian ethnicity. These subjects donated 248 specimens in 130 episodes. Their drug use pattern was significantly different to that of Chinese. Comparison of urinalysis results revealed that South Asians had a significantly higher proportion of opiate use such as heroin, methadone, and codeine (P<0.001) as well as dextromethorphan (P<0.05; Fig 3a). On the contrary, ketamine, zopiclone, and diazepam (P<0.001) as well as cocaine and amphetamine (P<0.05) were detected at significantly higher rates in Chinese compared with South Asians. Analysis of hair specimens showed a largely similar pattern of discrepancy between the two ethnicities (Fig 3b).
 

Figure 3. Comparison of the drugs detected in (a) urine and (b) hair samples collected from Chinese (white bars) and South Asians (dark bars)
 
Collection site setting
The urine samples in the current study were collected from different settings: 38 samples from acute setting (A&E departments and HKPIC toxicology clinic); 885 samples from drug rehabilitation setting (substance abuse clinics, CCPSA and other rehabilitation centres); and 41 from a high-risk population (youth hangout). A comparison of drugs detected between the acute and rehabilitation settings revealed a significantly higher detection rate of ketamine and cocaine (P<0.001) in the former (Fig 4). Drugs such as codeine, methadone, heroin, zopiclone, and dextromethorphan were detected at higher rates in samples collected in a rehabilitation setting.
 

Figure 4. Comparison of the drugs detected in urine samples collected under the rehabilitation (white bars) and acute (dark bars) settings
 
Discussion
Emerging DOA are constantly being monitored worldwide by agencies such as the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA). In 2008, 13 emerging DOA were reported for the first time to EMCDDA; by 2012, 73 new drugs had been reported within a year.1 Recent years have also seen the emergence of such designer drugs in Hong Kong, some of which have caused severe morbidity and fatalities.4 5 7 The early identification of emerging drugs enables prompt counteractive measures in terms of their clinical and social management, and the surveillance of emerging drugs in the population is increasingly being adopted globally as a proactive approach to combat drug abuse.13 14 15 In view of this, the present study was conducted to validate a locally developed LC-MS/MS method to screen for emerging DOA in the local population whilst simultaneously monitoring routine DOA. The study was conducted over a 21-month period. Multiple clinical and social service units from across the city collaborated in the study for a wider geographical coverage and more representative results. In 2013, approximately 10 069 drug abusers were reported in Hong Kong.16 This study population (972 subjects) was estimated to represent 9.7% of the total potential subjects. Regarding the response or participation rate, due to practical concerns and limited manpower, it was not possible for every collaborating unit to document fully the number of subjects approached or the number who refused consent.
 
The current results revealed the presence of three emerging drugs (PMMA, TFMPP, and methcathinone) in five specimens. This low prevalence is an expected finding due to the intrinsic nature of ‘emerging’ rather than ‘established’ drugs. Nevertheless, PMMA is a highly toxic methamphetamine derivative that has been sold on the drug market as MDMA (3,4-methylenedioxy-methamphetamine) substitute.8 The drug has been reported to have caused up to 90 fatalities worldwide over the years, including eight fatalities in Taiwan.8 17 In particular, PMMA-associated fatalities have also been reported recently in Hong Kong.6
 
On the other hand, TFMPP is a piperazine derivative with mild hallucinogenic effects and, when taken with another piperazine derivative benzylpiperazine (BZP), causes ecstasy-like effects.18 Piperazine derivatives are known to cause dissociative and sympathomimetic toxicity.19 The drug TFMPP was first reported in Hong Kong in 20104 and has been identified as an emerging drug in Ireland in recent years.15
 
Another emerging DOA detected in the study, methcathinone, gained popularity from the 1970s to 1990s, and was recently reported as a ‘re-emerging’ DOA in Sweden.14 It is an amphetamine-like stimulant and is among a group of synthetic cathinone compounds, commonly known as “bath salts”, that have been associated with numerous fatalities worldwide.20 Other highly toxic cathinone derivatives include mephedrone and MDPV (methylenedioxypyrovalerone),9 10 both of which are also covered in the analytical method but were not detected in the current study.
 
Of the conventional DOA, the opiates, methamphetamine, and ketamine were among the most frequently detected in this study. This is consistent with the data on reported drug abusers that was published by the local Central Registry of Drug Abuse.21 Since this manuscript focuses on screening for emerging DOA, detailed analysis of conventional drug use such as gender and age differences was not performed. However, an interesting finding was the observation that significantly higher proportions of South Asian drug abusers used opiates such as heroin, methadone, and codeine compared with Chinese; Chinese drug abusers were much more likely to use ketamine, cocaine, zopiclone, and diazepam. This highlights the ethnic differences in drug use and indicates that alternative approaches may be required for the clinical and social management of ethnic minorities in Hong Kong.
 
It is of interest to note the particularly high percentage of ketamine and cocaine detected in urine samples collected at A&E departments and toxicology clinic compared with the other collection sites. This may indicate that these drugs carry a more acute and severe toxicity profile relative to the other drugs with a consequent need of hospitalisation. A previous study on drug driving in Hong Kong also reported ketamine as the most prevalent drug detected in driver casualties who presented to the A&E department.22 Comparison of hair analysis results was not made here, since the main focus was on the difference between acute and non-acute cases; hair specimens would be less helpful since this biological matrix does not reflect recent exposure to drugs (see below for further discussion).
 
The present study showed a broadly similar pattern in urine and hair matrices in terms of the conventional DOA detected. Cocaine, dextromethorphan, and zolpidem were detected at higher rates in hair compared with urine, and may indicate the relatively high deposition efficiency of these drugs in hair matrix. It should be noted, however, that the metabolites of zolpidem were not included in the current assay, and may decrease its sensitivity for detection in urine. Urine and hair specimens have different ‘detection windows’, that is, they reflect different time frames of drug intake. Detection in urine indicates recent intake (within hours/days); thus this matrix is useful for the management of acute toxicity and drug overdose. The detection window of hair is much longer (weeks/months), enabling this matrix to be used for monitoring long-term drug use or abstinence.
 
When interpreting the results of the current study, it should be noted that some drugs may have been taken for therapeutic reasons, for example codeine, methadone, phentermine, or the tranquilizers/benzodiazepines. It was not possible in this study to differentiate medical use from abuse. It should also be noted that some drugs may be present as metabolites of others, for example temazepam and oxazepam (both of which are diazepam metabolites) and the emerging drug mCPP (metabolite of the antidepressant trazodone). Morphine is also the metabolite of codeine and heroin; it was only reported here as a drug in the absence of either codeine or heroin in the same sample.
 
Effective control of novel drugs depends on their early identification. A number of means to monitor emerging DOA have been proposed, such as conducting population surveys, analysing online test purchases, or wastewater analysis.3 Population surveys suffer the potential drawback of obtaining inaccurate data, since the actual identity of the drugs may differ from the claimed ingredients, for example, BZP being sold as ‘MDMA’ tablets.23 Analysing drug items purchased online is a costly approach due to the vast number of products available. Wastewater analysis may be used for monitoring conventional DOA, but the approach may not be easily adapted to the surveillance of emerging drugs due to the anticipated minute levels (ng/L range) in wastewater.24 All the above approaches require a considerable amount of financial and manpower resources. We propose the integration of emerging DOA surveillance into the routine drug monitoring of patients using the established analytical method. This surveillance approach is accurate, readily attainable, and is also achieved with minimal extra cost and effort since it is a convenient by-product of the routine drug monitoring of patients. Additionally, its applicability in A&E department patients allows the early identification of highly toxic novel drugs.
 
The proposed analytical method is LC-MS/MS–based, and offers several advantages over traditional DOA testing by immunoassay methods. First, development of an immunoassay is a lengthy process (in terms of years) involving the generation of antibodies. Immunoassay analysis also depends solely on the availability of commercial kits. These features do not favour early detection of new compounds given the protean nature of emerging drugs. In contrast, LC-MS/MS–based methods are much more versatile, permitting in-house enhancement of the method to allow detection of new compounds as soon as they enter the market. Second, although immunoassay methods require minimal capital investment, their running costs are high due to the generation of antibodies. On the other hand, LC-MS/MS methods require a high initial investment in analysers, but the running cost is lower in the long term as the reagents involved are relatively inexpensive. Lastly, unlike immunoassay methods that are only preliminary in nature and require further confirmatory testing, mass spectrometry analysis is already confirmatory with accurate and definitive results.
 
In addition to laboratory analysis, the emerging DOA surveillance team requires the expertise of medical doctors to keep a close watch on emerging drugs on the market, especially those with high clinical toxicity. Based on this ‘toxico-intelligence’, scientists should then enhance the analytical method to include such emerging substances. Hence, the effective control of emerging drugs will require a team of trained medical doctors and scientists, as well as versatile technology that enables the continual expansion of analytical coverage. In view of the resource requirements, specialised toxicology centres may be better suited for the purpose.
 
The present study has proven the concept that a locally developed analytical method is a valid tool to monitor emerging DOA whilst simultaneously performing regular DOA testing in patients. Implementation of the method in the routine drug monitoring of abusers will enable the continued and proactive surveillance of novel drugs in the population with minimal extra cost and effort. This surveillance gathers important information so that society can be prepared in terms of legislation, as well as social and clinical management of these potentially dangerous drugs. Further expansion of the analytical coverage will help keep abreast of the rapid and constant change in the designer drug scene.
 
Acknowledgements
This study was financially supported by the Beat Drugs Fund (Narcotics Division, Security Bureau of the HKSAR Government), project reference BDF101021. The authors are also grateful to all participants and for the generous assistance received from participating clinical divisions within the Hospital Authority and social service units.
 
Declaration
No conflicts of interests were declared by authors.
 
Appendix
Members of the Emerging Drugs of Abuse Surveillance Study Group:
YH Lam, MPhil1; WH Cheung, FHKCPsych, FHKAM (Psychiatry)2; Eva Dunn, FHKCPsych, FHKAM (Psychiatry)3; CK Wong, FHKCPsych, FHKAM (Psychiatry)3; YC Lo, MSc, FIBMS4; M Lam, FHKCPsych, FHKAM (Psychiatry)5; Michael Lee, MSc6; Angus Lau, MSW7; Albert KK Chung, FHKCPsych, FHKAM (Psychiatry)8; Sidney Tam, FHKCPath, FHKAM (Pathology)9; Ted Tam, BSW10; Vincent Lam, BA(Hon)11; Hezon Tang, MSW12; Katy Wan, BSocSc, MA13; Mamre Lilian Yeh, BA, MSc14; MT Wong, FHKCPsych, FHKAM (Psychiatry)15; CC Shek, FHKCPath, FHKAM (Pathology)16; WK Tang, MD, FHKAM (Psychiatry)17; Michael Chan, FRCPA, FHKAM (Pathology)18; Jeffrey Fung, FRCSEd, FHKAM (Emergency Medicine)19; SH Tsui, FRCP (Edin), FHKAM (Emergency Medicine)20; Albert Lit, FCEM, FHKAM (Emergency Medicine)21; Joe Leung, FHKCEM, FHKAM (Emergency Medicine)22
 
1 Toxicology Reference Laboratory, Hospital Authority, Hong Kong
2 Substance Abuse Assessment Unit, Kwai Chung Hospital, Hong Kong
3 Department of Psychiatry, Pamela Youde Nethersole Eastern Hospital, Hong Kong
4 Department of Pathology, Pamela Youde Nethersole Eastern Hospital, Hong Kong
5 Department of General Adult Psychiatry, Castle Peak Hospital, Hong Kong
6 Department of Clinical Pathology, Tuen Mun Hospital, Hong Kong
7 The Society of Rehabilitation and Crime Prevention, Hong Kong
8 Department of Psychiatry, Queen Mary Hospital, Hong Kong
9 Department of Pathology and Clinical Biochemistry, Queen Mary Hospital, Hong Kong
10 Youth Outreach, Hong Kong
11 Evergreen Lutheran Centre, Hong Kong Lutheran Social Service, the Lutheran Church — Hong Kong Synod
12 Cheer Lutheran Centre, Hong Kong Lutheran Social Service, the Lutheran Church — Hong Kong Synod
13 Rainbow Lutheran Centre, Hong Kong Lutheran Social Service, the Lutheran Church — Hong Kong Synod
14 Operation Dawn Ltd (Gospel Drug Rehab Centre), Hong Kong
15 Department of Psychiatry, Kowloon Hospital, Hong Kong
16 Department of Pathology, Queen Elizabeth Hospital, Hong Kong
17 Department of Psychiatry, the Chinese University of Hong Kong, Hong Kong
18 Department of Chemical Pathology, Prince of Wales Hospital, Hong Kong
19 Accident and Emergency Department, Tuen Mun Hospital, Hong Kong
20 Accident and Emergency Department, Queen Mary Hospital, Hong Kong
21 Accident and Emergency Department, Princess Margaret Hospital, Hong Kong
22 Accident and Emergency Department, Pamela Youde Nethersole Eastern Hospital, Hong Kong
 
References
1. European Monitoring Centre for Drugs and Drug Addiction (EMCDDA), Europol. New drugs in Europe, 2012. EMCDDA-Europol 2012 Annual Report on the implementation of Council Decision 2005/387/JHA; 2012.
2. Nelson ME, Bryant SM, Aks SE. Emerging drugs of abuse. Emerg Med Clin North Am 2014;32:1-28. Crossref
3. Brandt SD, King LA, Evans-Brown M. The new drug phenomenon. Drug Test Anal 2014;6:587-97. Crossref
4. Poon WT, Lai CF, Lui MC, Chan AY, Mak TW. Piperazines: a new class of drug of abuse has landed in Hong Kong. Hong Kong Med J 2010;16:76-7.
5. Tung CK, Chiang TP, Lam M. Acute mental disturbance caused by synthetic cannabinoid: a potential emerging substance of abuse in Hong Kong. East Asian Arch Psychiatry 2012;22:31-3.
6. The first mortality case of PMMA in Hong Kong. 本港首次發現服用毒品PMMA後死亡個案. RTHK. 2014 Feb 4. http://m.rthk.hk/news/20140204/982353.htm.
7. Tang MH, Ching CK, Tsui MS, Chu FK, Mak TW. Two cases of severe intoxication associated with analytically confirmed use of the novel psychoactive substances 25B-NBOMe and 25C-NBOMe. Clin Toxicol (Phila) 2014;52:561-5. Crossref
8. Lin DL, Liu HC, Yin HL. Recent paramethoxymethamphetamine (PMMA) deaths in Taiwan. J Anal Toxicol 2007;31:109-13. Crossref
9. Maskell PD, De Paoli G, Seneviratne C, Pounder DJ. Mephedrone (4-methylmethcathinone)-related deaths. J Anal Toxicol 2011;35:188-91. Crossref
10. Durham M. Ivory wave: the next mephedrone? Emerg Med J 2011;28:1059-60. Crossref
11. Tang MH, Ching CK, Lee CY, Lam YH, Mak TW. Simultaneous detection of 93 conventional and emerging drugs of abuse and their metabolites in urine by UHPLC-MS/MS. J Chromatogr B Analyt Technol Biomed Life Sci 2014;969:272-84. Crossref
12. The fitness for purpose of analytical methods: A laboratory guide to method validation and related topics. EURACHEM Working Group; 1998.
13. Archer JR, Dargan PI, Hudson S, Wood DM. Analysis of anonymous pooled urine from portable urinals in central London confirms the significant use of novel psychoactive substances. QJM 2013;106:147-52. Crossref
14. Helander A, Beck O, Hägerkvist R, Hultén P. Identification of novel psychoactive drug use in Sweden based on laboratory analysis—initial experiences from the STRIDA project. Scand J Clin Lab Invest 2013;73:400-6. Crossref
15. O’Byrne PM, Kavanagh PV, McNamara SM, Stokes SM. Screening of stimulants including designer drugs in urine using a liquid chromatography tandem mass spectrometry system. J Anal Toxicol 2013;37:64-73. Crossref
16. Newly/previously reported drug abusers by sex. Central Registry of Drug Abuse. Available from: http://www.nd.gov.hk/statistics_list/doc/en/t11.pdf. Accessed 3 Feb 2015.
17. European Monitoring Centre for Drugs and Drug Addiction (EMCDDA). EMCDDA risk assessments: Report on the risk assessment of PMMA in the framework of the joint action on new synthetic drugs; 2003.
18. Arbo MD, Bastos ML, Carmo HF. Piperazine compounds as drugs of abuse. Drug Alcohol Depend 2012;122:174-85. Crossref
19. Wood DM, Button J, Lidder S, Ramsey J, Holt DW, Dargan PI. Dissociative and sympathomimetic toxicity associated with recreational use of 1-(3-trifluoromethylphenyl) piperazine (TFMPP) and 1-benzylpiperzine (BZP). J Med Toxicol 2008;4:254-7. Crossref
20. Zawilska JB, Wojcieszak J. Designer cathinones—an emerging class of novel recreational drugs. Forensic Sci Int 2013;231:42-53. Crossref
21. Reported drug abusers by sex by common type of drugs abused. Central Registry of Drug Abuse. Available from: http://www.nd.gov.hk/statistics_list/doc/en/t15.pdf. Accessed 11 Jul 2014.
22. Wong OF, Tsui KL, Lam TS, et al. Prevalence of drugged drivers among non-fatal driver casualties presenting to a trauma centre in Hong Kong. Hong Kong Med J 2010;16:246-51.
23. Wood DM, Dargan PI, Button J, et al. Collapse, reported seizure—and an unexpected pill. Lancet 2007;369:1490. Crossref
24. van Nuijs AL, Gheorghe A, Jorens PG, Maudens K, Neels H, Covaci A. Optimization, validation, and the application of liquid chromatography-tandem mass spectrometry for the analysis of new drugs of abuse in wastewater. Drug Test Anal 2014;6:861-7. Crossref

Duplex sonography for detection of deep vein thrombosis of upper extremities: a 13-year experience

Hong Kong Med J 2015 Apr;21(2):107–13 | Epub 27 Feb 2015
DOI: 10.12809/hkmj144389
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Duplex sonography for detection of deep vein thrombosis of upper extremities: a 13-year experience
Amy SY Chung, MSc, MHKCRRT1; WH Luk, FRCR, FHKAM (Radiology)2; Adrian XN Lo, FRCR, FHKAM (Radiology)3; CF Lo, PDDR1
1Department of Radiology, United Christian Hospital, Kwun Tong, Hong Kong
2Department of Radiology, Princess Margaret Hospital, Laichikok, Hong Kong
3Department of Radiology, Hong Kong Adventist Hospital, 40 Stubbs Road, Hong Kong
Corresponding author: Dr Amy SY Chung (chungsya@gmail.com)
 Full paper in PDF
Abstract
Objectives: To determine the prevalence and characteristics of sonographically evident upper-extremity deep vein thrombosis in symptomatic Chinese patients and identify its associated risk factors.
 
Design: Case series.
 
Setting: Regional hospital, Hong Kong.
 
Patients: Data on patients undergoing upper-extremity venous sonography examinations during a 13-year period from November 1999 to October 2012 were retrieved. Variables including age, sex, history of smoking, history of lower-extremity deep vein thrombosis, major surgery within 30 days, immobilisation within 30 days, cancer (history of malignancy), associated central venous or indwelling catheter, hypertension, diabetes mellitus, sepsis within 30 days, and stroke within 30 days were tested using binary logistic regression to understand the risk factors for upper-extremity deep vein thrombosis.
 
Main outcome measures: The presence of upper-extremity deep vein thrombosis identified.
 
Results: Overall, 213 patients with upper-extremity sonography were identified. Of these patients, 29 (13.6%) had upper-extremity deep vein thrombosis. The proportion of upper-extremity deep vein thrombosis using initial ultrasound was 0.26% of all deep vein thrombosis ultrasound requests. Upper limb swelling was the most common presentation seen in a total of 206 (96.7%) patients. Smoking (37.9%), history of cancer (65.5%), and hypertension (27.6%) were the more prevalent conditions among patients in the upper-extremity deep vein thrombosis–positive group. No statistically significant predictor of upper-extremity deep vein thrombosis was noted if all variables were included. After backward stepwise logistic regression, the final model was left with only age (P=0.119), female gender (P=0.114), and history of malignancy (P=0.024) as independent variables. History of malignancy remained predictive of upper-extremity deep vein thrombosis.
 
Conclusions: Upper-extremity deep vein thrombosis is uncommon among symptomatic Chinese population. The most common sign is swelling and the major risk factor for upper-extremity deep vein thrombosis identified in this study is malignancy.
 
 
New knowledge added by this study
  •  Data suggest that upper-extremity deep vein thrombosis among ethnic Chinese is different from western population.
Implications for clinical practice or policy
  •  Patients with a history of malignancy should be given priority for ultrasound screening of upper-extremity deep vein thrombosis.
 
 
Introduction
It has been a long-held notion that in United Christian Hospital in Hong Kong, requests for upper-extremity vein sonography to screen for deep vein thrombosis (DVT) were rare. This may have been because upper-extremity deep vein thrombosis (UEDVT) was considered a benign phenomenon and not an urgent condition. However, UEDVT potentially carries certain risks like pulmonary embolism (PE), and leads to morbidity and mortality. Therefore, understanding the associated risk factors would help in improving the ability to predict and prevent the risk of PE.
 
In the past decade, most of the research focused on identification and management of lower-extremity deep vein thrombosis (LEDVT), because UEDVT was believed to be clinically insignificant and quite rare, representing less than 2% of DVT.1 A study by Baarslag et al2 in 2004, however, reported that around half of their patients with UEDVT died during the follow-up period. More recent studies have challenged this belief.3 4 5 In 2004, Chan et al6 reported a study comparing Chinese and Caucasian patients, and showed prevalence of LEDVT was different between the two populations (9.1% proximal LEDVT without prophylaxis for Chinese and 16% proximal LEDVT with prophylaxis for Caucasians). This suggested that a study to assess the prevalence of UEDVT in Chinese population needs to be undertaken.
 
There are many imaging strategies to aid diagnosis of UEDVT. When comparing the different strategies, contrast venograms and computed tomography (CT) venograms require the injection of contrast agents and involve radiation. With magnetic resonance venogram, however, no radiation is involved and can be performed without contrast injection. Unfortunately, the use of magnetic resonance venogram is limited by its high cost and inconvenience associated with the procedure. On the other hand, colour duplex sonography is relatively cheap and more easily available. Colour duplex sonography provides excellent sensitivity and specificity as shown in a study by Köksoy et al7 in which the sensitivity and specificity were 94% and 96%, respectively. According to these authors, the downside is that this technique cannot completely exclude the presence of thrombus in axillary, subclavian, superior vena cava, or brachiocephalic vessels.7 The presence of UEDVT may only be inferred from secondary signs such as absence of respiratory variation and cardiac plasticity.8 In view of its safety and cost-effectiveness, duplex sonography is usually preferred as the first-line imaging technique in the evaluation of UEDVT.
 
The aims of this study were to determine the prevalence and characteristics of sonographically evident UEDVT in symptomatic Chinese patients and identify the associated risk factors.
 
Methods
Methodology
A retrospective study was conducted in a regional hospital in a district where the socio-economic status was similar to the rest of the population in Hong Kong.9 The study sample was comprised of patients undergoing an initial duplex sonography of the upper extremity for suspicion of UEDVT during the period November 1999 to October 2012. The study began with an initial search on the computerised Radiology Information System of the Hong Kong Hospital Authority and patients undergoing duplex sonography of upper- or lower-extremity veins were identified. From the radiology reports, positive cases of DVT (both UEDVT and LEDVT) were sourced using key words “incomplete compressibility”, “non-compressible”, “incompressible”, “not compressible”, or “compressibility: (no)”. The search was further narrowed down to retrieve patients with radiology reports and images of all upper-extremity vein sonography using key words in reports like “upper extremity vein” or “upper limb vein”.
 
Since the demographic profile of Hong Kong is mainly ethnic Chinese, our study included only Chinese patients who underwent initial upper-extremity sonography for the detection of UEDVT within the defined period. Studies that were incomplete for any reason and patients who had a positive finding of UEDVT from a previous scan were excluded. Medical record search was performed for the selected patients through the electronic Patient Record System.
 
Data collection and analysis
The medical records were reviewed and data on patient demographic characteristics, possible risk factors, and co-morbidities were collected. All confidential patient data were de-identified and each patient was assigned a study number before analysis. Standardised data collection charts were used to gather information, and details of information recorded are shown in Table 1.
 

Table 1. Summary of data recorded in data collection chart
 
The radiology reports and images were reviewed by two experienced, qualified radiologists, with each radiologist having more than 10 years of experience. The diagnosis of UEDVT was primarily based on the incomplete compressibility of the veins on sonography.3 When Doppler evaluation was used, absence of flow, lack of respiratory variation, or cardiac plasticity were used as secondary criteria for diagnosis.3 Central lines were considered to be present if mentioned in the sonography report, in the medical record, or documented on chest radiography, venography, CT or other imaging modality within 4 weeks prior to sonography. The catheter size and catheter material were not considered or correlated, as such information was not readily available retrospectively. Patients who presented with a history of vigorous exercise within 4 weeks of UEDVT were classified as effort-related.10 In contrary, when no forceful activity of limb or predisposing factor was observed before onset of symptoms, UEDVT was classified as idiopathic or spontaneous.9 Any discrepancies in the report or findings were addressed according to a consensus by the two reviewing radiologists.
 
Preliminary data analysis was performed using descriptive statistics. The mean values of patient’s age and frequency distribution among both genders were calculated in the UEDVT-negative and UEDVT-positive groups. t Test was used to examine the differences in age between the two groups and P<0.05 was regarded as significant. The frequency distributions of signs and symptoms including swelling, extremity discomfort, erythema, dyspnoea, chest pain, and cough were compared in the two groups. The frequency proportions of the variables in the two groups were calculated. Variables including age, sex, history of smoking and LEDVT, major surgery within 30 days, immobilisation within 30 days, cancer (history of malignancy), associated CVC (central venous or indwelling catheter), hypertension, diabetes mellitus, sepsis within 30 days, and stroke within 30 days were tested using binary logistic regression. Using backward stepwise logistic regression, the variables with the highest P values were eliminated one by one until all the remaining variables had P≤0.2, and P<0.05 was considered significant. The most prevalent risk factor in the UEDVT-positive group was identified and compared with data from Caucasian population. All statistical comparisons were done using the Statistical Package for the Social Sciences (Windows version 19.0; SPSS Inc, Chicago [IL], US).
 
Results
Between November 1999 and October 2012, 11 019 patients had undergone upper- or lower-extremity vein ultrasound examinations in the hospital. Major proportion of requests (10 783 patients, 97.9%) was for lower-extremity vein ultrasound. Ultrasound diagnosis of DVT (UEDVT and LEDVT) was seen in 822 (7.6%) patients, of which UEDVT was seen in 34 (4.1%) patients and LEDVT in 788 (95.9%) patients during that period.
 
Overall there were 236 upper-extremity vein ultrasound requests, of which 23 patients (5 out of 23 patients had UEDVT) were excluded as they did not meet the inclusion criteria (an initial upper-extremity vein sonography). A total of 213 patients were included in the study sample; UEDVT was diagnosed in 29 (13.6%) of the study sample (Fig). Therefore, the proportion of UEDVT diagnosed by initial ultrasound was only 0.26% (29/11 019) of all DVT (upper and lower extremity) ultrasound requests. The demographic characteristics of patients in the UEDVT-negative and UEDVT-positive groups are shown in Table 2.
 

Figure. Ultrasound images of (a) a patient diagnosed with breast carcinoma: it shows lack of colour signals inside the vein (thrombus formation); and (b) a patient with colon carcinoma in bed-bound palliative care: it shows large thrombus inside the vein lumen
 

Table 2. Age and sex distribution of patients
 
When comparing the age distribution between the two groups with t test, the results were not significant (P=0.06). In the UEDVT-negative group, 74 (40.2%) patients were males and 110 (59.8%) patients were females. There was no significant difference in age distribution among the two genders (P=0.394). Among the UEDVT-positive group, 15 (51.7%) patients were males and 14 (48.3%) were females. t Test to compare the age distribution between the two genders in this group was also not significantly different (P=0.257).
 
The frequency distributions of the signs and symptoms in the two groups are summarised in Table 3. Most patients in the UEDVT-negative group presented with upper limb swelling, and was seen in 178 (96.7%) patients. Even among the UEDVT-positive group patients, upper limb swelling was the most common sign, and was present in 28 (96.6%) patients.
 

Table 3. Frequency distribution of signs and symptoms in both UEDVT-negative and -positive groups
 
Statistical analysis and frequency proportion of variables in the two groups are summarised in Table 4. In the UEDVT-negative group, history of cancer, hypertension, and diabetes mellitus appeared to be the more prevalent variables and was seen in 82 (44.6%), 81 (44.0%) and 47 (25.5%), respectively. On the other hand, among the 29 patients in the UEDVT-positive group, history of smoking, history of cancer, and hypertension were the prevalent risk factors, and was seen in 11 (37.9%), 19 (65.5%) and 8 (27.6%) patients, respectively.
 

Table 4. Statistical analysis and frequency proportion of variables in the UEDVT-negative and -positive groups
 
Binary logistic regression was used to test the variables (Table 4). There were no statistically significant predictors of UEDVT if all variables were included. There was a trend towards higher risk of UEDVT in patients with a history of malignancy (odds ratio [OR]=2.250, P=0.071) but this was not statistically significant. Stepwise backward regression was performed to eliminate the independent variables with the highest P value until P≤0.2. The final regression model was left with only age, sex, and history of malignancy as independent variables, as the other variables persistently showed high P values (Table 5).
 

Table 5. Analysis of risk factors for UEDVT (remaining variables after backward stepwise regression)
 
In this study, the remaining variables in the model were age (P=0.119), female gender (P=0.114), and history of malignancy (P=0.024). History of malignancy remained predictive of UEDVT, and positive history of malignancy had an OR of 2.664 (95% confidence interval, 1.140-6.211) for the presence of UEDVT. In the UEDVT-positive group, there was no obvious predisposing cause observed in three patients. Therefore, these three (10.3%) patients were classified as having primary UEDVT, while the remaining 26 (89.7%) patients were classified as secondary UEDVT.
 
Discussion
In our study, the number of UEDVT cases diagnosed during the 13-year period using initial sonography was about 2.2 patients per year. As stated earlier, it has been a long-held perspective that UEDVT screening was a rare request in our hospital, and this is clearly evident from this study. Requests for UEDVT sonography constituted only 2.1% (236/11 019) of all extremity (upper and lower) vein ultrasound requests. The proportion of UEDVT diagnosed by initial ultrasound was only 0.26% of all DVT (upper and lower extremity) ultrasound requests, and therefore very rare.
 
Among 29 patients with UEDVT in our study, three patients presented with no obvious predisposing cause. One young healthy 32-year-old male claimed to have developed symptoms after exercise, and so this particular case was classified as primary effort-related thrombosis. Effort-related UEDVT often affected individuals who were young and healthy, with a male-to-female ratio of approximately 2:1.11 The incidence is higher in males and similar findings were also found in this study, and males were younger than females. Pain and swelling are commonly present in patients with UEDVT as shown in a study by Mustafa et al.4 Similarly, swelling was the most prevalent sign in our study, which was seen in 96.6% of patients, and represented the most common sign of UEDVT.
 
In our study, the prevalence of UEDVT among those undergoing ultrasound examinations for suspected UEDVT was 13.6%, and is the lowest when compared with other studies conducted among Caucasian population (18%,12 40%,13 25%,14 and 40%5). We also observed that there were fewer patients with indwelling catheters in our study sample compared with other studies (10.3% vs 11.6%,13 12%,12 23%,14 and 57%5). Earlier reports by Joffe et al3 suggested that indwelling catheter was the strongest predictor of UEDVT, and this may be the reason for the lower incidence in our study compared with other studies.
 
Overall, in our study it was found that history of smoking (37.9%), malignancy (65.5%), and hypertension (27.6%) were the common risk factors and particularly in UEDVT group (Table 4). Statistical analysis showed that a history of malignancy remained predictive of UEDVT. In our study, malignancy was a major risk factor for UEDVT, similar to studies conducted in Caucasian population.1 3 4 In our study, the frequency of cancer (65.5%) was even higher than those in Caucasian population in other studies, which had 43%,15 30%,16 38%,17 and 45%.4
 
Similar studies on Chinese population have already been published. Chen et al18 have investigated the differences in limb, age, and sex of Chinese patients with LEDVT. Abdullah et al19 studied the incidence of UEDVT associated with peripherally inserted central catheters. Liu et al20 estimated the incidence of venous thromboembolism instead of UEDVT in a study from a Hong Kong regional hospital. However, no study relating to prevalence of UEDVT comparing Chinese and western population have been performed. This study, while important, highlighted malignancy as the major risk factor for the prevalence of UEDVT. In a resource-limited health care system, patients with a history of malignancy should be prioritised in the triage of symptomatic patients referred for UEDVT screening, because malignancy is a major predictor of UEDVT and carries risk of PE. Such prioritisation will be beneficial to UEDVT patients as they can be identified and treated early.
 
Limitations
We employed retrospective observation in this study, and data were collected only from those available in the medical records. Therefore, the frequency of UEDVT reported might grossly underestimate the true number. The reason for this could be that signs and symptoms of UEDVT are usually non-specific, and as reported in other prospective studies many patients with UEDVT may remain completely asymptomatic.21
 
In our study, diagnosis of UEDVT was made solely by ultrasound. Studies have shown that ultrasound imaging has excellent sensitivity and specificity for LEDVT.22 23 In a study, the sensitivity had reached 97% to 100% and specificity of 98% to 99%.18 However, previous studies have reported lower sensitivity and specificity for upper-extremity ultrasound at 78% to 100% and 82% to 100%, respectively.18 19 There are several possible reasons why the sensitivity and specificity for detecting UEDVT are lower compared with LEDVT. One main reason is because of the anatomic drawback. The sternum and clavicle create acoustic shadowing or artefact on ultrasound imaging which limits the visualisation of proximal upper-extremity veins and thereby explains the relatively low sensitivity and specificity.3 Additionally, it would be difficult to visualise the centrally situated veins like the medial segment of the subclavian vein, the brachiocephalic vein, and their confluence with the superior vena cava.24 Moreover, the presence of a catheter might not only alter the venous tone, but also affect the venous flow making it more difficult to interpret the Doppler findings visualised on ultrasound. Further, differentiation between a normal vein and a large collateral in a patient with chronic venous thrombosis might sometimes be difficult.20 Another limitation of our study was the relatively small sample size, especially for catheter-related patients. Such small numbers might preclude subgroup analysis and lower the statistical power for identifying risk factors.
 
Conclusions
The major risk factor for UEDVT identified from this study is malignancy. Therefore, patients with a history of malignancy should be prioritised in the triage of symptomatic patients referred for UEDVT screening because malignancy is a major predictor of UEDVT and carries risk for PE.
 
References
1. Tilney ML, Griffiths HJ, Edwards EA. Natural history of major venous thrombosis of the upper extremity. Arch Surg 1970;101:792-6. Crossref
2. Baarslag HJ, Koopman MM, Hutten BA, et al. Long-term follow-up of patients with suspected deep vein thrombosis of the upper extremity: survival, risk factors and post-thrombotic syndrome. Eur J Intern Med 2004;15:503-7. Crossref
3. Joffe HV, Kucher N, Tapson VF, Goldhaber SZ; Deep Vein Thrombosis (DVT) FREE Steering Committee. Upper-extremity deep vein thrombosis: a prospective registry of 592 patients. Circulation 2004;110:1605-11. Crossref
4. Mustafa S, Stein PD, Patel KC, Otten TR, Holmes R, Silbergleit A. Upper extremity deep venous thrombosis. Chest 2003;123:1953-6. Crossref
5. Giess CS, Thaler H, Bach AM, Hann LE. Clinical experience with upper extremity sonography in a high-risk cancer population. J Ultrasound Med 2002;21:1365-70.
6. Chan YK, Chiu KY, Cheng SW, Ho P. The incidence of deep vein thrombosis in elderly Chinese suffering hip fracture is low without prophylaxis: a prospective study using serial duplex ultrasound. J Orthop Surg (Hong Kong) 2004;12:178-83.
7. Köksoy C, Kuzu A, Kutlay J, Erden I, Ozcan H, Ergîn K. The diagnostic value of colour Doppler ultrasound in central venous catheter related thrombosis. Clin Radiol 1995;50:687-9. Crossref
8. Marshall PS, Cain H. Upper extremity deep vein thrombosis. Clin Chest Med 2010;31:783-97. Crossref
9. Statistical tables of the 2006 population by-census. Available from: http://www.bycensus2006.gov.hk/en/data/data3/statistical_tables/index.htm#A2. Accessed 9 Dec 2014.
10. Joffe HV, Goldhaber SZ. Upper-extremity deep vein thrombosis. Circulation 2002;106:1874-80. Crossref
11. Illig KA, Doyle AJ. A comprehensive review of Paget-Schroetter syndrome. J Vasc Surg 2010;51:1538-47. Crossref
12. Kerr TM, Lutter KS, Moeller DM, et al. Upper extremity venous thrombosis diagnosed by duplex scanning. Am J Surg 1990;160:202-6. Crossref
13. Kröger K, Schelo C, Gocke C, Rudofsky G. Colour Doppler sonographic diagnosis of upper limb venous thromboses. Clin Sci (Lond) 1998;94:657-61.
14. Lee JA, Zierler BK, Zierler RE. The risk factors and clinical outcomes of upper extremity deep vein thrombosis. Vasc Endovascular Surg 2012;46:139-44. Crossref
15. Marinella MA, Kathula SL, Markert RJ. Spectrum of upper-extremity deep venous thrombosis in a community teaching hospital. Heart Lung 2000;29:113-7. Crossref
16. Isma N, Svensson PJ, Gottsäter A, Lindblad B. Upper extremity deep venous thrombosis in the population-based Malmö thrombophilia study (MATS). Epidemiology, risk factors, recurrence risk, and mortality. Thromb Res 2010;125:335-8. Crossref
17. Muñoz FJ, Mismetti P, Poggio R, et al. Clinical outcome of patients with upper-extremity deep vein thrombosis: results from the RIETE Registry. Chest 2008;133:143-8. Crossref
18. Chen F, Xiong JX, Zhou WM. Differences in limb, age and sex of Chinese deep vein thrombosis patients. Phlebology 2014 Feb 14. Epub ahead of print. Crossref
19. Abdullah BJ, Mohammad N, Sangkar JV, et al. Incidence of upper limb venous thrombosis associated with peripherally inserted central catheters (PICC). Br J Radiol 2005;78:596-600. Crossref
20. Liu HS, Kho BC, Chan JC, et al. Venous thromboembolism in the Chinese population—experience in a regional hospital in Hong Kong. Hong Kong Med J 2002;8:400-5.
21. Luciani A, Clement O, Halimi P, et al. Catheter-related upper extremity deep venous thrombosis in cancer patients: a prospective study based on Doppler US. Radiology 2001;220:655-60. Crossref
22. Prandoni P, Polistena P, Bernardi E, et al. Upper-extremity deep vein thrombosis. Risk factors, diagnosis, and complications. Arch Intern Med 1997;157:57-62. Crossref
23. Baarslag HJ, van Beek EJ, Koopman MM, Reekers JA. Prospective study of color duplex ultrasonography compared with contrast venography in patients suspected of having deep venous thrombosis of the upper extremities. Ann Intern Med 2002;136:865-72. Crossref
24. Chin EE, Zimmerman PT, Grant EG. Sonographic evaluation of upper extremity deep venous thrombosis. J Ultrasound Med 2005;24:829-38; quiz 839-40.

Prospective study on the effects of orthotic treatment for medial knee osteoarthritis in Chinese patients: clinical outcome and gait analysis

Hong Kong Med J 2015 Apr;21(2):98–106 | Epub 10 Mar 2015
DOI: 10.12809/hkmj144311
© Hong Kong Academy of Medicine. CC BY-NC-ND 4.0
 
ORIGINAL ARTICLE
Prospective study on the effects of orthotic treatment for medial knee osteoarthritis in Chinese patients: clinical outcome and gait analysis
Henry CH Fu, MB, BS, MMedSc1; Chester WH Lie, FRCS (Edin), FHKAM (Orthopaedic Surgery)2; TP Ng, FRCS (Edin), FHKAM (Orthopaedic Surgery)3; KW Chen, BSc4; CY Tse, BSc4; WH Wong, Diploma in Prosthetics and Orthotics4
1 Department of Orthopaedics and Traumatology, Queen Mary Hospital, Pokfulam, Hong Kong
2 Department of Orthopaedics and Traumatology, Kwong Wah Hospital, Yaumatei, Hong Kong
3 Private Practice, Hong Kong
4 Department of Prosthetics and Orthotics, Queen Mary Hospital, Pokfulam, Hong Kong
Corresponding author: Dr Chester WH Lie (chesterliewh@gmail.com)
 Full paper in PDF
Abstract
Objective: To evaluate the effectiveness of various orthotic treatments for patients with isolated medial compartment osteoarthritis.
 
Design: Prospective cohort study with sequential interventions.
 
Setting: University-affiliated hospital, Hong Kong.
 
Patients: From December 2010 to November 2011, 10 patients with medial knee osteoarthritis were referred by orthopaedic surgeons for orthotic treatment. All patients were sequentially treated with flat insole, lateral-wedged insole, lateral-wedged insole with subtalar strap, lateral-wedged insole with arch support, valgus knee brace, and valgus knee brace with lateral-wedged insole with arch support for 4 weeks with no treatment break. Three-dimensional gait analysis and questionnaires were completed after each orthotic treatment.
 
Main outcome measures: The Western Ontario and McMaster Universities Arthritis Index (WOMAC), visual analogue scale scores, and peak and mean knee adduction moments.
 
Results: Compared with pretreatment, the lateral-wedged insole, lateral-wedged insole with arch support, and valgus knee brace groups demonstrated significant reductions in WOMAC pain score (19.1%, P=0.04; 18.2%, P=0.04; and 20.4%, P=0.02, respectively). The lateral-wedged insole with arch support group showed the greatest reduction in visual analogue scale score compared with pretreatment at 24.1% (P=0.004). Addition of a subtalar strap to lateral-wedged insoles (lateral-wedged insole with subtalar strap) did not produce significant benefit when compared with the lateral-wedged insole alone. The valgus knee brace with lateral-wedged insole with arch support group demonstrated an additive effect with a statistically significant reduction in WOMAC total score (-26.7%, P=0.01). Compliance with treatment for the isolated insole groups were all over 90%, but compliance for the valgus knee brace–associated groups was only around 50%. Gait analysis indicated statistically significant reductions in peak and mean knee adduction moments in all orthotic groups when compared with a flat insole.
 
Conclusions: These results support the use of orthotic treatment for early medial compartment knee osteoarthritis.
 
 
New knowledge added by this study
  •  Our data support the use of the lateral-wedged insole with arch support and valgus knee brace in the management of medial compartment osteoarthritis of the knee; however, compliance with the valgus knee brace is fair. Gait analysis showed that both supports can reduce the knee adduction moment during walking.
Implications for clinical practice or policy
  •  Lateral-wedged insoles with arch support and valgus knee brace can be considered for patients with medial compartment osteoarthritis of the knee.
 
 
Introduction
Osteoarthritis of the knee is the commonest type of arthritis affecting the geriatric population. Conservative treatment with physiotherapy and analgesics provides temporary relief of symptoms, yet surgical intervention such as high tibial osteotomy, unicompartmental knee replacement, or total knee replacement is a major undertaking and not without risk.1 2 The medial compartment is more commonly affected than the lateral compartment in osteoarthritis (67% and 17%, respectively).3 Varus alignment of the lower limbs increases the risk of incident knee osteoarthritis and also increases the risk of disease progression in patients with osteoarthritis.4 Apart from static lower limb alignment, dynamic varus thrust during the gait cycle is also independently associated with osteoarthritis progression in the knee.5 Knee adduction moment (KAM) is an indirect means to assess varus thrust during the gait cycle. Previous studies have proven the validity of KAM for prediction of clinical and radiological osteoarthritis progression.6
 
Orthotic treatment can alter loading to the knee in the hope of reducing symptoms and disease progression. Biomechanical studies have demonstrated a small effect size in reduction of KAM with a valgus knee brace7 8 9 10 and lateral-wedged insoles.11 12 13 14 This study is the first to sequentially evaluate the clinical outcomes and gait analyses of different orthotic treatments in Chinese patients with medial compartment osteoarthritis.
 
Methods
Patients
From December 2010 to November 2011, 18 patients with isolated medial osteoarthritis of the knee were referred by orthopaedic surgeons to the Department of Prosthetics and Orthotics at Queen Mary Hospital for orthotic treatment.
 
The inclusion criteria were age older than 50 years and a diagnosis of osteoarthritis according to the American College of Rheumatology criteria.15 The predominant symptom needed to be medial knee pain. Radiographical features needed to include varus knee alignment and osteoarthritis of Kellgren-Lawrence grade 2 or above over the medial compartment.16
 
Our study population comprised patients with isolated medial compartment osteoarthritis, while patients with predominant lateral compartment or patellofemoral joint symptoms or those with radiographical features of osteoarthritis of Kellgren-Lawrence grade 2 or above over the lateral compartment or patellofemoral joint were excluded.
 
Patients with previous knee surgery, fixed flexion deformity of >10°, hip or ankle pathology, required a walking aid, or had morbid obesity (body mass index, >40 kg/m2), a dermatological condition, or peripheral vascular disease were also excluded.
 
This was a non-randomised prospective cohort study with a cross-over design. All 10 patients were sequentially treated with a flat insole (FI), lateral-wedged insole (LW), lateral-wedged insole with subtalar strap (LW+SS), lateral-wedged insole with arch support (LWAS), valgus knee brace (VKB), and valgus knee brace with lateral-wedged insole with arch support (VKB+LWAS). The FI group acted as a control during gait analysis to mimic normal walking. The designs of the orthotics are shown in Figure 1. The insoles were custom-made in the Department of Prosthetics and Orthotics at Queen Mary Hospital, while the Unloader valgus knee braces (Össur hf, Reykjavik, Iceland) were ordered for each patient after measurement. Each of the orthotic treatments was prescribed for 4 weeks and each patient underwent 24 weeks of treatment to use all six orthotics.
 

Figure 1. Various orthotic treatments: (a) valgus knee brace, (b and c) lateral-wedged insole with subtalar strap, (d) lateral-wedged insole, and (e and f) lateral-wedge with arch support
 
For subjective clinical outcomes, pain scores using the visual analogue scale (VAS) and version 3.1 of the Chinese-validated Western Ontario and McMaster Universities Arthritis Index (WOMAC) were measured. The VAS, with a scale from 0 to 10, was used purely for pain severity. The WOMAC score was ascertained by a self-administered questionnaire consisting of 24 items and subdivided into three categories: pain (5 items), stiffness (2 items), and difficulty performing daily activities (17 items). Analgesic use (number of times required per week) was also compared. Pretreatment and interval assessments were completed after each orthotic treatment. Paired t test was used for analysis.
 
Gait analysis
Three-dimensional gait analyses were performed for each patient both before and during use of each orthotic treatment at the gait laboratory at the Duchess of Kent Children’s Hospital, Hong Kong, which is an affiliated hospital within the same cluster as Queen Mary Hospital.
 
Fifteen retro-reflective markers were placed according to the Plug in Gait model (Vicon Industries, Inc, Edgewood [NY], US) as shown in Figure 2. The markers were placed at the bilateral anterior superior iliac spines, midway between the posterior superior iliac spine, lateral epicondyle of the knee, lateral lower third of the thigh, lateral malleolus, lower third of the shin, second metatarsal head, and calcaneus at the level of the second metatarsal head. Three-dimensional positions of the markers and kinematic data were collected by six cameras using the 370 motion analysis system (Vicon Industries, Inc) at a sampling frequency of 60 Hz. Kinetic data were collected using the 370 motion analysis system synchronised with a multicomponent force platform (Kistler, Winterthur, Switzerland) at 60 Hz.
 

Figure 2. Placement of retro-reflective markers (arrows) for gait analysis
 
Peak and mean KAMs during the stance phase of the gait cycle were measured. Mechanical alignment throughout the gait cycle was derived from the hip centre, knee centre, and ankle centre from the retro-reflective markers. After data collection from the gait analysis laboratory, data were analysed jointly by orthopaedic surgeons and prosthetic and orthotic specialists who had a background in biomedical engineering. Gait analysis comparison was made with the FI group and baseline control data. An assumption was made that the flat insole would not alter the knee kinematics. The control data from the gait laboratory consisted of 47 aged-matched healthy participants with normal gait pattern.
 
Paired t tests were used for comparison of different gait parameters between the orthotic type and baseline measurement.
 
Results
Eighteen patients (36 knees) were initially recruited into our study. Nineteen knees of 10 patients completed the study, and the remaining eight patients withdrew for personal reasons. Of the 10 patients, nine had bilateral disease and one had unilateral disease. Ten knees were right knees and nine were left knees. There were six women and four men. The mean age of the patients was 56 years (range, 51-65 years). The pretreatment motion arc ranged from 65° to 140° (mean, 122°).
 
The changes in mean WOMAC and VAS scores for various orthotic treatments and their comparison with pretreatment scores are shown in Table 1. The results of mean and peak KAMs throughout the gait cycle with different orthotics are shown in Figure 3a. The mean and peak KAMs for each orthotic are shown in Tables 2 and 3, respectively. Figure 3b shows the knee mechanical alignment derived from the hip centre, knee centre, and ankle centre. The initial 65% of the gait cycle represents the stance phase and the later 35% is the swing phase. Compliance with the orthotic treatments is shown in Figure 4.
 

Table 1. Comparison of subjective pretreatment and post-treatment scores for various orthotics
 

Table 2. Mean knee adduction moment for various orthotic treatments
 

Table 3. Peak knee adduction moment for various orthotic treatments
 

Figure 3. (a) Comparison of knee adduction moments with different orthotic treatments throughout the gait cycle. (b) Comparison of mean mechanical alignment with different orthotic treatments
 

Figure 4. Compliance with orthotic treatments expressed as percentage of total walking time
 
The LW group demonstrated a significant reduction of 19.1% in the WOMAC pain score (P=0.04). Reductions in total and other WOMAC subscale scores, VAS score, and analgesic requirement were observed, but none were statistically significant. Mean and peak KAMs were reduced by 18.1% and 13.1% (P<0.05), respectively, when compared with the FI group. The compliance rate was 94.7% of total walking time.
 
With the addition of subtalar strapping in the hope of increasing the effectiveness of the lateral wedge, the LW+SS group demonstrated a greater reduction of peak KAM (18.8%), but a smaller degree of reduction in mean KAM (17.6%) [P<0.05]. The net effect of LW+SS did not confer any statistically significant reduction in VAS score, WOMAC score, or analgesic requirement when compared with the pretreatment scores. The compliance rate for the LW+SS group was 94.7% of total walking time.
 
The LWAS group demonstrated statistically significant reduction in VAS score of 24.1% (P=0.004) and WOMAC pain score of 18.2% (P=0.04). Mean and peak KAMs were also significantly reduced by 9.7% and 13.7%, respectively (P<0.05). The degree of reduction in VAS score was greatest in the LWAS group when compared with the LW and LW+SS groups. Score of VAS may be a more reliable predictor of actual symptom improvement than the WOMAC pain score. The compliance rate was also greatest for the LWAS group at 97.4% of total walking time. No significant difference in analgesic requirement was observed.
 
With respect to mean mechanical alignment, as shown in Figure 3b, all the insole groups (LW, LW+SS, and LWAS) showed lower varus angle throughout the stance phase. The stance phase is the symptomatic phase when the knee is under loading.
 
The VKB group showed a statistically significant reduction in VAS score and WOMAC pain score of 15.5% (P=0.04) and 20.4% (P=0.02), respectively. The WOMAC total score and other subscale scores showed some reductions, but these were not statistically significant. The analgesic requirement was also significantly reduced from 1.5 days/week pretreatment to 0.5 days/week post-treatment (P=0.04). Mean and peak KAMs were reduced by 15.5% and 18.9%, respectively (P<0.05). Mechanical alignment, as seen in Figure 3b, showed reduced varus angulation during the early stance phase. The interval between 15% and 20% of the gait cycle, representing the heel strike to mid-stance phase, was shown to have reduced the varus angle when compared with baseline. The varus angle remained constant throughout the stance phase, which was related to restricted motion of the knee inside the brace. Compliance was significantly lower than that for any of the insole groups at 54.5% of the total walking time. The low compliance rate was likely due to the bulky size of the valgus knee brace causing skin discomfort, especially in the hot and humid climate in this region.
 
The LWAS seemed to be the best insole treatment for pain relief and improvement in VAS score, so we further evaluated the combination effects of the VKB and LWAS treatments. Additive effects were observed with combined treatment. The VKB+LWAS group showed significant reductions in VAS score, as well as WOMAC total and all subscale scores. Score of VAS reduced by 22.4% (P=0.004), WOMAC pain score reduced by 29.4% (P=0.001), WOMAC stiffness score reduced by 18.8% (P=0.02), WOMAC activities of daily living score reduced by 26.4% (P=0.002), and WOMAC total score reduced by 26.7% (P=0.001). The extent of reduction in the WOMAC total and subscale scores for this group was the greatest of the treatment groups. The analgesic requirement was also significantly reduced from 1.5 days/week pretreatment to 0.6 days/week post-treatment (P=0.04). Peak KAM showed the greatest reduction of all the orthotic groups of 21.0%, while mean KAM showed moderate reduction of 16.3% (P<0.05). With regard to the mechanical alignment, reduction in varus angle was observed in the early stance phase, as in the isolated VKB group. The compliance, as expected, was lowest among all the treatment arms with only 49.1% of total walking time.
 
Discussion
The current literature recommendations for orthotic treatment for medial compartment knee osteoarthritis are still varied. In a guideline by the Osteoarthritis Research Society International (OARSI), insoles were concluded to be of benefit to reduce pain and improve ambulation in knee osteoarthritis.17 However, in another guideline by the American Academy of Orthopaedic Surgeons (AAOS), it was concluded that lateral-wedged insoles could not be suggested for patients with symptomatic osteoarthritis.18 Lateral-wedged insoles have been shown to correct the femorotibial angle19 and reduce the peak external KAM.12 20 Toda et al21 were able to demonstrate a dose-response correction of the femorotibial angle using insoles with different elevations. The effect on subjective scores showed significant improvements in some,22 but not all studies.23 24 Two randomised controlled trials by Maillefert et al23 and Baker et al24 did not show statistically significant changes in WOMAC scores with lateral-wedged insoles, although there was a significant reduction in non-steroidal anti-inflammatory drug intake in the insole group.
 
Our results showed reduction in WOMAC pain score with LW and LWAS, but more importantly, a greater percentage reduction in VAS score with LWAS. Addition of subtalar strapping to lateral-wedged insoles was shown in other studies to improve VAS scores, and decrease the femorotibial angle25 and peak KAM26 when compared with a lateral-wedged insole alone. The potential drawbacks of subtalar strapping include increased sole pain.27 The results from our study did not demonstrate the additional benefit with subtalar strapping in terms of WOMAC score or mean KAM. With a significantly greater reduction in VAS with LWAS than LW (24.1% vs 10.3%) and a high compliance rate, we believe LWAS is the insole of choice and can be offered to patients with early isolated medial compartment knee osteoarthritis.
 
Knee bracing acts by inducing a valgus force by the three-point bending principle. The OARSI guideline suggests that knee bracing could reduce pain, improve stability, and reduce the risk of fall in patients with mild-to-moderate osteoarthritis or valgus instability.17 However, the guideline from the AAOS could not conclude for or against the use of valgus-directed bracing.18 Advantages of knee bracing include avoidance of surgery and the potential surgical complications, while the disadvantages include compliance and the cost of manufacturing the brace.28 A randomised controlled trial by Brouwer et al29 compared three treatment groups of valgus knee brace plus medical treatment, insole plus medical treatment, and medical treatment alone. The brace plus medical treatment was shown to have borderline benefit compared to medical treatment alone in terms of pain score and function.29 These findings concur with our study result of improved WOMAC pain subscale score and reduced analgesic requirement with valgus knee brace when compared to pretreatment scores. From the kinetics perspective, Pollo et al7 were able to demonstrate reduction in net external KAM by 13%. Our gait analysis model was able to reproduce reduction in mean KAM by 18.9%. Despite the potential benefits from valgus knee brace, compliance remains a major drawback. With a compliance rate of 54.5%, many of our patients claimed that they did not wear the braces outdoors due to skin discomfort in the hot and humid climate. Our evidence would suggest valgus knee brace is suitable for selected patients with mild knee osteoarthritis, with consideration of the problem with fitting and compliance.
 
Our current study was among the few to evaluate the effects of combination orthotic treatment with valgus knee brace and lateral-wedged insole with arch support. The VKB+LWAS group was the only one to demonstrate significant reductions in WOMAC total and all subscale scores, analgesic use, and KAM when compared with pretreatment. These results further reiterate the dose-response relationship in reducing KAM to achieve improvement in objective knee scores. Despite these findings, the poor compliance rate would render this orthotic treatment less advisable.
 
Limitations
Limitations of our study included a small sample size, selection bias, self-selection bias, and a short follow-up period. Similar studies of less than 20 patients are seen in many studies of gait analysis.30 31 32 A larger sample size would provide a higher power to determine the statistical significance in more of the evaluated parameters. Compliance with orthotic treatment, in particular with the valgus knee brace, was another concern. Confounding factors in our study included the frequency of weight-bearing activities, which could be difficult to quantify.
 
This was a cross-over study, with all patients having to be treated sequentially with all six orthotic combinations. The advantages are an economy of sample size without the need to account for heterogeneity of the patient groups. The disadvantages of the design include lack of a treatment break and lack of randomisation in the treatment sequence. Scores of VAS reported by elderly people may also be inaccurate.
 
Conclusions
Knee osteoarthritis continues to pose a significant burden to our community with its ageing population and increased incidence of obesity. While operative treatments are not without risk, orthotic treatment also has its advantages and disadvantages. Our current study was able to demonstrate from subjective scores and gait analysis that orthotic treatment can alter knee loading and alleviate symptoms. The lateral-wedged insole with arch support is optimal, while valgus knee brace is equally effective, with fair compliance. Further studies with a larger sample size are required to evaluate the effectiveness in the long term.
 
References
1. Knutson K, Lindstrand A, Lidgren L. Survival of knee arthroplasties. A nation-wide multicentre investigation of 8000 cases. J Bone Joint Surg Br 1986;68:795-803.
2. Swedish Knee Arthroplasty Registry. SKAR Annual Report; 2011.
3. Ledingham J, Regan M, Jones A, Doherty M. Radiographic patterns and associations of osteoarthritis of the knee in patients referred to hospital. Ann Rheum Dis 1993;52:520-6. Crossref
4. Sharma L, Song J, Dunlop D, et al. Varus and valgus alignment and incident and progressive knee osteoarthritis. Ann Rheum Dis 2010;69:1940-5. Crossref
5. Chang A, Hayes K, Dunlop D, et al. Thrust during ambulation and the progression of knee osteoarthritis. Arthritis Rheum 2004;50:3897-903. Crossref
6. Birmingham TB, Hunt MA, Jones IC, Jenkyn TR, Giffin JR. Test-retest reliability of the peak knee adduction moment during walking in patients with medial compartment knee osteoarthritis. Arthritis Rheum 2007;57:1012-7. Crossref
7. Pollo FE, Otis JC, Backus SI, Warren RF, Wickiewicz TL. Reduction of medial compartment loads with valgus bracing of the osteoarthritic knee. Am J Sports Med 2002;30:414-21.
8. Lindenfeld TN, Hewett TE, Andriacchi TP. Joint loading with valgus bracing in patients with varus gonarthrosis. Clin Orthop Relat Res 1997;344:290-7. Crossref
9. Pagani CH, Böhle C, Potthast W, Brüggemann GP. Short-term effects of a dedicated knee orthosis on knee adduction moment, pain, and function in patients with osteoarthritis. Arch Phys Med Rehabil 2010;91:1936-41. Crossref
10. Toriyama M, Deie M, Shimada N, et al. Effects of unloading bracing on knee and hip joints for patients with medial compartment knee osteoarthritis. Clin Biomech (Bristol, Avon) 2011;26:497-503. Crossref
11. Hinman RS, Bowles KA, Bennell KL. Laterally wedged insoles in knee osteoarthritis: do biomechanical effects decline after one month of wear? BMC Musculoskelet Disord 2009;10:146. Crossref
12. Fantini Pagani CH, Hinrichs M, Brüggemann GP. Kinetic and kinematic changes with the use of valgus knee brace and lateral wedge insoles in patients with medial knee osteoarthritis. J Orthop Res 2012;30:1125-32. Crossref
13. Butler RJ, Marchesi S, Royer T, Davis IS. The effect of a subject-specific amount of lateral wedge on knee mechanics in patients with medial knee osteoarthritis. J Orthop Res 2007;25:1121-7. Crossref
14. Kakihana W, Akai M, Nakazawa K, Naito K, Torii S. Inconsistent knee varus moment reduction caused by a lateral wedge in knee osteoarthritis. Am J Phys Med Rehabil 2007;86:446-54. Crossref
15. Belo JN, Berger MY, Koes BW, Bierma-Zeinstra SM. The prognostic value of the clinical ACR classification criteria of knee osteoarthritis for persisting knee complaints and increase of disability in general practice. Osteoarthritis Cartilage 2009;17:1288-92. Crossref
16. Kellgren JH, Lawrence JS. Radiological assessment of osteo-arthrosis. Ann Rheum Dis 1957;16:494-502. Crossref
17. Zhang W, Moskowitz RW, Nuki G, et al. OARSI recommendations for the management of hip and knee osteoarthritis, Part II: OARSI evidence-based, expert consensus guidelines. Osteoarthritis Cartilage 2008;16:137-62. Crossref
18. Jevsevar DS, Brown GA, Jones DL, et al. The American Academy of Orthopaedic Surgeons evidence-based guideline on: treatment of osteoarthritis of the knee, 2nd edition. J Bone Joint Surg Am 2013;95:1885-6.
19. Yasuda K, Sasaki T. The mechanics of treatment of the osteoarthritic knee with a wedged insole. Clin Orthop Relat Res 1987;215:162-72.
20. Shimada S, Kobayashi S, Wada M, et al. Effects of disease severity on response to lateral wedged shoe insole for medial compartment knee osteoarthritis. Arch Phys Med Rehabil 2006;87:1436-41. Crossref
21. Toda Y, Tsukimura N, Kato A. The effects of different elevations of laterally wedged insoles with subtalar strapping on medial compartment osteoarthritis of the knee. Arch Phys Med Rehabil 2004;85:673-7. Crossref
22. Sasaki T, Yasuda K. Clinical evaluation of the treatment of osteoarthritic knees using a newly designed wedged insole. Clin Orthop Relat Res 1987;221:181-7.
23. Maillefert JF, Hudry C, Baron G, et al. Laterally elevated wedged insoles in the treatment of medial knee osteoarthritis: a prospective randomized controlled study. Osteoarthritis Cartilage 2001;9:738-45. Crossref
24. Baker K, Goggins J, Xie H, et al. A randomized crossover trial of a wedged insole for treatment of knee osteoarthritis. Arthritis Rheum 2007;56:1198-203. Crossref
25. Toda Y, Tsukimura N. A six-month followup of a randomized trial comparing the efficacy of a lateral-wedge insole with subtalar strapping and an in-shoe lateral-wedge insole in patients with varus deformity osteoarthritis of the knee. Arthritis Rheum 2004;50:3129-36. Crossref
26. Kuroyanagi Y, Nagura T, Matsumoto H, et al. The lateral wedged insole with subtalar strapping significantly reduces dynamic knee load in the medial compartment gait analysis on patients with medial knee osteoarthritis. Osteoarthritis Cartilage 2007;15:932-6. Crossref
27. Brouwer RW, Jakma TS, Verhagen AP, Verhaar JA, Bierma-Zeinstra SM. Braces and orthoses for treating osteoarthritis of the knee. Cochrane Database Syst Rev 2005;(1):CD004020.
28. Hanypsiak BT, Shaffer BS. Nonoperative treatment of unicompartmental arthritis of the knee. Orthop Clin North Am 2005;36:401-11. Crossref
29. Brouwer RW, van Raaij TM, Verhaar JA, Coene LN, Bierma-Zeinstra SM. Brace treatment for osteoarthritis of the knee: a prospective randomized multi-centre trial. Osteoarthritis Cartilage 2006;14:777-83. Crossref
30. Foroughi N, Smith RM, Lange AK, Baker MK, Fiatarone Singh MA, Vanwanseele B. Dynamic alignment and its association with knee adduction moment in medial knee osteoarthritis. Knee 2010;17:210-6. Crossref
31. van den Noort JJ, van der Esch M, Steultjens MP, et al. Ambulatory measurement of the knee adduction moment in patients with osteoarthritis of the knee. J Biomech 2013;46:43-9. Crossref
32. Kutzner I, Trepczynski A, Heller MO, Bergmann G. Knee adduction moment and medial contact force—facts about their correlation during gait. PLoS One 2013;8:e81036. Crossref

Pages