Convex probe EBUS-TBNA has been a major development in respiratory medicine. In the last decade we have seen numerous articles supporting the high diagnostic accuracy of EBUS-TBNA in the diagnosis of lung cancer, staging of lung cancer, diagnosis of extra-thoracic malignancies & benign conditions (e.g., TB & sarcoidosis)1. Patients included in this study reflect the real-life referrals that we see as respiratory physicians in our daily practice. This shows that the trend of doing EBUS-TBNAs for non-cancer patients is rising. Lung cancer is a common cause of cancer death worldwide2. Various guidelines (including NICE) have found this procedure safe & recommend it for the staging of lung cancer. In the last 10 years, lots of district general hospitals have started this service in UK & it is mainly delivered by respiratory physicians.
This has provided a specialist service for patients in their local area, which has reduced travelling and waiting times.
Setting & Methods
In this district general hospital under discussion, EBUS service was setup in 2018, under supervision of a tertiary care centre. We carried out 82 procedures during the first year of this service. All of these cases were reviewed for this article. Data was recorded on an excel spreadsheet (data included: number of cases, age, gender, lymph node stations sampled, complications, pathology & microbiology results of EBUS TBNA). Minimum of 4 passes were done at each lymph node station. Where EBUS was done for diagnostic purposes, stations to be sampled were at the discretion of the operator. Samples obtained via EBUS-TBNA were flushed into CytoLyt (methanol-water solution). EBUS-TBNAs were carried out in the absence of rapid on-site evaluation (ROSE). Where a cancer was suspected but EBUS-TBNA showed normal findings, samples were obtained via another modalities (e.g., CT biopsy) & FDG PET was carried out as well (if not done already). In cases of isolated mediastinal & hilar lymphadenopathy (IMHL), where EBUS-TBNA did not reveal any pathology, interval surveillance CTs were carried out for monitoring purposes. Where lymphadenopathy did not resolve, surveillance scans were carried out for a year. The outcomes of these surveillance CTs & PET CTs were also reviewed for this study. Diagnosis of reactive lymphadenopathy was made if EBUS-TBNA sample did not reveal any pathology, repeat CT did not show any change (or showed reduction/ resolution of lymphadenopathy) & the clinician did not consider the patient to have another diagnosis. EBUS-TBNA was labelled as false negative, if pathology result was negative, but node was positive on PET (in suspected cancer patients).
Results
Out of these 82 patients who underwent EBUS-TBNA, 55 (about 67%) were male & 27 were female (about 33%) (Figure 1).
The age range of patients at the time of procedure was 28 to 88 years. Majority of the patients were between the age of 52 – 88 years (80% of the cases) (Figure 2).
The 82 EBUS-TBNA procedures were carried out for the following main reasons (Figure 3): A. 42 procedures for cancer reasons (i.e. 51% of the total procedures) a. For diagnosis of lung cancer (38 procedures) b. Diagnosis of suspected extra-thoracic cancer (3 cases) c. Staging of lung cancer (1 case) B. 40 procedures for IMHL (i.e. 49%)
The final diagnoses in 38 procedures carried out for “diagnosis of lung cancer” were as follows: 1. 25 patients were diagnosed with lung cancer (12 squamous cell cancers, 7 adenocarcinomas, 4 small cell cancers, 1 undifferentiated lung cancer & 1 neuroendocrine tumour) 2. Final diagnosis in 9 cases was reactive lymphadenopathy (repeat CT showed resolution of lymph nodes in 3 cases, reduction in the size in 1 case & stable nodes in 5 cases) 3. Extra-thoracic malignancies were diagnosed in 2 cases (1 metastatic prostate cancer & 2nd was metastatic disease from primary parotid gland tumour) 4. We had false negative results in 2 cases (1 patient was diagnosed with small cell lung cancer on CT biopsy & 2nd with adenocarcinoma on ultrasound biopsy)
It was found in 11 cases, where the clinicians initial suspicion was a possible lung cancer, that the final diagnoses were reactive lymphadenopathy and extra-thoracic malignancies.
Some of these patients had lung nodules as well (along with mediastinal & hilar lymphadenopathy). These nodules either resolved or remained stable. In the case of metastatic prostate cancer, prior MRI showed prostate confined disease & the clinician suspected this size significant lymphadenopathy to be due to a lung primary. In the case of metastatic parotid tumour, the initial diagnosis of parotid cancer was a very long time ago & metastatic disease was not expected.
Final diagnoses in 3 patients who had EBUS-TBNA for “extra-thoracic malignancies” were as follows: 1. Prostate cancer (here pelvic MRI showed locally advanced disease) 2. Colon cancer (known colon cancer) 3. Ovarian cancer (patient had ovarian mass & abdominal/pelvic lymphadenopathy)
As most of the surgical patients go directly to tertiary care centres (from this hospital), we therefore didn’t have many patients for staging purposes during the 1st year of the service. We only had 1 patient for “staging EBUS-TBNA” during this time. In this case stations 4L, 7 & 12L were sampled. Only station 12L was PET positive & also positive on EBUS-TBNA sample. Station 4L & 7 were PET and EBUS-TBNA negative. There was no size significant nodes seen on staging CT in any other area, only 12L node was PET avid & we were not able to identify any size significant lymphadenopathy at any other station via EBUS as well. Sensitivity in this staging EBUS was 100%.
In these 42 diagnostic & staging procedures (carried out for cancers or suspected cancers), summary of the pathological diagnoses from lymph nodes aspirates is as follows: 1. Squamous cell carcinoma of lung 13 approximately (31%) 2. Adenocarcinoma of lung origin 7 approximately (17%) 3. Small cell lung cancer 4 approximately (9.5%) 4. Neuroendocrine tumour of lung origin 1 approximately (2.3%) 5. Undifferentiated lung cancer 1 approximately (2.3%) 6. Metastatic prostate cancer 2 approximately (4.75%) 7. Metastatic parotid gland cancer 1 approximately (2.3%) 8. Metastatic ovarian cancer 1 approximately (2.3%) 9. Metastatic colon cancer 1 approximately (2.3%) 10. False negative 2 approximately (4.75%) 11. Reactive lymphadenopathy 9 approximately (21.5%)
Out of the 40 procedures for IMHL, we were not able to get an adequate sample in 1 case and this patient underwent repeat EBUS-TBNA. Repeat sample showed granulomas; findings were consistent with the clinical diagnosis of sarcoidosis. Final diagnoses in these 40 cases are as follows: 1. Metastatic adenocarcinoma from pancreaticobiliary origin = 1 (2.5%) 2. Bronchogenic cyst = 1 (2.5%) 3. Insufficient sample = 1 (2.5%) 4. Tuberculosis = 3 (7.5%) 5. Granulomas = 16 (40%) 6. Reactive lymphadenopathy = 18 (45%)
Serious diagnoses were made in 10% cases of IMHL (4 out of 40). One patient had metastatic adenocarcinoma from pancreaticobiliary origin & didn’t have any abdominal symptoms or any abnormalities on CTs in the abdomen. 3 patients were diagnosed & later treated for active tuberculosis. Out of these 3 patients only 1 had features of active disease, but sputum negative. The other 2 patients had only mediastinal lymphadenopathy, no lung infiltrates & no sputum production.
A total of 122 lymph nodes were sampled. Details are as follows (figure 4):
Lymph node station
Times sampled
%
Station 7
65
53.3
4R
18
14.8
11R
15
12.3
11L
11
9
10R
4
3.2
4L
3
2.5
2R
2
1.6
10L
2
1.6
12R
2
1.6
2L
0
0
12L
0
0
Commonly sampled nodes were station 7 nodes. This is consistent with international literature published on EBUS-TBNA.
There were no complications from the procedures performed. None of our patients experienced significant airway bleeding (requiring admission or blood transfusion), mediastinal infection, pneumothorax, pneumo-mediastinum, haemo-mediastinum or airway lacerations.
Discussion
EBUS TBNA is one of the methods to access the mediastinal & hilar lymph nodes. This is a minimally invasive way to get samples from these nodes. Several invasive, minimally-invasive & non-invasive techniques are available to diagnose & stage lung cancers. Choice depends upon the extent of the disease. About 50% of lung cancer patients have evidence of metastatic disease at the time of presentation 3. Patients with intrathoracic disease undergo several investigations. Now we know that EBUS-TBNA should be considered as the initial investigation for patients with early stage suspected lung cancer 4. Research carried out has shown that EBUS-TBNA had a sensitivity of 90% 5. A recent national BTS audit on bronchoscopy & EBUS showed national diagnostic sensitivity of 90% for staging EBUS-TBNA. BTS quality standards statement sets target of 88% sensitivity for staging EBUS-TBNA6. As far as diagnostic EBUS-TBNA is concerned, we had 2 false negative results out of 41 (4.8%), that gives the sensitivity for diagnostic procedures of 95.2%.
There is significant evidence available that ROSE does not increase the diagnostic yield of even conventional TBNA 7. Trisolini et al demonstrated in this randomised controlled trial that ROSE did not give any significant diagnostic advantage & did not affect the percentage of adequate specimens. Articles have also shown that ROSE does not reduce the EBUS-TBNA procedure time 8. The use of immunohistochemistry on EBUS-TBNA reduces the rate of unclassified non-small cell lung cancer when compared with cytological diagnosis alone 9. EBUS-TBNA samples are sufficient to allow immunohistochemical and molecular analysis. I am happy to say that we were able to get ALK, EGFR & PDL1 testing on the EBUS-TBNA samples (where indicated), at our centre. The presence of a cytopathologist or cytotechnologist during the procedure for ROSE purposes can increase the cost significantly. This increased cost can have a significant impact on starting the service at the level of a district general hospital. Another issue which needs clarification, is the number of passes required before declaring the material is inadequate while using ROSE technique. Studies have shown that significant number of samples inadequate on ROSE were still able to give a diagnosis with the help of immunohistochemical analysis.
Here we have seen that 40 EBUS-TBNA procedures were carried out for IMHL. Unfortunately, in this group, one patient was diagnosed with unexpected malignancy, i.e., metastatic adenocarcinoma of pancreaticobiliary origin. In the remaining cases we had benign diagnoses. In the IMHL group about 45% cases had the diagnosis of reactive lymphadenopathy. Out of the total number of 82, about 33% cases were diagnosed with reactive lymphadenopathy. We made the diagnosis of reactive lymphadenopathy in patients where EBUS samples showed normal lymphocytes; these patients had surveillance CTs & clinical follow up as well. Clinicians’ impression & surveillance scans were also reviewed for the purpose of this diagnosis. In this IMHL group, 40% cases were diagnosed with sarcoidosis. In these cases, in addition to clinicians’ impressions, we reviewed cytology, microbiology & surveillance CT reports. Processing method for specimens impacts on the yield for granulomas. Cell block preparation, as carried out in this hospital, showed higher yield for granulomas 10.
During the first year of the EBUS service at this centre, there was no suspected or diagnosed lymphoma patient who underwent this procedure. International data suggests, for the diagnosis of lymphoma, EBUS-TBNA aspirates should be sent for cytopathology, immunohistochemistry, flow cytometry, cytogenetics and molecular studies 11,12,13.
Conclusion
EBUS-TBNA is a safe & minimally invasive procedure. It is a first line investigation for lung cancer staging. EBUS-TBNA has been effective in diagnosing extra-pulmonary malignancies 14. In the last decade we have also seen that its utility has increased significantly in diagnosing benign conditions like sarcoidosis and TB.
We feel operators’ training is also very important in achieving excellent results. Mastering the complexity of this procedure is time consuming. Standardised training is mandatory to achieve high skill levels15 and we hope there will be a standardised approach to this in future.
Lichtenstein tension-free mesh repair has been the standard practice in open inguinal hernia repair for many years. The procedure involves suture fixation of the mesh via an anterior approach to the inguinal canal. It is hypothesised that this invasive fixation contributes to the development chronic postoperative inguinal pain (CPIP), a condition which can cause significant morbidity.
A sound repair should restore the groin anatomy whilst minimising recurrence and not adversely affecting the patient quality of life. Considering the large number of these operations performed each year, reducing complications such as chronic postoperative pain will have a significant impact on healthcare resources.
The introduction of anatomical self-adhesive meshes such as Parietx ProGripTM addresses this concern in theory by obviating the need for mesh fixation. This mesh is a macro porous polyester mesh that utilizes polylactic acid micro grips (PLA) to aid placement within 60 seconds1 without the need for additional fixation. The manufacturer does suggest, however, that additional fixation is left to the discretion of the operating surgeon.
We conducted a review of the literature to evaluate the reported outcomes of using this mesh in open inguinal hernia repair.
Methods
We conducted a PUBMED/MEDLINE search using the search words “Self-adhesive mesh”, “Lichtenstein repair”, “Open inguinal hernia repair” and “Self-gripping mesh” .We looked primarily at the outcomes of postoperative pain and recurrence. The result highlighted five well-structured meta-analyses and several RCTs and retrospective reviews.
Results
In a retrospective review of 211 patients who underwent open inguinal hernia repair with self-adhesive mesh, Tarchi P et al reported a recurrence rate of 0.5% at 1 year and 2.4% at 2 years. They incidence of chronic pain was less than 3%. There were no cases of seroma, testicular complications or mesh infection at 1, 2 and 3-year follow-up. The report highlighted the shorter operative duration with no effect on recurrence rates as a point in favour of self-adhesive mesh. The authors acknowledged the limitation of the study design and the need for randomised trials to address the issue.8 A few other small non-randomised trials draw similar conlusions.9
A randomised blinded trial from the Danish Multicentre DANGRIP Study Group allocated 163 vs 171 patients to self-adhesive and suture fixation respectively. There were no significant differences between the groups in postoperative complications (33.7 versus 40.4 %; P = 0·215), rate of recurrent hernia within 1 year (1.2 % in both groups) or quality of life. The 12 month prevalence of moderate or severe symptoms was 17.4 and 20.2% respectively (P = 0.573).
The study concluded that the avoidance of suture fixation using a self-gripping mesh was not accompanied by a reduction in chronic symptoms after inguinal hernia repair. 5
The FinnMesh trial is a randomised multicentre trial from Finland that Compared glue fixation, self-gripping mesh, and suture fixation of mesh. 625 patients were randomised to cyanoacrylate glue (Histoacryl, n = 216), self-gripping mesh (Parietex ProGrip, n = 202), or conventional non absorbable sutures (Prolene 2-0, n = 207) There was no significant differences postoperatively in pain response or need for analgesics between the study groups at 1 year follow up. The mean operative duration was lower in the self-adhesive mesh group.6
The HIPPO trial is a randomised double-blinded trial of 165 patients. The reported hernia recurrence rate after 24 months was 2.4% for the ProGrip mesh and 1.8% for the sutured mesh (P = 0.213).
The incidence of CPIP was 7.3% at 3 months declining to 4.6% at 24 months and did not differ between both groups. 7 The mean duration of surgery was significant shorter with the ProGrip mesh (44 vs 53 minutes, P < 0.001).
In a systematic review of 7 studies comparing self-gripping versus sutured mesh for inguinal hernia repair totalling 1353 patients, Zhang C et al found no difference in recurrence (risk difference -0.02 [95% confidence interval -0.07 to 0.03], P = 0.40) or chronic pain (risk difference -0.00 [95% confidence interval -0.01 to 0.01], P = 0.57). 2 This review found no difference in wound infection, hematoma, and seroma formation. Self-adhesive mesh was again associated with a shorter mean operative duration. In its conclusion, the authors surmised that both mesh types are comparable in outcome but further long term analysis might be needed.
Pandanaboyana S published a meta-analysis of 5 RCTs and 1170 patients, that also found no significant difference in recurrence or chronic pain. Wound infection was lower in the self-gripping mesh group compared to sutured mesh but this was not statistically significant (risk ratio (RR) 0.57, 95% confidence interval 0.30-1.06, P = 0.08). The duration of operation was significantly shorter with self-gripping mesh compared to sutured mesh with a mean difference of -5.48 min [-9.31, -1.64] Z = 2.80 (P = 0.005).3
In another meta-analysis, Li J et al included 5 RCTs and 2 prospective comparative studies and 1353 patients. Statistically, there was no difference in the incidence of chronic pain [odds ratio = 0.74, 95% confidence interval (CI) (0.51-1.08)]. There was no statistical difference in the incidence of acute postoperative pain [odds ratio = 1.32, 95% CI (0.68-2.55)], hematoma or seroma [odds ratio = 0.89, 95% CI (0. 56-1.41)], wound infection [risk difference = -0.01, 95% CI (-0.02 to 0.01)], and recurrence [risk difference = 0.00, 95% CI (-0.01 to 0.01)]. The self-gripping mesh group was associated with a shorter operating time (1-9 minutes).10
In Ismail A et al’s meta-analysis of 12 randomized controlled trials and 5 cohort studies, 3722 patients were included in the final analysis. The two groups, using self-gripping mesh or sutured mesh fixation, did not differ significantly in terms of recurrence rate (odds ratio = 0.66, 95% confidence interval 0.18-2.44; P = 0.54) or postoperative chronic groin pain (odds ratio = 0.75, 95% confidence interval 0.54-1.05; P = 0.09). The operative time was less in the self-gripping mesh group (mean difference = -7.85, 95% confidence interval -9.94 to -5.76; P < .0001). There were comparable risks between self-gripping mesh and sutured mesh fixation groups in terms of postoperative infection (odds ratio = 0.81, 95% confidence interval 0.53-1.23; P = 0.32), postoperative hematoma (odds ratio = 0.97, 95% confidence interval 0.7-1.36; P = 0.9), and urinary retention (odds ratio = 0.66, 95% confidence interval 0.18-2.44; P =0.54).11
A more recent meta-analysis including 10 RCTs and 2541 patients also draws similar conclusions, with no significant difference in the incidence of chronic pain (odds ratio = 0.93; 95% confidence interval, 0.74-1.18), recurrence (odds ratio = 1.34; 95% confidence interval, 0.82-2.19), or foreign body sensation (odds ratio = 0.82; 95% confidence interval, 0.65-1.03).4 The mean operating time was significantly shorter (odds ratio = -7.58; 95% confidence interval, -9.58 to -5.58) in the self-gripping mesh group which is consistent with the reported literature.
Discussion
Open inguinal hernia repair is a routinely performed operation and chronic postoperative inguinal pain is a significant cause of morbidity that can impact negatively on patients’ quality of life. Eliminating the need for suture fixation seems theoretically a step in the right direction.
The published literature, however, seems to arrive at similar conclusions. Whilst using self-adhesive mesh results in a shorter operative duration and seemingly does not affect the outcome negatively otherwise, there is no evidence that it reduces postoperative chronic pain and therefore should not be advocated on this merit. A shortened operative time coupled with a non-inferior outcome does seem like a more reasonable evidence-based argument for its proponents.
The decision of which mesh fixation technique to use can be left to the discretion of the operating surgeon. Further long-term follow up data is required to arrive at more definitive conclusions as the mean follow up duration in the reviewed studies ranged from 4 months to 3 years. The cost implications involved in the choice of mesh used should also be taken into account in future studies.
Meningiomas are common intracranial neoplasms with a wide range of histopathological appearances. The WHO classification of tumours of the central nervous system recognises 15 subtypes of meningiomas, of which meningothelial, fibrous and transitional subtypes are most common. Lymphoplasmacyte-rich meningiomas (LPM) are rare WHO subtype that belong to Grade I meningiomas.1 The estimated incidence is less than 1% of all meningiomas.2 LPM usually occurs in young and middle age patients, with most common locations being cerebral convexities, skull base, parasagittal area within the superior sagittal sinus, cervical canal, optic nerve and tentorium.3 Histopathological examination shows extensive infiltrates of lymphocytes and plasma cells often obscuring the meningothelial component.
Case report
A 21-year-old man presented with a history of headache since 4 months. It was a dull pain not associated with vomiting, seizures or visual symptoms. The patient did not have any features suggestive of cranial nerve involvement. Physical examination was unremarkable except for the presence of papilloedema. Non-contrast CT scan showed a large isodense lesion with peri- lesional oedema and eccentric enhancing nodular component in the right fronto-parietal region (Figure 1). A radiological diagnosis of glioma with mass effect and shift to left was rendered. A right frontoparietal free bone flap craniotomy was performed. Operatively, a well encapsulated tumour probably arising from the dura mater was found. Gross total removal of the tumour was done and the excised tumour was sent for histopathological examination with a provisional clinical diagnosis of meningioma.
Histopathological examination revealed a tumour arranged as sheets and whorls of meningothelial cells without any mitoses or atypia. A dense infiltrate of lymphocytes and plasma cells was seen in large areas of the tumour (Figure 2).
On immunohistochemistry, tumour cells were positive for epithelial membrane antigen (EMA) (Figure 3) and vimentin. The lymphoplasmacytic infiltrate contained mixture of CD3 and CD20 positive lymphocytes. A diagnosis of lymphoplasmacyte- rich meningioma was given.
Figure 1. Non-contrast CT scan showing a large isodense cystic lesion with perilesional oedema and eccentric enhancing nodular component in the right frontoparietal region
Figure 2: Tumour arranged as sheets and whorls of meningothelial cells without any mitoses or atypia. A dense infiltrate of lymphocytes and plasma cells seen in large areas of the tumour (H & E x 100)
Meningiomas are common neoplasms accounting for 24-30% of all primary intracranial tumours. They arise from the arachnoidal cells, and are typically attached to the inner surface of the duramater.1 Most of the meningiomas are benign, corresponding to WHO grade I and associated with a favourable clinical outcome. LPM is a rare low grade histopathological subtype of meningioma, usually seen in younger patients, with the mean age of onset being 34 years.4,5 The patients with LPM have variable clinical manifestations according to the location of the tumour. The common presentations include headache, hemiparesis, seizure, vomiting, dizziness, visual disturbance, dyscalculia, dysgraphia and slurred speech.3 Although the natural history of LPM is often over one year, few cases might occur in short duration due to inflammatory cell infiltration and oedema.6 Systemic haematological abnormalities such as hyperglobulinemia and iron refractory anaemia have been documented in some patients with LPM, believed by some to be due to the plasma cell infiltrate.3,6,7
Radiologically, LPMs are usually globular, highly vascular, contrast- enhancing, and dural based tumours. The typical characters of LPM on MRI are isointense lesions on T1-weighted images and hyperintense lesions on T2-weighted images, with a strong homogenous enhancement after the administration of gadolinium; obvious peritumoural brain oedema and dural tail signs.3 Sometimes, cystic component and heterogeneous enhancement may also be encountered, making pre-operative diagnosis difficult, as in our case.8
On microscopic examination, this tumour is characterised by a conspicuous infiltrate of lymphocytes and plasma cells, sometimes completely obscuring the tumour cells. The massive infiltration of lymphocytes and plasma cells has been postulated to play a central role in the development of brain oedema associated with LPM. The origin of this tumour (neoplastic or inflammatory) is unclear, so it is considered closer to intracranial inflammatory masses rather than typical meningiomas.7
The differential diagnoses include collision tumour of meningioma and plasmacytoma, inflammatory pseudotumour, idiopathic hypertrophic pachymeningitis (IHP), and intracranial plasma cell granuloma.3,7 The use of staining for EMA and vimentin is useful in indicating the meningothelial origin of the tumour, and differentiates LPM from other intracranial lesions.9
The pathological findings of IHP usually include thickened fibrotic dura mater with marked infiltration of lymphocytes and plasma cells, occasionally accompanied with small islands of meningothelial proliferation mimicking those of LPM. Localised nodular lesion can sometimes rule out this diagnosis in that IHP usually shows diffused lamellar thickenings or plaque-like features.4
Chordoid meningiomas often contain regions that are histologically similar to chordoma, with cords or trabeculae of eosinophilic, vacuolated cells in a background of abundant mucoid matrix background.3 Detailed histological studies can aid the differential diagnosis. The plasma cell component is not neoplastic and thus plasmacytoma with reactive meningothelial hyperplasia or a collision tumour involving meningioma and plasmacytoma can both be excluded.10
The knowledge of this rare entity is important to avoid its underdiagnosis as an inflammatory pseudotumour or plasma cell granuloma and overdiagnosis as a plasmacytoma.
The non-vitamin K antagonist oral anticoagulants have demonstrated favourable benefit–risk profiles in large phase III trials, and these findings have been supported by real-world studies involving unselected patients representative of those encountered in routine clinical practice and including those deemed ‘challenging-to-treat’
Accurate detection of atrial fibrillation and assessment of stroke and bleeding risk is crucial in identifying patients who should receive anticoagulation
Elderly populations represent a significant proportion of patients seen in general practice, and advanced age should not be regarded as a contraindication to treatment; acetylsalicylic acid is not considered an effective treatment option to reduce the risk of stroke in patients with non-valvular atrial fibrillation (except for those declining oral anticoagulation), particularly in fragile elderly patients, for whom this drug was historically prescribed
The frequency of follow-up visits, in particular to check compliance, should be tailored according to patients’ clinical characteristics and needs, but there is no requirement for routine coagulation monitoring, unlike vitamin K antagonists
Atrial fibrillation: a clinical and economic burden to society
Atrial fibrillation (AF) is the most frequently encountered sustained cardiac arrhythmia, with a prevalence of about 1.5–2% in the general population1,2. Its incidence is predicted to rise sharply over the coming years as a consequence of the ageing population and increased life expectancy in those with ischaemic and other structural heart disease2. In addition to being associated with significantly increased rates of mortality3, AF is also associated with significantly increased rates of heart failure, which is both a common cause and consequence of AF and greatly worsens the prognosis4. However, it is stroke that is the most devastating consequence of AF, with an average fivefold increased risk5.
AF-related strokes are often more severe than other strokes6,7because the clots that embolise from the left atrium or left atrial appendage are often much larger8than from other sources of emboli. These clots usually lodge in large cerebral vessels, commonly the middle cerebral artery, resulting in huge neurological and functional deficits and increased mortality compared with other stroke types. Moreover, the strokes suffered by patients with AF are more likely to lead to extended hospital care than strokes in patients without AF, thus impacting on patients’ quality of life7.
Current evidence suggests that, in the UK, AF has a causative role in almost 20% of all strokes9. This is likely to represent a significant underestimate given that long term electrocardiogram (ECG) monitoring in patients who would previously have been diagnosed as having cryptogenic stroke has demonstrated a significant AF burden in these patients10.
With improved AF detection and stroke prevention, it is estimated that approximately 8000 strokes could be avoided and 2100 lives saved every year in the UK, resulting in substantial healthcare savings of £96 million11,12.
A key objective of this short review is to provide primary care clinicians with the confidence to manage patients with AF in need of anticoagulation, including the safe and appropriate use of the non-vitamin K antagonist oral anticoagulants (NOACs) apixaban, dabigatran, rivaroxaban (approved in the EU, US and several other countries worldwide) and edoxaban (approved in the EU, US and Japan).13-20The focus will be on how to accurately identify, risk-stratify and counsel patients on the risks and benefits associated with the different treatment options.
Who to treat. Accurate detection and assessment of stroke and bleeding risk
Many patients with AF are asymptomatic, particularly the elderly, less active patients who may not notice the reduction in cardiac performance associated with AF. Unfortunately, it remains the case that AF is undetected in up to 45% of patients21, and stroke is very often the first presentation of AF.
Both the National Institute for Health and Care Excellence (NICE) and the European Society of Cardiology (ESC) guidelines recommend opportunistic screening in patients aged ≥65 years by manual pulse palpation followed by ECG in patients found to have an irregular pulse1,22. Opportunistic screening (manual pulse palpation) was shown to be as effective as systematic screening (ECG) in detecting new cases23, and this simple strategy should be used to screen at-risk patient groups as often as possible. Hypertension and increasing age are the two leading risk factors for developing AF, but other high-risk groups include patients with obstructive sleep apnoea, morbid obesity or a history of ischaemic heart disease24-26. In the context of proactive AF detection, many initiatives have been launched worldwide to encourage primary care clinicians to integrate manual pulse checks into their routine practice. The Know Your Pulse campaign was launched by the AF Association and Arrhythmia Alliance during Heart Rhythm Week in 2009 and was quickly endorsed by the Department of Health in the UK and by many other countries. This initiative has assisted in diminishing some of the gaps in AF detection21.
The most frequently used tools to evaluate stroke risk in patients with non-valvular AF (AF that is not associated with rheumatic valvular disease or prosthetic heart valves) are the CHADS227 and CHA2DS2-VASc28scores, with recent guidelines favouring the use of the latter and emphasising the need to effectively identify ‘truly low-risk’ patients1. The CHA2DS2-VASc score is superior to CHADS2 in identifying these truly low-risk patients, who should not be routinely offered anticoagulation1. Patients with any form of AF (i.e. paroxysmal, persistent or permanent), and regardless of whether they are symptomatic, should be risk stratified in this way. The risk of stroke should also be assessed using CHA2DS2-VASc in patients with atrial flutter and probably for the majority of patients who have been successfully cardioverted in the past22. Unless the initial underlying cause has been removed (e.g. corrected hyperthyroidism) and there is no significant underlying structural heart disease, the risk of patients suffering from a recurrence of AF following ‘successful’ cardioversion remains high29. The ESC guidelines recommend that anticoagulation should be offered to patients with a CHA2DS2-VASc score ≥1 based on assessment of risk of bleeding complications and the patient’s clinical features and preferences1.
The new Quality and Outcomes Framework (QOF) for 2015–2016 now recommends the use of CHA2DS2-VASc for risk stratification and no longer recommends antiplatelet agents as a therapeutic option for stroke prevention in patients with non-valvular AF30; this should result in significantly more patients receiving anticoagulation for this indication. The changes to QOF 2015–2016 compared with 2014–2015 are summarised in Table 130.
Table 1. Summary of changes to UK the Quality and Outcomes Framework (QOF) 2015–201630
NICE indicator ID
Changes
2014–2015 points
2015–2016 points
NM45: Patients with AF and CHADS2=1 currently treated with anticoagulant therapy or antiplatelet therapy
Retired
6
–
NM46: Patients with AF and a latest record of a CHADS2 ≥1 currently treated with anticoagulant therapy
Replaced by NM82
6
–
NM82: Patients with AF and CHA2DS2-VASc ≥2 currently treated with anticoagulant therapy
Replacement
–
12
NM81: Patients with AF in whom stroke risk has been assessed using the CHA2DS2-VASc risk-stratification scoring system in the preceding 12 months (excluding those with a previous CHADS2 or CHA2DS2-VASc ≥2)
New indicator
–
12
Key: AF = atrial fibrillation; CHADS2 = Congestive heart failure, Hypertension, Age ≥75 years, Diabetes, Stroke (doubled); CHA2DS2-VASc = Congestive heart failure or left ventricular dysfunction Hypertension, Age ≥75 years (doubled), Diabetes, Stroke (doubled)-Vascular disease, Age 65–74 years, Sex category (female); NICE = National Institute for Health and Care Excellence
The Guidance on Risk Assessment and Stroke Prevention in Atrial Fibrillation (GRASP-AF) clinical audit software detection tool is now very widely used in primary care to improve clinical outcomes in the AF population by identifying patients likely to benefit from anticoagulation. GRASP-AF systematically scans general practice software systems and calculates CHADS2 and CHA2DS2-VASc scores in patients who are coded as having AF, thus enabling physicians to identify high-risk patients who are not adequately treated for stroke prevention31. Identification of AF patients who are poorly controlled on warfarin (defined as having a time in therapeutic range <65% or a labile international normalised ratio [INR], e.g. one INR value >8 or two INR values <1.5 or >5 within the past 6 months)22 is crucial because these patients are more likely to experience major bleeding or stroke. These patients should be reviewed and, if possible, the cause for the poor warfarin control should be identified. The Warfarin Patient Safety Audit tool is another software tool that has been developed to help identify patients with poor warfarin control32.
Primary care clinicians are being urged to objectively assess the bleeding risk of AF patients who are receiving, or about to receive, anticoagulation1,22,32. HAS-BLED is the bleeding assessment scheme advocated by both NICE and the ESC1,22, this has been validated in several independent cohorts and was shown to correlate well with the risk of major bleeding, in particular intracranial bleeding1. The key aspect of HAS-BLED is that, unlike CHADS2 and CHA2DS2-VASc, it consists of risk factors that are modifiable. It should, therefore, not be a tool to influence the decision of whether to anticoagulate, but instead to identify ways to reduce the risk of bleeding in patients receiving an anticoagulant; for example, optimising blood pressure control, stopping unnecessary antiplatelet or anti-inflammatory agents and reducing alcohol consumption can all significantly reduce HAS-BLED scores and bleeding risk1. In addition, it needs to be emphasised that the absolute number of patients with AF experiencing a serious bleeding event while receiving anticoagulant therapy is low (~2–3%/year in the XANTUS, PMSS and Dresden NOAC Registry real-world studies) , with prospective real-world studies indicating that most bleeding events can be managed conservatively33-35. Whilst concerns have been raised about not having a reversal agent to counter the anticoagulant action of NOACs in patients who experience serious bleeding, the low incidence of major bleeding in real-world and phase III studies and its conservative management in most cases demonstrate that such agents would not be required routinely. Despite these low rates of major bleeding, reversal agents have been developed and successfully completed phase III studies and undergone approval in some markets, including idarucizumab in the UK36,37. Notably, high-risk patients with AF were shown to be more willing to endure bleeding events in order to avoid a stroke and its consequences38, thus reinforcing the message that “we can replace blood but we cannot replace brain tissue”.
Adequate anticoagulation therapy should follow appropriate patient identification
Identifying the right treatment option for patients with AF is likely to improve clinical outcomes. Involving patients in the decision-making process and rationale, and ensuring they understand the net benefit–risk of treatment options, is likely to lead to better compliance and improved clinical outcomes. The ESC guidelines consider patients with valvular AF (patients with AF in the presence of either rheumatic mitral stenosis [very rare now in the UK] or prosthetic heart valves) to be at high risk, and these patients should be anticoagulated with a VKA regardless of the presence of any other risk factors1. Warfarin is very effective at reducing the risk of stroke compared with acetylsalicylic acid (ASA)39,40, but an unpredictable dose–response relationship and multiple drug and food interactions can be problematic for some patients, and many patients remain sub-optimally treated41. ASA is also not considered an effective treatment option to reduce the risk of stroke in patients with non-valvular AF especially in frail, elderly patients in whom ASA was historically prescribed. The GARFIELD-AF registry (10,614 patients enrolled in the first cohort) revealed that real-world anticoagulant prescribing in AF populations deviates substantially from guideline recommendations: 40.7% of patients with a CHA2DS2-VASc score ≥2 did not receive anticoagulant therapy, and a further 38.7% with a score of 0 received anticoagulant therapy. At diagnosis, 55.8% of patients overall were given a VKA, just over one quarter (25.3%) received an antiplatelet drug alone, and ~4.5% received a NOAC24. Inappropriate prescribing was further confirmed by data from UK general practices (n=1857, representing a practice population of 13.1 million registered patients) using the GRASP-AF tool. Only 55% of patients with high-risk AF (CHADS2 ≥2) were receiving oral anticoagulation (OAC) therapy, whereas a further 34% of patients with no known contraindication did not receive OAC therapy42.
The NOACs have altered the landscape in terms of stroke prevention management by increasing the available options for patients. These agents exhibit some important practical advantages over traditional therapy (e.g. no requirement for routine anticoagulation monitoring, simple fixed dosing oral regimens, fast onset of action, fewer drug reactions and no food interactions), leading to their increased uptake in primary care.
Key patient groups who are likely to benefit from the NOACs include patients poorly controlled on VKAs, those predicted to require medications that interact with VKAs (e.g. patients who require frequent antibiotics), those without severe renal impairment or those with a prior ischaemic stroke while receiving a VKA with an adequate INR. These agents could also be a good choice for patients living a considerable distance from their local hospital or surgery and commuters. The NICE guidelines state that primary care clinicians should consider clinical features and patient preference before deciding on the most appropriate option for patients22. In addition, cost may be important in some settings. All of the NOACs have demonstrated cost-effectiveness versus warfarin, and although cost models vary by country, there is little doubt that these agents provide cost-effectiveness largely through the number of adverse events avoided and their associated costs43.
Choice of anticoagulant: which to choose?
The demonstration of a favourable benefit–risk profile (stroke prevention vs bleeding events) in large phase III studies involving over 70,000 patients has resulted in the regulatory approval of apixaban, dabigatran, edoxaban and rivaroxaban44-47for the prevention of stroke and systemic embolism in patients with non-valvular AF and one or more risk factors.
Overall, NOACs have demonstrated an improved benefit compared with warfarin, with lower rates of intracranial haemorrhage (for all NOACs) and similar or superior efficacy for stroke prevention44-48. Statistically significant relative risk reductions (RRRs) in the incidence of fatal bleeding events were seen with low-dose dabigatran (110 mg twice daily [bd]; RRR=42%), both tested doses of edoxaban (30 mg once daily [od] and 60 mg od; RRR=65% and 45%, respectively) and rivaroxaban (20 mg od; RRR=50%)46,47,49; rates of fatal bleeding were also lower in patients treated with apixaban compared with warfarin (34 patients vs 55 patients, respectively)44. These data are promising, especially considering the current lack of a specific antidote for any of the NOACs, and it is likely that the very short half-life of these drugs play an important role in mitigating the bleeding risk.
Owing to a lack of head-to-head comparisons between the NOACs in phase III clinical trials, patient characteristics, drug compliance, tolerability issues and cost may be important considerations1. In addition, subanalyses of phase III trial data for rivaroxaban, apixaban and dabigatran indicate that the challenging-to-treat patient groups often encountered by primary care clinicians can be treated effectively and safely with the NOACs (Table 2). A recent meta-analysis showed a similar treatment effect for almost all subgroups encountered in clinical practice; NOACs appeared to be at least as effective as VKAs in reducing the risk of stroke and systemic embolism and no more hazardous in relation to the risk of major bleeding events, irrespective of patient co-morbidities50.
Table 2.Novel oral anticoagulants studied in key patient subgroups*
Subgroup analysis
Rivaroxaban
Dabigatran
Apixaban
Factors related to disease
ROCKET AF
RE-LY
ARISTOTLE
Heart failure
ü59
ü60
ü61
Renal impairment
ü62
ü63
ü64
Prior stroke
ü65
ü66
ü67
VKA-naïve
ü68
ü69
ü70
Prior MI or CAD
ü(prior MI)71
ü(CAD or prior MI)72
üCAD73
PAD
ü74
–
–
PK/PD
ü75
ü76
–
East Asian patients
ü77
ü78
79
Elderly
ü80
ü49
ü81
Major bleeding predictors
ü82
–
–
Obesity
–
–
–
Diabetes
ü83
ü84
ü85
Valvular heart disease
ü86
–
ü87
Paroxysmal versus persistent AF
ü88
ü89
ü90
*No subgroup analyses have been presented for edoxaban Key: AF = atrial fibrillation; ARISTOTLE = Apixaban for Reduction In STroke and Other ThromboemboLic Events in atrial fibrillation; CAD = coronary artery disease; CHADS2= Congestive heart failure, Hypertension, Age ≥75 years, Diabetes, Stroke (doubled); MI = myocardial infarction; PAD = peripheral artery disease; PK/PD = pharmacodynamics/pharmacokinetics; RE-LY = Randomized Evaluation of Long-term anticoagulation therapy; ROCKET AF = Rivaroxaban Once daily, oral, direct factor Xa inhibition Compared with vitamin K antagonism for prevention of stroke and Embolism Trial in Atrial Fibrillation; VKA = vitamin K antagonist
Because patient selection in clinical trials is based on strict inclusion/exclusion criteria, patient populations in such studies are not always representative of patients routinely seen in real-world practice. In addition, bleeding events may be managed differently in clinical trials versus routine clinical practice. Real-world data are, therefore, needed to help validate drug safety and effectiveness in unselected patient populations. Following phase III clinical trials and the widespread approval of the NOACs in stroke prevention in patients with non-valvular AF, real-world experience has been steadily accumulating. The current real-world data for rivaroxaban, apixaban and dabigatran have been very reassuring and bridge the evidence gap between clinical studies and real-world experience33-35,51-57.
The lack of routine coagulation monitoring with NOACs does not remove the necessity for regular follow-up. Instead, the frequency of visits can be tailored according to patients’ clinical characteristics and needs. NOACs are all partially eliminated by the kidneys; therefore, regular monitoring of renal function is important either to use a lower recommended dose of these drugs or to avoid them. For example, renal function should be monitored every 6 months in patients who have stage III chronic kidney disease (creatinine clearance [CrCl] 30–60 ml/min)58. Apixaban, rivaroxaban and edoxaban are not recommended in patients with CrCl <15 ml/min, and dabigatran is contraindicated in patients with CrCl <30 ml/min13,15,17,19. Reduced-dose regimens of NOACs are recommended for patients at higher risk of bleeding events, including those with reduced renal function. For example, a reduced apixaban dose of 2.5 mg bd is indicated in patients with at least two of the following characteristics: age ≥80 years, body weight ≤60 kg or serum creatinine ≥1.5 mg/dl (133 μmol/l); a reduced rivaroxaban dose of 15 mg od is indicated in patients with CrCl 15‒49 ml/min58; edoxaban is recommended at a reduced dose of 30 mg od in patients with CrCl 15‒50 ml/min and contraindicated in patients with CrCl >95 ml/min58; and a reduced dose of 110 mg bd dabigatran should be considered in patients with CrCl 30‒50 ml/min who are at a high risk of bleeding58. Follow-up visits should also systematically document patient compliance, thromboembolic and bleeding events, side-effects, co-medications and blood test results58.
Conclusions
The NOACs have demonstrated favourable benefit–risk profiles in large phase III trials, and these findings have been supported by real-world studies involving unselected patients, including those deemed challenging to treat. The NOACs also address many of the limitations associated with VKA use, thus assisting with their integration into clinical practice for stroke prevention in patients with non-valvular AF. In addition, the results from subgroup analyses should provide primary care clinicians with the confidence to manage stroke-prevention strategies in a wide variety of patients with AF.
A 38 year old BMI 20.2 ASA 2 female underwent an elective robotic-assisted laparoscopic extirpation of endometriosis and dissection of endometriomas. Her medical history included hypertension, migraine, atopic dermatitis, sciatica, cervical spine spondylosis and dysplastic spondylolisthesis of L4/5. Of note, the patient had allergies to Aspirin (causing angioedema), Morphine and Tramadol (both causing generalized rash).
An 18gauge IV cannula was inserted into the cephalic vein at the left wrist, and connected to a bag of Hartmann’s solution. The patient was induced with Propofol 100mg, Rocuronium 30mg and a Remifentanyl infusion running at Ce 1ng/mL. Cefazolin 2g and Dexamethasone 4mg were also administered post-intubation. No rashes were noted on the patient’s skin, and her arms were subsequently enclosed with green towels by her sides for the duration of the surgery. During the procedure, the patient was sustained in a steep trendelenberg position, with her face and eyes checked periodically. No rashes were noted on any exposed skin. Peri-operatively, she was maintained with O2/air/Desfluorane, top-up doses of Rocuronium, and titration of the Remifentanyl infusion. At the end of the surgery, the patient was administered Ondansetron 4mg and Pethidine 50mg (in 2mL), and reversed with Neostigmine 2.5mg and Glycopyrrolate 0.4mg. The patient’s arms were subsequently exposed in preparation for transfer, and it was noted that she had developed severe erythema and inflammation in specific tributaries of the cannulated vein (Figure 1). The patient was extubated uneventfully five minutes later, and did not complain of any symptoms systemically or pertaining to the cord inflammation. She was monitored in recovery for three hours post-op, and the inflammation subsided significantly 90 minutes post-op (Figure 2) and completely 150 minutes post-op (Figure 3).
There have not been many reports of such a reaction in published materials, and we take this opportunity to provide further pictorial evidence of the possible sequelae of IV administration of a high concentration Pethidine solution. The variances in analgesia effectiveness and potential side effects between Morphine and Pethidine are negligible2. As such, and given that Pethidine is commonly used as a mode of analgesia on our wards and in the peri- and immediate post-operative periods when other classes of drugs are contraindicated, we hope to provide further pictorial support of such an extraordinary reaction for other interested clinicians. It is also interesting to note that in both cases the patient was female, around 40 years old, had a thin body structure, had an atopic tendency, and the concentration of injected solution was higher than 10mg/mL. Additionally, these are known factors believed to increase reaction severity3 4. We acknowledge that 3 other drugs were administered at the same approximate time as Pethidine, and as such any of the 4 medications could be culprit to the reaction, although this is unlikely as our patient had been given those medications in previous procedures with no issues or complications.
Figure 1: Post-op, Figure 2: 90 mins post-op, Figure 3: 150 mins post-op
Routine pulse palpation is the recommended screening method to detect asymptomatic atrial fibrillation (AF) in clinical practice¹. Since this is part of the blood pressure (BP) measurement technique when using the Riva Rocci (mercury) device or the aneroid device, most patients are evaluated for rhythm irregularity while checking their BP, and, if pulse isn’t palpated, heart rhythm can be evaluated through auscultation of Koroktoff’ sounds. According to the European Community law (2007/51 CE; 2007 September 27th), the mercury sphygmomanometers should not be sold any more, therefore aneroid or automatic devices will replace them in a few years. Recently new devices with embedded algorithms to detect irregular heart beat and possible AF have been commercialised. Whether the switch from Riva-Rocci or aneroid sphygmomanometer to this device will affect detection of AF in usual care is unknown. We explored this issue using a retrospective, naturalistic observation of a group of GPs who abandoned the “old” Riva-Rocci or the aneroid sphygmomanometer and adopted this new device.
Methods
In September 2011 the members of the Italian College of General Practitioners based in Bologna (a medium size city in Central Italy) decided to standardize their office BP measurements. They received an unconditional grant for 30 automatic upper arm blood pressure monitors (Microlife- Afib ®) to be used in office by the GP him/herself. This device embeds an algorithm that calculates the irregularity index (standard deviation divided by mean) based on interval times between heartbeats; if the irregularity index is above a certain threshold value, atrial fibrillation is likely to be present and an atrial fibrillation icon is displayed on the screen. The 30 general practitioners who received the device agreed to a later proposal to examine their database to evaluate detection of new AF patients. They all had the same professional software (Millewin®), and used an automatic extraction. All the patients with recorded diagnosis of hypertension were identified, then BP recording and AF diagnosis were extracted before (365 days preceding the use of Microlife) and after (4 months since starting the use of Microlife) the adoption of the automatic devices. The proposal to examine AF detection was made after four months after they received the devices, therefore the GPs weren’t aware of this study during the usual professional activity. This study was also neither planned nor known by Microlife. Fourteen other GPs, who were using the traditional device, volunteered to provide the same data extraction from their personal database.
Results The 30 participants GPs cared for 48,184 individuals, 12,294 (25.5%) of whom had hypertension (mean age 69.9±13.4). The 16 control GPs cared for 23,218 patients, 5,757 (24.8%) with hypertension (mean age 69.7±13.6). The four-monthly AF detection rate for the original group and the control group is reported in table 1. All the new detected AF were then confirmed on ECG. Statistical analysis was made with the chi-square (χ²) test.
Table 1: Four-monthly AF detection rate in the original GP group and in the control group*
N° GPs and (n° hypertensive patients)
Detected AF % and (n° pts) October 2010- January 2011
Detected AF % and (n° pts) February 2011- May 2011
Detected AF % and (n° pts) June 2011- September 2011
Detected AF % and (n° pts) October 2011-January 2012
30 (12294) - original group
0.37% (46) *
0.3% (39) *
0.37% (45) *
0.63% (77) **
16 (5757) - controls
0.35% (20) ‡
0.45% (26) ‡
0.56% (32) ‡
0.33% (19) ‡‡
*‡ Use of the traditional device: original group vs controls: p NS ( χ² = 3.0421, df 1) ** use of the automatic device (other quarters use of traditional device) **‡‡ Original group: use of the automatic device vs traditional device in AF detection: p < 0.005 (χ ² = 9.487, df 1)
Discussion
Atrial fibrillation can be difficult to diagnose as it is often asymptomatic and intermittent (paroxysmal). The irregularity of heart rhythm can be detected by palpation of the pulse. It may therefore be detected in patients who present with symptoms such as palpitations, dizziness, blackouts and breathlessness, but may also be an incidental finding in asymptomatic patients during routine examination. The diagnosis must be confirmed with an ECG, which should be performed in all patients, whether symptomatic or not, in whom atrial fibrillation is suspected due to the detection of an irregular pulse. Heart rhythm should be evaluated while measuring BP with traditional sphygmomanometers, while this information may be lost with automatic devices, therefore the use of automatic devices with algorithms which can detect possible AF is an appealing choice. The hypothesis that these devices are equal or superior to systematic pulse palpation is currently under investigation by NICE². At the moment the consequences of switching from the classical Riva-Rocci devices to these new ones in usual care isn’t known. The AF opportunistic screening in people aged > 65 leads to a 1.63% detection rate while usual care has a detection rate of 1.04%, very similar to that observed in our hypertensive population (1.13%)³. Our data show that, at least in the short term, switching from the usual device to an automatic device with algorithm for irregular beat detection increases the identification rate of previously unknown AF in the hypertensive population. While waiting for a formal appraisal, GPs who wish or must renounce to their “old” Riva-Rocci can use this device implementing their “usual care” performances.
Mycobacterium tuberculosis was first isolated on 24th March 1882 by a German Physician Robert Koch, who received a Nobel Prize for this discovery in 1905 1. Tuberculosis is one of the oldest diseases in the history of mankind with evidence of tubercular decay found in some Egyptian mummies from 3000-2400 BC 2. The study of tuberculosis was also known as phthisiatry from phthisis, the Greek term for tuberculosis. Hippocrates identified phthisis as the most widespread disease of the time which involved the coughing up of blood, fever and was almost always fatal 3. Avicenna first identified that pulmonary TB was an infectious disease and developed the method of quarantine in order to limit the spread of disease 4 & 5. The disease was given the name of tuberculosis in 1839 by JL Schonlein 6.
Burden Of Disease
Tuberculosis (TB) is an infectious disease caused by various strains of mycobacteria; of which the commonest cause is Mycobacterium tuberculosis 7. The disease can affect any part of human body but commonly attacks the lungs. One third of the world’s current population has been infected by Mycobacterium tuberculosis and new infections occur at a rate of 1 per second 8. About 5-10% of these infections leads to active disease which, if left untreated, kills about 50% of its victims. TB affects approximately 8 million people worldwide and about 2 million people die of this disease annually. In the 19th century pandemic tuberculosis killed about 1/4th of the adult population of Europe 9. Nevertheless, these figures may be only the tip of the iceberg. Tuberculosis is again on the rise and main cause for the resurgence of TB is immunodeficiency as a result of HIV co-infection or, less commonly, immunosuppressive treatment such as chemotherapy or corticosteroids.
Introduction To Mycobacteria
Mycobacteria are aerobic and non-motile bacteria (with the exception of Mycobacterium marinum which is motile within macrophages) which are characteristically alcohol-acid fast 10. They are present in the environment widely in water and various food sources. They are usually considered to be Gram-positive bacteria, but they do not generally retain the crystal violet stain and are thus called Gram-positive acid-fast bacteria. These acid-fast bacilli (AFB) are straight or slightly curved rods 0.2-0.6 mm wide and 1-10 mm long. Mycobacteria are classified on the basis of growth & their ability to produce pigment.
On the basis of growth:
Rapid growing: Mycobacteria that forms colonies clearly visible to naked eye within 7 days on sub-cultures
Slowly growing: Mycobacteria that do not form colonies clearly visible to naked eye within 7 days on sub-culture
On the basis of pigmentation mycobacteria are divided into 3 groups:
Photochromogens (Group I): Produce non-pigmented colonies in dark and pigmented colonies when exposed to light and re-incubation e.g., M. kansasii, M. marinum etc
Scotochromogens (Group II): Produce deep yellow to orange colonies when grown in the presence of either light or darkness e.g., M. scrofulaceum, M. xenopi etc
Non-chromogens (Group III & IV): Non-pigmented in light and dark or only a pale yellow, buff or tan pigment that does not intensify after exposure to light e.g., M. tuberculosis, M. avium-intra-cellulare, M. ulcerans etc
For Clinical Purposes mycobacteria are divided into 3 main classes:
Mycobacterium tuberculosis complex: These are the mycobacteria which can cause TB and include M. tuberculosis, M. bovis, M. pinnipedii, M. africanum, M. microti and M. canetti.
Mycobacterium leprae causes leprosy, also known as Hansen’s disease.
Non-tuberculous mycobacteria (NTM) or environmental mycobacteria, atypical mycobacteria and mycobacteria other than tuberculosis (MOTT). These include all other mycobacteria which can cause pulmonary disease resembling tuberculosis, lymphadenitis, skin disease or disseminated disease. These include: Mycobacterium avium complex, Mycobacterium abscessus, Mycobacterium fortuitum and M. Kansasii which can cause both tuberculosis and leprosy in mammals.
Spread Of Tuberculosis
Today we know that TB is an airborne and highly infectious disease. A person becomes infected when he or she inhales Mycobacterium tuberculosis suspended in air as micro-droplets. Patients suffering from pulmonary TB who have detectable Mycobacterium tuberculosis in their sputum are known as smear positive cases of pulmonary TB. The bacterial load in sputum can be as high as 10,000,000 bacilli/mL. When such smear positive patients of pulmonary TB cough, sneeze or expectorate they produce micro-droplets of phlegm containing Mycobacterium tuberculosis (MTB). The size of these micro-droplets varies from 0.5 to 5mm in diameter. These micro-droplets can remain suspended in air up to 8 hours or even more (depending upon droplet size and environmental conditions including air flow). A single sneeze can produce up to 40,000 of these droplets 11. MTB cannot invade the mucous membranes of the respiratory tree and must reach the alveoli where it replicates. The size of the MTB-containing micro-droplet must be <1mm to be carried to the end of the bronchial tree otherwise it will be deposited on the walls of bronchial tree and cleared away by mucociliary action. Current knowledge asserts that even less than 10 bacteria may cause pulmonary infection 12 & 13. A sputum smear positive patient of TB, if left untreated, can cause infection in 10-15 new people each year.
Definition of TB contacts: People exposed to someone with infectious TB, generally including family members, roommates or housemates, close friends, coworkers, classmates, and others. They are a high priority group for latent-TB infection (LTBI) treatment as they are at high risk of being infected with TB.
Definition of close TB contacts:A person who had prolonged, frequent, or intense contact (i.e. >8 hours/day) with a person with sputum positive TB while he or she was infectious. They are more likely to become infected with TB than the contacts those who see the patient less often.
Pathogenesis
Once in the distal end of bronchial tree, MTB is engulfed by a macrophage in order to start replication within this host cell. Depending upon genetic factors, these macrophages can provide a variable environment for the replication of MTB. If this primary infection starts with a single mycobacteria and the initial host response is incapable of halting this process, within weeks or months there will be millions of tubercle bacilli within the body. MTB spreads in sequence from this primary site to the hilar-mediastinal lymph node initially. When seen on the X-ray, this primary focus of pulmonary infection is called a Gohn focus. It is generally located in the upper lobe or the apical segment of the lower lobe 7. The Gohn focus plus enlarged hilar-mediastinal node is called a Gohn complex. Tubercle bacilli enter the thoracic duct from the hilar-mediastinal lymph nodes, then by passing via the subclavian vein and right atrium, gain access to pulmonary and systemic circulation. As a result MTB can access, and subsequently infect, any organ of the body. Immunocompetent hosts can normally generate an effective immune response within 3-8 weeks, which tackles the primary Gohn focus and can cause involution of the lesions throughout the body. This immune response is a delayed type hypersensitivity reaction to the cell wall protein of bacilli and this is also responsible for positive tuberculin skin test, which appears 4-12 weeks after infection. The primary immune response is not however sufficient to sterilize the tissues and MTB can remain dormant in these foci. Latent foci may persist in the lungs or other organs of the body and are capable of producing disease reactivation which may be pulmonary or extra-pulmonary. In some cases where the initial host response is not capable of causing involution of the primary disease (such as infancy or an immunocompromised state) the infection proliferates and spreads, causing so-called “progressive primary disease”.
Mycobacterium bovis is a mycobacterium that causes tuberculosis in cattle but which can also infect humans. It can be transmitted from cattle to human by ingestion of infected milk and very rarely by inhalation of animal aerosol micro-droplets and by eating infected raw meat. The process of pasteurisation kills M. bovis and other bacteria in milk, meaning that infections in human are rare 14.
When To Suspect Tuberculosis
Primary Tuberculosis: Tuberculosis caused by infection with tubercle bacilli and characterized by the formation of a primary complex in the lungs consisting of a small peripheral pulmonary focus and hilar or para-tracheal lymph node involvement; it may cavitate and heal with scarring or progress. It is mainly seen in children but 10% cases of adults suffering from pulmonary TB have primary infection.
Reactivation Tuberculosis: Also known as chronic TB, post-primary disease, recrudescent TB, endogenous reinfection, and adult type progressive TB. It represents 90% of adult cases (in a non-HIV population), and is due to reactivation of dormant AFBs which are seeded at the time of the primary infection. The apical and posterior segments of the upper lobe and superior segment of the lower lobe of the lung are frequently involved.
Clinical Features:Symptoms and signs vary greatly as do radiological signs. A literature review showed that common signs and symptoms seen in TB infection were 15, 16, 17, 18:
Cough, which can be either productive or non-productive; it is often initially a dry cough which can later become productive.
Fever which seen in usually 70% of cases; generally it is low grade but could be as high as 390C, lasting for 14 to 21 days and in 98% cases is resolved completely by 10 weeks.
Night sweats which is usually seen in 50% of cases
Weight loss
Pleural effusion: 50% of the patients with pleuritic chest pain had pleural effusion
Chest pain: mainly pleuritic with some patients describing retrosternal and inter-scapular dull pain occasionally worsened by swallowing. This pain is believed to be due to enlarged bronchial/ mediastinal lymph nodes
Dyspnoea can be present in 33% of cases
Haemoptysis can be seen in 25%of cases
Fatigue
Arthralgia
Pharyngitis
Common radiological findings were as follows:
Hilar lymphadenopathy: can be seen as early as 1 week after the skin conversion and in almost all of cases within 2 months. It can be associated with right middle lobe collapse
Pleural effusion: typically within the first 3-4 months but can be seen as late as one year
Pulmonary infiltrates mainly in the upper zones and peri-hilar areas
How To Investigate19
HIV testing should be done in all patients presenting with clinical features of tuberculosis
Active Pulmonary TB
CXR: Perform an X-ray chest PA view. If the appearance is suggestive of active tuberculosis perform further investigations
Sputum smear & culture for AFB: send at least 3 sputums for AFB smear and culture including at least one early morning sample. This ideally should be before starting treatment or within 7 days of starting treatment.
If clinical features and CXR are suggestive of active TB, do not wait for culture and sensitivity results, start the patient on the 4 drug initial treatment. This can be modified according to culture results later on.
Active Non-Respiratory TB
A tissue sample should be taken from the suspected non-respiratory site and sent for histological analysis, AFB smear and culture analysis. Common examples of non-respiratory tuberculosis are tuberculous lymphadenopathy, tuberculous meningitis and disseminated tuberculosis.
Physicians should think about CNS tuberculosis such as TB meningitis if a patient with risk factors (i.e., immigrants from endemic areas, positive history of close contact etc) presents with signs and symptoms such as headache, low grade fever, photophobia and/ or focal neurological signs. Lumbar puncture (LP) after a CT brain to rule out any contra-indication for LP may yield the diagnosis in these scenarios. An MRI brain is also very sensitive for picking up tuberculomas in such cases.
Latent TB
Offer Mantoux testing to the household contacts and close contacts of the person with active TB (aged 5 and older). If the Mantoux is positive or if results are unreliable, as can be the case with BCG-vaccinated persons consider interferon gamma testing (T-spot TB Test). If Mantoux is inconclusive, the patient should be referred to a TB specialist. A similar approach should be used for new entrant TB screening.
QuantiFERON-TB Gold (QFT-G) Test & QuantiFERON-TB Gold in Tube (QFT-GIT) Test
Both of these tests have replaced the QuantiFERON-TB (QFT) Test. It is an interferon gamma release assay (IGRA) and measures a component of cell-mediated immune reactivity to mycobacterium tuberculosis. In QFT-G test a blood sample is mixed with antigens (2 Mycobacterium TB protein) and a control. Mixtures are incubated for 16 to 24 hours and then the amount of interferon gamma is measured. If the patient is infected with mycobacterium TB, white blood cells will release interferon gamma when they come in contact with TB antigens. Clinical features, chest X-ray and sputum/ tissue smear and culture for AFB are needed to differentiate between active and latent TB.
Its advantages over tuberculous skin testing are:
This test requires a single patient visit to draw a sample
Results are available within 24 hours
Results are not dependent on reader
It is not affected by prior BCG vaccination
Its limitations/ disadvantages include:
The blood sample must be processed within 12 hours of collection (while white cells are still viable)
There is limited data for use of QFT-G in immune-compromised patients, children under 17 years of age and persons recently exposed to MTB
False positive results may occur with Mycobacterium szulgai, kansasii and marinum infection
QFT-GIT is a modification of QFT-G test. It consists of 3 blood collection tubes containing: 1) no antigen, 2) TB antigen, 3) mitogen. These tubes must be transferred to an incubator within 16 hours of blood collection. Interferon gamma detection is then carried out via ELISA. Its specificity varies from 96-99% and sensitivity is as high as 92% in individuals with active disease.
T-Spot TB Test
It is a type of ELISPOT assay, developed by the researchers at the University of Oxford in England. It counts the number of effector T-cells in the blood that produce gamma interferon so gives an overall measurement of antigen load on immune system. As it does not depend upon production of antibody or recoverable pathogen, it can be used to detect latent TB and it is much faster. In one study it was found that its sensitivity is 97.2% 20.
Treatment Of Tuberculosis (Caused By Mycobacterium Tuberculosis)
Active TB will kill 2 of every 3 people affected, if left untreated. Disseminated TB is 100% fatal if untreated. For the treatment of TB, drugs are used in combination and never singly. Patients require regular supervision of their therapy during treatment to monitor compliance and side effects of medications. Treatment of atypical mycobacterial infections should be under the care of specialized units as this needs special care and drug regimens are complicated. Drugs for treatment of TB are divided into 3 categories:
1st Line Drugs: 1stline anti-TB drugs are very effective against TB. There are 5 first line drugs. All have 3 letter and 1 letter standard abbreviations.
Rifampicin is RMP or R
Isoniazid is INH or H
Ethambutol is EMB or E
Pyrazinamide is PZA or Z
Streptomycin is STM or S
Using a single drug usually results in treatment failure and drug resistant strains 21. The frequency of Mycobacterium tuberculosis developing spontaneous mutations conferring resistance to an individual drug is well known: 1 in 107for EMB, 1 in 108 for STM & INH, 1 in 1010 for RMP 22. A patient with extensive pulmonary TB usually has 1012bacteria in his body and hence will have about 105 EMB-resistant bacteria, 104 STM-resistant bacteria, 104 INH resistant bacteria and 102 RMP resistant bacteria. Drug-resistant tuberculosis occurs when drug-resistant bacilli outgrow drug-susceptible bacilli. Mutations can produce bacilli resistant to any of the anti-tuberculosis drugs, although they occur more frequently for some drugs than others. The average mutation rate in M. tuberculosis for resistance to isoniazid is 2.56 x 10-8mutations per bacterium per generation; for rifampicin, 2.25 x 10-10; for ethambutol, 1.0 x 10-7; and for streptomycin, 2.95 x 10-8. The mutation rate for resistance to more than one drug is calculated by multiplying the rates for the individual drugs. For example, the mutation rate for resistance to both isoniazid and rifampicin is approximately 2.56 x 10-8 times 2.25 x 10-10, or 5.76 x 10-18. The expected ratio of resistant bacilli to susceptible bacilli in an unselected population of M. tuberculosis is about 1:106 each for isoniazid and streptomycin and 1:108 for rifampicin. Mutants resistant to both isoniazid and rifampicin should occur less than once in a population of 1014 bacilli. Pulmonary cavities contain about 107 to 109 bacilli; thus, they are likely to contain a small number of bacilli resistant to each of the anti-tuberculosis drugs but unlikely to contain bacilli resistant to two drugs simultaneously 23.
There are different regimens available for the treatment of TB. The initial 2 months of treatment (usually rifampicin based) is called Initial Phase or Intensive Phase Treatment which later leads to Continuation Phase Treatment. Initial intensive phase treatment is designed to kill actively growing bacteria. Drugs are listed using their single letter abbreviation and a prefix denotes the number of months a treatment has to be given and a subscript denotes intermittent dosage. For example; 2RHEZ/4RH3 = 2 months of initial phase treatment with Rifampicin, Isoniazid, Ethambutol, Pyrazinamide and 4 months continuation phase treatment with Rifampicin and Isoniazid given 3 times per week. If there is no subscript, it means the drugs are given daily.
Usual anti-TB regimens are:
2RHEZ/4RH3 (in less endemic areas)
2RHEZ/4RH (mostly practised, especially in non-endemic areas including UK); standard recommended regimen 24
2RHEZ/7RH (in most endemic areas)
2RHEZ/10RHE (in cases of disseminated, bone and CNS tuberculosis)
2nd Line Drugs 25 & 26: These are less effective than 1st line drugs, have more toxic side effects and are usually not available in most of the developing countries of the world. There are 6 classes of 2ndline anti-TB drugs:
3rd Line Drugs: These are drugs which may be useful, but are not on the WHO list of second line drugs. These are not as effective. 3rdline drugs include:
Rifabutin (this is an effective drug but is very expensive for developing countries, so it not included in WHO list). Occasionally this can be used for patients who are intolerant to or have bacterial resistance to Rifampicin.
Macrolides: Clarithromycin (CLR), Azithromycin
Linezolid: (LZD) not of proven efficacy
Thioacetazone (T)
Thioridazine
Arginine
Vitamin D
R207910: efficacy not proven
Indications of Steroids in the treatment of TB
Steroids should be used along with anti-TB drugs in following situations:
CNS TB (proven benefit)
TB pericarditis (proven benefit)
TB involving eye (definitely beneficial)
TB pleuritis (beneficial – 20-40mg tapered over 4-8 weeks)
Extremely advanced TB (beneficial)
TB in children (may be beneficial)
Miliary TB (beneficial)
Genitourinary TB (beneficial)
Laryngeal TB (may be beneficial – scanty evidence)
TB peritonitis (may be beneficial – scanty evidence)
Important Definitions / Terms 25, 27, 28, 29
New Case: A patient diagnosed as having TB who has never had anti-TB treatment before or had taken anti-TB treatment for less than 4 weeks.
Sputum Smear Positive Case of Pulmonary TB: A patient who has 2 out of 3 consecutive sputum samples positive for AFB.
Sputum Smear Negative Case of Pulmonary TB: A patient clinically and radiologically suspected to have pulmonary TB but with 3 consecutive sputum samples which are negative for AFB and is also culture negative for AFB.
Culture Positive Case of Pulmonary TB: A patient with 3 consecutive sputum smear samples which are negative for AFB but with at least 1 specimen positive for AFB in culture.
Short Course Therapy for TB: The short course therapy for treatment of TB includes 2RHEZ/4RH and also known as standard regimen. If PZA is not included in the regimen for treating TB, the course should be extended from 6 months to 9 months. If rifampicin is not included in treatment regimen then the length of course should be 18 months in total.
Treatment Failure:A TB patient is said to have treatment failure if they remain smear or culture positive while on treatment at the 5th month or if they were initially smear positive, became negative but then reverted to positive at the end of 5months of treatment. Another scenario is that of a patient who was initially smear negative but then becomes smear positive after 2 months of treatment. Important things to note are:
Never add a single drug to a failing anti-TB regimen
Most cases are due to non-compliance
There is a high chance of Mycobacterium developing resistance to anti-TB drugs
Relapse of TB: A patient is said to have a relapse of TB if they were treated and declared cured but is again smear or culture positive; with the same organism. If the patient gets an infection with a new MTB then they are deemed to be a new case. Because genetic analysis of the infecting MTB is required to determine if re-infection is with the same organism or a new one, it is difficult to accurately diagnose TB relapse.
TB Default Case: A TB patient who completed 1 month of anti-TB treatment, stopped the treatment, and then returns for TB treatment over 2 months after treatment was first initiated. If the patient returns within 2 months of initial treatment, then his/ her initial regimen should be continued.
Re-treatment Regimen: A patient should be a given re-treatment regimen when they relapse or are a TB default case. In highly endemic areas for TB, most authorities prefer an initial intensive phase with 5 drugs for 3 months (2 months RHEZS and 1 month RHEZ).
Chronic Case of TB: A patient is said to be a chronic case of TB, who remains sputum smear positive after 1 re-treatment course. Such patients invariably have drug resistant TB.
Extra-pulmonary TB: TB involving organs other than lungs is called extra-pulmonary TB. For the purpose of treatment and understanding, TB of the central nervous system is excluded from this classification.
Pulmonary TB: Tuberculosis involving lungs is called pulmonary TB.
CNS Tuberculosis:TB can involve the meninges, brain & spinal cord. It is called TB-meningitis, cerebritis & myelitis respectively. Standard treatment is for 12 months and steroids are mandatory. INH & PZA have 100% penetration into CSF.
Miliary Tuberculosis: This a complication of 1–3% of all TB cases. Tuberculosis involving 2 or more organs/ systems of the body is called disseminated TB or miliary TB. It is also called tuberculosis cutis acuta generalisata and tuberculosis cutis disseminate. It is a form of tuberculosis that is characterized by the wide dissemination and by the tiny size of the TB lesions (1–5 mm). Its name comes from a distinctive pattern seen on a chest X-ray of many tiny spots distributed throughout the lung fields with the appearance similar to millet seeds—thus the term "miliary" tuberculosis. Miliary TB may infect any number of organs, including the lungs, liver, and spleen.
MDR-TB: Multi-drug Resistant TB (MDR-TB) is defined as TB caused by mycobacterium tuberculosis resistant to isoniazid and rifampicin. The diagnosis and appropriate treatment of MDR-TB is still a major challenge.
XDR-TB: Extensively-drug Resistant TB (XDR-TB) is defined as TB caused by mycobacterium tuberculosis resistant to isoniazid, rifampicin, quinolones and any 1 of 3 injectables: kanamycin, capreomycin or amikacin.
Treatment Categories of TB Patients:
There are four treatment categories of TB patients for details see table 1.
Table 1
Treatment Category
Type of TB Patient
Category I
New sputum smear +ve or Smear –ve pulmonary TB cases with extensive parenchymal involvement New severe extra-pulmonary TB cases
In this programme a trained person observes the patient swallowing tablets for preferably the whole course of treatment or at least the initial 2 months of treatment. Daily or thrice weekly dosages are recommended but twice weekly dosages are not recommended because of the risk of omitting (by mistake or by chance) one dose. This would result in once weekly dose and it is not acceptable. WHO recommends the DOTS strategy in an attempt to control tuberculosis. There are 5 main points of action:
Government commitment to control TB
Diagnosis based on sputum smear microscopy tests done on patients who actively report TB symptoms
Direct observation short course chemotherapy treatment
Definite supply of drugs
Standardized reporting and recording of cases and treatment outcomes
DOTS-Plus:
WHO extended the DOTS programme in 1998 to include treatment of MDR-TB and this is called DOTS-Plus. It requires the capacity for drug susceptibility testing and provision of 2nd line anti-TB drugs with facilities for identification and drug sensitivities.
Latent TB Infection (LTBI):
A patient is said to have LTBI when he is infected with MTB but does not have any symptoms and signs suggestive of active TB and has a normal chest X-ray. Such patients are non-infectious but 10% of these persons go on to develop active TB in their life at a later stage. They have positive tuberculin skin test and positive Interferon Gamma Release Assay (IGRA) tests (e.g. T-SPOT.TB test, QuantiFERON-TB Gold &QuantiFERON-TB Gold-in tube tests). There are different regimens for treatment of LTBI, commonly used are the following:
9H; 9 months INH (gold standard – only practised in USA)
6H; 6 months INH
3RH; 3 months INH + RMP (recommended in UK)
Common Causes Of Rising Burden Of Tuberculosis
The following are a few causes of rising burden of TB globally:
Non-compliance with medication
Presence of drug resistant strains of mycobacteria
Faulty regimens
Un-diagnosed cases
Under-diagnosed cases
Lack of newer, more effective anti TB medication.
Role Of Pcr In The Diagnosis Of Tuberculosis
There have been a number of studies regarding the role of PCR in the diagnosis of TB. They show that it has a high sensitivity and specificity but gold standard is still tissue smear and culture for AFB. In certain scenarios PCR of different tissue samples (pulmonary or extra-pulmonary) urine, CSF, sputum and blood can be useful and can also tell us about mycobacterial rifampicin resistance.
Role Of Physicians In Prevention & Control Of Tuberculosis In Relation To Airtravel30
Inform all patients with infectious TB that they must not travel by air on a flight exceeding 8 hours until they have completed at least 2 weeks of adequate therapy.
Inform all patients with MDR-TB and XDR-TB that they must not travel by air until they are culture-negative.
Advise patients with TB who undertake unavoidable air travel of less than 8 hours’ duration to wear a surgical mask or to otherwise keep the nose and mouth covered when speaking or coughing during the flight. This recommendation should be applied on a case-by-case basis and only with the agreement of the airline(s) involved and the public health authorities at departure and arrival.
Inform relevant health authorities of the intention of a patient with infectious TB to travel against medical advice.
Inform relevant health authorities when a patient with infectious TB has a recent history of air travel (travel within 3 months).
Side Effects Of Medications Used For Treatment Of Tuberculosis31, 32, 33, 34
Patients who are on treatment for TB should be monitored regularly for any signs of medication toxicity. This may include blood tests in addition to clinical examination. Common side effects of the routinely used 4 anti-TB medications (INH, rifampicin, Ethambutol & PZA) are as follows:
Hepatotoxicity: INH, PZA and rifampicin are known to cause liver toxicity. Ethambutol is a safer medication in patients with known liver problems. INH is contraindicated in patients with active hepatitis and end stage liver diseases. 20% patients can have an asymptomatic rise in AST concentration in the first 3 months of therapy. Symptoms of liver toxicity include anorexia, nausea, vomiting, dark urine, jaundice, fever, persistent fatigue, abdominal pain especially in the right upper quadrant. Routine base line LFTs are recommended prior to starting treatment. After that they should be repeated at least once a month and more frequently in those who are at risk of developing hepatotoxicity. Patients at increased risk of hepatotoxicity include:
HIV positive
Pregnant or post-partum (3 months after delivery)
History of or at risk of chronic liver disease (daily use of alcohol, IV drug users, hepatitis, liver cirrhosis)
Patients taking any other medication which have potential hepatotoxic side effects
The risk of hepatotoxicity increases with age (> 35 years old)
Suspect drug induced liver injury if there is AST/ ALT rise > 3 times base line with symptoms or > 5 times in the absence of symptoms, or disproportionate rise in ALP and total bilirubin. In such a situation:
Stop hepatotoxic anti-TB medications (INH, rifampicin and PZA) immediately
Admit the patient to hospital
Carry out serological tests for Hepatitis A, B, and C (particularly in those who are at risk for hepatitis)
Look for other causes (hepatotoxic medications, high alcohol consumption)
In acutely ill smear or culture positive patients start liver friendly medications i.e. Ethambutol Quinolones, and Streptomycin, until the cause for hepatotoxicity is identified.
Re-challenge: Once LFTs are normal (or < two times the upper normal limit) start with Ethambutol and add INH 1st. If LFTs do not rise after 1 week add Rifampicin. Next add PZA if there is no rise in LFTs after 1 week of adding Rifampicin. If at any point LFTs increase or symptoms recur, stop the last added drug – as this is the culprit drug.
Gastro-intestinal (GI) upset: GI upset is quiet common with anti-TB medications and usually occur in the first few weeks of therapy. Symptoms usually are nausea, vomiting, anorexia, abdominal pain. In such a case recommend good hydration, change the timing of medication (advise to take with a light snack and at bed time) and also check LFTs for possible hepatitis. Aluminium salt containing antacids can reduce bioavailability of INH, so avoid them 1 hour before and 2 hours after INH administration.
Rash: All anti-TB medications can cause a skin rash. Management is based on severity:
Mild rash or itching: administer anti-histamines 30 minutes prior to anti-TB medications and continue with the therapy. If no improvement, add prednisolone 40mg/day and gradually taper down when the rash clears.
Petechial rash: Red pinpoint sized dots under the skin due to leakage from capillaries – suspect rifampicin hypersensitivity. Monitor LFTs and full blood count. If platelet count is below normal (base line), stop rifampicin and do not restart it.
Erythematous rash with fever: and/ or mucous membrane involvement; stop all anti-TB medications immediately and hospitalize the patient. Rule out anaphylaxis (angio-oedema, swollen tongue, throat, stridor, wheezing, flushed face, hypotension) and Stevens-Johnson Syndrome (systemic shedding of mucous membranes and fever). If situation does not permit to stop TB medication then try 3 new drugs i.e. aminoglycoside and 2 oral agents from second line. Once the rash has settled, can re-introduce first line TB medications one by one every 2-3 days, 1st rifampicin, then INH, then PZA and then Ethambutol. While re-introduction, monitor the signs and symptoms of rash, if rash recurs at any point remove the last agent added.
Peripheral neuropathy: signs and symptoms include numbness and tingling in feet and hands, increased sensitivity to touch and stabbing pain. INH can cause peripheral neuropathy. It is more common in malnourished people, diabetes, HIV, renal failure, alcoholism, pregnancy and in breast feeding women. Prevention is the key; prophylaxis is with Pyridoxine (vitamin B6) 10mg/ 100mg INH (normally 25 – 50mg) per week is used in high risk patients.
Optic neuritis: the main agent responsible for this is Ethambutol. It is dose related and gets more intense if treatment is continued. Signs and symptoms are difficulty in reading road signs, decreased red-green colour discrimination, blurring or vision, and colour blindness. These can be unilateral or bilateral. Ethambutol is not recommended in children <5 years of age as visual changes are difficult to monitor. Visual acuity and colour blindness tests are recommended at baseline and also on a monthly basis. Fluctuations of 1 or 2 lines on the Snellen chart is considerable and Ethambutol must be stopped. More than 10% visual loss is considered significant.
Fatigue: INH can cause fatigue and in such situations patients should take the medication at bedtime. If it continues, check LFTs to look for hepatotoxicity.
Flu-like symptoms/joint aches and pains: These are usually seen with Rifampicin and treatment is symptomatic.
Drug-induced lupus: It is seen with INH and blood tests should be done to differentiate it from SLE. It can be managed with steroids while the patient is taking INH.
It seems that psychiatry is gradually losing its allure for future doctors. All around, one can detect an air of pessimism from colleagues about the creeping 'socialisation' of this important field of medicine. There is no longer the breadth of interest in the subject and each sub-branch, for want of a better expression, has its followers and adherents. Proponents of one particular facet of treatment are zealous in the pursuit of their own interests. Psychotherapy is pitched against the neurobiological, rehabilitation and social psychiatry against the pharmacologists, all trying to mark out their own piece of territory, with some yearning for a place in the history books, or at the very least, an acronym. Some psychiatrists do not believe in diagnoses; others ridicule the concept of personality disorder, autistic spectrum disorder, or drug treatment; some believe psychiatric illness is the fault of governments, and there are probably a few who do not believe in psychiatry at all! 'Research' studies are cherry-picked by all sides to illustrate the ineffectiveness of 'alternative' treatments. The full picture or perspective of ill health is blurred and narrowed by a minority who believe they alone know what is right for patients, and psychiatry is 'intellectualised' by others to give it an air of authority and profundity it does not possess. Morale and training are suffering and, if this state of chaos and insanity continues, the discipline itself will implode and cease to be of interest to anyone, save the warring factions in the profession itself.
Once upon a time it was considered that a reasonably broad mixture of community and hospital services would provide benefit for patients with mental illness. Staff involved in their care, who have the rather cumbersome and oxymoronic description of being called 'mental' health professionals, would also widen their experience because of the continuity of care provided. It was hoped patients who clearly did not need to be in hospital (for example, waiting for appropriate accommodation) could be discharged. Clinical need would determine those who required further rehabilitation/treatment in hospital, and would not be swayed by pressures, often financial, to discharge. Now, with the setting up of Home Treatment and other teams the situation has ironically worsened, because there is an implicit opinion in this arrangement that hospital admission, even for the seriously ill or indeed violent patients, is the least desirable option and something to avoid at all cost, even when care in the community is not immediately available or adequate. Care provision for the elderly is a separate concern and is not under discussion here.
In the domain of general adult psychiatry those patients who are in need of care, be it medical or social, are languishing at home, desperate for help, being offered assessment after assessment by disparate teams. There are not enough care professionals to cope with the demand. Home Treatment Teams in particular, are under considerable sustained pressure and stress to ensure further reduction in beds. Rehabilitation beds are being closed. 'It is cheaper to keep patients in the community', we are told. Or, if that does not suit, the liberal stance might be, 'What does hospital admission achieve'? That's fine if the problem is not on your doorstep. Psychiatrists who oversee inpatient care are also pressurised to discharge patients as soon as possible, so the very old notion of 'incarceration' (that worn-out cliché from the antipsychiatry lobby) seems facile, to say the least. On the contrary, doctors now have the added worry of prematurely discharging partially treated ('we need the beds') as well as more vulnerable patients who cannot cope. Most patients who take up psychiatric hospital beds do not want to be in hospital in the first place as they often, rightly or wrongly, do not see themselves as ill. Many hospital beds are now occupied by 'Section patients', and conversely, many very ill patients are left to go it alone because they refuse hospital admission and do not want community team involvement, yet are not 'sectionable'. The inference seems to be, 'If not sectionable or under CPA (Care Programme Approach) it is not our concern.'
Where there are sufficient provisions for outpatient care, some of the damage may be mitigated. Overworked staff including community psychiatric nurses (CPNs), support time recovery workers (STRs) and occupational therapists (OTs) often have the thankless task and enormous responsibility of seeing patients at home, some of whom are threatening and potentially dangerous, others erratic with their outpatient clinics attendance not always through deliberate evasion but often the result of the very condition causing the problem, for example, lack of insight. Other patients do not engage either through hostility or loss of motivation induced by the underlying problem, say, drug and alcohol misuse. Chronic patients are not ill enough to be on CPA and diagnostic 'conundrums' are left to others to sort out. With the introduction of the New Ways of Working,1 the traditional outpatient clinics are being abolished and replaced with community clinics ('short-term' outpatients really). Ideally a community clinic should be run by CPNs as they usually understand the medical, psychiatric, psychological, and social needs of patients. In the authors' opinion the clinics should be Consultant-led because despite the tendency to classify everyone as 'clinicians' many staff feel uncomfortable with this role as it implies or infers a degree of clinical responsibility for which they are not qualified. Psychiatric nurses (especially those with a general nursing background) are ideally placed to carry out this function by virtue of their wide experience; also they are aware when to seek medical help when needed. Often they are more informed about patients than the primary physician or indeed the psychiatrist because of more frequent contact, either via liaison with the hospital wards or through home visits in their role as CPNs. Nurses and other staff (for example, social workers) are involved in patients' discharge from hospital (usually determined at pre-discharge meetings) and are therefore an essential link in the continuity of patient care before patients are eventually seen in the 'community clinics'. Requests for domiciliary visits from general practitioners (GPs) to physicians themselves have become a thing of the past, with the exception of those psychiatrists working with Home Treatment Teams and Assertive Outreach Teams. Nowadays it is not uncommon for patients to be waiting months on end (more assessments) before being deemed 'appropriate' to see a Consultant Psychiatrist.
Certainly there are patients who do not need to continue seeing a Consultant Psychiatrist for years on end and should be discharged back to the GP to reduce unnecessary costs and to avoid a dependency culture, in the same way a patient with mild arthritis does not need to see a rheumatologist or a patient with anaemia does not always need the expertise of a haematologist, to use simple analogies. However, sometimes GPs are unwilling to reciprocate or feel out of depth with 'psychiatry' that this is not always possible. The chronicity of many psychiatric disorders perhaps harnesses the belief that new treatments may emerge which only a psychiatrist, with his/her specialized knowledge, can implement and deal with. This type of scenario is seen with many other illnesses in all fields of medicine (chronic psoriasis, rheumatoid arthritis, multiple sclerosis) yet no one is suggesting that GPs solely should be left to manage these conditions. It seems the clinical risk to patient care is not thought through and this no doubt will lead to serious repercussions later. In our estimation, physical and mental illnesses are so often intertwined that their management should be equally shared by physicians and psychiatrists.2
Swings and Roundabouts
Such is the pressure by management (under the thumb of civil servants) and 'those in the know', reverentially referred to as ‘Commissioners’, that health professionals in psychiatry have to defend their clinical judgment and carry out numerous risk assessments (defensive medicine) of patients who are to be discharged from the outpatient clinic back to the GP in any event. Patients may be fortunate enough to receive a few last appointments with the Community Clinic (when they are up and running: some are at the time of writing) before they are shown the door and sent back to the GP, all to save money. Packages of care will not disguise the fact that vulnerable patients are being left to fend for themselves, just as they were in the past when the large institutions closed down without any forward planning as to how and where patients would survive. Yet ‘management training’ and ‘mandatory courses’ continue inexorably, often provided by 'expert' outside speakers, costing Trusts considerable amounts of hours lost, let alone the expense, instead of employing more nursing staff to cope with the ever-increasing workload. We are led to believe that reducing 'outpatient numbers' will lead to less pressurised work on staff, which really does not fit. All that will be happening to the extent that 'outpatients' will now be filling to the brim with CPA patients (read 'psychoses') instead of a good case-mix of patients required for general experience and training. It seems to be forgotten that there are patients who feel very unwell and are unable to cope, yet are not suffering from major psychiatric disorders.
The next scenario will be the revolving door ‘GP - Access/Assessment Team - possible Consultant Psychiatrist advice and at most two follow-up appointments (if one is really ill) - Community Clinic - discharge to GP system’, to replace the premature hospital discharge-readmission system which failed miserably in the past. When the patient relapses (or rather, when the illness remains static) the GP refers back into the system and the whole process begins again. In this way the Trusts receive money by reaching their targets (discharging patients) and are paid a second time when GPs 'purchase' more care. Those patients with 'minor problems' (not in their GP's estimation) will whittle away and remain unhappy. 'They can always see a counsellor' is the unspoken passive riposte. Furthermore, there will be less clinical variety for doctors and students, as their work will amount to prescribing 'powerful drugs' (we are told by the antipsychiatrists), monitoring serum lithium (and other drug) levels or checking blood results and clozapine dosages, because the Talking Therapies will be curing all and sundry. If only. We are reverting to the bad old days of pseudomedicine and pseudoscience.
Academicians and those who sit on government advisory bodies with grandiose names would have us believe there are far more effective ways to support people at home, or if they have no home, a crisis house will do. Meaningless, empty statements such as 'randomised controlled trials' (given the complexity of the issues under study) often with some reference to National Institute for Health and Clinical Excellence (NICE) guidelines, are used to support questionable findings. Despite all the 'new ways of working' national stress levels are at their highest because of rising unemployment, unexpected redundancies, increasing debt through credit card borrowing, and suicide rates are going up. New ways of Working is not working and any 'ad hoc survey' (note we did not say 'research') will reveal the depth of disillusionment all professionals in the discipline of psychiatry are experiencing, and not just the hallowed psychiatrists. Rudderless multidisciplinary teams are not the answer: teams require management. The term 'leadership' is becoming redundant (one only has to look at successive governments) and is often merely a spur for making money out of meaningless and time-wasting leadership courses which seem to be sprouting everywhere. Among the many qualities 'leadership' embraces are a sense of humour, assertiveness, fairness, creativity, openness, integrity and dedication, all to be found in one individual; presumably! Hierarchical structures may work, contrary to the sweeping statements of some,3 because people who are experienced in medical, academic and management matters (with perhaps a sense of humour) tend to command respect from team members. It is not enough to be an expert in cognitive behavioural therapy (CBT).
No place like home
How does one establish trust and rapport with patients when there will be less opportunity to do so because their care and progress are determined by market forces? Instead of decreasing outpatient volume or confining this aspect of care to CPA patients only, outpatient departments should cater for the mounting levels of stress in the community (poverty, debts, redundancies, threatened job losses) through increased staffing levels and training/supervision of more social workers, CPNs and occupational health workers. Where possible such staff should attend as many clinics as possible (not just CPAs) to offer a more holistic approach to patient care. If anything, policy makers, clinicians, managers, carers and user groups need to collaborate and clamour for a more integrative mental health service, not fracture the already fragile set-up. Community clinics are seen as a stepping stone to discharging from the mental health services (those who set them up don't like this analogy), which in theory is a good idea. The problem lies in the precipitous nature of transfer from outpatient to community clinics. Some very ill patients with chronic conditions are ironically not a burden to the system, in that they do not need to be seen frequently nor do they not require repeated admissions to hospital, yet if left to their own devices and discharged back to Primary Care would soon find life unmanageable as they rely on the expertise of health professionals to remain reasonably stable. Many patients have physical problems, some partly the result of the very treatments given to alleviate their underlying condition (obesity, hypertension, ECG disturbances, Type 2 diabetes and so forth), and need careful monitoring and supervision which is best provided by CPNs and other staff, in the same way a Health Visitor, Practice Nurse, or Diabetes Nurse Specialist might offer his/her expertise to a GP practice.
There will always be patients who need to be seen in the outpatient department with the emotional security and staff support this provides. We are aware that some 20% of patients miss their mental health appointments but then people miss appointments for other interviews and not always because they are unwell.4 Some people miss appointments because they feel better. This is surely not a reason for abandoning the outpatient system, which serves the remainder of the patient population quite well. We have experienced an unprecedented expression of worry and disappointment by patients who have been told they are not ill enough to be followed up at the outpatient department. Now mental health professionals are also frustrated, because they perceive their remit is to refer back to the GP as swiftly as possible, without having thoroughly assessed a patient over a period of time. First on the target list will be those patients who have not been seen by a psychiatrist for several months (‘We don't see them very often, therefore what is the point?') yet many chronically unwell patients may not want to attend outpatients, or have sufficient insight to realize they need to attend, for reasons outlined above. Will Outreach Teams in every Trust be abandoned to save money? Was it not their role in the first place to help those reluctant to receive treatment? What messages are we giving to patients other than being 'just a number, a hospital statistic’'? Those who have had the ‘luxury’ of a hospital admission usually comprise the very psychotic, and the personality disordered, and of the latter some consider they should not be in hospital anyway. The gains that have been made over the past decade in early intervention and engagement with patients by Assertive Outreach Teams will be lost. Yet, there is a continuing demand from patients and their carers to be seen by doctors.5
Here is how the 'new' system works. New Ways of Working, set up some years ago 1 and imposed on us, was meant to be an innovative approach to consultants' contracts by encouraging multidisciplinary teamwork ('When did consultants ever not consult their fellow professionals'?), reviewing the continued necessity for outpatient clinics, advocating more scheduled time for carers (colleagues we have spoken to cannot ever recall not seeing relatives or carers!) and more prominent roles for all team members, encouraging further education and training. Unfortunately we have gone to the other extreme and are being bombarded by all sorts of courses to the extent that much time is lost not seeing patients. Team members may and should undertake postgraduate studies. For doctors, continued professional development is mandatory. We are the only profession that requires revalidation every five years. Nothing can substitute for the medical training doctors undergo and it is a shame that the expertise of psychiatrists is diluted and devalued by their current roles as medication gatekeepers. It is a curious state of affairs or perhaps conveniently forgotten that when Trusts or 'Health Care Reformers' talk nowadays about working in teams and 'shared responsibility', the Consultant-led team concept is dismissed. Where there are Consultants who do not feel up to the role of leading a team, or are uncomfortable making assertive decisions and would rather take a back-seat thus avoiding the responsibility of being in charge of a team, then a Specialist Registrar nearing the end of training could fill this position. Multidisciplinary means 'several' not 'equal' disciplines of learning, ideally each discipline contributing a part to the whole. The medical member of the team is nowadays confronted with the added indignity of having his/her patients described in management-speak as customers, consumers, clients, service users, in fact any title that does not describe the ill person as a patient. It also reflects a creeping normalisation of 'political correctness' thrust upon us by the social engineers and should be resisted. We want patients to be treated with respect not as 'service users’, waiting for the next bus or train. Trusts are now seen to promote a business approach to health care, thereby gaining the approval of their masters, the civil servants and politicians.6 Lots of tick boxes and targets, with subtle threats of redundancies or talks about 'natural wastage'. Meanwhile, the College sits idly by.
Another concern is the training of future psychiatrists which is slip-shod and bureaucratic (lots of forms and assessments). There is hardly any room to accommodate medical students. Junior doctors who practice psychiatry are not receiving the continuity of supervision which existed years ago. The 'junior doctor' is less visible because of European working time directives, on-call commitments with days off in lieu, study leave, annual leave, and the inevitable sick leave. Passing the Member of the Royal College of Psychiatrists (MRCPsych) exams nowadays does not necessarily equate with clinical experience anymore. Even the nomenclature is confusing - not just to doctors and management (‘CT1’, ‘ST1’ and so forth) but also to staff, and reduces the profession to an anomalous set of categories no outsider understands, not to mention the loss of identity it creates in the individual doctor. What was wrong with Senior House Officer (SHO), Registrar, Senior Registrar, and Consultant? Unfortunately, we believe it is now too late to revert this shambles born out of the chaotic modernisation of medical careers. 7
The future is bleak and many doctors (and indeed nurses) are becoming disenchanted by psychiatry, feeling let down by a Royal College which seems to accommodate every new social trend rather than concentrating on improving the status of a once fascinating field of medicine. Lots of wake-up calls, but no-one is getting out of bed.8 Strange having a 'trade union' that ignores its members! Could someone inform the College that nowadays most General Adult Psychiatrists are almost reduced to measuring lithium levels, advising on clozapine doses, and attending meetings. No wonder the numbers of potential psychiatrists are falling. How would this dilution of responsibility work in a surgical unit? Would the team members decide how an operation is to be carried out because one of them is trained in resuscitation? Contrary to reports3 consultants are not happy with the present set-up, though it is unlikely our Royal College hierarchy will do anything about it. Many psychiatrists nowadays have an extensive academic knowledge of medicine, psychology, sociology, and neuropsychiatry, and no longer want to be minor players in the game, or undermined by a system that encourages power without responsibility.
Fragmentation breeds disinterest
What is the answer? The previous system, though not perfect, worked well. This had its shortcomings too (oversized catchments areas, Consultants in charge of many wards, and so forth)7 but the continuity of care was there. Patients discharged from hospital were seen by the same team. GPs could refer directly to Consultants (as is the case in other medical specialties) and patients were then seen in the outpatient clinic. However, often the patient would attend such clinics for years because GPs were reluctant to resume care. Nowadays the training and education of GPs is exemplary and most are more than capable and indeed willing, to continue to provide support for their patients provided there is a back-up plan. The academic training of psychiatrists has never been better but their clinical skills are suspect. Therefore there needs to be an overhaul in the examination system as well. Actors are not patients. Simulated psychiatry is not the same as simulated surgery. Simulation is a technique not a technology, we are told. It is not a substitute for doctors examining real patients in real contexts. The same applies to nurses. All nurses (CPNs) could easily be trained to do ECGs, act as phlebotomists, and arrange routine tests. Many already do. Give back to nurses the skills they enjoy in other fields of medicine. For psychiatrists there are numerous courses one can attend to broaden their medical knowledge. Most GPs take an interest in a holistic approach to their patients (social, psychological, physical). As matters stand GPs now refer to a borough 'Access and Allocation Team' with no one held accountable, and even though requested by the GP, a Consultant Psychiatrist's opinion is not always provided. Responsibility is the province of senior doctors and management and should not be diluted by putting pressure on the Team as a whole whose individual experience varies considerably. Doctors (and nursing staff) should have mandatory training in psychological therapies (cognitive and behaviour therapies specifically). A fixed number of sessions in addition to their usual duties could be part of the job plan for those doctors interested in the psychotherapies per se, or put another way, a holistic approach to patient care, which is what most doctors do in any event. Patients would then have the benefit of medical and psychological input simultaneously (let's call it a cognitive-medical model). Waiting lists would be dramatically reduced at a stroke and Trusts would no longer have the responsibility of finding and employing unqualified (in medicine or psychology) 'talking therapists'. People who are generally physically well and who do not have serious psychosocial problems or psychiatric illnesses could receive treatment elsewhere through their GP, counsellors or other psychotherapists (those with no medical or psychology degrees) of their own volition. There is no need to clog up the system with 'customers'. We are not a supermarket!
Complaints will inevitably follow when patient dissatisfaction begins to emerge, which is only a matter of time. More serious incidents will be a consequence of too many bed closures and staff shortages. Dilution of responsibility means that no one person seems to be accountable when things go wrong and patients are left stranded (read the Francis Report 9). Already GPs are frustrated by the lack of informal contact with psychiatrists who are once again seen to be retreating to their ivory towers, having been overwhelmed by lots of courses, lots of training, lots of meetings, lots of empty rhetoric. Too much emphasis nowadays is placed on the sociological/psychological aspects of patients' illness and so serious conditions are missed. GPs should be able to refer directly to their colleagues where there are immediate concerns and not have to wait for triage meetings which delay this process. After all, GPs know their patients best. Community clinics could take the bulk of moderate conditions (which are causing undue stress) and see patients for as long as necessary (not a determined number of appointments) before deciding the GP can resume responsibility. 'Packages of Care' and other outdated expressions should be confined to the dustbin. Patients are not fooled by promises of cardboard boxes with little pink ribbons. Continuity of patient care requires a flexible approach which encompasses easy access to information and a direct pathway to services and medical care when needed.
Knowledge in the making
Psychiatrists should concentrate on more difficult and complicated cases (as was the case in the past) as well as routine moderate conditions, enabling them to use their broad skills more efficiently and effectively. Some psychiatrists see too few patients and this should be changed. Perhaps there is a case for psychiatrists rotating through some specialties say, every five years, for example, between Rehabilitation and General Adult Psychiatry. There are many patients who are not on mood stabilisers or clozapine who require intensive input and combined medical expertise and rotating between posts would offer valuable experience. A more varied approach is thus needed but do we really need all those subspecialties? What ever happened to the general psychiatrist with a special interest? In our view at least one year of neurology training should be mandatory for psychiatrists during their training. No formal examinations, just certificates to prove the courses have been completed; otherwise the system grinds to a halt. Under this system a doctor could still theoretically become a consultant after nine years postgraduate training (three years in foundation training and neurology), and six years Psychiatry (to include neurology, psychology and sociology) which is not unreasonable. Equal emphasis on neuromedical, sociological and psychological factors causing health problems would foster a healthier and friendlier relationship between disciplines which deal with mental illness and primary care providers. As it stands, with the fragmented role of general adult psychiatric services and the emphasis on e-learning and internet training for junior doctors (no hands-on clinical experience) we are facing yet another era of overemphasis on social psychiatry (or rather reverting to ancient belief systems) with its 'neutral' politically correct denigrating sound bites (customers, clients, service users). All will be well if we can just sort out the social problems! The simplistic notion that problems will disappear if we do not smoke, drink, take illicit drugs, keep our weight down, and have a home to go to, is the stuff of social engineering by the 'experts in living,' and alas by doctors who have lost touch with medicine.
Doctors need reminding that psychiatry is that branch of medicine that is concerned with the study, treatment, and prevention of mental illness using medical and psychological therapies as well as paying special attention to social hardship and isolation where present. It is not philosophy or social science. It is to medicine what metaphysics is to philosophy. Psychiatrists need to broaden their horizons and take their heads out of the therapy books to witness the advances in neuroscientific techniques and genetic advancements that have already transformed the nature of medicine. To develop their psychological skills they need to take on board that patients want more than drugs to alleviate distress. Therefore practical techniques such as CBT or DBT (dialectical behaviour therapy) will further heighten their expertise as physicians. Many doctors are already familiar with applying CBT and other therapies. However, doctors should also be aware of the limitations of psychotherapies in general, recognizing and acknowledging that such therapies do not always work either and indeed in some instances may be harmful. Psychiatrists should be part of separate Wellbeing Clinics (perhaps one session per week) to becoming better acquainted and proficient again with physical examinations, investigations, routine procedures (ECGs for example) and interpretation of results (not just screen, but to intervene). This overseeing of the physical health of patients is not always possible in a busy outpatient clinic. Many potentially serious conditions would be revealed and information to the GP or tertiary services made known immediately. Psychiatrists are not 'stuck in a medical model' no more than a physician believes all myocardial infarcts are caused by psychosocial factors or life style. But to ignore the medical advances in molecular biology and neuroscientific diagnostic techniques portrays a profound ignorance of biological psychiatry and is insulting to those scientists who work tirelessly, often without much recognition, to further our understanding of 'brain disorders'. It is all very well to talk about art, philosophy, social sciences and literature as having a great bearing on our interest in psychiatry and congratulate ourselves as 'lateral thinkers' but an understanding of the philosophy of say, Bertrand Russell or indeed the school of Zen Buddhism, will not eliminate mental disorder. Romantic as it might sound in retrospect, Vincent Van Gogh did not enjoy cutting his ear off, nor did Robert Schumann feel ecstatic when jumping into the Rhine before being carted off to the asylum.
If we do not embrace a holistic view of mental ill-health we risk not only throwing the baby out with the bath water but the bath itself, thereby causing further dissatisfaction and low morale among doctors with an inevitable negative impact on patient care. Psychiatrists are not bemoaning their loss of hegemony - a favourite word and another myth propagated by the antipsychiatry lobby; rather, it is only too obvious to them (as qualified medical doctors) that patients will suffer in the long term by not being referred appropriately to those who have the expertise to recognize and distinguish between human difficulties and illness. There is also a need to re-examine the impact of psychological therapies and not succumb to the popular and naive notion that they are all evidence-based in scientific terms. In the meantime the 'worried well' can indulge themselves with all the peripheral talking therapies and current fads they desire. Likewise, performance management, outcome measures and payment by results have become relentless tick-box exercises creating unnecessary stress among health care professionals (threats of job losses) who 'must meet targets at all costs', all for a slice of the Commissioners' cake. What a way to run a health service! Patients become meaningless statistics in the meantime. No! The wake-up call should be aimed at those who are intent on destroying the good will and values of the very same people they purport to support, through their social engineering and outdated attitudes.
Fifteen percent of elderly individuals report clinically significant depression due to variety of reasons. Osteoporosis is a disorder of bone metabolism which can be caused by multiple factors. The elder population has multiple risk factors for development of low Bone Mineral Density (BMD). Data supports that SSRI causes low BMD. There are numerous mediating processes, factors and causes that may contribute to relationship between depression and low BMD, therefore it has been suggested that depression may be an unrecognized risk factor for development of osteoporosis in this patient population.
Low BMD is a common condition among the elder population; prevalence of osteopenia and osteoporosis is expected to increase due to increasing elder population. Low BMD is associated with increased risk for debilitating fractures, particularly hip, vertebrae and distal forearm. There is a growing body of evidence that depression impact the risk for fractures in the older population.
Most studies support that depression is associated with increased risk for both low BMD and fractures. There are many risk factors for low BMD, but some are unalterable. Therefore it is crucial to identify modifiable risk factors to reduce the public health burden of osteopenia, osteoporosis and fractures, and complications associated with them.
Objective:
A literature review was performed to extract evidence and to evaluate risk of Osteoporosis in depression.
Educational Objectives:
At the conclusion of this article, the reviewer will be able to understand,
The risk of development of osteoporosis,
Need for close monitoring and early assessment of risk,
Need for prophylactic treatment to avoid complications due to development of osteoporosis.
Method:
Pubmed.gov was searched by using pre-determined key word.
Key words:
“Depression AND Osteoporosis"
Background:
Osteoporosis was first recognized as a disorder of bone metabolism in 1947 by Albright. It is the most common degenerative disease in developed countries; it is characterized by low bone mineral density (BMD), causing bone fragility and increased fracture incidence. Over past quarter century, it has emerged as a major public health problem in the Western world, prevalence of osteopenia and osteoporosis is expected to increase dramatically in the next 50 years as the population pyramid shift toward old age. In United States alone, app 10 million individuals over age of 50 have osteoporosis. In addition, 33.6 million Americans in this age group have osteopenia (i.e. a decrease in bone mineral density [BMD] that precedes osteoporosis and its potential complications later in life). The estimated annual fracture rate due to an underlying bone disease is 1.5 million. These fractures lead to pain, skeletal mutilation, disability, loss of independence and increased mortality.1
Low BMD has been shown to be major risk factor for debilitating bone fractures, particularly of the hip, vertebrae and distal forearm.2 The established risk factors for osteoporosis include increasing age, female sex, oestrogen deficiency, glucocorticoid therapy and other medications, smoking, alcohol use, inactivity, and low calcium intake.3 Many prominent risk factors are unalterable, it is therefore crucial to identify modifiable risk factors in order to reduce the public health burden of osteopenia, osteoporosis and the fractures associated with them. In the USA, depression is a common disorder that affects 5 to 9% of women and 1 to 2% men.4 It ranks second only to hypertension as the most common chronic illness encountered in general medical practice.5 This disorder carries a considerable risk of death and is associated with a two to three fold increase in all-cause of non-suicide-related death.6 Fifteen percent of elderly individuals report clinically significant depression.
Definition of Osteopenia and Osteoporosis:
Osteopenia is a condition where bone mineral density is lower than normal, more specifically; it is defined as BMD T-Score between -1.0 and -2.5. It is considered to be precursor to osteoporosis. However, not every person diagnosed with osteopenia will develop osteoporosis. Osteoporosis causes bones to become weak and brittle – so brittle that a fall or even mild stresses like bending over or coughing can cause a fracture.
Osteoporosis-related fractures most commonly occur in the hip, wrist or spine. Bone is a living tissue, which is constantly being absorbed and replaced. Osteoporosis occurs when the creation of new bone does not keep up with the removal of old bone. Osteoporosis affects men and women of all races, but White and Asian women especially those who are past menopause are at highest risk. Medications, dietary supplements and weight-bearing exercise can help strengthen bones.
Literature evidence:
Current evidence supports a bidirectional link between major depressive disorders (MDD), several other mood disorders, and various medical conditions such as osteoporosis and cardiovascular disease.7 A significant association was found between BMD and depressive symptoms after adjustment for osteoporosis risk factors. In Caucasians, depressive symptoms were associated with both osteoporotic and osteopenic levels of BMD.8 A meta-analysis reported BMD is lower in depressed than non-depressed subjects. The association between depression and BMD is stronger in women than men, and in premenopausal than postmenopausal women. Only women psychiatrically diagnosed for MDD display significantly low BMD; women diagnosed by self-rating questionnaires do not.9 Depression is a significant risk factor for fracture in older women.14 Numerous studies have examined association between antidepressant use (both SSRI and TCA) and fracture risk. The majority have found that use of these medications, regardless of class is associated with increased risk of fracture.10 Animal studies have also indicated that serotonin may influence bone mass, particularly during stages of bone growth.11, 12
Daily SSRI (Table 1) use in adults 50 years and older remained associated with a 2-fold increased risk of clinical fragility after adjustment for potential covariates. Depression and fragility fractures are common in this age group, and the elevated risk attributed to daily SSRI use may have important public health consequences.15 (Figure 1). SSRI may increase fracture risk because of their effect on bone physiology and on the risk of falling. Functional serotonin receptors and transporters have been localized to bone, while the administration of SSRI decreases bone mass and strength in growing mice. SSRI function by inhibiting the serotonin transporter. Functional serotonin transporters in osteoblasts, osteoclasts and osteocytes raises the possibility that serotonin transporters may play a role in bone metabolism and those medications that affect this transporter system may also affect bone metabolism. Use of SSRI is associated with an increased rate of bone loss at the hip in this cohort of older women; use of a TCA was not similarly associated with increased rates of hip bone loss in our cohort.16 In men, BMD was lower among those reporting current SSRI use, but not among user of other antidepressants.17 Meta-analysis proved that MDD is associated with low BMD and should therefore be considered a risk factor for osteoporosis. BMD in subjects with MDD was 4.7% lower at the AP spine, 3.5% lower at the total femur, and 7.3% lower at the femur neck as compared to healthy controls.18 NIH meta-analysis concluded MDD was associated with lower BMD at the AP spine, femoral neck and total femur. The deficits in BMD in subjects with depression are of clinical significance and likely to increase fracture risk over the lifetime of these subjects.18
Table 1: List of SSRI (Selective Serotonin Reuptake Inhibitor) and dosages range:
Generic Name
Brand Name
Dose range
Citalopram
Celexa
10 to 40 mg
Escitalopram
Lexapro
10 to 20 mg
Fluoxetine
Prozac
20 to 80 mg
Fluvoxamine
Luvox
50 to 300 mg
Paroxetine
Paxil
10 to 40 mg
Sertraline
Zoloft
50 to 200 mg
Figure 1: Fracture-free survival by Selective Serotonin Reuptake Inhibitors (SSRI) use
Potential mechanisms of bone loss in depression:
Depression is associated with alterations of the hypothalamic-pituitary-adrenal (HPA) axis at multiple levels, including altered secretion of hypothalamic corticotrophin-releasing hormone (CRH), as indicated by CRH levels in the cerebrospinal fluid, and change in the set point threshold for negative feedback; these changes generally result into hyper-cortisolism.
Pro-inflammatory cytokines are increased in depression and IL-6 is a potent activator of the osteoclast. Oestrogen deficiency in women and androgen deficiency in men may affect bone mass and there is at least theoretical evidence for decreased sexual hormones in both genders during the acute phases of depression. Serotonin transporter receptors are present on the osteoblast and use of antidepressants has been associated with more fractures. Commonly accepted life style risk factors for osteoporosis include smoking, inadequate calcium intake, excessive alcohol intake and physical inactivity.
There are three pathophysiologic pathways leading to low BMD.13 (Figure 2):
Inadequate acquisition of bone mass early in life
Elevated resorption of bone mass later in life, and
Inefficient bone formation during continuous bone remodelling
These pathways are interdependent and the relative importance of each mechanism changes over development and varies by sex.
Figure 2: Pathways linking depression, low bone mneral density and fracture. 13
Bottom line:
Current available evidence supports that there is increase of development of osteoporosis due to various factors, pathways and medications used in treatment of depression.
Conclusion:
Major depressive disorder is an important but still unrecognized risk factor for osteoporosis. Depression should be considered as an important risk factor for osteoporosis. Depression is associated with low BMD, with a substantially greater BMD decrease in depressed women and in cases of clinical depression. These patients need close monitoring, early assessment of risk and preventive measures to avoid complications. Premenopausal women with major depression should undergo DXA screening. Similar recommendation may be made for postmenopausal women with depression especially in the presence of one or more known risk factors for development of osteoporosis.
Once a diagnosis of osteoporosis is made in subjects with major depression, DXA measurements should be performed with a frequency based on the current WHO algorithm; this model takes into account the presence of other risk factors and age of the subjects.
Clinical Point:
Periodic BMD measurements and anti-osteoporotic prophylactic and curative measures are strongly advocated for these patients.
People with intellectual disabilities are a heterogeneous group, who can pose a challenge to services in terms of meeting a wide range of needs. Following the closure of large institutions, the optimum means of service provision for people with intellectual disabilities with additional mental illness and challenging behaviour has been a matter of debate.
Challenging behaviour can be defined as a ‘culturally abnormal behaviour of such an intensity, frequency or duration that the physical safety of the person or others is likely to be placed in serious jeopardy, or behaviour which is likely to seriously limit use of, or result in the person being denied access to, ordinary community facilities’ – Emerson, 19951. Examples of challenging behaviours include self-injury, aggressive outbursts, destruction of property and socially inappropriate behaviour.
The credit-crunch of recent years has led to an increased use of private sector services delivering care to NHS funded patients. The Winterbourne Scandal unearthed by BBC Panorama in June 2011 (an investigation into the physical abuse and psychological abuse suffered by people with learning disabilities and challenging behaviour at this private hospital in South Gloucestershire), highlighted that whist this maybe an economically viable option, fundamental questions were raised about whether private sector services’ safeguards and monitoring protocols were as robust as the NHS in protecting vulnerable patients. It also reawakened longstanding disputes around the way people with complex needs are cared for in residential settings. The discussions centred around ‘institutional’ versus ‘community’ care styles; specialist intellectual disabilities services versus generic adult psychiatric services; local versus specialist expertise congregated around a single unit; and also financial questions regarding how best to meet the needs of this population at a time of austerity. Opinions vary widely, and at times are even polarised, as a result of several factors including position within this competitive and complex system, personal and cultural politics and also personal experience. As a result of the government review, subsequent to the Winterbourne investigation, a number of recommendations have been made which will affect the future of care of this vulnerable group of patients. These include, “by June 2013, all current placements will be reviewed, everyone in hospital inappropriately will move to community-based support as quickly as possible, and no later than June 2014… as a consequence, there will be a dramatic reduction in hospital placements for this group of people”2
The Department of Health Policy, Valuing People3, set out ‘ambitious and challenging programme of action for improving services’, based on four important key principles – civil rights, independence, choice and inclusion. Government Policy as detailed in both Valuing People and the Mansell Report3, 4 recognises that NHS specialist inpatient services are indeed necessary on a short-term basis for some people with intellectual disabilities and complex mental health needs. Inpatient facilities for people with Intellectual Disability have been described as highly specialised services that are a valuable, but also expensive, component of mental health services5. The Enfield Specialist Inpatient unit - the Seacole Centre - is one such service.
The Seacole Centre consists of two inpatient units, with a total of 12 inpatient beds, for people with intellectual disabilities with acute mental illness and/or challenging behaviour. It is located within Chase Farm Hospital in Enfield, Greater London. The Seacole Centre has a multidisciplinary team consisting of nurses, psychologists, psychiatrists, a resident GP, occupational therapists, intensive support team staff, physiotherapists, speech and language therapists, a physical exercise coach and administrative staff. Patients are admitted from a variety of sources, including general psychiatric wards, general medical wards and community intellectual disability teams. Since patients are often referred from other boroughs, in addition to this multidisciplinary team, each patient has their own community and social care team based within their own borough. The use of out-of-area units faces similar challenges to out-of-area placements, use of which has been increasing in the UK, and it is important to explore ways in which service users, out-of-area, can be supported effectively6.
In 2002, a review of admissions to the unit was completed to describe the management of mental illness and challenging behaviour. Since then there have been several service reconfigurations within the trust, in order to accommodate national, political and financial recommendations. However, despite these changes, it was observed clinically that certain clinical problems including delayed discharges continue to occur. We decided to complete a similar review, to describe current admission trends in further detail, in order to enable us to identify areas of improvement, and also to ascertain the nature and severity of ongoing problems to focus future recommendations.
METHOD:
A retrospective review of the case records of all inpatient admissions to the Seacole Centre was completed over a three-year period – from 1st January 1999 to 31st December 2001.
Data collected included age on admission, gender, borough, diagnosis, psychotropic medication on discharge, date of admission and discharge, length of stay, legal status on admission, delays on discharge, and reason for delay, and living arrangements prior to and after discharge
A successful outcome of admission was discharge from hospital to community care. We used the following definition of the delayed discharge:
"A delayed transfer occurs when a patient is ready for transfer from a general and acute hospital bed but is still occupying such a bed. A patient is ready for transfer when:
a clinical decision has been made that the patient is ready for transfer
a multi-disciplinary team decision has been made that the patient is ready for transfer
the patient is safe to discharge/transfer.7
The review was repeated during a further three-year period between 1st January to 2009 and 31st December 2011.
RESULTS:
Characteristics of 1999-2001 cohort, and comparison with 2009-2011
The basic demographic details can be seen in Table 1.
Table 1 - Demographic details
1999-2001
2009-2011
Number of admissions
60
41
Number of patients
46
40
Average (mean) age/years
29.58
36.16
Age Range / years
14-63
19-72
M:F ratio
1.4:1
3.1:1
Total number of boroughs from which patients admitted
10
7
Trends in Admission Rates
As seen in Tables 1 and 2, there has been a reduction in the total number of admissions between the studies. There has also been a marked reduction in re-admissions. The average length of stay has increased, and although the number of delayed discharges has slightly decreased, it can be seen that this is still a factor in a significant proportion of the admissions.
Table 2 - Trends in admission
1999-2001
2009-2011
Total Number of admissions
60
41
Average (mean) length of stay / days
198.6
244.6
Number of readmissions
16
1
Number of delayed discharges
40 (67%)
24 (59%)
Reason for admission
The trends in reason for admission are shown in Figure 1.
Figure 1 – Trends in Reason for Admission, 1999-2001 compared to 2009-2011
In both time periods, the most frequent reason for admission is challenging behaviour (62%, n=37 between 1999-2001; 63%, n=29, between 2009-2011), followed by psychosis (22%, n=13 between 1999-2001; 11%, n=5, between 2009-2011. Social admissions were the third most common reason for admission in the recent study (0% between 1999-2001; 4%, n=2 between 2009-2011). The range of psychiatric presentations was widest during the original time period.
Patterns on discharge
As shown in Figure 2, most patients in the original study were discharged to either the same residential home or back to the family home, where as in the latter time period patients were most frequently discharged to either a different residential home or to supported living. Figure 3 summarises this effect, demonstrating the change in discharging the majority of patients to a different place of residence.
Figure 2 – A graph to show the place of discharge, 1999-2001 compared to 2009-2011
Figure 3 – A Graph to Demonstrate Trends in Place of Discharge – comparing 1999-2001 and 2009-2011
Delayed discharges
The primary cause for delay in both studies was finding appropriate placement, although this was more marked in the recent cohort.
One of the major factors contributing to delayed discharges was lack of identification of suitable placement, which was identified as a major contributing factor to delayed discharges in 69% of cases in 2009-2011 and in 44% in 1999-2001, and apparent delays in the role played by social services (table 2).
DISCUSSION:
Throughout this study spanning 10 years, challenging behaviour followed by psychotic disorder remained the most common cause for admission. Interestingly, by 2008-2011, the third most common cause for admission was related to social reasons (4%). There were no admissions in the original study for this reason. Between 1999 and 2001, there were a wider range of reasons for admission across the mental illness spectrum compared to 10 years on. In previous studies, the largest diagnostic group for all admissions was schizophrenia spectrum disorders7,8. However, between 2009-2011, more than a quarter of patients admitted to the Seacole Centre did not have any psychiatric diagnosis on admission. It is important to keep in mind that individuals with intellectual disabilities accessing specialist inpatient services are more likely to present with complex clusters of symptoms and behavioural problems that may span several diagnostic categories.
The most significant improvement from the original review and the re-review is that the number of re-admissions significantly reduced from 24% (14 patients) to 2% (1 patient). Of interest to note is that during 1999-2001 a large proportion of patients were discharged to their original place of accommodation (often the family home) whereas in 2009-2011, it was more common for patients to be discharged to a new place of living, more suited to managing increasing complex needs and behaviours. This may account for some of the reduction in re-admission rates.
The length of stay over the 10-year period has slightly increased from an average of 198.6 days up to 244.6 days, which demonstrate that admissions are considerably longer than in more generic medical settings. The findings are in keeping with a number of other studies regarding patients with intellectual disability who are admitted to a specialist unit and continue as inpatients for significantly longer periods. One study showed a mean length of stay 23.2 weeks for a specialist unit versus 11.1 weeks in generic settings 8. Another study in South London revealed similar finds of 19.3 weeks compared with generic unit stays of 5.5. weeks9. An exploratory national survey of intellectual disability inpatient services in England has shown that 25% of residents had been in the units for more than two years. Only 40% of residents had a discharge plan, and only 20% had both this and the type of placement considered ideal for them in their home area10. Reasons for length of stay are not fully understood in any of these studies. They may include fear of taking risks, lack of local safe or competent amenities, lack of experience or authority amongst those charged with sourcing bespoke services for complex people with challenging needs, and also a potential lack of such resource in terms of time available to see people, read reports, meet with stake holders and find the right services. The results of another retrospective study comparing the generic and specialist models in two districts in the UK by Alexander et al11 suggested that, within the same district, patients do stay longer in the specialist unit, but they are less likely to be discharged to an out of area placement.
There is no evidence to suggest that comprehensive care for people with intellectual disabilities can be provided by community services alone. Likewise, there is also no clear evidence to suggest that a balanced system of mental health care can be provided without acute beds12. There is, however, clear evidence that services created by the private sector are used very widely and seen as at time as an economically viable option in the current climate of credit crunches.
The different models of inpatient service provision that have been suggested range from mainstream adult mental health services; alternatively an integrated inpatient scheme whereby people with Intellectual disabilities with additional mental illness or severe challenging behaviour are admitted to adult mental health beds, with provision for extra support from a multidisciplinary learning disabilities team; ranging across to specialist assessment and treatment units13,14.
Inpatient care is known to consume most of the mental health budget15 and specialist inpatient units are an expensive component of these services. Cost containment and cost minimisation of inpatient beds within the current economic recession presents a real challenge for those charged with responsibility to provide high-quality, effective, specialist care for adults with intellectual disability. Such cost reduction could be approached in a number of ways, through the reduction of length of stay, optimising drug budgets, reducing rates of re-admissions, and establishment of projects in association with the voluntary and statutory sector to facilitate prompt and safe discharge.
Reducing the average length of stay where possible can reduce the cost, and the resources and budget freed up in this way could be used for other service components15. However, this single agenda can lead to problems of pressured early discharge to unsuitable placements. It is known that resource consumption is most intense during the early stages of admission. As such, we observe a position whereby reducing length of stay requires proactive planning throughout the whole process of care, as well as active discharge planning, with a need for clearly defined pathways of care.
A crucial aspect of the patient's transition through inpatient placement to life in the community is efficient and regular communication between the relevant professionals and teams who form part of continuity of on-going care back in the community. This can at times be particularly challenging owing to differences in values and perceptions about patient need and problem, and also varying pressures. Understanding and resolving problems for individuals with complex and severe challenging behaviour or mental illness that requires a period of containment in a specialist service also requires specialist on-going work and risk management to ensure that when the problems are contained and understood, they remain contained and understood on discharge and thereafter so long as the individual remains vulnerable to the point of requiring any care giving. Many people from the general population who develop a serious mental illness requiring hospitalisation, have capacity once well, to make decisions for themselves and articulate a need or otherwise for specific care or intervention. This is rarely completely the case for people with Intellectual disabilities. Collaborative approaches together with those involved in community care is crucial to getting the right care at the right financial cost for this relatively small but very complex and vulnerable group of individuals.
Assessment and evaluation are the foundations of learning; the former is concerned with how students perform and the latter, how successful the teaching was in reaching its objectives. Case based discussions (CBDs) are structured, non-judgmental reviews of decision-making and clinical reasoning1. They are mapped directly to the surgical curriculum and “assess what doctors actually do in practice” 1. Patient involvement is thought to enhance the effectiveness of the assessment process, as it incorporates key adult learning principles: it is meaningful, relevant to work, allows active involvement and involves three domains of learning2:
Clinical (knowledge, decisions, skills)
Professionalism (ethics, teamwork)
Communication (with patients, families and staff)
The ability of work based assessments to test performance is not well established. The purpose of this critical review is to assess if CBDs are effective as an assessment tool.
Validity of Assessment
Validity concerns the accuracy of an assessment, what this means in practical terms, and how to avoid drawing unwarranted conclusions or decisions from the results. Validity can be explored in five ways: face, content, concurrent, construct and criterion-related/predicative.
CBDs have high face validity as they focus on the role doctors perform and are, in essence, an evolution of ‘bedside oral examinations’3. The key elements of this assessment are learnt in medical school; thus the purpose of a CBD is easy for both trainees and assessors to validate1. In terms of content validity, CBDs are unique in assessing a student’s decision-making and which, is key to how doctors perform in practice. However, as only six CBDs are required a year, they are unlikely to be representative of the whole curriculum. Thus CBDs may have a limited content validity overall, especially if students focus on one type of condition for all assessments.
Determining the concurrent validity of CBDs is difficult as they assess the pinnacle of Miller’s triangle – what a trainee ‘does’ in clinical practice (figure1)4. CBDs are unique in this aspect, but there may be some overlap with other work based assessments particularly in task specific skills and knowledge. Simulation may give some concurrent validity to the assessment of judgment. The professional aspect of assessment can be validated by a 360 degree appraisal, as this requests feedback about a doctor’s professionalism from other healthcare professionals1.
Figure 1: Miller’s triangle4
CBDs have high construct validity, as the assessment is consistent with practice and appropriate for the working environment. The clinical skills being assessed will improve with expertise and thus there should be ‘expert-novice’ differences on marking3. However the standard of assessment (i.e. the ‘pass mark’) increases with expertise – as students are always being assessed against a mark of competency for their level. A novice can therefore score the same ‘mark’ as an expert despite a difference in ability.
In terms of predictive validity performance-based assessments are simulations and examinees do not behave in the same way as they would in real life3. Thus, CBDs are an assessment of competence (‘shows how’) but not of true clinical performance and one perhaps could deduct that they don’t assess the attitude of the trainee which completes the cycle along with knowledge and skills (‘does’)4. CBDs permit inferences to be drawn concerning the skills of examinees that extend beyond the particular cases included in the assessment3. The quality of performance in one assessment can be a poor predictor of performance in another context. Both the limited number and lack of generalizability of these assessments have a negative influence on predictive validity3.
Reliability of Assessment
Reliability can be defined as “the degree to which test scores are free from errors of measurement”. Feldt and Brennan describe the ‘essence’ of reliability as the “quantification of the consistency and inconsistency in examinee performance” 5. Moss states that less standardized forms of assessment, such as CBDs, present serious problems for reliability6. These types of assessment permit both students and assessors substantial latitude in interpreting and responding to situations, and are heavily reliant on assessor’s ability. Reliability of CBDs is influenced by the quality of the rater’s training, the uniformity of assessment, and the degree of standardization in examinee.
Rating scales are also known to hugely affect reliability – understanding of how to use these scales must be achieved by all trainee assessors in order to achieve marking consistency. In CBD assessments, trainees should be rated against a level of completion at the end of the current stage of training (i.e. core or higher training) 1. While accurate ratings are critical to the success of any WBA, there may be latitude in the interpretation of these rating scales between different assessors. Assessors who have not received formal WBA training tend to score trainees more generously than trained assessors7-8. Improved assessor training in the use of CBDs and spreading assessments throughout the student’s placement (i.e. a CBD every two months) may improve the reliability and effectiveness of the tool1.
Practicality of Assessment
CBDs are a one-to-one assessment and are not efficient; they are labour intensive and only cover a limited amount of the curriculum per assessment. The time taken to complete CBDs has been thought to negatively impact on training opportunities7. Formalized assessment time could relieve the pressure of arranging ad hoc assessments and may improve the negative perceptions of students regarding CBDs.
The practical advantages of CBDs are that they allow assessments to occur within the workplace and they assess both judgment and professionalism – two subjects on the curriculum which are otherwise difficult to assess1. CBDs can be very successful in promoting autonomy and self-directed learning, which improves the efficiency of this teaching method9. Moreover, CBDs can be immensely successful in improving the abilities of trainees and can change clinical practice – a feature than is not repeated by other forms of assessment8.
One method for ensuring the equality of assessments across all trainees is by providing clear information about what CBDs are, the format they take and the relevance they have to the curriculum. The information and guidance provided for the assessment should be clear, accurate and accessible to all trainees, assessors, and external assessors. This minimizes the potential for inconsistency of marking practice and perceived lack of fairness7-10. However, the lack of standardization of this assessment mechanism combined with the variation in training and interpretation of the rating scales between assessors may result in inequality.
Formative Assessment
Formative assessments modify and enhance both learning and understanding by the provision of feedback11. The primary function of the rating scale of a CBD is to inform the trainee and trainer about what needs to be learnt1. Marks per see provide no learning improvement; students gain the most learning value from assessment that is provided without marks or grades12. CBDs have feedback is built into the process and therefore it can given immediately and orally. Verbal feedback has a significantly greater effect on future performance than grades or marks as the assessor can check comprehension and encourage the student to act upon the advice given1,11-12. It should be specific and related to need; detailed feedback should only occur to help the student work through misconceptions or other weaknesses in performance12. Veloski, et al, suggests that systemic feedback delivered from a credible source can change clinical performance8.
For trainees to be able to improve, they must have the capacity to monitor the quality of their own work during their learning by undertaking self-assessment12. Moreover, trainees must accept that their work can be improved and identify important aspects of their work that they wish to improve. Trainee’s learning can be improved by providing high quality feedback and the three main elements are crucial to this process are 12:
Helping students recognise their desired goal
Providing students with evidence about how well their work matches that goal
Explaining how to close the gap between current performance and desired goal
The challenge for an effective CBDis to have an open relationship between student and assessor where the trainee is able to give an honest account of their abilities and identify any areas of weakness. This relationship currently does not exists in most CBDs, as studies by Veloski, et al8and Norcini and Burch9 who revealed that only limited numbers of trainees anticipated changing their practice in response to feedback data. An unwillingness to engage in formal self-reflection by surgical trainees and reluctance to voice any weaknesses may impair their ability to develop and lead to resistance in the assessment process. Improved training of assessors and removing the scoring of the CBD form may allow more accurate and honest feedback to be given to improve the student’s future performance. An alternative method to improve performance is to ‘feed forward’ (as opposed to feedback) focusing on what students should concentrate on in future tasks10
Summative Assessment
Summative assessments are intended to identify how much the student has learnt. CBDs have a strong summative feel: a minimum number of assessments are required and a satisfactory standard must be reached to allow progression of a trainee to the next level of training1. Summative assessment affects students in a number of different ways; it guides their judgment of what is important to learn, affects their motivation and self-perceptions of competence, structures their approaches to and timing of personal study, consolidates learning, and affects the development of enduring learning strategies and skills12-13. Resnick and Resnick summarize this as “what is not assessed tends to disappear from the curriculum” 13. Accurate recording of CBDs is vital, as the assessment process is transient, and allows external validation and moderation.
Evaluation of any teaching is fundamental to ensure that the curriculum is reaching its objectives14. Student evaluation allows the curriculum to develop and can result in benefits to both students and patients. Kirkpatrick suggested four levels on which to focus evaluation14:
Level 1 – Learner’s reactions Level 2a – Modification of attitudes and perceptions Level 2b – Acquisition of knowledge and skills Level 3 – Change in behaviour Level 4a – Change in organizational practice Level 4b – Benefits to patients
At present there is little opportunity within the Intercollegiate Surgical Curriculum Project (ISCP) for students to provide feedback. Thus a typical ‘evaluation cycle’ for course development (figure 2) cannot take place15. Given the widespread nature of subjects covered by CBDs, the variations in marking standards by assessors, and concerns with validity and reliability, an overall evaluation of the curriculum may not be possible. However, regular evaluation of the learning process can improve the curriculum and may lead to better student engagement with the assessment process14. Ideally the evaluation process should be reliable, valid and inexpensive15. A number of evaluation methods exist, but all should allow for ongoing monitoring review and further enquiries to be undertaken.
Figure 2: Evaluation cycle used to improve a teaching course15
Conclusion
CBDs, like all assessments, do have limitations, but we feel that they play a vital role in development of trainees. Unfortunately, Pereira and Dean suggest that trainees view CBDs with suspicion7. As a result, students do not engage fully with the assessment and evaluation process and CBDs are not being used to their full potential. The main problems with CBDs relate to the lack of formal assessor training in the use of the WBA and the lack of evaluation of the assessment process Adequate training of assessors will improve feedback and standardize the assessment process nationally. Evaluation of CBDs should improve the validity of the learning tool, enhancing the training curriculum and encouraging engagement of trainees.
If used appropriately, CBDs are valid, reliable and provide excellent feedback which is effective and efficient in changing practice. However, a combination of assessment modalities should be utilized to ensure that surgical trainees are facilitated in their development across the whole spectrum of the curriculum.
Malaria is caused by obligate intra-erythrocytic protozoa of the genus Plasmodium. Humans can be infected with one (or more) of the following five species: P. falciparum, P. vivax, P. ovale, and P. malariae and P. knowlesi. Plasmodia are transmitted by the bite of an infected female Anopheles mosquito and these patients commonly present with fever, headache, fatigue and musculoskeletal symptoms.
Diagnosis is made by demonstration of the parasite in peripheral blood smear. The thick and thin smears are prepared for identification of malarial parasite and genotype respectively. Rapid diagnosis of malaria can be done by fluorescence microscopy with light microscope and interference filter or by polymerase chain reaction.
We report a complicated case of P. ovale malaria without fever associated with Hepatitis B virus infection, pre-excitation (WPW pattern), and secondary adrenal insufficiency.
Case Report:
A 23 year old African American man presented to the emergency department with headache and dizziness for one week. He had 8/10 throbbing headaches associated with dizziness, nausea and ringing sensation in the ears and also complained of sweating but denied any fever. He had loose, watery bowel movements 3 times a day for a few days and had vomited once 5 days ago. He denied any past medical history or family history. He was a chronic smoker and smoked 1PPD for 8 years and denied alcohol or drug use. He had travelled to Africa 9 months before presentation and had stayed in Senegal for 1 month though he did not have any illnesses during or after returning from Africa.
On examination: T: 97.6, HR: 115/min, BP: 105/50, no orthostasis, SPO2: 100% in room air and RR: 18/min. Head, neck and throat examinations were normal and respiratory and cardiovascular system examinations were unremarkable except for tachycardia. Abdominal examination revealed no organomegaly and his CNS examination was unremarkable.
Laboratory examination revealed: WBC: 6.4, Hb: 14.4 and Hct: 41.3, Platelets: 43, N: 83.2, L: 7.4, M: 9.3, B: 0.1. His serum chemistry was normal except for a creatinine of 1.3 (BUN 14) and albumin of 2.6 (total protein 5.7). A pre-excitation (WPW Pattern) was seen on ECG and head CT and Chest X-ray were normal.
He was admitted to the telemetry unit to monitor for arrhythmia. Peripheral blood smear (PBS) was sent because of thrombocytopenia and mild renal failure and revealed malarial parasites later identified as P. ovale (Pic. 1 and 2).
He was treated with Malarone; yet after 2 days of treatment, he was still complaining of headache, nausea and dizziness. There were no meningeal signs. His blood pressure readings were low (95/53) and he was orthostatic. His ECG showed sinus tachycardia and did not reveal any arrhythmias or QTc prolongation. His morning serum cortisol was 6.20 and subsequent cosyntropin stimulation test revealed a serum cortisol of 13.40 at one hour after injection. His Baseline ACTH was<1.1 suggesting a secondary adrenal insufficiency. His IGF-1, TSH, FT4, FSH, LH were all within normal limits. His bleeding and coagulation parameters were normal, CD4 was 634(CD4/CD8: 1.46) and rapid oral test for HIV was negative. His Hepatitis B profile was as follows: HBsAg: positive, HBV Core IgM: negative, HBV core IgG: positive, HBeAg: negative, HBeAb: positive, HBV DNA: 1000 copies/ml, Log10 HBV DNA: 3000 copies/ml.
His Blood cultures were negative, his G6PD levels and hemoglobin electrophoresis were normal, haptoglobin was<15 and LDH was 326. MRI of the brain was unremarkable. The abdominal sonogram revealed a normal echo pattern of the liver and spleen and spleen size was 12 cm. The secondary adrenal insufficiency was treated with dexamethasone resulting in gradual improvement of his nausea, vomiting and headache. Furthermore the platelet count improved to 309. Primaquine was prescribed to complete the course of malaria treatment and he was discharged home following 8 days of hospitalization. Unfortunately he did not return for follow up.
Discussion:
Malaria continues to be a major health problem worldwide. In 2007 the CDC received reports of 1,505 cases of malaria among person in the United States. 326 cases were reported from New York with all but one of these cases being acquired outside of the United States1.
While Plasmodia are primarily transmitted through the bite of an infected female Anopheles mosquito, infections can also occur through exposure to infected blood products (transfusion malaria) or by congenital transmission. In industrialized countries most cases of malaria occur among travellers, immigrants, or military personnel returning from areas endemic for malaria (imported malaria). Exceptionally, local transmission through mosquitoes occurs (indigenous malaria). For non-falciparum malaria the incubation period is usually longer (median 15–16 days) and both P. Vivax and P. Ovale malaria may relapse months or years after exposure due to the presence of hypnozoites in the liver of which the longest reported incubation period for P. vivax being 30 years2.
Malaria without fever has been reported in cases of Plasmodium falciparum malaria in non- immune people3. Hepatitis B infection associated with asymptomatic malaria has been reported in the Brazilian Amazon4. This study was done in P. falciparum and P. vivax infected person with HBV co-infection though not in the P. ovale group. HBV infection leads to increased IFN-gamma levels5,6 which are important for plasmodium clearance in the liver7, in addition to its early importance for malarial clinical immunity8. High levels of IFN gamma, IL6 and TNF alpha are detectable in the blood of malaria patients and in the spleen and liver in the rodents’ model of malaria9,10. These inflammatory cytokines are known to suppress HBV replication in HBV transgenic mice9. This might explain the low levels of HBV viremia in our patient although human studies are required to confirm this finding.
The hypothalamic-pituitary- adrenocortical axis suppression and primary and secondary adrenal insufficiency has been reported in severe falciparum malaria10. In our case, the patient did not have any features to characterize severe malaria, and parasitaemia was <5%. Further, the MRI did not reveal any secondary cause for adrenal insufficiency. This might indicate that patients with malaria are more prone for hypothalamo-pituitary adrenocortical axis dysregulation yet further studies are required to prove this phenomenon in patients without severe malaria.
Cardiac complications after malaria have rarely been reported. In our patient pre-excitation on ECG disappeared after starting antimalarial treatment. Whether WPW pattern and its subsequent disappearance was incidental or caused by malarial infection that improved with treatment could not be determined. Lengthening of the QTc and severe cardiac arrhythmia has been observed, particularly after treatment with halofantrine for chloroquine resistant Plasmodium falciparum malaria11. Post-infectious myocarditis can be associated with cardiac events especially in combination with viral infections12. A case of likely acute coronary syndrome and possible myocarditis was reported after experimental human malaria infection13. To date, except for cardiac arrhythmias that developed after treatment with halofantrine and quinolines, no other arrhythmias has been reported in patients with malaria before treatment.
Transient thrombocytopenia is very common in uncomplicated malaria in semi -immune adults14. A person with a platelet count <150 × 109/l is 4 times more likely to have asymptomatic malarial infection than one with a count ≥150 × 109/l15. In an observational study among 131 patients, patients with involvement of more than one organ system was found to have a lower mean platelet count compared to those with single organ involvement16.
Conclusions:
Our case highlights the need for further studies to understand the multi-organ involvement in patients without severe malaria as well as early recognition of potential complications to prevent mortality and morbidity in this subgroup of patients.
Infection of a prosthetic total knee joint is a serious complication1 and should be diagnosed promptly2 and treated aggressively. We present an interesting case of MRSA infection of a primary total knee replacement following an IV cannula infection leading to bacteremia and subsequent infection of the knee prosthesis, complicated by stevens- Johnson syndrome .
There were many challenging issue which are outlined including diagnosis and management.
Case Report
A 63-year- old lady had an elective total knee arthroplasty for severe osteoarthritis of the knee. She had a background history of well-controlled type 2 diabetes mellitus and was on warfarin for a previous pulmanory embolism. As per the hospital protocol her warfarin was stopped before surgery until her INR was <1.5 and she was heparinised with a view of
warfarinizing after the surgery. She had an uneventful knee arthoplasty, but unfortunately one of her IV cannula site became cellulitic. She was empirically started on oral flucloxacillin after taking blood cultures and sending the cannula tip for microscopic culture and sensitivity (which is routinely done has hospital protocol for infected cannula sites).
Surprisingly the tip grew MRSA and also had MRSA bacteraemia. She became systemically unwell and septic, and was treated aggressively with parentral vancomycin for MRSA bacteraemia. She had a transeosphageal echocardiogram to rule out cardiac vegetation. She gradually improved but developed typical papular rashes over her palm, dorsum of hand, extensor surface of arm and forearm and trunk and buccal mucosa (Fig 1 and 2) .
Fig 1: Rash over the dorsum hands
Fig 2: Rash over the extensor aspects of forearm
She had a severe allergic reaction to vancomycin and the skin biopsy of the lesion confirmed that she had developed Stevens-Johnson syndrome. An alternative antibiotic was started following discussion with the specialist bone infection unit. She gradually improved over the next few weeks without any problem in her prosthetic replaced knee. At about 6 weeks post- operatively she developed severe pain and hot swelling of her replaced knee with decrease range of motion. Her inflammatory markers were markedly raised and the knee aspirate confirmed MRSA infection of the total knee replacement. She was referred to a specialist bone infection unit due to the complexity of the case, where she successfully underwent two- stage revision.
Discussion
Infection of a Knee replacement is a serious complication that requires significant hospital-based recourse for successful management3. The rate of infection of a primary knee replacement varies from 0.5- 12%1. Rheumatoid arthritis , previous surgery , diabetes mellitus are all associated with an increased risk of infection 4. Although there is no absolute diagnostic test for peri-prosthetic infection2 , a high index of clinical suspicion is essential. There has been a case report on MRSA cervical epidural abscess following IV cannulation 5, but to the best of our knowledge there has been no previous report of MRSA- infected knee arthroplasty following complications of IV cannulation. Stevens-Johnson syndrome involves rare but severe cutaneous adverse reactions related to a variety of medications including antibiotics6. Parenteral vancomycin is the first line treatment for MRSA bacteraemia. It is recognised that vancomycin is indicated in inducing Stevens -Johnson syndrome, mortalitiy being 30-100%7. It is vital that Stevens- Johnson syndrome is recognised early so that offending agents are stopped and supportive treatment commenced. Early dermatological consultation, skin biopsy and direct immunofluorescence7 are essential to confirm diagnosis so that effective treatment can be instituted.The diagnosis and management of this serious complication is complex and requires considerable recourse allocation by the patient, the hospital, the infectious disease specialist, and the orthopaedic surgeon1,5.
We present a case of a 48-year-old lady with a history of bony metastatic breast carcinoma who presented with abdominal pain, diarrhoea and bleeding per rectum. She had recently finished a course of chemotherapy 2 weeks ago.
On examination, she was febrile with a temperature of 38.4°C. Her blood pressure was 84/54mmHg and pulse rate was 130/min. She had lower abdominal tenderness with bowel sounds present and a small perineal haematoma. Per rectal examination revealed a small amount of fresh blood, but no surrounding crepitus or induration. Rectoscopic examination was not performed.
Initial haematological investigations revealed a haemoglobin of 11g/dl, white cell count 0.3x109/litre, neutrophil count 0.05x109/litre and a C-reactive protein of 171mg/L. A provisional diagnosis of neutropenic sepsis was made. She was managed with analgesia, intravenous fluids and broad spectrum intravenous antibiotics (piperacillin and tazobactam 4.5g 3-times per day). An urgent CT of abdomen and pelvis was arranged for that morning. It showed rectal wall thickening with air in the pelvis but no tumour or diverticulae (see figure 1).
Fig 1: CT scan of abdomen and pelvis showing free air around rectum
Explanation:
Stercoral perforation of the colon is caused by progressive ischemic necrosis of the bowel wall by a faecal mass. It is the least likely diagnosis here as it usually occurs on the antimesenteric border of the sigmoid colon and is usually associated with a history of chronic constipation and megacolon.
Typhylitis is a potentially life threatening inflammatory bowel process that is a recognised complication of systemic chemotherapy. It can progress to bowel necrosis and perforation but is usually characterised by involvement of the caecum or ascending colon and the rectum is rarely involved.
Clostridial gas gangrene infection occurs with tissue inoculation in a low oxygen tension environment. Approximately 80% of patients without trauma have a malignancy of which 40% are hematologic, however the vast majority of cases are preceded by trauma of which there was no history of in this case 1.
Perineal Necrotizing fasciitis is a rare condition with an estimated 500 cases each year in the UK2. It can affect healthy individuals of any age but carcinoma and immunosupression are known to increase susceptibility3. The initial lack of obvious skin findings make this condition difficult to diagnosis but exquisite pain, especially pain that is disproportionate to what would be expected from the clinical findings is seen 2, 4. Where concurrent signs of sepsis exist, a high index of suspicion is required.
As the disease progresses, the skin may begin to appear smooth, shiny and swollen. Blistering and serous bullae may develop, and a haemorrhage into bullae may occur and giving the appearance of a haematoma as in our case. Crepitus, induration and foul smelling watery discharge secondary to liquefactive necrosis can also become apparent2. On CT scans, fascial thickening, fat stranding and gas tracking may be seen in nearly 80%2, 5of cases, and was seen in this case as well.
Discussion
Necrotizing fasciitis is a lethal soft tissue infection characterised by rapidly progressive inflammation and necrosis of the subcutaneous fascial tissues. The adjacent skin and muscle are relatively spared until late in the course of the disease. Treatment with surgical debridement must be instigated without delay or the patient inevitably succumbs to sepsis and multi-organ failure2.
A 24 hour delay in treatment has been shown to increase mortality by 18% and further surgery is usually indicated with an average of 3.8 debridements needed overall5, 6. Surgical treatment should be instigated in conjunction with broad spectrum intravenous antibiotics and intensive care. The antibiotics selected should be effective against gram-positive, gram negative and anaerobic organisms. Adjuvant therapies like hyperbaric oxygen, intravenous immunoglobulin and activated protein C are of uncertain value.
Following surgery the patient is invariably left with a large tissue defect. Perineal wounds are particularly complex and present multiple challenges including the risk of infection from faecal contamination. Thus diverting colostomies are advised and a Vacuum Assisted Closure (VAC) system may facilitate wound healing 5.
Gastro-oesophageal reflux (GOR) is the passage of gastric contents into the oesophagus. In most infants with GOR the outcome is benign & self-limiting. (1)
Incidence/Prevalence
Peak incidence of GOR is around 4 months of age, and it resolves spontaneously by 1-2 years of age in most patients. (2)
Regurgitation (possetting or spitting up) is the most common presentation in infants with GOR. Regurgitation of at least one episode a day is seen in:
50% of infants 0-3 months
67% of infants at 4 months
5% at 10 to 12 months of age (3)
It is important to note that in infants (younger than 1 year of age) who are otherwise well and symptomatic, regurgitation may be considered entirely normal. (4)
Causes/Risks
GOR occurs due to the transient, inappropriate relaxation of the lower oesophageal sphincter, which allows the stomach contents to pass into the oesophagus.
GOR can be physiological or pathological:
Physiological GOR – when the infant has normal weight gain and experiences no complications and is generally well.
Pathological GOR – also known as gastro-oesophageal reflux disease (GORD) is when reflux is associated with other symptoms like failure to thrive or weight loss, feeding or sleeping problems, chronic respiratory disorders, oesophagitis, haematemesis etc (3)
Several anatomical and physiological conditions make infants (younger than 1 year of age) more prone to GORD than older children and adults:
Short, narrow oesophagus
Delayed gastric emptying
Shorter, lower oesophageal sphincter that is slightly above, rather than below, the diaphragm
Liquid diet and high calorie requirements, putting a strain on gastric capacity
Larger ratio of gastric volume to oesophageal volume(4)
Most children have no specific risk factors for GORD. Children with the following conditions are at increased risk for developing GORD and for progressing to severe GORD:
Severe neurological impairment
Prematurity
Cystic fibrosis
Gastro-oesophageal abnormalities (even after surgical repair), e.g. Oesophageal atresia, diaphragmatic hernia, pyloric stenosis
Bronchopulmonary dysplasia (preterm infants with lung disease)
Hiatus hernia
Oesophageal sphincter disorders
Raised intra-abdominal pressure(5)
Symptoms
GORD in infants and children can present with a variety of symptoms many of which can be relatively non-specific. Equally, other pathologies may lead to the development of reflux. Those in the early years tend to be based on observations by parents, while older, more vocal children express symptoms more akin to adult presentations.
As such, the history/symptoms will be broadly divided into those expected for infants (<1yr), young children (1-5yrs) and older children (>5yrs).
Infants(6-10)
Excessive possetting/regurgitation
Possetting is a normal phenomenon in infants
Frequent episodes, together with vomiting may indicate underlying GORD
Projectile vomiting may indicate an obstructive pathology
Difficult/rapid cessation of feeds
There may be difficulty initiating feeds and latching
Early cessation may be precipitated with the onset of reflux
Failure to thrive
No weight loss can be expected
Weight loss crossing centiles on the growth chart must be addressed urgently
Sleep disturbance
Particularly after an evening feed
This is often associated with irritability and inconsolable crying
Irritability and inconsolable crying
One of the commonest presentations to the GP
This may occur during feeds or shortly afterwards
Apnoeic episodes
A witnessed pausing in respiratory effort
Occurring at night, it can mimic obstructive sleep apnoea
This may indicate a more serious underlying pathology and requires urgent assessment
It is likely to be more prevalent in this age group
Young Children(6-10)
Regurgitation/vomiting
Beating/rubbing the chest may be an early sign of this pathology
Reflux symptoms can be typical of those in adults
Failure to thrive
Refusing food
Similar to the infant, however, the younger child can be more vocal in their refusal
Abdominal/chest pain
With increasing age, children may demonstrate gastric irritation with abdominal pain
Acid reflux producing oesophagitis may present as chest discomfort
Both are similar to symptoms adults experience
Irritability
Persistent/nocturnal cough/wheezing
There may be a dry, non productive cough
Secondary to pharyngeal irritation
There may be no co-morbidities or underlying pathologies
Symptoms can be mistaken for asthma by parents
Older Children (9)
Dyspepsia/vomiting
These symptoms in older children are thought to have a similar reliability in diagnosis as in adults
Dysphagia/odynophagia
As children become more articulate they may be able to describe these symptoms in relation to meals
Particularly with chronic GORD and the development of a Barrett’s Oesophagus
Abdominal/chest pains
Persistent/nocturnal coughing/wheezing
Other Symptoms
Symptoms which can be identified but which maybe considered less life-threatening include:
Dental erosions
Hiccups
Halitosis
Those deserving urgent investigation and intervention include:
Forceful/Bilious vomiting
Suggesting a possible obstructive pathology
This requires urgent surgical referral
Force of vomiting may not always indicate the severity of the problem
Upper gastrointestinal bleeding/hematemesis
This may be a consequence of increased pressure from vomiting
Similar to a Mallory-Weiss pathology
An urgent review by local Paediatric Gastroenterologists is warranted
Profuse diarrhoea or constipation
Failure to thrive/weight loss
Lethargy
Apnoeic episodes
Physical Signs
As with the previous section, physical signs will be considered for each age range as above: infants (<1yr), young children (1-5yrs) and older children (>5yrs).
Infants(9)
Irritability when lying flat
Particularly following feeds
Especially when supine
Weight loss
Regular monitoring with repeat measurements
A single weight cannot imply loss
This is usually a late sign
Arching of the back
Secondary to oesophageal irritation
Can be associated with increased tone and crying
Dehydration
Loss of fluid through vomiting
Look for
Dry mouth
Sunken fontanelle
Prolonged capillary refill time
Reduced skin turgor
Reduced urine output
Crying without tears
Apnoeas
Periods of reduced respiratory effort
Noted by parents as pauses in breathing
Young Children(9)
Weight loss
Dehydration
Anaemia
Associated with chronic symptoms and gradual loss of iron
Look for Pallor/pale conjunctivae, Glossitis, Angular stomatits, Pica
Dysphagia/choking with food
Particularly with prolonged GOR and development of stricturing
Difficulty in breathing/wheezing/lower respiratory tract infection (LRTI)
Similar to asthma on examination
Signs of LRTI on auscultation
Possibly stridor
Older Children(9)
Weight loss
Dehydration
Anaemia
Dysphagia/Choking with food
Difficulty in breathing/Wheezing/LRTI
Persistent sinusitis
Signs requiring urgent intervention include(9):
Hematochezia
Unaltered blood in stool
Stools take on a red appearance
Onset of vomiting after 6 months of life
Fever
Uncommon with GOR
Indicating an infective pathology
Hepatosplenomegaly
An underlying condition other than GOR is likely
Important pathologies must not be missed
Bulging fontanelle
Indicating increased intracranial pressure and an alternative pathology underlying the reflux
Macro/microcephaly
Suggestive of hydrocephalus or a congenital malformation
Seizures
Related to a number of other problems
Metabolic pathologies should figure highly in any differential diagnosis
Abdominal distension with reduced bowel sounds
Tinkling bowel sounds and an pain may suggest bowel obstruction
Differential diagnoses
Common differential diagnoses have been noted in Table 1, however, this is by no means a definitive list of conditions or presentations. It should be taken as an indication to the diverse presentations that can mimic or precipitate GOR (adapted from (9) and (10)).
Condition
History/Symptoms
Signs
Pyloric Stenosis
Sudden onset vomiting Constantly hungry baby Usually males First 4-6 weeks of life
Non-bilious projectile vomiting Visible peristalsis Positive test feed
Malrotation
Sudden onset pain in volvulus Reduced bowel movement Vomiting
Bilious vomiting Abdominal distension Pulling up legs with pain onset
Cow's Milk Allergy
Vomiting and Diarrhoea Eczema Relationship to feeds Failure to thrive
Urticaria Watery stool Weight loss crossing centiles
Constipation
Infrequent stools Straining Blood in nappy
Palpable stool on examination Irritable baby
Urinary Tract Infections
Vomiting Fever (can be without focus) Poor feeding
Lethargy Reduced urinary output Abdominal pain
Viral Gastroenteritis
Vomiting Diarrhoea Fever Lethargy
Dehydration Viral Rash
Hypocalcemia
Poor feeding Lethargy Tetany Seizures
Seizures Apnoeas Tremor Abdominal distension
Hydrocephalus
Vomiting Lethargy Confusion Visual changes
Increased head size Gait change Altered consciousness
Meningitis
Fever Lethargy Vomiting Confusion
Neck stiffness Photophobia Rash (late onset)
Drugs/Toxins
Vomiting Lethargy Ingestion history
Dependant upon drug ingested
Table 1
Investigations and management of infants (<1 yr old)
Complicated cases of GORD (not gaining weight/faltering growth or non-GI symptoms e.g. cough), should be referred to a Paediatrician while investigating for causes and instituting simple management.
Simple investigations to do in primary care:
Abdominal examination for hernias/pyloric stenosis (test feed)
Urine dip to rule out UTI
Blood tests for electrolyte abnormalities, coeliac screen (if weaned)
Referral to a Paediatrician will result in imaging investigations such as Abdominal x-ray and upper GI contrast study to rule out malrotation/hiatus hernia/achalasia in older children, sometimes GORD can be seen on contrast studies. The Paediatrician may go on to arrange a pH/impedance study, upper GI endoscopy or allergy testing.
Management
Calculate feed requirements, parents may be over feeding, e.g. approximate fluid requirement 100-120ml/kg/day every 3-6hrs (depending on age and whether weaned on to solids)
In thriving infants there is no evidence that pharmacological therapy will make a significant difference to symptoms.
Therefore the mainstay of management is reassurance. Simple pharmacological intervention can be tried with feed thickener (in formula fed babies) or Alginates e.g. Gaviscon (can be mixed with water for breast fed babies)
If there are continued concerns refer to Paediatrician for on going investigations and management.
Recent evidence shows that some infants may have cow’s milk protein intolerance (9). Therefore for breast fed babies the mother could try cutting out dairy from her diet (important to have supervision from dietician re: nutritional requirements while breast feeding). Formula fed babies can have a 2 week trial of hydrolysed/amino acid based formula e.g. Progestimil, Nutramigen, Neocate.
Reviews from ESPGHAN (9) and DTB (11) recommend H2RA (H2 receptor antagonists eg. Ranitidine) may help, though there is little evidence – these could be commenced while waiting for an appointment with the Paediatrician.
(Currently there is no role for Domperidone. The next medication a Paediatrician may try is Omeprazole ± omission of cow’s milk protein) (11)
Investigation and management of older children (>18mths)
As before, complicated cases of GORD (not gaining weight/faltering growth or non-GI symptoms e.g. cough), should be referred to a Paediatrician while investigating for causes and instituting simple management.
Investigations
Urine dip, if there are symptoms of vomiting
Stool H. Pyloti antigen test
Bloods tests inc. inflammatory markers, H. Pylori antigen, celiac screen
Management
If main symptom heartburn with no evidence of H. Pylori:
Reassurance and lifestyle changes (weight loss, dietary changes, timing of meals), up to 4 week trial of PPI (Proton pump inhibitor e.g. lansoprazole, omeprazole).
If symptoms improve then continue PPI for up to 6 months, then wean off over 4 weeks (evidence that if stopped suddenly patients may get rebound symptoms) (10).
If PPI doesn’t help or symptoms recur after stopping the PPI, then refer to a Paediatrician.
The Paediatrician may investigate with more blood tests e.g. Autoimmune screen, allergy testing, imaging, pH/impedance study, endoscopy.
Problem based learning (PBL) has been an important development in health professions education in the latter part of the twentieth century. Since its inception at McMaster University1 (Canada), it has gradually evolved into an educational methodology being employed by many medical schools across the globe2,3. PBL presents a paradigm shift in medical education, with a move away from ‘teacher centered’ to ‘student centered’ educational focus. The assumptive difference between a pedagogy learner and an androgogy learner (Table 1) was summarised by Knowles4, and the androgogy approach underpins PBL. This shift has redefined the role of a teacher in the PBL era, from being a teacher to a facilitator.
Table 1: Differences between Androgogy and Pedogogy learner (Knowles)
Characteristics
Pedagogy
Androgogy
Concept of the learner
Dependent personality
Self-directed
Readiness to learn
Uniform by age-level & curriculum
Develops from life tasks & problems
Orientation to learning
Subject-centered
Task- or problem-centered
Motivation
By external rewards and punishment
By internal incentives curiosity
It is well known that implementing PBL as an educational methodology required additional resources compared to a traditional lecture based curricula5. In addition, there was a need to recruit and train a large number of tutors to facilitate the PBL process6.Training PBL tutors is an important component of a successful curriculum change, and is a continuous process. Training workshops and role plays were employed to train conventional teachers, but challenges were faced in developing them into effective PBL tutors5.
The aim of this paper is to evaluate the literature for any evidence supporting the theory that a PBL background student may develop into an effective PBL tutor. The Medline, EMBASE and CINHAL databases were searched to look for any pre-existing literature or research supporting this theory.
Results:
To the best of my knowledge, there has been no reported evidence supporting this theory. With limited literature evidence, this paper aims to identify common grounds between a PBL student and a PBL tutor, and whether being a PBL student may contribute to the overall development as a PBL tutor. The discussion evolves around the following domains:
1. Teaching Styles:
The ideal teaching style of a PBL tutor is a facilitative-collaborative style, which augments and supplements the PBL process. The teaching style inventory developed by Leung et al7 hypothesised four domains of teaching styles: the assertive, suggestive, collaborative and facilitative styles. Though a PBL tutor assumes himself in possessing this style (facilitative), it does not necessarily match with the students perceptions, as reported by Kassab et al8.
Some of the characteristics of being a PBL student may foster the development of a collaborative teaching style. Being a student, you are expected to be a collaborative learner which is critical for achieving and improving group performance9. Initial years as a student in PBL may contribute to developing attributes required to develop a preferential teaching style.
2. Facilitating critical thinking:
PBL is grounded in cognitive psychology and is set out to stimulate curiosity and build durable understanding. One of the roles of the tutor is to foster critical thinking and enhance the group’s ability to analyse and synthesise the given information. This attribute stems from the tutors ability to facilitate, rather than teach. Irby10 opined that clinical teachers tended to teach as they themselves were taught using traditional approaches, which may affect the process of stimulating critical thinking among the students.
A tutor from a PBL background would have the ability to think critically, through a process of developing thoughtful and well-structured approach to guide their choices11. Tiwari et al12 showed in their study that PBL students showed significantly greater improvement in critical thinking compared to traditionalist courses. Hence, prior exposure to a certain learning style can create a cognitive psychology that can contribute to tutor development.
3. Group dynamics:
One of the prime roles of a PBL tutor is to facilitate the PBL process by keeping the group focused on tasks, and guiding them to achieve their goals. Tutors who are skilled in group dynamics are evaluated more highly than those who are not so skilled11,13 . Tutors need to develop sound appreciation of the group dynamics, failing which may lead to fostering uncertainty with in the group. Bowman et al13 commented about the lack of consideration on the emotional implications placed on prospective PBL tutors when tutoring small groups, especially the skills required to balance between short term anxieties and potential serious problems. This imbalance which usually serves as unconscious incompetence may affect group dynamics.
PBL students would have experience of group dynamics and the pressures of working within it. They would have developed a model of working with members with varying attributes. Blighet al14 showed in their study that students from a PBL curriculum rated themselves better in team working and motivation compared to conventional course peers. This highlights the fact that an apprenticeship model may be necessary in developing the right skills to be an effective tutor.
The characteristics of a student that may foster ideal attributes in a PBL tutor are briefly summarised in Table 2, and has evolved from the work of Samy Azer9,11 .
Table 2: Common ground
Ideal PBL student
Ideals of a PBL tutor
Knows his role within a group
Would help in identifying different roles students may play
Knows to ask empowering questions
Would help in guiding groups in achieving learning objectives
Monitors his own progress by self evaluation and motivation
Would help in monitoring individual progress and motivate group
Bonds with other members to achieve goals
Would help in building trust and encourage bonding of group members
Develops thoughtful and well structured approach to guide choices
Would help in facilitating critical thinking
Fosters collaboration with other group members to create a climate of trust
Would facilitate collaborative teaching style
4. Tutor training
Considerable resources are exhausted in teaching new tutors the art of facilitating a PBL group6, and the usual cohort is teachers from a conventional taught background. The shift from didactic expertise to facilitated learning is difficult for those tutors who feel more secure in their expert role. Finucane et al5 published their study which showed that only a minority of staff had volunteered to be PBL tutors, possibly reflecting the fact that absence of prior exposure to PBL style of learning may have contributed to this. In spite of tutor training workshops, they could only retain 73% at the end of two years.
Prior exposure as a student may help negate much of the stigma associated with PBL. They would have observed and learnt from their PBL tutors, and would have analysed their contribution to the PBL process. They could reflect on their experience and evolve into an ideal PBL tutor. This would help in minimising resource expenditure and contribute towards retention of staff.
5. Tutor comfort zones:
PBL contextualises learning to practical situations, with integration across disciplinary boundaries. Dornan et al15 reported on how some teachers felt PBL to be a frustrating drain on time as it did not fit their educational style, and was a distraction from clinical teaching, demonstrating the ‘conditioning effect’ of prior experiences. This further fuels the debate between content vs. process expertise, but prior knowledge of the process would benefit the students and the PBL process.
6. Role modeling:
Role models have long been regarded as important for inculcating the correct attitudes and behaviors in medical students. Being an ideal role model is considered as one of the prime requisites of a teacher. In a recent study, Mclean et al16 showed that PBL students tended to have a higher percentage of role models compared to students from a traditional programme (73% vs. 64%). In an ideal setting, a “content and process expert” would be the perfect role model for the PBL students, but this may not be realised in all settings.
Paice et al17 commented on the resistance to change within the medical profession, and highlighted the need for training to emphasise the values and attitudes required. This puts an added emphasis on the tutor to demonstrate tenacity and virtues to be an effective role model, avoiding ‘cognitive clouding’ from previous experiences.
As a PBL student, they would be exposed to variety of PBL tutors. They would have incorporated the good points of an effective PBL tutor, and would have reflected on the negative aspects. Reflective practice enables them to develop the right attributes. Though these attributes may be difficult to develop through training workshops, having a background of PBL education may help mould the tutor characteristics.
Conclusion:
As PBL continues to be employed across different specialties, there would be increased emphasis on the medical schools to match the resources needed to implement it. There is an argument for developing an apprenticeship model or recruiting tutors from PBL background, which would help in reducing the cost in training new tutors, along with nullifying the negative influences a new tutor may bring. The biggest limitation in the present setting is finding a cohort of PBL background tutors, but an apprenticeship model may benefit teachers from conventional background. A prospective research study exploring the attributes of tutors, successful and less successful, from traditional, PBL and hybrid curricula and those who have crossed the Rubicon from traditional to PBL can effectively answer this question.