An Open Letter to the College
As the new format FACEM exit exam is in the process of being marked, there is now an opportunity to draw to the attention of the College, it’s officers and examiners to significant issues which have already adversely affected a considerable number of trainees and will no doubt continue to do so as the marks are released. The opinions forwarded in this letter have the broad support and agreement of many of my colleagues who have been preparing for the 2015.1 exam, many of whom expressed a marked reservation about voicing their concerns, fearful of reprisals from the College. I believe such fears are unfounded and trust that the College will be genuinely interested in the experience of candidates and seek to resolve the issues raised herein, before exposing candidates to the second exam in the new format.
There is phrase, originally coined by the British army, that has subsequently made its way into civilian use and will, no doubt, be recognised by many non-military personnel. The more polite version of this phrase is Proper Planning and Preparation Prevents Pretty Poor Performance, which is often shortened to “The Seven P’s”, but I much prefer the version which describes Piss Poor Performance – so much more evocative, don’t you think! The use of this particular military phrase has its roots among the litany of disastrous campaigns and planning errors that haunt our past and directly contributed to heavy loss of life. It is intended as an aide memoire in the hope of avoiding future catastrophes by prompting planners of all things to stop, think and actively avoid the avoidable. With respect to new iteration of the FACEM exit exam, it would appear that the Australasian College of Emergency Medicine has disregarded this wise military adage, much to the disadvantage of its own trainees who will form the future of the College.
Before proceeding, I would like to clarify that I fully support the efforts of the College to improve the structure of the FACEM examination and anticipate that the new format will continue to improve the quality of training. The change itself is not the cause my complaint. Where my argument with the College begins and ends is the shoddy and amateurish implementation process that we, as candidates, have been exposed to in recent months and it is the implementation of the current format that is unfair, unjust and unreasonable.
The build up to the exam was, quite frankly, a debacle of misinformation and misdirection with little in the way of effective engagement of candidates. I am struggling to express the immense frustration that was experienced as a succession of well-meaning and honourably intentioned specialists attempted to advise on an optimum approach to the exam, only for them to acknowledge that they don’t really know what was actually expected and are only using their best educated guess.
That poor state of affairs was compounded by the oft contradicting advice given by other specialists. Effectively, the College appeared to be keeping their very own specialists in the dark about the new exam format. How can a professional college conducting a specialist examination process possible justify such a level of disregard to their future specialists?
The promised release of example questions never happened – a couple of MCQ’s, a smattering of EMQ’s and a handful of SCQ’s with minimal marking criteria that the College provided on the website was a pathetic effort and, to be honest, a complete aste of bandwidth to download. The mantle of secrecy that the College maintained over the nature of the new exam was reminiscent of playground tittle tattle and actually compromises the professional integrity of the College.
There seemed to be a constantly changing approach as to how the exam was going to be assessed and marked, leading to the feeling of uncertainty amongst candidates. Changes were still occurring to the exam process literally in the final few days before the exam was to be held and this manner of approach led to confusion, anxiety and uncertainty, all of which had a direct and negative impact on candidate’s preparation.
The MCQ exam in the morning was conducted through computers, a very reasonable and sensible approach, but the start of the exam was the very first time any of the candidates had seen the software. In addition to the anxiety and mental energy of answering the actual questions, a proportion of that energy would have been dedicated to working out how to use the software. This approach was completely inconsiderate to the candidates, unacceptable and completely avoidable. How hard would it have been for the College to allow candidates access to a trial version of the MCQ software in order to become familiar with the software? The answer, of course, is that it would have been very easy if only some thought and consideration had been applied. Such forethought and consideration was starkly absent from the College’s preparation for this exam.
The pinnacle of the College’s Piss Poor Performance was the SCQ booklet, which gave the appearance of having been knocked-up on a $50 printer from Aldi and photocopied too many times. The images looked as if they were low quality JPEG’s downloaded from the internet, particularly the ECG’s were of such poor quality that the lines on the ECG paper were indistinguishable and two wrist x-rays were reproduced at size smaller than your average smart phone screen, but so pixelated that the image had more in common with Minecraft than the professional exam it was supposed to be. Minor problems, you may argue, but with the pressure of answering a question in 6 minutes, is it reasonable to expect the candidate to squint at an ECG trying to calculate those essential characteristics or peer at a tiny x-ray image?
The booklet itself was bound with staples in such a way that, particularly towards the middle of the booklet, it was actually impossible to make full use of the provided space for the answer, hardly the hallmarks of a professionally executed exam. Was this deliberate cost cutting or was it just another aspect of the College’s woeful planning?
The time critical nature of the final exam had been hammered into us trainees for years in the old exam format and this emphasis has continued, yet the actual exam was almost impossible to pace. Where we have been told by our trainers that it should be a strict 6 minutes per question, this was impossible to maintain since some of the questions where 4 or 5 pages long, compared to others which were much shorter. The questions were not equal at all and were certainly not requiring the same time to answer. Why was this vital information not conveyed to the trainers and trainees? Why was a bank of sample questions of the new format not made available, along with marking scheme, to the candidates? The provision of handful of questions with scant comments does not provide an acceptable body of questions from which to practice.
The College must, somehow, be held accountable for this debacle. But it won’t be; it will be the candidates who will shoulder the consequence for the Colleges poor planning and preparation; it will be the candidates who will have to shell out another few thousand for another exam attempt; it will be the candidates who miss the opportunity of a specialist appointment; it will be the candidates and their families who are compromised. I am sure the College will absolve itself of all accountability by smugly declaring that the candidates did not prepare themselves properly, but how can we be expected to prepare ourselves properly when the College itself was not sure what was happening even in the final few weeks before the exam. The College was not prepared for this exam and should have been honest and transparent with its trainees from the outset.
Dr Chris Cheeseman NSW trainee
Thanks for this – I particularly agree with the final paragraph – since the exam I’ve felt so utterly helpless that the progression of my career is dependent on a group of people who obviously either don’t care or are incapable of assessing my skill or knowledge in my chosen feild of medicine. I feel like there should be a higher power I can appeal to in order to review the exam independently – but there is none. No ombudsman or anything – just me and a college who will get away with whatever it wants at the cost of my time, effort and mental well being.
The timer issue and the staples-made-it-hard-to-write issue are both bollocks, but everything else you said rings very true. Your final thoughts are the same as mine.
This exam was not ready for deployment. The college should have had some test runs amongst the DEMTs, or recently qualified FACEMs (with fresh/recent exam experience/memories/PTSD) or simply bailed, admitted they weren’t sufficiently prepared, and had only one exam in 2015, in August/October.
Yes, couldn’t agree more.
speaking for myself (and maybe for many more): All faith and trust in the college has gone.
Pretty easy to ruin a reputation if you think about it.
Having the nerve to “reassure” us in last Fridays’ email, that everything will be done to provide us with a fair process unfortunately comes across more like “stinging irony” causing more hurt, rather than real reasurance… The feeling of being BERAYED by my own colleagues (namely the college) has not left me since coming out of the examination room after the saq.
Day after day I am asking myself: “When are “they” finally pulling the plug?” There has to be someone out there really who takes leadership and responsibility to state the obvious:
“This SAQ did NOT test what it ought to be testing!” This is because it had MAJOR design faults! Hence, it is not valid!
One doesn’t really have to be or need a specialist in the field to come to that conclusion. Common sense and an unclouded and unbiased look at the SAQ will not leave a lot of room for debating this conclusion.
Indeed at the end of my training and after 9 month of intensive preparation inclusive of having done the 8 trial saqs’ available to me (AFEM, APEM, GCUH, Monash, Adelaide, Prince Charles, Princess Alexandra, NSW) to time twice (some of them 3 times) in order to get my timing right and in order to work on my (poor) handwritng, I find it pretty disheartening that the best the actual SAQ did, was assessing “my writing speed”, “how good my brain functions under high pressure over 3 hours without food” and “my ability to write neatly under these conditions”. If this is what the college wanted to test, maybe they have a valid exam!
I’d even suggest someone does a 4.10 on these about 260 papers written by us, looking at preformance decline over time (from question 1 to question 30)…
The reading time to writing time ratio alone was so bad (considering we had 6 minutes for both reading and writing), that on that facet alone, the SAQ has no leg to stand on….leave alone all the other mentioned shortcomings…(and the mantra of our teachers still echos in my ear: “Read the question, answer the question!” I so would have loved to do that, if only I had been given reasonable time to do so!)
I think, it would be in the colleges’ own best interested to stop the nonsense and pain soon (and definitely BEFORE giving out results based on an unfair and unvalid test) in order to provide further landslights of “losing confidence”…
All those who were responsible for this calamity, should resign without any hesitation, accepting the incompetence and not competent to test a candidate to be a consultant. They should apologise openly for the miseries caused to hundreds of candidates who had taken the exam and more hundreds in whom they have instilled fear. this is worse than a mass experimentation by dictators in history.
“this is worse than a mass experimentation by dictators in history”
Well, no, it isn’t.
Hi there, i’m preparing for the exam and just wondering what can I do to perform better.
– From what i have heard, the computer system for MCQs sounds very similar to what we used for primaries and that was the first time I ever used it in the exam and it worked fine. It would be nice to have a trial to see what it looks like before the actual thing but not necessary.
– Can someone give an example of a question that they were not able to answer in 6 minutes? I keep hearing about SAQs being unreasonable but if for example a question details a patient with arrhythmia then shows an ECG with changes of hypokalaemia asking for interpretation of changes, then asks for causes of hypokalaemia and management issues etc – is that not fair for 6 minutes?
OR are they just questions that are too detailed to even read and hard to come up with an answer in the first place??
The exam report is now out! It has become apparent that time was an issue for many candidates but that the pass mark was very high. We suggest pacing per hour rather than per question (e.g. be up to question 10 by one hour and so on)
Hi
After Written the OSCE felt like bad practical joke. Everyone knew what was being asked in the exams. The college i must say was never ready for the transition but still went ahead with it. I feel sorry for all the candidates however more so for the OSCE group in the 1&2 groups of Day 1 & 2 because those were the only who did not knew what were they going to asked. The purpose of the email is not to doubt credibility of the candidates but merely to point the fact the college should have done some homework before going ahead with this format. Once again let me say this does not question credibility of the candidates but the process of the examination.
If this is how you feel, I suggest you read my blog on Edward Reje’s forum.
Good Luck!
Recycling the OSCE stations clearly disadvantaged the candidates who had no idea of what they going to ask. Especially the first few groups of the first two days. Now this must have left the pass mark high in each question (preparing and practicing the best answer) leading to an artificially skewed curve with failing average candidate unfairly. Isn’t this exam a flawed? How does the college going to justify this if someone challenge the credibility of the process of the OSCE exam?