November 14, 2019

July 25, 2019

Please reload

Recent Posts

ZS-OFH gets her Prop Overhauled

July 21, 2017

1/10
Please reload

Featured Posts

PROJECT WUCAA - WAKE UP CAA! SOUTH AFRICA’S INSTRUMENT RATING DEBACLE

South Africa’s Instrument Rating Debacle

Professional, or even just competent, pilots should be able to fly in instrument conditions. Yet the South African Instrument Rating has become effectively unobtainable for many new commercial and private pilots who need to be able to fly safely in undemanding instrument conditions. This is making flying more dangerous for these pilots.

 

South African pilots and flight schools have been calling for an accessible instrument rating for years. In the United States more than half of all pilots have instrument ratings. However, in South Africa, the number of private pilots who have obtained instrument ratings in the past two years is negligible – and flight schools are producing commercial pilots without instrument ratings. The CAA, to its credit, made an effort to roll out the stand-alone, and therefore more accessible, instrument rating in July 2016, yet two years later, the instrument rating is less accessible than ever.

There are numerous examples of students being unable to pass the instrument rating (IR) exam. A flight school reports that it took a student, who had passed all the other Commercial Pilot Licence (CPL) exams, no less than 13 attempts to pass the IR exam. And once you have failed an exam three times, the CAA makes it progressively more time consuming to get another exam date. Even more remarkable is that there are a number of students who have gone on to pass all Airline Transport Pilot Licence (ATPL) exams and yet are still unable to pass the IR exam. If the IR exam is at the right standard, are all the other exams (CPL and ATPL) too easy? One undesirable consequence is that students are going overseas, typically to the USA, to get their instrument ratings.

A revised stand-alone instrument rating was first proposed at the CAA’s CARCom meeting as far back as August 2013. The impetus came from private pilots who wanted an instrument rating but argued that having to write all but one of the CPL theory exams was impractical and prohibitive. This was widely considered a good idea as it would improve pilot skills and thus safety. So in January 2014 the CAA proposed a new rating, which would consist of just one exam with an emphasis on practical training. To allow flight schools to get up to speed with the new course material, it was agreed that it would be implemented after an 18-month gestation period.

This date was delayed to 1 July 2016 and so there were 2.5 years for implementation. The bottom line is that the new IR should not have come as a surprise to any flight school. Yet loud protests ensued as pilots struggled to get to grips with the IR exam, and worryingly the failure rate was greater than 95%.

The big question is does the problem lie with the CAA – or with the flying schools? It’s probably the fault of both sides. Students who are struggling to pass the IR exam naturally complain that it is unfair (and their instructors claim that some questions have wrong answers), while those at the CAA, who spent years working on it, and are proud of it, say that training, knowledge and understanding is not at the required standard.

The CAA is also guarding their new data base of exam questions in an attempt to stop the private sector passing the questions and answers to students. This would defeat the intention of the IR exam, which is to test understanding. However, the exam has already been rewritten so many times by pilots trying to get their instrument rating, that the questions are already out there.

Designated Flight Examiners (DFEs) acknowledge that they’ve been asked by the CAA for input on the exam, but the DFEs running large flight schools aren’t allowed to see the exam. However, some independent DFEs (not affiliated to a flight school) have had a look at the exam and given constructive feedback, some of which has been very helpful. The CAA has said that if they receive a formal request from DFEs to see the exam, they will consider it on a case by case basis.

There is, however, something of a breakdown in trust between DFEs and the CAA – possibly as a result of the problems described with DFE oversight last month. The CAA tried to involve the DFEs during 2015-2016. They invited every South African DFE to participate in the quality and/or testing phase of the examinations, yet received very few responses. On the other hand, the DFEs’ trust in the CAA seems to have broken down because they feel the CAA lacks the necessary experience and expertise to oversee pilot training.

It takes a mature mindset to continue reaching out to an industry that often isn’t willing to engage constructively with you. In further efforts to engage with industry, the CAA speaks to students, whether they have passed or failed, about the exam, discussing where they did well, where they went wrong and why they performed as they did. The CAA also has an email dedicated to student interaction (examfeedback@caa.co.za) but receive little feedback.

The consensus at flight schools that have worked constructively with the CAA appears to be that, in the first months after the new exam was implemented, pass rates were dismal. But, as the schools jacked up their ground school, so pass rates improved. This is what the CAA was aiming for – students gaining practical experience and not learning in isolation. But before we pop the champagne, the findings must be tempered with the fact that, as time passes, question databases are being continuously updated and distributed. It’s therefore safe to assume the improved pass rate is a combination of both well-structured ground schools and the proliferation of past exam questions to which students learn the answers, rather than understanding the subject matter.

An undesirable consequence of this more challenging exam is that the flight schools are now producing more VFR only commercial pilots, and therefore even fewer instrument rated pilots. This is the opposite of what was intended to have happened. And perhaps even worse, the flight schools report that they are losing students to American flight schools as the students feel that they can better get an instrument rating there.

The appeal of the FAA instrument rating is that the theory aspect is relatively simple; the emphasis is on practical training, and here the FAA is tougher than our CAA. It makes sense – flying is a practical exercise and so focussing on practical skills is important. But it is argued that the FAA has a far deeper pool of training and skills testing experience than South Africa, and so has the expertise to focus on practical training to develop the necessary skillset.

Having a tough written exam is more closely aligned with the EASA syllabus, and the recommended study material is predominantly the Jeppesen EASA ATPL manuals. A complaint from a well-regarded flight school in South Africa is that, while mirroring the EASA syllabus is a good idea, instead South Africa has “tried to re-invent the wheel when we don’t have the skillset and experience to do it properly.”

The CAA’s intentions were good – test for understanding and application and turn seven exams into one. The trouble is lack of experience, coupled with poor DFE input, means that implementation hasn’t gone as planned. Successfully designing such a broad exam is an undeniable challenge and taking it isn’t a joke either. Having to answer questions that jump from Met to Air Law to Human Factors, Instruments and Radio Aids, some of which require combining knowledge from different topics to come to the correct answer, and doing it fairly quickly, can derail students – you need to know the material well!

 

SOLUTIONS

When students fail, the majority are in striking distance of a pass, where just five or so more correct answers, out of 100, would be enough. This implies that students are struggling with application, as close on 60% of the exam is purely theory, with the remaining 40% being application that combines knowledge from the range of subjects and asks you to apply it to a situation. These are more difficult to answer – sure – but this type of thinking is what will be required in actual IFR flight and decision making. In such cases rote learning isn’t enough.

The solution has two aspects: Firstly, students must drop the mindset of completing a tedious checklist of exams. They must learn to apply themselves, take the exam seriously, put in the necessary study time and improve their exam technique, all of which can be achieved by attending structured ground school. Secondly, the CAA needs to accept that re-inventing the EASA wheel hasn’t worked. Rather realign the syllabus with one that has been compiled by aviation authorities who are far more experienced than South Africa.

 

THE PRACTICAL SIDE

The practical aspect of the instrument rating, although requiring flight schools to rewrite their instrument training syllabus, has achieved far more industry support, and is generally viewed as an improvement. It now consists of six phases, the last two of which focus on Line Operational Flight Training, Simulation and Evaluation (LOFT, LOS and LOE). This brings flight training in line with real world situations and is far more valuable than merely practising intercepts and holds around a VOR – or even an NDB.

How the practical training relates to the exam has raised some concerns, in that students have been restricted to just Phase 1 and 2 of the practical training – which is ground briefing and basic IF flight, similar to what you do for a night rating – until they have passed the theory exam. Because the theory exam is intended to test practical understanding, this could be putting the cart before the horse, as students would likely be far better prepared for the exam if they’ve had some practical IFR exposure first.

 

A RESTRICTED INSTRUMENT RATING?

The main reason private pilots want an instrument rating is so they can cross the often clouded escarpment and fly standard departures and cloud break procedures to get in and out of airfields in IMC. They don’t want to fly precision approaches down to minimums. They therefore argue for a restricted instrument rating, similar to the UK’s Restricted Instrument Rating (IR(R)).

Briefly, the IR(R) allows you to fly IFR in, above, and below cloud, including instrument approaches. For takeoffs and landings, the runway visibility must be 1,800 metres.  Cloud base must be at least 600 ft for takeoff, and 500 ft for instrument approaches. EASA is planning to implement its own version of the IR(R), which they are calling the BIR (Basic Instrument Rating). It will be tailored for private single- and multi-engine rated pilots, who will likely be limited to higher approach minima.

Why can’t we follow suit in South Africa? According to one of SA’s DFEs and flight school owners, again it comes down to experience. He says that the UK and Europe have a massive depth of experienced instructors coming from the airlines and air force who are able to offer the requisite training and pass on their knowledge – something which South Africa is lacking. Furthermore, he says that a restricted instrument rating in the hands of a private pilot who isn’t regularly flying IFR in IMC isn’t safe. This is however, debateable as other very senior instructors, who examine part-time pilots with instrument ratings, argue that the insights gained from having an instrument rating make pilots safer, in that they consciously adopt safe personal limits – typically, cloud bases not lower than 1,000 ft.

 

 

 

 

CONCLUSION

The problem of not being able to produce pilots with an instrument rating is real and the concerns from both industry and the CAA are valid. The IR pass rate debacle is a result of inexperience on both sides, poor engagement, and a culture amongst student pilots that exams are an inconvenience that must be ticked off en-route to getting stuck into flying. Students must learn to ABC – apply bum to chair. Working through a question data base is helpful to check knowledge, but does little to improve understanding, and an instrument rating must be taken seriously. Tackling such a broad range of subjects for one exam is difficult, however.

Experience will take time to build, yet three years have already passed. In the meantime, the gap can be bridged by constructive engagement between the industry and CAA.

The CAA has said that a significant upgrade, and hopefully improvement, of the exam is in the pipeline, and is hoping to release an AIC in relation to this in the near future.

Share on Facebook
Share on Twitter
Please reload