Banks are vital hubs in an economy. They provide credit, underpin payments, facilitate trade and manage risk. That pivotal role is, of course, why they are highly regulated. It is also why regulators—although they have differing priorities—generally want institutions to be able to demonstrate that staff know what they are doing.
Getting staff to dust off old certificates and fax over a copy will not meet those regulatory requirements. It is also unlikely to help the bank itself gain much insight into its business. But finding out exactly what staff do know about what they should know— for example, how to ensure that a credit really does have adequate cover—is not easy. It becomes particularly tricky when you need to gather that information without either disrupting the business or alienating employees. Doing that even for just a few teams—say, relationship managers or credit officers—can create significant headaches both for compliance and for human resources. What to do?
A lot of employees in big firms go through skill assessments that are literal “tick-box” exercises. Such exercises rarely generate enthusiasm. It is arguable that they also rarely generate useful data. Until recently, however, many banks made do. Post-crisis, they found themselves fire-fighting a whole range of interconnected, legacy issues. The overall requirements were to cut costs, boost the return on equity, meet more onerous regulatory and compliance requirements and respond to increasing competition from start-ups. Down on the shop floor, that meant setting mandatory tests with pass/fail guillotines—much like school exams at 16 or 18.
Pass/fail sounds clear-cut, fair and objective, but it has a number of thorny social and psychological challenges. At the most fundamental level, no one likes to be told that they do not know their job—particularly if they feel they have a lot of knowledge and experience. The sense of being judged —and potentially found wanting—can generate a lot of stress in a firm. That stress, together with the work that goes into preparing for an exam, distracts from day-to-day business. Then, there is the question of what, exactly, is being tested and what the test is designed to do. Is it there to bring everyone to a certain level? Should it support professional development? Is it fair, or sensible, to demand that everyone—whatever their role—should meet the same bar? And wouldn’t it make more sense to find out what they do not know about their jobs—so that the lack can be remedied? Applying a pass/fail test can be like fixing a wall by smoothing on some plaster. People learn to pass the test—to hit a certain level—but actual strengths and weaknesses, and information that can be vital to the functioning of the firm, remain hidden.
This is where the hard-pressed compliance officer and the Human Resources team might ask how else a firm can go about ensuring that staff are properly trained—and demonstrating that to the regulator. Time and resources are limited. One way is to take a careful look at actual holes in their knowledge and then set about closing them, using diagnostics.
“Diagnostics give people a chance to demonstrate what they know and understand. It is not about testing short-term retention of ‘facts’, but about examining deep knowledge—the sort of knowledge used in business,” Mark Jones, director at Unicorn Training Group, states. Jones argues that, for the employees being examined, it is a bit like deciding whether to spend five hours cutting down a tree or one hour preparing and sharpening the axe. If they are diligent about taking the diagnostic test, they can show their knowledge and find the gaps, which can save a lot of time and effort in doing unnecessary revisions.
Credit where it’s due
One of the most important roles in any commercial bank is that of loan officer. Loan officers are gatekeepers. Post-crisis, many commercial banks found that those gatekeepers had not always done an optimal job. Loan books had large numbers of credits that were non-performing, even when seen in the flattering light of a benign credit environment.
For a credit officer to do his or her job properly, he or she needs two main skills. First, to be able to analyse the potential borrower—to know what the numbers actually say and whether they can be relied on. Second, to be able to assess the capabilities of the management and the realistic prospects of the business. Because being a loan officer is both a numbers and a “people job”, assessing and fixing poor performance can be hard —something that also applies in other financial-services roles.
“People may not want to admit that they do not understand complex material. They need to be encouraged in a way that will speak to them as individuals,” says Jones. What helps firms, Jones adds, is that—when it comes to professional qualifications—people are usually motivated to fill any knowledge gaps.
“The difficulty is in making complex and challenging material engaging and immersive,” Jones explains. “Too many companies undertaking staff training try to ‘sheep dip’ and do not have respect for differences in learning styles and motivation, nor do they consider differences in the complexity and volatility of subject matter.”
A further difficulty is that, in the work environment, the psychological and emotional stakes are particularly high. Staff told that their skills and knowledge are going to be assessed with a view to improvement can react in a number of ways—many of which will be defensive. Older staff, in particular, are unlikely to welcome the process. Firms also have entrenched cultures that staff will protect—often unconsciously and very vigorously—against change.
Individual fear and firm-wide resistance to cultural change can sink a whole training project. Staff in hard-pressed institutions—who know cost cuts are necessary and that outside competition is increasing—are unlikely to believe that skill tests are anything other than a prelude to redundancies. People under such pressure can put a lot of energy into demonstrating why the whole exercise is flawed and should be abandoned.
Senior managers can help by pointing out that, if getting rid of those staff were the goal, they would hardly spend money on identifying individual training needs and addressing them. Town hall meetings, at which staff get to ask questions and hear the rationale behind the assessment process, support and exemplars, can do a great deal to ease anxiety. They can show that a diagnostic test, done right, is a staff benefit.
What do you do with the diagnosis?
It is relatively easy to find gaps in technical knowledge, but how do you fix them? Most people who go through a diagnostic find they enjoy developing a more holistic understanding of how their firm really works—and seeing how they help to make it thrive. Just as importantly, when they can see that their performance is being benchmarked against international standards, they gain in confidence. Their institution might not have fared well—and might have some problems to overcome—but things can be put right. This works best when the tests are easy for staff to take, for examiners to mark and also provide real insight.
“The diagnostics are designed to be taken online,” says Jones. “To avoid a ‘multiple guess’ outcome, the incorrect answers—the ‘distracters’—are very carefully worded to really test people’s knowledge.”
That means that, when it comes to a role like credit officer, they sit for a timed multiple-choice test with questions that examine, say, whether they can calculate a ratio or distinguish loan products. This is not a memory test, but looks to examine their awareness and understanding of formulae and how they are applied. It addresses the fact that the route to professionalism in banking has changed over the past few decades. There is no longer a core set of skills—such as ACIB (Associate of Chartered Institute of Bankers)—that all credit officers must have.
The aspects of the credit-officer role that demand strong interpersonal skills are harder to pin down, but mini case studies can look at applied knowledge. Here, again, the format is multiple choice or short answer so that any holes can be clearly identified. The final part of the assessment is a full-blown case study. This is closely related to the sort of analysis that credit officers would carry out in their day-to-day work at a bank. However, it is as much a learning tool as a test or a diagnostic. The participants are given a clear structure on which to base their work.
At the end of the process, the participant bankers should be able to see where they are already adding value to their businesses and where they could usefully learn more—whether that is in technical know-how or in interpersonal skills. For example, should they turn down a loan request because an EBITDA (earnings before interest, taxes, depreciation, and amortization) number does not sit right with their models, or give the business some support because it meets important sustainability goals?
The London Institute of Banking & Finance (LIBF) has nearly 140 years’ experience of setting and examining professional standards, and it was founded by bank workers themselves in 1879, so it understands that what is needed is a developmental measure—not a pass/fail judgement. It carefully benchmarks its assessment programmes against professional banking qualifications. The people who take assessments should know that the results will be objective and give them a clear view of how their skills measure up to those of other professionals in their field. People can take pride in performing well in the assessment—and be encouraged to try to do better if they see there are gaps. It is designed to help improve overall professional knowledge and skills—not to be an excuse for a cull.