How and why to A/B test in digitized mortgage lending.
In any automated process, a critical component of business functionality is the ability to adapt and change. It is not ideal to have an automated system that is unable to adapt to changing consumer behaviours. If your business is not continually looking for ways to improve, it will be stagnating, leading to declining customer experiences and marketing campaigns that become quickly outdated.
Instead, the key to any great system is constant optimisation. In the banking and lending industry, this means automated lending software must be constantly tested and improved to ensure captivating customer experiences, high-performing conversion rates, and consistently fair and accurate decisions for the long haul. How can we make changes that we know will add value? One of the best and most versatile ways of doing this is through A/B testing (alternatively known as split testing).
What is A/B testing in digitized mortgage lending?
A/B testing is a valuable tool for improving the efficiency of digital systems. It does so by A/B testing alternative versions to determine if a system or process can be improved, usually one small change at a time. Even a minor alteration can have a pronounced impact on performance, so you miss out on huge gains if you are not testing alternatives. Tech giants such as Google always A/B test to optimise all aspects of their products.
So what is A/B testing, and how does it apply in digital mortgage lending? To use a simple example, think about the colour schemes on an online loan application form. As an example, you’re deciding between two different colours for your “submit application form” button. The button is an essential last step for converting a potential customer into a paying one. Which colour is better, green or red? We can use A/B testing to determine which is more effective systematically. Two different pathways for online application forms are created, A (green) & B (red), and analysing the results. The two options are tested simultaneously with customers, without any other changes to the page – visitor one sees option A, visitor two is shown B, visitor three gets shown A, and so on.
Over time, we may find that users are less likely to complete an online bank application with a red colour scheme. It could be that red tends to be associated with boldness. Boldness is not a feeling we want to evoke when trying to build trust, which is essential for lenders who require customers to share sensitive financial information. Thus we may see lower conversion rates for option A over time as a result.
The above is just a simple example but illustrates how to test different hypotheses to make automated lending systems more rigorous. A/B testing in this way provides the building blocks to create a truly robust and well-tested system. Banks can test if text reminders impact half-completed loan applications, how different parameter weightings affect outcomes, or evaluate the ease of onboarding processes. If one application process has an 80% successful completion rate, while the other has a rate of 36%, what can this tell us?
A/B testing in digitized mortgage lending for credit scoring.
In credit scoring, A/B testing can enhance digital onboarding processes and improve a scoring model’s overall performance by determining which loans should be approved and rejected. For example, we may want to test the ideal credit score cut-off point for loan applications. We can use A/B testing to assess the impact of different parameters to make an informed decision. We can also use the results to ensure our alternative credit scoring model strikes a balance between expected approval and rejection rates compared to traditional credit scoring methods.
In practice, A/B testing of credit models means each customer is randomly assigned a scoring method where each has one small parameter difference. We can then critically analyse the results to answer questions such as: which method gives the fairest result, and; how the results compare against each other? Over time, we understand the impact of varying parameters to decide which model is more accurate based on default rates and comparisons against traditional scoring methods.
With the proper infrastructure, it is easy for businesses to adapt and improve decision-making processes. It’s not just limited to credit scoring either – A/B testing has enormous potential within mortgage lending, automated lending systems and most areas of digital banking. Even just one small change can significantly impact optimising results and go a long way to optimising processes and systems.
Smarter technology in digitized mortgage lending means smarter decisions.
We’re excited to embrace A/B testing in our automated lending solutions and have added new A/B testing features in our digitized lending systems. We know this will provide a useful and powerful tool for our clients to make processes more accurate and lending decisions easier for both the individual and financial institutions.
We’ve also recently added campaign support in our digital lending system so that banks can tie an online application to a loan. We do this by collecting data during the loan origination stage and then track these metrics from start-to-end to help banks and other lenders determine how successful campaigns and which channels are the most important.
Each of these features, A/B testing and campaign support for mortgages, can go a long way to ensure your automated lending systems perform at the best level they can be. To learn more about how A/B testing can help your business or book a demo, get in touch.
A/B testing is a powerful tool to test parameters in a system by changing one parameter at a time. It is a great way to ensure robust and accurate processes and prevent technology from stagnating. Banks and lenders can use it in the mortgage lending industry to systematically test different scoring methods.
Get insights into the latest developments in a fast evolving industry with Näktergal’s newsletter.