Most financial institutions say they believe in data-driven marketing, but very few are actually testing their website content in meaningful ways. In this episode, Meredith Olmstead and Kristin Mock break down a real AB test on a home equity loan page and share what the data reveals about rates, benefits, user behavior, and smarter content placement.
Key Takeaways:
Test one specific variable at a time. True A B testing requires precision. If you change multiple elements at once, you cannot identify what influenced performance. In this case, the only variable tested was whether benefits were placed above or below rates. That clarity makes the insights actionable.
Rates drive scroll behavior more than we think. The data confirmed that users consistently scroll until they reach the rates module, regardless of its position on the page. Drop off happens immediately after rates. This reinforces that rate visibility is critical, but also opens the door to strategically placing important messaging just above it.
Assumptions are not strategy. Data is. Lead capture forms, content hierarchy, and product positioning often face internal pushback. But when tested properly, the data often disproves common fears. For example, short lead capture forms did not reduce completed applications. Testing replaces debate with clarity.
Transcription:
Meredith Olmstead:
Hi there. I'm Meredith Olmstead, CEO and founder of FI GROW Solutions. We're a digital marketing and sales consulting agency. We work exclusively with credit unions and community banks. And I am here with Nida Ajaz, our senior VP of marketing. Say, "Hi," Nida.
Hi there. I'm Meredith Olmstead, CEO and founder of FI GROW Solutions. We're a digital marketing and sales consulting agency. We work exclusively with banks and credit unions, and I'm here with Kristin Mock. She is one of our senior inbound marketing strategists. Say hi, Kristin.
Kristin Mock:
Hello.
Meredith Olmstead:
Kristin and I were just talking about how to structure your website content based on data. It's something we talk a lot about. We pay lip service really to data-driven marketing basically. So we were talking about website content and structuring a website and I was like, let's hit record and talk about this because I think this is a really nice little nugget of information that we can share with our audience and it could have some really meaningful takeaways.
Kristin has been doing a really cool A-B test with a client, I guess for about two months now, maybe a little longer. She started last year, so this is kind of the end of January, so it could be like two-ish, three months. So I wanted to give you an opportunity really to talk about how this A-B test that you've been structuring around specifically testing where you're placing a certain kind of content for this, this is a fairly large credit union, so not huge, a little bit over 500 million in assets out west, it's on the West Coast, and where you're putting the specific kind of content on the page. You've been testing where it is and how that is impacting user experience and time on page and what people are doing. So let's talk about what you're doing and what you're finding.
Kristin Mock:
Sure. So I like that you used the word specifically because when you're A-B testing anything, you want it to be very specific. You want to test exactly one very small change at a time, otherwise you don't know what impacted your results. So right now, the test that we've been running for about two and a half months is the placement of the benefits section on a page.
So we're only looking at home equity loans right now, and we're looking at if we put the benefits that that specific credit union has versus their competitors, if we put that section above or below the rates. All of our rates are featured on a product page, which we highly recommend for all financial institutions, but just kind of saying, are those benefits higher on the page or are they lower on the page than rates? And that's all we're testing right now.
Meredith Olmstead:
Okay. And so do you test to see if people are clicking on those? If they view them? How do you know if people are seeing those benefits?
Kristin Mock:
So there's a number of tools. Depending on where your website is hosted, you can get different types of analytics. So in HubSpot specifically, it doesn't automatically show you how far down people scroll, but if you use HubSpot CTAs you can see how many people viewed each CTA and how many people clicked each CTA.
So on both version A and version B of our page, we have separate CTAs created. They all look the same, they all say, apply now, and they all go to the same place. So to the user they're doing the same thing, but the data that we're collecting is to see exactly how far down on the page everybody scrolls and exactly which apply now CTA they're clicking on. So even the people who don't apply, we know how far down they went. We know what they were looking for.
Meredith Olmstead:
So you can see the views of that call to action even if they didn't actually click on that call to action?
Kristin Mock:
Yes.
Meredith Olmstead:
Okay, cool. So you're basically seeing how many people scroll down and saw a button and then potentially clicked on it. Okay. And so then the rates are on both pages, but one basically has a rate and then you scroll down to the benefits and one has the benefits and then you scroll down to the rates. Is that the way it's going?
Kristin Mock:
Right.
Meredith Olmstead:
Okay, cool. What are you finding out?
Kristin Mock:
So we are finding that as far as people clicking to apply, it doesn't make a difference. We're not seeing a drop-off of the number of people applying, but what we're finding is that the views always drop off. I say always in this test for two and a half months on this specific page, always, they drop off immediately after the rates module. So we're testing the benefits module, but the conclusions we're drawing have to do with the rates module. So if you've got a hundred people that scroll down to the rates module, maybe 30 of them scroll to the next one. No matter where that module is on the page, if it's the second module or the eighth thing on the page, people will scroll to rates because that's what they're there to find.
Meredith Olmstead:
Interesting.
Kristin Mock:
And it's something that it's kind of confirmation bias because we build websites and we suggest to put rates on each product page, and we typically put them first or second on a page because we say people are rate shoppers, they're coming to your page, they're coming to this product to see the rates. We know this behavior. But actually seeing the data and actually testing it, it's kind of okay, but how far down the page can we put it and still get people to scroll that far before they give up? Can I put something really important above the rates module and make sure everybody sees it, or do I need to put that rates module first with user experience in mind?
Meredith Olmstead:
Gotcha. So what you're basically seeing now is that you can slip something else in there. So basically you can put another module, another piece of content or two above the rates table and still see the same results. You can still ensure that people are going to scroll down a little bit, find that rate information that they were looking for in the first place, might've picked up a little bit more information on their buying journey and clicked that apply button and maybe helped them seal the deal on making the decision for the product. Okay, cool. That's very interesting. So rates are key in this journey, and you can definitely keep people moving down a page with that carrot. Did you figure out how far you could get them down the page or you're still kind of testing?
Kristin Mock:
Right now rates is, I think the fourth thing on the page, we are counting the very top image and H-1, counting that hero module up there. So it's about the fourth thing on the page right now. I think it'd be a pretty risky test to keep pushing it down. I'd be worried about losing too many people on that. I do think higher on the page is better for that user experience, but yeah, fourth is probably the riskiest I'll get.
Meredith Olmstead:
So implementing your findings from tests like this, it's great I think to really then understand, okay, across the board you can probably make some of those kind of generalizations across other pages. You could definitely, if you're running a big special in a certain product category, you could move rates down a spot or two and slot in a special promotion above them for a quarter or two if you really wanted to promote a certain kind option or a new version of a product above a rate table.
What other kinds of A-B tests? Obviously A-B testing is really important and we're trying to do more and more of that with our clients. I know we've tested lead capture forms before applications. We get lots of pushback on lead capture forms for applications, most of the time from lending teams who say nobody's going to want to fill out a lead capture form before they go to the application and lending teams will push back forever and want us to take them down. Sometimes they win in that argument and we have to take them down. What have we found on those split tests when we test before? When we test with a lead capture form and without a lead capture form, what do we see?
Kristin Mock:
We don't see any drop off. If you've got a first name, last name, email address, lead capture form versus getting them straight into the application, we don't see any significant drop off of people going, ugh, that's too much information for me to fill out before I even get to the application. They're about to give you a lot of very personal financial information about themselves, social security number, the exact vehicle they want, whatever loan they're getting, so a few things up front has not impacted that at all.
Meredith Olmstead:
Yeah, okay.
Kristin Mock:
But I mean, to your point, A-B testing is super important and I think that we should be doing it constantly. I think everybody should be testing different things at different times. Like I said, one thing specific at a time, but always testing something. So test where rates are, see if that user experience at the top is better for you, or if you're seeing that you can push it down and put something higher priority above it. Test lead capture forms and see if maybe you can capture a few leads that are going to abandon your application anyway and now you've got a lead on top of it and you're not going to lose any of those folks just by having that form.
Meredith Olmstead:
Yeah. The other thing we really want to start trying to test a little bit more is testing products, focusing on products versus focusing on the problems you're solving for somebody. So instead of focusing on, okay, do you want to get a personal loan? It's hey, do you want to get out of debt? So really, really getting into, do you want a high yield savings or do you want to save for your dream vacation or do you want to save for your dream retirement? So it's really thinking a little bit outside of the box too, around how you're positioning your products and services a little bit differently. And that's something too that those kinds of little small statement changes could be a really nice small thing to change on page in terms of A-B testing.
Awesome. Well, thank you so much, Kristen. I love this A-B test. I think this is a really great thing to have tested and be able to implement across clients and for one client for sure, but then now to take that and be able to implement it for all of our clients has been great.
Hopefully you all got some great takeaways from this. If you're interested in learning more about marketing for your bank or credit union, please come visit us at figrow.com. We have lots of other great podcasts, blogs, case studies, so we'd love to get in touch with you. So come and follow us there, and otherwise, let's just all get out there and make it happen.