Home / Banking Strategies / Granular benchmarking data revealed this digital banking onboarding hurdle and its solution

Granular benchmarking data revealed this digital banking onboarding hurdle and its solution

Robust marketing drove prospects to a digital checking account application, but a high abandonment rate perplexed the bank.

Apr 24, 2025 / Consumer Banking

A version of this article first appeared in the February BAI Executive Report: How banks best measure performance. Read more within to discover how data, especially peer-based benchmarking, increasingly helps a dynamic financial services industry understand the marketplace and act.

Growing new banking customer acquisition through digital checking accounts required the best combined efforts of a super-regional bank’s product and digital channel teams, with internal marketing and its agency partners invested heavily in attracting target prospects.

The success of the acquisition campaign, and anything short of its goals, would be shared across these departments.

Early signs pointed up. The bank knew it had strong reach through targeted digital marketing and was tracking quality traffic to the new checking account application page. But too many prospects were bailing out. Why?

Good question. The teams suspected where the most sensitive point along the application trail might be. But they faced a bigger question: Was it worth it to find out more, or essentially give up?

Restructuring the application process was a big project and new budget ask. And the request faced potential pushback.

The campaign to attract prospects was effective and arguably worked to a degree as it flooded the top of the application funnel. But this marketing blitz attracted future customers only to throw up a potential barrier soon after. Stoking consumer frustration could prove costly for this product’s outlook and for the bank’s reputational risk.

To justify the front-end marketing spending as well as the new resources required for a process redesign meant that these strategists needed an iron-clad case. They could pitch reformulated questions and how a smoother process might look. But more importantly, they needed real evidence to quantify the projected improvement in the volume of new accounts and higher average balances with those new openings. That’s where the granularity and analysis that peer benchmarking delivers made a difference.

At a high level, we can look closer at what this bank’s benchmarking example revealed for its digital onboarding experience compared to its peers:

  • Quality volume traffic to digital checking account application pages is logged.
  • Prospects are taking at least another step. The start rate, or the ratio of application starts to page visits, was above average.
  • Start rates indicate the bank attracted quality prospects interested in opening new checking accounts. Marketing was not an issue.
  • The process abandonment rate was 50% higher than peers, resulting in a substantially lower percentage of completed applications. The client bank’s abandonment rate was 60 out of 100 applications vs. 40/100 applications for peers.
  • Customers appeared to remain confident they were a good match for this product. The bank’s approval rates of completed applications were running somewhat higher than peers. To the benchmarking researchers, this is a sign prospects that did not abandon the application were also of high quality.
  • The funding rate of approved applications was also higher than peers, as were the new checking account balances.
  • The pattern emerging is when applicants reached a certain point in the process, of those willing to navigate the barrier, they had increased odds of completing to fully funded accounts.

The bank was, by select measures, standing up well or exceeding peers along the application funnel. The product and digital teams again had to ask, why are we losing this much applicant volume?

Supplied with benchmarking that gave a clear snapshot of their digital peers and bolstered by the quality of prospects that were trickling through, the bank focused on the account application process and cross-referenced internal data that identified two, specific Know Your Customer (KYC) and credit-check access questions of the application. It was here that several prospects went cold. That is, there was a noticeable drop-off in data input on applications right at that spot. Some applicants paused for a considerable time and were still willing to push through; many abandoned all together.

Importantly, benchmarking doesn’t provide the entire solution. KYC and credit questions, for instance, are valuable must-ask application points from a risk perspective. These lines of inquiry would need to remain in the process. The fixable portion turned out to be a form design that made fulfilling those questions too cumbersome and analog for many applicants who were already thinking digital-first. After all, these future customers were on their way to signing up for digital checking. Still, given the quality of applicants and the ability to pinpoint the problem, the fresh resource allocation could be justified.

Using the benchmarking analysis, the team was able to build a solid business case to support the application redesign. They were able to more accurately model both the upside in terms of new checking accounts as well as new balances. The resulting growth and profits were substantially higher than the cost of the application re-design project. And the team had more confidence in the business case because of the strength of the benchmarking analysis.

Think of benchmarking as a robust strategy tool that can be used to confirm (or refute) assumptions and working hypotheses. This enables bank and credit union leaders to start from a data-driven mindset and engage in committed feedback exchanges with colleagues. These leaders must keep an open mind for results that may confirm or challenge expectations to make smarter, fact-based decisions that typically result in better business results.

John Rountree is Head of Client Engagement at BAI.