Thursday, October 4, 2012

Deconstructed Check Processing?



I just got back from an excellent Remote Deposit Capture Summit organized by RDC.com. It is fast shaping up to be the place to gain knowledge and network with key players in the check processing industry. Not so long ago, there were several events a year devoted to, or with a major emphasis on check processing. Most have fallen by the wayside- perhaps driven by the assumption that check processing is mature and uninteresting. That this was from true was attested to by many at the Summit who presented business and consumer adoption figures showing a vast, yet untapped market.

Whether adoption has been glacial or spectacular depends on perspective. Celent’s Bob Meara had done a study a while back showing that check imaging has had a faster uptake than most other technology introductions in the banking space. Yet, the impatience to drive distributed capture into every office, shop and home is understandable. There was continued debate at the Summit between risk perception and the need to drive customer convenience. It is my take that while the risk folks ruled the roost in the early years, the convenience advocates are beginning to scale the ramparts. 

Yes, mobile RDC is cool and growing, check scanners are growing smaller and more accurate, but what caught my eye at the Summit is a possible shift in industry tectonics. Remember the early days of distributed capture? The focus was on branch back counter and teller capture back then. The technology players in the marketplace were those that had a pedigree in traditional check processing- the people who could make documents fly down the track on a 3890 high speed sorter. You had to earn spurs in centralized, sorter-based processing to get a seat at the distributed capture table. 

The push to capture checks from outside a financial institution’s infrastructure through merchant capture, and the slow evolution of the X9 and image quality standards began the tear away from the “sorter-on-a-rope” paradigm, as an industry wag uncharitably put it. New players entered the field, and some leading names disappeared from the industry. Fast forward (yes, an obsolete term in the MP3 era) to 2012, and we have technology suppliers specializing in particular channels of capture. Clearing and settlement? Oh well, that’s something that someone does with the X9 file we give them. Now, toss in the down-loadable app, and we are perhaps set for more deconstruction of the check processing chain.

I am tempted to draw a parallel with the evolution of card processing. During the paper draft days, there were “paper factories” processing card transactions. Electronic Draft Capture (EDC), and the introduction of specialized terminals by Verifone , Hypercom and others changed the nature of the industry. Today, there are those that specialize in the acquiring front end and others who drive scale through the switch. Substitute the words “capture” for “acquire” and “clear” for “switch” and you may have a template for where check processing may go.

I know past is not always prologue, and checks are very different from cards. Draw the picture out nevertheless and contemplate the shape of the check processing industry in five years. Fascinating, right?

Tuesday, March 20, 2012

Analytics- The Ignition Key

The challenge for those attempting to launch new payment alternatives has always been the need for an "ignition strategy". Simply put, it means making sure that there is a critical mass of payers and payees at the same time for the new payment vehicle. Attempts at achieving a payments "big bang" have often fizzled due to a basic circular conundrum. Users are reluctant to embrace a payment method, unless they are sure there is a critical mass of entities that will accept it. At the other end of the telescope, potential acceptors need to be convinced that the new alternative will have a large number of users. Many ideas have fallen between this chicken and egg.

Financial institutions are best positioned to break this deadlock. They have access to enormous amounts of data on the purchasing habits of their customers. Nevertheless, institutions have been slow, if not reluctant, to convert that data into useful insight through Analytics. The barrier is not technological. Powerful models exist that can sift through vast amounts of information to predict behavior, while keeping false positives to a minimum. Technology is available to bridge data silos through transaction hubs. The impediment to greater use of Analytics by financial institutions is driven for the most part, by concerns about consumer privacy.

While it is important to safeguard the trust equation that consumers have with banks, there are low hanging fruit that can be picked. Take P2P (person-to-person or peer-to-peer) payments for example. PayPal has been the runaway leader in this category, but banks are in position to ignite a P2P revolution. Most P2P payments replace cash or check transactions- paying friends, children, relatives, gardeners, babysitters etc.. Many of these are recurring payments that take place on or about the same date for similar amounts. In many cases, payers use their online bill pay systems to schedule payments which are sent as paper checks by financial institutions, because the receiving entities are not bonafide billers. In other cases, payers write checks and put them in the mail. In both cases, financial institutions bear the cost of paper handling. Checks also get lost in the mail, leaving the payer scrambling to get the money to someone in urgent need- I can attest to the latter, having had it happen repeatedly through "automated" bill pay.

It is possible to reduce cost and improve customer retention or "stickiness' with a little bit of analytical deftness. Technology exists to recognize payee names from check images. It is not difficult to analyze consumer payment histories to identify recurring payments to individuals. Similarly, an examination of online bill pay behavior can identify those consumers that are using checks to bridge the "last mile". These are consumers that may be receptive to P2P. Once target consumers are identified, financial institutions need to put incentive marketing in place to get them to enroll their payees into the P2P program. The payees are likely to react positively to overtures from someone they know. Handled properly, the targeting and enrollment of both payers and payees can be effected without raising privacy concerns.

This is but one simple example of using analytics as an ignition key to start emerging payments engines. There is more gold in them thar data mountains. What it requires, though, is a paradigm shift by financial institutions.

Thursday, October 20, 2011

Through Thick and Thin- Branch and Teller Deposit Automation Redux?

Just when you thought branch and teller deposit automation was done and dusted, a new interesting development is slowly making waves. Early forms of both types of deposit automation were thick client applications that resided on PCs at the branch. In previous posts, I have touched upon the headed and headless versions of teller deposit automation. And there the technology sat until merchant capture hit the scene (I say capture deliberately, because that is what most of them were- remote capture with the rest of the work-flow centralized upstream).

The question was asked, "If a merchant can capture and send relatively large deposits using thin client technology, why can't the same thing be done from the branch?" Why not, indeed? The thin client approach minimizes the challenge associated with initially deploying and updating applications deployed across a branch network. Do it once centrally, and the entire network is current.

Not so fast, say some. What about the network latency in performing the entire capture-correct-balance continuum using thin clients? Will that not slow things down, and affect branch efficiency? After all, a merchant does not have to worry about people waiting in queue to make the next deposit. What if connectivity goes down? What are the offline recovery alternatives? How do deal with exceptions- missing checks and data?

While the battle is being fought, here are some perspectives on the issue. I think the cases for automating deposits at the teller and branch back counter will are driven by different imperatives. As discussed in previous posts, teller deposit automation (TDA) is increasingly getting bundled with teller systems. Teller systems are headed in a thin client direction. Therefore, I think it will be difficult for thick client TDA systems to survive when the mother ship is headed elsewhere. Deposits made at teller stations typically small, and it is likely that network latency will not significantly affect queue length at the teller. This, of course, makes sense for brand new teller deployments. Keep in mind, nevertheless, that there are many thick client teller desktops out there that can be retrofitted with traditional thick client TDA.

The branch back counter is a different kettle of fish. As observed previously, the landscape ranges from very large institutions that capture at the branch and perform data perfection and balancing centrally, to smaller banks and credit unions that perform the entire work-flow at the branch level. The deposits dealt with at the branch back counter tend to be larger, and therein lie the seeds of an interesting trade-off. If the paradigm is to only capture at the branch, and defer correction and balancing to centralized operations, a thin client front end for capture may make sense. If, on the other hand, the entire process of item correction, amount entry and balancing is to be done at the branch level, network latency and offline recovery considerations may be an issue in a thin client environment. We may perhaps see a hybrid work-flow evolve, a la merchant capture, where items are captured using thin clients, transactions nominally balanced against a control total from a teller tape, and sent to central operations for downstream operations. Yes, this will add work at the branch, but it may reduce workload centrally- particularly in smaller institutions that do not have complex balancing rules.

Interesting how technologies considered ready for benign neglect have a habit of asserting themselves when you least expect it. Thoughts?

Wednesday, June 29, 2011

Teller Capture- Myths and Realities: Part III

Teller Deposit Automation (TDA) is where retail banking meets payment processing. The imperatives that drive these worlds are profoundly different. The previous installments dealt with some of the business case aspects of TDA. This post touches on systems integration and related business implications.

Retail banking and payment processing- shotgun wedding, a bridge too far? Let us see.

A common solution, seen mostly in the credit union world, is a check capture system interfaced with a receipt printer. That’s right, I did say “capture” not “deposit automation”, because that is exactly what this approach offers. Check images are captured after the teller has processed and posted the deposit. The benefits of keystroke reduction, and automated proofing and balancing at entry are absent. The approach adds the capture of check images on top of the original teller workflow- whatever that happens to be. Many solution providers offer this option, because of the challenges- both technical and political- associated with integrating deposit automation modules with teller platforms. So, if you opt for this solution, be aware that it is limited to image capture for archival and does not bring workflow efficiencies to the table.

Integrating deposit automation with teller platforms sparked a battle royale in the early days of Check 21. Providers of teller platforms saw anything that has to do with teller operations as their turf, while check imaging solution vendors viewed payment processing as their unique expertise. Often, this was reflected into political tension between the retail banking and payment operations groups within financial institutions. Some of the business case arguments I shared in the previous posts had their genesis in this departmental face-off.

There are two general alternatives for integration between teller platforms and deposit automation systems. An early approach, that is still in use widely, is a “toggle hand-off” between the teller host and the deposit automation system. When the teller is ready to accept deposits, she clicks on a button that brings up the deposit automation screens. All operations thereafter are conducted within the deposit automation system. After the deposit is proofed and balanced, the teller clicks another button that yields control back to the teller platform. As you can imagine, this sparked a firestorm of turf battles, as nothing seems to elicit emotion more than the ownership of the user interface.

An alternative, partly spurred by the politics, is the “headless API”. This approach leaves the user interface firmly in the hands of the teller system, while the heavy lifting of payment processing is performed, “behind the scenes”, by the deposit automation system. As teller platform providers have come up to speed on the nuances of check deposit processing, this alternative has gained more traction.

Both approaches work well, and are transparent to the depositing customer. There are marginal benefits in teller training with the headless approach. Once trained, however, I have not seen a marked difference in teller efficiency between the alternatives. It comes down to your institution’s operational philosophy. That said, my take is that the owner of the dominant platform will eventually control the interface. For example, I believe mobile banking applications will eventually subsume separate mobile remote deposit solutions…but more on that another day.

An important trend in the TDA world is the acquisition of check imaging companies or talent by teller and core system providers. Witness the Fidelity- Metavante (AFS, Bankware), Profit Stars- Alogent, Fiserv- Checkfree (Carreker) acquisitions to name a few. ARGO Data that dominates the very large bank teller space, has preferred to hire talent over buying a check imaging company. The upshot is that deposit automation is fast becoming a module that is already integrated with the teller platform. Nevertheless, there are independents who offer to integrate deposit automation with your teller system of choice, in lieu of the bundled alternative.

There are a few issues to be considered when looking at bundled versus separately integrated options. If you are considering a separately integrated deposit automation module, look carefully at the application programming interface (API) between the TDA system and the teller platform. Whose API is it- the TDA provider or the teller vendor? How well defined, and how flexible is the API? What is the strategy for version control as each system proceeds on its release schedule? In many cases, the interface is built without the overt cooperation of the teller vendor (only natural as they would prefer to sell you their bundled version). In these situations, there is often proprietary “glueware” that sits between TDA and the teller system. Who owns the glueware? What is the strategy to ensure that this essential piece keeps pace with the evolution of the two systems it is sandwiched by?

If you are considering bundled alternatives, ask the same questions as above. In many companies, divisions do not cooperate with each other as well as one might think, and sometimes behave like separate companies. Thus, “bundled” may not be as tightly integrated as one might think. There is also the issue of sort patterns to consider. In large institutions, the sort patterns that govern the business rules of deposit processing can be complex. These are traditionally owned and maintained by the operations departments, and are resident on centralized check clearing platforms. Now, the TDA paradigm requires that these sort patterns be applied at the point of entry- the teller. In other words, the TDA system needs to have access to, and be capable of administering complex sort patterns in real time, at the teller station. Does the bundled TDA system have the capability to administer complex sort patterns? If the provider of the back end check clearing system is different, what is the strategy for accessing and maintaining the patterns within the TDA system?

The early days of deposit automation saw different systems for TDA and back counter deposit automation. The back counter of the branch was used for bulk deposits and typically drove bulky scanners of higher speeds. TDA and back counter systems were not integrated, and lived in silo’d worlds. Today’s small footprint scanners are capable of variable speed including high speed capture. Thus, the rationale for separate TDA and back counter systems no longer exists. The need is for a deposit automation system that can be configured for teller or back counter operation. In other words, what is needed is a comprehensive Branch Deposit Automation (BDA) system that can be configured to work at the teller station or the back counter of the branch. Why deal with silos if you don’t have to?

As you can imagine, a blog post only touches the tip of the iceberg. There are many more nuances that need to be considered. Nevertheless, I trust these points help frame the decision process. In my next post, I’ll summarize the posts with a check list that might be of help.

Wednesday, May 18, 2011

Teller Capture Myths and Realities- Part II

The business case for Teller Deposit Automation (TDA) rests on the hard benefits of central proof elimination, reduced transportation and float, and the softer advantages of improved customer service, error elimination and early risk detection. Let’s take a closer look.

If you have a ready-to-post transaction at the teller station, clearly there is no need for any further proofing and balancing. This was a no-brainer benefit in the early days of Check 21 when branch alternatives were compared to centralized processing centers. The headcount reduction in central proofing alone could underwrite the business case for either teller or back counter deposit automation (BDA). In a strange way, the case is more complex now because trade-offs are made, not against central processing, but between TDA and BDA. An added twist in the tale is the financial institution’s workflow preference. The US is made up of institutions that prefer to proof and balance at the branch, and those that like to proof at the branch, but balance centrally. Not surprisingly, most small institutions exhibit a branch balancing preference, while larger ones opt for centralized balancing.

The actual business case analysis involves running scenarios that are unique to the institution. Nevertheless, it is relevant to make a few broad observations. Those that object to TDA actually increasing teller workload and queue length have a point when it comes to large commercial deposits. Thus TDA is not right for all deposits. Many institutions limit TDA to deposits containing, say, ten items. The rest are either dealt with through a separate BDA system, or set aside for tellers to process during slow periods.

Does TDA reduce keystrokes, or does it actually imply more work for the teller? I am one of those strange people that actually counts keystrokes whenever I go to make a deposit at a branch (yes, I still visit branches). I have accounts at institutions that use TDA, and those that do not. My experience is that TDA drastically cuts down keystrokes. Contrary to the claim that TDA makes tellers into proof operators, I find similar deposits taking about half the time in TDA institutions. I have deliberately made deposits with addition errors on the deposit slip, and I find that TDA institutions catch them faster. The key is that with improved recognition software, and vastly better MICR readers, technology is doing the heavy lifting, leaving little as a teller burden. The keystroke reduction is borne out by statistics including one from a large institution that observed a reduction from 75 to 5 keystrokes (what were they doing previously with 75 strokes??). Now, the theory is that reduced keystrokes allow more teller “face time”, to allow cross selling. Not once in my informal mystery shopping did I experience the “selling teller”. The institutions I visited seem to use the reduction in keystrokes to drive efficiency over cross-selling.

While the back counter versus teller trade-off requires more space than a blog post, there are key (no pun) considerations for BDA in larger institutions. Typically, these are environments where items are scanned at the branch but corrected and balanced at the center. If an item needs to be rescanned, or an operator needs access to the original paper, the fact that the check item is at a branch many miles away presents challenges. There are systems with instant messaging capability back to the branch, butthe DRIFT principle (DO IT Right the First Time) that TDA offers is compelling.

Transportation savings are trickier. While check truncation does eliminate the need to move paper, there is still a lot more paper that needs to be moved daily from the branch- not the least of which is cash, the only paper that cannot be truncated. There is an emerging move towards “the paperless branch”- an interesting convergence between check truncation and document management. It will be interesting to watch how this scenario pans out in the absence of a legislative catalyst like Check 21. Nevertheless, with careful analysis it is possible to quantify transportation savings due to TDA.

An aspect that is often overlooked is the potential to identify risky deposits at the very outset through TDA. We have accelerated the movement of money to the speed of light. Both inadvertent errors and outright fraud have kept pace. Yet, our industry still chooses to address risk on a batch basis on “Day Two”- a hold over to the old centralized mechanical capture days. It is my take that TDA and Day Zero Risk Management make perfect bedfellows. More on this in another post.

I know no one looks at float anymore because we are about as close to “free money” as one can get. While I don’t have a crystal ball, it doesn’t take rocket science to know that the sizeable national debt will push inflation and drive rates northwards in the not too distant future.

Lastly, the single biggest barrier to TDA is the difficulty in integrating deposit automation with teller platforms. With many core vendors having acquired check imaging companies, this should become easier. This, however, is not as straightforward as it might appear. Not all integrations are equal and, like everything else, there are trade-offs here as well. More on that in Part III.

Friday, April 22, 2011

Teller Capture- Myths and Realities- Part I

While the U.S. check payments industry has seen a dramatic transformation from being awash in oceans of paper, to an almost all-image environment in less than a decade, there is debate about where images are best captured. The alternatives are many- teller stations, branch back offices, ATMs, central processing centers, offices, stores, homes and mobile phones, to name a few.

Capture points of entry can be broadly divided into two categories:

1. Interior points, within a bank’s infrastructure like tellers, branches and ATMs, where the driving imperative is one of cost reduction and efficiency.

2. Exterior points, like offices, stores, homes and mobile phones, where the drivers are combinations of revenue uplift, customer convenience and efficiency.

This is the first of a series of posts on the pluses and minuses of various capture strategies. We begin with Teller Capture.

Let me begin by saying that I have never liked the term “Capture”. It is a holdover from the times when MICR (and later image) data were read and “captured” on electro-mechanical reader/ sorters. While sorters have been relegated to museums and the odd eBay page, the term lingers. I find the term limiting because it tries to describe a workflow which is far more comprehensive than the mere capture of information. I submit that the capture-correct-balance continuum that is typical of the many “capture” processes in use today is better referred to as Deposit Automation.

Now that I have my pet peeve out of the way, let us look at Teller Deposit Automation (TDA). The quick take on TDA is to have a proofed, balanced, and ready-to-post deposit, before the customer making the deposit has left the teller station.

That last statement sometimes lets loose a flurry of concerns:

  • I don’t want to make sorter operators out of my tellers
  • Error rates will go up because tellers aren’t trained to be proof and balancing operators
  • Queue length will go through the roof because each deposit is going to take much longer
  • Tellers (and perhaps customers) will not accept this new and different process
  • A scanner, computer and software at each station will be tough to justify

To be honest, there are also issues of a political nature that can rise to the top. TDA lies in that No-Man’s-Land between Retail Banking and Operations. In some institutions, particularly larger ones, TDA can be a lightning rod for turf battles. Nevertheless, let us look at the other end of the telescope and examine the benefits touted by proponents of TDA.

The hard benefits that drive the business case are:

  • Truncation and reduction of transportation
  • Central proofing and balancing elimination
  • Float gains through early capture (yes, I know…but interest rates will not always remain subterranean)

The soft benefits that supplement and sometimes drive the decision (depending on the institution’s strategic priorities) are:

  • Keystroke reduction freeing up more teller time for customer service
  • Error reduction through lower keyboard data entry
  • Potential risk reduction through integration of TDA with risk management systems
  • Potential for enhanced service through integration of TDA with Customer Relationship Management (CRM) systems

The business case battles are fought on multiple fronts with hard and soft benefits challenged, defended and examined from many angles. My next post will take you through some of the battlefields (I’ll admit I have a few scars from these skirmishes).

Thursday, March 17, 2011

Payment Hub Realities

At BAI's Payments Connect conference in Phoenix last week, I moderated a panel discussion on "Getting the Technology War Elephant to Dance". One of the avenues explored was to "extend" the reach of legacy systems through Payment Hubs.

The panelists were Taylor Vaughan, Director Treasury Services at First Tennessee Bank, Dave Shipka, Senior Vice President Enterprise Payments at Comerica Bank, and Elizabeth Cronenweth, Product Line Manager at Sterling Commerce (now a part of IBM). There were interesting perspectives from two bankers who were in the midst of implementing hubs, and a technology solution provider with a handle on industry trends. Here are some take aways:

  • Financial institutions are being buffeted by strong headwinds in the form of potential lost revenue (aka the Durbinator), heightened compliance regimes (one bank- not represented on the panel- stated at the conference that they project 30% of their IT budget to be spent on compliance), polarized demographics with Boomers and technology savvy Gen Nexters demanding very different services, explosions in channels and payment alternatives, increasing non-bank competition and globalization.
  • Some of the imperatives driving technology infrastructure planning are:
    1. Comprehensive customer view across all relationships. Most systems in place are transaction centric and don't offer a customer view- let alone a 360 degree view across relationships.
    2. Multi-channel and multi-payment capability as opposed to the silo'd legacy environment.
    3. Real-time operations to enhance customer service and reduce risk.
    4. High up-time availability and ubiquity- anytime, anywhere.
    5. Nimble operating environment lending itself to agile change management.
  • Existing infrastructures present barriers to the imperatives through silo'd architectures and organizations, batch operating environments, old and poorly documented code, hard coded interfaces and long lead-times for change management.
  • There are two broad approaches to deal with the challenge- "extend" the reach of legacy systems, or replace them altogether.
  • It is very early in the evolution of Payment Hubs and there is considerable debate as to what it is, and is not.
  • A vision of a Payment Hub- a single platform that operates across all customer relationships, channels of interaction and payment alternatives.
  • Payment Hubs can be data centric where data and business rules reside at the hub, or message centric where the hub is a traffic cop. The reality is that evolving hubs include combinations of both approaches.
  • The main driver for hubs is the "spaghetti" environment in most institutions, with one-to-one paths from every channel to multiple legacy systems.
  • A challenge that should not be overlooked is political pushbacks from the owners of various legacy turfdoms. Some see the implementation of a hub as a direct threat to their jobs.
  • It is absolutely essential to have an executive sponsor who will stay the course, as the hub can and will touch many parts of an institution.
  • Both revenue and cost perspectives should be carefully looked at. Clearly, the cost of an increasingly expensive and unwieldy status quo needs to be compared with the expense of implementing hubs. Often, the cost of the status quo can be untenable. What is needed then, is the will to take on the risk of transformation through enabling technologies like hubs.
  • Another perspective is to "sell" the hub on the back of one or two revenue opportunities. This view suggests that it is difficult to get consensus on implementing a hub on a cost reduction play alone. Integrated Payables can be one such revenue opportunity. Eliminating silo'd payables and multiple files not only enhances customer service, but presents an opportunity for value-added pricing. A follow on can be the other side of the mirror- Integrated Receivables. This view recommends building the case based on the revenue opportunity, and having the transformational foundation for the enterprise pulled along by a growing topline.
  • The option to replace legacy systems as opposed to the "extend" paradigm was examined, but discarded due to the complexity of "legacy spaghetti". An interesting observation shared was that if much of the intelligence ended up in the hub, there was no need for a legacy system, except for settlement. Can a hub be a Trojan Horse that eventually eliminates legacy infrastructure? Intriguing concept indeed!
  • A perspective on batch versus real-time was that both capabilities were needed in the hub as the batch based legacy systems were not going away overnight. This implies a careful definition of the Target Operating Model and a well defined Change Management Program to get there.
  • Lastly, a challenging note to the hub concept was raised, in that an attempt to over-centralize can end up in a potential single point of failure. Clearly, thought needs to be given at the design and architecture stage to safeguard against the hub bringing down the whole ship.
  • It was a fascinating dialog, and one that I enjoyed moderating. My take is that we will see more hub implementations in larger institutions in the U.S. and other older economies that do not have the benefit of leap-frogging from manual or poorly automated environments to the latest and greatest.
Watch this space as we evolve into a post Great Recession society!