Blogpost

Payment Tokens and Standards, Again

The last couple of months have seen a more focused and public discussion between merchants and banks about how the standards that will underlie payment tokens should be crafted. An oversimplified summary of positions would suggest that merchants want an ISO based standards development process which would allow for a more inclusive participation and more confidence in a truly open payment token ecosystem, and banks argue that because of the rapid growth of card not present (CNP) fraud and the expectation of geometric increases as EMV security measures reduce fraud opportunities at the physical point of sale there isn’t time for the notoriously slow process that hamstrings almost all ISO related work.

What’s interesting to me is that this debate raises – once again – the question of why standards development can be so contentious and whether there is in fact an ideal time for standards development. These are not new questions, and in fact there is useful work that defines the “ideal” time for standards development as a function of the time relationship between technology interest on one hand and “political” interest, which is fueled by economic concerns of the type we’re seeing now in both the banking and merchant communities, on the other. This framework can place useful context around the current situation where payment token stakeholders are at odds about the mechanics of the standards process, and at a high level might suggest an approach to make it more palatable.

In 1990 James Gosling, the Sun Microsystems engineer who developed JAVA, wrote what he described as a “moderately sarcastic” note about Phase Relationships in the Standardization Process (see Appendix). He begins with Toshi Doi’s diagram which describes the ideal window for standards development (“W” in the diagram below) as the time when technical interest is declining because it is well understood and when the “political” (economic) environment hasn’t become so contested that constructive negations between stakeholders is not possible.

He goes on to suggest that both technology interest and political interest activities have consequences (or results) which can be seen as integrals of the activity curves (see figure, below, and the Appendix for a full discussion). In this figure, Gosling says “the integral of Ta is K (knowledge) and the integral of Pa is C (calcification – revealing a strong personal cynicism). Ss, the sensibility of standardization, is just K-C. The optimum time for standardizing a technology is when Ss is at a maximum, which will be in a region where knowledge is high, but calcification has not yet set in.”

After walking through a small series of progressively more gloomy scenarios, Gosling sums up the fundamental issue as he saw it in 1990:

    The sad truth about the computer industry these days is that it is this last case that is dominating a broad range of standards activities. Standards are regularly created and adopted before anyone has performed the experiments necessary to determine if they are sensible. Even worse, standards are getting accepted before they are even written, which is a truly ridiculous situation.

    How this arises is clear: standards are increasingly being viewed as competitive weapons rather than as technological stabilizers. Companies use standards as a way to inhibit their competition from developing advantageous technology. As soon as technical activity is observed by political/economic forces, their interest rises dramatically because they see a possible threat that must be countered before it gains strength.

    The result of this is a tremendous disservice to both users and consumers of technology. Users get poor quality technology, and because of the standards process, they’re stuck with it.

This discussion is relevant to our current payment token circumstances.

MasterCard, Visa, and other EMVCo owners are careful to say that EMVCo develops specifications, not standards. But of course the difference between specifications and standards may be ephemeral in an environment where companies (or groups of companies) develop them to put a damper on technology development by others, and to create market advantage for themselves.

Let’s look at the current situation in terms of Toshi Doi and John Gosling’s notion of “political” interest. What is EMVCo’s motivation for creating a payment token specification at what one its senior staffers describes as “lightning speed?”

If we take EMVCo and bank statements at face value I’d suggest the motivation may be fundamentally different from the competitive advantage quest Gosling sees in his admittingly “moderately sarcastic” perspective on standards development. What’s different here is that speed to market is motivated by a desire to head off geometric increases in card not present fraud as the ongoing chip process better secures the physical point of sale and the United States moves closer to the October 2015 liability shift date at the point of sale. Effective fraud mitigation will be good for all stakeholders, and like it or not the EMVCo process is the most likely vehicle to yield an actionable payment token product in time to put a dent in the CNP fraud shift resulting from the implementation of Chip at the point of sale.

So what to do in the current environment where merchants, the card brands, and banks are so at odds? One approach to test true motivation could be to secure a commitment from EMVCo to implement the specification in flexible ways to encourage maximum stakeholder partition in the ecosystem, for example, to establish market mechanisms to make sure that a wide variety of players (including merchants) have the opportunity to operate token vaults, and to ensure that functionality that may not appear practical at launch, such as mass deployment of single use tokens, are not precluded in the future when the processing overhead is not so great as to make their use difficult at scale.

Stakeholders have a more common economic interest in short term payment token development than the current discussion emotion seems to suggest. Those common interests should lead, in relatively short order, to more a productive tone in the industry’s token standards discussion.

APPENDIX

Phase Relationships in the Standardization Process
James Gosling
August, 1990

This is a moderately sarcastic note on the phases that the standardization process goes through, and the relationship between the level of technical and political interest in a topic. It is purely a personal view.

Diagram 1

Toshi Doi of Sony describes the standardization process in terms in Diagram A. The i axis describes level of interest and the t axis describes time. Ti describes technical interest, and Pi describes political interest. As time passes, technical activity declines as the technology becomes understood. Similarly, generally fueled by economic pressures, the political interest in a technology increases in some period.

For a standard to be usefully formed, the technology needs to be understood: technological interest needs to be waning. But if political interest in a standard becomes too large, the various parties have too much at stake in their own vested
interest to be flexible enough to accommodate the unified view that a standard requires.

In this model, Ws is the `window of standardization’ where technical interest is waning (i.e. the technology has become understood), but the political situation hasn’t become too hotly contested for constructive negotiating.

Diagram 2

This model has many interesting insights, but there is more complexity in the situation that can be explored. In the original model, the T and P curves are open ended. The situation is more like the diagram at left. These curves, Ta and Pa, represent technical activity and political activity. In general, technical activity precedes political activity. Both types of activity go through phases of different intensity. As these activities proceed, they produce results. The result curves are the integrals of the activity curves.

Diagram 3

The integrals of these two curves are drawn at left. The integral of Ta is K (knowledge) and the integral of Pa is C (calcification – revealing a strong personal cynicism). Ss, the sensibility of standardization, is just K-C. The optimum time for standardizing a technology is when Ss is at a maximum, which will be in a region where knowledge is high, but calcification has not yet set in.

A very interesting quantity to observe is the phase relationship between Ta and Pa. When the maximum point on Pa follows the maximum point on Ta by a sufficient distance, there is a wide Ss window. A sensible standard can be fairly easily set since the political activity which leads to the standard has the necessary technical knowledge in hand when needed. If Pa lags Ta sufficiently, Ss will have a long high flat top, which forms a convenient table on which to work.

Diagram 4

Consider moving Pa left, closer to Ta. When it is close to Ta, Ss will have a shallow and flat region where the upward slope of Ta matches Pa approximately. This region is the time of chaos. Before calcification builds up, there isn’t enough knowledge to do anything sensible, by the time that there is enough knowledge, there’s too much calcification to allow a sensible compromise to be reached. In between, the region is flat enough that there isn’t a clearly defined optimum moment for developing a standard, so there is instead a drawn out period of chaotic bargaining and soul searching.

Diagram 5

Consider moving Pa even farther left, until it is to the right of Ta. This is the worst case: Ss is always negative. The long flat minimum region is the time of panic where the political/economic process has decided that a technology needs to be standardized, but no one understands it. Standards get set by making random guesses that are not grounded in any technical reality, but are instead grounded totally on political expedience.

Diagram 6

The case described in the previous diagram is impossible in practice. The very act of setting a standard inhibits technical activity, reducing the Ta curve and sharply flattening the K curve. Ss never rises to a positive level of sensibility.

The sad truth about the computer industry these days is that it is this last case that is dominating a broad range of standards activities. Standards are regularly created and adopted before anyone has performed the experiments necessary to determine if they are sensible. Even worse, standards are getting accepted before they are even written, which is a truly ridiculous situation.

How this arises is clear: standards are increasingly being viewed as competitive weapons rather than as technological stabilizers. Companies use standards as a way to inhibit their competition from developing advantageous technology. As soon as technical activity is observed by political/economic forces, their interest rises dramatically because they see a possible threat that must be countered before it gains strength.

The result of this is a tremendous disservice to both users and consumers of technology. Users get poor quality technology, and because of the standards process, they’re stuck with it.

For more than 35 years, Santa Fe Group Senior Advisor, Gary Roboff, contributed his outstanding talents to the financial services industry, and in particular to financial services payments systems. Gary has focused on such issues as privacy and information utilization, business frameworks, changes in the payments and settlement systems, and standards for emerging e-commerce applications. He has chaired the Electronic Funds Transfer Association (EFTA) Board of Directors and was a founder of the International Security Trust and Privacy Alliance (ISTPA), serving as Vice Chair of its Board.