Which Fedcoin?

David Andolfatto, Vice President of the St Louis Fed, published a most interesting post yesterday Fedcoin: On the Desirability of a Government Cryptocurrency. Andolfatto’s post is itself in reference to JP Koning’s Fedcoin piece of last October. Back then, I wrote a bit about that on a private email list that is usually devoted to topics relating to blockchain protocol design. I thought the Fedcoin thought experiment was interesting fodder for our monetary intuitions. Still do. So here it goes.

A central bank backed blockchain payments network for a national currency like the USD is a neat idea. It would, first of all, put digital cash on the map for good. And digital cash that trades at parity with an economy’s well-established unit of account is a far more useful medium of exchange than a volatile cryptocurrency like bitcoin. Andolfatto:

And so, here is where the idea of Fedcoin comes in. Imagine that the Fed, as the core developer, makes available an open-source Bitcoin-like protocol (suitably modified) called Fedcoin. The key point is this: the Fed is in the unique position to credibly fix the exchange rate between Fedcoin and the USD (the exchange rate could be anything, but let’s assume par).

So the idea is that the supply of Fedcoin expands and contracts perfectly with changes in Fedcoin demand, as the Fed would issue and redeem Fedcoins for USD deposits at parity.

Exactly what sort of blockchain protocol would be appropriate for this scheme is an open question. It certainly cannot be a proof-of-work protocol. Maybe it wouldn’t be a blockchain protocol at all. This is what I wrote about the idea on that email list:

forget nakomoto-consensus for a moment and assume Fedcoin is just a 1990’s implementation of digital cash, Fed-run servers, chaumian blinding, etc. Assume it’s executed well, so we’ve got something with all the properties of cash with the added benefit of cheap electronic payments. That’s actually a pretty evocative idea on its own, even though we had everything in place to do it 20 years ago and it has nothing to do with blockchains.

So what’s innovative about Fedcoin–whatever its technical implementation may be–isn’t blockchain tech. It’s rather the monetary implications of central bank sponsored digital cash. And those implications are IMO more profound than what both Koning and Andolfatto suggest. Andolfatto says:

Of course, just because Fedcoin is feasible does not mean it is desirable. First, from the perspective of the Fed, because Fedcoin can be viewed as just another denomination of currency, its existence in no way inhibits the conduct of monetary policy (which is concerned with managing the total supply of money and not its composition). In fact, Fedcoin gives the Fed an added tool: the ability to conveniently pay interest on currency.

In his theoretical work, Andolfatto has advocated the interest-bearing money concept as a way of increasing the efficiency of money holdings: the economic efficiency of the Friedman rule without the deflationary implications. So I can see why Andolfatto is interested in digital cash.

Indeed, Fedcoin could pay interest (at the IOER rate?). In fact, if Fedcoin were to displace the use of greenbacks, this could remove the last remaining impediment to negative nominal interest rates, so perhaps that is one aspect of Fedcoin that would actually expand rather than inhibit the conduct of monetary policy.

Just another dollar denomination?

But I think that Andolfatto and Koning are seriously underestimating the implications of Fedcoin. This is what I wrote:

But I’m not interested blockchains here, it’s the economic implications, which are radical: it would cause the demise of fractional reserve banking. A central bank that went down this path would effectively bring about something dubbed the “Chicago Plan“, an early 20th century proposal that banks hold 100% reserves and the CB compensate for the destruction of privately-created “endogenous money” with a dramatic expansion of base money (monetising in a non-inflationary way much of the national debt as a side-effect).

So the problem (or opportunity, depending upon your perspective) with Fedcoin is that it will compete with bank deposits in a big way. Unlike your bank deposit, which is an unsecured loan to a highly leveraged deposit-taking institution, Fedcoin is central bank money. It cannot default, by definition. Fedcoin would be better than credit than US Treasury Bills. Why would anyone use bank depo (and it’s creaky array of payment systems like ACH and SWIFT) given such an alternative?

The only reason why we are accustomed to thinking that cash and bank deposits are the same thing, exchangeable 1-for-1, is because for 80-odd years bank depo has been buttressed by the central bank’s Lender-of-Last-Resort (LLR) facilities, government-backed deposit insurance, and a bank debt credit market built around expectations that banks are Too-Big-To-Fail (TBTF).

Fedcoin would be immensely popular. Not just among individuals, but institutions, which could finally own large balances of the unit-of-account without having to assume the credit risk of a >30x leveraged balance sheet with a big duration miss-match between its assets and liabilities.

In no way inhibits the conduct of monetary policy?

Paper cash is central bank money too. But without an electronic payment rail, it’s usefulness is capped. But digital cash with a central bank issuer… that’s useful for everything except black market trade. It’s hard to see how the Fed could both maintain the parity peg and not see the “Fedcoin” line-item on its balance sheet swell. Banks would have to offer depositors a credit spread over the “Fedcoin rate” to prevent a run.

So, the introduction of Fedcoin would place the Fed in a dilemma. If it rations the supply of Fedcoin, Fedcoin will trade at a premium to bank depo and the peg breaks on a spot basis. But if you make banks compete with the Fedcoin rate for bank deposits and enforce parity (you can depo/withdrawal Fedcoin at your bank 1:1), then the Fedcoin rate is going to be determined more by the demand for Fedcoin relative to bank depo than macroeconomic considerations and, anyway, the peg breaks on a forward basis.

Why don’t you have an account with the Fed?

When you think about it, it’s rather odd: why can’t you have an account at the Fed? Why must you assume the credit risk of a bank just in order to transfer dollars electronically? (That’s a question that really should be asked more frequently.) The reason why Fedcoin is so radical is that, for the first time, central bank money would be available to everyone in electronic form. Electronic payments would finally be divorced from bank deposit.

Who said payment systems were boring! The whole edifice of fractional reserve banking is held up by the union of electronic payments and bank deposit (along with LLR, depo insurance, etc). Break that union and, I conjecture, the union of fiat money and fractional reserve breaks too.

Which may be no bad thing. Why must fiat money be inextricably linked up with credit? Without fractional reserve banking, the locus of credit origination could be what it should be: the issuance of a debt instrument with a market price, rather than bank loan financed with a privileged, publicly-subsidised debt instrument (bank deposit) that doubles as the electronic medium of exchange.

The other Friedman rule

So, if you accept my thesis that Fedcoin will undermine fractional reserve banking, it makes allot of sense to wonder what sort of monetary policy the Fed should conduct in a world where the money supply is equal to the monetary base. I’m going to step out on a limb here and say that discretion goes out the window and policy is run by rules.

And rules can be programmed. Milton Friedman famously once said that the FOMC could be replaced by a computer. I would like to go further and say that it should be replace by a distributed computer.

So my kinda Fedcoin wouldn’t be a fixed exchange rate regime with Fedcoins exchangeable with Fed deposits at parity, leaving the existing monetary policy instruments and discretionary policy framework intact. Instead, the liabilities of the Fed’s balance sheet would be entirely denominated in Fedcoin. Monetary policy would be an algorithm embodied in a DAO, and the FOMC could only change the algorithm infrequently, if at all.

Whether the monetary policy algorithm is Friedman’s k-rule, a Taylor Rule, Fisher-like dollar stabilisation, or whatever, the idea is that of a monetary policy represented by rules, the execution of which not even the FOMC could manipulate outside of the “meta rules” encoded in the FOMC DAO.

Cypherpunk monetarism

And Fedcoin along these lines is intriguingly close to the budding research around stable cryptocurrency. There are differences. The stable coin ideal is still very much a cryptocurrency vision, built around permissionless p2p networks autonomous of any off-chain institutional governance. My seigniorage shares model, for example, attempts to bootstrap the functionality of a central bank balance sheet using an on-chain digital asset that is distinct from the coin used as medium-of-exchange.

But an institutional model like Fedcoin would have an easier time of it. So my kinda Fedcoin: a stablecoin blockchain, combined with an off-chain balance sheet and some policing of the consensus protocol and data feeds. Not so different from “cypherpunk monetarism”.

Some Crypto Quibbles with Threadneedle Street

Last week the Bank of England published its Quarterly Bulletin, which contained two detailed papers on digital currencies. The Bank deserves credit for writing such a thoughtful review of this space, which was clearly the product of thorough and open-minded research.

One of the two papers titled Innovations in payment technologies and the emergence of digital currencies is noteworthy for pointing out the potential applications of decentralised crypto ledger systems for financial services. Given that I’m co-founder of a new company devoted to such applications, I’m delighted to see that a G10 central bank has the foresight to see this. There will be many points of intersection between private sector innovation here and the regulatory mandate of a central bank.

But you, my readers, are interested in the cutting edge of cryptonomics thinking, so I want to instead discuss the Bank’s second paper, The economics of digital currencies, because I take issue with parts of its analysis. In brief, I believe that the authors have incorrectly analysed the cost structure of digital currency systems and, as a result, incorrectly generalise some problems faced by digital currencies like Bitcoin to digital currencies in general.

The Costs of Mining and Transaction Fees

Ok, first a quick review of mining. We’ll assume the Bitcoin protocol as our template. The micro economics of mining are actually quite simple. To win the mining award (currently, a 25 bitcoin “coinbase” award + block’s TX fees), you have to solve a hash-based proof-of-work problem, which involves using a machine to compute the double sha256 hash of a block of TX over and over until you hit a value below a certain target, which is defined by the protocol’s current difficulty. Difficulty resets every 2016 blocks, increasing if the average duration between solved blocks is below 10 minutes, decreasing if the average duration is above 10 minutes. The scheme insures that average time between blocks approximates 10 minutes.

Now, the probability that a given hash “solves” the problem is precisely defined by difficulty. So, mining profitability is a function these four variables:

  • Current difficulty
  • The efficiency of mining (converting electricity into hashes)
  • The price of bitcoin (market value of mining award)
  • The price of electricity

The efficiency of mining really boils down to this simple ratio: GHs/kWh. How many Gigahashes per second can your hardware compute for a given unit of electricity (including electricity consumed in cooling the machines, etc).

And the dynamic reset of difficulty basically ensures that only those running machines with the highest GHs/kWh ratio and paying the lowest cost per kWh will mine in the long run, as everyone else will be mining unprofitably and drop out of the network. It’s almost a textbook model of perfect competition.

Now, what I think often gets missed here is that the costs of mining bitcoin are entirely a function of the price of bitcoin. If the price of bitcoin goes up, mining becomes profitable and more nodes join the network, which drives up difficulty making mining no longer profitable again. If the price goes down, mining becomes unprofitable at current difficulty, more nodes drop off the network, which drives difficulty down, making mining profitable again.

This dynamic is obscured by the fact that investments in mining hardware have driven GHs/kWh up relentlessly over the last few years, so difficulty rarely declines with bitcoin price, but the dynamic still applies: the price of bitcoin determines the cost of mining.

Failure to appreciate this fact leads to arguments like this one in the BoE paper:

Moreover, to the extent that miners’ expected marginal revenue exceeds their expected marginal costs, miners’ costs are likely to increase over time. This should occur even if no additional people start to mine and independently from any increase in the number of transactions per block. This is because distributed systems involve a negative externality that causes overinvestment in computer hardware. The negative externality emerges because the expected marginal revenue of individual miners is increasing in the amount of computing power they personally deploy, but the difficulty of the problem they must each solve (and hence their marginal cost) is increasing in the total amount of computing power across the entire network. Individual miners do not take into account the negative effect on other miners of their investment in computing resources. Economic theory would therefore suggest that in equilibrium, all miners inefficiently overinvest in hardware but receive the same revenue as they would have without the extra investment.

Can you spot the error? It’s right there in the first sentence: “to the extent that miners’ expected marginal revenue exceeds their expected marginal costs, miners’ costs are likely to increase over time.” This is a fallacy of composition.

What we really have here is the familiar pattern of Knightian uncertainty faced by entrepreneurs. The miner must make a capital outlay in advance for his mining equipment (to get an edge over the competition with new kit delivering a higher GHs/kWh ratio), but he doesn’t know what the price of bitcoin will be once he starts winning blocks, nor does he know what the difficulty will be, which will be a function of Bitcoin price and the capital investment of his competitors.

Boom, bust.. it’s all the same, the fixed costs of mining hardware are internalised by the miner, those costs are not an externality as the author’s argue. The cost of mining will be dictated by the price of bitcoin, that is, the market value of the mining award, and difficulty reset enforces the long-run equilibrium condition whereby hashing costs = market value of mining award. Even changes in the variable costs of mining (the price of a kWh, basically), don’t change the long-run costs of mining, as an increase in electricity prices ceteris paribus should cause difficulty to decline and a decrease should cause difficulty to increase. The cost of mining a block will converge to the market value of the mining award.

I’m not an economist, but back in January I suspected that the economics profession would have trouble with this implication of the Bitcoin protocol, when I wrote:

So the exchange value of the mining award determines the marginal costs rather than the other way round. An economist might find that pretty weird, but that is how it works.

And that is how it works. Economists are used to thinking in terms of prices (ephemeral market stuff) being a function of costs (stuff that is “material” and “real”, like a production function). But the way the Bitcoin protocol works, the hashing costs of the network are a function of the mining award’s market value. I’m not saying it’s a nice feature of the protocol. But it is what it is.

And that’s why all that investment in mining equipment is not a negative externality, at least from the perspective of mining costs.

But there is different way in which capital investment in mining equipment creates an externality, a way that the authors did not address.

Mining Centralisation

If there is a negative externality to the relentless quest to make mining a positive expected value lottery by investing in new gear that increases GHs/kWh, it’s that the number of mining nodes decreases as a result of this process. Mining becomes concentrated in fewer and fewer hands. To run a profitable mining operation, one must run machines with an above average GHs/kWh and below average electricity cost. This is specialised hardware with limited production runs and requiring a non-trivial capital outlay.

A network made up of a few large mining nodes is basically a centralised system with none of the benefits centralisation might bring. What Satoshi envisioned was a one-cpu-one-vote distributed system, people mining on commodity CPU or GPU hardware. That’s not what has evolved, and this is a serious problem for Bitcoin and other digital currencies with protocols designed along a similar pattern.

It might be tempting to think that the amortised cost of that hardware somehow gets baked into the cost of mining the network, as the author’s of the BoE paper do, but those costs are only faced by the miner. It may turn out that the ROI of the latest 28nm mining rigs is negative at current prices. Too bad for the miner who purchased it, but he’ll still mine if the variable costs (the electric bill) are less than the expected mining award. The market value of the hardware itself will decline to the point where the ROI is no longer negative.

And this Knightian boom/bust dynamic raises some questions about the future of mining investment. R&D in specialised mining gear can really go one of two ways. The first scenario is that there will continue to be an edge in capital investment in improving GHs/kWh, in which case capital investment in mining will continue to concentrate mining in few hands, a “bad” outcome.

The other scenario is that the gains from further optimisation’s reach a stage where they are too costly to be worth it and R&D switches to commoditizing the currently most efficient designs, in which case the centralising effects of mining investment go into reverse, a “good” outcome. Whichever way it goes, it is still the case that the mining costs of the network are determined by the market value of the mining award. That’s equilibrium.

So, if there is a negative externality inherent in Bitcoin mining, it is the negative externality of centralisation not of costs.

On the sustainability of low transaction fees

I want to focus on another dimension to this mining cost story. So far we have focused on the role that miners play in hashing blocks. But miners actually do two things: in addition to hashing blocks, they also perform transaction verification. The author’s of the BoE paper seem to conflate the two processes:

Low transaction fees for digital currency payments are largely driven by a subsidy that is paid to transaction verifiers (miners) in the form of new currency. The size of this subsidy depends not only on the current price of the digital currency, but also on miners’ beliefs about the future price of the digital currency. Together with the greater competition between miners than exists within centralised payment systems, this extra revenue allows miners to accept transaction fees that are considerably below the expected marginal cost of successfully verifying a block of transactions.

It’s that last sentence I take issue with. The “marginal cost of successfully verifying a block of transactions” is the cost of running the scripts on each TX in the block and verifying the digital signatures. The computational costs here are tiny compared to the cost of hashing the block, which plays no role in TX verification whatsoever. Hashing is there to raise the cost of a Sybil attack, nothing more.

What’s confusing about cryptocurrency is that there are these two different costs, hashing and verification, and two different sources of paying for them: seigniorage (the coinbase award) and TX fees. How the pair of costs and revenues match up is a protocol design consideration.

Costs Revenue
Proof-of-Work (SHA256 hash problem) Coinbase (25 bitcoins)
Transaction verification Transaction fees

In the case of Bitcoin, a miner has no control over the size of the coinbase award, but he does control which TX’s go into the block he’s currently hashing. So basic economic theory dictates that a miner will include a transaction if and only if the expected value of the TX’s fee is greater than his marginal cost of verifying that transaction. The costs due to proof-of-work do not come into the decision at all.

It makes sense to think of proof-of-work and TX verification as two separate subsystems with their own respective sources of financing: proof-of-work financed by coinbase and transaction verification financed by transaction fees. A digital currency protocol could follow this pattern and some do (Ethereum, for example). Bitcoin, however, is different. Its protocol dictates that the coinbase award halves every two years and never exceeds a cumulative total of 21m coins, which means that at some point both hashing costs and verification costs must be paid out of TX fees alone.

I’ve pointed out before that this aspect of Bitcoin’s protocol design is self-defeating in the long run. The market for media of exchange will gravitate towards those systems with the lowest transaction costs, and in the case of proof-of-work digital currencies, that means those protocols that forever subsidise hashing costs with the coin’s seigniorage (no supply cap). And even if that were not the case and Bitcoin remained the dominant digital currency, the protocol will need to change to incorporate a mandatory minimum fee that is sufficiently large to incentivise enough hashing to secure the network. I say “mandatory” because there is a collective action problem here in that an individual miner has no incentive to exclude a transaction whose fee exceeds his marginal verification costs, even if the aggregate effect of this rational behaviour is that the total TX fees are insufficient to support a hashing difficulty that secures the network.

Which brings me to the author’s bleak conclusion:

The eventual supply of digital currencies is typically fixed, however, so that in the long run it will not be possible to sustain a subsidy to miners. Digital currencies with an ultimately fixed supply will then be forced to compete with other payment systems on the basis of costs. With their higher marginal costs, digital currencies will struggle to compete with centralised systems unless the number of miners falls, allowing the remaining miners to realise economies of scale. A significant risk to digital currencies’ sustained use as payment systems is therefore that they will not be able to compete on cost without degenerating — in the limiting case — to a monopoly miner, thereby defeating their original design goals and exposing them to risk of system-wide fraud.

This is only partly right, and partly right for the wrong reasons. First of all, it’s not digital currencies that face this problem, but a subset of them that, like Bitcoin, eventually require TX fees to shoulder the entire burden of incentivising proof-of-work. But that is an accidental rather than essential feature of digital currencies. So, the conclusion is only partly right because it does not apply to protocols that finance hashing costs with a perpetual coinbase award.

And right for the wrong reasons… this sentence “With their higher marginal costs, digital currencies will struggle to compete with centralised systems unless the number of miners falls, allowing the remaining miners to realise economies of scale” is wrong because the authors have conflated hashing and verification costs.

I’m not sure I get the “economies of scale” thing in transaction processing systems, but perhaps the author’s are thinking of the extreme redundancy that distributed systems require. Transaction verification in a distributed system is redundantly performed by every node, so if there are 5,000 nodes verifying nodes on the system, every TX is verified 5,000 times. Compared to a centralised system that only needs to verify a TX once, it would seem that there is a simple economy of scale linear in the number of nodes in system.

But a centralised system must do much more than verify TX, it must do lots of things that nodes on a distributed system do not have to worry about. The centralised system must protect the server(s) against error and attack, as a centralised system is by definition a system with a single point of failure. You don’t have to be a network security expert to appreciate that this is hostile and difficult technical territory. I can’t offer estimates on what these additional costs are, but what I do know is that they are a large multiple of transaction verification costs, and exponentially more complicated processes. TX verification–parsing the blockchain and doing a bunch of ECDSA signature verifications–is easy and cheap by comparison.

So it is by no means obvious that the total costs of TX verification are lower in a centralised system than in a decentralised or distributed one, and it may in fact be the other way round. But either way, we can say two things with confidence:

  • The costs of distributed TX verification are a small fraction of the fees charged by legacy payment systems. Unlike hashing, this is not a costly computation even when multiplied by a large number of verifying nodes.
  • The costs of distributed TX verification will decline over time with improvements in computational efficiency, bandwidth, etc.

But the one thing that distributed systems must do that centralised systems do not have to worry about is a mechanism for achieving consensus on the authoritative state of the ledger. For Bitcoin and many other digital currencies, this mechanism is hash-based proof-of-work, and it is crucial to appreciate the fact that verification and proof-of-work hashing are separate processes with independent cost functions.

And it may turn out that the proof-of-work blockchain isn’t the best mechanism for achieving consensus anyway. There are other decentralised consensus algorithms used in projects like Ripple, Stellar, and Hyperledger that do not rely on energy intensive hashing problems to achieve consensus.

Proof-of-Work as “manufactured scarcity”

Now that proof-of-work is liberated from the misconception that it is somehow behind TX verification we can bring some really interesting economic properties of proof-of-work into relief.

As long as there is long-term growth in demand for the coin and the coinbase award is perpetual, seigniorage should be more than sufficient to cover the costs of proof-of-work. That idea alone is, I think, really interesting. Here’s what I mean.

There is a long-standing objection to private fiat money schemes advocated by Hayek and others that goes something like this. Media of exchange are near-substitutes. (This maybe false assumption, but lets go with it and set aside the economics of network effects, etc.) And the marginal costs of producing the media are almost zero, so if a privately produced fiat money is a success, the seigniorage that accrues to the issuer will be substantial. This will invite more and more competition producing more and more media of exchange. Invoke that near-substitutes assumption and, bingo, privately produced money gets driven down to the marginal cost of its production, which is basically zero. Privately produced money is impossible because of free market competition and the the near-zero marginal cost of producing it.

In my opinion, the most important innovation of hash-based proof-of-work isn’t its solution to the problem of distributed consensus, for which there are arguably better solutions. Rather, the real innovation is the way in which this energy intensive defence against the Sybil attack makes the marginal cost of proof-of-work fiat money meaningfully non-zero, refuting the argument above. The scheme’s seigniorage doesn’t really accrue to anyone. Instead, it gets burned up in hashing blocks, where the marginal cost of producing a new set of coins equals the cost of solving the hash problem on the block that brings the new coins into existence. There is no coin “issuer”, scarcity comes into existence ex nihilo.

And this “seigniorage burning” isn’t a complete waste, as my metaphor might suggest and an economist will wrongly suspect as “inefficient”, for it has the side-effect of bootstrapping a solution to the distributed consensus problem and thereby creating a distributed payment system on which the value can be transferred (a coin can’t be scarce if it can be double spent). After all, shouldn’t seigniorage be spent on a public good? I think that this is conceptually beautiful, and it deserves to be a chapter in the micro foundations of money economics, whatever its ultimate fate ends up being. The first credible scheme for credibly rationing the supply of privately produced fiat currency.

The dominant narrative to-date has been that digital currencies like Bitcoin have value because of the utility of the distributed payments system combined with an (eventually) fixed coin supply. I think that the latter belief is unfounded. It’s not the fixed supply of a coin that makes it scarce, but rather the marginal cost of producing the coin that makes it so.

There can be demand for coin because of the expectation that it will be demanded more in future and therefore increase in price (speculative demand), and there can be demand for coin because you want to hold a coin balance to facilitate transactions (transactional demand). A coin may embody demand from both sources, but the former implies the latter or else the coin’s value rests on some sort of “greater fool” phenomena.

The entire history of monetary thinking can probably be told from the perspective of the tension between these two sources of demand, the tension created by a single object embodying the properties of both store-of-value and medium-of-exchange. The pursuit of purchasing power stability in this object isn’t some hubristic policy ideal like the taming of the business cycle or full employment. It is intrinsic to the very idea of money.

A volatile medium-of-exchange is a poor medium-of-exchange, and it is almost inconceivable that a free market would ever converge on a unit of account where the numeraire of all exchange was among the most volatile of assets. Just consider the cost of extracting relative prices in that scenario! We’d have to develop an alternative unit-of-account, which is another way of saying the market would never select such a coin as the unit-of-account in the first place. Trade requires a reliable measuring stick.

This is my favourite paragraph from the BoE paper:

In order to address a need to respond to variation in demand, a more flexible rule would be required. For example, the growth rate of the currency supply could be adjusted to respond to transaction volumes in (close to) real time. Alternatively, a decentralised voting system could be developed. Finally, variant schemes could embrace existing monetary systems by seeking to match official broad money data or to target a fixed exchange rate, although this would require the abandonment of part of the schemes’ original ideology.

A more flexible money supply rule behind digital currencies is required. After all, even commodity money has a somewhat elastic supply function. If the price of gold hovers above the marginal cost of pulling gold out of the ground, more gold supply will hit the market. Digital currency with a deterministic money supply function is not a feature but a limitation of early, first designs. And a capped supply function like Bitcoin’s is a bug on microeconomic grounds alone, as we discussed above.

But there’s no reason to jump to the conclusion that the “original ideology” must be abandoned in order to implement certain stability schemes. For example, the coinbase award could be made a function of difficulty deflated by the change in GHs/kWh (improvements in hardware efficiency). Such a scheme would keep the coin price of a kWh roughly constant. A fixed exchange rate without abandoning the “original ideology”. Ok, it’s not a complete proposal, but you get the point. We’ve barely scratched the surface of this technology.

One of the (many) ways in which fiat money is weird and counter-intuitive is how it has value in the first place. The stock and bond markets have value because of the NPV of expected future income flows. But the aggregate value of the money stock is like value created out of nothing. It’s the value of pure liquidity.

So I want to offer a variation on the trust-less theme here: nobody can be trusted with the seigniorage generated from this value.

Why I’m bullish on AppCoins

In a post this week Vitalik Buterin challenges some of the objections made against alt-coins. The whole post is worth reading, but I want highlight what Vitalik says about appcoins:

One of the main attractions of cryptocurrency 2.0 is the idea of “appcoins” – protocols with a currency or token system built in, where the token system generates the emergent value to fund the development of the protocol. Having every protocol add its own currency for funding purposes may seem ugly, but the quantity of potential monetization per unit ugliness is vastly higher than existing solutions, such as making a protocol that is proprietary, charging license fees and excluding users who cannot afford to pay, and releasing “crippleware” apps in order to facilitate monetization or advertising. In the future, using emergent network assets (including non-fungible assets such as namespaces) as a funding mechanism may become the dominant business model for decentralized applications. If Bitcoin was the only game in town, none of this would be possible.

I agree. In a post last month I wrote:

There are non-obvious implications of the AppCoin crowdfunding model, both economic and legal. The current VC/Angel funding model of startups is based on the familiar NPV of expected future earnings. Crowdfunding via an AppCoin will be based on the seigniorage of a literally monetised network. In a future post we’ll discuss how to approach the valuation of such things. We will also discuss some of the favourable legal aspects of crowd funding in a manner that clearly falls outside the scope of securities law.

I promise I’ll address the subtle valuation questions raised by a funding model based on appcoin seigniorage rather than equity investment, but I just quickly want to mention a few other aspects of the appcoin idea.

Appcoins have demand that is endogenous to the services of the network protocol on which the appcoin is based (eg, anonymous browsing, in the hypothetical case of a TorCoin network). Those who are net providers of a scarce resource (say, bandwidth devoted to running a Tor exit node) will have another avenue for acquiring cryptocurrency besides running a mining rig or purchasing coin on an exchange. Those who are net consumers of a useful network resource will need the coin to use the network. Compare this to the demand to hold something like Bitcoin, which is primarily a speculative play on the view that in future a meaningful proportion of conventional payments volume will be done in Bitcoin.

I’m really not convinced of the near-term value proposition of cryptocurrency as a medium-of-exchange for physical goods, Bitcoin as the rival to PayPall and all that Silicon Valley VC jazz (is it the weather over there?). I think it will happen eventually, but this is not the space where cryptocurrency will flourish first IMO.

Cryptocurrency will flourish first in value exchange that does not involve physical delivery: Tor, Bittorrent, gambling, (crypto) cash settled derivatives like CFD’s, prediction markets, etc. In these use cases, both sides of a transaction can be handled on-block-chain with limited or no trust. In the case of appcoins, the two-way transaction is endogenous to the network itself.

Ethereum is the most ambitious appcoin project to-date, if we can describe it as an appcoin. It might be better to describe Ethereum as a meta-appcoin, where the service is a Turing complete scripting language in which smart contracts can be written. It will be interesting to see how much of the future appcoin space can be built on top of Ethereum, and how much will live independently of it. The obvious cases of Tor and BitTorrent are likely to be best implemented with their own blockchains (assuming hash-based proof-of-work is the cryptocurrency model that works best there). Use cases like settlement of financial contracts will live on Ethereum, among much else.

If the metric of success is the velocity of trade rather than the RoR of an exchange rate, then the success of cryptocurrency will be won next in the appcoin space.

AppCoins: embedded cryptocurrencies

Naval Ravikant writes:

In economics, the artificially scarce token used to allocate scarce resources is called “money.” So Bitcoin is crowdfunded OSS to run an Economic network. Now, a new generation of Appcoins can be created as open source software, crowdfunded into existence, and go public on day one. They can run networks where Bitcoin may not work, or where separate funding and compensation is needed.

The idea is to embed an application-specific cryptocurrency into a useful network technology to regulate its usage and remunerate its creators:

The Tor network is slow because it relies on volunteers to relay traffic. Anytime we see a line, the product in question is underpriced. Let’s crowdfund a Torcoin – users of relays will pay in Torcoins and operators of relays will get paid in TorCoins. Founding developers collect equity when TorCoins are first mined and sold. Non-founding developers and network operators are paid revenues from newly mined coins and transaction fees.

A P2P technology like Tor is an obvious candidate for AppCoin integration. It does suffer free-rider economics, and network performance would improve if users were incentivised to relay traffic and run exit nodes.

Bittorrent is another technology ripe for AppCoin integration. There is a project underway by the developers of one client to integrate Bitcoin, but I suspect that an application-specific coin would be more appropriate.

I’ve long thought that a grid computing platform like BOINC would be more widely used if it swapped its credit system for an AppCoin.

One of the advantages of an AppCoin is that tight integration with the target technology’s protocol allows for seamless transaction settlement. It’s important to remember that Bitcoin and its variations handle only the payer side of a transaction. Payee performance must be monitored via trust, third-party escrow, 2-of-3 signature transactions, or some other mechanism. For informational goods such as Tor, Bittorrent, grid work units, etc, the overhead of integrating a generic cryptocurrency seems sub-optimal; much better to have a generic mechanism for exchange between different AppCoins, with something like Bitcoin as the reserve/intermediary currency.

Another advantage of the AppCoin is that it will likely get the monetary economics right. In a one-good economy, where the AppCoin buys a digital resource like traffic on a p2p network, it will be obvious to the creators that an AppCoin whose coin supply expands at a rate proportional to the usage of the resources is much better than a supply rule (like Bitcoin’s) that aims for long-term appreciation of coin’s value.

AppCoin’s will also provide an array of avenues for the widespread distribution of crypto coins. Unless you have something to sell those who hold cryptocurrency balances, the only way to acquire cryptocurrency is to mine (economically unfeasible unless you invest in ASIC hardware) or buy them on exchange (counterparty risk, KYC/AML hassles). AppCoins will provide a way for anyone to barter scarce but ubiquitous resources like bandwidth and disk storage for coin.

One interesting, if difficult, dimension to the AppCoin idea is the prospect moving proof-of-work beyond the current model of cryptographic hash functions. GridCoin tries to integrate hash-based with grid-computation-based proof-of-work. Making these ideas work in a robust way is hard, but I suspect that progress will be made beyond these early experiments, and it will come from the AppCoin space.

There are non-obvious implications of the AppCoin crowdfunding model, both economic and legal. The current VC/Angel funding model of startups is based on the familiar NPV of expected future earnings. Crowdfunding via an AppCoin will be based on the seigniorage of a literally monetised network. In a future post we’ll discuss how to approach the valuation of such things. We will also discuss some of the favourable legal aspects of crowd funding in a manner that clearly falls outside the scope of securities law.

Can we value Bitcoin?

In a post a few weeks ago I wrote:

A more sophisticated defence of Bitcoin’s valuation goes like this. Bitcoin is great SOV not just because of its limited supply and those hashing cost network effects. It’s a great SOV because in future more and more people will use it as a medium-of-exchange (MOE). As the volume of bitcoin transactions increases, so will the demand to hold bitcoin balances for the purpose of making transactions in goods and services. But a total of only 21 million bitcoins will ever be produced, so the price of a bitcoin must reflect the ratio of expected future MOE money demand to 21 million. The price of Bitcoin, one might argue, is the market’s prediction of the long-term growth rate of bitcoin transaction demand.

Now, on Twitter today Marc Andreessen, links to Fortune article citing Stanford economist Susan Athey, who apparently makes an argument virtually identical to the above:

An anonymous viral email circulating among bitcoin watchers and partisans lays out a few simple hypothetical usage and adoption scenarios, and their consequences for bitcoin’s price. If Amazon.com adopted bitcoin for all payments, its volume of $38 billion, divided by a supply of (at the time of the email’s writing) about 7 million bitcoin, would make each bitcoin worth $5,400. If $300 billion in international remittance was conducted in bitcoin, that volume alone would push the price to $42,000. Adding these, along with online poker and gas station transactions, would lead to a total transaction volume of $602 billion – and a bitcoin, even at today’s expanded supply of 12 million coins, worth $50,000.

“Those numbers are good ones to start with. In some sense, that’s like a maximum,” says Susan Athey, a professor of economics at the Stanford Graduate School of Business who has been studying bitcoin. Few would realistically argue that bitcoin will service 100% of even these silos in the near term, but the volume/supply ratio is the starting point for understanding bitcoin price – as more consumers or organizations choose to use bitcoin, increased volume will drive the price up.

Building from that basic formula, Athey adds a variety of variables to build an analytic framework. The first is velocity – how frequently a bitcoin can be spent. Because bitcoin, unlike paper money, is very low-friction, there’s the possibility of a very high-velocity bitcoin, if, for example, vendors or traders only held bitcoin very briefly, cashing it in and out to government currencies on either end of transfers. That, Athey says, would allow a small volume of bitcoin to process a large volume of payments, keeping the price of bitcoin relatively low.

I’m not privy to this exchange, so I don’t know how much of this argument is attributed to Athey and how much is the Fortune journalist’s own thinking. A bit of Googling turned up this interview with Athey in November of last year:

What do you think about the bitcoin price increases recently? Well, if you expect the volume of transactions to grow a lot, then the exchange rate from dollars to bitcoins has to grow too, because each bitcoin can only be used so many times per day. The market value of all bitcoins has to be enough to support transaction volume. You could interpret the price increases as reflecting increased optimism about the future volume of transactions, driven by China implicitly signaling that it will allow bitcoins to be used for commerce there.

As I pointed out in my previous post, this is a more sophisticated rationalisation of a Bitcoin’s valuation than one usually reads. As a cryptocurrency pays no income, the only way to value it fundamentally is in terms of expected future cryptomoney demand (uncertain) in relation to its future supply (deterministic and completely predictable in Bitcoin). By “cryptomoney demand” we mean: crypto coin balances held for the purpose of facilitating transactions in that coin.

Money demand is proportional to the level of transaction volume if velocity–the number of times the coin supply changes hands over the period–is stable. So, if we can make that assumption of stable velocity, the price of Bitcoin today should reflect expectations of future bitcoin transaction volume. Let t be some future time when the growth rate of transaction volume TX(t) levels out and let V(t) be the velocity at time t, and S(t) is the supply of bitcoin:

price(BTC) = \frac{TX(t)}{V(t)} \times \frac{1}{S(t)}

The calculation cited above arriving at that BTC = $50,000 is implicitly assuming a money velocity of 1, which goes against the Silicon Valley vision of a Bitcoin-as-payments unit that people swap in and out of via intermediaries like Coinbase and BitPay. In that scenario, velocity will be very high.

Here is a back-of-envelope valuation. Let’s say that t represents year 2024 and that bitcoin transaction growth levels out in about 10yrs time. Now, let’s fix an assumption of velocity at that time. The money velocity of USD M1 is about 7, so I would guess that Bitcoin velocity will be rather higher than that. Let’s just say, arbitrarily, that Bitcoin velocity will be 10X USD money velocity. So BTC = $650, V(t) = 70, and S(t) = 20 million, making TX(t) = $910 billion, almost 6% of the US economy.

The result isn’t totally crazy. Here are the blockchain transaction volume figures for the last four years, converted into USD values at the time of transaction (as calculated by blockchain.info):

year         volume  growth
2010        985,887      0%
2011    418,050,216 42,303%
2012    601,415,369     44%
2013 15,216,615,077  2,430%

Much of that volume will be FX settlements and payments between addresses controlled by a single entity, and those volumes shouldn’t be included in the analysis. How much is difficult to estimate (something that we’ll look into in a future post), but let’s say that half of that volume should be excluded, so the current base is 7.6 billion per-year. Annual volume of 910 billion annual in a decade’s time is a bit over 60% compounded growth per year. In light of recent history, the result is conservative!

The problem with this sort of valuation analysis is that the inputs TX(t) and V(t) are entirely speculative. Assumptions, assumptions, assumptions. You can plug-in anything you like. It’s like valuing a new business.. but worse. In Bitcoin the S(t) is basically fixed in future horizons (never more than 21 million), so any change in the market’s assumptions translates into changes in the exchange rate. Bitcoin translates that uncertainty about its future prospects into present exchange rate volatility. And that exchange rate volatility dampens demand today for using bitcoin as a medium-of-exchange, undermining the very assumptions behind its current valuation. To me Bitcoin–not cryptocurrency in general, but Bitcoin–is like one of those M.C. Escher drawings, where the impossible looks deceptively plausible.

The Bitcoin-as-payments people will reply that the volatility doesn’t matter, that I’m wrong in saying that the volatility undermines the transactional demand for bitcoin. Here’s a recent claim by Andreessen:

The criticism that merchants will not accept Bitcoin because of its volatility is also incorrect. Bitcoin can be used entirely as a payment system; merchants do not need to hold any Bitcoin currency or be exposed to Bitcoin volatility at any time. Any consumer or merchant can trade in and out of Bitcoin and other currencies any time they want.

Athey qualifies this position a little (from the same interview above):

What about the extreme volatility? Volatility is bad because it increases frictions—if I just want to send you $100, the exchange rate might change between when I buy the bitcoins and send them to you, and when you receive and cash them out. That creates risk and frictions. But the level of the exchange rate is irrelevant for the efficiency of the payment rail—if I knew it would be $1000/bitcoin all day long, or $100/bitcoin, either way I can buy bitcoins, send them to you, and you can sell them, while avoiding paying exorbitant bank fees. You still incur some fees when getting money in and out, but those are relatively low and should fall over time with competition.

But the irreducible component of the costs facing those merchants Andreessen speaks about are those very “risks and frictions”, they make the price of offloading that volatility to someone else. Competition will not reduce those costs any more than competition among options dealers will reduce the price of a put on the S&P 500.

Is the 1% fee that Coinbase charges for the service of offloading exchange rate volatility sufficient to cover their cost of hedging a coin whose USD volatility is more than 7 times that of the stock market? There are a smart bunch of people behind that company, so I’m reluctant to second-guess the business model. But I feel I must.. and may do so in detail in a future post.

The implicit assumption behind the comments of Andreessen and Athey is that Bitcoin’s money velocity can be arbitrarily large, a hot potato that gets passed around so quickly that the volatility of the coin can be made negligible to the party using it as medium-of-exchange. But the Bitcoin protocol itself places a lower limit on the speed of transaction confirmations, placing an upper limit on Bitcoin’s velocity. Whatever that velocity turns out to be, the interval between time coin received and time coin paid will impose an irreducible risk on the party who wishes to use Bitcoin to make payments. A risk that is costly to layoff to someone else.

But I am a believer in cryptocurrency, I would just prefer to back a cryptocurrency where whose supply was more responsive to its demand, where \Delta S(t) is a function of \Delta TX(t), or a function of the exchange rate itself. This can be done in an entirely trustless way, and such a coin is likely to have a much more stable exchange rate and be a better medium-of-exchange.

Bitcoin, Ethereum and Pigou: the economics of transaction fees

The economics of transaction fees in cryptocurrencies are poorly understood. In a previous post I raised some questions about how using tx fees to compensate for hashing costs (as Bitcoin’s declining coinbase award increasingly does over time) can be incentive-compatible with transaction demand for cryptocurrency. There, I was concerned about the distribution of seigniorage between existing coin holders and hashing costs, and what this implies for tx fees.

A new post on the Ethereum blog focuses on another aspect of transaction fee economics: a tragedy of the transaction verification commons.

The essence of the problem is this. In Bitcoin, tx fees are effectively set by what tx miners choose to include in their blocks. The creator of a tx can pay any fee he chooses, but miners are free to ignore a tx, so a payer who pays a relatively large fee is more likely to have a faster-than-average confirmation time. On the surface, this looks like a market mechanism. But it isn’t. The miner gets the tx fees of every tx included in a block that the miner solves. But every node on the network pays the costs of verifying a transaction; tx must be verified before relaying and building on top of a solved block. Therefore, a miner will include any tx with a fee in excess of his computational costs of verifying it (and reassembling the Merkel tree of his block), not the network’s computational costs of verifying it.

A single, very large block containing many transactions with many inputs/outputs can bog down the network. To deal with this, the Bitcoin protocol imposes a 1MB upper limit on the size of a block. This isn’t a great solution. Not only does it put an upper limit on the number of tx Bitcoin can process per unit of time, it does nothing to rationalise tx fees to tx verification costs.

It’s like an airline that puts a 1000 suitcase (irrespective of size/weight) limit on luggage per flight, and deals with the problem of >1000 suitcases by prioritising those passengers that volunteered to pay a fee. Those who pay the lowest/no fees have their bags kicked off the flight, and placed in a que to for inclusion in subsequent flights (that employ the same 1000 suitecase limit). What will eventually happen is that those with big, heavy bags will pay the highest fees and have their bags included in the flight, as those fees will still be lower than the actual cost of shipping the luggage. Those with small/light bags will get kicked off, unless the passenger is willing to pay more than the marginal cost of shipping his bag. If airlines are a competitive market, those guys will eventually just chose to travel on a different airline that doesn’t ask them to subsidise pack rats.

From Ethereum’s post:

The question is, is this kind of market the right model for Bitcoin transactions? To answer this question, let us try to put all of the players into roles. The resource is the service of transaction processing, and the people benefiting from the resource, the transaction senders, are also the buyers paying transaction fees. So far, so good. The sellers are obvious the miners. But who is incurring
the costs? Here, things get tricky. For each individual transaction that a miner includes, the costs are borne not just by that miner, but by every single node in the entire network. The cost per transaction is tiny; a miner can process a transaction and include it in a block for less than $0.00001 worth of electricity and data storage. The reason why transaction fees need to be high is because that $0.00001 is being paid by thousands of nodes all around the world.

It gets worse. Suppose that the net cost to the network of processing a transaction is close to $0.05. In theory, even if the costs are not borne by exactly the same people who set the prices, as long as the transaction fee is close to $0.05 the system would still be in balance. But what is the equilibrium transaction fee going to be? Right now, fees are around $0.09 simply because miners are too lazy to switch. But then, in the future, what happens once fees become a larger share of a miner’s revenue and miners have a large incentive to
try to maximise their take? The obvious answer is, for a solo miner the equilibrium transaction fee is $0.00001. If a transaction with a fee of $0.00002 comes in, and the miner adds it, the miner will have earned a profit of $0.00001, and the remaining $0.04999 worth of costs will be paid by the rest of the network together – a cryptographic tragedy of the commons.

The Ethereum guys have defined the problem clearly. And I’m not encouraged by what (I think?) is the current thinking of the Bitcoin developers in dealing with this problem. From the Bitcoin Foundation’s blog:

I’ve been working on teaching the wallet code to estimate how low a fee (or priority) a transaction needs, at the moment it is sent, to be accepted by miners and included in the next block or three. The estimates are based on watching transactions as they are broadcast on the network and keeping track of which of those transactions are accepted into blocks.

The danger with estimating transaction fees is miners have an incentive to try to game the estimate to make transaction fees higher. For example, if the estimate was based on the average transaction fee for all transactions in the last N blocks, miners could add very-high-fee pay-to-self transactions to the blocks that
they mine to drive up the average. However, by only considering fees for transactions that have been broadcast on the network that threat is eliminated– miners could broadcast very-high-fee pay-to-self transactions, but would end up paying those high transaction fees to other miners. The transaction estimation code also uses median transaction fees, not averages, to make it much harder for a minority of transactions to influence transaction fees.

But this won’t work in the end, for even a perfect estimate that is not contaminated by strategic actions by miners will still be an estimate of the marginal cost of tx verification faced by a single miner, not the network as a whole.

It’s not surprising that the Ethereum developers have cut to the core of this problem. In Bitcoin, you can at least be sure that the execution of scriptSig and scriptPubKey will halt after time proportional to tx size. Not so with Ethereum’s Turing-complete scripting language. For Ethereum, the problem of rationing network resources over tx verification and contract computation is acute. The project simply will not work without an economically equilibrating solution to this problem.

Their current thinking is that tx fees should be destroyed (no recipient) and calculated along the lines of a Pigovian tax via a some mechanism of miner or ether holder consensus. I’m not convinced that this will work, but this post gives me confidence that the guys behind Ethereum are taking the economics of crypto seriously. Let’s all pitch in and help them solve this problem.

Ethereum: Turing-complete, programmable money

Ethereum is a new cryptocurrency project started by Vitalik Buterin and Charles Hoskinson and others. Part of that project is a new currency, called “ether”, but this is NOT another alt-coin. In my opinion, it’s the most interesting project in the crypto space since the introduction of Bitcoin itself. Assuming that it works, that is. The testnet was just released.

So what is Ethereum? In some respects, its design is similar to Bitcoin. Miners hash blocks of transactions and are rewarded in newly-created ether coins. It uses a new proof-of-work hashing algorithm called “Dagger”, which is, like Scrypt (the hashing algo used by Litecoin and most alt-coins), designed to be Memory-Hard. The developers are also experimenting with a new proof-of-stake mechanism called “slasher”, but the intention seems to be to promote a research effort to create a Memory-Hard algorithm that will be resistant to dedicated hardware like ASIC’s. The blockchain protocol is also different; Ethereum will use a variant of the new GHOST protocol, which should allow for a much shorter time interval between blocks.

So far, that just sounds like a state-of-the-art alt-coin. The real innovation is Ethereum’s Turing-complete scripting language. This is very cool, as it implements a new entity on the network, a programmable contract. From their whitepaper:

A contract is essentially an automated agent that lives on the Ethereum network, has an Ethereum address and balance, and can send and receive transactions. A contract is “activated” every time someone sends a transaction to it, at which point it runs its code, perhaps modifying its internal state or even sending some transactions, and then shuts down.

In Bitcoin (and the alt-coins), tx are generated and received by addresses. In Ethereum, contracts too can generate and receive tx. This creates endless possibilities. For example, in Ethereum, one could create a CFD. From an article by Buterin:

Each Ethereum contract has its own internal scripting code, and the scripting code is activated every time a transaction is sent to it. The scripting language has access to the transaction’s value, sender and optional data fields, as well some block data and its own internal memory, as inputs, and can send transactions. To make a CFD, Alice would create a contract and seed it with $1000 worth of cryptocurrency, and then wait for Bob to accept the contract by sending a transction containing $1000 as well. The contract would then be programmed to start a timer, and after 30 days Alice or Bob would be able to send a small transaction to the contract to activate it again and release the funds.

In the CFD example, I think the idea is something like this. Alice wants to bet on the change in next quarter’s US GDP. She creates a contract that includes a formula like PAYOUT = (ALICE_PREDICTION - GDP1Q2014 / GDP4Q2013 - 1) * GEARING and funds it with 10,000 ether. This is like a limit order. The script in the contract specifies that anyone who sends 10,000 ether to this contract will take the other side of this trade. The script also contains the public key of an “oracle”, eg a trusted website that publishes economic stats for the purpose of authoritatively fixing the settlement value of CFD’s. After X days the script consults the oracle, pays it a small fee, and gets signed value for GDP1Q2014, which script checks against the oracle’s public key. Script then computes the formula and sends Alice max(10000 + PAYOUT,0) ether and Bob max(10000 - PAYOUT,0) ether.

Some other contract types that have been suggested:

  • Multisignature escrows
  • Savings accounts
  • Peer-to-peer gambling
  • New currencies within Ethereum

As Buterin says:

This is the advantage of Ethereum code: because the scripting language is designed to have no restrictions except for a fee system, essentially any kind of rules can be encoded inside of it. One can even have an entire company manage its savings on the blockchain, with a contract saying that, for example, 60% of the current shareholders of a company are needed to agree to move any funds (and perhapps 30% can move a maximum of 1% per day). Other, less traditionally capitalistic, structures are also possible; one idea is for a
democratic organisation with the only rule being that two thirds of the existing members of a group must agree to invite another member.

Interesting stuff. Bitcoin’s advocates have always emphasised that Bitcoin is a decentralised payments system as well as a currency, and have gone to great lengths to build on top of it a richer set of financial exchange so that assets other than bitcoin can be traded on the blockchain. coloured coins, and protocols like Mastercoin are the best examples of these efforts.

The Ethereum guys share the same goals of these projects, but have a very different view about what the underlying technology needs to be to make them happen.

..as far as being an effective low-level protocol is concerned, Bitcoin is less effective; rather than being like a TCP on top of which one can build HTTP, Bitcoin is like SMTP: a protocol that is good at its intended task (in SMTP’s case email, in Bitcoin’s case money), but not particularly good as a foundation for anything else.

What makes Ethereum more like TCP and Bitcoin more like SMTP is that the former contains a Turing-complete scrip system whilst the latter does not. Bitcoin’s scripting system was deliberately made to be not Turing-complete to protect the network’s peers from malicious and buggy code. Instead of restricting the scripting language to deal with this problem, Ethereum uses an economic solution: tx and contract fees.

Ethereum will be like a giant distributed computer that automates all sorts of useful financial processes, as well as hashing tx blocks to define distributed ledger, like Bitcoin’s network does. But on Ethereum, contracts will have to pay fees to have their computations done, to compensate peers for resources consumed to run the contracts, and to make error and malice costly. My guess is that much rides on how effective this solution turns out to be. Bitcoin is very robust. It is also much less complex by not having Turing-complete scripting.

I will blog more on Ethereum as I learn more about it. I wish this project much success. The concept is brilliant if it actually works.

Thomas Schelling and Dr. Strangelove

An interview with Thomas Schelling and his role in the creation of one of my favorite movies, “Dr. Strangelove”. HKS reports,

Kubrick travelled to Cambridge to meet with Schelling and George. The three spent an afternoon wrestling with a considerable plot hole: when “Red Alert” was written in 1958, inter-continental ballistic missiles were not much of a consideration in a potential U.S.-Soviet showdown. But by 1962, ICBMs had made much of the book’s plot points impossible. The speed at which a missile strike could occur would offer no time for the plot to unfold. “We had a hard time getting a war started,” said Schelling.

Hence, the B52s?

Is Bitcoin volatility really in decline?

Eli Dourado has a great great blog that covers allot of issues concerning cryptocurrency, you should follow it if you don’t already. In a new post he reports that Bitcoin volatility has been trending down.

I calculated Bitcoin’s historical volatility using price data from
Mt. Gox (downloaded from Blockchain.info), which is the only
consistent source of pricing data over a long period. There is a clear
trend of falling volatility over time, albeit with some aberrations in
recent months. The trend is statistically significant: a univariate
OLS regression yields a t-score on the date variable of 15.

But the claim that “there is a clear trend of falling volatility over time” isn’t defensible at all. Before I explain why I don’t agree with with Eli, let me first replicate his analysis.

bitcoin-vol1.png

My OLS regression calibrates with Eli’s, so we’re on the same page:

                          Estimate   Std. Error   t value      Pr(>|t|)
(Intercept)           1.203775e-01 3.351178e-03  35.92095 3.489853e-194
seq(1, length(date)) -8.056651e-05 4.666856e-06 -17.26355  5.281210e-60

Putting that into English, the coefficient of the regression line is saying that volatility declines by about .00008 a day, or about 3 percentage points annually. Interpret that however you want.

And what is the daily volatility of BTC/USD?

    Min.  1st Qu.   Median     Mean  3rd Qu.     Max. 
0.007301 0.027890 0.048980 0.070220 0.082930 0.355300

About 5% per day. That’s pretty wild stuff, considering that the volatility of the S&P 500 is about 0.7% per day. But patience calls, one might say, for the trend line predicts that BTC/USD volatility is in decline.

I don’t like using trend lines in analysing financial timeseries. Let me show you why. Here is a plot of the coefficient of the same regression, but on a rolling 2-year window.

bitcoin-vol2.png

This tells a very different story. The slope of that regression line flattens out and eventually changes sign, as the early months of BTC/USD trading fall out of the sample period. Here’s how the chart looks running that regression on the last two years of data.

bitcoin-vol3.png

The trend reverses once the early stuff falls out of the sample. And there is good reason to exclude those early months from our analysis. Look at this chart of daily USD trade volume for those BTC/USD rates.

bitcoin-vol4.png

The price and volume series start around mid August 2010, but the volumes are really tiny for the first 8 months. And I mean tiny.. the median daily volume is about $3,300. Volumes get into the 5 and 6 digits after 13 April, 2011, when BTC/USD broke parity.

And before you say “that’s what we would expect, volatility to decline as volumes pick up”, look at those previous two charts. Volatility has been increasing as volume increases if you exclude the rinky dink period with sub-5-digit trading volumes.

Anyway, timeseries on thinly traded assets are notoriously unreliable. Those skyscraper patterns in the first chart are good hint that there’s some dodgy data in there. For example, Look at row 30:

                  date   price     volume         ror        vol
28 2010-09-13 19:15:05 0.06201   92.76696 -0.04598532 0.02973015
29 2010-09-14 19:15:05 0.06410 1293.53800  0.03370424 0.03019624
30 2010-09-15 19:15:05 0.17500 1035.82500  1.73010920 0.32375538
31 2010-09-16 19:15:05 0.06190   51.31510 -0.64628571 0.34284369
32 2010-09-17 19:15:05 0.06090  252.73500 -0.01615509 0.34271839

On September 15, 2010 we see a 173% daily return, followed by a -65% return the following day when the price basically returned to levels it was trading at on the 14th. Bad data point? Probably, but with these tiny volumes, does the question even matter.. this part of the series is junk.

One way of handling these issues is to prefer a more robust estimator of volatility, like Mean Absolute Deviation (this is a common practice in trading systems research). So let’s re-run the OLS we started off with–including the rinky dink period–but this time using 30-day rolling MAD instead of SD.

bitcoin-vol5.png

Bummer, the trend disappears. Let’s look at it another way. A plot of daily returns is always a good visual check. (I stripped out those two dodgy data points we looked at above.)

bitcoin-vol6.png

You can clearly see that the largest two one-day declines happened within the last 12-months. In fact, 4 of the 5 the largest one-day losses happened in 2013, and those were multi-million dollar volume days.

           date   price   volume      ror      vol      mad
968  2013-04-12  83.664 34740413 -0.47654 0.136859 0.075185
967  2013-04-11 159.830 38009457 -0.32722 0.097551 0.069511
973  2013-04-17  79.942 25542665 -0.27551 0.154853 0.086190
1201 2013-12-07 767.777 83625810 -0.26372 0.108006 0.091285
84   2010-11-08   0.370    34758 -0.26000 0.227176 0.064088

And the 5 largest gains? All over two and a half years ago.

          date price    volume     ror     vol      mad
169 2011-02-01  0.95   70422.8 0.90000 0.17375 0.041508
69  2010-10-24  0.19    2612.9 0.74296 0.17274 0.018748
83  2010-11-07  0.50   44081.4 0.72413 0.22559 0.067911
296 2011-06-08 31.91 3238531.0 0.67418 0.17172 0.073743
257 2011-04-30  4.15  349701.7 0.53702 0.13500 0.048596

Now, I wonder, what charts the FX guys at Coinbase are looking at…

A quick look at Bitcoin transaction volume

A common rationale for owning Bitcoin is that its logarithmic money supply makes it a good store-of-value (SOV). Like precious metals or rare paintings stored in Swiss vaults, the scarcity of those coins will ensure that they at least keep their value.

By itself, this argument is hopelessly naive, as there is nothing scarce about a cryptocurrency with a fixed terminal money supply; anyone can (and a great many have) fork Bitcoin and create another such currency, so the total supply of such coins is potentially unlimited. But it could be replied that there are powerful network effects here, that the demand for digital SOV will coordinate around just one or two “crypto gold” stocks. In a previous post I’ve argued that the continuous hashing costs required to make the p2p secure would indeed imply such a network effect, but that the inability of log coin supply to finance these hashing costs out of seigniorage after the money supply stops growing casts doubts about the sustainability of this spontaneous digital gold enterprise.

A more sophisticated defence of Bitcoin’s valuation goes like this. Bitcoin is great SOV not just because of its limited supply and those hashing cost network effects. It’s a great SOV because in future more and more people will use it as a medium-of-exchange (MOE). As the volume of bitcoin transactions increases, so will the demand to hold bitcoin balances for the purpose of making transactions in goods and services. But a total of only 21 million bitcoins will ever be produced, so the price of a bitcoin must reflect the ratio of expected future MOE money demand to 21 million. The price of Bitcoin, one might argue, is the market’s prediction of the long-term growth rate of bitcoin transaction demand.

So let’s set aside the theoretical objection to this thesis and look at it empirically. Is there evidence of transaction growth to date that would rationalise Bitcoin’s valuation if we extrapolate recent tx growth?

Here are the daily transaction volumes and BTC/USD fx volumes aggregated from the main exchanges.

bitcoin-tx-volume.png

Just eyeballing this chart, it looks to me like there is very little transaction growth except for the period at the end of the first and fourth quarters, when there were dramatic revaluations in the exchange rate. And the explanation that leaps to my mind for those spikes in tx volume is that it’s coming from the settlement of fx trades for buy-and-hold positions in bitcoin, and a good deal of Chinese evasion of capital controls via CNY –> BTC –> USD, GBP, EUR..

But a more bullish story could be told. The revaluation of Bitcoin might have had a large wealth effect, with early Bitcoin adopters spending some of their increasingly dear hoard on weed and alpaca socks, and the revaluation was itself due in large part to newcomers buying bitcoin for the purpose of buying stuff with it.

btc-usd-2013.png

Transactions on the blockchain that are settling an fx trade should be excluded from our calculation of bitcoin transaction growth. For every buy, there is a sell, so these transactions cannot represent new transaction demand by definition.

The data series used in these charts come from blockchain.info, and unfortunately, it only has fx volume for BTC/USD. Ideally, we’d want the volume figures for BTC vs EUR, GBP, CNY, JPY, and others so that we could add them all up and subtract total fx volume from the transaction series to get a truer picture of underlying transaction growth. If anyone can point me to where I can get those data easily, I’ll run the analysis.

Until then, here are monthly Bitcoin total transaction and BTC/USD trade month-on-month volume growth figures (in USD).

 yearmm     tx     fx
 201301  0.554  1.105
 201302  0.620  0.673
 201303  1.304  2.188
 201304  1.765  3.349
 201305 -0.388 -0.580
 201306 -0.318 -0.521
 201307  0.265 -0.183
 201308 -0.161 -0.323
 201309  0.112 -0.109
 201310  0.681  1.196
 201311  3.847  4.150
 201312 -0.015  0.207

Average Monthly Growth (2013)
   tx    fx 
0.413 0.418

These data are not conclusive. You could argue that the roughly 40% monthly tx growth is impressive evidence of underlying transaction growth. Or, you could interpret the roughly identical growth rates of tx and fx volume, and their high monthly correlation, as evidence that most of the tx growth is due to fx settlements. We need a complete fx volume series to disambiguate the data. When we do that, my bets are that monthly tx growth is under 20%.