AppCoins: embedded cryptocurrencies

Naval Ravikant writes:

In economics, the artificially scarce token used to allocate scarce resources is called “money.” So Bitcoin is crowdfunded OSS to run an Economic network. Now, a new generation of Appcoins can be created as open source software, crowdfunded into existence, and go public on day one. They can run networks where Bitcoin may not work, or where separate funding and compensation is needed.

The idea is to embed an application-specific cryptocurrency into a useful network technology to regulate its usage and remunerate its creators:

The Tor network is slow because it relies on volunteers to relay traffic. Anytime we see a line, the product in question is underpriced. Let’s crowdfund a Torcoin – users of relays will pay in Torcoins and operators of relays will get paid in TorCoins. Founding developers collect equity when TorCoins are first mined and sold. Non-founding developers and network operators are paid revenues from newly mined coins and transaction fees.

A P2P technology like Tor is an obvious candidate for AppCoin integration. It does suffer free-rider economics, and network performance would improve if users were incentivised to relay traffic and run exit nodes.

Bittorrent is another technology ripe for AppCoin integration. There is a project underway by the developers of one client to integrate Bitcoin, but I suspect that an application-specific coin would be more appropriate.

I’ve long thought that a grid computing platform like BOINC would be more widely used if it swapped its credit system for an AppCoin.

One of the advantages of an AppCoin is that tight integration with the target technology’s protocol allows for seamless transaction settlement. It’s important to remember that Bitcoin and its variations handle only the payer side of a transaction. Payee performance must be monitored via trust, third-party escrow, 2-of-3 signature transactions, or some other mechanism. For informational goods such as Tor, Bittorrent, grid work units, etc, the overhead of integrating a generic cryptocurrency seems sub-optimal; much better to have a generic mechanism for exchange between different AppCoins, with something like Bitcoin as the reserve/intermediary currency.

Another advantage of the AppCoin is that it will likely get the monetary economics right. In a one-good economy, where the AppCoin buys a digital resource like traffic on a p2p network, it will be obvious to the creators that an AppCoin whose coin supply expands at a rate proportional to the usage of the resources is much better than a supply rule (like Bitcoin’s) that aims for long-term appreciation of coin’s value.

AppCoin’s will also provide an array of avenues for the widespread distribution of crypto coins. Unless you have something to sell those who hold cryptocurrency balances, the only way to acquire cryptocurrency is to mine (economically unfeasible unless you invest in ASIC hardware) or buy them on exchange (counterparty risk, KYC/AML hassles). AppCoins will provide a way for anyone to barter scarce but ubiquitous resources like bandwidth and disk storage for coin.

One interesting, if difficult, dimension to the AppCoin idea is the prospect moving proof-of-work beyond the current model of cryptographic hash functions. GridCoin tries to integrate hash-based with grid-computation-based proof-of-work. Making these ideas work in a robust way is hard, but I suspect that progress will be made beyond these early experiments, and it will come from the AppCoin space.

There are non-obvious implications of the AppCoin crowdfunding model, both economic and legal. The current VC/Angel funding model of startups is based on the familiar NPV of expected future earnings. Crowdfunding via an AppCoin will be based on the seigniorage of a literally monetised network. In a future post we’ll discuss how to approach the valuation of such things. We will also discuss some of the favourable legal aspects of crowd funding in a manner that clearly falls outside the scope of securities law.

Can we value Bitcoin?

In a post a few weeks ago I wrote:

A more sophisticated defence of Bitcoin’s valuation goes like this. Bitcoin is great SOV not just because of its limited supply and those hashing cost network effects. It’s a great SOV because in future more and more people will use it as a medium-of-exchange (MOE). As the volume of bitcoin transactions increases, so will the demand to hold bitcoin balances for the purpose of making transactions in goods and services. But a total of only 21 million bitcoins will ever be produced, so the price of a bitcoin must reflect the ratio of expected future MOE money demand to 21 million. The price of Bitcoin, one might argue, is the market’s prediction of the long-term growth rate of bitcoin transaction demand.

Now, on Twitter today Marc Andreessen, links to Fortune article citing Stanford economist Susan Athey, who apparently makes an argument virtually identical to the above:

An anonymous viral email circulating among bitcoin watchers and partisans lays out a few simple hypothetical usage and adoption scenarios, and their consequences for bitcoin’s price. If Amazon.com adopted bitcoin for all payments, its volume of $38 billion, divided by a supply of (at the time of the email’s writing) about 7 million bitcoin, would make each bitcoin worth $5,400. If $300 billion in international remittance was conducted in bitcoin, that volume alone would push the price to $42,000. Adding these, along with online poker and gas station transactions, would lead to a total transaction volume of $602 billion – and a bitcoin, even at today’s expanded supply of 12 million coins, worth $50,000.

“Those numbers are good ones to start with. In some sense, that’s like a maximum,” says Susan Athey, a professor of economics at the Stanford Graduate School of Business who has been studying bitcoin. Few would realistically argue that bitcoin will service 100% of even these silos in the near term, but the volume/supply ratio is the starting point for understanding bitcoin price – as more consumers or organizations choose to use bitcoin, increased volume will drive the price up.

Building from that basic formula, Athey adds a variety of variables to build an analytic framework. The first is velocity – how frequently a bitcoin can be spent. Because bitcoin, unlike paper money, is very low-friction, there’s the possibility of a very high-velocity bitcoin, if, for example, vendors or traders only held bitcoin very briefly, cashing it in and out to government currencies on either end of transfers. That, Athey says, would allow a small volume of bitcoin to process a large volume of payments, keeping the price of bitcoin relatively low.

I’m not privy to this exchange, so I don’t know how much of this argument is attributed to Athey and how much is the Fortune journalist’s own thinking. A bit of Googling turned up this interview with Athey in November of last year:

What do you think about the bitcoin price increases recently? Well, if you expect the volume of transactions to grow a lot, then the exchange rate from dollars to bitcoins has to grow too, because each bitcoin can only be used so many times per day. The market value of all bitcoins has to be enough to support transaction volume. You could interpret the price increases as reflecting increased optimism about the future volume of transactions, driven by China implicitly signaling that it will allow bitcoins to be used for commerce there.

As I pointed out in my previous post, this is a more sophisticated rationalisation of a Bitcoin’s valuation than one usually reads. As a cryptocurrency pays no income, the only way to value it fundamentally is in terms of expected future cryptomoney demand (uncertain) in relation to its future supply (deterministic and completely predictable in Bitcoin). By “cryptomoney demand” we mean: crypto coin balances held for the purpose of facilitating transactions in that coin.

Money demand is proportional to the level of transaction volume if velocity–the number of times the coin supply changes hands over the period–is stable. So, if we can make that assumption of stable velocity, the price of Bitcoin today should reflect expectations of future bitcoin transaction volume. Let t be some future time when the growth rate of transaction volume TX(t) levels out and let V(t) be the velocity at time t, and S(t) is the supply of bitcoin:

price(BTC) = \frac{TX(t)}{V(t)} \times \frac{1}{S(t)}

The calculation cited above arriving at that BTC = $50,000 is implicitly assuming a money velocity of 1, which goes against the Silicon Valley vision of a Bitcoin-as-payments unit that people swap in and out of via intermediaries like Coinbase and BitPay. In that scenario, velocity will be very high.

Here is a back-of-envelope valuation. Let’s say that t represents year 2024 and that bitcoin transaction growth levels out in about 10yrs time. Now, let’s fix an assumption of velocity at that time. The money velocity of USD M1 is about 7, so I would guess that Bitcoin velocity will be rather higher than that. Let’s just say, arbitrarily, that Bitcoin velocity will be 10X USD money velocity. So BTC = $650, V(t) = 70, and S(t) = 20 million, making TX(t) = $910 billion, almost 6% of the US economy.

The result isn’t totally crazy. Here are the blockchain transaction volume figures for the last four years, converted into USD values at the time of transaction (as calculated by blockchain.info):

year         volume  growth
2010        985,887      0%
2011    418,050,216 42,303%
2012    601,415,369     44%
2013 15,216,615,077  2,430%

Much of that volume will be FX settlements and payments between addresses controlled by a single entity, and those volumes shouldn’t be included in the analysis. How much is difficult to estimate (something that we’ll look into in a future post), but let’s say that half of that volume should be excluded, so the current base is 7.6 billion per-year. Annual volume of 910 billion annual in a decade’s time is a bit over 60% compounded growth per year. In light of recent history, the result is conservative!

The problem with this sort of valuation analysis is that the inputs TX(t) and V(t) are entirely speculative. Assumptions, assumptions, assumptions. You can plug-in anything you like. It’s like valuing a new business.. but worse. In Bitcoin the S(t) is basically fixed in future horizons (never more than 21 million), so any change in the market’s assumptions translates into changes in the exchange rate. Bitcoin translates that uncertainty about its future prospects into present exchange rate volatility. And that exchange rate volatility dampens demand today for using bitcoin as a medium-of-exchange, undermining the very assumptions behind its current valuation. To me Bitcoin–not cryptocurrency in general, but Bitcoin–is like one of those M.C. Escher drawings, where the impossible looks deceptively plausible.

The Bitcoin-as-payments people will reply that the volatility doesn’t matter, that I’m wrong in saying that the volatility undermines the transactional demand for bitcoin. Here’s a recent claim by Andreessen:

The criticism that merchants will not accept Bitcoin because of its volatility is also incorrect. Bitcoin can be used entirely as a payment system; merchants do not need to hold any Bitcoin currency or be exposed to Bitcoin volatility at any time. Any consumer or merchant can trade in and out of Bitcoin and other currencies any time they want.

Athey qualifies this position a little (from the same interview above):

What about the extreme volatility? Volatility is bad because it increases frictions—if I just want to send you $100, the exchange rate might change between when I buy the bitcoins and send them to you, and when you receive and cash them out. That creates risk and frictions. But the level of the exchange rate is irrelevant for the efficiency of the payment rail—if I knew it would be $1000/bitcoin all day long, or $100/bitcoin, either way I can buy bitcoins, send them to you, and you can sell them, while avoiding paying exorbitant bank fees. You still incur some fees when getting money in and out, but those are relatively low and should fall over time with competition.

But the irreducible component of the costs facing those merchants Andreessen speaks about are those very “risks and frictions”, they make the price of offloading that volatility to someone else. Competition will not reduce those costs any more than competition among options dealers will reduce the price of a put on the S&P 500.

Is the 1% fee that Coinbase charges for the service of offloading exchange rate volatility sufficient to cover their cost of hedging a coin whose USD volatility is more than 7 times that of the stock market? There are a smart bunch of people behind that company, so I’m reluctant to second-guess the business model. But I feel I must.. and may do so in detail in a future post.

The implicit assumption behind the comments of Andreessen and Athey is that Bitcoin’s money velocity can be arbitrarily large, a hot potato that gets passed around so quickly that the volatility of the coin can be made negligible to the party using it as medium-of-exchange. But the Bitcoin protocol itself places a lower limit on the speed of transaction confirmations, placing an upper limit on Bitcoin’s velocity. Whatever that velocity turns out to be, the interval between time coin received and time coin paid will impose an irreducible risk on the party who wishes to use Bitcoin to make payments. A risk that is costly to layoff to someone else.

But I am a believer in cryptocurrency, I would just prefer to back a cryptocurrency where whose supply was more responsive to its demand, where \Delta S(t) is a function of \Delta TX(t), or a function of the exchange rate itself. This can be done in an entirely trustless way, and such a coin is likely to have a much more stable exchange rate and be a better medium-of-exchange.

Bitcoin, Ethereum and Pigou: the economics of transaction fees

The economics of transaction fees in cryptocurrencies are poorly understood. In a previous post I raised some questions about how using tx fees to compensate for hashing costs (as Bitcoin’s declining coinbase award increasingly does over time) can be incentive-compatible with transaction demand for cryptocurrency. There, I was concerned about the distribution of seigniorage between existing coin holders and hashing costs, and what this implies for tx fees.

A new post on the Ethereum blog focuses on another aspect of transaction fee economics: a tragedy of the transaction verification commons.

The essence of the problem is this. In Bitcoin, tx fees are effectively set by what tx miners choose to include in their blocks. The creator of a tx can pay any fee he chooses, but miners are free to ignore a tx, so a payer who pays a relatively large fee is more likely to have a faster-than-average confirmation time. On the surface, this looks like a market mechanism. But it isn’t. The miner gets the tx fees of every tx included in a block that the miner solves. But every node on the network pays the costs of verifying a transaction; tx must be verified before relaying and building on top of a solved block. Therefore, a miner will include any tx with a fee in excess of his computational costs of verifying it (and reassembling the Merkel tree of his block), not the network’s computational costs of verifying it.

A single, very large block containing many transactions with many inputs/outputs can bog down the network. To deal with this, the Bitcoin protocol imposes a 1MB upper limit on the size of a block. This isn’t a great solution. Not only does it put an upper limit on the number of tx Bitcoin can process per unit of time, it does nothing to rationalise tx fees to tx verification costs.

It’s like an airline that puts a 1000 suitcase (irrespective of size/weight) limit on luggage per flight, and deals with the problem of >1000 suitcases by prioritising those passengers that volunteered to pay a fee. Those who pay the lowest/no fees have their bags kicked off the flight, and placed in a que to for inclusion in subsequent flights (that employ the same 1000 suitecase limit). What will eventually happen is that those with big, heavy bags will pay the highest fees and have their bags included in the flight, as those fees will still be lower than the actual cost of shipping the luggage. Those with small/light bags will get kicked off, unless the passenger is willing to pay more than the marginal cost of shipping his bag. If airlines are a competitive market, those guys will eventually just chose to travel on a different airline that doesn’t ask them to subsidise pack rats.

From Ethereum’s post:

The question is, is this kind of market the right model for Bitcoin transactions? To answer this question, let us try to put all of the players into roles. The resource is the service of transaction processing, and the people benefiting from the resource, the transaction senders, are also the buyers paying transaction fees. So far, so good. The sellers are obvious the miners. But who is incurring
the costs? Here, things get tricky. For each individual transaction that a miner includes, the costs are borne not just by that miner, but by every single node in the entire network. The cost per transaction is tiny; a miner can process a transaction and include it in a block for less than $0.00001 worth of electricity and data storage. The reason why transaction fees need to be high is because that $0.00001 is being paid by thousands of nodes all around the world.

It gets worse. Suppose that the net cost to the network of processing a transaction is close to $0.05. In theory, even if the costs are not borne by exactly the same people who set the prices, as long as the transaction fee is close to $0.05 the system would still be in balance. But what is the equilibrium transaction fee going to be? Right now, fees are around $0.09 simply because miners are too lazy to switch. But then, in the future, what happens once fees become a larger share of a miner’s revenue and miners have a large incentive to
try to maximise their take? The obvious answer is, for a solo miner the equilibrium transaction fee is $0.00001. If a transaction with a fee of $0.00002 comes in, and the miner adds it, the miner will have earned a profit of $0.00001, and the remaining $0.04999 worth of costs will be paid by the rest of the network together – a cryptographic tragedy of the commons.

The Ethereum guys have defined the problem clearly. And I’m not encouraged by what (I think?) is the current thinking of the Bitcoin developers in dealing with this problem. From the Bitcoin Foundation’s blog:

I’ve been working on teaching the wallet code to estimate how low a fee (or priority) a transaction needs, at the moment it is sent, to be accepted by miners and included in the next block or three. The estimates are based on watching transactions as they are broadcast on the network and keeping track of which of those transactions are accepted into blocks.

The danger with estimating transaction fees is miners have an incentive to try to game the estimate to make transaction fees higher. For example, if the estimate was based on the average transaction fee for all transactions in the last N blocks, miners could add very-high-fee pay-to-self transactions to the blocks that
they mine to drive up the average. However, by only considering fees for transactions that have been broadcast on the network that threat is eliminated– miners could broadcast very-high-fee pay-to-self transactions, but would end up paying those high transaction fees to other miners. The transaction estimation code also uses median transaction fees, not averages, to make it much harder for a minority of transactions to influence transaction fees.

But this won’t work in the end, for even a perfect estimate that is not contaminated by strategic actions by miners will still be an estimate of the marginal cost of tx verification faced by a single miner, not the network as a whole.

It’s not surprising that the Ethereum developers have cut to the core of this problem. In Bitcoin, you can at least be sure that the execution of scriptSig and scriptPubKey will halt after time proportional to tx size. Not so with Ethereum’s Turing-complete scripting language. For Ethereum, the problem of rationing network resources over tx verification and contract computation is acute. The project simply will not work without an economically equilibrating solution to this problem.

Their current thinking is that tx fees should be destroyed (no recipient) and calculated along the lines of a Pigovian tax via a some mechanism of miner or ether holder consensus. I’m not convinced that this will work, but this post gives me confidence that the guys behind Ethereum are taking the economics of crypto seriously. Let’s all pitch in and help them solve this problem.

Ethereum: Turing-complete, programmable money

Ethereum is a new cryptocurrency project started by Vitalik Buterin and Charles Hoskinson and others. Part of that project is a new currency, called “ether”, but this is NOT another alt-coin. In my opinion, it’s the most interesting project in the crypto space since the introduction of Bitcoin itself. Assuming that it works, that is. The testnet was just released.

So what is Ethereum? In some respects, its design is similar to Bitcoin. Miners hash blocks of transactions and are rewarded in newly-created ether coins. It uses a new proof-of-work hashing algorithm called “Dagger”, which is, like Scrypt (the hashing algo used by Litecoin and most alt-coins), designed to be Memory-Hard. The developers are also experimenting with a new proof-of-stake mechanism called “slasher”, but the intention seems to be to promote a research effort to create a Memory-Hard algorithm that will be resistant to dedicated hardware like ASIC’s. The blockchain protocol is also different; Ethereum will use a variant of the new GHOST protocol, which should allow for a much shorter time interval between blocks.

So far, that just sounds like a state-of-the-art alt-coin. The real innovation is Ethereum’s Turing-complete scripting language. This is very cool, as it implements a new entity on the network, a programmable contract. From their whitepaper:

A contract is essentially an automated agent that lives on the Ethereum network, has an Ethereum address and balance, and can send and receive transactions. A contract is “activated” every time someone sends a transaction to it, at which point it runs its code, perhaps modifying its internal state or even sending some transactions, and then shuts down.

In Bitcoin (and the alt-coins), tx are generated and received by addresses. In Ethereum, contracts too can generate and receive tx. This creates endless possibilities. For example, in Ethereum, one could create a CFD. From an article by Buterin:

Each Ethereum contract has its own internal scripting code, and the scripting code is activated every time a transaction is sent to it. The scripting language has access to the transaction’s value, sender and optional data fields, as well some block data and its own internal memory, as inputs, and can send transactions. To make a CFD, Alice would create a contract and seed it with $1000 worth of cryptocurrency, and then wait for Bob to accept the contract by sending a transction containing $1000 as well. The contract would then be programmed to start a timer, and after 30 days Alice or Bob would be able to send a small transaction to the contract to activate it again and release the funds.

In the CFD example, I think the idea is something like this. Alice wants to bet on the change in next quarter’s US GDP. She creates a contract that includes a formula like PAYOUT = (ALICE_PREDICTION - GDP1Q2014 / GDP4Q2013 - 1) * GEARING and funds it with 10,000 ether. This is like a limit order. The script in the contract specifies that anyone who sends 10,000 ether to this contract will take the other side of this trade. The script also contains the public key of an “oracle”, eg a trusted website that publishes economic stats for the purpose of authoritatively fixing the settlement value of CFD’s. After X days the script consults the oracle, pays it a small fee, and gets signed value for GDP1Q2014, which script checks against the oracle’s public key. Script then computes the formula and sends Alice max(10000 + PAYOUT,0) ether and Bob max(10000 - PAYOUT,0) ether.

Some other contract types that have been suggested:

  • Multisignature escrows
  • Savings accounts
  • Peer-to-peer gambling
  • New currencies within Ethereum

As Buterin says:

This is the advantage of Ethereum code: because the scripting language is designed to have no restrictions except for a fee system, essentially any kind of rules can be encoded inside of it. One can even have an entire company manage its savings on the blockchain, with a contract saying that, for example, 60% of the current shareholders of a company are needed to agree to move any funds (and perhapps 30% can move a maximum of 1% per day). Other, less traditionally capitalistic, structures are also possible; one idea is for a
democratic organisation with the only rule being that two thirds of the existing members of a group must agree to invite another member.

Interesting stuff. Bitcoin’s advocates have always emphasised that Bitcoin is a decentralised payments system as well as a currency, and have gone to great lengths to build on top of it a richer set of financial exchange so that assets other than bitcoin can be traded on the blockchain. coloured coins, and protocols like Mastercoin are the best examples of these efforts.

The Ethereum guys share the same goals of these projects, but have a very different view about what the underlying technology needs to be to make them happen.

..as far as being an effective low-level protocol is concerned, Bitcoin is less effective; rather than being like a TCP on top of which one can build HTTP, Bitcoin is like SMTP: a protocol that is good at its intended task (in SMTP’s case email, in Bitcoin’s case money), but not particularly good as a foundation for anything else.

What makes Ethereum more like TCP and Bitcoin more like SMTP is that the former contains a Turing-complete scrip system whilst the latter does not. Bitcoin’s scripting system was deliberately made to be not Turing-complete to protect the network’s peers from malicious and buggy code. Instead of restricting the scripting language to deal with this problem, Ethereum uses an economic solution: tx and contract fees.

Ethereum will be like a giant distributed computer that automates all sorts of useful financial processes, as well as hashing tx blocks to define distributed ledger, like Bitcoin’s network does. But on Ethereum, contracts will have to pay fees to have their computations done, to compensate peers for resources consumed to run the contracts, and to make error and malice costly. My guess is that much rides on how effective this solution turns out to be. Bitcoin is very robust. It is also much less complex by not having Turing-complete scripting.

I will blog more on Ethereum as I learn more about it. I wish this project much success. The concept is brilliant if it actually works.

Is Bitcoin volatility really in decline?

Eli Dourado has a great great blog that covers allot of issues concerning cryptocurrency, you should follow it if you don’t already. In a new post he reports that Bitcoin volatility has been trending down.

I calculated Bitcoin’s historical volatility using price data from
Mt. Gox (downloaded from Blockchain.info), which is the only
consistent source of pricing data over a long period. There is a clear
trend of falling volatility over time, albeit with some aberrations in
recent months. The trend is statistically significant: a univariate
OLS regression yields a t-score on the date variable of 15.

But the claim that “there is a clear trend of falling volatility over time” isn’t defensible at all. Before I explain why I don’t agree with with Eli, let me first replicate his analysis.

bitcoin-vol1.png

My OLS regression calibrates with Eli’s, so we’re on the same page:

                          Estimate   Std. Error   t value      Pr(>|t|)
(Intercept)           1.203775e-01 3.351178e-03  35.92095 3.489853e-194
seq(1, length(date)) -8.056651e-05 4.666856e-06 -17.26355  5.281210e-60

Putting that into English, the coefficient of the regression line is saying that volatility declines by about .00008 a day, or about 3 percentage points annually. Interpret that however you want.

And what is the daily volatility of BTC/USD?

    Min.  1st Qu.   Median     Mean  3rd Qu.     Max. 
0.007301 0.027890 0.048980 0.070220 0.082930 0.355300

About 5% per day. That’s pretty wild stuff, considering that the volatility of the S&P 500 is about 0.7% per day. But patience calls, one might say, for the trend line predicts that BTC/USD volatility is in decline.

I don’t like using trend lines in analysing financial timeseries. Let me show you why. Here is a plot of the coefficient of the same regression, but on a rolling 2-year window.

bitcoin-vol2.png

This tells a very different story. The slope of that regression line flattens out and eventually changes sign, as the early months of BTC/USD trading fall out of the sample period. Here’s how the chart looks running that regression on the last two years of data.

bitcoin-vol3.png

The trend reverses once the early stuff falls out of the sample. And there is good reason to exclude those early months from our analysis. Look at this chart of daily USD trade volume for those BTC/USD rates.

bitcoin-vol4.png

The price and volume series start around mid August 2010, but the volumes are really tiny for the first 8 months. And I mean tiny.. the median daily volume is about $3,300. Volumes get into the 5 and 6 digits after 13 April, 2011, when BTC/USD broke parity.

And before you say “that’s what we would expect, volatility to decline as volumes pick up”, look at those previous two charts. Volatility has been increasing as volume increases if you exclude the rinky dink period with sub-5-digit trading volumes.

Anyway, timeseries on thinly traded assets are notoriously unreliable. Those skyscraper patterns in the first chart are good hint that there’s some dodgy data in there. For example, Look at row 30:

                  date   price     volume         ror        vol
28 2010-09-13 19:15:05 0.06201   92.76696 -0.04598532 0.02973015
29 2010-09-14 19:15:05 0.06410 1293.53800  0.03370424 0.03019624
30 2010-09-15 19:15:05 0.17500 1035.82500  1.73010920 0.32375538
31 2010-09-16 19:15:05 0.06190   51.31510 -0.64628571 0.34284369
32 2010-09-17 19:15:05 0.06090  252.73500 -0.01615509 0.34271839

On September 15, 2010 we see a 173% daily return, followed by a -65% return the following day when the price basically returned to levels it was trading at on the 14th. Bad data point? Probably, but with these tiny volumes, does the question even matter.. this part of the series is junk.

One way of handling these issues is to prefer a more robust estimator of volatility, like Mean Absolute Deviation (this is a common practice in trading systems research). So let’s re-run the OLS we started off with–including the rinky dink period–but this time using 30-day rolling MAD instead of SD.

bitcoin-vol5.png

Bummer, the trend disappears. Let’s look at it another way. A plot of daily returns is always a good visual check. (I stripped out those two dodgy data points we looked at above.)

bitcoin-vol6.png

You can clearly see that the largest two one-day declines happened within the last 12-months. In fact, 4 of the 5 the largest one-day losses happened in 2013, and those were multi-million dollar volume days.

           date   price   volume      ror      vol      mad
968  2013-04-12  83.664 34740413 -0.47654 0.136859 0.075185
967  2013-04-11 159.830 38009457 -0.32722 0.097551 0.069511
973  2013-04-17  79.942 25542665 -0.27551 0.154853 0.086190
1201 2013-12-07 767.777 83625810 -0.26372 0.108006 0.091285
84   2010-11-08   0.370    34758 -0.26000 0.227176 0.064088

And the 5 largest gains? All over two and a half years ago.

          date price    volume     ror     vol      mad
169 2011-02-01  0.95   70422.8 0.90000 0.17375 0.041508
69  2010-10-24  0.19    2612.9 0.74296 0.17274 0.018748
83  2010-11-07  0.50   44081.4 0.72413 0.22559 0.067911
296 2011-06-08 31.91 3238531.0 0.67418 0.17172 0.073743
257 2011-04-30  4.15  349701.7 0.53702 0.13500 0.048596

Now, I wonder, what charts the FX guys at Coinbase are looking at…

A quick look at Bitcoin transaction volume

A common rationale for owning Bitcoin is that its logarithmic money supply makes it a good store-of-value (SOV). Like precious metals or rare paintings stored in Swiss vaults, the scarcity of those coins will ensure that they at least keep their value.

By itself, this argument is hopelessly naive, as there is nothing scarce about a cryptocurrency with a fixed terminal money supply; anyone can (and a great many have) fork Bitcoin and create another such currency, so the total supply of such coins is potentially unlimited. But it could be replied that there are powerful network effects here, that the demand for digital SOV will coordinate around just one or two “crypto gold” stocks. In a previous post I’ve argued that the continuous hashing costs required to make the p2p secure would indeed imply such a network effect, but that the inability of log coin supply to finance these hashing costs out of seigniorage after the money supply stops growing casts doubts about the sustainability of this spontaneous digital gold enterprise.

A more sophisticated defence of Bitcoin’s valuation goes like this. Bitcoin is great SOV not just because of its limited supply and those hashing cost network effects. It’s a great SOV because in future more and more people will use it as a medium-of-exchange (MOE). As the volume of bitcoin transactions increases, so will the demand to hold bitcoin balances for the purpose of making transactions in goods and services. But a total of only 21 million bitcoins will ever be produced, so the price of a bitcoin must reflect the ratio of expected future MOE money demand to 21 million. The price of Bitcoin, one might argue, is the market’s prediction of the long-term growth rate of bitcoin transaction demand.

So let’s set aside the theoretical objection to this thesis and look at it empirically. Is there evidence of transaction growth to date that would rationalise Bitcoin’s valuation if we extrapolate recent tx growth?

Here are the daily transaction volumes and BTC/USD fx volumes aggregated from the main exchanges.

bitcoin-tx-volume.png

Just eyeballing this chart, it looks to me like there is very little transaction growth except for the period at the end of the first and fourth quarters, when there were dramatic revaluations in the exchange rate. And the explanation that leaps to my mind for those spikes in tx volume is that it’s coming from the settlement of fx trades for buy-and-hold positions in bitcoin, and a good deal of Chinese evasion of capital controls via CNY –> BTC –> USD, GBP, EUR..

But a more bullish story could be told. The revaluation of Bitcoin might have had a large wealth effect, with early Bitcoin adopters spending some of their increasingly dear hoard on weed and alpaca socks, and the revaluation was itself due in large part to newcomers buying bitcoin for the purpose of buying stuff with it.

btc-usd-2013.png

Transactions on the blockchain that are settling an fx trade should be excluded from our calculation of bitcoin transaction growth. For every buy, there is a sell, so these transactions cannot represent new transaction demand by definition.

The data series used in these charts come from blockchain.info, and unfortunately, it only has fx volume for BTC/USD. Ideally, we’d want the volume figures for BTC vs EUR, GBP, CNY, JPY, and others so that we could add them all up and subtract total fx volume from the transaction series to get a truer picture of underlying transaction growth. If anyone can point me to where I can get those data easily, I’ll run the analysis.

Until then, here are monthly Bitcoin total transaction and BTC/USD trade month-on-month volume growth figures (in USD).

 yearmm     tx     fx
 201301  0.554  1.105
 201302  0.620  0.673
 201303  1.304  2.188
 201304  1.765  3.349
 201305 -0.388 -0.580
 201306 -0.318 -0.521
 201307  0.265 -0.183
 201308 -0.161 -0.323
 201309  0.112 -0.109
 201310  0.681  1.196
 201311  3.847  4.150
 201312 -0.015  0.207

Average Monthly Growth (2013)
   tx    fx 
0.413 0.418

These data are not conclusive. You could argue that the roughly 40% monthly tx growth is impressive evidence of underlying transaction growth. Or, you could interpret the roughly identical growth rates of tx and fx volume, and their high monthly correlation, as evidence that most of the tx growth is due to fx settlements. We need a complete fx volume series to disambiguate the data. When we do that, my bets are that monthly tx growth is under 20%.

The Marginal Cost of Cryptocurrency

I would count myself in the camp who believe that cryptocurrencies could do to finance what TCP/IP did to communications. Yet I also believe that Bitcoin and most of its current variations suffer from a fatal economic design flaw that will not survive the evolution of cryptocurrencies. That flaw is logarithmic money supply growth, and in this post I will explain why it is flawed. My argument is a microeconomic analysis of cryptocurrency and has nothing to do with the much debated “deflationary bias”. As far as I am aware, the argument in this post has not been made before.

In a recent post Tyler Cowen adapts an old chestnut of the money literature to cryptocurrencies. Cowen’s argument is based on some false assumptions, but it has the virtue of starting from the right microeconomic principles, so it’s an excellent point of departure.

Once the market becomes contestable, it seems the price of the
dominant cryptocurrency is set at about $50, or the marketing costs
faced by its potential competitors. And so are the available rents on
the supply-side exhausted.

There is thus a new theorem: the value of WitCoin should, in
equilibrium, be equal to the marketing costs of its potential
competitors.

In defence of the dominant cryptocurrency, Bitcoin, one might accept this argument yet object to its pessimistic conclusions by pointing out that Cowen is ignoring powerful network externalities. After all, coins residing on different blockchains are not fungible with one another. Cowen seems to treat these things as if they’re near substitutes, but maybe it’s a Visa/Mastercard sort-of-thing.

I want to dismiss this objection right away. We should be sceptical of ambitious claims about the network externalities of any one cryptocurrency. The network externalities of established fiat currencies are, of course, enormous, but this is largely due to their having a medium-of-account (MOA) function (to say nothing of legal tender and taxation), as well as a medium-of-exchange (MOE) function. Transactions settled in a cryptocurrency consult the exchange rate at the time of settlement and therefore piggy-back off the numeraire of an established fiat currency. Cryptocurrencies are not MOA, they are MOE only.

And given the near-frictionless fx between cryptocurrencies themselves, it’s not difficult to imagine a payment front-end for routine payees like on-line retailers that accepts a wide range currencies as MOE. And multi-coin wallet software for payers is a no-brainer.

So, for the sake of argument, I’m going to assume that the network externalities of any given cryptocurrency are close to zero. On Cowen’s analysis, this would imply that the marginal cost of cryptocurrency is near-zero. And this means:

Marginal cost of supply for the market as a whole is perhaps the
(mostly) fixed cost of setting up a new cryptocurrency-generating
firm, which issues blocks of cryptocurrency, and that we can think of
as roughly constant as the total supply of cryptocurrency expands
through further entry. In any case this issue deserves further
consideration.

This is a long-time objection to the workability of competitive, privately issued fiat currencies. The cost structure of their production cannot be rationalised with their value. A market of competing fiat currencies with “stable” purchasing power will generate too much seigniorage to their issuers, inviting more competition until the purchasing power of these media rationalise their cost of production.

If we can’t lean on the economics of network externalities, what’s wrong with this argument?

The marginal cost of new coins is the cost of hashing a block

First of all, Cowen speaks of a “cryptocurrency-generating firm” that issues “blocks of cryptocurrency”. The idea here seems to be that the marginal costs of creating a crypto coin are close to zero (it’s just data after all), most costs being the fixed costs of setting up the cryptocurrency system.

But this has things the wrong way round. Creating a new crypto currency is as easy has forking the Bitcoin source code, hacking it, and throwing the fork up on a code repo. Fixed costs are practically zero. Marginal costs, however, equal the electricity costs (and amortised hardware costs) of solving a new block of transactions, as each new block contains a mining award for the peer whose hashing finds a solution to the system’s hash problem. This is how new coins are created.

Mining in equilibrium

To compensate a peer for the costs of doing this costly hashing work, he is allowed to pay himself a certain number of new coins in a special coinbase tx each time he solves the hash problem on a block. But the protocol ensures that the expected value of this mining award is offset by the cost of the requisite kilowatt hours needed to do the hashing. There are no issuers here “collecting rents”; it’s as if the seigniorage is sacrificed to the entropy gods.

Miners (the peers who choose to do the hashing) will work on new blocks only when the expected value of the mining award exceeds the cost of electricity required to run the hashing hardware. There are no restrictions of entry to mining, and the equilibrating mechanism is the protocol’s hashing difficulty. If the coin’s exchange value increases, making mining profitable at current difficulty, more miners will join the hashing effort and because of this, after 2016 blocks the protocol will adjust the difficulty upward making expected value of mining = costs of mining again. The same process works in reverse in the scenario where exchange value decreases. In the creation of crypto coins, MC = MP.

(It should be noted that this is a stochastic rather than deterministic equilibrium, as the difficulty resets approximately every two weeks. Furthermore, the miner is paying for electricity today for an award he will get at some point in future, so it’s really more of a case of MC = E[MP]. But these details are not relevant to the conclusions we want to draw in this post, so I’ll continue to speak as if the marginal cost of making new coins equals the exchange value of coin at any given point in time.)

Why isn’t it obvious that MC = MP?

There are two properties of hash-based proof-of-work that obscure these microeconomics. The first is the multi-factored economics of mining difficulty. Improvements in specialised hashing hardware increase mining difficulty but do not increase its cost. (These improvements should eventually converge to a Moore’s Law function of time when the mining rig manufacturers exhaust all of the low-hanging fruit and run into the same photolithography constraints faced by Intel, etc.) The efficiencies merely result in a higher hashing difficulty, a sort of digital Red Queen Effect.

Similarly, increases (decreases) in the price of electricity will decrease (increase) the difficulty without changing the costs of mining. (It should also be noted that mining will gravitate towards regions like Iceland where it is cold and electricity is relatively cheap.) The only variable that does change the cost of mining is the exchange value of the currency itself.

And this is the other barrier to realising that MC = MP. In Bitcoin and most of the alt-coins, money supply is a logarithmic function of time. As money supply growth is deterministic, changes in money demand are reflected in the exchange value of the coin, raising or lowering the cost of producing the next coinbase as the protocol adjusts the difficulty up or down in response to the entry or exit of hashing power. So the exchange value of the mining award determines the marginal costs rather than the other way round. An economist might find that pretty weird, but that is how it works.

Network security and the crypto money demand function

It costs nothing to fork Bitcoin, hack the source, and create your very own alt-coin. But by itself, such a system is broken and has no bearing whatsoever on the economics of working cryptocurrencies. To make your alt-coin a working system, a sufficiently diverse group of miners must burn some costly kilowatt hours mining each block of transactions. Someone has gotta spend some capital to keep the lights on.

And the more kilowatt hours burned, the better, as the demand for a given cryptocurrency is a function of that system’s hashing costs (among other things, of course). The reason this is so has to do with the integrity of the most recent blocks on the distributed tx ledger, the blockchain. The amount of capital collectively burned hashing fixes the capital outlay required of an attacker to obtain enough hashing power to have a meaningful chance of orchestrating a successful double-spend attack on the system.

A double-spend is an event whereby the payee sees his payment N blocks deep and decides to deliver the goods or services to the payer, only to have this transaction subsequently not acknowledged by the network. Payments in cryptocurrency are irreversible, but double-spends are possible, and in economic terms they have the same effect that fraudulent chargebacks have in conventional payment systems like Visa or Paypal. The mitigation of this risk is valuable, and the more capital burned up hashing a crypto currency’s network, the lower the expected frequency of successful double-spend attacks.

Given that such events undermine confidence in the currency and drive its exchange value down (harming all holders, not just the victims of a double-spend), it should be axiomatic that a cryptocurrency’s hash rate is an argument of its money demand function.

This is also why it doesn’t make sense to speak of new cryptocurrencies expanding the aggregate crypto money supply without limit (or limited only by the fixed costs of creating one). What matters is how the aggregate hashing power, which is scarce, gets distributed over the set of extant cryptocurrencies. The obove reasoning predicts that hashing power will not spread itself arbitrarily thinly, keeping MC well-above 0. (The distribution currently looks more like a power law.)

Who pays to keep the lights on?

From the perspective of mitigating double-spend risk, the more capital that is burned hashing the better because the frequency of double-spend attacks is inversely related to the amount of capital burned. But the marginal benefits of hashing are at some point diminishing and the cost of hashing is linear, so for the end-user of a cryptocurrency, there is some level of hashing that is optimal.

In our argument above for why MC = MP, we made a simplification in saying that the mining award consisted entirely of coinbase. In fact, it consists of coinbase plus tx fees. In a protocol like Bitcoin’s where money growth is logarithmic, most of the early hashing costs are paid for out of new money supply, but as time goes on, tx fees become a greater and greater proportion of the mining award (currently, tx fees are about 1/3rd of Bitcoin’s mining award).

Now here we do see a genuine network externality. Imagine that all hashing costs are paid out of tx fees (as will eventually be the case with Bitcoin). There will be a natural tendency for demand for crypto MOE to gravitate towards the system with a higher tx volume, as it will have lower fees per-transaction for a given level of hashing.

Now imagine that we have a range of cryptocurrencies along a spectrum. On one end of the spectrum is the logarithmic money supply protocol–we’ll call these “log coins”. On the other end of the spectrum is a protocol with perfectly elastic money supply–we’ll call these “growth coins”. Growth coins have a non-deterministic money growth rule, an algorithm that enlarges the coinbase just enough to offset any increase in money demand, so the exchange value is roughly stable as long as money demand is not in decline. (In a future post, we will outline a protocol that can actually implement something approximating this.)

Where can we expect demand for MOE to gravitate along this spectrum of cryptocurrencies? This is where the logarithmic money growth rule hits the skids. At the margin, seigniorage for the log coins is eaten up by hashing costs, but as money demand outpaces the (rapidly declining) growth rate of money supply, the exchange value of the currency increases and existing coin holders are the recipients of the seigniorage on all of the existing, revalued coin.

Growth coins, by contrast, generate most of the seigniorage in the form of a larger coinbase rather than revalued coin, meaning that most of the seigniorage is spent on hashing. The result is lower tx fees for the those who use growth coins as a MOE.

Given that tx fees will be shared between payer and payee, it’s hard to see how magic network economics will maintain the dominance of the log coins in the long run. Money demand coming from the transaction motive will gravitate towards the MOE with the lowest tx costs.

Free-riding not gold buying

The scenario gets worse when we relax the monetarist assumptions (latent in the above analysis) of stable money velocity and demand proportional to tx growth. You don’t have to be a Keynesian to see how a large quantity of Bitcoin balances are held for speculative reasons. The high level of coin dormancy in the Bitcoin blockchain is as conclusive empirical evidence of this as there can be.

Bitcoin, therefore, has a free rider problem, whereby speculative coin balances, which benefit from the system’s costly hashing rate are effectively subsidised by those who use bitcoins primarily as a MOE. These speculative balances repay the favour by adding a toxic amount of exchange rate volatility, providing yet another reason for the transaction motive to run away from log coin MOE. As time goes on and the coinbase declines, this inequitable arrangement only gets worse.

Optimal cryptocurrency

As long as the growth rate of a growth coin’s money demand is sufficient to generate enough seigniorage in coinbase to cover the hashing rate demanded of its MOE users, transactions in growth coin are basically free. Some negligible fee will likely be required to deter DoS attacks (which has the interesting consequence of putting the goals of Adam Back’s Hashcash back into the cryptomoney that appropriated its designs), and its hard to see how one who wishes to hold crypto coin balances for the purpose of actually making transactions would prefer a log coin over a growth coin.

So maybe here is a new theorem: the value of a cryptocurrency will converge to its optimal level of hashing costs?

Fiat money via hash-based proof-of-work breaks new ground and we need to give the concept the attention and analysis it deserves. After all, we can dispense with the barbarous relic of logarithmic money supply and keep the good bits.

The velocity and dormancy of bitcoin

Dorit Ron and Adi Shamir (R&S) of The Weizmann Institute of Science wrote a paper Quantitative Analysis of the Full Bitbcoin Transaction Graph that has received allot of attention in the Bitcoin community and some press coverage. One of the paper’s main claims is that vast majority of bitcoins are not “in circulation”.

Here is our first surprising discovery, which is related to the
question of whether most bitcoins are stored or spent. The total number of BTC’s in the system is linear in the number of blocks. Each block is associated with the generation of 50 new BTC’s and thus there are 9,000,050 BTC’s in our address graph (generated from the 180,001 blocks between block number zero and block number 180,000). If we sum up the amounts accumulated at the 609,270 addresses which only receive and never send any BTC’s, we see that they contain 7,019,100 BTC’s, which are almost 78% of all existing BTC’s.

By itself, this is uninteresting. It is part of the Bitcoin protocol that 100% of the input to a tx must be assigned to its output, so when the former is not commensurable with the latter, the spender generates a new address to which the remainder is paid (i.e., he pays himself the “change”). Also, it is recommended practice that you generate a new address for every tx where you are payee. For both of these reasons, at any given point in time most bitcoin will be in an address that has never spent. In fact, if the recommended practice were universally followed, 100% of coin would be in such an address.

However, 76.5% of these 78% (i.e., 59.7% of all the coins in the
system) are “old coins”, defined as bitcoins received at some address more than three months before the cut off date (May 13th 2012), which were not followed by any outgoing transactions from that address after they were received… This is strong evidence that the majority of bitcoins are not circulating in the system… Note that the total number of bitcoins participating in all the transactions since the establishment of the system (except for the actual minting operations) is 423,287,950 BTC’s, and thus each coin which is in circulation had to be moved a large number of times to account for this total flow.

Now this is more interesting. That about 60% of bitcoins have been dormant for at least the three months prior to the study’s cut off date is consistent with the thesis that the majority of coin is not held for the purposes of conducting transactions but rather held as a store-of-value. But lets put aside the theoretical preconceptions for the moment. In this post I want to help tighten-up some concepts so that we can actually start testing some monetary theories on Bitcoin.

What is bitcoin velocity?

The velocity of a currency is basically the number of times a currency unit changes hands over a given interval of time. Conventionally, this interval is taken to be a calendar quarter because economists estimate money velocity as quarterly GDP divided by the average money supply over the quarter. Their calculation is indirect, because there is no centralized record of all fiat transactions in a given currency, but that transaction history is implied in the GDP stats. Bitcoin has the opposite problem: there is no GDP calculation for Bitcoin (yet!), but we do have a complete transaction log in the form of the blockchain. So calculating Bitcoin velocity should be straight forward.

We can make a back-of-envelope calculation right now. We’ll estimate the average (quarterly) Bitcoin velocity over the same time window studied in the paper, Jan 2009 to mid-May 2012 (13.5 quarters).

According to R&S the sum of all transactions (excluding minted coins) for the period is 423,287,950. As money growth over this period is linear and the first block starts at 50 and the last block is 9 million, the average money supply is 4,500,00. Divide the former by the latter and multiply by \frac{1}{13.5} and you get a quarterly money velocity for bitcoin of just under 7.

Is that high or low? As a benchmark, look at US M1 money velocity, which we can get from the St. Louis Fed. The average quarterly US money velocity over the same period was about 8 (it’s currently about 6.9), and this has been on a downward trend since 2008.

We should really work these numbers into a timeseries, but the average is at least in line with USD velocity numbers, which in itself should cast some doubt on the level of economic activity that gets done in bitcoins. But we should also note that the numerator in our rough calculation includes change, which should be subtracted out; paying yourself doesn’t exactly count as coin “changing hands”. Devising an estimator for this is a task for a rainy day, but suffice it to say that our velocity estimate of \approx 7 is biased upwards.

The “Shadow” Bitcoin system

There is however an offsetting factor that may even bias velocity estimates downward: the Shadow Bitcoin system. Exchanges like MtGox, on-line wallets like, and some other bitcoin services allow transfers of coin between their users. The service holds coin in many addresses that are in effect “client” accounts, and transfers between such accounts are recorded only by the third party’s servers, not the bitcoin blockchain.

So despite the fact that a transfer of bitcoin takes place within a trusted third party, presumably such transfers should still be included in the velocity figure, but we have no way of knowing directly what these volumes are. I am going to set this question aside in this post.

How should we measure “dormant” coins?

Velocity is a basic concept in monetary economics and is easy to calculate. But the key statistic in the R&S paper is the percentage of “old coins”. This is a related but different concept.

If every address spends its entire balance 7 times over the quarter, velocity is 7. But if two addresses ping 1 BTC between each other 64 million times over the quarter whilst the remaining 8,999,999 coins aren’t spent at all, velocity is still 7. But in the first case there are no “old coins”, in the latter case all but two are “old coins”. Let’s call this concept dormancy.

Dormancy is related to velocity. If bitcoin money velocity is 7, that means that on average a coin sits inside an account for about 13 days before it is spent. If dormancy is not commensurate with velocity, then the distribution of dormancy across the money supply is going to be very wide. For example, an “old coin” is defined by R&S as one that hasn’t been spent in more than 90 days. So if about 60% of bitcoins are old coins, then the remaining 40% of coins have a velocity of at least 17.5, so on average each of those coins is dormant for no more than 5 days.

One of the problems with defining “dormant coin” as coin in an address that has not spent or received any coin in the last three months is that a single tx at an address–no matter how small–will put the entire balance of the address outside the set of dormant coins. This identification rule seems to be a lower limit estimate of dormant coin rather than a definition of it.

Any anyway, “dormant coin” is a binary attribute and rests on some arbitrary cut off based on duration, when what we are really interested in is the duration itself. So instead of measuring percentage of dormant coins, we should instead measure a coin’s dormancy: the time passed since the coin was last spent.

How do we measure the dormancy of a coin? Strictly speaking, this is nonsense, as coin input into a transaction is fungible. So dormancy is actually a property of a bitcoin address rather than a bitcoin (or some fraction thereof). We can define it as the weighted average of time since coin was paid into the address.

For example, if a new address A is created and 10BTC is paid into it at noon on Monday, the dormancy of A is 0. At noon on Tuesday, the dormancy of A is 1 (taking a day as the unit of time), by Wednesday it is 2. But suppose another 20BTC is paid into A on Wednesday. Dormancy goes down to 2/3 (the coins with dormancy=2 are now only 1/3rd of the address balance and the other 2/3 coin have zero dormancy). By noon on Thursday, dormancy is now 1 2/3.

In other words, an account’s dormancy increases by 1 every 24hrs when there is no activity in the address. Whenever R coins are paid into an address, dormancy is reduced by the factor 1 - \frac{R}{B}, where B is the address balance after the coins are paid in. Whenever coins are spent by an address, its dormancy is unchanged (dormancy is a property of the remaining coins). But spends reduce the address balance, so subsequent coins received will reduce dormancy even more.

So what this definition gives us is a distribution of dormancies over every address in the blockchain at a given point in time. The dormancy of the Bitcoin network at a given point in time is simply the weighted-average of the account dormancies, where the weight for an address is its balance.