Research 175 items
Research: Token Engineering Pt III: Analysis of Bitcoin, design of Ocean protocol » Brave New Coin (bravenewcoin.com)
Trent McConaghy


In previous articles, I described why we need to get incentives right when we build tokenized ecosystems; and introduced ideas towards a practice of token engineering.

We can use these tools to help analyze existing tokenized ecosystems, and design new ones. This article does exactly that with case studies in (1) analysis of Bitcoin, and (2) design of Ocean Protocol. Let’s get started!

1. Case Study: Analysis of Bitcoin
We’ve discussed how best practices from optimization can apply to token design. Let’s put this into practice. Let’s frame Bitcoin in the lens of optimization design. In particular, let’s focus on the objective function for Bitcoin.Its objective function is: maximize the security of its network. 

It then defines “security” as compute power (hash rate), which makes it expensive to roll back changes to the transaction log. Its block reward function manifests the objective, by giving block reward tokens (BTC) to people who improve the network’s compute power.

We can write the formula for the objective function (block reward function), as the image below shows. On the left side is the amount of token rewards R in a block interval that actor i can expect. The right side of the equation is proportional (α) to the left, and is the product of compute power (hash rate) of actor i and number of tokens dispensed every block T.

The latter value is currently 12.5 BTC every ten minutes. Every four years that value halves.

bitcoin-incentive.png


Aside: Trading Variance for Efficiency

Notice that the reward is in terms of expected value, E(). This means that that each user doesn’t necessarily receive a block award every interval. Rather, in Bitcoin, it’s quite lumpy: just a single user is awarded in each block interval. But since their chance of getting the award is proportional to the hash rate they’ve contributed, then their expected value is indeed the amount contributed.

The Orchid team calls this probabilistic micro-payments.
Why would Bitcoin have this lumpiness (high variance), rather than award every player at every interval (low variance)? Here are some benefits:

It doesn’t need to track how much each user contributed. Therefore lower compute, and lower bandwidth.It doesn’t need to send BTC to each user at each interval. Therefore far fewer transactions, and lower bandwidth. An efficiency tweak!In not needing the first two, the system can be far simpler and therefore minimizes the attack surface. Therefore simpler, and more secure.
These are significant benefits. The biggest negative is the higher variance: to have any real chance to win anything at all you need significant hash rate, though if you do win, you win big. However, this higher variance is mitigated simply by higher level mining pools, which have the direct effect of reducing variance. This is cool because it means that Bitcoin doesn’t need even need to do that directly. As usual, we keep learning from Satoshi:)

1.1 Success of Bitcoin’s incentives?
How well does Bitcoin do towards its objective function of maximizing security? The answer: incredibly well! From this simple function, Bitcoin has incentivized people to spend hundreds of millions of dollars to design custom hashing ASICs and building ASIC mining farms.

Others are creating mining pools with thousands of participants. Now the hash rate is greater than all supercomputers combined. Electricity usage is greater than most small countries, and on track to overtake USA by July 2019.

All in pursuit of Bitcoin token block rewards! (Not all of it is good, obviously.)
Besides the ASIC farms and mining pools, we’ve also seen a whole ecosystem emerge around Bitcoin. Software wallets, hardware wallets, core developers, app developers, countless Reddit threads, conferences, and more.

Driving much of it is BTC token holders incentivized to spread the word about their token.What’s driven all of this is the block rewards that manifest the objective function. That’s the power of incentives. You called it, Charlie:)

2. Case Study: Design of Ocean Protocol

ocean-protocol.jpeg

2.1 Introduction

When we first started doing serious token design for Ocean Protocol in May 2017, we found ourselves struggling. We hadn’t formulated the goals (objectives and constraints) and instead were simply looking at plug-and-play patterns like decentralized marketplaces.

But then we asked: how does this help the data commons? It didn’t. Does this need its own token? It didn’t. And there were other issues.
So, we took a step back and gave ourselves the goal of writing proper objectives and constraints. Then, things started to go smoother. With those goals written down, we tried other plug-and-play patterns (solvers).

We found new issues that the goals didn’t reflect, so we updated the goals. We kept looping in this iterative process. It didn’t take long before we’d exhausted existing plug-and-play patterns, so we had to design our own; and we iterated on those.

After doing this for a while, we realized that we had been applying the optimizer design approach to token design!

That is: formulate the problem, try using existing patterns; and if needed then develop your own. So while this blog post lays out the token design process as a fait accompli, in reality we discovered it as we were doing it. We’ve actually used this methodology for other token designs since, to help out friends in their projects.

2.2 Ocean Problem Formulation
Recall that the objective function is about getting people to do stuff. So, we must first decide who those people are. We must define the possible stakeholders or system agents. The following table outlines the key ones for Ocean token dynamics.


ocean-ecosystem.png


Objective function. After the iterations described above, we arrived at an objective function of: maximize the supply of relevant AI data & services. This means to incentivize supply of not only high-quality priced data, but also high-quality commons data; and compute services around this (e.g. for privacy).

Constraints. In the iterations described above, used this checklist when considering various designs. Roughly speaking, we can think of these as constraints.

For priced data, is there incentive for supplying more? Referring? Good spam prevention?For commons (free) data, is there incentive for supplying more? Referring? Good spam prevention?Does the token give higher marginal value to users of the network versus external investors?
Besides these questions, as we continually polled others about possible attacks; added each new concern to the list of constraints to solve for (including a memorable name); and updated the design to handle it. New constraints included: “Data Escapes”, “Curation Clones”, “Elsa & Anna Attack”, and more.

The FAQs section of the Ocean whitepaper documents these, and how we addressed them.

2.3 Exploring the Design Space
We tried a variety of designs that combined token patterns in various ways; and tested each design (in thought experiments) against the constraints listed above. Some that we tried:

Just a decentralized marketplace. Fail: doesn’t incentivize commons data.Just a TCR for actors (like adChain). Fail: can’t handle spam data.Just a TCR for data/services. Fail: can’t handle Data Escapes.A TCR for actors and a TCR for data/services. Fail: can’t distinguish non-spam data/services from relevant ones.A TCR for actors and a Curation Market (CM) for data/services. Fail: no incentives to make data/services available.
And more, such as various riffs on governance and reputation systems. Finally, we arrived at one that met our goals: TCR for actors, and Curated Proofs Market (CPM) for data/services. The next section elaborates.

2.4 A new token pattern for ocean: Curated proofs markets
Ocean’s objective function is to maximize the supply of relevant AI data & services.To manifest this, we must acknowledge that we can’t objectively measure what is “high quality”.

To solve this problem, Ocean leaves curation to the crowd: users must “put their money where there mouth is” by betting on what they believe will be the most popular datasets, using a Curation Market setting.

Then we needed to reconcile signals for quality data with making data available. We resolved that by binding the two together: predicted popularity versus actual (proven) popularity. A user is awarded tokens if both of:

They have predicted a dataset’s popularity in a Curation Market setting. This is the Predicted Popularity.They have provably made the dataset/service available when requested. By definition, the more popular it is, the more requests there are. This is the Proofed Popularity.
Together, these form what we call a Curated Proofs Market (CPM). In a CPM, the curation market and the proof are tightly bound: the proof gives teeth to the curation, to make curation more action-oriented; in turn, the curation gives signals for quality to the proof.

CPMs are a new addition to our growing list of token design building blocks:)
The following equation describes Ocean’s token rewards function.


ocean-reward.png


The first term on the right hand side, Sij, reflects an actor’s belief in the popularity of the dataset/service (Predicted Popularity). The second term, Dj, reflects the popularity of the dataset/service (Proofed Popularity). The third term, T, is the number of tokens doled out during that interval.

The fourth term, Ri, is to mitigate one particular attack vector. The expected reward function E() is implemented similar to Bitcoin. The Ocean whitepaper elaborates on how this reward function works.

3. Conclusion
This article gave case studies on using token engineering tools to analyze bitcoin and to design Ocean Protocol.

Brave New Coin offer in depth research and report analysis of cryptocurrencies and blockchain. Discover more here: 
https://bravenewcoin.com/news/token-engineering-pt-iii-analysis-of-bitcoin-design-of-ocean-protocol/
    • 1
    Francisco Gimeno - BC Analyst This is the third part of an interesting series about token engineering from #bravenewcoin. Once people thought token´s creation was an easy issue, just copy it from the ones provided by #Ethereum and that is. Nowadays the complexity of the token´s ecosystem need deep study and work. Research like this is welcomed.