<span>Monthly Archives</span><h1>June 2020</h1>
    ETH authorization
    Blockchain public chain

    What is “authorization” in Ethereum contract interaction?

    June 30, 2020

    As for the “authorization” operation, many new Ethereum users are confused when they first operate the smart contract. I don’t understand what authorization is, let alone why authorization initiates a transaction, and the transaction itself does not carry any assets, and at the same time, it has to pay a miner’s fee.

    In this article, we will explain the nature of “authorization” operation from the perspective of technology.

    When users interact with Ethereum smart contract token assets, they must first approve. So, why do you need authorization?

    quotation for nest oracle

    Let’s give an example of a miners’ quotation for nest oracle:

    • Bob is a nest price maker. When he participates in the ETH/USDT price prediction machine quotation, he needs to transfer the ETH and USDT assets into the quotation contract according to his own quotation data. Here, it is assumed that 10 ETH and 1600 USDT are used to conduct a quotation operation.
    • Then Bob must first authorize the USDT asset to the quotation contract of nest Oracle, so that the quotation contract has the authority to operate the USDT asset in Bob’s wallet, so that the transaction logic related to USDT asset can be executed smoothly when the verifier takes the order within the quotation life cycle.
    • Authorization here is essentially an on chain transaction, requiring users to pay miners’ fees (Gas fees). The purpose is to tell the USDT token contract that target smart contract a has the right to control the number of USDT assets in my wallet X. Then, when the target contract A needs to conduct USDT transactions, it will take the initiative to obtain no more than x USDT assets from the USDT token contract.

    However, in the above case, another question arises. Why only ERC20 tokens such as USDT need to be authorized while ETH does not?

    Technical analysis: because ETH is the native asset of Ethereum network, when transferring money to the target smart contract, the underlying layer of Ethereum network forces the target contract to have a certain receiving method, so the transaction itself can carry ETH assets into the target contract; while ERC20 token only changes the ERC20 token when transferring to the target contract The target contract will not receive any notice for the account book information of the contract itself.

    Therefore, ETH does not need authorization operations like ERC20 token when it interacts with smart contracts.

    To be exact, there are two steps in the authorization operation:

    Step 1: authorize the transaction itself. It is to tell an ERC20 token contract that in the future, a target smart contract address a may come to my wallet account to take x quantity of token assets;

    Step 2: the transaction execution itself. When the logic execution in the target contract A requires the token transaction, contract a will take the initiative to trigger the transfer transaction of ERC20 token to take away the X quantity of the token; on the contrary, if the transaction of the token is not involved, even if it has been authorized, the asset transaction will not really occur.

    In short, after the authorization operation, the token transaction is not necessarily executed, but such a fund operation authority is reserved for the target contract A.

    In order to avoid users’ repeated authorization operations, many smart contract developers usually set the maximum number of tokens authorized to the target smart contract by default. Obviously, there are certain risks in this way. If there is a loophole in the smart contract or the contract administrator does something evil, the user’s token assets will be lost, which is the problem caused by “over authorization“.

    Whether in NEST DAPP or imToken wallet, we often encounter this problem.

    over authorization

    In order to solve the problem of “over authorization”, NEST DAPP has an authorization management page. If a miner does not expect to participate in NEST Oracle quotation in a short time, he can “cancel authorization” to eliminate the security problems caused by existing authorization. imToken wallet also takes some measures, such as “clear authorization information” for each authorization, and has an exclusive DAPP for authorization management, so that users can freely manage their existing authorization.

    The feasibility scheme of skipping authorization operation: by implementing specific transfer logic in ERC20 token contract, that is, a method of forcibly invoking the target contract while transferring money, the current authorization operation can be avoided. However, in order to keep the purity of token contract, the mainstream ERC20 token does not implement this function.

    Logic and law
    DeFi

    Enlightenment from the collapse of encryption market on March 12: what is missing in DeFi?

    June 29, 2020

    The crash of March 12, and many people in the industry lost confidence, especially in the field of DeFi.

    Taking the stable currency project MakerDAO as an example, because of the collapse of 312, the debt position of the mortgage stable currency appeared to be out of position (i.e. insolvent), thus the MKR auction was started.

    In the whole process, various kinds of jokes were flying everywhere. Some said that the assets were captured at 0 cost, some said that the loss was how much, some said that the project was over, some wanted to sue them, and even the foundation felt pressure to hand over the governance to the community.

    Self-Sustaining MakerDAO

    It seems that the entire crypto community is beginning to doubt whether or not DeFi can really be established.

    This kind of doubt leads to collective reflection of the industry. Many people have written articles and summarized 312, with strange phenomena and even more strange explanations. MakerDAO has also made a lot of adjustments, including liquidation, auction and so on. It is becoming more and more complicated, but the purpose is only one, that is, when this situation occurs in the future, it can avoid the recurrence of 312 event. These starting points and original intentions are all good, and they hope that DeFi can have a virtuous cycle, sustainable development, and truly enter the public’s vision. However, these summaries, including the recent adjustment of MakerDAO, always feel that it is better to change the soup without changing the dressing, which does not point to the core of the problem.

    Currently, the well-known DeFi, whether MakerDAO or Compound, is the so-called excess mortgage and limit price closing mode. On the face of it, the excess mortgage and the liquidation of the water mark are impeccable. But these designs, in essence, are options one by one! Since it is an option, it must involve three important variables: price, risk-free rate of return and asset volatility.

    These things are not information on the chain. Because the impact of price variables is too significant, someone has designed a scheme called the Oracle, trying to transfer the required price from the centralized exchange representing price discovery.

    Both MakerDAO and Compound use price prediction machines. Then, the risk-free rate of return and volatility are implied in the pricing formula, which is not easy to detect. Therefore, no matter which DeFi is in the current market, the possible drastic fluctuations of these two are not considered in the design, and no in-depth sensitivity analysis is made.

    The risk-free rate of return is not mentioned first, because the impact factor of this variable is small and the frequency of significant changes is low. Next, let’s just talk about the volatility, which is a core variable that is generally missing in DeFi.

    Volatility

    Volatility measures the volatility level of return on assets. In finance, it is often measured by the standard deviation of return on assets in a given period (generally known as historical volatility, which is used to estimate the implied volatility).

    In option pricing, volatility is an indispensable and extremely sensitive variable, but this variable is dead written or adjusted by governance in current DeFi.

    Taking MakerDAO as an example, two important parameters ensure the stability of DAI: mortgage rate and position closing line (clearing line). However, these two parameters are not dynamically adjusted, that is, the contract is written at the beginning, for example, the mortgage is 50% off and the position is closed at 20%. If you want to adjust this indicator, you need the so-called chain governance, not automatic change of the system.

    Similarly, the same is true for us to borrow money from Compound. The mortgage rate and closing line are determined in advance and will not be automatically adjusted.

    However, under normal circumstances, the design of these two variables is reasonable. However, when the volatility rises sharply, it will present a very realistic challenge: sell out. Once you are out of position, you have to ensure the stability of DAI, which is impossible. The function of stabilizing currency will gradually lose efficacy. Since the fluctuation rate is not as obvious as the price change, and it is often relatively stable, these DeFis can operate stably and balance automatically in most cases. Even if Ethereum fell from $1400, MakerDAO would be able to circulate normally, and the mortgage scale would even continue to grow. Everyone felt that this reflected the greatness of decentralization.

    However, no matter how lucky you are, MakerDAO’s processing is ultimately missing one variable, and there is no feedback on volatility indicators.

    Volatility surged

    This kind of problem also exists in the AMM scheme that we will mention later.

    From a more abstract level, if there is less than one basic dimension variable, the participants can provide constrained information for the game. In the end, even if the equilibrium can be formed, it is also incomplete information equilibrium. Most of this kind of equilibrium is unstable: once it deviates from the track under some disturbances, it may never return, leading to the withdrawal of some participants and the end of the market.

    For example, in 312, MakerDAO was unable to deal with the sudden surge in volatility. The original position closing setting was broken by price fluctuation in a short period of time, resulting in some debt positions becoming insolvent. Due to this risk sharing mechanism, the intrinsic value of all DAI is devalued. In order to maintain the property of stable currency, MKR auction is adopted to restart manually, which is far from the spirit of DeFi. So later, the MakerDAO foundation also realized this problem. It did not want to and could not take responsibility, and gave the right of governance to the community.

    In fact, as a real DeFi protocol, it does not need to charge fees or governance rights. It is just a standard just like erc20 protocol. The premise is that the agreement should comprehensively reflect all variables of the service, instead of taking a model that is short of weight to deal with the complicated economic world.

    Logic and law

    Although we only cite the example of MakerDAO, the problems we reveal are universal. The absence of volatility makes all mortgage agreements go back to MakerDAO. This has nothing to do with luck. In the financial sector, we only believe in logic, not luck. Come out to mix, sooner or later is to return, today’s good luck, for tomorrow’s step on the pit buried foreshadowing.

    In the end, only logic and law can save you.

    Ethereum block
    Blockchain public chain

    How to determine the size of Ethereum block

    June 28, 2020

    With the development of blockchain industry, the ecosystem based on the Ethereum network is becoming larger and larger. In addition to the transaction package of various assets issued based on the Ethereum network, it also needs to meet a variety of smart contract package transactions, such as the business contracts related to the DeFi protocol (Uniswap), Oracle protocol(NEST protocol), and the game (Decentraland). This makes the Ethereum network need to deal with more and more transactions, so that we often see the congestion of Ethereum and the increase of miners’ fees when the market fluctuates.

    EthGasStation
    Ethgasstation: ranking of Gas consumption of Ethereum network

    According to the natural growth rate of the current Ethereum blockchain state, the Ethereum network will face some problems before long. Because, with the continuous growth of Ethereum block data, the threshold for us to run the whole Ethereum node will be higher and higher, which will lead to the more centralized Ethereum network. (currently, an Ethereum node needs about 220GB of storage space)

    With the increase of network delay, its speed may become slower and slower; with the emergence of “state expansion“, block verification may become more difficult. Finally, as the transaction TPS reaches the upper limit, and the improvement of the client is more difficult to achieve, the end users and core developers of Ethereum will be hit, which will affect the sustainable development of Ethereum ecology.

    daily transactions to be packed
    Etherscan: number of daily transactions to be packed in the Ethereum network

    At the macro level, the problem that Ethereum 1.0 network is facing is that the Ethereum blockchain is becoming larger and larger! In terms of segmentation, the variables that make the problem more prominent are data storage, transaction status and block size. Today, we mainly analyze the core factor of “Ethereum block size“, and specifically understand how to determine the block size of the Ethereum network.

    Unlike Bitcoin networks, Ethereum does not explicitly limit the size of each block by memory, but forces the size of each block by block GasLimit.

    The block GasLimit setting of Ethereum effectively limits the amount of transactions that can be packaged in a block. The GasLimit parameter is determined by the Ethereum miners collectively, that is, to dynamically increase or decrease the GasLimit value by voting. The most recent vote was in the second half of 2019. The miners voted to increase the block GasLimit of Ethereum from the original 8 million gas units to 10 million, increasing the size of each block by about 25% compared with the previous block, which theoretically improves the TPS of Ethereum network.

    Ethereum network TPS
    BTC.COM Display, Ethereum network TPS: 8 ~ 14

    Increasing the GasLimit of a block means that the data volume of a single block package transaction increases and the block becomes larger, which slows down the information transmission speed, which will increase the probability of out of block of the uncle block. Even though Ethereum uncle block has some ETH mining rewards, the miners don’t want to meet it.Therefore, there is a balance between the upper limit of GasLimit in a single block and the miner group. (Note: in the Ethereum system, if multiple miners dig out new blocks at the same block height, one of these blocks will become the block on the longest chain, while the other blocks, if referenced by subsequent blocks, will be called Uncle block)

    Therefore, the GasLimit of a single block of the Ethereum network can not be significantly prompted in a short time, but it should be dynamically adjusted according to the development state of the Ethereum network to balance the efficiency required by the ecological development and the interests of the miners.

    So far, there is a scientific conclusion about the “safe” block size upper limit data, but we generally believe that the TPS improvement brought by increasing the single block GasLimit upper limit is not enough to achieve the development of Ethereum in the next three years. In addition, there are many other related problems.

    Finally, we look forward to the early arrival of eth 2.0!

    agency risk
    DeFi

    Dazzling moment in financial history: blockchain technology will completely eliminate agency risk!

    June 27, 2020

    People who have just come into contact with blockchain are often at a loss. What kind of economic language is used to describe the meaning of blockchain.

    Some people say it’s cost, but they don’t see which industry’s cost has changed because of blockchain; others say it’s trust, and they don’t see the quantifiable description of trust under blockchain, so many people think the value of blockchain is that it can’t be tampered with.

    These are very superficial understandings. In the language of economics, blockchain essentially changes the agency risk in the principal-agent relationship.

    Agency risk is everywhere.

    When you entrust someone to do something for you, he fails to fulfill it; when you leave something in someone else’s place, he fails to take good care of it; or when the service you pay for goes wrong, it’s all agency risk. What’s really worrying is that your assets are entrusted to others, resulting in huge losses, such as our investment fund, our investment in P2P and so on.

    Agency risk refers to the risk that an agent has no ability or fails to perform the principal-agent relationship in accordance with the rules.

    Code is the law

    All of the above seems to be common and has nothing to do with blockchain; but in fact, the real meaning of blockchain is here: the algorithmic realization of agent risk through public ledger and decentralized consensus, that is, agent risk is implicit in the code, and the code is open-source and determined at the beginning, so agent risk is completely known. This is the real value of blockchain, and de trust refers to the agency risk in a general sense: the trust risk to people and institutions. Cost reduction is to avoid the risk of all kinds of adverse selection, thus bringing the decline of social cost.

    Therefore, the essence of blockchain is a problem of principal-agent framework. Technological changes have brought about a revolution of principal-agent. We no longer put the core risk on a third-party organization. Code is the law, and code is the main risk.

    Here are a few examples:

    Case 1: Bitcoin

    Through Bitcoin, in an ideal situation, we do not rely on any third party to save or transfer assets, which is also what many people say, through cryptography, we guarantee the “inviolability of private property”.

    BTC

    There is not no risk in this. For example, the code is also likely to make mistakes (although it has been tested for 10 years, it can’t be said that it’s impossible). But the code is open-source at the beginning. For everyone, this risk is completely informed and cannot be modified (except for bifurcation); more importantly, in this process, an individual or institution has no impact on BTC transfer Maybe, we can reliably complete point-to-point payment.

    Bitcoin, a major technological innovation, has completely changed the past economic model and brought us to a new era of trust algorithm rather than individual.

    Case 2: USDT

    USDT is the digital dollar issued on Ethereum. Although USDT is on the blockchain, the system to ensure the value of USDT is off the chain, that is, the issuer Tether company promises that each USDT is equal to one dollar.

    Although Tether company has done a lot of work to ensure the effectiveness of this commitment, such as trusteeship and audit of bank accounts, we must trust Tether company, audit institution, trusteeship bank, etc. in order to truly and smoothly implement the whole process, which is quite different from BTC’s full trust in the calculation method. Although blockchain is used, its value includes huge agency risk. Once Tether company and other companies do not honor, USDT becomes a string of codes, rather than a dollar.

    Case 3: Platform currency

    Blockchain industry has a special kind of assets, called platform currency, which embodies the service fee reduction, transaction pricing, profit return and other values of an exchange platform. This kind of asset, such as BNB, as token of the coin security platform, whether or not it uses blockchain technology, essentially contains a huge agency risk like USDT, that is, money security exchange can change, revoke or even change the value reflected in the token. What can we do? Only believe in the safety of money.

    Case 4: DAI

    In addition, there is a special kind of asset, such as the stable currency DAI of MakerDAO. In essence, it is a stable option. DAI generated by eth mortgage is an option based on eth. It has a strict pricing formula, which is in line with the design that we have no agency risk but only algorithm risk. But there is also a problem with DAI. The price variable that determines its risk value is input by human. There is no good verification mechanism for this price, but it is input irregularly through several internal nodes. The basis for our position closing is the input price of these so-called nodes. Obviously, this risk is not algorithmic, but we need to believe that these nodes do not do evil or make mistakes, although However, maker has a rollback mechanism, but it also introduces the trust risk of rollback: who will determine the need for rollback? Why trust them?

    the degree of decentralization

    Based on the above analysis, we can summarize:

    That is, in the blockchain world, a complete value interaction process, as long as there is a link with agent risk, it is actually different from BTC, that is, we still introduce human risk, not code risk.

    Therefore, in the blockchain world, the quality of decentralization and the agency risk are two sides of a coin, and they are different expressions. The former is a perceptual description, and the latter is a rational definition. It is completely appropriate for us to measure the degree of decentralization with the agency risk.

    Not only can we judge the degree of decentralization of a system, but also we can find out which areas of agent risk can be solved by blockchain, so that we can really enter the application era of blockchain.

    Trust algorithm

    In a world full of agency risks, we have established a set of incentive system based on laws, systems, etc. to maintain the effectiveness of principal-agent structure, so that the world can operate normally. But the cost is also huge. Imagine all kinds of corporate scandals and regulatory corruption, which is the inevitable risk of the past model. No matter how well our incentive mechanism is designed, this fundamental risk cannot be eliminated. It can be said that the evolution of human society is the story of constantly changing models to deal with agency risk.

    However, this change brought about by blockchain technology is the most thorough solution to the agent risk. No longer need to trust any third party, no matter authority, hero or sage, but only need to trust the code! Trust algorithm!

    This is a dazzling moment in the financial history, which is worth our in-depth and comprehensive exploration. However, most people did not return to their minds, thinking it was just a small attempt, and soon compromised in the traditional model, which makes people sigh.

    Product thinking
    DeFi

    When we are “consuming” Bitcoin, what exactly are we consuming?

    June 26, 2020

    Next, we will talk about the value and risk of consensus.

    The consensus cost determines the value of a public chain, not the application. The application only affects the risk of the value, which is a very subversive view.

    For this reason, we need to further explore:

    Imagine that we have a bunch of code, such as Bitcoin code. When we ask what it’s used for, it goes into product thinking.

    Product thinking is in line with our intuition. If something has no use value, it is not worth our attention. But there is a drawback in product thinking, that is, the products we see are all provided by companies or individuals, and have a strong central directivity. This makes it seem that when we understand the product, only I interact with the product itself and the service provider behind the product. The use value of the product is mainly reflected in the product itself or the service provider. Sometimes good product development may determine the number of users.

    But Bitcoin is different.

    When we use Bitcoin, it’s not Bitcoin code or an organization that provides us with services, but tens of thousands of miners behind the code that provide services, which is quite different from the centralized products.

    First of all, the code itself is not very valuable. (Note: if there is a lot of value, the code is open source, you can copy one and provide services for free. Do you want to switch to the new code?)

    Secondly, a person is running this code and making your transfer transaction go smoothly, which is not very valuable.

    Only when the miners are big enough and scattered enough can the transfer have “value”.

    Consensus provided by miners

    It’s a little similar to the difference between ink weather and wechat. As long as ink weather is running, anyone can get the service they want by opening the ink weather app. But wechat is not, if their friends don’t use it, Tencent has a million servers there, which is of no value to you. This kind of mutual dependence between users is similar to that of Tencent Consensus depends on the same.

    Therefore, when we “consume” Bitcoin, we are not “consuming” Bitcoin code. Because this code can be copied by anyone, its own value is very low, but in the “consumption” of the consensus provided by tens of millions of miners.

    These consensus include the unity, authenticity, and non tamperability of the data on the chain, as well as the recognition of the intrinsic value of the special currency, and various expectations of the future use scenarios, which are different from the consensus thinking of product thinking.

    Consensus does not necessarily point to a certain application, nor does a certain application determine the value of blockchain.

    What’s the use of blockchain? It’s not pre-set from the beginning.

    Although most blockchain systems have initial function definitions, this function will eventually move towards abstraction, i.e. the application may exceed the designer’s imagination, only the most abstract level is the same as the preset.

    BTC value storage

    For example, Bitcoin was intended to be used for payment, but now it does not have the conditions to become payment, instead, it turns to value storage, but both of them are consistent on the level of transfer. Another example is Ethereum. The applications it advocates in the white paper have not been carried forward, while various decentralized financing and decentralized finance have developed, but these applications all use the function of smart contract.

    Those who are always looking forward to a specific application to drive the development of a decentralized system are a bit willing to give up.

    If the consensus itself has not been built, that is to say, the cost of consensus embodied has not yet come up, then the application is difficult to develop based on a single software function; and even if the subsidy is stiff, it will eventually return to the original appearance of things. Finally, it depends on the construction of consensus. The way of using the back-end to pull the front-end is crude, and the possibility of success is very small.

    Consensus building

    Let’s focus on the construction of consensus, which contents are defined by each blockchain system. For example, some are consensus of transaction data on the chain, some contain consensus of smart contract, some contain consensus of market price, etc.

    Every consensus has its intrinsic value and needs to pay a cost. Moreover, consensus does not eliminate the friction of the world and let water flow to the lower place like ordinary products; on the contrary, consensus is to form a new value and let water flow to the higher place. Therefore, the higher the consensus cost, the greater the value, which is also the special feature of the blockchain decentralized system.

    Value and risk of consensus
    Blockchain public chain

    Value and risk of consensus

    June 24, 2020

    Consensus mechanism is the most special thing of blockchain, which is different from the consensus in the real world. Based on algorithmic program, blockchain reaches an agreement on the transaction data on the chain, thus creating value. The value in our life is also based on consensus. Except for the exchange price, most of the consensus does not have a serious process or force, so it is difficult to quantify. What’s interesting about blockchain is that this consensus can be measured, such as BTC reference computing power index, POS passing currency holding test, DPOS voting based and so on. Do these differences in consensus represent differences in value and risk?

    We believe that this is true, that is, the cost of consensus determines the value of the blockchain. According to the current BTC model, this cost is computing power. Many people may focus on the application of blockchain, and think that as long as there are enough applications on the chain, its value is the greatest, which is a typical product thinking, not applicable to blockchain. A public chain, if its consensus cost is high enough, that is, its computing power is large enough, it is indeed more valuable than a public chain with low computing power. Many people will question this sentence, saying how can the computing power maintained by the central organization represent value? In fact, this sentence confuses value and risk. Computing power determines value, and the composition and application of computing power determine risk.

    Computing power is provided by an organization. If it can not effectively share the cost of computing power, this kind of consumption is unsustainable. Computing power is provided by the market, which is the result of thousands of rational individual calculations. Each of them tries to achieve their own economic closed-loop. As long as they still provide computing power, it shows that they can achieve the net efficiency of input-output. These people who provide computing power are different, thus effectively avoiding the situation of both damage or prosperity. No matter how the incentive of the public chain fluctuates, there are always people in and people out, and there will be no collective exit at the same time. This decentralized structure reduces the risk of consensus.

    If the public chain or the corresponding token has a usage scenario, it will also indirectly reduce the consensus risk. On the one hand, the consumption of tokens has been settled, on the other hand, the secondary market of tokens has a stable expectation, which enhances the purchasing power, thus bringing certainty to the computing power provider. These two aspects regulate the consensus risk of public chain, but the consensus cost always determines the value.

    consensus mechanism

    According to the BTC design, when the computing power is large enough, the cost of completing 51% attack will become exaggerated. Even if the computing power attack is successfully implemented, the miner can split out of the attacked block and maintain the original consensus again. The attacker can get a block chain system without consensus, which is meaningless. Therefore, once a public chain is maintained based on a decentralized consensus, it will become very powerful. The cost of this consensus, seemingly unrelated to actual production, is the foundation of value, and production activities only reduce the risk of consensus. To exaggerate, even if BTC can’t be used anywhere, or no one wants to use it, its consensus cost will remain at tens of billions every year, and it still has the current value.

    Ethereum 2.0 will change the consensus mechanism from POW to POS, which has a great impact on Ethereum. This impact is mainly reflected in the evaluation of consensus cost. At present, the industry is not fully aware of this problem, but simply analyzes the impact of consensus change on application and development. I think this is putting the cart before the horse, because how much application is for a blockchain system is only a risk issue, not a value issue.

    The risk of maintaining value is low for the public chain with application, and potential risk may exist for the public chain without application under the same consensus cost. We need to spend more time thinking about the comparative analysis between POW and POS, and how to compare the cost of both in a framework. In fact, many people know that for a long time, even now, the number of applications on EOS is no less than ETH from participants, but the market value is less than 20% of ETH. Some people think that it is because heavy asset projects like usdt or DeFi are not released on EOS, which is incorrect. The root cause is that the consensus cost of DPOS is far lower than that of POW. No matter how many applications there are, it can only appear that the risk is low under the given consensus cost, which cannot represent the high value. But for the cost comparison between POS and POW, there is no good framework, which needs more people to study and improve.

    Blockchain-credit
    DeFi

    Credit of a block

    June 24, 2020

    The anonymity of blockchain makes the concept of “credit” redundant. All the loans and levers on the chain need to be fully mortgaged. Even if someone tries to find another way to establish a weak credit through some “identity” asset, it is also a kind of disguised mortgage in essence (it is known that this is also a kind of mortgage after noticing the word “asset”). In the world of blockchain, if you can’t guarantee the performance, you can’t have credit. Generally, the only way to guarantee the performance is mortgage. It’s in line with this logic to look at various decentralized loans and stable currency schemes on the chain.

    Recently, however, BZX The arbitrage event provides a credit solution: through the flash loan interface, ensure that the asset call in the contract succeeds = the call returns more assets, the contract call fails = the asset does not change, and in this transaction, you can control the asset in the contract at will, as long as the asset is repaid in this transaction, there is no restriction on the use of the asset System, with a great degree of freedom, is similar to the real world liquidity financing (different from dedicated funds). It can be said that this structure realizes the credit of a transaction. Since the transaction is completed in a block, we can call this credit “one block credit” (see Note 1 for the flow of flash loan). Some people may think it’s just “a block of credit”. It has no advantage in time and its value will be very small. Indeed, if we take into account the numerous credit assets in the real world, this “one block credit” is unattractive. However, there is a situation that becomes very important, which is arbitrage!

    one-block-credit

    Different from the traditional commodity economy, in the financial world, once the opportunity of risk-free arbitrage appears, its demand will no longer be a downward curve, but a straight line perpendicular to the horizontal axis (arbitrage yield), that is, the demand is infinite! Any participant, as long as they have the funds or the financing ability, will try their best to participate in this opportunity. This is an ideal situation. In the real world, when real arbitrage opportunities appear, there is often an important constraint: “lending restrictions”, that is, people who find opportunities do not have funds, nor can they finance to complete arbitrage, resulting in that this opportunity can only be used by a few people, or even not used for a long time. The inefficiency of resource allocation is intolerable to most financial scientists, so the criticism of lending restrictions has never stopped. In the real world, it is difficult to judge whether an arbitrage opportunity is really risk-free and whether the borrowed liquidity can be returned in time. Therefore, the lending restrictions and partial “market inefficiency” have never completely disappeared.

    However, the existence of inter contract transfer and flash loan on the blockchain may make the lending restrictions in the traditional financial world and the resulting “market inefficiency” completely disappear! This is because in the chain, through this kind of flash loan design, two problems mentioned above are solved at the same time: 1. Whether the arbitrage opportunity really exists; 2. Whether the repayment can be made in time. Now for example, if there is contract a, an arbitrage opportunity is exposed. At this time, any fund pool o supporting flash loan in the blockchain world, and a participant s, s who finds the opportunity, do the following arbitrage contract B for the opportunity: call asset X in O, and interact with contract a. if the arbitrage succeeds, return asset X in O. if the arbitrage fails, return asset o is low On X, contract B is cancelled. According to such a contract process, the authenticity of the arbitrage opportunity is verified by whether the loan assets can be effectively returned, so that conditions 1 and 2 are met at the same time. It is noted that O and s can be any fund pool and individual, which means that there are no obstacles and selectivity in the whole process, thus fully reflecting the homogeneity of participants and financial needs, and perfectly solving the inefficient problem of resource allocation.

    Of course, the perfect implementation of the lightning loan also has certain basic conditions to guarantee: a large number of asset pools supporting the lightning loan interface and allowing mutual calls between contracts. If these two conditions are met, a perfect decentralized financial system will be built, and its efficiency will far exceed that of traditional finance. This is not only because of code execution, but also because it perfectly solves the arbitrage risk and the credit problem required by arbitrage. However, it is worth mentioning that if contract calls create new arbitrage opportunities, this is not the expected behavior to promote market effectiveness, which belongs to a new kind of financial attack. In the following articles, we will conduct a more in-depth analysis on condition 1 just mentioned, and need to make a distinction, because if we take advantage of the transaction characteristics of the contract, it is carried out externally Restructuring, forming new affairs, is not a real arbitrage of market inefficiency, but undermines the effectiveness of the market. For contracts and contract calls, we will conduct in-depth analysis. See the following articles for details. However, in any case, because lightning loan creates a new kind of credit, which helps to overcome the “borrowing constraints”, it is an important discovery in the field of blockchain and provides a lot of help for subsequent development.

    What kind of changes will a block’s credit bring to the world? Let’s see!

    Note 1: Flash loan, using the transaction characteristics of a transaction (acid: atomicity, isolation, consistency, durability), has a callback design for the loan model, as follows:

    1) In addition to specifying the loan amount, the loan function entry also needs to specify a receiving contract address
    2) Acceptance of contracts is subject to the requirements given in the loan specification
    3) The loan interface calls the specified callback function of the receiving contract and transfers, and records the balance status before transfer
    4) When receiving the loan transfer, the callback function is called. You can use the received assets to perform custom contract logic, or call other contract functions
    5) After the execution of the receiving contract function, return to the callback point of the valet function, then you can get the pre transfer balance status of the third part. At this time, check whether the repayment has been received. If not, enter the abnormal status, and the whole transaction will be cancelled.

    As mentioned above, a supervised “credit loan process” is completed through callback design and transaction characteristics of transactions

    DeFi
    DeFi

    DeFi insurance design: always start from general equilibrium and reduce systematic arbitrage

    June 23, 2020

    On March 1, 2020, Star daily, a professional media platform of blockchain, and nine chapter Tianwen, a nest enthusiast (nest is a distributed price predictor), held a in-depth exchange and dialogue on the development trend of the DeFi insurance industry. What is the design idea of the DeFi insurance products and the development trend of the DeFi insurance? On these hot issues, we will do our best in this wonderful dialogue. Here are the dialogue contents to share with you:

    Odaily: what do you think is the significance and role of DeFi insurance in the whole DeFi ecosystem?

    Nest enthusiasts: Defi insurance is a kind of security protection for some DeFi products, mainly a kind of compensation for the development risk of DeFi and the fluctuation risk of assets in DeFi.These two kinds of risks are different. In terms of development risks (code vulnerability, backdoor, arbitrage algorithm, etc.), the value of DeFi insurance is high, and it is also the core direction of industry development. As for the insurance of asset fluctuation, it is essentially a kind of swap or option, which more reflects the derivative structure of asset price. It can not be called insurance completely. This kind of product will be included in the derivatives in the future, but its value is also very large, but it is not a strict insurance product.

    Odaily: in your opinion, where is the ceiling of the DeFi insurance segment? What’s the value of the DeFi lock?

    Nest enthusiast: The ceiling of the DeFi product is ultimately the market value of Eth, because all risks on the chain can be offset by insurance, but the code risk of Eth cannot be offset by insurance, because your insurance is also developed on the chain. If we say that the market value of eth measures the code risk and consensus risk of eth, then without changing the current structure of eth (anti attack), the ceiling of insurance is the market value of eth. But in terms of supply demand relationship, it may be far below this standard. It has something to do with infi’s lock in, but not entirely. For example, if the asset fluctuation insurance in question 1 is included, it is larger than the lock in. It’s mainly about the type 2 demand mentioned above, which is hard to estimate. If the type 2 demand is not considered, it is really limited by the size of the insured DeFi lock warehouse.

    Odaily: due to the high interoperability and composability of DeFi products, the capital loss event on DeFi is often not an independent probability event. When a product’s capital is damaged, it is often associated with other products. In view of this problem, do the current insurance products have better coping strategies?

    Nest fans: it’s a good thing, not a bad thing. This reflects the effectiveness of the financial market. The solution is not to provide more insurance. Even if more insurance is provided, it will be incorporated into the whole arbitrage system by the arbitrager. The core is to improve the DeFi The overall view of product developers is not to develop financial products with wishful thinking or partial equilibrium thinking, but from the perspective of general equilibrium, that is, all variables in the whole market should be considered for your product, and your product should constantly attack and arbitrage existing products! Only based on such a starting point of development can it be correct.

    The problem of traditional Internet development is to use demand as the starting point, such as user experience, user experience, efficiency, etc., but in the financial field, especially in the contract product field, there is only one demand: arbitrage (or making money), the demand of all participants is homogeneous, so we need to consider the whole market behavior, whether your product will help reduce the market arbitrage or increase it Arbitrage? If it is reduced, your product can stand, if it is increased, you will be arbitraged, and the product has no meaning. So, it’s not about insurance, it’s about the mindset shift of all the DeFi developers.

    DeFi

    Nest system is based on this assumption to build a modern DeFi system. From Oracle to trading market, interest rate market, derivative market and insurance market, it is based on whether to reduce arbitrage opportunities in the whole market. We need to know that as long as the arbitrage increases, it means that there are a group of people with the advantage strategy, which will lead to the disappearance of the whole market! The design and development of profi is a big deal, and the experimenter will have a lot of holes to step on, about the risk of centralization or the security of contract code.

    Odaily: a few days ago, nexus mutual was exposed to governance loopholes, which made a big question mark for the DFI insurance platform: who can underwrite the insurance acceptor of DFI? From this point of view, does it mean that the fully decentralized DeFi insurance platform can’t guarantee the security of DeFi?

    Nest fans: it’s all about product design. I mentioned in the article “the first principle of decentralization”. The so-called governance in this chain, if it still defaults, is essentially centralized and will be eliminated. This kind of product and phenomenon should not be included in the discussion, but should be criticized on a large scale.

    Odaily: apart from DeFi insurance, what do you think is the more feasible risk prevention in terms of technology for the risks caused by the composability of DeFi?

    Nest enthusiasts: we have explained in the answer to the previous question 3 that this is a good thing. The problem is not in the thinking of combination, but in the thinking of the combiner, because they are going on along the wrong product development ideas, and they are going to be eliminated finally. The new generation of Profi will have a global perspective, that is, we 3 General equilibrium thinking mentioned in: the development of each product is aimed at reducing market arbitrage opportunities. Some people say, you don’t think about the needs? Obviously, if the demand is not satisfied, there is an arbitrage opportunity (note that the demand here does not refer to the demand that is used fast and good-looking, but refers to the demand that the risk and return are not reasonably allocated and priced). We need to meet the new development thinking of profi and enter the era of efficient finance.

    Odaily: which of the existing DeFi insurance products and platforms do you like better? (Ethernet, CDX, nexus mutual, opyn, vouchforme, keeperdao, etc.)

    DeFi-platform

    Nest fans: as we mentioned earlier, I think it’s all partially balanced products. Be careful of being arbitraged. Someone should systematically analyze this kind of products. If the thinking does not change, this kind of partially balanced products will be eliminated by the market with the passage of time.

    Odaily: can you talk about the philosophy of the design of DeFi insurance products? Can we also put forward some suggestions for improvement of existing products?

    Nest fans: Always start from general equilibrium, reduce the arbitrage of the whole system, and always improve the pricing efficiency of the whole system. What is insurance? Insurance is a negative security. Why do we have negative securities? For the sake of the market. Why complete market? Only in this way can pricing be effective. What does effective pricing mean? The market is efficient. What does the market effectively represent? Financial resources are allocated most efficiently and best.

    Odaily: finally, can you summarize that, will DeFi insurance become a trend in the future? What do you think are the necessary conditions for decentralization insurance to become a trend?

    Nest enthusiast: I think the main trend is DeFi, and the DeFi insurance is just an important part of it. As I said before: Oracle, trading market, interest rate market, insurance market and derivative market are all very important. They are developing together. Because this is a super rational world on the blockchain, the arbitrage of code completion is only in a moment, and each link cannot be slow. I think it is a trend of all-round development. The necessary condition is first the prediction machine. With this, it will flourish later. I hope you will pay more attention to the Nest price oracle network, a better solution to the problem of prediction machine.

    Blockchain public chain

    From the perspective of consensus cost, why is the market value of EOS less than 20% of eth?

    June 21, 2020

    Consensus mechanism is the most special thing of blockchain, which is different from the consensus in the real world. Based on algorithmic program, blockchain reaches an agreement on the transaction data on the chain, thus creating value. The value in our life is also based on consensus. Except for the exchange price, most of the consensus does not have a serious process or force, so it is difficult to quantify. What’s interesting about blockchain is that this consensus can be measured, such as BTC reference computing power index, POS passing currency holding test, dpos voting based and so on. Do these differences in consensus represent differences in value and risk?

    We believe that this is true, that is, the cost of consensus determines the value of the blockchain. According to the current BTC model, this cost is computing power. Many people may focus on the application of blockchain, and think that as long as there are enough applications on the chain, its value is the greatest, which is a typical product thinking, not applicable to blockchain. A public chain, if its consensus cost is high enough, that is, its computing power is large enough, it is indeed more valuable than a public chain with low computing power. Many people will question this sentence, saying how can the computing power maintained by the central organization represent value? In fact, this sentence confuses value and risk. Computing power determines value, and the composition and application of computing power determine risk.

    Calculation determines value

    Computing power is provided by an organization. If it can not effectively share the cost of computing power, this kind of consumption is unsustainable. Computing power is provided by the market, which is the result of thousands of rational individual calculations. Each of them tries to achieve their own economic closed-loop. As long as they still provide computing power, it shows that they can achieve the net efficiency of input-output. These people who provide computing power are different, thus effectively avoiding the situation of both damage and prosperity.

    No matter how the incentive of the public chain fluctuates, there are always people in and people out, and there will be no collective exit at the same time. This decentralized structure reduces the risk of consensus.

    If the public chain or the corresponding token has a usage scenario, it will also indirectly reduce the consensus risk. On the one hand, the consumption of tokens has been settled, on the other hand, the secondary market of tokens has a stable expectation, which enhances the purchasing power, thus bringing certainty to the computing power provider. These two aspects regulate the consensus risk of public chain, but the consensus cost always determines the value.

    Consensus risk of public chain

    According to the BTC design, when the computing power is large enough, the cost of completing 51% attack will become exaggerated. Even if the computing power attack is successfully implemented, the miner can split out of the attacked block and maintain the original consensus again. The attacker can get a block chain system without consensus, which is meaningless. Therefore, once a public chain is maintained based on a decentralized consensus, it will become very powerful. The cost of this consensus, seemingly unrelated to actual production, is the foundation of value, and production activities only reduce the risk of consensus.

    To exaggerate, even if BTC can’t be used anywhere, or no one wants to use it, its consensus cost will remain at tens of billions every year, and it still has the current value.

    Ethereum 2.0 will change the consensus mechanism from POW to POS, which has a great impact on Ethereum. This impact is mainly reflected in the evaluation of consensus cost. At present, the industry is not fully aware of this problem, but simply analyzes the impact of consensus change on application and development. I think this is putting the cart before the horse, because how much application is for a blockchain system is only a risk issue, not a value issue.

    Transfer of consensus mechanism

    The risk of maintaining value is low for the public chain with application, and potential risk may exist for the public chain without application under the same consensus cost. We need to spend more time thinking about the comparative analysis between POW and POS, and how to compare the cost of both in a framework. In fact, many people know that for a long time, even now, the number of applications on EOS is no less than eth from participants, but the market value is less than 20% of eth. Some people think that it is because heavy asset projects like usdt or defi are not released on EOS, which is incorrect. The root cause is that the consensus cost of dpos is far lower than that of pow. No matter how many applications there are, it can only appear that the risk is low under the given consensus cost, which cannot represent the high value. But for the cost comparison between POS and POW, there is no good framework, which needs more people to study and improve.