Blockchain storage arweave: tape of Turing machine, a new paradigm of Trusted Computing

outprog 2021-04-06 19:18:58 阅读数:42

本文一共[544]字,预计阅读时长:1分钟~
blockchain storage arweave tape turing

The essence of digital consensus is storage consensus . Storage computing paradigm : A blockchain trusted computing paradigm with ultimate expansibility .

from 2020 Beginning of the year , The etheric fang DeFi Explosive growth , from COMP,YFI And so on, causing a sharp congestion in Ethereum , At one point, the miner's fee was raised to 500 Gwei,DeFi It costs hundreds of dollars . Ethereum is congested , There is an urgent need for a scalable smart contract solution .

10 month ,V God commented With rollup The road map of Ethereum , It's a wave of two-tier networks ,ZK Rollups and Optimism Rollups Competing to appear .ZK Rollups and OP Rollups Two completely different proof modes are adopted , Namely : Proof of validity and proof of error .

ZK The proof of validity is to send a batch of transactions and the corresponding integrity proof to the smart contract of Ethereum for verification , If the verification is correct, it will be accepted by the contract , If the verification error occurs, the transaction will be rejected ;OP It's a false proof , It's the data state operator who submits the data , With a deposit , Anyone can post non interactive error proofs for a period of time , If no one successfully verifies the error during this period , Then the state is final , conversely , Successful proof of error will result in forfeiture of the operator's deposit .

ZK and OP The difference is that there are two different ways to end the game . I put ZK Analogy for POW, He verified the effectiveness of the results through cryptography algorithm ; and OP More like POS, It's the operator's mortgage money ( image Staking equally ), A series of “ government ” Means to achieve the ultimate goal .ZK Due to the limitation of cryptography theory and Technology , Because of its “ Instruction set ” Too concise , It's hard to handle complex business processes ;OP The problem is governance , The need to lock in funds and limit liquidity .

Whether it's ZK still OP, The expansion of Ethereum is based on data compression and off link computing . As shown in the figure below , In order to save the load of one layer network , The data is greatly compressed and the computation is migrated to the second layer for processing , At this time, the data stored in the first tier is reduced , And you just need to verify the State , It greatly reduces the load of one layer .

But reducing the load doesn't mean unlimited load . The block size and gas There is an upper limit , And there will still be a lot of gas Will be consumed on the first floor DeFi And combinatorial applications , Therefore, the second tier and the first tier will seize resources , And the problem of resource preemption is also involved between multiple layers . In the end, will there be some two layers unable to link under limited resources 、 Can't package the deal , So whether the second floor will shut down due to this ? Even if the second floor doesn't shut down , Always trading on the second floor without restrictions , When assets need to go back to the first level , Whether there will be too much data needs to be verified in one layer , Eventually, it's impossible to verify . in addition , The combination of two layers is also a serious problem .

Ethereum is a “ World computer ”, The original intention of its design is to complete the calculation and storage on the blockchain . Computing on the chain means that all nodes in the blockchain network must process the computing process , The cost of on chain computing can't be too low . Even if the two-tier expansion mode is adopted , It's just compression of data and computation , But it still needs to be calculated ( verification ) Processing on the first floor .

This paper aims to introduce a new computing paradigm , Different from Ethereum's model of computing on the chain , The computing paradigm puts computing completely under the chain , The chain is just for storage , At this time, the blockchain only needs to ensure the availability and certainty of storage . Suppose the input parameters of a calculation are deterministic , Then the output of this calculation should also be certain . such as x + y = z This procedure , If x and y All values are recorded on the chain , Namely x = 1 and y = 2, So the formula calculates the result anywhere z Always be 3. As long as the parameters of the program are definitely recorded on the chain , So the program doesn't need to run on the chain ( Anyone, anywhere, under the chain ), The results are still credible . Because the calculation process is completely decoupled from the chain , Calculation parameters depend entirely on deterministic storage , We call this computing paradigm : Storage computing paradigm .

Turing machine , Return to the original paradigm of storage computing

We know , Whether it's von Neumann or Harvard computers , Its essence is a universal Turing machine .

A Turing machine is an imaginary machine , It consists of an infinite tape and a tape reading machine with status register . The reader moves back and forth on the tape and writes the new parameters to the tape . Such a hypothetical machine can perform any complex calculation ( Turing Complete ).

With blockchain Technology , We can replace Turing tape with blockchain , You can get the new calculation model in the figure below ( Storage computing paradigm ):

You can write a business program , Upload the code of the program to the blockchain in advance ( The height of the block above 102). Anyone can download and run a program from a trusted blockchain , The read and output end of the program are blockchain ( Paper tape ). Because blockchain is traceable 、 An unforgeable feature , Therefore, the input and output of the program under the chain are also deterministic . The program loads deterministic parameters from the blockchain , The final generated program state is also deterministic .

Storage computing paradigm will be the source code of the program 、 The input and output are stored on the blockchain . When you run the program , Load trusted source code on the chain , The trusted parameters on the chain are calculated off the chain , The output state must also be consistent . Anyone can run a program , Everyone runs the same thing , To achieve the purpose of Trusted Computing .

The model is theoretically feasible , But Ethereum as deterministic storage is still too expensive , It will lead to the application cost surge . Using Ethereum's limited block size as storage will also greatly reduce the scalability of the application . With the help of Arweave, A blockchain targeting permanent storage , We can put the storage computing paradigm into practice .

Arweave Introduction of the principle

Arweave It's a file storage protocol based on blockchain , It has a one-time payment , Characteristics of permanent storage files , It implements a simple set of economic incentive rules , So that miners can store data for a long time .

Permanent storage is AR Core functions , So first of all, we need to know the cost of permanent storage . Statistics show that storage costs are decreasing at an alarming rate every year , Each of them GB On average, the cost of storage is decreasing every year 30.57%. Calculated , The cost will converge to a constant , Get a cost of permanent storage .AR Use the permanent cost of convergence as a benchmark for data storage . Shown below , Storage 1 GB The data consumption of 2.45 individual AR, The cost is about $9.8( because AR Price volatility , Storage costs also fluctuate ).

With a fee base , How can miners keep this data forever ?AR The introduction of a new mining mechanism : When mining a new block , Will randomly chain to the previous one “ Memory block ”, Ask the miner to prove that he can access the data of this memory block , Only in this way can an effective new block be mined . This will encourage miners to store as many historical blocks as possible . meanwhile , The algorithm also encourages miners to store “ rare ” block , Because when rare blocks are chosen as memory blocks, miners can exploit new blocks with less competition , To achieve the goal of data permanence .

AR It can keep the data forever 、 price steadiness , cheap . Relying on blockchain Technology ,AR The stored data is also verifiable and traceable , Very suitable for Trusted Computing “ Turing tape ”.

summary

The key to the operation of storage computing paradigm is data persistence and fixed cost ( Cheap ), Only permanence can make data always “ Available ”, Only by getting these data can we calculate the consistency state under the chain ; Fixed or low cost , It can keep the consensus cost of application in a stable range , Not like Ethereum , In the case of block congestion, there is fierce competition for resources , Stable cost makes the application more available .

Compare the storage computing paradigm with AR combination , We may have got a perfect deterministic Turing machine , A new trusted computing model that can be applied in practice .

The storage computing paradigm has at least the following advantages :

  1. It can be calculated with any complexity . Computing power depends on the performance of the machine in the chain ;
  2. Reduce the cost of consensus . The cost of consensus only includes the cost of storage , It doesn't include computing costs anymore . The cost is calculated by the operator of the application ( Chain calculator ) Provide ;
  3. Have composability and excellent “ Fragmentation ” Ability . Applications only need to load the data they care about from the chain , When multiple applications are combined, the data of multiple applications will be loaded , Operators no longer need to download all the data ( Like running a dapp It needs the whole thing geth The support of );
  4. Extremely scalable . First, the reduction of consensus cost improves the expansibility , Second, data is not just a download “ Fragmentation ”, Can also achieve upload “ Fragmentation ”, So the performance bottleneck is just network bandwidth ;
  5. Unlimited programming languages . Just store the target program in the blockchain ahead of time and serialize all the input parameters of the program .

We are right. rollups and ETH 2.0 In depth research , All efforts are to externalize the computing chain , And the ultimate goal may be complete offline computing . In the process of exploration , It can be concluded that : When there is no ambiguity in the program , As long as the storage of input and output is deterministic , Then the calculation results of the program must also have certainty .

Storage computing paradigm is a new thing completely different from the previous blockchain computing model , It may take a lot of time for the public to accept and accept , But it must be closer to the essence ( Turing machine ) Excellent trusted computing paradigm .

Thank you, Mr. a Jian, a fan of Ethereum rollups and ETH 2.0 In depth interpretation of , My uncle is right SmartWeave The introduction of Li Pei from Xinghuo mine to Arweave and LazyLedger In depth study of .

Link to the original text :https://www.jianshu.com/p/338d70b76361

版权声明:本文为[outprog]所创,转载请带上原文链接,感谢。 https://netfreeman.com/2021/04/20210406191552646a.html