Progress of blockchain cryptocurrency in the past five years vitalik

Jiedao jdon 2021-04-30 19:50:08 阅读数:831

本文一共[544]字,预计阅读时长:1分钟~
progress blockchain cryptocurrency years vitalik

2014 year , I published an article article And a speech , It lists the math , Problems in computer science and Economics , I think these challenges are important for cryptocurrency ( Later I called it ) It's vital that we grow up . In the past five years , Things have changed a lot . In this article , I will introduce them one by one 2014 Years since 16 A question , And introduce our situation one by one . Last , I'm going to talk about 2019 A new choice for the year's puzzle .

These problems fall into three categories :(i) encryption , therefore , If you want to solve it at all , It can be solved by pure mathematical techniques ;(ii) Consensus theory , The proof of work and the proof of rights and interests have been greatly improved , (iii) economic , So it's about creating structures that involve incentives for different players , And it usually involves the application layer rather than the protocol layer . We see significant progress in all categories , Although it's better than other fields .

Scalability

Scalability is a technical problem that we have made great progress in theory . Five years ago , Few people think too much about it . Now? , Sliced design has become commonplace . except The etheric fang 2.0, We also have OmniLedger,LazyLedger,Zilliqa as well as It seems to happen every month [url=https://medium.com/@giottodf/zilliqa-a-novel-approach-to-sharding-d79249347a1f] Published [/url] A research paper .

Fundamentally speaking , We already have a lot of Technology , It enables the verifier group to safely reach a consensus on more data that cannot be processed by a single verifier , And these technologies can also enable customers to indirectly verify the full validity and availability of blocks , Even lower than 51% Attack conditions . These are probably the most important technologies :

That being the case , The block chain has not yet appeared in real-time operation . In theory , There is mainly controversy about the remaining details , And the stability of the partitioned network , The experience of developers and the challenges associated with mitigating the risk of centralization ; The basic technical possibilities no longer seem to be in doubt .

Time stamp

Create a distributed incentive compatible system , Whether it's the overlay at the top of the blockchain or on its own blockchain , Can make the current time to maintain a high accuracy . All legitimate users in a “ real ” A clock with a normal distribution around time , The standard deviation is 20 second ... The distance between two nodes is not more than 20 second .

Ethereum actually only has 13 Second blocking time , And there is no particularly advanced timestamp Technology . It uses a simple technique , That is, the client does not accept the block whose declaration timestamp is earlier than the local time of the client . in other words , This has not been tested for serious attacks .

however , in general , Time stamping is not at the forefront of current research challenges . Maybe once the equity chain ( Including Ethereum 2.0 And other things ) The evidence of the real-time system appears online , This will change , We're going to see the problem .

Proof of arbitrary calculation

Basically, it means , Build a SNARK( or STARK or SHARK or ...).SNARK More and more people understand , Even in today's multiple block It's already used in the chain SNARK( Include The etheric fang Upper tornado.cash).SNARK As a privacy technology ( see also Zcash and tornado.cash) And scalability Technology ( see also ZK Rollup,STARKDEX and STARKing Erase the encoded data root ) They are very useful .

There are still challenges in terms of efficiency ; Doing arithmetic friendly hash functions is a big problem , Proving random memory access effectively is another matter . Besides , There is still an unresolved problem , That is, in the time of the prover O(n * log(n)) Is explosion the basic limit , Or is there a way like bulletproof Also, we only use linear overhead for concise proof ( Unfortunately , Take linear time ) verification ). There is also the risk of loopholes in the existing solutions . Usually , The problem is in the details, not in the basis .

Code obfuscation

For any given program P, Obfuscators can generate a second program O(P)= Q, So if you give the same input , be P and Q Return the same output , And the important thing is ,Q It's not about P Any information inside . A person can be in Q Hide a password inside , A secret encryption key , Or one can simply use Q To hide the proprietary work of the algorithm itself .

In simple English , The problem is that we want to propose a “ encryption ” Procedure method , So that the encrypted program will still provide the same output for the same input , But the program's “ Inside ” Will be hidden . . The confusing example use case is a program that contains a private key , The program only allows the private key to sign some messages .

The solution to code obfuscation will be very useful for blockchain protocols .

Unfortunately , It's still a problem . In terms of solving the problem , Ongoing work is under way , On the one hand, construct ( for example this), To reduce the number of mathematical objects that we don't actually know ( for example , General cryptographic multilinear mapping ) Assumptions , On the other hand , Try to reduce the number of mathematical objects needed for actual implementation . however , All of these paths are far from creating viable and known security . see also https://eprint.iacr.org/2019/463.pdf Get a more general overview of the problem .

Hash based cryptography

Create a signature algorithm , The algorithm does not rely on any security assumptions , It's hash based random oracle attribute , Maintain the best size and other attributes for classic computers 160 Bit security .

since 2014 year since , Two major advances have been made in this regard .SPHINCS It's a kind of “ No state ”( It means that you don't need to remember information like a random number to use it many times ) Signature scheme , Here it is “ problem ” Soon after the list was published , And provide a size of about 41 kB A pure hash based signature scheme .

in addition , It's been developed STARK, And you can create signatures of similar size based on them . I didn't think of it just five years ago , Not only can signatures be used , And we can use general zero knowledge proof .

The main unsolved problem of hash based encryption is aggregate signature , Be similar to BLS polymerization Make it possible . as everyone knows , We can do a lot of Lamport Sign for STARK, But it's inefficient . A more effective solution would be welcome .

Resistance ASIC Proof of work

One way to solve this problem is to create a workload proof algorithm based on a computation type that is difficult to specialize .

stay “ Difficult problem ” About six months after the list was released , Ethereum decided to use its anti ASIC The working proof algorithm of :Ethash.Ethash It's called hard to store algorithm .

Ethash By making memory access run PoW The main part of the calculation is to realize ASIC resistance .Ethash It's not the first hard to remember algorithm , But it does add an innovation : It's on two floors DAG Using pseudo-random search on , This provides two ways to evaluate functions . First , If one has the whole (〜2 GB)DAG, You can calculate it quickly ; It's hard to remember “ Fast path ”. secondly , If only DAG The top floor of , You can calculate it more slowly ( Still fast enough to quickly examine the individual solutions offered ). This is for block validation .

Ethash In resistance ASIC It turns out to be very successful . After three years and billions of dollars in block Awards ,ASIC Do exist , But it's Function and cost At best, it's better than GPU  high 2-5 times . It has been put forward ProgPoW As an alternative , But there is a growing consensus that , Resistance ASIC Our algorithm will inevitably have a finite lifetime , also ASIC  It has disadvantages , Because it makes 51% It's cheap to attack .

I believe it can be created to provide medium level ASIC Resistance PoW Algorithm , But this resistance is limited , also ASIC He Fei ASIC PoW All have shortcomings . In the long run , A better choice for blockchain consensus is proof of equity .

Useful work proves

The challenge proved by the useful workload is , Workload proof algorithms need many attributes :

  • Difficult to calculate
  • Easy to verify
  • Don't rely on a lot of external data
  • Can be effectively used in small “ The size of a bite ” To calculate the block of

Unfortunately , There are not many useful calculations to keep all of these properties , And most do have all of these attributes and “ Useful ” Our calculation is only “ Useful ” For a short time , There's no way to build cryptocurrency around them .

however , There is one possible exception : Zero knowledge proof generation . Zero knowledge proof of blockchain effectiveness ( for example , A simple example of The root of data availability ) Difficult to calculate and easy to verify . Besides , They're hard to calculate . If “ Highly structured ” The proof of the calculation becomes too easy , You can simply switch to verify the whole state transition of the blockchain , Because of the need to model virtual machines and random memory access , So it becomes very expensive .

The zero knowledge proof of blockchain effectiveness provides huge value for blockchain users , Because they can replace the requirements of the direct verification chain ; Despite a simplified blockchain design , And the verifiability is optimized , but Coda Already doing it . These proofs can greatly help improve the security and scalability of blockchain . in other words , The total amount of calculation that needs to be completed is still far less than the amount of work that proves that the miner has completed at present , therefore , At best, it's just an addition to the proof of equity and blockchain proof , Instead of a complete consensus algorithm .

Proof of interest

Another way to solve the problem of mining centralization is to cancel mining completely , And turn to other mechanisms to calculate the weight of each node in the consensus . so far , The most popular alternative in the discussion is “ Equity to prove ”- in other words , Instead of seeing the consensus model as “ One CPU Functional units , One vote ”, It became “ A monetary unit , One vote ”.

present situation : Great progress has been made in theory , More practical assessments are yet to be made .

Today's most interesting consensus algorithm is fundamentally similar to PBFT, But replacing the fixed set of validators with a dynamic list ,

As of today , We ( In many other algorithms ):

Proof of storage

The third way is to use scarce computing resources other than computing power or money . In this regard , The two main alternatives proposed are storage and bandwidth . In principle, there is no way to provide ex post encryption proof for a given or used bandwidth , therefore , Bandwidth proof should be regarded as a subset of social proof most accurately , We will discuss it in the following questions , But storage proof can be done by calculation . One of the advantages of storage proof is that it is completely resistant to ASIC The attack of . The type of storage in the hard drive is close to optimal .

present situation : Although there is still a lot of work to be done in theory , But there's still a lot of real progress , And more practical assessments .

There are many Plan to use storage agreement Proved [url=https://filecoin.io/filecoin.pdf] block [/url] chain , Include Chia and Filecoin. in other words , These algorithms have not been tested in the field . My own main focus is centralization : These algorithms are actually dominated by smaller users using spare storage capacity , It's still dominated by large mines ?

Stable value encryption assets

One of the main problems with bitcoin is price volatility . problem : Build encryption assets at a stable price .

present situation : Some progress .

MakerDAO Now it's in use , And it has been stable for nearly two years . Its basic mortgage assets (ETH) The value of the company has fallen 93%, Survived the crash , Now it's released DAI exceed 1 Billion dollars . It has become a pillar of the Ethereum ecosystem , Many Ethereum projects have been or are being integrated with . Other synthetic token projects ( for example UMA) It's also developing rapidly .

however , Even though MakerDAO The system is in 2019 Survived in the tough economic environment of the past year , But this is by no means the most difficult situation that can happen .

Another major challenge that could be a bigger challenge is , Be similar to MakerDAO The stability of the system depends on some basic oracle programme . There is a problem with Oracle Different attempts at the system , But the question of whether they can withstand a lot of economic pressure , There is no final conclusion .

Decentralized public goods incentives

Usually , One of the challenges in the economy is “ Public goods ” problem . for example , Suppose there is a scientific research project that will cost 100 Ten thousand dollars to complete , And as we all know , If this research is done , The resulting research will be 100 Ten thousand people save 5 dollar . Overall speaking , Social benefits are clear ……[ however ] From the perspective of everyone's contribution, it doesn't make sense …… up to now , Most public goods issues involve centralization . Additional assumptions and requirements : There is a completely reliable way to determine whether a public service task has been completed oracle( In fact, this is wrong , But this is another area of problem )

present situation : Some progress .

It is generally believed , The issue of financing public goods is divided into two issues : The question of money ( Where to get money for public goods ) And preference aggregation ( How to determine what real public goods are , Not some personal thing ) Pet project ). Suppose the latter has been solved , Then this problem is aimed at the former ( Work on this issue , See below “ Decentralized contribution indicators ” part ).

in general , There are no major new breakthroughs here . There are two main categories of solutions . First , We can try to elicit personal contributions , So as to provide people with social rewards . I'm myself about Through marginal price discrimination Conduct charitable A case in point is our proposal to . The other is Peepeth The anti malaria donation badge of . secondly , We can collect money from applications with network effects . In the field of blockchain , There are several options to do this :

  • Issue coins
  • A portion of the transaction fee is charged at the agreement level ( for example , adopt EIP 1559
  • From some point of view 2 A part of the transaction fee is charged in the tier application ( for example Uniswap Or some scaling solutions , Even in Ethereum 2.0 To collect state rent in the implementation environment of )
  • Charge the other part of the fee ( for example ENS register )

Beyond blockchain , It's just an old question : If you are the government , How to collect taxes ; If you are a business or other organization , How to charge .

Credit system

since 2014 Since then , The reputation system doesn't actually do much work . Perhaps the best way is to use the token managed registry to create a trusted entity / Object management list .

Proof of excellence

One can take out “ prove ” Money to reward players with mathematical proofs of certain theorems .

present situation : No progress , The problem is largely forgotten .

The main alternative to token allocation is Airdrop  ;  Usually , When a token is started, it is allocated in proportion to the existing holdings of other tokens , Or based on other indicators ( for example , stay Handshake airdrop ) Distribute . No real attempt has been made to directly test human creativity , And with AI What's new , The problem of creating tasks that only humans can perform but computers can verify can be very difficult .

back Sybil System

One of the issues related to reputation system issues is the creation of “ The only identity system ” The challenge of , The system is a token generation system , The token proves that the identity is not Sybil Part of the attack ... however , We want to have more than “ One dollar, one vote ” Better , A more equal system of characteristics ; so to speak , One person, one vote would be the ideal choice .

present situation : Some progress .

Many attempts have been made to solve human unique problems . The attempts that come to mind include ( Incomplete list !):

As for Second vote and Secondary funding And so on , To some kind of anti human sybil The demand for the system is also growing . I hope the continuous development of these technologies and new technologies can adapt to it .

Distributed indicators

The latest work on determining the value of public goods does not attempt to separate the task from the quality of completion ; The reason is that it's really hard to separate the two . The work done by a specific team is often irreplaceable and subjective , Therefore, the most reasonable way is to regard the correlation between task and performance quality as a whole , And evaluate it using the same techniques .

Secondary financing is a mechanism by which individuals can contribute to projects , And then according to the number of people and the number of donations , Use the formula to calculate if they're perfectly compatible , How much they're going to donate ( That is, considering each other's interests , Not a victim of the tragedy of the Commons . For any given project , The difference between the amount to be donated and the amount actually donated will be provided to the project as a subsidy from a central pool of funds ( About the sources of the central pool of funds , Please refer to the first 11 strip ). Please note that , This mechanism focuses on meeting the values of certain communities , Instead of meeting certain given goals , And whether or not anyone cares about it . because sense of worth Of complexity   problem , This method may be more robust for unknown unknowns .

stay Current gitcoin The second round of financing in , They have even tried second financing in real life , And achieved considerable success . Some progress has also been made in improving secondary funding and similar mechanisms ; especially Pairwise bounded secondary financing To reduce collusion . We have also carried out some counter-measures Bribe The standardization and implementation of voting technology , To prevent users from proving to a third party that they vote ; This can prevent a variety of collusion and bribery attacks .

Decentralized success indicators

problem : Come up with and implement a decentralized way to measure digital variables in the real world ... The system should be able to measure any consensus that humans can reach at present ( For example, asset prices , temperature , Global carbon dioxide concentration ) )

present situation : Some progress .

Now it's often called “ oracle problem ”. The largest known example of distributed Oracle operation is Augur, It has dealt with millions of dollars in bets . The registry for token management ( for example Kleros TCR token ) It's another example . however , Due to the existence of controversial issues or attempts 51% The attack of , These systems have not yet tested the bifurcation mechanism ( Here, Search for “ Subjective right ” ). Also with “  Peer to peer prediction  ” In the form of literature, it studies what happens outside the blockchain space oracle problem . of this The latest developments in the field , Please see the here .

Another immediate challenge is , People want to rely on these systems to guide the transfer of the number of assets , The amount of the asset is greater than the economic value of the system's local token .

Click on the title to see the original text

 

版权声明:本文为[Jiedao jdon]所创,转载请带上原文链接,感谢。 https://netfreeman.com/2021/04/20210430194658675m.html