Categories
citoyen tech

How DAO Governance Experiments Can Scale Coordination – pt.3

Examples of successful DAO governance models

While it might be arguable what “success” means for DAO governance, this chapter will highlight three DAOs that have scaled well beyond their initial size and retained lively participation and a robust degree of satisfaction among users and contributors.

MakerDAO

After the Maker Foundation turned over the reigns to a decentralized governance model in 2021, some decisions floundered and didn’t get executed due to low voter participation. The introduction of a delegate system, one of the first in the space, and delegate compensation helped to overcome this particular challenge.

The DAO grew to more than 140 full-time contributors during the 2021-22 bull market and started to lose money as operating costs skyrocketed. Token holders rallied around the “Endgame Plan” by co-founder Rune Christensen which radically revamped the governance process and introduced sweeping cost-cutting measures. The new system is designed to make insider dealings and bureaucratic scope creep much harder. It has led to Maker becoming hugely profitable in 2023 and doubled the governance token’s price.

Arbitrum

After a successful airdrop of $ARB tokens at the end of March 2023, Arbitrum decided to run multiple governance experiments in parallel. Most notable was the Arbitrum Short-Term Incentive Program and their creative use of jokeraces to get token holders to align on what to do.

Arbitrum also invited multiple teams like RnDAO to run grant programs for the DAO, leading to viral word-of-mouth marketing and high-quality contributions.

Optimism Retroactive Public Goods Funding

A special case in many ways, RetroPGF is structured by the Optimism Foundation team but involves so-called badge holders who decide on who gets awarded the substantial and coveted retroactive grants.

Round three concluded last November and attracted more than 700 nominations, leading to a huge challenge in structuring the workload so that badge holders – who don’t receive compensation – are able to vote meaningfully.

RetroPGF is a showcase for fast iteration of governance experiments and a consistent source of knowledge about what works and what doesn’t in this context.

2. Challenges in Scaling Coordination

Inherent challenges in coordinating large-scale DAOs

Complexity of decision-making in decentralized organizations

The complexity of decision-making in decentralized organizations lies at the intersection of autonomy and interconnectedness. In these dynamic structures, decision-making is distributed among various nodes, fostering adaptability and resilience.

The main challenge arises from harmonizing diverse perspectives and ensuring alignment with overarching goals. Achieving consensus becomes a delicate dance, requiring mechanisms that balance individual autonomy with collective coherence. The distributed nature of decision-making demands robust communication channels, transparent processes, and a shared understanding of organizational values.

As nodes operate in interdependence, the intricate web of interactions introduces a layer of unpredictability. Decentralized organizations thrive on innovation and agility, but the complex interplay of decisions means that each action influences the larger picture, sometimes in a non-linear fashion. Like the butterfly flapping its wings, a stray comment can sometimes tip the scales of a significant decision.

Navigating this complexity is not merely a theoretical exercise; it becomes the lifeline for individuals whose professional destinies hinge on the outcomes of these intricate organizational decisions.

Potential conflicts and coordination issues

The complexity of decision-making in decentralized organizations can give rise to various potential conflicts and coordination issues.

One significant challenge is the potential for divergent interests among decentralized nodes, leading to conflicts over priorities, resource allocation, and strategic direction. The autonomy granted to different units may result in inconsistent decision-making, hindering a cohesive organizational strategy.

Coordinating efforts across decentralized teams becomes an intricate process, and misalignments in goals or communication breakdowns can impede progress. Moreover, the lack of a centralized authority might create challenges in enforcing standardized policies or ensuring adherence to shared values, potentially leading to cultural clashes. Ongoing discussions about foundational values and a consistent process of alignment are paramount. I think Rune Christensen got that part exactly right in his Endgame vision for MakerDAO.

Rapid decision-making increases the risk of information asymmetry, where some nodes may be unaware of crucial developments, which can create unpleasant surprises or ill-informed decisions.

Striking the right balance between autonomy and alignment presents a perpetual challenge, as too much control stifles innovation, while too much anarchical autonomy risks fragmentation. Navigating these conflicts and coordination issues requires a sophisticated understanding of the organizational dynamics and the implementation of effective communication and decision-making frameworks.

3. Experimenting with Governance Solutions

Introduction to innovative governance experiments in scaling coordination.

The landscape of public governance has witnessed a shift towards innovative experiments in scaling coordination. Traditional hierarchical structures and rigid frameworks are gradually making way for more dynamic and agile methodologies that allow organizations to adapt to challenges faster.

Even public sector organizations are exploring decentralized leadership models that empower teams to make decisions from the edges. This shift is fueled by a recognition of the limitations of top-down approaches and a growing understanding of the need for adaptability in complex, rapidly changing environments with rising degrees of uncertainty.

Some experiments involve the adoption of agile methodologies in public institutions. Originally designed for software development, agile methodologies like Scrum or SAFe have proven effective in enhancing organizational responsiveness and increasing focus on the “customer”. By fostering iterative and customer-centric approaches, public entities increase their ability to navigate uncertainties and deliver value efficiently.

Another area of exploration is the implementation of decentralized leadership, distributing decision-making authority throughout the organization. This approach fosters a culture of accountability and taps into the collective intelligence of teams, promoting innovation and adaptability. These approaches are usually very hard to implement for organizations with a long-standing hierarchy, which is often tied to pay grades. Nevertheless, even the military has successfully pushed the responsibility to where the rubber hits the proverbial road. See this TED talk by a nuclear submarine commander on how he successfully turned the military hierarchy on its head.

Voting mechanisms and consensus protocols

The most common voting mechanisms found in governance are:

  • Simple majority – the option with the most support wins. Very easy to understand, but it could mean that the majority of voters don’t support the winning option. (Say three options A, B, and C exist and get 40, 35, and 25% support. 60% have not selected A, yet it is the winner of the vote)
  • Instant Runoff Voting – The option with the least support is eliminated and the votes redistributed to the second choices, or the top choice. Can be designed to select the Condorcet winner of a vote.
  • Approval Voting – Multiple options can be selected, and the option with the most votes wins. No elimination takes place. This reduces strategic voting and has been shown to be more inclusive to fringe voices.
  • Quadratic Voting – Votes get counted by calculating the square root of the votes cast by each person. This assumes that a person can express the strength of their preference by assigning multiple votes to an option. Quadratic Voting empowers smaller token holders in DAOs because it flattens the power law curve of token distribution. It is notoriously susceptible to Sybil attacks, however. Gitcoin has gone to great lengths to reduce Sybil attacks through its decentralized identification passport system.

Which types of polls and voting mechanisms governance uses is ultimately also a question of the type of decision that is being made. Simple majority polls serve binary decisions (Yes/No) very well. Others might need IRV or approval voting to make the field more equitable for fringe options and to reduce tactical voting.

For further reading on this subject, please refer to this article:

Voting Systems | Simple Majority, Ranked Choice & Approval Voting. Plus, a guy called Condorcet. | by Raphael Spannocchi

The use of smart contracts in DAO governance

Smart contracts play a pivotal role in the governance of decentralized autonomous organizations (DAOs), providing a transparent and automated framework for execution.

Smart contracts execute predefined rules encoded into the system when specific conditions are met. In DAO governance, these self-executing contracts enable the allocation of resources and can trigger other actions without the need for intermediaries. The inherent transparency and immutability of smart contracts enhance the integrity of the governance process, reducing the risk of fraud or manipulation.

DAOs aim for automation and ossification to achieve robust, self-sustaining systems. Automation ensures that predefined rules are executed seamlessly, minimizing the need for manual intervention and reducing the surface for governance attacks.

This not only increases efficiency but also enhances trust. Ossification, referring to the stabilization of code and rules, contributes to the long-term stability of DAOs. Striving for a state of ossification means establishing a governance framework that resists unnecessary changes, promoting a more predictable and secure environment. As DAOs continue to mature, the strategic use of smart contracts aligns with the broader goal of creating decentralized, automated systems that can stay true to their explicit intent and deliver what they promised to their token holders with increasing efficiency.

Categories
citoyen tech

How DAO Governance Experiments Can Scale Coordination – pt.2

If you haven’t read part one, please go ahead and read it here.

2. Challenges in Scaling Coordination

Inherent challenges in coordinating large-scale DAOs / Complexity of decision-making in decentralized organizations

The complexity of decision-making in decentralized organizations lies at the intersection of autonomy and interconnectedness. In these dynamic structures, decision-making is distributed among various nodes, fostering adaptability and resilience.

The main challenge arises from harmonizing diverse perspectives and ensuring alignment with overarching goals. Achieving consensus becomes a delicate dance, requiring mechanisms that balance individual autonomy with collective coherence. The distributed nature of decision-making demands robust communication channels, transparent processes, and a shared understanding of organizational values.

As nodes operate in interdependence, the intricate web of interactions introduces a layer of unpredictability. Decentralized organizations thrive on innovation and agility, but the complex interplay of decisions means that each action influences the larger picture, sometimes in non-linear fashion. Like the butterfly flapping its wings, a stray comment can sometimes tip the scales of a significant decision.

Navigating this complexity is not merely a theoretical exercise; it becomes the lifeline for individuals whose professional destinies hinge on the outcomes of these intricate organizational decisions.

Potential conflicts and coordination issues

The complexity of decision-making in decentralized organizations can give rise to various potential conflicts and coordination issues.

One significant challenge is the potential for divergent interests among decentralized nodes, leading to conflicts over priorities, resource allocation, and strategic direction. The autonomy granted to different units may result in inconsistent decision-making, hindering a cohesive organizational strategy.

Coordinating efforts across decentralized teams becomes an intricate process, and misalignments in goals or communication breakdowns can impede progress. Moreover, the lack of a centralized authority might create challenges in enforcing standardized policies or ensuring adherence to shared values, potentially leading to cultural clashes. Ongoing discussions about foundational values and a consistent process of alignment are paramount. I think Rune Christensen got that part exactly right in his Endgame vision for MakerDAO.

Rapid decision-making increases the risk of information asymmetry, where some nodes may be unaware of crucial developments, which can create unpleasant surprises or ill-informed decisions.

Striking the right balance between autonomy and alignment presents a perpetual challenge, as too much control stifles innovation, while too much anarchical autonomy risks fragmentation. Navigating these conflicts and coordination issues requires a sophisticated understanding of the organizational dynamics and the implementation of effective communication and decision-making frameworks.



Categories
citoyen tech

How DAO Governance Experiments Can Scale Coordination

Introduction

Definition of DAO (Decentralized Autonomous Organization)

  • DAO, or Decentralized Autonomous Organization, is represented by rules encoded as a transparent computer program controlled by the organization members and not influenced by a central government. Essentially, it’s a form of a smart contract on a blockchain that operates without a central authority, relying on code and consensus among its members to make decisions. DAOs are often used for managing and governing decentralized projects, funds, or communities.

Importance of coordination in DAOs

  • In Social Choice Theory, coordination is paramount for the effective functioning of DAOs. DAOs often involve multiple decision-makers with diverse preferences and opinions. Without proper coordination mechanisms, decision-making can become chaotic and inefficient.Social Choice Theory teaches us that aggregating individual preferences into a collective decision is not straightforward. In the context of DAOs, where decisions are often made through consensus or voting mechanisms, coordination helps align the diverse interests of participants toward a common goal.Effective coordination in DAOs ensures that decision-making processes are coherent, transparent, and reflective of the collective will. It helps prevent conflicts, reduces the risk of manipulation, and fosters community and collaboration among DAO members.In essence, the study of Social Choice Theory emphasizes that the design and implementation of coordination mechanisms within DAOs are crucial for achieving collective decision-making that is not only fair but also robust and efficient.

Overview of the blog post content

  • We’re going to explore governance mechanisms, that are currently in use, outline challenges of scaling these coordination tools and. then come away with key takeaways that can inform a wider discussion on the subject.

1. Understanding DAO Governance

Explanation of DAO governance mechanisms

  • First, let’s delve into these fundamental DAO governance mechanisms:
    1. One Token – One Vote (1T1V):
      • In this system, voting power is directly proportional to the number of tokens a participant holds in the DAO. The more tokens you have, the more influence you wield in decision-making.
      • This straightforward mechanism aligns with the principle of financial stake determining voting power. However, critics argue it can lead to plutocracy, where those with more wealth have disproportionate control.
    2. One Person – One Vote (1P1V):
      • In contrast to token-weighted voting, 1P1V assigns equal voting power to each participant, regardless of the number of tokens they hold. This approach aims to ensure democratic and egalitarian decision-making.
      • It’s a conceptually simple system promoting equality, but it might not be ideal for decentralized networks where financial contributions often correlate with commitment and expertise.
    3. Quadratic Voting:
      • Quadratic Voting introduces a more nuanced approach. Participants receive a certain number of voting credits and can distribute them across different proposals. However, the cost of each additional vote for a single proposal increases quadratically (e.g., 1 vote costs 1 credit, 2 votes cost 4 credits).
      • This system allows participants to express the intensity of their preferences while avoiding the concentration of power in the hands of a few wealthy voters. It’s a balance between one token – one vote and one person – one vote.
    4. Delegation:
      • Delegation involves participants assigning voting power to another member or a representative entity. This can be particularly useful when participants lack the time or expertise to vote on every issue.
      • Delegation can enhance efficiency, as knowledgeable or trusted individuals can represent the interests of a broader group. However, the challenge lies in ensuring that delegates act in the best interest of their constituents.
    Each of these mechanisms comes with its advantages and challenges. The choice often depends on the DAO’s goals, values, and the level of decentralization and inclusivity it aims to achieve.

The importance of effective governance for successful coordination

  • Effective governance is crucial for successful coordination because it directly impacts the outcomes of collective decision-making processes. Social Choice Theory deals with aggregating individual preferences into a collective choice. Here’s why practical governance matters:
    1. Fair Representation:
      • Effective governance ensures that the preferences of all participants are adequately represented in decision-making. This is essential for fairness, preventing the dominance of specific interests, and ensuring inclusivity.
    2. Consensus Building:
      • Successful coordination often requires reaching a consensus among diverse stakeholders. Effective governance mechanisms facilitate the negotiation and compromise needed to achieve collective decisions acceptable to the majority.
    3. Avoidance of Manipulation:
      • Governance mechanisms should be designed to prevent manipulation or strategic behavior by individuals or groups. Social Choice Theory emphasizes the need for robust systems that resist manipulation and strategic voting tactics.
    4. Transparency and Accountability:
      • Transparent governance processes foster trust among participants. When decision-making procedures are transparent and accountable, individuals are more likely to adhere to the outcomes, reducing the risk of disputes and conflicts.
    5. Efficiency and Timeliness:
      • Well-designed governance structures streamline decision-making processes, making them more efficient and timely. This is particularly important in dynamic environments where quick responses to changing circumstances are necessary.
    6. Adaptability and Flexibility:
      • Social Choice Theory acknowledges that preferences and circumstances can evolve. Effective governance mechanisms should be adaptable and flexible, allowing for adjustments to the decision-making process as the needs of the community or organization change.
    7. Incentive Alignment:
      • Governance should align participants’ incentives with the organization’s or community’s overall goals. This encourages cooperative behavior and discourages actions that might harm the collective.
    8. Accounting for Diversity:
      • Successful coordination often involves dealing with diverse preferences, values, and perspectives. Governance structures should be designed to account for and respect this diversity, ensuring that minority views are not marginalized.
    In summary, effective governance aligned with Social Choice Theory principles is essential for creating a robust framework that fosters cooperation, fairness, and efficient coordination within a group or organization.
Categories
citoyen

Exponential Growth? I have thoughts.

Ever since the start of what became known as the SARS-CoV2 Pandemic the word “exponential growth” has been all over the headlines. And it’s bad. Real scary.

“Just look at the curve, man! Pretty soon everyone will be…”, you know the drill.

After that exponential growth was used to justify military aid in Ukraine – “Stop the exponential aggression of Russia”, and recently the growing concerns about AI – “God Mode AI”.

Exponential graphs are scary because not only is there growth, but the speed of growth is constantly accelerating. The old fable of the king, the con-artist and the chess board is frequently mentioned to whip those in line who aren’t scared yet.

To recap: A king grants a con-artist a simple favor. The con-artist asks the king to take out a chessboard and put one grain of rice on the first square, two on the second, four on the third, and so forth. The king, calculus never being one of his strong suits, complies only to find out that all of his kingdom isn’t worth enough to buy that much rice.

BOOM! Told ya so! See, exponential growth is SCARY!

Well, it is! But it’s also exceedingly rare. It’s absolutely impossible in the world of physics, but can and does occur in mathematical worlds, and some of those have massive real-world impact.

What do I mean by that? Let’s stick to the example of the king and the chess board. Let’s say the king really commits to the con-artist and doesn’t just chop their head off for being insufferable smart asses after square number 8 or 9.

The table below shows how many grains of rice are in each field.

We’re still assuming the king is absolutely committed to fulfilling his offer, and time is no issue, because counting those damn grains is gonna take a loooong while.

We’re converting the grains to tons (yes 1000kg tons) of rice now, and assume each grain of rice weighs approx 0.05g on average, a number we got from the internet.

So at field 26 the king has to fork over a 1.68 tons of rice already. It starts to get expensive. But again, time is no issue, the logistics are all sorted. The wonders of modern agriculture managed to produce 756,743,722 metric tons of rice in 2022. As you can see at square #55 the poor king would already have to buy more than a full year of rice production worldwide. Ouch.

And here, I want to end the experiment. We can clearly see that the “exponential growth” would, even with the best of intentions, have likely ended somewhere around square #30, or at least slowed down considerably.

This dynamic is exactly what we always see in the natural world. Growth has limits! And that leads to an opposing force dampening growth acceleration at first and finally limiting it. The resulting function is called Sigmoid function, Gompertz Funktion in German.

We can see the upper limit and we can see growth being somewhat close to exponential at the very beginning, then linear, and finally logarithmic before coming to a standstill.

Understanding the dynamics of the Sigmoid allows us to calibrate and predict with much higher accuracy then staying trapped in exponential scaremongering.

  • Is growth still accelerating? -> okay, we’re right at the start of this.
  • Is growth linear? -> Somewhere in the middle, we might be able to make some educated guesses about where the limit will be.
  • Is growth already slowing down? -> Top signal.

We started out with a couple of popular scares. Let’s briefly look into them.

  • Viral contagion -> Textbook sigmoid, the question becomes how many are susceptible to infection at all.
  • Military aggression -> Sigmoid with a low threshold. There’s a clear limit to how many tanks and soldiers any nation can have.
  • AI power -> Sigmoid. Processing capability is limited by chip supply. Neural nets have truly exponential costs of compute as they grow…

In my humble opinion, the poor exponential curve is the new scary ghost story. It mostly doesn’t exist if you shine a flashlight on it.

Where it does exist and has massive power is in finance. When growing a portfolio, the difference between a 2% and a 3% CAGR is massive over the years. Warren Buffet compared it to a snowball. The stickier the snow, the faster the growth.

Or inflation. In the 30s of the previous century, inflation in the Weimar republic grew exponentially. Even though in reality there was a soft dampening factor because it was a cash-only economy and the printing presses took some time. In the current cash-less economy, this could happen at a much quicker pace. Now that’s something to worry about as governments worldwide push trillions upon trillions of newly created money supply into the markets.

Don’t believe the hype, as Public Enemy shouted from the roof-tops.

And embrace the Sigmoid!

Categories
tech

Regulatory capture: the $181mm $BEAN hack brings hacking to DAO governance.

The heist

On 16 April 2022 hackers used a flash loan to purchase governance tokens for Beanstalk Farms. They used these tokens to vote with a supermajority (> 66%) on a proposal that granted them access to all the protocol’s funds. Shortly afterwards they completely drained Beanstalk’s smart contracts and managed to escape with $75mm in wrapped ETH.

The heist was a first of a kind. It didn’t exploit a technical vulnerability in a smart contract but focused on a weakness in the governance construct. I have seen similar attacks with less scope a short while before and expect to see others following shortly. “Whatever can be thought once can be thought again”, as F. Duerenmatt said in his seminal novel “The Physicists”.

In this piece, I want to break down what happened step-by-step and identify key learnings for DAO governance. I will follow the trail of transactions, so this is a bit more technical. This piece was created as part of buildspace’s IRL hackathon in Amsterdam.

Beanstalk exploit, governance hack, DAO governance

I’m working off the amazing tweets by Igor Igamberdiev, Xohn and PeckShield, so a huge shout out to them. Let’s dive in.

Getting technical

It continues to amaze me that even a completely public, transparent, and immutable ledger doesn’t keep criminal activity at bay. But it makes forensics easier and telling the story a lot more exciting because we have all the details available.

I’ll keep referring to the hackers in the plural, even though it’s completely possible that this was the work of a single person. First, I think this might just as well have been teamwork and second, it makes gender a non-issue.

The attackers used a flash loan to borrow $1bn in stablecoins (DAI, USDC and USDT) which were used to buy some $BEAN and $LUSD. If you want to see the transaction on the blockchain you can find it here on Etherscan.

Beanstalk heist transaction log. DAO governance. Governance hacks.

Flash loans are unique to blockchains or to smart contracts to be exact. A flash loan is when a borrower takes out a loan, with no collateral and pays it back before closing the transaction. If the loan is not successfully repaid the transaction is reverted and no funds are lost. This minimizes the risk of defaults or illiquidity, making flash loans extremely cheap. Usually, no interest is charged, and a small fee (~0.09%) is paid instead.

Transactions do not have to finish within a single block (~13 secs on Ethereum), so more complex operations are possible, which is exactly what the hackers took advantage of.

Let’s get into the details of what happened.

Detailed log of the Beanstalk hack transactions. 
DAO governance hack.
Thanks to PeckShield for laying it all out <3

The attackers created the Beanstalk Flashloan Exploiter smart contract. They took out flashloans of $350mm in DAI, $500mm in USDC and $150mm in USDT from Aave, as well as 32mm $BEAN.

They deposited the stablecoins into Curve’s 3Pool to get 980mm 3CRV, 15mm of which they exchanged for LUSD on SushiSwap.

965mm of 3CRV where converted to 795mm BEAN3CRV-f. The 32mm $BEAN where pooled with $15mm in LUSD to get BEANLUSD-f. Whenever a deposit is made to one of these silos (pools) on Beanstalk, the depositor is accredited with stalk and seed rewards and is allowed to participate in the governance system of the protocol. The system itself implements a lock mechanism to disallow the same amount of units from being voted with more than one time. The attackers deposited their LP tokens into Beanstalk and now had a supermajority of 70% of the voting rights, because of the sheer size of their holdings. Supermajority is needed to vote pass core protocol changes and trigger the emergencyCommit() function.

Now the good part

After this feat of number crunching came the good part, the artistic flourish: Beanstalk supports protocol upgrades via its Beanstalk-Improvement-Proposal (BIP) governance mechanism, making it possible for an upgrade to perform arbitrary code execution. The attackers used this to retrieve their locked funds as part of their malicious update.

The voting system of Beanstalk permitted votes to be cast retroactively on any active BIPs thus allowing newly generated votes to apply to historical BIPs. The attackers submitted two smart contracts as proposals BIP-18 and BIP-19 ahead of time to satisfy the emergency commit time threshold of 24 hours. BIP-18 proposed to donate $250k in USDC to the Ukrainian Crypto Donation. It also contained a call to an init() function on an address that had no code at that time. The second BIP (BIP-19) was the exploit with a function labelled “InitBip18”, to create a false trail.

The bad guys waited for the time threshold to pass, and no monitored if anyone noticed the malicious contracts they had deployed. As soon as the time had passed they took out the flashloans and acquired 70% of the voting power. Now they could trigger the emergencyCommit() function on the fake OIP-18 (really OIP-19) and minted 36mm $BEAN plus they drained all the wETH from the contract.

Next, they removed liquidity from Curve, repaid their loans on Aave, and sent the wETH to TornadoCash plus the 250k USDC to Ukraine Crypto Donation, before closing the transaction. Boom! In one fell swoop, Beanstalk was dead, the attackers made out with a huge 8-figure sum of ETH and a comparatively small donation.

Donation to Ukraine from the Beanstalk hacker. DAO Governance hack.

What went wrong from a governance POV?

Beanstalk’s smart contracts were audited and no technical vulnerability was exploited. Instead, the attackers used a flaw in the governance logic of the protocol.

BEAN3CRV-f and BEANLUSD-f had just recently been added to the Beanstalk protocol with governance rewards. Uniswap checks if a transaction on an LP is already closed and doesn’t reflect the weight of holdings until they are fully committed. But Curve does not. This way the Curve LP tokens presented a flashloan vulnerability that Beanstalk was not responsible for, but could have checked.

Intimate knowledge of the way that Beanstalks checks asset prices, as well as how DEXs report holdings were required for this sophisticated attack. Gigabrains and gigaballs as crypto twitter has written.

Gigabrain meme. DAO governance hack.
Gigabrain meme courtesy of NooNe0x.eth

Basic flashloan protection is simple to implement in Solidity. There are exactly 0 (in words: zero, lol ) confirmations on an active flashloan.

Require(block.timestamp.sub(_lastActionTimestamp)> 0, ‘Flashloan’);

Is all that’s needed to prevent an unfinished transaction from participating in votes.

Flash loans allow attackers to get almost unlimited numbers of tokens, and when these can be converted into voting power, malicious proposals can be rammed through. Just ask Beanstalk.

Another big mistake was that retroactive voting was possible. This is not an easy fix, because voting power would have to be checked with snapshots at the time of each proposal, which is possible but requires larger amounts of processing and storage, both of which are very expensive on Ethereum. Off-chain solutions incur provider lock-in, and sometimes closed-source software usage.

Conclusion

Governance needs to figure out technical, game-theoretical, and political attacks before malicious actors do.

Since DAOs need to codify their governance mechanism these insights are reflected in the production code as well. Technical audits are not enough but should be performed on a regular basis.

I’m not aware of anyone offering game-theoretical and political audits for governance. That’s a massive opportunity for builders. LFG!!

The Beanstalk hack was not the first attempt to use flashloans to buy voting power but the one with the biggest loot.

I fully expect attackers to search for other protocols with this vulnerability and exploit them in short order. If you, dear reader work in governance the onus is firmly on you to make sure you don’t suffer Beanstalks fate.

If you’re interested in thinking through game-theoretical aspects of your DAO or protocol please reach out. Always happy to help.

BUIDL! DAO governance political audits, e.g.
Categories
tech

What will the next crypto bear market look like?

  • The 2017 bear market was a multi-year market-wide downturn
  • Forward to 2021 and fundamentals are strong with some undervalued asset groups 
  • Investors should expect short market-wide bear-phases and brutal group-wide downturns 
  • The crypto market is no longer one cohesive entity. Invest accordingly

Context: the 2017 bear market

Those who have been in crypto for a few years still shudder when they think about the aftermath of the ICO craze. After an incredible run in the second half of November 2017, bitcoin’s price lost 80% within a year. The flash flood of retail money that went in on a get-rich-quick-spree dried up almost as fast as it had arrived. 

Many ICO projects with lofty claims, promising to “Disrupt the X trillion $ ABC industry” lost 90% in the first month and then another 90%. More than a few never made it back from the abyss, as investors pulled the plug and called it quits.

Builders with genuine intentions and good projects faced difficulty with lining up funding rounds. Community-operated cryptocurrencies like Ethereum or bitcoin saw miners leave in droves and network security tank.

The gold-digger mentality of the past year left a disgusting aftertaste and made investors and developers give a wide berth around anything labelled crypto. 

The market stayed depressed for more than a year and only started to pick up again at the end of March 2019. It would take until November 2020 until bitcoin’s price would surpass the peaks of its 2017 bull run.


Bitcoin price during the 2017 bull market and its aftermath. Source: tradingview.com

Why the sad faces?

The main reason for the 2018 crypto market depression is that most projects were vaporware. ICOs raised hundreds of millions with a whitepaper or just a website full of ludicrous claims.

Retail investors were happy to keep pumping bags as long as the number went up but got cold feet as they realised that many teams consisted of a web developer, five marketers, and little else. Distrust and disappointment fed upon each other as cryptocurrencies became notorious for being scams. A look at the total market capitalisation of the crypto market excluding bitcoin shows how severe the 2018 downturn was.

Total crypto market capitalisation excluding bitcoin. Source: coinmarketcap.com

$75m in October 2017 turned into $50m by mid-2018. The market cap before the ICO boom was reached again only in September 2020. Retail investors had bought heavily into an already frothy, overpriced market and got slaughtered as rational expectations returned and market participants saw that the king was without clothes.

Without a crystal ball to predict the future, it is best to look at fundamentals to gauge the stage of the market cycle. While no one can predict the movement of a market, there are clear indicators for caution that investors need to heed.

Fundamental difference

A fundamental approach to valuing the crypto market is complex because cryptocurrencies are more like emerging currencies than growth stocks. Numbrs research has an excellent piece on this.

On-chain analysts like Willy Woo or David Puell use a combination of indicators to get a clearer outlook. 

Bitcoin’s MVRV z-score, developed by Nic Carter et al., was a very accurate indicator for market tops in the past. Currently, at 2.5, this metric is far from market bottoms and nowhere near tops of 6 or above, last seen before the Mai 2021 sell-off, where bitcoin lost 50% of its value. It is also a far cry from the pre-2018 crash peaks of 9.

Bitcoin market value to realised value per standard deviation of market capitalisation. Source: lookintobitcoin.com

Another intriguing piece of data is the amount of funding for blockchain projects. CBInsights reports a 384% increase when comparing the first nine months of 2021 to 2020. Not retail investors, but VCs and other professionals with a longer time horizon fund most projects.

Blockchain funding at record highs in 2021. Source: Twitter

Compared to 2017 ICOs, today’s crypto projects deliver. Decentralised finance now has more than $300bn under management, and projects like SushiSwap or Abracadabra.money have price-to-sales ratios of ten or lower.

Meanwhile, play-to-earn games like Axie Infinity open up whole new economies for thousands of gamers through secondary markets for in-game items and characters.

The fundamental difference between 2017 and 2021 is:

  1. Bitcoin’s price is high but not frothy
  2. Decentralised finance is still vacuuming up assets.
  3. Play to earn seems just to get started
  4. VC interest in funding blockchain companies is growing

Total market capitalisation has increased from $750bn at the peak of 2017 to $2.8tn in November 2021. The size of the market points to a different scenario for the “next crypto winter”.

Every man for himself

A closer look at the dynamics of the last year reveals that the “crypto market” doesn’t behave like a singular entity anymore. Compared to the price of ETH, DeFi tokens got pummeled with $SUSHI down -80% while play to earn token $AXS skyrocketed +4242% since the beginning of 2021.

The crypto market has started to differentiate. While tail risks like a Tether implosion or a regulatory crackdown would affect the market as a whole, investors should begin to approach crypto investments like macro or stock. Careful research and evaluation of individual projects is likely to yield better results than blanket purchases of the whole market. 

Bundling assets into groups and purchasing 3-5 top projects is a good way to diversify. If exposure to play to earn games is desired, investors could buy $AXS, $YGG and $GALA. A conviction that DeFi tokens are undervalued could lead to purchasing $SUSHI, $SPELL and $OHM. Specific category indexes like the DeFi Pulse Index or the Metaverse index by IndexCoop allow purchasing of managed groups.

Or take the recent rise of Metaverse coins after Facebook rebranded itself as Meta. Tokens like $SAND outperformed the entire market measured in YTD gains within weeks.

Metaverse Index 3M performance. Source: indexcoop.com

Conclusion – This time it’s different

A 2017 style wipeout of all crypto assets across the board is highly unlikely this time. “History doesn’t repeat itself, but it rhymes”, Mark Twain famously said.

Instead of multi-year downtrends, we will see some group-agnostic bear phases of the whole crypto market with quick recoveries, as investors see that the grass isn’t greener anywhere else. And we’re almost destined to see brutal downdrafts of asset groups as crypto investors take profit and move elsewhere. 

Dips are opportunities to enter into overvalued assets or sectors, as this beautiful research by Galaxy Digital shows.

6M and 12M returns following bitcoin downturns. Source: Galaxy Digital

Buying the dip is not just a slogan in crypto; it makes good investing sense, primarily for drawdowns of -15% to -40%. To conclude: Onwards, to more dips!

Categories
tech

What is Optimistic Ethereum?

So what is Optimistic Ethereum ?

Does it mean blocks are considered “half-full” when at 50% capacity?

Optimism tries to solve one of Ethereum’s biggest challenges. Speed and cost of transactions.

Currently Ethereum can process less than 30transactions per second, compared to 50,000+ for Visa.

Transactions are auctioned, causing urgent moves to send tx fees upwards of $1,000 in rare cases, but routinely more than $20.

Optimism builds on top of the current Ethereum chain (called Layer 1 or L1). So it’s not a hard fork, or alternate #cryptocurrency.

Instead it employs so-called validators. Validators group and process multiple transactions and publish a result on the Ethereum chain. So they are transaction speed multipliers (and fee dividers 😉 )

But how can we be sure that the validators perform what they say they do and don’t cheat?

Other validators check the output and if a validator misbehaves it gets excluded from further validation. And correct validation awards a fee to the validator to create an economic incentive.

The reason this is called Optimism is that it hasn’t been tried yet, and large-scale collusion between bad-faith validators could pull the roof down (over their own heads, but it has happened before).

For more detailed info check out this article that explains the tech well: https://tinyurl.com/kj6uczb6.

Categories
tech

Avalanche – should Ethereum brace for impact?

TL;DR

Avalanche is a Proof of Stake (PoS) blockchain network developed by a Ava.Labs, under CEO Emin Gün Sirer. Sirer is an associate professor of computer science at Cornell University, and co-director of IC3, the Initiative for crypto currencies and (smart) contracts.

Avalanche employs a unique consensus algorithm that allows rapid finalization of transactions.

The Avalanche network can now process more transactions per second than Visa and is highly decentralized, with more than 4000 validators spread around the globe.

It hosts a very active developer community, with more than 300 projects currently being built on top of the platform. Avalanche can bridge multiple blockchains by allowing other tokens to be wrapped and processed on its native ledger. An approach similar to what Polkadot achieves with parachains, although the technical implementation is vastly different.

The Avalanche community seems to be made up of unusually level headed individuals that rarely engage in maximalism and bashing. While this is attractive to many projects that want to build valuable projects, mostly in the De-Fi space, it allows for fewer headlines and keeps the project as a dark horse and possible-runner-up. 

The technical foundation is exciting and an active community is certainly worth a lot.

We will see in the  next two years, if Avalanche is truly moving mountains, or is just a happening in a remote valley. 

What’s special about Avalanche?

At the heart of every blockchain is the mechanism that governs which transactions to confirm and which to reject, known as consensus algorithm. Bitcoin uses what is now known as Nakamoto consensus, where miners are incentivized to inspect transactions carefully and reject hostile or faulty additions to a block. If they still do include these, the resulting block will not be added to the main blockchain, and their block reward rendered worthless.

Avalanche takes another approach. They call it Snowflake – because a lot of snowflakes can make up an Avalanche (duh!).

Snowflake works like this: A validator receives a transaction and polls 5-10 other validators from a list at random: Should the transaction be approved or not?

The validator then takes the side of the majority of poll results.On and on, as the snowflake expands its branches and connects to other snowflakes.

This leads to a high probability of a final transaction within just a few rounds and near certain finality at about 13 rounds of polling, usually in less than 1 second.

The algorithm’s biggest drawback is that double spends can remain undecided. Which is bad for the double spender, since the transaction doesn’t clear. Instead the funds remain locked, and ultimately lost to the double spender.

Here’s a link to Gün Sirer explaining it in detail.

From Snowflake to Avalanche – Emin Gün Sirer

Avalanche is a Proof of Stake blockchain. This means no specialized mining hardware burning terawatt hours of energy is required. In contrast to some other PoS chains, Ava.Labs emphasizes that validating can be done on cheap, inexpensive hardware. Even smartphones. This means Avalanche is as green as blockchains get. And as easy to keep democratized and decentralized.

At the same time the protocol can scale to more than 10 million nodes, and has been shown to offer the capacity for more than 11,000 transactions per second, engaging the more than 4,000 validators online now. On par with Visa and Mastercard.

The next interesting technical innovation is that Avalanche allows tokens of other chains, and even smart contracts to run on Avalanche. The detailed technical background can be found here: The Avalanche-Ethereum Bridge: What You Need to Know | by Avalanche | Avalanche | Medium

Avalanche also supports multiple so-called virtual machines. This means dApps relying on other blockchains can be migrated to Avalanche and profit from the speed of the platform while maintaining desired functionality.

Developers can launch their own unique blockchains, with specific feature sets, as part of Avalanche. Profiting from the security and scale of a global distributed ledger while being able to implement custom features. This can include permissioned chains, where only some users are allowed to inspect certain transactions, and others not, a frequent prerequisite for enterprise adoption.

The ecosystem

Emin Gün Sirer became somewhat famous during the DAO hack that nearly took down Ethereum in 2015. One of Sirer’s students was the first to point out the flaw in The DAO’s smart contract, now known as Reentrancy Attack. Sirer looked at the code and convinced the student that while the problem existed, it was unlikely that it would be exploited, a mistake that ultimately led to Ethereum’s split into ETH and ETC. 

Rattled, Sirer doubled down on his research and decided to head the development of what is now Avalanche. He and his team, with the impact of the DAOs demise still prominent in their memory, refused to launch their token, Avax, during the heydays of the 2017/18 crypto boom. They wanted to get it right this time.

Ultimately Avax launched in a crowd sale in June 2020, and was quickly sold out. It now trades on multiple international exchanges like Binance and Gate.io.

Thanks to the clear, no bullshit communications, thorough documentation and low barrier to entry Avalanche attracted developers since their initial release in 2018 and long before their token launch. Currently more than 300 projects build on the platform and a quick glance at their discord chat reveals intense technical discussion and a vibrant exchange often missing in other projects.

The token

Avalanche network’s native token is called Avax. It has a maximum supply of 720 million tokens. Half of the tokens were minted in the genesis block and the other half is released according to a distribution curve specified in the white paper here.

Validators need to stake 2,000 Avax to be able to vote on transactions. This compares to Ethereum’s 32 ETH2 for becoming a staking node. Validators in Avalanche are not slashed when they misbehave, so funds are never at risk. Staking rewards are 11%.

Further some Avax are burnt as transaction fees, minting fees and when subnets and sub chains are created. This means that Avax is actually deflationary. Burning also creates another incentive for staking, as the stake becomes more and more valuable, in addition to growing through staking rewards.

Avax was sold for 0,50 USD at the pre-sale, when agreeing to a lock-in of 6 months. The price experienced a wild ride up to almost 60 USD around February ‘21 and as of this writing in the beginning of July ‘21 is hovering around 12 USD.

So where’s the catch?

As you can see Avalanche offers unique technology. Fast, cheap transactions and massive scalability. Low energy use and innovations such as interoperability with other chains, which is a great asset for De-Fi developers.

Ava.Labs managed to activate and engage a lot of development effort, especially in the De-Fi and enterprise space.

However, Avalanche never grabbed much of the spotlight in comparison to projects like Polygon/MATIC or Polkadot. This is partly due to the quiet, no bullshit attitude the whole organisation projects. While this is a good thing, it could also be an impediment to further adoption. Without buzzy projects that Polygon and especially Ethereum offer, the question remains how Ava.Labs will drive wiede-spred adoption.

This is not necessarily a reason to pass on Avalanche. Like with the turtle and the hare, continuous, steady execution with good results can win over flashy competitors..

And Avalanche has its share of great projects like Pangolin swap, or the Snowball yield farm. I’m cautiously optimistic for Avalanche’s future. Especially if the team manages to engage end users better and create a more profound momentum for adoption.

Categories
tech

Moving Bitcoin from a paper wallet to an exchange

Right now the Bitcoin (BTC) price is skyrocketing! Amazing if you own only a piece of a coin. Maybe you were smart and stored the coins on a cold storage. The most unexpensive and one of the most secure being a paper wallet. I recommend the ones from Bitcoin Paper Wallet. The guy just nails it, super comfortable to set up, easy to print. Lot’s of no fluff info, and he even responded in a day when I thanked him profusely.

So, maybe now you’re thinking it’s time to move those coins to an exchange to profit from the boom, before the inevitable bust, and then buy more once the bust recovers. So you want to move the BTC to an exchange, to cash those lovely Bitcoins out at prime rate, right?

I personally like and use Kraken. Fantastic exchange, just rock-solid. I even invested a tiny bit in their crowd-funding 2 years ago.

To get the Bitcoins from your paper wallet to an exchange you have to sweep the paper wallet. While you could type the private key, I recommend using an app like Coinomi that uses your phone’s camera to scan the QR code your paper wallet probably has.

Install Coinomi on your phone, create a wallet, really write down and verify the recovery phrase!! Please!! And then select Bitcoin, as well as any Bitcoin hard forks you might be privy of. Then go to the Menu on the top left and select your Bitcoin wallet. In the upper right corner select the 3 dots and then choose sweep wallet. Coinomi will ask to access your camera if you just installed the app. Select OK and scan the QR code of your paper wallet.

Great!! Now Coinomi has access to the BTC stored at your address on the paper wallet. Transfer the part of your coins you plan on cashing out to a deposit address on the exchange of your choice. To store the rest please create a new paper wallet. Just for safety.

Good luck and lot’s of profit!

Categories
tech

SEO optimization for Shopify stores [Beginners]

I’ve been busy building janori lately. A cool webshop where you can buy vegan chocolate, organic coffee, and delicious snacks produced by regional heroes. The shop is up and running since November 11th, and so far we’re doing fine. Just recently I noticed that we aren’t ranking on Google – at all!

Woe me! I thought, but then went of on an interwebz truffle hunt to find good info on how to optimize my Shopify store. I diligently watched Neil Patel’s course, and some of Brian Deans ouptut. Both are great, and explain the process of SEO well, and gave me a good fundament of what keyword research is, and how it’s done.

But how do I get my actual products, and my site to rank?

I found two videos, that really help: This is the best. Mr. Kanase really breaks it down into totally relatable and actionable steps, that made me feel I can start straight away.

Then I found another one, nearly identical in scope with some extra tips, that also made me see, that SEO wasn’t an exact science, and “every way leads to Rome”. Very good for beginners like me. This really helps not to get stuck in analysis-paralysis.

The basic methodology is to use the Keywords Everywhere plugin in Google Chrome and look up your products name. Then find some keywords that have relatively high search volume, with not soo high competition (search difficulty), and use those in the title, especially the Meta title, and in the description, as well as in the Alt Text of images.

At the same time it’s important to write actually useful, ideally long, product descriptions.That’s a fun writing exercise, if I ever saw one!

I’m just starting to implement the process, and will let you know the progress I made with janori.