Jun 23, 2025

Future Forward: Key Themes from the Owl Explains Crypto Summit

The Owl
By and The Owl
6a6b9ef0-6e71-496f-878e-9dbafbae2366

What better place to explore the future than a setting steeped in the past? Against the backdrop of the Dorchester Hotel—an iconic London venue rich with history and elegance—the first Owl Explains Crypto Summit brought together a dynamic mix of policymakers, technologists, legal minds, and industry leaders to tackle some of the most forward-looking questions in crypto and digital markets. The turnout was strong, the energy high, and the conversations —both on and offstage — were substantive. This wasn’t a day of soundbites or sales pitches!

So in between the delicious food (miniature vegan lemon meringue pie, yes please), getting your new complimentary professional headshot from Van Scoyoc Associates, and enjoying the contents of your OE tote swag - our owlet attendees were able to enjoy a range of panels, delving into topics including privacy, liquidity, global commerce, autonomous code, anti-money laundering and tokenization.

Big Picture Perspectives

Sprinkled delightfully among our roundtables we were able to hear from three keynote speakers whose leadership continues to shape digital policy at the highest levels: Lord Holmes (UK House of Lords), Peter Kerstens (European Commission), and MEP Ondřej Kovařík (European Parliament). While occupying very different roles in the policy ecosystem, they all spoke to the power of blockchain and digital assets to enhance the global financial world of tomorrow.

Lord Chris Holmes emphasized the need for thoughtful regulation of emerging technologies and called for a cross-sectoral AI framework—highlighting both innovation and social inclusivity, especially for sensory-impaired communities. Peter Kerstens, “the father of MiCA”, used his keynote to underline Europe’s new crypto-assets framework and urged developers not to wait for prescriptive regulation, but to innovate, demonstrating in practice how the rules can be shaped and applied. MEP Ondřej Kovařík offered a forward-looking view on MiCA implementation and its broader implications for the European crypto ecosystem, emphasizing the importance of ensuring a smooth and coordinated rollout of the new framework.

With those big-picture perspectives anchoring the day, we can now zoom into the practical, the technical, and the sometimes provocative. Across six expert-led roundtable sessions, attendees had the chance to get stuck into the details: asking hard questions, sharing lived experience, and debating what’s really needed to take this industry from potential to practice.

Roundtable Session 1: Tokenizing It All

The summit’s first roundtable, Tokenizing It All, explored the implications of a fully tokenized world where stablecoins are commonplace with panelists Helen Disney, Sean McElroy, Yuliya Guseva, Jannah Patchay, Varun Paul, Isadora Arredondo, and Kene Ezeji-Okoye. The discussion delved into the fundamentals - what does it mean to tokenize something, the practical challenges and opportunities of tokenizing various asset classes (including Sean’s apartment!), the role of regulation, and the potential impact on commerce and trading.

Roundtable Session 2: DeFi-ing Liquidity

The second session, DeFi-ing Liquidity, examined the dynamics between decentralized and centralized finance in providing market liquidity. Panelists Fahad Saleh, Lavan Thasarathakumar, Joey Garcia, Dan Gibbons, David Wells, Sara George, and Olta Andoni had an animated discussion, highlighting the benefits and risks associated with DeFi, the need for regulatory clarity, and the future of liquidity provision in a tokenized economy. And even a sprinkling of friendly feather ruffling as the question of definitional prowess between academics and lawyers came to a head!

Roundtable Session 3: Globalizing Commerce

If the audience were hungry, they weren’t letting it show. The high spirits continued into the final panel of the morning, which addressed the complexities of global commercial structures in the context of tokenized assets. Panelists Yesha Yadav, Erwin Voloder, Scott Mason, Sam Gandhi, Emma Pike, Dagmar Machova, Ari Pine, and Amanda Wick discussed jurisdictional challenges, the convergence of commerce and trading, and the legal implications of cross-border transactions in a blockchain-enabled world.

Roundtable Session 4: The Chase is On

The afternoon discussions kicked off with a great panel looking into enforcement, litigation, and anti-money laundering in the realm of tokenized and decentralized finance. Our expert panel featuring Justin Gunnell, Christopher Mackin, Sayuri Ganesarajah, Joanna F. Wasick, Laura Clatworthy, Isabella Chase, Joe Hall, and Jesse Overall shared insights on tracking illicit activities, the role of international cooperation, and the evolving legal landscape in digital finance. They also touched briefly on the rise of wrench attacks, which involve real-world violence targeted at individuals for their digital assets - reminding our audience that the digital and physical worlds are now inextricably linked.

Roundtable Session 5: When We Need Secrets

The fifth roundtable raised some interesting Nuggets (!) on privacy and identity in a fully tokenized and decentralized market. Speakers Seema Khinda Johnson, Dr. Agata Ferreira, Adam Jackson, Eugenio Reggianini, Adi Ben-Ari, Peter Freeman, and Chris Grieco debated the challenging balance between giving citizens control and privacy, and combating fraud. They discussed the development of digital identity solutions, and the ethical considerations of data protection in blockchain applications.

Roundtable Session 6: Code Running Solo

Last but by no means least, the final session, Code Running Solo, explored the intersection of cybersecurity, artificial intelligence, and autonomous code in tokenized markets. Panelists Lilya Tessler, Norma Krayem, Laura Navaratnam, Fabian Schär, Eva Wong, Joni Pirovich, and Caroline Malcolm examined the challenges of securing decentralized systems, the implications of AI-driven decision-making and the role of regulation (on which opinions differed wildly!) within that. 

Looking Ahead: A Community with Purpose

As Wee Ming Choon took to the stage to close out the first ever The Owl Explains Crypto Summit, the mood was buoyant, especially for a conference ending after 6pm! This wasn’t just a policy event—it was a community coming together to explore real questions about how our digital future is taking shape. 

One topic that kept coming up was regulation. Are regulators getting it wrong because they have turned their back on technology and competition? Or is that a mischaracterization of the role of regulation - and in fact the incentives work against regulators, promoting continuity of the status quo? Our panels on liquidity and autonomous codes in particular discussed this at length - and while this is not a discussion that can be solved easily, creating platforms for smart and articulate individuals with a range of views and experience to debate them can only serve to be a step towards answers.

As our Owl Explains parliament retired to the drinks reception, brains fizzing with a heady mix of topics - from tokenization to AI, from privacy to liquidity, all conversations that didn’t shy away from complexity. And that’s exactly what made them so valuable. In the words of Owl Explains founder Lee Schneider, “I came away with a really positive sense that we will change the world.” It was a sentiment shared by many in the room: pride in what’s been built, and excitement for what comes next.

Articles

shutterstock 1744689107
2026-04-21

The Importance of the Nature of the Asset Under Recent U.S. Regulatory Guidance

Over the past year, U.S. regulators have taken important steps toward clarifying how tokenized assets fit within existing legal and regulatory frameworks. While these efforts come from different agencies and address different use cases, a consistent theme has emerged: The nature of the asset, not the technology used to represent it, should determine its regulatory treatment. This principle sits at the core of the Avalanche Policy Coalition’s (APC) work on token classification. Encouragingly, recent guidance from the SEC, federal banking agencies, and the CFTC reflects a growing convergence toward this approach. The Guiding Principle for Token Classification At APC, we have consistently emphasized that effective policy and regulation begins with a simple but powerful idea: The nature of the asset matters.  The functions and features of the asset, regardless of how it is represented, are paramount for determining utilization, valuation, risk profile, and legal classification. This is not just a regulatory concept—it reflects how markets, businesses, and individuals already understand and interact with assets in every other context. When we think about how an asset is used, what its value is, or what risks it presents, we naturally focus on the bundle of rights that make up the asset. Tokenized assets should be no different. Our work has highlighted concrete examples that illustrate why classification must follow function and substance, not technology. For example: A tokenized share of stock, whether held through traditional infrastructure or represented on a blockchain, remains an equity security. It reflects ownership in a legal entity and should be regulated as such. A tokenized money market fund, such as those issued by major financial institutions and deployed on blockchain infrastructure, is still a fund product. Its risk profile, investor protections, and regulatory treatment should derive from that underlying reality, rather than from the fact that it settles on-chain. A concert or sporting event ticket represented as an NFT is fundamentally a consumer item, not a financial instrument. Treating it like a security simply because it exists on a blockchain is a category error. A loyalty reward or in-game asset, such as points or digital items used within a platform, functions as part of a closed ecosystem. These are not investments in a business enterprise or part of a company’s capital stack and should not be regulated as such. We discuss these concepts in more detail here, here, and here. This is exactly how markets already operate. The utilization of an asset depends on its nature—a stock conveys ownership rights, a ticket grants access, and a loyalty point provides benefits within a program. The same is true for valuation: investors value equities based on earnings and growth prospects, while collectibles or tickets derive value from scarcity, demand, or utility. When policymakers fail to make these distinctions, the result is overbroad definitions, regulatory confusion, and unintended consequences. When they get it right, the result is a framework that is easier to understand, easier to apply, and better aligned with actual risks. The common thread is simple: Tokenization does not change what an asset is. Recent regulatory developments suggest that U.S. agencies are increasingly adopting this same paradigm. Recent Regulatory Guidance Applies the Nature of the Asset Principle 1. CFTC Guidance on Tokenized Collateral In December 2025, the CFTC issued staff guidance on the use of tokenized assets as collateral in derivatives markets. The guidance addresses how futures commission merchants (FCMs), clearinghouses, and other market participants should evaluate tokenized collateral under existing margin and risk management rules. The CFTC emphasizes that tokenization is simply a technological wrapper: The use of distributed ledger technology does not, by itself, change the fundamental characteristics of an asset. As a result, the key considerations are familiar ones: Does the tokenized asset provide legal and economic rights equivalent to the underlying asset? Is the asset liquid and reliable enough to serve as collateral? Can the collateral be safely held, controlled, and transferred? The guidance encourages firms to focus on assets that are already eligible as collateral in traditional form, such as high-quality government securities, and to evaluate tokenized versions within existing risk frameworks. It also highlights several operational and legal issues that must be addressed, including: Custody and segregation of tokenized collateral Legal enforceability of claims Cybersecurity and access controls Settlement timing and operational resilience Finally, the CFTC notes that standard margin haircuts and risk assessments can be applied, with adjustments only where the tokenized format introduces new or different risks. 2. SEC Guidance on Tokenized Securities In January 2026, the SEC’s Division of Corporation Finance issued a statement addressing how federal securities laws apply to tokenized securities—that is, traditional securities represented using distributed ledger technology. The statement begins by clarifying that tokenization can take multiple forms. In some cases, an issuer may issue a security directly on a blockchain, recording ownership and transfers on-chain. In others, a third party may create a tokenized representation of an existing security, such as a token backed by shares held in custody or a synthetic instrument linked to the value of the underlying asset. Despite these structural differences, the SEC’s core message is clear: Tokenization does not alter the legal nature of the underlying instrument. Whether a security is recorded in a traditional book-entry system or on a blockchain, it remains subject to the same securities-law framework. The analysis turns on the rights conveyed, the structure of the product, and the obligations of the parties involved, not on the technology used to implement it. The statement also highlights important practical considerations, including: The need for clear disclosure around how tokenized products are structured The role of custodians and intermediaries in holding underlying assets The potential differences between issuer-sponsored tokenization and third-party tokenization models In short, the SEC is signaling that tokenization is permissible—but must fit within existing securities-law principles. 3. Federal Banking Agencies’ Capital Treatment of Tokenized Securities In March 2026, the Federal Reserve, OCC, and FDIC jointly issued guidance on how bank capital rules apply to tokenized securities. The guidance is framed as a set of FAQs. It addresses how banks should treat tokenized securities for purposes of: Risk-based capital requirements Market risk capital rules Collateral eligibility The agencies explicitly take a technology-neutral approach. They state that the use of distributed ledger technology to issue or represent a security does not, in itself, change how that asset should be treated under bank capital rules. Instead, the key question is whether the tokenized instrument: Confers legal rights identical (or substantively equivalent) to those of the traditional security, and Exposes the institution to the same underlying risks Where those conditions are met, the agencies conclude that: Tokenized securities should generally receive the same capital treatment as their non-tokenized equivalents They may qualify as financial collateral under existing rules Standard haircuts and risk weights can be applied The guidance also makes clear that banks must still consider: Operational risks, including settlement and custody mechanisms Legal enforceability of tokenized claims Liquidity characteristics of the tokenized asset Importantly, the agencies do not distinguish between permissioned and permissionless blockchains, reinforcing the principle that the focus should be on risk and rights—not infrastructure design. The banking agencies agree with the SEC: tokenization does not alter the legal nature of the underlying instrument. 4. SEC Broker-Dealer Capital Treatment of Stablecoins In February 2026, the SEC’s Division of Trading and Markets updated its FAQs addressing how broker-dealers may treat certain stablecoins under the net capital rule (Rule 15c3-1), specifically in FAQ 5. The guidance focuses on payment stablecoins—stablecoins designed for use as a medium of exchange and backed by high-quality reserves. Under the FAQ, the staff states that it will not object if a broker-dealer: Treats a proprietary position in a qualifying payment stablecoin as having a “ready market,” and Applies a 2% haircut, consistent with certain low-risk, liquid assets However, this treatment is not automatic. It is conditioned on the stablecoin meeting specific criteria, including: Backing by high-quality, liquid reserves Clear and enforceable redemption rights at par Regular public disclosures and independent attestations Alignment, where applicable, with the statutory framework under the GENIUS Act The guidance is intentionally narrow. It does not apply to all stablecoins, and it does not attempt to classify stablecoins more broadly under securities laws. Instead, it addresses a practical issue: how broker-dealers can incorporate certain stablecoins into their operations without facing disproportionate capital charges. At a broader level, the guidance reflects a willingness to integrate blockchain-based payment instruments into existing regulatory frameworks, provided their structure and risk profile support it.  Conclusion: Convergence Around a Workable Framework Taken together, these four regulatory interpretations reveal a clear and consistent direction in U.S. regulatory thinking. First, they reinforce technology neutrality. Blockchain is not the regulatory trigger. Second, they embrace a “same asset, same treatment” approach. Equivalent rights and risks should lead to equivalent regulatory outcomes, regardless of technology used. Third, because the nature of the asset is unchanged, they rely on existing regulatory frameworks rather than creating new regimes. Finally, they focus on real risks—legal, custody, liquidity, and operational—not labels or technological form. These themes closely align with an APC core principle: The nature of the asset matters. By focusing on what an asset is—rather than how it is represented—regulators can promote clarity, support innovation, and ensure that regulation remains targeted and effective. They also avoid reregulation and confusion around how to evaluate an asset. All of this supports robust, competitive markets. APC is pleased to see regulators aligning around this principle. Grounding regulation in the nature of the asset lays the foundation for frameworks that reflect existing law, avoid unnecessary complexity, and are ultimately workable and durable.

The Owl
By and The Owl
sand-
2026-04-15

The SEC Confirms: The Nature of the Activity Matters

How the SEC’s User Interface Guidance Aligns with APC’s Framework Recent guidance from the SEC’s Division of Trading and Markets on broker-dealer registration for user interfaces (the “Staff Statement”) marks an important step toward bringing clarity to digital asset regulation. While the statement focuses specifically on user interfaces interacting with crypto asset securities, its broader significance lies in the analytical framework it adopts. That framework closely aligns with the Avalanche Policy Coalition’s (APC) long-standing position: Regulation should turn on the nature of the activity, not the technology used to perform it. In our May 2025 submission to the SEC Crypto Task Force, we articulated this concept as the “nature of the activity test.” The Staff Statement demonstrates that this approach is increasingly reflected in regulatory practice. The Core Question: When Does a Tool Become an Intermediary? The SEC’s statement addresses a central issue in modern market structure: When does a software interface that enables transactions become a broker-dealer? Rather than creating a new category for “crypto interfaces” or focusing on the use of blockchain technology, the Staff applies a familiar inquiry rooted in existing law. The analysis turns on whether the provider is engaging in traditional intermediary activities, such as: Soliciting transactions Recommending securities Exercising discretion Receiving transaction-based compensation Custodying assets Acting as an intermediary between buyers and sellers If these hallmarks are present, broker registration is required. If they are not, the provider should not be treated as a broker.  This is a functional test—one that looks to what the entity does, not the means by which it is done. APC’s “Nature of the Activity” Test This approach closely mirrors the framework proposed in Ava Labs’ May 2025 submission to the Task Force. In that letter, APC articulated the nature of the activity test as a method for determining when infrastructure providers should be treated as securities intermediaries. The test asks a simple question: Are the activities ones performed by a broker, dealer, or investment adviser? If the answer is yes, existing regulatory obligations apply. If not, registration should not be required. This framework is grounded in decades of securities law. As the submission explains, the SEC has long evaluated whether entities fall within the scope of broker, dealer, or adviser regulation based on factors such as: Engagement in the business of effecting transactions Providing investment advice Receipt of transaction-based compensation Active solicitation of trades Participation in negotiations Custody of customer funds or securities Notably, none of these factors depend on the technology used. They were developed in an era of paper-based markets and continued to apply as markets digitized. We went on to say that the same logic should apply to blockchain-based systems, which represent the next iteration of digital market infrastructure. Infrastructure vs. Intermediation A central theme of the APC submission is the distinction between infrastructure providers and intermediaries. Infrastructure providers—such as validators, software developers, and communications providers—perform essential technical functions. They enable networks to operate but do not: Solicit transactions Provide advice Exercise discretion Control assets Know or influence the nature of specific transactions As the submission explains, these actors are: “invisible and indiscriminate in verifying, recording, and enabling transactions.” Their role is analogous to that of internet service providers, cloud service providers, API and RPC providers, and similar technical services.   These functions have never been treated as regulated financial intermediation, even though they are essential to the operation of financial markets. Our recent blog post comparing the GENIUS Act’s exceptions for infrastructure with the exceptions for “ancillary infrastructure” in the EU’s Transfer of Funds Regulation reinforces this distinction. SEC’s User Interface Guidance: A Practical Application The Staff Statement reflects this same distinction, even if it uses different terminology. The statement identifies a category of providers—those offering interfaces assisting users in crypto asset securities transactions (“Covered User Interfaces”)—for which broker-dealer registration is not required, provided they satisfy certain conditions. These conditions effectively define what it means to operate as infrastructure rather than an intermediary. To remain outside broker-dealer status, an interface provider must: Allow users to set all transaction parameters Avoid recommendations or investment advice Refrain from soliciting trades Operate without discretion or control Present execution options using objective criteria Maintain neutral, non-conflicted compensation structures Provide clear disclosures These requirements collectively describe a passive, neutral conduit—precisely the type of actor that has historically received no-action relief.  Continuity with SEC No-Action Precedent The APC submission places heavy emphasis on the SEC’s long history of granting no-action relief to technology providers performing neutral functions. Examples include: Messaging systems connecting brokers Electronic bulletin boards posting trade information Matching platforms linking investors and issuers Data providers offering analytics and research In each case, the SEC focused on whether the provider: Exercised control Participated in negotiations Provided advice or recommendations Handled funds or securities Earned transaction-based compensation Where these elements were absent, the SEC consistently declined to require registration. The user interface guidance follows the same pattern. It does not create new rules; it applies existing principles to new technology.  The Staff Statement even frames its conclusion in terms that closely resemble traditional no-action relief:  In circumstances where a Covered User Interface Provider takes the measures discussed below relating to its creation, offering, and/or operation of a Covered User Interface, the Staff will not object to the Covered User Interface Provider creating, offering, and/or operating a Covered User Interface without registering as a broker-dealer pursuant to Section 15(b) of the Exchange Act. Conclusion The convergence between APC’s framework and the SEC’s guidance has important implications. First, it confirms that existing law is sufficient when applied correctly. There is no need to create new categories for blockchain-based actors. Second, it reinforces the importance of functional analysis. Regulatory outcomes should depend on what an entity does—not on labels, technology, or proximity to financial activity. By focusing on the nature of the activities conducted, regulators can distinguish between: True financial intermediaries, and The infrastructure and tools that support modern markets Third, it provides a path forward for innovation. By clarifying that neutral infrastructure and tools are not automatically subject to intermediary regulation, the SEC reduces uncertainty and enables development within a compliant framework. APC is encouraged to see this clear alignment with its “nature of the activity” test. It demonstrates that longstanding principles of securities law remain vibrant and adaptable—even as markets evolve. The next step is to apply this same logic consistently across the digital asset ecosystem, ensuring that regulation remains targeted, coherent, and grounded in how these technologies actually operate. As our 2026 policy priorities make clear: Infrastructure providers are not intermediaries. Getting this distinction right is essential—not only for regulatory clarity, but for ensuring that robust, competitive markets can develop within a coherent and predictable framework.

The Owl
By and The Owl
shutterstock 2730976661
2026-04-13

DeFi Governance Is a Question of Concentration, Not Decentralization

A recent European Central Bank working paper looks to analyze decentralization in DeFi protocols from the standpoint of governance.  It finds concentration in governance and that this undermines decentralization.  This claim, however, rests on a conceptual error: it conflates system decentralization with governance concentration. And governance concentration that does not affect transaction finality or asset ownership is not relevant to whether a system is decentralized. The distinction matters and clarifies both the paper’s findings and their implications. At Avalanche Policy Coalition, we have consistently defined decentralization from a technical standpoint. A system or network is decentralized when there is no single source of truth, no single point of failure, and no authority with the ability or responsibility to change data, transactions or balances.  It is a definition focused on finality. It ensures that users can trust what they see regarding ownership of assets and the completion of transactions.  The working paper errs by reframing decentralization as a governance question rather than a matter of network finality.  It compounds this error by trying to answer the question of who to regulate in DeFi by looking at concentration of governance power and participation across major DeFi protocols.  What the paper actually demonstrates is not a failure of decentralization, but the presence of concentrated governance layered on top of decentralized infrastructure. Confusing governance concentration with decentralization risks pushing regulation toward infrastructure rather than actors—undermining the very properties that make these systems trustworthy. Here is a summary of the paper’s empirical findings:  Token ownership is heavily skewed, with the top 100 holders controlling more than 80% of supply across the studied protocols, and the top five holders often control a substantial fraction of that total. Governance systems also rely extensively on delegation, whereby token holders assign voting power to intermediaries. As a result, a relatively small number of delegates exercise a disproportionate share of voting power, in some cases controlling the majority of delegated votes. Delegation thus operates as a structural amplifier of concentration. The paper also notes that concentration of governance power is further compounded by opacity. A substantial share of the most influential participants cannot be linked to identifiable individuals or institutions, making it difficult to determine whether governance power is independent or coordinated, whether incentives are aligned or conflicted, and whether influence is exercised by insiders, intermediaries, or diffuse communities. At the same time, governance processes themselves do little to redistribute power. The paper shows that most proposals concern operational parameters—risk settings, asset listings, and similar adjustments—while very few address governance structure. As a result, the paper concludes, existing distributions of power tend to reproduce themselves over time. The paper then concludes that decentralization is a property of governance. Under this view, a system is decentralized to the extent that decision-making authority is widely distributed, no small group can dominate outcomes, and the relevant actors are identifiable and accountable. If governance power is concentrated, the paper concludes that decentralization is incomplete or illusory.  This definition is viscerally appealing, particularly from a regulatory perspective. Regulators require identifiable points of control, and the paper emphasizes the difficulty of relying on governance token holders, developers, or exchanges as regulatory “anchor points” precisely because of opacity and fragmentation in governance structures.  Yet this definition departs from the more established understanding of decentralization in distributed systems, where the concept refers not to governance dispersion but to system architecture: whether there is a single point of failure, a single source of truth, or a single authority capable of altering data or transactions. On the more technically precise definition of decentralization, the protocols studied in the paper—built on public blockchains—remain decentralized. Framed in these terms, the paper’s findings are best understood as documenting concentrations of governance power, not undercutting decentralization.  The paper does not show that any individual token holder, delegate, or developer can rewrite transaction history, override consensus, or unilaterally alter the state of the ledger. Nor does it show that the voting groups have this power.  It also implicitly recognizes that where governance does not affect asset ownership or transaction finality, regulatory hooks are difficult to establish.  Indeed, as noted above, the proposals on which votes are sought have nothing to do with transaction finality or asset ownership.   At best, the paper can conclude that the infrastructure remains decentralized even if governance becomes concentrated. This distinction suggests a more precise analytical framework. At the infrastructure layer, finality is distributed, consensus is collective, and no single point of failure exists. At the governance layer, ownership can become concentrated, voting power aggregated, and influence unevenly distributed. These are not contradictory observations but complementary ones. DeFi systems can be both decentralized and concentrated, depending on the layer of analysis. Recognizing this layered structure clarifies the nature of the challenges identified in the paper. The difficulty regulators face is not that decentralization has failed, but that concentration exists without clear attribution.  This produces a structural asymmetry.  Governance actors can shape protocol outcomes—adjusting parameters, allocating resources, and influencing development trajectories—but they do so within systems whose core integrity cannot be compromised by that concentration. The result is a hybrid condition in which decentralized infrastructure coexists with concentrated influence over things that do not undercut decentralization. Reframing the issue in terms of concentration rather than decentralization also shifts the focus of regulation. For regulators, the challenge is not identifying a centralized intermediary in the traditional sense (i.e., one that controls transactions or custodies assets), but understanding how concentrated influence operates within systems that lack formal control points on the areas of typical regulation. Addressing these issues will require regulatory approaches that focus on identifiable actors and activities, rather than attempting to impose control at the infrastructure layer where it does not exist. The ECB paper makes a significant contribution by documenting the realities of DeFi governance. But its conceptual framing requires greater precision. Decentralization and concentration are not opposing descriptions of the same phenomenon; they operate at different levels of analysis. The systems studied in the paper are not failed attempts at decentralization. They are decentralized systems with concentrated governance structures. And where those structures do not affect transaction finality or asset ownership, the system remains decentralized. Recognizing this distinction provides a clearer understanding of both the risks and the possibilities inherent in DeFi. To hear more on this and related topics, please listen to this webinar from Global Blockchain Business Council.

The Owl
By and The Owl