OCC Symposium Explores Tokenization of Real-World Assets and Liabilities

The Owl
By and The Owl
Untitled design (3)

In February 2024, the U.S. Office of the Comptroller of the Currency (OCC) hosted its Symposium on the Tokenization of Real-World Assets and Liabilities. The OCC is one of three prudential banking regulators in the United States, overseeing national banks and federal savings associations. Its role in ensuring the safety, soundness, and fairness of the banking system means it is imperative for the regulator to assess how the entities it supervises are planning to leverage distributed ledger technology (DLT) to provide new and enhance existing products and services.

The tokenization of real-world assets and liabilities, such as commercial deposits, real estate, commodities, or art, involves converting the ownership rights of these assets and expressing them as digital tokens that can be traced on DLT. This process has the potential to revolutionize the way assets are bought, sold, and managed, offering increased liquidity, transparency, and accessibility. However, it also presents new regulatory queries, particularly in terms of ensuring compliance with existing financial regulations, safeguarding against money laundering and fraud, and protecting investor rights.

As tokenization of real-world assets and liabilities becomes further integrated in the financial system, the OCC's role and regulations will likely influence how other regulatory bodies, both domestically and internationally, approach tokenized assets’ oversight.

Importantly, and excitingly, many of the themes discussed during the event fall under the five branches of the Tree of Web3 Wisdom

The Tokenization Symposium began with remarks from Acting Comptroller Michael Hsu, where he defined tokenization as “process of digitally representing an asset’s liability, ownership, or both, on a programmable platform,” and called on event attendees to understand the technology. He set as the “north star” for the event, identifying problems and proposing solutions accordingly, as opposed to developing solutions in search of a problem. 

Panel 1: Legal Foundations for Digital Asset Tokens consisted of members of the Uniform Commercial Code (UCC) drafting committee and others who were supportive of the UCC, a comprehensive set of laws governing commercial transactions in the United States, including sales, leases, negotiable instruments, and secured transactions. The panel argued that amending the UCC to include digital assets benefits token holders because it provides statutory protection compared to enforcing rights through suing over contract rights, and this is particularly important in situations such as bankruptcy, where there is a legal process for asserting claims to recover funds. The panel discussed how the United States has the most advanced body of rules for commercial law, given efforts to amend the UCC to recognize use of DLT, as opposed to other jurisdictions where the common law is still developing. During the discussion, the panelists discussed how it is important to take into consideration the sensible classification of tokens, comparing the concept of tokenization to using paper as a medium for recording rights and liabilities.  

Panel 2: Academic Papers on Tokenization explored three academic papers: 1) how the acceptance and usage of digital payments leads to increased financial inclusion; 2) the use of payment stablecoins for real-time gross settlement; and 3) a study on the economics of NFTs. The panelists in their presentations discussed thinking globally with respect to how tokenization is occurring across the world and how it can facilitate cross-border payments and support financial inclusion objectives.  

Panel 3: Regulator Panel featured staff of the innovation offices from the OCC, Federal Reserve (the Fed), Federal Deposit Insurance Corporation (FDIC), Commodity Futures Trading Commission (CFTC), and the Securities and Exchange Commission (SEC). Each office discussed how they are seeing tokenization of real-world assets and how they interact with other aspects of DLT such as smart contracts. The regulators discussed opportunities for tokenization within the banking sector, such as tokenization of deposits, tokenized money market fund shares, and the benefits they can provide in areas such as correspondent banking, repo transactions, and post-trade processes. One area they flagged as an opportunity is increasing the accuracy of systems under the Bank Secrecy Act to monitor for money laundering, terrorist financing, and sanctions screening more efficiently. Interoperability is one challenge they are seeing with respect to tokenization. The panelists discussed throughout how regulation of digital assets should be context-appropriate. 

Panel 4: Tokenization Use Cases featured representatives from the Depository Trust & Clearing Corporation (DTCC), Mastercard, and the Massachusetts Institute of Technology (MIT). The panelists discussed exciting use cases that tokenization and DLT are enabling such as T+1 settlement and tokenization for private markets, multi-rail payments that support complex types of payments that enable increased coordination, reduce counterparty risk, and enable greater fraud controls. The panelists also touched on how policymakers and innovators should beware of misconceptions when assessing the various use cases. Some themes that echoed from previous panels included challenges around interoperability, developing solutions based on need, and carefully developing regulations based on the use cases.  

Panel 5: Risk Management and Control Considerations also explored various tokenization use cases and areas where tokenization can make a big difference, such as markets where capital is freed up and markets become more liquid. The panelists discussed the perspective regulators should use when approaching risk management and developing standards to minimize risk. They also discussed the role of intermediaries in tokenization and how industries have evolved and become more "dis-intermediated" over time. In their closing statements, the panelists called for regulators and policymakers to understand the technology and experiment more with it to better understand its implications.    

The Symposium ended with a keynote speech featuring Hyun Song Shin (Economic Advisor and Head of Research at the Bank for International Settlements) regarding how tokenization can help propel innovations in the monetary system similar to money and paper ledgers. He discussed various concepts involving tokenization such as improved delivery versus payment, central bank digital currency, the “singleness of money” with respect to tokenized deposits and stablecoins, and the "tokenisation continuum" that maps out different use cases ranging from wholesale payments to land registries. 

In conclusion, the OCC Symposium on the Tokenization of Real-World Assets and Liabilities underscored the need for careful consideration, collaboration, and continuous innovation. The diverse perspectives shared across legal foundations, academic research, regulatory insights, use cases, and risk management considerations have collectively woven a narrative of both promise and challenge. Moving forward, it is clear that embracing the digital evolution calls for a harmonious blend of regulatory adaptability, technological exploration, and a shared commitment to understanding the profound impact tokenization can have on the global financial ecosystem. 

Articles

token classification notes
Mar 27, 2024

Understanding and Classifying Blockchain Tokens

As seen in The International Journal of Blockchain Law (2024) by the GBBC.

The Owl
By and The Owl
Blockchain Analysis & Investigations

Blockchain Analysis & Investigations

Definition: The process of inspecting, identifying, clustering, modeling and visually representing data on a blockchain. Blockchain analytics can involve the use of software tools and open source information (OSINT) to analyze data on blockchain networks. These tools scrutinize transaction patterns, wallet addresses, and other data points on a blockchain to provide insights into the activities occurring on the network. Blockchain analysis is done for a variety of reasons from market analysis to investigating illicit activity. Blockchain investigations are commonly conducted to uncover illicit activities such as money laundering, fraud, and the use of cryptocurrency in criminal enterprises. Investigations leverage analytics tools to track and identify this activity on-chain. The transparent nature of the blockchain allows for investigators to follow the flow of funds on the public ledger. How it Works: Data Aggregation: collecting, compiling and summarizing information from various sources across blockchain networks Pattern Recognition: identifying and interpreting behaviors and trends within the aggregated data Forensic Analysis: systematically interpreting the aggregated data and recognized patterns to come to investigative conclusions Purposes (not an exhaustive list): AML compliance and regulatory reporting Fraud detection Security analysis Market analysis Enhance security and trust in blockchain networks Aiding law enforcement to catch 'bad actors'

The Owl
By and The Owl
Decentralization: A Matter of Computer Science, Not Evasion

Decentralization: A Matter of Computer Science, Not Evasion

Decentralization is an important concept in computer science that has gained significant attention in recent years. It is defined as a system or network with no single point of failure or "central orchestrator" that is required for proper operation. Decentralized systems require independent peers to collaborate to achieve some greater goal with aligned incentives. Most decentralized systems have no single entity or authority with the power or obligation to change or update data or transactions. For a blockchain network, the key point is that a distributed network of computers not centrally controlled must reach consensus about changes to “state + future transitions of state” (adding new blocks of transactions). One of the primary virtues of decentralization lies in its ability to enhance fault tolerance and system resilience. In centralized systems, a single source of truth or point of failure can lead to negative consequences, such as when a hack or ‘fat finger error” occurs. Decentralized systems, on the other hand, distribute functions across multiple nodes, making them more capable of withstanding failures or attacks and not prone to errors. In the case of blockchains, they are robust because work is duplicated by everyone (taking away one or a group of nodes does not impact its continued functioning).  Furthermore, decentralization is a powerful tool against censorship and tyrannical control. In centralized systems, a single entity can wield disproportionate power to control or manipulate data or information flows. Decentralized systems distribute power among participants, making it much harder for any single entity to exert undue influence, control data, or censor (or alter) information.  When it comes to fault tolerance, centralized systems often rely on trusted third parties, which can be vulnerable to errors, security breaches or malicious activities. In decentralized systems, as illustrated in blockchain, we trust incentives to keep validators honest and the correctness of the software. The computer code is implemented and managed by a distributed network that reaches consensus. By eliminating the need for intermediaries, they reduce the potential for bad behavior and foster a more secure environment. As such, these distributed systems have the potential to scale more effectively than their centralized counterparts.  Decentralization also fosters a fertile ground for innovation and competition. It lowers barriers to entry, allowing a wider range of participants to contribute to the network and its associated ecosystem. This healthy competition can lead to the development of more diverse and specialized solutions, driving overall progress for the network and the applications built on it.  Contrary to some misconceptions, decentralization is not about evading laws or regulations. It is a network design element that seeks to ensure better information, greater user control and autonomy, and more access for builders. Dr. Emin Gün Sirer, founder of Ava Labs, put it this way in his testimony before the House Financial Services Committee: Let me be clear: this ability to leverage distributed or decentralized networks is a desirable goal for many reasons that have nothing to do with securities laws, financial services regulation, or the laws and rules governing other areas of commerce, recreation, and communications. Distributed networks are more resilient, secure, auditable, and available for builders. Blockchain builders did not set out to develop the technology to evade laws and rules. We set out to solve computer science problems. Decentralization is not about laws and regulations, but about unlocking the true potential of computer systems and associated technology. The internet itself is an example of a decentralized system: By distributing power, trust, and functions across a network of servers, it takes advantage of numerous benefits in terms of fault tolerance, censorship resistance, and scalability. Embracing decentralization in computer science is a strategic move towards building a more secure and inclusive Internet. Sensible, workable regulation in conjunction with innovation will help guide the transformative power of decentralization and blockchain to empower individuals and drive economic inclusion. The Tree of Web3 Wisdom is a set of principles to help guide blockchain regulation worldwide and harness the power of decentralized systems.

The Owl
By and The Owl