Unlocked by Nillion

    This research report has been funded by Nillion. By providing this disclosure, we aim to ensure that the research reported in this document is conducted with objectivity and transparency. Blockworks Research makes the following disclosures: 1) Research Funding: The research reported in this document has been funded by Nillion. The sponsor may have input on the content of the report, but Blockworks Research maintains editorial control over the final report to retain data accuracy and objectivity. All published reports by Blockworks Research are reviewed by internal independent parties to prevent bias. 2) Researchers submit financial conflict of interest (FCOI) disclosures on a monthly basis that are reviewed by appropriate internal parties. Readers are advised to conduct their own independent research and seek advice of qualified financial advisor before making investment decisions.

    Privacy: Crypto's Sleeping Giant

    Daniel Shapiro

    Key Takeaways

    • Privacy as a Catalyst for Adoption: Privacy is often a prerequisite for institutional adoption of decentralized technologies, and is essential to solve regulatory and compliance challenges while protecting users' freedoms.
    • Privacy-Enhancing Technologies (PETs): Multiple complementary approaches including Multi-Party Computation (MPC), Zero-Knowledge Proofs (ZKPs), Fully Homomorphic Encryption (FHE), and Trusted Execution Environments (TEEs) each offer distinct tradeoffs in security, scalability, and computational complexity. We see a market opportunity for an orchestration layer that can combine these techniques for optimal performance.
    • Foundational Protocols: Emerging networks like Nillion, Zama, and TACEO are building the infrastructure layer for private computation, offering different approaches from generalized computation networks to privacy-focused blockchains. These protocols can serve as the backbone for privacy-preserving applications.
    • Market Applications: Private computation unlocks massive opportunities across various markets including AI/ML, DePIN, DeFi, enterprise data sharing, healthcare, and more.

    Subscribe to 0xResearch Daily Newsletter

    Introduction

    Emerging technologies including AI and blockchain are set to transform society, however they rely heavily on data inputs often containing sensitive information. For these technologies to realize their full potential, private data sharing and privacy-preserving computation are essential prerequisites.

    For instance, personalized AI agents will likely only be widely adopted if consumers know they are not exposing their information to centralized entities. Similarly, decentralized trading - which offers operational efficiency gains over traditional systems - will only be viable if users' trades are private. 

    Emerging Privacy-Enhancing Technologies (PETs) will enable secure collaboration without compromising privacy, making it possible for individuals and organizations to leverage data across partners, users, and systems. This capability will unlock vast opportunities across sectors ranging from AI, Decentralized Finance (DeFi), Decentralized Physical Infrastructure Networks (DePIN), healthcare, digital identity, and more. PETs, foundational privacy applications, and user-touching applications will unlock new markets. These technologies form the Crypto Privacy Stack.

    A variety of protocols powered by new cryptographic primitives and traditional blockchain infrastructure alike are exploring a vast design space to solve these problems. While some, such as Nillion, are competing to provide the base layer for secure compute and storage, others are solving for specific use cases. This report aims to provide a comprehensive overview of the crypto privacy stack and the transformative impact it will have on global markets.

    Markets Enabled by Private Compute

    As privacy-enhancing technologies mature, they will help shape new markets and unlock value across a wide range of industries. Below, we will explore several key markets poised to benefit from the widespread adoption of private compute capabilities.

    AI

    Since the launch of ChatGPT, generative AI has begun reshaping industries, with its advancements seemingly accelerating at an increasing rate. However, questions are being raised around privacy and trust with frontier AI model companies. Leaders in the private compute space argue that privacy is the foundation for AI adoption because it will contribute to data security, user trust, regulatory compliance, and ultimately collaboration; key requirements for real-world viability and adoption. By embedding privacy as a foundational feature to AI systems, it will win user confidence and attract enterprises, ultimately leading to a better future for us all. Across each aspect of the AI supply chain - from data, to training, and compute - privacy and decentralization can be leveraged to enhance applications and protect sensitive data. 

    Personalized Agents

    AI model inference is the process of using a trained AI model to make predictions or decisions on new, unseen data. At the moment, frontier models such as OpenAI’s ChatGPT, Anthropic’s Claude, or Google’s Gemini store user input data. Although this data is used to improve the model, it is now under the control of these centralized organizations. This creates privacy concerns around user data, and this problem is set to get worse. Currently, Sam Altman believes we are at Level 2 of AI progress: Reasoning models. The problem will be exacerbated at Level 3: Agents.

    Agents will be capable of prompting other AI models on their own. In turn, they can solve complex problems from a single human input - no longer does one need to constantly re-prompt and debug an issue oneself. Agentic frameworks are already being created, and will likely become widespread over the next 18 months. This technology will unlock personalized AI assistants, which introduces significant privacy concerns for users. 

    To achieve the maximum benefit of a personal AI agent, one would ideally give the model as much data about your life as you can. This way, your agent could plan meetings, send emails, help with work, plan life events, manage your finances and investments, and could even be a personal doctor that analyzes biometric information and warns you when something is wrong. 

    Without privacy preserving technology, though, AI systems will have access to information including emails, message histories, spoken conversations, personal documents, knowledge of daily routines, as well as understanding of one's personal thoughts and personal relationships. This information, in the hands of private corporations, introduces risks of data misuse or exploitation.

    As the shift occurs from an AI being a tool to potentially an intermediary in all aspects of human life, PETs can be used to ensure the benefits of personalized AI agents can be realized without compromising personal privacy and security.  

    Private Model Training

    Private AI model training refers to techniques that enable AI/ML models to be trainedtested, and inferred upon in a way that preserves the privacy of the data used. Decentralization is also a critical component here to ensure reliability, security, and censorship resistance. 

    Open machine learning networks like Bittensor aim to replace traditional siloed models by enabling AI models to scale more efficiently through incentivized global collaboration, compounding intelligence, and model composability. With that said, any model - decentralized models included - are limited to high quality data. Private compute infrastructures will open the door for individual researchers and organizations to pool private data for the use of training open models. 

    At the moment, onchain investment focused AI agents are rapidly growing. Although these are in their early stages, they constitute a signal for a much larger trend. Users will demand institutional-grade AI agents to be trained on open and verifiable infrastructure to ensure that their objective function is to maximize the wealth of their users.

    Privacy-Enabled Compute DePINs

    Compute DePINs are a critical component to decentralized AI, as detailed in this Blockworks Research Report. Some estimates put the size of this market as high as $400B by 2027. These networks bootstrap peer-to-peer GPU marketplaces, enabling AI developers to access compute resources necessary to train models. 

    These networks often do not natively enable private compute, as they assume that both data and models are public. These networks are logical customers to generalized private compute protocols that could obfuscate sensitive data inputs and outputs during training.

    Data Sovereignty Solutions

    Today, users have effectively lost control over their personal information through a well-documented process where major tech platforms silo user data before it is harvested, monetized, and at times compromised, with little to no transparency or compensation to the individuals generating the data. Private computation and storage networks offer a compelling solution to these structural issues, while simultaneously enabling users to monetize their data. 

    Secure Digital Identity (D-ID)

    D-ID represents a significant opportunity in the Web3 infrastructure stack, with estimates suggesting the digital identity market could reach 3% of GDP by 2030. Current D-ID solutions suffer from poor data monetization mechanisms that exploit user information, fragment data access, as well as introduce significant security risks from centralized honeypots of personal data. Blockchain-based D-ID, and in particular those that also leverage privacy-preserving solutions such as Nillion, address these challenges while unlocking use cases across a variety of sectors including DeFi, healthcare, voting, decentralized social networks, and more.

    Along with heightened security and UX improvements for users, the largest value proposition lies in the potential for users to better monetize their own data, while simultaneously maintaining control over how it is utilized. For example, a user could choose to sell their demographic and search history to advertisers that need it to create targeted ads. 

    Proof of Personhood

    As AI agents proliferate, it will be increasingly difficult to identify if content was generated by a human or an AI. There will be an increasing need for the ability to verify human uniqueness and authenticity digitally. Worldcoin has created a product called World ID, which enables individuals to anonymously and securely verify that they are a real and unique human for easy online verification. The ID could be applied to activities such as voting or buying concert tickets. Worldcoin introduced AMPC (anonymized multi-party computation) to enhance the security and privacy of users stored biometric data. Stored iris codes are encrypted into multiple parts and distributed to different parties for storage.

    Decentralized Finance (DeFi)

    Privacy in DeFi is fundamental to achieving the vision of a transparent, yet secure, financial system. Although blockchains provide some level of anonymity, with the rise of sophisticated on-chain network explorers such as Chainalysis and Arkham, the pseudonymous nature of blockchain transactions are not sufficient for large financial market infrastructures, asset managers, or enterprises that deal with large amounts of capital. In turn, privacy-preserving technology has the potential to unlock trillions of dollars worth of capital that could flow into DeFi. 

    The World Economic Forum stated in a report on the future of Capital Markets that there is $867 Trillion that could flow into DeFi over the next few decades. Although it is unclear what the exact market size is, it is obvious that much of this value could not be moved onchain unless privacy-preserving infrastructure was in place.

    By completely obfuscating transactions entirely, large institutions can be confident that they can trade or conduct economic activity without their specific strategies or objectives being detectable by market competitors. Private transactions, that can still be verified by authorized intermediaries such as governments or regulatory bodies, will also enable DeFi products to be sold to investors of all sizes, as authorities will be able to verify KYC/AML data as well as Accredited Investor status.

    DePIN, Supply Chain, and IoT

    Many supply chain or infrastructure networks rely on IoT devices to function. Sensors which provide real-time data on the state of the network are crucial for network management and fault resolution. Supply chains are often highly complex, though, and could span hundreds of partners - especially in large OEM industries such as Aerospace or Automotive. Sensitive data - such as supplier details, production numbers, parts lists, and more - is often managed individually by each member of a supply chain on separate infrastructures.

    Secure, decentralized storage and computation networks can help reduce the cost and complexity of developing autonomous supply chains in various ways. For one, partners will be able to share private information without revealing the contents of the data. This means that a singular, real-time database can be created which contains the data from every component member of the supply chain. This approach drastically reduces the cost and complexity for developing a secure, real-time supply chain management system. Such a system could be used to automate every aspect of the supply chain as well as create autonomous, dynamic responses to disruptions and faults. AI and machine learning can be applied to these singular databases as well for optimization and real-time analytics. Supply chain experts refer to this future as Industry 4.0.

    Healthcare

    The need to share data for research and improved care often conflicts with privacy and ethical concerns, creating an ongoing struggle to balance patient privacy protections with advancing data-driven clinical research and care delivery. Balancing these opposing needs generally remains an unsolved issue. With that said, the rapid digitization of health records and the growing capability to store and exchange information has surpassed the reach of existing legal frameworks. This means that the data exists, it is just not usable due to regulations. This presents a massive opportunity for private compute networks. The market size for applying AI to healthcare data, for instance, was estimated to be worth $19.7B in 2023, with that number set to grow multiples in the coming years.

    With privacy-enabled data storage and compute solutions, researchers can analyze patient data from multiple regions without moving or compromising sensitive information. This capability would enable large-scale research initiatives, such as global clinical trials or population health studies, which require diverse datasets. Additionally, privacy-preserving AI/ML could improve patient outcomes by automating processes like triage or safety alerts without exposing identifiable data. Moreover, AI-based diagnostic and therapeutic tools would be generally improved by the ability to train large-scale AI models on vast amounts of patient data. 

    Foundational Private Computation Protocols

    In this section, we will move one layer down the stack to the base-layer protocols needed for privacy-preserving applications. There are a variety of generalized private compute networks, as well as specialized frameworks, that enable the user-centric applications discussed above.

    Generalized Private Computation Networks

    The networks discussed below employ one or multiple PETs to provide private data storage or computation. Although they are not blockchains, they are synergistic with blockchain based networks in that they can provide private inputs and outputs for blockchain transactions.

    Nillion

    Nillion is a secure computation network that decentralizes trust for high value data in the same way that blockchain decentralizes transactions. The difference, though, is that not only does Nillion gain the positive externalities of decentralization - censorship resistance, fault tolerance, and reliability - but Nillion also provides privacy. Nillion accomplishes this by enabling secure storage and computation of high-value data (HVD). Key features of Nillion include:

    Secure Storage: Nillion introduces a novel approach to decentralized, secure data storage by leveraging multiple PETs. Unlike traditional encrypted storage solutions that require data to be retrieved and decrypted before it can be computed on, Nillion enables computations to be performed directly on the masked data through its dual-network consisting of a coordination layer (NilChain) and orchestration layer (Petnet). 

    Nillion’s Dual Network Architecture (Source)

    While protocols such as Filecoin and Arweave are great choices for storing large amounts of transparent data, when that data needs to be utilized beyond storage and retrieval, Nillion is the superior choice.

    Private Computation: Private computation represents a paradigm shift in the sensitive data supply chain. Nillion has integrated multiple PETs into its Petnet testnet, including MPC, HE and LSSS. Nillion’s network enables operations on encrypted data while maintaining complete confidentiality. As PETs improve over time, Nillion will allow developers to seamlessly combine new PET techniques into their applications. Nillion’s novel implementation of PETs inside its orchestration layer generates strong privacy and decentralization guarantees. This is what sets Nillion apart from other private computation networks.

    By providing developers with a decentralized and private data storage and computation layer, Nillion unlocks new use cases in industries including healthcare, finance, and AI, while also enabling secure collaboration. 

    Zama

    Zama is an open source cryptography company that builds novel FHE solutions for blockchain and AI. The team have created a variety of FHE libraries based on the TFHE scheme, which enables efficient computation over encrypted data. This allows sensitive information to be processed without ever being exposed, hence offering strong privacy and security.

    Zama’s implementation, called TFHE-rs, is a pure Rust implementation of TFHE which provides developers with an easy to implement library of FHE operations. Zama’s Concrete is an open-source framework aimed at simplifying FHE deployment, abstracting away from the complexity that arises from FHE into something more accessible to developers.

    Zama aims to unlock markets including onchain identity, tokenization, confidential trading, preventative medicine, confidential credit scoring, biometrics, and more. To do so, the team has created Concrete ML and fhEVM, two specialized FHE libraries aiming to unlock AI and confidential smart contracts. Although not released, Zama has multiple services coming soon built on top of these libraries, including the fhEVM Coprocessor for confidential smart contracts and TKMS, a fully fledged Threshold Key Management Service for secure FHE key generation and ciphertexts decryption.

    TACEO

    Taceo is creating the Compute Layer Security (CLS) protocol to make blockchain computation encrypted by default, enabling applications to compute on private shared state. By leveraging both ZKPs and MPC in a technology called Collaborative SNARKs (coSNARKs), Taceo has created a scheme that overcomes the limitations that exist with traditional SNARKs: single-prover dependence and limited flexibility in multi-party environments. 

    Taceo’s coSnarks achieve efficiencies compared to single-prover systems, theoretically making privacy-preserving computation practical in resource-constrained environments such as blockchains. Applications for coSNARKs include AI, DeFi, gaming, healthcare, data ownership, and gaming. Taceo does not appear to have any in-production systems live, but have begun collaborating with teams such as Worldcoin (World ID) to create privacy-preserving technologies.

    Other Application Specific Infrastructure

    Aside from private compute networks, there are a few more categories of infrastructure that could fit into the private compute stack: Verticalized AI compute, Privacy chains, and decentralized storage networks. With that said, each has a limited scope compared to generalized private compute networks, making them only useful for specific applications.

    AI-Specific Compute

    Protocols such as Ritual and Nesa are partnered with Nillion to create decentralized open AI infrastructure networks. Both protocols are aiming to decentralize the inference process and provide AI model developers with model verification, censorship resistance, accessibility, and data security tools.

    Other protocols, such as Sentient, are choosing to verticalize their entire stack. Sentient is developing its own open, decentralized infrastructure layer that aims to enable independent AI researchers and developers to collaborate, monetize their innovations, and participate as stakeholders in what they term the “Open AGI Economy”. The protocol employs a novel cryptographic primitive called OML (Open, Monetizeable, and Loyal), which enables AI models to be decentralized in a format which protects AI model developers IP while enabling authorized usage. OML utilizes a hybrid TEE and FHE approach.

    OLM hybrid security and privacy approach (Source)

    Sentient is a very early stage project, with its core platform yet to be launched. With that said, the team recently released what will become a core component of its infrastructure: its AI fingerprinting library. Fingerprinting enables AI model developers to verifiably prove who owns an AI model. 

    Privacy Chains

    Privacy chains are blockchains that typically employ ZKPs to natively enable private transactions. Although these chains could make it easy for developers to build private applications out of the box, privacy chains are limited in the scope of applications that can build on top of them due to the lack of flexibility to manage complex data interactions on shared private states. Privacy Chains are best for applications centered around DeFi, but likely fall short for applications that require complex computations on large amounts of encrypted data, such as AI. Two examples are Aztec and Aleo.

    Aztec is a privacy-first Layer 2 zk-rollup built on Ethereum that aims to fill a critical gap by enabling private, programmable money. By combining ZKPs and ZK rollups, Aztec ensures that users’ transactions are encrypted while provably adhering to network rules. Its hybrid model offers both privacy and transparency, keeping user data private while allowing protocol states to remain public for compliance and usability. Aztec’s layer 2 solution is not fully live, but the project is making progress towards launch. As of August 2024, the team has launched a devnet, which is now live for experimentation. 

    Aleo is a privacy-focused layer 1 blockchain that leverages ZKPs to enable secure and scalable decentralized applications. Aleo launched its Mainnet in September 2024. The team expects builders to create secure applications and products including secure identity verification, compliant payments, trust-minimized oracle infrastructure, hidden information games, proof of useful work, and more. The team stated that over 350 teams are building on the network.

    Decentralized Storage Networks

    While blockchains provide a decentralized computation layer for applications, many still rely on centralized data storage that creates a single point of failure. To achieve end-to-end decentralization for specific applications, such as backup of enterprise data, scientific datasets, or media libraries, decentralized storage is necessary. However, these networks are limited in that once the data is stored, it can’t be computed on. In turn, they are good for static storage, but fall short for many use cases involving AI and DePIN. With that said, these networks could utilize a generalized private compute network, such as Nillion, to encrypt data inputs and outputs. Filecoin and Arweave are two of the largest players in the space. 

    Filecoin utilizes two techniques called Proof of Replication (PoRep) and Proof of Spacetime (PoSt) to ensure the integrity and reliability of its decentralized storage network. PoRep uses Zk-SNARKs to store client’s data, preventing a storage provider from replicating the data elsewhere. PoSt is Filecoin’s consensus mechanism, ensuring that storage providers continue to store customers’ data over time. Users can encrypt their data before storing it, although no computation can be performed on this data until it has been retrieved. With that said, for use-cases that don’t need real-time access to encrypted data for computations, Filecoin is a cheap and performant option.

    Arweave uses a permanent storage solution with a one-time fee, aiming to store data indefinitely. The protocol uses a consensus mechanism called Proof of Access (PoA) to incentivize miners to earn fees for storing data. Arweave also allows for encrypted data storage with the Advanced Encryption Standard (AES), however similar to Filecoin no computation can be performed on this data until it has been retrieved. Similar to Filecoin, though, Arweave shines for use-cases where the stored data does not need to be computed on. Arweave’s decentralized compute network, AO, could also pair well with private compute networks for specific applications that require privacy.

    Privacy Enhancing Technologies

    Finally, it is important to have a general understanding of the cryptographic primitives and innovative hardware-based security solutions that form the lowest level of the stack. This section will give a high level overview of relevant PETs.

    Multi-Party Computation (MPC)

    MPC is a cryptographic technique that allows multiple parties to jointly compute a function over their inputs while keeping those inputs private. It can be implemented through various approaches including FHE, linear secret sharing, and garbled circuits. MPCs strengths lie in its distributed trust model which naturally supports active security, as each participant does not reveal information to other parties. MPC provides flexibility in network topology and can be optimized across a variety of performance characteristics based on the chosen implementation.

    Example of MPC for calculating average hourly wages (Source)

    MPC shares many use cases as FHE, as it also enables private computations. Due to its inherently strong security guarantees, MPC shines for use cases with highly sensitive data, such as financial transactions or healthcare data.

    Zero Knowledge Proofs (ZKPs)

    ZKPs allow one party to prove to another that a statement is true without revealing any information beyond the validity of the statement itself. This technique has become particularly valuable in blockchain applications, enabling privacy over the historically transparent systems.

    There are a variety of ZKPs that can be implemented, from zk-SNARKS, zk-STAKRS, PLONKs, and Bulletproofs, each offering different trade-offs in terms of computational requirements, proof size, and verification time. 

    ZKPs excel at applications such as decentralized identity, privacy-preserving transactions, secure layer-2 rollups, voting systems, internet of things (IoT), proving origin and authenticity of goods and materials, and more.

    Fully Homomorphic Encryption (FHE)

    FHE operates on a client-server model; a user's encrypted data is computed on by the FHE server before being returned to the user for decryption. The architecture of a FHE deployment is simple in that it requires no pre-processing phase and requires no complex coordination between multiple nodes. All computation can be performed on a single server, which minimizes communication overhead and decreases latency. 

    FHE enables computations to be performed over encrypted data, in turn making it an exceptionally strong choice for any use case involving arbitrary computations over large datasets. Some examples include secure AI/ML, cloud computation, regulatory compliance around data regulations, supply chain security, and more.

    Trusted Execution Environments (TEEs)

    TEE is a segregated area of memory and CPU that is protected from the rest of the CPU using encryption. Any data in the TEE can’t be read or tampered with by code outside that isolated environment. Code executed inside the TEE is processed in the clear but is only visible in encrypted form when anything outside tries to access it.

    TEEs are a powerful technology that can provide privacy for use cases including AI/ML, cryptocurrency wallets, cloud computing, and more. TEEs can also be used within MPC frameworks.

    Tradeoffs, Orchestration, & Emerging Techniques

    Each PET has a variety of strengths and weaknesses, creating a complex landscape of tradeoffs across three axes: computational complexity, scalability, and security guarantees. 

    Zero knowledge proofs are designed to validate information, and consist of an interaction between two parties - a prover and verifier. The focus is on a single proof producing a binary output, not on computing results. As a result, ZKPs are exceptional at tasks such as digital identity.

    While FHE offers minimal communication overhead and simple deployment, it demands significant computational resources. It also faces security challenges due to the single server setup which might require additional mechanisms - such as ZKPs or computational replication -  to guarantee security, further increasing computation costs. 

    MPC, on the other hand, distributes both data and computation across multiple nodes, providing inherently greater security guarantees than FHE at a reduced computation cost. With that said, MPC comes with its own challenges - it requires a pre-processing phase for randomness generation, needs careful coordination between nodes, and faces scalability challenges as the number of nodes increases. The cost of each operation grows with the number of participating nodes due to the communication requirements between them.

    Finally, TEEs offer several advantages over FHE and MPC - mainly ease of deployment, scalability, and performance. However, TEEs have hardware level attack vectors and also require the user to trust the manufacturer of the TEE, potentially making them less secure.

    Ultimately, users won’t care what PET is utilized under the hood - they care about the trust assumptions, performance, and cost of the application in use. Because PETs exist in a constantly changing tradeoff space of many factors like performance, cost, maturity, and functionality, the best results can be attained from combining them.

    Thus, some protocols, such as Nillion, are exploring orchestration methods that enable builders to utilize a variety of PETs in their application. For instance, Nillion is implementing Blind Modules and various SDKs to enable developers to easily create applications that combine multiple PETs. This orchestration layer can exist at various levels of abstraction: (1) Black Box Beginnings, (2) Inter-Blind Module Orchestration, (3) Intra-Blind Module Orchestration, and (4) Inter-Cluster Orchestration. 

    At level 1, isolated PET protocols operate independently from each other and must be manually combined by developers. Level 2 introduces orchestration between different blind modules within SDKs, making it easy for developers to work with multiple PETs. Level 3 advances to implementing different PETs within a single Blind Module, allowing for customizable security -performance tradeoffs. Finally, level 4 enables Blind Modules to operate across multiple networks with different configurations, offering developers ultimate flexibility.

    The other large benefit of orchestration is that it future proofs one’s architecture. As the core cryptographic primitives improve and new techniques are invented, builders will want an easy way to integrate these into their applications. For instance, techniques such as Garbled Circuits, Linear Secret Sharing, and Discrete Wavelet Transforms (DWTs) are being explored to enhance PETs, and could find their way into production deployments as better hardware unlocks greater computational bandwidth. 

    As these techniques mature, an orchestration layer could abstract away the underlying complexity needed to integrate new advancements, enabling developers to upgrade their applications with ease while staying on the cutting edge of performance and security guarantees. 

    Conclusion

    PETs represent a critical inflection point in the evolution of Web3, as privacy is a necessary precursor for mass adoption. With many protocols launching testnets and gearing up for mainnet deployments, it is clear that the trend of private compute is just beginning. This technology unlocks use cases in massive markets, and it is likely that the winners in this space could justify similar valuations similar to layer 1 or layer 2 blockchains. 

    The broader implications of a robust Crypto Privacy Stack will ultimately affect our everyday lives, as it will facilitate the advancement of personalized AI and a ubiquitous globally connected DeFi ecosystem. Privacy will empower individuals and protect personal freedoms while at the same time enabling all of us to benefit the most from these transformative technologies.



    The information contained in this report and by Blockworks Inc. and related affiliates is for general informational purposes only and is not intended to provide legal, financial, or investment advice. The report should not be construed as an offer or solicitation to buy or sell any security, token, or financial instrument and does not represent any recommendation or endorsement of any investment or financial product or service. Blockworks Inc. and related affiliates are not registered as a securities broker-dealer or an investment advisor in any jurisdiction or country.