1. What is this standard?
IEEE P2895 establishes a comprehensive taxonomy that describes the rules and categories of data rights in data contracts that govern the capture, use, sharing and trade of data. The standard articulates the parameters for permitted use, restricted use, any exceptions to usage, duration of use and/or storage, and geography of use and/or storage of human-generated data. The standard may be used to describe the parameters of trade of data, regardless of the type/industry, or the file type.
The taxonomy provides a universal language for describing data exchange parameters across institutional boundaries—regardless of industry sector or file format—addressing permitted uses, jurisdictional processing requirements, retention periods, and distinctions between individual and aggregate data handling.
This standard is also related to the work of the IEEE Digital Privacy Initiative and draws on the IEEE Digital Privacy Model.
2. Why is it important?
Recent developments such as OpenAI’s "OpenAI for Countries" initiative signal a shift toward rapid deployment of AI infrastructures by national governments. These initiatives underscore the urgency of having standardized, interoperable, and human-rights-centered data governance frameworks that ensure the responsible use of human-generated data from the outset. IEEE P2895 provides the taxonomy to support transparency, interoperability, and shared accountability in these emerging systems.
The World Economic Forum estimates that by 2025 almost five hundred exabytes of data will be generated daily from humans. This continued exponential increase is the equivalent of 200 million Digital Versatile Disc (DVDs) full of human-generated data daily. Concurrently, Harvard Business Review expects institutions including governments, for-profit and nonprofit to exponentially increase investments in becoming data-driven and/or analytics-anchored. Both institutional aspirations require commercial capture, use, sharing and trade of human-generated data. Adjacent to these two concurrent trends is an unprecedented increase in human data trade regulations. There are almost one hundred countries drafting human data trade regulations inspired by the European General Data Protection Regulation (GDPR) and/or the California Consumer Protection Act (CCPA).
These three converging forces—exploding data production, institutional data hunger, and proliferating regulatory frameworks—have exposed a critical gap in current data management and technology infrastructure: existing platforms were simply not designed to accommodate the additional rules governing responsible data stewardship at scale.
This standard aims to address the gap in parameters defined by existing domain standards, regulations, and industry practices for data use. In the absence of a scalable capability to attach the rules of capture, use, sharing and trade of data to a data set, and a standard to ensure that descriptions of data capture, use, sharing and trade rules can be passed from one institution to another and easily understood, a common vocabulary is needed by the majority of institutions in the world that leverage human-generated data.
3. What is a real-world example or case study of how it might help?
By establishing a consistent vocabulary to describe the responsible consumption of human-generated data sets, this standard helps institutions that seek to capture, use, share and trade human-generated data sets to do so in compliance with consistent rules while reducing friction in legitimate data exchange. That could be about personal gain in the use of personal data or it could be agreeing to share personal data for the benefit of society.
Specific examples of P2895 applications include:
- Healthcare and wearables: Defining ethical frameworks for sharing sensitive health metrics across treatment, research, and commercial contexts while preserving patient agency and dignity.
- Smart cities: Enabling trusted cross-agency data exchange for transportation, energy, and municipal services while protecting citizen privacy and preventing surveillance overreach.
- Education and humanitarian contexts: Protecting vulnerable populations such as refugees and children engaging with digital learning platforms through clear data boundaries and purpose limitations.
- Environmental Monitoring: Clarifying privacy parameters and consent models when AI systems monitor household waste or energy consumption patterns
- National AI programs: Supporting responsible data governance architectures as countries launch public-private partnerships for AI infrastructure development.
This consistency in defining capture, use, sharing, and exchange rules significantly reduces regulatory compliance costs while increasing the transparency and fluidity of responsible data flows both within and between organizations. Most importantly, it creates the foundation for data ecosystems that respect human dignity by design.
4. What stage is the standard at?
The standard was originally proposed as “Standard Taxonomy for Responsible Trading of Human-Generated Data”. The term “trading” was deemed to be too focused on potential commercial use of data sets and did not reflect the fundamental relationship to human privacy as a human right. So, “trading” was changed to “Use and Privacy” to provide the standard with more focus on concerns about how data was appropriately shared and used.
The P2895 working group has been considering data governance challenges from a variety of sectors, including telecommunications, health, finance, power/energy, affective/semiotic, transportation, legal, human rights, genome, and human centric.
The working group has started drafting a proposed standard and is looking for contributors to help continue the development of the standard to provide guidance for how privacy can be made the centerpiece of the use of human-generated data.
5. What is the current geographical or disciplinary spread of your working members?
Our working group comprises a diverse coalition of privacy advocates, data scientists, ethicists, academics, end users, and consultancies. We strive to maintain global representation with contributors from North America, Europe, Africa, Asia, and Australia; bringing together diverse perspectives on data governance reflecting different cultural contexts, regulatory approaches, and implementation philosophies. This global representation ensures our standard addresses both universal principles and region-specific concerns and critical considerations.
6. What type of people might be interested or well-suited for this standards group?
The multifaceted nature of this standard makes it relevant to privacy specialists, data ethicists, database architects, AI developers, and compliance experts across industries. However, we recognize that control over human-generated data is emerging as a fundamental right touching every aspect of digital life—from smart city development to metaverse governance, social media moderation, and beyond.
We particularly seek contributors with expertise in regulatory compliance, technical data management implementation, human rights advocacy, and cross-sector integration. Our comprehensive scope (as reflected in our extensive Table of Contents) demands diverse perspectives from technology specialists, ethicists, legal experts, and policy architects who can help shape robust governance frameworks balancing innovation with human dignity.
Whether you're interested in contributing technical expertise, ethical perspectives, or end-user advocacy, your voice belongs in this conversation about how our data—and by extension, our digital selves—should be governed in the algorithmic age.
7. What triggered your own interest in this area?
What drew me to IEEE P2895 wasn't just the technical challenge—it was the moral and strategic imperative to bring integrity to how we define, share, and act on human-generated data. In systems design and governance, language is everything. When intent, definition, and execution don’t align, the cost isn’t just inefficiency—it’s harm.
P2895 creates a common framework that makes responsible data practices both actionable and interoperable. It doesn’t just describe what’s possible—it sets the boundaries for what’s appropriate. That distinction is critical.
Having contributed to IEEE 7010-2020 and 7014-2024, and worked closely with W3C on privacy frameworks, I’ve learned this: we can’t protect what we can’t clearly define. This standard helps close that gap. It gives institutions the clarity they need to embed accountability, and the tools to ensure digital systems reflect human values—not just machine logic.
I joined this effort because I believe the future of technology must be governed by consent, not confusion—by design, not default. P2895 is a chance to set that foundation while we can.
8. Call to Action
The P2895 working group invites contributors passionate about creating ethical data frameworks for our shared digital future. Whether you bring expertise in privacy, technical implementation, ethical considerations, or simply care deeply about how your data is governed, your perspective is valuable. To learn more or get involved you can check our public website for additional information and are free and welcome to contact our IEEE SA program manager Christy Bahn at [email protected], our chair Angelo Ferraro, or me at [email protected], for specific inquiries.