Last Update 6:24 PM October 22, 2024 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Tuesday, 22. October 2024

The Engine Room

Community Call: Learnings and reflections from our first ‘Deep Dive Week’

Join us on November 14 for our Community Call to reflect on our first Deep Dive Week The post Community Call: Learnings and reflections from our first ‘Deep Dive Week’ appeared first on The Engine Room.

Join us on November 14 for our Community Call to reflect on our first Deep Dive Week

The post Community Call: Learnings and reflections from our first ‘Deep Dive Week’ appeared first on The Engine Room.


ResofWorld

India’s electric cab companies can’t find enough cars to put on the road

There are only a handful of electric taxi manufacturers in India today, slowing the country’s EV revolution.
Rahul Mathur has sworn by BluSmart since its early days. The electric-taxi rideshare company was his preferred choice for the daily commute as a startup founder in Delhi two years...

Elastos Foundation

Unlocking Staked ELA: Introducing Elastos’ BPoS NFT System

Staking cryptocurrency assets typically means locking them away for months, leaving users unable to take advantage of new opportunities—a common frustration in the crypto space. But what if the Elastos ecosystem allowed users to unlock staked Mainchain ELA without unstaking? With Elastos BPoS NFTs, users can convert their staked assets and APY rewards into NFT […]

Staking cryptocurrency assets typically means locking them away for months, leaving users unable to take advantage of new opportunities—a common frustration in the crypto space. But what if the Elastos ecosystem allowed users to unlock staked Mainchain ELA without unstaking? With Elastos BPoS NFTs, users can convert their staked assets and APY rewards into NFT receipts on the Elastos Sidechain’s (ESC) open market.

The Bonded Proof of Stake (BPoS) NFT system offers a novel solution by combining Bitcoin-backed ELA staking security, ELA APY rewards, and tokenized liquidity through ERC-721 standard NFTs. This system allows users to stake ELA with validators and mint BPoS NFTs, representing their staked assets and accruing Mainchain rewards. These NFTs can be traded, collateralized, or transferred using Smart Contracts. Let’s explore how BPoS NFTs work, the Elastos consensus model, and the essential role these NFTs will play in the upcoming BeL2 Arbiter network. 

 

Understanding Elastos’ Dual Consensus Model

Elastos operates using a multi-layered consensus called Elastic Consensus, combining Auxiliary Proof of Work (AuxPoW) with Bonded Proof of Stake (BPoS) to secure the network and provide utility.

1. Auxiliary Proof of Work (AuxPoW): Bitcoin-Backed Security Bitcoin’s Security: AuxPoW leverages Bitcoin’s mining infrastructure, allowing Bitcoin miners to secure both Bitcoin and Elastos simultaneously without extra energy costs. 293.69 EH/s of Bitcoin’s total 580.74 EH/s hash rate reinforces Elastos, giving it nearly 50% of Bitcoin’s security. Benefits of AuxPoW: Energy Efficiency: Elastos inherits Bitcoin’s security without additional energy consumption. Network-Wide Trust: This shared security protects the Elastos Mainchain and sidechains, making the entire ecosystem highly reliable. 2. Bonded Proof of Stake (BPoS): Securing the Network with Validators Long-Term Staking: Users lock ELA on the Mainchain to secure the network and validate transactions. Incentivizing Participation: Users earn 2-3% APR, with higher rewards for longer lock periods. Validators must hold 80,000 staking rights to participate in block validation, sharing rewards with stakers.

This hybrid consensus model ensures that Elastos is anchored in Bitcoin’s security, while BPoS validators provide a second layer of decentralized governance and stability.

 

How Elastos BPoS NFTs Unlock the Value of Staked ELA

Traditional staking locks assets, limiting liquidity. BPoS NFTs solve this problem for Elastos and by converting its mainchain voting rights into ERC-721 standard NFTs that represent the ownership to claim the underlying staked ELA and its accumulating rewards.

How BPoS NFTs Work Staking and Minting: Users stake ELA tokens on the Mainchain using the Essentials Wallet. Once staked, they can mint BPoS NFTs on the Elastos Smart Chain (ESC). The NFT represents both the staked ELA and the accumulating APY rewards. Trading and Transfer: BPoS NFTs can be freely traded or transferred on the Elastos Smart Chain. This allows users to unlock liquidity without ending their staking position or interrupting rewards. Burning and Claiming Rewards: NFT holders can burn NFTs anytime to claim APY rewards. The staked ELA remains locked until the lock period ends, at which point it can be withdrawn by the original staker.

 

BeL2 Arbiter Network: Unlocking New Financial Applications

In the upcoming BeL2 arbiter network, BPoS NFTs will act as collateral for BTC-based loans, stablecoin issuance, and dispute resolution services. Arbiter nodes using these NFTs as collateral will earn BTC-based and dApp rewards, in addition to Mainchain ELA rewards.

This decentralized financial infrastructure will allow users to participate in Bitcoin-native dApps without moving BTC off the mainnet, creating an easy integration between Bitcoin security and BeL2 DeFi applications, such as the upcoming Harvard student-led teams “New Bretton Woods (NBW)” project, incubated by the Harvard Innovation Labs.

Collateralization in the BeL2 Arbiter Network Arbiter Entry: Users can stake ELA BPoS NFTs as collateral to participate as nodes in the BeL2 network. Earn BTC and dApp Rewards: BPoS NFT arbiter nodes earn percentage-based BTC and dApp rewards, alongside Mainchain ELA staking rewards, for supporting time-based transactions and dispute resolution services that reflect the value of their collateral. Decentralized Financial Services: The BeL2 arbiter network supports Native BTC services, ensuring decentralized loans are completed, maintaining stablecoin pegs, liquidating assets based on market conditions, and providing dispute resolution services, offering a secure, decentralized alternative to traditional financial systems.

 

So What are the Key Advantages of Elastos BPoS NFTs? Flexible Liquidity: Trade or transfer staked assets and their rewards as NFT receipts without waiting for the lock period to end. Bitcoin-Backed Security: 293.69 EH/s of Bitcoin’s hash rate secures the Elastos Mainchain, providing trust and reliability. New Financial Tools: Use BPoS NFTs as collateral for loans and arbitration services in the upcoming BeL2 network. Simple Wallet Management: Manage NFTs through the non-custodial Essentials Wallet, ensuring full control over staked assets. Conclusion

The Elastos BPoS NFT system offers a groundbreaking solution by combining Bitcoin’s security with NFT liquidity. Users can mint, trade, or burn NFTs anytime, claiming APY rewards without interrupting their staking period. The ability to unlock staked ELA after the lock period ensures long-term rewards while maintaining flexibility.

With the upcoming BeL2 Arbiter Network, BPoS NFTs will serve as collateral for BTC-based loans and dispute resolution services, creating new earning opportunities. This innovative design reflects Elastos’ commitment to decentralization, offering secure and scalable financial tools for the future of blockchain-based finance.

With ELA’s fixed supply of 28.22 million, Bitcoin merge-mining security, and a 4-year halving cycle, participants benefit from both scarcity and sustainability. The Elastos BPoS NFT system sets a new standard for DeFi innovation by offering liquidity, security, and long-term value for users.

Want to mint a BPoS NFT? Tomorrow, we will release a step-by-step guide on how to mint your BPoS NFT. Did you enjoy this article? Follow Infinity for the latest updates here!

 


OpenID

Revisions to OpenID Process Document and IPR Policy Approved

A subgroup of OpenID Foundation board members and key staff have been working to update the “OpenID Process” document based on issues raised by some board members to ensure the document aligns with how the Foundation currently works. This update addresses those original issues and also identified a significant number of mainly editorial issues and […] The post Revisions to OpenID Process Documen

A subgroup of OpenID Foundation board members and key staff have been working to update the “OpenID Process” document based on issues raised by some board members to ensure the document aligns with how the Foundation currently works. This update addresses those original issues and also identified a significant number of mainly editorial issues and improvements that were possible. It also highlighted inconsistencies and other issues that required coordinating revisions with the “Intellectual Property Rights (IPR) Policy,” so that has been added to the scope and improvements proposed. Full details including material changes made can be referenced here.

The changes were unanimously approved by the board at the September 12, 2024 board meeting. Approving these changes also required a 21-day review and 14-day vote of the membership with a 30% quorum requirement.

I am pleased to announce that the update Process Document and IPR Policy were approved by the membership this past Saturday, October 19, 2024 with 34% member participation, greater than the 30% quorum requirement for the vote.

The voting results were:

Approve – 106 votes

Object – 1 vote

Abstain – 21 votes

Marie Jordan – OpenID Foundation Secretary

The post Revisions to OpenID Process Document and IPR Policy Approved first appeared on OpenID Foundation.

Monday, 21. October 2024

Hyperledger Foundation

Staff Corner: The importance of maintainers and contributors at LF Decentralized Trust

At LF Decentralized Trust, the driving force for our projects are the maintainers and contributors. These dedicated individuals are the ones who roll up their sleeves and do the important planning, development, and governance work that not only sustains our projects but pushes the boundaries of innovation, collaboration, and community growth. Their collective contributions are instrumen

At LF Decentralized Trust, the driving force for our projects are the maintainers and contributors. These dedicated individuals are the ones who roll up their sleeves and do the important planning, development, and governance work that not only sustains our projects but pushes the boundaries of innovation, collaboration, and community growth. Their collective contributions are instrumental to a vibrant and dynamic ecosystem.


ResofWorld

A local ride-hailing service can’t beat Uber and Bolt, so its drivers are beating up their rivals

Shesha was launched as a fair-minded local competitor to Western companies in South Africa. It has acquired an ugly reputation for intimidating drivers and passengers.
In July, Progress, a used-clothes trader, disembarked from his bus at Pretoria’s main station. The night was dark, and he knew that violent assault was a serious risk in the...

Digital Identity NZ

Shaping the Future of Open Banking in Aotearoa: DINZ Responds to Proposed Designation Regulations and Standards

21 October 2024 Digital Identity NZ (DINZ), through its Policy and Regulatory Subcommittee, has provided feedback to the Ministry of Business, Innovation and Employment (MBIE) for the proposed open banking regulations and standards under the Customer and Product Data Bill. This collaborative submission reflects insights from DINZ members across New Zealand’s digital identity sector, representing …

21 October 2024

Digital Identity NZ (DINZ), through its Policy and Regulatory Subcommittee, has provided feedback to the Ministry of Business, Innovation and Employment (MBIE) for the proposed open banking regulations and standards under the Customer and Product Data Bill. This collaborative submission reflects insights from DINZ members across New Zealand’s digital identity sector, representing both large and small organisations.

DINZ fully supports the Bill’s goal to unlock the value of customer data, fostering competition and innovation. However, our submission highlights specific areas where the proposed rules could be enhanced to better achieve these objectives.

Empowering Customers through Open Banking

DINZ appreciates the Bill’s focus on giving customers control over their data, which can drive a more competitive and dynamic marketplace. However, we raised concerns around the prioritisation of the banking and electricity sectors, believing that a broader scope of competitive third-party providers is essential for success. Additionally, affordability is crucial, as the cost of third-party services could hinder widespread adoption.

Data Security and Privacy

Maintaining data security is a key focus for DINZ. While the Bill requires transparency from data holders and accredited requesters, we recommend aligning with the New Zealand Privacy Act 2020 to safeguard consumers’ data without unnecessarily disclosing sensitive information. This ensures robust protection and trust.

Learning from Australia’s Open Banking Journey

Reflecting on Australia’s slow uptake of open banking, DINZ cautions against over-reliance on the Digital Identity Services Trust Framework (DISTF) as a singular solution. A more holistic approach is needed to address identity, verification, and consent challenges in the context of open banking. Colin Wallis, Executive Director of Digital Identity NZ says:

“DINZ supports the general direction indicated in the discussion paper, however it considers that not enough attention is being directed to the reasons behind the slow take-up in Australia. Additionally unintended consequences may arise from its seemingly over reliance on the DISTF as the magic bullet to resolve all the digital identity, verification, attribute exchange and consent – as much as we would all like that.”

DINZ is committed to working with MBIE to ensure a secure, efficient, and inclusive open banking framework that benefits all Kiwis.

You can read the full submission here: DINZ_Submission_on_CPD_Open_banking_designation_rules_10_Oct_2024_Final-Signed.pdf (digitalidentity.nz)

For media inquiries or further information, please contact:

Email: info@digitalidentity.nz
Phone: + 64 9 394 9032

About Digital Identity NZ

Digital Identity NZ (DINZ) is a not-for-profit, membership-funded association with around 100 organisations from both the public and private sectors. Representing diverse industries and individuals, DINZ is the leading voice for digital identity in Aotearoa. As part of the New Zealand Tech Group (NZTech), we connect the digital identity community and actively influence policy and solutions. Our members play a crucial role in advancing digital identity across various sectors—from public-facing government services to open banking, account opening, and customer and product data. These initiatives rely on digital identity, working alongside AI, biometrics, and cloud technologies.

The post Shaping the Future of Open Banking in Aotearoa: DINZ Responds to Proposed Designation Regulations and Standards appeared first on Digital Identity New Zealand.

Friday, 18. October 2024

ResofWorld

Meet the mothers in small-town Hungary leading a fight against Chinese EV battery plants

Prime Minister Viktor Orbán’s goal to turn Hungary into a global EV battery hub is facing environmental backlash and legal challenges.
Eva Kozma watched as excavators raked up clouds of dust and flatbed trucks shuttled steel beams across a bustling construction site on the outskirts of a tranquil Hungarian village. A...

Thursday, 17. October 2024

Hyperledger Foundation

Hello Hiero! Building the Next Generation Open Source Distributed Ledger Technology Together

As the landscape of decentralized trust continues to evolve, open source technologies play a pivotal role in driving innovation and fostering collaboration. At the forefront of this transformation is Hedera, a fully open source public ledger that is rewriting the rules of blockchain governance and energy efficiency. Powered by the Hashgraph consensus algorithm, which is recognized as th

As the landscape of decentralized trust continues to evolve, open source technologies play a pivotal role in driving innovation and fostering collaboration. At the forefront of this transformation is Hedera, a fully open source public ledger that is rewriting the rules of blockchain governance and energy efficiency. Powered by the Hashgraph consensus algorithm, which is recognized as the most energy-efficient blockchain today, Hedera boasts a unique governance model that includes 31 recognized global leaders such as Google, IBM, Dell, Boeing, and Standard Bank.


ResofWorld

What are you wearing? Rest of World’s fashion quiz

Test your knowledge of e-commerce startups, 17th-century markets, and everything in between.
The internet has transformed the way the world shops, as fast fashion behemoths challenge traditional retailers and influencers supercharge new trends. Let’s see how well you know the global fashion...

The Catholic Church condemned Santa Muerte. TikTokers gave her a makeover

A growing number of Santa Muerte devotees are using social media to rebrand the controversial Mexican deity of death
On the eighth day of her journey across the Sonoran desert in the summer of 2004, Jessica Maribel Morales collapsed. She was trying to make it from Mexico to the...

Wednesday, 16. October 2024

ResofWorld

The most popular payment app in the Philippines has a side bet: online gambling

The GCash app is ubiquitous in the Philippines, and is being blamed for rising gambling addiction in the country, particularly among women.
The first time that Ana, a Filipino housewife, visited a gambling platform on her mobile phone, she was only looking to unwind for a little while. Soon, she was spending...

Blockchain Commons

Musings of a Trust Architect: Open & Fuzzy Cliques

Digital communities are collections of individual entities that are connected together. They can be modeled as graphs, with the individuals being nodes and their relationships being edges. Traditionally, identity models have focused on the nodes, but in Musings of a Trust Architect: Edge Identifiers & Cliques, I suggested that both private keys and public-key identifiers could be based on the r

Digital communities are collections of individual entities that are connected together. They can be modeled as graphs, with the individuals being nodes and their relationships being edges.

Traditionally, identity models have focused on the nodes, but in Musings of a Trust Architect: Edge Identifiers & Cliques, I suggested that both private keys and public-key identifiers could be based on the relational edges, and that when you combined a complete set of edges you could create a cryptographic clique, where the group was seen as an entity of its own, with the identities of any participants hidden through the use of a Schnorr-based signature.

My first look at cliques focused on the technical definition, which requires that cliques be “closed”, meaning that there’s a relationship between every pair in the group and that those pairwise edges form the clique identity among them.


However, creating closed graphs becomes increasingly difficult as the graph size grows. There are some alternatives which I discuss here: open cliques and fuzzy cliques. The entities forming a clique also don’t have to be people, as I discuss in cliques of devices.

Open Cliques

Cryptographic cliques don’t have to be fully closed. Open cliques are also possible. (In graph theory these technically are not called “cliques”, but I’m going to continue to use the term for cryptographic identifiers that are based on edges.)

While the concept of a fully connected clique provides clear value in graph theory, such structures can become computationally intensive, especially as the group size increases. Open cryptographic cliques, which are not completely interconnected, may then be used instead.

Open cliques support different sorts of modeling, for groups where not everyone is connected and where the relationships are fluid. They also allow for easier growth: a clique can organically add a new member when a single participant creates a relationship with them, without the need to define the new member’s relationship to everyone in the clique (especially as most of those relationships would not exist).

For example, Bob might not actually have a close or independent relationship with his mother-in-law, Anna, while Mary’s best friend from college, Beth, might join the clique when she stays with the family, despite the fact that she only has a real relationship with Mary. (However, more relationships, and thus edges, might develop over time!)


While open cliques may lack the complete interconnectedness of their closed counterparts, they offer a realistic representation of the evolving nature of dynamic social relationships. One of the main questions regarding them is when and how to recognize new edges as an open clique evolves, and thus when and how to rotate the clique’s overall keys.

Fuzzy Cliques

As discussed in the appendix to this article, there are currently two major Schnorr-based MPC signature systems that could be used as the foundation of cliques: FROST and MuSig2. Each comes with its own advantages and limitations, but one of the advantages of using FROST is that it allows for the creation of fuzzy cliques, thanks to its ability to create threshold signatures (with m of n agreement required to sign where m≤n).

This allows group decisions or representations to be based on a subset (threshold) of members rather than requiring unanimity, as would be required when using MuSig2 in its native form. Using thresholds to define group interactions adds a degree of “fuzziness” or flexibility to the representation of those groups and their actions, at the price of higher latency and the fact that the theoretical implications are not as well studied.

There’s one other catch: fuzzy cliques are the one situation where the Relationship Signature Paradigm can’t be used. Though we still create the relational edges, to allow any pair of participants in the clique to make joint decisions, the clique keys are created by the individual participants, not the edges, ensuring that we have thresholds of participants making decisions, not thresholds of edges (which would quickly become confusing!).

Even for a triadic clique, the privacy implications of using a threshold key to represent the clique are notable.


Imagine that the participants generated two FROST keys for the triadic clique, one that had a 2-of-3 threshold and one that had a 3-of-3 threshold. If every one agreed, they could all sign with their share fragments of the 3-of-3 private key, and anyone could compare it to the 3-of-3 public key and know that the group was in perfect consensus.

But what if you only required the consensus of two members of the group? After all, Joshua probably won’t be making a lot of decisions for a while. Theoretically, you could just sign with one of your relational edge keys, such as the Mary-Bob relational edge key. That demonstrates the consensus of two members of the clique and supports accountability: you know which two participants signed.

But, if you instead sign with the 2-of-3 threshold key for the clique you get to take advantage of the aggregatability that’s baked into Schnorr. With it, no one knows which two people signed (or indeed, if two or three people signed). They just know that at least the threshold of people within the group signed. It’s a powerful privacy enhancement that really shows off the power of fuzzy cliques.

Fuzzy cliques allow for real-world decision-making dynamics, where different sorts of decisions might require a single person’s agreement, a majority’s agreement, a super-majority’s agreement, and everyone’s agreement. This creates a model for fully decentralized decision-making that’s resilient and fault tolerant, all while supporting both individual privacy and group accountability (which still allowing for individual accountability using relational edges).

Cliques of Devices

Thus far, I’ve largely presumed that relational edges and cryptographic cliques are created by people. But, that doesn’t have to be the case: independent nodes in a graph can be entities of any type, including devices.

In my first article, I touched upon the idea that a clique could define not just a group, but also a singular person’s identity. This could be done using devices. Imagine that a person has a few devices that together form the basis of his digital identity: a hub of information that contains his credentials; a biometric ring that verifies his physical identity, primarily to unlock that hub; and a coordinator that allows a clique-identity to communication with the network. The following diagram shows how our old friend Bob could be defined as an open clique including devices:


Using the clique-of-cliques model, this then might be the identity that’s linked in with Mary and Joshua to form their triadic nuclear-family clique:


Though these examples suggest a clique where devices and real people are mixed together, that’s not the only option. Another example might be a fuzzy clique made up of three automated factcheckers, which are all devices. Together, any two can issue a finding of “TRUE” or “FALSE”:


Again using the clique-of-cliques model, these fact checkers could then interact with other identities, such as Dan and Ty, who write together.


The Fact Checkers interact with the authors’ edge relationship (known by their joint pseudonym, “James”), to sign off on the validity of their work. Thanks to the aggregatability of Schnorr signatures, no one knows (or cares) that the Fact Checkers are three devices or the authors are two people!

Conclusion

Cliques offer a powerful new model for identity control (and more generally, for control of many sorts of digital assets). But, using closed cliques has drawbacks.

Two other models offer different utility:

Open Cliques allow for the modeling of more realistic social situations while simultaneously reducing compuational costs, but create new questions for theoretical understanding and in figuring how to maintain public and private keys for the clique. Fuzzy Cliques open up the possibilities for authorizations, agreements, and other decisions to be made by portions of a group rather than the group as whole, but they depend on either FROST or some other (theoretical) threshold signature system, and they disallow the creation of a clique using relational edges.

In addition, cliques don’t have to be made up only of people:

Cliques of Devices show how cliques could also include AIs, oracles, fact checkers, hardware wallets, biometric rings, and other computerized programs, and that they could interact either as parts of cliques or as separate entities!

These possibilities are just the beginning. I think that edge identifiers and cliques could be a powerful new tool for expanding the design of identies online.

How could you use them? How would you expand them? What would you like to see next?

Appendix: FROST & MuSig

There are currently two major Schnorr-based signature systems, FROST and MuSig2, both of which support Multi-Party Computation (MPC) signing.

FROST is a Schnorr-based multisig system that originated in a 2020 paper. As of 2024, it’s just coming into wide use thanks to projects such as ZF FROST and wallets such as Stack Wallet.

🟢 Possible efficiency improvements for larger cliques. 🟢 Supports thresholds (m of n). 🟢 Privacy for thresholds. 🛑 Limited accountability for thresholds. 🛑 Can’t build clique from edges if using thresholds. 🛑 More rounds for signing. 🟨 Allows Distributed Key Generation or Trusted Dealer Generation.

MuSig2 is a Schnorr-based multisig system that dates back to 2020 (when MuSig2 was introduced) and before that 2018 (when MuSig1 was introduced). It’s been well-studied and is detailed in BIP 328, BIP 390, and BIP 373, providing strong integration with Bitcoin, especially since its recent merge into libsecp256k1.

🛑 No thresholds (n of n). 🟨 But can mimic thresholds with Taproot trees 🟢 Full accountability for signatures. 🟢 Fewer rounds for signing. 🟢 Can always build clique from edges.

Two of the features of Schnorr-based signature systems that best support edge identifiers and cryptographic cliques are aggregation and MPC.

Aggregation. Schnorr signatures are aggregatable. They’re mathematically added together, producing a final multisig that’s the same size as an individual signature would be. As a result, signatures are indistiguishable: you don’t know how many people signed or who signed, simply that a signature is valid (or not). MPC. Multi Party Computation means that each participant has a secret (here, a key share), which they can use together without revealing that secret. It’s what allows individuals to jointly create an edge-identifier key and then for edges to jointly create a clique key.

For more on Schnorr, see my Layperson’s Intro to Schnorr.

Tuesday, 15. October 2024

EdgeSecure

Awards Presented at EdgeCon Autumn 2024 to Recognize Excellence

The post Awards Presented at EdgeCon Autumn 2024 to Recognize Excellence appeared first on NJEdge Inc.

NEWARK, NJ, October 16, 2024 – Hosted in partnership with Kean University, EdgeCon Autumn on October 10 brought together attendees from universities and community colleges to explore how to achieve sustained success by uniting strategy and innovation through enterprise architecture. During the keynote panel, Blueprints for Success: Uniting Strategy and Innovation in Higher Education, Edge presented several awards to celebrate the incredible achievements and contributions that higher education institutions are making throughout the region. 

To recognize vision and leadership in the area of high performance computing, the New Jersey Institute of Technology (NJIT) was presented with the High Performance Computing Innovation Award. Over the past year, the technology and network teams at NJIT have built an exemplary HPC platform, including on-site and remote data center resources, that is secure, at scale, and meets the growing demands of their research community. 

At a time when higher education institutions are going through rapid change and facing unique challenges, Edge wanted to recognize the vision and leadership of the Metropolitan College of New York for seeking out the support and capabilities of the Edge network and community. To honor this commitment to collaboration, the College was presented with the Regional Network Partnership Award.

As attacks on higher education accelerate, and privacy regulations and compliance standards become increasingly complex, the cybersecurity burden on institutions has sharply increased. In acknowledgement of their dedication and diligent work in going above and beyond regulatory standards to keep their educational community safe, Middlesex College received the Community College Cyber-Preparedness Award.

Edge also recognized Jeremy Livingston, Chief Information Security Officer at Stevens Institute of Technology for his essential role in relaunching the Edge IT Security Community of Practice, which serves as a forum for collaboration and collective intelligence to fight cybersecurity threats. Livingston was presented with the Security Community Leadership Award for his outstanding commitment over the past year.

To help broaden the education community and support students in their learning journey, online learning programs are vital to the mission and success of modern higher education institutions. In recognition of their efforts to grow their online program rapidly, effectively, and without sacrificing quality, the award for Exemplary Online Program Leadership went to Rowan University.

Following EdgeCon Spring, which included a galvanizing keynote focused on artificial intelligence (AI), and the growing momentum and importance of the topic, the instructional team at Seton Hall University stepped up to engage the Edge community and partnered with Edge to host the first AI Teaching and Learning Symposium. To recognize the University’s continued leadership in the community and their participation in the initial cohort of the American Association of Colleges and Universities (AAC&U) Institute on AI, Pedagogy, and the Curriculum, Edge presented Seton Hall with the AI Education Leadership Award.

Edge also wanted to celebrate research that has had a significant scientific, societal, economic, or environmental impact. The Research Impact Award was presented to Stephen K. Burley, M.D., D. Phil. from Rutgers, The State University of New Jersey, for his Pioneering Work in Structural Biology: Transformative Contributions to Biomedical Research and Global Scientific Collaboration. Dr. Burley is an internationally recognized scholar and has published extensively in data science and bioinformatics, AI/machine learning, structural biology, and clinical oncology.

The Engaging Students in Collaborative Research Award recognizes research projects that involve significant collaboration between institutions or research teams that engage students in the research process. This honor was presented to Joseph Diaco, Professor, Camden County College, and Dr. Hieu Nguyen, Professor, Rowan University, who were principal investigators for the Precision Agriculture Using Drone/AI Technologies project, Blueberry Drone AI: Smart Farming of Blueberries using Artificial Intelligence and Autonomous Drones. The project aimed to equip students with hands-on experience in drone technology and AI, improve the accuracy of blueberry counting and health assessment through enhanced image recognition models, and achieve proof of concept for autonomous drone missions. 

EdgeCon Autumn 2024 not only served as a platform for meaningful dialogue on the future of higher education but also highlighted the remarkable efforts of institutions committed to excellence and innovation. Edge is excited to see how these inspiring achievements will continue to shape the landscape of higher education and empower students and institutions for generations to come.

About Edge: Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Awards Presented at EdgeCon Autumn 2024 to Recognize Excellence appeared first on NJEdge Inc.


OpenID

An Outreach Workshop for Open Banking Chile

Mike Leszcz, OpenID Foundation Operations Director This was a hybrid event with some CMF and ecosystem members participating in person in Santiago. OIDF was very fortunate to have founding member and long-time board member, John Bradley with Yubico, representing OIDF in person. The goal of the workshop was to introduce OIDF and OpenID specifications with […] The post An Outreach Workshop for Ope

Mike Leszcz, OpenID Foundation Operations Director

This was a hybrid event with some CMF and ecosystem members participating in person in Santiago. OIDF was very fortunate to have founding member and long-time board member, John Bradley with Yubico, representing OIDF in person. The goal of the workshop was to introduce OIDF and OpenID specifications with a focus on FAPI 2.0 to the ecosystem as Chile will require FAPI 2.0 when the Chilean Open Finance System goes live.

OIDF Standards Overview

Victor Andrade, Senior Analyst with the CMF, opened the workshop welcoming approximately 190 participants. Gail Hodges, OIDF Executive Director, kicked off the agenda with a brief introduction to OIDF including how the Foundation operates including with other ecosystems and then highlighted how to get involved.

Mark Haine, OIDF Technical Director, presented an overview of current OpenID specifications including recommendations for new vs. existing ecosystems. This introduced a deeper dive into FAPI 2.0, delivered by Domingos Creado who represents OIDF certification team and is a valued FAPI Contributor. Domingos discussed key technical details from FAPI 2.0, including how it builds on FAPI 1.0 and is intended to be easier to implement. Domingos also confirmed that FAPI 2.0 is on track to be a Final Specification by the end of 2024.

At the request of the CMF, the workshop also included a high-level overview of the Shared Signals Framework (SSF) specification that improves API efficiency and security by providing privacy-protected, secure webhooks. It is in use by some of the largest cloud services to communicate security alerts and status changes of users, continuously and securely to prevent and mitigate security breaches. It is currently leveraged by two applications – the Continuous Access Evaluation Protocol (CAEP) and Risk Incident Sharing and Coordination (RISC) to achieve this result. Shared Signals WG co-chairs, Atul Tulshibagwale, CTO at SGNL, and Sean O’Dell, Senior Staff Security Engineer at Disney, provided this overview and addressed SSF questions.

Joseph Heenan, OIDF Specifications Specialist and Certification Director as well as a FAPI 2.0 Editor, provided an overview of the OpenID Certification Program. This included the value of certification including how ecosystems that mandate FAPI and FAPI certification are achieving high security within their ecosystems as well as enabling interoperability. He noted that FAPI 2.0 conformance tests and certifications are currently available with a number of OP and RP certifications from the ConnectID private ecosystem in Australia. Joseph highlighted a number of other conformance test suites for other OpenID specifications are currently in development and will be made available for certifications once in production.

Ecosystem Engagement

The workshop then turned to ecosystem engagement, facilitated by OIDF Operations Director, Mike Leszcz. Mike spoke about the ecosystems that OIDF has partnered with in recommending or mandating FAPI adoption and FAPI certifications. He noted that OIDF is also supporting some ecosystems that are in the process of going live with their open finance/open data ecosystems.

This overview introduced the strong partnership that OIDF has had with Open Finance Brazil (OFB) the last several years as OFB mandates FAPI adoption and certifications with annual recertifications required. We were privileged to have Elcio Calefi, CIO at OFB and OIF board member present, “Technology in Finance – Innovation, Security and Inclusion”, highlighting OFB’s journey from including FAPI into the Brazilian open finance regulation and then operationalizing the mandate for FAPI adoption and certification.

Questions Answered

After a lunch break, OIDF presenters and workshop participants reconvened for a Q&A session that addressed hot topic such as the lifecycle of the standards, the use of mTLS, the implementation of refresh tokens, the practical aspects of changing the scope of authorizations or grants, among others. Other topics during this session included:

Certification costs and OIDF’s recommendations regarding the implementation of certification processes. Adaptations to the applicable profile(s) for Chile and OIDF’s position on possible deviations that a local implementation may have from the plain vanilla standard. OIDF recommendations regarding the use of RAR / PAR, especially in replay attack threat scenarios. Questions on DCR single profile. Inclusion of data finality principles and their relationship to the FAPI standard. Questions on how OIDF has approached to embedded finance for FAPI compliance, in particular where authorizations come from or are managed by third parties.

John Bradley, representing OIDF and an author on a number of the specifications being discussed, took the lead on many of the topics during the Q&A session with support from the workshop presenters. The Q&A session allowed additional time for the Chilean Open Finance System participants to dive deeper into the workshop topics.

OIDF thanks our colleagues at the CMF for their support and coordination of these two important events in support of the Chilean Open Finance System.

Links to the session recordings and workshop deck can be found on the OpenID Foundation’s Presentations and Media page.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post An Outreach Workshop for Open Banking Chile first appeared on OpenID Foundation.


Announcing the IPSIE Working Group

The OpenID Foundation is delighted to announce the formation of the Interoperability Profiling for Secure Identity in the Enterprise (IPSIE) Working Group. This WG aims to tackle key challenges that underlie identity security in today’s enterprise environments.  The Core Challenge Identity and Access Management (IAM) within the enterprise is a multifaceted endeavor, as indicated by […] The

The OpenID Foundation is delighted to announce the formation of the Interoperability Profiling for Secure Identity in the Enterprise (IPSIE) Working Group. This WG aims to tackle key challenges that underlie identity security in today’s enterprise environments. 

The Core Challenge

Identity and Access Management (IAM) within the enterprise is a multifaceted endeavor, as indicated by the growing Body of Knowledge maintained by IDPro. There is a broad range of specifications that are relevant to securing the many IAM functions that underpin operations. Some of these are OIDF standards – like OpenID Connect, FAPI, and Shared Signals – while others are maintained in different standards bodies. For example, IPSIE has already identified the IETF’s OAuth 2.0 and System for Cross-Domain Identity Management (SCIM) as relevant to their initial scope (below). But these specifications are written to support many contexts and use cases; they contain optionality that reduces the likelihood that independent implementations will interoperate. 

The IPSIE Working Group will develop secure-by-design profiles of these existing specifications with a primary goal of achieving interoperability across enterprise implementations.

Getting Involved

According to its Charter, the IPSIE WG will initially focus on standards that support:

Single Sign-On User Lifecycle Management Entitlements Risk Signal Sharing Logout Token Revocation.

As of this publication, the WG is meeting weekly on Tuesdays, though Contributors should always check the OpenID Calendar for any updates to the schedule. To stay up-to-date with the latest news, please join the IPSIE mailing list.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Announcing the IPSIE Working Group first appeared on OpenID Foundation.


Oasis Open Projects

Coalition for Secure AI Forms Technical Steering Committee to Advance AI Security Workstreams

Boston, MA, USA, 15 October 2024 – The Coalition for Secure AI (CoSAI), an OASIS Open Project, announced the formation of its Technical Steering Committee (TSC), which is responsible for the overall technical health and direction of the project. The TSC will advise the Project Governing Board (PGB), oversee releases, and manage the efforts of […] The post Coalition for Secure AI Forms Technical

J.R. Rao of IBM and Akila Srinivasan of Anthropic Elected to the OASIS Open Project's TSC Leadership

Boston, MA, USA, 15 October 2024 – The Coalition for Secure AI (CoSAI), an OASIS Open Project, announced the formation of its Technical Steering Committee (TSC), which is responsible for the overall technical health and direction of the project. The TSC will advise the Project Governing Board (PGB), oversee releases, and manage the efforts of the project’s three initial workstreams along with their respective chairs, contributors, and maintainers. The TSC will promote initiatives that align with CoSAI’s mission to promote secure-by-design AI systems.

J.R. Rao from IBM and Akila Srinivasan from Anthropic have been elected co-chairs of the TSC. They will play a central role in steering the direction of the workstreams to ensure that they contribute to the overall goals of CoSAI. J.R. and Akila bring a wealth of experience and leadership from their respective organizations and will be instrumental in driving CoSAI’s technical direction.

“Securing AI, openly and collaboratively, will be critical for inspiring trust and enabling its acceptance by consumers and enterprises alike. As TSC co-chair, I am committed to guiding CoSAI’s three workstreams to establish best practices and frameworks that enhance the security of AI systems,” said J.R. Rao, TSC co-chair, of IBM.

“As co-chair of the CoSAI TSC, I’m committed to developing frameworks and controls that help us attest to the trustworthiness and integrity of AI models,” said Akila Srinivasan of Anthropic. “By fostering transparency and control, we empower organizations to build secure and responsible AI systems that protect users and pave the way for a safe and innovative future.”

The TSC has launched three workstreams aimed at advancing the security of AI systems and will oversee their efforts to establish best practices, governance, and frameworks for AI security:

Software Supply Chain Security for AI Systems:
This workstream focuses on enhancing AI security by addressing the challenges of third-party model risks, provenance, and AI application security. It builds upon widely recognized security frameworks like the SSDF and SLSA, extending them for AI development. Preparing Defenders for a Changing Cybersecurity Landscape:
Designed to equip defenders with a comprehensive framework, this workstream will focus on identifying necessary security investments to counter emerging AI-driven offensive capabilities. AI Risk Governance:
This workstream will develop a comprehensive risk and controls taxonomy, checklist, and scorecard for assessing, managing, and monitoring the security of AI systems across industries.

The governance structure for these workstreams ensures community collaboration, transparency, and alignment with CoSAI’s long-term goals. For more details on the governance model, visit the TSC and Workstream Governance documentation in GitHub.

About CoSAI:

CoSAI is an open source ecosystem of AI and security experts from industry-leading organizations dedicated to sharing best practices for secure AI deployment and collaborating on AI security research and product development. CoSAI operates under OASIS Open, the international standards and open source consortium. 

Media inquiries: communications@oasis-open.org

The post Coalition for Secure AI Forms Technical Steering Committee to Advance AI Security Workstreams appeared first on OASIS Open.


ResofWorld

WhatsApp vigilantes in India are converting Christians by force

How far-right Hindu nationalists use WhatsApp to target Christian families when they’re most vulnerable — by preventing them from burying their dead.
It was the day of his mother’s funeral, but Jaldhar Kashyap knew the dozens of people descending on his home weren’t there to offer condolences. When his mother was diagnosed...

South Africa’s migrant delivery workers find safety in numbers

From sharing vital information to fundraising for medical bills, informal unions have become a lifeline for South Africa’s delivery workers.
South Africa has long been a hub for migrants and refugees, who come from neighboring countries seeking opportunities in Africa’s strongest economy. But the country is also notorious for high...

Blockchain Commons

2024 Q3 Blockchain Commons Report

Blockchain Commons’ work to create open, interoperable, and secure digital infrastructure continued in Q3 2024. Here were some of our main topics of interest: Gordian Envelope Videos TPAC dCBOR & Unicode Seed Recovery BIP-85 SSKR for Ledger FROST FROST Implementers Meeting FROST in Gordian Stack Wallet Reference Upgrades Gordian SeedTool for iOS 1.6.2 Swift 6 Stack Upgrade More Envelope Signatu

Blockchain Commons’ work to create open, interoperable, and secure digital infrastructure continued in Q3 2024. Here were some of our main topics of interest:

Gordian Envelope Videos TPAC dCBOR & Unicode Seed Recovery BIP-85 SSKR for Ledger FROST FROST Implementers Meeting FROST in Gordian Stack Wallet Reference Upgrades Gordian SeedTool for iOS 1.6.2 Swift 6 Stack Upgrade More Envelope Signatures in Rust Developer Resources Stack Organization New Envelope Pages What’s Next? Gordian Envelope

Gordian Envelope, Blockchain Commons’ privacy-preserving data-interchange format for data at rest and (using GSTP) data on the wire, remains one of our top priorities. This quarter, we worked to make it more accessible and explored new cases for its usage.

Videos. We produced a trio of videos to offer an introduction to Gordian Envelope: a teaser, an overview, and a look at extensions. They’re must-watch viewing if you’re interested in adopting a data-storage and data-interchange format that actually focuses on privacy.

Envelope Teaser: Understanding Envelopes I: Understanding Envelopes II:

Presentations at W3C TPAC (Technical Plenary and Advisory Committee): We’ve worked extensively on using Gordian Envelope to store digital assets such as seeds and SSKR shares. At TPAC 2024 this year, we presented some new thoughts on using various Envelope and CBOR alternatives in the rechartered DID group, where Christopher is an Invited Expert. We also discussed using Gordian Envelope for some specific DID use cases, which we hope to explore more. There’s more in the minutes and the slides.

dCBOR & Unicode. Gordian Envelope is built on dCBOR, our deterministic CBOR profile. In Q3, we updated our dCBOR Internet-Draft to v11. This was to incorporate Unicode Normalization Form C (NFC), to ensure that Unicode strings, used for all text in Gordian documents, will always be deterministic.

Seed Recovery

The safe storage and recovery of seeds has long been a focus at Blockchain Commons, because it’s the heart of #SmartCustody. Our August 7th Gordian Developers Meeting focused on the topic and gave community members the ability to talk about their own efforts.

BIP-85. Aneesh Karve presented on BIP-85. This is a methodology for deriving many secrets from a single seed.

SSKR for Ledger. SSKR has been one of Blockchain Commons’ most successful releases because it allows developers to safely use Shamir’s Secret Sharing. Aido has incorporated SSKR into Ledger Seed Tool, which now allows you to shard your Ledger secrets yourself (without depending on Ledger Recovery and Ledger’s privacy-busting KYC-compliant partners).

Seed Recovery: BIP-85: SSKR for Ledger:

Our Gordian Developer community is one of our most important resources to ensure that we’re doing work that meets the needs of wallet developers. Sign up for our Gordian Developer announcements to get the latest info on our upcoming meetings!

FROST

FROST is an up-and-coming multisig method that takes advantage of Schnorr-based signatures and Multi-Party Computations (MPCs) for key generation and signing. It’s an important new technology for creating keys that are more resilient and more secure. We’ve been supporting it for more than a year now.

FROST Implementers Meeting. Our second FROST Implementers Meeting occurred on September 18th. It gave people working on FROST specs, libraries, and cryptography the ability to talk about their most recent challenges. We’ve got a full record of the event, including videos, slides, summary, and transcript. It was great to bring the community together and plan for the future!

ChillDKG:
FROST Federation:
secp256k1-zkp:
Serai DEX:
FROST UniFFI SDK:
ZF FROST Updates:

FROST in Gordian. We’ve been doing our own work with FROST! Our Rust and Swift Gordian stacks are switching to fully BIP-340 compliant Schnorr signatures. We’ve also been experimenting with FROST support, to allow the FROST signing method using the Trusted Dealer model. We’re waiting on an updated release of the secp256k1 Rust crate so that we can publish our own Rust crates and Envelope-CLI, but we hope to have our full reference implementation available within the month.

Stack Wallet. We did some light review of the Stack Wallet this quarter, which is the first wallet we know of that incorporates FROST. We’d love to see a security review of its FROST design, but from what we can see from usage, it not only implements FROST, but the ability to change thresholds on the fly, which is one of FROST’s rea2lly amazing capabilities.

We will have a FROST Developer’s Meeting on December 4th that will provide advice & support for wallet developers who want to implement FROST. We’ve already scheduled Stack Wallet to give a presentation, since they’ve already done it!

Thanks to The Human Rights Foundation for their support of our FROST work in 2024.

Reference Upgrades

Our reference apps and libraries suggest best-practices and offer examples on the uses of our specifications.

Gordian SeedTool for iOS 1.6.2. We released a minor update of Gordian Seed Tool for iOS that makes our card entropy compatible with other sources, that allows the export of SSKR Envelopes in UR format, and that resolves a few other incompatibilities.

Swift 6 Stack Upgrade. We also upgraded our entire Swift stack to Swift 6. This allows us to take advantage of the Swift 6 concurrency model, remove unnecessary dependencies on forked libraries, and convert the tests of some modules to the new Swift Testing framework. This work can already be found in our Swift libraries, but we’re waiting to release a new Seedtool for iOS until we have other new features to deploy.

More Envelope Signatures in Rust. Fully BIP-340 compatible signatures are just one of our expansions to our Envelope Rust reference libraries. You can now also do Ed25519 signing (again, as soon as we’re able to release our new crates).

Developer Pages

Our Developer Pages are intended to help wallet developers to use our specifications (and other important standards like FROST). If there’s anything you’d like to see that isn’t on the pages, please let us know. This quarter, we made some major updates.

Stack Organization. Our biggest upgrade was a reorganization of the website to focus on the technology stacks that we offer. We have a core stack (which is our fundamental techs like dCBOR and Envelope), a user experience stack (which makes it easier for users to transmit and view data), and a crypto stack (which does the heavy lifting of things like sharding seeds). This is how it all fits together!

New Envelope Pages. Last quarter, we did work on the Gordian Sealed Transaction Protocol. This quarter, we incorporated that into our developer pages, with new content for GSTP and Encrypted State Continuation, plus updates to our look at Collaborative Seed Recovery

What’s Next?

Our most exciting work planned for Q4 my be our December 4th FROST Implementers Meeting. If you are considering incorporating FROST into your own work, please be sure to sign up for our announcements-only Gordian Developers list to receive notifications on the meeting.

Or, our most exciting Q4 work may be our new work on cliques, which we think is an innovative new way to look at identity. We’ve released the first article on the topic, with a few more to come.

We’ll generally be talking with members of the identity and credentials community in Q4, including a presentation at the W3C Credentials Community Group, planned for October 22nd.

We’re also looking to roll out our work on FROST and Ed25519 signing, which just requires the official deployment of an updated secp256k1 Rust crate.

There are more projects under consideration! We’re thinking about producing a “Gordian Companion” to offer a reference for storing SSKR shares. We’re looking into more grants, as funding continues to be poor for many of our partners. (You can help by becoming a sponsor for us at any level!) And of course we’re looking forward to 2025!

TV screen courtesy Freepik.

Monday, 14. October 2024

Digital Identity NZ

DINZ Executive Council Elections & Annual Meeting 2024

Kia ora, In December 2019, members elected the first Digital Identity NZ Executive Council. The Council is the governing group for the association, providing guidance and direction as we navigate the evolving world of digital identity in Aotearoa. Each Council member is elected for a two-year term, with elections held annually, and results notified at … Continue reading "DINZ Executive Council Ele

Kia ora,

In December 2019, members elected the first Digital Identity NZ Executive Council. The Council is the governing group for the association, providing guidance and direction as we navigate the evolving world of digital identity in Aotearoa. Each Council member is elected for a two-year term, with elections held annually, and results notified at the Annual Meeting in December. As we approach the end of the year, it is time for nominations for the Council seats coming up for re-election.

Executive Council Nominations

There is now an opportunity to put yourself forward, or nominate someone else, for a role on the Digital Identity NZ Executive Council. This year we have vacancies for the following positions:

Corporate – Major (2 positions) Corporate – Other (2 positions) SME & Start-up (2 positions)

The nominees for the above positions must be from a Digital Identity NZ member organisation (including government agencies) and belong to the same Digital Identity NZ Membership Group they are to represent on the Executive Council. If you are unsure of your organisation’s membership category, please email elections@digitalidentity.nz.

All nominations must be entered into the online form by 5pm, Monday 4 November 2024.

Nomination Form

Digital Identity NZ Executive Council roles and responsibilities include:

Direct and oversee the business and affairs of Digital Identity NZ. Attend monthly Executive Council meetings, usually two hours in duration (video conferencing is available). Represent Digital Identity NZ at industry events and as part of delegations. Assist in managing and securing members for Digital Identity NZ. Participate in Digital Identity NZ working groups and projects. Where agreed by the Executive Council, act as a spokesperson for Digital Identity NZ on issues related to working groups or projects. Be a vocal advocate for Digital Identity NZ.

Online Voting

Voting will take place online in advance of the meeting, with the results announced at the Annual Meeting. Please refer to the Charter for an outline of Executive Council membership and the election process. Each organisation has one vote, which is allocated to the primary contact of the member organisation.

Annual Meeting 2024

The Annual Meeting is scheduled for 10:00am on Thursday, 5 December 2024, and will be held via Zoom.

REGISTER NOW

Notices and Remits

If you wish to propose any notices or motions to be considered at the Annual Meeting, please send them to elections@digitalidentity.nz by 5:00pm on the Thursday, 14 November 2024.

Key Dates:

14 October: Call for nominations for Executive Council representatives issued to members 4 November: Deadline for nominations to be received 11 November: List of nominees issued to Digital Identity voting members and electronic voting commences 14 November: Any proposed notices, motions, or remits to be advised to Digital Identity NZ 5 December: Annual Meeting, results of online voting announced

Background:

From the beginning, we have asked that you consider electing a diverse group of members who reflect the diversity of the community we seek to support. We ask that you do so again this year. The power of that diversity continues to shine through in the new working groups this year, particularly as we consider the importance of Te Tiriti, equity, and inclusion in a well-functioning digital identity ecosystem.

The Council has identified several areas where diversity, along with expertise in the digital identity space, could help us better serve the community. Nominations from organisations involved in kaupapa Māori, civil liberties, and the business and service sectors are particularly encouraged. We also encourage suggestions from young people within your organisations, as their viewpoint is extremely valuable and relevant to the work we perform. As an NZTech Association, Digital Identity NZ adopts its Board Diversity and Inclusion Policy, which you can read here.

The post DINZ Executive Council Elections & Annual Meeting 2024 appeared first on Digital Identity New Zealand.


Elastos Foundation

Content is King, Distribution is Queen. Digital Empowerment Means Controlling Both.

Every click, share, and like adds value to others. Every online interaction you make generates valuable data that fuels advertising, shapes consumer insights, and drives the growth of tech giants. Yet, despite being constant producers of digital capital, most of us are neither recognised nor compensated for our contributions. Your online activities are not trivial—they […]

Every click, share, and like adds value to others. Every online interaction you make generates valuable data that fuels advertising, shapes consumer insights, and drives the growth of tech giants. Yet, despite being constant producers of digital capital, most of us are neither recognised nor compensated for our contributions. Your online activities are not trivial—they are digital assets with significant value. Content is king, and distribution is queen. Digital empowerment means controlling both. You are the creator; you must own and profit directly from the value you generate online. Let’s dive in.

Traditionally, human labor has been viewed as a liability—a cost to be minimised. As automation increasingly replaces routine jobs, many people will feel displaced. Companies, understandably, seek greater productivity to stay competitive, favoring machines that don’t require salaries or breaks. So, where do humans fit in the future? We must shift from seeing ourselves as physical liabilities to recognising that we are digital assets in the online world. To achieve this, we need to capture the value that is our birthright.

Silicon Valley would be nothing without your data. Machines lack the creativity, empathy, and originality that are inherently human—AI even trains on these traits, but where is the value back? Authenticity is your strength. Recognise your unique human value. You can thrive where machines cannot by turning your talents into digital assets. Technology is a tool that empowers you, while you remain the creator—shaping a digital future where your strengths and individuality truly thrive.

Distribution, at its core, is the process of delivering goods from owners to consumers. Traditionally online, this has been facilitated by third-party servers and platforms that act as intermediaries between you and your audience. We connect to external servers because they provide the infrastructure and reach needed to share our content and thoughts widely. However, this reliance often comes with dependence, loss of control, and unfair revenue sharing, as these intermediaries may impose strict terms and conditions and dictate how and where your data is shared.

Elacity, built on the Elastos SmartWeb, offers a global marketplace for selling, leasing, and sharing content. Smart contracts—self-executing blockchain agreements—automate digital rights management, access control, and royalty payments. Content is encrypted using decentralized identities, with licensing terms embedded as tokenized rights for trading. Every time your content is sold, smart contracts enforce your terms, ensuring access for the buyer and immediate payments. Let automation work for you.

Now let’s imagine owning fractional rights to an AI model, where every subscription instantly pays you. Picture buying a song directly from your favourite artist or renting a movie through a global marketplace. Envision robotics-as-a-service: self-driving cars, 3D printing, healthcare support, and smart locks—all leased directly from owners, with instant access and payments for everyone involved. This is digital participation. 

Shifting focus from smart contracts for rights management to data distribution. The encrypted data you own still needs to be stored and managed. Imagine building your own distribution hub—storing data locally while making it available globally, streaming directly from your home. You can create and control communities, with access, security, and payments enforced by smart contracts on your blockchain. This is 100% independent, decentralised distribution—where you own both the content and the distribution.

By hosting a NAS-like station at home, connected to the Elastos SmartWeb network, you can store, process, communicate, and stream your content globally as an independent “Channel” on a decentralised, interconnected network. Instead of relying on third-party cloud servers, you distribute your content directly from your own property, whether as an individual or an organisation. Blockchain technology ensures that access, decryption, and playback rights are validated first, guaranteeing ownership and security. Alternatively, you can monetise your resources by renting out available storage space and computational power to others, creating a service where everyone can support the network.

Imagine millions of nodes cross-communicating, each owning their narratives, building authentic followers, and, for the first time, truly owning their digital selves. With Elacity on Elastos, smart contracts govern access and payments for your encrypted digital assets, creating a fully decentralised and automated system for security, privacy, and monetisation. This ensures that rights are honored and payments are fulfilled. Royalties are on autopilot, distributed instantly to all stakeholders across millions of smart contracts each time anyone globally interacts to purchase access rights for diverse assets and channels. This is the Elastos SmartWeb: a new internet layer—more open, secure, and user-centric.

Distribution hubs or “Channels” can be public, privately owned by groups, or managed individually. You have the freedom to collaborate with others to expand your reach. Whether you join community hubs with shared profits and collective decision-making or operate your own hub, you set the rules and create specialised environments. This approach fosters a sense of community and shared success.

By offering decentralised rights management and distribution, Elastos allows you to focus on what you do best—creating. You can own, distribute, and trade your digital assets, directly influencing the broader economy. Elastos sets the stage for the future of distribution, while Elacity launches the SmartWeb party, turning your online presence into real value and empowering you to control your economic destiny.

We must advocate for fair and inclusive digital policies, setting a new standard for transparency, inclusivity, and opportunity in the digital economy. Everyone should benefit from their contributions and recognise the shifting landscape of automation. Elastos stands for pure integrity, while Elacity sets new standards for fairness, ensuring your voice matters in the evolving digital world. Own your rights. Control your distribution. Now is the time to transition from being undervalued to becoming an empowered digital asset owner, with the freedom to reach a global audience on your terms.

Own. Distribute. Prosper. Join Elacity today. Embrace direct distribution, automate your rights management, and unlock your full potential in the digital economy. Take back control of your digital life. Explore Elastos and Elacity today, and become part of the movement to restore the internet to its rightful owners—you. Did you enjoy this article? Follow Infinity for the latest updates here!


ResofWorld

TikTok wants to turn millions of Americans into paid shopping influencers

You only need around 1,000 followers to earn money eating noodles and selling toys on TikTok Shop.
Brandy Leigh, a 50-year-old mother of six in Indiana, was looking for a career that would allow her to work from home. After raising children for most of her adult...

FIDO Alliance

The FIDO Alliance Launches Comprehensive Web Resource to Accelerate Passkey Adoption

Passkey Central provides leaders with education about passkeys and steps to implement them for consumer sign-ins October 14, 2024 — Carlsbad, CA —  The FIDO Alliance today announced Passkey Central, […]

Passkey Central provides leaders with education about passkeys and steps to implement them for consumer sign-ins

October 14, 2024 — Carlsbad, CA —  The FIDO Alliance today announced Passkey Central, a new web resource where consumer service providers can learn more about why and how to implement passkeys for simpler and more secure sign-ins.

Passkeys, an easy-to-use and secure replacement for passwords, are already available for consumer services around the world including Adobe, Amazon, Apple, eBay, Google, Hyatt, Microsoft, Nintendo, NTT DOCOMO, PayPal, PlayStation, Shopify and TikTok. More than 13 billion user accounts can now leverage passkeys. Passkeys offer significant benefits to implementing organizations, including faster user sign-ins, higher sign-in success rates, reduced account takeovers, reduced costs associated with authentication, and lower cart abandonment. Passkey Central provides product leaders and architects with the information required to implement and realize similar benefits with passkeys.

Passkey Central provides visitors with actionable, data-driven content to discover, implement, and maintain passkeys for maximum benefits over time. The comprehensive resources on Passkey Central include:  

Introduction to passkeys Business considerations and metrics  Internal and external communication materials Implementation strategies & detailed roll-out guides   UX & Design guidelines Troubleshooting And more implementation resources, such as glossary, figma kits, and accessibility guidance 

Service providers should go to passkeycentral.org to get started with passkeys.

“Passkeys are the simplest and most secure way for consumers to access the global connected economy,” said Andrew Shikiar, CEO of FIDO Alliance. “The early adoption of passkeys has been remarkable and it is now time to help more service providers break their dependence on passwords. Passkey Central will accelerate the use of passkeys by providing product leads and architects with independent and authoritative guidance on why and how to implement passkeys for their own website and services.”

A research-backed public resource

The content for Passkey Central is based on several years of FIDO Alliance research, including subject matter expert interviews, focus groups and UX testing, to determine what guidance businesses need when implementing passkeys. Investment and participation from the following companies as Founding Underwriters enabled the underlying research, web and content development costs required to launch Passkey Central: Craig Newmark Philanthropies, Google, Trusona and Yubico.

“Our adversaries attack nations in cyberspace using techniques that are blocked by passkeys and related technologies. We need to do what we can to accelerate passkey adoption, and to help regular people understand that passkeys protect countries, and make their online lives a little easier.” – Craig Newmark, Founder and ISR, Craig Newmark Philanthropies

“Trusona is committed to revolutionizing the authentication experience for digital businesses, ensuring customers can sign up and sign in simply, swiftly, and securely. Passkey Central brings that mission to life with a new resource that will positively impact people’s digital lives today and in the future.” – Ori Eisen, CEO, Trusona

“Phishing attacks resulting from stolen login credentials is one of the greatest cybersecurity risks facing individuals and enterprises today. In order to achieve a phishing-resistant passwordless future, the solution is clear: prioritize education on passkey implementation and broad support for passkey authentication options globally. Passkey Central is a major step toward achieving this goal, and we look forward to working with the FIDO Alliance toward accelerating adoption of passkeys.” – Derek Hanson, VP, Standards and Alliances, Yubico

“The best way to accelerate passkey adoption is to give website owners and app owners the information they need to get oriented with the benefits of passkeys and guidance on how they can start deploying passkeys. FIDO’s Passkey Central will be a key resource that helps meet this need.” – Sam Srinivas, Product Management Director, Google and FIDO Board Rep for Google.

For more information about Passkey Central, visit passkeycentral.org.

About the FIDO Alliance

The FIDO (Fast IDentity Online) Alliance was formed in July 2012 to address the lack of interoperability among strong authentication technologies and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services. For more information, visit www.fidoalliance.org.

Contact

press@fidoalliance.org


FIDO Alliance Publishes New Specifications to Promote User Choice and Enhanced UX for Passkeys

The FIDO Alliance has published a working draft of a new set of specifications for secure credential exchange that, when standardized and implemented by credential providers, will enable users to […]

The FIDO Alliance has published a working draft of a new set of specifications for secure credential exchange that, when standardized and implemented by credential providers, will enable users to securely move passkeys and all other credentials across providers. The specifications are the result of commitment and collaboration amongst members of the FIDO Alliance’s Credential Provider Special Interest Group  including representatives from: 1Password, Apple, Bitwarden, Dashlane, Enpass, Google, Microsoft, NordPass, Okta, Samsung and SK Telecom.

Secure credential exchange is a focus for the FIDO Alliance because it can help further accelerate passkey adoption and enhance user experience. Today, more than 12 billion online accounts can be accessed with passkeys and the benefits are clear: sign-ins with passkeys reduce phishing and eliminate credential reuse while making sign-ins up to 75% faster, and 20% more successful than passwords or passwords plus a second factor like SMS OTP. 

With this rising momentum, the FIDO Alliance is committed to enabling an open ecosystem, promoting user choice and reducing any technical barriers around passkeys. It is critical that users can choose the credential management platform they prefer, and switch credential providers securely and without burden. Until now, there has been no standard for the secure movement of credentials, and often the movement of passwords or other credentials has been done in the clear.  

FIDO Alliance’s draft specifications – Credential Exchange Protocol (CXP) and Credential Exchange Format (CXF) – define a standard format for transferring credentials in a credential manager including passwords, passkeys and more to another provide in a manner that ensures transfer are not made in the clear and are secure by default. 

Once standardized, these specifications will be open and available for credential providers to implement so their users can have a secure and easy experience when and if they choose to change providers. 

The working draft specifications are open to community review and feedback; they are not yet intended for implementation as the specifications may change. Those interested can read the working drafts here, and provide feedback on the Alliance’s GitHub repo. Drafts are expected to be updated and published for public review often until the specifications are approved for implementation.

The FIDO Alliance extends a special thank you to its members in the Credential Provider Special Interest Group and its leads for driving and contributing to this important specification.

Friday, 11. October 2024

FIDO Alliance

FIDO APAC Summit 2024: Unlocking a Secure Tomorrow by Accelerating the Future of Authentication in Asia-Pacific

Building on the success of last year’s summit in Vietnam, the FIDO APAC Summit 2024 in Kuala Lumpur, Malaysia, once again brought together thought leaders, policymakers, technology innovators, and industry […]

Building on the success of last year’s summit in Vietnam, the FIDO APAC Summit 2024 in Kuala Lumpur, Malaysia, once again brought together thought leaders, policymakers, technology innovators, and industry experts from across the Asia-Pacific region. With over 350 attendees from 15 countries—including Australia, China, France, Hong Kong, India, Indonesia, Japan, Malaysia, the Philippines, Singapore, South Korea, Taiwan, Thailand, the USA, and Vietnam—this year’s event served as a powerful platform for sharing knowledge, inspiring collaboration, and exploring the evolution of secure and convenient authentication technologies.

Watch the Recap Video

Malaysian Government Endorses Phishing-Resistant FIDO Authentication

In his keynote speech, CyberSecurity Malaysia Chief Executive Officer Datuk Amirudin Abdul Wahab emphasized, “Passwordless methods, such as FIDO-based biometric authentication, offer robust alternatives that are harder to compromise than traditional credentials. They also reduce the burden on users to remember complex passwords and mitigate the risks associated with credential theft.” 

The National Agency of Cyber Security (NACSA) officially announced that they have become the first Malaysian government entity to adopt FIDO and passwordless technology. The local organizations classified as National Critical Information Infrastructure (NCII) are now using FIDO Security Keys for authentication and safeguarding applications and sensitive data.

The summit also received extensive media coverage, about 40 stories both pre- and post-event, featured in numerous esteemed publications. Some highlights include:

[The Edge] Over 80% of data breaches tied to weak passwords

[Business Today] Malaysian Businesses Should Ditch Passwords for Better Cybersecurity

[The Sun] Malaysia Advocates Passwordless Authentication to Enhance Cybersecurity

[BERNAMA TV] Malaysia Advocates Passwordless Authentication to Enhance Security

[Astro Awani] Malaysia Supports Passwordless Authentication to Enhance Cybersecurity

40 Speakers from Various Sectors Highlight Key Industry Trends

The Summit featured more than 40 speakers from sectors such as banking, government, telecom, enterprises, defense, eCommerce, solution vendors, online service providers, and manufacturers. Speakers represented leading organizations including Google, Lenovo, Samsung, ETDA Thailand, NTT Docomo, Mercari, Visa, SBI Bank, TikTok, iProov, Okta, TWCA, RSA, OneSpan, Thales, and VinCSS. One of the key themes of the 2024 Summit was the adoption of passkeys and the push towards achieving a passwordless experience across platforms. Here are some notable lessons shared:

Google: Demonstrated passkeys as the key to providing personalized experiences that users love. Cases from X, Amazon, Roblox, Kayak, WhatsApp, Zoho, and 1Password were shared. Roblox reported, “Passkeys are a significant security and usability upgrade for all of our users. In the six months since our launch, we have seen millions of users adopting passkeys to enjoy a simpler, faster, and more secure login experience.” Kayak noted a “50% reduction in average sign-in time with passkeys. With passkeys available on most devices, we’ve phased out traditional password logins and eliminated passwords from our servers.” 1Password highlighted that “in 2023, more than 1 million passkeys were created and saved by our users, and trial users who interact with passkey features are roughly 20% more likely to convert to paying customers.”

Samsung: Presented on passkeys on Galaxy mobile devices. Samsung launched the Passkey Provider Service at the end of 2023, providing a convenient user experience with the passkey as the default provider on Galaxy mobiles. Users can easily log in with fingerprint authentication and manage passkeys at a glance. Samsung ensures safe passkey synchronization across multiple devices logged into a Samsung account, including utilization with Samsung Knox Matrix. Statistics from the seven-month record of Samsung Passkey Provider include 7,672,861 cumulative registrations, 1,000,000 average new monthly registrations, and 850,000 average monthly authentications. Plans are in place to expand passkey usage for home appliance connectivity, such as TVs.

NTT Docomo: Highlighted the advantages of passkeys as an ideal authentication method—simple, frictionless user experience with biometric authentication, taking just 4-7 seconds compared to up to 30 seconds for SMS OTPs. They emphasized that passkeys are the only practical phishing-resistant authentication method.

Visa: Introduced Visa Payment Passkey for cardholder authentication in modern e-commerce. Traditional consumer authentication methods reduce fraud but often add friction, whereas biometric authentication with passkeys reduces both fraud and friction, leading to a 50% lower fraud rate.

TikTok: Reported success with passkeys, noting that over 100 million users registered within a year of implementation, with a 97% login success rate and a 17x faster login experience. There was also a 2% reduction in SMS OTP logins, as users who adopted passkeys chose them over other methods, improving app performance and reducing costs.

Workshops, Panel Discussions, and Networking Opportunities

This year’s Summit offered morning workshops on Passkeys and FDO (FIDO Device Onboard), allowing participants to delve deeper into implementing FIDO solutions. Attendees had the chance to work with FIDO experts to learn about integrating FIDO authentication into their services, understand technical specifications, and explore best practices. Experts also discussed the impact of emerging technologies like AI and post-quantum computing (PQC) on the authentication ecosystem while highlighting vulnerabilities related to human elements that can be addressed through implementing passkeys and FIDO’s efforts on future-proofing security.

Networking sessions, including a gala dinner, provided attendees a venue to relax and connect with peers from different parts of the world and sectors, fostering collaboration on developing solutions tailored to regional needs. Many participants enjoyed and respected the local culture while finding value in exchanging ideas and experiences about overcoming specific challenges in their respective sectors.

Celebrating Progress and Looking Forward

The FIDO APAC Summit 2024 showcased the significant progress towards convenient and secure FIDO-based passwordless authentication in the region. Through the collective efforts of governments, private sector leaders, and technology providers, the adoption of FIDO standards across the Asia-Pacific is accelerating, delivering stronger security and a seamless user experience.

The Asia-Pacific region is at the forefront of building a phishing-resistant, passwordless future, serving as an inspiration for other regions. The spirit of innovation and collaboration at the Summit reflects the dedication of all stakeholders to creating a secure and user-friendly digital landscape.

We extend our gratitude to all speakers, sponsors, participants, and members for making this year’s Summit a success. Together, we are shaping a more secure, passwordless future.

Proudly Sponsored by:


Velocity Network

Verifiable Credentials: Trust and Truth in an AI-enabled Talent Acquisition Market  

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

Elastos Foundation

Unlocking Native Bitcoin DeFi for Developers: The BeL2 SDK Portal is Here

We are excited to announce the launch of the BeL2 SDK Portal, a powerful new toolkit that gives developers the ability to connect Bitcoin with the world of DeFi, all while maintaining Bitcoin’s renowned security and decentralisation. This marks an important step forward, opening up new possibilities in the decentralised finance ecosystem, allowing Bitcoin to […]

We are excited to announce the launch of the BeL2 SDK Portal, a powerful new toolkit that gives developers the ability to connect Bitcoin with the world of DeFi, all while maintaining Bitcoin’s renowned security and decentralisation. This marks an important step forward, opening up new possibilities in the decentralised finance ecosystem, allowing Bitcoin to play a key role in a space that was previously out of reach.

 

The BeL2 SDK: Bridging Bitcoin and DeFi

The BeL2 SDK serves as a bridge between Bitcoin and Ethereum-compatible chains, offering developers the ability to build secure, trustless applications that bring Bitcoin into DeFi platforms. This offers developers a new way to extend Bitcoin’s capabilities while keeping its core values of security and decentralisation intact. With the BeL2 SDK, developers can now unlock cross-chain operations that connect Bitcoin to the world of yield generation, lending, borrowing, and other DeFi opportunities — all while ensuring that Bitcoin stays on its native chain, maintaining its decentralisation and trustless structure.

 

Why This Matters for Developers

For the first time, developers have a reliable, developer-focused toolkit that simplifies complex cross-chain interactions and allows Bitcoin to interact with DeFi protocols on Ethereum-compatible chains. The BeL2 SDK removes the need for wrapped tokens, allowing Bitcoin to retain its core properties while gaining access to DeFi.

By using the BeL2 SDK, developers can:

Build secure and trustless applications that link Bitcoin with DeFi. Enable cross-chain participation without compromising Bitcoin’s security. Apply Zero-Knowledge Proofs (ZKPs) to ensure the privacy and security of cross-chain transactions. Open up new avenues for Bitcoin in DeFi lending, borrowing, and yield generation.

 

Unlocking Bitcoin’s Potential in DeFi

For years, Bitcoin has been limited to simple transactions or as a store of value, with restricted access to the growing world of DeFi. The BeL2 SDK changes that, unlocking Bitcoin’s full potential to participate in decentralised financial ecosystems, a new Bretton Woods monetary system. With this toolkit, developers can now build applications that bring Bitcoin into DeFi without bridging assets and giving up the values that make Bitcoin so important — its security, decentralisation, and trustless foundation.

The BeL2 SDK makes Bitcoin usable for:

DeFi lending and stablecoin issuance, allowing Bitcoin holders to access liquidity without selling their assets. Yield generation, enabling Bitcoin to generate returns in DeFi markets. Cross-chain transactions, preserving Bitcoin’s security while connecting with other blockchains.

 

Emphasising Security and Trustlessness

At the heart of BeL2 is a commitment to maintaining Bitcoin’s security and trustless operations. Developers can be confident that Bitcoin remains on its own network, with zero-knowledge proofs ensuring the privacy and accuracy of all cross-chain activities. This directly addresses the concerns of Bitcoin enthusiasts who prioritise decentralisation and are reluctant to move their Bitcoin off-chain. The BeL2 SDK allows developers to confidently build DeFi applications while knowing that Bitcoin’s integrity remains fully intact.

 

A Toolkit for a Global Developer Community

The BeL2 SDK is designed specifically for developers. It provides all the necessary tools for creating Bitcoin-focused DeFi applications without introducing unnecessary risks or complications. Whether you are a DeFi developer looking to integrate Bitcoin into your protocol or a Bitcoin enthusiast aiming to expand the use of your assets, the BeL2 SDK makes it easy.

With the BeL2 SDK, developers now have access to:

Developer-friendly tools that make building cross-chain solutions straightforward. The ability to bring Bitcoin into DeFi without relying on wrapped tokens or centralised custodians. A toolkit that preserves Bitcoin’s security and decentralisation while opening it up to DeFi.

 

The Future of Bitcoin in DeFi

The launch of the BeL2 SDK Portal marks a new chapter in the development of Bitcoin as a key player in decentralised finance. This toolkit provides the foundation for Bitcoin to interact with DeFi ecosystems while staying true to its role as a trustless, secure asset. By standing at the crossroads of Bitcoin’s security and DeFi’s financial opportunities, the BeL2 SDK is the gateway to a future where Bitcoin can fully engage in the decentralised economy.

We invite developers to explore the possibilities of the BeL2 SDK Portal and join us in shaping the future of Bitcoin in DeFi. You can explore the BeL2 SDK portal here! Did you enjoy this article? Follow Infinity for the latest updates here!

 

Thursday, 10. October 2024

The Rubric

Web Superpowers Activated! (did:webs, Part 1)

did:webs is an interoperable more secure version of did:web, applying the principles of KERI to layer on additional security while retaining usability. Today on the show we talk with Markus Sabadello and Lance Byrd, contributors to and implementers of, the did:webs specification. References Danube Tech https://danubetech.com/  Decentralized Identity Foundation (DIF) https://identity.foundation
did:webs is an interoperable more secure version of did:web, applying the principles of KERI to layer on additional security while retaining usability. Today on the show we talk with Markus Sabadello and Lance Byrd, contributors to and implementers of, the did:webs specification. References Danube Tech https://danubetech.com/  Decentralized Identity Foundation (DIF) https://identity.foundation/  did:webs for Muggles https://docs.google.com/presentation/d/1BC9y4YvLPwOJwnwpwl8puJYwJHUONLovLITxOCOK8FY/edit?usp=sharing ...

Web Superpowers Activated! (did:webs, Part 2)

did:webs is an interoperable more secure version of did:web, applying the principles of KERI to layer on additional security while retaining usability. Today on the show we talk with Markus Sabadello and Lance Byrd, contributors to and implementers of, the did:webs specification. References Danube Tech https://danubetech.com/  Decentralized Identity Foundation (DIF) https://identity.foundation
did:webs is an interoperable more secure version of did:web, applying the principles of KERI to layer on additional security while retaining usability. Today on the show we talk with Markus Sabadello and Lance Byrd, contributors to and implementers of, the did:webs specification. References Danube Tech https://danubetech.com/  Decentralized Identity Foundation (DIF) https://identity.foundation/  did:webs for Muggles https://docs.google.com/presentation/d/1BC9y4YvLPwOJwnwpwl8puJYwJHUONLovLITxOCOK8FY/edit?usp=sharing ...

Wednesday, 09. October 2024

Next Level Supply Chain Podcast with GS1

How EPCIS is Revolutionizing Supply Chains with Matt Andrews

As supply chains become increasingly complex and stringent regulations like DSCSA and FISMA become more prevalent, understanding how to leverage EPCIS (Electronic Product Code Information Services) for granular visibility and efficient data management is more crucial than ever. In this episode, hosts Reid Jackson and Liz Sertl are joined by Matt Andrews, Global Standards Director at GS1 US. Mat

As supply chains become increasingly complex and stringent regulations like DSCSA and FISMA become more prevalent, understanding how to leverage EPCIS (Electronic Product Code Information Services) for granular visibility and efficient data management is more crucial than ever.

In this episode, hosts Reid Jackson and Liz Sertl are joined by Matt Andrews, Global Standards Director at GS1 US. Matt unpacks the fundamentals and applications of EPCIS, from its role in modeling supply chain processes to its transformative impact across industries like healthcare, food, retail, and logistics.

EPCIS can help your organization achieve unparalleled supply chain visibility, improve compliance, and drive competitive advantage.

 

In this episode, you’ll learn:

The intricacies of EPCIS (Electronic Product Code Information Services) and its universal application across industries for enhanced supply chain visibility, compliance, and efficiency.

How EPCIS can revolutionize inventory management with real-time data accuracy, from monitoring cycle counts to tracking product movement from back of house to point of sale.

How industries such as healthcare and food service leverage EPCIS to comply with regulations like DSCSA and FISMA 204, ensuring traceability down to the unique item level.

 

Jump into the Conversation:

(00:00) Introducing Next Level Supply Chain

(06:25) Benefits that organizations are seeing by leveraging EPCIS

(08:00) Full granular visibility, item-level tracking, inventory management

(13:54) How EPCIS can log events from manufacturing to sales

(17:03) Enhanced supply chain visibility through real-time EPCIS data

(18:28) Accessing claims compliance through advanced visibility

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Matt Andrews on LinkedIn

Tuesday, 08. October 2024

Hyperledger Foundation

Introducing Linea ENS Support in Hyperledger Web3j

Ethereum addresses, with their 42 characters, have become a hallmark of the web3 ecosystem. However, memorizing these long strings is nearly impossible for most users. The Ethereum Name Service (ENS) addresses this issue by offering a decentralized naming protocol on the Ethereum blockchain. 

Ethereum addresses, with their 42 characters, have become a hallmark of the web3 ecosystem. However, memorizing these long strings is nearly impossible for most users. The Ethereum Name Service (ENS) addresses this issue by offering a decentralized naming protocol on the Ethereum blockchain. 


ResofWorld

Philippine chipmakers are embracing automation — and leaving low-skilled workers behind

Hundreds of workers at the Philippine unit of Nexperia threatened to go on strike over layoff fears, a sign of growing discontent over automation in the industry.
The Philippines has long been a hub for semiconductor manufacturing services — largely assembly, testing, and packaging. Now, the government is keen to go up the value chain with increased...

DIF Blog

Build the Future of Education & Workforce with Verifiable Credentials at the DIF 2024 Hackathon

The Future of Education & Workforce track, sponsored by Jobs for the Future and the Digital Credentials Consortium, invites you to explore a future where education is accessible to all learners and acts as a true gateway to economic advancement. Building on the work of the JFF Plugfest competitions, participants

The Future of Education & Workforce track, sponsored by Jobs for the Future and the Digital Credentials Consortium, invites you to explore a future where education is accessible to all learners and acts as a true gateway to economic advancement.

Building on the work of the JFF Plugfest competitions, participants can use tooling and resources that give them a quick start, and the confidence that their submission will provide real value to learners and workers.

Jobs for the Future (JFF)

Jobs for the Future is a nonprofit organization committed to transforming the US workforce and education systems to achieve equitable economic advancement for all. JFF drives change by designing innovative solutions, scaling best practices, influencing public policies, and investing in the development of a skilled workforce. JFF Labs is the innovation arm of the organization, focused on building the infrastructure for a skills-based talent marketplace, supporting an ecosystem of open standards-based interoperability that enables credential portability, and empowering individuals to use their data to access opportunity.

Digital Credentials Consortium (DCC)

The Digital Credentials Consortium is a network of leading universities and institutions advancing the use and understanding of portable, verifiable digital academic credentials in higher education through open source technology development and leadership, research, and advocacy. Founded by MIT and partners worldwide, DCC encourages a learner-centered ecosystem where portable, verifiable digital credentials are universally recognized and easily shared. By fostering collaboration among academia, industry, and standards organizations, DCC is working towards a future where these credentials are more accessible, secure, and verifiable. 

The Challenges

Participants can choose from multiple challenges designed to push the boundaries of what's possible with VCs:

1. Verifiable Learner / Worker IDs and Records

Demonstrate the transformative potential of user-controlled data in learning and professional experiences. Use VCs such as Student IDs, Employee IDs, and Employment History to showcase compelling use cases such as:

Applying for new job opportunities using proof of employment history Accessing platforms based on verified credentials Demonstrating essential skills through verifiable records Implementing selective disclosure principles to share only necessary information 2. Powerful New VC Tools a. Multiple Language Support

Promote cross-border mobility by enabling educational credentials to be meaningfully used internationally. Build a tool that constructs VCs in any language, with a special emphasis on non-Latin scripts, using the renderMethod attribute

b. Browser Integration

Enhance convenience and usability by developing a browser plugin for displaying and verifying VCs. This challenge also requires the use of the VC renderMethod attribute.

3. Feature Enhancement a. Learner Credential Wallet

Add support for the Learner Credential Wallet to use the VC renderMethod attribute, enabling rich displays of credentials within the application.

b. VerifierPlus

Enhance VerifierPlus to support rich displays using the renderMethod attribute, including capabilities for PDF rendering.

4. Bonus Design Challenge: Establishing Credibility in Digital Credentials

Explore innovative ways organizations can integrate VCs into their processes to build trust among users. Design the equivalent of a "browser padlock" for Verifiable Credentials, helping users understand that verification checks are valid and trustworthy.

Prizes

This track offers a substantial prize pool totaling $15,000, distributed among top submissions that meet the challenge criteria and demonstrate exceptional innovation and impact.

Submission Requirements

All submissions must adhere to the following criteria:

Open Source Licensing: Projects must be open source under the MIT license to promote transparency and collaboration. Technical Interoperability Standards: Submissions must comply with the technical standards used by the JFF Plugfest, including: Credential Format: Open Badges 3.0 (using Verifiable Credential format) Issuing Credentials: Utilize VC API with CHAPI or OpenID for Verifiable Credential Issuance Exchanging Credentials: Use CHAPI, OpenID for Verifiable Presentations, or WACI-DIDComm Interop Profile

Participants are encouraged to build upon tools provided by the Plugfest, such as VC Playground, CHAPI, and the Digital Credentials Consortium Wallet.

Why Participate?

By joining this track, you have the opportunity to:

Contribute to solutions that can have a real-world impact on education and workforce development Collaborate with leading organizations in the decentralized identity space Showcase your innovative ideas and technical skills to a global audience Be part of a movement that is shaping the future of learning and work

Sharon Leu, Executive in Residence at JFF Labs, highlights the importance of the challenges: “The challenges that we proposed are critical to the infrastructure that will help learners and jobseekers find meaningful opportunities at all stages of their learning and employment journey. We are excited for this community to work together to create the tools that will give people control of their data in wallets, data models that allow them to express their different identities as workers and learners, multi-language support for verifiable credentials, and a seamless verification experience for relying parties with minimal technology capacity.” 

“The Digital Credentials Consortium advocates for open source, open standards, and open community to foster transparency, collaboration, and innovation in the development of digital credentialing systems,” adds Kerri Lemoie, Director at MIT Digital Credentials Consortium.  “Hackathons foster creativity and collaboration, bringing together diverse minds to solve real-world problems in a short amount of time. Through experimentation, skill development, and community building we hope the participants are inspired to make tools and technologies that will enhance trust of portable, verifiable digital credentials that democratize access to educational achievements and skills verification.”

Kim Hamilton Duffy, Executive Director of DIF, emphasizes the transformative potential of this track: "Education and workforce development have the power to change lives. This challenge embodies the core reason I became involved in decentralized identity – to ensure people have control over credentials that are portable, verifiable, and meaningful across borders and contexts. I'm thrilled to see the innovative solutions our participants will create to address these critical issues."

Join Us in Revolutionizing Education and Workforce Development

Whether you're a seasoned developer in the decentralized identity space or new to the field, your participation can make a significant difference. Together, we can build the next generation of tools that will empower learners and workers worldwide.

Ready to Take on the Challenge?

Register for the DIF Hackathon 2024 and select the Future of Education & Workforce track. Let's collaborate to create a more accessible and verifiable future for education and career advancement.

Register now: https://difhackathon2024.devpost.com/ Join our informational session: https://www.eventbrite.com/e/education-and-workforce-track-overview-tickets-1029330524307 Read details about the challenges, prizes, and submission requirements: https://identity.foundation/hackathon-2024/docs/sponsors/edu/  Join the discussion on the DIF Hackathon discord: ​​https://discord.gg/WXPzWvBCjD 

Join us in shaping the future of education and work through innovation and collaboration.


🚀 Don’t Miss These Exciting Challenges at the DIF Hackathon 2024! 🌍

The DIF Hackathon 2024 is in full swing, and we’ve got a fantastic lineup of challenges waiting for you! From reusable identity to revolutionizing digital identity in education, this is your chance to innovate, compete for amazing prizes, and help shape the future of decentralized identity. Below is

The DIF Hackathon 2024 is in full swing, and we’ve got a fantastic lineup of challenges waiting for you! From reusable identity to revolutionizing digital identity in education, this is your chance to innovate, compete for amazing prizes, and help shape the future of decentralized identity. Below is the full lineup of sessions for the coming week!

🌟 ONT Login Challenge – Unlock Seamless Authentication!📅 Date: Tuesday, October 8 | 8 AM PST / 5 PM CEST Ontology is bringing you the ONT Login challenge! Learn how to integrate a decentralized universal authentication component for secure, reusable identity in Web2 and Web3 applications. Demonstrate how ONT Login can transform your app’s login experience while keeping user privacy intact.

💰 Prizes: 1st Place: $1000 USD | 2nd Place: $500 USD | 3rd Place: $300 USD🔗 Register Now

💥 tbDEX Challenge – Power Up Payments with Known Customer Credentials!📅 Date: Tuesday, October 8 | 9 AM PST / 6 PM CEST Get ready to dive into the payments world with the tbDEX challenge! As a business or developer, you’ll use the Web5 SDK to streamline KYC processes with Known Customer Credentials (KCC). Join this session to unlock a future where seamless decentralized identity enhances payments.

💰 Prizes: 1st Place: $2500 USD | 2nd Place: $1500 USD | 3rd Place: $1000 USD🔗 Register Now

🚀 How to Resolve DIDs and Verify VCs for Free with VIDOS📅 Date: Tuesday, October 8 | 10 AM PST / 7 PM CEST This session will unlock the power of DIDs and Verifiable Credentials in recruitment and reusable identity. Explore two dynamic challenges to develop solutions that make identity verification more secure and efficient for real-world applications.

💡 Challenge 1: Employer Portal Using DIDs and VCs (Education Track)Build a proof-of-concept that allows recruiters to verify and onboard candidates securely using verifiable credentials.

💡 Challenge 2: VC Interoperability (Reusable ID Track)Create a solution that demonstrates VC interoperability across scenarios, like using a passport for travel or age-gated entry.

💰 Prizes: Total prize pool of $4,500 USD🔗 Register Now

🚀 Join the Future of Education and Economic Advancement!📅 Date: Wednesday, October 9 | 9 AM PST / 6 PM CEST

This track invites innovators to develop solutions that make education and economic opportunities more accessible through decentralized identity. Dive into challenges that use Verifiable Credentials (VCs) for educational records, employment history, and more.

💡 Challenge C1: Verifiable Learner/Worker IDsBuild VCs representing Student IDs, Employee IDs, and Employment History. Show how they can be used for job applications, skill verification, and more.

💡 Challenge C2: Build Tools for Global UseDevelop tools that support VCs across borders, languages, and digital platforms, creating a more universal decentralized identity solution. 

💰 Prizes: Total prize pool of $15,000 USD🔗 Register Now

🔑 Crossmint's Reusable Identity Challenge!

📅 Date: Wednesday, October 9 | 10 AM PST / 7 PM CST

Unlock the potential of reusable digital identities to simplify KYC, KYB, and age verification processes. Use Crossmint’s Verifiable Credentials API to build secure, scalable identity solutions for various platforms. Let's tackle identity verification and compliance with a focus on privacy and usability!

💰 Prizes:

1st Place: $800 USD + $2,000 in Crossmint credits

2nd Place: $500 USD + $1,000 in Crossmint credits

3rd Place: $200 USD + $500 in Crossmint credits

🔗 Register Now

🏨 Revolutionize Hotel Check-Ins with Verifiable Credentials (VC)!📅 Date: Thursday, October 10 | 9 AM PST / 6 PM CEST Imagine a world where hotel check-ins are seamless and secure. This challenge, led by Mateo Manfredi, Senior Full Stack Developer at Extrimian, invites you to build a privacy-focused check-in system using government-issued Verifiable Credentials. Let’s reimagine how hotels handle guest data and create a safe, smooth experience.

💰 Prizes: 1st Place: $1000 USD + $1800 in Extrimian Platform credits🔗

Register Now

🤖 Harness the Power of Decentralized Identity for Verifiable AI📅 Date: Thursday, October 10 | 10 AM PST / 7 PM CEST In the age of AI, trust is more important than ever. This challenge, led by Ankur Banerjee, Co-founder and CTO of cheqd, invites you to create solutions that ensure AI-generated content is trustworthy and verifiable using decentralized identity and Verifiable Credentials.

💰 Prizes: Total prize pool of $7,500 USD in CHEQ tokens🔗 Register Now

Don’t miss your chance to innovate, compete, and win big at the DIF Hackathon 2024! Whether you're passionate about education, AI, payments, or hospitality, there’s a challenge for you. Let’s build the future of decentralized identity together.

Best regards,

The DIF Hackathon Team


Blockchain Commons

Musings of a Trust Architect: Edge Identifiers & Cliques

Since the mid-1990s, I’ve been advocating for the creation of secure digital infrastructures that protect human rights, civil liberties, and human dignity online. My mission has always been to decentralize power and give individuals control over their digital lives, from my early work co-authoring the TLS standard to my recent efforts supporting DIDs and Verifiable Credentials. We now stand at anot

Since the mid-1990s, I’ve been advocating for the creation of secure digital infrastructures that protect human rights, civil liberties, and human dignity online. My mission has always been to decentralize power and give individuals control over their digital lives, from my early work co-authoring the TLS standard to my recent efforts supporting DIDs and Verifiable Credentials.

We now stand at another crossroads in digital identity. The current paradigm, where an individual’s private key is the cornerstone of their identity, has served us well but it also has significant limitations—especially as we move toward a more interconnected, collaborative digital world. Fortunately, advances in cryptography allow us to rethink single-key self-sovereign identity systems, suggesting the possibility for new options such as edge identifiers and cryptographic cliques.

The Single Signature Paradigm

Identity management has long centered on the use of single-signature cryptographic keys. Operating on a straightforward principle, this “Single Signature Paradigm” requires the possession of a unique private key for cryptographic signatures, allowing actions such as authentication, data encryption, and transaction validation.


The security of this model hinges on the confidentiality of the private key: a compromise of the key means a compromise of security. To reduce this threat, standards often require private keys be stored in specialized hardware, providing a fortified environment. This model is the cornerstone of security strategies endorsed and required by entities such as the National Institute of Standards and Technology (NIST), European Union government standards, and various international standards groups such as the Internet Engineering Task Force (IETF) and the World Wide Web Consortium (W3C).

There has been very limited success in strengthening this fundamental methodology through protocols such as key rotation. Meanwhile, the Single Signature Paradigm has many flaws, the most serious of which are Single Point of Compromise (where a key can be stolen) or Single Point of Failure (where a key can be lost). If anything, these problems are worsening, as demonstrated by recent side-channel attacks that can extract keys from older hardware. Other issues include scalability limitations, hardware dependency, operational inflexibility, and numerous legal, compliance, and regulatory issues.

There are fundamental limits to what can be achieved within the confines of a Single Signature Paradigm, making the need for evolution clear.

The Keys to Self-Sovereign Identity

The Single Signature Paradigm is problematic for many use cases surrounding digital assets, but particularly so for the management of digital identities, because identities are both central to our digital experience and largely irreplaceable. You can’t just create a new identity to replace a compromised one without losing credentials and connections alike.

When I first conceived of my ideas for the personal control of digital identity, known today as self-sovereign identity, I didn’t want to be limited by the Single Signature Paradigm. Instead, I modeled self-sovereign identity to be an identity that existed in a social context, not an isolated identity defined by singular keys. I wrote some on this in The Origins of Self-Sovereign Identity.

One of the key principles of living systems theory is the concept of the membrane. This is not just a physical barrier but a selective boundary that controls the exchange of energy, matter, and information between the system and its environment. The membrane allows certain things to pass through while restricting others, thereby maintaining the system’s integrity and autonomy. It’s a delicate balancing act: the system must allow enough interaction with the environment to sustain itself while ensuring that it isn’t overwhelmed by external forces.

Though I meant for it to be something that would protect the individual, self-sovereignty doesn’t mean that you are in complete control. It simply defines the borders within which you can make decisions and outside of which you negotiate with others as peers, not as a petitioner.

Implementing practical solutions that encapsulate this interconnectedness has historically been challenging due to the dominance of the Single Signature Paradigm. This has led to self-sovereign identity systems that actually adhere to the Single Signature Paradigm, which in turn causes the to overemphasize individualism, which was not my intent.

It’s not the only way.

Relational Edge Identity

Living systems theory suggests that identity isn’t just about oneself, but about one’s connections to the rest of society.

Consider the process of a child’s identity formation. They may be named “Joshua” upon birth, suggesting a unique, nodal form of identity. But, there are many Joshuas in the world. To truly define the child’s identity requires linked local names (or pet names) that define relationships. The father and mother say “my child”, attesting to the relationship between each of them and the child. A sibling says, “My brother’s child” and a grandparent says “my grandchild”.


Though unidirectional descriptors are useful to help identify someone, each link is actually bidirectional, creating an edge between two individual nodes of identity:


At this point we must ask: does the node really define identity or is it the edges? The most complete answer is probably that an identity is defined by an aggregation of edges sufficient to identify within the current graph context: “Joshua, who is filially linked with Mary, who is filially linked with Anna.”

Relational Edge Keys

We can model the interconnectedness of edge-based relationships in an identity system by using Schnorr-based aggregatable multisig systems that support Multi-Party Computing (MPC), such as MuSig2 or FROST (see the Appendix in the next article for more on the technology and the differences between the two systems). Schnorr-based systems are an excellent match for edge identity because their peer-based key construction technique matches the peer-based model of an identity graph: two users come together to create a joint private key.

To create a relational edge key, the two identities (nodes) connected by an edge each generate a private commitment. These commitments are combined in a cryptographic ceremony to form the edge’s private key. The associated public key then effectively becomes an identifier for this two-person group, indiscernible from a single user’s public key thanks to Schnorr.


Leveraging the Multi-Party Computation (MPC) of MuSig2 or FROST allows for the creation of a private key that doesn’t exist on a single device. It exists only in a distributed cryptographic construct, colloquially called a “fog”. Through unanimous consent, users can use this “fog” to sign collectively, allowing (even requiring) joint agreement for joint actions.

This relational-edge identity model begins to resolve the issues with current self-sovereign identity models by recognizing identity as being about more than just a single self-sovereign person. It also offers substantial benefits including better security, trust, resilience, and verification due to full keys existing only in this distributed cryptographic “fog”. Finally, it allows relationships to dynamically grow and change over time through the addition or removal of edges in a graph.

Clique Identity

Edge identity is just the first step in creating a new model for identity that recognizes tthat personal digital identity is founded in relationships. The next step is to expand pairwise relationships by forming a clique, specifically a triadic clique.

A clique in graph theory is “a fully connected subgraph where every node is adjacent to every other node.” Thus, in a complete graph, no node remains isolated; each is an integral part of an interconnected network. This concept is core to understanding the transition from simple pairwise relationships to more complex, interconnected group dynamics.

In our example, there is an obvious triadic clique: the nuclear family of Mary, Bob, and Joshua.


Remember that the term “nuclear family” comes from the word “nucleus”.That’s a great metaphor for a tight, strongly connected group of this type. A triadic clique fosters strong social cohesion and supports a robust, tightly-knit network.

Cryptographically, we form a triadic clique by generating a relational edge key for each pair of participants in the group. This represents the pair’s joint decision-making capability. Once these pairwise connections are in place, the trio of edges participates in a cryptographic ceremony to create a shared private key for the whole group, which in turn creates a clique identifier: the public key. This identifier represents not just an individual or a pair but the collective identity of the entire triadic group (and, once more, their decision-making capability).

Although my examples so far suggest that nodes in a clique are all people, that doesn’t have to be the case: I’ll talk about cliques of devices as one of three variations of this basic formula in my next article.

Why Cliques of Edges?

As noted, a clique is formed by the pairwise edges jointly creating a key, not by the original participants doing so. There are a number of advantages to this.

Most importantly, it builds on the concept of identity being formed by relationships. Call it the Relationship Signature Paradigm (or the Edge Signature Paradigm). We’re saying that a group is defined not by the individuals, but by the relationships between the individuals. This is a powerful new concept that has applicability at all levels of identity work.

Individually, we might use the Relationship Signature Paradigm to create an individual identity based on edge-based relationships. My relationship to my friends, my relationship to my company, my relationship to my coworkers, my verifiable credentials (which are themselves relationships between myself and other entities), and my relationship to my published works together define the “clique” that is me. Crucially, this identity is built upon the relationship with other participants, not the participants themselves.

At a higher-level, we can also use this paradigm to form a clique of cliques, where each member is not a participant or even an edge, but instead a clique itself! Because we already recognized cliques as being formed by relational groups when we defined a first-order clique as a collection of edges, we can similarly define a clique as a collection of cliques (or even a collection of edges and cliques), creating a fully recursive paradigm for identity.


There is one clique-based design where the Relationship Signature Paradigm can’t be used: fuzzy cliques, which is another variation of clique identity. But more on that in the next article.

Higher Order Graphs

There is no reason to limit cryptographic cliques to three edges. However, the larger the group is, the harder it is to close the graph: as the number of nodes (n) in a clique increases, the number of edges grows following the formula (n*n-1)/2, which is the number of unique edges possible between n nodes.

A “4-Clique” (or K4), for example, is a complete graph comprising 4 nodes, where each node is interconnected with every other node, resulting in a total of (4*3)/2 = 6 edges.


This pattern continues with larger cliques:

K5 = (5*4)/2 = 10 edges; K6 = (6*5)/2 = 15 edges; K7 = (7*6)/2 = 21 edges; etc.

In practice, as the number of nodes in a clique increases, the complexity of forming and maintaining these fully connected networks also escalates: each additional connection requires its own key-creation ceremony with every existing member of the graph.

Complete graphs, or closed cliques, have valuable applications across various disciplines, from computer science to anthropology, but they aren’t the only solution for cryptographic cliques. I’ll talk more about the alternative of open cliques as another variation of the clique identity model in my follow-up article next week.

Conclusion

The Single Signature Paradigm has been at the heart of the digital world since the start. It’s always had its limitations, but those limitations are growing even more problematic with the rise of digital identity.

Relational edge keys and closed cliques offer a next step, modeling how identity is actually based on relationships and that many social decisions are made through the edges defined by those relationships.

Other advantages of using clique-based keys and identities include:

Decentralized Identity Management. Peer-based edge and clique identifiers are created collaboratively, bypassing third-party involvement, thus supporting self-sovereign control and improving anonymity. Identity Validation. Peer-based identifiers help to authenticate social identities, creating trust. Resilience Against Single Points of Failure: Distributing control among multiple parties in a clique guards against single points of failure. Secure Group Decision Making. Relations or groups can securely and irrevocably made decisions together. Enhanced Privacy in Group Interactions. Aggregatable Schnorr-based signatures keep the identities of the members of a relationship or a clique private.

Cliques can be quite useful for a number of specific fields:

Blockchains. The use of aggregatable signatures creates smaller transactions on blockchains. Collaborative Projects. Collaborative projects and joint ventures can use clique keys to authenticate shared resource usage and other decisions. Financial Fields. Dual-key control is often required in financial fields, and that’s an implicit element of relational edge keys. Internet of Things (IoT) & Other Smart Networks. Relational edge keys can ensure secure and efficient communication among diverse devices that have paired together. Medicine & Other Sensitive Data. When data is sensitive, cliques can ensure all parties have agreed to the data sharing terms, maintaining both security and collaboration integrity.

By leveraging cryptographic cliques for group identification and decision-making, we open a wide array of opportunities. These are just the beginning: open cliques, fuzzy cliques, and cliques of devices can offer even more opportunities, as I discuss in my next article in this series (which also talks a little bit about the cryptography behind this).

Monday, 07. October 2024

Oasis Open Projects

Invitation to Comment – Energy Interop (CTS) Version 1.0

OASIS and the Energy Interoperability TC are pleased to announce that Energy Interoperation Common Transactive Services (CTS) Version 1.0 is now available for public review and comment.  Common Transactive Services (CTS) permits energy consumers and producers to interact through energy markets by simplifying actor interaction with any market. CTS is a streamlined and simplified profile […]

Public review ends November 7th

OASIS and the Energy Interoperability TC are pleased to announce that Energy Interoperation Common Transactive Services (CTS) Version 1.0 is now available for public review and comment. 

Common Transactive Services (CTS) permits energy consumers and producers to interact through energy markets by simplifying actor interaction with any market. CTS is a streamlined and simplified profile of the OASIS Energy Interoperation (EI) specification, which describes an information and communication model to coordinate the exchange of energy between any two Parties that consume or supply energy, such as energy suppliers and customers, markets and service providers.

The documents and all related files are available here:

Energy Interoperation Common Transactive Services (CTS) Version 1.0
Committee Specification Draft 04
09 September 2024

Editable Source: https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd04/ei-cts-v1.0-csd04.pdf (Authoritative) 

HTML: https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd04/ei-cts-v1.0-csd04.html

DOCX: https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd04/ei-cts-v1.0-csd04.docx

For your convenience, OASIS provides a complete package of the specification document and any related files in a ZIP distribution file. You can download the ZIP file at:  

https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd04/ei-cts-v1.0-csd04.zip

How to Provide Feedback

OASIS and the Energy Interoperability TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public review starts October 7, 2024 at 00:00 UTC and ends November 7, 2024 at 23:59 UTC.

Comments from TC members should be sent directly to the TC’s mailing list. Comments may be submitted to the project by any other person through the use of the project’s Comment Facility: https://groups.oasis-open.org/communities/community-home?CommunityKey=70a647c6-d0e6-434c-8b30-018dce25fd35

Comments submitted for this work by non-members are publicly archived and can be viewed by using the link above and clicking the “Discussions” tab.

Please note, you must log in or create a free account to see the material. Please contact the TC Administrator (tc-admin@oasis-open.org) if you have any questions regarding how to submit a comment.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification. 

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the Energy Interoperability TC’s can be found at the TC’s public home page: https://www.oasis-open.org/committees/energyinterop/

Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/energyinterop/ipr.php

The post Invitation to Comment – Energy Interop (CTS) Version 1.0 appeared first on OASIS Open.


OpenID

10 Years On: OpenID Connect Published as an ISO/IEC Spec

The OpenID Connect Final specification was launched on February 26, 2014 with a vision of increased security, privacy, and usability on the internet. Ten years after that publication, we are delighted to announce that 9 OpenID Connect specifications are now published as ISO/IEC standards. ISO/IEC 26131:2024 — Information technology — OpenID connect — OpenID connect […] The post 10 Years On: Open

The OpenID Connect Final specification was launched on February 26, 2014 with a vision of increased security, privacy, and usability on the internet. Ten years after that publication, we are delighted to announce that 9 OpenID Connect specifications are now published as ISO/IEC standards.

ISO/IEC 26131:2024 — Information technology — OpenID connect — OpenID connect core 1.0 incorporating errata set 2 ISO/IEC 26132:2024 — Information technology — OpenID connect — OpenID connect discovery 1.0 incorporating errata set 2 ISO/IEC 26133:2024 — Information technology — OpenID connect — OpenID connect dynamic client registration 1.0 incorporating errata set 2 ISO/IEC 26134:2024 — Information technology — OpenID connect — OpenID connect RP-initiated logout 1.0 ISO/IEC 26135:2024 — Information technology — OpenID connect — OpenID connect session management 1.0 ISO/IEC 26136:2024 — Information technology — OpenID connect — OpenID connect front-channel logout 1.0 ISO/IEC 26137:2024 — Information technology — OpenID connect — OpenID connect back-channel logout 1.0 incorporating errata set 1 ISO/IEC 26138:2024 — Information technology — OpenID connect — OAuth 2.0 multiple response type encoding practices ISO/IEC 26139:2024 — Information technology — OpenID connect — OAuth 2.0 form post response mode

We would like to thank the AB/Connect Working Group for their tireless efforts building and maintaining this family of specifications, including the process of applying errata corrections to the specifications, so that the ISO versions would have all known corrections incorporated. 

OpenID Connect has been used by millions of developers and deployed in billions of applications worldwide. Publication by ISO as a Publicly Available Specifications (PAS) will enable even broader global adoption by enabling deployments within ecosystems and jurisdictions that require the use of specifications from standards bodies recognized by international treaties (such as ISO).

The OpenID Foundation remains committed to helping people assert their identities wherever they choose – and to do so by building identity standards that are secure, interoperable, and privacy-preserving. For the benefit of individual and ecosystem security all over the world, OIDF will soon follow this same process with other specification families. These include the FAPI 1.0 and eKYC-IDA specifications, and once they’re final, the  FAPI 2.0 specifications.

Many thanks to all of OIDF spec authors, implementers, members, and contributors who have ensured the success of OpenID Connect over the last 10 years!

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post 10 Years On: OpenID Connect Published as an ISO/IEC Spec first appeared on OpenID Foundation.


Oasis Open Projects

OASIS Celebrates 20th Anniversary of Common Alerting Protocol, Global Standard for Alerts and Warnings

Boston, MA – 7 October 2024 – This month marks the 20th anniversary of the Common Alerting Protocol (CAP) being established as an OASIS Open Standard. CAP, part of the EDXL suite of standards, provides an open, non-proprietary message format for delivering all-hazard alerts and notifications. Over the past two decades, CAP has become a […] The post OASIS Celebrates 20th Anniversary of Common Ale

CAP Standard Has Transformed Emergency Communication and Continues to Save Lives

Boston, MA – 7 October 2024 – This month marks the 20th anniversary of the Common Alerting Protocol (CAP) being established as an OASIS Open Standard. CAP, part of the EDXL suite of standards, provides an open, non-proprietary message format for delivering all-hazard alerts and notifications. Over the past two decades, CAP has become a model of global collaboration and a fundamental component of emergency communications systems worldwide. Its use across multiple platforms has helped save countless lives through timely, reliable messaging. Today, 87% of the world’s population lives in a country with at least one national-level CAP news feed for emergency notifications. 

CAP enables a consistent message to be disseminated simultaneously over a variety of communication pathways, including radio, television, mobile phones, emails, and other media. This all-hazards, all-media format ensures that critical alerts (e.g., weather events, earthquakes, tsunami, volcanoes, public health crises, power outages, fires, child abductions, and more) reach the public swiftly and efficiently, regardless of the medium.

“As we celebrate 20 years of CAP, I’m incredibly proud that it has become the backbone of emergency communication worldwide, recognized by the UN as the standard for the Early Warnings for All program. The success of CAP is a testament to the dedication and collaboration of so many over the years, and I extend my sincere thanks to everyone who has played a part in making it the global standard it is today,” said Elysa Jones, chair of the OASIS Emergency Management Technical Committee (EMTC). “CAP’s ability to deliver consistent, interoperable alerts through multiple channels has made it indispensable for disaster management. We’ll continue to evolve CAP to ensure it serves communities in need.”

The CAP community will commemorate this significant anniversary milestone at the CAP Implementation Workshop from 22-24 October in Leuven, Belgium. OASIS is a co-sponsor of the event, which will focus on the use of CAP and its consistent use throughout the world. OASIS and the EMTC will continue to work with nations and organizations to explore future advancements in global emergency alerting.

The fundamental need for CAP was identified by the Partnership for Public Warning (PPW) in response to the 9/11 attacks when there was no consistent method for informing the nation. The 2004 Indian Ocean tsunami highlighted the urgent need for improved emergency alert communication across the globe. With the support of Eliot Christian, longtime CAP advocate and former chief architect of the World Meteorological Organization (WMO) Information System (WIS), and Elysa Jones, chair of the OASIS EMTC, along with EMTC members, CAP was officially adopted by the International Telecommunications Union (ITU) in 2007 as ITU-T Recommendation X.1303. Since then, many international organizations like the WMO, the International Federation of Red Cross and Red Crescent Societies (IFRC), and the United Nations Office for Disaster Risk Reduction (UNDRR) have embraced CAP as an essential standard for emergency alerting. In 2021, the Call to Action on Emergency Alerting set a goal to achieve 100% CAP implementation by 2025, an initiative that has since been integrated into the UN’s Early Warnings for All initiative. 

OASIS and its partners are committed to increasing global CAP adoption. Participation in the EM TC is open to all through membership in OASIS, with interested parties encouraged to join and contribute to shaping the future of alerting. To get involved in the TC, visit www.oasis-open.org/join-a-tc

The post OASIS Celebrates 20th Anniversary of Common Alerting Protocol, Global Standard for Alerts and Warnings appeared first on OASIS Open.


DIF Blog

Vidos Challenges you to Transform the Recruiting Process and Streamline Access to Digital Services

DIF is proud to introduce Vidos as a Silver sponsor of the DIF 2024 Hackathon! Vidos Hackathon challenges tackle real world challenges of transforming the recruitment process and streamlining access to digital services using Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) About Vidos Vidos’s mission is to empower

DIF is proud to introduce Vidos as a Silver sponsor of the DIF 2024 Hackathon! Vidos Hackathon challenges tackle real world challenges of transforming the recruitment process and streamlining access to digital services using Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs)

About Vidos

Vidos’s mission is to empower a digital evolution that prioritizes user centric control of identity, data, and communications. By providing tools for builders and services for enterprises, Vidos makes it easy for organizations across finance, education, legal, and travel & hospitality sectors (and beyond) to work with digital identity and verifiable credential services.

Two Challenges, Limitless Innovation

This year, Vidos is presenting two challenges that highlight the practical applications of DIDs and VCs in recruitment and reusable identity:

Challenge 1: Employer Portal Using DIDs and VCs: Imagine a world where recruiters can instantly verify candidate qualifications, automate onboarding processes, and even personalize training programs using verifiable credentials. This challenge invites you to build an "employer portal" that leverages DIDs and VCs to create a seamless and secure experience for both recruiters and candidates.

Example use cases include:

● Matching candidates to job openings based on verified skills and qualifications

● Automating the verification of educational and professional credentials

● Creating a marketplace for qualification issuers and recruitment agencies

Prizes for this challenge total $2,250 with additional social promotion opportunities!

Challenge 2: VC Interoperability: This challenge tackles the power of reusable identity. Build a solution that demonstrates how a single VC, such as a passport, can be used for various purposes, including age verification and travel authorization. By showcasing the interoperability of VCs across different issuers and scenarios, you'll be at the forefront of shaping a future where individuals have more control over their data and how it's used.

An example use case could be use of a travel document such as passport or mobile drivers license for both travel and age-gated entry to an online service.

Prizes for this challenge total $2,250 with additional social promotion opportunities!

A Word from Vidos

”The DIF 2024 Hackathon is an exciting opportunity for builders to create practical, user-centric solutions with decentralized identity. At Vidos, we're sponsoring challenges focused on real-world adoption in recruitment, travel, and digital services. We believe DIDs and VCs transform how businesses can handle identity verification and data sharing, putting more control in the hands of individuals. Our goal is to inspire developers to build tools that organizations can implement today, driving adoption of decentralized identity while enhancing user privacy and control.” said Tim Boeckmann, CEO of Vidos.

Why Build with Vidos? Real-World Impact: Vidos’s challenges are designed to address real-world problems faced by the recruitment and travel industry for digital service providers. Industry Exposure: As an active member of the decentralized identity space, Vidos is offering participants the chance to showcase their skills and gain valuable exposure to potential employers and partners. Cutting-Edge Technology: Work with the Vidos tech stack and tools from our partner network to build innovative solutions that push the boundaries of decentralized identity. Expert Mentorship: Participants will have access to Vidos's team of mentors for guidance and support throughout the hackathon. DIF's Executive Director on Vidos’s Participation

Kim Duffy, Executive Director of DIF, shares her excitement about Vidos' participation:

"Vidos' challenges for the DIF 2024 Hackathon showcase how decentralized identity can tackle real-world issues and create a more equitable digital future. By focusing on revolutionizing recruitment and enabling reusable identity, we're opening doors to increased economic mobility and fairer access to opportunities. I'm excited to see how participants will use DIDs and VCs to transform identity verification and data sharing in professional settings, potentially making the job market more accessible and inclusive for all."

Ready to get started?

Explore Vidos Digital Identity Hack Pack for resources, tools, and inspiration Join the conversation on the DIF Hackathon Discord server Register for Vidos' information session Read the details of Vidos’ challenge: Employer Portal using DIDs and VCs VC Interoperability Visit Vidos website to learn more about Vidos and their vision for the future of identity

Don't miss this opportunity to work with cutting-edge decentralized identity technology and make a real impact on the future of recruitment and digital identity!


Human Colossus Foundation

Human Colossus Foundation at the Global DPI Summit: Shaping the Future of Digital Public Infrastructure

The Human Colossus Foundation (HCF) was honored to participate in the Global Digital Public Infrastructure (DPI) Summit in Cairo, under the auspices of H.E. President Abdel Fattah El-Sisi, President of The Arab Republic of Egypt.

The Human Colossus Foundation (HCF) was honored to participate in the Global Digital Public Infrastructure (DPI) Summit in Cairo, under the auspices of H.E. President Abdel Fattah El-Sisi, President of The Arab Republic of Egypt.

The Global Digital Public Infrastructure (DPI) Summit in Cairo was the world's first summit dedicated to DPI. Bringing together a diverse ecosystem of experts. The summit featured insightful keynotes, engaging discussions, and practical focus panels where participants shared real-world DPI implementation experiences. Success stories spanned across national digital identity, payments, government services, and data exchange initiatives. However, discussions highlighted challenges, particularly in cross-governance data exchange and the interoperability layer, signaling a need for improved solutions to ensure a seamless DPI ecosystem across sectors and borders.

At the summit, discussions emphasized the need to start thinking about implementation of DPI, beyond service oriented use cases such as finance, public services, and governance. These sectors have benefited from DPI adoption, but as noted during the summit, there remains much work to do in improving its application across industries.

One of the key discussions revolved around the evolution of DPI. First iteration of DPI  provided the initial frameworks for public digital infrastructure, focusing on secure and efficient digital services. However, the future version promises to shift the focus towards interoperability, with a higher emphasis on connecting different systems and ensuring they work together seamlessly. This is a critical development as governments and organizations look to build more integrated, accessible, and collaborative infrastructures. This trend is welcomed by HCF as it goes in the direction the Foundation has been promoting since its creation in 2020, a total interoperability within and across ecosystems.

Data exchange was a recurring theme during the event.In particular, the necessity of cross  borders governance frameworks was raised and discussed. While there has been much progress within individual countries or regions, many experts admitted that the global community still lacks a clear pathway to enable effective cross-governance data exchange. The complexities of regulatory frameworks, governance structures, and varying legal standards pose significant challenges. At present, there seems to be no clear consensus on how to tackle this issue, highlighting the urgent need for collaborative innovation. HCF witness these complexities in the projects we are involved (see Governance Periscope blog post of Sep.16). There is not one single digital governance framework that will capture the world’s diversity. 

HCF was pleased to see a strong emphasis on inclusion and the need for vendor-agnostic solutions, ensuring that digital public infrastructure is accessible to all, regardless of geography or socio-economic status. This aligns with HCF’s mission of building decentralized, scalable infrastructure that works for everyone, not just for those in advanced economies or within specific vendor ecosystems.

HCF’s vision for a digital infrastructure that scales horizontally was widely accepted at the summit. The need for a common infrastructure that can be applied across various industries and sectors was highlighted as critical for the next phase of DPI development. This closely aligns with HCF's work, which focuses on enabling cross-sector digital infrastructure that is decentralized, scalable, and interoperable.

The first DPI summit was a great success, setting the stage for the continued development of global digital public infrastructures. The next DPI Summit is scheduled for November 4-6, 2025, and it promises to build on the momentum from Cairo, with even more insights and innovations expected to emerge. 

HCF is excited to continue contributing to these important discussions, helping shape the global DPI ecosystem and ensuring that it meets the needs of people across all sectors and regions.

In conclusion, the Global DPI Summit in Cairo highlighted the critical role DPI will play in shaping the future digital economy, and HCF’s work in decentralized infrastructure aligns perfectly with this vision. We look forward to further collaborations and innovations in the years to come.


ResofWorld

Instagram helps Brazilians dodge a national sales ban on vapes

As Brazilian law clashes with Meta’s policies, illegal vapes flood social media and the streets.
On August 3, Love Disk, a tobacco and liquor shop, posted an ad of a saleswoman holding three vapes up to the camera on Instagram. “The little beloved ones have...

Elastos Foundation

Reclaim the Web: Bring Back Authenticity Before It’s Lost Forever.

The internet isn’t dying. It’s already dead, taken over by bots and AI. Let’s bring it back to life for real people. The internet in the early days was a boundless frontier teeming with human connection, creativity, and unfiltered authenticity. It was a place where communities blossomed around shared interests, where originality thrived, and every […]

The internet isn’t dying. It’s already dead, taken over by bots and AI. Let’s bring it back to life for real people. The internet in the early days was a boundless frontier teeming with human connection, creativity, and unfiltered authenticity. It was a place where communities blossomed around shared interests, where originality thrived, and every voice could find an audience.

Today, that vibrant space is less than 50% of the internet. We’ve drifted into an era where bots outnumber real users, corporate algorithms dictate our experiences, and genuine human interaction is suffocated under increasing layers of automation. The essence of the internet—that which made it a rebellious tool for unity and expression—is fading away.

The transformation didn’t happen overnight. Slowly but surely, the internet morphed into a sterile version of itself—a curated environment dominated by corporate interests and government oversight. The organic growth of communities gave way to controlled ecosystems where user engagement is meticulously engineered. Social media platforms, once the epicenters of human interaction, have become echo chambers filled with recycled content, inflammatory bias, and automated posts.

Recent studies suggest that bots and automated accounts now constitute 49.6% of overall web traffic, and 65% of these bots are malicious.  On platforms like Twitter, it’s estimated that bots make up around 15% of all accounts. These automated participants flood timelines with repetitive messages, manipulate trending topics, and distort public discourse. They create a dense fog that obscures the line between genuine human expression and automated noise. The fakery is gross and superficial.

The corporate control of the internet extends beyond social media. Search engines, content platforms, and even news outlets are influenced by algorithms designed to maximize profit and personal agendas, often at the expense of user experience. Personalized ads follow us relentlessly, and content is filtered through the lens of what will keep us clicking, not what informs or enriches us. This has led to a fundamental erosion of trust.

Users are either hypnotized or increasingly skeptical of the content they consume, unsure whether it’s genuine or manipulated. Fake news proliferates, and the ability to engage in open, honest dialogue diminishes. The internet, once a tool for enlightenment, has become a battlefield of misinformation. However, the core purpose of blockchain technology is to reinstate trust, and new layers are forming.

Elastos is a decentralized internet designed to restore digital sovereignty to individuals. Unlike the current internet, which relies on centralized servers, Elastos uses blockchain technology and peer-to-peer networks to let users own their data and digital identities.

With Elastos, you have a digital shield protecting your online activities. Holding your own “freedom key”, you control your privacy and security in a new safe zone. Your data isn’t stored on corporate servers; it’s securely encrypted and accessible only by you and who you let in, verified using blockchain technology. This marks a significant shift in how we interact with the digital world.

Elastos creates a new digital commons—a decentralized platform where communities form organically without centralized gatekeepers. Developers can build decentralized applications (dApps) that operate securely and privately. Content creators can share work directly with audiences, free from algorithmic suppression or monetization schemes favouring platforms over individuals.

By decentralizing the internet’s infrastructure, Elastos frees users from the monopolistic grip of large tech companies. Users become active participants in a network valuing privacy, security, and autonomy. This change reclaims the internet as a safe space for genuine human connection and empowerment—a true rebirth of the authentic Internet.

Building on Elastos’ decentralized infrastructure, Elacity provides access control systems which allow owners, creators and communities to regain control over their digital environments. This platform fosters a rebirth of authenticity for direct interaction directly, securely, and profitably with audiences.

Elacity is more than a marketplace; it’s a blockchain-governed platform where everyones’ rights are safeguarded using tokenized rights in decentralized wallets, and where genuine engagement is rewarded. In its upcoming v2 release, anyone can create an Elacity channel—a special economic environment for communication. This marks a platform for a creative explosion!

Users can be independent, set up business models like “Buy Now” or “Subscribe” to their property, or establish token-gated access—for example, “hold 28 ELA to enter.” Users gain access only if they meet owners’ specific terms enforced through smart contracts, bringing privacy and trust back online. This mechanism also produces a blockchain-powered AI bot filter for this new internet layer.

Central to Elacity is non-custodial ownership. Creators retain full control over their digital assets, free to license, sell, or share without handing over ownership to intermediaries. Elacity functions as a world computer marketplace, a new economy for creativity powered by blockchain technology and smart contracts.

Elacity’s decentralized Digital Rights Management (dDRM) system lets you set terms for how your content is used and monetized. Transactions are secure, immediate, and transparent, eliminating delays typical of traditional platforms. This new era of creative empowerment extends beyond individual creators to community builders and social environments.

By removing intermediaries and implementing transparent systems, Elacity lays the foundation for a digital renaissance—an internet where creativity thrives and authenticity is celebrated, all whilst bots and corporate control is denied!

The internet is at a crossroads. We can either continue down a path of automation, corporate control, and detachment, or we can forge a new way forward—one in which individuals reclaim their digital sovereignty, creators flourish without intermediaries, and communities thrive through real, meaningful connections.

Take back control of your digital life. Explore Elastos and Elacity today, and become part of the movement to restore the internet to its rightful owners—you. Did you enjoy this article? Follow Infinity for the latest updates here!

 

Friday, 04. October 2024

ResofWorld

Dozens of police influencers are running for office in Brazil

A history of policemen using violent content to build their online personalities has policymakers on edge.
When he’s not chasing after criminals in São Paulo’s dangerous neighborhoods, police lieutenant Flavio Goncalves da Costa is busy discussing his risky operations on police-themed shows on YouTube. The shows...

EdgeSecure

Building Pathways to Equity and Economic Prosperity

The post Building Pathways to Equity and Economic Prosperity appeared first on NJEdge Inc.
New Jersey Community College Opportunity Summit

As a vital part of the higher education ecosystem, community colleges play a key role in responding to the economic, demographic, technological, social, and environmental shifts impacting New Jersey. To discuss these current challenges and the strategies for helping transform education to better support student success, the New Jersey Council of County Colleges (NJCCC) held their inaugural Opportunity Summit on June 11-13, 2024 in Atlantic City. Along with community college leaders, faculty, and staff, the Summit welcomed partners from high schools, college, universities, unions, community-based organizations, state and local government, and the public workforce system.

The three-day event focused on NJCCC’s Opportunity Agenda which is centered around equity, collaboration, opportunity, and innovation. “We’re in a time of rapid change and our colleges are working diligently to respond to these changes and keep on the cutting edge of helping more students earn degrees and credentials that will help them lead quality lives,” says Aaron R. Fichtner, Ph.D., President, New Jersey Council of County Colleges. “We felt it was important to bring colleges together to share insights and information, while also hearing from national thought leaders about the challenges many of us are confronting in higher education.”

Expanding Education and Workforce Partnerships
The first day of the Opportunity Summit focused on the Opportunity Agenda where state and national experts shared unique insights into navigating the evolving world of education. The second day was the NJ Pathways to Career Opportunities Summit and discussed ways to expand innovative education and workforce partnerships. In partnership with the New Jersey Business and Industry Association, the NJ Pathways to Career Opportunity Initiative has joined together government, industry, union, and education partners to build stackable education and training opportunities.

The third day highlighted the Community to Opportunity Initiative which provides holistic support to community college students and addresses food insecurity, childcare issues, and mental health and wellness. “This event was uniquely structured and offered the opportunity to attend one or two days or all three,” says Maria Heidkamp, Chief Innovation and Policy Officer, New Jersey Council of County Colleges. “We had five hundred attendees in total and many colleges had teams of five or six people representing different areas, including leadership, academic affairs, student services, data services, and workforce development. We heard through feedback that they appreciated being able to come to one conference together and participate as a team.”

“The lunch panel, Serving New Jersey’s Justice-Impacted Individuals, discussed the work that our colleges are doing to serve justice-impacted student. Governor McGreevy was a part of this session and the presentation had many great speakers and inspiring moments. It highlighted three amazing programs that are helping individuals who are either in prison or are on probation and giving them opportunities to succeed.”

— Aaron R. Fichtner, Ph.D.
President, New Jersey Council of County Colleges

Building Pathways to Equity and Economic Prosperity
New Jersey’s community colleges continue to help a diverse group of students achieve their academic goals and have worked within their organization and collectively with other institutions to expand instruction, build partnerships with high schools, four-year colleges and universities, nonprofits, and businesses; and improve student outcomes. To meet this core mission and effectively address equity challenges, community colleges within the state have committed to following four action pillars outlined in the Opportunity Agenda. Dr. Fichtner explored these pillars with Summit attendees and how these collective priorities can help promote equitable academic, social, and economic mobility for the greater community:

Pillar 1: Helping all high school students access pathways to postsecondary and career success. Strategies include ensuring all high school students have the opportunity to earn at least six college credits while in high school and understand their options for further education and a career.

Pillar 2: Fostering student success and completion in postsecondary education and training. To achieve this mission, there is a commitment to make community college tuition-free through an expansion of the Community College Opportunity Grant (CCOG) program. There is also a drive to invest in a statewide student success initiative targeting low-income and underrepresented students, student parents, justice-impacted students, students with disabilities, veterans, and others.

Pillar 3: Building transparent, seamless, and stackable pathways that respond to the changing economy. In collaboration with four-year institutions, state leaders, and technologists, New Jersey community colleges aim to revitalize general education and address the implications of AI for students, staff, and faculty. Strategies also include ensuring all students can complete paid internship work experience, embedding workforce credentials in community college programs, and building statewide and regional pathways connecting students to credentials, degrees, and lifelong learning.

Pillar 4: Helping adults attain the credentials they need for career mobility and labor market success. To achieve this goal, strategies include funding noncredit programs for low-income students and developing a consistent, statewide approach to Credit for Prior Learning (CPL)/Prior Learning Assessment (PLA).

“The Opportunity Agenda was released publicly in March, and we were able to do a larger public rollout at the Summit,” says Fichtner. “To get the buy-in and support of all eighteen community colleges in New Jersey, it took several months of discussion around which pillars to select,” adds Heidkamp. “We reached a consensus around these action pillars and the strategies that support them. The community recognizes that technology is advancing and what that means for the workforce. Colleges are also identifying that there are large equity gaps. The point of the Agenda is to build upon the current momentum happening within community colleges around holistic student support and workforce development and determine how we can capitalize on what is already underway to shape our focus going forward.”

Fichtner adds, “We’re making a series of bold efforts to help our eighteen community colleges continue to make strides in equity, access, and success. Each pillar iIs part of a broader strategy and we want to empower institutions as they innovate and evolve to meet the rapidly-changing world that we live in.”

In addition to sharing details of the Opportunity Agenda, day one of the Summit included the session, Amid AI Revolution: Opportunities for Community College Innovation, presented by Dave Cole, Chief Innovation Officer, State of New Jersey, and Developing Community College AI Programs: Lessons from an AI Pioneer on Equity, Academics, Industry Partnerships, and Degrees, presented by Dr. Madeline Burillo-Hopkins, Vice Chancellor Workforce Instruction and President Southwest College, Houston Community College.

Expanding Innovative Education and Workforce Partnerships
The second day kicked off with a breakout session, Reckoning with Relevance: 2024 State of the Sector, led by Dr. Tara Zirkel, Director, Strategic Research, EAB, followed by The Success of the New Jersey Pathways to Career Opportunities Initiative, presented by Dr. Michael McDonough, President, Raritan Valley Community College, and Catherine Starghill, Esq., Vice President, New Jersey Council of County Colleges and Executive Director, New Jersey Community College Consortium for Workforce and Economic Development. “Day one invited nationally known speakers who spoke on very timely and important topics,” says Heidkamp. “Day two was a mix of national speakers and presenters who highlighted some of the workforce projects going on at our colleges.”

Additional sessions included Education Pathways and The Future of Work presented by Charlotte Cahill, Associate Vice President, Education, Jobs for the Future, and Industry and Pathways: The Intersection of Education and Workforce Development, led by Amanda Winters, Program Director, Post-Secondary Education, National Governors Association; Keith Witham, Vice President of Education Philanthropy, Ascendium Education; and Paul Fain (Moderator), Journalist, Work Shift and The Job, and The Cusp Podcast. The afternoon session, New Jersey Pathways to Career Opportunities: Centers of Workforce Innovation Highlights, included representation from a variety of industries, including aseptic processing and biomanufacturing, film and television production, esports production, and robotics and automation.

“This event was uniquely structured and offered the opportunity to attend one or two days or all three. We had five hundred attendees in total and many colleges had teams of five or six people representing different areas, including leadership, academic affairs, student services, data services, and workforce development. We heard through feedback that they appreciated being able to come to one conference together and participate as
a team.”

— Maria Heidkamp
Chief Innovation and Policy Officer, New Jersey Council of County Colleges.

Expanding Innovative Education and Workforce Partnerships
The second day kicked off with a breakout session, Reckoning with Relevance: 2024 State of the Sector, led by Dr. Tara Zirkel, Director, Strategic Research, EAB, followed by The Success of the New Jersey Pathways to Career Opportunities Initiative, presented by Dr. Michael McDonough, President, Raritan Valley Community College, and Catherine Starghill, Esq., Vice President, New Jersey Council of County Colleges and Executive Director, New Jersey Community College Consortium for Workforce and Economic Development. “Day one invited nationally known speakers who spoke on very timely and important topics,” says Heidkamp. “Day two was a mix of national speakers and presenters who highlighted some of the workforce projects going on at our colleges.”

Additional sessions included Education Pathways and The Future of Work presented by Charlotte Cahill, Associate Vice President, Education, Jobs for the Future, and Industry and Pathways: The Intersection of Education and Workforce Development, led by Amanda Winters, Program Director, Post-Secondary Education, National Governors Association; Keith Witham, Vice President of Education Philanthropy, Ascendium Education; and Paul Fain (Moderator), Journalist, Work Shift and The Job, and The Cusp Podcast. The afternoon session, New Jersey Pathways to Career Opportunities: Centers of Workforce Innovation Highlights, included representation from a variety of industries, including aseptic processing and biomanufacturing, film and television production, esports production, and robotics and automation.

Holistic Student Supports
Day three of the Opportunity Summit explored a variety of topics, including enhancing institutional access through online learning, transforming the local workforce ecosystem, and flexible work arrangements. Attendees could also learn more about AI in teaching, integrating grant offices into finance and procurement, and SNAP employment and training. “After sending out a request for presentations, we added sessions to the Summit agenda by a selection committee,” explains Linda Scherr, Chief Academic Officer, New Jersey Council of County Colleges. “Along with invited keynotes and plenary sessions, we wanted to give faculty and staff within the community an opportunity to share best practices and insights with their peers. It was a great blend of topics and presenters and led to unique collaborations that may not have been previously possible.”

Heidkamp adds, “Along with giving us an opportunity to officially kick off the Opportunity Agenda, this event allowed us to connect with several organizations we hope to partner with moving forward. Many are social justice groups, like the New Jersey Institute for Social Justice and the United Way. We also had business groups, like the Statewide Hispanic Chamber, the New Jersey Business and Industry Association, and the Council on Humanities. These are all partners that are reflected in the four pillars and we look forward to joining forces and creating a stakeholder group who can help move the needle in a positive direction.”

“We see students at many different stages of their lives. They may come to community college for one goal, and then come back later for further career development. We want to be their partner for life and be an anchor institution for our communities. A conference like the Opportunity Summit allows us to showcase this mission and identify how we can work together in a coordinated fashion to help students gain the knowledge and skills necessary to make an impact in the workforce.” 

— Linda Scherr
Chief Academic Officer, New Jersey Council of County Colleges

Memorable Event Highlights
Throughout the Summit and the Holistic Student Supports Convening, national and state thought leaders shared their perspectives on key issues facing higher education and how all organizations can work together to ensure students and workers are prepared to thrive in an innovative economy. “There were so many great moments throughout the three-day event,” reflects Fichtner. “I really enjoyed Dr. Chauncy Lennon’s presentation, Pathways: Equity and Access to High Quality Industry Credentials and Associate Degrees. He did an excellent job framing where we are in higher education and our society and what actions we will need to take going forward.”

“The lunch panel, Serving New Jersey’s Justice-Impacted Individuals, discussed the work that our colleges are doing to serve justice-impacted students,” continues Fichtner. “Governor McGreevy was a part of this session and the presentation had many great speakers and inspiring moments. It highlighted three amazing programs that are helping individuals who are either in prison or are on probation and giving them opportunities to succeed.”

The session, Developing Community College AI Programs: Lessons from an AI Pioneer on Equity, Academics, Industry Partnerships, and Degrees, led by Dr. Madeline Burillo-Hopkins, Vice Chancellor, Workforce Instruction and President Southwest College, Houston Community College, was among Heidkamp’s favorites of day one. “Dr. Burillo-Hopkins has helped lead the efforts at her college to develop AI associate degrees, as well as bachelor’s degrees in AI and robotics. She was very energetic and passionate about the subject and encouraged attendees to embrace this rapidly growing trend.”

Putting on such an event involves a great deal of planning and moving parts, but overall NJCCC feels it was a great success. “I think we were able to maximize the range of topics and engage people across many issues,” shares Heidkamp. “We continue to receive tremendously positive feedback and there is already momentum and encouragement to prepare for next year.” Fichtner adds, “From our conversations with college presidents and leaders, there was a real excitement about being together with colleagues from across the state. Many community colleges said they appreciated having an opportunity for their teams to spend time together off campus and listen to national experts and partners explore different topics that are pertinent to everyone in the education space.”

The Important Role of Community Colleges
Community college graduates play a vital role in the success of the state’s key industries, including manufacturing, technology, health care, education, and renewable energy. “NJCCC serves 240,000 students, where half of all undergraduate students in public higher education and forty percent of all college students in New Jersey are at a community college,” says Fichtner. “Our colleges are comprehensive institutions that serve a wide variety of purposes and help people advance their lives and gain family-supporting careers. We want to make sure that everybody in our community has opportunities, which is where our relentless focus on equity comes into play. We also place priority on helping high school students earn college credit and become inspired to continue their journey after graduation.”

“We see students at many different stages of their lives,” adds Scherr. “They may come to community college for one goal, and then come back later for further career development. We want to be their partner for life and be an anchor institution for our communities. A conference like the Opportunity Summit allows us to showcase this mission and identify how we can work together in a coordinated fashion to help students gain the knowledge and skills necessary to make an impact in the workforce.”

Heidkamp says businesses are increasingly turning to community colleges to be their academic and workforce partners. “The value of community colleges continues to gain recognition and we’ve done a great deal in recent months to show the important role these institutions play in New Jersey. Historically, there was some stigma attached to community colleges, but thankfully, that viewpoint is changing and more people are recognizing the unique service they can provide in creating a skilled workforce, advancing the state’s innovation economy, and enriching the community for all.”

To learn more about the NJCCC’s vision, initiatives, and upcoming events, visit njcommunitycolleges.org.

View Article in View From The Edge Magazine »

The post Building Pathways to Equity and Economic Prosperity appeared first on NJEdge Inc.


Navigating the New Landscape of GLBA Compliance

The post Navigating the New Landscape of GLBA Compliance appeared first on NJEdge Inc.
Key Changes required to
Protect Your Federal Financial Aid

For higher education institutions offering financial aid to students, the Gramm-Leach-Bliley Act (GLBA) means an institution is required to meet compliance standards pertaining to the security and protection of financial information, and to provide transparency related to how personal information is used and shared. Failure to meet these standards carries significant risk for institutions, including restrictions or loss of eligibility for Title IV funding. In July 2024, Edge security, privacy, and compliance experts hosted a webinar focused on key changes to GLBA compliance requirements, how these revisions increase the compliance burden for many organizations, and key steps for meeting the new standard and maintaining compliance to receive federal financial aid support.

GLBA Compliance
In the last 10 years, GLBA has become increasingly integrated with higher education and higher education technology. This act regulates the protections that must be in place to protect student financial information for institutions that issue or handle student aid through Title IV programs, or who use the Student Aid Internet Gateway system. Since 2018, the Office of Federal Student Aid (FSA) at the U.S. Department of Education has encouraged higher education institutions to work toward adopting NIST SP 800-171, and later in 2020, has indicated this may be included in future compliance requirements.

More recently in May 2024, NIST updated the content of SP 800-171 standard to Revision 3, changing the specific controls required to achieve compliance. These changes mean that higher education institutions will need to re-assess compliance if they are currently implementing an earlier revision of NIST SP 800-171 to meet the upcoming GLBA requirement. “For member institutions using the EdgePro virtual CISO (vCISO) service, they are already aware of the GLBA requirements, and we have strategies in place to bring their organization onto the NIST 800-171 standard,” explains Dr. Dawn Dunkerley, Edge’s Principal vCISO. “The changes for Revision 3 will change ongoing work and will require updates to policies and procedures that are already developed.”

“Other institutions that are currently compliant with GLBA may not yet be working towards compliance with NIST SP 800-171,” continues Dunkerley. “This can be a great opportunity to be proactive and receive an assessment to identify any gaps to NIST SP 800-171 Rev 3 compliance. The EdgePro vCISO can be another asset to help with documentation, make recommendations for technologies, and create processes for immediate adoption, implementation, and operational improvements within the organization.”

Withdrawn and Consolidated Controls
Even those institutions who are already tracking compliance with NIST SP 800-171 will need to make adjustments in order to stay compliant for the upcoming GLBA requirement. Revision 3 makes several changes that will impact institutions in both administration and in substantive control requirements. On the administrative side, there will be updates required to domain and control numbering schemes, titles, and specific language. “Each control was renumbered, so if you have a system security plan, it will require touch labor on each specific control,” says Dunkerley. “Several controls have been withdrawn and some existing controls have been consolidated and combined with other existing controls. There are also three new domains added, including nine individual controls that focus more on depth and proving control implementation, versus simply defying activities that should be done. The framework now has a helpful supplemental document that is a guide for assessors and allows an institution to look at each control and ask questions such as, how would an assessor test me on this; what is the interview they might conduct; what is a possible technical test; and what artifacts would they be looking for from a documentation perspective?”

All of the controls listed in the new revision have been re-numbered, re-titled, and the descriptions of requirements have been expanded to make requirements clearer. Any documentation that specifically references the NIST SP 800-171 Revision 2 controls will need to be updated in kind, and updated control descriptions will need to be reviewed to ensure that the control implementation in practice matches the clarified descriptions. “The updated and clarified control language may help institutions implement controls more effectively and the assessment guide is helpful for understanding how each would be tested,” says Dunkerley. “Institutions that have already implemented these controls should decide whether to keep each control in place or remove the relevant requirements from policies and enforcement practices.”

“The new substantive controls changes focus on not only performing activities to achieve compliance but proving that you do. This may include documented processes, logs, interviews, or a technical analysis with an auditor. By following the supplemental document, an institution can get into an auditor’s mindset as they’re reviewing their policies and procedures. Navigating the new landscape of GLBA compliance can be challenging, but we are here to help institutions understand these requirements, develop strategies for safeguarding financial information, and ensure compliance every step of the way.”

— Dr. Dawn Dunkerley
Principal vCISO, Edge

New Domains
Three new domains have been added in Revision 3, which will require new controls to be implemented that were not tracked in previous versions of the NIST SP 800-171 standard. “One of the more exciting pieces in these updates is the addition of new domains,” shares Dunkerley. “They are planning, system and services acquisition, and supply chain risk management. There can be several policies and procedures that fall underneath each of these, but the two main artifacts specific to this particular framework are the system security plan (SSP) and the plan of action and milestones. The SSP is a living document that shows where an organization is in compliance with the control set. There is a SSP template that shows each control, and an institution is either compliant, non-compliant, or it doesn’t apply.”

An institution can use this document to track progress and determine which actions are required to become compliant. The new set of controls for system and services acquisition is associated with following proper practices and procedures for internal development. “We must ask ourselves; how do we engineer security into the things we are building?” says Dunkerley. “If we have a situation where something is not supported by the vendor, what are the practices that are required to minimize the risk to ourselves? How are we dictating to an external provider what security requirements they are required to follow and how will they be held accountable? Each institution must have vendor risk management procedures in place that have active monitoring of their critical vendors.”

Designed for higher education to measure vendor risk, the Higher Education Community Vendor Assessment Tool (HECVAT) is a questionnaire framework that helps assess the information, data, and cybersecurity policies that are in place to protect sensitive institutional information. “Edge likes to use this tool in conjunction with other automated tools before a vendor is brought into our ecosystem,” explains Dunkerley. “We look for a company to share their security practices and documentation, network diagrams, risk assessment results, and how they mitigate any vulnerabilities.”

Substantive Control Changes
In response to the latest trends in the cyber threat landscape, 11 new controls have been added to the existing domain structure and should be implemented to work towards compliance and improve institutional security. These controls may require new policies, documentation, and practices to achieve compliance. “The new substantive controls changes focus on not only performing activities to achieve compliance but proving that you do,” explains Dunkerley. “This may include documented processes, logs, interviews, or a technical analysis with an auditor. By following the supplemental document, an institution can get into an auditor’s mindset as they’re reviewing their policies and procedures. Navigating the new landscape of GLBA compliance can be challenging, but we are here to help institutions understand these requirements, develop strategies for safeguarding financial information, and ensure compliance every step of the way.”

The EdgePro vCISO provides independent and objective input to ensure that your security posture is on track, identifies areas of necessary improvement, and supports areas where you’re already in compliance. To learn more about this service or conducting an assessment, visit njedge.net/solutions-overview/vciso.

View Article in View From The Edge Magazine

The post Navigating the New Landscape of GLBA Compliance appeared first on NJEdge Inc.


Online Program Success through Strategic Innovation

The post Online Program Success through Strategic Innovation appeared first on NJEdge Inc.

With enrollment still struggling to bounce back from a downward trend, and closures and consolidations in the news, the need for innovative strategies in higher education has never been greater. Edge’s AVP and Chief Digital Learning Officer, Joshua Gaul, recently led a webinar to discuss these recent trends and what actions can be taken to navigate the evolving educational landscape. Whether an organization is looking to launch new online initiatives or enhance existing ones, employing strategies that build digital learning capacity can help boost enrollment and institutional resiliency and success.

A Closer Look at Campus Closures
During a span of time between July 2004 and June 2020, nearly 12,000 campuses abruptly closed, leaving many students unsure about their path forward to finish their education. According to research conducted by the State Higher Education Executive Officers Association (SHEEO) and the National Student Clearinghouse Research Center who analyzed 467 of these campus closures, 78% were for-profit two-year and four-year institutions, and over 20% were non-profit two-year and four-year institutions. “Even prior to 2020, many of these institutions had operating deficits or thin liquidity,” explains Gaul. “The arrival of the pandemic exacerbated these issues and many colleges were forced to merge or close. The majority of these schools were smaller campuses with less than a thousand students, but many had over two thousand students before the pandemic. Around 70% of the students experienced an abrupt closure and over half of them did not re-enroll.”

Affected by a $6 million deficit and a drop in enrollment, Cabrini University was among the institutions that was forced to close. Founded in 1957, the University began with 43 students and offered four major programs. Growing to over 1,700 students and 30 programs in 2021, the institution later saw a decline in enrollment in 2022. Cabrini recorded a decline of approximately 1,000 students, or half the student body, in less than a decade, forcing staff cuts and course changes. These issues combined with the financial strain led Cabrini to officially close its doors in June 2024.

The College of Saint Rose in Albany, New York closed in May 2024 after program cuts, a 37% drop in freshman enrollment, and a financial gap of over $11 million. Over the past three decades, the Board of Trustees at Saint Rose had tripled the size of the campus, spending more than $12 million to purchase 68 properties, and another $100 million to renovate and build new facilities on that land. As enrollment dropped, the private institution cut expensive programs, but the budget deficit continued to grow, and the college had to close earlier this year.

In the fall of 2012, Bloomfield College in New Jersey had 2,044 students enrolled, but that number had dropped to just over 1,000 students a decade later. The public college was recognized as the only four-year college that is a Predominantly Black Institution, Hispanic Serving Institution, and Minority Serving Institution. As the pandemic continued and Black and Hispanic populations struggled with the fallout, enrollment numbers continued to drop. In order to keep the campus open, Bloomfield merged with Montclair State University in July 2023.

“Mergers and acquisitions in higher education are not uncommon,” says Gaul. “For example, Northeastern University in Boston absorbed Mills College in California. Additionally, schools like Otterbein University and Antioch University announced a partnership, but not a full merger, as part of an effort to create a national university system with various shared services. If we look at the most common factors contributing to these closures or mergers, we see declining enrollment, demographic shifts, and financial instability.”

Gaul continues, “Many colleges and universities were facing financial difficulties even before the pandemic, and about one third of institutions were operating at a deficit in 2019. The pandemic exacerbated these issues leading to enrollment declines, as well as a questioning of the value proposition of higher education, which was probably the most damaging thing. An institution’s market position, how it’s perceived in its target markets, its financial position, and its ability to manage resources effectively are paramount in determining the institution’s future sustainability. And helps determine what type of investments and strategic decisions make sense for an institution.”

The Segmentation of Higher Education
According to journalist and author Jeffrey Selingo, there are six segments of higher education institutions, each facing different levels of risk following the pandemic:

Powerhouses (3%): Globally focused, prestigious public and private institutions that are primarily focused on research and knowledge creation. Their top concerns include research funding and attracting top talented faculty and students. Elites (3%): Highly selected, well-resourced liberal arts colleges who maintain selectivity around undergraduate education. They are not as focused on tuition revenue and student demand, but more about their positioning within the job-focused world to prepare students with the training and advanced skills they need to succeed. Strivers (10%): Private and public institutions who are looking to reach Elite or Powerhouse status but are more dependent on tuition revenue or state funding to make that transition. Regionals (30%): Private colleges that are generally small, tuition-dependent institutions who are heavily reliant upon a student body who is regional, if not highly localized. Compass Institutions (14%): Regional public universities that are dependent on local student markets and state funding. Community Colleges (40%): Focused locally and on the adult student market and are increasingly offering four-year degrees. Well positioned to take advantage of a workforce-focused approach to higher education.

With the higher education landscape changing immensely in the last decade, institutions are tasked with creating strategies for building sustainability. “Today’s institutions will need to focus on three key areas moving forward,” says Gaul. “This includes improving organizational culture, enhancing digital infrastructure, and developing new educational and economic models.”

Driving Cultural Change
Organizational culture comprises values and behaviors that shape performance where institution leaders must identify problematic aspects of the current culture and understand what changes are needed. “Internal audits can help initiate discussions about cultural strengths and weaknesses,” says Gaul. “Self-evaluations can go beyond what HR requires every year to evaluate their staff and explore more in depth how an organization handles different aspects of its culture. Leaders play a crucial role in transforming unhealthy academic cultures and must clearly articulate the organization’s purpose to motivate engagement in the cultural change process. Changing culture requires acknowledging the need for change, identifying specific behaviors and values that need to shift, and building buy-in.”

“You can’t force a culture shift, it is through changes in underlying values and behaviors,” continues Gaul. “Creating a positive and productive environment is challenging, but it’s essential for unit performance and satisfaction. Oftentimes, getting an outside perspective can tell a story that you’re not used to reading. Through a service like Edge’s Enterprise 360 (E360) Assessment, an institution can gain a holistic look at the entire technology infrastructure and identify which processes, procedures, and policies may be standing in the way of scalability and survival. Through business process modeling and conversations with an entire team, we can tell a story that allows us to make informed suggestions on ways to improve your operation and digital transformation readiness.”

The E360 Assessment looks at every moving part of an institution’s technology ecosystem and pinpoints which medium and long-term changes could have the biggest impact along the digital transformation journey. “Edge can help the transition by providing the support of thought leaders and a research-driven approach to change management that puts people at the center of the process.  While a change is a thing you do, a transition is a psychological step that everyone involved makes together, and we can help guide an institution throughout this evolution.”

“Today’s institutions will need to focus on three key areas moving forward. This includes improving organizational culture, enhancing digital infrastructure, and developing new educational and economic models.”

— Joshua Gaul
AVP and Chief Digital Learning Officer, Edge

The Need for Digital Infrastructure
In the 21st century, sustainability relies heavily upon technology as robust digital infrastructure enables higher education institutions to adapt to evolving challenges, enhance learning experiences, and maintain operational continuity. “Steps taken outside the walls of an organization’s data center can lead to greater strength and sustainability on the academic side as well,” explains Gaul. “To start, it’s important to monitor and measure digital system performance and track those key metrics like conversion rates, customer satisfaction scores, student surveys, and lead times to identify areas for improvement in your digital transformation journey. Each institution should explore how their tools are supporting them and evaluate current processes and technologies to determine which changes will have the biggest impact on their business operations.”

During an evaluation, institutions often find the tools and technologies they are using either overlap or are not being used effectively. “Oftentimes, schools find they have three different tools doing the same thing or are only using 30% of a tool’s capabilities,” says Gaul. “We want to help institutions invest in essential tools and systems and implement necessary tools like an enterprise resource planning (ERP) system, customer relationship management (CRM) system, or a project management system that can help them compete in a digital age. In driving digital transformation at your organization, you will need to train employees on these new systems and tools and develop a strong organizational change management strategy that includes various training methods to ensure employee adoption and productivity.”

“Building a service management framework is key to providing system support and identifying and managing incidents effectively. In addition, an institution should implement robust security measures and prioritize data security by implementing proper protocols and authentication measures to protect against cyber risks and data breaches. Digital transformation is an ongoing process that requires continuous improvement, investment, and adaption, and Edge can provide support every step of the way.”

Adopting New Education Models
In the past, the college campus was a physical place where a student enters after graduating high school and leaves when a degree is completed. Over the years, the higher education experience has evolved and the parameters have changed. “The fiscal and democratic realities of today call for the development of pathways and services for learners throughout their lifetime; beyond just those four years,” says Gaul. “This doesn’t mean institutions must discard all the elements of their legacy model, but they will need to add new solutions and innovations. We look at a future where higher education has more flexible pathways and an immersive hybrid option may become more common. By mixing and matching in-person and online courses, students have more flexibility to pursue work, internships, co-ops, or undergraduate research off campus.”

“An immersive hybrid approach can turn the standard model of the student life cycle into something more progressive and innovative,” continues Gaul. “We will begin to see new credentials alongside legacy degrees that are industry recognized certificates that give students on-ramps into jobs before they earn a degree. By asking for input from industry leaders about the key credentials and skill sets they most desire, colleges and universities can help prepare their students more effectively.”

“Some schools are also offering a shortened track to a degree. For example, the University of Minnesota at Rochester offers a year-round, two-and-a-half-year bachelor’s degree program in health sciences in partnership with Google Cloud. Every student is assigned a coach, as well as a mentor from the Mayo Clinic, and gains research experience, a paid internship, and a digital portfolio to track their learning. This model can help cut down costs for students, while offering the same amount of courses and credits in a condensed period of time, helping them get out in the workforce sooner with real-world experience and expertise.”

Another growing trend seen in the higher education community is partnerships between colleges and universities and their local community colleges. Together they build programs where students explore academic options and complete core courses at the community college, and then later transfer to the state school for their junior year. “Lifelong learning is redefining higher education as a platform for continual learning, and new approaches to education, like a seamless transfer program, gives students greater flexibility and provides institutions with creative ways to improve their retention,” says Gaul. “To maintain relevancy and a competitive advantage, more institutions need to be open to building quality competency-based education courses and providing students with ongoing opportunities to gain business-ready skills.”

Gaul reminds institutions that no one education model will be the magic bullet for solving the higher education fiscal crisis, but diversifying revenue streams and academic models can begin to shift the legacy economic model for many campuses. “Higher education was already changing long before the pandemic and how students viewed education was evolving too. Institutions that were already prepared to innovate survived and are flourishing, while those that did not shift their model likely closed.”

“The faculty and students are an integral part of the transition, they are the lifeblood of your institution,” adds Gaul. “You must take their perspectives into account when pushing these large-scale changes. Academics should also always lead technology; technology should not be the one fueling the academic mission. Edge’s team of experts is here to help you drive digital transformation and provide insight into the complicated nature of the business behind higher education so your organization not only survives, but can stand out and thrive for many years to come.”

Looking to enable digital transformation and streamline efficiencies within your institution? Learn how the E360 Assessment can provide a holistic view of your current-state technology program and key steps for evolving toward a more integrated and agile future. Visit njedge.net/solutions-overview/digital-transformation.

View Article in View From The Edge Magazine

The post Online Program Success through Strategic Innovation appeared first on NJEdge Inc.

Thursday, 03. October 2024

EdgeSecure

Leveraging Higher Education Reference Models to Align IT Strategies with Institutional Goals

The post Leveraging Higher Education Reference Models to Align IT Strategies with Institutional Goals appeared first on NJEdge Inc.

Designed to create a structured approach to understanding and improving various aspects of higher education institutions, the Higher Education Reference Models (HERM) provide standardized business and data architectures in areas such as administration, academic processes, and technology integration. These models share best practices that can help improve the value and efficiency of an organization’s architecture teams and facilitate collaboration and the exchange of architectural knowledge within the sector. Originally launched in 2016, the models are curated and managed by the Council of Australasian University Directors of Information Technology (CAUDIT) HERM Working Group.

The Origin of the Higher Education Reference Models
A peer of international associations such as EDUCAUSE, UCISA, and EUNIS, CAUDIT brings together universities, further education, and research institutions from across Australasia and the South Pacific and helps these organizations access services that foster leadership, collaboration, and best practice. “The origin of the CAUDIT Enterprise Architecture Community of Practice began with the strategy and architecture teams at Charles Sturt University, who received a grant to develop an enterprise architecture from what was then DEETYA, the Australian Government’s Department of Employment, Education, Training & Youth Affairs,” explains Nigel Foxwell, Manager Enterprise Architecture, Strategy, and Delivery, Technology Solutions, James Cook University. “The grant was extended on the condition that the outcomes of the work were shared back to the higher education community and CAUDIT was nominated as the overarching body to receive and coordinate that sharing.”

This nomination led to the inaugural CAUDIT Enterprise Architecture in Higher Education Symposium, held at Charles Sturt University’s Bathurst campus in November 2006. The event seeded the CAUDIT Enterprise Architecture Community of Practice, with the Symposium becoming the established annual community meeting, held in-person through 2019, and held online since 2020. “Several CAUDIT member universities had undertaken consultancy engagements with a leading consultancy known as enterprisearchitects.com, now named FromHereOn (FHO),” says Foxwell. “Over the years, FHO attended and contributed to the annual CAUDIT Enterprise Architecture in Higher Education Symposium events. Over the course of a decade, FHO began to see consistency emerging in the anchor models they were creating through their work with many different university clients in Australia and internationally. Working with Anne Kealley, then CEO of CAUDIT, these models were shared with CAUDIT in 2016, to be prepared for release as the first version of the HERM.  We are grateful that Greg Sawyer, current CEO of CAUDIT, continues to provide amazing leadership support for the HERM.”

Receiving the FHO models catalyzed formalization of the CAUDIT Enterprise Architecture Community of Practice as a formal group under the wider CAUDIT structure and the first meetings of Higher Education Reference Models Working Group (HERM-WG). The HERM-WG is co-chaired by Nigel Foxwell and jeff kennedy, Enterprise Architecture Manager, Organisational Performance & Improvement, The University of Auckland (Yes, this is the proper appearance of jeff’s name). “The HERM was then delivered into the care and custodianship of the HERM-WG under the CAUDIT Enterprise Architecture Community of Practice and work began in earnest partnership between FHO and the HERM-WG to prepare the initial models for release as Version 1 of the HERM,” explains Foxwell. “The first-release HERM comprised the Business Capability Model and the Data Reference Model, both supplied as posters supported by spartan descriptive catalogs. The ongoing work of the HERM-WG, which meets weekly for ninety-minute workshops, has seen the coverage, completeness, and internationalization of the HERM mature and progress over five version releases. Today, the HERM is being used by more than a thousand institutions throughout the world.”

Aligning IT Strategies with Institutional Goals
The HERM serves as a guide for institutions to benchmark their practices, implement improvements, and align with industry standards and emerging trends in higher education. These resources can act as a foundation to fast-track the development of business and data architecture, as a benchmark to identify similarities and unique features within the institution, and as a tool to effectively communicate and engage with stakeholders. “The adoption of these models can provide a common language that bridges and unites different stakeholders and perspectives within and between institutions and jurisdictions,” says kennedy. “The HERM can provide a ready-to-use framework that aids sensemaking in the complex environment of higher education and can connect strategy with execution, particularly through business capabilities serving as the missing link.”

Although the HERM is cared for and extended by groups of enterprise architects, and Enterprise Architecture practices are typically located within IT, the HERM is a fully featured enterprise reference model that can be applied throughout an organization—it’s not just an “IT Thing.” For example, the introduction of the Business Model Canvas provided tools that support scenario planning and strategic modeling. “Mapping traditional IT services and initiatives to business outcomes and broader institutional goals can be challenging,” says Foxwell. “The HERM can assist with achieving those mappings and helping to better understand and communicate the value of IT in the higher-education context.”

“From an IT-specific perspective, business capabilities play a crucial translation role in helping to understand the connections between investment in technology services, the digital products those services enable, the business capabilities making use of those digital products, and the resulting quality of customer experience,” continues Foxwell. “The HERM represents a consistent view of an institution that is highly stable over a long period of time, withstanding organizational restructuring, continual process improvement, and technological change. Other IT-with-the-business connections can be established by techniques such as placing storytelling overlays on the business capability model. This could include suitability for robotic process automation, technology fitness and business value of application portfolio segments, and readiness to benefit from artificial intelligence (AI).”

“From an IT-specific perspective, business capabilities play a crucial translation role in helping to understand the connections between investment in technology services, the digital products those services enable, the business capabilities making use of those digital products, and the resulting quality of customer experience. The HERM represents a consistent view of an institution that is highly stable over a long period of time, withstanding organizational restructuring, continual process improvement, and technological change. Other IT-with-the-business connections can be established by techniques such as placing overlays on the business capability model of factors. This could include suitability for robotic process automation, technology fitness and business value of application portfolio segments, and readiness to benefit from artificial intelligence.”

— Nigel Foxwell
Manager Enterprise Architecture, Strategy, and Delivery, Technology Solutions, James Cook University

Standardizing Business Architectures
Through a business continuity lens, HERM can provide structured frameworks and best practices that ensure the resilience and stability of institutional operations. By defining and standardizing key processes and functions, using these reference models can create consistency, making it easier to prioritize critical functions and resources, manage changes, and recover from disruptions. “Using the HERM as a guidepost can help an institution understand the footprint of significant business-change initiatives, and project that footprint upon the DRM to understand the manifest of data integrations required and any attendant data-quality issues that might need to be addressed,” explains kennedy. “Referencing this framework can also help understand and communicate stories as diverse as strategic objectives mapping, solving business process issues, directing CapEx investments, and assisting with public records management.”

The value of the HERM continues to be enriched by the global higher education enterprise architecture community and special interest groups including CAUDIT, UCISA, European University Information Systems (EUNIS), and EDUCAUSE members through the Enterprise Business and Technical Architects (Itana) community group. “We have seen the HERM in action and how these frameworks can help drive strategic success,” says Foxwell. “For example, a research-intensive university used a health-assessment overlay on the HERM business capability model to inform their strategic decision-making and investment planning for their future research ecosystem. For another university setting out on a Robotic Process Automation journey, the evaluation of the workload types in each HERM business capability underpinned the areas of interest analysis and implementation-governance framework.”

“Mapping the HERM business capability footprint of a transformation program to reimagine the shape and delivery of its curriculum and signature pedagogies enabled another university to focus its investment in areas that contributed the greatest benefit to achieving the desired outcomes,” adds kennedy. “Numerous institutions have utilized the DRM to conduct a rapid startup of data governance and data management initiatives and used it as the foundation for Business Intelligence initiatives. These models have also been used to undertake enterprise-wide assessments of data assets to provide valuable insights to new senior IT leadership. In another instance, an institution used the HERM business capability model to analyze the likely effects of uplifting the digital skills of their people and enabled them to pinpoint where the effects of the necessary investment would show up and articulate the resulting benefits in terms of customer experience, stronger learning outcomes, and more-impactful research.”

Foxwell says the new Application Reference Model was partly developed and tested in a live scenario where a collaborating university was building a multi-year application roadmap for their research division. “This is a great example of how the models can be deployed to support and guide strategic investment in specific areas. The university has since extended this approach through combining application and business capability models to identify strategic investments in other areas such as human resources, core curriculum delivery, and student support functions.”

Promoting Global Collaboration
The HERM continues to evolve through the commitment of a growing network of constructive and enduring international collaborations who are dedicated to the ongoing development and use of the models. “The earliest of these collaborations can be traced back to 2013 and the connections established between the CAUDIT community and the EDUCAUSE Itana group,” explains Foxwell. “Later, the global pandemic forced the EUNIS Congress 2020 to move online, affording the opportunity for us to participate in that event and the privilege to meet for the first time with colleagues in the UCISA and EUNIS communities. The relationships formed there with the leadership of the UCISA Enterprise Architecture Special Interest Group have proven to be valuable and enduring, with the CAUDIT and UCISA teams having now held more than fifty online meetings and workshops together. CAUDIT and UCISA have also partnered and co-hosted the global launch of the HERM alongside its Version 2.6 release in November 2021.”

CAUDIT has established joint statements of collaborative intent to further the HERM together with EDUCAUSE, UCISA, EUNIS, and Higher Education Information Technology South Africa (HEITSA). “The heartbeat of the international collaborations is provided by each association’s own local Enterprise Architecture community, and the approaches differ for each group,” says kennedy. “The HERM-WG partners with EDUCAUSE on weekly Itana core and Itana Business Architecture Working Group sessions. The working group also connects with UCISA through monthly and on-demand meetings and by sharing access to our collaboration environments. In addition, the HERM-WG regularly meets with EUNIS and holds workshops on topics of mutual interest. Every suggestion and every piece of feedback we receive is logged and triaged through the HERM-WG assessment and evaluation process, and this ensures global perspectives and diverse viewpoints are held up and considered carefully against the HERM.”

Foxwell says international collaboration has improved and strengthened the HERM immeasurably and has led to a richer exchange of best practices, enhanced innovation, and more effective solutions tailored to diverse educational environments. “The collective expertise, curiosity, and generosity of the many people engaging with the HERM and contributing their suggestions, provocations, and thoughtful scrutiny about its applied use and potential futures is nothing short of amazing. Weaving together diverse viewpoints has helped foster a truly global and widely road-tested HERM that is ready for use and able to facilitate day-to-day enterprise architecture activities just as readily as it can serve as the foundation for generational digital transformation. The HERM has therefore been tested and developed across many different regions of the world and has proven to be useful and applicable for all.”

“Through the coordinated efforts of our EUNIS colleagues, the HERM has been translated into several languages other than English, including Catalan, Croatian, Finnish, French, German, Norwegian, and Latvian,” continues Foxwell. “The effort and academic energy required to translate the reference models into another language has also provided substantial benefits to the clarity and global consumability of the HERM. In addition, joining forces has made the HERM distribution available for each higher-education association to provide direct natural syndicated access to its members. We’ve also seen HERM-inspired papers published at The Open Group, including Spanish Higher Education Enterprise Architecture Initiative and Capability Map and the School Reference Model.”

Adapting to Meet Strategic Initiatives
Since the HERM models are comprehensive, contemporary, and curated for use as a reference architecture, the framework does not require continual updating in response to technological change or strategic-direction shifts. “Even through changes in the higher education sector and various curriculum delivery models, the HERM has not needed major revisions to cope,” shares kennedy. “This includes different learning modal like massive open online courses (MOOC), multi-model teaching, fully online, hyper-flex and the emergence of consumer-facing artificial intelligence. The HERM is adaptable to various university strategic options from hyper-personalized to large-scale inclusivity. For example, the HERM Business Capability model has been used to represent the signature differences in many important business capabilities for each of the Gartner 4U models of universities in the future. The Business Capability Models are relatively stable devices and if somebody from the University of Auckland in 1883 or The University of Glasgow in 1451, for example, encountered the models back in their own time then this lens would likely be perfectly relevant!”

“The Recipe Cards introduced with Version 3.0.0 of the HERM provided the ability to express institution-specific motivations, business outcomes, and future institutional goals using powerful enterprise-storytelling techniques,” continues kennedy. “The selection of recipe cards included with the HERM distribution included Diversity, Equity, Inclusion, & Belonging and Student Success, each of which will be expressed and achieved in quite different ways from one institution to another. In looking at the Business Model Canvas, it is more playful, adaptable, and flexible than other elements of the reference model and is designed to support scenario-based planning and the evaluation of what-if scenarios. They can adapt to represent the stakeholders, resources, and drivers that surround different types of institutions.”

Later this year, a Technology Reference Model (TRM) will be introduced with the release of HERM Version 3.1.0. “The TRM is probably the most volatile domain across the architecture stack,” explains Foxwell. “The TRM is broadly positioned as a structure of technology services, rather than of devices, things, or specific implementations, and it is expected to adjust over time in response to market shifts, technology convergence, and the ongoing dynamic and rapid shifts in new-and-emerging technologies.”

Meeting Diverse Needs
To ensure the models remain relevant and effective for institutions with diverse needs, their usage is continually tested in the field and questions about their content, structure, or application are funneled through the HERM working group. “We are able to review any feedback to identify necessary changes and improvements, but also ratify the models where changes are not necessarily in line with the global nature of the reference models,” explains Foxwell. “The working group is diligent in ensuring that any feedback we take on as change proposals are supported by relevant references to provide context for their scope and content. It is also important to ensure that as we improve the models, that we maintain the best level of detail for our content and provide a consistent vocabulary within and across the models. The level of detail is important as increasing specificity can rapidly lead to a decrease in usefulness, and if our vocabulary is inconsistent, the value of our definitions also decreases rapidly.”

“The Recipe Cards allow different lenses to be applied over and above the Business Model Canvas and can communicate motivations or outcomes of importance to individual institutions,” continues Foxwell. “The HERM is designed to ensure consistency in how its structures are leveled so that the optimum balance between completeness and usability with theoretical perfection is maintained. The HERM-WG maintains detailed supporting references for all change proposals as part of its responsibility for handling all feedback and suggestions.”

The HERM serves as a guide for institutions to benchmark their practices, implement improvements, and align with industry standards and emerging trends in higher education. These resources can act as a foundation to fast-track the development of business and data architecture, as a benchmark to identify similarities and unique features within the institution, and as a tool to effectively communicate and engage with stakeholders. “The adoption of these models can provide a common language that bridges and unites different stakeholders and perspectives within and between institutions and jurisdictions. The HERM can provide a ready-to-use framework that aids sensemaking in the complex environment of higher education and can connect strategy with execution, particularly through business capabilities serving as the missing link.”

— jeff kennedy
Enterprise Architecture Manager, Organisational Performance & Improvement, The University of Auckland

Imagining the Future
In thinking about the future, kennedy says there are many new features being considered for future releases. “We are looking forward to the imminent release of V3.1 which will include the Technology Reference Model (TRM). This is an important companion to the Application Reference Model, which is also getting important improvements following feedback. Our next major release Version 4.0.0 will introduce additional new features and will take the next steps in migrating the HERM distribution to an open-access platform that ensures fully open and equitable access to the HERM is provided. Our community feedback also tells us that the concept of value streams is gaining prominence in strategic discussions. By providing a view on value streams in the context of the HERM, we would hope to demonstrate another way the business capabilities can be organized to address specific areas of focus for HE institutions.”

“The concept of a service reference model is also gaining significant traction,” continues kennedy. “Services have been a focus for some Australian, New Zealand, and UK institutions as they look to define their value to students and faculty. It is a tricky area with a long history of the service management terminology tied very closely with IT services, but there is a broader context around the growing interest in Enterprise Service Management that may be closely aligned to the HERM.”

Foxwell adds, “Lastly, with the explosion of AI everywhere and the ease of access to these tools, the working group can imagine a future where the next most valuable accelerator to higher education is a tool that can be interrogated using the reference models as its base model. It would be amazing for architects of all kinds, amongst others, to be able to provide an institution’s list of applications and have the HERMbot allocate them accurately to the appropriate application and technology HERM items. Or perhaps provide a list of processes and their brief descriptions while identifying the range of associated business capabilities. The opportunities are extensive, and we may be already behind some existing EA tool vendors thinking in this area. There is much promise for routine and everyday Enterprise Architecture activities for the use and application of artificial intelligence.”

“We continue to see great uptake of the HERM in Europe and there are many active users there. The EUNIS EA community has been particularly active and supportive and continues to educate and advocate around the models. They will also demonstrate their collaboration, advocacy, and advanced use of the HERM at their second EA Week workshop event, held this year at The University of Southern Denmark. Building awareness is key and we look forward to continuing to connect with new institutions and share our excitement about the opportunities the HERM can help unlock.”

How to Get Started
Edge will continue to promote and employ the HERM and work with members to increase awareness and adoption, and show how to leverage the frameworks in strategic and operational analysis, planning, and decision-making. “We are very excited about the opportunities that will arise from Edge’s support of the HERM and helping extend our reach to new institutions,” shares kennedy. “Promoting the HERM through EdgeMarket is terrific and we hope this will support institutions of all sizes to leverage EA thinking in their strategic and operational objectives.”

For institutions interested in using the HERM, getting started could be as simple as mapping a strategy document to the BCM or using the DRM as a map for data-governance responsibilities. “Full implementation is not necessary to take advantage of these models,” says kennedy. “For those using an EA tool, the models should be able to be imported very quickly. Otherwise, you can start by creating a set of spreadsheets to capture basic information using the catalogs as starting points. For example, the open-source Essential Architecture Services has a launchpad offering that contains the HERM as a ready-to-use, out-of-the-box set of artifacts. Adopting the HERM as part of an EA Tools implementation will reduce the time-to-value and can provide immediate, cost-effective benefits.”

“If you want to learn more about the frameworks, we encourage you to get hold of the HERM and explore what’s in the box,” continues kennedy. “You can also connect with or form a local Enterprise Architecture in Higher Education (EAinHE) community of practice. We commend the Enterprise Architecture communities that are established and thriving within UCISA, EUNIS, EDUCAUSE/Itana, and CAUDIT. The HERM Working Group welcomes and invites thoughts, feedback, and suggestions about any changes that will make the HERM more valuable to the institutions that use these resources to help standardize quality, enhance curriculum development, and improve student outcomes through best practices and strategic alignment.”

To learn more about the Higher Education Reference Models and how they provide standardized business and data architectures, visit https://library.educause.edu/resources/2021/9/the-higher-education-reference-models.

To contact the HERM Working Group, email herm-feedback@googlegroups.com.

View Article in View From The Edge Magazine

The post Leveraging Higher Education Reference Models to Align IT Strategies with Institutional Goals appeared first on NJEdge Inc.


Support & Services for Launching Your Nonprofit Venture

The post Support & Services for Launching Your Nonprofit Venture appeared first on NJEdge Inc.

Give your institution’s new nonprofit entity, foundation, or small business a successful start with expert guidance from Edge’s full scope of incubator services and hosting.

Launching a new venture is both exciting and demanding.
Where does one begin?

There’s a tremendous amount of ‘heavy lifting’ to ensure your new venture gets off to a proper start. Perhaps you need help drafting your initial bylaws or requesting a Taxpayer Identification Number from the IRS and the State of New Jersey. You may also need assistance drafting, reviewing, and submitting IRS form 1023, or seeking recognition of nonprofit status and issuing payroll.

Edge is ready to help with a full scope of incubator services and hosting, including:

Services are available as a la carte, packaged within specific speciality areas, or full service. We invite you to discover the variety of services available to you on the following pages.

“Edge’s expert team has launched the ventures of numerous start-up entities throughout the region. From creating academic and workforce opportunities within the FinTech industry to Esports and beyond, you can have the confidence knowing your venture will succeed with Edge’ssupport and guidance helping you every step of the way.”

— Caitlin Kaplan
Vice President of Finance, Administration, Legal and Chief Financial Officer, Edge

New Entity Formation Draft initial Bylaws Drafting Certificate of Incorporation and submitting final documents Request taxpayer Identification number from IRS as well as the State of NJ Draft, review, and submit IRS form 1023 to seek recognition of Nonprofit status Draft and review business plan, pro forma financials as part of the Form 1023 File for Business Registration Certificate with New Jersey Department of Treasury Register with the New Jersey Department of Consumer Affairs (DCA) File forms required for a New Jersey Certificate of Employee Information Report (CEIR) State of New Jersey Sales tax Exemption Establish banking relationship Edge’s current banking partner or another bank the entity prefers. Human Resources Assistance Hiring activities Writing job descriptions Posting of job descriptions Conduct background checks Organize interviews Issue payroll to employees File employment–related taxes Issue W-2s

“NJ Transfer is not just a place to find course equivalencies, but is truly an academic planning tool that allows students to maximize their transfer credits, keep them on track to achieve their dreams, and make the most of their higher education experience.”

— Thea Olsen
Executive Director, NJ Transfer

Venture Opportunity
To help make the college transfer process a success, NJ Transfer helps students who are currently enrolled or plan to enroll in a community college before transferring to a New Jersey four-year college or university. This online resource provides information about transferring academic credits, choosing courses for each degree program, accessing application requirements, and finding transfer contacts. NJ Transfer was looking to drive this initiative forward by rebranding the website and print materials and improving the user experience.

Partnering with Edge
In April 2020, Edge signed a memorandum of understanding with the New Jersey President’s Council to assume management and operation of NJ Transfer. As part of this agreement, Edge provides support, leadership, business services, and strategy development and planning that helps advance the value proposition of the NJ Transfer organization. Now a convenient one-stop resource, students can begin to cultivate their support network of staff at both the 2-year and 4-year level and make more informed decisions about their future.

Venture Opportunity
With a goal of creating new academic and workforce development opportunities within the esports industry and establishing New Jersey as a prominent hub for esports innovation, Stockton University created the Esports Innovation Center (EIC) with support from the New Jersey Economic Development Authority (NJEDA). Dedicated to helping shape the future of the esports landscape and attract top talent to the state, Stockton University was seeking support to help provide students with unique competitive gaming opportunities, enhance academic offerings, and foster connections to private sector businesses within the local esports ecosystem.

Partnering with Edge
To help the EIC in their mission, Stockton University partnered with Edge for strategic and business planning guidance, startup services for establishment of the non-profit (501c3), and back-office support for the EIC that includes legal, human resources, and financial services. In addition, the EIC uses esports connectivity via EdgeNet to help deliver a superior esports experience. Edge’s high-performance optical fiber network offers high bandwidth, low latency, and highly available connections, with direct peering connections to all major gaming platforms.

Operations Guidance Track all revenue and expenses in accounting software Provide monthly financial statements to ED of Entity Draft and review standard membership and donation forms Help with customer service and collections activities Process vendor bills and payments Issue IRS Form 1099s to applicable vendors Help find external auditors to perform year end audits when required Provide technical support for domain name registrations, creating and maintaining a website using WordPress, hosting of email and other productivity tools Assist with writing of grant proposals Provide Benefits Issue payroll to employees File employment–related taxes Issue W-2s Medical, dental, and vision Ancillary benefits – Long-term disability, life Insurance, short-term disability Paid time off 401(k) plan

“In the past, we had to do RFPs through other member institutions, and it was always second hand. As a part of Edge, we can do that through them. The Edge relationship allows VALE to do more things and we have greater potential; including thinking about going statewide sooner and expanding beyond just purchasing.”

— Joe Toth, J.D.
Director of Library Services, Chair, VALE Executive Committee, Stockton University

 

Venture Opportunity
To advance New Jersey’s leadership in the online gaming and financial technology (FinTech) industries, the New Jersey City University (NJCU) created the Fintech & Sports Wagering (FTSW) Innovation Center in partnership with the New Jersey Economic Development Authority (NJEDA). The goal was to bring together a community of small, medium, and large companies, startups, entrepreneurs, venture capital, and other investors, university and high school students, faculty, and government.

Partnering with Edge
In 2021, Edge signed a memorandum of understanding with NJCU to provide support, leadership, human resources, business services, and strategy development to help the FTSW Innovation Center take shape. Now a host of a variety of activities, the Innovation Center fosters connections, provides learning and workforce development opportunities, connects employers to talent, and helps launch new businesses in these spaces.

Venture Opportunity
The Virtual Academic Library Environment (VALE), a consortium of 50 New Jersey college and university libraries, is dedicated to furthering excellence in learning and research through innovative and collaborative approaches to information resources and services. In particular, VALE aims to serve New Jersey institutions as a facilitator for information resources and collections and by providing seamless access to open educational resources (OER). With a goal of becoming a clearinghouse for shared electronic academic information, VALE needed help as the organization grew to meet the needs of faculty and students.

Partnering with Edge
In January 2015, VALE connected with Edge during this growing process to allow them to pursue projects they couldn’t legally do in the past. Through this partnership, Edge helped VALE obtain their 501(c)(3) status, and currently employs their program director, reviews contracts, provides payroll and benefits, bills vendors, and invoices their members.

View Article in View From The Edge Magazine Contact us for expert assistance getting your institution’s new nonprofit entity, foundation, or small business up and running.

MICHELLE FERRARO
michelle.ferraro@njedge.net
732.740.5092

ERIN BRINK
erin.brink@njedge.net
973.943.8088

The post Support & Services for Launching Your Nonprofit Venture appeared first on NJEdge Inc.


Smart Building Showcased at Edge’s Executive Networking Reception

The post Smart Building Showcased at Edge’s Executive Networking Reception appeared first on NJEdge Inc.

This year’s Higher Education Executive Networking event not only allowed community members and leaders to network with each other, but gave Edge and historical members from New Jersey a chance to hear how they can better support the individual missions of each organization. Edge also formed new connections with attendees from New York and was able to speak in person to a new Edge member, the Metropolitan College of New York (MCNY). As a user of EdgeNet, MCNY shared positive feedback regarding the network and having connectivity in their area.

Leaders from institutions throughout New York City and beyond joined together on June 20, 2024 to enjoy refreshments and community building at Edge’s popular Higher Education Executive Networking Reception. Partnering with Cisco and CBTS to host this year’s event, Edge welcomed attendees to engage in consortium networking and experience new technologies, while giving them an opportunity to share their top priorities, challenges, and aspirations. Cisco gave participants a tour of their state-of-the-art 1PENN location in NYC and provided an in-person look at the advantages of incorporating advanced technologies into modern building systems.

Creating Innovative Hybrid Work Environments

With a goal of reimagining in-office facilities, Cisco created a Smart Building solution that supports a hybrid work environment, streamlines building management, and reduces operational costs. With technology incorporated into the design instead of an add-on, the workspace promotes collaboration and an enhanced user experience, while catering to diverse work and learning styles. Employees can check meeting room availability, adjust room temperature, lights, and blinds, and reduce outside noise with ambient sound.

Created by Cisco, the software-based system for room control allows users to adapt, monitor, and maintain conference rooms and classrooms to better fit their needs. IoT sensors, AI-driven analytics, and virtual collaboration tools all allow seamless transitions between in-person and remote interactions and showcase what is possible when creating innovative hybrid work and learning environments. For member institutions looking to enhance digital learning and collaboration, Cisco’s hybrid learning and collaboration technology is available through EdgeMarket.

Forming New Connections
This year’s Higher Education Executive Networking event not only allowed community members and leaders to network with each other, but gave Edge and historical members from New Jersey a chance to hear how they can better support the individual missions of each organization. Edge also formed new connections with attendees from New York and was able to speak in person to a new Edge member, the Metropolitan College of New York (MCNY). As a user of EdgeNet, MCNY shared positive feedback regarding the network and having connectivity in their area.

The network node addition occurred in 2019, as Edge continued its regional network expansion. The high performance networking node was established at the historic 32 Avenue of the Americas (32 AoA) facility in New York City, a worldwide hub for high performance optical fiber networking. This location allows Edge to deliver a higher quality of service to a wider regional member base and to continue advancing research and education networking capabilities for member institutions.

As the need for digital learning and collaboration solutions continues to grow within higher education, Edge plans to organize more opportunities for thought leaders to see firsthand how user-centric technology can be integrated successfully into workspaces and classrooms. If you are interested in learning more about future executive networking events and tours, please contact Michelle Ferraro at michelle.ferraro@njedge.net.

About Cisco’s PENN 1 Facility
Cisco’s PENN 1 office located in Midtown Manhattan, New York City is one of Cisco’s newly reimagined office spaces designed to provide the ultimate hybrid work experience for employees and visitors while also leveraging smart building technology to help reach Cisco’s sustainability goals and provide real estate and facilities teams the operational insights needed to understand how the space is being used. The PENN 1 office was the first of its kind at Cisco and as such, was a learning experience.

The PENN 1 office was initially created as a proof of concept, and as such, many of the technologies used had no pre-existing IT standards for their deployment. Additionally, there was a desire to reimagine the purpose of the office which required rethinking how space was allocated. As a result, a virtual team was created of subject matter experts from Cisco’s IT, sales, and workplace resources organizations, who were responsible for designing and deploying the office and then taking the learnings to create new IT standards that could be leveraged for future offices. This effort took place through 2021 and early 2022, so design decisions made here should be viewed through the lens of the state of the art at that time. Additionally, supply chain issues influenced some of the product choice decisions. The document will point out alternate product selections that have been used in newer offices that have been implemented since the opening of PENN 1.

View Article in View From The Edge Magazine

The post Smart Building Showcased at Edge’s Executive Networking Reception appeared first on NJEdge Inc.


Edge’s vCISO Services to the Rescue!

The post Edge’s vCISO Services to the Rescue! appeared first on NJEdge Inc.

EdgePro solutions aim to deliver support and provide value to our members as they meet the challenges of a world driven by information and an unprecedented pace of technological change. From professional services, to staff augmentation, to end-user support applications, EdgePro is designed to help members grow, thrive, and rapidly adapt to change.

The Important Role of a vCISO
The vCISO service is designed to create actionable information security strategies and define optimum information security direction for your organization. The vCISO will provide independent and objective input to ensure that your security posture is on track, identifying areas of necessary improvement and continuing to support areas where you’re already in compliance.

vCISO services can be engaged for anywhere from a few hours, to a per-project basis, to a full-time staff augmentation. The outcome of an engagement with the vCISO would involve executive level strategy, policy development, and process creation for immediate adoption, implementation, and operation of improvements within the organization.

Contact Edge Today!

Michelle Ferraro
Member Engagement Manager
michelle.ferraro@njedge.net
732.740.5092

Erin Brink
Member Engagement Manager
erin.brink@njedge.net
973.943.8088

An EdgePro vCISO is able to assist in any of the following areas

Organizational Leadership Cybersecurity Team Development Direction of InfoSec Team InfoSec Program Management Ownership of Security Policy Employee Security Awareness Programs Security Framework Certifications Technical Contract Review 3rd Party & Vendor Risk Management Liaise with Law Enforcement & Government Vulnerability Management Programs Business Risk Management & Assessment IT Configuration Assessment & Audit Establish & Improve Security Policy, Process, & Procedure Establish & Improve Roles, Responsibilities, & Organization Establish & Improve Human Resources Security Controls Establish & Improve Asset & Data Management Controls Establish & Improve Access & Cryptographic Controls Establish & Improve Physical & Environmental Controls Establish & Improve Operations, Communications, and Incident Management Controls Business Continuity & Disaster Recovery Planning Incident Response Process Development & Management Board Presentations & Leadership Committee Participation …And A Variety of Additional InfoSec Problem Areas View Article in View From The Edge Magazine

The post Edge’s vCISO Services to the Rescue! appeared first on NJEdge Inc.


The Power of the CIO and CISO Partnership

The post The Power of the CIO and CISO Partnership appeared first on NJEdge Inc.

Joining Steven Institute of Technology in 2020, Tej Patel brought over fifteen years of corporate information technology and higher education experience to his new leadership role as the University’s Information Technology and Chief Information Officer (CIO). Along with leading the IT strategy for Stevens, Patel also advises cabinet members and presidents on technology investments and digital capabilities that can enhance the faculty, staff, and student experience. Prior to coming to Stevens, Patel held several leadership roles at the University of Pennsylvania, further deepening his passion for building and leading IT organizations. “When you spend eighteen years growing up at an institution, you become a part of that community, and that community becomes part of your DNA,” shares Patel. “I owe everything to Penn and the great people who supported my journey and shaped me into the leader I am today.” (Editorial Note: At the time of the publication of this magazine, Tej accepted a new role at Villanova University. Congratulations, Tej!)

Upon joining Penn in 2003, Patel managed classroom technology and then later became IT director of systems and infrastructure service at the Annenberg School for Communications. “I led their IT operations and increased the portfolio to include research, web security, and data center operations,” explains Patel. “In 2015, I became the Penn Nursing Chief Information Officer and co-chaired the roundtable for the entire university. Throughout my experience, I made a lot of friends, went through failures, learned a great deal, and formed many good relationships—all of which made the path to my role at Stevens more successful.”

Delivering High Quality Services
Aligning with Patel’s vision as CIO, the Division of Information Technology at Stevens, known as One IT, aims to empower the institution’s community by providing robust digital technology experiences and reliable, high-quality services, while establishing trusted partnerships across the campus. “My goal is to find alignment within Stevens strategic plan, which is Inspired by Humanity, Powered by Technology,” says Patel. “There are specific goals that also focus on research, innovation, and scholarship, as well as promoting diversity, equity, and inclusion. We take these objectives very seriously and we include institutional level priorities in our IT strategies every year.”

“Cybersecurity and data privacy have always been a top priority, and most recently, we have shifted our focus from infrastructure modernization to data and artificial intelligence,” continues Patel. “We are currently implementing certain digital capabilities to enable the business growth that Stevens is looking for over the next several years. If you ask me to summarize our mission in a few words, it would be customer centricity and delivering high quality services. And how do we measure the success of these initiatives and manage IT investment? While there are metrics that we monitor from a security, infrastructure, or budget perspective, what it really comes down to is, are we able to provide best-in-class IT services to faculty, staff, students, and alumni? If the answer is yes, then we hit our metrics, and the service levels are fully maintained at a top level.”

“Your community must have trust in you as a CIO or a CISO. There must be a culture of transparency instilled at an institution and our roles have to be held accountable for certain projects and tasks occurring on campus. We must be able to communicate openly about these activities. It’s not about ensuring that a lot of people have a seat at the table, but what can be done when someone has that seat? It’s deciding how to educate the board and cabinet members and connect the dots so we can continuously provide support and digital capabilities and business leaders can do what they do best. When you are invited to these meetings, you must be prepared to discuss what’s working well, but more importantly, what opportunities are available.”

— Tei Patel
Former Information Technology and Chief Information Officer, Steven Institute of Technology

Promoting Alignment and Partnership
Soon after joining Stevens, Patel hired Jeremy Livingston as Chief Information Security Officer (CISO). The two had already been working together developing the Protect Stevens security program and wished to continue the forward trajectory to keep pace with the ever-changing technology on today’s campuses. “Cybersecurity is a team sport and the CIO and CISO roles have shared objectives and goals of leveraging technology to enhance the institution’s mission while ensuring robust security measures,” says Patel. “We want to make sure the security posture is implemented in such a way that we are not getting in front of the business, but are truly managing that risk. In order to do this, it’s critical to have a solid trust and understanding of both environments and to give the CISO the freedom that they require.”

“One of the biggest changes that was implemented when I joined Stevens is that Jeremy was adopted into the Audit and Risk Committee on our board,” continues Patel. “We’re able to go to the board together and discuss IT and cybersecurity as an aligned front. Having a strong CIO and CISO partnership helps make an institution more effective and successful from a security standpoint and an organization overall.”

Along with the time spent working together, Livingston says building the CIO and CISO relationship takes an alignment of their objectives. “We both have the same goal, because we know classrooms are unable to operate and instructors are unable to teach if systems are offline because of a cyberattack. At the same time, we must prevent security controls from being too onerous or in the way of those objectives. We have a shared understanding of the risk tolerance working at an institution and we try to take innovative approaches to minimizing the risks that could disrupt the organization’s business objectives.”

As partners, Patel and Livingston have a collaborative decision-making process that strongly supports honest discussions. “Having open communication has helped us build trust and we both want to land on the best idea or solution,” says Livingston. “Tej has also done a great job on resource allocation and the balancing of these resources, including money, time, effort, and talent. This has been a tremendous strength of ours and keeps us on a successful path forward.”

“Your community must have trust in you as a CIO or a CISO,” adds Patel. “There must be a culture of transparency instilled at an institution and our roles have to be held accountable for certain projects and tasks occurring on campus. We must be able to communicate openly about these activities. It’s not about ensuring that a lot of people have a seat at the table, but what can be done when someone has that seat? It’s deciding how to educate the board and cabinet members and connect the dots so we can continuously provide support and digital capabilities and business leaders can do what they do best. When you are invited to these meetings, you must be prepared to discuss what’s working well, but more importantly, what opportunities are available.”

Encouraging Open Conversations
To be the most effective CIO and CISO, Patel says you must spend time broadening your end-to-end understanding of several aspects within the organization. “First, it’s crucial to be educated about the business of the institution and the overarching objectives. Second, you must be able to identify your own risk appetite, and third, you need to understand where leadership is coming from and how those external factors impact your own cyber strategy. With this comprehensive approach, you can sit at the table and contribute to the success of that institution.”

Livingston adds, “One of our strengths here at Stevens is we are open and honest about not only our successes, but the things we could do better; identifying the areas of opportunity. That integrity goes all the way to the board and president. They hold us accountable, and we have independent audits and penetration testing. Our team receives validation that what we’re doing is working and meeting expectations.”

Last year, the State of Cybersecurity began at Stevens, which invites faculty, staff, and students to hear directly from Patel and Livingston and their teams about security operations. “We provide an intimate level of access to some of the matrix that Jeremy and his team manages and monitors and we talk openly about the areas we are excelling in and the areas we could use their help,” shares Patel. “Along with a top-down approach, we must also take a bottom-up approach, since the end user and their experience are of high importance.”

“We both have the same goal, because we know classrooms are unable to operate and instructors are unable to teach if systems are offline because of a cyberattack. At the same time, we must prevent security controls from being too onerous or in the way of those objectives. We have a shared understanding of the risk tolerance working at an institution and we try to take innovative approaches to minimizing the risks that could disrupt the organization’s business objectives.”

— Jeremy Livingston
Chief Information Security Officer, Steven Institute of Technology

Improving Cybersecurity Outcomes
At Stevens, there is a strong drive to ensure the institution maintains a good security posture and has a strong security program. “In the months before I joined, Tej had started to develop the Protect Steven program and we’ve continued to build out this program to have ten key programmatic areas,” explains Livingston. “This includes independent risk assessment, security engineering, user protection, threat intelligence, identity and access management, user education, security operations, governance, frameworks and standards, and professional development. Plus, each of these ten areas has a dozen subparts of the program.”

“The entire community here at Stevens is focused on Protect Stevens, from leadership down to faculty, staff, and students,” adds Patel. “We keep a pulse on what’s happening and the success of this program is largely due to having a customer savvy CISO who combines empathy and compassion with the security posture controls that need to be implemented. We look at what is working, what we can do better, and how we can help maintain the security compliance that is needed. Information flows in all directions and I feel very fortunate to have a partner like Jeremey who has taken Protect Stevens to the next level.”

Going forward, Patel and Livingston are looking at identity and access management (IAM) as they continue to evolve in a rapidly changing environment. “As we look to improve our cybersecurity outcomes, we must determine how to ensure that the right users have the appropriate access to technology resources,” says Patel. “We must also find balance between user behavior and providing effective training. There is a rise in deep fakes and people are leveraging generative AI, so Jeremy and I will focus more on improving in those areas under Project Stevens in 2025.”

Keeping Pace with AI
With AI rapidly integrating into our day-to-day lives, Stevens continues to look at AI-powered tools and how this technology fits within their organization. “Being a STEM institution, we want to make sure our faculty, staff, and students have access to these tools, but we also want to ensure our data fabric is solid before we grant this access,” explains Patel. “We have three verticals: AI for academic education, AI for employees, and AI for research, and there are small pilots underway in these areas. For example, we have a pilot with 300 faculty, staff, and students who are leveraging Microsoft Copilot functionality to improve their productivity and quality of work. In the research vertical, we have the Stevens Institute of Artificial Intelligence who are doing exciting, cutting-edge research. To support this, we recently built a new high-performance computing cluster that has a little over 200 teraflops of compute power. Ultimately, we want to improve and support the undergrad and graduate student experience, expand our research enterprise, empower our faculty and staff, and make sure our infrastructure is reliable and secure.”

Connecting within the Edge Community
As a longtime Edge member, Stevens looks to the consortium for a variety of benefits. “Our campus internet connectivity is through Edge and is used by all on a day-to-day basis,” says Patel. “We’re also able to leverage cutting-edge technology through Edge’s access to vendors. I have attended several Edge events and I’ve found that these meetings have created a solid networking platform that connects several institutions within the region, including presidents, provosts, and executives. Edge creates an environment where we can openly discuss challenges and opportunities, which inspires meaningful conversations and partnerships across the state.”

“I too enjoy Edge events and hearing from speakers who come from different places and bring unique experiences and perspectives,” adds Livingston. “I also help chair Edge’s Security Community of Practice and I like how it brings together security practitioners from the whole Edge community. This collaborative group talks about issues, security events, and we share indicators of compromise or things to look for to help block IP addresses. Together we are bringing attention to the importance of cybersecurity and risk management and how a proactive security strategy is essential at each and every organization.”

Need help identifying and implementing actionable information security strategies? The EdgePro virtual CISO (vCISO) service provides independent and objective input to ensure your security posture is on track. Learn more at njedge.net/solutions-overview/vciso/.

View Article in View From The Edge Magazine

The post The Power of the CIO and CISO Partnership appeared first on NJEdge Inc.


Taking a Collaborative Approach to AI and Education

The post Taking a Collaborative Approach to AI and Education appeared first on NJEdge Inc.

Instrumental in bringing Edge’s recent AI Teaching & Learning Symposium to Seton Hall, Renee Cicchino, Director of Instructional Design & Training at Seton Hall University’s Teaching, Learning and Technology Center, was interested in how AI was impacting teaching, learning, and student experiences at local universities. “I saw in real time the ongoing challenges instructors faced as they tried to keep pace with this ever-changing technology. I also knew of several institutions conducting research in AI to address these issues,” says Cicchino. “Attending an Edge event sparked my interest in a potential partnership to co-host a small AI symposium focused on exploring these growing pains, sharing policies and best practices, and fostering meaningful discussions. No matter where an educational institution is on its AI journey, we all share the collective goal of supporting and equipping students with the skills they need for success after graduation.”

The AI Teaching & Learning Symposium was a huge success with attendees, presenters, and sponsors coming from around the region and beyond. “The overwhelming interest and diverse participation in this event highlight the growing excitement and necessity for AI integration in education,” says Cicchino. “I was taken aback by Dr. Robbie Melton’s proposal from Tennessee State University. She presented purposeful AI integration in education and it was fantastic! The variety of proposals we received—whether AI played a minor or major role—was eye-opening and demonstrated AI’s potential for positive impact while understanding its limitations. Most importantly, the symposium introduced AI in a digestible way and raised many questions that will help inspire ongoing conversations.”

“The overwhelming interest and diverse participation in this event highlight the growing excitement and necessity for AI integration in education. I was taken aback by Dr. Robbie Melton’s proposal from Tennessee State University. She presented purposeful AI integration in education and it was fantastic! The variety of proposals we received—whether AI played a minor or major role—was eye-opening and demonstrated AI’s potential for positive impact while understanding its limitations. Most importantly, the symposium introduced AI in a digestible way and raised many questions that will help inspire ongoing conversations.”

— Renee Cicchino
Director of Instructional Design & Training, Seton Hall University

Sharing Experiences and Insights
The symposium began with a student panel, Experiencing Generative AI Insights from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics, and gave Seton Hall students an opportunity to discuss their experiences engaging with generative AI (GenAI) tools in their coursework. “We heard from students about their needs, their perspectives on how AI is changing how they learn, and how these tools enhance their educational experiences,” says Cicchino. “The panel highlighted the positive aspects of AI use, including how personalized and interactive feedback can guide the development of students’ writing rather than simply doing the work for them. This session was extremely valuable for educators and decision-makers to understand students’ needs and how institutions can enhance student success during college and as they transition into the workforce.”

Impressed and excited by the variety of schools that attended the symposium, Cicchino said the event exceeded her expectations. “I was thrilled to see participants from other states and local colleges with whom we don’t have the opportunity to connect as often, as well as our long-time colleagues from Lackawanna College, King University, New York University, and Montclair State University. Everyone was excited to learn and collaborate, and I can’t say enough about the Edge team for working tirelessly to organize this successful event.”

The symposium underscored the vast potential of AI in education, and Seton Hall is committed to continuing its exploration of AI’s ethical use, data security, and best practices across disciplines. “This summer, we are researching best practices and AI use statements for faculty to include in their syllabi,” says Cicchino. “We aim to explore how AI is leveraged in various disciplines and explore the ethical and data security concerns surrounding AI. We hope to harness this technology’s power responsibly and ethically.”

In reflecting on the attendance and sold-out event, Cicchino says people want to participate because they want answers. “Understanding how other institutions tackle changes in teaching and learning is invaluable. Learning from their experiences through collaboration helps us grow in higher education. With AI rapidly evolving, it’s crucial to integrate it sensibly and purposefully into our curriculum and daily lives. We are dedicated to ensuring our students are well-prepared to succeed in an AI-driven future.”

View Article in View From The Edge Magazine

The post Taking a Collaborative Approach to AI and Education appeared first on NJEdge Inc.


Oasis Open Projects

Advancing Cybersecurity in Space at OASIS

As space operations become increasingly complex, the need for effective threat intelligence sharing is more crucial than ever. The increase in data transmission across space networks brings both opportunities and heightened risks, as cyber threats increasingly target critical space infrastructure. Protecting these assets demands a coordinated and proactive approach to threat intelligence sharing.

By Erin Miller, Hector Falcon, and Joel Francis, Space ISAC

As space operations become increasingly complex, the need for effective threat intelligence sharing is more crucial than ever. The increase in data transmission across space networks brings both opportunities and heightened risks, as cyber threats increasingly target critical space infrastructure. Protecting these assets demands a coordinated and proactive approach to threat intelligence sharing. To address this, the OASIS global standards body is working with Space ISAC to form the Space Automated Threat Intelligence Sharing (SATIS) Technical Committee (TC). The group will formally launch on Oct 9, but initial members include NSA, Northrup Grumman, Cyware, MITRE, Peraton, and Carnegie Mellon University. SATIS will build on existing frameworks like Structured Threat Information Expression (STIX) and Trusted Automated eXchange of Intelligence Information (TAXII) to help secure space operations against evolving threats…

Read more here.

The post Advancing Cybersecurity in Space at OASIS appeared first on OASIS Open.


OpenID

Three OpenID Connect for Identity Assurance Final Specifications Approved

The OpenID Foundation membership has approved the following three OpenID Connect for Identity Assurance specifications as an OpenID Final Specifications:   OpenID Identity Assurance Schema Definition 1.0 – https://openid.net/specs/openid-ida-verified-claims-1_0-final.html OpenID Connect for Identity Assurance Claims Registration 1.0 – https://openid.net/specs/openid-connect-4-ida-cl
The OpenID Foundation membership has approved the following three OpenID Connect for Identity Assurance specifications as an OpenID Final Specifications:   OpenID Identity Assurance Schema Definition 1.0 – https://openid.net/specs/openid-ida-verified-claims-1_0-final.html OpenID Connect for Identity Assurance Claims Registration 1.0 – https://openid.net/specs/openid-connect-4-ida-claims-1_0-final.html OpenID Connect for Identity Assurance 1.0 – https://openid.net/specs/openid-connect-4-identity-assurance-1_0-final.html   A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision. These Final Specifications are products of the eKYC & IDA Working Group.   The voting results were: Approve – 89 votes Object – 0 votes Abstain – 18 votes   Total votes: 107 (out of 402 members = 26% > 20% quorum requirement)    Marie Jordan – OpenID Foundation Secretary

The post Three OpenID Connect for Identity Assurance Final Specifications Approved first appeared on OpenID Foundation.


Elastos Foundation

The Global Debt Avalanche: A Bitcoin-Backed Stablecoin To Rescue the World’s Economy

Imagine standing at the base of a mountain, watching an unstoppable avalanche of debt cascading towards you. This is the precarious situation the global economy faces today. National debts are accumulating at unprecedented rates. In the United States, the national debt crossed $35 trillion in July 2024, a staggering figure that took 200 years to […]

Imagine standing at the base of a mountain, watching an unstoppable avalanche of debt cascading towards you. This is the precarious situation the global economy faces today. National debts are accumulating at unprecedented rates. In the United States, the national debt crossed $35 trillion in July 2024, a staggering figure that took 200 years to reach $1 trillion but now grows by $1 trillion every three months. Unchecked inflation acts as the silent thief, eroding the value of our hard-earned money. The dollar has lost at least 25% of its value in the past four years due to inflation and interest rate hikes.

Families worldwide are grappling with rising living costs as their savings lose value, and the dream of financial security slips further out of reach. The combined market capitalization of major tech companies like Apple, Microsoft, NVIDIA, Google, Meta, and Tesla stands at $14 trillion—less than 10% of the $175.3 trillion owed by the U.S. government when including entitlements like Social Security and Medicare. Influential figures like Elon Musk and Ray Dalio foresee a sovereign debt crisis worse than the 2008 financial meltdown, however, this time it’s a melt-up as currencies are destroyed to prop up a failing system and wealth is wiped globally.

The urgency of the situation cannot be overstated. The escalating debt isn’t just an economic threat; it’s a debt as a precursor to conflict. History has shown that severe economic instability often leads to social unrest, political polarization, and even geopolitical conflicts. The Federal Reserve made more emergency loans in 2023 than during the 2008 financial crisis, indicating the severity of the current financial stress. The world is witnessing increasing finger-pointing and blame, creating fertile ground for division and discord.

The debt crisis isn’t just about numbers; it’s a deep generational theft burdening our youth with debts they didn’t create. No political election or administration can pay off the worlds $305 trillion debt; only extensive use of the printing press can temporarily address it. Young people are entering a world where opportunities are scarce, unable to buy homes or start families—a loss of future prospects and generational equality. This unfair burden diminishes their hopes for a stable and prosperous life.

This atmosphere has given rise to a populism trap, where charismatic leaders offer simplistic solutions to complex problems. They capitalize on public discontent, exacerbating divisions rather than healing them. Interest payments on national debts have become the largest government expense, surpassing costs like defence and social security. Instead of promoting unity and constructive dialogue, this finger-pointing deepens societal rifts and distracts from finding real solutions.

Is there a way out of this looming catastrophe? Yes, there is. A Native Bitcoin Stablecoin is a beacon of hope in a sea of financial turmoil. This novel solution proposes a transition from Fiat-Based Instability to a new Bitcoin-Backed Stable Solution, positioning Bitcoin as a way out of an inherently flawed system. Bitcoin, often hailed as digital gold, has a fixed supply that cannot be manipulated—no government can print Bitcoin to cheat the system—making it a reliable store of value.

But what if holders could unlock liquidity without selling their bitcoins? By collateralizing their Bitcoin holdings, they can issue stablecoins backed by Bitcoin itself. This stablecoin is secured algorithmically by blockchain miners, ensuring that its peg cannot be broken—unlike the old Bretton Woods system, where in 1971, President Nixon severed the US dollar’s tie to gold due to mounting debt, unleashing a wave of money printing and manipulation that led us to the crises we face today.

The Bitcoin Standard metaphor suggests that Bitcoin, much like gold in the original Bretton Woods Agreement, can anchor a new, trustworthy financial system free from manipulation by centralized institutions. With China, the largest foreign holder of U.S. Treasury bonds, rapidly selling off its holdings, and BRICS countries increasingly buying gold instead of U.S. debt, the global financial landscape is shifting. The Bitcoin-backed stablecoin offers the best of both worlds: stability through a pegged value and the strength of Bitcoin as a store of value. It acts as a volatility shield, protecting users from the wild price swings commonly associated with cryptocurrencies while providing a secure and trustworthy alternative to fiat currencies.

As more people flock to Bitcoin as a store of value, the issuance of stablecoins opens the door to the world of decentralized finance and rights management. Through smart contracts, entire economies can enter into a new world which operates under blockchain protocols, providing transparency and trust in an environment where trust is scarce. This represents a new paradigm for smart economies—a means to replace the failing debt-based system through creative destruction. This shift lays the foundation for a new global financial order, one based on those who accumulate Bitcoin early and offers society a system everyone can verify and trust. By embracing this Bitcoin-Backed Stable Solution, we can transition from Fiat-Based Instability to a future where financial sovereignty is in the hands of the many, not the few.

So where are we on this journey? In a exciting development, a team of Harvard students and alumni has launched the New Bretton Woods Project (NBW) to tackle the global debt crisis head-on. Incubated at Harvard Innovation Labs, NBW is developing a Native Bitcoin stablecoin using the BeL2 infrastructure. This initiative reframes Bitcoin not just as a store of value but as the foundation of a decentralized financial system. Jacob, the Lead Member of NBW at Harvard University, stated: “Our goal is to create a ‘New Bretton Woods’ system anchored in Bitcoin, bringing stability through the utility of a stablecoin. This stablecoin lets users avoid Bitcoin’s price swings while keeping the benefits of holding Bitcoin.”

The NBW project aims to reshape global finance by building a Bitcoin-backed stablecoin, NBW provides stability while preserving Bitcoin’s decentralization and security. This stablecoin allows users to bypass Bitcoin’s price volatility while retaining the potential for long-term gain, making it practical for daily use. What’s more it it will be build on BeL2, an interoperability protocol which uses $ELA arbiters to allow Bitcoin and EVM networks to talk to each other without bridging assets, avoiding security concerns related to wrapped coins like WBTC. Sasha Mitchell, Head of Operations at BeL2, added: “Financial empowerment comes from both freedom and stability. By offering a stablecoin backed by Bitcoin on the BeL2 platform, NBW gives people a way to protect their wealth and access new financial opportunities, especially in times of economic volatility.”

The implications of adopting a Bitcoin-backed stablecoin extend far beyond individual benefits. This shift could transform national economies, ushering in an era of sound money, transparency, and fairness. Imagine a new bitcoin-based Bretton Woods, where Bitcoin serves as the foundation for a stable, decentralized global currency. The global debt crisis is an existential challenge that demands bold and creative solutions. It’s time to move beyond the failing fiat system and embrace the potential of a Bitcoin-backed stablecoin. Projects like the New Bretton Woods offer a tangible path toward a more equitable and transparent economic future. This is the next big step for Bitcoin—a stable, unmanipulated currency for everyday spending, backed by the strongest asset of our era.

Did you enjoy this article? Follow Infinity for the latest updates here!

Wednesday, 02. October 2024

MyData

Empowering a Human-Centric Digital Society: From Ethical Personalisation to Domain Super Apps

In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Author: StJohn Deakins, CEO at DataSapien […]
In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Author: StJohn Deakins, CEO at DataSapien […]

DIF Blog

ArcBlock Opens Decentralized Identity Development to All at the DIF 2024 Hackathon

Are you ready to transform the way we interact with digital identities? ArcBlock invites you to participate in a challenge that demonstrates Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) as practical tools for solving real-world problems. As a Silver sponsor of the DIF 2024 Hackathon, ArcBlock is calling on developers,

Are you ready to transform the way we interact with digital identities? ArcBlock invites you to participate in a challenge that demonstrates Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) as practical tools for solving real-world problems. As a Silver sponsor of the DIF 2024 Hackathon, ArcBlock is calling on developers, innovators, and decentralization enthusiasts to create applications that have a tangible impact on everyday life.

The Challenge: Real-World Applications of DIDs and VCs

Your mission is to develop practical applications using DIDs and VCs that address genuine needs and simplify everyday tasks. Whether it's enhancing privacy, streamlining processes, or empowering users with control over their data, show how you can leverage decentralized identity technologies to make a real difference.

Tools and Resources ArcBlock's Comprehensive Blockchain Platform: Access user-friendly interfaces and scalable solutions to build your application. Developer Support: Utilize ArcBlock's documentation and resources to accelerate your development process. Community Engagement: Join discussions and collaborate with peers on the DIF Hackathon Discord server. Rewards Worth Your Effort

Innovation deserves recognition, and this challenge offers a total of $7,500 in prize money, as follows:

1st Place: $3,000 2nd Place: $1,500 3rd Place: $1,000 Honorable Mentions: 10 prizes of $200 each

Bonus: All participants will receive a DID/VC "Proof of Participation" badge as a token of appreciation for your efforts.

Why Participate? Accessible to All: Whether you're a seasoned developer or new to coding, ArcBlock's tools make it possible to contribute. Make an Impact: Develop solutions that could improve daily life for people around the world. Enhance Your Skills: Gain hands-on experience with cutting-edge blockchain technologies. Get Recognized: Winners may have the opportunity to feature their projects in ArcBlock's Blocklet Store. Network: Connect with industry leaders and like-minded innovators in the decentralized identity space.

Kim Duffy of DIF, notes, “ArcBlock's challenge at the DIF 2024 Hackathon is particularly exciting because it opens the doors for a wider range of participants, including those with low-code or no-code experience.” 

By providing accessible tools and resources, ArcBlock empowers more people to contribute to the decentralized identity ecosystem. “This inclusivity is crucial for driving innovation and adoption of DIDs and VCs. We look forward to seeing how participants will leverage these user-friendly resources to build practical solutions that make a real difference in everyday life,” she adds.

About ArcBlock

ArcBlock simplifies decentralized application development, empowering innovators to build, deploy, and manage with ease. ArcBlock’s goal is to make decentralized technology accessible, enabling you to focus on creating solutions that can change the world.

Robert Mao, CEO of ArcBlock, notes "At ArcBlock, we are committed to empowering developers by providing a comprehensive framework and tools that make building Decentralized Identity (DID) applications easier than ever.” 

He adds, “Even more exciting, our AI No-code apps engine, AIGNE, allows anyone—even without coding experience—to leverage DID and Verifiable Credentials (VC) to create powerful applications. We're thrilled to support the DIF Hackathon and inspire innovation at every level of technical expertise." 

Ready to Innovate?

Register for ArcBlock’s informational sessions: ArcBlock and DID: A Suite of Frameworks and Tools for Building Applications, Register now (Thursday 10/3/2024 10:00 am PT) Building Decentralized Identifier (DID) Applications: Demo and Quick Start, Register now (Friday 10/4/2024 10:00 am PT) Explore ArcBlock's Resources: Visit our website to learn more about our platform and tools. https://arcblock.io  Dive into the Documentation: Access detailed guides and tutorials to help you get started. https://www.arcblock.io/docs/hackathon/en/dif-hackathon-2024   Join the Conversation: Engage with the community on the DIF Hackathon Discord server or a dedicated discussion forum from ArcBlock community: https://community.arcblock.io/discussions/boards/hackathon-support 

Your project could be the breakthrough that brings decentralized identity solutions into everyday use. We can't wait to see what you'll create!

Tuesday, 01. October 2024

DIF Blog

DIF Newsletter #44

October 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Announcements at DIF; 4. DIF Members; 5. Get involved! Join DIF 🚀 Decentralized Identity Foundation News The DIF Hackathon 2024 is Now Live! We're excited to

October 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Announcements at DIF; 4. DIF Members; 5. Get involved! Join DIF 🚀 Decentralized Identity Foundation News The DIF Hackathon 2024 is Now Live!

We're excited to announce that the Decentralized Identity Foundation (DIF) Hackathon 2024 is officially underway! This highly anticipated event brings together developers, innovators, and enthusiasts to explore groundbreaking solutions in decentralized identity.

Key Details:

Hacking Period: October 1 - November 4, 2024 Tracks: Education, Reusable Identity, Travel, and ZKPs Prize Pool: ~$70,000 USD

Whether you're a seasoned decentralized identity developer or new to the field, this hackathon offers a unique opportunity to push the boundaries of innovation. With challenges spanning education, reusable identity, travel, and ZKPs, there's something for everyone interested in shaping the future of digital identity.

How to Participate:

Register on DevPost: https://difhackathon2024.devpost.com/ Explore the challenge details: https://identity.foundation/hackathon-2024/ Sign up for educational sessions: https://www.eventbrite.com/o/decentralized-identity-foundation-26691849135 Join the Discord community: https://discord.gg/WXPzWvBCjD

Don't miss out on this chance to collaborate, learn, and potentially win prizes while contributing to the advancement of decentralized identity technology. Register now and be part of the next wave of innovation. Happy hacking!

DIF Africa SIG Launch

DIF announced the launch of the DIF Africa Special Interest Group (SIG). This initiative aims to promote and advance decentralized identity technologies and standards across the African continent. Led by Chairs Gideon Lombard from DIDx and Jack Scott-King from VERA, the DIF Africa SIG will focus on addressing Africa's unique requirements and use cases in the decentralized identity space.

The SIG's primary objectives include raising awareness, fostering collaboration among stakeholders, contributing to standards development, and advocating for the adoption of decentralized identity solutions in Africa. Gideon and Jack invite all interested organizations, institutions, and individuals to join the inaugural meeting on October 16th, 2024, from 1:00 to 2:00 PM South African Standard Time. This marks an exciting step forward in ensuring that decentralized identity technologies are developed and implemented with Africa's specific needs in mind.

Read the full article:

Launch of the DIF Africa Special Interest Group Dear DIF Community, We are excited to announce the launch of the Decentralised Identity Foundation (DIF) Africa Special Interest Group (SIG). This initiative represents a significant milestone in our collective efforts to promote, advance, and support the development and adoption of decentralised identity technologies and standards across Africa. About the Decentralized Identity Foundation - BlogFoundation DID Traits and Trust DID Web: Significant Work Items Added to ID & Discovery Working Group

DIF is thrilled to announce the launch of two significant new work items within our Identifiers & Discovery Working Group. DID Traits and Trust DID Web are set to enhance the functionality and security of Decentralized Identifiers (DIDs), bringing the next level of trust, interoperability, and ease of use to the ecosystem.

Markus Sabadello, Founder & CEO of Danube Tech and Co-Chair of the ID & Discovery WG, highlights the significance of these initiatives: "Identifiers are the foundation of any digital identity system. The new work items, DID Traits and Trust DID Web, are vital steps forward in building robust identifier systems that other technologies and protocols can rely on."

Read the full article:

DIF Announces Two New Work Items in Identifiers & Discovery Working Group The Decentralized Identity Foundation is at the forefront of innovating standards and technologies for decentralized digital identity. Today, we’re excited to announce the launch of two new work items within our Identifiers & Discovery Working Group, aimed at improving the functionality and security of Decentralized Identifiers (DIDs). Markus Sabadello, Founder & CEO Decentralized Identity Foundation - BlogWorking Groups DID Method Standardization Initiative

DIF held the kickoff meeting for DID Method Standardization efforts. Here's a quick rundown of our latest developments:

Held the kickoff meeting on 22 September with around 50 participants. Regular schedule to be announced in early October. Presented the collaboration to W3C Technical Plenary / Advisory Committee meetings. Published a comprehensive update on our progress and next steps. Launched a central hub for collaboration and information sharing, featuring an overview of mission and goals, pointers to ongoing efforts, and general discussions

Your participation is crucial in shaping the future of interoperable DID Methods. Whether you're a seasoned expert or new to DIDs, we welcome your input and involvement.

🛠️ Working Group Updates 💡Identifiers and Discovery Work Group

The Identifiers & Discovery WG launched DID Traits and did:tdw work items. Read more in the featured section above

Identifiers and Discovery meets bi-weekly at 11am PT/ 2pmET/ 8pm CET Mondays

🪪 Claims & Credentials Working Group

Reminder that the Claims & Credentials WG is accepting input on the Basic Person schema, relevant for reusable identity claims.

The Credential Schemas work item meets bi-weekly at 10am PT / 1pm ET / 7pm CET Tuesdays

🔐 Applied Crypto WG

The Applied Crypto WG released BBS v07!

The DIF Crypto - BBS work item meets weekly at 11am PT/2pm ET /8pm CET Mondays

📦 Secure Data Storage

Additional improvements to the specification and implementation continue.

DIF/CCG Secure Data Storage WG - DWN Task Force meets bi-weekly at 9am PT/12pm ET/6pm CET Wednesdays

If you are interested in participating in any of the Working Groups highlighted above, or any of DIF's other Working Groups, please click join DIF.

📢 Announcements at DIF

Internet Identity Workshop (IIW) #39

The Fall IIW is taking place in Mountain View, California from 29 - 31 October. Book your ticket here.

🗓️ ️DIF Members DIF Member Spotlight: Moises Jaramillo of Dentity

We sat with Moises Jaramillo, Principal Engineer at Dentity and veteran software developer, to discuss his journey in decentralized identity and his current work pushing the boundaries of digital identity with Web3/Web5 and Decentralized Web Nodes.

Moises also shares insights into Dentity's groundbreaking partnership with ENS Labs and offers valuable advice for participants in the upcoming DIF hackathon.

Read more from Moises.

DIF Member Spotlight: Nick Lambert, CEO of Dock

We interviews Nick Lambert, CEO of Dock, who has been at the forefront of empowering individuals with control over their digital identity for over a decade.

In our interview, Nick shares valuable insights on:

The evolution of decentralized identity solutions and the power of industry-wide collaboration Dock's approach to verifiable credentials and their recent blockchain merger with Cheqd The potential of reusable KYC and Customer Identity Access Management (CIAM) as key growth areas A unique use case involving anonymous cyber incident reporting for the University of Arkansas

This spotlight offers a glimpse into the future of digital identity and the power of cooperation in solving complex challenges in our field. Read the full interview to dive deeper into Nick's perspectives on the evolving landscape of decentralized identity.

TBD Features DWNs in Hacktoberfest

TBD has added their DWN SDK to the annual Hacktoberfest! Check out the github repository to participate!

GitHub - TBD54566975/dwn-sdk-js: Decentralized Web Node (DWN) Reference implementation Decentralized Web Node (DWN) Reference implementation - TBD54566975/dwn-sdk-js GitHubTBD54566975 DIF Members in the News

Spruce announced their partnership with California DMV on the Mobile Drivers' License

SpruceID Partners with California DMV on the Mobile Driver’s License SpruceID has partnered with the State of California Department of Motor Vehicles (DMV) to bring mobile driver’s licenses to residents of California. SpruceIDSpruceID

Trinsic released their reusable ID SDK

Trinsic introduces SDK to ease reusable digital ID integration | Biometric Update Allows businesses to perform identity verification for their customers 10 times faster than before by accepting digital IDs within their existing IDV flow. BiometricUpdate.comChris Burt

👉Are you a DIF member with news to share? Email us at communication@identity.foundation with details.

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website or follow our channels:

Follow us on Twitter/X

Join us on GitHub

Subscribe on YouTube

🔍

Read the DIF blog

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Find more information on DIF’s slack or contact us at community@identity.foundation if you need more information.


Enable Miko's Journey with Truvity at the DIF 2024 Hackathon

Truvity brings the future of global digital identity management to life with its innovative challenges at the DIF 2024 Hackathon. A leader in user-centric digital identity systems, Truvity is dedicated to making the benefits of self-sovereign identity seamless and accessible for individuals and businesses alike. Their two innovative challenges aim

Truvity brings the future of global digital identity management to life with its innovative challenges at the DIF 2024 Hackathon. A leader in user-centric digital identity systems, Truvity is dedicated to making the benefits of self-sovereign identity seamless and accessible for individuals and businesses alike. Their two innovative challenges aim to revolutionize digital identity management for our increasingly global and mobile world, leveraging the power of decentralized identity and verifiable credentials.

Alexander Mikhailov, Product Manager at Truvity shares his thoughts: “We’re excited to see Truvity’s SDK come to life through the creativity and innovation of the DIF 2024 Hackathon participants. Our goal is to make it easy for developers to build with technology like verifiable credentials, so they can have a real impact on how we exchange information and move away from physical documents to a digital future of credentials.”

Streamlining relocation-related digital identity complexities with SSI

Truvity invites participants to explore how decentralized identity and verifiable credentials can streamline complex eKYC processes and digital identity management using self-sovereign identity solutions.

At the heart of these challenges is Miko, a talented backend developer embarking on an international relocation. Her journey from outside Europe to Amsterdam serves as the backdrop for exploring innovative applications of digital wallets, smart to-do lists, and interlinked verifiable credentials.

The Challenges Challenge 1: Miko’s Journey to Amsterdam

In the first challenge, you will create a Digital Identity Wallet with an embedded to-do list that guides Miko as she relocates to Amsterdam and navigates the complexities of settling in a new country. The wallet should:

Manage and submit Verifiable Credentials (VCs) for various steps of the relocation process Handle interlinked VCs to maintain data integrity across different procedures Simplify tasks from obtaining employment contracts to securing housing Challenge 2: eKYC Compliance Officer Panel

Building on Miko's journey, design a user-friendly Compliance Officer Panel for financial institutions. This challenge focuses on streamlining the verification process when Miko opens a bank account. Key aspects include:

Reviewing and approving interlinked Verifiable Credentials Providing an efficient interface for compliance officers to manage digital identity documents Ensuring the integrity and completeness of submitted credentials

Both challenges encourage participants to leverage Truvity's SDK and explore innovative ways to make digital identity management more accessible and secure in our increasingly mobile world.

Prize Pool and Participation

Truvity is offering a total prize pool of $5,000, with each challenge awarding:

1st Place: $1,500 2nd Place: $700 3rd Place: $300

To participate, leverage the Truvity SDK (available in TypeScript and Java) and ensure your solutions use W3C-compliant verifiable credentials.

Why Join Truvity's Challenge? Tackle real-world identity management issues faced by global citizens and financial institutions Work with cutting-edge technology in the decentralized identity space Gain visibility among leaders in fintech and digital identity sectors Contribute to making digital interactions simpler, more secure, and user-centric Have your entries featured on the Truvity blog and across their social channels with over 15,000 followers and readers.

Kim Hamilton Duffy, DIF's Executive Director, expresses enthusiasm for Truvity's choice to anchor the challenges in a real user journey: "By following Miko's relocation story, participants can focus on creating solutions that genuinely simplify people's lives. Concentrating on practical, human-centered use cases is the key to setting a new standard for convenience and agency in daily interactions.”

Ready to Innovate?

Register for Truvity’s information session ”Building SSI Solutions: An Introduction to Truvity SDK”, this Wednesday at 10/3/2024 noon ET (18:00 CEST).

We can't wait to see how you'll leverage decentralized identity and verifiable credentials to create a more connected, efficient, and user-friendly digital world!


Digital ID for Canadians

Spotlight on VoxMind

1. What is the mission and vision of VoxMind? At VoxMind, our mission is to revolutionize digital security by providing cutting-edge voice biometrics solutions that…

1. What is the mission and vision of VoxMind?

At VoxMind, our mission is to revolutionize digital security by providing cutting-edge voice biometrics solutions that protect identities and ensure secure authentication. Our vision is to create a world where identity verification is effortless, secure, and universally trusted—one where your voice is your most secure digital asset. We aim to set the gold standard in voice biometrics, delivering scalable and innovative solutions that address the evolving security needs of individuals and organizations worldwide.

2. Why is trustworthy digital identity critical for existing and emerging markets?

In today’s increasingly digital world, a trustworthy digital identity is crucial for secure transactions, both for established industries and emerging markets. As the global economy becomes more interconnected, consumers and businesses demand frictionless and secure authentication processes. Without trustworthy digital identities, fraud and identity theft risks increase, eroding user confidence. By incorporating secure and scalable biometric solutions like voice authentication, businesses can protect against these threats while delivering seamless customer experiences.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

Digital identity will enable a secure, efficient, and inclusive global economy. By ensuring secure access to services, whether financial, healthcare, or government, it can streamline operations, reduce fraud, and increase user trust. At VoxMind, we address challenges like identity fraud, AI-driven threats like deepfakes, and the need for easy-to-use solutions. Our voice biometrics technology offers a future-proof solution that can adapt across industries, safeguarding users while simplifying the digital verification process.

4. What role does Canada have to play as a leader in this space?

Canada, through organizations like DIACC, plays a pivotal role in shaping global standards for secure digital identity. With its commitment to innovation and inclusivity, Canada is well-positioned to lead in developing scalable, privacy-preserving solutions that can be adopted globally. By collaborating with global partners, Canada can help set the benchmark for interoperable and secure digital ecosystems that benefit both individuals and businesses.

5. Why did your organization join the DIACC?

VoxMind joined DIACC to be part of a visionary network shaping the future of digital identity. By collaborating with DIACC and its members, we aim to contribute to the creation of secure and interoperable identity standards. DIACC’s mandate aligns with our commitment to protecting individual identities in a scalable, secure, and privacy-preserving manner. As a Sustaining Member, we look forward to sharing our voice biometrics expertise and helping build a secure digital identity infrastructure for Canada and beyond.

6. What else should we know about your organization?

VoxMind is pioneering voice biometrics as a secure, convenient, and adaptive identity verification solution. We address modern security threats such as deepfakes and voice cloning while ensuring seamless user experiences across various industries, including finance, healthcare, and IoT. Our technology is designed to be language-agnostic, scalable, and adaptable to evolving security challenges. As we continue to innovate, we are committed to building partnerships that enhance global security and trust in digital identities.

Do not hesitate to contact us for more information at contact@voxmind.ai


Energy Web

Energy Web Launches AutoGreenCharge Beta App to Decarbonize EV Charging, Secured by Polkadot

Energy Web’s innovative app enables EV owners to decarbonize charging sessions with renewable energy Zug, Switzerland — October 1, 2024 — Energy Web is proud to announce the beta launch of AutoGreenCharge, a mobile app designed to decarbonize electric vehicle (EV) charging. With AutoGreenCharge, users can ensure that every EV charging session is powered by renewable energy. The app is accessible
Energy Web’s innovative app enables EV owners to decarbonize charging sessions with renewable energy

Zug, Switzerland — October 1, 2024 — Energy Web is proud to announce the beta launch of AutoGreenCharge, a mobile app designed to decarbonize electric vehicle (EV) charging. With AutoGreenCharge, users can ensure that every EV charging session is powered by renewable energy. The app is accessible to owners of popular electric vehicles, including Tesla, BMW, Mercedes, and others, bringing the promise of green charging to a worldwide, mainstream audience.

Powered by the decentralized technology of Energy Web’s EnergywebX and secured by the Polkadot blockchain, AutoGreenCharge offers a simple, secure, and verifiable solution to ensure EV charging is not just electric, but 100% renewable. By integrating renewable energy certificates (RECs), the app will automatically match EV charging sessions with clean energy, providing verifiable green charging in real time. While in the beta phase, users can familiarize themselves with the app’s core features and experience the future of EV charging firsthand.

AutoGreenCharge allows EV owners to easily connect their vehicles through a partnership with Smart Car. Once connected, every charging session is automatically tracked, giving users detailed insights into their energy consumption and environmental impact. As the app evolves toward full production, users will be able to retire real renewable energy certificates with each charging session, ensuring their cars are powered by clean, sustainable energy sources. Additionally, they will have the option to specify preferences for the type and location of renewable energy, offering personalized access to solar, wind, and other clean energy sources from around the globe.

Mani Hagh Sefat, CTO of Energy Web, shared, “AutoGreenCharge represents a major step forward in the electrification and decarbonization of transportation. By providing EV owners with a seamless way to ensure their cars are charged with renewable energy, we’re empowering drivers to make more sustainable choices and actively contribute to the global energy transition.”

AutoGreenCharge’s integration with the Polkadot blockchain ensures that every transaction and certificate retirement is securely recorded and verifiable, enhancing transparency and trust in the system. This cutting-edge app is a key development in the broader mission to build a more resilient, efficient, and sustainable energy system.

With the beta version now available, EV owners are encouraged to download the AutoGreenCharge app and start participating in this transformative initiative. The app can be easily found on the testflight Apple and Google Play Stores. As the app moves towards its full production release, users will play a crucial role in refining its features and improving the future of green charging.

For more information, visit Energyweb.org

Energy Web Launches AutoGreenCharge Beta App to Decarbonize EV Charging, Secured by Polkadot was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


DIF Blog

Redefine Digital Privacy: PSE's Identity Innovation Challenge at DIF 2024

Privacy & Scaling Explorations (PSE) brings cutting-edge cryptography challenges to the DIF 2024 Hackathon as a Gold sponsor. PSE invites you to develop applications that enhance privacy, security, and interoperability in Self-Sovereign Identity (SSI) systems using advanced cryptographic techniques like Zero-Knowledge Proofs (ZKPs), Multi-Party Computation (MPC), and Fully Homomorphic Encrypti

Privacy & Scaling Explorations (PSE) brings cutting-edge cryptography challenges to the DIF 2024 Hackathon as a Gold sponsor. PSE invites you to develop applications that enhance privacy, security, and interoperability in Self-Sovereign Identity (SSI) systems using advanced cryptographic techniques like Zero-Knowledge Proofs (ZKPs), Multi-Party Computation (MPC), and Fully Homomorphic Encryption (FHE).

The Track: Pushing the Boundaries of Privacy in SSI

This track is designed to foster innovation and collaboration among developers, researchers, and industry experts to advance the use of programmable cryptography in digital identity solutions. You’ll design and build solutions that use advanced cryptographic methods to empower users with greater control over their digital identities.

Objectives:

1. Innovate with ZKPs, MPC, FHE: Develop creative applications using programmable cryptography to enhance privacy, security, and interoperability in SSI systems.

2. Collaborate and Learn: Engage with ZKP experts to learn about best practices, existing tools, and solutions.

Potential Project Ideas: Solidity verifier for BBS+: Build a Solidity Verifier for VC issued with the BBS+ algorithm. Build a GPC builder + visualizer MPC-Based Social Recovery: Build a privacy-preserving social recovery system for Identity wallets using Multi-Party Computation (MPC). MPC-Based Credential Issuance: Develop a VC issuance system where credentials are computed via Multi-Party Computation (MPC). Decentralized PKI infra: Build a decentralized PKI for DIDs. The Prize Pool

Prizes in this track total $10,000 USD:

🥇 1st Place: $5,000 🥈 2nd Place: $3,000 🥉 3rd Place: $2,000 Why Participate? Collaborate with Experts: Work alongside PSE's team of cryptography specialists. Innovate at the Cutting Edge: Utilize advanced cryptographic techniques to solve real-world privacy challenges. Make a Real Impact: Your solutions could redefine digital privacy and identity management. Network with Leaders: Connect with industry pioneers in ZKPs and SSI. About PSE: Leaders in Privacy and Scaling Technologies

Backed by the Ethereum Foundation, PSE is dedicated to advancing scaling solutions and programmable cryptography for privacy-enhancing technologies. Their focus on ZKPs, MPC, and FHE positions them at the forefront of cryptographic innovation.

DIF Recognizes the Critical Role of ZKP in Advancing SSI

ZKPs and other advanced cryptographic techniques are essential for achieving the privacy and security goals of SSI. While numerous efforts are underway within the ZKP and SSI communities, they are often fragmented.

According to DIF’s Executive Director, Kim Duffy: "DIF is intensifying its focus on applying privacy-enhancing cryptographic techniques to SSI. We're keen to unify efforts and drive standardization in this crucial area, and PSE's sponsorship accelerates these initiatives. We're especially excited to see participants use advanced cryptography to create unified, privacy-preserving identity solutions that empower users and set new standards for digital trust."

Get Involved

Ready to make a difference in the world of digital privacy? Join PSE’s informative webinar session to learn more. Register today

Monday, 30. September 2024

DIF Blog

Pinata Challenges Developers to Innovate with File-Based Identity Solutions

DIF is thrilled to announce Pinata as a Gold sponsor of the DIF 2024 Hackathon! Pinata is bringing challenges that use decentralized file storage in digital identity solutions. About Pinata Pinata, the Internet’s File API, provides simple-to-use decentralized storage solutions for enterprises and individuals. Building on the secure

DIF is thrilled to announce Pinata as a Gold sponsor of the DIF 2024 Hackathon! Pinata is bringing challenges that use decentralized file storage in digital identity solutions.

About Pinata

Pinata, the Internet’s File API, provides simple-to-use decentralized storage solutions for enterprises and individuals. Building on the secure foundation of IPFS (InterPlanetary File System), which enables content authenticity by design, their tools abstract away the complexity typically associated with decentralized storage and management. Further, it offers robust features for scalable, responsive applications. This includes a global CDN (content delivery network) to boost load times and simple access control options.

Pinata's technology is widely used in decentralized applications, and its architecture makes it a perfect foundation for decentralized identity applications and solutions.

The Challenges

Pinata presents three challenges that highlight the role of decentralized storage in enabling decentralized identity solutions:

1. Verifiable File Storage

Associate files with users via verifiable credentials. Both files and verifiable credential metadata can be stored publicly or privately using Pinata's immutable Files API. This challenge invites you to demonstrate your creativity in using immutability and verifiable content hashes to solve real-world problems.

2. Proof of Personhood Credentials

In this open-to-interpretation challenge, we want to see creative solutions to personhood credentials that leverage immutable file storage. Participants can use private file storage through Pinata's Files API or public file storage through Pinata's IPFS pinning service.

3. Identity-Based Access Controls for Private Files

Build an identity-based access control system for retrieving files stored privately on Pinata's Files API. This challenge should focus on private file storage through the Files API, not IPFS, as IPFS is a public network.

Prizes:

Prizes total $10,000, broken down as follows:

Grand Prize: $5,000 Per-Challenge Prize: $1,500 (1 for each of the 3 challenges above) Honorable Mentions: 5 x $100 Why Participate in Pinata's Challenges? Innovative Technology: Work with Pinata's cutting-edge file storage solutions and explore their applications in digital identity. Real-World Applications: Develop solutions that address practical challenges in file management and access control. Skill Development: Enhance your expertise in decentralized storage, identity management, and access control systems. Industry Recognition: Showcase your creativity and technical skills to leaders in the decentralized storage and identity spaces. Why this Matters

The application of decentralized storage to decentralized identity presents exciting opportunities for innovation. Pinata and DIF share their thoughts on the potential impact of these challenges:

Justin Hunter, Vice President of Product at Pinata, emphasizes the practical applications: "Decentralized identity has shown that it can solve complex real world problems, and we’re excited to help facilitate continued progress in this area through complementary file storage. Pinata’s been building file storage solutions since 2018, and we’ve expanded to solve both decentralized storage and private file storage. Combined with decentralized identity, we think there can be some incredible applications."

Kim Duffy of DIF highlights the broader impact: "Pinata's challenges for DIF's Hackathon address a crucial aspect of decentralized identity adoption: making security, privacy, and scalability easily integrable into products by default. Their approach to simplifying decentralized storage is exactly what enterprises need to embrace these technologies at scale." 

She further adds, "I'm personally excited that Pinata is supporting innovation in the Proof of Personhood Credentials challenge. As AI advancements render traditional methods like CAPTCHAs ineffective, we need innovative, privacy-preserving ways to differentiate human users without over-collecting personal data."

Ready to Revolutionize File-Based Identity Solutions? Attend the sessions Register for the Hackathon opening session: https://www.eventbrite.com/e/opening-session-tickets-1027651562487  Regiter for Pinata's educational session with Steven Simkins, Pinata's Head of Developer Relations: https://www.eventbrite.com/myevent?eid=1029330564427 See Pinata’s challenges and developer resources: Hackathon site: https://docs.pinata.cloud/events/dif Docs: https://docs.pinata.cloud  App/Signup: https://app.pinata.cloud  Website: https://pinata.cloud  Join the DIF Hackathon Discord server to connect with other participants and Pinata mentors DIF’s Hackathon discord: https://discord.gg/WXPzWvBCjD  Channel: #pinata

Don't miss this chance to work with state-of-the-art decentralized storage technology and contribute to the future of file-based identity solutions!


FIDO Alliance

Webinar: NIST SP 800-63 Digital Identity Standard: Updates & What it Means for Passkeys

The fourth revision of the draft NIST SP 800-63-4 Digital Identity Guidelines is now open for public comment. The FIDO Alliance hosted a webinar on September 24, 2024, with top […]

The fourth revision of the draft NIST SP 800-63-4 Digital Identity Guidelines is now open for public comment.

The FIDO Alliance hosted a webinar on September 24, 2024, with top digital identity experts to discuss the latest updates to the standard and what they mean for passkeys.

Megan Shamas, CMO of the FIDO Alliance, was joined by guests Ryan Galluzzo, Digital Identity Program lead of NIST NCCOE, Teresa Wu, co-chair of the FIDO Alliance Government Deployment Working Group and VP of Smart Credentials at IDEMIA. The panel unpacked the latest changes to the draft and shared what it means for passkeys.

Webinar attendees also had an opportunity to get questions answered before the public comment submission deadline next month. NIST requests that all comments be submitted by 11:59 pm Eastern Time on October 7, 2024.

View the webinar slides below.

Webinar: NIST SP 800-63 Digital Identity Standard: Updates & What it Means for Passkeys.pptx from FIDO Alliance

Kantara Initiative

Dr Carol Buttle joins Kantara Initiative as Chief Technology Officer (CTO)

Virginia, US – 30 September 2024 – the Board of Directors of Kantara Initiative has announced that Dr Carol Buttle is to join the organization in the newly created role of Chief […] The post Dr Carol Buttle joins Kantara Initiative as Chief Technology Officer (CTO) appeared first on Kantara Initiative.

Virginia, US – 30 September 2024 – the Board of Directors of Kantara Initiative has announced that Dr Carol Buttle is to join the organization in the newly created role of Chief Technology Officer (CTO). 

Carol joins from the UK government’s Department of Science, Innovation and Technology (DSIT) where she was Head of Certification and Assurance. She brings with her unrivalled expertise in designing trust frameworks and identifying the implications of specific regulations and standards designed to support personal data privacy and security. Carol has input regularly into the development of identity trust frameworks across the globe.  As demand for identity certification extends into new markets and territories, she will ensure that Kantara maintains its role as leaders in identity assurance, offering challenge and guidance to support all our members, clients, and partners.   

“Standards are everywhere,” said Carol. “We all seek assurance that the products we use, the medicines we consume – even the restaurants we visit – meet the appropriate standards and will not cause us harm. It is no different with identity. I see my role not just about certification and approvals. It is about how I can steer the wider industry to improve for the good of individual citizens, and particularly the most vulnerable.”

We asked Kantara Executive Director, Kay Chopard, about what she sees as the greatest impact of the new role. “Our investment in a Chief Technology Officer demonstrates significant commitment to the future of the industry as a whole, particularly with regard to the potential for international growth and greater interoperability across sectors and territories. Carol’s appointment follows on from the recent arrival of UK-based auditors James Keenan and David Nutbrown as our Head of Certification Delivery and Head of Regulatory Compliance, respectively. Their expertise greatly strengthens our existing operations and will secure confidence in the future direction of certification and identity assurance.”

Commenting on Carol’s appointment, Kantara Board Chair Andrew Hughes stated: “Carol’s appointment brings a real depth of regulatory and operational expertise to our leadership. It underpins the valuable contribution we already make through our Work Groups and certification programs. Carol brings with her thorough knowledge and expertise of UK requirements and how they might apply in the US. This will benefit those Kantara members who are engaged with US Federal Agencies or those wishing to become certified under the UK Digital Identity & Attributes Trust Framework (DIATF).”

Click here to understand more about our US assurance program approval process

The post Dr Carol Buttle joins Kantara Initiative as Chief Technology Officer (CTO) appeared first on Kantara Initiative.

Friday, 27. September 2024

OpenID

Announcing the Death and the Digital Estate Community Group

By Dean H. Saxe I am happy to announce the formation of the Death and the Digital Estate Community Group (DADE CG).  DADE CG has been created as a space for the OpenID Foundation and identity community to develop an understanding of how individuals can manage their digital estate in the event of disablement or […] The post Announcing the Death and the Digital Estate Community Group first ap

By Dean H. Saxe

I am happy to announce the formation of the Death and the Digital Estate Community Group (DADE CG).  DADE CG has been created as a space for the OpenID Foundation and identity community to develop an understanding of how individuals can manage their digital estate in the event of disablement or death.

Before we dive deep into DADE, the problem space, and the outputs of DADE – all of these things will come in the next few weeks – I want to share some history of how I found myself working on this passion project.  In November 2010, a close friend of mine passed away unexpectedly.  His passing rocked my world. We were close in age – both under 40 – with young children and plans for the future.  Our plans to climb Mt. Rainier together never came to pass.    

In the following months and years, I spent a lot of time thinking about death and how I, and others, could manage our credentials for online services in order to allow our friends and family to gracefully manage our digital footprint after death.  Initially, I spent time considering a dead man’s switch – a mechanism that would release access to a set of credentials to a defined individual or individuals after a fixed period of time where the owner didn’t check in to the service. The solution seemed unwieldy and impractical.  

I let this idea sit on the back burner for years.  In those intervening years my career moved into the realm of identity.  

As I saw the rise of stronger authentication factors, and a move toward digital identity documents such as mDLs, the issue became more pressing.  In 2022, recognizing the inevitability of passkeys and the impact that passkeys would have on the ability of family and loved ones to gain access to their deceased loved one’s accounts it became clear we had to resolve this gap.

In 2022 at EIC in Berlin, Germany came a late night conversation with Andrew Shikiar, Khaled Zaky, Tim Cappalli, and Vittorio Bertocci.  I shared my crazy idea, we discussed the issue earnestly and agreed that action was needed.  Tim – in a way that only Tim could – named it “Dean’s Death Service”.  Life happened and, yet again, this idea remained on the back burner.

Nearly 18 months later at Authenticate 2023 and Internet Identity Workshop XXXVII, with the identity community still mourning Vittorio Bertocci’s passing, I leaned on the community to help move the idea of DADE into real action.  People shared with me stories of their friends and loved ones’ difficulties managing the deceased’s digital estate.  Others shared their own methods for trying to preserve their digital estates in advance of their passing.  

I’ll be honest – I’m not comfortable with talking about death.  I’m not sure most of us are.  As identity practitioners, it is up to us to define the use cases, the risks, the privacy implications, and – most importantly – the benefits of standardized mechanisms to allow individuals to choose how their digital estate is managed during disablement or after death.  The work we do enables the systems that power modern identity.  We must consider how that identity persists in a digital form after death.

Finally, I must extend my appreciate and gratitude to the people who helped make this a reality: Ian Glazer, Mike Kiser, Kaliya Identity Woman Young, Arynn Crow, Nishant Kaushik, George Fletcher, Gail Hodges, Teresa Wu, Pamela Dingle, Tim Cappalli, Andrew Shikiar, Khaled Zaky, Eve Maler, Andi Hindle, Jeremy Grant. Thank you all for the discussions, stories, feedback, that helped get DADE CG off the ground..  

Welcome to the Death & the Digital Estate Community Group.  Over the next few weeks I’ll take care of background administrivia to get the group bootstrapped.  Please join the DADE CG mailing list to keep up with the latest information about DADE CG.  I look forward to our shared work!

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Announcing the Death and the Digital Estate Community Group first appeared on OpenID Foundation.


Me2B Alliance

Identity Resolution and the Big Dogs

In our recently published research on the worldwide web of commercial surveillance, we took a close look at the global infrastructure connecting and correlating personal information across platforms, devices, and even from physical world sources like point-of-sales systems. The connectivity is, in a word, staggering. At some point, however, there is a first-party relationship with […] The post I

In our recently published research on the worldwide web of commercial surveillance, we took a close look at the global infrastructure connecting and correlating personal information across platforms, devices, and even from physical world sources like point-of-sales systems. The connectivity is, in a word, staggering. At some point, however, there is a first-party relationship with a data subject. From that starting point, personal information is systematically being shared with countless entities including data brokers. In such a hyper-interconnected infrastructure, how can a single publisher make promises about where customer data is going? Moreover, how could a user possibly consent to the sharing of their data with thousands of recipient organizations?   But the complexity and unknowability of system behavior isn’t just with these hyper-interconnected marketing networks. As we touched on in a recent podcast with Zach Edwards, very large platforms (like Google, Facebook, X) are just as complex and opaque as the identity resolution and customer data platform networks. Software is increasingly a leaky, hyper-connected, unpredictable sieve of personal data sharing. In this blogpost, we take a closer look at the opacity and leakiness of the “big dogs”–large online platforms with hundreds of millions and billions of users.  

1. Types of Commercial Identity Resolution 

I’ve been digging into this more since the publication of our research on identity resolution and customer data platforms and have revised my framing of identity resolution. To wit, I observe three co-existing types of commercial identity resolution architectures or systems happening in the world: 

The first one I call distributed by design. This is the LiveRamps, The Trade Desks, mParticles, etc. of the world. These systems enjoy the power of massive data aggregation with [too] little of the risk and responsibility, as they are designed to be third-parties relative to the data subject. These platforms are architected to ingest and process (resolve) personal information from a disparate array of services and devices.   The second one I call company-centric.  This is the “big dog” platforms with millions or billions of users; the universes unto themselves. A company-centric identity resolution can also be distributed by design in the sense that it provides numerous small pieces of functionality which can be embedded as third-party resources into other companies’ apps and websites, allowing the big dogs to collect data external users despite not necessarily having direct relationships with them. Microsoft is a good example of this. It’s also true that company-centric identification schemes can and are ingested by distributed systems like LiveRamp. The lines implied by these two categories are fuzzy.  The third one I call standardized.  This is the hiding-in-plain-sight globally coordinated efforts in Unified ID 2.0 and European Unified ID. Note that these efforts are championed primarily by distributed by design identity resolution and customer data platforms. Scanning the partners of just the Unified ID 2.0 standard is enough to give one pause: these are the platforms that want to know who you are and what you’re doing at all times. Notably absent are the big dogs.  

A brief word about national/governmental identification schemes, like India’s Aadhaar and the US Internal Revenue Service’s id.ME: these systems operate somewhat like a big dog company-centric identification system, orchestrating personal information across their own services, with the exception that we don’t expect these systems to be either ingesting or sharing data with external, commercial platforms1.   At Internet Safety Labs (ISL), we rate “big dog” platforms as critical risk “data aggregators”2. We do so for the following two reasons: 

These corporate entities monetize personal information, either through ownership of advertising platforms, the selling of audience information, or other monetizing behaviors, and   These entities run multiple consumer products and services with inadequate transparency of how personal information flows across product lines.  

The remainder of this post takes a closer look at Google and Facebook (Meta) personal data strategies and why they’re so risky. 

1.1 GAIA and Narnia: Google’s Universal Identification and Cross-Product Personal Data Aggregation Grand Plan 

In the wake of the recent Google search antitrust case in the US, Jason Kint published a long thread on a recently unsealed 325-page Google strategy document. The document titled “Display, Video Ads, Analytics and Apps” contains a coordinated and synthesized set of business strategies describing how Google can: 

More effectively coordinate the extraction of user information,  Better leverage user data across all of their AdTech, and   In general, increase ad revenues across its entire portfolio of products and services: “make it easier to add monetization to other Google O&O [owned and operated] properties.”3 

The document also covers how Google doesn’t make as much money from sites it doesn’t own and would like to assert its control to make them more like sites it does own, thereby increasing revenues.   Nearly every product line’s strategy contained in the document mentions the use of “GAIA signals” or “GAIA data”. GAIA is Google’s proprietary “universal ID”4. The plan clearly outlines how they can better utilize the massive trove of personal information joined by their GAIA “universal IDs”, amassed across their various owned and operated (O&O) properties, like Gmail and Chrome to name two of the largest. This highlight from page 126 (section on “Smart Campaigns”) makes clear Google’s intention to share user information across all its properties to enrich their advertising services (project Narnia and Narnia2):  But it’s not enough to join user data across Google properties; they also indicate an intention to join external data sources, such as streaming and TV ad networks (pg 150):  The second highlighted section above describes the ingesting of external customer data and resolving the data (i.e. identity resolution) to Google’s GAIA IDs.   Overall, the document describes an organization-wide, orchestrated plan to amass and unify user data (via GAIA IDs) to better leverage Google ads (Narnia 2.0) for both internal and external properties.  How can Google users understand–nevermind consent– to the use of their personal information in this wide-reaching way?  

1.2 Facebook Admits Unknowability of User Data Processing 

One of my favorite references for explaining why the world needs software safety labels is this story about two Facebook architects explaining how it’s virtually impossible for Facebook to know where user data is going. The complexity and dynamism of software is making it so it’s not a bounded system—and it’s never the same river twice.   The story came out two years ago and I recently read the discovery document written in April 2021 and it is really good. This excerpt outlines the fundamental problem of the unknowability of Facebook software’s behavior:  And this: The discovery document contains fascinating information on what Facebook must do to track personal data usage within its system [implement Curated Data Sources and a Purpose Policy Framework], and it’s a massive undertaking: 450-750 engineer years over three calendar years. And even that’s not enough. It also requires “heavy Ads refactoring and adoption work in warehouse.”  Let’s go back to that “closed form system” described by the Facebook engineers. It comes from mathematics’ “closed-form expression”, describing an equation comprised of “constants, variables and a finite set of basic functions connected by arithmatic operations and function composition.”5 If we look at realtime bidding as one example of a programmed system, we see that it is necessarily dynamic and unbounded. The participants (buyers) in the realtime bidding network are dynamic; also the ad inventory itself is dynamic. Realtime bidding is, by design, never the same river twice. The system is not a closed form system.   Machine learning (ML) is another example: virtually all of the ML technologies generating much recent hype are also not closed form systems by design. They are constantly changing based on the training set, based on ongoing learning, and based on dynamic rule-making.  

2. Have We Agreed to Be Always Known and Tracked Online? 

To summarize the situation: industry has developed techniques (distributed by design and company-centric) to interconnect and aggregate personal information such that we are always known and tracked online. As noted in the earlier mentioned research paper, there are at least $9T (as in trillion) worth of industries that want to know who we are and what we’re doing at all times. It’s unlikely that we can stop this financially motivated juggernaut of universal identification. So what’s to be done?

2.1 To Do List  Consent is dead. It’s impossible and the more we pretend like it’s possible to have informed consent when it comes to the unbounded nature of software, the more we are lying to ourselves.   Privacy policies protect companies but not the people who use technology. Know how you’ve consented into the worldwide web of commercial surveillance? It’s through this phrase found in many privacy policies: “…and we [may] share your data with our marketing partners.”  We need more exposure of actual measured software behavior (ala ISL’s App Microscope: https://appmicroscope.org/app/1579/).  One day, it will be possible for systems to generate machine-readable records of processing activities–a kind of passport stamp showing how your data was processed (used by first party, shared and used by third parties). This will be a landmark moment in empowering people through transparency of actual system behavior.    Data broker regulation is inadequate.   If a platform has your data, it should de facto have a first party relationship with you, and as such, you are entitled to all the proactive data governance rights allowed to you. In other words, nothing about me, without me. Data brokers aren’t and never have been just 3rd parties.  Note that these data rights are unexercisable if people don’t know that they’re actually in a relationship with a particular platform. Thus, there also needs to be a requirements for these platforms to proactively notify all data subjects for which they hold information.  Is the selling of personal information safe for humans and humankind? We’ve agreed as a society that certain things are sacrosanct and the selling of which unacceptably degrades and devalues them (such as votes, organs, children). We need to have a much deeper think about whether or not personal information should fall in that category.  Are data broker laws effective in their current form? It seems clear to ISL that all actual data brokers are not currently registered in the states requiring registration.    Privacy and safety experts–and perhaps regulatory experts–need to get more aware of and involved in the two universal commercial identification standards (Unified ID 2.0 and European Unified ID) pronto.   Identity resolution platforms and customer data platforms demand substantially more regulatory attention.   Minimally, the massive troves of personal information are ripe for data breaches.   Maximally, the public needs assurances that platforms that are amassing this data are held to accountability.  

 

Footnotes: Note that ISL has not confirmed this. List of ISL designated data aggregators at the time of this writing: Adobe, Amazon, Apple, Google, Meta, Microsoft, and X.  https://storage.courtlistener.com/recap/gov.uscourts.vaed.533508/gov.uscourts.vaed.533508.1132.2_1.pdf, page 7. See ISL paper on Identity Resolution and Customer Data Platforms for more information on universal identification schema.  https://en.wikipedia.org/wiki/Closed-form_expression

The post Identity Resolution and the Big Dogs appeared first on Internet Safety Labs.

Thursday, 26. September 2024

FIDO Alliance

Webinar: NIST SP 800-63 Digital Identity Standard: Updates & What it Means for Passkeys

The fourth revision of the draft NIST SP 800-63-4 Digital Identity Guidelines is now open for public comment. The FIDO Alliance hosted a webinar on September 24, 2024, with top […]

The fourth revision of the draft NIST SP 800-63-4 Digital Identity Guidelines is now open for public comment.

The FIDO Alliance hosted a webinar on September 24, 2024, with top digital identity experts to discuss the latest updates to the standard and what they mean for passkeys.

Megan Shamas, CMO of the FIDO Alliance, was joined by guests Ryan Galluzzo, Digital Identity Program lead of NIST NCCOE, Teresa Wu, co-chair of the FIDO Alliance Government Deployment Working Group and VP of Smart Credentials at IDEMIA. The panel unpacked the latest changes to the draft and shared what it means for passkeys.

Webinar attendees also had an opportunity to get questions answered before the public comment submission deadline next month. NIST requests that all comments be submitted by 11:59 pm Eastern Time on October 7, 2024.

Watch the presentation below.


MyData

Welcome to the new board of MyData Global!

Author: Christopher Wilson, Executive Director at MyData Global. I am very happy to welcome the new members of the board of MyData Global. Their appointment marks a new era for […]
Author: Christopher Wilson, Executive Director at MyData Global. I am very happy to welcome the new members of the board of MyData Global. Their appointment marks a new era for […]

Digital Identity NZ

Spring Clean | September Newsletter

Kia ora, Recent weather extremes in Aotearoa reminds us of life’s delicate balance. As I gaze at our last daffodils and watch the remaining lamb playing in the paddock – having sadly lost two to the cold earlier in the month – I’m reminded of the circle of life. This theme resonates with the recent … Continue reading "Spring Clean | September Newsletter" The post Spring Clean | September Newslet

Kia ora,

Recent weather extremes in Aotearoa reminds us of life’s delicate balance. As I gaze at our last daffodils and watch the remaining lamb playing in the paddock – having sadly lost two to the cold earlier in the month – I’m reminded of the circle of life. This theme resonates with the recent closure of the Open Identity Exchange (OIX) at the end of August, a significant player in the digital identity industry, also facing a challenging financial climate.

Fortunately, Digital Identity NZ is thriving, with more organisations joining our mission. We’re excited to welcome 3PlusConsultingArrowheadBeingAIPaymentsNZQubitCybertechHappy and Voco as new members. A heartfelt thank you to all our members who contribute to our mahi.

In recognition of International ID Day, individual member Vica Papp shared a blog post highlighting its significance. We continue to make progress, as seen in the PaymentsNZ – DINZ Digital Identity May 2024 sprint report, with more updates on collaborations to come.

I had the honour of being a panellist alongside the mighty Holly RennieRalph Bragg and Adrian Smith at FSC24 on September 4. We explored how global innovation is shaping New Zealand’s future in ‘FinTech Innovation and Open Banking.” Don’t forget to take advantage of our 10% discount offer for DINZ news readers to attend The Point 2024 – thank you Payments NZ! More information below.   

Still basking in the glow of Digital Trust Hui Taumata 2024, we’re excited to share the wealth of content including the Opening Keynote on Trust Frameworks and an Innovation Spotlight on identity-centric solutions for enterprise security. Attendees received links to the presentations, and our Coffee Chat attendees engaged with Slido questions posed to our speakers and panellists – sparking great discussions! Look out for more insights from the Hui coming soon.

Recently, DINZ members AWSWorldline and Xebo along with myself, met with Minister Collins to share our observations on the landscape. We left the meeting with a clear understanding of her expectations, reflecting a very positive engagement.

Looking ahead, we’re entering DINZ’s annual election cycle for the Executive Council. As we begin this process, we want to take a moment to sincerely thank and recognise our outgoing Councillors. Their contributions have been invaluable in shaping the direction and growth of our community.

We eagerly look forward to welcoming the next cohort of passionate members, ready to step into these important roles. This is the essence of spring – embracing new opportunities and growth while acknowledging what has brought us to this point.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Spring Clean | September Newsletter

SUBSCRIBE FOR MORE

The post Spring Clean | September Newsletter appeared first on Digital Identity New Zealand.

Wednesday, 25. September 2024

DIF Blog

DIF Hackathon 2024: Sponsors and Challenges Revealed

DIF is excited to announce our impressive lineup of 2024 Hackathon sponsors and provide a preview of our challenges. This year’s hackathon promises to push the boundaries of decentralized identity innovation with challenges focused on education, reusable identity, frictionless travel, and more! Meet Our Sponsors We are honored

DIF is excited to announce our impressive lineup of 2024 Hackathon sponsors and provide a preview of our challenges. This year’s hackathon promises to push the boundaries of decentralized identity innovation with challenges focused on education, reusable identity, frictionless travel, and more!

Meet Our Sponsors

We are honored to have the following organizations supporting this year’s event, contributing challenges, prize pools, and guidance for participants.

Gold Tier Sponsors Jobs for the Future Foundation & Digital Credentials Consortium: Driving the Future of Education & Workforce Track, invite you to explore a future where access to education is available to any learner and where education opens the door to economic advancement. Pinata: Specializing in file-based identity solutions, Pinata offers challenges that use verifiable credentials and immutable content to create secure and innovative access control systems. Privacy + Scaling Explorations Focusing on advanced cryptographic techniques like Zero-Knowledge Proofs (ZKPs), Privacy + Scaling Explorations aims to improve privacy and security in SSI solutions. Silver Tier Sponsors ArcBlock invites you to use its developer-friendly framework and tools (including the No-code AI Application Studio) to build Decentralized Identifiers (DIDs) and VCs, creating privacy-preserving, secure identity solutions. Standout projects will have the opportunity to be showcased in ArcBlock's Blocklet Store. Truvity offers a challenge focused on streamlining digital identity management, helping users navigate complex processes like eKYC with seamless integration of digital wallets and VCs.The challenges focus on building innovative applications that simplify the journey of individuals like Miko, an expat moving to Amsterdam, by leveraging digital wallets, to-do lists, and interlinked VCs. Vidos offers two challenges - one focuses on decentralized identity solutions for recruitment and employee onboarding using secure, verifiable credentials. The other challenge explores reusable identities and credentials for multiple scenarios e.g. a single passport credential used for travel and age verification. Bronze Tier Sponsors Anonyome focuses on privacy-preserving Personhood Credentials (PHCs), allowing users to interact anonymously while maintaining control over their data. Cheqd's challenge, "Harnessing Decentralized Identity for Verifiable AI," invites participants to build solutions that address societal and technical challenges arising from the rise of generative AI.  Crossmint challenges participants to create reusable identity solutions for KYC/KYB, ensuring compliance and security with minimal friction across platforms. NetSys NetSys aims to streamline frictionless travel experiences, using verifiable credentials to make traveling more secure and seamless. Ontology focuses on building decentralized authentication systems, using DIDs and VCs to create reusable identity solutions for secure logins and verification. TBD/Block offers challenges in reusable identity and decentralized storage, simplifying KYC processes through verifiable credentials and decentralized web nodes.

Tooling Sponsors

We’re excited to announce that Trinsic is providing its global document and identity verification tooling for free to hackathon participants.Participants who want IDV as part of their wallet onboard, reusable identity creation, etc can optionally integrate this into their solutions.

What You’ll Be Building

This year’s hackathon offers challenges that align with DIF’s mission of advancing privacy, security, and interoperability in decentralized identity systems. 

We previously announced our original hackathon tracks, including Frictionless Travel, Future of Education & Workforce, and Reusable Identity.

We are thrilled to add three more challenge focuses thanks to our generous sponsors: File-based Identity Solutions, Personhood Credentials, and Zero-Knowledge Proofs in Self-Sovereign Identity.

Full challenge details and requirements will be available on our DevPost site, coming October 1 at 9am ET. Highlights are described below.

File-Based Identity Solutions

Create file-based identity solutions using VCs to associate files with users, protect data, and build secure access control systems. Your solutions will explore the use of immutable file storage to ensure data integrity and privacy, with the flexibility of public or private storage options.

Sponsors: Pinata

Frictionless Travel

Build seamless travel systems using decentralized identity technologies that allow travelers to share information securely across booking and transport platforms. Leverage VCs to create frictionless, portable identities for a hassle-free travel experience.

Sponsors: NetSys

Future of Education & Workforce

This track focuses on building solutions that empower individuals to control their education and employment data. Participants will showcase transformational experiences enabled by verifiable learner/worker IDs and records, develop tools for multilingual credentials and browser integrations, and enhance existing tools like Learner Credential Wallet and VerifierPlus. The challenge emphasizes cross-border recognition and skills verification to open new opportunities for learners and workers globally.

Sponsors: Digital Credentials Consortium, Job for the Future, and Vidos

Personhood Credentials (PHCs) and Verifiable AI

These challenges aim to enhance AI safety and trustworthiness using decentralized identity. You'll focus on attestations of AI- and/or human-initiated interactions and content, and combinations such as proof of authorized AI agents. These challenges draw from ideas in the Personhood Credentials (PHCs) paper. PHCs enable anonymous interactions, ensuring user privacy by making digital activity untraceable by issuers and unlinkable across service providers, even in cases of collusion, effectively countering scalable deception.

Sponsors: Anonyome, Cheqd, and Pinata

Reusable Identity 

Develop reusable identity solutions that enable credentials issued for one use (like KYC) to be repurposed for others, such as age verification or employment history. Highlight the benefits of interoperable identity systems across platforms.

Sponsors: Anonyome, ArcBlock, Crossmint, TBD, Truvity, and Vidos

Zero-Knowledge Proofs in SSI 

Build applications that use Zero-Knowledge Proofs to enhance privacy and security in digital identity. You’ll work with advanced cryptographic tools to create privacy-preserving SSI systems. This is a chance to apply cutting-edge technology to real-world identity challenges and set new standards in data privacy and interoperability.

Sponsors: Privacy + Scaling Explorations

Key Dates Hacking Period: October 1, 2024 (6:00 am PT) – November 4, 2024 (3:00 pm PT) Educational Sessions: October 1, 2024 – October 10, 2024 Sponsor-Run Office Hours: October 14, 2024 – November 1, 2024 Judging Period: November 7, 2024 – November 17, 2024 Winners Announced: November 20, 2024 (9:00 am PT) How to Get Involved Pre-registration as a participant is open now! Don’t miss your chance to be part of this exciting event. Register here to secure your spot as a participant.  DevPost Site Goes Live on October 1, 2024, where you’ll learn details of the challenges, submit your projects, and track the competition. Educational Sessions start October 1, with sponsors leading workshops to provide technical guidance. Register to attend the opening ceremony. Office Hours will run from October 14 to November 1, offering participants the chance to ask questions directly via DIF’s Discord channel. Join the DIF Hackathon Community by the DIF-Hackathon Discord. Prizes and Recognition

With cash prizes approximately $70,000, participants will also have the chance to network with other decentralized identity builders and experts, participate in educational sessions, and receive public recognition.

Join Us in Shaping the Future of Decentralized Identity

This hackathon is your opportunity to innovate and build solutions that can change the world of identity as we know it. Whether you’re tackling privacy challenges, building reusable identity systems, or reimagining the future of education and workforce, the DIF Hackathon is the place to be.

Pre-register now and be part of the next wave of decentralized identity innovation!


Next Level Supply Chain Podcast with GS1

How Modern Barcodes Keep Supply Chains in Check with Rich Eicher

Imagine if barcodes not only speed up your grocery checkout but also transform logistics, healthcare, and the overall efficiency of global supply chains. In this episode, hosts Reid Jackson and Liz Sertl are joined by Rich Eicher, Director of codeREADr. With his extensive experience in barcode innovation, Rich shares insights into how modern camera-based barcode readers surpass traditional

Imagine if barcodes not only speed up your grocery checkout but also transform logistics, healthcare, and the overall efficiency of global supply chains.

In this episode, hosts Reid Jackson and Liz Sertl are joined by Rich Eicher, Director of codeREADr. With his extensive experience in barcode innovation, Rich shares insights into how modern camera-based barcode readers surpass traditional laser readers and why dedicated barcode scanning devices are preferred in specific environments.

Rich explains barcodes' critical role in various business applications, from facilitating accurate inventory management to preventing costly supply chain errors. He also elaborates on the industry's adaptation to consumer demands, the significant challenges of barcode inaccuracies and their impact on delivery services, and how advancements in AI and ChatGPT are poised to revolutionize data capture and processing across industries.

 

In this episode, you’ll learn:

The differences between laser and camera-based barcode readers commonly used in grocery stores.

The importance of barcodes in various business applications and the issues caused by barcode discrepancies in the supply chain.

The upcoming GS1 Sunrise 2027 initiative is transitioning to QR codes for enhanced data capture.

 

Jump into the Conversation:

[00:00] Introducing Next Level Supply Chain

[01:01] Who Rich Eicher is, and what he does

[03:08] The different barcodes and their significance

[10:13] All about laser reading barcodes

[13:12] The importance of using barcodes and why companies are shifting to using them more

[16:22] The problems that come along with not using barcodes

[22:16] Other trends happening outside of barcodes

[27:22] Rich’s favorite technology he is using right now

[29:51] Closing

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guest:

Rich Eicher on LinkedIn


Digital Identity NZ

DINZ Hui Keynote: Juliana Cafik on Digital Identity Trust Frameworks

At the Digital Trust Hui Taumata in August, Juliana Cafik, Microsoft’s Identity Standards Architect, delivered a thought-provoking keynote on global digital identity frameworks. With over 27 years of experience, Juliana provided valuable insights into how trust frameworks enable secure, interoperable digital transactions worldwide. What are Digital Trust Frameworks? Trust frameworks establish rule

At the Digital Trust Hui Taumata in August, Juliana Cafik, Microsoft’s Identity Standards Architect, delivered a thought-provoking keynote on global digital identity frameworks. With over 27 years of experience, Juliana provided valuable insights into how trust frameworks enable secure, interoperable digital transactions worldwide.

What are Digital Trust Frameworks?

Trust frameworks establish rules for:

Ensuring secure digital interactions Verifying identities in both public and private sectors Applying region-specific levels of assurance for various transactions

Global Examples of Trust Frameworks

eIDAS (European Union)

Goal: Enable cross-border identity verification for 80% of EU citizens by 2030 Focus: Electronic signatures, digital wallets, and consistent identity proofing across member states.

DIACC (Canada)

Focus: Economic growth through the adoption of digital identity Approach: Public-private collaboration to ensure compliance and usability

New Zealand’s Unique Position

Juliana praised New Zealand’s approach, emphasising these key elements:

Focus on Safety: New Zealand uniquely integrates safety into its trust framework Key Framework Pillars: Identity management, privacy, security, data management, and facilitation Collaborative Potential: Strong public-private partnerships can enhance adoption

Challenges and Opportunities

Global Challenges: Trust frameworks are complex, evolving alongside technology and regulations. Opportunity for New Zealand: Learning from other countries’ implementations, New Zealand can lead in trust framework innovation.

Conclusion

Juliana encouraged New Zealand to embrace its potential by fostering collaboration across sectors to build a robust, trusted framework that supports digital identity verification and secure interactions.

The post DINZ Hui Keynote: Juliana Cafik on Digital Identity Trust Frameworks appeared first on Digital Identity New Zealand.


DINZ Hui Innovation Spotlight: Why Identity-Centric Security is Crucial for Enterprise Protection

In today’s digital landscape, data breaches and cyber threats are rapidly evolving. At a recent conference, Marc Airo-Farulla, Regional Sales Director of Entrust, discussed the importance of identity-centric security solutions as the cornerstone for enterprise data protection. He highlighted how identity management is reshaping cybersecurity strategies and shared insights on how organisations can

In today’s digital landscape, data breaches and cyber threats are rapidly evolving. At a recent conference, Marc Airo-Farulla, Regional Sales Director of Entrust, discussed the importance of identity-centric security solutions as the cornerstone for enterprise data protection. He highlighted how identity management is reshaping cybersecurity strategies and shared insights on how organisations can safeguard their data effectively.

Shifting Threat Landscape

Traditional security methods are no longer enough: Perimeter defences like firewalls fail to address evolving threats. Phishing remains a top concern: 85% of security breaches stem from phishing attacks, often exploiting employee vulnerabilities.

The Role of Identity Access Management (IAM)

Identity is now the frontline defence: Managing user access, including employees and contractors, is critical to mitigating risk. Fragmentation challenges: As more systems and technologies are introduced, IAM becomes harder to manage and protect, leading to vulnerabilities.

Zero Trust Frameworks

Adopt a zero-trust mindset: Zero trust means no user or device is trusted automatically, even within the organisation’s network. Data security first: Protecting sensitive data is key, ensuring minimal access and maintaining strict oversight of who has access to what.

Real-World Breach Examples

Marc shared a cautionary tale of an Australian e-subscription company that collapsed after a cyberattack. Within two months, the business was wiped out, emphasising the dire consequences of inadequate cybersecurity.

The Future of Security with Entrust

Investing in live identification solutions: Entrust has been at the forefront of developing future-proof security systems, like their partnerships with Onfido for live identification. Securing digital assets: Using tools like hardware security modules (HSM) can safeguard critical business data.

Best Practices for Enterprises

Educate and train employees: From C-suite executives to the newest team members, everyone needs to understand the importance of security measures. Limit data collection: Only store what is necessary, minimising the risk in case of a breach.

Identity-centric security solutions are the future of enterprise protection. By shifting the focus to identity management and implementing zero-trust frameworks, organisations can better protect their digital assets and reduce the risk of devastating breaches.

Thanks to Entrust for this insightful talk, and for supporting the Digital Trust Hui Taumata in 2024.

The post DINZ Hui Innovation Spotlight: Why Identity-Centric Security is Crucial for Enterprise Protection appeared first on Digital Identity New Zealand.


International Identity Day 2024: Why Legal Identity Matters for Everyone

Observed annually on 16 September, International Identity Day aligns with the United Nations’ Sustainable Development Goal (SDG) 16.9, which aims to provide legal identity, including birth registration, to all people by 2030. While most of us can easily prove our identity, for millions around the world, the lack of legal identity remains a significant barrier … Continue reading "International Iden

Observed annually on 16 September, International Identity Day aligns with the United Nations’ Sustainable Development Goal (SDG) 16.9, which aims to provide legal identity, including birth registration, to all people by 2030. While most of us can easily prove our identity, for millions around the world, the lack of legal identity remains a significant barrier to accessing even the most basic services.

If you’re reading this, the chances are that you have a legal identity and can prove it. But without one, life is vastly different. Without legal documentation, children may miss out on vaccinations and education, adults are unable to secure formal employment, access healthcare or welfare, vote in elections, start a business, use banking services, travel abroad, register their children’s births, or even claim inheritance or pensions. In effect, you don’t officially “exist.”

The Global Picture: Millions Left Behind

In 2021, the World Bank’s ID4D Global Dataset reported that over 850 million people worldwide had no way to prove their identity. Around 540 million of these individuals were in Africa, and half of all women in low-income countries lacked identification. Although progress has been made since then, it is difficult to determine how much progress has been achieved until the next dataset is published.

Globally, civil registration programmes are being increasingly integrated with healthcare systems to ensure children are enrolled early. National birth registration initiatives have been launched or accelerated in countries like Cameroon, Zimbabwe, Nigeria, and Papua New Guinea. This issue is not confined to low-income nations, as high-income countries, including Australia, are also making strides in streamlining identity processes. A recent pilot in New South Wales, for instance, allowed parents to register the birth of their baby across federal and state government agencies using a single account. This “tell us once” approach eliminates the need for parents to interact with multiple government agencies—reducing up to seven separate interactions.

The Challenges Closer to Home

Whilst Aotearoa New Zealand may seem far removed from these global statistics, we are not without our own identity challenges. Certain groups in our population struggle to prove who they are, including rural communities, blind and deaf citizens, former refugees, unhoused people, those who have escaped domestic abuse, and individuals recently released from prison. These groups often remain legally invisible. This raises the question: can Aotearoa help address this before 2030, or will we run out of time to meet the goal?

Take Action and Learn More

International Identity Day serves as a reminder that the right to identity is fundamental. Without it, people are denied basic human rights. For a deeper understanding of this global issue and the progress being made, we recommend listening to the ID16.9 podcast, which offers valuable insights into the state of identity inclusion around the world. The podcast is available on Spotify, Apple, Google, or at ID16.9 podcast.

As we move towards 2030, it is clear that achieving universal legal identity is not just about meeting a target set by the UN. It’s about unlocking access to opportunities, dignity, and human rights for everyone. Let’s continue to push for meaningful change, both here in Aotearoa and globally.

By Vica Papp

The post International Identity Day 2024: Why Legal Identity Matters for Everyone appeared first on Digital Identity New Zealand.

Tuesday, 24. September 2024

Hyperledger Foundation

Sunsetting Tessera and Simplifying Besu

Many great use cases are served by Besu in the blockchain space. We are proud of this engagement with public Ethereum, private networks, enterprise, L2s, and more. However, the Besu code base has become a monolithic swiss-army knife. 

Many great use cases are served by Besu in the blockchain space. We are proud of this engagement with public Ethereum, private networks, enterprise, L2s, and more. However, the Besu code base has become a monolithic swiss-army knife. 


Elastos Foundation

Elastos Incorporates BukProtocol; Decentralized Commerce for the Travel Sector Now Direct in Bitcoin

White label service to enable travel agencies to bring the benefits of decentralization to travel suppliers and consumers Agreements traded via Bitcoin-denominated Smart Contracts facilitated through Elastos’ BeL2 protocol  Partnership highlights opportunity for Real World Asset (RWA) to eliminate friction and inefficiencies in the travel sector Singapore: September 24, 2024 – Elastos today an
White label service to enable travel agencies to bring the benefits of decentralization to travel suppliers and consumers Agreements traded via Bitcoin-denominated Smart Contracts facilitated through Elastos’ BeL2 protocol  Partnership highlights opportunity for Real World Asset (RWA) to eliminate friction and inefficiencies in the travel sector

Singapore: September 24, 2024 – Elastos today announced a partnership with the Real World Asset (RWA) application (dApp), BukProtocol, to extend decentralization to the travel sector, direct in native Bitcoin.  

BukProtocol converts travel booking and other agreements into tokenized, fully transferable ‘Dynamic Assets’ which can subsequently be monitored, exchanged or traded, in the event of cancellation or itinerary change. Tokenization covers all aspects of the travel experience, from travel to accommodation but can potentially extend to other travel-related services such as hospitality, guides or local attractions.  

Through Elastos’s BeL2 protocol, the tokenization process will be completed directly in native Bitcoin, to maximize the integrity, security and liquidity of resulting Smart Contracts.  Travel is notoriously unpredictable, with itineraries, schedules and routes subject to change at the last minute often due to unforeseen incidents, the consequences of which can range from unused accommodation to multiple reimbursement claims.
Tokenization – and the transparency offered by decentralization – has the potential to mitigate much of the resulting friction faced by travelers and suppliers alike.  

To date, over 2.2 million properties have been on-boarded onto the BukProtocol system, including rooms from brands such as The Hilton Group, Marriott and Wyndham Hotels & Resorts. BukProtocol is available as a white label service for travel suppliers and agencies to better manage their bookings and inventories, which can subsequently be traded across Web2 or Web3 marketplaces. 

According to Arul Prakesh, Founder and CEO of BukProtocol, explains that the Elastos partnership is about much more than reaching new audiences for its dApp.

“While a presence within the Elastos ecosystem will certainly boost our visibility and reach, what’s really compelling is BeL2’s potential to complete the entire tokenization process directly in Bitcoin, a token that most of our audiences are familiar and comfortable with. Bitcoin denomination also maximizes the liquidity – and ‘tradability’ – of resulting assets; a crucial consideration for users,” he says. 

Elastos’s BeL2 Protocol enables the tokenization of any travel-related experience – from a journey to accommodation – based on terms defined in a Bitcoin-assured Smart Contract. Thanks to the protocol, this process can be completed without bridging, wrapping or otherwise interfering with the Bitcoin layer; this both assures the integrity of the currency and avoids network congestion and additional fees that would otherwise result.  

Jonathan Hargreaves, Elastos’ Global Head of Growth, describes the partnership as the perfect intersection between Bitcoin tokenization and Real World Assets (RWA). 

“BeL2’s unique ZK-proof process – ensuring complete interoperability while leaving the Bitcoin layer entirely untouched – means that Bitcoin’s integrity is fully leveraged throughout the tokenization process.  In practice, this means that members of the travel community can trade assets (Smart Contracts) directly with each other, completely eliminating the need for intermediaries and the inefficiencies that inevitably result.  The BUK Protocol is really demonstrating the practical – and exciting – opportunities that can emerge from decentralization; within a sector that’s characterized by unpredictability and sub-optimized inventories,” says Jonathan.

 About BukProtocol

Buk Protocol is a solution stack to create secondary markets for Dynamic Assets
We are enabling flexibility and liquidity for RWAs by allowing users buying event tickets, hotel room bookings, airline bookings and other industries which associate an expiration date to their assets and services

Discover more : https://bukprotocol.io/

About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.

The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.

Media Contact

Roger Darashah

roger.darashah@elastoselavation.org 


Blockchain Commons

Results of Second FROST Round Table Published

On September 18, 2024, Blockchain Commons held its second Round Table on FROST. Almost twenty expert cryptographers, designers, and developers came together to share the challenges and successes they’ve had with FROST over the last year as well as the advances of their differing approaches. A full log of the meeting is now available, including video, rough transcript, rough summary, and PDFs of all

On September 18, 2024, Blockchain Commons held its second Round Table on FROST. Almost twenty expert cryptographers, designers, and developers came together to share the challenges and successes they’ve had with FROST over the last year as well as the advances of their differing approaches.

A full log of the meeting is now available, including video, rough transcript, rough summary, and PDFs of all of the presentations.

Our next FROST meeting will be a FROST Developers meeting, focused on helping wallet developers to implement FROST (and why they might want to). It’s scheduled for December 4th. Sign up for our Gordian Developers mailing list or Signal channel to receive an invite.

Thank you to HRF for supporting Blockchain Commons’ FROST work in 2024.

Monday, 23. September 2024

DIF Blog

Launch of the DIF Africa Special Interest Group

Dear DIF Community, We are excited to announce the launch of the Decentralised Identity Foundation (DIF) Africa Special Interest Group (SIG). This initiative represents a significant milestone in our collective efforts to promote, advance, and support the development and adoption of decentralised identity technologies and standards across Africa. About the

Dear DIF Community,

We are excited to announce the launch of the Decentralised Identity Foundation (DIF) Africa Special Interest Group (SIG). This initiative represents a significant milestone in our collective efforts to promote, advance, and support the development and adoption of decentralised identity technologies and standards across Africa.

About the DIF Africa SIG:

The DIF Africa SIG has been established to foster collaboration among African organisations, institutions, and individuals who are passionate about decentralised identity technologies. Operating under the Decentralised Identity Foundation (DIF), the SIG aims to create an open ecosystem for decentralised identity, focusing on Africa’s unique requirements and use cases.

Purpose and Scope:

The primary goals of the DIF Africa SIG include:

Promoting awareness, understanding, and education about decentralised identity within the African community. Encouraging collaboration among stakeholders to address specific issues, opportunities, and use cases relevant to the African market. Contributing to the development of decentralised identity standards with an emphasis on Africa’s needs. Advocating for the adoption of decentralised identity solutions among African enterprises, government bodies, and individuals.

The SIG will engage in various activities, including technical collaboration with other DIF working groups, outreach and education initiatives, and advocacy efforts to drive the adoption of decentralised identity technologies across the continent. We are committed to ensuring that the solutions developed are interoperable and tailored to the African ecosystem.

Membership and Leadership:

Membership in the DIF Africa SIG is open to any organisation, institution, or individual interested in supporting the mission and objectives of the SIG. We encourage broad participation from African entities and experts, regardless of DIF membership status.

We are pleased to announce the leadership of the DIF Africa SIG, with Gideon Lombard from DIDx, serving as the Chair, and Jack Scott-King, representing VERA, as the Co-Chair. Together, they will lead the group’s activities, represent the SIG within DIF, and ensure alignment with the charter.

Inaugural Meeting:

We invite you to join us for the inaugural meeting of the DIF Africa SIG on 16 October from 1:00 to 2:00 PM South African Standard Time. This meeting will be a great opportunity to discuss our goals, outline our roadmap, and explore ways to get involved.

Link: https://calendar.app.google/8QVaRaFP9U1YQ9Ms8

Meetings and Communications:

The SIG will conduct its meetings primarily through teleconferences, email lists, and online collaboration tools, holding regular sessions to discuss progress and share updates. We will adhere to the DIF Code of Conduct in all our activities.

We are eager to begin this journey and invite you to participate in the DIF Africa SIG. We look forward to your engagement and insights as we explore how best to collaborate. 

Thank you for your support and enthusiasm for advancing decentralised identity technologies in Africa!

Best regards,

The DIF Africa SIG Team.


The Engine Room

Join our October online event series: strengthening information ecosystems 

Join us for a series of online conversations about the work of strengthening information ecosystems in these regions. The post Join our October online event series: strengthening information ecosystems  appeared first on The Engine Room.

Join us for a series of online conversations about the work of strengthening information ecosystems in these regions.

The post Join our October online event series: strengthening information ecosystems  appeared first on The Engine Room.


Digital ID for Canadians

Spotlight on Docusign

1. What is the mission and vision of Docusign? Docusign’s mission is to bring agreements to life by accelerating the process of doing business and…

1. What is the mission and vision of Docusign?

Docusign’s mission is to bring agreements to life by accelerating the process of doing business and simplifying people’s lives. With its Docusign IAM platform, Docusign unleashes business-critical data that is trapped inside of documents and disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign IAM, companies can create, commit, and manage agreements easily. Focusing on the ‘commit’ capability, where identity verification is more relevant, Docusign’s extensive portfolio of identity verification solutions make it simpler for stakeholders to commit to agreements through advanced, AI-enabled identity verification solutions and multiple levels of authentication. Supporting capabilities such as Phone authentication, ID verification, biometric detection, and FINTRAC-compliant-workflows, not only can signers easily confirm their identities, but senders (i.e. businesses) can also securely capture and store the identity information provided during the agreement completion process. This ensures that all parties are who they claim to be, and agreements are enforceable.

2. Why is trustworthy digital identity critical for existing and emerging markets?

Over the last few years we’ve seen the highest volumes of fraudulent cases ever on record (Cifas, Fraudscape 2024). Therefore, it’s understandable why we’re starting to notice drastically higher levels of regulatory scrutiny, and more requirements being imposed on businesses in terms of introducing strong identity verification methods for digital interactions. This scrutiny isn’t just in mature markets, but also in emerging ones where the rapid-adoption of new technologies that’s accompanying fast-paced growth, is necessitating urgent regulatory oversight. Therefore, having trustworthy digital identity for both existing and emerging markets is essential for secure, efficient, and inclusive digital economics. For emerging-economies specifically, widely available identity verification tools that are easy to use can help promote secure, sustainable and long-term economic growth that ensures equitable access to increasingly digital services.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

The local and international benefits of digital identities are transformative, in the sense that they can help enhance the security of increasingly digital interactions, improve the efficiency of these interactions, and also make them more accessible across various sectors. For example, the adoption of digital identities can further enhance trust in online interactions, making it easier for consumers to engage in e-commerce transactions. This will likely lead to the expansion of the digital economy in Canada, where secure and convenient online shopping experiences will become safer and, therefore, more adopted. Globally, this could drive the growth of e-commerce, particularly in developing economies where digital identities can securely bridge the gap between offline and online markets. That’s where Docusign’s portfolio of identity verification solutions comes in. Our extensive portfolio offers enhanced signer identification and authentication capabilities built into any agreement workflow, enabling organizations to transact a full range of agreements with increased trust and ease of use.

4. What role does Canada have to play as a leader in this space?

Canada has introduced a series of anti-fraud initiatives that have made a significant impact on combating various forms of fraud across the country (public awareness campaigns, industry collaboration (‘Canadian Bankers Association’s Fraud Prevention Month’), strong legislative frameworks (FINTRAC), etc.). These initiatives have made significant progress in reducing fraud, increasing awareness, and improving recovery efforts locally. By advocating for similar initiatives internationally, Canada can influence the global development of digital identity systems to ensure other countries can reap similar benefits. Through innovation, collaboration, and advocacy, Canada can help ensure that digital identity becomes a force for good in the global economy.

5. Why did your organization join the DIACC?

Focused on securing the online agreement space for everyone, Docusign joined the DIACC to help shape the future of digital identity in Canada and contribute towards developing a more secure, inclusive and beneficial digital agreement ecosystem for its Canadian customers. Being able to collaborate with like-minded industry leaders and drive innovation across the country makes DIACC membership a valuable investment for Docusign.

6. What else should we know about your organization?

Agreements are based on intention and identity: organizations need to be able to trust that signers are who they say they are. The standard practice of verifying a signer’s identity is to send a link to the signer’s email address. But agreement value, sensitivity, business risk, regulation, or legal requirements can drive the need for enhanced identification. The challenge is to deliver stronger verification, while keeping the overall experience user-friendly. That’s where Docusign Identify comes in. Identify provides a portfolio of enhanced signer identification and authentication capabilities built into the agreement workflow, enabling organizations to transact a full range of agreements with increased trust. These solutions include: * ID Verification: FINTRAC-compliant digital identity proofing of participants in agreements workflows via biometric checks such as AI-enabled liveness detection, verification of passports, driver licenses, or permanent resident cards * Phone Authentication: multi-factor authentication via text message or phone call * ID solutions for digital signatures: meet requirements for UK and EU electronic identification, authentication and trust services (eIDAS) compliant Advanced (AES) and Qualified Electronic Signatures (QES) * Network of trust service solutions: Easy access to our tightly-integrated global network of trust service providers for region-specific compliance To learn more, visit www.docusign.com/en-ca/products/identify


Identity At The Center - Podcast

Join us on the latest episode of the Identity at the Center

Join us on the latest episode of the Identity at the Center podcast as we explore the critical components of a successful IAM program. We break down the key elements required to build a solid foundation for your IAM program and set you up for success. Watch at https://www.youtube.com/watch?v=5-kRe187AG0 or listen in your podcast app. #iam #podcast #idac

Join us on the latest episode of the Identity at the Center podcast as we explore the critical components of a successful IAM program. We break down the key elements required to build a solid foundation for your IAM program and set you up for success.

Watch at https://www.youtube.com/watch?v=5-kRe187AG0 or listen in your podcast app.

#iam #podcast #idac


Elastos Foundation

6 Reasons You Can’t Ignore about Elastos, the Decentralised Internet, Secured by Bitcoin

The world is increasingly dominated by privacy breaches with the annual cost of cybercrime predicted to reach 10.2 trillion by 2025, not to mention the centralised control of digital platforms and user manipulation. Elastos offers a solution—a decentralised internet, secured by the very network that powers the world’s most trusted digital currency, Bitcoin. Just as […]

The world is increasingly dominated by privacy breaches with the annual cost of cybercrime predicted to reach 10.2 trillion by 2025, not to mention the centralised control of digital platforms and user manipulation. Elastos offers a solution—a decentralised internet, secured by the very network that powers the world’s most trusted digital currency, Bitcoin. Just as Bitcoin provides financial freedom and sovereignty, Elastos brings those same principles to the world of data, identity, and digital ownership. By merge-mining with Bitcoin, Elastos guarantees us that it’s protected by Bitcoin’s miners, delivering a specific layer of security and trust. This unique integration allows Elastos to inherit Bitcoin’s decentralisation, security, and freedom while adding its distinct value to the future of the internet. No Elastos isn’t dying, and here are 6 core reasons why Elastos is the next-generation decentralised Internet, secured by Bitcoin!

1. Reclaim Your Digital Sovereignty

Take back control of your digital life with Elastos, where you own your data, identity, and assets—free from corporate control. At its core, Elastos taps into the desire for freedom—the same emotional appeal that made Bitcoin a global movement. We are increasingly trapped by big tech companies that control our data and content, Elastos emerges as the platform that offers true digital sovereignty. We already trust Bitcoin to secure their financial transactions, and now, Elastos builds a decentralised internet with the same security and trust as Bitcoin, giving us full control over their data and digital assets.

2. Simple, Secure, Decentralised

Elastos is secured by Bitcoin miners, meaning Bitcoin-backed security, so you can immediately relate to its reliability. The platform provides a Bitcoin-level security system for your data and content, making sure that your digital assets are as secure as your Bitcoin. Elastos brings Bitcoin’s security to the decentralised internet, so your data is as secure as Bitcoin. While the technology behind Elastos is complex, its value is simple: Elastos is a new, decentralised internet that allows you to control your data and identity as securely as you control your Bitcoin.

3. The Next Digital Revolution

Elastos is where Bitcoin’s security meets the decentralised internet, opening the full potential of blockchain for digital freedom. Elastos is part of the next digital revolution, one that mirrors the transformation brought by Bitcoin in the world of finance. Elastos and Bitcoin together form the ultimate decentralised duo—Bitcoin secures your finances, while Elastos secures your digital life. Merge mining with Bitcoin gives us a unique advantage, ensuring that Elastos is powered by the world’s most trusted decentralised network.

4. Empowering People, One Story at a Time

Elastos gives us the tools to own your digital life, powered by the same miners securing Bitcoin. Elacity, a project on Elastos building the next generation of digital ownership through a decentralised marketplace for Digital Capsules, had a creator earn $5,600 in less than 24 hours monetising a video on Elastos. This is just the start and Elastos has the potential to empower millions of people who want to reclaim control over their digital identities, content, and data. Much like Bitcoin has given financial freedom to millions, Elastos provides the tools for users to take back their digital freedom and realise their digital worth. When you use Elastos to escape the centralised control of big tech and build direct relationships with your audiences, you begin to realise your true digital worth.

5. The Future of Digital Security

Elastos is unbreakable—secured by Bitcoin’s miners, it is the safest decentralised platform for the digital future. Elastos is the new internet, offering a future where you have total control over your data, identity, and digital assets. Elastos is one of the most secure decentralised networks in existence, making it an inevitable part of the future due to Bitcoin miners securing Elastos. Bitcoin-level security reinforces Elastos’ strength and drastically increases its probability of success against those without.

6. Evolving with the Digital World

As the digital world evolves, Elastos remains the most trusted and secure decentralised platform—powered by Bitcoin miners. As Elastos grows, and integrates new features like the Bitcoin Elastos Layer2 protocol, BeL2, Elacity dDRM and beyond, it not only adapts to the evolving digital landscape, but it leads it. Elastos is a solution to the privacy crises of today, offering users a way to secure their digital lives against threats like data breaches and surveillance. With its Bitcoin-backed security, Elastos represents freedom from centralised risks—ensuring that it stays relevant in a world that increasingly values privacy and control.

So there we have it, 6 Reasons You Can’t Ignore about Elastos the Next-Gen Internet, Secured by Bitcoin. Want to learn more about Elastos and how to own a piece of its network? Learn here! Did you enjoy this article? Follow Infinity for the latest updates here!

Friday, 20. September 2024

Human Colossus Foundation

HCF Presence at DaKM 2024 in Copenhagen

The Human Colossus Foundation (HCF) is pleased to announce that Paul Knowles, co-founder and Head of the Advisory Council, will attend the 9th International Conference on Data Mining & Knowledge Management (DaKM 2024) in Copenhagen on 21-22 September 2024. Paul will present his paper, "Data-Centric Design: Introducing an Informatics Domain Model and Core Data Ontology for Computational Sys

We are delighted to announce that Paul Knowles, Head of the Advisory Council at the Human Colossus Foundation (HCF) and co-founder of the Foundation, will attend the 9th International Conference on Data Mining & Knowledge Management (DaKM 2024) in Copenhagen, Denmark, on the 21st-22nd September 2024.

DaKM 2024, September 21-22, 2024, Copenhagen, Denmark

Paper Presentation

In addition to his active role at HCF, Paul will present his paper titled "Data-Centric Design: Introducing an Informatics Domain Model and Core Data Ontology for Computational Systems."

This paper marks a significant leap forward in redefining system architectures through the Informatics Domain Model and Core Data Ontology (CDO), promoting a shift from traditional node-centric designs to a data-centric paradigm. These models enhance data security, semantic interoperability, and scalability across distributed data ecosystems with their quadrimodal domain structure: objects, events, concepts, and actions.

You can find further details and the abstract of the paper here.

Session 3 Chair

As part of his contribution to DaKM 2024, Paul has been invited to chair Session 3. The session will cover various topics, from AI-powered assistive technologies to virtual reality and intelligent community-driven platforms. It promises to explore cutting-edge solutions with potential applications in HCF's ongoing initiatives around distributed data ecosystems and AI development.

Paul will oversee discussions on the following topics during Session 3 at DaKM 2024:

Topic 1: An Immersion Sailing Experience and Simulation Feedback System for Disabled People using Artificial Intelligence and Virtual Reality – Presented by HoiNi Yeung and Ang Li, this talk will showcase a virtual reality sailing simulator designed to help individuals with disabilities practise sailing in a realistic environment using AI and VR technology.

Topic 2: An Intelligent Robot Arm used to Automate Chores to Eliminate Time Waste using Computer Vision – Presented by Yifei Zhang and Jonathan Sahagun, this presentation will cover the use of computer vision and AI to automate household tasks, improving adaptability and efficiency in daily chores.

Topic 3: Enhancing Indoor Environments through Augmented Reality and Artificial Intelligence for Personalised Plant Integration – Presented by Yingqi Wang and Marisabel Chang, discover how AR and AI are used in PlantAR to enhance indoor environments by providing personalised plant recommendations, promoting better air quality and well-being.

Topic 4: A Smart Community-Driven Tutoring Mobile Platform using Artificial Intelligence and Machine Learning – Presented by Haoyun Yang and Yu Cao, this platform leverages AI for personalised quizzes, encouraging peer-to-peer learning and technological innovation in education.

Topic 5: An Intelligent System to Help Individuals with Mobility Issues Crack Eggs using an App and a Bluetooth-Connected Mechanical Device – Presented by Alexander Xu and Jonathan Sahagun, a Bluetooth-enabled device designed to help individuals with mobility issues by automating egg-cracking using machine learning.

Topic 6: Medifact: A Reliable Mobile Application for Combating Medical Misinformation using Verified Data Sources – Presented by Annabel Shen Tu and Andrew Park, this mobile app tackles the spread of medical misinformation through verified health data and AI-driven validation processes.

Topic 7: An Intelligent Mobile Platform to Recognise and Translate Sign Language using Advanced Language Models and Machine Learning – Presented by Arlene Chang and Jonathan Sahagun, this platform translates American Sign Language (ASL) into English and vice versa, bridging communication gaps between Deaf and hearing individuals using AI.

Topic 8: A Smart Medicine Mobile Platform for Injury Diagnosis and Mental Stress Management using Artificial Intelligence and Machine Learning – Presented by Zelin Jason Hu and Garret Washburn, this mobile app provides AI-generated injury diagnoses and mental stress management solutions, improving accessibility to healthcare.

Topic 9: A Policy Report Evaluating the National Assessment Program for Literacy and Numeracy (NAPLAN) Reform in Australia – Presented by Wenya Zhang, a critical evaluation of the NAPLAN reform, focusing on its impact on students and proposing policy improvements for standardised testing in Australia.

For more details about the programme schedule, visit the Programme schedule.

If any of these topics align with your work in distributed data ecosystems or DDE-related issues, don't hesitate to contact Paul Knowles or the HCF advisory team to explore potential synergies. Email: Ac@humancolossus.org  

About Paul Knowles 

Paul Knowles is a leading figure in decentralised semantics and co-founder of the Human Colossus Foundation. He chairs the Decentralised Semantics Working Group and has over 25 years of experience in pharmaceutical biometrics, having worked with companies such as Roche, Novartis, GlaxoSmithKline, Amgen, and Pfizer. His contributions include the Overlays Capture Architecture (OCA) for semantic interoperability. He also holds advisory roles at Secours.ai and Global Privacy Rights at 0PN Governance Architecture.


About the Human Colossus Foundation 

At the Human Colossus Foundation, we envision a Dynamic Data Economy (DDE) where data is harmonised, secure, and framed by robust governance principles. Our mission is to empower businesses and individuals with the tools and frameworks they need to make better-informed decisions through real-time, accurate data. The DDE bridges existing standards while embracing new data-centric structures that respect human and jurisdictional differences.


Me2B Alliance

Webinar: The Worldwide Web of Commercial Surveillance Identity Resolution & Customer Data Platforms

Are you concerned about data brokers and commercial surveillance, but never heard of identity resolution platforms? This webinar is for you! This webinar provides an overview and explanation of the infrastructure that powers the worldwide web of commercial surveillance, which is a data aggregating force, powering data brokers and a lack of online anonymity. In […] The post Webinar: The Worldwide

Are you concerned about data brokers and commercial surveillance, but never heard of identity resolution platforms? This webinar is for you! This webinar provides an overview and explanation of the infrastructure that powers the worldwide web of commercial surveillance, which is a data aggregating force, powering data brokers and a lack of online anonymity. In this webinar, we look deeply at:

Identity resolution and customer data platforms, How they work, Why they’re risky, and what you can do to protect yourself.

The target audience for this webinar is for privacy professionals (lawyers, regulators, and industry) and concerned users of technology.

Geekiness Level: Medium

Open PDF

The post Webinar: The Worldwide Web of Commercial Surveillance Identity Resolution & Customer Data Platforms appeared first on Internet Safety Labs.

Thursday, 19. September 2024

Elastos Foundation

Elastos Creates World’s First live Music Inscription on Bitcoin at Token 2049

Partnering with YeloPlay and BeatFarm Elastos shows the potential of the New Age of Bitcoin Creators and Artists can eliminate intermediaries, reduce fees, retain ownership and drive immediate revenue Singapore September 20th 2024 – Elastos, a SmartWeb ecosystem provider, today created the World’s first live music inscription at its Asia OnChain Real World Assets (RWA) […]
Partnering with YeloPlay and BeatFarm Elastos shows the potential of the New Age of Bitcoin Creators and Artists can eliminate intermediaries, reduce fees, retain ownership and drive immediate revenue

Singapore September 20th 2024 – Elastos, a SmartWeb ecosystem provider, today created the World’s first live music inscription at its Asia OnChain Real World Assets (RWA) event at Token 2049. Elastos, part of the Group supported by the Singapore headquartered Elastos Foundation, used its eScription technology to create a recording that was not only inscribed live to Bitcoin and the Elastos Smart Chain (ESC) but now remains available to artists and fans for all time as a minted collectible of the occasion. This is a significant example of the New Age of Bitcoin, where artists and creators can eliminate intermediaries, reduce fees and drive immediate revenues, while retaining an immutable record of their work.

This first recording was supported by Elastos partners, YeloPlay, a Web3 platform with a catalog of Indian and Sri Lankan soundtracks, film scores, and hits and BeatFarm, a transformative music industry platform focused on the development of new inscription technology and blockchain-based music consumption models. Today’s demonstration is a significant moment in showing the potential for Bitcoin and the Bitcoin infrastructure to deliver a new economic model. The song being inscribed is “Back to the start” composed and performed by Mike De Zilva, Kat Sonata and Charth Fernando.

BeatFarm enables this inscription by inscribing an Ordinal on Bitcoin and an eScription on ESC, which is attractive to Bitcoin holders as it maintains the benefits of the currency’s security and integrity to protect their investment. For creators and artists, it means Bitcoin owners can easily use their currency without bridging, wrapping or otherwise interfering with the Bitcoin layer. The inclusion of ESC is important, as it avoids network congestion and additional fees that would otherwise result, allowing users to establish Smart Contracts on their own terms, ultimately becoming an attractive alternative way to commercialize content.

Jonathan Hargreaves, Global Head of Growth, Elastos explains: “This is a key moment for Web3 models and the music industry as for the first time an instantaneous record of a performance can be created and monetized using Bitcoin and crypto technologies. This gives artists ownership rights and control, as well as a direct relationship with their fans cutting out intermediaries and their fees. This approach has the potential to transform how music is made, consumed and stored, empowering everyone to enjoy and cherish the creative product.”

One interesting aspect of the Elastos BIT Index is that BRIC nations and the Global South are moving quicker to embrace Bitcoin. For example, 24% of tech-savvy Indian consumers and 26% of UAE respondents use Bitcoin every day as their currency of choice compared to the global average of 18% while only 11% of Germans, 13% of UK respondents, 14% of South Koreans and 15% of US tech-savvy consumers are prepared to do the same. Furthermore, 91% of Nigerians and 90% of Indians see a time when Bitcoin could become a type of ‘default’ current compared to only 70% in Germany and 73% in the UK and South Korea, and 75% in the US. Therefore, it is perhaps no surprise that YeloPlay, whose back catalogue includes material from some of Sri Lanka’s most famous artists such as Roshan de Silva, Mariazel Goonathilaka, Nalin Perera, Mike De Zilva, Shane Vincent, and Joel Fernando, has participated in this project.

Jonathan Hargreaves continues: “Although today’s first live inscription signals both a technological and creative revolution, it also enables cultural renaissance where artists from anywhere in the world, such as these Sri Lankan and Indian artists supported by YeloPlay, can experience new artistic and economic opportunities.  It means every community can promote and share a unique cultural record by preserving it on the Bitcoin infrastructure forever.”

Beatfarm’s Co-Founder, Alex Panos shared:  “Crucial to this revolution, BeatFarm is providing artists the tools and resources to control their content, collaborate, and take advantage of new ways to connect with their super fans in ways which we haven’t seen before, supporting communities and social media movements who want to share musical culture.”

Mike de Zilva, Founder and CEO of YeloPlay, describes completion of the live eScription as a milestone for the creative and music industries:

“The World’s first live eScription, literally, signals the moment when artists regained control over their creations – it’s massive. And it’s not limited to content, Bitcoin-backed eScriptions will enable artists to develop and define completely new relationships with their audiences, incentivising super fans, for instance, with exclusive content, access and experiences. All on the artist’s terms,” says Mike. 

 

About BeatFarm

BeatFarm is a Web3 music distribution platform that enables artists to mint and inscribe their music onto multiple blockchains simultaneously, while also connecting directly with their superfans. BeatFarm aims to democratize music distribution by leveraging blockchain technology to provide artists with unprecedented control over their work and direct access to their fans.

Media Contact

Roger Darashah, roger.darashah@elastoselavation.org


OpenID

Accelerating mDL Adoption in the United States

The OpenID Foundation is delighted to announce that it is one of 15 parties to a Collaborative Research and Development Agreement (CRADA) with the National Cybersecurity Council Center of Excellence. This project will accelerate digital identity adoption in the United States. Its first use case supports financial institutions to meet their Know-Your-Customer requirements using Mobile […] The pos

The OpenID Foundation is delighted to announce that it is one of 15 parties to a Collaborative Research and Development Agreement (CRADA) with the National Cybersecurity Council Center of Excellence. This project will accelerate digital identity adoption in the United States. Its first use case supports financial institutions to meet their Know-Your-Customer requirements using Mobile Drivers License (mDL) technologies and standards

The OpenID Foundation and its members are passionate about the role that standards play in catalyzing ecosystems. We are grateful to NIST for the opportunity to engage in this project and for its continuing contributions to industry standards, including their active participation in OpenID Foundation Work Groups.

In this case, we are excited to test the OpenID for Verifiable Credentials family of specifications, especially OpenID for Verifiable Presentations, to support financial use cases. This effort directly complements our work with the California DMV, which employs two upcoming Hackathons to test a variety of public and private sector mDL use cases.

We are looking forward to further collaboration with the other 14 parties, among whom are many valued members of OIDF, including Idemia, Spruce ID, Mattr, the California DMV, and Sustaining Board Members Microsoft and Block/TBD.

We encourage members of the Financial Services community who are interested in this work to complete this survey, powered by our friends at the Secure Technology Alliance, which aims to understand FinServ awareness and attitudes towards Mobile Drivers Licenses.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Accelerating mDL Adoption in the United States first appeared on OpenID Foundation.


Oasis Open Projects

OASIS Coalition for Secure AI Welcomes EY, Protect AI, Trend Micro, and Zscaler as Newest Premier Sponsors

Boston, MA – 19 September 2024 – The Coalition for Secure AI (CoSAI), an OASIS Open Project that launched on 18 July 2024, is announcing the addition of EY, Protect AI, Trend Micro, and Zscaler as its newest Premier Sponsors. These industry leaders join CoSAI’s expanding alliance of organizations, which now includes more than 30 […] The post OASIS Coalition for Secure AI Welcomes EY, Protect AI,

Blinder, Cranium, Cyware, Dell Technologies, Fr0ntierX, Harvey, HiddenLayer, Invariant Labs, Lasso Security, Legit Security, Logitech, Mozilla, Styrk AI, Thomson Reuters, TrojAI, and VE3 Join a Growing Roster of Organizations Committed to Advancing AI Security

Boston, MA – 19 September 2024 – The Coalition for Secure AI (CoSAI), an OASIS Open Project that launched on 18 July 2024, is announcing the addition of EY, Protect AI, Trend Micro, and Zscaler as its newest Premier Sponsors. These industry leaders join CoSAI’s expanding alliance of organizations, which now includes more than 30 partners dedicated to advancing the security of artificial intelligence (AI). Together, they support CoSAI’s mission to develop and share open-source methodologies, standardized frameworks, and tools for secure AI development and deployment.

CoSAI is a collaborative open-source initiative designed to give all practitioners and developers the guidance and tools they need to create Secure-by Design AI systems. Three strategic workstreams have been established within CoSAI, with plans to add more over time: software supply chain security for AI systems, preparing defenders for a changing cybersecurity landscape, and AI risk governance.

In addition to welcoming new Premier Sponsors, CoSAI is pleased to introduce its latest General Sponsors: Blinder, Cranium, Cyware, Dell Technologies, Fr0ntierX, Harvey, HiddenLayer, Invariant Labs, Lasso Security, Legit Security, Logitech, Mozilla, Styrk AI, Thomson Reuters, TrojAI, and VE3. These organizations further diversify and strengthen CoSAI’s community of stakeholders committed to advancing AI security.

“Joining CoSAI underscores the EY organization’s dedication to fostering innovation while at the same time enhancing the security and integrity of AI technologies,” said Yang Shim, EY Americas Technology Consulting Leader. “By working alongside other industry leaders, we aim to contribute to the development of robust frameworks that will empower enterprises and individuals to shape the future with confidence through the secure integration and deployment of AI,” added Kapish Vanvaria, EY Americas Risk Leader.

“At Protect AI we are on a mission to create a safer AI-powered world. As the prevalence of AI within organizations grows, so must the ability to secure it,” said Ian Swanson, CEO and Co-founder, Protect AI. “We are proud to join CoSAI as a Premier Sponsor. Through this collaboration, we aim to help shape the development of frameworks and standardized MLSecOps processes that enhance the security, safety, and trust for AI applications across industries.”

Eva Chen, CEO at Trend Micro, said, “We are dedicated to leading the charge in securing AI deployment, ensuring that security is seamlessly embedded from the ground up. Our collaboration with CoSAI reflects our commitment to pioneering efforts that not only protect organizations but also leverage AI to enhance security and uphold the trust of consumers. By bringing together industry leaders, we aspire to set new standards for the integrity and safety of AI systems, driving positive change across both the industry and broader society.”

“Zscaler is proud to join CoSAI to collaborate with industry leaders. Our collective aim is to establish best practices that ensure AI technologies are not only innovative but also trustworthy,” said Deepen Desai, Chief Security Officer, Zscaler. “This partnership will enable Zscaler to leverage the power of AI in order to deliver the most advanced security solutions for our customers. Through this collaboration, we’re striving to set a new standard for AI-driven security that prioritizes transparency, reliability, and excellence.”

These Premier and General Sponsors will join forces with CoSAI’s founding Premier Sponsors – Google, IBM, Intel, Microsoft, NVIDIA, and PayPal – and founding General Sponsors, including Amazon, Anthropic, Cisco, Chainguard, Cohere, GenLab, OpenAI, and Wiz. With the support of these industry leaders and experts, CoSAI is poised to make significant strides in establishing standardized practices that enhance AI security and build trust among stakeholders globally.

Participation 

Everyone is welcome to contribute technically as part of the CoSAI open-source community. OASIS welcomes additional sponsorship support from companies involved in this space. Contact join@oasis-open.org for more information.  

Support for CoSAI 

Blinder:
“AI is the most transformative technology of our generation. As attorneys and corporate legal departments adopt AI, data and IP security are at the forefront of their priorities. Blinder is proud to join CoSAI and further the mission of accelerating secure AI development. The open source OASIS model aligns with our focus on fair use IP, and democratizing AI security.”
— Nils Tracy, CEO & Founder, Blinder

Cranium:
“Cranium is proud to join CoSAI to advance AI security. As the leading enterprise AI security and trust software firm, we believe that by sharing our insights and best practices with other industry leaders we can collectively and securely develop and deploy AI. Only through collaboration can we truly strengthen AI security to build trust in each organization’s and third-party AI.”
— Felix Knoll, COO & CRO, Cranium AI, Inc.

Cyware:
“AI is transforming cybersecurity, enabling speed and scale at unprecedented levels. However, the opportunity AI presents is only matched by the risk it introduces. We are committed to developing secure, ethical AI to not only protect our systems but also to build trust with our clients and the broader community. Joining CoSAI was a natural decision, aligned with our mission to drive innovation while ensuring that safety and integrity are at the core of everything we do.”
— Sachin Jade, Cyware Head of Product

Dell Technologies:
“We share an unwavering commitment to collaboration and innovation within the AI ecosystem which includes empowering organizations globally to adopt AI safely and securely. By working alongside industry leaders in the Coalition, we aim to help establish necessary industry standards and contribute to the development of secure open-source solutions.”
— John Roese, Global Chief Technology Officer and Chief AI Officer, Dell Technologies

Fr0ntierX:
“AI is reshaping the world, and security must evolve with it. By joining the incredible lineup at CoSAI, we’re excited to be part of a global effort to ensure AI continues to push boundaries safely and responsibly. We look forward to driving innovation in AI and building systems people can rely on every day without compromising their security.”
— Jonathan Begg, CEO, Fr0ntierX

Harvey:
“Trust is the most important factor to the future success of AI. From Day 1, Harvey has built its platform with the highest security standards to become a reliable AI partner for its high-performing legal clients. We are thrilled to join CoSAI to share our experience and contribute to AI security standards for everyone.”
— Winston Weinberg, CEO and Co-Founder, Harvey

HiddenLayer:
“AI has never been easier to develop, use, and implement within organizations. As deployment continues to surge, so does the need to adopt common security standards and best practices in AI security. HiddenLayer is proud to join the CoSAI in our shared mission to support the widespread adoption of AI security principles.”
— Malcolm Harkins, Chief Security & Trust Officer, HiddenLayer

Invariant Labs:
“As AI systems and agents rapidly become integral parts of any organization, addressing their security and reliability is one of the key challenges blocking wider adoption. At Invariant Labs, we are proud contributors to open-source AI security, and we are excited to join the CosAI ecosystem and democratize secure AI together.”
— Mislav Balunovic, Co-Founder and CTO, Invariant Labs

Lasso Security:
“LLM and GenAI technologies are now non-negotiable assets for businesses striving to maintain a competitive advantage. However, as GenAI deployment accelerates, organizations must prioritize security standards and best practices to ensure safe and responsible use. At Lasso Security, we are proud to lead the way in securing GenAI deployments and to join CoSAI in our shared goal of protecting organizations from existing and emerging threats.”
— Elad Schulman, CEO & Co-Founder, Lasso Security

Legit Security:
“As AI grows more integral to how we build and deploy software, ensuring the security and integrity of AI systems throughout the software development lifecycle is more urgent than ever. Legit Security is proud to join CoSAI in advancing the security standards for AI systems and infrastructure, enabling organizations to innovate with confidence, safeguarded against emerging AI threats. Together, we will drive forward a secure future for AI in software development.”
— Liav Caspi, Co-Founder and CTO, Legit Security

Logitech:
“Logitech is proud to join CoSAI in its mission to enhance AI security. As a leader in developing human-centric technologies, we recognize the importance of ensuring that AI is developed and deployed responsibly. We are committed to collaborating with CoSAI and industry partners to create a safer and more secure AI ecosystem.”
— Nabil Hamzi, Head of Product Security, Logitech

Mozilla:
“Mozilla has contributed for decades in security and privacy and that is evident within the standards and protocols that we all use today. We’re proud to support OASIS and CoSAI’s mission in making AI safe and secure. We can’t wait to collaborate in building new and innovative technologies in making AI trustworthy in an open and transparent way.”
— Saoud Khalifah, Director from Mozilla

Styrk AI:
“The partnership with CoSAI allows Styrk to further its mission of enabling safe and secure usage of AI for all. CoSAI’s community focused efforts both in standardization and research in the critical area of AI security fills a very timely need and Styrk will now be able to leverage the platform to contribute back to the community while democratizing AI adoption.”
— Vilayannur Sitaraman, CTO of Styrk AI

TrojAI: 
“TrojAI is committed to developing comprehensive security solutions to safeguard AI applications and models from evolving threats. We are thrilled to join CoSAI in their mission of advancing the security and trustworthiness of AI systems to ensure the responsible and secure deployment of AI. Together, we will set new standards for AI integrity and security.”
— Lee Weiner, CEO, TrojAI

VE3:
“VE3 is proud to support CoSAI’s mission to establish a global framework for AI safety and security. As a pioneer in AI development, we recognize the paramount importance of advanced AI security and governance. Joining CoSAI represents our commitment to advancing security in AI development and implementation.”
— Manish Garg, Managing Director, VE3

About CoSAI:

CoSAI is an open ecosystem of AI and security experts from industry-leading organizations dedicated to sharing best practices for secure AI deployment and collaborating on AI security research and product development. CoSAI operates under OASIS Open, the international standards and open source consortium.

Media inquiries: communications@oasis-open.org

The post OASIS Coalition for Secure AI Welcomes EY, Protect AI, Trend Micro, and Zscaler as Newest Premier Sponsors appeared first on OASIS Open.


Identity At The Center - Podcast

We’ve got another sponsor spotlight episode of the identity

We’ve got another sponsor spotlight episode of the identity at the center podcast for you this week. We talked to Marta Nappo of Panini about their role in the identity verification space and how they are addressing that need for their customers. You can learn more by watching the episode at https://www.youtube.com/watch?v=Ekak4H6ccss or listening in your podcast app. Learn more about Panini at pa

We’ve got another sponsor spotlight episode of the identity at the center podcast for you this week. We talked to Marta Nappo of Panini about their role in the identity verification space and how they are addressing that need for their customers. You can learn more by watching the episode at https://www.youtube.com/watch?v=Ekak4H6ccss or listening in your podcast app. Learn more about Panini at panini.com

#iam #podcast #idac


We Are Open co-op

How to build a co-design event

Development vs Engagement in a Collaborative Setting Our new Mozilla Foundation-funded Friends of the Earth ‘Green Screen’ project has the express aim of developing a set of AI principles that the climate movement can use. The project involves desk research and a gathering of experts to influence and contribute to these principles, creating a co-designed starter for ten that others can build upon
Development vs Engagement in a Collaborative Setting

Our new Mozilla Foundation-funded Friends of the Earth ‘Green Screen’ project has the express aim of developing a set of AI principles that the climate movement can use. The project involves desk research and a gathering of experts to influence and contribute to these principles, creating a co-designed starter for ten that others can build upon. We will then take what we’ve learned to report for the Friends of the Earth policy site.

Part of this involves setting up an online roundtable to gather insights from a diverse range of experts. In our project kickoff call last week, we realised that clarifying the ambitions and aims of such an event is something we do instinctively.

We’re big fans of community calls, but the roundtable we will be putting together is something slightly different. In this post, we’re going to give you a few things to think about when you’re gathering people together to co-design a policy or set of best practices — or when you’re more on the development side of the continuum.

cc-by-nd Bryan Mathers for WAO The Development-Engagement continuum

First off, there’s tons of value in getting people together and working collaboratively towards something. There is also a lot of nuance in such an endeavour, so it’s best to understand what your long term goal for the project might be. We like to determine where on a continuum between ’development’ and ’engagement’ a particular project might sit.

Development is the side of the continuum that focuses on the final output of the project. This could be, for example, a report, article, or set of recommendations. Engagement, on the other hand, can serve as a launchpad for building community. While there may be outputs along the way, the main goal is to find and engage with people who are interested in a particular topic. As it happens, we’ve written extensively about how to build a community of practice in this short (and free!) resource.

If you’re mostly focusing on development, as we are with our Friends of the Earth (FoE) roundtable, you will need a different kind of preparation and facilitation than if you’re focused on a longer term community- building initiative.

Of course, many projects have an eye on both the short term and the longer term, and so are looking to do development and engagement. However, it’s important to understand that community building requires designated moderation, facilitation and a place to interact. If there is no one to actively manage and engage the community, it can become stagnant!

Co-designing for Development

It’s important to note that every collaborative effort does not need to lead to a fully fledged community. For example, with our FoE Green Screen project, the focus is very much on the set of principles that other organisations can build upon.

If you find yourself looking to engage a group of people around a particular project, like policy development or a set of principles, you should have a think about how your co-design event fits into the “project management triangle”.

Let’s take each point in turn.

PM triangle cc-by Laura Hilliger for WAO 1. Funding

How much funding is available for the co-design piece of your project? Can you afford to pay people to participate?

If you can pay people for their input, you absolutely should. Even a small portion of your overall budget can work, offering people a one-time fee or goodwill payment shows them that you value their contribution. However, we know that sometimes remuneration is not possible.

If there isn’t enough funding to pay people for their time, make sure you plan to spotlight their contribution in other ways. You can issue badges to contributors, ensure all contributors are named in final outputs and publicly thank people when you share the final outputs.

cc-by-nd Bryan Mathers for WAO 2. Scope

Based on the funding available, what is the scope of the collaboration? How can you ensure that you are building collaboratively?

Will you host a single event? You might also have three events or five of them. You might set a six month timeline for your project. Be clear about what you’re asking of people as you are asking them to participate. Are you inviting them to the first of a series of events or a singular event? Is there prep work or ‘homework’ involved? How will their contributions be attributed?

Together with other contributors, you’ll need to establish procedures to receive feedback and inputs, as well as how you will process them to create iterations. As always, documentation goes a long way and writing openly about the scope and decisions made along the way will help contributors understand the plan.

3. Time

How much time and effort can you ethically ask others to put in? How much time are you putting in?

Depending on your budget, you’ll need to figure out how much time you need to complete your policy, principles, best practices or whatever the output actually is. You’ll also want to think about how much time you require from the people you’d like to invite.

If you are looking for open contributions to your project, it’s best to try to minimise the amount of time other people will need to participate. Help contributors give good insight quickly by asking specific questions and giving people the opportunity to give feedback via email or a voice text.

Bringing things together

The three sides of the project management triangle play together to shape your co-design event. Depending on your parameters, it might make sense to do some of the development work on your own and then ask for input and feedback. Alternatively, you might want to get everyone involved from the beginning and co-design the entire project through a series of events. Be adaptable and flexible as you begin to work with others and refer back to the 5 principles of open to remind yourself of what it means to work openly.

Knowing where you sit on the ‘development/engagement’ continuum and mapping out your funding, scope and time will help you plan a codesign event that leads to great outputs and strengthened relationships.

Need help figuring out how to design your co-design initiative? Get in touch!

How to build a co-design event was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elastos Foundation

Harvard Students and Alumni Launch Groundbreaking Native Bitcoin Blockchain Project at Harvard Innovation Labs to Tackle Global Debt Crisis

Launch of “New Bretton Woods Project” (NBW): A Harvard student-led initiative, soon to be incubated at Harvard Innovation Labs, tackling the global debt crisis through decentralized finance (DeFi) solutions.   Native Bitcoin Stablecoin: NBW is developing a Bitcoin-backed stablecoin via BeL2 infrastructure, offering stability while preserving Bitcoin’s decentralization and security. 
Launch of “New Bretton Woods Project” (NBW): A Harvard student-led initiative, soon to be incubated at Harvard Innovation Labs, tackling the global debt crisis through decentralized finance (DeFi) solutions.   Native Bitcoin Stablecoin: NBW is developing a Bitcoin-backed stablecoin via BeL2 infrastructure, offering stability while preserving Bitcoin’s decentralization and security.   Economic Disruption and Resilience: The project aims to reshape global finance by leveraging Bitcoin and DeFi to promote economic stability and empower users in the face of rising global debt. 

Cambridge, MA, September 19th 2024 — In a bold step to transform the global financial landscape, the Digital Economy Research Initiative, led by a dynamic group of Harvard students and alumni, has officially launched the “New Bretton Woods Project” (NBW). This pioneering blockchain-based initiative has secured membership in Harvard’s prestigious Innovation Labs and is set to begin incubation there in the coming weeks. NBW aims to tackle the escalating global debt crisis by offering innovative, technology-driven solutions. 

At the heart of the project is the development of a native Bitcoin stablecoin, leveraging the transformative potential of decentralized finance (DeFi). Built on the innovative BeL2 infrastructure, NBW aims to reshape global financial systems, unlocking new possibilities for debt management and financial stability across nations. 

With the power of DeFi and blockchain, NBW stands poised to disrupt the status quo, offering a bold new path toward economic resilience in the face of one of the greatest challenges of our time. This initiative signals not just a step, but a leap toward a decentralized, stable and secure economic future. 

The project reframes Bitcoin as not just a store of value but the foundation of a decentralized financial system. Using BeL2—Bitcoin’s second-layer solution—the NBW project enables smart contracts for Bitcoin-backed stablecoin construction, empowering users to engage in decentralized finance while preserving Bitcoin’s core principles of decentralization and security. 

“Harvard Innovation Labs will help turn our vision into reality,” said Jacob, Lead Member of New Bretton Woods (NBW) at Harvard University. “Our goal is to create a ‘New Bretton Woods’ system anchored in Bitcoin, bringing stability through the utility of a stablecoin. This stablecoin allows users to bypass Bitcoin’s price volatility while retaining the potential for long-term gain, making the product practical for daily use.” 

The native Bitcoin stablecoin will be fully backed by Bitcoin, enabling users to experience the stability of fiat currency without liquidating their Bitcoin holdings. This offers a balance of algorithmic security using Bitcoin miners and the opportunity for long-term growth. 

The BeL2 infrastructure allows for decentralized finance applications, where Bitcoin remains securely on the main network. Bitcoin can be used as collateral for Layer 2 applications such as decentralized exchanges, loans, and stablecoin issuance. The NBW team ensures that all Bitcoin-related settlements occur on the Bitcoin main network for maximum security. Instead of transferring assets across chains, messages are sent to Ethereum-compatible networks to issue stablecoin, uniting technologies and supporting a robust decentralized economy. 

“Financial empowerment comes from both freedom and stability,” added Sasha Mitchell, Head of Operations at BeL2. “By offering a stablecoin backed by Bitcoin on the BeL2 platform, NBW is giving people a way to protect their wealth and access new financial opportunities, especially in times of economic volatility.” Sash presented The New Bretton Woods Project on stage at Token2049, signalling a big moment to the industry in advancing technologies and institutional support.

“This initiative comes at a crucial time as global debt reaches record levels. By combining Bitcoin’s decentralized structure with the stability of a pegged currency, the project offers a financial system that mitigates the risks of traditional economies, highlighting the real-world benefits of financial security and sovereignty. 

“Our stablecoin isn’t just another digital currency; it’s a tool for global financial stability,” said Jacob, Lead Member of the NBW project. “We believe that offering a decentralized and stable currency helps individuals and communities navigate the growing challenges posed by the global debt crisis.” 

The NBW team invites those who share their vision for a decentralized and secure financial future to explore how they can contribute. Whether you’re a developer, an investor, or a policymaker interested in sustainable financial solutions, this project offers a unique chance to shape a future focused on security, accessibility, and freedom. 

 

About New Bretton Woods Project (NBW)
The project is led by Digital Economy Research Initiative, a team of Harvard students and alumni. NBW is set to be incubated at Harvard Innovation Labs in the coming weeks. Focused on bridging the gap between traditional finance and decentralized systems, the team is committed to advancing financial inclusivity and economic stability. 

About Harvard Innovation Labs
Harvard Innovation Labs is a collaborative ecosystem that supports entrepreneurship across Harvard University. It provides resources, mentorship, and funding to students, faculty, and alumni as they develop practical solutions in fields like technology and finance. 

About BeL2
BeL2 is Bitcoin’s second-layer solution that enables decentralized finance (DeFi) while keeping Bitcoin secure on its main chain. By providing users with the ability to lend, borrow, and trade without intermediaries, BeL2 ensures financial freedom while preserving Bitcoin’s core principles of decentralization. With BeL2, users retain full control over their Bitcoin while accessing new financial opportunities. 

Wednesday, 18. September 2024

DIF Blog

Interview with Nick Lambert of Dock

Nick Lambert, CEO of Dock, has been a driving force in the digital identity space for over a decade, consistently working towards empowering individuals with control over their personal data. His decentralized identity journey began at MaidSafe, where as COO he spearheaded the SAFE Network, an ambitious project aimed at

Nick Lambert, CEO of Dock, has been a driving force in the digital identity space for over a decade, consistently working towards empowering individuals with control over their personal data. His decentralized identity journey began at MaidSafe, where as COO he spearheaded the SAFE Network, an ambitious project aimed at creating an autonomous data network that prioritizes user privacy and data sovereignty. At Dock, Nick continues his mission to revolutionize digital identity, focusing on solutions that streamline processes and reduce friction for both individuals and organizations.

Nick joined us to share his insights on the evolving landscape of decentralized identity. He discusses the recently announced blockchain merger with Cheqd, Dock's unique approach in the market, and key sectors for growth in the coming years. Nick emphasizes that the collaborative approach in decentralized identity, with shared standards and toolsets, accelerates innovation towards user empowerment, with industry players "all pointed in roughly the same direction." 

Your journey from MaidSafe to founding Dock reflects a long-standing interest in digital identity and personal data control. How did these experiences shape your vision for Dock?

It started back around 2012. I was a technology marketer and wasn't very familiar with decentralized technology. We were a startup looking to decentralize web services: enabling users to store their own data and provide it back to social media networks or other service providers. This introduced me to relevant decentralized technology, and also the blockchain and cryptocurrency industry.

It made me realize how important it is that we control our own data, whether that's documents we store, music, or academic files, and to be able to control it on our own device. You also don't want companies removing your access to it, which they can do and have done, even by mistake. 

When I left MaidSafe and saw Dock, it made sense that the most important type of data you have is your identity data. Identity takes many forms, but it seemed like a critical area with work still to be done. Dock was an opportunity to take a narrower view (than MaidSafe) and ensure we could provide people with the ability to control their own identity. And for the last twelve years, that's what's really got me out of bed in the morning.

Your path perfectly captures how long-term builders and leaders in the space become focused on giving people control over their identity data. Can you elaborate on that?

Absolutely. It's such a big problem, but what's encouraging about the identity space is that it's a more focused challenge compared to something like decentralized storage. With identity, there's a much larger group of people working on the problem with, for the most part, an agreed-upon set of tools and standards. It's almost like a movement of companies and organizations solving it together.

We might have minor disagreements about implementation details, but we're all pointed in roughly the same direction. That's what makes it feel solvable. We're not all trying to work in isolation - we're collaborating, building on each other's work, and moving towards a common goal.

This approach is crucial because giving people control over their identity data is too big and important a problem for any one company to solve alone. By working together, sharing standards, and building interoperable tools, we're making real progress. 

How does Dock's approach to verifiable credentials and digital identity differ from other solutions in the market?

We have a big focus on the business aspect. It's easy to fall in love with the technology, but it's the problems it solves that ensure technology is adopted and builders can make their solutions operate effectively as a business. We focus on tools that can enable business models.

An essential part is an online trust registry. Just because something's in a system or blockchain doesn't mean it's true. You need a linkage between real-world verification of a company or individual and their representation within a system. Trust registries (and the appropriate business partners) provide this, and having them programmatically exist inside your system is key.

Once you have that, it enables businesses to think about how they wish to monetize the credentials. This is important because it's very hard to get businesses to change unless you can point them to a specific business model. We're enabling credential issuers to turn what today is a cost center into a revenue source. When you combine that with our ecosystem enablement and the rest of our stack, it makes us differentiated in the market.

You've been at the forefront of communicating the business value of decentralized identity, including popularizing the term "reusable identity." Tell us your perspective on the evolution of narrative over time.

At the start, many focus their communications on the tooling -- that it's decentralized, uses certain types of cryptography, etc. You put that all over your website, thinking people will care. But the only people who care are others building those solutions.

The companies that will use and pay for these solutions are looking for solutions to problems. They don't care whether it's decentralized or what type of cryptography you're using - they just want a problem fixed.

We've made a conscious effort with our communications to focus more on the benefits we can provide. We try to understand the specific problem a group of prospects and customers are having and how we can fix it. Then we build the narrative around that. This has been the biggest shift we've had, and we're starting to see some success from it.

When discussing decentralized identity with investors, what aspects do you focus on to convey its potential value? 

Investors are typically looking for returns within a defined time period. They're looking for things like size of opportunity and potential market growth. They also want to see something that doesn't completely disrupt existing businesses. Evolutionary changes that complement existing businesses in a relatively easy-to-integrate way are often more interesting to investors.

We also talk about market drivers pushing adoption, like eIDAS regulations, mobile driver's license initiatives in the US, World Bank investments in digital identity, LinkedIn's use of Microsoft Entra ID for credentials, and Google's recent release of the Credentials API. These developments indicate the level of adoption we're likely to see, which excites investors.

What are some common questions or misconceptions about decentralized identity that you encounter, and how do you address them?

One common misconception is that identity only refers to government-backed identity, like a driver's license. In reality, we're talking about identity in a much broader sense. It can be anything from a website's understanding of who you are to your access to their system. Fundamentally, it's about control over your data.

Another misconception about decentralized identity is that every single aspect of it must be decentralized. What we're really talking about is the decentralized consumption of data - the presentation and verification. This typically happens in a wallet where the individual has control. They're presenting a credential given to them by a third party to another entity, who can verify it without knowing who they're verifying or what they're verifying. That's the important part of decentralization. Of course, you'll still have centralized entities issuing those credentials.

The recent merger between Dock and cheqd is significant news. What was the vision behind this decision?

Keep in mind it's a blockchain merge rather than a full company merger; the legal entities are remaining separate. We're migrating all of our chain-related elements (like revocation registries, DIDs, testnets) over to Cheqd.

The vision behind this is to allow Dock to focus on what differentiates us in the market: our issuing and verification platform, API, and our various platform features. Not all of these innovations take place at the blockchain layer but above it. As a small team, managing the entire stack was becoming a challenge.

We feel our strengths lie in the areas above the chain, and Cheqd is highly competent in the blockchain area. This allows us to focus on what really interests us and where we think we can make the most difference for our customers.

Additionally, being part of the Cheqd ecosystem with other great projects feels like a big step towards interoperability with a number of other providers. This addresses a concern I had about Dock potentially isolating itself on its own chain.

Your partnership with the University of Arkansas involves anonymous cyber incident reporting. How does this project demonstrate the capabilities of verifiable credentials?

The University of Arkansas, Little Rock is involved with some Department of Defense and energy-related projects. They're building what they call ET-ISAC (Emerging Threat Information Sharing and Analysis Center). They needed a way for people to report things anonymously, like a whistleblowing service, without fear of reprisal.

Using Dock's technology, they've created a system where someone can become a member and use that membership credential to access the reporting tool in a way where the system knows they're a member but doesn't know who they are. This is achieved using zero-knowledge proof technology.

A member of ET-ISAC receives a credential, which they can use to create a QR code in a wallet on their phone. They can then scan that QR code to gain access to the site and report an incident anonymously.

It's a unique use case that demonstrates the versatility of verifiable credentials and zero-knowledge proofs in solving real-world problems.

What use cases for decentralized identity have you found to be particularly impactful or promising?

Our current focus is on KYC (Know Your Customer) and biometrics companies. We see this as a market that's ready for innovation, especially with regulations like eIDAS 2.0 and mDL coming into play. We can make their verification checks portable, which is what reusable KYC is all about.

There's a good value transfer throughout the KYC process:

Companies can make their verifications portable and generate new revenue streams. Individuals get more control over their data and can short-circuit painful onboarding processes. Relying parties (verifiers) don't have to start from scratch and can benefit from previous verification checks.

Beyond that, I'm excited about Customer Identity Access Management (CIAM). Many large corporations, especially in the US, have different divisions that often act as separate entities. Regulations sometimes prevent them from sharing information about individuals across these divisions. This leads to tremendous pain for both the company and the individual.

Imagine if they could give individuals control over their information, allowing them to consent and decide what they want to release to another business unit within the same corporation. This could solve many problems for large companies and enable individuals to manage their own information seamlessly.

The challenge with CIAM is that it's a slow-moving, risk-averse industry. They're just getting used to passkeys, so introducing verifiable credentials and different applications for managing those credentials might take some time. However, the potential is enormous, and I'm excited to see what happens in the next year or two.


Velocity Network

Sertifier Joins Velocity Network Trust Framework  

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

DIF Blog

DID Method Standardization Initiative: Progress Update and Next Steps

We're happy to share progress in the DID Method Standardization initiative. This effort aims to drive Decentralized Identifier (DID) adoption through standardization maturity and ecosystem engagement. Here's what's new: DIF Working Group Kickoff Meeting DIF's kickoff meeting for the DID Method Standardization

We're happy to share progress in the DID Method Standardization initiative. This effort aims to drive Decentralized Identifier (DID) adoption through standardization maturity and ecosystem engagement. Here's what's new:

DIF Working Group Kickoff Meeting

DIF's kickoff meeting for the DID Method Standardization working group is happening this Friday:

📅 20 September
🕘 9:00am PT / 12:00 pm ET / 18:00 CET
🔗 Add to calendar

This meeting will set the stage for our ongoing efforts. The recurring meeting schedule will be set based on participant feedback.

Coordination Hub Launched

We've created a neutral coordination hub to facilitate collaboration and information sharing:

https://github.com/did-method-standardization

This hub will serve as the go-to resource for all things related to DID Method standardization. Here you can:

Join the Discussion: Engage with the community, share your thoughts, and contribute to the conversation in our GitHub Discussions Hear the Latest Announcements: Learn about activities and upcoming events Read our Mission & Goals: Get acquainted with our objectives and the vision driving this initiative. Read our Mission & Goals.

We're just getting started, so check back for updates.

In case you're not familiar with GitHub: Note the coordination hub uses GitHub. Create a GitHub account and then "follow" the did-method-standardization GitHub organization (detailed instructions)
W3C TPAC Discussion

The DID Method Standardization initiative will be a topic of discussion at the upcoming W3C Technical Plenary / Advisory Committee (TPAC) meetings. This presents an excellent opportunity to align our efforts with the broader web standards community.

Action Items for the Community Attend the First Meeting: Join us this Friday to help shape the future of DID Methods. Add the event to your calendar and come prepared with your ideas and questions. Explore the Coordination Hub: Visit our GitHub organization, read through the Mission & Goals, and familiarize yourself with our objectives. Participate in Discussions: Engage with the community in our GitHub Discussions area. Share your perspectives, ask questions, and collaborate with others passionate about DIDs. Spread the Word: Help us expand our community by sharing this update with colleagues and on social media. The more diverse voices we have, the stronger our effort will be.

Your input shapes the future of interoperable DID Methods, enabling secure digital interactions based on a foundation of trust.

Stay tuned, and we look forward to seeing you at our first meeting!

Tuesday, 17. September 2024

OpenID

Public Review Period for Proposed AuthZEN Authorization API 1.0 Implementer’s Draft

The OpenID AuthZEN Working Group recommends approval of the Authorization API 1.0 specification as an OpenID Implementer’s Draft: Authorization API 1.0 Implementer’s Draft Other formats: XML, MD An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review peri
The OpenID AuthZEN Working Group recommends approval of the Authorization API 1.0 specification as an OpenID Implementer’s Draft: Authorization API 1.0 Implementer’s Draft Other formats: XMLMD An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification drafts in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the Working Group believes must be addressed by revising the drafts, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve these drafts as OpenID Implementer’s Drafts. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period. The relevant dates are: Implementer’s Draft public review period: Wednesday, September 17, 2024 to Friday, November 1, 2024 (45 days) Implementer’s Draft vote announcement: Saturday, October 19, 2024 Implementer’s Draft early voting opens: Saturday, October 26, 2024 * Implementer’s Draft voting period: Saturday, November 2, 2024 to Saturday, November 9, 2024 (7 days)* * Note: Early voting before the start of the formal voting will be allowed. The AuthZEN work group page is https://openid.net/wg/authzen/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/benefits-members/. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specifications in a way that enables the working group to act upon it by (1) signing the Contribution Agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “AuthZEN” work group on your contribution agreement), (2) joining the working group mailing list at openid-specs-authzen@lists.openid.net, and (3) sending your feedback to the mailing list. 

Marie Jordan
OpenID Foundation Board Secretary

The post Public Review Period for Proposed AuthZEN Authorization API 1.0 Implementer’s Draft first appeared on OpenID Foundation.


Elastos Foundation

Elastos Partners with BitLen and BQ Labs to Drive BitFi Growth

BitLen developing peer-to-peer Bitcoin loan application based on Elastos’ BeL2 protocol BQ Labs to build the World’s first decentralized insurance protocol for the Bitcoin ecosystem Increasing variety of financial services products being developed in the World’s most popular decentralized currency signals the emergence of a ‘new Bretton Woods’ based on Bitcoin (Bitcoin finance (‘BitFi’))   […]
BitLen developing peer-to-peer Bitcoin loan application based on Elastos’ BeL2 protocol BQ Labs to build the World’s first decentralized insurance protocol for the Bitcoin ecosystem Increasing variety of financial services products being developed in the World’s most popular decentralized currency signals the emergence of a ‘new Bretton Woods’ based on Bitcoin (Bitcoin finance (‘BitFi’))

 

Singapore: September 18, 2024 – Elastos today announced two more partnerships to develop and deliver financial services products in native Bitcoin.  The developments – being undertaken with BitLen and BQ Labs, respectively – confirm the momentum building around Bitcoin-denominated decentralized finance (‘BitFi’).

BitLen is developing a peer-to-peer Bitcoin-denominated loan offering based around Elastos’ Bitcoin Elastos Layer2 protocol – codenamed ‘BeL2’; a Layer 2 solution for Bitcoin, enabling multiple functionalities such as staking and smart contracts to be denominated directly in the World’s most popular digital currency. 

The resulting loan mechanism will enable Bitcoin users to collateralize up to 80% of their assets in return for L2 credit (stable coins, for instance) based on terms defined in a Bitcoin-assured smart contract. Thanks to the BeL2 protocol, this process can be completed without bridging, wrapping or otherwise interfering with the Bitcoin layer; this both assures the integrity of the currency and avoids network congestion, additional fees, and security compromises that would otherwise result.  

“The BeL2 protocol perfectly reflects what BitLen is all about; increasing Bitcoin liquidity without compromising its integrity.  The result will be a genuinely peer-to-peer loan product; it’s not simply Bitcoin-based credit, but completely disintermediated and anonymous,” says Leon Jiang, CEO of BitLen. 

 

BQ Labs is a Bitcoin-focused platform that provides decentralized insurance and risk management solutions for Bitcoin Layer 2 protocols and validators. Through the partnership, BQ Labs’ Risk Engine product will be incorporated into the Elastos ecosystem and dApps, to help users plan and mitigate operational and compliance risk when transacting in Bitcoin.  

“Our mission is to facilitate and secure the 21 million BTC currently onboarding to DeFi, across all stages of the ecosystem from hodlers to validators, providing them with a set of tools and protocols to manage and mitigate their risk. Our partnership with Elastos reflects our shared vision of Bitcoin-denominated DeFi with all the benefits of true decentralization,” says Akshay Agrawal, CBO of BQ Labs.  

Jonathan Hargreaves, Elastos’ Global Head of Growth, describes the partnerships as ‘irrefutable proof’ of the momentum now gathering around Bitcoin-based financial instruments.

“Both projects demonstrate the benefits that native Bitcoin can bring to decentralized finance in terms of integrity, security and validation, as long as the token itself is not compromised in the process.  Our BitLen partnership, for instance, represents the perfect example of the potential of what can be achieved with Bitcoin-based smart contracts that – thanks to BeL2’s unique ZK-proof process – ensure complete interoperability while leaving the Bitcoin layer entirely untouched,” he says. 

”With BQ Labs, it’s about incorporating tools and protocols that make it easier and safer to transact in Bitcoin; specifically to help Bitcoin-denominated L2 projects build trust and credibility while simultaneously reducing their operational and regulatory headaches,” Jonathan adds. 

“Both these partnerships demonstrate Elastos’ commitment and role in supporting a DeFi ecosystem with Bitcoin at its heart.  And this is where the big opportunity lies – the establishment of what we call a ‘new Bretton Woods’; a reserve currency whose supply is fixed and guaranteed (just like the Gold Standard), complete with integrity and security, but which – unlike traditional reserves such as gold bullion – can be subdivided, stored and transferred at zero cost.  Native Bitcoin embodies all the qualities of an alternative reserve currency, and – as these partnerships demonstrate – the era of ‘BitFi’ is now upon us,” he concludes.  

BitLen will be joining executives from Elastos at the Token 2049 side event, Asia OnChain – RWA, DePIN & DAO, 19 September ‘24. 

 

About BitLen 

BitLen aims to be a community-driven, cross-chain liquidity-boosting platform offering stability through its Bitcoin-pegged HEX token, diverse yield strategies via lending, and restaking capabilities within the Bitcoin ecosystem.

About BQLabs 

BQLabs is transforming risk management in the Bitcoin ecosystem with its innovative decentralized insurance protocol. At the heart of this transformation is the BQ Protocol, which provides a comprehensive technical, operational, and legal infrastructure to facilitate the underwriting, trading of risks, and insurance cover purchases. The BQ Protocol is designed to address the unique challenges of the Bitcoin network, offering a secure and user-centric approach for managing risks.

About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data. The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.


FIDO Alliance

White Paper: Displace Password + OTP Authentication with Passkeys

Editors Husnan Bajwa, Beyond IdentityJosh Cigna, YubicoJing Gu, Beyond Identity Abstract For enterprises that have implemented a second factor, such as SMS OTP, to mitigate the risk of credential stuffing, […]
Editors

Husnan Bajwa, Beyond Identity
Josh Cigna, Yubico
Jing Gu, Beyond Identity

Abstract

For enterprises that have implemented a second factor, such as SMS OTP, to mitigate the risk of credential stuffing, this paper will provide guidance on displacing passwords + OTP with passkeys.

Audience

This white paper is intended for ISRM and IT staff tasked with deploying and maintaining multi-factor authentication (MFA) sign-in processes.

1. Introduction

Many enterprises aiming to secure their workforces have deployed SMS and application-based one-time passcodes (OTPs) as an additional factor of authentication to passwords. This whitepaper is aimed at these organizations that are now considering a move to FIDO authentication with passkeys. While this whitepaper focuses on OTPs specifically, the discussion and recommendations can be generalized to any system using phishable second factors including application-based OTP and push notifications.

This whitepaper compares OTPs as an additional authentication factor to passwords and passkeys in terms of security, user experience and ease of deployment. And it provides general guidance about migrating from OTPs to passkeys in order to improve user experience while strengthening the organization’s overall security posture. The guidance within this paper will cover key steps for moving towards passkeys, including planning, use case identification and documentation, pilot considerations, as well as deployment and training guidance. This document targets low-assurance use cases, including internal authentication, external authentication and third party and B2B authentication strategies. Given that organizations typically implement OTPs as the second factor to passwords, for this document all references to OTPs should be assumed as being used as a second factor to passwords.

This document will not cover specific vendor technologies or implementations. For guidance on moderate or high assurance use cases please refer to additional whitepapers published by the FIDO Alliance [1]. As this document is a part of a series of whitepapers, it is recommended that the reader start with the introductory document [2].

download the white paper 2. OTP MFA vs Passkeys: Why Choose Passkeys

Passkeys offer several benefits to security, user experience, and ease of deployment when compared to OTPs.

2.1 Security

OTP-based MFA has been widely adopted to mitigate the risk of credential stuffing and reuse. SMS and authenticator application-based OTP are the most commonly deployed solutions due to their relative low-cost and ease of deployment across a broad set of users and use cases. The relative simplicity of this type of MFA, however, leaves it vulnerable to social engineering and many MFA bypass toolkits, because no bidirectional communication exists between the secrets generator and the validating identity provider (IDP), meaning that an OTP can be intercepted and used by a third party without the knowledge of the end user or IDP.

Additionally, OTP-based MFA requires trust in a device that an organization may not manage nor have visibility into its security posture. This means that organizations are relying on end-users to maintain the security of their own device and their ability to discern phishing attempts. While user training can help to address some of these attacks, historic guidance like checking for secure connections and familiar URLs, still relies on an ever-vigilant user base.

Passkeys provide phishing-resistant, replay-resistant sign-ins that reduce the cognitive load on users and strengthen organizations’ overall security posture. This is achieved because passkeys implement a cryptographic challenge-response protocol scoped to the relying party’s domain. The authenticators then rely on secure behaviors, like biometric proofs or PINS to unlock the credentials on the authenticator while retaining a user-friendly experience. With passkeys, an organization can have a strong first-factor of authentication for passwordless scenarios OR a strong second factor for traditional MFA workflows.

2.2 User Experience

Passkeys improve the user experience over passwords and OTPs in several ways, including: Passkeys work even when there is poor cell coverage whereas SMS OTPs require mobile network connectivity. For example, a user can have wireless access on an airplane but are not permitted to use SMS. In this instance, the SMS OTP cannot be delivered whereas passkeys can be used to authenticate. AutoFill UI enables seamless integration within browsers on mobile devices and on desktops. Up to four times faster login, no need to wait for code delivery [3] Protection against mis-keyed passwords and codes Passkeys build on common behaviors for authentication like biometric proofs (face or fingerprint).

2.3 Ease of Deployment

For some micro, small, and medium sized businesses without large, dedicated support staff, end-user deployment of dedicated
authentication hardware tokens can create roadblocks. This includes both OTP hardware tokens or FIDO security Keys. Historically, the ease of deployment of SMS/App based OTPs made them a more favorable option. Procurement, logistics, and configuration are a constant battle fought by operations and IT teams. With updates to the FIDO2 ecosystem to expand the authenticator landscape, this problem is alleviated and allows the use of many different devices as passkey stores.

All of this comes together to mean that the deployment of passkeys is much easier and less costly compared to SMS OTP for a few reasons: There is no SMS integration required. Enterprises will not need to configure or maintain interfaces with mobile carriers or third-party SMS vendors which reduces deployment complexity. Enterprises will not have to pay per-transaction fees associated with SMS OTP therefore reducing the total cost of ownership for authentication. FIDO authentication uses passkeys. Passkeys are simple to implement across a range of devices and applications. SMS OTP rely on carrier-specific APIs or third-party vendor APIs that are not standardized which increases risk of vendor lock-in and lack of interoperability. No time-synchronization is needed. Passkeys avoid the time-synchronization requirements of SMS time-bound OTPs (TOTPs). Codes don’t need to be entered within a short time window, and deliverability issues with SMS do not result in login failures.

FIDO authentication with passkeys has been embraced by operating system (OS) and browser providers. This existing OS support from most major vendors means that, in most cases, existing hardware in the enterprise, such as laptops, phones, and tables, can be leveraged to deploy FIDO with passkeys without costly updates and replacements.

In some cases, enterprises use shared, single user devices such as iPads. For these use cases, a passkey stored in the integrated platform authenticator may not be appropriate, since any user of the device has access to the credential. In these cases, organizations should use roaming authenticators (hardware security keys) to generate and store passkeys for use on the shared device. This offers the same ease of use and convenience. Keep in mind, there may be an additional cost to purchase and manage these hardware keys for users. In many cases using hardware keys there may be a need to issue users a second hardware key as a backup to reduce the risk of the user being locked out of their account(s).

3. Deployment Strategy for Migration from OTP to Passkeys

3.1 Identifying the Deployment Model

Planning for a successful passkey deployment requires organizations to consider the needs of the user and the computing devices they use in their role to maximize the utility of passkeys for staff. At a minimum, organizations should consider the following questions when planning a passkey deployment in order to make passkeys accessible to the broadest audience possible:

What kind of computing devices are used? Are your users working on laptops or desktop computers? Mobile devices? Tablets? Are the devices single user devices or multi-user (e.g., shared) devices? Are the devices provisioned by the company or are users using their own personal devices at work? Are there limitations on using USB ports, Bluetooth, or NFC? Are there limitations on access to the internet? Are your users commonly wearing gloves or masks which limit the use of on-device biometrics?

Based on the answers to the previous questions, organizations can choose one of a few types of authenticators to store user’s passkeys. The flexibility of passkeys means that organizations can mix and match as their security posture and UX needs dictate. Other documents in this series go into more detail on each type of authenticator.

3.2 Deployment Testing

After determining the deployment model and deploying a FIDO server with applications integrated, it is recommended that organizations use pilot groups to test registration, authentication, and recovery processes (see below) with users. Then use the feedback from the pilot to improve processes and address issues raised by the pilot population before embarking on a broad rollout of passkeys.

3.3 User Experience

3.3.1 Registration

Enterprises should implement a reliable registration process to ensure that users are correctly and securely associated with their passkeys, as stated in earlier FIDO whitepapers. The registration experience is critical to consider because it is a user’s first interaction with passkeys. Here are a few elements to consider when it comes to designing the registration experience:

Identity Proofing – Physically or remotely having the user prove their identity at the start of the registration process is recommended to ensure a strong, abuse resistant process. This may involve SMS OTP for the final time. Self-service registration – Users use their existing credentials to bootstrap a new passkey. Supervised registration – work with IT/helpdesk for registration. This reduces the risk associated with self-service models that are vulnerable to phishing the original creds. Pre-provisioned credentials – high effort, high assurance, but a mechanism is needed to get the credential into the user’s hands. Remote users – self-service or pre-provisioned, but a mechanism is needed to provide the PIN to the user to unlock the device the first time.

3.3.2 Sign-In

The first step in designing a FIDO deployment with passkeys is to understand the user base, common work paradigms, and available devices – phones, tablets, laptops, desktops. This step is critical because enabling user-friendly workflows that work with the user’s existing devices is core to developing a successful rollout.

Common users’ environments and equivalent suggestions include:

Environments with users who primarily operate on mobile devices or tablets – Look into built-in unlock capabilities. Mixed device environments or environments that rely on a variety of SaaS tools – Leverage SSO provided by IDP or build flexible login workflows. Shared accounts – FIDO RPs can be configured to allow for more than one credential to be associated with a login. Investigate
cross device hybrid authentication or roaming authenticators.

3.3.3 Recovery

Any authentication process is only as strong as its weakest point, which is why recovery processes have often been abused by attackers to compromise systems. Synced passkeys are durable credentials that can survive device loss and unintentional wiping by restoring from a passkey provider and reduce the need to perform account recovery to establish new credentials for the user. With passkeys, users are expected to lose their credentials less frequently. However, there may be cases where passkeys, or access to the passkeys, is lost, thus requiring account recovery.

For passkey implementations utilizing device-bound passkeys that cannot be backed up or recovered, account recovery should be performed using the backup authentication method, such as using a backup authenticator, to bootstrap enrollment of a new authenticator. In the event that a backup authentication mechanism is not available, organizations must provide alternative pathways to recovery. Organizations should take a risk-based approach to designing and implementing account recovery systems. The specific implementation details will vary widely depending upon organizational requirements. In general, recovery mechanisms should never depend on weaker factors than the credential that the user is trying to recover. In the case where passkeys need to be re-registered, organizations should design mechanisms, either automated or manual, to prevent the use of passkeys no longer registered to that user.

For passkey implementations where synchronized passkeys are used, be sure to document the bootstrapping/enrolment process for new devices as well as building a risk averse process (including identity proofing) for full provider account recovery or replacement. While these catastrophic events should be low, it may still be necessary to have users go through this process. Knowing the proper process ahead of time will insulate organizations against manipulations and stop work events.

For additional considerations around account recovery, please see the FIDO Alliance’s Recommended Account Recovery Practices for FIDO Relying Parties. [5]

3.4 Admin Considerations:

Monitoring of implementation and adoption metrics are critical to ensuring the success of the deployment and ensuring that the security benefits of FIDO authentication with passkeys is realized. Below are recommendations for metrics and processes that are indicative of the success of enterprise passkey migrations.

3.4.1 Monitoring and Visibility into Utilization

Admins are strongly encouraged to use groups or other segmentation structures to allow graceful transition of subsets of users and applications to passkeys. Pilot populations should be carefully constructed and should be composed of a variety of end user types and levels in the organization. Monitoring the usage of items below, both before and after the migration, will provide critical insights into the effectiveness of the program and guide important adjustments.

Device enrollment: How long did it take the user to enroll their first device? Security events: Where was the device at time of onboarding? What, if any, identity proofing approaches were used to ensure that the correct user was onboarded? If manager, IT support, or peer approval workflows were used, who provided attestation? Are there any time of day or device location anomalies that did not previously exist? User authentication: Was the user able to successfully authenticate? Are there any observable changes in their daily authentication patterns that would suggest problems or added friction? Does analysis of day-of-week and time-of-day suggest any issues? Key management: Are keys being used as expected and only from known devices? Some authenticators support device attestation which provides key provenance and assurance of the identity of the authenticator. If the source of the passkey is an important security control for your implementation, be sure to verify if your chosen authenticator solution supports this kind of attestation. How many keys are associated with an individual’s account? Normal guidance would be to expect the number of passkeys associated with a user’s account to be close to the number of devices that a user leverages. For example, if your users use Android phones & Windows laptops then you should expect to see two to three passkeys associated with a users’ account, one stored on each platform authenticator, and possibly one backup from a security key. In this scenario if an account had five to six passkeys registered, then it would be time to investigate and potentially remove excessive keys. Every organization’s definition of excessive may vary, and should be defined based on observations from their environment. Additionally, depending on your deployment, consider the number of applications that you have enabled for passkey authentication. If you deployed passkeys as credentials for an SSO integration, your users may only have one passkey per device. If you deployed passkeys on an application-specific basis, there may be one passkey per device per application. Organizations are recommended to monitor the number of keys associated with each user and use this data as context for informing passkey management. Whose keys are associated with administrative/service/break-glass accounts? In the same way that it is best practice to segregate administrative access from normal user access, generating a separate set of passkeys for administrative accounts is also recommended. If they are shared, be sure to include rotation, monitoring, and good cleanup practices. How will passkeys be removed?If an employee leaves the company or moves into a different role, their accounts should be disabled, deleted, or access should be evaluated and vetted. In situations where this is not reasonable due to legal requirements, passkeys should be promptly removed to prevent unauthorized account access as part of the disablement process. Similarly, if a user reports a device missing or stolen, any passkeys associated with those devices should also be removed. Compatibility assurance: Do any combinations of applications and endpoint platforms show unusual changes or decline in authentication events? Are all invocation methods for passkey authentication continuously functioning, including after upgrades? Next Steps: Get Started Today Enterprise organizations should consider migrating to FIDO authentication where possible. Use FIDO standards. Think about what your relying parties are supporting as well as your own enterprise security requirements. Passkeys are far more secure than traditional OTP mechanisms. Passkeys are far more secure than passwords. Look for the passkey icon on websites and applications that support passkeys.

For more information about passkeys, check out the FIDO Alliance passkeys resource page [6] and the FIDO Alliance knowledge base [7].

5. Acknowledgments

The authors acknowledge the following people (in alphabetic order) for their valuable feedback and comments:

Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Enterprise Deployment Working Group John Fontana, Yubico, Co-Chair FIDO Enterprise Deployment Working Group FIDO Enterprise Deployment Working Group Members Dirk Balfanz, Google Jerome Becquart, Axiad Vittorio Bertocci, Okta Greg Brown, Axiad Tim Cappalli, Microsoft Matthew Estes, Amazon Web Services Rew Islam, Dashlane Jeff Kraemer, Axiad Karen Larson, Axiad Sean Miller, RSA Tom Sheffield, Target Corporation Johannes Stockmann, Okta Shane Weeden, IBM Monty Wiseman, Beyond Identity Khaled Zaky, Amazon Web Services 6. Glossary of Terms

Please consult the FIDO Technical Glossary for definitions of these terms.

7. References

[1] FIDO Alliance Enterprise Deployment Whitepapers – https://fidoalliance.org/fido-in-the-enterprise/
[2] FIDO Alliance Enterprise Deployment Introduction Whitepaper –
https://media.fidoalliance.org/wp-content/uploads/2023/06/June-26-FIDO-EDWG-Spring-2023_Paper-1_Introduction-FINAL.docx.pdf
[3] Forrester Report of Total Economic Impact of YubiKeys –
https://resources.yubico.com/53ZDUYE6/at/6r45gck4rfvbrspjxwrmcsr/Forrester_Report_Total_Economic_Impact_of_Yubico_YubiKeys.pdf?format=pdf
[4] High Assurance Enterprise FIDO Authentication –
https://media.fidoalliance.org/wp-content/uploads/2023/06/FIDO-EDWG-Spring-2023_Paper-5_High-Assurance-Enterprise-FINAL5.docx-1.pdf
[5] Recommended Account Recovery Practices for FIDO Relying Parties –
https://media.fidoalliance.org/wp-content/uploads/2019/02/FIDO_Account_Recovery_Best_Practices-1.pdf
[6] Passkeys (Passkey Authentication) – https://fidoalliance.org/passkeys/
[7] FIDO Alliance Knowledge Base – https://fidoalliance.org/knowledge-base/

Monday, 16. September 2024

Trust over IP

Trust Over IP Members to Participate in Bhutan Innovation Forum

By Eric Drury If you’re in the digital trust space, like me you might be encouraged by the number and variety of inspiring developments—appearing on an almost weekly basis—which illustrate... The post Trust Over IP Members to Participate in Bhutan Innovation Forum appeared first on Trust Over IP.

By Eric Drury

If you’re in the digital trust space, like me you might be encouraged by the number and variety of inspiring developments—appearing on an almost weekly basis—which illustrate the momentum that responsible technology is gaining in the drive towards a more trustworthy and sustainable future.

One of the latest and more interesting initiatives comes—yet again—from the Kingdom of Bhutan in the form of Gelephu Mindfulness City (GMC). GMC is the Kingdom’s unique version of a smart city. They call it a Special Administrative Region, integrating economic growth with mindfulness, holistic living, and sustainability.

How does digital trust fit into GMC?

In October of last year, Bhutan launched the world’s first self-sovereign national digital identity system, Bhutan NDI. And as a pillar of digital public infrastructure (DPI), a digital identity system forms the basis for trusted interactions between individuals, government, and business.

The NDI system, which Bhutan is currently rolling out nation-wide, will be part of the core infrastructure that provides cutting-edge digital connectivity for GMC. As such, this is an exciting opportunity to further cement the viability of emerging SSI and other trust technologies for implementation at population scale.

Trust Over IP is proud to have contributed to Bhutan’s NDI system both formally and informally.

Drummond Reed, one of ToIP’s founding board members, provided inspiration—and a road map of sorts—for the development of SSI systems via the book he co-authored called ‘Self-Sovereign Identity – Decentralized Digital Identity and Verifiable Credentials’. Drummond and other ToIP members also provided expertise and input for Bhutan’s National Digital Identity Act, the NDI Act of Bhutan, relying heavily on the ToIP governance metamodel which was developed by the ToIP’s Governance Stack Working Group led by Scott Perry.

For an in-depth case study on the Bhutan NDI program, refer to Bhutan NDI, Digital Trust Ecosystem, written by DHI’s Pallavi Sharma and Eric Drury, co-chair of ToIP’s Ecosystem Foundry Working Group.

Trust Over IP and Gelephu Mindfulness City further align in that both espouse the intertwining of governance and technology as a core enabler of digital trust.

From a governance perspective, GMC blends robust policies that ‘build trust and accountability with mindful incentives designed to empower both residents and businesses alike to reach their fullest potential’. From a technology perspective, Gelephu Mindfulness City will be built on a foundation of ‘world-class infrastructure seamlessly integrating state-of-the-art technology with sustainable practices’.

In support of GMC, the Kingdom is convening the first Bhutan Innovation Forum (BIF), a global initiative uniting international leaders, innovators, and entrepreneurs to support Bhutan’s vision of a Mindfulness City.

The BIF takes place in less than two weeks, October 1-3, 2024. ToIP members Drummond Reed and Eric Drury are honored to be invited to attend, representing the ToIP community and continuing the engagement with Bhutan’s digital trust community.

Drummond and Eric will appear on panels to promote the principles of digital trust and will share their experience and expertise on all things digital trust, including the emergence of cross-border digital trust networks.

ToIP is thrilled to once again walk the path with Bhutan towards a more sustainable, trustworthy, and equitable digital future. We look forward to sharing our learnings after the Forum so that the entire digital trust community benefits.

The post Trust Over IP Members to Participate in Bhutan Innovation Forum appeared first on Trust Over IP.


Human Colossus Foundation

News from HCF Digital Governance Periscope

As we enter the second half of 2024, a time full of global digital governance initiatives, we're excited to share our progress at the Human Colossus Foundation from the governance perspective. This update encapsulates the recent strides of the Swiss E-ID, G20 Digital Public Infrastructure, UN Development Program, and the inaugural event for European Digital Independence.

Geneva, September 16 2024

As we enter the second half of 2024, a time full of global digital governance initiatives, we're excited to share our progress at the Human Colossus Foundation from the governance perspective. This update encapsulates the recent strides of the Swiss E-ID, G20 Digital Public Infrastructure, UN Development Program, and the inaugural event for European Digital Independence.

Distributed Governance Progress update

In the summer of 2023, the Human Colossus Foundation published Part 1 of the HCF Distributed Governance Model[1]. The model, though abstract in concept, tackles governance through the lens of the Principal-Agent problem. The core idea is to view information systems (i.e. technology) as agents serving users (i.e. humans) -whether individuals, businesses or any sovereign organisation with a decision power. 

In today's hyper connected digital society, we lose control over information. It has become challenging to know what happens to the data we share and even more difficult to assess the accuracy of the data we use. Part 1 addressed this need for control and accuracy by introducing a governance framework integrating digital technology with existing (non-digital) frameworks. 

While Part 1 lays the conceptual foundation, Part 2 turns theory into action. By collaborating directly with the key players in digital transformation across various sectors, we are building the technology stack that will demonstrate the crucial role of data authenticity and integrity in shaping a digital governance model. This practical approach is a prerequisite to our future work on AI governance for example.

Therefore, Part 2 develops a concept into a reality we shape through multiple projects. We are well underway and excited to share the progress here.


Switzerland’s E-ID - a Sovereign Digital Legal Identity

Swiss High Chamber adopts the E-ID

Sep.10 -Bern The Swiss parliament's high chamber (States Council) agreed to the design principles of the Swiss E-ID by a clear 43 against 1 vote[2]. Together with the low chamber (National Council), they will iron out the remaining differences and pave the way for parliamentary approval in 2024. This agreement further confirms the readiness of the legislative basis for introducing the Swiss E-ID[3]

HCF Contribution: The E-ID project, led by the Federal Department of Justice, has selected HCF's decentralised semantic architecture, Overlays Capture Architecture (OCA) [4] as a core technology for the first E-ID implementation.

As highlighted during the September 5th E-ID Participation Meeting, Swiss E-ID leverages OCA to present verifiable credentials in digital wallets securely. We are proud to be part of this significant project, and more information is available on the E-ID project's official Git-Hub repository[5].

Europe -moves Toward EU Digital Independence 

Sep. 24 -Brussels Kick-off event.

A diverse group EU parliament members from different parties and leading European and international experts will gather to engage and discuss the critical building blocks for a secure, accountable digital public infrastructure. The ultimate goal is to firmly establish the objective of European Digital Independence in the next EU Commission agenda. The Human Colossus Foundation has accepted the in-person invitation to participate in this efforts.

Horizon Europe Grant No101093126

HCF Contribution: Digital independence requires technological independence. The Human Colossus develops an Open-Source distributed technology stack to implement applications based on a distributed governance model. Creating these technologies and making them accessible to everyone requires developing an ecosystem of tools to harmonise data across multiple stakeholders (possibly millions) and ensure interoperability across different jurisdictions. The Foundation is creating some of these tools as part of the digital healthcare NextGen EU Horizon project[6] with funding from the EU, Switzerland and UK to integrate sensitive health data (including genomics) in personalised medicine for cardiology. The press release of the European Society of Cardiology provides more information [7].

International -Momentum behind Digital Public Infrastructure

Oct. 1 to 3 -Cairo Global Summit on DPI -Digital Public Infrastructure

The Human Colossus Foundation will be present at this convening of stakeholders in the Digital Public Infrastructure (DPI) ecosystem. This event will present how the HCF Dynamic Data Economy governance model and technology stack can support DPI implementation strategies for sustainable horizontal scaling. Our approach enables effective data exchange for economic development.

HCF Contribution: In 2023, the UN Development Programme (UNDP) published a Legal Digital Entity Framework[7]. Within that framework, UNDP develops the model governance assessment for data exchanges that respect the country's (i.e. sovereign entities) existing governance and fundamental Human Rights principles. The Human Colossus Foundation is part of a UNDP Advisory Board that provides its expertise in developing this work.

Current Work at the Foundation:

The above initiatives provide input for the continuous development of the core HCF technologies. They help to demonstrate the essential relevance of accurate data for digital governance. From a governance perspective, we can summarise them as follows:

The 'Ambient Infrastructure' is a reputation-based authentication system built upon the decentralised key management infrastructure KERI (Key Events Log Receipts Infrastructure). Launched in 2023 as part of the EU Horizon 2020 eSSIF-Lab project[9], the Ambient Infrastructure is now advancing in NextGen, an EU Horizon project focused on personalised cardiovascular medicine.

Version 2.0 of the OCA specification[4] supporting the OCA ecosystem. Decentralised semantics architectures are significant developments for digital governance to  ensure the respect of different sovereign digital governance.

OCA Ecosystem v1.0. Community based solutions and tooling including extensions (i.e. overlays not part of the core specification). The Human Colossus Research and Technology Councils have open a dedicated focus group to collect community requirements.

This ecosystem, featuring a suite of tools and protocols, ensures consistent and interoperable  data flows across multiple stakeholders and jurisdictions.

Conclusion

The Human Colossus Foundation has a busy autumn ahead. If you would like to help map a distributed governance framework into real-world applications, the HCF Research Council invites you to join its new Focus Group.

Joining this group will provide you with the opportunity to work closely with our team, share your expertise, and help shape the future of digital governance. We are also expanding our network of subject matter experts. Please get in touch with rc@humancolossus.org to learn more and express your interest.

References

[1] “Distributed Governance: a Principal-Agent Approach to Data Governance -- Part 1 Background & Core Definitions”, P.Page, P.Knowles, R.Mitwicki, arXiv:2308.07280v2 [cs.CY] , Aug. 15 2023

[2] “PARLAMENT IST SICH ÜBER AUSGESTALTUNG DER E-ID IM GRUNDSATZ EINIG”, SDA
KEYSTONE-SDA-ATS AG, September 10 2024

[3] “Parliament gets closer to finalising new digital ID scheme”, Swiss Info.ch, September 10 2024

[4] “Overlays Capture Architecture: Official Resources”, ColoSSI Network website, August 2023

[5] “Specification de Design pour les preuves électroniques”, Swiss E-ID Participation-Meeting, September 9 2024

[6] “Next Generation Tools for Genome-Centric Multimodal Data Iintegration in Personalised Cardiovascular Medicine”, EU Horizon Grant Number 101136962, funded by the EU, the Swiss State Secretariat for Education, and UK Research & Innovation.

[7] “Heart patients set to receive treatment tailored to their genetic and health information”, European Society of Cardiology Press Release, February 12 2024
[8] “UNDP Model Governance Framework for Digital Legal Identity Systems“, United Nation Development Programme & Norwegian Ministry of Foreign Affairs. Link as of September 16 2024

[9] “Decentralized Key Management Infrastructure for SSI by The Human Colossus Foundation“, NGI eSSIF-Lab project July 2021

Subscribe to our newsletter

Identity At The Center - Podcast

The Identity at the Center podcast was on the scene in Washi

The Identity at the Center podcast was on the scene in Washington DC attending the Identity Week America conference. We had the opportunity to sit down with Ryan Galluzzo from NIST to talk about the process of updating NIST standards, assurance levels, and existential questions from Ryan’s son. You can watch the episode here https://www.youtube.com/watch?v=NtWRrmoltQQ or listen to it in your favo

The Identity at the Center podcast was on the scene in Washington DC attending the Identity Week America conference. We had the opportunity to sit down with Ryan Galluzzo from NIST to talk about the process of updating NIST standards, assurance levels, and existential questions from Ryan’s son.

You can watch the episode here https://www.youtube.com/watch?v=NtWRrmoltQQ or listen to it in your favorite podcast app.

#iam #podcast #idac


We Are Open co-op

An Introduction to Systems Thinking

Part 3: Identifying leverage points This is the third in a series of posts about Systems Thinking, an approach that helps us make sense of complex situations by considering the whole system rather than just the individual pieces from which it is constituted. This series is made up of: Part 1: Three Key Principles Part 2: Understanding Feedback Loops Part 3: Identifying Leverag
Part 3: Identifying leverage points

This is the third in a series of posts about Systems Thinking, an approach that helps us make sense of complex situations by considering the whole system rather than just the individual pieces from which it is constituted.

This series is made up of:

Part 1: Three Key Principles Part 2: Understanding Feedback Loops Part 3: Identifying Leverage Points (this post)

In the first post, we laid the groundwork by exploring the foundational principles of Systems Thinking. We then explored feedback loops in the second post, examining how they drive system behaviour and contribute to stability or change.

In this concluding post, we turn our attention to Leverage Points — those critical areas within a system where a small shift can lead to profound changes. By understanding and identifying these points, you can uncover powerful opportunities for intervention, allowing you to drive meaningful and lasting change within complex systems.

1. What are leverage points?

“When we must deal with problems, we instinctively refuse to try the way that leads through darkness and obscurity. We wish to hear only of unequivocal results, and completely forget that these results can only be brought about when we have ventured into and emerged again from the darkness.” — Carl Jung

Leverage points are specific areas within a system where a small change can produce a significant impact on the entire system. These points often require careful analysis and a deep understanding of the system’s structure and behaviour, as they are not always immediately obvious. Identifying leverage points can be challenging because they are often hidden beneath the surface of the system’s visible components.

For example, in the world of technology, offering short, simple, clear (e.g. no legalese) ‘Terms and Conditions’ might be a leverage point. By helping users understand exactly how your organisation uses their data, you might improve customer satisfaction, increase trust, and reduce misunderstandings — all by making it easy for people to understand your company’s policy on privacy.

Leverage points, therefore, offer a way to create meaningful change by focusing on areas where even minimal effort can lead to substantial results. Recognising these points requires stepping back and viewing the system holistically, understanding not just the individual parts but how they interact and influence each other.

2. The role of paradigms in leverage points

“Inquiry is the controlled or directed transformation of an indeterminate situation into one that is so determinate in its constituent distinctions and relations as to convert the elements of the original situation into a unified whole.” — John Dewey

Paradigms are the underlying beliefs and assumptions that shape how we understand and interact with a system. These paradigms represent some of the most powerful leverage points, as changing a paradigm can fundamentally alter the way a system operates. Shifting these beliefs requires questioning what is often taken for granted and being open to new perspectives.

To go back to our Terms and Conditions example, a paradigm shift from prioritising user convenience to emphasising data privacy can serve as a transformative leverage point. Traditionally, many tech companies have focused on creating products that are easy to use, often at the expense of data privacy. However, as the paradigm shifts towards valuing privacy, we see significant changes in how technologies are designed and deployed.

This shift has been significantly driven by policy changes, such as the implementation of the European General Data Protection Regulation (GDPR). The GDPR has forced companies to rethink how they collect, store, and manage user data, prioritising privacy by design. As a result, this new perspective has encouraged the development of more secure products, the adoption of stricter data management practices, and an overall increase in consumer trust. The influence of GDPR illustrates how policy can drive paradigm shifts, leading to widespread changes not just in product development but also in corporate strategies, legal frameworks, and consumer expectations. This example demonstrates the profound impact that changing a paradigm, supported by regulatory measures, can have on an entire system.

By addressing paradigms, especially through supportive policies, you are working at the root of the system’s behaviour. Changing the underlying beliefs that drive a system can lead to widespread and lasting change, making paradigm shifts one of the most effective leverage points in Systems Thinking.

3. Small, incremental changes as leverage points

“Of any stopping place in life, it is good to ask whether it will be a good place from which to go on as well as a good place to remain.” — Mary Catherine Bateson

Small, well-placed actions can serve as powerful leverage points within a system. These might involve slight adjustments in policy, processes, or resource allocation. The key is to identify where these small changes can have a disproportionately large impact, leading to significant shifts in the system without the need for extensive interventions.

For example, in the context of data privacy, implementing a small but strategic change, such as requiring explicit user consent for data sharing, can have a profound impact. A great example of this is the introduction of GDPR-compliant consent forms across digital platforms in the EU. This seemingly minor policy change has led to a significant increase in user awareness and control over their personal data, contributing to greater trust in online services. The increased transparency and control have demonstrated how a small, well-placed policy change can lead to substantial benefits for both users and organisations, highlighting the power of incremental changes at the right leverage points.

The success of such small interventions lies in their ability to trigger widespread behavioural changes without the need for drastic overhauls. By carefully identifying and implementing these changes, we can achieve meaningful and lasting impacts across various systems.

4. Communication as a leverage point

“We are changed not only by being talked to but also by hearing ourselves talk to others, which is one way of talking to ourselves. More exactly, we are changed by making explicit what we suppose to have been awaiting expression a moment before.” — Geoffrey Vickers

Communication within a system can serve as a crucial leverage point. The way information is shared and understood can significantly influence the system’s behaviour, impacting everything from decision-making processes to overall system efficiency. Improving communication channels, ensuring transparency, and making sure all stakeholders are heard can lead to more effective decision-making and positive outcomes.

For instance, in our work at We Are Open Co-op, practicing open communication and transparency is central to our project management approach. By sharing progress, challenges, and decision-making processes openly with all stakeholders, we foster a culture of trust and collaboration. This openness extends to how we handle privacy; by being transparent about how we collect, use, and protect data, we build confidence among our partners and clients. Implementing tools that enable open documentation and communication, such as public repositories or shared workspaces, not only keeps everyone informed but also reinforces accountability and integrity in our practices. This small adjustment in how information is shared and protected can lead to significant improvements in project outcomes, ensuring that tasks are completed effectively while respecting privacy concerns.

The power of communication as a leverage point lies in its ability to unify and align the various components of a system. By enhancing the flow of information and ensuring that all voices are heard, you can create a more cohesive, responsive, and effective system.

5. Feedback Loops as Leverage Points

“The major problems in the world are the result of the difference between how nature works and the way people think.” — Gregory Bateson

Feedback loops, as discussed in the second post, can also act as leverage points within a system. By altering how these loops operate, you can influence the system’s overall behaviour. Identifying and adjusting these loops — whether reinforcing (positive) or balancing (negative) — can be a powerful way to steer the system toward desired outcomes.

For example, in the context of data privacy, feedback loops between user behaviour and data management practices can be effectively leveraged to enhance privacy protection. By providing users with real-time feedback on how their data is being used and giving them the ability to adjust their privacy settings, you create a positive feedback loop. As users become more aware of how their data is handled, they are often motivated to take greater control over their privacy settings. This increased control leads to further trust in the platform, encouraging users to engage more actively with privacy tools. Over time, this feedback loop can significantly strengthen overall data protection practices, leading to better compliance with regulations and higher user satisfaction.

This example highlights how adjusting feedback loops can lead to substantial changes in system behaviour with relatively simple interventions. By strategically modifying these loops, you can achieve desired outcomes more effectively and sustainably.

Conclusion

Leverage points offer powerful opportunities to influence complex systems with relatively small, strategic actions. By identifying where these points lie and understanding how to use them effectively, you can create meaningful and lasting change within any system. Whether you are looking to improve organisational processes, drive social change, or enhance personal decision-making, the principles of Systems Thinking can help you navigate complexity with greater clarity and purpose.

This post, along with the previous two in this series, provides an introduction to the foundational principles of Systems Thinking and their practical applications. These insights are drawn from my studies toward an MSc in Systems Thinking in Practice through the Open University, and they are just the beginning of what Systems Thinking can offer.

If you’re interested in applying these principles to your work, We Are Open Co-op is here to help you implement effective systemic interventions tailored to your unique challenges. Thank you for following this series — your journey into Systems Thinking doesn’t have to end here. Continue to explore, apply, and refine these concepts, and watch how they can transform the way you approach complex problems.

References Ackoff, R.L. (1974). Redesigning the Future: A Systems Approach to Societal Problems. New York: Wiley. Bateson, G. (1972). Steps to an Ecology of Mind. San Francisco: Chandler Publishing Company. Bateson, M.C. (1994). Peripheral Visions: Learning Along the Way. New York: HarperCollins. Beer, S. (1972). Brain of the Firm. New York: Herder and Herder. Checkland, P. (1981). Systems Thinking, Systems Practice. Chichester: John Wiley & Sons. Dewey, J. (1938). Logic: The Theory of Inquiry. New York: Holt, Rinehart, and Winston. Jung, C.G. (1957). The Undiscovered Self. New York: Little, Brown, and Co. Meadows, D.H. (2008). Thinking in Systems: A Primer. White River Junction: Chelsea Green Publishing. Vickers, G. (1965). The Art of Judgement: A Study of Policy Making. London: Chapman & Hall.

An Introduction to Systems Thinking was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Hyperledger Foundation

Welcome to Linux Foundation Decentralized Trust!

Today marks a significant moment for the Linux Foundation and the entire decentralized technology landscape. I’m excited to announce the official launch of LF Decentralized Trust, a new umbrella organization dedicated to fostering innovation and collaboration across decentralized technologies. Built on 8+ years of open development and community building, Linux Foundation Decentralized T

Today marks a significant moment for the Linux Foundation and the entire decentralized technology landscape. I’m excited to announce the official launch of LF Decentralized Trust, a new umbrella organization dedicated to fostering innovation and collaboration across decentralized technologies. Built on 8+ years of open development and community building, Linux Foundation Decentralized Trust (LF Decentralized Trust) is the new home for a growing ecosystem of blockchain, ledger, identity, interoperability, cryptographic, and related technologies. We are launching with more than 100 founding members and 200 local, regional and industry groups that convene tens of thousands of participants globally. We have 17 projects and well over 50 labs and a mature standards project with Trust OverIP. And we are just getting started.


Introducing Hiero: Bringing Hedera’s Core Network Software to Linux Foundation Decentralized Trust

In the ever-evolving landscape of decentralized technologies, few innovations stand out like the Hedera network. Offering unparalleled speed, security, and scalability, Hedera's Hashgraph consensus algorithm has demonstrated its ability to power enterprise-grade distributed ledger solutions. Today, we are excited to announce the next chapter in the Hedera network's journey—its core soft

In the ever-evolving landscape of decentralized technologies, few innovations stand out like the Hedera network. Offering unparalleled speed, security, and scalability, Hedera's Hashgraph consensus algorithm has demonstrated its ability to power enterprise-grade distributed ledger solutions. Today, we are excited to announce the next chapter in the Hedera network's journey—its core software will be contributed to Linux Foundation Decentralized Trust under the new name Hiero.


Lockness: A new home for trusted key cryptography

We are thrilled to introduce Lockness, a new open-source ecosystem focused on key management and digital signature protocols, including technologies like Multi-Party Computation (MPC), Threshold Signature Schemes (TSS) and other state-of-the-art protocols in key cryptography. Our company, Dfns, has proudly contributed its open-source libraries to Lockness, which is a project of the just

We are thrilled to introduce Lockness, a new open-source ecosystem focused on key management and digital signature protocols, including technologies like Multi-Party Computation (MPC), Threshold Signature Schemes (TSS) and other state-of-the-art protocols in key cryptography. Our company, Dfns, has proudly contributed its open-source libraries to Lockness, which is a project of the just launched LF Decentralized Trust.


Hyperledger Fabric v3: Delivering Smart Byzantine Fault Tolerant consensus

Introducing Byzantine Fault Tolerant for Hyperledger Fabric

Introducing Byzantine Fault Tolerant for Hyperledger Fabric

Friday, 13. September 2024

OpenID

Proposed Revisions to OpenID Process Document and IPR Policy

Dear OpenID Foundation Members, A subgroup of OpenID Foundation board members and key staff have been working to update the “OpenID Process” document based on issues raised by some board members to ensure the document aligns with how the Foundation currently works. This update addresses those original issues and also identified a significant number of […] The post Proposed Revisions to OpenID Pr
Dear OpenID Foundation Members,

A subgroup of OpenID Foundation board members and key staff have been working to update the “OpenID Process” document based on issues raised by some board members to ensure the document aligns with how the Foundation currently works. This update addresses those original issues and also identified a significant number of mainly editorial issues and improvements that were possible. It also highlighted inconsistencies and other issues that required coordinating revisions with the “Intellectual Property Rights (IPR) Policy,” so that has been added to the scope and improvements proposed. The changes were unanimously approved by the board at the September 12, 2024 board meeting. Approving these changes also requires a 21-day review and 14-day vote of the membership.
Process Document Material Changes Throughout the document addition to include the intended “, and Final Specifications Incorporating Errata Corrections” in the various clauses of artifact Addition of 4 new defined terms for clarity “Non-Core Decision”, “Core Decision”, “Quorum Requirement”, “Substantive Change” Updated definition of “Specifications Council” and edit of Section 2 to address Specs Council membership requirements Moved the descriptions of elements of Sections 3.1 and 3.2 into “Table 1 – Work Group Decision points” Table 2 – Re-drafting for clarity the decision options for Inter-WG Core decision and updated the Quorum requirement to be more realistic “20% of the Active Contributors to the applicable WG or 20 Contributors to the applicable WG, whichever is lower.” Sections 4.4, 4.7, 4.10 replace “will promptly be submitted to a vote of the OIDF membership” with “will promptly be submitted for a Supermajority confirmation vote of the Board” Section 4.8 adds “Work Group Repositories” to places where “work of a WG will be conducted” Section 4.8 updates “WG documents” to “Implementers Drafts, Final Specifications, and Final Specifications Incorporating Errata Corrections” Section 4.10 (and corresponding part of Table 2) defines WG meeting quorum and Intra-WG non-core as at least six Contributors Section 5.1 is redrafted to reflect the real stages of the Specification approval process and to make it clearer
IPR Policy Material Changes Throughout the document addition to include the intended “, and Final Specifications Incorporating Errata Corrections” in the various clauses Addition of “Table 1: Overview of IPR applicability” – This is intended to help the reader comprehend the rest of the document and to which document types the rights and obligations apply Addition of the defined term “Errata corrections” Addition of the defined term “Final Specification Incorporating Errata Corrections” Significant re-draft to the definition of “Specification” Addition of the defined term “Work Group Draft” Re-organisation of section “V. Copyrights” to improve the clarity and to be much clearer about the two artifact types and which rights and obligations apply to each. Addition of a new paragraph entitles “(b) License” which explains the license that the OIDF grants to others
Links Links to current documents Current OpenID Process Document Current OIDF Intellectual Property Rights (IPR) Policy  Links to redlined versions of the documents Redlined OpenID Process Document Redlined OIDF Intellectual Property Rights (IPR) Policy
Key Milestones Friday, September 13, 2024 – 21-day member review begins Friday, October 4, 2024 – 21-day member review concludes Saturday, October 5, 2024 – 14-day member voting opens Saturday, October 19, 2024 – member voting concludes The vote begins on Saturday, October 5, 2024 and take place at https://openid.net/foundation/members/polls/340.    Marie Jordan
OpenID Foundation Secretary

The post Proposed Revisions to OpenID Process Document and IPR Policy first appeared on OpenID Foundation.


Oasis Open Projects

Invitation to comment on two KMIP specifications

OASIS and the KMIP TC are pleased to announce that KMIP Version 3.0 and KMIP Profiles Version 3.0 are now available for public review and comment.  The OASIS KMIP TC works to define a single, comprehensive protocol for communication between encryption systems and a broad range of new and legacy enterprise applications, including email, databases, […] The post Invitation to comment on two KM

Public review ends October 14th

OASIS and the KMIP TC are pleased to announce that KMIP Version 3.0 and KMIP Profiles Version 3.0 are now available for public review and comment. 

The OASIS KMIP TC works to define a single, comprehensive protocol for communication between encryption systems and a broad range of new and legacy enterprise applications, including email, databases, and storage devices. By removing redundant, incompatible key management processes, KMIP will provide better data security while at the same time reducing expenditures on multiple products.

The documents and all related files are available here:

Key Management Interoperability Protocol (KMIP) Version 3.0
Committee Specification Draft 01
23 August 2024

Editable Source: https://docs.oasis-open.org/kmip/kmip-spec/v3.0/csd01/kmip-spec-v3.0-csd01.docx (Authoritative)

HTML: https://docs.oasis-open.org/kmip/kmip-spec/v3.0/csd01/kmip-spec-v3.0-csd01.html

PDF: https://docs.oasis-open.org/kmip/kmip-spec/v3.0/csd01/kmip-spec-v3.0-csd01.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in a ZIP distribution file. You can download the ZIP file at:  https://docs.oasis-open.org/kmip/kmip-spec/v3.0/csd01/kmip-spec-v3.0-csd01.zip

Key Management Interoperability Protocol (KMIP) Profiles Version 3.0
Committee Specification Draft 01
30 November 2024

Editable Source: https://docs.oasis-open.org/kmip/kmip-profiles/v3.0/csd01/kmip-profiles-v3.0-csd01.docx (Authoritative)

HTML: https://docs.oasis-open.org/kmip/kmip-profiles/v3.0/csd01/kmip-profiles-v3.0-csd01.html

PDF: https://docs.oasis-open.org/kmip/kmip-profiles/v3.0/csd01/kmip-profiles-v3.0-csd01.pdf

Test Cases: https://docs.oasis-open.org/kmip/kmip-profiles/v3.0/csd01/test-cases/

For your convenience, OASIS provides a complete package of the specification document and any related files in a ZIP distribution file. You can download the ZIP file at: https://docs.oasis-open.org/kmip/kmip-profiles/v3.0/csd01/kmip-profiles-v3.0-csd01.zip

How to Provide Feedback

OASIS and the KMIP TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public review starts September 13, 2024 at 00:00 UTC and ends October 14, 2024 at 23:59 UTC.

Comments from TC members should be sent directly to the TC’s mailing list. Comments may be submitted to the project by any other person through the use of the project’s Comment Facility: https://groups.oasis-open.org/communities/community-home?CommunityKey=2b5e5c66-cc41-4aa5-92ee-018f5aa7dfc4

Comments submitted for this work by non-members are publicly archived and can be viewed by using the link above and clicking the “Discussions” tab.

Please note, you must log in or create a free account to see the material. Please contact the TC Administrator (tc-admin@oasis-open.org) if you have any questions regarding how to submit a comment.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification. 

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the KMIP TC’s can be found at the TC’s public home page: https://www.oasis-open.org/committees/kmip/

Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] http://www.oasis-open.org/committees/kmip/ipr.php

Intellectual Property Rights (IPR) Policy

The post Invitation to comment on two KMIP specifications appeared first on OASIS Open.

Wednesday, 11. September 2024

Digital ID for Canadians

Spotlight on IndyKite

1. What is the mission and vision of IndyKite? Backed by leading venture firms and based in San Francisco, IndyKite is building a new category…

1. What is the mission and vision of IndyKite?

Backed by leading venture firms and based in San Francisco, IndyKite is building a new category of data management and digital identity services by capturing, connecting and controlling data across the enterprise and surrounding ecosystem. With an identity-centric approach to data, IndyKite enables companies to achieve higher trust in their data products, AI and applications with enhanced visibility, data governance and granular access controls. Leveraging knowledge graph technology and machine learning, IndyKite delivers a powerful operational data layer to enable developers with flexible APIs through a growing open-source ecosystem. Learn more at [www.indykite.com](http://www.indykite.com/)

2. Why is trustworthy digital identity critical for existing and emerging markets?

Digital identity is a core enabler of applications and services and will only become more important in the future. Digital identity not only applies to humans, but also to all devices, applications, systems, AI, digital products and even individual data points. Securing these identities is paramount, but even more important, is understanding how they drive and enable user experience, functionality and data mobility across the organization. At IndyKite, we see digital identity as the starting point for enabling businesses to build modern solutions, deliver incredible customer experiences and ensure trustworthy AI.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

Modern organizations around the world are undermined by siloed data and disconnected identities. The advent of AI tools is increasing pressure for leaders to address data challenges in the organization to ensure future viability. IndyKite enables organizations to capture, connect and control their data in a flexible and dynamic way, to drive better decisions, security, machine learning and AI and solve challenges.

4. What role does Canada have to play as a leader in this space?

Canada holds a place of global influence as a leading voice in many sectors. By pioneering a secure and accessible digital identity framework, Canada has ensured the sustainability of its modern economy into the future and created a blueprint for other nations to follow.

5. Why did your organization join the DIACC?

As digital identity is a an essential part of the future, it needs modern approaches and frameworks that enable innovation, without being restricted by legacy thinking. DIACC is an ideal forum for public and private sector leaders to discuss, design and accelerate these approaches to ensure digital trust into the future.

6. What else should we know about your organization?

Powered by graph technology, the IndyKite platform increases visibility, trust and control of your data. This enables data pipelines to get the right data, to the right place and in the right context to drive enhanced product development and new revenue channels. It also enables the secure sharing of data beyond the bounds of your organization, and better customer journeys with native identity workflows. More details can be found at www.indykite.com


Hyperledger Foundation

Introducing Signare, a Hyperledger Lab

Contributors from Adhara and ioBuilders are teaming up to launch Signare, a Hyperledger lab, which is an enterprise-grade digital signing solution for Ethereum clients.

Contributors from Adhara and ioBuilders are teaming up to launch Signare, a Hyperledger lab, which is an enterprise-grade digital signing solution for Ethereum clients.


Next Level Supply Chain Podcast with GS1

How U.S. Customs and Border Protection Is Connecting Trade and Sustainability with Lea-Ann Bigelow

Imagine a world where combating climate change and protecting the environment is integrated into every step of global trade.  In this episode, hosts Reid Jackson and Liz Sertl are joined by Lea-Ann Bigelow, Director of Green Trade at U.S. Customs and Border Protection (CBP). With a wealth of experience in environmental regulation and sustainability, Lea-Ann shares how U.S. Customs is evolvi

Imagine a world where combating climate change and protecting the environment is integrated into every step of global trade. 

In this episode, hosts Reid Jackson and Liz Sertl are joined by Lea-Ann Bigelow, Director of Green Trade at U.S. Customs and Border Protection (CBP). With a wealth of experience in environmental regulation and sustainability, Lea-Ann shares how U.S. Customs is evolving to meet the challenges of climate change through innovative trade practices.

Lea-Ann discusses how CBP's efforts are not just about regulating imports but about leading the charge in reducing emissions, enhancing traceability, and fighting environmental crimes. By integrating sustainability into the global supply chain, these initiatives are paving the way for a cleaner, safer world.

 

In this episode, you’ll learn:

How U.S. Customs and Border Protection is pioneering green trade practices to reduce emissions and enhance sustainability across global supply chains.

The implementation of advanced traceability systems to combat environmental crime and ensure compliance in international trade.

The collaborative strategies between government and industry that are shaping a more resilient and environmentally responsible future for global trade.

 

Jump into the Conversation:

[00:00] Introducing Next Level Supply Chain

[00:40] Lea-Ann Bigelow Discusses Green Trade Initiatives at CBP

[01:23] The Dark Side of Trade: Environmental Crime and Its Ties to Other Offenses

[01:52] Why GS1 Connect Is Crucial for Environmental Compliance

[02:55] The Role of GS1 Standards in Enhancing Global Trade Compliance

[04:41] Developing the Green Standard for Global Trade

[04:50] International Collaboration on Environmental Regulations

[05:31] Navigating Complex Global Environmental Regulations

[06:02] Closing Thoughts: The Future of Green Trade

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Lea-Ann Bigelow on LinkedIn

Tuesday, 10. September 2024

DIF Blog

DIF Announces Two New Work Items in Identifiers & Discovery Working Group

The Decentralized Identity Foundation is at the forefront of innovating standards and technologies for decentralized digital identity. Today, we're excited to announce the launch of two new work items within our Identifiers & Discovery Working Group, aimed at improving the functionality and security of Decentralized Identifiers (DIDs). Markus

The Decentralized Identity Foundation is at the forefront of innovating standards and technologies for decentralized digital identity. Today, we're excited to announce the launch of two new work items within our Identifiers & Discovery Working Group, aimed at improving the functionality and security of Decentralized Identifiers (DIDs).

Markus Sabadello, Founder & CEO of DanubeTech, DIF Steering Committee member, and Co-Chair of the ID & Discovery WG, emphasizes the importance of these new initiatives:

"The I&D WG recognizes that identifiers will always be the foundation of any digital identity system. For this reason, I am excited about the two latest work items in our working group - DID Traits and Trust DID Web. Both will be extraordinarily useful in our continued quest to build strong and useful identifier systems that other technologies and protocols can rely on."
DID Traits

JC Ebersbach, CEO of identinet GmbH and newly-added Co-Chair of the ID & Discovery WG, initiated the DID Traits work item. DID Traits focuses on helping implementers choose the most appropriate DID method for their specific use cases.

Key aspects of the DID Traits work item include:

Identifying and defining key traits of identifiers that have proven significant Providing a JSON schema for DID method authors to present their method's characteristics in a structured, machine-readable format Enabling third-party systems, such as DID resolvers, to utilize this structured information effectively

The DID Traits work builds upon existing literature and specifications in the field, aiming to streamline the process of DID method selection and implementation.

JC shares his motivation behind the DID Traits initiative:

"The introduction of Trust DID Web (did:tdw) at IIW 39 inspired me to initiate the DID Traits work item. I’m thrilled that both initiatives have been adopted by the ID & Discovery working group. My aim is to provide implementers with simple, practical tools that will lead to more widespread integration of Self-Sovereign Identity technologies. With DID Traits, we aim to simplify the process of selecting the most appropriate DID method for specific use cases, making adoption easier and more effective."
Trust DID Web

Stephen Curran, Founder of Cloud Compass Computing and Chair of the Sovrin Foundation Board of Trustees, leads the Trust DID Web work item. This initiative introduces the did:tdw method specification as an enhancement to the existing did:web method.

Key features of Trust DID Web DIDs include:

Ongoing publication of all DID Document versions A verifiable chain of updates from genesis to deactivation Self-certifying identifiers for DIDs, enabling portability Signed proofs for DID Document updates Optional mechanisms for pre-rotation keys and collaborating witnesses Standardized DID URL path handling A "/whois" path for returning a verifiable presentation about the DID

Trust DID Web aims to provide greater trust and security while maintaining the simplicity of did:web, fostering a more trusted and secure web environment for decentralized digital identities.

Stephen Curran explains the goals of the Trust DID Web initiative:

"I’m happy to lead the new Trust DID Web (did:tdw) initiative at DIF. Our goal is to enhance did:web with important security and verifiability features from the world of ledger-based DIDs, while keeping the simplicity that makes web-based DIDs so accessible. These improvements are designed to boost trust and security without adding complexity. We think that did:tdw provides a solid foundation for decentralized digital identities that are secure and easy to deploy for a variety of applications."
Join Us

We invite everyone to join us in these exciting new initiatives. As Markus Sabadello notes, "We welcome anyone to our working group who would like to contribute to these new work items, or propose any other identifier-related topic."

Both meetings are bi-weekly:

DID Traits on Tuesdays at 8:00 Pacific / 17:00 Central Europe Trust DID Web on Thursdays at 9:00 Pacific / 18:00 Central Europe

To participate in these and other DIF work items:

👉 Join DIF
📆 Subscribe to the DIF Calendar
📧 Contact membership@identity.foundation with questions

Stay tuned for updates as we progress with these important work items that will shape the future of decentralized identity.


Oasis Open Projects

Invitation to comment on LEXIDMA DMLex Version 1.0 – CSD04

OASIS and the LEXIDMA TC are pleased to announce that DMLex V1.0 CSD04 is now available for public review and comment.  DMLex is a data model for modelling dictionaries (here called lexicographic resources) in computer applications such as dictionary writing systems. DMLex is a data model, not an encoding format. DMLex is abstract, independent of […] The post Invitation to comment on LEXIDM

Public Review Ends October 11th

OASIS and the LEXIDMA TC are pleased to announce that DMLex V1.0 CSD04 is now available for public review and comment. 

DMLex is a data model for modelling dictionaries (here called lexicographic resources) in computer applications such as dictionary writing systems. DMLex is a data model, not an encoding format. DMLex is abstract, independent of any markup language or formalism. At the same time, DMLex has been designed to be easily and straightforwardly implementable in XML, JSON, NVH, as a relational database, and as a Semantic Web triplestore.

The documents and all related files are available here:

LEXIMDA Data Model for Lexicongraphy (DMLex) V1.0
Committee Specification Draft 04
06 September 2024

Editable Source (Authoritative):
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/dmlex-v1.0-csd04.pdf
HTML:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/dmlex-v1.0-csd04.html

Schemas:

XML: https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/schemas/XML/

JSON: https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/schemas/JSON/

RDF: https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/schemas/RDF/

Informative copies of third party schemas are provided:

https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd04/schemas/informativeCopiesOf3rdPartySchemas/

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/ap-pf-v1.0-csd02.zip

How to Provide Feedback

OASIS and the LEXIDMA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public review starts September 10, 2024 at 00:00 UTC and ends October 11, 2024 at 23:59 UTC.

Comments may be submitted to the project by any person through the use of the project’s Comment Facility: https://groups.oasis-open.org/communities/community-home?CommunityKey=b7061122-77c2-424a-8859-018dce26037f

Comments submitted for this work by non-members are publicly archived and can be viewed by using the link above and clicking the “Discussions” tab.

Please note, you must log in or create a free account to see the material. Please contact the TC Administrator (tc-admin@oasis-open.org) if you have any questions regarding how to submit a comment.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification. 

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the LEXIDMA TC’s can be found at the TC’s public home page: https://www.oasis-open.org/committees/lexidma/

Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] http://www.oasis-open.org/committees/lexidma/ipr.php

Intellectual Property Rights (IPR) Policy
Intellectual Property Rights (IPR) Policy

The post Invitation to comment on LEXIDMA DMLex Version 1.0 – CSD04 appeared first on OASIS Open.


DIF Blog

Pre-registration Now Open for DIF’s 2024 Hackathon!

Are you ready to be at the forefront of decentralized identity innovation? Pre-registration is now open for the Decentralized Identity Foundation's highly anticipated hackathon! This October, coinciding with Hacktoberfest, we're bringing together both seasoned decentralized identity developers and newcomers to explore its transformative potential. The hackathon

Are you ready to be at the forefront of decentralized identity innovation? Pre-registration is now open for the Decentralized Identity Foundation's highly anticipated hackathon!

This October, coinciding with Hacktoberfest, we're bringing together both seasoned decentralized identity developers and newcomers to explore its transformative potential. The hackathon will spotlight groundbreaking solutions in Education, Reusable Identity, and Travel.

With a stellar lineup of sponsors (more details coming soon!) and prize pools fueled by our community, this hackathon promises to be a landmark event for everyone involved.

Event Details:

Hacking Period: October 1st - November 4th, 2024 Tracks: Education, Reusable Identity, Travel Join us as we shape the future of decentralized identity!

Be the first to secure your spot—pre-register now and get ready to drive the next wave of innovation!

Pre-register at this link.


We Are Open co-op

Staying on Track

What to Do When Your User Research Project Starts to Veer Off Course User research is a journey, and like any journey, it doesn’t always go exactly as planned. We’ve written previously about starting your own user research journey and questions to ask that can help get you off on the right foot. Sometimes though, despite all best intentions and diligent planning with your chosen select
What to Do When Your User Research Project Starts to Veer Off Course

User research is a journey, and like any journey, it doesn’t always go exactly as planned. We’ve written previously about starting your own user research journey and questions to ask that can help get you off on the right foot.

Sometimes though, despite all best intentions and diligent planning with your chosen selection of pre-mortems, checklists, Gantt charts, or Kanban boards, a user research project can still go off course.

There are many reasons why this might happen, and many may seem well beyond your control. In this post, we’ll share some strategies we’ve found helpful in regaining momentum and getting a project back on track.

Common Issues in User Research cc-by-nd Bryan Mathers for WAO

Before we explore solutions, it’s important to recognise some of the issues that can derail a user research project.

Unclear Objectives
Without a clear understanding of what you want to find out, your project can quickly lose focus. For example, if you’re researching a digital tool’s usability but haven’t defined whether you’re focusing on ease of navigation, visual design, or performance, you may end up collecting feedback that’s too broad or irrelevant. This can lead to scattered efforts and data that doesn’t fully answer the questions you initially set out to explore. Clarity in your objectives also helps you stay within scope — after all, projects that become too complex or overly broad can be difficult to manage. Recruitment Challenges
Finding the right participants is crucial, but sometimes, despite your best efforts, recruitment doesn’t go as planned. You may find that a particular community or demographic is underrepresented or face difficulty finding enough respondents due to limited access or poor engagement. This can delay your project and result in a participant pool that’s not as diverse or representative as it needs to be. External Dependencies
Often, research projects rely on external factors like stakeholder input, third-party data, or collaboration with other teams. For example, you could be relying on receiving data from a partner organisation or feedback from a legal team before being able to proceed. When these dependencies fall through, your project can come to a standstill. Strategies to Get Back on Track cc-by-nd Bryan Mathers for WAO

Now that we’ve identified some common issues, let’s look at how you can regain momentum and steer your project back on course.

Revisit and Refine Objectives
If your project is losing direction, it might be time to revisit your initial objectives. What is the overarching research question you’re trying to answer? Is your objective still relevant? Do they need to be refined based on what you’ve learned so far? Narrowing or reaffirming your focus can help streamline your efforts and ensure that the data you collect is meaningful and actionable. Simplify Your Scope
If you’ve taken on more than you can manage, don’t hesitate to simplify your project’s scope. Focus on the most critical questions and the areas that will have the greatest impact. It’s better to complete a smaller, focused project successfully than to stretch resources too thin across a larger, unfocused one. Adapt Your Recruitment Approach
When recruitment isn’t going as planned, consider adapting your approach. If your initial outreach didn’t yield the desired results, explore alternative channels or adjust your criteria. Techniques like reaching out through community organisations or leveraging existing networks can help you connect with a broader range of participants while maintaining diversity and inclusivity. Adjust Your Methods
If certain methods aren’t yielding results, be flexible and try different approaches. For example, if individual interviews aren’t providing the depth of insight you need, consider adding focus groups? The ability to adjust and adapt is a strength in user research. Communicate and Realign with Stakeholders
If external dependencies are causing delays, clear communication is key. Communicate with stakeholders regularly to identify any roadblocks and realign expectations. Perhaps your pre-mortem foresaw some of the issues that have emerged and have a plan to mitigate already in place? Sometimes, simply clarifying timelines or adjusting deadlines can help get things moving again. Refresh the Team
If a project is dragging on and fatigue is setting in, prioritise tasks based on their impact and feasibility. Reassign team members or bring in extra help if possible. Sometimes, a fresh perspective or additional hands can help push through bottlenecks. Staying Resilient: Embrace the Learning Process cc-by-nd Bryan Mathers for WAO

Remember, setbacks and challenges in user research are not uncommon, especially when working to include a wide range of voices and experiences. Each obstacle offers an opportunity to learn, grow, and improve your approach. By staying resilient, being flexible, and maintaining clear and empathetic communication, you can navigate difficulties and ensure that your research is both inclusive and impactful.

Ultimately, the ability to adapt and respond thoughtfully to challenges is what strengthens your user research and makes it more effective. Every project, even those that face hurdles, contributes valuable lessons that can be applied to future work. Embrace these lessons, and use them to enhance your research practices, ensuring that all voices are heard and valued.

Need help? We do this a lot, get in touch!

Staying on Track was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


OpenID

Notice of Vote for Proposed Final OpenID Connect for Identity Assurance Specifications

The official voting period will be between Monday, September 23, 2024 and Monday, September 30, 2024 (12:00pm  PT), once the 60 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, September 16, 2024. The eKYC & IDA work group page is https://openid.net/wg/ekyc

The official voting period will be between Monday, September 23, 2024 and Monday, September 30, 2024 (12:00pm  PT), once the 60 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, September 16, 2024.

The eKYC & IDA work group page is https://openid.net/wg/ekyc-ida/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/337

Marie Jordan – OpenID Foundation Secretary

The post Notice of Vote for Proposed Final OpenID Connect for Identity Assurance Specifications first appeared on OpenID Foundation.

Monday, 09. September 2024

Oasis Open Projects

Invitation to comment on XLIFF v2.2 CSD01

OASIS and the XLIFF TC are pleased to announce that XLIFF v2.2 CSD01 Parts 1 & 2  are now available for public review and comment.  This spec is a multi-part specification which defines Version 2.2 of the XML Localisation Interchange File Format (XLIFF). The purpose of this vocabulary is to store localizable data and carry […] The post Invitation to comment on XLIFF v2.2 CSD01 appe

Public review ends October 10th

OASIS and the XLIFF TC are pleased to announce that XLIFF v2.2 CSD01 Parts 1 & 2  are now available for public review and comment. 

This spec is a multi-part specification which defines Version 2.2 of the XML Localisation Interchange File Format (XLIFF). The purpose of this vocabulary is to store localizable data and carry it from one step of the localization process to the other, while allowing interoperability between and among tools.

The documents and all related files are available here:

XLIFF Version 2.2 Part 1: Core
Committee Specification Draft 01
18 July 2024

Editable Source:
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-core-v2.2-csd01-part1.xml
HTML (Authoritative):
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-core-v2.2-csd01-part1.html
PDF:
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-core-v2.2-csd01-part1.pdf

Schemas: https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/Schemas/

XLIFF Version 2.2 Part 2: Extended
Committee Specification Draft 01
18 July 2024

Editable Source:
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-extended-v2.2-csd01-part2.xml
HTML (Authoritative):
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-extended-v2.2-csd01-part2.html
PDF:
https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/xliff-extended-v2.2-csd01-part2.pdf

Schemas: https://docs.oasis-open.org/xliff/xliff-core/v2.2/csd01/Schemas/

For your convenience, OASIS provides a complete package of the specification document and any related files in a ZIP distribution file. You can download the ZIP file at:

How to Provide Feedback

OASIS and the XLIFF TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public review starts September 9, 2024 at 00:00 UTC and ends October 10, 2024 at 23:59 UTC.

Comments may be submitted to the project by any person through the use of the project’s Comment Facility: https://groups.oasis-open.org/communities/community-home?CommunityKey=f7b70a54-5dd7-4ea9-9d6f-018dce262ff9

Comments submitted for this work by non-members are publicly archived and can be viewed by using the link above and clicking the “Discussions” tab.

Please note, you must log in or create a free account to see the material. Please contact the TC Administrator (tc-admin@oasis-open.org) if you have any questions regarding how to submit a comment.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification. 

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the XLIFF TC’s can be found at the TC’s public home page: https://www.oasis-open.org/committees/xliff/

Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] http://www.oasis-open.org/committees/xliff/ipr.php

Intellectual Property Rights (IPR) Policy

The post Invitation to comment on XLIFF v2.2 CSD01 appeared first on OASIS Open.


FIDO Alliance

Passkeys Hackathon Tokyo: A Showcase of Innovation and Excellence

By Atsuhiro Tsuchiya, APAC Market Development Sr. Manager In June, Google and the FIDO Alliance hosted a highly successful event in Tokyo that brought together innovative minds from various universities […]

By Atsuhiro Tsuchiya, APAC Market Development Sr. Manager

In June, Google and the FIDO Alliance hosted a highly successful event in Tokyo that brought together innovative minds from various universities and companies. The event was marked by a high level of participation and competition, showcasing the latest advancements in authentication technologies.

Event Highlights:

High Participation: The event saw an impressive turnout, with around 200 participants from 25 different universities and companies. Cutting-Edge Innovations: Participants showcased groundbreaking solutions aimed at enhancing security and convenience in authentication processes. Technical Workshops: Engaging workshops provided a platform for sharing practical knowledge and experiences.

Key Features:

Real-World Implementations: A notable aspect of this event was the participation of teams focused on implementing their solutions in real-world services. This added a layer of practicality and relevance, making the event highly impactful. High-Level Competition: The level of competition was exceptionally high, reflecting the advanced state of current research and development in the field. In particular, the teams from universities all proposed a high level of implementation.

Awards and Recognition:

Grand Winner: Keio University SFC-RG pkLock team (Keio University)

The team developed an innovative authentication system for IoT space that combines security and user convenience, making it a standout solution in the competition.

FIDO Award 1: SKKN (Waseda University)
This team was recognized for their advanced authentication technology that promises to enhance security in various applications.

FIDO Award 2: TOKYU ID (Tokyu)
Their solution focused on integrating authentication technologies into everyday services, demonstrating practical and scalable applications.

Google Award: Team Nulab (Nulab)
The team impressed with their user-friendly authentication app that combines multiple security features to provide a seamless user experience.

What We Learned from the Event:

Collaboration is Key: The event underscored the importance of collaboration between academia and industry. By working together, we can accelerate the development and implementation of innovative authentication solutions.

Focus on User Experience: Many of the successful solutions emphasized the need for a seamless and user-friendly experience. Security should not come at the expense of convenience.

Scalability and Practicality: Solutions that can be easily integrated into existing systems and scaled to meet the needs of various applications are crucial for widespread adoption.

Continuous Innovation: The rapid advancements in authentication technologies highlight the need for continuous innovation and adaptation to stay ahead of emerging threats.

Acknowledgements: We would like to extend our heartfelt thanks to the tutors from the Japan Working Group for their invaluable support and guidance throughout the event. Their expertise and dedication were instrumental in making this event a success.

These award-winning solutions highlight the diverse approaches and innovative thinking that are driving the future of authentication technologies. Each team demonstrated a unique blend of creativity, technical expertise, and practical application, making this event a true showcase of excellence in the field.
The FIDO Alliance is excited to share the outcomes of this event and looks forward to continuing to support and foster innovation in authentication technologies. To learn more about the background and details, please read the full event report, Passkeys Hackathon Tokyo event report.


DIF Blog

Member Spotlight: Moises Jaramillo of Dentity

Moises Jaramillo is a veteran in the software development world with nearly three decades of experience. A Principal Engineer at Dentity, Moises is at the forefront of decentralized identity technology with expertise in Web3/Web5, Decentralized Web Nodes, distributed architectures, and in general a broad range of SSI tech. Moises

Moises Jaramillo is a veteran in the software development world with nearly three decades of experience. A Principal Engineer at Dentity, Moises is at the forefront of decentralized identity technology with expertise in Web3/Web5, Decentralized Web Nodes, distributed architectures, and in general a broad range of SSI tech. Moises is also a previous winner of the DID:Hack Contest and TBD’s Hackathon. 

Moises’ work at Dentity is pushing the boundaries of digital identity, as evidenced by Dentity’s recent groundbreaking partnership with ENS Labs. Moises's expertise makes him an ideal guide to help us understand the future of digital identity in both Web2, Web3, and Web5 spaces.

Your career spans a broad range of software development. What specifically drew you to decentralized identity, and how has it shaped your professional journey?

I've been in software development for almost 30 years, mostly on the Microsoft stack. Around the mid-2010s, while working for a digital audience measurement company, I realized how data was being commoditized, often without people's awareness. This commoditization was rewarding the wrong parties - data brokers and middlemen.

Reading Jason Lanier's book "Who Owns the Future" cemented my thinking that this model was broken. Then in 2017, I discovered Ethereum and its programmable nature. I participated in a hackathon sponsored by Consensys, addressing the opioid epidemic. Our team won second place for a project tokenizing painkiller prescriptions on the blockchain.

The idea was to represent prescriptions as entries in a smart contract, allowing pharmacists to verify the authenticity of the prescription and the doctor's signature. This would prevent the abuse of paper prescriptions across state lines, which was a significant problem, especially in areas like D.C., Maryland, and Virginia.

This experience led me to explore blockchain in healthcare, which is how I discovered self-sovereign identity (SSI). I started working with Hyperledger Indy and eventually joined Lumedic, a pioneering SSI company. We worked on issuing COVID vaccine credentials during the pandemic.

After Lumedic was acquired, I continued with SSI, joining DIF as an independent contributor. I worked on the secure storage group, where I learned everything about decentralized web nodes, and I think they’re amazing. All these experiences ultimately led me to my current role at Dentity.

You were a winner in the DID:Hack contest and TBD’s Hackathon. Could you share a key insight from that experience, and perhaps offer advice for participants in DIF's upcoming hackathon?

For the DID:Hack contest, I focused on exploring Decentralized Web Nodes (DWNs), which were in their early stages. I saw the hackathon as an opportunity to prove whether this technology actually worked in a controlled environment. I learned that DWNs were more capable than many realized - they were pretty darn good, and almost ready for production. Winning the contest was exciting, but the real value was in the learning experience.

My advice for future hackathon participants:

Go wild with your ideas! There's no fear of failure in a hackathon. Worst case is you learn a lot. So push yourself beyond your comfort zone. Treating deadlines as mission-critical helps maintain discipline and simulate real-life situations. Have fun! Hackathons are designed to be enjoyable learning experiences. Leverage the resources provided by the organizers – they're there to help you succeed.

What brought you to Dentity, and how does your role there align with your vision for decentralized identity?

I met Dentity through TBD contacts. What drew me was their approach to solving real-world problems, particularly in identity verification. During my time at Lumedic, I realized the critical need for identity verification services, especially in healthcare where compliance with regulations like HIPAA is crucial.

Throughout my SSI career, I consistently encountered the challenge of ensuring that the person we're dealing with is who they claim to be. Dentity offers a valuable service that addresses this problem directly, so joining was a no-brainer.

What I love about Dentity is the energy and pragmatism in bringing real solutions to market. Earlier in my SSI career, I was more dogmatic about fully decentralized technologies. Now, I've become more pragmatic, recognizing the need to balance ideals with practical implementation.

This perspective aligns perfectly with Dentity's approach. We focus on solving immediate problems while abstracting away the nuances and complexities of SSI and decentralized technologies. This strategy is especially effective when working with traditional Web2.0 businesses.

The Dentity-ENS Labs partnership aims to bring real-world identities on chain. Could you explain the significance of this for both everyday users and the broader decentralized ecosystem?

ENS offers an interesting service in the Web3 world. The Web3 community values privacy and anonymity, so a public profile that broadcasts information feels like it's against that ethos. At the same time, establishing trust in transactions is essential. By integrating Dentity's identity verification with ENS profiles, we're giving users a higher degree of certainty about who they're transacting with.

I’ll give you a practical example. There was a major crypto scam where hackers created a deep fake video of Elon Musk saying that he would send crypto to anyone that sent ether to an address. That scammed people out of millions of dollars in crypto. 

On the other hand, if users had the ability to check addresses against a verified identity, they would have known it wasn't really Elon's address. In general, the idea is that this could reduce the number of similar scams in the future.

How does the Dentity-ENS integration empower users to control their digital identity? Could you walk us through a practical scenario that illustrates this?

Users need to own an ENS profile and authenticate with their wallet against ENS. They're then given the choice to verify their social accounts. A button routes them to our Dentity profile, where we've integrated several workflows.

One optional workflow is our full IDV process, which issues a proof of personhood credential to their public ENS profile. Currently, we only support full KYC, but we're planning to enable a biometric check without government documents in the future. This aligns with the Web3 ethos, as many users distrust sharing official documents.

Once the workflow is complete, the user chooses how much information to share with ENS. This creates a "long-lived token" that resides on-chain. The user pays for gas, and ENS has modified their code to leverage this token. When someone views the ENS profile, it invokes our API to retrieve only the information the user consented to share publicly.

Dentity utilizes self-sovereign identity standards like W3C Verifiable Credentials and DIDs. For our more technical readers, can you offer us a deep-dive into how the ENS integration works? And more generally, what tangible benefits does this offer to end-users?

Dentity adopts standards from W3C, DIF, and OpenID Foundation, including DIDs, verifiable credentials, and OpenID Connect for verifiable presentations. We approached ENS integration similarly to a typical relying party or verifier, modifying our protocol to enable integration.

We couldn't store the verifiable presentation on-chain due to PII concerns and gas prices. Instead, we store a small string on-chain as a reference. This approach offers the best of both worlds: users can disclose PII if they choose, while we guard that information from being stored directly on-chain.

The benefits to end users include enhanced trust and counterparty verification. For example, in transactions through ENS, we enable trust because users can verify who they're dealing with. It provides full consent and control over shared information, essentially creating counterparty trust.

This partnership has potential to expand blockchain use cases beyond cryptocurrency. What are some compelling non-financial applications of decentralized identity that you envision this enabling?

While DeFi is obviously a major beneficiary, we're seeing interesting applications emerge in other areas:

DAO Governance: There's growing interest in using this technology for voting in Decentralized Autonomous Organizations. It can help ensure they're not dealing with Sybil attacks or bots trying to manipulate DAO governance. Content Authenticity and Provenance: This could be leveraged for verifying the authenticity and origin of content, particularly on social media platforms. If you're distributing content on Twitter, whether it's politically charged or aimed at combating scams, you can check for the provenance of these tweets. This helps ensure the content really belongs to a person and not a disinformation bot trying to skew elections or change opinions.

In your view, what are the most significant challenges in achieving widespread adoption of decentralized identity solutions, and how is Dentity positioning itself to address these?

To paraphrase Jeffrey [Schwartz, CEO of Dentity], we have the technology, sound protocols, and emerging trust registries. The challenge now is being pragmatic and choosing the right problems to solve. Not everything is solvable through SSI, so we must carefully assess whether this technology fits a particular problem.

Dentity addresses this by targeting traditional Web2.0 businesses, which hold the largest market share. We use familiar technologies like OpenID Connect for verifiable presentations, making it easier for these businesses to integrate our solutions. We abstract SSI's complexity to focus on practical solutions to real-world problems.

In my previous SSI experiences, convincing enterprises to adopt was difficult because it required understanding new protocols, mediators, and agents. It wasn't practical to address their problems with these technologies then. At Dentity, we've scaled back, allowing for gradual adoption. We don't lead with "we do SSI," but rather with solving problems without the SSI complexity.

Looking ahead, what emerging trends or technologies in decentralized identity do you find most promising, and why?

While there are exciting technological developments like proof of personhood, fully homomorphic encryption, and zero-knowledge proofs, I'm most excited about the regulatory landscape. Regulators are increasingly forcing companies to comply with data privacy acts, which is driving innovation in the SSI space.

For example, the European eIDAS (electronic IDentification, Authentication and trust Services) regulation is spurring competition among SSI companies to offer the best solutions. In the U.S., although moving at a slower pace, traditional big data companies are beginning to realize that the massive amounts of data they hold are a liability. This realization is pushing them to explore what the SSI community has to offer.

These regulatory pressures are the engine that's going to force us to create standards and companies to comply with those standards. It's about more than just technology - whether you love it or hate it, we need regulations to drive adoption and interoperability in the decentralized identity space.


Identity At The Center - Podcast

It’s time for another episode of the Identity at the Center

It’s time for another episode of the Identity at the Center podcast! This week, Jim McDonald sat down with Deneen DeFiore, Chief Information Security Officer at United Airlines. They discuss her career journey in identity and access management, with a focus on her past experiences at General Electric and her current initiatives at United Airlines. The conversation emphasizes the importance of inte

It’s time for another episode of the Identity at the Center podcast! This week, Jim McDonald sat down with Deneen DeFiore, Chief Information Security Officer at United Airlines. They discuss her career journey in identity and access management, with a focus on her past experiences at General Electric and her current initiatives at United Airlines. The conversation emphasizes the importance of integrating customer identity with enhanced trust and personalized experiences, the significance of team building and professional growth, and balancing business and technical expertise in leadership.

Watch the episode here: https://www.youtube.com/watch?v=KVmYRDv7jHM

#iam #podcast #idac

Saturday, 07. September 2024

Oasis Open Projects

Invitation to comment on OpenC2 Actuator Profile for Packet Filtering Version 1.0

OASIS and the OpenC2 TC are pleased to announce that OpenC2 Actuator Profile for Packet Filtering Version 1.0 is now available for public review and comment.  OpenC2 is a concise and extensible language to enable machine-to-machine communications for purposes of command and control of cyber defense components, subsystems, and systems in a manner that is […] The post Invitation to comment on

Public review - ends October 8th


OASIS and the OpenC2 TC are pleased to announce that OpenC2 Actuator Profile for Packet Filtering Version 1.0 is now available for public review and comment. 


OpenC2 is a concise and extensible language to enable machine-to-machine communications for purposes of command and control of cyber defense components, subsystems, and systems in a manner that is agnostic of the underlying products, technologies, transport mechanisms, or other aspects of the implementation. This specification defines an Actuator profile for Packet Filtering (PF). Packet filtering is a cyber defense mechanism that denies or allows traffic based on static or dynamic properties. The Actuator profile collects Actions, Targets, Arguments, and Specifiers along with conformance clauses to enable the operation of OpenC2 Producers and Consumers in the context of PF.  The documents and all related files are available here:

OpenC2 Actuator Profile for Packet Filtering Version 1.0
Committee Specification Draft 02
08 August 2024

Editable Source (Authoritative):
https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/ap-pf-v1.0-csd02.md
HTML:
https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/ap-pf-v1.0-csd02.html
PDF:
https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/ap-pf-v1.0-csd02.pdf

Schemas: https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/Schemas/

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download https://docs.oasis-open.org/openc2/ap-pf/v1.0/csd02/ap-pf-v1.0-csd02.zip

How to Provide Feedback
OASIS and the OpenC2 TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.


The public review starts September 7, 2024 at 00:00 UTC and ends October 8, 2024 at 23:59 UTC.


Comments from TC members should be sent directly to the TC’s mailing list. Comments may be submitted to the project by any other person through the use of the project’s Comment Facility: https://groups.oasis-open.org/communities/community-home?CommunityKey=9ae0f0f9-24b5-44ea-9fe7-018dce260e09


Comments submitted for this work by non-members are publicly archived and can be viewed by using the link above and clicking the “Discussions” tab.
Please note, you must log in or create a free account to see the material. Please contact the TC Administrator (tc-admin@oasis-open.org) if you have any questions regarding how to submit a comment.


All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification. 
OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.


Additional information about the specification and the OpenC2 TC’s can be found at the TC’s public home page: https://www.oasis-open.org/committees/openc2/


Additional references:
[1] https://www.oasis-open.org/policies-guidelines/ipr/
[2] http://www.oasis-open.org/committees/openc2/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr#Non-Assertion-Mode

The post Invitation to comment on OpenC2 Actuator Profile for Packet Filtering Version 1.0 appeared first on OASIS Open.

Friday, 06. September 2024

Velocity Network

The SLAP – Requirements for Practical Verifiable Credentials   

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

Thursday, 05. September 2024

Identity At The Center - Podcast

We’ve got another episode of The Identity at the Center podc

We’ve got another episode of The Identity at the Center podcast for you this week! This one is sponsored by Zilla Security. We spoke with Nitin Sonawane of Zilla Security about disrupting the identity security and governance space with innovative solutions such as Zilla Universal Sync (ZUS) and how AI and ML can streamline and enhance access reviews and compliance. Watch the episode here: https:/

We’ve got another episode of The Identity at the Center podcast for you this week! This one is sponsored by Zilla Security. We spoke with Nitin Sonawane of Zilla Security about disrupting the identity security and governance space with innovative solutions such as Zilla Universal Sync (ZUS) and how AI and ML can streamline and enhance access reviews and compliance.

Watch the episode here: https://www.youtube.com/watch?v=QLiSUgYyZwU

You can learn more about their Zilla at https://zillasecurity.com/

#iam #podcast #idac


GS1

Maintenance release 2.11

Maintenance release 2.11 daniela.duarte… Thu, 09/05/2024 - 07:53 Maintenance release 2.11
Maintenance release 2.11 daniela.duarte… Thu, 09/05/2024 - 07:53 Maintenance release 2.11

GS1 GDM SMG voted to implement the 2.11 standard into production in Aug 2024.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.11 contains updated reference material aligned with ADB 2.5 and GDSN 3.1.28.

 

Updated For Maintenance Release 2.11

GDM Standard 2.11 (Aug 2024)

Local Layers For Maintenance Release 2.11

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (December 2021)

USA - GSMP RATIFIED (February 2023)

Finland - GSMP RATIFIED (November 2023)

Netherlands - GSMP RATIFIED (May 2024)

Italy - GSMP RATIFIED (May 2024)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (February 2024)

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.10 and 2.11 (April 2024)

Attribute Definitions for Business (Aug 2024)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.28 (Aug 2024)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (May 2024)

GDM Local Layer Submission Template (May 2024)

Training

E-Learning Course

Future Release Documentation

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.12 (Aug 2024)

Any questions

We can help you get help you get started using the GS1 standards

Contact your local office

Monday, 02. September 2024

Identity At The Center - Podcast

We had a great conversation with Andrew Shikiar from the FID

We had a great conversation with Andrew Shikiar from the FIDO Alliance on the latest episode of the Identity at the Center podcast. We dove into the world of authentication, covering everything from different use cases to the importance of passkeys and regional adoption trends. We also got the inside scoop on Authenticate 2024, a can't-miss event for anyone in the identity space. Andrew also annou

We had a great conversation with Andrew Shikiar from the FIDO Alliance on the latest episode of the Identity at the Center podcast. We dove into the world of authentication, covering everything from different use cases to the importance of passkeys and regional adoption trends. We also got the inside scoop on Authenticate 2024, a can't-miss event for anyone in the identity space. Andrew also announces a new discount code just for IDAC fans to the FIDO Alliance Shop to get yourself some swag!

Check out the episode and let us know what you think:

https://youtu.be/quY-pEDa_5Y?si=RN9wfYnJBD1kc98e

Don’t forget about our discounts!

Authenticate Conference - Use code **IDAC15** for 15% off: https://authenticatecon.com/event/authenticate-2024-conference/

FIDO Alliance Shop - https://shop.fidoalliance.org/ - Use code **IDAC10** for a discount on your purchase!

#iam #podcast #idac


MyData

Effective Data Solidarity requires symmetry in human digital agency and a social license

We believe that 3 pillars are required in Healthcare to support a FAIR data economy: Data Quality, Data Interoperability and Data Solidarity. Two previous blogs describe the first 2 pillars. […]
We believe that 3 pillars are required in Healthcare to support a FAIR data economy: Data Quality, Data Interoperability and Data Solidarity. Two previous blogs describe the first 2 pillars. […]

Friday, 30. August 2024

FIDO Alliance

Bias in Biometrics: How Organizations Can Launch Remote Identity Verification Confidently

Most of us today are accustomed to unlocking our smartphones with a simple glance or touch. In the blink of the tech industry’s eye, biometric authentication has quickly become a […]

Most of us today are accustomed to unlocking our smartphones with a simple glance or touch. In the blink of the tech industry’s eye, biometric authentication has quickly become a normal part of our daily lives.

Consumers love the convenience and security of biometrics, which has helped propel its growth and mainstream adoption. In the FIDO Alliance’s last global barometer survey, biometrics ranked top as the most secure and the preferred way to log in by consumers.

But for biometrics to continue its success, there is a reputation issue and ‘elephant in the room’ that is holding back consumers, governments, and other implementers alike from full trust and confidence: bias.

Are biometric technologies biased?

Concerns have been circulating for some time about the accuracy of biometric systems in processing diverse demographics. In the UK in 2021, for example, Uber drivers from diverse ethnic backgrounds took legal action over claims its software had illegally terminated their contracts as its software was unable to recognize them.

In the FIDO Alliance’s recent study, Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024, consumers made clear that they are concerned about bias in biometric facial verification systems.

While over half of respondents indicated they believe face biometrics can accurately identify individuals (56%), others in the survey report a different experience. 

A quarter of respondents felt they had been discriminated against by biometric face verification systems (25%).

Organizations like NIST have been closely monitoring the disparities in bias performance for some time – with NIST’s most recent evaluation ​​of solutions across different demographics released this year. The headline is: Not all biometric systems are created equal.

As face verification has become adopted globally, the accuracy in identifying diverse demographics has gone from weakness to strength, with most leading solutions today operating with extremely small margins of error. However, less sophisticated solutions do exist and are perpetuating a far bigger reputational and adoption challenge.

Inclusivity and accessibility in remote identity

Inclusivity is just one part of the problem. Bias impacts the entire user experience and erodes faith in the technology overall. Half of American and British consumers in the survey said they would lose trust in a brand or institution if it were found to have a biased biometric system, and 22% would stop using the service entirely.

Remote identity solutions unlock huge benefits for governments, organizations, and consumers alike. Consider already how many more scenarios where we are asked to prove who we are virtually today – starting a new job, opening a bank account, signing legal documents. And, as outlined earlier, we know consumers already love using biometrics – 48% of those we surveyed preferred biometrics to enroll and verify themselves remotely.

However, the excitement of more remote identity solutions is understandably mixed with these bias concerns, causing some organizations to delay or reconsider implementation. We’re in an age where digital inclusivity is highly scrutinized, especially for public services, and governments are increasingly calling for a way to demonstrate equity.

Equitable biometrics systems are both a practical and a moral imperative. So how do we get there? 

Addressing bias in biometric systems

The FIDO Alliance has launched its Face Verification Certification program, with mitigating bias as a key priority. It assesses a face verification system’s performance across different demographics, including skin tone, age, and gender, in addition to far more wide-reaching security and  performance tests.

Why is independent certification for biometrics important?

Currently, testing levels are completed on a case-by-case basis, per organization. This means it’s expensive and time-consuming, and what ‘good’ looks like varies widely. The FIDO Alliance’s program is based on proven ISO standards and has been developed by a diverse, international panel of industry, government, and subject matter experts. This means it is unrivaled in its ability to set equitable performance benchmarks.

More broadly, certification and independent global testing catalyze innovation and technological adoption. Whether launching an identity verification solution or including it in related regulations, open standards and certification set a clear performance benchmark. It removes considerable duplicated efforts, improves the confidence of all stakeholders, and ultimately drives up the performance of all solutions on the market.

How is bias evaluated?

At this time, the FIDO Alliance program considers false reject rate (FRR) for bias, using FRR methodology, and is measured at the transaction level across skin tone, age, and gender. ISO 19795-10 has multiple options for measuring differential performance. One option is described in the Section: Reporting differential performance against a benchmark (Section 7.4.2). In this approach, testers seek to compare the performance of one or more demographic groups to a specific benchmark. FIDO has chosen this approach given the small sample size of the individual groups (50+ per group). For skin tone, groups are defined and distributed across three brackets based on the Monk Scale. For gender, groups are defined and distributed across male, female, and other. For age, groups are defined and evenly distributed across four age brackets. 

The benchmarks are set at 6% (95% confidence interval), based on bootstrapping simulations. These simulations covered a spectrum of scenarios, population sizes, correlation between attempts. The benchmark chosen reduces the probability that a group will be considered different when it actually is not, i.e., finding a difference by chance (<5%).

What is the value of certification for Biometric Vendors? Independent validation of biometric performance Opportunity to understand gaps in product performance to then improve and align with market demands Demonstrate product performance to potential customers  Improve market adoption by holding an industry-trusted certification Leverage one certification for many customers/relying parties  Benefit from FIDO delta and derivative certifications for minor updates and extendability to vendor customers Reduce need to repeatedly participate in vendor bake-offs What is the value of certification for Relying Parties? One-of-a-kind, independent, third-party validation of biometric performance assessing accuracy, fairness and robustness against spoofing attacks Provides a consistent, independent comparison of vendor products – eliminating the burden of maintaining own program for evaluating biometric products Accelerates FIDO adoption to password-less Commitment to ensure quality products for customers of the relying parties  Requirements developed by a diverse, international group of stakeholders from industry, government, and subject matter experts Conforms to ISO FIDO Annex published in ISO standards What is the value of certification with FIDO accredited laboratories?

FIDO Accredited Laboratories are available worldwide and follow a common set of requirements and rigorous evaluation processes, defined by the FIDO Alliance Biometrics Working Group (BWG) and follow all relevant ISO standards. These laboratories are audited and trained by the FIDO Biometric Secretariat to ensure lab testing methodologies are compliant and utilize governance mechanisms per FIDO requirements. Laboratories perform biometric evaluations in alignment with audited FIDO accreditation processes. In contrast, bespoke, single laboratory biometric evaluations may not garner sufficient trust from relying parties for authentication and remote identity verification use cases.

What are the other ISO Standards that FIDO certification conforms to?

In addition to ISO/IEC 19795-10, vendors and their accredited lab are adhering to the following ISO standards:

Terminology
ISO/IEC 2382-37:2022 Information technology — Vocabulary — Part 37: BiometricsPresentation Attack Detection
ISO/IEC 30107-3:2023 Information technology — Biometric presentation attack detection — Part 3: Testing and reporting
ISO/IEC 30107-4:2020 Information technology — Biometric presentation attack detection — Part 4: Profile for testing of mobile devices
– FIDO Annex, published 2024Performance (e.g., FRR, FAR)
ISO/IEC 19795-1:2021 Information technology — Biometric performance testing and reporting — Part 1: Principles and framework
ISO/IEC 19795-9:2019 Information technology — Biometric performance testing and reporting — Part 9: Testing on mobile devices
– FIDO Annex, published 2019Bias (differentials due to demographics)
ISO/IEC 19795-10:2024 Information technology — Biometric performance testing and reporting — Part 10: Quantifying biometric system performance variation across demographic groups – FIDO Annex, under developmentLaboratory
ISO/IEC 17025:2017, General requirements for the competence of testing and calibration laboratories Enhancing Confidence in the Biometrics of Identity Verification

The FIDO Alliance continues to champion the cause of combating bias and enhancing security measures in remote biometric identity verification technologies through its Identity Verification and Biometric Component certifications. The FIDO Certification Programs offer reliability, security, and standardization to certify biometric solutions for remote identity verification, and has specifically set benchmarks for face verification technologies to test for bias.

In addition to the Face Verification program, the FIDO Alliance emphasizes the importance of rigorous testing and certification processes in ensuring that identity verification solutions are trustworthy and secure, including the Document Authenticity (DocAuth) Certification. These programs offer solution providers the opportunity to differentiate themselves in the market by leveraging FIDO’s independent, accredited test laboratories and industry-recognized brand.

Learn More about FIDO Biometric Certifications

As digital identity verification landscapes evolve, the demand for independently verified and unbiased biometric systems becomes increasingly vital. The introduction of the FIDO Alliance’s Face Verification Certification Program reinforces the commitment of solution providers to proactively address trust, security, and inclusivity in biometric identity verification technologies.

To learn more, download the in-depth consumer research on remote ID verification here, and discover the certified providers backed by FIDO certification to stay ahead with secure and trustworthy biometric identity verification technologies.


We Are Open co-op

An Introduction to Systems Thinking

Part 2: Understanding Feedback Loops This is the second post in a series exploring the fundamentals of Systems Thinking. This is an approach that helps us make sense of complex situations by considering the whole system rather than just its individual parts. This series is made up of: Part 1: Three Key Principles Part 2: Understanding Feedback Loops (this post) Part 3: Identif
Part 2: Understanding Feedback Loops

This is the second post in a series exploring the fundamentals of Systems Thinking. This is an approach that helps us make sense of complex situations by considering the whole system rather than just its individual parts.

This series is made up of:

Part 1: Three Key Principles Part 2: Understanding Feedback Loops (this post) Part 3: Identifying Leverage Points

In the first post, we explored the foundational principles of Systems Thinking: Drawing a Boundary, Multiple Perspectives, and Holistic Thinking. Now, we’ll build on that foundation by examining Feedback Loops, another crucial concept in Systems Thinking.

Feedback loops are essential for understanding how systems behave, adapt, and evolve over time. By mastering the dynamics of feedback loops, you can identify how small changes can ripple through a system, leading to significant outcomes, whether intended or unintended.

1. What are feedback loops?

“You think that because you understand ‘one’ that you must therefore understand ‘two’ because one and one makes two. But you forget that you must also understand ‘and.’” — Donella Meadows

Feedback loops are the mechanisms by which systems regulate themselves. They can either reinforce a particular behaviour (positive feedback) or counteract it to maintain stability (negative feedback). Understanding these loops helps us see how systems respond to internal and external changes, leading to growth, adaptation, or stability.

For instance, a classic example of a negative feedback loop is a thermostat regulating room temperature. When the temperature rises above the set point, the thermostat turns off the heating. When the temperature drops below the set point, the thermostat turns the heating back on. This loop maintains the room’s temperature within a desired range, counteracting any deviations. The thermostat example clearly demonstrates how negative feedback loops work to maintain stability within a system, bringing it back to a set equilibrium when it deviates.

On the other hand, social media platforms often rely on positive feedback loops to increase user engagement. When a user interacts with content, the platform’s algorithm shows more similar content, increasing the likelihood of further engagement. This creates a reinforcing cycle where user behaviour drives more of the same behaviour, leading to increased overall engagement. This example illustrates how positive feedback loops can amplify behaviours, sometimes leading to rapid growth or creating echo chambers where only certain types of content are repeatedly reinforced.

Feedback loops are crucial for identifying how systems behave over time, which ties back to the principles we discussed in the first post, particularly the importance of considering Holistic Thinking.

2. Negative feedback loops

“The major problems in the world are the result of the difference between how nature works and the way people think.” — Gregory Bateson

Negative feedback loops work to stabilise a system by counteracting changes. These loops are vital for maintaining equilibrium and preventing a system from spiralling out of control.

For example, in economics, the supply and demand mechanism operates as a negative feedback loop. When the supply of a product exceeds demand, prices tend to fall, which in turn reduces supply as producers cut back on production. Conversely, when demand exceeds supply, prices rise, encouraging increased production. This self-regulating process helps to stabilise markets by bringing supply and demand into equilibrium, preventing extreme fluctuations in prices and availability.

It’s important to understand that in the context of Systems Thinking, “negative” doesn’t imply something bad or undesirable. Instead, it refers to the type of feedback that counteracts changes, helping to stabilise a system and maintain equilibrium. Negative feedback loops are essential for preventing systems from spiralling out of control, ensuring they remain balanced and functional.

3. Positive feedback loops

“The only thing harder than starting something new is stopping something old.” — Russell Ackoff

Positive feedback loops amplify changes in a system, often leading to exponential growth or decline. These loops are self-reinforcing, meaning that as a particular change occurs, it triggers more of the same change, leading to potentially dramatic effects.

For example, in the context of climate change, positive feedback loops play a significant role in accelerating global warming. One such loop involves the melting of polar ice. As the ice melts, less sunlight is reflected back into space, and more heat is absorbed by the Earth’s surface, which in turn leads to further ice melt. This self-reinforcing loop accelerates the warming of the planet, contributing to increasingly severe climate impacts.

Again, it’s important to note that a “positive” feedback loop doesn’t necessarily have morally beneficial consequences — rather, it refers to the amplifying nature of the loop itself, which can drive both beneficial and harmful changes.

4. The interaction of Feedback Loops in Systems

“The purpose of a system is what it does.” — Stafford Beer

In most systems, positive and negative feedback loops do not operate in isolation; they interact with each other, creating complex and dynamic behaviours. This interaction can make systems resilient, allowing them to adapt to changes and maintain stability, or it can make them vulnerable, leading to oscillations, chaotic behaviour, or even collapse if not properly managed. Understanding how these loops interplay is crucial for effective system management and intervention.

In product management, the balance between feature usage and product complexity is shaped by the interaction of feedback loops. When a feature is widely used and appreciated, customer satisfaction increases, leading the product team to enhance or expand the feature (positive feedback). As the feature becomes more complex, it may introduce usability issues, decreasing customer satisfaction (negative feedback). This interaction helps maintain a balance within the product, ensuring it continues to meet customer needs without becoming overly complicated.

A similar interplay of feedback loops occurs when managing product adjustments. Increasing customer satisfaction through positive feedback can prompt further enhancements, which might initially appear beneficial. Yet, if these enhancements lead to excessive complexity, they may cause usability issues, frustrating users and resulting in negative feedback. This process highlights the delicate balance product managers must maintain to keep their product both innovative and user-friendly, illustrating the challenges of managing feedback loops in product development.

By recognising and analysing these interactions, systems thinkers can identify points of leverage where interventions can either reinforce desirable behaviours or counteract negative trends, leading to more stable and resilient systems.

5. Recognising feedback loops in your work

“Taking action to improve a problematical situation will of course itself change that situation, so that the learning cycle could in principle begin again.” — Peter Checkland

Identifying and understanding feedback loops in your work or organisational context can be a game-changer. These loops are often the invisible threads that weave through processes, projects, and relationships, dictating outcomes and driving behaviours. By recognising these loops, you can gain a deeper understanding of the forces at play and make more informed decisions that lead to sustainable success.

Get started identifying feedback loops:

Think of an area in your work where specific actions lead to similar outcomes Identify the underlying feedback mechanism — is it positive or negative or are multiple loops at play? Make adjustments to one part of the system and observe how the system changes

Here’s an example:

When our team meets a deadline, team morale improves and I’ve noticed that we often meet our deadlines with the same enthusiasm This is a positive feedback loop — success breeds further success. Figure out what works, and further incentivise it.

Conversely:

When our team misses a deadline, we tend to become stressed leading to further delays. This is a negative feedback loop — the more stressed we are, the more we miss our deadlines. We need a win, let’s deprioritise X and focus on Y because it’s an easier task. Then we’ll come back to X when team morale is improved.

Recognising these loops allows you to take proactive steps. If you identify a positive feedback loop that is driving beneficial outcomes, consider how you might reinforce or expand this loop. Perhaps you could introduce additional incentives or recognition programs to further enhance the team’s motivation and productivity. On the other hand, if you uncover a negative feedback loop that is leading to undesirable results, it’s crucial to intervene early. For example, introducing stress management resources or adjusting timelines could help break the cycle and restore balance.

Feedback loops are not limited to project management; they are pervasive in all areas of work, from customer service to organisational culture. In customer service, a positive feedback loop might occur when excellent service leads to customer satisfaction, which in turn results in positive reviews and repeat business. Recognising this loop could lead to further investments in training and support for customer service teams. In organisational culture, a feedback loop could exist where open communication leads to trust, which fosters more open communication, creating a virtuous cycle of collaboration and innovation.

By consciously identifying and managing these loops, you can transform potential challenges into opportunities for growth and continuous improvement. This approach not only helps in achieving immediate goals but also contributes to the long-term resilience and adaptability of your organisation.

Conclusion

Feedback loops are a fundamental aspect of Systems Thinking, providing crucial insights into how systems evolve, adapt, and maintain stability. By identifying and analysing these loops, we gain the ability to predict the outcomes of our actions more accurately and craft interventions that are not only effective but sustainable.

At We Are Open Co-op, we specialise in applying Systems Thinking principles, including the detailed analysis of feedback loops, to help organisations navigate and resolve complex challenges. Whether you’re looking to enhance organisational processes, improve project outcomes, or foster better relationships within your team, understanding feedback loops can be transformative.

If you’re curious about how feedback loops influence your work or how Systems Thinking can be integrated into your approach, we’re here to help. Reach out to us to explore the possibilities. And stay tuned for the final post in this series, where we’ll dive into identifying and leveraging key points within systems to drive meaningful and lasting change.

References Ackoff, R.L. (1974). Redesigning the Future: A Systems Approach to Societal Problems. New York: Wiley. Bateson, G. (1972). Steps to an Ecology of Mind. San Francisco: Chandler Publishing Company. Beer, S. (1972). Brain of the Firm. New York: Herder and Herder. Checkland, P. (1981). Systems Thinking, Systems Practice. Chichester: John Wiley & Sons. Meadows, D.H. (2008). Thinking in Systems: A Primer. White River Junction: Chelsea Green Publishing. Schön, D.A. (1983). The Reflective Practitioner: How Professionals Think in Action. New York: Basic Books. Vickers, G. (1965). The Art of Judgement: A Study of Policy Making. London: Chapman & Hall.

An Introduction to Systems Thinking was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Origin Trail

Growing the Buz Economy: Announcing the Social Intelligence Paranet Launch

By LunarCrush and OriginTrail In the rapidly evolving world of tech and finance, the demand for innovation and adaptability is higher than ever, driven by a quest for transparency for internet users. LunarCrush has been at the forefront of Social Intelligence, converting human-driven insights into actionable information for both retail and institutional stakeholders. Originally focusing on the cr

By LunarCrush and OriginTrail

In the rapidly evolving world of tech and finance, the demand for innovation and adaptability is higher than ever, driven by a quest for transparency for internet users. LunarCrush has been at the forefront of Social Intelligence, converting human-driven insights into actionable information for both retail and institutional stakeholders. Originally focusing on the crypto industry, LunarCrush’s Social Intelligence now extends across diverse sectors such as technology, politics, travel, music, and more. Recognizing the convergence of crypto, the Internet, and Artificial Intelligence (AI), LunarCrush is making a significant leap forward in their transparency efforts through social intelligence. By launching the Social Intelligence Paranet on the OriginTrail Decentralized Knowledge Graph (DKG), LunarCrush aims to enhance content collection through incentivized crowdsourcing and enable the creation of AI-powered services on this trusted knowledge base.

The Decentralized Knowledge Graph and the Social Intelligence Paranet

The Social Intelligence Paranet will operate on the OriginTrail DKG, a permissionless peer-to-peer network that ensures all social content published to the Paranet is discoverable, verifiable, and attributed to its owners. This setup allows AI services leveraging this knowledge base to avoid challenges like hallucinations, managed bias, and intellectual property violations. For an in-depth understanding of the technical design of paranets, DKG, and decentralized Retrieval-Augmented Generation (dRAG), we recommend reviewing the OriginTrail Whitepaper.

The Social Intelligence Paranet Initiative

Aligned with LunarCrush’s growth trajectory, the Social Intelligence Paranet will initially target the crypto sector, attracting high-quality content creators and community members from various crypto projects. LunarCrush will also mine knowledge tied to their social insights, such as Alt Rank, Top Creators, and Sentiment analysis. Beyond knowledge mining, the Social Intelligence Paranet will feature the first AI-powered tool to interact with top knowledge assets on the Paranet, supported by LunarCrush. This AI-powered tool will be accessible to users paying with BUZ tokens. All BUZ tokens spent by users will be recycled as additional rewards for knowledge mining.

In the upcoming weeks, a comprehensive proposal for the Social Intelligence Paranet will be submitted to the NeuroWeb community for approval. The proposal will include:

- Knowledge Assets created from LunarCrush APIs

- An incentives model for knowledge miners targeting the first category of knowledge

- A demo of the LunarCrush AI tool

Advancing the Wisdom of the Crowds

The traditional wisdom of the crowds concept eliminates idiosyncratic noise associated with individual judgment by averaging a large number of responses. Social Intelligence takes this concept further by unlocking actionable information through high-quality, curated knowledge enhanced with specific domain expertise. The rise of AI introduces the potential for another leap forward in extracting wisdom from a vast body of knowledge. Incentivized crowdsourcing to collect superior social content provides an ideal foundation for AI services to uncover wisdom that is not immediately apparent. While a conversational tool is the initial step, subsequent developments will include AI agents performing comprehensive tasks such as market analysis and prediction market suggestions. As the Social Intelligence Paranet expands beyond the crypto field, it promises to support enhanced decision-making powered by the wisdom of the crowds across various topics.

Growing the Buz Economy: Announcing the Social Intelligence Paranet Launch was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 29. August 2024

FIDO Alliance

White Paper: High Assurance Enterprise FIDO Authentication

Editors Sean Miller, RSA Abstract Enterprises should consider using passkeys, especially if they are currently relying on passwords. By replacing these credentials with passkeys, enterprises will immediately reduce the risk […]
Editors

Sean Miller, RSA

Abstract

Enterprises should consider using passkeys, especially if they are currently relying on passwords. By replacing these credentials with passkeys, enterprises will immediately reduce the risk of phishing and eliminate credential reuse, improving authentication service security. Different types of FIDO authenticators may be used to meet users’ needs with a balance between convenience and security. For enterprises that require high levels of identity assurance, internal security policies, or regulatory requirements, additional scrutiny is needed to determine the appropriate type of passkey. It is important to look at both the enterprise as a whole, as well as parts of the organization because high assurance requirements may not apply to the entire enterprise.

For many high assurance scenarios, attested device-bound passkeys may be more desirable. Relying parties with high assurance requirements will need to decide whether to accept all types of authenticators and adapt their authentication flow based on the attestation characteristics or reject registrations from unattested or unacceptable authenticators at the risk of a poor user experience.

Audience

This white paper is intended for IT administrators and enterprise security architects who are considering deploying FIDO authentication across their enterprises and defining life cycle management policies. This paper provides an overview of the different use cases for multi-factor authentication MFA and the FIDO Authenticator choices available to administrators. The intent is to help guide administrators in choosing the right authenticator types for their specific environment. Companies requiring higher levels of security, such as those involved in healthcare, government organizations, or financial institutions that have a hard requirement around the control of the credential, in particular should read this white paper.

It is assumed that the reader has an understanding of FIDO architecture, relying parties, protocols, and has read “FIDO EDWG 2023 Papers – Introduction” that introduces key concepts used in this white paper.

1. Introduction

This document focuses on deploying passkeys for enterprise users in high assurance environments.
Readers can find an introduction to the series of papers here. The introductory whitepaper provides additional descriptions and links to all papers in the series, which cover an array of use cases from low to high assurance. Most enterprises will likely have use cases that span more than one of these papers, and readers are encouraged to review the white papers relevant to their deployment environment.

This white paper examines what it means to be in a high assurance environment and how that may influence how FIDO is used. More specifically, the document addresses the challenges with password-only authentication and proposes passkeys as a stronger, phishing-resistant alternative to using passwords to authenticate users. Additionally, the document provides some adoption considerations for IT and security professionals to consider to ensure compliance with regulatory and security requirements for high assurance authentication scenarios. This white paper examines the use cases of registering a device, using a registered device, and dealing with recovering a lost device.

A key part in deciding if a passkey should be allowed in an environment is based on attestations. Attestations can be provided for credentials as part of the registration process, which relying parties can trust as provenance of the authenticator being used. For high assurance enterprise scenarios, attestations should always be requested. What can be discovered from the attestation associated with the credential, or the absence of any attestation, can help drive policy decisions about whether to accept the registration. Without any attestation, it may be difficult for the relying party to decide if the credential should be allowed. They may reject the registration outright, making for a poor user experience, or the enterprise may choose to employ additional, conditional multi-factor authentication (MFA) along with FIDO authentication to meet the high assurance requirements. With an attestation, the enterprise has assurances about the provenance, manufacture type, certifications, and features of the authenticator and often can rely on these assurances as MFA devices, providing multiple factors like credentials and a PIN to unlock the authenticator.

Synced passkeys work well in many use cases and can still work for some high assurance scenarios, depending on the security or regulatory requirements of the enterprise. Synced passkeys are attractive because of their recoverability and ease of use; however, they also change where credentials reside and who controls them. Given this external control of the credentials, some additional MFA may be desired for synced passkeys where the enterprise has control of the lifecycle management of the MFA method.

The remainder of this white paper will examine enterprises or organizations that have high assurance requirements based on Authenticator Assurance Levels [7] and FIDO Certified Authenticator Levels [8] to operate.

download the white paper 2. Passkey Use Cases

This section will focus on use cases around passkeys in an enterprise or an organization. There are many use cases for enterprises where synced passkeys work very well for ease and convenience in registering devices, using devices, and recovering lost devices since the credentials are available on other devices. It is highly recommended that organizations look at all the benefits of synced passkeys to determine if they are appropriate for the organization. However, the use of synced passkeys, while convenient, may not meet all the security requirements for an enterprise or organization needing high assurance (e.g., AAL3 requirements). AAL3 level has several requirements with the most significant being the use of a hardware-based authenticator. Please refer to NIST for more detail on the different levels of Authenticator Assurance Levels (AAL) [7]. Quite often, AAL3 applies to companies and organizations requiring higher levels of security, such as those involved in healthcare, government, or finance, which have a hard requirement around the control of the credential, specifically, that it is device-bound and never copied.

2.1. Registration
The enterprise or organization should first consider what device(s) they will support in their environment and how they will manage the provisioning of devices. For example, an organization may support an environment where users can bring their own device (e.g., mobile phone), or an organization may have very strict requirements around issued devices that meet specific security requirements such as PIN length, particular user presence features, or even specific hardware models. Finally, organizations need to consider whether they will allow passkeys to reside on multiple devices or just a single device. This has both security and recovery implications that need to be considered.

Organizations may have use cases that require credentials to be device-bound and not copyable at all, in which case synced passkeys are not recommended. Organizations may choose to allow synced passkeys alongside traditional MFA mechanisms, replacing the password with a passkey. However, if the organization has strict requirements for where the credentials can reside, they should look closely at restricting use to device-bound passkeys. These factors will decide how organizations manage registration. All these cases put some added burden on the relying party if types of passkeys need to be restricted.

The relying party may need to check if some requirements are met during the registration process, such as requiring an authenticator that meets or exceeds the FIDO L1+ certification [8]. To assess the authenticator’s compliance with these requirements, the authenticator must provide an attestation that can be validated and examined. If an authenticator does not meet the requirements of L1+ then, the relying party may be forced to reject the registration since nothing can be proven about the provenance of the credential, or the party may consider an implementation with additional MFA to meet the requirements of high assurance.

If an attestation is provided, the relying party can check what type of device it is and if it meets the requirements of the enterprise or organization. The relying party may also want to restrict based on the unique identifier for the authenticator, provided an attestation is available. The unique identifier, known as an Authenticator Attestation Globally Unique Identifier (AAGUID), can be used to look up the details against the FIDO Alliance Metadata Service [2] to understand what type of device is being registered, the certification level, and what features it provides.

Enterprise Attestation is another form of attestation that can be leveraged during registration. This is implemented by some authenticator vendors to add additional information that is unique to the organization. Including this additional information as part of the attestation and narrowing allowed authenticators can be used to further enhance the registration experience.

Similarly, there may be flags about whether the credential is eligible for backup and/or if it has been backed up. These flags cannot be trusted, however, without some attestation that the device is certified. A relying party might decide to allow or deny the registration based on this information as well as other information provided at runtime.

Unfortunately, if the relying party fails the registration of a credential, it forces the user to repeat the registration process again with a different authenticator at step one. Although WebAuthn [5] does not support a preflight mechanism to identify suitable authenticators, relying parties may provide feedback to the user before registration to identify acceptable authenticators. Additional guidance can be provided after failed registration to guide the user’s choice of authenticator. This guidance should be explicit and identify why the authenticator was rejected during registration, which authenticators meet the RP’s requirements, and guidance on managing browser-mandated optionality on communicating attestations.

Relying parties should be able to be more prescriptive in describing requirements of authenticators, allowing for a much better user experience where the end user can only select authenticators that meet the requirements and remove this burden from relying parties. These changes have been proposed to WebAuthn, but they have not yet gathered the support of platform vendors.

Another approach for enterprises might be not to offer any registration use case exposed to the end user. Instead, the enterprise would manage the lifecycle of registering the devices before they are provisioned to users. Similarly, the enterprise might provide some form of supervised registration experience to ensure only authorized authenticators are provisioned and registered. This avoids a number of pitfalls with the user experience mentioned above but puts more lifecycle management burden on the enterprise.

2.2. Sign In
Once a credential has been registered, FIDO credentials can be accessed when needed at authentication. The application(s) will leverage the WebAuthn browser API or platform passkey APIs to perform a FIDO authentication using a registered device. Depending on the type of registered device, there will be multiple factors involved in the authentication, like the entering of a PIN or a user presence challenge. The requirement for these interactions is there is a high level of assurance that the user is who they say they are, and they are not impersonating any user. These requirements need to be enforced during the registration process to ensure devices are allowed to meet the requirements of the enterprise or organization.

The only difference in this use case between synced passkeys and device-bound passkeys is what needs to be authenticated. For device-bound passkeys, the original hardware device used during the registration process is needed. Synced passkeys may be accessed from multiple devices that have access to an account hosted by a passkey provider. Furthermore, some synced passkeys may be shared after registration. Relying parties do not have a mechanism for identifying shared credentials in the current specifications, making it harder to understand and manage the lifecycle of synced passkeys.

There are several enterprise use cases covered in the white paper on “Choosing FIDO Authenticators for Enterprise Use Cases” [4]. Organizations should review these to evaluate how FIDO is leveraged. In particular, an organization planning to rely on FIDO as a first factor (passwordless) or a second factor is a key decision, and the white paper may help organizations understand what truly requires high assurance. For example, there may be a specific project, or a use case may apply to an entire industry driven by government or regulatory requirements. Employees might be allowed to use a synced passkey to access a laptop for example, but then need to use a device-bound passkey to sign in to a specific application restricted to certain employees with a particular clearance level.

2.3. Recovery/Lost Device
Recovery is where a synced passkey shines. If one loses a FIDO device that holds a credential, they can just access the credential from a different device that shares the same platform account. This is convenient, but also means that a passkey is only as secure as the platform account with which it is associated. Enterprises should examine the vendor solutions to understand how secure it is before relying on a service external to the organization. For example, does it provide end-to-end encryption with keys that are not known to the vendor? What additional measures like MFA are used to secure the user’s account? What process is used for account recovery? End users may not be concerned about such matters, but these details may represent a security concern for the organization’s security administrators. The organization’s security requirements need to be examined to see if an external party can store and manage credentials. Furthermore, without requiring attestations, the relying party has no idea who or what is the issuer of a credential—whether it be the platform, a roaming authenticator, a browser plug-in, or something else. As a result, the relying party cannot provide any guidance as to how to recover access to the credentials while providing high assurance. An alternative form of account recovery external to recovering the FIDO credential would be needed to verify the identity of the user and issue a new device and credentials. Finally, the recovery of a passkey from a provider when using synced is not known to the relying party. This represents a potential attack that the enterprise is unaware of.

For device-bound passkeys, the recovery process is more involved and will likely require the involvement of a help desk [6] to issue a new device and possibly revoke access for the old device. This is a security-first approach over convenience that allows an enterprise or organization to control who has devices. It does mean there are additional steps needed for the end user before they can regain access. However, this gives enterprises more control over the lifecycle of the credentials, allowing enterprises to revoke or expire authenticators at any point and be able to guarantee that credentials are not copied or do not exist outside enterprise controls. Some enterprises have solved this by provisioning multiple devices so users can self-recover. Ultimately, there is a business decision to be made regarding recovery models. In some cases, it may be appropriate to block access until the user can receive a new device, taking loss of productivity over a lower security model. The extra burden highlighted in the registration step if an enterprise chooses to manage the registration experience has a direct impact on the recovery/replacement experience.

2.4. Unregistering
At some point an employee will either leave a project or the enterprise overall. The enterprise will want to be sure they have control over credentials and unregister their use so access is no longer possible. This is a bigger consideration when it comes to synced passkeys where the enterprise does not have full control of the lifecycle and management of the credentials. If synced passkeys require additional MFA, the enterprise can control the MFA aspect, expiring the factors involved so authentications no longer are allowed. Device-bound passkey environments have much more control over unregistering devices, either by physically handing in a device and knowing no copies were made, or invalidating/expiring the device so subsequent authentication attempts fail.

The credential lifecycle requires the ability to disable or remove a credential, whether due to a change in status of an employee, such as a leave of absence or separation from the organization, or due to the potential loss or compromise of a credential. Passkeys differ from passwords in these instances since the user may have multiple passkeys registered with the relying party, as opposed to passwords, where the user is expected to only have one password per relying party. In the case of a permanent separation between the user and enterprise, disabling the user account and/or rotating the credential in the service is standard practice to ensure the user is no longer able to authenticate. If the separation is temporary, such as for leave of absence, enterprises may choose to rotate all the user’s credentials or disable the user account until the user returns.

In the case of credential loss, the next steps are dependent upon the deployment scenario. Users with device-bound passkeys who lose their security key should have the credential revoked by the service. Synced passkeys create additional challenges. If the device has been compromised, all credentials resident on the device, including those resident in different passkey providers, should be treated as compromised and revoked by the RP. If the user’s passkey provider account has been compromised, the impacted credential(s) stored with the provider must be revoked. To facilitate revocation in these scenarios, RPs should allow credentials to be named or otherwise identified by the user during registration to facilitate the revocation of specific credentials where possible. Administrative controls must narrow their focus on eliminating credentials from the RP rather than removing the credential private key material from either hardware security keys or a passkey provider’s sync fabric, which may not be possible.

3. Deployment Strategy

In a high assurance environment, the enterprise is likely going to want to manage the distribution and retirement of all authenticators. Device-bound passkeys would be managed by IT and provisioned to individuals. Relying parties would need to check for attestations and only allow the registration of authenticators that are managed by the enterprise or organization. If attestations are absent or do not meet the security requirements, the registration should fail. Processes should be established to manage the pool of authenticators to ensure they are retired when individuals leave or no longer require high-level access. Lastly, the organization or enterprise should define what the process looks like for recovering lost/stolen devices. Depending on how critical the access is to the continuity of the business, multiple hardware devices might be issued for a given individual to ensure they always have access.

4. Conclusion

There is no argument that passkeys are a strong phishing-resistant alternative option to traditional passwords. In an enterprise environment, it is important to look at security and regulatory requirements to determine if synced passkeys work, or if there are stricter constraints such as internal security policies, regulatory, or compliance requirements that require the use of device-bound passkeys. With either approach, enterprises should spend the time to understand how registration, management, and recovery of FIDO credentials will be managed. This includes important use cases like storage of credentials (external), recovery of lost credentials, and unregistering devices when employees leave. Based on the requirements of the enterprise, passkeys may work without any customizations, or enterprises may need to invest to ensure their authentication experience is more managed and filtered to specific devices.

5. Next Steps: Get Started Today Use FIDO standards. Think about what your relying parties are supporting and consider your enterprise security requirements. Passkeys are far more secure than passwords. Look for the passkey icon on websites and applications that support it.

For more information about passkeys, visit the FIDO Alliance site [3].

6. References

[1] FIDO Deploying Passkeys in the Enterprise – Introduction
[2] FIDO Alliance Metadata Service – https://fidoalliance.org/metadata/
[3] Passkeys (Passkey Authentication) –
https://fidoalliance.org/passkeys/#:~:text=Can%20FIDO%20Security%20Keys%20support,discoverable%20credentials%20with%20user%20verification.
[4] FIDO Alliance White Paper: Choosing FIDO Authenticators for Enterprise Use Cases –
https://fidoalliance.org/white-paper-choosing-fido-authenticators-for-enterprise-use-cases/
[5] WebAuthn – https://fidoalliance.org/fido2-2/fido2-web-authentication-webauthn/
[6] FIDO account recovery best practices –
https://media.fidoalliance.org/wp-content/uploads/2019/02/FIDO_Account_Recovery_Best_Practices-1.pdf
[7] NIST Authenticator Assurance Levels – https://pages.nist.gov/800-63-3-Implementation-Resources/63B/AAL/
[8] FIDO Certified Authenticator Levels – https://fidoalliance.org/certification/authenticator-certification-levels/

7. Acknowledgements

We would like to thank all FIDO Alliance members who participated in the group discussions or took the time to review this paper and provide input, specifically:

Matthew Estes, Amazon Web Services John Fontana, Yubico Rew Islam, Dashlane Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Enterprise Deployment Working Group Johannes Stockmann, Okta Shane Weeden, IBM Khaled Zaky, Amazon Web Services FIDO Enterprise Deployment Group members

White Paper: FIDO Authentication for Moderate Assurance Use Cases

Editors Jerome Becquart, AxiadGreg Brown, AxiadMatt Estes, Amazon Web Services Abstract The intent of this whitepaper is to provide guidance for organizations as they analyze the abilities and features of […]
Editors

Jerome Becquart, Axiad
Greg Brown, Axiad
Matt Estes, Amazon Web Services

Abstract

The intent of this whitepaper is to provide guidance for organizations as they analyze the abilities and features of both device-bound passkeys and synced passkeys to determine how both credential types can be utilized in a moderate assurance environment. In this paper, the term “moderate assurance” refers to an environment or organization where the legal, regulatory, and security requirements are flexible enough to allow for the use of both types of credentials, using synced passkeys to replace passwords and multi-factor Authentication (MFA) for standard user accounts and device-bound passkeys for user accounts that require the highest level of protection and assurance. The paper is designed to provide a comparison of features and requirements that are supported by device-bound passkeys and synced passkeys, providing a vision of how both types of credentials can be utilized together in an organization that has moderate assurance needs.

Audience

This white paper is one in a series of white papers intended for anyone who is considering deploying FIDO Authentication across their organization, including IT administrators, enterprise security architects, and executives.

Readers can find an introduction to the series of papers here. The introductory white paper provides additional descriptions and links to all papers in the series, covering an array of use cases from low to high assurance. We expect that most enterprises will have use cases that span more than one of these papers and encourage readers to review the white papers that are relevant to their deployment requirements.

The white paper assumes that the reader has a foundational understanding of FIDO2 credentials and the role they play in the
authentication process; introductory information on FIDO2 can be found here: FIDO2 – FIDO Alliance.

1. Introduction

The initial implementations of FIDO2 credentials were created as device-bound passkeys on either a roaming authenticator or platform authenticator, where the private key of the credential is stored on the device’s authenticator and not allowed to be exported, copied, backed up, or synchronized from the authenticator. This configuration presents a very secure and phishing-resistant solution for authentication that gives relying parties (e.g., web sites or service providers), a very high level of confidence that the user and the device are legitimate users of the system. With this high level of assurance, however, comes some challenges – primarily regarding usability and account recovery. For example, because there is no way to get the private key off the authenticator, if the device the private key is stored on becomes lost or damaged, then access to the resources that key authenticated would be lost. With device-bound passkeys, the solution is to register a second device-bound passkey with every relying party. This creates a more difficult user experience as the user would be required to register both authenticators. This is somewhat reduced for organizations that have consolidated their authentication flow by using an identity provider (IdP) to federate access to their applications, as the relying party is then the IdP itself.

To solve these challenges, in May 2022 Apple, Google, and Microsoft announced their intent to support synced passkeys in their operating systems. Synced passkeys have many of the same characteristics of device-bound passkeys, including the continued use of private and public key pairs. One significant difference, however, is that synced passkeys allow for the private key of the credential to be synchronized to other devices the user owns that exist in the same vendor’s synchronization fabric ecosystem (e.g., iCloud in the Apple ecosystem). Synced passkeys also allow for the creation of a more streamlined and user-friendly experience. All passkeys share several common security properties, are highly phishing resistant, and use unique key pairs to enable strong authentication. However, it is also important to note the difference between synced and device-bound passkeys. For example, synced passkeys introduce new security considerations when analyzed against a device-bound passkey. Conversely, synced passkeys can more easily address account recovery challenges.

As organizations work to evaluate how and where both credential types can be utilized in their environment, they will need to review and understand their organization’s legal, regulatory, and security requirements. When organizations evaluate these requirements, they will many times refer to the combination of these requirements as an authentication assurance level (AAL) and will reference documentation from the National Institute of Standards and Technology (NIST), which provides guidance and recommendation for different assurance levels. While there is currently work underway by NIST to update these assurance levels to better incorporate synced passkeys, the current standards can be helpful when evaluating the implementation of device-bound passkeys and synced passkeys into an organization. More information regarding NIST and AALs can be found here: Authenticator Assurance Levels (nist.gov).

In terms of this white paper, a moderate assurance environment is an organization that has several different authentication use case scenarios that can be met by a combination of AAL1 and/or AAL2 as well as AAL3 levels of assurance. This white paper will dive deeper into the advantages and disadvantages of both device-bound passkeys and synced passkeys to provide a comparison between the two that an organization can use along with their own legal, regulatory, and security requirements to determine how and where they can implement both device-bound passkeys and synced passkeys into their moderate assurance environment so that they can take advantage of the secure, phishing-resistant, and user friendly authentication process that FIDO2 credentials provide in all parts of their organization.

download the white paper 2. FIDO Credential Adoption Considerations

When organizations are evaluating the use of both device-bound passkeys and synced passkeys to support the AAL1, AAL2, and AAL3 requirements of their organization, there are several factors that they should consider. These factors are described below and are intended to provide the organization with the information they need to help analyze both types of credentials and determine where they can be used in their enterprise.

2.1. User Experience
In terms of user experience, the goal of using FIDO credentials to authenticate to a system has always been to provide an easy-to-use and effortless process for the user. The original FIDO implementations provided a streamlined sign-in experience, but still presented some user experience challenges.

Passkeys introduce several enhancements to help provide improve user experience including a new feature called “passkeys Autofill UI” that provides users easier access to the creation of the passkeys and provides an autofill-like experience where users simply pick the credential they want to use when authenticating and no longer type in their username or password. This experience becomes quite easy to use and is very similar to the experience that most users already like and are comfortable with when using solutions such as password managers. Creating a passkey user experience that users like more than their current password experience removes the hurdle to adoption that has been seen with previous passkey implementations.

2.1.1 Backup, Lost Devices, and Recovery
With device-bound passkeys, the private key is stored on and not allowed to leave the authenticator. This creates a very secure solution but does create challenges for users and enterprises regarding backup of the key data, loss of the authenticator, and addition of new authenticators for the user. While there are recommended recovery practices for device-bound passkeys (FIDO_Account_Recovery_Best_Practices-1.pdf (fidoalliance.org)), synced passkeys work to resolve these challenges in a more user friendly manner. With the implementation of a synced passkey solution, the user no longer must register multiple authenticators with a relying party to ensure continued access in the event of a lost authenticator. If an authenticator is lost, a user can recover their passkey by using the recovery process provided by the passkey provider. Additionally, synced passkeys make for a better user experience as a user does not have to register unique credentials per device or maintain multiple device-bound passkeys to minimize the risk of credential loss. Once configured, synced passkeys are available across all devices synced with the passkey provider.

Synced passkeys do, however, create a dependency on the passkey provider and their synchronization fabric. Each provider implements their own synchronization fabric, which includes their own security controls and mechanisms to protect credentials from being misused. Organizations with specific security or compliance requirements should assess which provider(s) or hardware security keys meet their requirements.

Synced passkeys have a lower security posture as they allow the private key on the authenticator to be synchronized to authenticators of other devices the user has in the same vendor’s ecosystem. Organizations should also be aware that there currently are no standards or systems that allow them to keep track of what devices these credentials have been created and stored on, nor mechanisms to identify when the credential has been shared with another person. For use cases in an organization that require a high level of assurance, the fact that this information cannot be determined or obtained means that synced passkeys would not be a good solution for those specific organizational use cases, and they should look to device-bound passkeys to support those use cases.

2.3 Attestation and Enforcement of Credential Type
Attestation is a feature that is designed to enhance the security of the registration process. Attestation mechanisms are defined by the specifications as an optional feature, though most hardware security keys implement it. Attestation is the ability of the authenticator to provide metadata about itself back to the relying party so that the relying party can make an informed decision on whether to allow the authenticator to interact with it. This metadata includes items such as an Authenticator Attestation Globally Unique Identifier (AAGUID), which is a unique ID that represents the vendor and model of the authenticator, the type of encryption that the authenticator uses, and the PIN and biometric capabilities of the authenticator. Some authenticator vendors also support a feature called Enterprise Attestation that allows an organization to add additional uniquely identifying information in an attestation that is included with an authenticator registration request, with the intent to use this additional information to support a controlled deployment within the enterprise where the
organization wants to allow the registration of only a specific set of authenticators. Additional information about Enterprise Attestation can be found in this white paper: FIDO-White-Paper-Choosing-FIDO-Authenticators-for-Enterprise-Use-Cases-RD10-2022.03.01.pdf (fidoalliance.org).

At the time of publication, synced passkeys do not implement attestation, which means they are not an appropriate solution for scenarios with highly privileged users that require higher levels of assurance or for organizations that want to implement Enterprise Attestation. To support these highly privileged users, relying parties and organizations have historically looked to, and will need to continue to look to, device-bound passkeys and authenticators from vendors that support and include attestation in their solutions. For organizations that have regulatory, legal, or security requirements that require all users to be treated as high privilege users or have a need to implement Enterprise Attestation, it is recommended that only device-bound passkeys be implemented in their environment. A companion white paper, “High Assurance Enterprise Authentication,” provides details on this scenario and can be found here: https://media.fidoalliance.org/wp-content/uploads/2023/06/FIDO-EDWG-Spring-2023_Paper-5_High-Assurance-EnterpriseFINAL5.docx-1.pdf. Moderate assurance organizations can support all their users by implementing synced passkeys for their standard users to replace passwords and MFA with a more secure solution and then use device-bound passkeys for highly privileged users and their access to resources that require the highest level of assurance.

Implementing both types of passkeys in the same authentication domain does however create an additional challenge that will require organizations to take additional steps to ensure that the correct type of passkey is used when accessing resources: for example, ensuring that a highly privileged user is using a device-bound passkey and not a synced passkey when accessing a resource that requires a high level of assurance. Organizations can leverage the user risk evaluation and policy engine framework of their Identity Provider to solve this challenge. Watermarking the user’s session with an identifier representing the AAL (or other properties of their choosing) to be used in downstream authorization decisions can also be used to solve this challenge. In federated authentication environments, this may be communicated using standards such as the Authentication Method Reference (amr, RFC8176) standardized by OpenID Connect.

3. Conclusion

In moderate assurance environments, both device-bound passkeys and synced passkeys may be implemented together to provide a more secure authentication solution for all use cases of the organization. The more user-friendly synced passkeys can be implemented to replace passwords and MFA for users with standard assurance level requirements, giving them a more secure authentication method that is also easier to use. For highly privileged users in the organization that require the highest level of security, device-bound passkeys can be issued that provide an even higher level of security and an additional level of trust in the authentication process. The white paper provides information comparing synced passkeys, with their better user experience, against device-bound passkeys, with their enhanced security features. Using this information, organizations can evaluate device-bound passkeys and synced passkeys to determine how both can be leveraged in their organization to provide easy-to-use and secure authentication methods that meet and exceed the requirements of their moderate assurance environment.

4. Next Steps

The next step for organizations is to start the evaluation of FIDO2 credentials so that organizations can move away from passwords, which are susceptible to phishing and are well documented to be a significant weakness in their overall security posture. Organizations that have a moderate assurance need and will implement both device-bound passkeys and synced passkeys should determine which credential type will provide the best return on investment, work towards implementing that credential type first, and then follow up by completing the deployment of the other credential type when possible. Implementing either type of FIDO2 credential is a large step forward in moving to a passwordless environment and significantly increasing the overall security posture of the organization.

5. Acknowledgements

We would like to thank all FIDO Alliance members who participated in the group discussions or took the time to review this paper and provide input, specifically:

Karen Larson, Axiad Jeff Kraemer, Axiad Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Alliance Enterprise Deployment Working Group Tom Sheffield, Target Corporation FIDO Enterprise Deployment Working Group Members

White Paper: Replacing Password-Only Authentication with Passkeys in the Enterprise

Editors Khaled Zaky, Amazon Web Services Abstract This white paper describes the need for a more secure and convenient solution for authentication. Passwords have long been the standard for authentication, […]
Editors

Khaled Zaky, Amazon Web Services

Abstract

This white paper describes the need for a more secure and convenient solution for authentication. Passwords have long been the standard for authentication, but the risks inherent to passwords reduce their efficacy as an authentication mechanism. Multi-factor authentication (MFA) solutions have been on market for some time, but their widespread adoption has been slow due to various barriers. Passkeys are an authentication solution that reduces the adoption barriers of traditional MFA mechanisms, while offering improved security, ease of use, and scalability over passwords and classic MFA solutions. Passkeys utilize on-device biometrics or PINs for authentication and provide a seamless user experience. This white paper outlines the benefits of passkeys, the user experience, and adoption considerations for enterprises.

1. Introduction

Passwords have long been the standard for authentication, but their inherent security flaws make them exploitable. Many passwords can be easily guessed or obtained through data breaches, and the reuse of passwords across multiple accounts only exacerbates the problem. This vulnerability makes them susceptible to credential stuffing attacks, which use leaked or commonly used passwords to gain unauthorized access to user accounts. In fact, passwords are the root cause of over 80% of data breaches, with up to 51% of passwords being reused. Despite these security concerns, many consumers and organizations continue to rely solely on passwords for authentication. According to a recent research by the FIDO Alliance, 59% of consumers use only a password for their work computer or account.

Traditional multi-factor (MFA) mechanisms, such as one time passwords (OTPs) delivered via SMS, email, or an authenticator app, are used by organizations to reduce the risk associated with a single-factor, password-based authentication system. Organizations using single-factor authentication with passwords, or those that have deployed OTPs to reduce phishing and credential stuffing, can implement passkeys as a password replacement to provide an improved user experience, less authentication friction, and improved security properties using devices that users already use—laptops, desktops, and mobile devices. For an introduction to passkeys and the terminology, please see the FIDO Alliance’s passkeys resource page. In the following pages, we will focus on migrating existing password-only use cases to passkeys. For additional use cases, please see here.

download the white paper 2. Why Are Passkeys Better than Passwords?

Passkeys are a superior alternative to passwords for authentication purposes and offer improved usability over traditional MFA methods. They offer several benefits such as better user experience, reduced cost of lost credentials, phishing resistance, and protection against credential compromise.

Synced passkeys offer a consistent authentication experience for users across multiple devices. This is made possible by leveraging the operating system platform (or a third party synchronization fabric such as that from password managers) to synchronize cryptographic keys for FIDO credentials. This allows for quick and easy sign-in using biometrics or a device PIN. Synced passkeys also improve scalability and credential recovery. With synced passkeys users do not have to enroll a new FIDO credential on every device they own, ensuring that they always have access to their passkeys, regardless of whether they replace their device.

On the other hand, device-bound passkeys such as security keys can be used on multiple devices allowing for cross-device portability. Unlike synced passkeys that are accessible on any synchronized device, device-bound passkeys are tied to the specific physical security key.

In terms of security, passkeys are built on the FIDO authentication standards, providing strong resistance against the threats of phishing and credential stuffing. Additionally, passkeys rely on existing on-device security capabilities, making it easier for small and medium enterprises to adopt stronger authentication methods.

Finally, passkeys offer a comprehensive solution for secure and efficient authentication that is better than passwords and traditional MFA authentication methods. With a seamless user experience, improved scalability, and enhanced security, passkeys are a valuable solution for organizations of all sizes.

3. Passkeys User Experience

3.1 Create a passkey visual UX/UI

Note: This section will provide an overview of the passkey registration and sign-in process using examples. Note The FIDO Alliance User Experience Working Group has developed UX guidelines for passkeys that are available here.

In the passkey registration flow, users are first prompted to provide an email or username along with their password to authenticate.

2. Then, users simply follow the prompts to provide their on-device biometric or PIN authentication.

3.2 Sign in with a passkey visual UX/UI

To sign in with a passkey, a user just selects the email or username. Available passkeys will be shown in the passkey autofill user interface. 4. Adoption Considerations for Enterprises

Within businesses large and small, there are systems and services dependent upon single factor authentication using passwords. We collectively refer to these use cases as “low assurance use cases.” For low assurance use cases, technology leaders can displace password-only authentication mechanisms with passkeys, dramatically reducing the risk of phishing, and eliminating password reuse and credential stuffing. However, even for low assurance use cases, businesses must consider factors that will influence their choice of technology and implementation, which we outline below.

As a prerequisite to deploying passkeys in the enterprise, leaders must clearly define the set of use cases, users, and the suitability of passkeys for this set.

4.1 Does the relying party (RP) support passkeys?
At the time of writing (Q2 2023), passkeys are a relatively new technology, and as such broad-based support is not guaranteed. As organizations review their systems to identify candidates for migration to passkeys, leaders must start by identifying where passkeys are supported within their ecosystem.

First, for in-house developed/managed applications, how can passkey support be added to the application(s)?If a single-sign on (SSO) mechanism is used to federate multiple applications and services, adding passkey support to the Identity Provider (IdP) can propagate support for passkeys to numerous federated applications, creating a rich ecosystem of services supporting passkeys with engineering efforts focused on the SSO IdP. Conversely, if the environment uses multiple independent applications, each of which uses password-based authentication, organizations will have to prioritize FIDO implementation across their suite of applications to leverage passkeys, or consider migrating to a federated authentication model where the IdP supports passkeys.

Second, third-party developed or hosted applications may or may not support passkeys. If an organization’s service provider does not support passkeys today, inquire when support is expected. Alternatively, if the organization is pursuing a federated identity model, does the service provider support inbound federation?If so, end users can authenticate to the IdP with a passkey before federating to the service providers’ systems.

4.2 Which devices are used to create, manage, and authenticate with passkeys?
After identifying a set of targeted applications or IdPs, identify the users of the applications and the devices they use to access the same. Generally speaking, users on modern operating systems, browsers, and hardware will have broad support for passkeys registered on a platform device, using a credential manager, or with a hardware security key. There are tradeoffs with each mechanism.

Today, passkey providers allow users to register passkeys that are synchronized to all of the devices the user registered with the sync fabric. Passkeys providers may be part of the operating system, browser, or a credential manager which stores and manages passkeys on behalf of the user. If the user loses or replaces their device, the passkeys can be synchronized to a new device, minimizing the impact on users. Typically, this is a good solution for users who use a small number of devices on a regular basis.

Conversely, hardware security keys create device-bound passkeys; they never leave the device. If a user loses their hardware key, they must have a backup or perform account recovery for all credentials stored on the device. Passkeys may be shared with other users if they are not hardware bound.

Hardware security keys require connectivity to the user’s computing device through USB, Bluetooth, or NFC whereas providers are always available on the user’s devices once bootstrapped. Platform credentials may be used to authenticate on nearby devices using the FIDO Cross-Device Authentication. Enterprises should consider whether users who move between a number of shared devices should synchronize passkeys across all the shared devices, use hardware keys, or use the hybrid flow to best support their work style.

When users operate on shared devices using a single account (or profile), passkeys registered to the platform or credential managers are not a good fit. Device bound passkeys on a hardware key are recommended for this scenario. If the user carries a mobile device, consider registering a passkey on the device and using the cross device authentication flow to authenticate users.

Unlike passwords, all of the passkey solutions reviewed above provide strong phishing resistance and eliminate credential theft from the RP and reuse.

4.3 Registration & Recovery
If there are no restrictions on which device(s) or platform(s) the user can register their passkeys, users may self-provision passkeys by bootstrapping a new credential from their existing password using the device(s) of the user’s choice. If using hardware security keys, organizations should provide two per user to allow for a backup credential.

As long as a password remains active on the user account, the user can recover from credential loss following the self-provisioning described above. This step is only required if the user is unable to restore their credentials from their passkey provider.

5. Conclusion

Passkeys offer a significant improvement in security compared to traditional passwords, but it is important to carefully evaluate and understand the adoption considerations before proceeding with an implementation. Organizations should ensure its technical requirements, security, and management preferences align with the passkey solution. Not all use cases are suitable for a passkey-only implementation. For additional deployment patterns, see the other white papers in this series here.

6. Next Steps: Get Started Today

Organizations should upgrade their authentication method and take advantage of the stronger security that passkeys provide. Based on the FIDO authentication standards, passkeys offer a robust solution to the growing threat of phishing attacks. Look for the passkey icon on websites and applications that support it, and take the first step towards a more secure future. Don’t wait. Make the switch to passkeys today!

For more information about passkeys, visit the FIDO Alliance site.

7. Acknowledgements

We would like to thank all FIDO Alliance members who participated in the group discussions or took the time to review this paper and provide input, specifically (in alphabetic order):

Jerome Becquart, Axiad Vittorio Bertocci, Okta Greg Brown, Axiad Tim Cappalli, Microsoft Matthew Estes, Amazon Web Services John Fontana, Yubico, Co-Chair FIDO Enterprise Deployment Working Group Rew Islam, Dashlane Jeff Kraemer, Axiad Karen Larson, Axiad Sean Miller, RSA Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Enterprise Deployment Working Group Tom Sheffield, Target Corporation Johannes Stockmann, Okta Shane Weeden, IBM Monty Wiseman, Beyond Identity FIDO Enterprise Deployment Working Group Members

White Paper: FIDO Deploying Passkeys in the Enterprise – Introduction

Editors Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Enterprise Deployment Working Group 1. Introduction Last year FIDO Alliance, Apple, Google, and Microsoft announced their intentions to support passkeys— FIDO […]
Editors

Dean H. Saxe, Amazon Web Services, Co-Chair FIDO Enterprise Deployment Working Group

1. Introduction

Last year FIDO Alliance, Apple, Google, and Microsoft announced their intentions to support passkeys— FIDO credentials that may be backed up and made available across devices that are registered to the same passkey provider. Since then, we have seen the support for passkeys and beta implementations by multiple platforms and password managers. Enterprises have expressed interest in passkeys but do not know where to start, what type of passkeys work in their environment, or how passkeys fit in their authentication strategy.

It is important to note that FIDO Alliance has embraced the term “passkey” to describe any passwordless FIDO credential. This includes synced passkeys(consistent with the original announcement and intent) as well as device-bound passkeys – which are FIDO authentication credentials that cannot leave the issued device (e.g., on a FIDO Security Key).

In the following series of papers, the FIDO Enterprise Deployment Working Group (EDWG) will provide guidance to leaders and practitioners on deploying FIDO solutions scaling from SMBs to large enterprises. With recognition that there are a variety of different use cases for FIDO credentials, from synced passkeys to device-bound passkeys, this series will identify key decision points for identifying which solution(s) are a good fit across different enterprise use cases. Enterprises are likely to find there are multiple FIDO-based solutions required to meet their different use cases.

As organizations evaluate how to use passkeys in their environment, they will need to determine the legal, regulatory, and security requirements of their organization and evaluate how both synced passkeys and device-bound passkeys can meet these requirements.

We assume that the reader has a high level understanding of the FIDO protocols, if not, please consult https://passkeys.dev/.

download the white paper 2. Why Choose Passkeys?

Passwords are the root cause of over 80% of data breaches, and up to 51% of passwords are reused, making them subject to credential stuffing attacks. FIDO credentials are inherently more secure than passwords due to their design. These credentials are unique cryptographic key pairs scoped to a specific origin (e.g., https://fidoalliance.org/) to prevent discovery by unrelated services. Unlike passwords, FIDO credentials are highly phishing resistant, and the credential—a private key—cannot be stolen from the relying party (RP) servers.

FIDO credentials can be utilized across a variety of use cases—from low to high assurance, balancing user experience, convenience, and security. Authenticators—ranging from hardware security keys to biometric hardware in phones, tablets, and laptops to password managers—enable enterprises to choose the right tools for their unique environments.

While all FIDO credentials are based on cryptographic key pairs, they do not exhibit the same security characteristics, nor are they all suitable for all use cases. For example, hardware security keys may be FIPS certified devices with device-bound passkeys. RPs can identify these credentials based upon the attestation statements provided at registration. On the other hand, synced passkey implementations synchronize key material through a cloud-based service. The export and management of credentials in a third-party service introduces additional considerations and may not meet every organization’s security requirements. The table on page 4 summarizes the use cases and properties of device-bound and synced passkeys.

As you read the series you may encounter terminology that is unique to the FIDO ecosystem. Please consult the FIDO Technical Glossary for definitions of these terms.

We expect that most enterprises will have use cases that span more than one of these papers. Wherever organizations find themselves on this journey, they can start using FIDO credentials today to reduce credential reuse, phishing, and credential stuffing.

In the first paper, we examine how organizations can deploy passkeys to their users who are using passwords as their only authentication factor. By deploying passkeys, companies can immediately reduce the risk of phishing or credential stuffing for their staff while using corporate or personal devices for authentication. https://fidoalliance.org/fido-in-the-enterprise/.

There are many organizations that have deployed classic second factor authentication solutions such as SMS OTP, TOTP, and HOTP. In many cases, these deployments were tactical responses to reduce the success of phishing attacks. However, none of these mechanisms are immune to phishing. In the second paper of the series, we examine how passkeys can displace less phishing resistant mechanisms while improving the authentication user experience. https://fidoalliance.org/fido-in-the-enterprise/.

Enterprises in regulated industries may be obligated to utilize higher assurance authentication for some, or all, of their staff. These companies (or other companies with stringent security requirements) may be able to deploy synced passkeys, device-bound passkeys, or both to meet their authentication requirements. The third paper in the series provides guidance on deciding which FIDO-based solution(s) can meet these requirements. https://fidoalliance.org/fido-in-the-enterprise/.

The final paper describes using device-bound passkeys where functional or regulatory requirements require high assurance
authentication. These scenarios use attestation data to securely validate the hardware devices used to generate and manage passkeys.

This attestation data can be used to ensure compliance with regulatory and security requirements for regulated enterprises and use cases. https://fidoalliance.org/fido-in-the-enterprise/.

Device-Bound PasskeysSynced PasskeysLow AssuranceSufficientSufficientModerate AssuranceSufficientMay Be SufficientHigh AssuranceMay Be Sufficient
Dependent upon the authenticator and
regulatory/compliance requirements (e.g. FIPS
140)Insufficient

PortabilityMay be portable between devices & ecosystems e.g. hardware security keys)

Limited by available connectivity options (USB,
NFC, BLE)Portable within the Passkey Provider ecosystem


Shareable / CopyableNo – device bound credentials cannot be exportedMay be supported. Dependent upon the passkey
providerAccount RecoveryMinimize credential loss scenarios by registering
multiple device-bound passkeys

Account Recovery via enterprise RP defined
mechanismsCredential recovery via Passkey Provider defined mechanisms to bootstrap a new device

Account Recovery via enterprise RP defined
mechanismsCostPotential additional cost to obtain and provision
hardware security keys if device-bound keys are
unavailable in the platform ecosystemBuilt in to existing platforms

Possible additional cost for third party/non-platform passkey providers 3. Acknowledgements Vittorio Bertocci, Okta Greg Brown, Axiad Jerome Becquart, Axiad Tim Cappalli, Microsoft Matthew Estes, Amazon Web Services John Fontana, Yubico, Co-Chair FIDO Enterprise Deployment Working Group Rew Islam, Dashlane Sue Koomen, American Express Jeff Kraemer, Axiad Karen Larson, Axiad Sean Miller, RSA Tom Sheffield, Target Corporation Johannes Stockmann, Okta Shane Weeden, IBM Monty Wiseman, Beyond Identity Khaled Zaky, Amazon Web Services FIDO Enterprise Deployment Working Group Members

White Paper: FIDO Attestation: Enhancing Trust, Privacy, and Interoperability in Passwordless Authentication

Editors Khaled Zaky, Amazon Web ServicesMonty Wiseman, Beyond IdentitySean Miller, RSA Security Eric Le Saint, Visa Abstract This document intends to provide a comprehensive understanding of attestation’s role in enhancing and […]
Editors

Khaled Zaky, Amazon Web Services
Monty Wiseman, Beyond Identity
Sean Miller, RSA Security 
Eric Le Saint, Visa

Abstract

This document intends to provide a comprehensive understanding of attestation’s role in enhancing and advancing the digital security landscape, specifically with respect to authentication. It focuses on the core function of attestation: verifying the origin and integrity of user devices and their authentication materials. FIDO credentials are discussed with a focus on how they offer more secure alternatives than traditional password-based systems and how FIDO attestation enhances authentication security for both Relying Parties (RPs) and end-users. In this document, RPs are those entities that provide websites, applications and online services that require the need for secure user access by confirming the identity of users or other entities. FIDO Alliance’s historical journey is presented with practical analogies for understanding FIDO attestation, its enterprise-specific technical solutions, and privacy aspects involved in the attestation process.

Audience

Targeted for CISOs, security engineers, architects, and identity engineers, this white paper serves as a guide for professionals considering the adoption of FIDO within their enterprise ecosystem. Readers should possess a baseline understanding of FIDO technologies, the meaning of attestation, and have a desire to understand why and how to implement attestation.

download the white paper 1. Introduction

While authentication is widely understood, attestation may be less familiar to many practitioners in the information technology field. Attestation, as understood within the FIDO protocols, confirms a set of properties or characteristics of the authenticator. In the physical world, we can rely on examining an object to inspect its properties and verify its authenticity. In the interconnected digital world, physical inspection is not practical. Devices used for FIDO authentication should be carefully checked before use, especially if their source or contents are uncertain. Certain transactions, especially those related to government, healthcare, or financial institutions, demand higher assurance, and it is vital that the Relying Party (RP)  confirms the authenticator’s legitimacy in these cases. To ensure that high-assurance transactions are legitimate, RPs can employ attestation to verify the authenticity and properties of the authenticator.

A note on terminology: The term “key” and “key pair” is common to several types of keys described in this paper. To alleviate this confusion the term “passkey” will always be used when referring to a key used to authenticate a user. Use of other instances of the term ‘key’ will be specific by either the context or a modifier such as Attestation Key.

In traditional password-based systems, it may be assumed that users and RPs keep passwords confidential. Because this assumption is not consistently enforced, breaches can occur. Using passkeys instead of passwords is a significant improvement, but some RPs may need more stringent policies to verify the authenticity of the authenticator and its properties.

Unlike passwords, passkeys use securely generated key material allowing access to websites and apps. Users and RPs rely on the authenticator for storage and management of this key material and therefore share the responsibility for secure handling of passkeys.  All actors and components of the FIDO solution, including the authenticator, RP, and the passkey provider (when applicable), together ensure a robust security framework. This is in contrast to passwords, where the secure handling of passwords depends primarily on the user’s memory, behavior, the RP, and password managers (if used). RPs can leverage attestations to verify that passkeys are securely handled within  properly implemented FIDO certified devices.

Attestation provides RPs with information about the authenticator protecting the user’s passkeys. This provides a means for the RP to enforce security policies for FIDO authentication. In the following sections, we delve deeper into the concept of attestation, its purpose, real-life scenario comparisons, and the problems attestation solves.

1.1 Real-World Analogies for FIDO Attestation

Drawing parallels with everyday security protocols offers significant insights. Both digital and physical environments demand rigorous checks and balances to validate identities and fortify trust. FIDO Attestation reflects the trust and verification processes familiar in the physical world.

To understand the pivotal role of FIDO attestation, consider its application in real-world identification and verification practices. These analogies underscore its integral function and efficacy:

Identity Document Verification: Just as individuals may produce official documents such as passports or driver’s licenses to authenticate identity, the verifier (e.g., immigration official) wants proof of the document’s authenticity and therefore checks for the relevant seals and marks. FIDO attestation provides proof of the authenticity of a user’s authenticator, offers statements for examination, and provides cryptographic signatures for verifying the authenticity of the authenticator and the statements. Gaining Trust Through Authentication: Think of moments where trust is contingent on proof of identity or authority.  For example, accessing a secure facility where a guard authenticates you based on your identity documents, authorizing access to the facility. FIDO attestation fosters trust in digital environments when used to confirm the authenticator provenance and authenticity during online registration. Countering Threats and Weaknesses: In real-world scenarios, ID checks exist to counteract impersonation, forgery, and fraud. FIDO attestation identifies the origins of authenticators and assists RPs to detect registrations from devices with known vulnerabilities, thereby enabling them to ensure that users employ only secure devices. 2. Practical Implications and Use-Cases of FIDO Attestation

2.1 From the Perspective of a Relying Party 

Delving deeper into FIDO attestation provides invaluable insights into critical roles fortifying authentication systems:

Assured Authenticator Security and Compliance: For RPs operating in sensitive sectors, for example,  finance or the public domain, there’s a heightened need to ascertain that authentication devices are secure and meet specific standards. FIDO attestation helps ensure that authenticators accessing services are not only secure, but also adhere to specific standards and regulations. Authenticator Model Specificity and Trust in FIDO Authenticator Models: FIDO attestation is tailored to distinct authenticator models, ensuring that cryptographic proofs during registrations validate said authenticator model authenticity. Beyond general trust in the attestation process, this specificity allows the RP to confirm that the passkey used in the registration request originates from a particular FIDO authenticator model. Such granularity is paramount for RPs where the details of authenticator models are crucial due to regulatory or security reasons. Verification Through Attestation Signature: As a user sets up a new account, the onboarding RP can authenticate that the “attestation signature” linked to the freshly generated passkey is indeed from a genuine authenticator model. Incident handling and Response: If a vulnerability is discovered in an authenticator, RPs checking attestations have the ability to discover which authenticators may be affected and require additional authentication factors or registration of a new credential for impacted users. 

2.2 From the Perspective of the End-User

Although end users may not be aware of the technical details, FIDO attestation can enhance their online security:

Enhanced Trust in Services: When using services, particularly in high-assurance sectors such as  banking or government portals, users can experience increased confidence. They understand that the RP isn’t just authenticating but is also ensuring that authenticators accessing the platform adhere to specific standards. Authenticator Compliance: FIDO attestation assures RPs of authenticator compliance and security,  giving users the benefit of reliable functionality of their authentication devices paired with desired RP-related services. Transparent Registration and Onboarding:  The registration process is designed for seamlessness, but includes an additional step when an RP requests attestation of a FIDO authenticator.  At this step, users must provide their consent to share the attestation metadata with the RP. This ensures that while backend verifications related to attestations, certification path validations, and authenticator compliance are streamlined, the user is aware of and has approved the process. 3. FIDO Attestation Explained

In this section we describe FIDO attestation and FIDO attestation types.

3.1 What is FIDO Attestation?

Within the FIDO authentication framework, attestation is a process for verifying the authenticity of a user’s authenticator during the authentication process. The attestation can be used in conjunction with the FIDO Alliance’s metadata service [1] to get more information about the authenticator including the model and certification level. An optional level of attestation, known as enterprise attestation, allows for further verification of specific authenticators, see section 4.5.

Note that the term ‘attestation’ might have  different meanings outside of the context of FIDO. This paper discusses attestation only within the scope of the FIDO Alliance.

In FIDO registration, a key step is the creation of a user authentication passkey, which occurs regardless of whether attestation is involved. During this process, the user’s authenticator—such as a smartphone—generates a unique cryptographic key pair for each RP. The private key is securely stored within the authenticator, while the public key is shared with the RP, establishing a secure authentication framework. Additionally, during registration, the authenticator may provide an attestation, offering further assurance about the authenticator’s integrity.

In addition to generating the user’s authentication passkey, the FIDO authentication framework includes an optional attestation process. When attestation is requested, the authenticator may provide an attestation (synced passkeys do not currently provide attestations) by using an Attestation Key to sign the AAGUID  (Authenticator Attestation Globally Unique ID) along with the passkey public key, creating signed evidence that establishes a trust anchor for the RP to validate that the authenticator properties meet the RP conditions through the MDS (FIDO Alliance’s Metadata Service [1], see section 3.3 for additional information).  If the authenticator cannot provide an attestation, the RP can authenticate the user with the passkey, and may obtain authenticator information (e.g. AAGUID), but it may not obtain verifiable evidence that the required authenticator properties are present.

This attestation process helps protect against supply chain attacks, such as the introduction of substitute or counterfeit authenticators. By verifying the authenticity of the authenticator, the RP understands the properties of the authenticator and assesses whether it meets the expected security standards, particularly during the registration phase, to ensure the device’s legitimacy.

FIDO attestation is thus a key component of the broader security and privacy objectives of the framework. It minimizes reliance on passwords, fosters strong device authentication based on public-key cryptography, and aims to offer a standardized and interoperable approach to authentication across different platforms and devices.

3.2 Types of FIDO Attestation

There are several types of FIDO attestation which differ in how the attestation statement is signed. Note that none of these attestation types except Enterprise Attestation provide information about the specific authenticator. This is to preserve user privacy.

Self-attestation: The attestation statement is signed by the user’s passkey. This provides integrity protection for the attestation statement and provides no other assurances. Basic attestation: The attestation statement is signed by a key created by the authenticator’s manufacturer and embedded into the authenticator. This provides integrity protection of the attestation statement and proof of the authenticator’s manufacturer. For privacy purposes, this key must be duplicated across many of the same authenticator’s model (current FIDO Alliance requirement is >100,000 devices). It is not unique to a specific authenticator instance. Attestation CA (AttCA) or Anonymization CA (AnonCA): This is similar to basic attestation, except the attestation statement is signed by a TPM Attestation Key. In this case, the TPM, a hardware-based module where cryptographic operations occur and secrets are stored securely without leaving the module, has its Attestation Key’s certificate signed by a trusted authority managing the authenticator. Enterprise attestation: This is discussed in section 4.5,

It should be noted that the FIDO2 Specifications work along with the WebAuthn specification [2]. The type of attestation used is determined by examining fields within the attestation object which are defined in the WebAuthn specification. Further definitions provided by the WebAuthn specification includes a number of different types of formats, for example: packed, TPM, and Android-key as well as supporting custom formats if needed.

3.3 Using AAGUID

The Authenticator Attestation GUID or simply AAGUID, uniquely identifies the authenticator’s make (manufacturer) and model. It does not uniquely identify the specific authenticator. The AAGUID  is returned by the authenticator when attestation is requested by the RP and  the RP may use it to determine if the authenticator’s make and model meets its policies. Among other uses, the AAGUID is the lookup value within the FIDO (MDS) [1] providing the RP detailed information about the authenticator.

The authenticator’s conveyance of the AAGUID provides no proof of its integrity or authenticity. The RP must trust the specific authenticator to provide truthful information.

This point is important to emphasize:

The AAGUID without attestation is “informational” only and does not provide any assurance of its authenticity. Attestation provides a signature providing a level of assurance (depending on the type of attestation) of the authenticator’s identity. 4. Technical Solutions 

This section describes the sequence of events and involved components that make up FIDO attestation.

4.1 Authentication vs. Attestation Keys 

The use of keys and methods for user authentication from FIDO have been introduced in previous documents, but the use of  keys and methods used for attestation may not be familiar.  

User Authentication: This is the process where the user demonstrates possession of the correct system credentials, utilizing a passkey instead of the traditional password, which is a common application of FIDO technology. Attestation: This is the process of the authenticator using a key that is not assigned to a user, but instead  assigned to the authenticator,  to digitally sign a message providing proof of the message’s authenticity. The message involved is called the “attestation statement” and contains information about the authenticator. When the attestation statement is digitally signed by the authenticator’s attestation key, the RP can verify the validity of the attestation statement.

In summary:

A passkey authenticates the user to an RP An attestation key signs an attestation statement to authenticate its origin

As stated in section 3.3 an RP may obtain the authenticator’s make and model by simply checking the authenticator’s AAGUID against the Metadata Service to get this information. Without being digitally signed by a key trusted by the RP, the RP has no proof this information is authentic or associated with the authenticator being queried. 

Note: As discussed in section 3.2, there are several attestation types. One of these, “self-attestation”, uses the User Authentication key to sign the attestation statement. This is not technically a contradiction, but a simplification provided to allow integrity protection, not authenticity, of the attestation statement.

4.2 Trust in the Attestation Key – Trust Chain

Fundamental to attestation is the RP’s trust in the Attestation Key. The Attestation Key must be generated by a trusted source and protected by the authenticator. The trusted source is typically the authenticator’s manufacturer however, in the case of “Attestation CA (AttCA) or Anonymization CA (AnonCA)”, a trusted agent or Certification Authority (CA) is asserting the authenticity of the authenticator. The public part of the Attestation Key is obtained by the RP using a trusted channel, typically the FIDO MDS [1], mentioned previously.

4.3 FIDO Attestation Sequence

Attestation uses a key pair associated with an authenticator, not a user. It is important that all authenticators of the same make and model return the same attestation statement. The format of the attestation is examined later in this section, but it is important to understand that, at a high level, the attestation provides information about the type of authenticator, and it is not specific to a single device.

The following steps (1.a or 1.b then 2.) summarize a FIDO authenticator’s attestation lifecycle:   

1. Authenticator Manufacturing: There are two models for provisioning the Attestation Key: case “a” for roaming authenticators, such as smartphones or USB security keys used across multiple platforms, and case “b” for platform authenticators, which are built-in authentication mechanisms within devices like laptops or smartphones.

Note: This two-model distinction is not architecturally required by the FIDO Specification, but it is the practical implementation known today and provides a simplified explanation for the purpose of this paper. Also, the descriptions are generalizations and manufacturers may deploy different methods than described here – this is only a generalization.

Roaming Authenticator: The authenticator manufacturer generates an Attestation Keypair (AK) for a specific authenticator model. The manufacturer creates a certificate with the AK’s public key. The AK Certificate is commonly put into the MDS. This allows a RP to retrieve the AK Certificate from a trusted source, MDS, when an AAGUID is provided. The AK Certificate itself is usually signed with the authenticator’s manufacturer’s issuer key. This creates a verifiable cryptographic chain from the authenticator back to its manufacturer. Platform Authenticator: The authenticator is not shipped from its manufacturer with an attestation key that can be used for FIDO attestation. Instead, it relies on persistent keys within the platform authenticator. These keys are crucial cryptographic elements that the attestation service uses to generate a FIDO Attestation Key. The attestation service is trusted by the Relying Party to provide assurance in the platform authenticator’s integrity and compliance. The attestation service creates an attestation key that is used to sign an attestation object which asserts the properties of the authenticator. The RP must trust the attestation service in the same way it trusts the roaming authenticator’s manufacturer.

2. User Provisioning with Attestation: During registration (setting up the new account), a new User Credential (a passkey) is created with a unique cryptographic key pair, and the public key is sent to the RP. The RP may optionally require an attestation. Note that the User or the authenticator may ignore the requirement for attestation. If the authenticator possesses an attestation key and it is allowed by the User, the user’s public passkey (along with the attestation statement) will be sent to the RP signed with the attestation private key. This allows the RP to verify the attestation statement which includes the User’s Public passkey for the newly created User. Therefore, providing confidence/proof that the User’s private passkey originated from a specific authenticator with known properties.

4.4 A General Description of the Attestation Lifecycle

The attestation key generally has an associated attestation certificate, which links to a trusted root certificate of the Manufacturer. Once the RP has determined the authenticity of the signed attestation statement, the RP can use the attestation statement along with the MDS to learn more about the authenticator. For example, the RP may want to understand what level of encryption is used and what type of activation secrets is leveraged (e.g., biometrics) with a certain level of accuracy, etc. In order to get details about the authenticator an AAGUID value identifying the authenticator model is sent to the RP along with the newly created public passkey. Since the  AAGUID represents a specific group of authenticator instances such as specific product release with a specific characteristic, specific form factor, or enterprise branding, an RP can use this AAGUID to lookup more information about the authenticator from the MDS.

As shown in the diagram, the attestation object, if provided, will indicate the format of the attestation statement, and then include some data the RP can examine. The attestation object includes a statement that typically contains a signature as well as a certificate or similar data providing provenance information for the attestation public key.  Detail of the attestation object is provided in section 9.1 of the Appendix.

RPs should first verify the signature of the attestation statement and once verified, then examine the attestation statement.  Once the RP has identified the attestation statement’s format and type, the RP then reviews the contents and compares the contents against its policy.

An example attestation response resulting from a  direct request to the authenticator by an RP is provided in 9.2 of the Appendix. The AAGUID provided in the attestation response can be used to obtain additional  details about the authenticator from the FIDO Metadata Service.

4.5 Enterprise Attestation

By default, FIDO allows an authenticator to provide only product information using the AAGUID and high-level information about its type and capabilities, explicitly prohibiting an authenticator from providing uniquely identifying information.  However, Enterprise attestation removes that limitation, as it binds a unique authenticator key pair to a serial number or equivalent unique identifier.

4.5.1 Use Cases

Enterprises actively manage authenticators for various purposes and are essential for securing high-value assets. While employees may select their own authenticators, enterprises may limit authenticators per employee and revoke them upon a departure or loss, as they oversee the entire process from purchase to collection. Additionally, enterprises may prioritize manageability and traceability to safeguard resources. Upon a threat incident, forensic investigations may need to trace activities related to a particular authenticator and correlate the authenticator’s usage activity patterns in order to discover anomalies or the source of threat. Tight management enhances their ability to ensure non-repudiation for transactions. High-risk users may be assigned dedicated authenticators from the enterprise for access to restricted sensitive information or services. These authenticators are assigned specific PINs and are acquired through trusted supply chains. 

Certain enterprise deployments require the use of FIDO authenticators with enterprise attestation in order to identify specific device identities (e.g. device serial numbers). Enterprise Attestation validation must also be supported by the organization’s specific Relying Parties. These practices actively address enterprise-specific needs for improved control over device provisioning and lifecycle management.

4.5.2 Process

4.5.2.1 Provisioning 

Provisioning for enterprise attestation, is modified from the process described in section 4.3 to include both authenticator unique information in the attestation statement and to add any specific RPs permitted to receive this unique information from any set of RPs permanently “burned” into the authenticator by the authenticator’s manufacturer. The authenticator performs enterprise attestation only to those RPs provisioned to the authenticator. Other RPs may still perform any other type of attestation that excludes the unique identifier.

Authenticators that have the enterprise attestation burned into them must not be sold on the open market and may only be supplied directly from the authenticator’s manufacturer to the RP.  An RP wanting an enterprise attestation enabled authenticator will order them directly from the authenticator’s manufacturer by providing a list of RP IDs (RPIDs). These specific RPIDs are the ones permanently burned/written to the authenticator.

4.5.2.2 User Registration with Enterprise Attestation

During a FIDO user registration described in section 4.3, the RP may indicate the need for enterprise attestation. This will uniquely associate the user with the specific authenticator by providing proof of the authenticator’s unique identifier. During user registration the authenticator verifies that the requesting RP (using its RPID) is among those listed in the permanently provisioned list of RPID permitted to perform enterprise attestation. If approved, this unique identifier is added to the attestation object and signed by the Attestation Key. The RP should validate the attestation object and, optionally, the certificate link/chain used to sign the attestation object. The RP can then verify, at user registration time, that the unique identifier was indeed purchased by the enterprise and may include that verification in its records.

The implementation used by an RP to authenticate the uniquely identifying information varies by authenticator. Some authenticators may use vendor facilitated methods where the enterprise provides a list of the RP IDs to the manufacturer and those are imprinted into the authenticators. Another is where some enterprise managed platforms maintain a policy, such as an enterprise managed browser. Rather than imprinting the list of allowed RPs into the authenticator, an enterprise managed platform will make the determination if the enterprise attestation is provided to the RP based on the policy.

5. Privacy Implications and Considerations

While attestation provides a valuable assertion of trust for authenticators, privacy concerns can arise from the information shared during attestation. Some privacy considerations include:

While the attestation properties described in this paper include a broad set of privacy controls, implementers should consider these capabilities against regional and local privacy policies. Attestation enables sharing information, such as authenticator’s make and model, firmware version, or manufacturer details, with the RP. Concerns may arise regarding the potential exposure of sensitive authenticator-specific data and the subsequent tracking or profiling of users based on this information.  For this very reason, an attestation batch of at least 100,000 is recommended so it is not a small pool to identify devices from. Non-enterprise attestation prevents the association of multiple passkeys within an authenticator with different RPs, thus safeguarding user privacy. For example, a person using a single authenticator may create a User Authentication passkey (passkey1) for RP 1 (RP1), then create a new User Authentication passkey (passkey2) for RP 2 (RP2). Even though the person is using the same physical authenticator for both RPs and using attestation, even if RP1 and RP2 collaborate, they cannot determine that passkey1 and passkey2 are from the same authenticator, therefore, they cannot determine the transactions are from the same person. Enterprise attestation adds uniquely identifying information (e.g., a device serial number) allowing an authorized RP to track the use of a specific authenticator across several pre-provisioned RPs within the enterprise. It is expected that users in this environment have an understanding of this property and the value it adds to the enterprise.  6. Adoption and Deployment Considerations

RPs can determine the registration requirements for a FIDO authenticator, as  reflected in their preference for attestation conveyance. Some RPs may not require attestations to decide if registration is allowed. Other RPs may have security requirements that require an attestation object in order to make risk decisions. Security requirements may be based on characteristics of the authenticator (e.g., whether it requires a PIN) or could be as specific as the model of authenticator(s) allowed. Finally, in more protected environments, some RPs may require additional enterprise attestations to ensure an authenticator is known, controlled, and trusted by the enterprise.

7. Conclusion

FIDO attestation, a component of the FIDO and WebAuthn standards, validates the authenticity of a user’s authenticator. This process provides a defense against various threats such as supply chain attacks, counterfeit authenticators, and substitution attacks. For RPs requiring higher authentication assurance, attestation is a FIDO-centric mechanism to obtain that assurance. For RPs that need to ensure the authenticity of specific authenticators,  attestation provides these RPs assurance that they are dealing with a known and trusted device.

By generating unique key pairs for each RP that a user registers with, FIDO underscores its commitment to user security, eliminating potential cross-service vulnerabilities. The enterprise attestation feature provides organizations with better management of authenticators used by their personnel and is vital to environments that prioritize precise device management.

FIDO attestation brings certain privacy considerations. Disclosing authenticator-specific information, user device fingerprinting and the potential for user tracking, all highlight the importance of a privacy-aware approach. All stakeholders, including RPs, manufacturers, and users, must navigate the path between enhancing security and preserving user privacy.

FIDO attestation is adaptable.  RPs have the discretion to request their desired level of attestation, ensuring a tailored approach suitable for both specialized services and large enterprises.

In summary, FIDO attestation augments online authentication. With a focus on public-key cryptography, unique key pairs, and specific attestation processes, its efficacy is maximized through careful deployment, thorough understanding of its capabilities, and a consistent commitment to user privacy.

8. Acknowledgments

The authors acknowledge the following people (in alphabetic order) for their valuable feedback and comments:

FIDO Enterprise Deployment Working Group Members Dean H. Saxe, Amazon, Co-Chair Enterprise Deployment Working Group Jerome Becquart, Axiad IDS, Inc. Johannes Stockmann, Okta Inc. Tom De Wasch, OneSpan North America Inc. Tom Sheffield, Target Corporation John Fontana, Yubico 9. Appendix

9.1 Attestation Object

Appendix Figure 1 – Attestation object*
*layout illustrating the included authenticator data (containing attested credential data) and the attestation statement.

9.2 Example Attestation Object

attestationObject: {   "fmt": "packed",   "attStmt": {     "alg": -7,     "sig": "3045022100da2710ff0b5f5e5d72cda8c1e650f0b696e304942e55138672aa87a5e370a92d02205fd1a48bbda4757aac21252c7064f21130aba083151ab8ae75a26a356b675495",     "x5c": [       "3082026f30820213a003020102020404ae6da1300c06082a8648ce3d04030205003077310b3009060355040613025553310b3009060355040813024d413110300e06035504071307426564666f726431193017060355040a1310525341205365637572697479204c4c4331133011060355040b130a4f7065726174696f6e733119301706035504031310525341204649444f20434120526f6f743020170d3232303632333034323132315a180f32303532303632323034323132315a30818c310b3009060355040613025553310b3009060355040813024d413110300e06035504071307426564666f726431193017060355040a1310525341205365637572697479204c4c4331223020060355040b131941757468656e74696361746f72204174746573746174696f6e311f301d06035504031316525341204453313030204649444f20426174636820343059301306072a8648ce3d020106082a8648ce3d0301070342000465f2b3189a6dd2f7df9de784c1c8fd00ae804ac8de7bea042d00563dcd5d7a40948ae59d9dcf8722d8b6025ba98fbb80e6698bbe5003e4db4d80c4a50a3348e4a37330713021060b2b0601040182e51c010104041204107e3f3d3035574442bdae139312178b39301f0603551d23041830168014b851a38b84da69c9fd5b467c1f8e374ac0433419300c0603551d130101ff04023000301d0603551d0e041604142806df6c60b1656a78f97a28e168e5ec8d2937b4300c06082a8648ce3d0403020500034800304502210088122ea59cca8480ed57a0a60a2e203302b4d93713f837be7acc3a2c895c6251022010f67d709ea2dc04ca63aec8d341dc9e562909dcea3f2a4abee2bdfd21dd162d"     ]   },   "authData": {     "rpIdHash": "f95bc73828ee210f9fd3bbe72d97908013b0a3759e9aea3d0ae318766cd2e1ad",     "flags": {       "userPresent": true,       "reserved1": false,       "userVerified": true,       "backupEligibility": false,       "backupState": false,       "reserved2": false,       "attestedCredentialData": true,       "extensionDataIncluded": false     },     "signCount": 4,     "attestedCredentialData": {       "aaguid": "7e3f3d30-3557-4442-bdae-139312178b39",       "credentialId": "c0a3eb62197b77edd0cd1c73bffeb068dcc2595cfdf2e4dc01478bddc9cefcf52282f95bc73828ee210f9fd3bbe72d97908013b0a3759e9aea3d0ae318766cd2e1ad04000000",       "credentialPublicKey": { "kty": "EC",         "alg": "ECDSA_w_SHA256",         "crv": "P-256",         "x": "D3Ki/INLfrmlNogo8d1lK7kBT4Fh3wPyVt/kusDAMKY=",         "y": "M11KJSPXRiBn1ZtAo1eynxvaUXqipZJYV0AT0gC2czo="       }     }   } },

Appendix Figure 2 – Example Attestation object

10. References

[1] FIDO Alliance Metadata Service – https://fidoalliance.org/metadata/
[2] WebAuthn Specification – Attestation Section – https://www.w3.org/TR/webauthn-3/#sctn-attestation


Hyperledger Foundation

Hyperledger Web3j in 2024: Updates for Blockchain Developers

Since its introduction in February 2024, maintainers have focused on significantly enhancing Hyperledger Web3j, a crucial tool for Java and Android developers working within the Ethereum ecosystem. Web3j has been instrumental in making blockchain development accessible to a broader range of developers, particularly those who work in environments where Java is the primary language.

Since its introduction in February 2024, maintainers have focused on significantly enhancing Hyperledger Web3j, a crucial tool for Java and Android developers working within the Ethereum ecosystem. Web3j has been instrumental in making blockchain development accessible to a broader range of developers, particularly those who work in environments where Java is the primary language.


MyData

fairsfair: Data governance and privacy pop up in mobility

In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. What is fairsfair? Coming fresh into […]
In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. What is fairsfair? Coming fresh into […]

Project VRM

On Intentcasting

On July 9, 2012, not long after The Intention Economy came out, I got word from Gary Rosen of The Wall Street Journal that the paper’s publisher, Robert Thomson, loved the book and wanted “an excerpt/adaptation” from the book for the cover story of  the WSJ’s Weekend Review section. The image above is the whole cover […]

The cover page of the Weekend Review section of The Wall Street Journal, July 20, 2012

On July 9, 2012, not long after The Intention Economy came out, I got word from Gary Rosen of The Wall Street Journal that the paper’s publisher, Robert Thomson, loved the book and wanted “an excerpt/adaptation” from the book for the cover story of  the WSJ’s Weekend Review section. The image above is the whole cover of that section, which appeared later that month.

In the article I described a new way to shop:

An “intentcast” goes out to the marketplace, revealing only what’s required to attract offers. No personal information is revealed, except to vendors with whom you already have a trusted relationship.

I also said that this form of shopping—

…can be made possible only by the full empowerment of individuals—that is, by making them both independent of controlling organizations and better able to engage with them. Work toward these goals is going on today, inside a new field called VRM, for vendor relationship management. VRM works on the demand side of the marketplace: for you, the customer, rather than for sellers and third parties on the supply side.

The scenario I described was set ten years out: in 2022, a future now two years in the past. In the meantime, many approaches to intentcasting have come and gone. The ones that have stayed are Craigslist, Facebook Marketplace, Instacart, TaskRabbit, Thumbtack, and a few others. (Thumbtack participated in the early days of ProjectVRM.) We include them in our list of intentcasting services because they model at least some of what we’d like intentcasting to be. What they don’t model is the full empowerment of individuals as independent actors: ones whose intentions can scale across whole markets and many sellers:

Scale gives the customer single ways to deal with many companies. For example, she should be able to change her address or last name with every company she deals with in one move—or to send an intention-to-buy “intentcast” to a whole market.

Should we call the sum of it “i-commerce“? Just a thought.

Back to the Wall Street Journal article. It is clear to me now that The Customer as a God would have been a much better title for my book than The Intention Economy, which needs explaining and sounds too much like The Attention Economy, which was the title of the book that came out ten years earlier. (I’ve met people who have read that one and thought it was mine—or worse, called my book “The Attention Economy” and sent readers to the wrong one.)

Of course, calling customers gods is hyperbole: exaggeration for effect.  VRM has always been about customers coming to companies as equals. The “revolution in personal empowerment” in the subhead of “The Customer as a God” is about equality, not supremacy. For more on that, see the eleven posts before this one that mention the R-button:

That symbol (or pair of symbols) is about two parties who attract each other (like two magnets) and engage as equals. It’s a symbol that only makes full sense in open markets where free customers prove more valuable than captive ones. Not markets where customers are mere “targets” to “acquire,” “capture,” “manage,” “control” or “lock in” as if they were slaves or cattle.

The stage of Internet growth called Web 2.0 was all about those forms of capture, control, and coerced dependency. We’re still in it. (What’s being called Web3 is, while “decentralized” (note: not distributed), it is also based on tokens and blockchain. ) Investment in customer independence rounds to nil.

And that’s probably the biggest reason intentcasting as we imagined it in the first place has not taken off. It is very hard, inside industrial-age business norms (which we still have) to see customers as equals, or as human beings who should be equipped to lead in the dance between buyers and sellers, or demand and supply, in truly open marketplaces. It’s still easier to see us as mere consumers (which Jerry Michalski calls “gullets with wallets and eyeballs”).

So, where is there hope?

How about AI? It’s at the late end of its craze stage, but still here to stay, and hot as ever:

Can AI provide the “revolution in personal empowerment” we’ve been looking for here since 2006? Can it prove our thesis—that free customers are more valuable than captive ones—to themselves and to the marketplace?

Only if it’s personal.

If it is, then the market is a greenfield.

Some of us here are working at putting AI on both sides of intentcasting ceremonies. If you have, or know about, one or more of those approaches (or any intentcasting approaches), please share what you know, or what you’re got, in the comments below. And come to VRM Day on October 28. I’ll be putting up the invite for that shortly.

 

Wednesday, 28. August 2024

FIDO Alliance

Webinar: Misconceptions about passkeys

In the years since passkeys were first announced, a lot has changed in their availability to consumers, nomenclature across platforms, and even implementation requirements. However, one thing that has yet […]

In the years since passkeys were first announced, a lot has changed in their availability to consumers, nomenclature across platforms, and even implementation requirements. However, one thing that has yet to change is the need for more awareness on what passkeys are, how they work, and their benefits.

In this webinar we debunk common misconceptions associated with passkeys, which we’ve heard from customers, FIDO members and industry participants, and see pop up across social networks. By doing so, we’re confident we can help drive our industry towards a passwordless world.


Passkeys Webinar: Achieving End-to-End Passwordless

Authentication is a complicated problem with ever-creeping scope. Passkeys provide phishing-resistance at the point of authentication, but you need protection at enrollment and during the authenticated session thereafter, too, to […]

Authentication is a complicated problem with ever-creeping scope. Passkeys provide phishing-resistance at the point of authentication, but you need protection at enrollment and during the authenticated session thereafter, too, to truly fortify the authentication process against evolving threats. 

In this discussion, authentication experts walk through all of the components of a user authentication workflow, highlighting areas of innovation and future steps for securing enrollment, authentication, and sessions.  


Identity At The Center - Podcast

We have a bonus Sponsor Spotlight episode of the Identity at

We have a bonus Sponsor Spotlight episode of the Identity at the Center podcast for you this week sponsored by Semperis. Jim McDonald hosts Eric Woodruff, Senior Security Researcher at Semperis, to discuss the company's approach to identity security. They delve into Semperis' tools like Purple Knight and Forest Druid, focusing on their capabilities in detecting and mitigating Active Directory an

We have a bonus Sponsor Spotlight episode of the Identity at the Center podcast for you this week sponsored by Semperis.

Jim McDonald hosts Eric Woodruff, Senior Security Researcher at Semperis, to discuss the company's approach to identity security. They delve into Semperis' tools like Purple Knight and Forest Druid, focusing on their capabilities in detecting and mitigating Active Directory and Entra ID vulnerabilities. The conversation covers the critical role of prevention and response in ITDR, the impact of ransomware on Enterprise ID infrastructures, and the importance of ensuring a trusted state in Active Directory.

You can watch it on YouTube at https://youtu.be/UwIP0hQmv00?si=BBYvcVbO9cZqET-Q

More at idacpodcast.com

#iam #podcast #idac

Monday, 26. August 2024

Identity At The Center - Podcast

Today marks 300 episodes of the Identity at the Center podca

Today marks 300 episodes of the Identity at the Center podcast. We celebrated by doing what we do best - talking about IAM! We took the opportunity to answer a couple of listener questions including “what is identity at the center” and whether using SSN to validate caller identities is a good idea (it’s not). You can watch it here: https://www.youtube.com/watch?v=VXxBIG2UI8s The website: idacpod

Today marks 300 episodes of the Identity at the Center podcast. We celebrated by doing what we do best - talking about IAM! We took the opportunity to answer a couple of listener questions including “what is identity at the center” and whether using SSN to validate caller identities is a good idea (it’s not).

You can watch it here: https://www.youtube.com/watch?v=VXxBIG2UI8s

The website: idacpodcast.com

#iam #podcast #idac

Friday, 23. August 2024

We Are Open co-op

An Introduction to Systems Thinking

Part 1: Three Key Principles This is the first post in a series exploring the fundamentals of Systems Thinking. This is an approach that helps us make sense of complex situations by considering the whole system rather than just its individual parts. Why is Systems Thinking important? Whether you’re navigating challenges in your professional life or making decisions in your personal life, Systems
Part 1: Three Key Principles

This is the first post in a series exploring the fundamentals of Systems Thinking. This is an approach that helps us make sense of complex situations by considering the whole system rather than just its individual parts.

Why is Systems Thinking important? Whether you’re navigating challenges in your professional life or making decisions in your personal life, Systems Thinking offers a series of powerful lenses through which to view the interconnectedness of various elements. By understanding how different parts of a system influence one another, you can identify more effective solutions, anticipate unintended consequences, and make better-informed decisions. This holistic approach is particularly valuable in today’s complex world, where problems are rarely isolated and simple fixes can often lead to new issues!

This series is made up of:

Part 1: Three Key Principles (this post) Part 2: Understanding Feedback Loops Part 3: Identifying Leverage Points

In this first post, we’ll explore three key principles that are foundational to a Systems Thinking approach: Drawing a Boundary, Multiple Perspectives, and Holistic Thinking. These principles provide the groundwork for seeing beyond surface-level problems and understanding the deeper, often hidden, relationships and patterns within any system. By applying these principles, you can begin to approach challenges with a mindset that seeks to comprehend the whole rather than merely addressing symptoms, leading to more sustainable and impactful solutions.

1. Drawing a boundary

“The only problems that have simple solutions are simple problems. Complex problems have complex solutions.” — Russell Ackoff

In Systems Thinking, one of the first steps is to draw a boundary around the system you’re examining. This boundary defines what is included within the system and what is considered external. It’s crucial because the way you set this boundary determines the scope of your analysis and influences the insights you gain.

A boundary can be drawn in various ways. It might cut cross-functionally across an organisation, considering multiple departments and their interactions. Alternatively, it could include several organisations, looking at an entire ecosystem rather than a single entity. These choices lead to different conclusions and strategies. Boundaries help manage complexity by allowing you to concentrate on specific parts of the system without getting overwhelmed by the whole.

For example, when we started working with the Digital Credentials Consortium (DCC) based at MIT, we had to make a decision about where to draw the boundary. We could have drawn it very narrowly to focus just on the DCC team, which would have focused mainly on operational issues and decision-making. Our brief, however, was to help with a communications strategy, which meant drawing a wider one. This included organisations who were aware of the DCC, but not directly involved with them.

Draw a boundary too wide, and it becomes difficult to identify ways to create meaningful change. For example, if we included all of Higher Education, it would have become an impossible project to manage. In the end, we identified key stakeholder groups (e.g. registrars) and included them within the boundary. That meant purposely excluding other stakeholders (e.g. vendors) so that we could work on communications that would resonate and create change.

2. Multiple perspectives

“We live in a world of problems, not puzzles. Problems are tangled with uncertainty, ambiguity, and value conflicts.” — Geoffrey Vickers

Every system is viewed differently depending on the perspective of the observer. Recognising and incorporating multiple perspectives is a cornerstone of Systems Thinking. Each stakeholder sees the system from their own unique angle, and by considering these diverse viewpoints, we can uncover hidden dynamics and potential conflicts.

For instance, with the DCC project, we needed to gain the perspectives of different members of the team, the Leadership Council, and key stakeholder groups. Each had different opinions and concerns, which helped us create an effective and holistic communications plan, balancing the needs of all stakeholders and provide a way forward.

3. Holistic thinking

“Insight, I believe, refers to the process by which a shift in perspective reveals a coherent pattern that had previously been obscured from view.” — Mary Catherine Bateson

Holistic thinking involves seeing the system as a whole rather than just focusing on individual parts. It contrasts with reductionist approaches, which break down a system into its components. While reductionism can be useful, it often misses the interactions and relationships that make the system function.

In the DCC communications strategy project, it was important not to take a reductionist approach focused on individual aspects, but rather think more holistically. We needed a way that considered how these elements interact. For example, by creating a cadence for communications which re-uses content from one platform, and which helps scaffold stakeholder understanding. By considering the Verifiable Credentials ecosystem as an interconnected whole, we were able to think about how a communications strategy could encourage adoption, without burning out DCC staff.

Conclusion

In this post, we’ve explored the three foundational pillars of Systems Thinking: Drawing a Boundary, Multiple Perspectives, and Holistic Thinking. These principles offer a powerful framework for tackling complex problems, allowing you to see the bigger picture, understand diverse viewpoints, and recognise the intricate connections that drive systems. By applying these concepts, you can develop more effective strategies, avoid unintended consequences, and create sustainable solutions.

At We Are Open Co-op, we specialise in helping organisations integrate these Systems Thinking principles into their everyday practices. Whether you’re navigating organisational change, developing strategies, or addressing societal challenges, our approach can help you uncover deeper insights and achieve lasting impact.

If you’re interested in how Systems Thinking can transform your organisation, we’d love to collaborate with you. Stay tuned for the next post in this series, where we’ll delve into practical applications of these principles, providing you with tools and examples to put Systems Thinking into action.

References Ackoff, R.L. (1974). Redesigning the Future: A Systems Approach to Societal Problems. New York: Wiley. Bateson, M.C. (1994). Peripheral Visions: Learning Along the Way. New York: HarperCollins. Beer, S. (1972). Brain of the Firm. New York: Herder and Herder. Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), pp. 127–138. Vickers, G. (1965). The Art of Judgement: A Study of Policy Making. London: Chapman & Hall.

An Introduction to Systems Thinking was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 22. August 2024

Energy Web

Clean EV Charging at Your Fingertips

AutoGreenCharge mobile app coming soon to app stores near you We’re thrilled to announce that Energy Web’s AutoGreenCharge mobile app has completed development and will soon be released on the Apple App Store and Google Play Store, enabling electric vehicle owners to charge with 100% renewable electricity. AutoGreenCharge is a mobile app that provides unprecedented transparency and traceabi
AutoGreenCharge mobile app coming soon to app stores near you

We’re thrilled to announce that Energy Web’s AutoGreenCharge mobile app has completed development and will soon be released on the Apple App Store and Google Play Store, enabling electric vehicle owners to charge with 100% renewable electricity.

AutoGreenCharge is a mobile app that provides unprecedented transparency and traceability for EV charging. It works by:

Connecting your EV: Our partnership with Smartcar makes it easy for data to be shared with the Autogreencharge app from a wide range of EV models Tracking Your Charging Sessions: Every time you charge your car, AutoGreenCharge collects data about your charging session. Matching Renewable Energy: the app matches your charging session with renewable energy certificates from markets around the world and creates a publicly verifiable proof that your electricity comes from clean sources. Tracking Your Green Proofs: you can easily view and verify your green proofs for each charging session within the app.

In addition to the mobile app, AutoGreenCharge is also available for enterprise customers. The solution can serve as a powerful tool for EV fleets, charge point operators, and automakers looking to decarbonize charging and offer new sustainability solutions to their customers.

We’re excited to bring AutoGreenCharge to EV owners everywhere. Stay tuned for the official launch and get ready to experience the future of electric vehicle charging.

There’s still time to sign up for the beta launch on https://www.autogreencharge.com/.

About Energy Web

Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Clean EV Charging at Your Fingertips was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyData

Call to action for data portability

The EU’s Digital Markets Act requires six big tech “gatekeepers” (Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft) to provide continuous and real-time data portability mechanisms to users. We’re calling on the […]
The EU’s Digital Markets Act requires six big tech “gatekeepers” (Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft) to provide continuous and real-time data portability mechanisms to users. We’re calling on the […]

Wednesday, 21. August 2024

Next Level Supply Chain Podcast with GS1

Revolutionizing Patient Care with Smart Inventory Management with Chris Anderson

Imagine a world where ensuring patient safety and improving healthcare outcomes begins with something as simple as smart inventory management. In this episode, hosts Reid Jackson and Liz Sertl are joined by Chris Anderson, Director of Technical Program Management at VUEMED. Chris, with nearly a decade of experience in data management and analytics, shares the intricate world of inventory managem

Imagine a world where ensuring patient safety and improving healthcare outcomes begins with something as simple as smart inventory management.

In this episode, hosts Reid Jackson and Liz Sertl are joined by Chris Anderson, Director of Technical Program Management at VUEMED. Chris, with nearly a decade of experience in data management and analytics, shares the intricate world of inventory management solutions for hospitals—focusing on implantable medical devices. 

Chris also discusses how a unified system not only enhances the tracking of medical devices but also bolsters patient safety through more effective recall management and improved patient outcomes.

In this episode, you’ll learn:

How unique device identification (UDI) standardization is transforming hospital inventory management, enabling more precise tracking and significantly improving patient safety outcomes.

Insights into the seamless integration of GS1 standards within healthcare supply chains and learn practical approaches to overcoming compliance pitfalls and maximizing data utility.

The emerging trends and legislative updates that are set to impact future supply chain regulations in healthcare, providing a strategic edge to stay ahead in a rapidly evolving landscape.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Chris Anderson on LinkedIn

Tuesday, 20. August 2024

Digital Identity NZ

‘Simply the Best’ | August Newsletter

If Tina had been in the Oceania Room at Te Papa Tongarewa last Tuesday, and for the two Digital Trust Hui Taumata that preceded it, like some of you were, I think this would have been her observation. The post ‘Simply the Best’ | August Newsletter appeared first on Digital Identity New Zealand.

Kia ora,

If Tina had been in the Oceania Room at Te Papa Tongarewa last Tuesday, and for the two Digital Trust Hui Taumata that preceded it, like some of you were, I think this would have been her observation. It certainly was for another singer. The buzz in the room was palpable from the very start. Compared to the previous two years, this event saw increased attendance (214 registered, 202 in the room), more international and local speakers, and an increase in panels and roundtable discussions. It’s tangible evidence that Digital Trust is entering the consciousness of more people, accompanied by a desire to ‘get stuff done’. To that end – amongst several other great presentations, panels and exhibits of things currently in progress for those that wish to opt-in – NZTA showcased its NZTA vehicle status app, DINZ member Worldline walked us through its Digital Identity Acceptance Network at POS terminals, and DINZ member Xebo demonstrated the application of its information assurance platform for a document (quote) using verifiable credentials in the exhibition area. We are hugely grateful to the Hui’s partners, speakers and panellists, without whom this fantastic event would not have been possible.
 

Awareness and education play the most critical role in people’s adoption of these new services. In their absence, misinformation and disinformation fill the vacuum. A case in point is the jaw-dropping statistic from DINZ’s research published last year regarding the extent to which organisations are trusted to protect identity and use personal data responsibly. Government agencies only scored 51%. And yet, OPC’s Approved Information Sharing list is much more limited and evidentially contradicts the surveillance conspiracies that swirl around the internet.  

Similarly with the opt-in Digital Identity Services Trust Framework, where it’s not widely known that it’s primarily targeted at service providers to help keep their clients’ (you and me) information private and secure from fraudsters by introducing best practices, such as adopting protected reusable verifiable credentials that you decide who gets to see – so you are not forced to hand over documents to all and sundry for copying, which carry the risk of targeted data theft.    

In my closing remarks at the Hui, I asked each attendee to consider what single action they can undertake right now to improve Digital Trust in Aotearoa. And I’m asking again here. There is a private-sector-initiated awareness and education module all set to go. Will your organisation step up to its corporate social responsibility in this domain and help sponsor it? Reach out to me if you’re interested to know more.  

Lastly, please take some time to listen to the first of DINZ’s new podcast series, Digital Identity in Focus here. And if you’re interested in collaborating on a brief submission on the CPD Bill, please contact me here

Ngā mihi Colin Wallis
Executive Director, Digital Identity NZ

Read full news here: ‘Simply the Best’ | August Newsletter

SUBSCRIBE FOR MORE

The post ‘Simply the Best’ | August Newsletter appeared first on Digital Identity New Zealand.


Human Colossus Foundation

Human Colossus Foundation Announces the Publication of "Decentralised Semantics: A Semantic Engine User Perspective"

The Human Colossus Foundation announces the release of a pioneering paper,  "Decentralised Semantics: A Semantic Engine User Perspective," by Carly M. Huitema, Paul Knowles, Philippe Page, and A. Michelle Edwards. This paper introduces the Semantic Engine, a powerful tool built on the Overlays Capture Architecture (OCA), designed to enhance the Findable, Accessible, Interoperable, and Reus

The Human Colossus Foundation is excited to announce the publication of a groundbreaking new paper titled "Decentralised Semantics: A Semantic Engine User Perspective," authored by Carly M. Huitema, Paul Knowles, Philippe Page, and A. Michelle Edwards. This paper marks a significant advancement in how researchers search information through advanced semantic data management. The Semantic Engine developed in the agri-food sector leverages Overlays Capture Architecture (OCA) as a basis for semantic harmonisation and information discovery.

Citation

Huitema, C.M., Knowles, P., Page, P. and Edwards, A.M. (2024)
Decentralised Semantics: A Semantic Engine User Perspective.
Data Science Journal, 23: 42, pp. 1–5.
DOI: https://doi.org/10.5334/dsj-2024-042

Addressing the Challenges of FAIR Data Implementation

The paper addresses a critical issue in implementing the Findable, Accessible, Interoperable, and Reusable (FAIR) data principles. While many research groups strive to make their data FAIR, they often encounter challenges documenting the context in which data was collected, processed, and analysed. This lack of machine-actionable, contextual metadata frequently renders data less reusable and visible outside the immediate research team.

To overcome these challenges, the authors present the first version of the Semantic Engine, a tool designed to facilitate the creation of decentralised, machine-actionable metadata schemas. This tool is handy when data is collected across multiple projects and institutions, such as Agri-Food Data Canada.

Leveraging Overlays Capture Architecture (OCA)

The Semantic Engine is built upon the Overlays Capture Architecture (OCA), a flexible and extensible standard hosted by the Human Colossus Foundation. OCA supports decentralised collaboration and reproducibility by allowing multiple contributors to work on different aspects of a data schema without compromising the integrity of the core data structure. This approach is particularly beneficial in the agri-food sector, where data heterogeneity and decentralised research efforts are expected.

The Semantic Engine is built upon the Overlays Capture Architecture (OCA), a flexible and extensible standard hosted by the Human Colossus Foundation. OCA supports decentralised collaboration and reproducibility by allowing multiple contributors to work on different aspects of a data schema without compromising the integrity of the core data structure. This approach is particularly beneficial in the agri-food sector, where data heterogeneity and decentralised research efforts are expected.

Applications and Future Implications

The Semantic Engine, freely accessible at semanticengine.org, allows researchers to create, edit, and manage OCA-based schemas. It has been thoroughly tested by researchers at the University of Guelph and is designed to be user-friendly for the broader research community.

The potential applications of OCA and the Semantic Engine extend beyond the agri-food sector. The paper highlights ongoing projects in Canada and Switzerland and the EU Horizon project 'NextGen,' which uses OCA to harmonise semantic data in cardiovascular personalized medicine.

The release of the "Decentralised Semantics: A Semantic Engine User Perspective" paper represents a significant step forward in making research data more FAIR and usable. By leveraging the Semantic Engine and OCA, researchers can ensure that their data is well-documented, reproducible, and accessible to a broader audience. The Human Colossus Foundation is proud to support this critical work and looks forward to its continued impact on the research community.

You can access the full paper here and explore the Semantic Engine at semanticengine.org for more information.

Monday, 19. August 2024

Trust over IP

ToIP Welcomes GLEIF to our Steering Committee

GLEIF is pleased to have broadened its engagement and participation in Trust Over IP Foundation (ToIP) by becoming a member of the ToIP Steering Committee in March 2024, recognizing the... The post ToIP Welcomes GLEIF to our Steering Committee appeared first on Trust Over IP.

GLEIF is pleased to have broadened its engagement and participation in Trust Over IP Foundation (ToIP) by becoming a member of the ToIP Steering Committee in March 2024, recognizing the importance of well-functioning governance to the ongoing success of the foundation. GLEIF has been a member of ToIP, as a Founding Contributor member, since May 2020.

With the verifiable Legal Entity Identifier (vLEI), GLEIF has pioneered a new form of digitized organizational identity to meet the global need for automated identification, authentication and verification of legal entities across a range of industries. By creating the vLEI, GLEIF is now answering to this urgent and unmet need of pioneering a multi-stakeholder effort to create a new global ecosystem for organizational digital identity.

The verifiable Legal Entity Identifier vLEI concept is simple: It is the secure digital counterpart of a conventional Legal Entity Identifier (LEI). In other words, it is a digitally trustworthy version of the 20-digit LEI code which is automatically verified, without the need for human intervention. The vLEI concept is also very much in-line with ToIP Technical and Governance Frameworks as detailed below.

The vLEI Trust Chain demonstrates the ability to chain the issuance of vLEI credentials as well as providing the foundation for the automated verification of vLEIs back to GLEIF which enable cryptographic verification of the identity of an organization back to its validated LEI identity.

vLEIs go further though in being able to cryptographically tie persons to organizations in the roles in which the persons are representing or engaging with these organizations. vLEI Role Credentials combine three concepts – (1) the organization’s identity, represented by the LEI, (2) a person’s identity and (3) the role that the person plays for the organization. 

GLEIF works to advance digital trust standards in the neutral ToIP forum through participation in the Ecosystem Foundry Working Group, the Issuer Requirements Task Force of the Governance Stack Working Group and as a co-chair of both the ACDC/KERI Task Force and Technical Stack Working Group. It is here in which the technical specifications of the KERI Suite have been drafted and have begun the process of approval to become published ToIP standards. The KERI Suite of specifications is made up of 3 documents – the Key Event Receipt Infrastructure (KERI) specification, the Authentic Chained Data Containere (ACDC) specification and the Composable Event Streaming Representation (CESR) specification. 

GLEIF also contributed to the development of the ToIP Ecosystem Governance Metamodel and companion guide. The verifiable LEI (vLEI) Ecosystem Governance Framework is based on the ToIP Governance Metamodel.

The post ToIP Welcomes GLEIF to our Steering Committee appeared first on Trust Over IP.


Identity At The Center - Podcast

It’s time for another new episode of The Identity at the Cen

It’s time for another new episode of The Identity at the Center Podcast! We talked with Microsoft Product Manager Merill Fernando about the current state and future plans for Entra ID and the importance of DevOps and governance in identity management. Watch it here: https://www.youtube.com/watch?v=szPgsyQUpQU More info: idacpodcast.com #iam #podcast #idac

It’s time for another new episode of The Identity at the Center Podcast! We talked with Microsoft Product Manager Merill Fernando about the current state and future plans for Entra ID and the importance of DevOps and governance in identity management.

Watch it here: https://www.youtube.com/watch?v=szPgsyQUpQU

More info: idacpodcast.com

#iam #podcast #idac

Friday, 16. August 2024

We Are Open co-op

Demystifying User Research

A step-by-step guide from WAO Image CC BY-NC Visual Thinkery for WAO In a recent post we outlined the early stages of our user research and evaluation project with Jobs for the Future (JFF) and the International Rescue Committee (IRC). Building on that, we also shared a post on the principles that drive our approach to user research. Now, we offer a comprehensive guide for those who are
A step-by-step guide from WAO Image CC BY-NC Visual Thinkery for WAO

In a recent post we outlined the early stages of our user research and evaluation project with Jobs for the Future (JFF) and the International Rescue Committee (IRC). Building on that, we also shared a post on the principles that drive our approach to user research.

Now, we offer a comprehensive guide for those who are embarking on user research for the first time, or those looking to refine their current practices.*

1. Define the Scope 🔍

Every successful project starts with a clear definition of its scope. This involves establishing the “Who, What, When, Where, Why, and How” before taking any further steps. By doing so, we create a shared understanding with our clients, which is vital for ensuring that everyone is on the same page. This clarity enables us to guide the project effectively, making informed decisions at every stage. Without a well-defined scope, projects can easily lose focus, leading to wasted time and resources.

2. Identify Stakeholder Groups 👫

Identifying the relevant stakeholders is the next crucial step. Stakeholders are those who have an interest in the outcome of the project or who may be affected by it. For our JFF/IRC project, we concentrated on engaging employers, IRC staff, and, where possible, IRC clients. Each group brought unique insights that enriched our understanding of the Job Readiness Credential’s effectiveness. Engaging a broad spectrum of stakeholders ensures that the research reflects a wide range of experiences and perspectives, making the findings more robust and actionable.

3. Create Surveys (Quantitative Data) 📊

Once stakeholder groups are identified, we design surveys tailored to each group to gather quantitative data. This data is essential for identifying trends and patterns that might not be immediately obvious. For example, in our work on this project, we developed a survey for employers that asked them to rate various aspects of the Job Readiness Credential, including its design and usability. The data collected provided a solid foundation for further analysis, allowing us to make evidence-based recommendations. Surveys are a powerful tool for collecting data at scale, providing a broad overview that can guide more detailed exploration.

4. Develop a User Research Guide (Qualitative Data) 📝

Quantitative data provides a broad picture, but to gain a deeper understanding, we complement it with qualitative research. This involves developing detailed guides for our interviews, which help us stay focused while remaining open to unexpected insights. The qualitative data gathered through interviews adds depth and context to our findings. For instance, while a survey might tell us that a particular feature is unpopular, an interview can reveal the underlying reasons why. By combining quantitative and qualitative data, we create a more comprehensive picture of the user experience.

5. Create Transcripts 🎤

Conducting interviews is only the beginning. To extract meaningful insights, we transcribe the conversations using tools like Sonix.ai. These transcripts are then reviewed and edited to ensure clarity and accuracy. This step is crucial because it allows us to focus on the most relevant insights without losing any nuance. By thoroughly reviewing the transcripts, we ensure that the final analysis accurately reflects the participants’ perspectives. This meticulous approach to data processing is what allows us to provide our clients with reliable and actionable insights.

6. Apply Snowball Sampling ❄️

In some cases, the initial set of stakeholders might not be sufficient to capture the full range of perspectives. This is where snowball sampling comes in. By asking participants to recommend others who might have relevant insights, we can expand our research to include a wider range of voices. For example, in the JFF/IRC project, we explored the possibility of engaging a representative from Indeed, the HRTech platform. This approach allows us to gather additional perspectives that might otherwise be overlooked, ensuring that our research is as comprehensive as possible

7. Use AI to Gain Initial Insights 🤖

With the increasing availability of AI tools, we’ve integrated them into our process to generate early insights from the data we collect. Tools like GPT-4o and Claude help us quickly identify key themes and patterns in the data. These AI-generated summaries provide a useful starting point, allowing us to focus our analysis on the most promising areas. However, AI insights are not the final word; we carefully review and refine them, combining them with our own expertise to ensure a balanced and nuanced analysis. This approach allows us to work more efficiently without sacrificing quality.

8. Synthesise Findings 📚

After collecting and analysing both quantitative and qualitative data, we synthesise the findings into a comprehensive report. This report brings together the survey results, interview insights, AI-generated perspectives, and our own analysis. The goal is to provide our clients with a clear and actionable summary of our findings. We often create both a visual summary and a more detailed report. The visual summary is designed for quick reference, while the detailed report offers a deeper dive into the data, providing clients with the insights they need to make informed decisions. By presenting our findings in this way, we ensure that our research is not only informative but also practical and easy to use.

Conclusion

At We Are Open Co-op, user research is all about helping people make better decisions with insights they can trust. Our experience with the Job Readiness Credential project shows how dedicated we are to this work.

We’re here to support you in your own research efforts, whether you’re just starting out or looking to refine your approach. As our work progresses, we’ll continue to share insights that can help you make the most of your user research journey.

* Note that user research and user testing can get very involved and scientific. Going into more detail and depth is definitely important if you are doing things at scale. This post is aimed at encouraging those who may not have in-house capacity to get started for the first time!

Demystifying User Research was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 14. August 2024

FIDO Alliance

Authenticate Update: 2024 Agenda Released

Carlsbad, Calif, August 14, 2024 – The FIDO Alliance has announced its agenda today for Authenticate 2024, held October 14-16, 2024, at the Omni La Costa Resort and Spa in […]

Carlsbad, Calif, August 14, 2024 – The FIDO Alliance has announced its agenda today for Authenticate 2024, held October 14-16, 2024, at the Omni La Costa Resort and Spa in Carlsbad, California.

Now in its fifth year, Authenticate is the only industry conference dedicated to all aspects of user authentication, and has become a ‘must attend’ cybersecurity event. This year’s event includes over 100 sessions and 125 speakers from across the globe, offering the latest innovations, expertise, and essential discussions for the digital identity industry, with an emphasis on passwordless authentication using passkeys.

Check out the Authenticate 2024 Agenda and register at https://authenticatecon.com/event/authenticate-2024-conference/.

Authenticate is perfect for CISOs, security strategists, enterprise architects, UX leaders, and product and business executives at any stage of their passwordless journey. Attendees will dive into practical content on authentication and identity security. The topics explored include FIDO technology basics, achieving business results, best practices for implementation in various use cases, UX factors, and case studies from the real world — all hosted in a resort environment that fosters collaboration, networking, and community building.

The 2024 keynote speakers have extensive experience implementing passwordless solutions for workforces and consumers and represent renowned organizations such as Amazon, FIDO Alliance, Google, Microsoft, Sony, Visa, and Yubico. The conference offers four stages with dedicated content tracks tailored to match attendees’ levels of expertise, interests, and implementation stages. Additionally, attendees will be able to get to know FIDO solution providers and join networking events to connect with peers and industry experts.

The Authenticate 2024 agenda features the following 11 content-rich tracks:

Business Case and ROI for Passkeys Technical Fundamentals and Features of Passkeys IAM Fundamentals UX Fundamentals of Passkeys Identity Verification Fundamentals Passkeys for Consumers Passkeys in the Enterprise Passkeys for Government Use Cases and Policy Making Passkeys for Payments The Passwordless Vision and the Future of Passkeys Complementary Technologies and Standards Sponsoring Authenticate 2024

Authenticate 2024 is accepting sponsorship applications for companies to showcase their solutions to key decision-makers and connect with potential customers. To learn more about the available on-site and virtual sponsorship options for the 2024 event, visit the Authenticate Sponsors page here. Due to the limited opportunities remaining, interested parties are encouraged to reach out to the Authenticate team soon at authenticate@fidoalliance.org.

About Authenticate

Authenticate 2024 is the leading conference dedicated to all aspects of user authentication, with a focus on FIDO standards. Celebrating its 5th year, the event will take place October 14-16, 2024 at the Omni La Costa Resort and Spa, offering both in-person and virtual attendance options. The conference gathers global leaders working to advance stronger, phishing-resistant authentication, and provides the latest educational content, technical insights, tools, and deployment best practices.

Authenticate 2024 is hosted by the FIDO Alliance, the cross-industry consortium that provides standards, certifications, and market adoption programs to accelerate the utilization of simpler, stronger authentication innovations like passkeys. The signature sponsors for the 2024 Authenticate conference include industry leaders Cisco, Google, Microsoft, and Yubico. 

Visit the Authenticate 2024 website to register now and use the early bird discount (through September 9, 2024). Follow @AuthenticateCon on X for the latest updates. 

Authenticate Contact

authenticate@fidoalliance.org

PR Contact

press@fidoalliance.org


Me2B Alliance

Introducing Our Newest ISL Advisor: Dr. Liad Wagman

We enthusiastically welcome Liad Wagman as our newest Advisor to Internet Safety Labs. The post Introducing Our Newest ISL Advisor: Dr. Liad Wagman appeared first on Internet Safety Labs.

We are pleased to announce that Liad Wagman has joined Internet Safety Labs as our newest Advisor. Liad is currently serving as the Dean and Professor of Economics at the Rensselaer Polytechnic Institute’s Lally School of Management.   

Liad brings a wealth of experience from his previous tenure at the Illinois Institute of Technology, where he served as the Dean and Professor of Economics and a key figure in spearheading innovative STEM and lifelong learning programs at the Stuart School of Business.  

Now at Rensselaer, Liad continues to influence the academic and business landscapes, emphasizing the importance of ethical practices within economics and technology. His decision to join ISL as an Advisor is driven by a shared commitment to enhancing product safety within the tech industry.  

“Solutions that align incentives for ethical behavior can benefit society by enabling stakeholders to make more informed decisions, reducing uncertainty, and fostering trust,” Wagman commented, highlighting his vision for his role at ISL.  

At ISL, we are excited about the perspectives and insights Liad will bring to our mission. His extensive background and forward-thinking approach will be invaluable as we continue our work to make the internet a safer and more transparent space.  We extend our deepest gratitude to Liad and all our advisors, whose expertise helps propel our mission forward.  

Please join us in warmly welcoming Liad Wagman to Internet Safety Labs!  

The post Introducing Our Newest ISL Advisor: Dr. Liad Wagman appeared first on Internet Safety Labs.


Human Colossus Foundation

Developing Sustainable Approaches to Shaping a Secure and Inclusive European Health Data Space

Nestled amidst the majestic Swiss Alps and the picturesque Disentis Monastery, the 2024 Summer Academy of the German Studientstiftung des deutschen Volkes and Max Weber Programm was an inspiring setting for over 70 passionate students. Among them, six vibrant working groups explored pressing societal matters, ranging from sustainability to digital health. One such group took on the challenge o

Disentis, July 28 to August 9

Nestled amidst the majestic Swiss Alps and the picturesque Disentis Monastery, the 2024 Summer Academy of the German Studientstiftung des deutschen Volkes and Max Weber Programm was an inspiring setting for over 70 passionate students. Among them, six vibrant working groups explored pressing societal matters, ranging from sustainability to digital health. One such group took on the challenge of envisioning a secure and inclusive European Health Data Space (EHDS).

A vision for the future of health data.

With the recent legislative work of the European Parliament and Council, the European Union is taking a visionary step towards establishing the EHDS. Beyond its benefits for patients and research, the EHDS has the potential to leverage digital technologies to significantly enhance the resilience and long-term sustainability of Europe's universal healthcare systems, provide a unique economic advantage, and set global standards in privacy, individual protection, and data governance.

For a European health data space that can be adopted by everyone (including patients!), individuals must rely on a system that embeds information security by design. A working group approached the question by leveraging the participants' diverse perspectives as stakeholders in a European Health Data Space.

Under the leadership of Philippe Page from the Human Colossus Foundation's Research Council, an international team of eleven students from diverse disciplines embarked on the mission to address key aspects of the EHDS. Their goal was to create a safe, accessible, and economically viable solution that would benefit patients and researchers alike.

Three guiding questions anchored their deliberations:

How might EHDS revolutionize medical research pathways via expanded data accessibility?

What measures could ensure the EHDS contributes to the EU economy without compromising health data privacy from commercial exploitation?

Which components constitute a secure, scalable infrastructure that meets the EHDS expectations and security demands? Would such an implementation prove sustainable?

The working group organized itself into three distinct focus groups, each addressing specific themes related to the broader topic. These subgroups operated autonomously throughout the sessions, diving into their subjects. Daily, the working group would collaborate in a collective session to share insights on primary/secondary data usage, discuss findings, and harmonize perspectives to manage risks in commercial usage. Through these collaborative efforts, the participants crafted a first draft of a position paper encompassing essential questions about the future development of the EHDS.

In conclusion, the Human Colossus Foundation thanks the organisers for creating space for bringing new ideas forwards in a manner respecting everyone’s perspective. The vision initiated during the retreat in Disentis Monastery is just the beginning. With plans to reconvene in 2025, the group aims to build upon its foundational ideas. Its aim is to create a safer, more inclusive European Health Data Space that sets global benchmarks in privacy, individual protection, data use in research, and data governance.

Subscribe to our newsletter

Tuesday, 13. August 2024

Ceramic Network

OrbisDB is a Practical Upgrade for Databases on Ceramic

Databases on Ceramic OrbisDB has emerged as a premier database solution for the Ceramic Network. Building on the foundation laid by ComposeDB, OrbisDB brings significant advancements in functionality, performance, and user experience. This blog post will elaborate on the connection between Ceramic and OrbisDB, highlight OrbisDB's new features,
Databases on Ceramic

OrbisDB has emerged as a premier database solution for the Ceramic Network. Building on the foundation laid by ComposeDB, OrbisDB brings significant advancements in functionality, performance, and user experience. This blog post will elaborate on the connection between Ceramic and OrbisDB, highlight OrbisDB's new features, and showcase its value to developers.

ComposeDB: The Original Building Block

ComposeDB was the first database and has become an integral technology for many decentralized applications built on Ceramic, such as Passport.xyz, Zuzalu City, CharmVerse, and Lateral DeSci.

ComposeDB has been instrumental in dapp development on Ceramic because it introduces a robust, scalable, and user-friendly approach to data management. It supports structured data models, advanced queries, and the integration of decentralized identities, all while leveraging Ceramic's fast performance and high transaction capacity.

OrbisDB: A Practical Evolution

3Box Labs designed Ceramic as an open network upon which an ecosystem of data-handling solutions could emerge. We launched ComposeDB in 2023 as the first database service on the network.

While ComposeDB represented the first database service offered on Ceramic and introduced many advancements for interacting with Ceramic, the need for simple onboarding, hosted nodes, SQL, and easy integrations with other services led the Orbis team to create OrbisDB.

Built initially as the Ceramic-based infrastructure for Orbis Social, OrbisDB evolved from a template implementation used by leading crypto projects such as Iggy Social, CoinEasy, Autonolas, and Gitcoin Schelling Point, into a slick set of interface services for data on Ceramic, including a UI for no code deployment, integrated hosting, support for additional languages, and a blue sea of possibilities made possible by plugins.

Key Upgrades with OrbisDB Simplified Ceramic Developer Experience: Rapid Ceramic Onboarding: OrbisDB offers a web app and SDK for storing and managing datasets on Ceramic, no-code, or CLI. Hosted nodes: OrbisDB makes Ceramic DevOps easy with an in-built hosted node service. Accelerated Customization: Extend the functionality of your database with plugins. Build plugins for other developers. Database Language Choice: SQL Queries: Using PostgreSQL as its indexing database, OrbisDB offers scalable performance and the benefits of traditional scaling methods. GraphQL: (already available on ComposeDB) and vector embeddings are both in development. Plugin Ecosystem: Optional and Versatile: Developers can easily add plugins to OrbisDB. These plugins are optional and designed to perform operations beyond the core's scope, providing additional functionality and connections to other blockchain services. Recently released plugins for Dune (link) and Base (link) make data visualization and importing on-chain data from any Base smart contract code-free and straightforward. Open source: Plugins are open source. Users can build and share plugins with other developers in the ecosystem. Do anything with plugins: Combine on-chain transactions from Base or other EVMs with verifiable data on Ceramic (e.g., enable mutable and verifiable metadata) Provide sybil-resistance and instant reputation score to all user-generated data using Passport.xyz or Verax attestations. Easily token-gate your applications via pre-defined indexing logic Resolve ENS Domain names directly from any datasets in one click Enable a single query from multiple data sources (API, on-chain, Ceramic data, etc.)

Get Early Access to OrbisDB Studio

OrbisDB represents a practical evolution of databases on Ceramic, building upon the foundations laid by ComposeDB and significantly improving experience, languages, and interoperability.

Projects have already started building on OrbisDB in beta, including Index Network, Plurality, and Flashcards, for various use cases, including a blockchain event listener and storing encrypted user data and educational content.

We're excited to work with Orbis to support the future of decentralized data management. OrbisDB Studio, accessible later this year, will offer the developer experience improvements discussed above. Sign up here to get on the waitlist for early access.

Learn more about OrbisDB at useorbis.com


Me2B Alliance

In Defense of Cyber Product Safety for Civilians (or Something)  

It’s cybersecurity season in Las Vegas and I’m inspired to write an overdue post on why I hate the phrase “cyber civil defense”. Actually, I don’t hate the phrase, I disagree with its usage. Being the literal sort I am, I have to of course start with a long look at what the three words […] The post In Defense of Cyber Product Safety for Civilians (or Something)   appeared first on Inte

It’s cybersecurity season in Las Vegas and I’m inspired to write an overdue post on why I hate the phrase “cyber civil defense”. Actually, I don’t hate the phrase, I disagree with its usage. Being the literal sort I am, I have to of course start with a long look at what the three words seem to mean.  

Cyber: aka technology; though perhaps a more detailed definition would include “software driven” and “internet connected” as necessary attributes.  

Civil: in this context, I think it means “citizens” or just “people”.  

Defense: The catch with this term is that it’s unclear what or who is being defended and by what or whom. For instance, this innocuous three-word phrase could mean any number of things: 

Civilians defending “cyber” [tech] from other civilians.  Civilians defending other people from civilians.  Tech defending civilians from other civilians.  Tech defending civilians from tech.   Tech defending tech from civilians. (ew.) 

I could go on but my head hurts. 

Civilian defense seems to imply a kind of volunteer force to defend people from cyber threats (what kinds of threats?).  

Here’s where the wheels fall off this phrase: what about when the call’s coming from inside the house? Meaning, what about when the technology—as designed and with perfect integrity—is itself harmful to people? I’m not fine with using the phrase in the context or implication of defending people from risks from commercial technology itself because doing so: 

– Reinforces that it’s acceptable for commercial technology (i.e. commercial products) to be a thing that civilians need to protect themselves from, 

– Gaslights people into thinking that it’s somehow their responsibility to protect themselves against commercial technology that is evolving faster than the governance around it, bolstered by staggering amounts of financial resources, and whose risks are admittedly poorly understood by the makers themselves, but with just a little more elbow grease, you, dear user, can maybe be marginally less at risk.  

– Smacks a bit of a military operation. I don’t want to join an army, I just want to have reasonably safe technology products.  

– Also it’s a smidge paternalistic. (I can almost hear the “little lady” in there…) 

The good news is we already have a phrase to describe risks of commercial products on humans. It’s called Product Safety.  

But product safety is an abject failure when it comes to commercial software and software-driven technology. In the US we have a dedicated product safety commission, but their scope hasn’t been updated since 2008, and was hamstrung by budgetary contractions in the Consolidated Appropriations Act of 2019. Other agencies pick up pieces of product safety in the style of the blind men and the elephant, using their granted powers to maximum effect. The failure, however, is with the law makers. We have not updated ideas of “products” and “product safety” to keep pace with the internet age and citizens pay the price every day.  

Sadly, from my research, it usually does take around 50 years after the launch of a new commercial product for US product safety laws to emerge, so we’re depressingly on time. For example, seatbelts became mandatory on January 1, 1968, sixty years after the commercial launch of the Fort Model T. 

The EU recognized this gap in 2023 with their updated product safety law. As we in the US still wait for a federal privacy law, perhaps we can leapfrog ahead to a reimagined federal product safety law. Good news: we at ISL have tons of data, know-how, and tools to support this; it doesn’t have to start from scratch. But it would take extraordinary intestinal fortitude on behalf of the lawmakers to create something that meaningfully throttles the myriad risks technology foists upon us today. It would take precise regulation and a financially backed commitment to enforcement.  

I won’t be holding my breath, but we are absolutely here for that moment if and when it comes. Meanwhile, in the likely event the US government continues to ignore product safety for technology, ISL will continue to champion the safety of all tech users through our maturing safety labels and research.  

The post In Defense of Cyber Product Safety for Civilians (or Something)   appeared first on Internet Safety Labs.


We Are Open co-op

Key Questions for User Research Design

Our approach to the nexus of user research and user testing WAO often starts projects with a bit of user research: asking your stakeholders and audience, users and even friends to give you some insight is always time well spent. After all, we all bring our own biases and experience, and user research can help surface some of those to better serve other people. While there are, of cours
Our approach to the nexus of user research and user testing

WAO often starts projects with a bit of user research: asking your stakeholders and audience, users and even friends to give you some insight is always time well spent. After all, we all bring our own biases and experience, and user research can help surface some of those to better serve other people.

While there are, of course, specialised organisations and more or less complex user research projects, user research is something that you and your organisation can do yourself!. Even talking to five people about your project will give you a wealth of ideas and perspectives on how to improve things for your stakeholders.

In this post, we’re going to look at a few important considerations to make as you start the design of your user research. Read on to find out how you can bootstrap your first user research project.

Are you doing user research or user testing? cc-by-nd Bryan Mathers for WAO

We use the term ‘user research’ when we’re trying to figure out how to serve users better, whereas ‘user testing’ is when you have a prototype and you want to observe how people respond to it. We often do research that sits at the nexus of these two things — we ask questions as well as showing prototypes and ideas. However, rather than simply observing how people use them, which is what you would traditionally do in “user testing”, we talk to people about what they are experiencing. This allows us to both test ideas while also deepening our understanding of our users’ needs.

Who are ‘users’? cc-by-nd Bryan Mathers for WAO

The word ‘User’ is sort of a confusing word in the research context. Sometimes we are, indeed, talking to users of a platform or product, but more often than not, we’re talking to people who are involved in a programme or project, but not necessarily as ‘users’. If you are having trouble figuring out who to talk to, you may want to reframe the term ‘user’ into ‘participant’ and reconceptualise ‘user research’ as just ‘talking to people’. Having a structured conversation with the people you are trying to serve is how you can create systems, process and, yes, products that are relevant and valuable.

The goal of this kind of research is to understand what people need and want, and the behaviours they have. This can help make programme, project or product design better. It can also help your communication and marketing.

The first thing you’ll want to do when designing a user research project is to think about the types of people you are designing for and see if you can talk to them. Our Learn with WAO site has some activities to help you think about who these people are. Perhaps start with Stakeholder Mapping and then see if you can apply your learnings in the Persona Mapping activity.

Can you be more inclusive? cc-by-nd Bryan Mathers for WAO

While you need to be targeted in the specific types of people you talk to, it’s also important to ensure that you have a diverse range of participants for your user research. It’s important to be inclusive, which means that you don’t overlook people or groups that you might not associate as ‘users’. Once you’ve figured out the stakeholders and/or spectrum of personas you want to talk to, make sure you have diversity in your pool.

Diversity and inclusion is an intentional act, so if you realise that you do not have a racially, geographically, social-economically and gender diverse group of participants, you’ll need to do more outreach. It is perfectly acceptable to contact people within your network to ask for help finding diverse participants. You can also use social media and hashtags.

Sometimes, we use what’s called “snowball sampling” to help us find more participants. We simply ask people who have already volunteered to participate if they know someone who might also be willing to participate. This works well, but beware, snowball sampling can actually lead to less diverse groups because of the way network effect works. It can also lead to ‘scope creep’ where your project ends up being much larger than initially defined. So watch out for that!

What kind of data should you collect? cc-by-nd Bryan Mathers for WAO

Effective user research combines both quantitative and qualitative methods. While quantitative data provides measurable insights, qualitative research helps to understand the context and motivations behind user behaviours, leading to deeper insights.

One of the tactics we like is to use surveys first. This allows us to collect quantitative data as well as to expand our recruitment pool for user research interviews. We ask “May we talk to you more about your answers?” And if the survey respondent answers yes, we ask for their contact information.

Once you have people to talk to, you need a set of questions to ask. A user research guide helps you ensure some rigour and can be used to steer interviews. You need to avoid leading questions, questions that have the answer in the question (e.g. things like “Isn’t this prototype beautiful and easy to use?”) Use open-ended questions, neutral language and avoid assumptions. It can help to test your questions on friends or colleagues!

Another point to note is that, as you carry out the user research interviews, you’ll learn things about the project and about the needs and desires of participants. You will need to balance asking similar questions (for fairness and reliability), with perhaps slightly different ones as the landscape becomes clearer.

Next steps cc-by-nd Bryan Mathers for WAO

In the end, user research is all about getting to know what your users need and want from your project, programme, product or service. The best way to get that information is to find a diverse group of people and talk to them in any way you can! Then, you can gather your data and present it in a way that tells a story about what your next steps should be.

Need help? We do this a lot, get in touch!

Key Questions for User Research Design was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 12. August 2024

IDunion

Summary of the work of the IDunion research project

To mark the entry into force of the eIDAS Regulation 2.0 in May 2024, the Blockchain Berlin news platform has published a comprehensive review of the IDunion research project's work over the past three years.

To mark the entry into force of the eIDAS Regulation 2.0 in May 2024, the Blockchain Berlin news platform has published a comprehensive review of the IDunion research project’s work over the past three years.

While the original aim of the project was to develop a decentralised identity management system based on blockchain, the project was subsequently refocused on federated data storage using the OpenID4VC protocol family developed by the consortium partners. The integration of these protocols into the eIDAS 2.0 Architecture Reference Framework (ARF) by the EU confirms IDunion’s expertise and its important contribution to digitalisation in Europe.

The article presents various possible use cases for digital credentials, such as the digital student ID card of the TU Berlin or the digital product passport for seamless identification and transparency of product sustainability data.

Click here to read the article


Identity At The Center - Podcast

On this week’s episode of The Identity at the Center podcast

On this week’s episode of The Identity at the Center podcast, we take an entire episode to answer a listener question about IAM RFPs and how to get the maximum value out of that process. You can watch it here: https://youtu.be/rwn3CTRlPP0 Visit idacpodcast.com for more. #iam #podcast #idac

On this week’s episode of The Identity at the Center podcast, we take an entire episode to answer a listener question about IAM RFPs and how to get the maximum value out of that process.

You can watch it here: https://youtu.be/rwn3CTRlPP0

Visit idacpodcast.com for more.

#iam #podcast #idac


Energy Web

CIRPASS-2 Launches to Advance Digital Product Passports

New Initiative to Drive Sustainability, Circular Economy, and Data Interoperability Across Europe In response to the sustainability and circularity data sharing needs highlighted in the EU Green Deal and the new Circular Economy Action Plan (CEAP), the European Commission adopted the Ecodesign for Sustainable Products Regulation (ESPR) in March 2022. This regulation establishes the Digital P
New Initiative to Drive Sustainability, Circular Economy, and Data Interoperability Across Europe

In response to the sustainability and circularity data sharing needs highlighted in the EU Green Deal and the new Circular Economy Action Plan (CEAP), the European Commission adopted the Ecodesign for Sustainable Products Regulation (ESPR) in March 2022. This regulation establishes the Digital Product Passport (DPP) system, designed to provide essential information on a need-to-know basis to support sustainable production and enable the transition to a circular economy. The DPP aims to boost material and energy efficiency, extend product lifetimes, and optimize product design, manufacturing, use, and end-of-life handling.

Beyond facilitating sustainability and circularity-related data sharing among economic operators along value chains, the DPP is intended to create new business opportunities through digital and circular value retention and optimization, such as product-as-a-service, repair, reuse, remanufacturing, and recycling. It also helps consumers make sustainable choices and allows authorities to verify compliance with legal obligations.

Currently, there are numerous initiatives for such data sharing systems at the industry sector, platform, or individual company level. However, there is a lack of standardization in data definition, data format, and IT infrastructure, which hinders cross-sectoral interoperability. To address this, the EU Commission initiated the CIRPASS Coordination and Support Action (CSA) in October 2022, with the project set to conclude in March 2024. The focus is on batteries, electronics, and textile sectors. By the project’s end, CIRPASS will propose a clear cross-sectoral definition and description of the DPP system, identify key data for circularity, and define requirements for product identification and data exchange protocols to support further legislative and standardization developments.

To ensure inclusivity, an open call for pilot proposals was widely disseminated, forming the CIRPASS-2 project consortium, which will run from May 2024 until April 2027. Alongside 17 partners, CIRPASS-2 will demonstrate functioning Digital Product Passports in real-world settings through circular pilot deployments and use cases in textiles, electrical and electronic equipment, tires, and construction value chains.

CIRPASS-2 will show that the DPP, as a digital transformation initiative, helps strengthen the Union’s resilience and data sovereignty. It will develop a DPP data space fully aligned with Europe’s ongoing efforts in this field (such as SIMPL and DSSC) and emphasize interoperability based on harmonized standards. The focus on open source and architecture will also foster the EU data economy while ensuring compliance with data regulations. By taking the conceptualized DPP and pilot solutions and deploying them at scale, CIRPASS-2 aims to bridge the gap between digital technology research and market deployment. The project will contribute directly to Digital Europe Programme (DEP) objectives through its results and will provide extensive policy and business recommendations to further these goals.

CIRPASS-2 is an Innovation Action project funded by the European Commission’s Digital Europe Programme, running from May 2024 until April 2027. With 13 lighthouse pilots, the project will demonstrate functioning DPPs in real settings and at scale across four target value chains: textiles, electrical and electronic equipment, tires, and construction materials. Additionally, CIRPASS-2 will create a broad community of DPP stakeholders to facilitate the deployment of DPPs in various product sectors across Europe and beyond.

About Energy Web

Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

CIRPASS-2 Launches to Advance Digital Product Passports was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 08. August 2024

FIDO Alliance

New CISA Guide Calls for Phishing-Resistant Forms of Authentication and Passkeys by Default

Andrew Shikiar, FIDO Alliance Executive Director & CEO In a significant move to bolster software security, the Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigations have […]

Andrew Shikiar, FIDO Alliance Executive Director & CEO

In a significant move to bolster software security, the Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigations have released new guidance that organizations can use to demand better security from their software vendors.

The Secure by Demand Guide: How Software Customers Can Drive a Secure Technology Ecosystem underscores the pivotal role that software customers play in digital supply chain security. The guide outlines high-priority security requirements from the earliest stages of software development, a principle central to creating “secure by design” products.

Among the items highlighted are phishing-resistant authentication methods, such as passkeys, as a default feature in software products. Announced on Tuesday, August 6, 2024, at Black Hat USA, this new guidance represents a vital step forward in securing the digital supply chain in the United States and worldwide.

Secure by Demand, Secure by Design

This new guidance complements CISA’s recent Secure by Design Guide aimed at technology manufacturers to improve their software product security. By focusing on the procurement aspect in the supply chain, the new guidance advises software buyers to demand modern security features from technology manufacturers, such as phishing-resistant authentication and passkeys. By doing so, customers can drive demand for security as a baseline feature and compel technology manufacturers to adhere to secure design practices.

The guidance also includes an assessment to evaluate software security and include security requirements in contracts. It encourages a proactive procurement approach, where a buyer can assess a manufacturer’s security and capabilities to reduce vulnerabilities and strengthen resilience. The guide establishes best practices for secure software procurement and highlights the product security features that bolster supply chain security and interoperability.

Passkeys Take Center Stage

CISA’s guidance aligns with the recent guidance from the National Institute of Standards and Technology (NIST) in their digital identity guidelines on authentication and lifecycle management. In supplemental guidance, NIST SP 800-63Bsup1, NIST affirmed that synced passkeys meet Authentication Assurance Level 2 (AAL2) requirements and device-bound passkeys satisfy Authentication Assurance Level 3 (AAL3). The two guidance documents emphasize the importance of security, including digital identity and authentication best practices, across the digital supply chain.

The Secure by Demand guidance empowers IT buyers, who can drive market demand for secure software features, such as passkeys and FIDO authentication. Given that weak or stolen passwords account for 80% of hacking-related breaches and credential phishing has skyrocketed by 967% since 2022, buyers can use the guide’s security assessment to evaluate software security, including passkey capabilities, and improve security risk management in the supply chain. With this guidance, CISA aims to increase awareness and drive market demand for secure software.

Key Recommendations for Software Manufacturers

CISA’s Secure by Demand guide outlines several critical requirements that customers should evaluate when procuring software, and includes questions to assess a software manufacturer’s security capabilities in the following areas:

Authentication: Manufacturers should support secure, standards-based Single Sign-On (SSO) and implement phishing-resistant multi-factor authentication (MFA) or passkeys — by default, and at no extra cost. Eliminating Vulnerabilities: Systematic efforts should be made to address and prevent classes of software defects, such as SQL injection and cross-site scripting vulnerabilities. Secure Defaults: Security logs should be provided to customers without additional charges, ensuring transparency and accountability in software security. Supply Chain Security: Ensuring the provenance of third-party dependencies via Software Bill of Materials (SBOM) and robust processes for integrating open-source components are vital. Vulnerability Disclosure: Transparency and timely reporting of vulnerabilities, including authorization for security testing by the public, is crucial for maintaining trust and improving security outcomes. A Call to Action for Security Leaders

The guidance for those manufacturing or procuring software across the software supply chain is clear: passkeys improve third-party supply chains and ensure higher security standards in software procurement and development processes. By integrating passkeys into authentication processes, organizations can strengthen end-to-end digital identity lifecycle management and significantly reduce the risks of phishing and social engineering attacks.

To learn more about CISA’s Secure by Demand guidance, visit https://www.cisa.gov/resources-tools/resources/secure-demand-guide.
Ready to go passwordless? Learn how to implement passkeys or find a passkey deployment partner using the FIDO Certified Directory and FIDO Certified Member Showcase.


Origin Trail

Trend is Your Friend: Knowledge Graphs at the Heart of Gartner’s Impact Radar — Here is How the…

Trend is Your Friend: Knowledge Graphs at the Heart of Gartner’s Impact Radar — Here is How the Decentralized Knowledge Graph (DKG) Enhances Reliable AI Gartner puts Knowledge Graphs at the epicenter of their 2024 Impact Radar right next to Generative Artificial Intelligence (GenAI). Here is WHY knowledge graphs (KGs) are so important for Artificial Intelligence (AI) and how Decentralized Kn
Trend is Your Friend: Knowledge Graphs at the Heart of Gartner’s Impact Radar — Here is How the Decentralized Knowledge Graph (DKG) Enhances Reliable AI

Gartner puts Knowledge Graphs at the epicenter of their 2024 Impact Radar right next to Generative Artificial Intelligence (GenAI). Here is WHY knowledge graphs (KGs) are so important for Artificial Intelligence (AI) and how Decentralized Knowledge Graph (DKG) helps power trust at Internet scale.

Knowledge Graphs in support of reliability of the GenAI

KGs act as intelligent maps for AI, aiding in understanding connections, explaining decisions, enhancing learning through Retrieval-Augmented Generation (RAG) to reduce hallucination and bias in AI.

Initially developed by Meta, RAG is a type of AI system that combines two main tasks: retrieving information and generating answers. Think of it like a smart assistant that not only looks up facts but also puts them together in a way that makes sense. By using KGs, RAG systems become smarter and more reliable.

They can understand questions better, provide accurate answers, connect different ideas, search quickly, show their sources, and bring together knowledge from various topics. This helps users find the information they need and helps drive more reliable AI based on the transparent use of knowledge sources.

Decentralized Knowledge Graph (DKG): Trust the Source

“We live in a time of abundant connectivity and alas abundant misinformation. The OriginTrail Decentralized Knowledge Graph (DKG) is an evolving tool for finding the truth in knowledge. In particular, we see knowledge graphs improving the fidelity of artificial intelligence.”

Dr. Bob Metcalfe, Internet pioneer and Ethernet creator

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), importantly advances knowledge graphs (KGs). It addresses the challenges of data ownership, AI hallucinations, and bias with the Decentralized Retrieval-Augmented Generation (dRAG) that enhances RAG by organizing external sources beyond a single organization in a DKG for AI models to use — from a single source to networks of sources.

Most importantly, the DKG allows users to go beyond the limitations of siloed data of a single organization to achieve integrated, decentralized knowledge access from multiple sources, all while preserving the lineage, or provenance of it all. This then drives the reliability and accuracy of AI well beyond a single source. The DKG also provides a connective layer (or a middleware) that allows for centrally operated knowledge graphs to interoperate with other information sources.

Are you building an enterprise-grade AI solution that requires reliability and trust beyond a single source?

Go check the solutions used by world-class organizations built on OriginTrial DKG and schedule a demo.

Trend is Your Friend: Knowledge Graphs at the Heart of Gartner’s Impact Radar — Here is How the… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. August 2024

The Engine Room

22 AUG: Join us to launch our new Organizational Strategy

Join us to talk about the ideas and approaches behind our organisational strategy for 2024 - 2025 The post 22 AUG: Join us to launch our new Organizational Strategy appeared first on The Engine Room.

Join us to talk about the ideas and approaches behind our organisational strategy for 2024 - 2025

The post 22 AUG: Join us to launch our new Organizational Strategy appeared first on The Engine Room.


Velocity Network

LERs and Blockchain: Why, Where and How?  

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

Next Level Supply Chain Podcast with GS1

Technology and Modern Food Safety with Darin Detwiler

Food safety is intricate, and for some, it can be emotional. But as food safety evolves, technology and innovation are more crucial now than ever.   Reid Jackson and Darin Detwiler, Founder and CEO of Detwiler Consulting Group, discuss the intricacies of food safety. Darin, an expert with over 31 years of experience, opens up about his journey, which began with the tragic loss of his son

Food safety is intricate, and for some, it can be emotional. But as food safety evolves, technology and innovation are more crucial now than ever.

 

Reid Jackson and Darin Detwiler, Founder and CEO of Detwiler Consulting Group, discuss the intricacies of food safety. Darin, an expert with over 31 years of experience, opens up about his journey, which began with the tragic loss of his son to E. coli and evolved into a lifelong commitment to improving food safety standards.

 

They discuss the roles that new technologies and procedures play in fortifying food safety systems, the emotional and operational challenges faced by professionals in the field, and the essential human elements—like courage—that drive meaningful change. They also address the complexities of balancing short-term and long-term goals in food safety investments, emphasizing the importance of preparing and integrating cutting-edge solutions amid ongoing production challenges.

 

In this episode, you’ll learn:

How integrating advanced digital solutions like data analytics and blockchain can simultaneously address long-term and short-term goals within the food safety sector, ensuring a sustainable and secure supply chain.

The importance of cultivating courage within food safety leadership roles, emphasizing the critical need for a proactive approach in preventing and mitigating safety crises.

The transformative role of social media in empowering consumers as active stakeholders in food safety, contributing real-time data, and enhancing industry transparency and responsiveness to potential risks.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Darin Detwiler on LinkedIn

More about Detwiler Consulting Group  - https://www.herculeaneffort.net/

 


Digital Identity NZ

DINZ Supports NZTech’s Biometrics Concerns

In response to Graeme Muller's open letter from NZTech to New Zealand Government officials, DINZ and its Biometrics Special Interest Group wish to emphasise our support for the concerns raised. The post DINZ Supports NZTech’s Biometrics Concerns appeared first on Digital Identity New Zealand.

In response to Graeme Muller’s open letter from NZTech to New Zealand Government officials, DINZ and its Biometrics Special Interest Group wish to emphasise our support for the concerns raised. The letter, published yesterday, highlights significant concerns about the OPC’s proposed Code of Practice for biometrics, which could negatively impact businesses, innovation, and the economy. 

“DINZ is fully committed to supporting the Office of the Privacy Commissioner (OPC). We believe that privacy-enhancing technologies introduced by our industry can complement the OPC’s role in protecting privacy. Collaboration between the OPC and businesses should not be seen as a trade-off but as a partnership with mutual benefits.” says Steven Graham, Head of Biometrics (ANZ) & Innovation, NEC New Zealand Limited and Chair of DINZ Biometrics Special Interest Group.

Following the publication of this letter the Privacy Commissioner reached out to NZTech CEO Graeme Muller to assure NZTech members that the process is far from finished, they are still reviewing the feedback and there will be further opportunities for stakeholders to engage. Given how important biometrics are for privacy, productivity and economic growth, DINZ, and NZTech look forward to the OPC publishing their proposed next steps and engaging in real consultation with experts in biometrics.

VIEW FULL LETTER

The post DINZ Supports NZTech’s Biometrics Concerns appeared first on Digital Identity New Zealand.

Tuesday, 06. August 2024

Digital ID for Canadians

The Digital Identification and Authentication Council of Canada (DIACC) Written Submission for the 2025 Pre-Budget Consultations

Submitted by: Joni Brennan, President List of recommendations Introduction The spread of misinformation is evolving around the world at a concerning pace. Bad actors are…

Submitted by: Joni Brennan, President

List of recommendations

Recommendation 1: That the government prioritize digital trust in four areas critical to Canada’s leadership and the privacy, security and protection of our people and industries, including: Digital Trust in Citizen Services; Digital Trust in Finance and Regulatory Digital Trust in Public Safety; and Digital Trust in Business and Industry Recommendation 2: That the government recognize the necessity of embracing and prioritizing privacy-protecting verification and authentication tools as part of its Artificial Intelligence (AI) strategy. Recommendation 3: That the government allocate the funding needed to support the adoption of digital trust tools to the benefit of government, businesses, and citizens alike.

Introduction

The spread of misinformation is evolving around the world at a concerning pace. Bad actors are finding new battlegrounds and frontiers every day, and information and images generated by AI are being used to push political agendas and false narratives, scam and steal money and identities, and, even worse, lure online. 

Similarly, AI is also evolving rapidly, with risks as significant as the benefits. Further, now that AI is generative, users can manipulate images and information at unprecedented speed and scale, and vast amounts of inaccurate and malicious information make it difficult for people and organizations to verify information authenticity.

In today’s era of information warfare, authenticity and verification must be prioritized — particularly given the role of digital trust and identity verification in the delivery of government and business services.

Our submission and recommendations reflect the deep experience and expertise of DIACC’s member organizations, and our collective commitment to working with leaders in both the public and private sectors to secure verifiable information authenticity to the benefit of government, industry, and citizens alike by prioritizing inclusive and accessible privacy-protecting digital trust and verification capabilities.

About DIACC

The Digital Identification and Authentication Council of Canada (DIACC) was created following the federal government’s Task Force for the Payments System Review, with a goal to bring together public and private sector partners in developing a safe and secure digital ecosystem.

DIACC is guided by a belief that our public safety, civic engagement, and economic prosperity depend on leveraging trusted solutions and using well-established risk mitigation and certification tools. DIACC is committed to accelerating digital trust adoption and reducing information authenticity uncertainty by certifying services against its Pan-Canadian Trust Framework — a risk mitigation and assurance framework developed collaboratively by public and private sector experts that signals trustworthy design rooted in security, privacy, inclusivity, accessibility, and accountability.

Recommendations

Recommendation 1: That the government prioritize digital trust in four areas critical to Canada’s leadership and the privacy, security and protection of our people and industries, including:

Digital Trust in Citizen Services; Digital Trust in Finance and Regulatory; Digital Trust in Public Safety; and Digital Trust in Business and Industry.

Digital Trust in Citizen Services

DIACC advocates for digital trust in citizen services, emphasizing the importance of secure, privacy-respecting, and user-centric solutions through collaboration between government, private sector, and civil society. Leveraging our collaborative partnerships, we developed the Pan-Canadian Trust Framework (PCTF) – a risk mitigation and assurance framework that extends standards and open source code to help service providers ensure risk mitigation and user care.

As public services continue to move online, digital trust and verification services will be critical for ensuring that services are secure and accessible. From online healthcare consultations to digital government services, these technologies provide the necessary security infrastructure to protect public interactions and data.

Through partnerships with organizations such as DIACC, the government is encouraged to prioritize innovation in digital trust technologies through pilot projects, research, and education. Collaboration with various sectors will ensure the development and implementation of secure, efficient, inclusive, and accessible digital trust solutions, fostering a reliable digital ecosystem for accessing healthcare, banking, and government services.

Digital Trust in Finance and Regulatory

Canadians benefit from being a highly banked jurisdiction with broad inclusivity and accessibility – according to the Canadian Bankers Association, approximately 99 per cent of Canadian adults have a bank account.

Existing financial regulations provide powerful and internationally recognized tools that act as a solid foundation to fight fraud and foster a more verified, authentic, and trustworthy ecosystem that supports the needs of people, governments, and businesses alike. However, the government is encouraged to build on the existing regulatory framework and develop new regulations to facilitate secure digital transactions, including compliance with anti-money laundering (AML) and know-your-customer (KYC) regulations.

Further, digital trust and verification services will be critical as the government moves forward with its commitments to open-banking, with interoperability also being paramount as the federal framework and existing provincial frameworks work together.

Similarly, the government has committed to reducing incidents of mortgage fraud and strengthening proof of borrower and title insurance, and digital trust and verification services can and should play a critical role in making that commitment a reality.

Digital Trust in Public Safety

DIACC believes that implementing digital trust and verification services are essential for enhancing public safety. Digital verification can play a crucial role in protecting vulnerable populations, including children, the elderly, and individuals with disabilities.

By ensuring that only authorized personnel have access to sensitive areas such as schools, healthcare facilities, and care homes, digital trust services can help safeguard these groups from potential harm. Further, digital trust and verification services enable secure and reliable cross-border identity verification, facilitating international collaboration in law enforcement, disaster response, and public health.

By prioritizing advanced authentication methods that ensure individuals and organizations are who they claim to be, we can help prevent unauthorized access to sensitive information and critical infrastructure, minimize financial scams and misuse of personal data, and enhance public safety for all Canadians.

Digital Trust in Business and Industry

Enhanced security is a primary benefit of digital trust and verification services for businesses and industries. These services provide robust security measures that protect businesses from fraud, identity theft, and cyber threats. Ensuring that only authorized individuals have access to sensitive information and resources, these services help maintain the integrity of business operations.

By prioritizing digital trust in business and industry and implementing authentication and verification tools, the government can help drive the following benefits:

streamlined business processes by automating identity verification and reducing the need for manual checks; faster, more efficient operations and reduced administrative costs, allowing businesses to allocate resources more effectively; data minimization and the secure handling of personal information, increasing customer confidence; a competitive advantage for Canadian businesses by helping them innovate and offer their customers new, secure digital services; and a reduction in incidents of fraud, resulting in significant cost savings for businesses. These savings can be reinvested into other business areas, driving growth and innovation and improving overall business performance.

Recommendation 2: That the government recognize the necessity of embracing and prioritizing verification and authentication tools as part of its AI strategy.

In today’s world, where AI is becoming smarter every day, and information can be generated and manipulated at unprecedented speed and scale, ensuring the accuracy and trustworthiness of information is critical. It is vital to maximize the benefits of an AI and Artificial General Intelligence (AGI)-fueled data ecosystem for Canada while also fostering citizen trust and protecting their safety.

To effectively address the challenges we’re facing while realizing the benefits of AI, the federal government should prioritize verification and authentication tools as part of its broader AI strategy. Prioritization must include funding, collaboration, and urgent action to support the development, adoption and certification of tools that verify information authenticity while protecting privacy and empowering Canadians. Governments, banks, telcos, tech companies, media organizations, and civil society must work together to deploy open, standards-based solutions and services to verify the authenticity of information.

The economic imperative of investing in these capabilities is clear. According to a study by Deloitte, the Canadian economy could unlock an additional 7 per cent (CAD $7 trillion) in economic value through AI and AGI technologies. People and organizations can only realize this potential for the good of society by investing in tools, processes, and policies that support verifying the authenticity of the information generated and processed by AI and AGI technologies.

Recommendation 3: That the government allocate the funding needed to support the adoption of digital trust tools to the benefit of government, businesses, and citizens alike.

Today, solutions can signal verified trust by getting certified against a technology-neutral risk and assurance framework like DIACC’s Pan-Canadian Trust Framework, developed collaboratively by public and private sector experts.

Verifiable information authenticity relies on critical principles, including provenance and traceability: provenance establishes the origin and history of information, ensuring it comes from a reliable source, while traceability allows for audibility of the flow of information, enabling people, businesses, and governments to verify its accuracy and authenticity. These principles are essential in combating the spread of misinformation and disinformation, which can have far-reaching consequences in an AI-fueled world.

Provenance and traceability are potent information authenticity tools that can help:

businesses and professionals reduce liabilities and meet obligations to verify information about their clients and their operations; citizens and residents interact securely and efficiently with governments; customers and clients transact with privacy and security anywhere, anytime; industries manage decision-making and securely supply chains using trusted data; producers verify essential data related to environmental, safety, and operational goals and creators track intellectual property to ensure fair payment and cultural protection.

Conclusion

Our public safety, civic engagement, and economic prosperity depend on leveraging trusted solutions, well-established risk mitigation and certification tools, and powerful collaboration to ensure regulations set informed guardrails that put people’s benefits, protections and agency to control data at the center of the design. The evolving AI-fueled information landscape presents unprecedented challenges and opportunities for innovation and progress. By prioritizing verifiable information authenticity, inclusive and accessible solutions,  and investing in digital trust, we can ensure that people and organizations realize the benefits of AI and AGI while mitigating its risks

Thank you once again for the opportunity to provide our input in advance of Budget 2025 and as we collectively move forward on the path to a digitally and economically prosperous Canada.


The Engine Room

27 AUG – Report launch event: Information ecosystems in Latin America and the Caribbean

Join us for the launch of our new report sharing learnings from our research into information ecosystems in Latin America and the Caribbean. The post 27 AUG – Report launch event: Information ecosystems in Latin America and the Caribbean appeared first on The Engine Room.

Join us for the launch of our new report sharing learnings from our research into information ecosystems in Latin America and the Caribbean.

The post 27 AUG – Report launch event: Information ecosystems in Latin America and the Caribbean appeared first on The Engine Room.


Digital ID for Canadians

DIACC Unveils New Board of Directors to Champion Digital Trust and Verification in Canada and the Global Digital Economy

Toronto, Ontario – August 6, 2024: DIACC is thrilled to announce the appointment of its Board of Directors following the recent election at DIACC’s Annual…

Toronto, Ontario – August 6, 2024: DIACC is thrilled to announce the appointment of its Board of Directors following the recent election at DIACC’s Annual General Meeting (AGM) on June 27, 2024, where renowned leaders and visionaries from various sectors converged.

“On behalf of the DIACC Board, I am thrilled to welcome our newly elected and re-elected board members,” said Dave Nikolejsin, Chair of the DIACC Board. Their expertise and dedication are invaluable as we advance digital trust in the global digital economy. We will continue to work together to advance a secure, efficient, privacy-respecting, and inclusive digital ecosystem.”

The new and returning Directors bring fresh perspectives and experience-based commitment to DIACC’s mission. Their leadership will help the council ensure its initiatives align with and influence standards and practices that mitigate security and privacy risks. DIACC’s leadership works collaboratively to support a more inclusive, secure, and efficient global digital economy that benefits people and organizations of all sizes.

This diverse group of leaders joins DIACC’s esteemed roster of Directors, bringing together a wealth of expertise and collective experience crucial in guiding and shaping the future of digital trust, verification, and privacy protection.

DIACC Board of Directors:

Chair: Dave Nikolejsin, Independent, Strategic Advisor with McCarthy Tetrault Vice-Chair: Jonathan Cipryk, Vice President & Head of Technology Functions, Manulife * Treasurer: Andre Boysen, Independent  Manish Agarwal, Chief Information Officer, Government of Ontario * Neil Butters, Vice President & Head of Product Architecture, Interac Corp * Mike Cook, CEO, Identos Balraj Dhillon, General Manager of Product Platforms and Channels, Canada Post Giselle D’Paiva, Digital Identity Leader, Government and Public Sector, Deloitte Erin Hardy, General Counsel & Chief Privacy Officer, Service New Brunswick * Hesham Fahmy, Chief Information Officer, TELUS Marie Jordan, Senior Director Global Standards Management, VISA Jonathan Kelly, Assistant Deputy Minister for Government Digital Transformation, Province of Quebec Karan Puri, Associate Vice President, TD Bank * CJ Ritchie, Associate Deputy Minister and Government Chief Information Officer, Province of BC Pierre Roberge, Independent

* Indicates newly appointed.

The DIACC Board of Directors works closely with public and private sectors, academia, and civil society stakeholders to foster collaboration, reduce uncertainty, and accelerate the adoption of trustworthy services in the digital services ecosystem.

DIACC is confident that its Board of Directors collective insights and strategic direction will drive significant progress in the digital trust and verification space. Their dedication to fostering innovation and trust in digital services is invaluable as we work towards a future where secure and reliable digital identities are accessible to everyone.

“On behalf of the DIACC Board, I am thrilled to welcome our newly elected and re-elected board members,” said Dave Nikolejsin, Chair of the DIACC Board. “Their expertise and dedication are invaluable as we advance digital trust in the global digital economy. We will continue to work together to advance a secure, efficient, privacy-respecting, and inclusive digital ecosystem.”

“Being re-elected to the DIACC Board of Directors and serving as Vice-Chair is a tremendous honour. It allows me to support identity trust in Canada during these times of rapid technological advancements,” said Jonathan Cipryk, Vice President & Head of Technology Functions at Manulife. “I will use my expertise in technology and security to foster collaboration and drive programs that benefit our community. Together, we can build a future where identity trust and privacy empower individuals and strengthen our digital economy.” 

About the DIACC:

Established in 2012, the DIACC is a non-profit coalition of public and private sector organizations committed to advancing digital trust adoption through initiatives that inform and validate private sector services, enable privacy-protecting trusted exchanges between private and public sector authorities, and foster a robust ecosystem. DIACC enhances global economic prosperity by promoting digital trust, tools and services that verify information about individuals and organizations while protecting privacy.


For inquiries, please contact: communications@diacc.ca


GS1

Tomas Tluchor

Tomas Tluchor Data Services Director glenda.fitzpatrick Tue, 08/06/2024 - 12:35 Member excellence GS1 Czech Republic Tomas Tluchor
Tomas Tluchor Data Services Director glenda.fitzpatrick Tue, 08/06/2024 - 12:35 Member excellence

GS1 Czech Republic

Tomas Tluchor

Shannon Fuller

Shannon Fuller Lead Data Governance glenda.fitzpatrick Tue, 08/06/2024 - 12:34 Member excellence Ahold Delhaize USA Shannon Fuller
Shannon Fuller Lead Data Governance glenda.fitzpatrick Tue, 08/06/2024 - 12:34 Member excellence

Ahold Delhaize USA

Shannon Fuller

Maintenance release 3.1.29

Maintenance release 3.1.29 daniela.duarte… Tue, 08/06/2024 - 11:25 Maintenance release 3.1.29
Maintenance release 3.1.29 daniela.duarte… Tue, 08/06/2024 - 11:25 Maintenance release 3.1.29

GS1 GDSN accepted the recommendation by the Operations and Technology Advisory Group (OTAG) to implement the 3.1.29 standard into the network in November 2024.

Key Milestones:

See GS1 GDSN Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.

Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools on understanding the release and any impacts to business processes.

Business Message Standards including Message Schemas Updated For Maintenance Release 3.1.29

Trade Item Modules Library 3.1.29 

GS1 GDSN Code List Document (Sept 2024)

Delta for release 3.1.29 

Delta ECL for release 3.1.29 (Aug 2024)

Validation Rules (Sept 2024)

Delta for Validation Rules (Sept 2024)

Unchanged for 3.1.29

Approved Fast Track Attributes (Dec 2022)

BMS Documents Carried Over From Previous Release

BMS Shared Common Library (Dec 2023)

BMS Catalogue Item Synchronisation (Dec 2023)

BMS Basic Party Synchronisation

BMS Price Synchronisation 

BMS Trade Item Authorisation

 

Schemas

Catalogue Item Synchronisation Schema including modules 3.1.29 

Changed Schemas for 3.1.29 

Party Synchronisation Schema

Price Synchronisation Schema

Trade Item Authorisation Schema

Release Guidance

GS1 GDSN Attributes with BMS ID and xPath 

Packaging Label Guide 

Approved WRs for release (Aug 2024)

GPC to Context Mapping 3.1.29 (Sept 2024) May GPC publication 

Delta GPC to Context Mapping 3.1.29 (Aug 2024) May  GPC publication 

Migration Document (Oct 2024) Updated 

GS1 GDSN Unit of Measure per Category (Aug 2024)

Unchanged for 3.1.29

Local Code List (LCL) Page

Deployed LCLs 

GS1 GDSN Module by context 

Warning Messages Presentation (Mar 2024)

Flex Extension for Price commentary (Dec 2018)

Any questions

We can help you get help you get started using the GS1 standards

Contact your local office


Velocity Network

Velocity Network Foundation – MidYear Updates

The post Velocity Network Foundation – MidYear Updates appeared first on Velocity.

Monday, 05. August 2024

Identity At The Center - Podcast

Welcome to August 2024 and a new episode of The Identity at

Welcome to August 2024 and a new episode of The Identity at the Center podcast! This week we are joined by Chris Power from Sallie Mae for a wide-ranging conversation about non-human identities, RBAC vs. PBAC, how IAM organizations should be designed, and, of course, Robert Downey, Jr’s return to the Marvel cinematic universe as Dr. Doom. Watch the episode at https://youtu.be/CNzUZ6JXIOA?si=flD-2

Welcome to August 2024 and a new episode of The Identity at the Center podcast! This week we are joined by Chris Power from Sallie Mae for a wide-ranging conversation about non-human identities, RBAC vs. PBAC, how IAM organizations should be designed, and, of course, Robert Downey, Jr’s return to the Marvel cinematic universe as Dr. Doom.

Watch the episode at https://youtu.be/CNzUZ6JXIOA?si=flD-2a6q0AyaJUuq and don’t forget to give us a like and subscribe!

#iam #podcast #idac

Friday, 02. August 2024

Oasis Open Projects

Cybersecurity in the Age of GenAI

Cybersecurity in the Age of GenAI: Coping with the Exponential Web of Change In today’s rapidly evolving digital landscape, businesses face a paradoxical challenge: while digital transformation fuels revenue growth, it also exposes them to escalating cyber threats.On the one hand, more than half of the global GDP is driven by digitally transformed organizations by […] The post Cybersecurity in t

Blog post by the Baseline Technical Steering Committee

Cybersecurity in the Age of GenAI: Coping with the Exponential Web of Change

In today’s rapidly evolving digital landscape, businesses face a paradoxical challenge: while digital transformation fuels revenue growth, it also exposes them to escalating cyber threats.On the one hand, more than half of the global GDP is driven by digitally transformed organizations by now (IDC, 2020). On the other hand, this digital revolution, while fostering economic growth and innovation, also ushers in an era of unprecedented cybersecurity risks. Cybercrime damages are expected to reach a staggering $10.5 trillion annually by 2025 roughly 10% of the projected global GDP, with a concerning annual growth rate of 15%. The complexity and scale of these threats highlight the inadequacy of traditional cybersecurity approaches and the urgent need for a paradigm shift.

As we embrace the power of Generative AI to transform most aspects of society, its exponential growth and capabilities bring about new risks and challenges.

Read more

The post Cybersecurity in the Age of GenAI appeared first on OASIS Open.


Project VRM

Up Starting

Not finishing up, or starting up, but up starting. Hell, we’ve been up and starting for one month short of eighteen years. Across that whole time, we’ve been pushing the idea that free customers are more valuable—to themselves, to sellers, to the whole marketplace—than captive ones. And I’m more optimistic than ever that we’ll prove […]

Not finishing up, or starting up, but up starting.

Hell, we’ve been up and starting for one month short of eighteen years. Across that whole time, we’ve been pushing the idea that free customers are more valuable—to themselves, to sellers, to the whole marketplace—than captive ones.

And I’m more optimistic than ever that we’ll prove that idea in the next few years.

Toward that ambition, here are some links in tabs I’m closing:

Customer Commons (ProjectVRM’s nonprofit spinoff) has a renewed website. There is still much shaking down to do, but big thanks to Justin Byrd of Machi-Systems for doing the heavy lifting on the project. The Future, Present, and Past of News—and Why Archives Anchor It All is a talk I’ll be leading on Thursday, 8 August at DWeb Camp. The VRooMy side of it is leadership news needs from its consumers (who pay nothing) and customers (who do). More context at the News Commons series running on my blog. The Personal Stack, 2024 ‘AI Powered’ Version … what needs to be built on the individual side to enable balanced, trustworthy relationships with supply organisations is one among many pure-VRM posts in Iain Henderson’s Substack newsletter.. Jamie Smith’s Customer Futures is another one. Don Marti’s blog has too much good stuff for me to list it all. One especially worth pointing out, for Mac and iPhone users, is turn off advertising measurement in Apple Safari. After giving instructions (which I just followed, surprised that I hadn’t turned this shit off), he explains, “The deeper they hide stuff like this, the more it shows they understand that it’s not in your best interest to have it on. The Apple billboards are all about protecting you from tracking. I haven’t seen one yet that was more like Connect and share with brands you love! (please let me know if you see any Apple billboards like this) Information has value in a market. When your browser passes information about you—even in a form that is supposed to prevent individual tracking—you’re rewarding risky and problematic advertising practices along with the legit ones. Some advertising has value, but putting legit sites and malvertising on an equal basis for data collection is not helping.” Bonus link concerning Apple’s new AI push. And here are two more bonus links from when Apple first went on its privacy kick: Apple vs (or plus) Adtech, Part I Apple vs (or plus) Adtech, Part II “Okay, whatever”: An Evaluation of Cookie Consent Interfaces. From 2022, but more relevant than ever. $700bn delusion: Does using data to target specific audiences make advertising more effective? Latest studies suggest not, by Jon Bradshaw in Mi3. Apps Apple threatens or kills with its new gear and OS generations. Google Is the Only Search Engine That Works on Reddit Now Thanks to AI Deal, by Emanuel Maiberg in 404. This is about more silo-ing. Dave Winer‘s Podcasto is cool. He got the podcast ball rolling, both on tech and in pods of his own. This features much of his early stuff. Recommendations from the High-Level Group on Access to Data for Effective Law Enforcement , which EDRi says was “first published by Netzpolitik and now also made public by the European Commission, was drafted by the “High-Level Group (HLG) on access to data for effective law enforcement,” which was convened following a proposal by the Swedish Presidency of the Council last spring.” It continues, “Building upon previous proposals drafted by police and security officials from Europe and North America, the plan contains 42 separate recommendations, amongst which are calls for the re-introduction of mass telecommunications surveillance (“data retention”) and the undermining of encrypted communication systems.” Bold red type in the Recommendations says, “The opinions expressed are those of the experts only and should not be considered as representative of the European Commission’s official position.” This is good, because the whole idea is awful. Ted Gioia’s A 2000-Year-Old Argument Over the Flute Is the Most Important Thing in Our Culture Right Now: This bitter debate from ancient times helps us understand today’s crisis in music and other creative fields unpacks what the head and subhead say. Ted’s is one of the best Substack newsletters. Metaphor, Morality, and Politics, Or, Why Conservatives Have Left Liberals In the Dust is a rough outline of George Lakoff‘s landmark 1995 book, Moral Politics: What Conservatives Know that Liberals Don’t, which in its later editions became Moral Politics: How Liberals and Conservatives Think. If you want to know a big reason why political movements on the right and left succeed, George’s stuff is required reading. Bonus link. On the separate and more current matter of “wokeness,” here’s Lessig. Toward Personal AI: LM Studio, Ben Evans, MemGPT, Pi.ai, TheRundown, SITUATIONAL AWARENESS, Genie Out of the Bottle, What exactly is an AI agent?, On a web of heterogeneous agents for collaborative intelligence, Semi-Autonomous Agents at Web Scale: a software architecture approach, Responsible AI for a Better World, Internet Computer Protocol (ICP). A blockchain thing. See what you think. Rewilding the Web: Europe’s Path to Digital Sovereignty: Is Personal Data Going Dutch? by Arno Otto. I’m sourced in it. Somewhere is stuff I said at Solid World in a video. Can’t find it right now, though. The Dutch Data Vault Foundation is a VRM play. COSMOPlat “introduces users’ participation in the whole manufacturing process.” FEMA’s National Risk Map. Not especially VRooMy, but interesting. Where I am now, Monroe County, Indiana, is “relatively low.” George Tannenbaum bails from the advertising industry. Internet Kessler Syndrome: Are We Witnessing The Beginning Of The End Of The Open Internet? The risk: “an internet so clogged with ‘debris’ that it loses everything that once made it useful.” A good and depressing read. Augustine Fou on how adtech fails. A bill to protect people form deepfakes. ‘The Foundation For Open Source Ecosystem Technology (FOSET) is centered on Open Source development to better serve the public sector, academic institutions, and non-profit organizations. FOSET acts as a home for the technology, its associated documentation and governance, and the community of individuals and organizations that support it.” How Does Your Mobile Phone Track You (Even When Off)? by Danka Delić in ProPrivacy. The Hacking of Culture and the Creation of Socio-Technical Debt. by Kim Córdova and Bruce Schneier. Bonus link from Bruce. Network Neutrality, Search Neutrality, and the Never-ending Conflict between Efficiency and Fairness in Markets, by Andrew Odlyzko. An oldie but goodie.

That’s it for now.

Thursday, 01. August 2024

MyData

The MyData Awards 2024 – 2025 are now open for nominations! 

Nominate any individuals or organisations now at https://go.mydata.org/nominate ! Self-nominations are allowed and encouraged.You may make as many nominations as you like. Hello to all members, interested people and the […]
Nominate any individuals or organisations now at https://go.mydata.org/nominate ! Self-nominations are allowed and encouraged.You may make as many nominations as you like. Hello to all members, interested people and the […]

Tuesday, 30. July 2024

The Engine Room

[CLOSED] Apply now for intensive, strategic tech and data support in Sub-Saharan Africa

We are currently accepting applications from organisations based in Sub-Saharan Africa, for Matchbox intensive support partnerships beginning in October 2024. The post [CLOSED] Apply now for intensive, strategic tech and data support in Sub-Saharan Africa appeared first on The Engine Room.

We are currently accepting applications from organisations based in Sub-Saharan Africa, for Matchbox intensive support partnerships beginning in October 2024.

The post [CLOSED] Apply now for intensive, strategic tech and data support in Sub-Saharan Africa appeared first on The Engine Room.

Monday, 29. July 2024

Energy Web

World-leading sustainable aviation fuel certificate registry now live on Energy Web X

Registry enables tracing and tracking of sustainable aviation fuel with the help of Energy Web and Polkadot technologies The Sustainable Aviation Fuel certificate or SAFc registry was recently launched to support major consumers of air travel and transport in their efforts to help directly decarbonize flight. The registry’s design, development, and testing was spearheaded by nonprofit organizatio
Registry enables tracing and tracking of sustainable aviation fuel with the help of Energy Web and Polkadot technologies

The Sustainable Aviation Fuel certificate or SAFc registry was recently launched to support major consumers of air travel and transport in their efforts to help directly decarbonize flight. The registry’s design, development, and testing was spearheaded by nonprofit organizations RMI and the Environmental Defense Fund, that work through the Sustainable Aviation Buyers Alliance (SABA), an organization whose members include major airlines, fuel providers, and sustainable aviation buyers such as Novo Nordisk looking to reduce the climate impacts of air transport.

The SAFc Registry connects corporate consumers, airlines, freight forwarders, and clean fuel providers in one universally accessible platform that will spur the use of sustainable aviation fuel (SAF) by ensuring that SAF certificates purchased outside of the platform are delivered consistently, verifiably, transparently, and credibly. The registry builds on industry best practices and was subject to public consultation to ensure SAF certificates exchanged realize their intended environmental impact and can be claimed towards emissions reduction goals.

SAF is made from renewable or waste materials and can decrease the lifecycle emissions of flight by more than 80%. However, the fuel is not yet widely available and can be significantly more expensive than conventional jet fuel. The Registry helps address this market gap by connecting corporate demand for emissions reductions to SAF producers through an auditable ledger for certificates which decouples the emissions benefits of SAF use from the physical fuel supply. It is the final piece of market infrastructure — in addition to robust standards, book & claim accounting guidelines, and structured RFPs — to create a robust and de-risked market for SAFc investment.

Major customers have already banded together via SABA to collectively procure SAF certificates. Several of these deals will use the SAFc Registry as the delivery mechanism. Future procurement efforts inside and outside of SABA can use the SAFc Registry to seamlessly ensure the delivery of SAF certificates and sustainability assurances to buyers. Q8Aviation, an international jet fuel supplier delivering sustainable aviation fuel to the aviation market, recently enrolled to the registry saying, “We look forward to leveraging the SAFc registry to expand accessibility and support broader adoption of SAF.”

Additionally, climate tech company CHOOOSE is complementing the registry with features to streamline the flow of SAF transactions across the value chain. “At CHOOOSE, we build and operate software that supports airlines, freight forwarders, clean fuel providers, and corporate buyers in scaling their sustainable aviation fuel programs. Recognizing the vital role SAF plays in helping to decarbonize the aviation industry, these programs contribute meaningfully to SAF’s adoption. By leveraging systems like book and claim, the SAFc Registry ensures the secure transfer of certificates — and when coupled with CHOOOSE technology, SAF transactions become more automated, streamlined, and accessible. Through this partnership, we’re able to accelerate the use of SAF across the sector,” said Gaute Gamst, Chief Technology Officer at CHOOOSE.

Web 3 technologies play an important role in operation of the SAFc Registry in order to provide deep transparency to end-users. With the support of so-called worker nodes anchored and supported by the Energy Web X blockchain on Polkadot, actions on the SAFc Registry are checked, validated, and digitally recorded in a zero trust, tamper proof way. For registry users, the public, and auditors, this represents a continuous audit of registry operations.

For more information, please visit the SAFc registry website.

Notes to Editors:

About Energy Web Foundation:

Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, data exchange, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

About SABA:

The Sustainable Aviation Buyers Alliance (SABA) is accelerating the path to net-zero aviation by driving investment in, and adoption of, high-integrity sustainable aviation fuel (SAF) and supporting companies, airlines and freight customers in achieving their climate goals. SABA Members work in collaboration with EDF and RMI to develop a rigorous, transparent system that expands opportunities to invest in high-integrity SAF to all businesses and organizations interested in reducing the climate impacts of flying. SABA’s founding companies included Bank of America, Boeing, Boston Consulting Group, Deloitte, JPMorgan Chase, McKinsey & Company, Meta, Microsoft, Netflix and Salesforce.

World-leading sustainable aviation fuel certificate registry now live on Energy Web X was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Who is on your IAM 'Mount Rushmore'? This is one of the que

Who is on your IAM 'Mount Rushmore'? This is one of the questions we asked in our latest episode of The Identity at the Center Podcast. Jim McDonald and I sat down with Allan Foster from the Digital Identity Advancement Foundation (DIAF). He shared a wealth of knowledge from his journeys in the IAM space, including helping start up ForgeRock in a London pub, being part of the DIAF, and overrated/

Who is on your IAM 'Mount Rushmore'?

This is one of the questions we asked in our latest episode of The Identity at the Center Podcast. Jim McDonald and I sat down with Allan Foster from the Digital Identity Advancement Foundation (DIAF). He shared a wealth of knowledge from his journeys in the IAM space, including helping start up ForgeRock in a London pub, being part of the DIAF, and overrated/underrated, and more. The episode culminates with Allan's take on the IAM 'Mount Rushmore' and gives us four names for consideration.

Watch it at https://youtu.be/Ue0CGysg10Q?si=gVorujoGhpM-ZD5w

What do you think of his selections? Post your thoughts below!

#iam #podcast #idac

Friday, 26. July 2024

Origin Trail

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and…

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and OriginTrail European Gymnastics is a sports organisation counting 50 national member federations, and reaches beyond the borders of political Europe. It nevertheless bears the idea of a united gymnastics nation. As guarantor of interests of its around 8,500,000 gymnasts, European Gymnastics represents
Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and OriginTrail

European Gymnastics is a sports organisation counting 50 national member federations, and reaches beyond the borders of political Europe. It nevertheless bears the idea of a united gymnastics nation. As guarantor of interests of its around 8,500,000 gymnasts, European Gymnastics represents many different facets. From high-level competitive sports in four Olympic, and three non-Olympic disciplines to leisure sports in gymnastics for all with offers for all age groups, from toddlers to senior citizens. European Gymnastics transmit their understanding of being together beyond borders and set an example in community.

Now, European Gymnastics is launching its own Artificial Intelligence (AI) assistant powered by OriginTrail to drive borderless knowledge in order to further their mission to promote, develop and support synergy among the community to make Gymnastics and gymnasts at all levels, shine. The friendly mascot Luigi who you can meet at all major European Gymnastics events, is now receiving its digital twin. Powered by AI, digital Luigi allows anyone to learn and keep in touch with the European Gymnastics community. From finding the information about the next competition to learning about​​ important events in European Gymnastics history or helping you understand which elements are important for scoring a routine on parallel bars — all can be discovered with the help of the AI-powered Luigi.

The uniqueness of our digital Luigi is that his responses always include sources of information, allowing the user to explore any particular source further. This capability is unlocked by using OriginTrail’s Decentralized Knowledge Graph, which is promising to unlock an even more powerful Luigi assistant as it will allow the initial knowledge base to continuously expand, not only by European Gymnastics’s inputs but also with contributions of the national federations, gymnasts and fans. As OriginTrail is based on blockchain, all such contributions will also be protected against tampering — extending European Gymnastics’s commitment to integrity from the sport halls to managing data.

Today’s launch of Luigi is accompanied by the launch of the biggest sporting event in the world — the Olympic Games. To help you navigate all the performances by European gymnasts in Paris, Luigi is already equipped with knowledge about the schedule and will also be receiving updates about results every day.

“European Gymnastics is excited to keep pushing the innovation in our sport. After being the first continental gymnastics federation to launch a digital cup competition this year, we are now making first steps into adopting Artificial Intelligence and blockchain to improve the ease of interaction with what is sometimes considered a complex world of Gymnastics. This is an important step in our newly adopted Strategy 2030, embracing top technology which has a lot to offer..” Dr. Farid Gayibov, European Gymnastics President.

You can find Luigi’s digital twin on the European Gymnastics website.

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 25. July 2024

Ceramic Network

Unlocking Privacy: A Step-by-Step Guide to Ceramic's Private Data Proof-of-Concept

Learn how the Ceramic team is rolling out a plan to support private data in the network starting with a minimal proof-of-concept.

About a month ago one of our founders, Joel Thorstensson, released an initial overview of our plans to begin ideating how physical private data capabilities could be natively offered to developers using the Ceramic Network. The forum post includes details of how this would be rolled out in two phases in the form of a proof-of-concept, which is advisable to read before consuming this article.

It's also important to mention the motivation behind this effort. To start, Ceramic doesn't currently offer any native privacy features. This means that all data on Ceramic is public by default. At the same time, over the past few years, we've recognized the strong need for access control across several applications and use cases. A rough estimation would be almost half of all apps built on Ceramic have access control needs in one form or another.

In thinking about a solution, we aligned on the premise that physical access control (where data lives and who can sync it, as opposed to encryption) resonates most directly with the uniqueness of Ceramic's event-based architecture and our desire to align with edge privacy principles.

As such, our next step was to define a scope around a minimalist build to showcase how physical data privacy could be implemented in Ceramic.

Phase 1: API Read Access Control

If you've read Joel's forum post (linked above), you already know the details of the concept we've designed for this first phase. However, below are several key takeaways:

Builds on Ceramic-one (implementation of the Ceramic protocol, written in Rust) Leverages the Feed API on Ceramic-one nodes, thus allowing nodes to filter the feed of events based on which permissions a user has Designed to showcase how two users could share private data from the same node by leveraging object-capabilities (OCAP) Shows how an object-capability generated by user 1 references a stream containing the data they want to share (as well as the DID corresponding to user 2, the person they want to share their data with) and allows user 2 to access data that would otherwise be physically inaccessible to query and obtain without the OCAP

This article will walk through how to run the PoC locally.

If you prefer video, please view the YouTube version of our Private Data Playground:

Setting Up Your Environment

This walk-through requires you to generate clones of two repositories, one of which is the PoC itself, while the other is the Rust-Ceramic codebase.

Rust-Ceramic Set-Up

First, install protobuf:

# You can alternatively use `brew install protobuf` PROTOC_VERSION=3.20.1 PROTOC_ZIP=protoc-$PROTOC_VERSION-linux-x86_64.zip curl --retry 3 --retry-max-time 90 -OL https://github.com/protocolbuffers/protobuf/releases/download/v$PROTOC_VERSION/$PROTOC_ZIP \ && unzip -o $PROTOC_ZIP -d /usr/local bin/protoc \ && unzip -o $PROTOC_ZIP -d /usr/local 'include/*' \ && rm -f $PROTOC_ZIP

Next, clone the Rust-Ceramic codebase:

# we will need a special branch from the repo git clone https://github.com/ceramicnetwork/rust-ceramic && cd rust-ceramic && git fetch

We need to set up our Rust-Ceramic node from a specific branch. Enter the branch relevant to this PoC, build, and run the daemon:

# enter the special branch git checkout feat/private-data # build and run cargo run -p ceramic-one -- daemon

If your terminal starts populating with logs like the screenshot below, you've successfully started your node!

You now have an active Ceramic node running in the background! Next, we'll walk through the configuration for the private data playground web app.

Private Data Playground Web App Setup

First, clone the Private Data Playground repository:

git clone https://github.com/ceramicstudio/private-data-playground

Go ahead and open the private-data-playground repo in your text editor of choice. Once open, we will need to create a copy of the example environment file and rename it:

cp .env.example .env

Our first step is to supply a value to our NEXT_PUBLIC_PROJECT_ID variable by setting up a Project ID with WalletConnect. You can set one up for free (if you don't already have one) by following the simple steps in our WalletConnect tutorial (under "Obtain a WalletConnect Project ID). We will need this given that our application's Wagmi hooks rely on a contextual wrapper that will allow us to leverage these hooks within all child components, and use Web3Modal.

Once obtained, paste this into your new environment file next to the variable name referenced above.

Next, install your dependencies:

npm install

We're now ready to run the PoC!

Running the Application

Start up your application from within the private-data-playground repository in developer mode to initiate the UI:

npm run dev

You should now be able to access the UI by navigating to http://localhost:3000 in your browser!

Creating a Stream and a Capability

Our first section will focus on generating a stream (containing a simple message) and an OCAP.

To begin, self-authenticate by clicking "Connect Wallet."

You should see a secondary signature request appear after selecting an account - approving this request will create a browser session (specific to your DID, stemming from your Eth address) that the application will use to sign and submit data on your behalf as you create messages:

There are two views contained in this simple PoC - one for writing data, and one for reading. Make sure you're in the "Write" view by clicking the toggle under your address:

Go ahead and enter a simple message of your choosing - for example, "I love Ceramic!" would be an obvious choice. Go ahead and click the "Create" button. This action initiates a process that builds a new Ceramic stream and constructs your message into a payload that the Rust-Ceramic feed API will accept.

You should now see the resulting identifier under "Stream ID":

Go ahead and copy this value. Save it somewhere as it's needed later (a simple text document will suffice).

Finally, select another Eth address you control (make sure to remember which one) and enter it into the text input under "Delegate read access to". When ready, click "Create Capability". If you've followed all the steps correctly, your screen should look something like this:

Copy the capability value somewhere you can reference for the next section.

Congrats! You've successfully created both a Ceramic stream and a capability object! The next section will show how to use these to access otherwise private data.

Using the OCAP to Access Private Data

Go ahead and disconnect your current authenticated account from the web app. Next, go through the sign-in flow using the address you selected for the "Delegate read access to" input from the prior section.

Once authenticated, navigate to the "Read" toggle in the web app:

Enter the Stream ID and the Capability generated and saved from the prior section.

If you've copied over the values correctly, you should now be able to view the original message:

Congratulations - you've successfully used a capability to access otherwise private data on Ceramic.

You can also run through the "Read" process again, but this time make an arbitrary edit to the OCAP (thus invalidating it). With the Stream ID value kept the same, you'll notice that you no longer access the resulting message.

Next Steps

This minimal PoC is only the beginning of our plans for rolling out private data on Ceramic, with phase 2 coming soon (showcasing data privacy in the form of nodes and their ability to sync data between each other based on signed capabilities).

Is private data relevant to what you're building? Have feedback, questions, or concerns about our current thinking around private data? We'd love to hear from you! Fill out our community contact form, or email us at partners@3box.io.

Happy buidling!

Wednesday, 24. July 2024

Me2B Alliance

The Worldwide Web of Commercial Surveillance: Identity Resolution & Customer Data Platforms 

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and […] The post The

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and spanning the digital world and the physical world.  

In February 2024, Cracked Labs published “Pervasive identity surveillance for marketing purposes”, an in-depth analysis of LiveRamp’s RampID identity graph. One of the most superficial yet most powerful functions of this excellent report was to guide attention towards industries responsible for pervasive consumer surveillance. The timing was excellent as I’d already committed to present “The Hidden Identity Infrastructure” at Identiverse (May 2024) and prompted by the report, I dug in to better understand the two industries underpinning hidden identity infrastructure, namely, Identity Resolution (ID Res) and Customer Data Platforms (CDPs).  

There are nearly $9T worth of industries worldwide that rely on persistent, hidden identification of people. Naturally, demand of this magnitude fueled the now mature industries that perform pervasive, universal identification of people and their personal information. ISL identified over 350 companies providing either identity resolution platforms, customer data platforms, or both.  

This paper explores the magnitude and reach of these two industries, how they came to be, and most importantly, why, from a human well-being perspective, it’s crucial that these kinds of platforms be held to higher regulatory standards of scrutiny, transparency, and accountability. One identity resolution company alone out of 93  such companies (worldwide) boasts the collection of 5,000 data elements for [each of] 700 million consumers in 2021. To put this in perspective, the number of user accounts breached worldwide in 2023 was about 300 million1. Is there an appreciable difference between stolen user data and undisclosed “legitimate” personally identifiable information sharing? Moreover, nearly 40% of the 93 companies that provide identity resolution platforms are registered data brokers.   

Indeed, after reviewing the research, we must ask ourselves, is this the kind of world we want to live in: a world where everything about us is always known by industry; a world where the ongoing surveillance of people is deemed necessary in the name of capitalism. Is this the kind of world in which humans and societies will flourish or self-destruct? Are humans more than capitalistic consumers? Are we more than our purchasing potential?  

A Call to Action 

ISL conducted this research to help illuminate the sizable risk of hidden identification and the worldwide web of user surveillance. ISL believes naming and exposure is crucial to effecting change. Identification resolution and customer data platforms have been hiding in plain sight for more than a decade, and yet even the “identerati” are largely unfamiliar with these industries. How can we expect everyday people to know?   

This paper is a rallying call for privacy advocates to come together to demand greater regulatory scrutiny, transparency and oversight for these industries, in conjunction with more meaningful data broker regulation.  

Additionally, this is a rallying call to acknowledge the catastrophic failure of notice and consent as a valid permissioning mechanism for highly complex and interconnected digital services. It’s inconceivable that people understand the magnitude of data sharing that consenting to sharing “your data with our marketing” entails.  

We must ask ourselves if this is the kind of world we want for ourselves and our children, where our preferences, practices, relationships, behaviors, and beliefs are all up for sale and broadly shared without our awareness. Are we ourselves in fact being sold?  

The technologies fueling these capabilities have received billions of dollars; consumers don’t have a chance in the face of voracious hunger to identify, know, and manipulate them. We hope that this research shines a much needed light on the forces enabling the worldwide web of human surveillance so that they may be held to accountability for their troves of data on nearly all internet users. 

 P.S. Also check out our latest podcast with guest Zach Edwards where we discuss this worldwide web of human surveillance live.

Open Report PDF

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

Identity Resolution and Customer Data Platform Companies

The post The Worldwide Web of Commercial Surveillance: Identity Resolution & Customer Data Platforms  appeared first on Internet Safety Labs.


Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you […] The post Identity Resolution and Customer Dat

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

 

The post Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic appeared first on Internet Safety Labs.


Identity Resolution and Customer Data Platform Companies

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections. This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 […

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platform Companies

The post Identity Resolution and Customer Data Platform Companies appeared first on Internet Safety Labs.


The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms 

This paper explores the magnitude and reach of the Identity Resolution and Customer Data Platform industries, how they came to be, and most importantly, why, from a human well-being perspective, it’s crucial that these kinds of platforms be held to higher regulatory standards of scrutiny, transparency, and accountability. One identity resolution company alone out of […] The post The Worldwide We

This paper explores the magnitude and reach of the Identity Resolution and Customer Data Platform industries, how they came to be, and most importantly, why, from a human well-being perspective, it’s crucial that these kinds of platforms be held to higher regulatory standards of scrutiny, transparency, and accountability. One identity resolution company alone out of 93  such companies (worldwide) boasts the collection of 5,000 data elements for [each of] 700 million consumers in 2021. To put this in perspective, the number of user accounts breached worldwide in 2023 was about 300 million1. Is there an appreciable difference between stolen user data and undisclosed “legitimate” personally identifiable information sharing? Moreover, nearly 40% of the 93 companies that provide identity resolution platforms are registered data brokers.

Open Report PDF

The post The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms  appeared first on Internet Safety Labs.


Next Level Supply Chain Podcast with GS1

Digital Twins & Their Supply Chain Wins with Elyse Tosi

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.   Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in produ

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.

 

Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in product digitization. Elyse shares her extensive knowledge and experience in supply chain management, touching on her work with brands like Victoria's Secret and Eileen Fisher, to discuss the transformative impact of technology and standards on global supply chains.

 

They discuss enhancing value chain efficiency through interoperability, the significance of the EPCIS standard in scaling and achieving interoperability, and how EON, chosen by the EU to pilot digital product passports, is influencing legislation and standards adoption—an initiative critical for compliance, brand protection, and product authentication. They also explore emerging trends like digital twins, QR codes, digital links, and their game-changing potential for retail and customer engagement.

 

In this episode, you’ll learn:

How EPCIS standards ensure interoperability and scalability for digital product passports, enabling seamless data exchange and lifecycle management in supply chains

The transformative impact of digital twins, QR codes, and digital links on retail experiences, customer engagement, and product data connectivity, driving new commerce channels and incremental revenue opportunities.

How Eon leverages compliance with EU legislation to provide commercial benefits such as brand protection and product authentication, reinforcing the importance of scalable and cost-effective blockchain applications.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Elyse Tosi on LinkedIn

More about EON -https://www.eon.xyz/ 

 


Digital Identity NZ

Digital Trust Framework: Launch & Future | July Newsletter

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

It was a different story in industry, however, where the occasion was covered by DINZnationally by RNZlocally, internationally by Biometric Update and social media posts, including my own and those from DINZ.

The quote that really stuck was this one from Victoria University’s Professor of Informatics Markus Luczak-Roesch: “There’s a huge risk of doing nothing. Which is why it’s good that we’re doing something.” He’s absolutely right. It’s been over seven years of policy work at the DIA to reach this point, which I described as ‘the end of the beginning’. While a challenging journey, Aotearoa can build from here with those that want to opt-in.

Next month’s Digital Trust Hui Taumata ahead of Net Hui and The Point 2024, will kick off with a keynote by Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik. The panel that follows will discuss NZ’s Digital Identity Trust Framework, representing organisations that could be potential Relying Parties/Verifiers in Aotearoa under the DISTF regulation. The Trust Framework market model would see such parties seek out Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials, a topic that I blog about here. Publicly, it’s known that MSD and HNZ are piloting DIA’s platform, with RealMe as a notional issuer.

Additionally, the event will cover Digital Public Infrastructure, AI, biometrics, digital acceptance networks, digital drivers’ licences, the Metaverse, passkeys, digital cash, next generation payments, and the challenges of delegated administration across communities and much more. It’s all there, along with a panel of four experts who will review the sessions from a Te Ao Māori perspective.

In short, this year’s Digital Trust Hui Taumata will be like no other. The wait is over, and the rubber is hitting the road for the DISTF. What matters now is scale – will they come?

Lastly, I’m very excited to tell you that the DINZ podcast series is almost ready for launch so do keep an eye out for the first episode dropping very soon.

Ngā mihi
Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Digital Trust Framework: Launch & Future | July Newsletter

SUBSCRIBE FOR MORE

The post Digital Trust Framework: Launch & Future | July Newsletter appeared first on Digital Identity New Zealand.

Tuesday, 23. July 2024

Digital Identity NZ

Will the Digital Trust Hui Taumata 2024 move the dial?

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years.  The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years. 

Significant attention will be devoted to Trust Frameworks given the Digital Identity Services Trust Framework (DISTF) regulation coming into play on 1 July. Immediately following Minister Collins’ opening remarks, Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik will deliver an intensely interesting first keynote – The international landscape for Digital Identity Trust Frameworks and how NZ compares. Trust frameworks already exist and we use them daily – for example using your bank card to withdraw cash from another bank’s ATM. The panel that follows, representing organisations that could be potential Relying Parties (RPs)/Verifiers under the DISTF, discuss how they see Trust Frameworks playing out. To be relieved of the burden and to minimise risk, these parties notionally look for accredited Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials.  

Two of these three panellists come from regulated industries while the other is a key government agency, where in all cases the failure to verify parties correctly could have devastating consequences. Other regulated industries and government agencies that need similar verification processes include estate agents, rental companies, law firms, financial services, insurance companies, the pharmacies, doctor’s surgeries, the Police Vetting Service, driver licencing, firearms licensing, the box store where you take out a loan for your new appliance, registering for a loyalty scheme – and the list goes on. Representatives from the Regulator, the DISTF Trust Framework Authority, will lead a Roundtable discussion after lunch where delegates can pose their questions.   

There are multiple paths to achieve this nirvana of privacy-aware, cryptographically secured verified credentials available to all people under the auspices of a Trust Framework which is why, straight after the Trust Frameworks panel, Worldline’s Conrad Morgan will keynote a complementary path – Turning transactions into interactions – building New Zealand’s first digital identity acceptance network’. 

Supporting Trust Frameworks are increasingly biometrics and AI – both of which need demystifying for the public to gain confidence in them – along with Digital Public Infrastructure, the Metaverse, passkeys, digital cash, digital driver’s licences, next generation payments, the critical need for digital inclusion, and the challenges of delegated administration across communities. The agenda comprises local and international speakers covering these topics as well, all reviewed by a panel of four experts reviewing the sessions from a Te Ao Māori perspective. 

The richness of the content to be presented at this year’s event is incomparable with previous years. So do not be surprised when the 2024 Digital Trust Hui Taumata is dropped into conversations in years to come.    

Colin Wallis, Executive Director, DINZ

The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.


Energy Web

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in…

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic pa
Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape

July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic partnership with Acurast, an innovative leader in decentralized computing. This collaboration marks a significant step forward in enhancing the capabilities of both platforms while driving sustainability and technological innovation across the global energy landscape.

The partnership aims to seamlessly integrate Energy Web worker node networks with Acurast’s Decentralized Compute network. This integration will enable Energy Web users to host Energy Web workers on Acurast’s secure and widely distributed compute protocol. The primary goal is to facilitate a more efficient and scalable deployment of digital energy solutions.

In a move to expand its digital footprint, Energy Web will leverage the Acurast SDK to roll out a new mobile application. This collaboration will not only enhance mobile accessibility but also significantly improve the functionality, providing users with robust tools for managing their energy resources efficiently.

Both Acurast and Energy Web Foundation are committed to sustainability. Acurast’s approach to upcycling smartphones, giving them a second life as compute units in its decentralized network, dramatically reduces electronic waste and promotes efficient resource use. Similarly, Energy Web Foundation is dedicated to accelerating the clean energy transition through its development of cutting-edge, open-source technologies for energy systems.

By combining their unique resources and expertise, Acurast and Energy Web Foundation aim to foster significant innovation, efficiency, and sustainability in the energy sector. This partnership underscores their shared vision of a more sustainable and decentralized future, driving positive change across communities worldwide.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in… was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant  

  In May 2024, the US government’s General Services Administration updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the […] The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.
  In May 2024, the US government’s General Services Administration updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the IT Large Category. SIN 541519CSP is designed to help federal agencies ensure that any IT services procured meet the requirements of “National Institute of Standards and Technology (NIST) Special Publication (SP) 800-63 requirements and digital identity compliant services”. To provide credential services under the new SIN, companies must meet specific instructions and requirements.   SIN 541519CSP was created to meet the increasing need for robust, trustworthy credential service providers. The new SIN will help government agencies quickly identify credential service providers that have been vetted against the government’s standard requirements. If your company offers credential services and meets the requirements, obtaining SIN 541519CSP will place your company in a better position to capture bids as agencies look to acquire NIST 800-63 compliant services.   In order to be included on the Schedule, Credential Service Providers must either be listed on the Kantara Trust Status List or provide a letter of approval from Kantara Initiative, or other GSA approved third party that can assure conformance to NIST SP 800-63.  To begin the process, you’ll need to complete forms that can be found on idmanagement.gov.   Since both state and federal government agencies are permitted to use the vendors on this schedule for credential services, this considerably extends opportunities for Kantara certified companies.   Read the full instructions on how to be included on MAS, including the Technical Evaluation Criteria.

The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.


Blockchain Commons

2024 Q2 Blockchain Commons Report

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digi

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digital assets in ways that are private, independent, resilient, and open.

In Q2 of 2024, we advanced these principles through the following work:

Gordian Envelope Updates Expanded Developer Pages Request/Response Presentation Graph Representation Gordian Meetings FROST Presentation PayJoin Presentation All the Rest Seedtool-Rust Release seedtool-cli-rust Seedtool Manual dCBOR Adoption cbor.me cbor2 QCBOR IANA assignment of Tag 201 GSTP Improvements SSH Research ssh-envelope Experiment in Python SSH Key Support for envelope-cli More to Come Architectural Articles Minimum Viable Architecture Authentication Patterns DID Futures W3C DID 1.1 WG RWOT 13 Grants/Funding What’s Next? Gordian Envelope Updates

Gordian Envelope, Blockchain Commons’ “Smart Document” system, continues to be a major focus. Here’s what that meant in Q2.

Expanded Developer Pages. The developer pages were updated with a new executive summary and feature list to clarify the capabilities and advantages of using Envelope. (More executive summaries of our technology to follow!)

Request/Response Presentation. Our May Gordian Developers Meeting included a presentation on Request/Response, which is an interoperable communication methodology using Gordian Envelope. Why use it? It can make complex digital-asset procedures more accessible by using automation to dramatically reduce the amount of human interaction needed, yet it also preserves security by ensuring that human choices are required whenever data is transmitted from one device to another. (But watch the presentation for more!)

Graphs Representation. Blockchain Commons has a new research paper out on Representing Graphs with Envelope, which presents a proposed architecture for representing many types of graphs, enabling the use of Envelope for a variety of graph-based structures and algorithms.

Gordian Meetings

Gordian Developer Meetings are how we bring the wallet community together to talk about our interoperable specifications. We’ve been thrilled to expand that in the last quarter with some feature presentations from experts in the field.

FROST Presentation. April saw a special presentation on FROST by Jesse Posner that not only talked about his work to date, but also some of the emerging capabilities of FROST, such as the ability to regenerate shares or even change thresholds without changing the underlying secret! We’ve long thought FROST was a great next-generation resilience solution for digital assets, and so appreciate Jesse talking to our community about why it’s so exciting. See the complete video of our April meeting for more.

PayJoin Presentation. Privacy is one of our fundamental principles for Gordian design. It’s also a principle that will be better supported in Bitcoin with a new version of PayJoin. Dan Gould was kind enough to give a full presentation on the updates he’s working on at our May meeting. We’ve got a video of just his PayJoin presentation.

All the Rest. Both meetings of course also included details on Blockchain Commons’ own work (much of which is detailed in this report). The Gordian Developer meetings continue on the first Wednesday of every month. We’ve also already scheduled a few feature presentations for the rest of the year. On August 7th, we’ll have a special presentation on BIP-85, then on December 4th, we’ll have another FROST presentation for wallet developers. If you’d like to make a special presentation in September, October, or November on a topic of interest to wallet developers, let us know!

Also, if you’re a cryptographer, spec designer, or library developer who is working to implement FROST, please be sure to sign up for our FROST implementers announcements-only list so that you can receive invites for our second FROST Implementers Round Table, which will be on September 18 thanks to support from the Human Rights Foundation (HRF).

Seedtool-Rust Release

Blockchain Commons’ newest reference application is seedtool-cli for Rust.

seedtool-cli-rust. Seedtool is a domain-specific application that allows the creation, reconstruction, translation, and backup of cryptographic seeds. Blockchain Commons’ new Rust-based Seedtool replaces our older C++-based CLI and provides broader support for Gordian Envelope, including offering Gordian Envelopes of SSKR shares, that can backup a seed using Shamir’s Secret Sharing. Seedtool’s Gordian Envelopes can then be piped into envelope-cli-rust for compression, encryption, or the addition of further metadata.

Seedtool Manual. For more on seedtool-cli-rust, check out the full user manual, which explains how to use all of its functionality and why it’s important.

dCBOR Adoption

dCBOR is one of the foundations of Envelope, as it allows for the deterministic ordering of data, which is crucial for a hashed data system like Envelope. The IETF dCBOR Internet-Draft updated from v8 to v10 over Q2, with most of those changes due to expanding support for the spec. We’re still hoping to see the Internet-Draft finalized soon!

cbor.me. The CBOR Playground is Carsten Bormann’s foundational diagnostic site for CBOR. It now supports dCBOR thanks to a new Ruby Gem that Carsten authored.

cbor2. Joe Hildebrand’s cbor2 library for Typescript has also been expanded to support dCBOR.

QCBOR. Laurence Lundblade’s QCBOR library (which is written in C) now supports dCBOR in its development branch.

IANA Assignment of Tag 201. Finally, 201 is now officially the “enclosed dCBOR” tag for CBOR. This is also critical for Gordian Envelope, which uses this tag to wrap dCBOR in each of an envelope’s “leaf” nodes.

GSTP Improvements

Gordian Sealed Transaction Protocol (GSTP) is a Gordian Envelope extension. It allows for Envelope Requests and Responses to be sent in a secure way and is a critical element of Blockchain Commons’ Collaborative Seed Recovery system, which enables the storage of SSKR shares in a Gordian Depository.

GSTP Advances. Thanks to support from our Research Sponsor, Foundation Devices, Blockchain Commons was able to expend considerable engineering work on GSTP in the last quarter, resulting in more fluent API patterns for building GSTP requests and responses. In addition, GSTP now supports bidirectional self-encrypted state with a unique and powerful new feature that we are calling Encrypted State Continuations (ESC). Overall, GSTP is a system that is secure, distributed, and transportation-agnostic. In a world where we could be sending digital-asset info by NFC, Bluetooth, or QR codes, it’s a critical security measure. See our presentation from the most recent Gordian Developers Meeting for more!

SSH Research

SSH has been long used as an authentication system, primarily for accessing UNIX computers. However, it’s recently come under increasing usage as a signing system as well, primarily thanks to extensions in Git. That has led to Blockchain Commons experimenting with the integration of SSH keys into Envelope. (This has also demonstrateð the flexibility of Envelope through the addition of these signing methodologies.) We’ve now got some first results.

ssh-envelope Experiment in Python. Early in the quarter, we produced ssh-envelope, an experimental Python program that worked with both ssh-keygen and envelope-cli. But, thanks to some very rapid development, we’ve already moved beyond that.

SSH Key Support for envelope-cli. We’ve since integrated SSH key support throughout our Rust stack, primarily affecting our bc-components and bc-envelope Rust crates. This allowed us to bring our SSH key support fully into the Rust envelope-cli, which you can now use for SSH signing.

More to Come. We’re still working on processes that will allow for the safe, secure, and reliable signing of software releases, something that we talked about extensively in our software use cases. You can see some more of our work-in-progress in a discussion of SSH Key Best Practices. We hope to have more on using SSH to enable resilient & secure software releases later in the year.

Architectural Articles

Blockchain Commons expresses a lot of its more architectural thoughts as articles. There were two major articles in Q2.

Minimum Viable Architecture. Our first major article for the quarter focused on the methodology of Minimum Viable Architecture (MVA). Many companies still focus on Minimum Viable Products. Our article advocates instead looking at the big picture (with lots of discussion on why that’s important).

Authentication Patterns. Design patterns are a crucial element in architectural design. Much as with the adversaries found in #SmartCustody, design patterns allow you to put together a larger system piece by piece. As part of a guide to the strength of heterogeneity in architectural design, Blockchain Commons penned a set of authentication design patterns. We’d like to do more to fill out the space, but for now feel like this is a good first cut that shows the value of the design style.

DID Futures

The Blockchain Commons principals have been involved with DIDs since Christopher Allen founded Rebooting Web of Trust in 2015.

W3C DID 1.1 WG. After a hiatus, the W3C DID working group has been rechartered through 2026. Christopher Allen continues as an Invited Expert, focused on a variety of privacy issues, including elision, DID registration, and DID resolver issues.

RWOT 13. Meanwhile, Rebooting the Web of Trust continues to be on the frontline for DID advancements, with Christopher still the chair of the organization and Shannon Appelcline the editor-in-chief. RWOT13 is finally back in the USA, with the early bird deadline for advance-reading papers at the start of August.

Grants/Funding

As we’ve written elsewhere, funding has become more difficult in the last year because of large-scale financial factors such as inflation and the resultant increase in interest rates. Blockchain Commons has responded by working more closely with some of our partners on topics of special interest to them and by seeking out grants.

Thanks to Human Rights Foundation for their grant enabling our continued support of FROST work.

Thanks to Foundation Devices for their support of GSTP work.

Thanks to Digital Contract Design for their support of our advocacy over the last year.

Please consider becoming a personal or corporate sponsor of Blockchain Commons so that our work can continue. Or, if you want support to integrate or expand one of Blockchain Commons’ existing projects (such as SSKR, Envelope, or the Gordian Depositories) in an open manner, to meet your company’s needs, contact us directly about becoming a Research Sponsor.

Also, please let us know of any grants or awards that you think would be closely aligned with our work at Blockchain Commons, so that we can apply.

What’s Next?

Coming up:

More work on Envelope & GSTP. More reveals of our SSH work. A new musings on cryptographic “cliques”.

We’re looking forward to Q3!

Monday, 22. July 2024

FIDO Alliance

Strengthening Authentication with Passkeys in Automotive and Beyond

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had […]

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had over 100 attendees who heard from various stakeholders on the need and opportunity for standards-based approaches to securing the automotive workforce and manufacturing process. Themes included how passkeys and FIDO-certified biometrics can help transform the future of in-vehicle experiences, especially with in-car payments, smart cars, and IoT.

FIDO Momentum in the Automotive Industry

Like just about every market sector, the automotive industry is plagued by risks and ramifications associated with decades of relying on passwords – and is also uniquely poised to improve the user experience by embracing passkeys for user authentication.

With smart cars having embedded technology to connect to digital experiences, there are several innovations primed for take-off in the automotive industry. With nearly 100 million vehicles will be making payments by 2026, up from just 2.3 million in 2021, passkeys will be crucial to simplify the in-vehicle user experience. At the same time, manufacturers have the opportunity to improve IoT and secure embedded devices to improve customer experiences on and off the road.

Manufacturing and Smart Car Case Studies

On the workforce front, the event featured a case study from MTRIX and considerations on how to deploy FIDO security keys to a manufacturer’s workforce – contemplating the many types and locations of workers for today’s global manufacturers. This case study reinforced the factors called out in a presentation by Infineon on the regulatory-driven push and pull with FIDO authentication.

VinCSS described how FIDO Device Onboard is being used today to secure the smart car ecosystem both at point of manufacturing as well as for after-market use cases.

Using Passkeys for In-Vehicle Payments

The final block of sessions looked more closely at our in-vehicle future – including an overview of current trends for in-vehicle payments. Visa and Starfish then presented a blueprint and demo respectively for a standards-based approach for in-vehicle payments before Qualcomm wrapped things up with their vision for a digital chassis as the foundation for a software-defined vehicle that contemplates the need for secure identity, payments and driver/passenger personalization.

Driving FIDO in the Automotive Industry – Next Steps

Interested in this seminar’s content? Find these presentations and more on the Munich Seminar event page.

The FIDO Alliance welcomes input from the public and the identity security community on FIDO’s future in the automotive industry. Comments are welcome via our contact us page. For in-person connections, we encourage identity security and authentication professionals to join us at our conference, Authenticate, where there will be several automotive and passkey related sessions, content, and peer networking. This year’s event, held Oct. 14-16th, 2024, will be held in sunny southern California at the La Costa Omni Resort in Carlsbad, CA.


FIDO Munich Seminar: Strengthening Authentication with Passkeys in Automotive and Beyond

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state […]

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state of passwordless technology, detailed discussions on how passkeys work, their benefits, case studies, and practical implementation strategies. Attendees learned about current and emerging elements of the FIDO Certified program and how they pertain across sectors, including a focus on automotive and payments use cases. 

Attendees also had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A and networking – plus the opportunity to see demos and meet the experts that can help move FIDO deployments forward.

View the seminar slides below.

FIDO Munich Seminar Introduction to FIDO.pptx from FIDO Alliance

FIDO Munich Seminar Blueprint for In-Vehicle Payment Standard.pptx from FIDO Alliance

FIDO Munich Seminar FIDO Automotive Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Biometrics and Passkeys for In-Vehicle Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Strong Workforce Authn Push & Pull Factors.pptx from FIDO Alliance

FIDO Munich Seminar: Securing Smart Car.pptx from FIDO Alliance

FIDO Munich Seminar In-Vehicle Payment Trends.pptx from FIDO Alliance

FIDO Munich Seminar Workforce Authentication Case Study.pptx from FIDO Alliance

FIDO Munich Seminar: FIDO Tech Principles.pptx from FIDO Alliance

FIDO Munich Seminar Considerations for Workforce Authentication from FIDO Alliance

Energy Web

ECS4DRES: Shaping the Future of Renewable Energy Systems

A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government. In collaboration wi
A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe

We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government.

In collaboration with 33 partners across 6 European countries, ECS4DRES aims to revolutionize the reliability, safety, and resilience of Distributed Renewable Energy Systems (DRES). By developing advanced monitoring and control technologies, the project will incorporate integrated sensors with energy harvesting functions, capable of various types of detection for safety and monitoring of energy transfers. Additionally, ECS4DRES will achieve interoperable and low-latency communication systems, along with sophisticated algorithms, AI tools, and methods. These innovations will enable the widespread interconnection, monitoring, and management of numerous DRES, subsystems, and components, optimizing energy management between sources, loads, and storages, enhancing power quality, and ensuring resilient system operation.

ECS4DRES is committed to thorough validation of these technologies through a series of five relevant use cases and demonstrators. The project’s results will generate a wide range of scientific, technological, economic, environmental, and societal impacts on a global scale, meeting the needs of Original Equipment Manufacturers (OEMs), Distribution System Operators (DSOs), grid operators, EV charging station aggregators, energy communities, end customers, and academia.

By providing interoperable and tailored solutions in electronic control systems, sensor technology, and smart systems integration, ECS4DRES will facilitate the deployment and efficient, resilient operation of DRES, including the integration of hydrogen equipment and components.

As we embark on this ambitious project, we are reminded of the words of renowned futurist Alvin Toffler: “The great growling engine of change — technology.” ECS4DRES represents a significant leap forward in the technological advancement of renewable energy systems, driving us toward a more sustainable and resilient future.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

ECS4DRES: Shaping the Future of Renewable Energy Systems was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Join us on The Identity at the Center Podcast as we sit down

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea. In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea.

In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field of ITDR, and the intersection of digital identity, authentication, and AI in cybersecurity.

Watch the episode: https://www.youtube.com/watch?v=klBxFLvUC78

More Info: idacpodcast.com

#iam #podcast #idac

Thursday, 18. July 2024

FIDO Alliance

Battling Deepfakes with Certified Identity Verification

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated […]

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated “deepfakes.” As internet users have learned about the increasing threat of deepfakes, they have become increasingly concerned about their identities being spoofed online, according to a new study conducted by the FIDO Alliance. As a result, deepfake awareness and the risks associated with them have steadily increased.

Amidst this landscape, the FIDO Alliance released its newest research in the eBook, Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024, which reveals insights from an independent study surveying 2,000 respondents in the U.S. and the U.K. on consumer perceptions on remote identity verification, online security, and biometrics. While the data showed consumer awareness and adoption of biometrics is increasing, consumers also expressed concerns about the rise of AI-generated deepfakes – reinforcing the need for preventative strategies and technologies focused on secure remote identity verification. 

What is a “deepfake”?

According to the Center for Internet Security, a deepfake consists of convincingly fabricated audio and video content designed to mislead audiences into believing that fabricated events or statements are real. These manipulations can create realistic yet entirely false representations of individuals through synthetic images or complete video footage. This manipulated audio/video content is dangerously effective at spreading false information. In cybersecurity, deepfakes are increasingly being used to spoof identities to fraudulently open accounts or take control of existing accounts.

With the advent of AI and the increasing use of face biometrics for remote identity verification, the deepfake risks to remote identity proofing (RIDP) methods have become a reality. Security researchers have been closely evaluating the identity verification risks associated with deepfakes to increase awareness of the rapidly changing threat landscape and support stronger countermeasures that enhance the trustworthiness and reliability of remote identity proofing (RIDP) methods. In the European Union Agency for Cybersecurity’s (ENISA) latest remote ID report, researchers observed that deepfake injection attacks are increasing and becoming more difficult to mitigate.

Users Express Concerns about Deepfakes and ID Verification

With the rise of generative AI and deepfake videos in the news, there has been a heightened consumer unease about the security of biometrics for online verification. In the FIDO Alliance’s study, the deepfake trends have not escaped consumers’ attention online, who are increasingly using face biometrics to authenticate identities online and are concerned about identity security.

On one hand, the study reinforced consumer preference for using biometrics in remote identity verification, with nearly half of the respondents indicating a preference to use face biometrics, especially for sensitive transactions, like financial services (48%). 

On the other hand, just over half of respondents revealed they are concerned about deepfakes when verifying identities online (52%).

Building Consumer Trust in Face Biometrics

As the concerns around deepfake security threats gain prominence, the industry has taken a significant step forward with the FIDO Alliance’s newly introduced Identity Verification certification program for Face Verification. This industry-first testing certification program, based on ISO standards, with requirements developed by the FIDO Alliance, aims to measure accuracy, liveness (including deepfake detection), and bias (including skin tone, age, and gender) in remote biometric identity verification technologies. By providing a framework for testing biometric performance and a network of accredited laboratories worldwide, this certification program standardizes and evaluates the performance of face verification systems while mitigating the impact of bias and security threats, like deepfakes.

Certifying Identity Verification with the FIDO Alliance

The Identity Verification certifications that the FIDO Alliance provides offer industry providers the ability to demonstrate commitment to addressing bias and security threats in remote biometric identity verification technologies. With a focus on standardizing and enhancing the performance of face verification technologies, the Alliance released its new FIDO Certification Program to elevate the performance, security, and equity of biometric solutions for remote identity verification. Combined with its Document Authenticity (DocAuth) Certification Program, these two certifications work together to ensure identity verification solution providers can leverage FIDO’s independent testing and accredited laboratories as a market differentiator. 

What is the value for IDV Biometric Vendors? Independent validation of biometric performance Opportunity to understand gaps in product performance to then improve and align with market demands Demonstrate product performance to potential customers  Improve market adoption by holding an industry-trusted certification Leverage one certification for many customers/relying parties  Benefit from FIDO delta and derivative certifications for minor updates and extendability to vendor customers Reduce need to repeatedly participate in vendor bake-offs What is the value for Relying Parties? One-of-a-kind, independent, third-party validation of biometric performance assessing accuracy, fairness and robustness against spoofing attacks Provides a consistent, independent comparison of vendor products – eliminating the burden of maintaining own program for evaluating biometric products Accelerates FIDO adoption to password-less Commitment to ensure quality products for customers of the relying parties  Requirements developed by a diverse, international group of stakeholders from industry, government, and subject matter experts Conforms to ISO FIDO Annex published in ISO standards What is the value of accredited laboratories?

FIDO Accredited Laboratories are available worldwide and follow a common set of requirements and rigorous evaluation processes, defined by the FIDO Alliance Biometrics Working Group (BWG) and follow all relevant ISO standards. These laboratories are audited and trained by the FIDO Biometric Secretariat to ensure lab testing methodologies are compliant and utilize governance mechanisms per FIDO requirements. Laboratories perform biometric evaluations in alignment with audited FIDO accreditation processes. In contrast, bespoke, single laboratory biometric evaluations may not garner sufficient trust from relying parties for authentication and remote identity verification use cases.

What are the ISO Standards that FIDO certification conforms to?

When a vendor invests in FIDO’s Face Verification Certification, they and their accredited lab are adhering to the following ISO standards:

Terminology
ISO/IEC 2382-37:2022 Information technology — Vocabulary — Part 37: BiometricsPresentation Attack Detection
ISO/IEC 30107-3:2023 Information technology — Biometric presentation attack detection — Part 3: Testing and reportingISO/IEC 30107-4:2020 Information technology — Biometric presentation attack detection — Part 4: Profile for testing of mobile devices
-FIDO Annex, published 2024Performance (e.g., FRR, FAR)
ISO/IEC 19795-1:2021 Information technology — Biometric performance testing and reporting — Part 1: Principles and frameworkISO/IEC 19795-9:2019 Information technology — Biometric performance testing and reporting — Part 9: Testing on mobile devices
-FIDO Annex, published 2019Bias (differentials due to demographics)
ISO/IEC 19795-10:2024 Information technology — Biometric performance testing and reporting — Part 10: Quantifying biometric system performance variation across demographic groups
-FIDO Annex, under developmentLaboratory
ISO/IEC 17025:2017, General requirements for the competence of testing and calibration laboratories Learn More about FIDO IDV Certification

As organizations and policymakers navigate the evolving landscape of digital identity verification, these consumer insights serve as a testament to the pressing need for independently tested and accurate biometric systems. The FIDO Alliance’s new Face Verification Certification Program offers solution providers the opportunity to demonstrate deepfake prevention to relying parties and end users by testing for security, accuracy, and liveness.

Download the Remote ID Verification eBook here today, and discover the world-class offerings from FIDO’s certified providers that have invested in independent, accredited lab testing with FIDO certification.


Energy Web

Green Proofs by Energy Web Now Available as a Service

Enables energy companies to rapidly construct digital registries for green commodities July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly c
Enables energy companies to rapidly construct digital registries for green commodities

July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly construct digital registries for tracking, tracing, and exchanging digital certificates representing any green commodity with unprecedented flexibility and control.

Green Proofs as a Service includes the following key features:

Customized Data Formats and Schema: Users can tailor data formats and schema specific to different green commodities, enabling any green commodity and associated data format to be supported Configurable Business Logic and Rules: Administrators can define and adjust business logic and rules for the creation, transfer, issuance and retirement of certificates, providing full control over the certification process. Comprehensive Registry Administration: The service includes all functionalities expected of a registry administrator, such as the ability to add and remove users from individual companies or multiple companies, enhancing security and user management.

Green Proofs has already demonstrated its efficacy and reliability in supporting multiple enterprise solutions. Notable implementations include the RMI / EDF Sustainable Aviation Fuel Certificate Registry, a low-carbon shipping registry, and multiple 24/7 renewable energy matching solutions. These use cases highlight the versatility and robustness of Green Proofs in real-world applications.

“Green Proofs as a Service marks a significant milestone for Energy Web and our commitment to driving innovation in the energy sector,” said Mani Hagh Sefat, CTO of Energy Web. “By offering Green Proofs via an as-a-service model, we help our clients innovate much faster by quickly putting a digital registry into their hands for experimentation and rapid prototyping.”

Green Proofs as a Service is now available to businesses and organizations worldwide who are interested in using digital registries to support any green commodity supply chain. For more information or to schedule a demo, please visit www.energyweb.org or contact hello@energyweb.org

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Green Proofs by Energy Web Now Available as a Service was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Origin Trail

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years

Celebrating 50 Years of the GS1 Barcode June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards. As consumer demand for product information grew, regulator
Celebrating 50 Years of the GS1 Barcode

June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards.

As consumer demand for product information grew, regulatory requirements became stricter, and supply chain optimization pressures increased, the need for an updated barcode became evident. Enter the GS1 Digital Link, the barcode upgrade designed to provide dynamic access to comprehensive product information. Now, with leading retail and consumer goods companies actively supporting the transition to Digital Link QR codes, the stage is set for the traditional barcode to retire gracefully.

Setting a Strong Foundation for Digital Link with OriginTrail

For products and brands to fully benefit from the GS1 Digital Link transition, a robust, connected, and verifiable data foundation is crucial. Product data is often split across various supply chain partners, including manufacturers, logistics providers, wholesalers, retailers, and others. To connect billions of products to the internet in a meaningful way that provides genuine insights and business value, this scattered product data needs to be interconnected.

Scanning a Digital Link on a product and seeing the manufacturer’s information, such as production date, description, ingredients, and brand details is good. Scanning the same code and accessing comprehensive information about the product’s journey through the supply chain — including whether the ingredients were ecologically produced, if the product was stored at proper temperatures during transport, and how long it was in the supply chain — is much better. This is the true potential of the Digital Link.

Beyond consumer engagement, consider a business operating a rail or plane network being able to access details on a component’s manufacture, testing, and maintenance by scanning a Digital Link code. That would have surely been invaluable with the recent Boeing aircraft incidents.

This is where the OriginTrail Decentralized Knowledge Graph (DKG) and GS1 Digital Link make a match in heaven. The DKG provides a verifiable and interconnected knowledge base encompassing product data, supply chain events, certifications, locations, and more — across organizations and data sources. With the new DKG V8, the OriginTrail introduces the scalability needed to bring billions of products equipped with Digital Link into a world of standards-based, connected, and verifiable data. And the new DKG Edge Node concept empowers organizations and business networks to exchange product and other supply chain data with just a few clicks while maintaining data privacy, verifiability, and connectivity.

Supply chain data from multiple sources connected in a verifiable Decentralized Knowledge Graph.

As a longstanding partner of GS1, OriginTrail DKG is designed to natively support GS1 standards, including EPCIS, Core Business Vocabulary (CBV), Global Data Model (GDM), and Digital Link. This integration means that consumers, regulators, brands, and other stakeholders can access richer, more comprehensive, and trusted product data. The challenge now is to make this user experience seamless and simple, and there’s a tech perfect for the job — Artificial Intelligence (AI).

OriginTrail, Digital Link, and AI: A Consumer Engagement Power Throuple

Incorporating AI into the mix creates an incredibly powerful technology trio, enabling brands to enhance consumer engagement, based on connected and verifiable data spanning organizations, in unprecedented ways. And with the DKG Edge Node, AI capabilities come natively. Brands can thus offer personalized and tailored experiences by allowing customers to scan a product with a Digital Link QR code and ask anything — from brand details to product origins, sustainability, and environmental impact, all based on verifiable data from OriginTrail DKG.

This combination not only benefits consumers but also provides brands with valuable insights into customer preferences, allowing them to refine their business strategies. As billions of products transition from barcodes to Digital Link, the potential of this technology trio becomes evident. In fact, AI-powered product discovery, based on OriginTrail and Digital Link, is no longer a future concept but a current reality:

Some additional examples to check out:

Check the origin » Perutnina Ptuj Church of Oak Whiskey Distillery

Simultaneously, organizations can leverage AI to better understand and enhance their supply chains, ensuring they receive accurate and verifiable responses rooted in data from across their business network. By simply scanning a Digital Link QR code on a product, pallet, or shipping container, users are immediately empowered to ask questions and get verifiable answers — from basic queries like “Where was this product manufactured?” to more complex ones such as “Was the temperature in this shipping container in line with expectations?” and “Give me a list of all train wagons that are likely to experience issues with their wheels in the next month.” Exciting stuff indeed.

Where do we go from here?

As billions of products transition from traditional barcodes to Digital Link QR codes, establishing a robust foundation of connected and verifiable data becomes paramount. OriginTrail is at the forefront of this transformation, with the new DKG V8 offering the scalability and simplicity necessary to realize its full potential. When combined with AI, this technology trio unlocks immense opportunities for brands to engage with their customers in a trusted and meaningful way.

But consumer engagement is just one area set to benefit significantly from this transition. Regulatory bodies will gain streamlined access to verifiable product data, and supply chain management will become more proactive and efficient. The coming months and years promise exciting advancements and opportunities, making this a pivotal moment in the evolution of product information and consumer engagement.

We are excited to see OriginTrail at the epicenter of it all, as we — Trace Labs, the core developers of OriginTrai — along with our ecosystem partners get ready to unveil the Digital Link support via the new DKG V8 at the GS1 Industry & Standards Event. Over 1,000 business leaders from 80+ countries will come together virtually to solve today’s greatest business challenges through the development and adoption of the GS1 global standard.

For the GS1 Industry & Standards Event, register at: https://standards-event.gs1.org/

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 17. July 2024

Ceramic Network

Optimizing Ceramic: How Pairwise Synchronization Enhances Decentralized Data Management

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic. “Information also wants to be expensive. Information Wants To Be Free.

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic.

“Information also wants to be expensive. Information Wants To Be Free. ...That tension will not go away.” Stewart Brand. There is tension since data storage is a competitive market but data retrieval can only be done by the service that has your data. At 3Box Labs, we want to catalyze a data ecosystem by making community driven data distribution not only possible but available out of the box. Ceramic is a decentralized storage solution for apps that are dealing with multi-party data and that is more scalable, faster and cheaper than the blockchain.

Data vendor lock-in

Many organizations and individuals have data that they want to publish and Ceramic lets them do so without instant data vendor lock in for storing their own data. In the Web2 era, data often becomes ensnared within exclusive services, restricting its accessibility and durability. Access to this data requires obtaining permission from the service provider. Numerous platforms have vanished over the years, resulting in significant data loss like GeoCities, Friendster and Orkut. Even within still existing companies like Google, numerous lost data products are documented. See killed by google.

We can break free from this risk by creating data-centric applications that multihome the data. Ceramic is the way to have many distinct data controllers publishing into shared tables in a trustless environment. Each reader can know who published what content and when they did without relying on trusting the storage service to keep accurate audit logs. Since each event is a JSON Document, signed by a controller, timestamped by ethereum, and in a documented schema it can be preserved by any interested party, with or without permission from the storage vendor.

Multihome the data

In Ceramic we separate the roles of data controllers from the data servers. By allowing data to live on any preferred server the data is durable as long as any server is interested in preserving the data. This allows data to outlive a particular data server, paired with the durability of data living in multiple places and the speed/reliability of operating on local data.

Document the schema

Throughout the history of the internet, we have witnessed numerous data services going away and taking the users data with them. While multihoming helps preserve data, it's useless without the ability to interpret it.

Ceramic preserves the data formats in two ways. The first is that the data lives in JSON Documents. This format allows us to reverse engineer and examine the data.  The second is that the model schema gets published. The model schema contains both json-schema and human language description that the original developer can use to give machine and human context to the data. This enables both the preservation of the data and schema so the data can be understood and new apps can be made to interact with the preserved data.

{ "data":{ "accountRelation":{ "type":"list" }, "description":"A blessing", "name":"Blessing", "relations":{ }, "schema":{ "$defs":{ "GraphQLDID":{ "maxLength":100, "pattern":"^did:[a-zA-Z0-9.!#$%&'*+\\/= ?^_`{|}~-]+:[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~- ]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*$", "title":"GraphQLDID", "type":"string" } }, "$schema":"https://json-schema.org/draft/2020-12/schema", "additionalProperties":false, "properties":{ "text":{ "maxLength":240, "type":"string" }, "to":{ "$ref":"#/$defs/GraphQLDID" } }, "required":[ "to" ], "type":"object" }, "version":"1.0", "views":{ "author":{ "type":"documentAccount" } } }, "Header":{ "controllers":[ "did:key:z6MkgSV3tAuw7gUWqKCUY7ae6uWNxqYgdwPhUJbJhF9EFXm9" ], "model":{ "/":{ "bytes":"zgEEAXFxCwAJaG1vZGVsLXYx" } }, "sep":"model" } }

Example schema document

Information retrieval 

The key to multihome data is being able to retrieve the data from a server that has it.

How do we move the data from the servers that have the data to the servers that are interested in storing it? When we first made Ceramic we used two multicast methods: The first was to do a gratuitous announcement of new data. Send the data to EVERY node in the network so that they can store it if they are interested in it. Second, if a node did not know about a stream then when requested by a user it would multicast a request to the whole network and take the latest version to come back as a response.

This worked but had several drawbacks. The first is that requests for streams that a node did not know used WAN traffic and would have unpredictable latencies. This meant that all applications needed to design for slow unpredictable retrieval times. The second drawback was that a node had no way to retrieve a complete set of the streams that matched their interests. They could only listen to the multicast channel and fetch any stream they happened to hear about. Any stream that they missed either because it happened before the node was online or during down time could be missed forever. Third, there is a performance cost to sending requests to nodes that have no mutual interest with your node. A node that did 100 events a year could not scale down since it would need to keep up with filtering announcements from nodes doing 100 events a second. If we wanted to support both very large and very small data centric applications we needed a new strategy. We even saw cases where a slow node could not keep up on the multicast channel harming the performance of larger more powerful nodes.

To solve these problems of performance, completeness, and scalability we switched to a pairwise synchronization model. Each node advertises the ranges of streams that the node is interested in. Each node only synchronizes the streams that are of mutual interest and the nodes synchronize pair wise.

Scalability

Since the nodes synchronize pairwise, no slow node can harm the ability of two healthy nodes to complete a synchronization. If two nodes have no intersection in their interests then the conversation is done. A range of streams that has 100s of events per second that your node is not interested in will not create work for your node. A node only needs to scale to the speed of events in the ranges it is interested in and the scale of any model you are not interested in costs you nothing. This solved our scale up / scale down objective.

Completeness

If the two nodes do have an intersection of their interests they will continue the synchronization until both nodes have ALL the events that the other node had when the synchronization began. There is no longer a need for high availability to be online either when the stream’s event was originally published or when some node queried for that stream. If the event is stored by either of the nodes both nodes will have it at the end of the pairwise synchronization. Once a node has pairwise synchronized with each of the nodes that are advertising an interest range that node has all of the events in that range as of the time of the synchronization. This solves the completeness objective.

More interestingly, the local completeness means that we can build local indexes over the events and do more complex queries over the events in the ranges nodes are interested in entirely locally.

Performance

 Lastly, since we have a complete set of events for our interests we can serve queries about the events from the local node with no need for WAN traffic. This solves the performance objective for predictable fetch latencies.

Pairwise Synchronization in Logarithmic rounds

In the multicast model ceramic sends messages to all other ceramic nodes. One of the most notable differences with synchronization is that nodes do pairwise synchronization one peer at a time. The two peers will each send the other their interests. Both nodes filter the events that they have to find the set of events of mutual interest between the two nodes. Once this intersection is found we synchronize the set with a Range-Based Set Reconciliation protocol we call Recon.

We can report progress in a Recon synchronization by reporting the percentage of events in the in sync vs syncing ranges. Alternatively we could render a bar like in the diagram showing which ranges are in which states.

This is a divide and conquer protocol. We start with the full intersection as a single range. We pull a range off the work list and send the (hash, count) of all events in the range to the other side. They compare their own (hash, count) and respond accordingly.

We have

They have

Acton

hash_A

hash_A

Done.
`in sync`

0

hash_A

Send a request for the events.
`Don’t have`

hash_A

0

Send the events.
`in sync`

hash_A

hash_B

Split the range
Push sub-ranges from split on the work list.
Each range `syncing`

The range splits are handled differently on the Initiator then the Responder. The Initiator maintains the work list and pushes all of the subranges onto the work list. The Responder just sends a message back with multiple ranges and hashes for each range. This keeps the synchronization state on the Initiator and reduces the burden on the Responder to a stateless call and response. This fits Recon into the http client server request response paradigm.

Exponential distribution

Now that we have replaced a multicast message to all nodes in the network with pairwise sync it is reasonable to ask if we have broken the exponential distribution we got from multicast trees.

How fast can data spread through the network? Now that we have replaced the multicast channel with pairwise connections, how do we match the exponential distribution of the multicast channel? 

We get this property since each node cycles through connecting to all other nodes that advertise overlapping interests. When the node that originally received the event from a client there is 1 copy on the network. After the first sync there are 2. Then both of the nodes sync to new nodes giving 4. This will grow exponentially until almost all interested nodes have the data. At that point the odds that any node with the event calls a node without it is small but the odds that the node without the event calls a node with it is large. By using synchronization we get the benefits of both push and pull gossip protocols. Push which is fast when the knowledge of the event is rare and pull which is fast when knowledge of the event is common.

Summary

By using Set reconciliation to perform pairwise synchronization of node’s overlapping interests we are able to have performance, completeness, and scalability. The predictable performance of querying local data on your node. The completeness of synchronizing all of the events of interest preemptively. The scalability of not synchronizing the events that lay outside of the interests of a node. Pairwise synchronization protects the network from slow nodes from slowing down the rest of the network. It is now possible to scale up or down without performance and completeness problems. This enables developers to build data intensive applications without the data vendor lock-in from either the storage providing service or the application that originally read the schema.

Tuesday, 16. July 2024

FIDO Alliance

Case Study: Wedding Park Deploys Company-Wide Passwordless Authentication for Internal Cloud Service Logins

Corporate overview: Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review […]

Corporate overview:

Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review information site and has since expanded its operations. Utilizing a wealth of information, it operates several wedding-specialized media, including the wedding preparation review site Wedding Park. In addition, it runs various businesses in the realm of weddings combined with digital technology, such as internet advertising agency services, digital transformation (DX) support, and educational ventures.

Background and challenges leading to deployment

Wedding Park was faced with the challenges of strengthening the security of multiple cloud services that were being used for internal operations and the complexity of password management. As a way to address these issues, the company introduced an ID management service and consolidated them into a cloud service entrance with a single sign-on function.

The impetus for deploying FIDO authentication came from the fact that Salesforce, which is used for authentication for customer management, order and supply systems, and time and attendance management, announced that multi-factor authentication (MFA) was mandatory. However, if MFA is applied only to Salesforce and other cloud services continue to operate with password authentication, not only will the usability of users deteriorate, but the work of the IT management department will also become more complicated. In addition, due to the vulnerability of password-only authentication, the company decided to apply MFA to all cloud services, including Salesforce, in accordance with its policy to promote zero-trust security in February 2020.

Selection and verification of an authenticator

As an authentication method for MFA, the company considered one-time password authentication (OTP) and biometric authentication using smartphone applications, but ultimately decided to deploy passwordless authentication using FIDO for its unique ability to improve both security and user convenience.

In order to realize passwordless authentication using FIDO, a terminal equipped with a FIDO-compatible biometric authentication device is required. The majority of devices currently on the market support FIDO authentication, and with the exception of a few employees, the adoption of FIDO has been supported by the fact that all in-house devices are already equipped with Windows Hello and Touch ID. For some employees who use the devices not equipped with biometric features, a separate external authenticator has been installed.

A step-by-step changeover for each department

After examining the authenticators, the policy to deploy passwordless authentication company-wide in January 2022 was officially launched. The transition took place from February to March of the same year, and the smooth implementation in a short period of one month was made possible by the department-by-department implementation and the generous support provided by the IT management department. For this implementation, the company requested the support of CloudGate UNO, an identity management platform by International System Research Corporation (ISR) that the company has been using since 2012, because it supports passwordless authentication using FIDO2 and biometric authentication using a smartphone APP. 

The introduction of the system within the company began with the development department and gradually progressed to departments with a larger number of employees. First, at regular meetings for each department, the company communicated the purpose of why the system was being introduced and the benefits of “the deployment of the system will make daily authentication more convenient,” and gained the understanding across the company. The introduction of the system on a departmental basis had the advantage of not only limiting the number of people the IT management department had to deal with at one time, but also allowing the accumulation of QA as test cases and the smooth maintenance of manuals, since the system was introduced starting with the development department, which had high IT skills.

As a result of close follow-up by the IT management department, which not only prepared materials, but also checked the progress status on the administrator website as needed, and individually approached employees who had not yet registered their certifiers, the company was able to implement the system company-wide within the targeted time frame.

Effects of introduction

The number of login errors due to mistyping of passwords, which used to occur about 200 times a month, has been reduced to zero since the deployment of FIDO authentication. Many employees commented that the system has become very convenient, eliminating authentication failures due to forgotten passwords or typing errors. In addition, the number of periodic password reset requests has decreased, resulting in a reduction in man-hours for the administrator.

The passwordless authentication is smooth, and the authentication status retention period was shortened to further enhance security, but the system has continued to operate without problems since then.

Wedding Park’s future vision is to link all cloud services used within the company to “CloudGate UNO” and centrally manage them, including authentication, with “CloudGate UNO.

Akira Nishi, General Manager of the Corporate IT Office, who spoke with us about this case study, made the following comments.

“For those who are considering the deploying of a new authentication method, there is inevitably a concern that a change in authentication method will cause a large-scale login failure. In our case, in the early stages of the project, we held explanatory meetings for each department and repeatedly brushed up on explanatory materials and procedures, which was effective in minimizing confusion and anxiety within the company.

“After the switchover, we continued to check on the progress of the implementation and followed up with each department individually, but once the use of passkey (device-bound passkey) became standardized within the company, we felt that the scope of use, including various security measures, was expanding dramatically.”

download the case study

Ceramic Network

New Ceramic release: ceramic-one with new Ceramic Recon protocol

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of ceramic-one.

About the release

The new release of Ceramic includes a data synchronization protocol called Recon, implemented in Rust. This new implementation of the Ceramic protocol enables data sharing between nodes and allows developers to run multiple nodes that stay in sync and are load balanced. All this facilitates highly available Ceramic deployments and reliable data synchronization.

To utilize the Recon protocol for their applications, developers are provided with a binary called ceramic-one.

This new implementation of the Ceramic protocol offers significant performance and stability improvements. Additionally, this release marks a significant shift in making the Ceramic architecture more robust, allowing the team to iterate on and build new protocols in the future.

The new Recon protocol

Recon is a new data synchronization protocol used for synchronizing stream events in the Ceramic network, implemented on top of libp2p. Stream sets bundle multiple streams together, allowing nodes with a common interest in certain streams to synchronize efficiently.

Before Recon, Ceramic nodes broadcasted updates to streams to every node in the network using a simple libp2p pubsub topic. Due to the single channel, nodes would receive stream event announcements they were not interested in, imposing a significant overhead on every node. Additionally, the network's throughput was limited by bandwidth, which led to either prioritizing high-bandwidth nodes or greatly limiting the network throughput to support low-bandwidth nodes.

Recon provides low to no overhead for nodes with no overlap in interest, while retaining a high probability of receiving the latest events from a stream shortly after any node has the events, without any need for remote connections at query time. By shifting updates from the pubsub channel to a stream set, interested nodes can synchronize without burdening uninterested ones. Stream sets also enable sharding across multiple nodes, allowing synchronization of only sub-ranges, which distributes the storage, indexing, and retrieval workload.

Additionally, nodes need to discover peers with similar interests for synchronization. Recon achieves this through nodes gossiping their interests and maintaining a list of peers' interests, ensuring synchronization with minimal bandwidth. Nodes also avoid sending event announcements to uninterested peers.

Performance and robustness improvements

This release, along with the recent Ceramic Anchor Service (CAS) updates, marks significant scalability improvements. Currently, Ceramic provides a throughput of 250 TPS (transactions per second), more than double the previous throughput of up to 100 TPS before the Recon implementation. This increase in throughput is especially important for applications that handle large amounts of user data and require fast transaction times.

These numbers were measured between two nodes that share the same interest. It’s worth noting that nodes without overlapping interests do not affect each other's throughput. This means that, in theory, the throughput of a ceramic-one node scales horizontally. However, there is still one component that puts an upper limit on this: the CAS, which is operated by 3Box Labs. This service is currently a centralized bottleneck in the protocol, which is why the team’s next goal is Self-Anchoring, allowing any Ceramic-One node to operate completely independently.

This release of Ceramic is also a significant step towards making the Ceramic architecture more robust, enabling the team to iterate on it and build new protocol implementations more easily and quickly.

Getting started with ceramic-one

All new Ceramic developers are recommended to use the ceramic-one to start building on Ceramic. Check out the setup guides on the Ceramic documentation to get started.

Developers, who have been building on Ceramic for a while, are encouraged to migrate their applications to the ceramic-one-based implementation. Check out this migration guide to follow the migration steps.

Share you feedback with us!

We would like to get your feedback on building on Ceramic. Do you have any suggestions or ideas of how the core Ceramic team can improve the implementation of Ceramic? Do you have questions or troubles using the new release or migrating your existing application? Share your thoughts and ideas with us by posting on the Ceramic Community Forum.


FIDO Alliance

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Join this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Essentials for Adopting Passkeys as the Foundation of your Consumer Authentication Strategy from FIDO Alliance

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

View the webinar slides to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Aligning Authentication Experiences with Business Goals from FIDO Alliance

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

View the webinar slides to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Watch this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

Watch the webinar to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

Watch the webinar to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

UX Webinar Series: Passkeys Design Guidelines AMA ask me anything!

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format! Speakers answered audience […]

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format!

Speakers answered audience questions for the full hour to provide actionable guidance for the use of passkeys for consumer authentication.

Phase 1: Identity needs and the “password problem”
Phase 2: Research and Screen Ideas
Phase 3: Concept and Prototype
Phase 4: Build and Test Phase 5: Release and Optimize

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents Designers Content Strategists Product managers IT managers / leaders Security Analysts Data Analysts

Monday, 15. July 2024

We Are Open co-op

Building and Sustaining Engagement with the Digital Credentials Consortium

Developing communications for your organisation This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world. The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education t
Developing communications for your organisation

This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world.

The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education through open source technology development and leadership, research, and advocacy.

The DCC plays a pivotal role in the definition and establishment of the W3C Verifiable Credentials Standard. Standards are often invisible, but they are massively important!

In this post, we’ll use our work with the DCC to help you systematically review your communication initiatives and give you a bit of a playbook on how to develop reusable communication assets and resources.

Understanding your audience An audience map WAO created with the DCC Research

When crafting communications strategies, most organisations miss a crucial step: audience research. Implementing lessons from outdated research or making assumptions about audience only to find out that your assumptions were wrong are two mistakes that you can avoid!

Before we started creating communication messaging and assets for the DCC, we did two rounds of interviews. In both rounds, we spoke one-to-one with people deeply involved in the DCC’s work. In the first round, we spoke with staff members, W3C task group members and people already implementing the Verifiable Credentials standard. In the second round, we talked to members of the DCC and with the Leadership Board. We asked the same questions for both rounds, but allowed for organic conversation to emerge.

cc-by-nd Visual Thinkery for WAO

These interviews not only provided us with a bounty of onboarding and understanding to the DCC’s multitude of work, but it helped us identify, specifically, what stakeholders need and want from the DCC.

Segmentation

Once you have collected insights from your audience, you can begin to reflect those insights back in ways that help others understand who your audience is. Segmentation is a way to find overlapping interests and topics. We like to visualise segmentation and have done so in multiple ways, from our Audience Ikigai to Defining the Cast and Persona Spectrums, we use a couple of different tools to find audience overlaps. Figuring out a visual way to explain your audience and their unique needs and insights is a great way to help people feel connected to your organisation.

Crafting your communications First slide of a deck implementing suggested design constraints Being specific

Understanding your audience will help you tailor your messages and customise content to specific segments of your audience. Through research, you are also creating relationships with your audience and can encourage people to feel open to giving you feedback.

Our research and subsequent analysis helped us see trends and patterns to pay attention to as we began to craft communications for the DCC. We also identified some quick intervention points allowing us to immediately implement small changes and quick wins. For example, before we ran our final interview, we implemented a new README for the DCC’s Github organisation. Small wins can have big impact!

Our onboarding and research activities helped us see where there were misunderstandings, so that we could deal with them as quickly as possible.

Design guidelines

It can be helpful to put what we call “Design Constraints” in place when we’re building communication strategies and initiatives. Design Constraints are simply rules you and your colleagues use to create consistency in both visual and written language. For example, we helped the DCC select a colour palette, fonts and an illustration library for their future communications.

A brand guide is an example of visual design constraints. A “key messaging and wording” section in your communications strategy is another. It helps create consistency, so that your audiences know how you wish to communicate your organisational goals.

Growing your audience cc-by-nd Visual Thinkery for WAO Engagement

You want to engage with people strategically so that you can work sustainably and your communications are aligned with your current initiatives and goals. We use several tools to help us figure out the best way to engage with a specific audience or community. We’ve written often about the Architecture of Participation, our go-to framework for creating participatory communities.

We also like to build Contributor Pathways, which help show how different stakeholders engage with a project. These pathways can outline steps different audience take and where you might be able to engage with them more effectively.

There are four stages to the engagement model we like to use:

Awareness — The first stage invites you to think about how your particular group hears about you or your project for the first time. The questions to as are How do they hear about us and how would we like them to hear about us? First Engagement — Stage two identifies the first interaction a person or a group has with you or your project. What is the first action that they take and what action would you like them to take? Build Relationship — Stage three is about your interaction. How do you build relationships with people or groups and what value can you bring? Deepen Engagement — As people deepen their engagement with your organisation or project, you’ll want to show them that they’re valued. So how can you ensure consistent engagement with your most engaged audiences?

We think about each of these stages in reference to each specific audience group, as some audiences might be more or less engaged than others.

Advocacy

WAO tends to work with groups and organisations that are trying to create a better world. Advocacy is an integral part of our work. There are a variety of advocacy and collaboration strategies as well as best practices that you can use to ensure you are able to promote your messages in a way that lead to action.

In this post on campaigning for the right things, we take a deep dive into using an advocacy framework to figure out where we might focus efforts. You can reapply this framework to your own initiatives!

Building and sustaining engagement cc-by-nd Visual Thinkery for WAO Cadence

If you’ve truly understood your audiences through research and analysis and you’ve determined the messages and design constraints you need to utilise for maximum communication effectiveness, your audience will begin to grow. Yay! You are building engagement!

It’s time to find sustainable ways to keep your engagement going. Probably the most effective strategy we have for sustaining engagement is cadence and consistency.

an example month of DCC events and associated comms

You need to establish a cadence to your engagement efforts both so that your growing audience knows what to expect and so that you and your team can stay sane. It’s simple, but a communication schedule will help you be consistent, so that people stay engaged. Check out our how to be a great moderator post too, it has good tips on building consistency into your workflow.

Commitment

Last, but not least, commitment to your goals, team and community are essential. However it is that you are trying to have impact on the world, it is a marathon, not a sprint. We believe that open, flexible strategies with reusable and adaptable assets are a great way to help you stay committed.

🔥 Do you need help with communications and engagement? Get in touch!

Building and Sustaining Engagement with the Digital Credentials Consortium was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

It’s time for another episode of The Identity at the Center

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies. You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU Visit our website for more: idacpodcast.com #iam #podcast #idac

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies.

You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU

Visit our website for more: idacpodcast.com

#iam #podcast #idac


GS1

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety. However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing inv
Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety.

However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing investigational products and their locations, as well as for data interchange in clinical trials.

As a result, participants had to configure their IT systems to adapt to each solution implemented by each specific Investigational Medicinal Product (IMP) manufacturer.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1_healthcare_cases_studies_2024_france_v_final_.pdf

Thursday, 11. July 2024

Digital ID for Canadians

The DIACC releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.…

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.

Why is the PCTF Authentication component important?

The Authentication component helps assure the on-going integrity of login and authentication processes by certifying, through a process of assessment, that they comply with standardized Conformance Criteria. The Conformance Criteria for this component may be used to provide assurances that Trusted Processes result in the representation of a unique Subject at a Level of Assurance that it is the same Subject with each successful login to an Authentication Service Provider while also providing assurances concerning the predictability and continuity in the login processes that they offer or on which they depend.

What problems does the PCTF Authentication component solve?

The Authentication component helps establish a standardized way for individuals and organizations to verify their identities when accessing digital services. This reduces the risk of unauthorized access and potential breaches. Additionally, by providing a reliable method for authentication, this allows the PCTF to foster trust and confidence among users, service providers, and stakeholders. This is crucial for the widespread adoption of digital services.

Who does the PCTF Authentication component help?

All participants will benefit from login and authentication processes that are repeatable and consistent (whether they offer these processes, depend on them, or both). It can help lay the foundation to provide assurances that identified Users can engage in authorized interactions with remote systems. When combined with considerations from the PCTF Wallet Component, participants may have an enhanced user experience through the reuse of credentials across multiple Relying Parties.

Relying Parties can benefit from the ability to build on the assurance that Authentication Trusted Processes uniquely identify, at an acceptable level of risk, a Subject in their application or program space.

Find the PCTF Authentication component here.


The Engine Room

Launching our UXD support services!

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab. The post Launching our UXD support services! appeared first on The Engine Room.

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab.

The post Launching our UXD support services! appeared first on The Engine Room.


Berkman Klein Center

Fellows Spotlight: Johanna Wild, Investigative Journalist

An interview on risks, trends, and tools in OSINT digital research Photo by Emily Morter on Unsplash When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of tec

An interview on risks, trends, and tools in OSINT digital research

Photo by Emily Morter on Unsplash

When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of technical approaches to support a more diverse cohort of public interest reporters and investigators, blending automated approaches with human-centered research methodology.

As someone who has supported expert networks in both disinformation and conflict documentation, I wanted Wild’s first-hand perspective on the benefits and risks of using novel open source intelligence (OSINT) tools to enable a broader, more transparent global knowledge base. We conducted this interview over email between Amsterdam and New York City.

Sam Hinds: Do you encounter specific types of people or professional backgrounds in the work of investigations and OSINT tool development?

Johanna Wild: The great thing about the field of open source research is that it consists of people from various backgrounds. Open source researchers spend a lot of time online. They find pieces of information on social media platforms, in online forums, and databases, and they compare features that they identify in user-generated online videos and photos with locations that can be seen on satellite imagery. This process, called geolocation, is used to verify online images. The nature of open source research allows everyone with an internet connection to do this type of work.

The open source researcher community is therefore a mix of people who do open source research as part of their job and volunteers who are passionate about contributing to important research in their free time. My surveys and user interviews with our Bellingcat community showed that our community consists of people working for human rights organizations, stay-at-home-parents who use their limited time to do something mentally challenging and useful, cybersecurity specialists, job seekers who want to learn new skills, lawyers, data scientists, people who are retired and many more. When I ask volunteers about their motivation, they often say that they want to contribute to research that reveals issues in the regions where they live, that they want to feel that in these times that are characterized by various conflicts around the world, and global challenges like climate change; they do not just passively sit around but actively contribute to something that creates new knowledge about those issues. Another motivation is to become part of a community with similar interests and to improve their open source research skills.

Of course there are also many journalists who are part of this community. Nowadays, more and more newsrooms are setting up teams focusing on open source research. However, journalists were more of the late adopters in this field. Most of them only discovered in the last few years how useful this type of research can be, especially if it is combined with traditional journalistic skills and methods. Newsrooms even started hiring skilled open source researchers who are completely self-taught and who have no journalism degree, which is something that is still rather unusual in the news industry.

Volunteers with a technical background contribute by building tools. These are often simple command line tools that are able to do one very specific task, for instance to scrape posts from a specific social media platform or to check whether an online account has been created on a platform using a specific phone number. Those tools do not usually turn into big commercial products; they are built by people from within the open source software community who focus on writing code that is publicly accessible to anyone. Several years ago, I clearly saw that the open source researcher and the open source software community are a very good match for each other, we just needed to bring them together. This is one of the things that we now do at Bellingcat. We organize hackathons, actively invite software developers into our volunteer community, and support them to build their own tools or to contribute to tools built by the Bellingcat team. This group of volunteers consists for example of people who have a full time job in a software company but want to do something meaningful in their free time, of job seekers who want to create their own portfolio of tools, or of academics who are already deep into a technical topic but would like to test its practical application.

Although the open source researcher and tech communities are very diverse in terms of their professional and personal backgrounds, they are currently still dominated by volunteers and professionals from Western countries, mainly from the US and Europe. The technical tool builder community is also, to date, still male dominated. This lack of representation raises serious questions in terms of who defines the future of our field and who has the power to research topics in regions all around the world. With people in many other regions still excluded from participating in this type of research, they mainly become the subject of Western researchers.

“While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity!”

SH: Have you seen novel trends emerge in the type of information researchers want today?

JW: I definitely observe that researchers, and especially journalists, have become more aware of how useful it is to be able to work with large datasets, to know how to scrape information from websites or to have the skills to build small tools that can speed up some of their research tasks.

Currently, everyone is of course interested in AI. Less experienced researchers are hoping for a tool that lets them input any picture or video and then spits out the exact location of where it was taken. While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity! Creativity is needed to spot topics that are worth getting investigated. When deciding where to look next in the vast amount of online information that is out there, creativity helps to connect multiple, often tiny, pieces of verified information which allow researchers to draw conclusions on a certain topic.

Another trend is the use of facial recognition tools. Open source researchers often find pictures that show individuals who have a connection to a certain research case but whose identity they don’t know. In the last few years, several easy to use facial recognition tools have emerged. Researchers can upload a picture of a person and the tool compares this picture with collections of photos from social media platforms. Sometimes, this can reveal the identity of a person, for instance by providing the person’s LinkedIn profile. It is obvious how useful this can be to identify individuals who were involved in serious crimes that require journalistic reporting.

However, facial recognition tools are a double-edge sword. We all know that they can provide wrong results. Two people might just look very similar and an uninvolved person might be misidentified as someone who is involved in illegal activities. It is therefore important that open source researchers do not use those tools as the only way of identifying someone. On top of that, the use of such tools raises various ethical questions ranging from the risk of stalking random people online, to questions about the data sources on which facial recognition tools rely. At Bellingcat, we reflected on how we can ensure a responsible use of facial recognition technologies and concluded that we will refrain from using these tools extensively, and never as a core element of an investigation. We also never used products from companies like Clearview AI. A good example of how we sometimes use a facial recognition tool as a starting point for further research can be found in our article on how “Cartel King Kinahan’s Google Reviews Expose Travel Partners”.

SH: Are there any overlooked tools that you like to highlight in your trainings?

JW: The best type of tool really depends on the research topic. Often a combination of several small tools can lead to the best results. For instance, our Name Variant Search Tool is basically an enhanced search engine for finding information about people. Open source researchers often start with a name and try to find out as much as possible about the person’s online presence. However, the name might be written differently on different sites. “Jane Doe” might also show up as “J. Doe” or “Doe, Jane”. The tool suggests different possible variations of a name and provides search results for all those variations. It is also possible to instruct the tool to search for a name specifically on Linkedin or Facebook.

Example: Name Variant Search results for different variants of the name “Jane Doe”

Our OpenStreetMap search tool, on the other hand, supports the geolocation process. A core task of many open source researchers is to find out where a photo or video that they found online has been taken. To do that, they try to identify specific features and compare those with what is visible on satellite imagery or maps. If researchers already have a rough idea in which region a photo might have been taken, they can input a list of features that are visible in the photo (for instance, a residential street, a school and a supermarket) into our tool, which will try to list all locations in a pre-defined region in which those features show up together. This can really help narrow down possible locations.

SH: What’s an example of an unusual story or insight one can find from OS tools?

JW: If open source researchers have no idea where a picture might have been taken but they know at which time it was captured and the photo shows objects that cast clearly visible shadows, they can try our ShadowFinder tool which is able to calculate at which locations around the world shadow lengths correspond with what can be seen in the photo at a specific point in time. This helps open source researchers concentrate their geolocation efforts to the areas suggested by the tool instead of searching across the whole world.

Example of a ShadowFinder tool result: Possible locations are shown by the yellow circle.

Another tool that has gained popularity within the open source researcher community is PeakVisor, a tool that was originally targeted at helping mountaineers orient themselves but which can also be used for geolocation tasks. For instance, we used it to research the location of the killing of Colombian journalist Abelardo Liz. This example in particular shows that a combination of research skills and the use of tools can go a long way.

SH: What frustrations or barriers do you see as a trainer, and how could the field democratize knowledge of command line tools?

JW: First of all: Teaching open source research is great. People who are interested in learning these methods come from so many different backgrounds which allows everyone to learn new things from each other, including the trainers! The topic is also quite accessible, meaning that everyone can start doing open source research with very simple methods, like using search engines in creative ways. Sometimes, this can lead to surprising results: For instance, just by googling, my colleague Foeke Postma revealed how US soldiers exposed nuclear weapons secrets via flashcard apps.

Of course not all methods are as simple, and one of the things people are struggling with the most are research tools. During my Nieman-Berkman Klein fellowship my research assistant Cooper-Morgan Bryant and I interviewed forty open source researchers about their use of tools. Their answers confirmed my previous findings on this topic: Open source researchers, who are either beginners or who are looking at a topic that is new to them, find it really difficult to figure out what tool they should use at what stage of the research process and how those tools work. With such a wide variety of online tools, some more useful and some easier to find than others, and many researchers feel overwhelmed by the task of finding their way through the landscape of available tools spread across various platforms.

In addition, the majority of open source researchers are not able to use command line tools since this requires a certain degree of technical skills. However, those are exactly the type of small tools that the open source software community is building most frequently. There is a clear divide between those who are building tools for open source researchers and the researcher community itself, for whom those tools often turn out not to be accessible.

“Open source researchers want complex tools that are easy to use and that are stable and well-developed but such tools need funders and teams who build them, and these conditions are not always easily met in the open source research and journalism space.”

On the other side, open source researchers are often not aware of the resources that are required to build mature tools that have an easy-to-use interface. It is getting easier now, but tool builders need to invest a lot more time to build such tools and this is difficult for people who do this task in their free time and without any funding. Open source researchers want complex tools that are easy to use, stable, and well-developed, but such tools need funders and teams who build them. These conditions are not always easily met in the open source research and journalism space. I hope that researchers will become a little bit more open to learn some basic technical skills, and even more importantly that they understand that not every tool that is useful for their research has to function like a fully built commercial tool.

At Bellingcat, we focus on bridging this gap between tool builders and open source researchers. We work with tech communities —often through programs like hackathons or fellowships — and make them aware of how important good user guides are, even for seemingly easy-to-use tools. On the other hand, we teach open source researchers how to use command line tools. We also launched a video series with the goal to help researchers make their first steps towards the more technical side of research tools.

SH: Tools take a lot of resources to build. Do any OSINT tools have a complicated provenance in terms of private sector origin or geopolitics?

JW: It is definitely problematic that researchers and journalists can be so dependent on tools provided by big tech companies. Meta’s social monitoring platform Crowdtangle will be shut down in August and this has caused a lot of discontent amongst journalists, in particular amongst those who are covering elections. For instance, many of the platforms and tools open source researchers use are provided by Google, like Google Search, Google Maps and Google Earth Pro. We are often at the mercy of the decisions that big tech companies take regarding use of their tools.

However, their tools are usually provided for free, which is not the case for other commercial tools. Open source researchers definitely need to look into the companies from which they are buying tools. One risk is that tool providers might be able to see what type of keywords people are typing in or on what topic someone is working on. Researchers and journalists need to be sure that their sensitive research topics are safe from being monitored by tool providers.

At Bellingcat we focus on mostly small open source tools, but those tools come with their own set of challenges. For instance, it is often not clear who is behind a tool that is offered on code-sharing platforms like Github, which can raise security-related questions.

“I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists…since both sides have the common goal of advancing research in the public interest”

This is why I really hope we can build a different tool ecosystem for open source researchers in the future. I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists. I think that such collaborations could work well since both sides have the common goal of advancing research in the public interest, and many of the tools that are used by open source researchers are equally useful for academic researchers. I also see opportunities to research security-related aspects of widely used tools together, as journalists and open source researchers could definitely use some help in assessing the risks that some of the tools they are using might be posing. If anyone who reads this would like to discuss these topics with me: Feel free to get in touch!

SH: Misinformation, disinformation, conspiratorial thinking: What are some of the uses and abuses of “research” you see in these contexts?

JW: What is most common — especially during conflicts and wars — is that people share either photos or videos from a different conflict or old imagery and make people believe that they are related to current events. In the context of the Israel-Gaza conflict since October 2023, this phenomenon has reached a new scale with countless examples circulating online. For instance, Bellingcat found videos that were shared with the claims that one showed rockets that were fired at Israel by Hamas and another that claimed to show recent Israeli strikes on Hamas; both turned out to be recycled videos that had been uploaded to YouTube several years prior.

“People who post such pictures might sometimes think they are doing ‘research’ and that they are sharing relevant information about an ongoing conflict, without realizing that they are actually sharing incorrect information.”

What is dangerous is that some of those posts go viral and are able to reach significant numbers of people who will never know that they fell for misinformation. People who post such pictures might sometimes think they are doing “research” and that they are sharing relevant information about an ongoing conflict, not realizing the information is incorrect. Others, however, will do it on purpose to evoke emotions either in favor or against one of the conflict parties. Users of online platforms cannot really do much to prevent being confronted with such posts. This is another reason it is essential that we all learn to question what we see online and to invest some time in learning basic verification skills.

What we have also been seeing is that supporters of conspiracy ideologies are increasingly using open source research tools and presenting the information as journalistic findings. For example, Qanon supporters in German-speaking countries started using flight-tracking sites to search for flights which they falsely believed were circling above “deep underground military bases” in which children were hidden and mistreated. This is problematic since people who are not aware of the methods and standards of open source research might not be able to differentiate between serious research and the distorted version of it.

SH: What are some of your favorite guidelines or best practices for journalists who aim to cover (and fact-check) broad conspiratorial thinking enabled by OS information?

JW: Looking at their business models can often be a very promising approach. More often than not, conspiracy-minded communities have business-savvy people amongst them who manage to benefit financially from those communities’ beliefs. When I was researching QAnon online communities in Germany, big platforms like Amazon and eBay had started implementing measures to ban QAnon products from their platforms. However, this seemed to have created new opportunities for QAnon influencers who were offering merchandise via their own small online shops. On top of that, customers in Germany were able to buy QAnon products from abroad, for instance from Chinese or British companies who offered products targeted specifically at German-speaking customers. It was interesting but also concerning to see how international today’s conspiracy merchandise markets are.

To research online shops, it is always worth researching what payment options those shops are using and to look into their potential use of cryptocurrencies. It is also important to take some time to learn the terminology a certain group is using. If you are looking into the far-right, for instance, it is crucial to learn how to interpret the symbols they use.

”Open source researchers are often portrayed as some type of ‘nerdy hero’ who spends time on his laptop to research ‘the bad guys’ and is celebrated once he succeeds. The idea of one hero figure who solves all the research challenges is really the exact opposite of how open source research works best…”

SH: How might international organizations build stronger support for women, femme-identified, and gender-nonconforming media and research professionals?

JW: In the field of open source research, there are definitely tendencies that I would like to see changed in the future. It is well established that women and gender-nonconforming people have traditionally had a much harder time to enter and succeed in the space of investigative journalism. Those issues are far from being overcome, but the journalism world has started to talk more openly about it, and the fact that academic researchers have published work on this topic has also been helpful.

My impression is that as open source researchers, we have not yet put enough effort into reflecting on what is happening in our own field. Maybe we thought that since it is relatively new, those issues would not appear as strongly. Unfortunately, however, they do, and it’s time to recognize this.

There are definitely many contributing factors, but one that has had a strong effect on me is that open source researchers are often portrayed as some type of “nerdy hero” who spends time on his laptop to research “the bad guys” and is celebrated once he succeeds. The idea of one lone wolf who solves all the research challenges on their own is really the exact opposite of how open source research works best, which is by nature collaborative and often requires the efforts of many to put together various small pieces of verified online sources for a specific research case. For those of us who don’t want, and are also not able to fit into this commonly portrayed male hero picture, this field might not necessarily feel like a good fit.

However, since more and more traditional newsrooms are setting up open source research units right now, I see more women entering the field and hopefully, this will also change how we publicly talk about open source research over time. To everyone who organizes a public event on open source research, I recommend to not only approach the few already well known voices in the field but to take the effort to find and invite speakers who can contribute new perspectives and who have done research on topics that are not always in the spotlight.

SH: What were the most meaningful conversations you had during your time at the Berkman Klein Center? Do you plan to use any of your connections or insights from the fellowship in your future work?

JW: I am very grateful that I was able to be a Berkman Klein Fellow this year. It was a great opportunity to be part of a community of people who all reflect on how we integrate new technologies in our lives but from various different angles. Each fellow and community hour provided me with insights into a different technology-related topic and I liked the “surprise” effect of being able to learn new things about topics I usually don’t have the time to think about. This has definitely had an impact on how I approached my own projects with Bellingcat. I feel that being immersed in such a knowledgeable and collaborative community has unlocked my creativity and I am looking forward to continuing to learn from everyone in the Berkman Klein sphere in the future.

Johanna Wild was a joint 2023–2024 Nieman-Berkman Fellow in Journalism Innovation, a joint fellowship administered between the Nieman Foundation for Journalism and the Berkman Klein Center for Internet & Society at Harvard University. Wild is currently Investigative Tech Team Lead at Bellingcat.

Fellows Spotlight: Johanna Wild, Investigative Journalist was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

Kantara awards IAL3 certification to NextGenID Component Services

World's first Trust Mark award for Component Services at IAL3 will continue to build confidence in the identity industry The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.

We are delighted to announce that NextGenID has successfully obtained IAL3 certification for its component services. This effectively makes it the first organization to achieve IAL3 in the Identity Credentialing and Access Management (ICAM) space. This sets a new industry standard for security, accessibility and reliability.

NextgenID’s Trusted Services Solution (TSS) provides Supervised Remote Identity Proofing (SRIP) identity stations. Operators use SRIP stations to collect, review, validate, prove and package IAL3 identity evidence and enrollment data.  This means that CSPs that use the NextGenID TSS will offer an enhanced level of assurance.

Speaking of the award, Kantara Exec Director, Kay Chopard, said: “Achieving Kantara certification is a significant endeavor, reflecting a rigorous commitment to excellence in identity and access management. By developing frameworks and ensuring conformance to robust standards, we provide guidance that ensures security, privacy and interoperability in digital transactions. This is critical for organizations looking to adopt identity solutions that not only comply with current regulations but also anticipate future challenges in digital identity verification. We congratulate the NextGenID team on being the first to achieve IAL3 certification for Component Services.”

Are you ready for identity assurance certification?  Visit our Approval Process page for full details of what is involved and the criteria we use when evaluating applications.

 

The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.

Wednesday, 10. July 2024

MOBI

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance DENSO, Honda, Mazda, and Nissan among MOBI members who have completed Stage 1 of the Cross-Industry Interoperable Minimum Viable Product, a three-year initiative towards a decentralized Global Battery Passport built for data privacy and selective disclosure [...]

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance

DENSO, Honda, Mazda, and Nissan among MOBI members who have completed Stage 1 of the Cross-Industry Interoperable Minimum Viable Product, a three-year initiative towards a decentralized Global Battery Passport built for data privacy and selective disclosure

Los Angeles, 10 July 2024 — MOBI, a global nonprofit Web3 consortium, is thrilled to announce a significant milestone in development of the Web3 Global Battery Passport (GBP) Minimum Viable Product (MVP). In a historic first, the MVP has successfully demonstrated battery identity/data validation and exchange between nine organizations using open-standards — MOBI Battery Birth Certificate and the World Wide Web Consortium (W3C) Self-Sovereign Identity (SSI) framework — an achievement that carries exciting implications for stakeholders of the battery value chain and lays critical groundwork for the Web3 Economy. Organizations worldwide are invited to join and collaborate in this trailblazing effort.

Participants of MOBI Circular Economy and the GBP Working Group, constituting close to USD 1 Trillion in annual revenue and representing diverse functions within the global battery ecosystem, have successfully completed Stage 1 of the three-year MVP and are set to begin Stage 2. Implementers of the decentralized GBP include Anritsu, DENSO, HIOKI, Honda, Mazda, Nissan, and TradeLog Inc.

As the world increasingly turns to batteries for sustainable energy solutions, global battery value chains are making continuous improvements to enhance operational efficiency, circularity, and cross-border compliance. Forward-looking global policies like the US Treasury’s Section 30D Guidance on EV Tax Credits and the EU Battery Regulation mandate digital recordkeeping to track battery life cycles, underscoring the need for a global battery passport — a digital credential containing key information about the battery’s composition, state of health, history, and more.

Creating a scalable GBP will require cross-industry interoperability, such that entities across the value chain can securely coordinate and selectively share relevant data without the need to pay for costly one-off integrations or abandon their existing systems. MOBI and its members believe Web3 can become a key enabler for cross-industry interoperability at scale. SSI in particular introduces a promising approach to unlocking powerful synergies, enabling entities to securely control and share credentials across different web applications and platforms without the need for centralized intermediaries. To this end, MOBI and its members are building a decentralized Web3 marketplace ecosystem with standardized communication protocols for Self-Sovereign Data and SSI, designed such that the federated infrastructure and data therein are not controlled or managed by one organization. While MOBI MVP initiative demonstrates a specific use case for implementing the GBP, the benefits of MOBI Web3 implementation extend to almost any use case that involves multiparty transactions.

In Stage 1 of the initiative, implementers demonstrated the Integrated Trust Network (ITN) identity services of one-to-one cross-validation for battery identity and data. The ITN serves as a federated (member-built and operated) registry for W3C Decentralized Identifiers, offering SSI management for connected entities such as batteries and their value chain participants. The ITN is the first enterprise network supporting multiple blockchains at the same time. These features are unique to the ITN, built for high resilience by ensuring the network’s functionality and sustainability do not rely on a single organization or blockchain.

Stage 2 will demonstrate Citopia decentralized marketplace services through creation of the cross-industry interoperable, privacy-preserving GBP. ITN services are one-to-one whereas Citopia services are one-to-many (and many-to-one). Through Web3 implementation, the availability and selective disclosure of trusted data and identities throughout the battery value chain will beget digital services and applications such as enhanced battery and carbon credits management, vehicle-to-grid communications and transactions, risk-based insurance calculations, and data-driven used electric vehicle pricing.

“Today’s global battery value chain is complex and it’s difficult to simultaneously ensure efficiency, scalability, safety, circularity, and regulatory compliance. To balance these priorities, we need to enhance battery lifecycle management through the creation of a shared ecosystem with SSI framework for secure coordination and selective disclosure of sensitive data,” said MOBI CEO and Co-founder, Tram Vo. “Driving innovation at this scale requires cross-industry collaboration. We invite public and private organizations worldwide to join us in this critical pursuit.”

“Web3 is an interesting technology which may facilitate a more scalable approach to exchanging battery data in a peer-to-peer fashion between organizations,” said Christian Köbel, Senior Engineer at Honda Motor Co., Ltd.

“It is confirmed how Web3-based self-sovereign data management works throughout Stage 1 of the MVP,” said Yusuke Zushi, Senior Manager at Nissan.

Said Mazda Motor Corporation, “Through participation in the GBP Working Group, we not only acquired the technical knowledge of Web3 but also understood a vision of an ecosystem that realizes the exchange of reliable information. We appreciate MOBI for giving us this invaluable opportunity.”

“The Stage 1 MVP experiment was instrumental in deepening our understanding of a system that enables sovereign data management. In the current era of data, the integrity and safety of information flow are paramount. As an experienced manufacturer of measuring instruments, HIOKI has always placed a premium on the reliability and accuracy of data,” said Kenneth Soh, Executive Officer at HIOKI. “We believe that the insights gained from this MVP are vital for the future progress of measurement instrument manufacturers and the industries we serve, from the perspective of innovation and societal contribution driven by mechanisms that offer both security and precision in data distribution.”

“The successful implementation of the Web3 GBP MVP is a significant step towards a more transparent and sustainable battery ecosystem. We are honored to contribute to the realization of the GBP as participants in this important initiative,” said Hisashi Matsumoto, Senior Manager at Anritsu.

“We, TradeLog, Inc., proudly support the decentralized Global Battery Passport project, driven by the efforts of MOBI and its dedicated Implementers,” said Alvin Ishiguro, Project Coordinator at TradeLog, Inc. “Going forward, we will continue to deliver our customers new experiences through blockchain technologies in the energy sector.”

About MOBI

MOBI is a global nonprofit Web3 consortium. We are creating standards for trusted Self-Sovereign Data and Identities (e.g. vehicles, people, businesses, things), verifiable credentials, and cross-industry interoperability. Our goal is to make the digital economy more efficient, equitable, decentralized, and circular while preserving data privacy for users and providers alike. For additional information about joining MOBI, please visit www.dlt.mobi.

Media Contact: Grace Pulliam, MOBI Communications Manager

Email: grace@dlt.mobi | Twitter: twitter.com/dltmobi | Linkedin: MOBI

###

The post First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance first appeared on MOBI | The New Economy of Movement.


Next Level Supply Chain Podcast with GS1

Replay: Ways to Build an Enduring Brand on Amazon with Shannon Roddy

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond!  Key takeaways: Building a defensible brand is crucial for long-term success. Invest in building a

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond! 

Key takeaways:

Building a defensible brand is crucial for long-term success. Invest in building a brand that is recognizable, trustworthy, and unique to differentiate yourself from your competitors.

Amazon holds over 50% of the online market and can significantly impact the success or failure of a brand. Harnessing Amazon's data and feedback is crucial for identifying trends, understanding demographics, and developing new products.

Leveraging Amazon's platform and customer data can give you a competitive edge, but you need to adapt to changing customer preferences and market demands.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with our guest:

Follow Shannon Roddy on LinkedIn

More on Avenue7Media

 

Jump into the Conversation:

[1:42] Can you share a little bit of your background and what you’ve been working on in the last couple of years?

[4:31] The Amazon space is constantly evolving, are there some major trends or changes that have happened recently?

[10:33] You gave us some examples when we talked before of how things can go wrong for brands on Amazon, so how can you help them make things go right?

[14:45] What are some other tips and tricks that you can offer?

[18:37] Does that also mean discontinuing product one and two while you expand out, or is that what you learn from the data?

[27:14] What do you see is next from Amazon’s perspective?

[29:06] What trends are you seeing that are blowing your mind?

[32:31] What’s your favorite technology?


Digital Identity NZ

Government bringing in new digital trust framework

The government has quietly ushered in the beginnings of what it hopes will be the answer to people's experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days. The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

The government has quietly ushered in the beginnings of what it hopes will be the answer to people’s experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days.

Digital Identity New Zealand Executive Director Colin Wallis spoke to Radio New Zealand this morning, “The intent is that you’ll have a safer digital playing field as a baseline to build other services on top of. It’s just going to take some time for the ripple through where we are now for it to become seismic.”

Listen to the full recording

You can learn more about the DISTF and the Digital Public Infrastructure on Tuesday 13 August at The Digital Trust Hui at Te Papa, Te Whanganui a Tara.

The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

Tuesday, 09. July 2024

We Are Open co-op

Behind the Scenes of Our New Project on Job Readiness Credentials

A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary ma
A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context

We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary materials, and conducting interviews with employers, IRC staff, and, if possible, IRC clients.

Our broad key question relates to how the visual design and metadata contained in a digital badge impact employer perceptions and interactions. We want to help JFF and the IRC have the most impact possible with the Job Readiness Credential because that impact means changing the lives of real people.

How we approach this kind of work

At the start of any project, it’s important to know the absolute basics. In fact, it’s a good time to get the Busytown Mysteries theme tune in your head as an earworm! The 5W’s and an H shown above help make sure we know all of the things necessary to set the project up for success. Ideally, we’d know most of this before even signing the contract, but anything missing we can pick up in the client kick-off meeting.

Before the client kick-off meeting, we have an internal project kick-off where we talk about everything from timelines and responsibilities, to setting up the digital environments in which we’ll do the work. If we need to purchase any new equipment or subscriptions, we’ll identify those in this meeting. Our guidelines for this can be found on the WAO wiki.

Communications and cadence Early days of JFF/IRC Trello board. It’s the usual kanban format with the additional of the self-explanatory ‘Feedback Needed’ along with ‘Undead’. The latter is for cards that would otherwise get stuck somewhere but we don’t want to delete/archive just in case they come back to bite us!

Getting into the right rhythm with clients is an art rather than a science. While it’s easy to put an hour in the calendar each week for a catch-up call, this is a sub-optimal for anything other than the very short term. This is because, in our experience, these kind of calls quick devolve into status update meetings.

Much better is to work as openly as possible. Sometimes that means entirely publicly with etherpads, public Trello boards, and the like. Other times, it’s working transparently with tools that provide either real-time or summary updates. Often this means that the number and frequency of meetings can be reduced. With our recent work with the DCC, for example, we met every other week, aiming for 45 minutes. Between meetings, we sent Loom videos and other sorts of outputs to make sure our collaborators knew how thinking had evolved.

While it’s important that there is a project lead from both sides, it’s also crucial that their inboxes do not become information silos. Larger organisations might use CRM systems, but for us information is best in context. So, for example, a Google Doc for ongoing meta-level important info, and everything else on the relevant Trello card (or equivalent).

Documentation is not putting a message in a Slack channel or mentioning something during a meeting. Documentation is writing something down in an uncontroversial way that makes sense to everybody involved in the project. This is important because humans can only hold so much information in our heads at one time, and our memories can be faulty.

Everything is a work in progress CC BY-ND Visual Thinkery for WAO

‘Perpetual beta’ is another name for saying that everything is a work in progress. What’s true of software is true of documentation and everything involved in a project. Conclusions are provisional and based on the data and knowledge we had at the time.

To account for this, we usually version our work, starting at v0.1 rather than 1.0. The reason for this is to show the client (and ourselves) that we’re working towards our first fully-formed opinions and outputs. It’s all part of our attempt to work openly and show our work.

With this work that we are starting with JFF and IRC, we’ll be talking to stakeholders in a couple of different places. Our human brains want to take shortcuts and jump to conclusions quickly so that we can take action. However, we’ve learned to “sit in ambiguity” for long enough to allow thoughts and reflections to percolate. This slower kind of thinking allows us to spot things that might have been missed by our ‘System 1’ mode of thought.

Conclusion

We’re greatly looking forward to getting started with this work. We haven’t gone into how we perform user research, which is perhaps the topic for a future post. There’s a lot to cover from that point of view in terms of ethics, data, and different kinds of methodologies.

What we hope that we have shown in this post is our commitment to working openly, holistically, and thoroughly so that the outputs we generate are trusted, interesting, and actionable. We’ll share more on the project as it progresses.

Behind the Scenes of Our New Project on Job Readiness Credentials was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 08. July 2024

Identity At The Center - Podcast

The Identity at the Center Podcast episode this week dives i

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world. Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world.

Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast app. Visit idacpodcast.com for more info.

#iam #podcast #idac

Friday, 05. July 2024

Ceramic Network

Ceramic Nodes in Production: Example Costs + Scenarios

Running a Ceramic node involves several key services. Learn about what production costs to expect across example hypothetical scenarios.

Running a Ceramic node in a production environment involves several key components. This article aims to provide an overview of the necessary resources and cost estimates for deploying a Ceramic node in the cloud. While we only showcased two specific providers for the services required (DigitalOcean and QuickNode), we hope these cost examples given the hypothetical scenarios we walk through will help give you a general idea of cost.

Components Required for a Ceramic Node

There are several sub-services to consider when running a Ceramic node in production, each serving different functions. As such, you will need:

1. Resources for JS-Ceramic Functionality: Tracks and stores the latest tips for pinned streams, caches stream state, provides the HTTP API service for connected clients to read, and communicates with the Ceramic Anchor Service (CAS) for blockchain anchoring and timestamping. 2. Resources for Ceramic-One Functionality: These nodes store the actual data and coordinate with network participants. 3. Resources for Postgres Database Functionality: Required for indexing data. 4. Ethereum RPC Node API Access Functionality: Required to validate CAS anchors. 5. Ceramic Anchor Service (CAS) Access Current Status: Anchors Ceramic protocol proofs to the blockchain. This service is currently funded by 3box Labs, however, eventually, this function will be provided by node operators and with some expected cost. Baseline Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “baseline” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state Ceramic-One 4 vCPU 4 GB memory 100 GB disk for storage Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Traffic Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “High Traffic” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state 10,000 IOPs Ceramic-One 6 vCPU 8 GB memory 500 GB disk for storage 15,000 IOPs Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Availability Configuration

For high availability, an additional node can be configured to sync data and handle dynamic read/write tasks, thus doubling the cost of a single-node setup.

Ethereum RPC Node Endpoint Costs

We’ve also chosen QuickNode to provide several RPC cost examples :

QuickNode Base Plan: $10/month (100 million API credits, 2 endpoints, 550 credits/second) QuickNode Middle Plan: $49/month (500 million API credits, 10 endpoints, 2,500 credits/second) QuickNode Premium Plan: $299/month (3 billion API credits, 20 endpoints, 6,000 credits/second) Hypothetical Scenarios and Cost Estimates

Let’s walk through three hypothetical need scenarios and use these to help estimate our cost structure:

Application A: Small User Base User Base: 10,000 monthly active users Query Behavior: 30% writes, 70% reads Availability: Low-priority Configuration: Baseline resources Cost Estimate: Node: $96/month Ethereum RPC: $10/month Total: $106/month Application B: Write-Heavy Mid-Sized Application User Base: 100,000-500,000 monthly active users Query Behavior: 70% writes, 30% reads Availability: High priority (2-node setup) Configuration: High Traffic Cost Estimate: Nodes (2x): $918/month (2x $459) Ethereum RPC: $49/month Total: $967/month

Example GCP budget

Other Considerations

Additional cloud costs must be considered for networking - these costs will vary based on traffic patterns. Most cloud providers offer free traffic ingress to the nodes but will charge for egress, or data leaving the nodes.

Running a Ceramic node in production involves various components and resources, each contributing to the overall cost. By understanding the necessary configurations and associated costs, developers can make informed decisions tailored to their application's needs and user base. High availability setups and resource over-provisioning can significantly impact costs, especially for mid-sized applications with high traffic and write volumes.

Thursday, 04. July 2024

Digital ID for Canadians

OIX and DIACC join forces to move digital trust and verification interoperability forward

Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification. UK, June 2024 – The global non-profit…
Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification.

UK, June 2024 – The global non-profit Open Identity Exchange (OIX) and the Canadian non-profit Digital ID Authentication Council of Canada (DIACC) have committed to working together to advance global digital interoperability – a crucial element for trusted, successful international trade in a rapidly advancing digital global economy.

OIX is an influential global community for all those involved in the ID sector to connect and collaborate, developing the thought leadership and guidance needed to enable interoperable, trusted identities carried seamlessly from place to place in ‘roaming wallets’ for everyone. DIACC is an equally influential community of public and private sector leaders committed securing inclusive digital economy benefits by promoting user-centric design principles and verifying private sector services against the Pan-Canadian Trust Framework (PCTF) to support a secure ecosystem of services to enable user-directed information verification between public and private sector data authorities.

The two organisations will explore how different country-based policies related to identity management, verification, security, data privacy innovation and approaches to digital identity assurance can be compared and analysed so that more rapid progress can be made towards global digital ID interoperability through alignment of policy or acceptance of policy differences.

The collaboration will focus on advancing methods for participants in one framework to accept identity verification and digital credentials verified through another trust framework based on a mixture of policy acceptance and technology adaption. DIACC and OIX will explore equivalency and interoperability processes, identify potential alignments, new standards required, and gaps that may need to be addressed, and highlight use cases that can be facilitated through interoperability across digital ecosystems. Within this work, they will explore methods to describe common features of jurisdictional and sectoral trust frameworks, and share insights widely available as a resource.

The exchange and transfer of knowledge and expertise will be at the heart of this collaboration. OIX and DIACC will work together to create ‘intellectual capital’ to shape debate and bring about actions, moving identity management, data privacy, and security forward at pace.

Nick Mothershaw, Chief Identity Strategist at OIX, said: “The benefits of the digital global economy will be vast, but there is still some way to go before everyone can confidently access them. Our collaboration with DIACC will play a critical role. The fantastic progress DIACC has already made across Canada is an exemplar for global interoperability and will provide much needed insight, tools and guidance to pave a much clearer way forward globally.

“Our plans are to share our work with other trust frameworks across the globe, by publishing the criteria and values, and in the short-term creating an interim tool for trust frameworks to use for policy areas. We also want to secure their input on what they want to see in Trust Framework Comparison tool, as well as to start demonstrating how a roaming wallet will work.”

Joni Brennan, DIACC President, said: “We’re thrilled to collaborate with the Open Identity Exchange. The formalization of our liaison demonstrates progress in supporting our shared values to advance secure, user-centric digital identity solutions globally. Our collaboration will leverage each organization’s expertise to explore opportunities to foster innovation, enhance interoperability, and build public trust in digital services by identifying the alignments and gaps between jurisdictional and sectoral trust frameworks.”

For more information, please contact Serj Hallam at communications@openidentityexchange.org 

About The Open Identity Exchange (OIX)

The OIX is a non-profit trade organisation on a mission to create a world where everyone can prove their identity and eligibility anywhere through a universally trusted ID. OIX is a community for all those involved in the ID sector to connect and collaborate, developing the guidance needed for inter-operable, trusted identities. Through our definition of, and education on Trust Frameworks, we create the rules, tools and confidence that will allow every individual a trusted, universally accepted, identity.

About The Digital ID and Authentication Council of Canada (DIACC)

The Digital ID and Authentication Council of Canada (DIACC) is a not-for-profit corporation of Canada that benefits from membership of public and private sector leaders committed to developing a trust framework to enable Canada’s full and secure participation in the global digital economy. DIACC’s objective is to unlock economic opportunities for consumers and businesses by providing the framework to develop a robust, secure, scalable and privacy-enhancing digital identification and authentication ecosystem that will decrease costs for governments, consumers, and businesses while improving service delivery and driving GDP growth.


Origin Trail

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various government-funded initiatives, the DKG has also played a crucial role in informing public policies.

DKG uniquely and effectively addresses the challenges of data ownership, AI hallucinations, and bias⁷ with the Decentralized Retrieval-Augmented Generation (dRAG)⁸ framework. dRAG drives a vast advancement of the RAG model initially developed by Meta⁹, by organizing external sources in a DKG while introducing incentives to grow a global, crowdsourced network of knowledge made available for AI models to use.

The DKG V8 has through a prototype demonstrated an unprecedented scale at which the Verifiable Internet for AI can drive value for anyone, on any device, and any chain. Addressing sensitive data concerns, scalability, and AI challenges concurrently has brought encouraging results that importantly shape the expected V8 release timeline.

DKG V8 — for Anyone, on Any Device, on Any Chain at Internet Scale

OriginTrail DKG has been battle-tested in real-world applications increasingly used by an ecosystem of organizations and government-supported initiatives. To date, no decentralized system has scaled in the production environment the way V6 DKG has. However, the current capacity of DKG reached its limits to support the growing usage requirements, prompting a transition to the V8, evolved to tackle the scale at which AI is consumed in any environment.

Data has been growing exponentially for decades, with AI driving further growth acceleration — according to the latest estimates, 402.74 million terabytes of data are created each day¹⁰. This trend is increasingly visible in the rising demands for additional capacity in the DKG, driven by data-intensive industry deployments in aerospace, manufacturing, railways, consumer goods, and construction driving DKG growth.

Version 8 of the DKG has therefore been designed with major scalability improvements at multiple levels, with a prototyped implementation tested in collaboration with partners from the sectors mentioned above.

3 key products of OriginTrail DKG V8

The major advancement that DKG V8 is making is in expanding the OriginTrail ecosystem’s product suite to 3 key products:

DKG Core Node V8 — highly scalable network nodes forming network core, persisting the public replicated DKG DKG Edge Node V8 — user-friendly node applications tailored to edge devices (phones, laptops, etc). ChatDKG V8 — the launchpad for creating AI solutions using decentralized Retrieval Augmented Generation (dRAG). DKG Edge Node — enabling the largest, internet-scale decentralized physical infrastructure network (DePIN)

The newcomer in the product suite is the DKG Edge Node — a new type of DKG node enabling the OriginTrail ecosystem to tackle the global challenges described above. As the name suggests, DKG edge nodes can operate on Internet edge devices. Devices such as personal computers, mobile phones, wearables, IoT devices, but also enterprise and government systems are where we can find huge volumes of very important data activity that DKG edge nodes will enable to enter the AI age in a safe and privacy-preserving way. The DKG edge node will enable such sensitive data to remain protected on the device, giving owners full control over how their data is shared.

Together with being protected on the device, edge-node data becomes a part of the global DKG with precise access management permissions controlled by the data owner. In this way, AI applications that the owner allows data access to will be able to use it together with the public data in the DKG via decentralized Retrieval Augmented Generation (dRAG).

Since such AI applications can equally be run locally on devices directly, this enables fully privacy-preserving AI solutions aimed at the ever-growing number of devices on the network edge that can at the same time use both public and private DKG data. The introduction of the DKG edge node enables the DKG to quickly expand to be the largest, internet-scale decentralized physical infrastructure network (DePIN).

New features of the DKG Edge Node

To unlock these powerful capabilities, DKG edge node will include new features that have previously not been available on DKG nodes but were elements of other proprietary or open-source products.

To enable a seamless creation of knowledge, DKG nodes will inherit the proven knowledge publishing pipelines from the Network Operating System (nOS). The data protection techniques for private and sensitive data will be based on the NGI-funded OpenPKG project outcomes. The DKG Node will support all major standards such as GS1 Digital Link, EPCIS, Verifiable Credentials, and Decentralized Identities. To support the growing field of knowledge graph implementations globally, it will enable seamless knowledge graph integrations of major knowledge graph providers such as Ontotext, Oracle, Snowflake, Neo4j, Amazon Neptune, and others.

DKG Edge Node V8 Prototype — Oura Ring integration with demonstrated 400 Knowledge Assets published in 10 seconds

DKG V8 Timeline

The V8 DKG launch sequence consists of 4 stages, aligned with the wider OriginTrail ecosystem roadmap, with a forkless upgrade to V8.

Stage 1: V8 multi-chain infrastructure deployment

Paranet deployment and first IPOs launched Base blockchain integration Cross-chain knowledge mining support

Stage 2: DKG core internet-scale V8 testnet launch

Asynchronous backing Knowledge assets V2: Batch minting (in prototype) DKG Core: Random sampling (in prototype)

Stage 3: DKG edge nodes on V8 testnet

Edge node beta launch Knowledge assets V2: Batch minting & native vector support DKG Core: Random sampling deployment

Stage 4: V8 mainnet upgrade deployment (October 2024)

To stay on trac(k) with updates on DKG V8 as it nears the deployment phase, make sure to join our Telegram or Discord channels!

¹https://v1.bsigroup.com/en-GB/insights-and-media/media-centre/press-releases/2023/july/new-solution-developed-for-cross-border-food-transfers/

²https://page.bsigroup.com/BSI-Academy-Blockchain-Solution

³https://www.bsigroup.com/globalassets/localfiles/en-th/innovation/blockchain-white-paper-th.pdf

https://www.gs1.org/sites/default/files/bridgingblockchains.pdf

https://www.gs1si.org/novice/novica/origintrail-resuje-izziv-ponarejenega-viskija

https://ec.europa.eu/digital-building-blocks/sites/display/EBSISANDCOLLAB/European+Blockchain+Sandbox+announces+the+selected+projects+for+the+second+cohort

https://origintrail.io/documents/Verifiable_Internet_for_Artificial_Intelligence_whitepaper_v3_pre_publication.pdf

https://origintrail.io/blog/decentralized-rag-with-origintrail-dkg-and-nvidia-build-ecosystem

https://ai.meta.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models/

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Energy Web

Energy Web Launches Full RPC Node for the Energy Web Chain

Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operatio
Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options

Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operations for energy sector enterprises and application developers.

The new Full RPC Node is available in two flexible deployment options: fully managed or Bring Your Own Cloud (BYOC). Clients can choose to deploy their node on leading cloud platforms including AWS, GCP, and Digital Ocean. This flexibility ensures that organizations can select the deployment model that best fits their operational needs and technical environments.

Key features of the Energy Web Full RPC Node include:

Fully Dedicated Node: Each client receives a dedicated node, eliminating rate limiting and ensuring optimal performance and security for their blockchain applications. Comprehensive Security: Nodes are properly secured, providing peace of mind that organizational data and transactions are protected. Embedded Analytics Dashboards: Integrated analytics dashboards offer deep insights and real-time monitoring, enabling clients to make informed decisions based on accurate data.

The introduction of the Full RPC Node further expands Energy Web’s infrastructure offerings, reinforcing the company’s commitment to providing cutting-edge solutions that meet the evolving needs of the energy sector.

With the launch of our Full RPC Node, we’re offering a powerful tool for organizations that require robust access to the Energy Web Chain,” said Jesse Morris, Senior Fellow of Energy Web. “This product ensures that our clients can operate their applications smoothly and securely, with the flexibility to choose a deployment option that best suits their needs.

For more information about the Energy Web Full RPC Node and how it can benefit your organization, please visit www.smartflow.org

About Energy Web

Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Launches Full RPC Node for the Energy Web Chain was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 02. July 2024

FIDO Alliance

What is a passkey? Why Apple is betting on password-free tech

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, […]

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, transforming how users secure their online accounts. This innovation is set to create a password-less future, significantly enhancing user data protection.

What Are Passkeys?

Passkeys are a sophisticated, passwordless login option for apps and websites developed by the FIDO Alliance. They consist of a “private key” stored on the user’s device and a “public key” residing with the service. This dual-key system undergoes an encrypted verification process, ensuring that access is granted only when the user’s biometrics or device PIN confirm their identity. This system effectively eliminates the need for passwords and multi-factor authentication codes, creating a seamless and secure user experience.

The Benefits of Passkeys

Traditional logins rely on passwords, which users often reuse across multiple sites, posing substantial security risks. Passkeys, however, are tied to the user’s unique device and biometric data, rendering them immune to phishing and brute-force attacks. If a passkey is stolen, it becomes useless without the rightful owner’s biometric verification. This intrinsic link between the user and the device significantly mitigates the threat landscape.

Banks and Passkey Adoption

While the advantages of passkeys are clear, some industries have been slow to adopt, including banks. Andrew Shikiar, CEO and Executive Director of the FIDO Alliance, explains, “Banks and financial institutions operate in a highly regulated industry, so they are vigilant when it comes to ensuring that user authentication complies with relevant regulations. Synced passkeys introduce a new customer assurance model that compliance leads within banks are still adjusting to.”

However, Shikiar noted that “we are now seeing regulatory and other government bodies begin to give formal guidance on how industry should contemplate passkeys,” including an April 2024 missive from the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) offering guidance about implementation.

But Shikiar says that “banks are hypersensitive to customer experience,” too, and thus more cautious about changing how customers log in—even if passkeys are quicker and more secure. New login methods require educating customers—and that takes time.

Despite these bottlenecks, Shikiar says that banks are slowly moving away from strictly password-based logins because they “inherently understand that using a passkey as a primary factor is far superior to a password.”

The Collaborative Future of Passwordless Authentication

Apple’s implementation of passkeys underlines a collective effort by tech giants within the FIDO Alliance, including Microsoft and Google, to enhance internet security. The Alliance has pioneered developments in authentication standards, striving to eliminate the vulnerabilities of password-based systems. Users can visit the FIDO Alliance to learn more about the ongoing efforts and advancements in passkey technology and the latest in passkey implementation.

As passkeys gain traction, the internet moves closer to a future where security does not come at the expense of user convenience. The collaborative efforts of industry leaders within the FIDO Alliance signal a transformative shift towards more secure, passwordless authentication methods, promising a safer digital experience for all.


Ceramic Network

CeramicWorld 05

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks: Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks:

Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic are calling developers to build for the Base Onchain Summer! 🏖️ Proof of Data is coming to EthCC ✈️ Ceramic’s new Recon Protocol is almost here! 🔥 Supercharge your crypto database with OrbisDB plugin for Gitcoin Passport! 🔥

OrbisDB team has just announced their new plugin for Gitcoin Passport.

OrbisDB is a decentralized database built on Ceramic, for onchain builders providing a practical, scalable solution for storing and managing open data. Gitcoin Passport lets users collect verifiable credentials to prove their identity and trustworthiness without revealing personal information, providing apps with a safeguard against sybil attacks and bad actors.

The new plugin allows developers to simply integrate the no-code Gitcoin Passport plugin with the OrbisDB instance to automatically generate reputation scores and filter out malicious actors from being indexed.

Check out this video to learn more and see the new plugin in action:

0:00 /2:18 1× If you’d like to become a Beta tester for Orbis plugins, shoot the team a DM! Index Network adds Farcaster integration

Index Network has recently added a Farcaster integration to their decentralized semantic index.

Index is a composable discovery protocol built on Ceramic, allowing the creation of truly personalized and autonomous discovery experiences across the web.

This integration allows for seamless interaction with decentralized graphs, including user-owned knowledge graphs on Ceramic and social discourse on Farcaster. Paired with autonomous agents, which can be used to subscribe to specific contexts, this new integration pushes the limits of what’s possible with semantic search. Check out the demo below:

0:00 /1:51 1× Learn more about Index Network

Build on Index Network for the Base Onchain Summer

Index Network has teamed up with the Ceramic team to call developers to build on Index Network for this year’s Base Onchain Summer! Base Onchain Summer is a multi-week celebration of onchain art, gaming, music, and more, powered by Base.

Devs are invited to build composable search use cases between Base and other projects participating in Base Onchain Summer. For example, those use cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain projects (Zora, Nouns)

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

And of course, there are prizes! A total prize pool of 2250 USDC will be distributed across the top 3 best applications!

Check out the bounty details on Bountycaster and reach out to Ceramic and Index teams on Farcaster if you have any additional questions.

Start building today! Proof of Data is coming to EthCC!

The third edition of the Proof of Data event series is coming to Brussels! Join the Ceramic and Textile (creators of Tableland and Basin) teams for an inspiring afternoon, expanding on the essential discussions from EthCC. This event will unite pioneers and practitioners in the decentralized data realm. Engage in dynamic panel discussions and networking opportunities, ideal for developers and innovators eager to push the boundaries of decentralized technology.

Featured presenters from IoTeX, DIMO, WeatherXM, and Filecoin will share the latest advancements and projects, sparking engaging conversations with all attendees. A moderator will guide these discussions, ensuring critical themes in crypto, web3, and beyond are covered.

Don’t miss this chance to connect, collaborate, and contribute to the future of decentralized technology. Be part of the conversation driving the next wave of technological innovation!

RSVP today and join our Data Room Telegram channel.

RSVP Index Network & CrewAI Integration

Index now supports an integration with CrewAI, which brings an intuitive way to design multi-agent systems, with Index offering composable vector database functionality. Now, autonomous agents can synthesize data from multiple sources seamlessly.

Learn more! Ceramic’s new Recon Protocol is almost here!

The core Ceramic team is getting ready for the public release of Ceramic’s new Recon Protocol. This new Ceramic networking protocol improves network scalability and data syncing efficiency. It unlocks data sharing between nodes and enables users to run multiple nodes that stay in sync and load balanced. This will enable highly available Ceramic deployments.

Ceramic’s Recon Protocol is in the last testing stages, with some key partners already building on it. It will be launched as part of the nearest upcoming Ceramic release, which will unlock the document migration process from js-ceramic + Kubo to js-ceramic + rust-ceramic.

The next Ceramic public release is scheduled in a few weeks' time. Keep an eye on the Ceramic public roadmap and Ceramic blog for updates regarding the release!
Ceramic Community Content BOUNTY: Build composable search applications on Index Network TRENDING DISCUSSION: Ceramic without Anchoring TRENDING DISCUSSION: Private Data Architecture TUTORIAL: Save OpenAI Chats to OrbisDB on Ceramic VIDEO: How data logs are defined to be easily discoverable in an open network by Charles from the Orbis team VIDEO: OrbisDB lifecycle by Charles from the Orbis team VIDEO TUTORIALS: Check out the latest video tutorials shared on the Ceramic YouTube channel Events Meet the Ceramic team at EthCC and side events: July 9, Proof of Data July 9, Data on Tap: Data & AI Cocktail Hour with Ceramic & Tableland July 10, Builders Brunch July 11th, Ceramic ecosystem developers call Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.

pic.twitter.com/yJkRHzFdtb

— Ceramic (@ceramicnetwork) June 17, 2024


Until next time! 🔥

Monday, 01. July 2024

We Are Open co-op

Finding your activist friends

Solidarity, common ground, and intersectionality in the climate movement This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities. Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional underst
Solidarity, common ground, and intersectionality in the climate movement

This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities.

Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional understandings of the climate crisis. Together, we explored what intersectionality looks like in the climate movement and how we can tell stories that lead to action.

cc-by-nd Bryan Mathers Expanding our activist energies

Although many of us care about a variety of struggles, we don’t have the time or the energy to get involved in every single thing. We focus our energies, we have to. The problem, of course, is that each issue and cause needs the visibility a group of activists coming together can provide. So how can we focus ourselves and find energy to do more?

For just a couple hours last week, I worked with a small group of rabble rousers to create campaign ideas for the challenge:

The intersection and complexities of our structural problems makes people feel powerless.

The structural problems we are facing are complex and co-exist within a matrix of other challenges. We are dealing with environmental crisis, racist societies, and social inequalities left, right, and centre. No matter how positive your personality may be, it’s hard to stay optimistic. No matter how cognisant you are about other struggles, it’s hard to pay attention to everything.

When we are overwhelmed and feeling powerless, we tend to recede. Our group had the insight that feeling overwhelmed or powerless is lonely. Loneliness is a cascading psychological phenomena that halts action and feeds despair.

We started to think about how we might address loneliness in activist movements by telling stories that help people who feel like they belong to one group (e.g. environmentalists) to understand their connection to other groups.

Our theory of change is that finding belonging amongst your activist friends can provide you with solidarity and a source of energy. We wanted to push for intersectionality in the climate movement.

Intersections in audiences, the Audience Ikigai, cc-by WAO Choose three: intersectionality in practice

Everybody cares about something, whether that be sports, the environment, or even status. If you can identify one thing you care about, you can surely identify three others. Using arbitrary design constraints, like “choosing three” is a good way to move any idea forward, including ideas around your own activism or civic participation.

We know that climate change disproportionately affects already marginalised communities, which can exacerbate existing social inequalities. With this in mind, we choose to look at the intersectionality of climate with three human rights issues:

Refugee and migrant justice Women’s rights LGBTQIA+ rights

Easy, right? Choosing three issues to put your energy into is a lot less than “everything”. Three is also enough to give variability and provide access to different communities. Different communities come with different energies and that is something you can tap into when needed.

We often consider the thematic intersections of our own work. See how we work in the overlaps together with thoughtful, ethical organisations in Practical utopias and rewilding work. Find connections and leverage points

Intersectionality is about understanding the points of interconnection between two issues. Seeing the overlaps means that you can connect issues together in new and novel ways. Novelty is just one storytelling tactic in calling attention to a particular issue. Once you’ve determined places to focus, you can further narrow your focus by looking for leverage points that lead to connection.

Refugee and migrant justice — From climate-induced displacement to the fact that people who are forced to migrate whatever the reasons can face challenges in accessing their basic rights, refugee and migrant justice ties heavily to other environmental and human rights issues. Women’s rights — Women’s societal roles as caregivers and food producers make them more vulnerable to the effects of climate change. It’s now widely understood that educating girls is a catalyst towards climate action. LGBTQIA+ rights — Again, marginalised communities are disproportionately affected by the climate crisis. LGBTQIA+ are often members of other marginalised communities, such as racial minorities, and they are more likely to live in poverty.

Human rights and environmental justice are big and complex areas of focus. Thinking about how the complexities of these issues overlap can help narrow down the impact you want to have.

cc-by Iris Maertens with Dancing Fox Have some fun

Yes, structural problems are serious and complicated. It is essential to be aware of both your own privileges (whatever they may be) and to think deeply about the issues and communities you are working with and within. It’s also important to know that joy is a common emotional human experience. Inciting joy is a way to truly help people. It can help build psychological characteristics that help people deal with whatever life has to throw at them. Joy can also open people up to a better tomorrow.

At the event I attended last week, as we thought about the intersectionality of environmentalism with human rights, we thought about how we might be able to inspire people to be joyfully curious to learning more about an issue they might not have much involvement with.

We developed a few posters, designed to be displayed on a metro, to inspire this curiosity.

Our poster ideas, drawn by the incredible Iris Maertens Solidarity with others

The complexity of our global problems can be overwhelming, but we cannot solve one complex issue without tackling the intertwining structural issues. Finding ways to relate what you care about to what others care about is a way to build solidarity and, therefore, momentum. It’s not always easy, but the more you can participate in cross-cutting social and environmental communities, the bigger our collective power becomes.

I worked with inspiring people from these organisations:

La Casa dels Futurs is both an ongoing project dedicated to supporting intersectional organizing between social and ecological movements, and a campaign to create a permanent Climate Justice Center and Movement School…” “Rinascimento Green…aims to bring together various pieces of civil society to promote, through a path of popular participation, a bottom-up Green Deal.” “WeSmellGas is a collective of organisers, researchers and film-makers based in Northern Europe. Climate justice can only be realised by dismantling capitalism and the imperial processes that reinforce it, including our current extractivist energy system.”

🔥 Do you need help with storytelling and advocacy? Check out WAO’s free, email-based Feminism is for Everybody course, or get in touch!

Finding your activist friends was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Happy birthday to the Identity at the Center podcast! Our la

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners for your continued support!

Watch it here https://youtu.be/OUHTB1ncLME?si=fxeS8bNtzmKaW2kp or listen in your podcast app.

More info at idacpodcast.com

#iam #podcast #idac

Friday, 28. June 2024

Ceramic Network

Save OpenAI Chats to OrbisDB on Ceramic (Tutorial)

Build an AI-powered chatbot using OrbisDB for storage and the OpenAI API.

Last year we partnered with Learn Web 3 (a free educational platform for Web3 developers) to publish a tutorial on Saving OpenAI Chats to ComposeDB on Ceramic to showcase an easy-to-understand design architecture and how ComposeDB could be leveraged for storage. In that example we showed how to configure and deploy a local ComposeDB/Ceramic node, walking through data model design, server configurations, model deployment, and runtime definition generation, all of which are necessary steps a developer must undergo before running the application locally.

But what if developers could bypass local node configuration altogether and start testing their database design and application logic immediately? What if they could do so with the assurances of no lock-ins, and the open option to move to a self-hosted configuration in the future? And finally, what if they could benefit from all of these things while enjoying a seamless developer experience that makes storage setup easy?

That's where OrbisDB comes in.

What is OrbisDB?

OrbisDB is an advanced decentralized database built on the Ceramic Network and offers an ORM-like interface that developers can leverage when integrating the OrbisDB SDK. Developers who have worked with Prisma or Drizzle with a Postgres instance will find this experience familiar and exceedingly easy to work with.

As for developer experience, what sets OrbisDB apart are the following:

A user interface (UI) developers can run either locally or using a hosted Studio instance, bypassing the need to define and deploy data models by hand (which is still an option if using the SDK). The UI also includes data visualization (you can view data relevant to your applications in table format), as well as other views for configuring add-ons like plugins (described below). OrbisDB offers the option to leverage a variety of a growing list of plugins to enrich the data capabilities developers can incorporate into their application logic. Some example plugins offer gating ability, automatic resolution of ENS domains, sybil resistance, and more. Anyone can also build plugins and incorporate them immediately in the event they're running a standalone instance. The OrbisDB SDK wraps user authentication, client creation, schema creation (if developers prefer not to use the UI), and querying all under one roof, therefore simplifying the list of dependencies developers need to worry about. Finally, OrbisDB offers the option to run an instance locally (similar to the ComposeDB tutorial mentioned above), or on a shared (hosted) instance. This is a significant feature for overall development and testing velocity as it lets developers start writing and reading data right away without having to worry about node configuration. Once developers are ready to take their application to production after testing on the shared instance, setting up a self-hosted (standalone) instance is straightforward.

For this tutorial, we will be leveraging the hosted Studio instance to both define our data models and utilize a shared OrbisDB instance.

Let's Get Started!

Before we get started, you will need the following dependencies:

MetaMask Chrome Extension (or similar browser wallet for authentication) Node v20 An OpenAI API key A project ID you will need from WalletConnect  A free OrbisDB Studio account - first, log in with your browser wallet. We will use this later to define our data models and obtain a context and environment ID Initial Setup

First, clone the repository and install the dependencies:

git clone https://github.com/ceramicstudio/orbisdb-chatbot && cd orbisdb-chatbot npm install

Next, create a copy of the example env file in your root directory:

cp .env.example .env

Visit OpenAI Signup page to create an account if you don't yet have one and generate an API key. OpenAI offers a free OpenAI API trial, with $5 worth of credit (which can take you a LONG way). Go ahead and assign your new API key to OPENAI_API_KEY in your new .env file.

Navigate to WalletConnect and create a free account and a new project (with a name of your choosing and the App type selected). Copy the resulting project ID and assign it as the value for NEXT_PUBLIC_PROJECT_ID .

OrbisDB Setup

If you're logged into your OrbisDB Studio account, we can start connecting our application to a shared OrbisDB instance.

First, you will need to define a new context. Contexts are a Ceramic-native feature exposed in all data management methods, and make it easy for developers to organize data across different applications or projects (there is also the option to leverage sub-contexts, but we'll save this for a future tutorial).

Go ahead and click "+ Add context" within your root studio view - feel free to give your new context a name and description of your choosing:

If you click into your new context you can view its corresponding ID:

Go ahead and copy this value and assign it to NEXT_PUBLIC_CONTEXT_ID in your .env file.

On the right-hand side, you should also see details about your setup:

Copy the value found under "Environment ID" and assign it to NEXT_PUBLIC_ENV_ID in your .env file. This ID is required to identify you when using the shared OrbisDB instance.

You will also see the endpoints for the shared Ceramic and OrbisDB instances in the same section. No need to copy these values as they are already hard-coded into the repository.

Defining Data Models

We will also use the Studio UI to define the data models our application needs. This demo application utilizes two simple data models found within the tables file in our repository:

posts - this will contain each message within our conversation exchange. The "body" field will house the message itself, while the "tag" field will keep track of who the message came from (user vs. bot). This model will use the "List" account relation, which means an authenticated account can have an unbounded amount of instance documents that fall under this definition. profiles - this model will allow us to assign additional data to ourselves and our chatbot, including a name, username, and fun emoji. The "actor" subfield will be used to differentiate between the user (using the value "human"), and your chatbot (using the value "robot"). In contrast to posts, this model will use the "Set" account relation based on the "actor" subfield, which means an account can have exactly 1 instance document given a value assigned to "actor". For example, this ensures that our application won't allow us to accidentally create >1 document with an "actor" subfield matching "human".

To start creating the models, navigate to "Model builder" from the Studio navigation. You can start by defining your "posts" table. After clicking "Create Model" you will be able to view the model ID:

Copy this value and assign it to NEXT_PUBLIC_POST_ID in your .env file.

Go through the same steps for your "profiles" table. However, be sure to select the "Set" option under "Account relation". Copy the resulting model ID and assign it to NEXT_PUBLIC_PROFILE_ID in your .env file.

Application Architecture

As mentioned above, the OrbisDB SDK makes it easy to instantiate clients, authenticate users, and run queries using the same library. As you'll note in the application repository, there are various components that need to be able to access the state of the authenticated user. While we're wrapping all components of our application within a WagmiConfig contextual wrapper (which will allow us to leverage Wagmi's hooks to see if a user's wallet is connected - learn more about this in our WalletConnect Tutorial), we also need a way to know if the user has an active OrbisDB session.

While there are multiple ways to facilitate this, our application uses Zustand for state management to circumvent the need for contextual wrappers or prop drilling.

If you take a look at the store file you can see how we've set up four state variables (two of which are methods) and incorporated the OrbisDB SDK to authenticate users and alter the state of orbisSession:

type Store = { orbis: OrbisDB; orbisSession?: OrbisConnectResult | undefined; // setOrbisSession returns a promise setAuth: ( wallet: GetWalletClientResult | undefined ) => Promise<OrbisConnectResult | undefined>; setOrbisSession: (session: OrbisConnectResult | undefined) => void; }; const StartOrbisAuth = async ( walletClient: GetWalletClientResult, orbis: OrbisDB ): Promise<OrbisConnectResult | undefined> => { if (walletClient) { const auth = new OrbisEVMAuth(window.ethereum!); // This option authenticates and persists the session in local storage const authResult: OrbisConnectResult = await orbis.connectUser({ auth, }); if (authResult.session) { console.log("Orbis Auth'd:", authResult.session); return authResult; } } return undefined; }; const useStore = create<Store>((set) => ({ orbis: new OrbisDB({ ceramic: { gateway: "https://ceramic-orbisdb-mainnet-direct.hirenodes.io/", }, nodes: [ { gateway: "https://studio.useorbis.com", env: ENV_ID, }, ], }), orbisSession: undefined, setAuth: async (wallet) => { if (wallet) { try { const auth = await StartOrbisAuth(wallet, useStore.getState().orbis); set((state: Store) => ({ ...state, orbisSession: auth, })); return auth; } catch (err) { console.error(err); } } else { set((state: Store) => ({ ...state, session: undefined, })); } }, setOrbisSession: (session) => set((state: Store) => ({ ...state, orbisSession: session, })), }));

As you can see, we've hard-coded the Ceramic and OrbisDB gateways, whereas we've imported our environment ID that we previously assigned as an environment variable.

Our navbar component sits at the same or greater level as all of our child components and includes our Web3Modal widget. You can see how we're using a useEffect hook to check if our session is active and either set our "loggedIn" state variable as true or false. This result determines if we generate a new session for the user by leveraging our setAuth method from our Zustand store, or if we simply set our orbisSession as the value of our valid active session.

Back in the home page component you can see how we're conditionally rendering our MessageList child component based on whether we have both an active orbis session AND the user's wallet is connected (allowing us to access their address).

Reading Data

The message list and userform component files are responsible for performing the majority of writes and reads to OrbisDB. If you navigate to the message list component for example, take a look at how we've imported our client-side environment variables to identify our post and profile models, as well as our context ID. When this component is rendered, the useEffect hook first invokes the "getProfile" method:

const getProfile = async (): Promise<void> => { try { const profile = orbis .select("controller", "name", "username", "emoji", "actor") .from(PROFILE_ID) .where({ actor: ["human"] }) .context(CONTEXT_ID); const profileResult = await profile.run(); if (profileResult.rows.length) { console.log(profileResult.rows[0]); setProfile(profileResult.rows[0] as Profile); } else { // take the user to the profile page if no profile is found window.location.href = "/profile"; } await getRobotProfile(profileResult.rows[0] as Profile); } catch (error) { console.error(error); return undefined; } };

Notice how we've constructed a .select query off of our OrbisDB instance (provided by our Zustand store), asking for the corresponding values for the 5 columns we want data for.

Next, we need to notate which data model we want our query to reference, which is where we use .from with our profile model ID as the value.

We also only want the records where the profile is for the human user, indicated on the following line.

Finally, we use the context ID that corresponds to this project as the final value that's appended to the query.

If a corresponding profile exists, we then invoke the getRobotProfile method to obtain our chatbot's information. If it does not exist, we take the users to the profiles page so they can create one.

Writing Data

Let's take a quick look at an example of data mutations. Within the same message list component you will find a method called createPost which is invoked each time the user creates a new message:

const createPost = async ( thisPost: string ): Promise<PostProps | undefined> => { try { await orbis.getConnectedUser(); const query = await orbis .insert(POST_ID) .value({ body: thisPost, created: new Date().toISOString(), tag: "user", edited: new Date().toISOString(), }) .context(CONTEXT_ID) .run(); if (query.content && profile) { const createdPost: PostProps = { id: query.id, body: query.content.body as string, profile, tag: query.content.tag as string, created: query.content.created as string, authorId: query.controller, }; return createdPost; } } catch (error) { console.error(error); return undefined; } };

While this looks similar to the syntax we use to read data, there are a few differences.

First, take a look at the first line under the "try" statement - we're calling getConnectedUser() off of our OrbisDB prototype chain to ensure that our active session is applied. This is necessary to run mutation queries, whereas it's not a necessary step for reading data.

You can also see that we've swapped out the .select and .from statements for .insert which references the model ID we want to use, thus creating a new row in the corresponding table.

Finally, we're referencing the user's message value for the body while ensuring we tag the message as coming from the "user" before running the query and checking on its success status.

Running the Application in Developer Mode

We're now ready to boot up our application!

In your terminal, go ahead and start the application in developer mode:

nvm use 20 npm run dev

Navigate to http://localhost:3000/ in your browser. You should see the following:

Go ahead and click on "Connect Wallet." You should see a secondary authentication message appear after you connect your wallet:

Signing this message creates an authenticated session (using orbis.connectUser() from our Zustand store). You can check the value of this session by navigating to the "Application" tab in your browser and looking for the orbis:session key pair:

Given that you have not yet created any messages, the application should automatically direct you to the /profiles page where you can assign identifiers to yourself and your chatbot:

Finally, navigate back to the homepage to begin exchanging messages with your chatbot. Notice how the values from your corresponding profiles appear next to the messages:

How Could this Application be Improved?

Since our message history is being written and queried based on static values (for example, assigning messages to the "user" tag), you'll notice that the same conversation history appears when self-authenticating with a different wallet address and creating a new session.

As a challenge, try thinking about how to implement different ways to improve the application design to improve this experience:

Tagging the profiles and messages with values that align with actual authenticated accounts instead of static ones Altering our message data model and application to accommodate different chat contexts, allowing a user to have different conversation histories Next Steps

We hope you've enjoyed this tutorial and learned something new about how to configure and incorporate OrbisDB into your application! While this concludes our walk-through, there are other possibilities Ceramic has to offer:

Join the Ceramic Discord

Follow Ceramic on X

Follow Orbis on X

Start Building with Ceramic


GS1

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plas
Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less

Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plastic and make recycling easier.

Information that was previously on the labels of Jeju SamDaSoo mineral water is now available simply by scanning the QR Code with GS1 Digital Link on the bottle cap.

Beyond being compliant with national laws, the company is seeing improved engagement with consumers, better inventory management and more.

case-study-gs1-korea-jpdc.pdf

Wednesday, 26. June 2024

FIDO Alliance

FIDO APAC Summit 2024 Announces Keynotes, Speakers, and Sponsors

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, […]

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, 2024. Co-hosted by SecureMetric Technology and supported by Malaysia Digital Economy Corporation (MDEC) and CyberSecurity Malaysia, this premier event is dedicated to advancing phishing-resistant FIDO authentication across the region under the theme, “Unlocking a Secure Tomorrow.”

The summit will feature keynote addresses by notable leaders such as Gobind Singh Deo, Malaysia’s Minister of Digital; Dato’ Dr. Amirudin Abdul Wahab, CEO of CyberSecurity Malaysia; TS. Mohamed Kheirulnaim Mohamed Danial, Senior Assistant Director of National Cyber Coordination and Command Centre (NC4) & National Cyber Security Agency (NACSA); Andrew Shikiar, CEO & Executive Director of FIDO Alliance; and Edward Law, CEO of Securemetric. 

They will be joined by a distinguished roster of speakers including Christiaan Brand, Product Manager: Identity and Security at Google; Eiji Kitamura, Developer Advocate at Google; Henry (Haixin) Chai, CEO of GMRZ Technology / Lenovo; Hyung Chul Jung, Head of Security Engineering Group at Samsung Electronics; Khanit Phatong, Senior Management Officer at Thailand Electronic Transactions Development Agency; Masao Kubo, Manager of Product Design Department at NTT DOCOMO; Naohisa Ichihara, CISO at Mercari; Niharika Arora, Developer Relations Engineer at Google; Sea Chong Seak, CTO at SecureMetric; Simon Trac Do, CEO & Founder of VinCSS; Takashi Hosono, General Manager at SBI Sumishin Net Bank; Yan Cao, Engineering Manager at TikTok; and Hao-Yuan Ting, Senior Systems Analyst at Taiwan Ministry of Digital Affairs.

The updated list of speakers can be found here.

Among the speakers, Tin Nguyen, a former U.S. Marine and FBI Special Agent, now a cybersecurity expert, will discuss the benefits of passwordless authentication and how it enhances organizational defenses against cyber threats. “Cybercriminals continuously search for vulnerabilities to take advantage of. Therefore, it is imperative for organizations to implement strong cybersecurity measures to safeguard their users,” says Nguyen. “Implementing FIDO-based passkeys provides an extra layer of security, mitigating potential threats without compromising user experience.”

The event promises to attract hundreds of attendees and will feature keynote addresses, panel discussions, technical workshops, and an expo hall showcasing the latest innovations from leading technology companies such as Securemetric, VinCSS, OneSpan, iProov, Thales, AirCuve, Zimperium, RSA, Yubico, Identiv, Utimaco, FETIAN, and many more. Attendees will have the opportunity to explore the latest trends in cybersecurity, network with top industry minds, and gain invaluable knowledge on implementing FIDO standards for enhanced security.

“The FIDO Alliance is thrilled to host its second FIDO APAC Summit 2024 in Malaysia, featuring presentations from some of the brightest minds in authentication from the APAC region and beyond,” said Andrew Shikiar, Executive Director and CEO of the FIDO Alliance. “With the continuous rise in the volume and sophistication of cyber-attacks, it is crucial for organizations to move past passwords and adopt passkeys, a user-friendly alternative based on FIDO standards.”

Registrations are now open to the public. For more information and to register, please visit www.fidoapacsummit.com. For sponsorship opportunities, please contact events@fidoalliance.org.

About the FIDO Alliance 

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.

PR Contact 

press@fidoalliance.org


Me2B Alliance

Do SDKs Represent Actual Network Traffic in EdTech Apps?

1. Background  In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on […] The post Do SDKs Represent Actual Network Traffic in EdTech
1. Background 

In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on their behaviors related to safety. As part of the safety evaluation, SDKs in each app were identified and researchers collected network traffic for 1,357 apps. In total, there were 275 unique SDKs in the apps, and 8,168 unique subdomains, 3,211 unique domains from the network traffic.  

A key research question in conducting the 2022 EdTech benchmark was to determine how accurate SDKs were as a proxy for actual third-party data sharing, since network traffic data collection is somewhat labor-intensive. This report shares the results of the analysis. 

2. Analysis 

The basis of the analysis was to compare the “expected” third parties as based on the company owners of the SDKs with the observed companies in the network traffic. This required identifying the owner companies for both the SDKs and all the subdomains observed in the aggregate network traffic.1 

Researchers first identified which SDKs were in use in apps by using AppFigures as a resource. In total, 275 SDKs unique SDKs were found in use across all apps.  Next, researchers identified the companies who published these SDKs. For each app, the number of unique company owners of SDKs found in each app is referred to as the “expected” number of companies to receive data.  

Next, researchers performed a similar analysis on the subdomains observed in the network traffic (1,175 total apps). Each subdomain was resolved to an “owner” company.  Subdomains were identified from HTTP POST/GET requests captured in the network traffic. 

We then performed two quantitative analyses: (1) we examined the network traffic of apps with at least one SDK (n=1,083 apps), and (2) we examined the network traffic of apps with no SDKs (n=92 apps).  

2.1   Apps With at Least One SDK 

Apps with at least one SDK communicated with an average of 10.1 companies based on observed network traffic (Table 1).  

2.1.1   “Expected” Companies in Network Traffic 

In apps with at least one SDK, there were an average of 4.7 unique companies represented by the SDKs–thus, 4.7 “expected” companies to receive data. However, on average, only 1.7 (or 36.2%) of the “expected” companies were seen in the network traffic of apps with at least one SDK (Table 1). 

Note that there are several contributing factors that could account for this, including: 

The manual testing performed by the researchers was unstructured and therefore had inconsistencies across researchers.  The manual testing didn’t perform all functions in the app. For instance, the tested did not make any optional purchases or upgrading to a premium version.    Table 1  Apps containing at least one SDK (n=1,083)   Average Expected Companies  Average Expected Companies Seen  Average # Unexpected Companies Seen  Average Total # of Companies Seen  Webview – With (n=609)  5.0  1.9  12.6  14.5  Webview – Without (n=474)  4.3  1.4  2.6  4.0  Advertisements – With (n=189)  5.6  2.1  24.0  26.1  Advertisements – Without (n=894)  4.5  1.6  5.0  6.6  Behavioral Advertisements – With (n=105)  5.4  2.1  33.7  35.8  Behavioral Advertisements – Without (n=978)  4.6  1.6  5.5  7.1  ALL Tested Apps With 1+ SDK (n=1083)  4.7  1.7  8.4  10.1 
2.1.2   “Unexpected” Companies in Network Traffic 

Additionally, as seen in Table 1, these apps communicated with an average of 8.4 unexpected companies.  

As expected, apps that used Webview2, had advertisements or behavioral ads all had even higher average numbers of unexpected companies, with apps with behavioral ads having the highest at 33.7 unexpected companies on average3. The ISL app score rubric regards the use of Webview and the inclusion of advertising as very high risks for K-12 students and the data in Table 1 reinforces the rubric.  

Apps with at least one SDK that use Webview had 2.6 times as many third parties as apps with at least one SDK that don’t use Webview.  Apps with at least one SDK that include ads had 3.0 times as many third parties as apps with at least one SDK that don’t include ads.  Apps with at least one SDK that include behavioral ads had 4.0 times as many third parties as apps with at least one SDK that don’t include behavioral ads.  2.2   Apps with No SDKs 

There were 92 apps in the data set that had no SDKs and for which we had network traffic. Since these apps had no SDKs, there were no “expected” companies to receive data from the app [other than the app developer, of course].  

Apps with no SDKs averaged 4.6 companies observed in network traffic—negligibly less than the average for apps with at least one SDK. However, for apps that use Webview, or include advertising or behavioral advertising, the average observed companies is markedly lower (Table 2).  

Apps with no SDKs that use Webview had 44.1% fewer observed companies.  Apps with no SDKs that include advertising had 40.6% fewer observed companies.  Apps with no SDKs that include behavioral advertising had 21.0% fewer observed companies.  Table 2  Apps with no SDKs   Average # of Companies Seen  Webview – With (n=43)  8.1  Webview – Without (n=49)  1.6  Advertising – With (n=11)  15.5  Advertising – Without (n=81)  3.2  Behavioral Ads – With (n=4)  28.3  Behavioral Ads – Without (n=88)  3.6  All Tested Apps Without SDKs (n=92)  4.6 
3. Conclusion
3.1   SDKs as a Proxy for Third Party Sharing

As the data shows, SDKs aren’t a useful proxy for the actual number of third parties receiving data from the app. Moreover, apps that include ads or that use Webview will likely have significantly more third parties than apps without.  

This means that viable measurement of third parties receiving data from apps requires testing and observation of network traffic. ISL used mostly manual methods for the collection of this data but automated methods would be extremely beneficial for ongoing and pervasive measuring of app third party sharing.  

SDKs do provide value in identifying potential omissions in the manual testing process. Can we account for the specific SDKs that don’t appear in the network traffic? Did we miss a particular functional branch of the app that we should go back and test? Or might it be an indication of an error in the SDK database? So while SDKs don’t serve as a perfect indication of the third parties communicating with the app, they still provide valuable information, and as such, they will remain in our app safety labels (see https://appmicroscope.org/).  

3.2   Validation of ISL App Scoring Rubric 

As shown in section 2, use of Webview and the inclusion of advertising substantially increase user exposure to data sharing with more third parties. This finding reinforces the ISL app scoring rubric wherein the use of Webview and presence of advertising are indicators for very high risk. 

4. Helpful Links 

App Microscope 

SDK Risk Dictionary 

Domain Risk Dictionary 

Company Risk Dictionary

 

Footnotes: See the SDK Risk Dictionary and the Subdomain Risk Dictionary for details.  Note: researchers determined the use of Webview manually, by observing third-party pages opening within the app. Thus, the presence of Webview as tagged in ISL’s AppMicroscope.org may not accurately assess Webview use for first-party web pages.  It would be interesting to study how many apps have behavioral ads and don’t use Webview.

The post Do SDKs Represent Actual Network Traffic in EdTech Apps? appeared first on Internet Safety Labs.


Next Level Supply Chain Podcast with GS1

50 Years of Confidence, Supply Chain Success, and the Next Dimension in Barcodes

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with: Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcode

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with:

Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcodes for better product traceability.

Sean Murphy from Cencora demystifies the Drug Supply Chain Security Act and emphasizes the necessity of unique serial numbers and digital backpacks for pharmaceutical products to ensure safety and compliance in the healthcare industry.

Andrew Meadows is the founder and CEO of BL.INK introduces the intriguing world of 2D barcodes and digital resolvers. Learn how BL.INK’s platform, BL.INK CXP revolutionizes consumer engagement by providing personalized experiences and enhancing data privacy.

JW Franz from Barcoding Inc. emphasizes the importance of supply chain automation innovation and the future of barcoding, including RFID and computer vision technologies.

They all speak on the gradual implementation of new technologies, the strategic importance of 2D barcodes, and the transformative potential of computer vision in inventory management. The episode also covers the crucial role of standardization and regulatory compliance in healthcare and explores the exciting advancements paving the way for smarter, safer, and more efficient supply chains.

 

Key takeaways:

Discover how the integration of 2D barcodes and QR codes, paired with advancements in computer vision, is revolutionizing retail and supply chain management for enhanced consumer experiences and operational efficiency.

Explore the significant impact of the Drug Supply Chain Security Act and the digital backpack concept on pharmaceutical traceability, with insights from Sean Murphy of Cencora on how serialization ensures compliance and safety.

Learn about BL.INK’s innovative 2D barcode technology and digital resolvers, with Andrew Meadows explaining how these tools enable personalized consumer interactions and secure data privacy, driving a more direct and meaningful brand engagement strategy.

 

Jump into the Conversation:

 

[00:00] Welcome to Next Level Supply Chain

[00:48] Coming to you from GS1 Connect 2024 in Orlando

[02:45] Introducing Dave DeLaus, CIO at Wegmans

[03:42] Hot Topics with Wegmans

[04:47] Some insights on use of the 2D barcode at Wegmans

[06:01] How you can interact with the 2D barcode differently for your customer

[10:17] Introducing Sean Murphy with Cencora

[12:14] Cencora’s use of EPCIS or Electronic Product Code Information Service

[14:04] Leveraging RFID technology

[14:53] Focusing on DSCSA to create a smart, safe, and sustainable supply chain

[16:48] 2D barcodes in the pharmaceutical and healthcare industry

[18:49] Introducing Andy Meadows, founder and CEO of BL.INK

[19:25] BL.INK platform and digital resolvers

[24:22] Advising product manufacturers about BL.INK

[25:37] Andy’s thoughts on the future of 2D barcodes

[27:53] Introducing JW Franz from Barcoding Inc.

[29:07] JW’s biggest takeaway fro attending Connect

[29:48] Barcoding Inc’s current focus

[30:28] JW’s thoughts on the future of RFID and 2D barcodes

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Dave DeLaus - CIO, Wegmans 

Sean Murphy - Senior Manager of Manufacturing Operations, Cencora

Andrew Meadows - Founder & CEO, BL.INK

JW Franz - IoT Automation Solution Director, Barcoding Inc.


Digital Identity NZ

Postcard from Berlin | June Newsletter

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Kia ora,

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. For someone who regularly spoke on this circuit for over 10 years, it was a blast to be back, revelling in the richness of the presentations and discussion, highlighted in this post by my predecessor at the Kantara Initiative. The brain is fully engaged for hours, absorbing expert insight, experience, and innovation that is found here at EIC in Germany and Identiverse in the US, the world’s two biggest conferences in this space. I’m looking forward to a tiny fraction of the ground being covered and contextualised locally at our Digital Trust Hui Taumata in just six weeks’ time.

The pre-conference SIDIHub made significant progress in the challenging goal to achieve digital identity cross border interoperability. Participants were interested in learning about Aotearoa becoming the first common law country to implement a regulated digital identity trust framework on July 1st. This framework regulates stakeholders that opt in for accreditation, thereby increasing customer trust in their security and privacy settings. The US, UK, Canada, and Australia have digital ID trust frameworks in operation and piloted, but not yet nationally legislated. Credential authentication and verification continue to evolve both in policy and technology at different speeds, making it a complex issue. 

At our recent DIA-hosted public/private sector working group meeting on digital identification standards, I commented that there is still much work needed globally before we can adopt comprehensive policies, protocols and standards for decentralised digital ID and its containerwallets. This is a plane we will continue to build as we fly it. 

The high interest in digital ID led us to host/co-host two events in June: the capacity-filled ‘Digital Identity, Higher Education, Aotearoa’ sponsored by Middleware and featuring the University of Auckland, and a Town Hall-styled session on digital cash with the Reserve Bank of New Zealand Te Pūtea Matua (RBNZ), in partnership with FinTechNZ. Both events showed how important broad digital trust is for ensuring cybersecurity and protecting against deepfakes, scams and hacking threats. Awareness and education are essential, so we thank our members for supporting DINZ initiatives, just as DINZ supports members’ initiatives like the upcoming series from NEC.

And finally, the DIA has released a new schedule of Identification Masterclasses through to August.

To register for any of the Zoom sessions, please email identity@dia.govt.nz with the G or HD reference number and a Zoom link will be supplied.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read full news here: Postcard from Berlin | June Newsletter

SUBSCRIBE FOR MORE

The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Monday, 24. June 2024

GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1 To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle. By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many fi
Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle.

By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many filling cycles the bottle has gone through, and whether it should be refilled or recycled.

Beyond the positive sustainability impact, the initiative provides a valuable set of data about each bottle’s journeys through the market across its lifecycle.

case-study-gs1-brazil-coca-cola.pdf

Ceramic Network

Calling all devs: Build composable search applications for the Base Onchain Summer

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include: Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns) The bounty is officially

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns)

The bounty is officially hosted on bountycaster.

About the Index Network

Index is a discovery protocol, built on Ceramic, that eliminates the need for intermediaries when finding knowledge, products, and like-minded people through direct, composable discovery across the web. By leveraging Web3 and AI, Index offers an open layer for discovery as the first decentralized semantic index. It functions as a composable vector database with a user-centric perspective, enabling interaction with decentralized graphs like Ceramic Network for user-owned knowledge graphs and Farcaster for social discourse.

About the bounty

For this bounty, developers have access to the Base search engine created on Index Network. They can utilize this index and integrate it with other projects and tools participating in Base’s on-chain summer to innovate and enhance information discovery experiences. Additionally, using Farcaster Channel indexes as a data source can help create personalized applications.

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

Prizes

A total prize pool of 2250 USDC will be distributed across top 3 best applications:

First place: 1000 USDC Second place: 750 USDC Third place: 500 USDC Useful links

Below you can find all of the tools and links available for you to build for this bounty.

Bounty:

Official link to the bounty on bountycaster

Indexes:

Create new indexes (for bounty builders only) Base documentation index Index Network Profile on Index Farcaster Channels Profile on Index

Documentation, tutorials and support:

Index.Network documentation Video tutorial for Farcaster contextual subscription Getting Started with Index Network SDK Video tutorial: Creating the Base documentation index GitHub Discord and forum for technical support

FIDO Alliance

The Register: AWS is pushing ahead with MFA for privileged accounts. What that means for you.

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to […]

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to maintain account access.


IT Brew: FIDO Alliance announces identity-proofing certification

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of […]

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of the FIDO Alliance, highlights that this certification technology “gives licensing companies added assurance that a vendor is performing well.”


Find Biometrics: ID Talk: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s […]

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s vendor accreditation, and the new Face Verification Certification program.


Identity At The Center - Podcast

In our newest episode of the Identity at the Center podcast,

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments. You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO Visit our website

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments.

You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO

Visit our website for more: idacpodcast.com

#iam #podcast #idac

Thursday, 20. June 2024

Me2B Alliance

We Need to Talk About Product Labels

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but […] The post We Need to Talk About Product Labels a

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but perhaps not warning labels and definitely not just for social media platforms. 

Various experts have written thoughtful responses1 to this week’s call for warning labels, and their concerns underscore the fact that a warning label may be inappropriate.  

But of course we need safety labels on technology. 

History of Product Labels 

There are several types of product labels: ingredient labels (food), test result labels (automobile crash test ratings), warning labels from the surgeon general (cigarettes) and from other entities (OSHA’s Hazard Communication System for chemicals).  

Safety labels have a long-standing history in the US as a core component of product safety and product liability. The purpose of safety labels is to illuminate innate—i.e. unavoidable—risks in a product whether it is food, vehicles, cleaning solvents, toys for children, or the technology that we use with increasing reliance for all facets of living our lives.  

Safety labels almost always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

According to InComplianceMag.com, US and Canadian tort-based law makes manufacturers’ product safety obligations clear (emphasis is mine): 

” Manufacturers have an obligation to provide safe products and to warn people about any hazards related to the product. Those requirements have risen up out of the original product safety/liability cases, some of which happened in the same timeframe as the Chicago World’s Fair, the middle to late 19th century, with many more to follow.

 

The assumption in U.S. liability law, and also typically if a case is brought in Canada, is that the manufacturer of the product is guilty and has to prove that they did everything necessary to provide a safe product. That includes warnings, user instructions, and other elements. Today, that continues to be the basic concept in product liability, that the burden lies on the manufacturer to prove that they did everything possible to make their product safe.” 

 

https://incompliancemag.com/product-safety-and-liability-a-historical-overview/

Another interesting fact about safety labels is that they always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

If tech were food we would have never stood for the absence of product information for as long as we have. Never. We use tech with little to no visibility or awareness of what it’s actually doing. That simply must change.  

We need a science of product safety for software and software driven technology. And that’s exactly what we’ve been building for five years at ISL. The current attitude of placing the onus on consumers to somehow gird themselves against invisible risks that not even vendors fully understand is absurd. Of course we need labels.  

And the good news is we’ve got them started on over 1,300 EdTech related apps. Here’s an example https://appmicroscope.org/app/1614/. The image below shows just the label header and the safety facts summary.   

Labels for Technology 

What type of label is appropriate for technology? A warning label is appropriate when the science is irrefutable. Are we there with the physical and mental health risks due to the use of technology? Maybe. Depends on who you ask. But maybe a label more like chemical warning labels is appropriate. Or perhaps just a test results label.  

In our work at Internet Safety Labs, our intention since day one was to expose invisible or difficult to recognize facts about risky behaviors of technology. As can be seen from the design of our app safety labels, we chose to emulate food nutrition labels that report measured findings. This approach of reporting measured findings works very well for this early stage of the science of product safety for technology.  

For instance, in our safety labels, you can see the category averages for most of the measures in the label. Why did we do that? Because there is no concrete threshold that distinguishes safe from unsafe ranges. There’s no industry standard that says, “more than ten SDKs is bad” for example. Moreover, technology norms vary by industry, such that personal information collection in fintech and medical apps is quite different than personal information collection in retail (at least one hopes). Thus, the category averages displayed in our labels don’t necessarily mean “safe”, they just provide context as we continue to measure and quantify technology behavior. An example of the shortcomings of this approach is when, for instance, the category average number of data brokers is greater than zero for apps typically used by children. (We advocate for no data brokers in technology used by children.) But we need to start with understanding the norms. We can’t change what we can’t see.  

The Devil is in the Details 

The call for a congressional mandate for something (not necessarily a warning label) is a step in the right direction. Why? Because it treats software as a product and tacitly places product safety requirements on it. This is an advancement in our eyes.  

Moreover, product safety is almost always the domain of government (or insurance). In the absence of a government mandate for product safety for technology, we see fragmented efforts with the FTC boldly championing privacy risks in technology, and the FCC advocating for a different type of label. So indeed, it’s encouraging that we’re starting to talk about technology in product safety terms.  

But the devil is in the details of any labeling program. In the words of Shoshana Zuboff, “who decides and who decides who decides?” As in, who decides what goes on the labels? Also who oversees the integrity of the labels? The US government is a customer of data obtained by surveillance capitalism2. When it comes to technology can the government be trusted to keep people safe? (When it comes to food can the government be trusted to keep people safe? When you dig into it, the track record is spotty.)  

Product safety exists in natural opposition to the industry status quo and any kind of regulation is already facing and will continue to face strong opposition3. In the early 1900s, when chemist Dr. Harvey W. Wiley began a crusade for the labeling of ingredients and identifying toxic elements in food, industries who relied on the opacity of ingredients (snake oil salesmen) or who simply didn’t want to incur the cost of change (whiskey distillers) opposed such a mandate.  

“Strenuous opposition to Wiley’s campaign for a federal food and drug law came from whiskey distillers and the patent medicine firms, who were then the largest advertisers in the country. Many of these men thought they would be put out of business by federal regulation. In any case, it was argued, the federal government had no business policing what people ate, drank, or used for medicine. On the other side were strong agricultural organizations, many food packers, state food and drug officials, and the health professions. But the tide was turned, according to historians and Dr. Wiley himself, when the activist club women of the country rallied to the pure food cause.”

 

https://www.fda.gov/media/116890/download  

Product safety challenges the status quo and creates necessary growing pains for industry. But industry always survives. And more often than not, new industries emerge, such as the ongoing development of safety features for vehicles.  

Let’s return to the challenge of deciding what goes in the labels. We at ISL know quite a lot about what it takes to develop safety labels in a space where the naming and measurement of risk isn’t fully baked (or worse, non-existent). Determining what goes into a previously uncharted, unmeasured safety label is extraordinarily challenging. It’s even more challenging if the measurement tools don’t exist. But our situation is even worse than that: we don’t even have agreement on what the risky behaviors in technology are. AND, we are talking about behaviors here—which is not language we typically associate with products. Products don’t typically behave. From our several years in development, these are the highly iterative steps that must occur to reach consensus on labels for technology: 

Consensus on naming the hazards/harms in technology.4  Consensus on assessing and quantifying the risks.   Identify the behaviors that embody the risks.  Figure out a way to measure the behaviors that embody the risks.   Assess the measurements.5  Consensus on presentation of the measurements/information. 

As far as presentation of the data, in our case, we decided to aggregate the data into clusters based on riskiness, and we also ultimately decided to provide a single app score. This was done with some reluctance, and it will no doubt be a much-evolving scoring rubric for the next few years.  

For now, we believe the best thing the labels can do is objectively report the invisible (or poorly understood) behaviors of the products until such time as definitive harm thresholds can be derived.  

There’s a final vital detail regarding the establishment of any labels, and that’s having what I would characterize as exceptional diversity of participants in establishing safety standards. This isn’t lip service. A few years ago, when I started to better see how what was risky for me was very different than what was risky for people who are different from me such as a person of color, or a person with a disability, or an incarcerated person, I woke up one night from a deep sleep with the awareness that any attempt at standardizing or consensus is doomed if it doesn’t have full diversity involved6 . Why this is so is a long and complicated matter. On the one hand, everything ever done should endeavor to have exceptional inclusion of a massively diverse set of participants.  

But it also has to do with the fact the software and software driven tech is “alive” and interactive in a way that other products in our lives aren’t. We have a special duty when it comes to product safety of software animated products. We may even need to reconsider what a “product” is. We have seen evidence of the hazards of animated technology not built with adequate understanding of the diversity of users with the embodiment of human bias in automated decision making or with hand dryers that don’t activate for people of color. The point is that technology acts on and with us in a different (and constantly changeable) way than other products. So labeling is both harder and matters more than ever.  

Conclusion 

Overall, I remain optimistic that the lens is happily starting to focus on product safety, implicit though it may be. People will be thinking more about labels for technology. And they will see that ISL is already providing labels with privacy risks. We can call out the presence of infinite scroll, and like buttons and other widely recognized as addictive user interface patterns in labels today.  

As I mentioned above, confusion stems from Dr. Murthy’s call for a “warning label” instead of a safety or ingredients label. Technology is cigarettes7. We use the metaphor all the time. Technology today is cigarettes in the 1940s/1950s when just about everybody chain smoked and the harms were likely all anecdotal and pooh-poohed. It took decades to assemble causal evidence. But tech is also much more complicated than cigarettes and a warning label is premature. This is not a compelling argument to say that we don’t deserve to have accurate information on tech’s risky behaviors. As it is right now, we don’t even have an ingredient label for technology. We are flying (tech-ing?) blind. 

Of course we need labels. Industry would do well to proactively embrace label enablers like software bills of material, consent receipts, and machine-readable record of processing activities (ROPAs). Because there can be no doubt that labels are imminent.  

Earlier, I said that we’ve “started”. I say that because our labels only include privacy risks at present. Our labels are deliberately modular and we’ve scoped additional sections: 

Risky UI Patterns –like deliberately addictive UI patterns of the sort Dr. Murthy is calling for exposing. Our Safe Software Specification for Websites and Mobile Apps already describes measurement of these kinds of risks.  Automated Decision-Making Risks  Security [client side only] Risks  Differences between observed tech behavior and privacy policy and/or terms of service.  

All of these are on our roadmap. We know exactly how to add these sections to the label, it’s strictly a resource and funding issue. If they sound good to you, please consider supporting our mission

Because of course we need labels. 

 

Footnotes: https://www.wsj.com/us-news/u-s-surgeon-general-calls-for-warning-labels-on-social-media-platforms-473db8a8?st=gmnjmhotka7febm&reflink=desktopwebshare_permalink 
https://technosapiens.substack.com/p/should-social-media-have-a-warning  https://arstechnica.com/tech-policy/2024/01/nsa-finally-admits-to-spying-on-americans-by-purchasing-sensitive-data/
https://www.nbcnews.com/tech/security/us-government-buys-data-americans-little-oversight-report-finds-rcna89035
https://www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x  https://www.politico.com/news/2023/08/16/tech-lobbyists-state-privacy-laws-00111363 We have ongoing work with our Digital Harms Dictionary. They will be wrong, and you will have to find a different measure. We welcome everyone, whether you are technical or not, to participate in our open Software Safety Standards Panel where we define the content of the safety labels, and name hazards and harms. Tech may actually be worse than cigarettes because it has the capability of inflicting every kind of harm people can experience, either directly or indirectly, in a multitude of increasingly creative ways: financial, reputational, social, emotional/psychological, and even physical. 

The post We Need to Talk About Product Labels appeared first on Internet Safety Labs.

Wednesday, 19. June 2024

Origin Trail

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI. There is however another gap that has become strikingly apparent — t

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI.

There is however another gap that has become strikingly apparent — the AI trust gap. As challenges such as AI hallucinations, bias, and intellectual property slurping continually cause damage, we look into how the base of the current Web could be effectively transformed to support the Verifiable Internet for AI.

The announced Apple and OpenAI integration signals the trust gap is widening, with Apple users’ data becoming the next frontier for training ChatGPT and questionable transparency on how it is used. This data is so valuable that it reportedly makes up for charges Apple would pay for using costly ChatGPT AI models. The Verifiable Internet for AI shifts this paradigm, making such data transactions transparent on chain with ownership of data taken back by users, who ultimately get to monetize it.

Decentralized AI: Intersection of Crypto and AI

Having employed the fundamentals of crypto, AI, and knowledge graphs successfully within a plethora of sectors, where trust, transparency, and accuracy are of paramount importance, OriginTrail now integrates Base blockchain with OriginTrail Decentralized Knowledge Graph (DKG), to help drive trust and transparency with neuro-symbolic AI. Instilling information provenance, ownership, and graph structure through blockchains and knowledge graphs together can effectively address the aforementioned problems of AI, as detailed in the most recent White Paper 3.0.

Your Body of Knowledge, Your Choice of AI

The opportunity of graph algorithms as a foundation for reputation in the age of AI was also highlighted by Brian Armstrong, CEO of Coinbase, in a recent podcast:

“Another piece of a puzzle that I feel could be missing, is something around reputation that’s on chain. You can imagine a version of this that’s like using the graph structure of the chains. To sort of say, okay if I trust this node, and they sent money to this node, that sort of implies some amount of trust. Kind of like a Google Page Rank had an algorithm, something like that could be built on chain.” — Brian Armstrong, CEO of Coinbase

The recently introduced OriginTrail Paranets (user-controlled on-chain knowledge graphs), enable users total control over their data, connecting it into the DKG decentralized physical infrastructure (DePIN), while keeping it safely stored on their devices. Users are then able to choose from a growing selection of open-source AI systems integrated with OriginTrail via ChatDKG.ai, a launchpad for user-controlled AI.

Knowledge graphs with paranets enable transparent on-chain reputation, relevance scoring with PageRank, recommendation engines, graph neural networks, and other AI reasoning applications.

The first of such knowledge graphs to launch on Base is the DeSci paranet for autonomous scientific research by ID Theory, crowdsourcing knowledge assets utilizing on-chain reputation via the OriginTrail DKG.

#DeSci has great potential. Those who have a working product will have the power to forever improve science and the scientific process.” — Brian Armstrong, CEO of Coinbase

One of the first dapps deployed on the DeSci paranet on Base will be the DeSci AI agent, which will include a knowledge mining interface through which scientific knowledge will be minted on chain, with publishers receiving token incentives.

“We’re creating a user-friendly hub to coordinate scientific knowledge creation onchain — a co-owned substrate to crowdsource AI and supercharge research and discovery as we know it through autonomous science. The first iteration will focus on neuroscience as it’s very close to our hearts, but the future is boundless. Who knows, crypto might actually cure cancer.” — ID Theory

DeSci AI Agent in action built on OriginTrail

AI and Crypto, converging together in the OriginTrail DKG can tackle some of the largest challenges while providing users with an inclusive, unbiased, and verifiable way of making mission-critical decisions. As we bring this technology to more data-intensive sectors such as science, the trust layer — blockchain underpinning the neuro-symbolic AI approach made possible by the DKG — needs to fulfill both the scalability and user experience requirements.

This is where Base can help Trust Thy AI — in a scalable, inclusive, and user-friendly way.

Make sure to subscribe and follow the next steps as we make AI Base-d.

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

Musings of a Trust Architect: Minimum Viable Architecture (MVA)

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian sys

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian system, illustrate the successful implementation of MVA in software development.

A business methodology focused on producing a Minimum Viable Product blossomed in the 21st century. Unfortunately, it can set businesses up for future failure because it doesn’t properly define the larger architecture that is needed to evolve a product past its earliest, minimal state.

The Old Methodology: Minimum Viable Product

A Minimum Viable Product (MVP) is a business methodology that advocates creating the simplest possible version of a product as a first release, to see if the market responds positively, or else to understand why it doesn’t1. If an MVP is successful, possibly through iterations of the initial work, the product can then be grown and ultimately find large-scale success in the market.

Twitter has long been used as an example of an MVP that did great, with Dropbox and Facebook being other examples of MVPs2 (to various extents). By the criteria of these companies, MVP would seem to be a win-win methodology.

However, they’re not the full story.

MVP Biases

Unfortunately, current literature about Minimum Viable Products suffers from Survivorship Bias. We hear about the success of companies that used MVPs, but we don’t know that their doing so actually led to success. In fact, the successes that we see might just be a false signal.

How many hundreds or even thousands of companies pursuing MVPs failed for each Twitter or Facebook that succeeded? How many companies found that they couldn’t scale their MVP, realized that they couldn’t take commercial advantage of an otherwise successful MVP, or simply were beaten by competitors with even more viable products?

We can’t measure the success of the MVP methodology by the anecdotal success of a few individual companies.

Survivorship bias image by Martin Grandjean (vector), McGeddon (picture), Cameron Moll (concept). Released under cc by-sa 4.0.

MVP Dangers

The MVP system also has other dangers.

Some of these are related to company brand. Though a new company doesn’t have a reputation to damage, a series of unsuccessful MVPs could nonetheless curtail their future opportunities. Meanwhile, a more mature company might find their existing reputation blemished by a poor MVP. This is especially true today, as companies are increasingly saying that MVPs can be poor-quality releases3. That didn’t work out that well for Cyberpunk 2077, one of the highest profile and most controversial computer game releases of recent years, even though (like many modern-day computer games) it wasn’t quite released as an MVP, but not in a fully complete form either4.

There are also competitive dangers. Within a developmental niche, MVPs only work if everyone pursues them; otherwise, an MVP built on solid ideas could easily be out-competed by a firm who produced a slightly more mature prototype. Similarly, a company with more resources might be able to scoop up the ideas in a MVP and replicate them to their own advantage5.

However the biggest dangers of MVPs are probably architectural. By defining an MVP, a company can easily ignore the larger architecture issues that would have once been considered before starting work on any serious release. This can cause problems with scaling, with missing features that can’t easily be added, and with locked-in decisions that become part of the final product.

Twitter (“X”), for example, didn’t finalize its network architecture design until 20106, four years after its advent. It would have been easy for a better architected social-medium system to get in there first; the fact that no one did is one of the pieces of luck that led to Twitter’s ultimate success. In fact, one of the developers at Twitter has noted this, saying: “In the end, Twitter barely made it, and product progress was slow for years due to post facto infrastructure investment.”7

The biases and dangers implicit in MVPs suggest the need to at least experiment with other methodologies for product releases. The huge problems implicit in the potential lack of architecture in an MVP also suggest what that alternative methodology should be: a Minimum Viable Architecture.

A New Methodology: Minimum Viable Architecture

Minimum Viable Architecture (MVA)8 is a methodology that has been discussed in somewhat different forms over the last several years. It doesn’t focus on the simplest product that can be released to consumers, but instead on the simplest architecture that can support future development within a product’s technological ecosystem.

The goal of an MVA is still to create a product that doesn’t strain the resources of a company and that doesn’t create a situation where a company’s ultimate success or failure depends on that singular release. However, that simple product must be created with the understanding of a larger architecture that has enough flexibility9 that designers can fill in gaps in that architecture in the future. It’s just that the decisions for filling in those gaps are delayed as much as possible10. It’s a melding of agile methodologies with architectural concerns.

Though an MVA could be created with a full understanding of future expansions that may or may not be ultimately accomodated, it’s more powerful to create an MVA that is modular and expandable — that doesn’t depend on the architect thinking of everything, but instead future-proofs itself so that the architecture could include unthought-of elements in the future. As Jorge Lebrato says: “The architecture remains cohesive and each piece cooperates with the others, despite having had different rhythms.” The best MVA is a compromise between entirely ignoring the architecture (as is likely in an MVP) and designing it entirely (which would likely result in time cost and waste)11.

MVA Examples

The following examples contain some real-world usages of MVA instead of MVP.

SSL/TLS

When I co-authored the TLS spec in the ’90s, I did my best to future-proof it by simultaneously constraining the design and giving it enough flexibility to be expanded in the future. This is an example of a Minimum Viable Architecture whose usefulness has proven itself: TLS is now the most deployed security system on the internet, at the heart of almost every shopping, financial, or banking transaction.

This future-proofing was thanks in part to our architecting elements that we suspected would be required in the future, but which couldn’t be deployed in the then-present, primarily due to CPU limitations. Perfect forward secrecy12 is an example. Users were able to simply turn it on when its usage became viable on standard hardware platforms.

However, our more notable work in creating an MVA came from our inclusion of ciphersuites. These are powerful encryption and decryption rules that do the actual cryptographic work of TLS. By defining them as modular plug-ins, we supported the future innovation of TLS, even in ways that we could not envision. And, there was considerable innovation. TLS 1.2 had 37 ciphersuites, though that dropped back to five with TLS 1.313.

The Gordian System

One of my most recent endeavors is Blockchain Commons’ Gordian system14, which is a layered architecture for protecting digital assets and identity that has seen early successes with the protection of seeds with systems like SSKR15 and CSR16 and that focuses on the Gordian Principles of independence, resilience, privacy, and openness.

Blockchain Commons’ Mission: Advocating for the creation of open, interoperable, secure & compassionate digital infrastructure to enable people to control their own digital destiny and to maintain their human dignity online

In order to create an MVA that future-proofs the Gordian products, the Gordian architecture identifies points of potential interoperability and breaks the architecture into discrete components across those interoperable interfaces, thus allowing individual elemetns to be replaced. This was done both at the large-scale application level and at the small-scale programmatic level. It’s important everywhere.

At the large-scale application level, the Gordian system achieves interoperability by the careful architecting of both discrete applications and the ways that they can interact. Airgaps are a traditional methodology for introducing security into a digital asset system17, but the Gordian system has expanded that to include Torgaps18, which is a way for making transactions between connected applications both secure and non-correlatable. This modular approach is one way to enable future-proofing, and it’s only strengthed by systems such as airgaps and torgaps that tightly constrain communications between the modules.


At the small-scall programmatic level, the Gordian system introduces a layered stack of specifications that together enable the private and secure transmission of sensitive data. This stack includes dCBOR19, Bytewords20, URs21, Animated QRs22, Envelope23, Gordian Transport Protocol24, and Gordian Sealed Transaction Protocol25. Together these specifications allow for the deterministic storage of binary data (dCBOR), the alphabetic representation of binary data (Bytewords), the tagged display of that representation with functionality to support multipart data (URs), the QR display of multipart data (animated QRs), the structured & smart storage of content (Envelope), the communication of Envelopes (GTP), and the secure communication of Envelopes (GSTP). But we didn’t know what all the layers would be when we got started: this is another example of future-proofing, and one that easily arises from carefully layered specifications.

Similarly, when Blockchain Commons creates its progressive use cases we focus first on the requirements without needing to know the technology. The technological specifics can be filled in by ourselves or individual vendors in the future.

By abstracting and separating architectural elements—whether they be large-scale components, layered specifications, or additional requirements found in progressive use cases—the Gordian system will be able to incorporate options that we are not even considering. The ultimate goal of all of these designs is to ensure that our MVA architecture does not limit itself, but instead remains flexible for the future.

Other MVA Examples

This type of MVA thinking is a pattern that can be widely successful and that doesn’t create some of the limitations that appear in MVP thinking. For example, when I was supporting the creation of the earliest specifications for Decentralized Identifiers (DIDs)26, I was pleased to see us arrive at a compromise where core DID specifications were separated from specific DID methods and from signature suites. It’s an architecture that allows for a lot of future expansion.

Similarly, some of my earliest Blockchain Commons work was with a company who was adapting the Gordian architecture. Even though they weren’t planning to initially implement multi-sigs, I ensured that they don’t make decisions that would lock them out of multi-sig usage in the future, because I was thinking of a MVA that went beyond the MVP they were focused on.

Coda: The Benefits of Coopetition

It can be quite hard for a single company to figure out an MVA. Thus, it’s great to work with other companies in your technology space.

This is particularly true if your industry supports coopetition, where business competitors can work together for a mutually beneficial good. If an industry supports interoperability, or one company adding services to another company’s products, then it’s a great candidate for coopetition—and thus MVAs are even more likely to be successful.

Blockchain Commons has been able to take advantage of this. A variety of companies have participated in the Gordian Developer Community community27, each contributing their own ideas and requirements for the Gordian architecture. In turn, they’ve then gone off and created open-source libraries that adapt the architecture28, before beginning work on their own wallets that use the MVA that we cooperatively designed. A not-for-profit organization can be a great support for MVA work of this type; that’s what Blockchain Commons does.

Conclusion

Hollowing out spaces in architectures for future development and creating flexibility for the future through modular designs are two of the most successful methods for turning an MVP into an MVA. They give you something that supports minimal investment and agile development, while simultaneously maximizing the ability to scale and expand in the future.

We don’t always know the right solutions. We can’t predict what will work best. So the best we can do is create architectures that won’t lock us in to specific decisions about the future. By doing so, especially by working in coopetition to do so, we also ensure that no one company will lock us or our users into futures that we don’t agree with.

This article was originally drafted in 2021, and then back-burnered for various reasons. It’s been great to see a real exposion in discussion of MVA in the years since by authors such as Ekaterina Novoseltseva 9, Jorge Labrato11 and Murat Erder and Pierre Pureur10, much of which reflects my own thoughts on MVA. Hopefully that means we’re moving in this direction!

Various. Retrieved 2021. “Minimum Viable Product”. Wikipedia. https://en.wikipedia.org/wiki/Minimum_viable_product. 

Michael Sweeney. 2015, 2020. “5 Successful Startups That Began With an MVP”. Clearcode. https://clearcode.cc/blog/successful-startups-minimum-viable-product/. 

Allan Kelly. 2020. “The MVP is broken: It’s time to restore the minimum viable product”. TechBeacon. https://techbeacon.com/app-dev-testing/mvp-broken-its-time-restore-minimum-viable-product. 

Frank, Allegra. 2020. “How one of the biggest games of 2020 became one of the most controversial”. Vox. https://www.vox.com/culture/22187377/cyberpunk-2077-criticism-ps4-xbox-one-bugs-glitches-refunds. 

Andrea Contigiani. 2018. “The Downside of Applying Lean Startup Principles”. Knowledge at Wharton. 

Mazdak Hashemi. 2017. “The Infrastructure behind Twitter: Scale”. Twitter blog. 

Evan Weaver quoted by James Governor. 2017. “Minimum Viable Architecture – good enough is good enough in an enterprise”. James Governor’s Microchips. https://redmonk.com/jgovernor/2017/06/13/minimum-viable-architecture-good-enough-is-good-enough-in-an-enterprise/. 

Deepak Karanth. 2016. “How to Create a Minimum Viable Architecture”. Dzone. https://dzone.com/articles/minimum-viable-architecture. 

Novoseltseva, Ekaterina. 2022. “Minimum Viable Architecture”. Apiumhub. https://apiumhub.com/tech-blog-barcelona/minimum-viable-architecture/#.  ↩2

Pureur, Pierre. 2021. “Minimum Viable Architecture: How To Continuously Evolve an Architectural Design over Time”. Continuous Architecture in Practice. https://continuousarchitecture.com/2021/12/21/minimum-viable-architecture-how-to-continuously-evolve-an-architectural-design-over-time/.  ↩2

Lebrato, Jorge. 2022. “What is a Minimum Viable Architecture (MVA) and why an iPaaS such as Anypoint Platform can help you achieve it”. Medium: Another Integration Blog. https://medium.com/another-integration-blog/what-is-a-minimum-viable-architecture-mva-and-why-an-ipaas-such-as-anypoint-platform-can-help-you-f54c9791f6c3.  ↩2

Various. Retrieved 2021. “Forward Secrecy”. Wikipedia. https://en.wikipedia.org/wiki/Forward_secrecy. 

Uncredited. 2020. “Cipher Suites and TLS Protocols”. SSLs.com Blog. https://www.ssls.com/blog/cipher-suites-and-tls-protocols/. 

Various. Retrieved 2024. “Blockchain Commons Developer pages”. Blockchain Commons website. https://developer.blockchaincommons.com/. 

Various. Retrieved 2024. “SSKR: Sharded Secret Key Reconstruction”. Blockchain Commons website. https://developer.blockchaincommons.com/sskr/. 

Various. Retrieved 2024. “CSR: Collaborative Seed Recovery”. Blockchain Commons website. https://developer.blockchaincommons.com/csr/. 

Various. Retrieved 2024. “Air Gaps” Blockchain Commons website. https://developer.blockchaincommons.com/airgap/. 

Various. Retrieved 2024. “Torgaps”. Blockchain Commons website. https://developer.blockchaincommons.com/torgap/. 

Various. Retrieved 2024. “Deterministic CBOR (dCBOR)”. Blockchain Commons website. https://developer.blockchaincommons.com/dcbor/. 

Various. Retrieved 2024. “Bytewords”. Blockchain Commons website. https://developer.blockchaincommons.com/bytewords/. 

Various. Retrieved 2024. “Uniform Resources (URs)”. Blockchain Commons website. https://developer.blockchaincommons.com/ur/. 

Various. Retrieved 2024. “Animated QRs”. Blockchain Commons website. https://developer.blockchaincommons.com/animated-qrs/. 

Various. Retrieved 2024. “Gordian Envelope”. Blockchain Commons website. https://developer.blockchaincommons.com/envelope/. 

Appelcline, Shannon, Wolf McNally & Christopher Allen. 2024. “Gordian Transport Protocol / Envelope Request & Response Implementation Guide 📖”. GitHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.mdhttps://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.md. 

McNally, Wolf & Christopher Allen. 2023. “Gordian Sealed Transaction Protocol (GSTP)”. GItHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2023-014-gstp.md. 

Drummond Reed, Manu Sporny, Dave Longley, Christopher Allen, Ryan Grant, and Markus Sabadello. 2021. “Decentralized Identifiers (DIDs) v1.0”. https://www.w3.org/TR/did-core/ 

Various. Retrieved 2024. “Gordian Developer Community”. GitHub. https://github.com/BlockchainCommons/Gordian-Developer-Community 

Various. Retrieved 2024. “Blockchain Commons Libraries”. Blockchain Commons website. https://developer.blockchaincommons.com/libraries/ 

Tuesday, 18. June 2024

Velocity Network

Authoritative Sources for Verifiable Credentials – Part 3

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

FIDO Alliance

AWS Expands MFA Requirements, Boosting Security and Usability with Passkeys

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by […]

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by incorporating secure-by-design and safe-by-default principles. FIDO passkeys offer a strong and easy MFA option, leveraging public key cryptography to resist phishing attempts and enhance overall account protection.


Velocity Network

Velocity Network’s Architecture for Issuer Trust – Part 2

Velocity Network aims to migrate career data to a three-party data exchange model with reliable data. This architecture is key to the revolution in career data that is waiting to happen. Without the three-party model, relying parties must create API integrations with a source of trusted digitized data from a single, often monopolistic, trusted issuer. The post Velocity Network’s Architecture for

Monday, 17. June 2024

FIDO Alliance

ID Talk Podcast: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential […]

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential industry standards, promoting the adoption of biometrics, and enhancing digital security through two-factor and multi-factor authentication technologies.

This week, Andrew Shikiar, FIDO’s Executive Director and CEO, joins the ID Talk podcast to discuss critical issues in authentication and identity security. The conversation covers topics such as the intricacies of passkeys, the dangers of phishing and deepfakes, and the comprehensive testing FIDO certified products undergo with independent, accredited labs to gain FIDO certification. Additionally, Shikiar introduces FIDO’s new Face Verification Certification program, aimed at standardizing selfie-based identity verification technologies across various sectors.Gain valuable insights from Andrew Shikiar by tuning into the podcast, available on Soundcloud, Spotify, Apple Podcasts, or using the link below.


Identity At The Center - Podcast

Dive into the world of digital identity with our latest epis

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC. Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app. #iam #podcast #idac

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC.

Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app.

#iam #podcast #idac

Sunday, 16. June 2024

Velocity Network

Empowering Self-Sovereign Identity With Trusted Credentials: Exploring Velocity Network Checks – Part 1

Self-sovereign identity centers on placing data control squarely in the hands of individuals. The goal is to rectify a mistake that has grown exponentially since the late 90s—the dominance of certain companies over personal information.  The current model has data providers sending to data consumers directly, for the most part, with little to no consent from the data subjects themselves in

Friday, 14. June 2024

MyData

Lessons from the City of Helsinki: Three Paradigm Shifts in Smart Cities

Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness […]
Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness […]