Last Update 6:18 PM March 28, 2024 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Thursday, 28. March 2024

Origin Trail

The Initial Paranet Offerings (IPOs) to supercharge the Verifiable Internet for AI

Access to shared open knowledge constructed in a collaborative way is mission-critical for the future of AI, especially since non-AI-generated content is expected to be surpassed in size by synthetic, AI-generated content in the coming period. The importance of it has also been highlighted by the Turing Award winner in the field of Deep Learning, Yann LeCun: “The way you train that (AI) syst

Access to shared open knowledge constructed in a collaborative way is mission-critical for the future of AI, especially since non-AI-generated content is expected to be surpassed in size by synthetic, AI-generated content in the coming period. The importance of it has also been highlighted by the Turing Award winner in the field of Deep Learning, Yann LeCun:

“The way you train that (AI) system will have to be crowdsourced … if you want it to be a repository of all human knowledge, all humans need to contribute to it.” Yann LeCun

To achieve that, AI para — networks or paranets, the autonomously operated collections of Knowledge Assets owned by its communities and residing on the OriginTrail Decentralized Knowledge Graph (DKG), were introduced in the Whitepaper 3.0.

Initial Paranet Offerings (IPO) are now introduced as a means of a public launch of a paranet, with a collection of Knowledge Assets and accompanying incentivization structure proposed and voted upon via the NeuroWeb governance mechanism. Each IPO is structured as an initial proposal and an initial set of Knowledge Assets published, along with an incentivization structure set forth by an IPO operator that proposes how the incentives will be split across three groups:

IPO operator Knowledge miners Neuro holders that participated in supporting the creation of an IPO and approved the requested allocation of Neuro utility tokens for an IPO’s knowledge mining.

The success of an IPO largely depends on the IPO’s operator ability to wisely propose the incentive structure, taking into consideration the following factors among others:

IPO operator autonomously selects AI services to be used to drive value of a knowledge base, and must undertake an economically and commercially viable approach for both creation and maintenance of a paranet. It is expected that an IPO operator proposes an operator fee that renders the birth of a paranet economically viable (earning a share of allocated emissions), while also setting up a fee structure for both knowledge miners and NEURO holders that partake in voting. Assuming the cost of mining Knowledge Assets on the DKG in TRAC utility tokens, knowledge miners are to be considered as central to the success of not only an IPO proposal, but even more so as entities that drive incentives in NEURO tokens only as each new knowledge asset is mined. Only when new Knowledge Assets are mined, the allocated emissions of NEURO are executed across the three groups as incentives. When launching an IPO, the paranet operator will define the ratio of NEURO to be earned per TRAC spent to mine each Knowledge Asset. An IPO operator may set the ratio autonomously to target a desired profitability before the proposal is submitted to voting, yet attempts of price gouging might not receive support from NEURO holders. NEURO holders that support an IPO via governance voting are to lock up tokens for the duration of NEURO emission allocated for the IPO. Though the share of emissions allocated for an IPO is an important factor for NEURO holders’ decision, the duration of the “lock period” can also play an important role. The paranet operator also defines what portion of paranet incentives will be shared with NEURO holders supporting the proposal. The ecosystem incentivizing the Verifiable Internet for AI

The interest to launch the first IPOs has already been pre-registered by several institutional entities and builders, with the inaugural batch nearing the announcement stage. If you are interested in launching a paranet and knowledge mining, hop into the community discussion in Discord and share your ideas.

The Initial Paranet Offerings (IPOs) to supercharge the Verifiable Internet for AI was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

ShopeeFood drops non-compete clause after Rest of World reporting

Drivers for the Vietnam delivery company had used workarounds but feared detection. After Rest of World’s reporting, the clause was dropped.
ShopeeFood, one of Southeast Asia’s biggest food delivery platforms, has dropped a controversial non-compete clause that forbade its workers from working for rivals. The stipulation had forced delivery drivers to...

Brazil banned these killer kites. Influencers are keeping the black market alive

By advertising on Instagram and TikTok, and selling through marketplaces, vendors are finding their way around the ban.
Above the treetops and shabby rooftops of Brazil’s favelas, hoards of colorful kites soar across the sky on any given day. In these densely populated and poverty-stricken neighborhoods, the centuries-old...

Meta’s long-standing problem with “shaheed”

An overdue ruling from the Oversight Board.
On Tuesday, the Meta-established Oversight Board released a new ruling on how Facebook moderates the Arabic word “shaheed,” which translates roughly to “martyr.” Meta had been automatically flagging the word...

Scale AI’s Remotasks platform is dropping whole countries without explanation

Workers in Kenya, Nigeria, and Pakistan were booted off the AI-training service earlier this month.
Grace Mumo started working with Remotasks after she lost her job in 2020. To the single mother caring for three children, the remote clickwork service offered a way to make...

Wednesday, 27. March 2024

DIF Blog

Effective governance now with DIF Credential Trust Establishment

In the digital identity space, the Trust Establishment (TE) and Credential Trust Establishment (CTE) specifications play crucial roles in defining how trust is established and managed. CTE, in particular, is gaining traction as we approach the Internet Identity Workshop (IIW), with a plan to advance it to formal V1 status.

In the digital identity space, the Trust Establishment (TE) and Credential Trust Establishment (CTE) specifications play crucial roles in defining how trust is established and managed. CTE, in particular, is gaining traction as we approach the Internet Identity Workshop (IIW), with a plan to advance it to formal V1 status. This article focuses on the CTE, shedding light on its key features that make it a game-changer in building trust within digital credentials.

Core Aspects of CTE

CTE builds upon TE by enabling ecosystems to express their trust in the issuers of decentralized identifiers (DIDs) and credentials. Credential validation steps of checking the integrity and revocation status are well known and understood, but there are not yet commonly-agreed-upon standards for evaluating the trustworthiness of the issuer’s claims. 

Existing approaches have fallen short in one or more of the following areas: 

Ensuring the approach is sufficiently adaptable Ability to express authorization for a given purpose (not just authorization) Allows good performance and minimal resources, even eligible for offline use Low-cost to implement, deploy, and use

This is where CTE comes in: enabling ecosystems to express the credibility of participants, but in a way that meets the above needs. By doing so, it helps avoid “rent-seeking” behavior, in which an ecosystem participant tries to position themselves to collect transaction fees or similar.

Authority in the Ecosystem

CTE is non-prescriptive in its stance on defining who is an authority. It operates on the principle that authority is determined by an ecosystem’s existing trust structure, informing the acceptance and recognition of the credentials. This flexibility allows for wide adoption and adaptation, making it a practical solution for managing trust.

Governance and Flexibility

CTE introduces a practical governance model that is lightweight and adaptable. It specifies roles such as credential issuance and verification, and allows grouping by schemas, or type of credential. This allows CTE to adapt well to a wide variety of use cases and simplifies the process of determining who is authorized to issue or verify credentials.

Trust on Demand

CTE includes flexible dials in cases where more fluidity is required. For example, instead of being statically included in the registry, an individual can hold credential(s) that assigns them a specific role, and the root authority of that credential corresponds to an entry/role in the registry.   This method is not only efficient for offline use but also broadens the compatibility with different protocols, enhancing the flexibility and utility of the trust establishment process.

Impact

CTE is designed to counter rent-seeking behaviors and establish a solid trust foundation in digital credentials. It enables organizations and individuals to easily verify the legitimacy of credentials, providing a clear pathway for recognizing valuable credentials for professional development, for example. The specification’s governance model is straightforward and requires minimal technical investment, making it accessible and implementable across various industries.

How it can be used

In the wild, CTE files would be used by software representing companies and people. Companies and people will have a collection of governance files they use for different industries and purposes. In general, companies will be interested in software providing an immediate yes or no answer informing whether to accept or reject a credential. For individuals, however, software can use CTE files to advise on whether a credential is recognized by different parties. By indexing different CTE files, software can help individuals decide which credentials are most valuable for them.

Future Directions

As CTE heads towards v1, its potential to streamline the verification process and enhance the credibility of digital credentials is becoming increasingly apparent. DIF invites you to learn more about how CTE can revolutionize the digital identity field in providing a scalable, flexible, and trustworthy framework for managing digital credentials.

Learn more at:

Internet Identity Workshop DIF virtual event (details coming soon)

In summary, CTE is not just about establishing trust; it's about making the process more accessible, adaptable, and reliable for everyone involved in the digital identity ecosystem. Its forward-thinking approach to governance, authority, and risk mitigation positions it as a cornerstone specification in the evolving landscape of digital credentials.


GS1

Maintenance release 2.9

Maintenance release 2.9 daniela.duarte… Wed, 03/27/2024 - 16:20 Maintenance release 2.9
Maintenance release 2.9 daniela.duarte… Wed, 03/27/2024 - 16:20 Maintenance release 2.9

GS1 GDM SMG voted to implement the 2.9 standard into production in February 2024.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.9 contains updated reference material aligned with ADB 2.3 and GDSN 3.1.26.

 

Updated For Maintenance Release 2.9

GDM Standard 2.9 (February 2024)

Local Layers For Maintenance Release 2.9

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (December 2021)

USA - GSMP RATIFIED (February 2023)

Finland - GSMP RATIFIED (November 2023)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (February 2024)

GPC Bricks To GDM (Sub-) Category Mapping (March 2024)

Attribute Definitions for Business (February 2024)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.26 (February 2024)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (Nov 2023)

GDM Local Layer Submission Template (May 2023)

Training

E-Learning Course

Any questions?

We can help you get started using GS1 standards.

Contact your local office


EdgeSecure

Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure

The post Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure appeared first on NJEdge Inc.

NEWARK, NJ, March 27, 2024 –Edge recently partnered with FABRIC, Rutgers, The State University of New Jersey, and Princeton University to provide high performance network infrastructure connecting university researchers and their local compute clusters and scientific instruments to the larger FABRIC infrastructure. 

Notes Dr. Forough Ghahramani, Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge, “The partnership with the FABRIC team and researchers at Princeton University and Rutgers will create opportunities to explore innovative solutions not previously possible for a large variety of high-end science applications and provide a platform on which to educate and train the next generation of researchers on future advanced distributed system designs.”

FABRIC is an international infrastructure that enables cutting-edge experimentation and research at-scale in the areas of networking, cybersecurity, distributed computing, storage, virtual reality, 5G, machine learning, and science applications. Funded by the National Science Foundation’s (NSF’s) Mid-Scale Research Infrastructure program, FABRIC enables computer science and networking researchers to develop and test innovative architectures that could yield a faster, more secure Internet. 

“EdgeNet is uniquely well-positioned to provide infrastructure support to these types of research networking initiatives,” explains Bruce Tyrrell, Associate Vice President, Programs & Services, Edge. Continues Tyrrell, “As a backbone and external services provider to both Rutgers and Princeton University, Edge has the capacity and capability to meet the high bandwidth research needs of our partner institutions. Our extensive optical backbone enables Edge to efficiently and economically deploy 100Gb transport services to all of our members.”    

The FABRIC team is led by researchers from University of North Carolina at Chapel Hill, University of Kentucky, Clemson University, University of Illinois, and the Department of Energy’s ESnet (Energy Sciences Network). The team also includes researchers from many other universities, including Rutgers and Princeton University, to help test the design of the facility and integrate their computing facilities, testbeds, and instruments into FABRIC.

“The partnership with the FABRIC team and researchers at Princeton University and Rutgers will create opportunities to explore innovative solutions not previously possible for a large variety of high-end science applications and provide a platform on which to educate and train the next generation of researchers on future advanced distributed system designs.”

— Dr. Forough Ghahramani
Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge

“FABRIC aims to be an infrastructure to explore impactful new ideas that are impossible or impractical with the current Internet. It provides an experimental sandbox that is connected to the globally distributed testbeds, scientific instruments, computing centers, data, and campuses that researchers rely on everyday,” said Paul Ruth, FABRIC Lead PI. “Edge enables us to support research across many facilities including the COSMOS wireless testbed, Princeton’s experimental P4 testbed, and remotely controlled instruments such as a CyroEM microscope at Rutgers.”

“The integration of FABRIC with COSMOS, both being pivotal national testbeds, opens unparalleled avenues for experimentation that blend wired and wireless networking with edge computing. Supported by Edge’s provision of connectivity between these pivotal national testbeds as well as to other national and international networks in NYC and Philadelphia carrier hotels, it opens unparalleled avenues for experimentation that blend wired and wireless networking with edge computing. This synergy not only enhances our research capabilities but also paves the way for groundbreaking advancements in network infrastructure and distributed systems,” notes Ivan Seskar, Chief Technologist at WINLAB, Rutgers, emphasizing the importance of collaborative efforts in pushing the boundaries of networking and computing research.

“As a backbone and external services provider to both Rutgers and Princeton University, Edge has the capacity and capability to meet the high bandwidth research needs of our partner institutions. Our extensive optical backbone enables Edge to efficiently and economically deploy 100Gb transport services to all of our members.”

— Bruce Tyrell
Associate Vice President, Programs & Services, Edge

Princeton University Provost and Gordon Y.S. Wu Professor in Engineering and Computer Science, Dr. Jennifer Rexford, was an early supporter of bringing FABRIC to Princeton, serving as a founding member of the project’s steering committee. Shares Rexford, “Linking into FABRIC allows Princeton to support science on a global scale, across multiple domains and enables researchers to reinvent the internet by experimenting with novel networking ideas in a realistic setting — at tremendous speed, scope and scale.” Further elaborates Jack Brassil, Ph.D., Senior Director of Advanced CyberInfrastructure, Office of the Vice President for Information Technology, and Senior Research Scholar Department of Computer Science, Princeton University, “FABRIC enables the Princeton University campus to usher in a new generation of terabit per second networking applications.By connecting our faculty to experimental testbeds, scientific instruments, and research collaborators at other higher education institutions, FABRIC will provide a fast path to scientific discovery.”

To learn more about FABRIC capabilities, visit https://whatisfabric.net/. Contact Forough Ghahramani (research@njeged.net) for additional information. 

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure appeared first on NJEdge Inc.


We Are Open co-op

Towards a manifesto for Open Recognition

Advocating for a more diverse future for the recognition of talents, skills, and aspirations CC BY-ND Visual Thinkery for WAO Back in 2016, the Open Recognition Alliance created the Bologna Open Recogition Declaration (BORD). This has helped the community organise around principles relating to the concept of Open Recognition for all. It emphasises the importance of building out technologies a
Advocating for a more diverse future for the recognition of talents, skills, and aspirations CC BY-ND Visual Thinkery for WAO

Back in 2016, the Open Recognition Alliance created the Bologna Open Recogition Declaration (BORD). This has helped the community organise around principles relating to the concept of Open Recognition for all. It emphasises the importance of building out technologies and infrastructure to enable Open Recognition, as well as advocating for policies which foster its development.

Eight years later, the Open Recognition is for Everybody (ORE) community has started work on a manifesto for Open Recognition. This will be part of the Open Recognition Toolkit and extends the BORD to help people envision and advocate for a future where Open Recognition is commonplace.

Unpacking Open Recognition

Let’s begin with defining terms:

Open Recognition is the awareness and appreciation of talents, skills and aspirations in ways that go beyond credentialing. This includes recognising the rights of individuals, communities, and territories to apply their own labels and definitions. Their frameworks may be emergent and/or implicit.
(What is Open Recognition, anyway?)

We want to help people understand that traditional approaches to credentialing, while important for unlocking opportunities, are just one part of a wider recognition landscape.

Image CC BY-ND Visual Thinkery for WAO

For example, you could think of traditional credentialing — with courses, modules, and diplomas as like a greenhouse where growth conditions are carefully controlled. Only certain plants thrive in this environment, and they are pre-selected to do so.

Open Recognition, on the other hand, is more like the garden that surrounds the greenhouse where a diverse array of plants grow naturally, adapt to their environment, and flourish in unique ways. Not only that, but there are many different gardens with different types of soil and varying atmospheric conditions.

Getting started with a manifesto

A manifesto is a call to action. It’s a way of allowing people to sign up to implement specific principles in order to work towards a better future.

To get started on that road, in a recent ORE community call we asked two questions:

What sucks that we want to do the opposite of? What doesn’t exist that we want to bring into being?

While these are only our first steps towards a manifesto with a subset of the community, we’re keen to share what we’ve discussed so far.

What sucks? Simplifying complex systems — our digital landscape is cluttered with overly complex technologies and terminology. We aim to streamline these technologies, making open recognition accessible to everyone, not just the tech-savvy. Clearing confusion and enhancing communication — there’s a tendency to overlook past contributions in the field, creating a cycle where new initiatives ignore the groundwork laid by predecessors. We want to provide clear, accurate information about Open Recognition to varied audiences. Dismantling exclusivity — some forms of recognition and credentials are guarded as if they’re an exclusive membership available only to a select few. It’s important that we break down these barriers to create a more inclusive environment where everyone’s achievements are acknowledged. What doesn’t exist? Streamlined badge creation — we want to make creating badges for Open Recognition as easy as filling out a social media profile. This would encourage wider adoption and creativity in badge design/issuing. Stories of success —examples and case studies help guide and inspire others. This could be part of the Open Recognition Toolkit, allowing stories to be shared and help provide practical and conceptual guidance to others. Bridging spheres of learning — different forms of learning, for example formal and informal, tend to be siloed. As we know valuable skills can be acquired outside of traditional educational settings, we want to build a bridge to recognise the worth of both formal training and self-taught expertise. Next steps

Creating a manifesto for Open Recognition involves creating something that resonates with a broad audience. It needs to be informative and upbeat, and have an ideological stance which advocates for a better future world.

Our next community call will continue the work we started this week, helping us work towards a plausible utopia for Open Recognition. If this is something which resonates with you, and you’d like to get involved, join us!

Related posts How badges can change the world — Part 1: The Two Loops Model for Open Recognition advocacy How badges can change the world — Part 2: Why we need to transition Advocating for learner-centric badge systems: Some thoughts on campaigning for the right things

Towards a manifesto for Open Recognition was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

ByteDance shuts down its WhatsApp clone in Africa

TikTok’s owner launched LetsChat in 2021, aimed at young Africans and boosted by heavy marketing, but it couldn’t crack WhatsApp’s dominance.
ByteDance has shut down LetsChat, an app that was once considered a rival to WhatsApp and Telegram in Africa. The Chinese tech giant pulled the plug on LetsChat on March...

The changing face of protest

Mass protests used to offer a degree of safety in numbers. Facial recognition technology changes the equation.
Intro An end to privacy On March 13, 2022, 34-year-old English teacher Yulia Zhivtsova left her Moscow apartment to meet her friends at the mall. Bundled up against the freezing...

Blockchain Commons

Foremembrance Day 2024 Presentation

For Foremembrance Day 2024, Christopher Allen gave a Twitter Livestream discussing the tragedy of overidentification in The Netherlands in WWII, how France offered a different path, and how we must continue to be wary about what identity information we collect and distribute today. For more see the slides of this presentation, the original article “Echoes from History” and a discussion of modern th

For Foremembrance Day 2024, Christopher Allen gave a Twitter Livestream discussing the tragedy of overidentification in The Netherlands in WWII, how France offered a different path, and how we must continue to be wary about what identity information we collect and distribute today.

For more see the slides of this presentation, the original article “Echoes from History” and a discussion of modern threats in “The Dangers of eIDAS”.

Tuesday, 26. March 2024

Energy Web

Web 3 as-a-service solution for enterprise now live on Energy Web X and Polkadot

Global energy majors at the forefront of Web 3 enterprise adoption on Energy Web X and Polkadot Energy Web, a global ecosystem of energy companies focused on developing and deploying Web 3 technologies to accelerate decarbonization of the global energy system, recently released Smartflow, a new as-a-service product that makes it simple and easy for enterprise customers to launch web 3 soluti
Global energy majors at the forefront of Web 3 enterprise adoption on Energy Web X and Polkadot

Energy Web, a global ecosystem of energy companies focused on developing and deploying Web 3 technologies to accelerate decarbonization of the global energy system, recently released Smartflow, a new as-a-service product that makes it simple and easy for enterprise customers to launch web 3 solutions built on Energy Web X, a parachain powered by the Polkadot blockchain and substrate technology.

Smartflow enables enterprise customers to configure and deploy custom business logic using decentralized networks of “worker nodes”. Worker nodes ingest data from individual enterprises or consortia of corporate customers, perform application-specific computational work on the data, and publish the results for enterprises and the public to verify. Worker nodes create business value because of the zero trust relationships they unlock: with worker nodes, no central entity has access to underlying data and, more importantly, work conducted by the nodes can be independently verified without needing to trust a single centralized entity or server. Worker nodes are connected to and secured by the Energy Web X parachain, a new blockchain on the Polkadot network.

Given the regulated nature of the energy industry and growing calls for more transparency on sustainability data, worker nodes are uniquely positioned to create value for energy companies. Over the past 7 years, corporates in the Energy Web ecosystem have uncovered a number use cases for Smartflow and worker nodes, including:

Matching granular energy consumption from buildings with energy produced by individual renewable energy power plants Balancing the grid using fleets of distributed solar systems and batteries Verifying the carbon intensity of sustainable aviation fuel
“As long-time Energy Web supporters, we are excited by the decentralized approach to multi-party computation provided by SmartFlow. It is a digital tool that provides a new kind of trust layer in business relations based on shared information. We are eager to discover how it can be integrated into our processes, and what kind of value it can create,” said Etienne Gehain, New Digital Solutions Director at Engie.

Smartflow enables enterprises to build custom worker node workflows and integrate them with existing data sources and APIs in minutes using a no-code infrastructure. Though currently configured to support Energy Web’s ecosystem of member energy companies, the technology can create value for enterprises in any industry where exchange and processing of sensitive data between companies is required to create business value.

Please visit the smartflow website to create your account and begin building today.

About Energy Web Foundation:Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, Digital Spine, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Web 3 as-a-service solution for enterprise now live on Energy Web X and Polkadot was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

Reliance and Disney team up to crush Netflix and Prime Video in India

Mukesh Ambani’s Reliance and The Walt Disney Company have joined forces, creating a hard-to-beat streaming entity in the country.
At the end of February, as several global celebrities made their way to his son’s pre-wedding celebrations, India’s richest man, Mukesh Ambani, made a move that is likely to win...

FIDO Alliance

Recap: Virtual Summit: Demystifying Passkey Implementations

By: FIDO staff Passkeys hold the promise of enabling simpler, strong authentication. But first organizations, governments and individuals will have to adopt the technology – and some of them have […]

By: FIDO staff

Passkeys hold the promise of enabling simpler, strong authentication. But first organizations, governments and individuals will have to adopt the technology – and some of them have questions.

At the Authenticate Virtual Summit: Demystifying Passkey Implementation on March 13, speakers from the FIDO Alliance, Intercede, IDEMIA, Yubico, Dashlane and 1Password as well as implementers including Amazon and Target, presented on their experiences implementing and working with passkeys. The virtual summit covered the technical perspective on passkeys from the FIDO Alliance, as well as use cases for passkeys in the enterprise, consumer authentication, and the U.S. government. Along the way, attendees asked lots of questions and got lots of insightful answers.

Fundamentally a key theme that resonated throughout the virtual summit was that passkeys are a password replacement – and it’s a replacement that can’t come soon enough.

“Passwords are still the primary way for logging on and they are still easily phished through social engineering and they tend to be very difficult to use and to maintain,” David Turner, senior director of standards development at the FIDO Alliance said. “The consequences are real and the impact is real to the world at large.”

Passkeys 101

During his session, Turner provided a high-level overview on what passkeys are and how they work.

Passkeys build upon existing FIDO authentication protocols and simplify the user experience. 

Passkeys can now be synchronized across devices through the use of passkey providers, removing the need for separate credentials on each device. Passkeys also enable new capabilities like cross-device authentication. Turner demonstrated how a QR code scanned on one device can securely connect to credentials stored on another nearby device. 

In addition to synced passkeys there are also device-bound passkeys, that rely on technologies like a security key to provide the required credentials.

The State of Passkeys

The current and future state of passkey adoption was the topic tackled by

Andrew Shikiar, executive director and CEO of the FIDO Alliance.

There are now hundreds of services, including the major platform vendors Microsoft, Apple and Google, representing billions of users, that support passkeys at this point in 2024.

“If you are a service provider and you wish to deploy passkeys, you can do so with high confidence that your consumers will be able to leverage them,” he said.

The FIDO Alliance aims to drive passkey support over the coming years, in part by sharing best practices and success stories, which is a core part of what the virtual summit was all about.

Usability was emphasized as a key factor for widespread adoption. 

“Usability is paramount. It must be front and center in what you do,” said Shikiar. 

The FIDO Alliance has released user experience guidelines and a design system to help companies implement passkeys in a user-friendly way. Future guidelines will address additional use cases.

Shikiar emphasized that passkeys are not about being a new addition to improve the security of passwords. His expectation is that passkeys will be seen as a true password replacement rather than just an attempt at bolstering existing authentication methods. He emphasized that the fundamental problem is passwords, and the goal should be replacing them, not just adding extra security layers on top of passwords. Shikiar wants people to stop thinking about multi-factor authentication factors and instead think about enabling phishing resistant identities. 

Passkeys are on Target at Target

Passkeys are already in use at retail giant Target, helping to improve security and optimize authentication for its employees. 

Tom Sheffield, senior director cybersecurity at Target, said that the company has been leveraging FIDO for workforce authentication since 2018 and adopted it as a primary authenticator in 2021.

One of the ways that Target has been able to more easily enable passkey support across its platforms is via Single Sign On (SSO). 

“We have a very robust SSO environment across our web application suite,” Sheffield said. “So for us, that made it very easy to integrate FIDO into the SSO platform, and then therefore every application behind SSO automatically got the benefit of it.”

In terms of how Target was able to get its users to adopt passkeys quickly, Sheffield said that the option was communicated to users in the login flow, rather than trying to explain to users what they should do in an email.

Overall, Sheffield emphasized that if an organization is using OTP (one time passwords) today for multi-factor authentication (MFA), any form of FIDO will provide significantly better user experience and security. 

“There have not been many security programs that I’ve been part of in my 25-year career in this space that offer you security and user experience simultaneously,” he said. “So if you’re using anything other than FIDO you’ve got a great opportunity to up your game and provide a great experience for users which should make you a hero.”

Authenticating a Billion Customers with Passkeys at Amazon

Among the biggest consumer-facing websites that supports passkeys today is online giant Amazon.

Yash Patodia, senior manager of product management at Amazon, detailed how passkeys were rolled out to hundreds of millions of consumers worldwide. Patodia explained Amazon’s motivation noting that passwords are relatively easy for a bad actor to crack. He noted that passkeys help customers to authenticate more easily than other methods with a better user experience. 

Amazon implemented passkeys using different APIs for web, iOS, and Android platforms. Now available across devices, Amazon’s goal is to drive awareness and increase passkey adoption among its customer base over the next year. In his view, passkeys are well suited for mass adoption and early indications from Amazon’s user base are very encouraging.

“If you’re a consumer facing company who has a big customer base, definitely explore this option,” he said.

Considerations for FIDO and Passkeys in the US Government 

The U.S. Government is no stranger to the world of strong authentication, with many staffers already using PIV (Personal Identity Verification) smart card credentials. 

Teresa Wu from IDEMIA and Joe Scalone from Yubico, who both serve on the FIDO Alliance’s Government Deployment Working Group (GDWG), provided an overview of how passkeys can complement PIV credentials and support a zero trust security model. 

As government agencies work to implement phishing-resistant multi-factor authentication, passkeys are an option that could provide a more seamless user experience than one-time passwords or hardware tokens. 

“We are not here to replace PIV, we are here to supplement and use FIDO where PIV is not covered,” said Wu. 

One area they see opportunities for FIDO is for federal contractors and employees who are not eligible for a PIV card due to their job functions. Currently these individuals rely on passwords for system access.

State of Passkey Portability Set to Improve

A critical aspect of user experience is the ability to change passkey providers and move from one provider to another, if that’s what the user wants to do.

With existing password managers and legacy passwords, the process of moving credentials isn’t particularly efficient or secure, according to Rew Islam from Dashlane and Nick Steele from 1Password. It’s a situation that the Credential Provider Special Interest Group within the FIDO Alliance is looking to solve with a new standard for securely porting passwords between different password/passkey management applications.

The group is developing a new Credential Exchange Protocol that will use hybrid public key encryption to securely transfer credentials; the effort also includes the development of a standardized data format for credential information.

“By having the standard credential format, it will allow for interoperability of sharing credentials between two different providers in different organizations,” Steele said.

A proof of concept demo for the credential exchange is currently set for May, during the FIDO Member Plenary in Osaka, Japan. Islam noted that the effort represents a real triumph for the power of FIDO to bring different competitive vendors together for common purpose.

Common Questions about Passkeys 

The virtual summit was concluded with an ‘Ask Me Anything’ (AMA) session where attendees asked their most pressing questions on passkeys.

Among the big questions asked:

How should organizations consider choosing synced passkeys or device-bound passkeys from a security and usability perspective?

Turner answered that the first thing to make really clear is that synced passkeys are probably the right answer for the majority of use cases. That said, he noted that FIDO recognizes that there are some areas where people have a much higher risk profile, and in those cases the device- bound passkeys can provide an extra level of trust.

Can passkeys play a role in transaction signing?

Pedro Martinez from Thales responded that yes, passkeys can be used to sign transactions. He explained that the beauty of the FIDO protocol is that it is based on the signature of a challenge. As such, it’s possible to adjust the challenge in order to contain data related to a transaction that needs to be digitally signed.

When will passkeys be the default mode of authentication? 

Shikiar said that he doesn’t think that all passwords will go away, but he is hopeful for a passwordless future.

“Sophisticated risk engines and anomaly detectors don’t really think twice about accepting a password,” he said. “But as passkeys become more prevalent and become the default all of a sudden using a password will be anomalous in and of itself.and I think that’s when we’ll be in the fabulous future when using a password is rightfully seen as a high risk and anomalous action.”


ResofWorld

The tech executive who believes AI needs more voice-based learning systems

Bayo Adekanmbi explains why voice-based learning is the best way to increase AI adoption across Africa.
Bayo Adekanmbi is the founder and CEO of Data Scientists Network (DSNai), an organization that has trained over 500,000 people in artificial intelligence and data science.  This interview has been...

Monday, 25. March 2024

Identity At The Center - Podcast

It’s time for a public conversation about privacy on the lat

It’s time for a public conversation about privacy on the latest episode of the Identity at the Center Podcast. We had an open conversation with Hannah Sutor, a Principal Product Manager at GitLab and IDPro Board Member, about privacy. We delved into the nuances of privacy as a human right, the expectations of privacy in our roles as employees and consumers, and much more. Check out this episode a

It’s time for a public conversation about privacy on the latest episode of the Identity at the Center Podcast. We had an open conversation with Hannah Sutor, a Principal Product Manager at GitLab and IDPro Board Member, about privacy. We delved into the nuances of privacy as a human right, the expectations of privacy in our roles as employees and consumers, and much more.

Check out this episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac


ResofWorld

Uber drivers are melting amid Nigeria’s historic heat wave and record fuel prices

As temperatures soar and fuel prices triple, drivers are having to choose between their health and livelihood.
Each time a new booking pops up on his Uber app, Johnson Idahosa promptly switches on the air conditioner in his car. To get a high rating, he needs to...

China’s EV price war is killing brands and infuriating consumers

Fast depreciation in an already slow market has panicked car buyers, many of whom are venting their frustrations on social media.
Mandy Pan thought she couldn’t go wrong with her first major purchase out of university: a silver plug-in hybrid from China’s leading electric vehicle maker, BYD. The compact sedan, called...

Friday, 22. March 2024

World Identity Network

World Identity Network Releases “Shadows in the Dark” Documentary on Amazon

WASHINGTON, March 22, 2024 /PRNewswire/ — World Identity Network (WIN), the leading nonprofit organization advocating for universal identity rights, has released its groundbreaking documentary, Shadows in the Dark: Our Global Identity Crisis, exclusively on Amazon. “Releasing this film to the public is a moment of great triumph for our organization,” says WIN Founder and CEO, D

WASHINGTON, March 22, 2024 /PRNewswire/ — World Identity Network (WIN), the leading nonprofit organization advocating for universal identity rights, has released its groundbreaking documentary, Shadows in the Dark: Our Global Identity Crisis, exclusively on Amazon.

“Releasing this film to the public is a moment of great triumph for our organization,” says WIN Founder and CEO, Dr. Mariana Dahan. “We spent years interviewing undocumented persons and refugees. Telling their stories with the utmost care, precision, and nuance was a tremendous responsibility, and we could not be happier with the final result.”

Shadows in the Dark is a sprawling saga following the stories of undocumented individuals across the United States, the Middle East, and refugee camps in Europe and beyond. The documentary shines a light on those born in the shadows of the formal economy, at the margins of society, lacking common identity documents, such as birth certificates and passports.

The movie highlights the work that Dr. Mariana Dahan has conducted at The World Bank, as the initiator and first global coordinator of the Identification for Development (ID4D) agenda, which celebrates its 10-years anniversary this year. Shadows in the Dark offers a compelling analysis of the successes and the risks associated with this multi-billion dollars program.

The Emmy Award – winning film crew has interviewed decision-makers, technologists and human rights activists, advocating for universal identification and responsible use of digital technologies, such as biometrics, facial recognition and AI.

“Identity is at the heart of many of today’s global challenges,” says Shadows in the Dark Co-director, Brad Kremer. “It is the common thread in immigration and many of the conflict zones existing throughout the world. When Dr. Mariana Dahan approached me to do this film together, I knew it would be a journey of immense meaning. But directing this narration, and telling the stories of everyone this issue impacts, has exceeded all our expectations.”

Produced in partnership with the United Nations, the Human Rights Foundation and Singularity University, Shadows in the Dark features extensive interviews with displaced Ukrainian and Syrian refugees recounting their experiences with the asylum process, along with leading officials at the World Bank and the United Nations, and the founders building new digital identity solutions. The film likewise explores nuances surrounding surveillance, authoritarian regimes, and biometric systems, as well as a dialogue with a group of far-right border advocates in the United States.

“In many ways, this film is a culmination of my life’s work,” continues Dr. Dahan. “Having been born without a birth certificate in Soviet-era Moldova, at the border with Ukraine, I know firsthand how crucial identity is to the preservation of human rights. I encourage everyone to watch the film and learn more about this global issue impacting millions. Identity is the cornerstone of human civilization”.

To learn more about Shadows in the Dark go to www.shadowsinthedark.movie

The post World Identity Network Releases “Shadows in the Dark” Documentary on Amazon appeared first on World Identity Network.


FIDO Alliance

Identity Week: HID’s 2024 report highlights mobile IDs, MFA, and sustainability in security trends

With over 83% of organisations currently using MFA, the shift away from password-dependency is clear. However, the report indicates a slower but growing implementation of Zero Trust architectures, currently in […]

With over 83% of organisations currently using MFA, the shift away from password-dependency is clear. However, the report indicates a slower but growing implementation of Zero Trust architectures, currently in place in up to 16% of larger organisations. The development of standards like FIDO heralds a move toward more secure authentication options.


Neowin: Proton Pass gets passkey support for both free and paid users

Proton has announced passkey support in its Proton Pass password manager, which now offers enhanced security and usability for both free and paid users across all platforms.

Proton has announced passkey support in its Proton Pass password manager, which now offers enhanced security and usability for both free and paid users across all platforms.


Biometric Update: FIDO’s influence expands with new security key and board member

Cisco has further solidified its commitment to passkeys by joining the FIDO Alliance’s board of member representatives. Andrew Shikiar, executive director and CEO of the FIDO Alliance welcomes Cisco’s expanded […]

Cisco has further solidified its commitment to passkeys by joining the FIDO Alliance’s board of member representatives. Andrew Shikiar, executive director and CEO of the FIDO Alliance welcomes Cisco’s expanded involvement, noting their historical contributions through Duo Security and now as an official member.


Elastos Foundation

Bitcoin Layer 2 Evolution: Unveiling BeL2’s BTC Oracle with Elastos

The launch of BeL2’s BTC Oracle marks a critical juncture,  a paradigm shift in how Bitcoin interacts with the broader ecosystem of decentralised applications (DApps) and Ethereum Virtual Machine (EVM) compatible blockchains. Bitcoin, as the first cryptocurrency, has long been critiqued for its limitations in scalability and flexibility, particularly in the context of smart contracts […] Th

The launch of BeL2’s BTC Oracle marks a critical juncture,  a paradigm shift in how Bitcoin interacts with the broader ecosystem of decentralised applications (DApps) and Ethereum Virtual Machine (EVM) compatible blockchains.

Bitcoin, as the first cryptocurrency, has long been critiqued for its limitations in scalability and flexibility, particularly in the context of smart contracts and DApps. The introduction of BeL2 and its BTC Oracle addresses these critiques head-on by generating zero-knowledge proofs (ZKPs) to enable secure, private, and efficient communication between Bitcoin and EVM blockchains. This development is crucial because it expands Bitcoin’s utility beyond being a mere store of value to a foundational layer upon which complex decentralised applications can be built and managed directly.

 

 

The Core

The core of this innovation lies in BeL2’s BTC Oracle. The BTC Oracle generates ZKPs to feed real-time Bitcoin transaction data into EVM smart contracts without compromising the privacy or security of the transactions. This functionality is revolutionary, as it allows for the creation of Bitcoin-denominated smart contracts across any EVM-compatible blockchain, vastly expanding the potential use cases and applications for Bitcoin in the decentralised finance (DeFi) space.

BeL2, or Bitcoin Layer 2, further extends this capability by providing a framework for developing and managing Bitcoin-native smart contracts. It represents the culmination of efforts to integrate Bitcoin more deeply into the ecosystem of decentralised applications, enabling novel financial products and services such as BTC lending, algorithmic stablecoin issuance, and more.

 

The Mechanism

BeL2’s technology stack comprises a BTC Oracle that inputs Bitcoin-related data into EVM contracts, an upcoming ELA-powered relay network to decentralise and secure the data transmission, and the application layer where the actual development of Bitcoin-native smart contracts takes place.

This approach minimises reliance on intermediaries, reduces points of failure, and enhances the system’s overall resilience and efficiency. BeL2’s BTC Oracle is centred around enhancing Bitcoin’s utility and accessibility, involving innovative cryptographic techniques like ZKPs to deliver a comprehensive solution for Bitcoin and EVM blockchain interoperability.

 

 

The Impact

By enabling direct development on Bitcoin Layer 2, Elastos is not just augmenting Bitcoin’s functionality; it is redefining the possibilities of the blockchain space. The ability for any EVM blockchain to leverage Bitcoin in smart contracts opens up new avenues for innovation, potentially increasing the market for Bitcoin-based applications sevenfold.

This development aligns with the broader trend of seeking solutions that respect the foundational principles of blockchain technology—decentralisation, security, and user sovereignty—while pushing the boundaries of what’s possible. It embodies a non-consensus, forward-thinking approach that challenges conventional limitations and opens up new opportunities for the entire crypto ecosystem.

In conclusion, the launch of Elastos’ BTC Oracle and BeL2 platform represents a significant milestone in the evolution of Bitcoin and blockchain technology. By addressing fundamental challenges of interoperability and functionality, Bitcoin’s value is not just in its scarcity and security but in its utility and integration into the decentralised web.

Try the BeL2 demo here!

The post Bitcoin Layer 2 Evolution: Unveiling BeL2’s BTC Oracle with Elastos appeared first on Elastos.


DIDAS

Parallel Signatures – a relevant input to the Technology Discussion

To enhance the Swiss e-ID framework with selective disclosure while ensuring unlinkability, it's imperative to incorporate advanced digital signature technologies such as BBS+ signatures. These technologies not only fortify the security of digital credentials but also significantly enhance user privacy. Such capabilities are crucial in minimizing the risk of personal data exposure and ensuring ...

To enhance the Swiss e-ID framework with selective disclosure while ensuring unlinkability, it’s imperative to incorporate advanced digital signature technologies such as BBS+ signatures. These technologies not only fortify the security of digital credentials but also significantly enhance user privacy. Such capabilities are crucial in minimizing the risk of personal data exposure and ensuring that users retain control over their information. It’s essential to continuously align our Trust Infrastructure with international cryptographic standards while remaining adaptable to emerging norms. This approach will facilitate interoperability across borders and sectors, ensuring that e-ID systems are both secure and universally recognized.

The parallel signatures model involves attaching multiple digital signatures to a single document or payload, with each signature providing different security or privacy features. This approach allows for a flexible and robust security framework, accommodating various cryptographic standards and privacy needs without compromising the integrity of the original document. It’s particularly useful in environments requiring adherence to diverse regulatory standards or in scenarios where resilience and both, high security and privacy are paramount. Cryptographic layering supports adaptiveness by incorporating multiple layers of cryptographic techniques within a system. This approach allows for the seamless integration and removal of cryptographic methods as needed by the Trust Ecosystem governance, enabling the system to adapt to evolving security threats and advancements in cryptographic research. It ensures long-term resilience and flexibility, allowing systems to maintain security without complete overhauls. Applying cryptographic schemes always mandates careful handling of private keys. Preventing their exposure is vital, even more so when using advanced schemes supporting derivative keys, as possible with BBS+. This underscores the need for strict security measures to prevent unauthorized access and ensure the system’s integrity.

Public-Private Partnerships (PPPs) represent a proven strategic model to operationalize digital trust and -identity solutions, combining public oversight with private sector efficiency and innovation. Such partnerships should be structured to encourage shared investment and risk, with a clear focus on public interest, global standards and local governance, protection of digital sovereignty and value-based adoption. These initiatives should be complemented by ongoing research into cryptographic innovations, preparing the ground for future advancements in e-ID security and privacy.

To address the challenges comprehensively and to build a continuously improving framework that is not only secure and compliant but also resilient and forward-looking, we must evaluate to invest in an independent body that accompanies the further progress in technology, governance and supports public and private sector adoption – to benefit from the opportunities of a trusted digital economy in the long term.

Thank you DIDAS Technology Working Group and Manu Sporny of Digital Bazaar for the dialogue!


MyData

A Recorded Delivery Network… for Data

In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Since the 1980s, personal data has been managed in essentially the same way. Organisations aggregate customer information in vast data warehouses, with the assumption that more data is always better to […]
In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Since the 1980s, personal data has been managed in essentially the same way. Organisations aggregate customer information in vast data warehouses, with the assumption that more data is always better to […]

Thursday, 21. March 2024

Digital ID for Canadians

DIACC Women in Identity: Marli Lichtman

DIACC women in identity spotlights showcase outstanding DIACC member women in identity. If you are a DIACC member woman in identity and would like us…

DIACC women in identity spotlights showcase outstanding DIACC member women in identity. If you are a DIACC member woman in identity and would like us to feature you in the spotlight, contact us!

Marli Lichtman is Managing Director and Head, Digital Strategy and Controls at BMO Financial Group, BMO.

Follow Marli on LinkedIn

What has your career journey looked like?

Let me work backwards, starting with my current role as Head of Digital Strategy and Controls at BMO. In this role, I lead two teams accountable for: (1) Strategy: defining and executing BMO’s “Digital First” agenda and (2) Controls: working in partnership with the Financial Crimes Unit to build and enhance digital controls to protect our customers against fraud.

I initially joined BMO’s Corporate Strategy Team in 2013 and since then have worked in progressively senior roles across Finance, Risk, Transformation and Business Operations.

Before joining BMO, I was a consultant in Oliver Wyman’s Finance and Risk Practice and prior to that, I worked in wealth management and earned my CFA (Chartered Financial Analyst) designation. My first job out of school was at a boutique investment advisory firm. I graduated from Ivey at Western University with an Honours Business Administration (HBA) degree.

When you were 20 years old, did you visualize a dream job and if so, why?

I didn’t really know what I wanted to do when I was 20! I focused the early days of my career on finding opportunities where I could be challenged, learn as much as possible, maintain optionality to transition to other industries or career paths, and work with great people who would champion my career.

Have you encountered significant barriers in your career as a woman in leadership, and if so, what were they?

I have experienced many of the usual challenges you hear about concerning women in the workplace. However, my biggest barrier has been getting into my own head and thinking that I don’t deserve the positions I’ve been given (I mean, earned 😊). Through executive coaching, mentors, sponsors, and simply the experience of failing and rebounding, I’ve been able to overcome this (although I would be lying if I said I don’t experience imposter syndrome from time to time!).

How do you balance work and life responsibilities?

It’s a constant juggling act, but I try to focus on 5 things:

Regular calendar reviews to “optimize” my time (e.g., which calls can I take from the car on my way to / from the office?) Learning to say “no” and setting clear boundaries (applies to both work and personal life). Finding time for self-care. Working as a team with my partner who is also balancing a demanding schedule. Living my values and knowing what’s important in life.

How can more women be encouraged to pursue digital trust and identity careers?

We need to start with education – What is Digital ID? What skillsets do you need to enter the space? Why is diversity so important? Who are female trailblazers in the space, and what has their career path looked like? Early exposure, encouragement, and mentorship are key to increasing female representation in this space.

What are some strategies you have learned to help women achieve a more prominent role in their organizations?

Build meaningful relationships. Earn the trust of your colleagues. Network within and outside of your industry. Ensure you have a mentor and a sponsor at your organization. Most importantly, stay true to yourself.

What will be the biggest challenge for the generation of women behind you?

While women have made considerable progress over the past decade, there is still more work to do. The next generation will continue to face the same challenges (e.g., gender bias, pay inequality, balancing personal life) but will benefit from increased female representation and sponsorship at Senior levels.

What advice would you give to young women entering the field?

Be confident – you are in the field for a reason! Trust your instincts, and don’t be too hard on yourself.


Ceramic Network

Toward the first decentralized points system: Oamo becomes the first points provider on Ceramic

We're thrilled to announce that Oamo is partnering with Ceramic as a data provider on the platform. Oamo will issue tens of millions of publicly available credentials based on wallets’ on-chain activity and holdings. This is the first step in a broader initiative to develop and standardize

We're thrilled to announce that Oamo is partnering with Ceramic as a data provider on the platform. Oamo will issue tens of millions of publicly available credentials based on wallets’ on-chain activity and holdings. This is the first step in a broader initiative to develop and standardize the first decentralized point system, powered by Ceramic and Oamo’s credential models.

Oamo has been a big supporter of the Ceramic ecosystem from day one. By harnessing Ceramic's innovative DID (Decentralized Identifier) infrastructure and ComposeDB for zero-party data storage, they’re setting the foundation for a future where user data is private by design, perishable at will, and accessible only with explicit permission. Oamo and Ceramic are crafting a path toward a consensual and rewarding digital ecosystem.

The partnership so far

Since launching on Ceramic in Q3 2023, Oamo has witnessed remarkable results – over 65,000 Oamo Profiles have been created, with more than 200,000 Ceramic documents generated. Additionally, Oamo has distributed over 400,000 credentials spanning Web2 and on-chain behaviors, enriching the digital identity and access privileges of Oamo Profile users across various platforms.

Oamo credentials cover:

On-chain activity across DeFi, NFTs, staking and gaming; Wallet holdings including major ERC-20s and NFT collections; and Social activity across Web2 platforms like Discord, Youtube and X. Supercharging the Ceramic ecosystem

With this partnership and announcement, Oamo aims to enhance digital identity and engagement through:

Credential Distribution
Oamo has indexed millions of EVM wallets’ behaviors and holdings, and will be distributing tens of millions of publicly available credentials to enrich user identities across platforms, ensuring the security and verification of online activities. These credentials can then be used to:Credentials issued will be maintained and updated monthly to include time decay and ensure they always represent the latest behaviors of the indexed wallets. Feedback from the community is welcome to develop new credentials that track the most relevant on-chain behaviors for builders in the ecosystem. Compile specific wallet lists for airdrops. Establish reputation frameworks based on behavioral data points. Launch strategic user acquisition campaigns by identifying wallets in a specific target audience and contacting them via XMTP for example. Decentralized Point System
Oamo will leverage its credential models to develop the first standardized decentralized point system on Ceramic, with each indexed wallet receiving its own scorecard based on its on-chain activity and holdings. Builders in the ecosystem will be able to leverage these scorecards and customize their own points system with their own credentials and Oamo’s. Credential & Points Management SDK
Oamo will release an SDK to allow any builder to search and leverage Oamo’s credentials and points system easily. This middleware will also allow builders to issue their own credentials and points based on their own models and app activity. What’s in it for users

Anyone creating their Decentralized Identifier (DID) on the Ceramic Network (by creating an Oamo Profile, for instance) will be able to claim their credentials and scorecards seamlessly. This open and inclusive approach democratizes access to digital credentials, ensuring users from all backgrounds and levels of onchain experience can benefit from Ceramic’s ecosystem of builders.

What’s in it for developers

Oamo's vision includes diverse use cases, transforming how developers interact with consumers. The Oamo platform offers endless opportunities for various types of protocols and apps:

DeFi Protocols
Easily find wallets matching their target audience, such as active liquidity providers on leading AMMs or active traders on DEXes across major EVM chains. NFT Projects
Identify potential collectors based on their NFT holdings and distribute collections to the right user base. Wallet Providers
Identify and reach whales holding specific token amounts across multiple chains. Liquid Staking Projects
Identify wallets holding significant ETH amounts and generating yield via lending protocols as high-value acquisition targets. Game Developers
Find gamers in Web3 that hold specific NFTs or have engaged with similar on-chain games.

While the Oamo app provides a hub for user acquisition and relationship development, this publicly available tooling and data will allow anyone to craft their own strategies.

Builders on the Ceramic Network will have the capability to query, consume, and customize issued credentials and points to power new data-rich use cases, such as targeted airdrops, credential-gated experiences, loyalty programs, and more. To streamline integrations, Oamo will be launching an SDK, making it easier for developers to incorporate these capabilities into their own projects.

Join the Ceramic Discord and Oamo’s Telegram channel for builders to contribute or be notified about updates and releases.

About Ceramic

Ceramic is a decentralized data network for managing verifiable data at scale, combining the trust and composability of a blockchain with the flexibility of an event-driven architecture to help organizations get more value from their data. Thousands of developers use it to manage reputation data, store attestations, log user activity, and build novel data infrastructure. Ceramic frees entrepreneurs from the constraints of traditional siloed infrastructure, letting them tap into a vibrant data ecosystem to bring their unique vision to life faster.

About Oamo

Oamo allows consumers to discover and match with their favorite brands based on their online activity. Brands can define their ideal user persona based on their online behaviors, optionally incentivize data sharing via token rewards, and design personalized conversion and retention campaigns to acquire power users. Zero-party data guarantees an optimal match between interested consumers and brands through rich behavioral alignment, leading to higher conversion rates and LTV.


Origin Trail

Decentralized RAG with OriginTrail DKG and NVIDIA Build ecosystem

Introduction Generative Artificial Intelligence (AI) is already reaching relevant adoption across multiple fields, however, some of its limitations are significantly hurting the potential of mainstream adoption and delivering improvements in all fields of modern humanity. For GenAI to be production-ready for such a scale of impact we need to limit hallucinations, manage bias, and reject intellect
Introduction

Generative Artificial Intelligence (AI) is already reaching relevant adoption across multiple fields, however, some of its limitations are significantly hurting the potential of mainstream adoption and delivering improvements in all fields of modern humanity. For GenAI to be production-ready for such a scale of impact we need to limit hallucinations, manage bias, and reject intellectual property (or data ownership) infringements. The promise of Verifiable Internet for AI is to address these shortfalls by providing information provenance in model outputs, ensuring verifiability of presented information, respecting data ownership, and incentivizing new knowledge creation.

Below we’re showcasing the implementation framework called Decentralized Retrieval-Augmented Generation (dRAG) on the NVIDIA Build ecosystem, combining an ample amount of powerful models across industries and types. dRAG is advancing the Retrieval-Augmented Generation (RAG) framework proposed by Patrick Lewis in an attempt to increase accuracy and reliability of GenAI models with facts fetched from external sources. The RAG framework has gained prominence both among AI developers and the leading tech companies’ leaders, such as NVIDIA’s CEO Jensen Huang.

The dRAG advances the RAG system by leveraging the Decentralized Knowledge Graph (DKG), a permissionless network of Knowledge Assets. Each Knowledge Asset contains Graph data and/or Vector embeddings, immutability proofs, a Decentralized Identifier (DID), and the ownership NFT. When connected in one permission-less DKG, the following capabilities are enabled:

Knowledge Graphs — structural knowledge in knowledge graphs allows a hybrid of neural and symbolic AI methodologies, enhancing the GenAI models with deterministic inputs. Ownership — dRAG uses input from Knowledge Assets that have an owner that can manage access to the data contained in the Knowledge Asset. Verifiability — every piece of knowledge on the DKG has cryptographic proofs published ensuring that no tampering has occurred since it was published.

In this tutorial, you will learn how to query the OriginTrail DKG and retrieve verified Knowledge Assets on the DKG.

Prerequisites A NVIDIA build platform account and API key. A DKG node. Please visit the official docs to learn how to set one up. A Python project with a virtual environment set up. Step 1 — Installing packages and setting up dkg.py

In this step, you’ll install the necessary packages using pip and set up the credentials for dkg.py.

Navigate to your Python project’s environment and run the following command to install the packages:

pip install openai dkg python-dotenv annoy

The OpenAI client is going to act as an intermediary for interacting with the NVIDIA API. You’ll store the environment variables in a file called .env. Create and open it for editing in your favorite editor:

nano .env

Add the following lines:

OT_NODE_HOSTNAME="your_ot_node_hostname"
PRIVATE_KEY="your_private_key"
NVIDIA_API_TOKEN="your_nvidia_api_token"

Replace the values with your own, which you can find in the configuration file of your OT Node, as well as your wallet’s private key in order to perform the Knowledge Asset create operation, which needs to be funded with TRAC tokens (more information available in the OriginTrail documentation). Keep in mind that this information should be kept private, especially your wallet’s key. When you’re done, save and close the file.

Then, create a Python file where you’ll store the code for connecting to the DKG:

nano dkg_version.py

Add the following code to the file:

from dkg import DKG
from dkg.providers import BlockchainProvider, NodeHTTPProvider
from dotenv import load_dotenv
import os
import json

dotenv_path = './.nvidia.env' # Replace with your .env file address
load_dotenv(dotenv_path)
ot_node_hostname = os.getenv('OT_NODE_HOSTNAME')
private_key = os.getenv('PRIVATE_KEY')

node_provider = NodeHTTPProvider(ot_node_hostname)
blockchain_provider = BlockchainProvider("testnet", "otp:20430", private_key=private_key)

dkg = DKG(node_provider, blockchain_provider)
print(dkg.node.info)

Here, you first import the required classes and packages. Then, you load the values from .env and instantiate a NodeHTTPProvider and BlockchainProvider with those values, which you pass in to the DKG constructor, creating the dkg object for communicating with the graph.

If all credentials and values are correct, the output will show you the version that your OT Node is running on:

{'version': '6.2.3'}

That’s all you have to do to be connected to the DKG!

Step 2 — Instructing the LLM to create Knowledge assets on the DKG

In this step, you’ll connect to the NVIDIA API using the OpenAI Python library. Then, you’ll instruct it to generate

First, you need to initialize the OpenAI class, passing in the NVIDIA API as the base_url along with your API key. OpenAI acts as an intermediary to the NVIDIA API here, and will be able to use multiple LLM models, such as the Google Gemma and Meta’s Llama which are used in the tutorial.

from openai import OpenAI

client = OpenAI(
base_url = "https://integrate.api.nvidia.com/v1",
api_key = os.getenv('NVIDIA_API_TOKEN')
)

Then, you define the instructions, telling the model what to do:

instruction_message = '''
Your task is the following:

Construct a JSON object following the Product JSON-LD schema based on the provided information by the user.
The user will provide the name, description, tags, category and deployer of the product, as well as the URL which you will use as the '@id'.

Here's an example of an Product that corresponds to the mentioned JSON-LD schema.:
{
"@context": "http://schema.org",
"@type": "Product",
"@id": "https://build.nvidia.com/nvidia/ai-weather-forecasting",
"name": "ai-weather-forecasting",
"description": "AI-based weather prediction pipeline with global models and downscaling models.",
"tags": [
"ai weather prediction",
"climate science"
],
"category": "Industrial",
"deployer": "nvidia"
}

Follow the provided JSON-LD schema, using the provided properties and DO NOT add or remove any one of them.
Output the JSON as a string, between ```json and ```.
'''

chat_history = [{"role":"system","content":instruction_message}]

As part of the instructions, you provide the model with an example Product definition, according to which a new one should be generated. We want to create a Knowledge Asset which will represent the ‘rerank-qa-mistral-4b’ model from the NVIDIA Build platform. You add the contents of that message to chat_history with a system role, meaning that it instructs the model before the user comes in with actionable prompts.

Then, you define an example user_instruction for testing the model:

user_instruction = '''I want to create a product (model) with name 'rerank-qa-mistral-4b', which is a GPU-accelerated model optimized for providing a probability score
that a given passage contains the information to answer a question. It's in category Retrieval and deployed by nvidia.
It's used for ranking and retrieval augmented generation. You can reach it at https://build.nvidia.com/nvidia/rerank-qa-mistral-4b. Give me the schema JSON LD object.'''

This user prompt wants the LLM to output a Product with the given name and gives information as to where that model can be found.

Finally, you can ask the LLM to compute the output and print it:

completion = client.chat.completions.create(
model="google/gemma-7b",
messages=chat_history + [{"role":"user","content":user_instruction}],
temperature=0,
top_p=1,
max_tokens=1024,
)

generated_json = completion.choices[0].message.content
print(generated_json)

The output will look like this:

```json
{
"@context": "http://schema.org",
"@type": "Product",
"@id": "https://build.nvidia.com/nvidia/rerank-qa-mistral-4b",
"name": "rerank-qa-mistral-4b",
"description": "GPU-accelerated model optimized for providing a probability score that a given passage contains the information to answer a question.",
"tags": [
"rerank-qa-mistral-4b",
"information retrieval",
"retrieval augmentation"
],
"category": "Retrieval",
"deployer": "nvidia"
}
```

The LLM has returned a JSON-LD structure that can be added to the DKG.

def clean_json_string(input_string):
if input_string.startswith("```json") and input_string.endswith("```"):
cleaned_query = input_string[7:-3].strip()
return cleaned_query
elif input_string.startswith("```") and input_string.endswith("```"):
cleaned_query = input_string[3:-3].strip()
else:
return input_string

product = json.loads(clean_json_string(generated_json))

content = {"public": product}
create_asset_result = dkg.asset.create(content, 2)
print('Asset created!')
print(json.dumps(create_asset_result, indent=4))
print(create_asset_result["UAL"])

Here you first define a function (clean_json_string) that will clean up the JSON string and remove the Markdown code markup. Then, you load the product by deserializing the JSON and add it to the DKG by calling dkg.asset.create().

The output will look like this:

Asset created!
{
"publicAssertionId": "0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef",
"operation": {
"mintKnowledgeAsset": {
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"from": "0xD988B6fd921CFab980a7f2F60B9aC9F7918D7F71",
"to": "0xB25D47412721f681f1EaffD1b67ff0638C06f2B7",
"blockNumber": 3674556,
"cumulativeGasUsed": 397582,
"gasUsed": 397582,
"contractAddress": null,
"logs": [
{
"address": "0x1A061136Ed9f5eD69395f18961a0a535EF4B3E5f",
"topics": [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x0000000000000000000000000000000000000000000000000000000000000000",
"0x000000000000000000000000d988b6fd921cfab980a7f2f60b9ac9f7918d7f71",
"0x000000000000000000000000000000000000000000000000000000000027fb68"
],
"data": "0x",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 0,
"transactionLogIndex": "0x0",
"removed": false
},
{
"address": "0xf305D2d97C7201Cea2A54A2B074baC2EdfCE7E45",
"topics": [
"0x6228bc6c1a8f028a2e3476a455a34f5fa23b4387611f3c147a965e375ebd17ba",
"0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
],
"data": "0x00000000000000000000000000000000000000000000000000000000000003e700000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000008",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 1,
"transactionLogIndex": "0x1",
"removed": false
},
{
"address": "0xFfFFFFff00000000000000000000000000000001",
"topics": [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x000000000000000000000000d988b6fd921cfab980a7f2f60b9ac9f7918d7f71",
"0x000000000000000000000000f43b6a63f3f6479c8f972d95858a1684d5f129f5"
],
"data": "0x0000000000000000000000000000000000000000000000000000000000000006",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 2,
"transactionLogIndex": "0x2",
"removed": false
},
{
"address": "0x082AC991000F6e8aF99679f5A2F46cB2Be4E101B",
"topics": [
"0x4b81188c3c973dd634ec0dae5b7e72f92bb03834c830739d63935923950d6f64",
"0x0000000000000000000000001a061136ed9f5ed69395f18961a0a535ef4b3e5f",
"0x000000000000000000000000000000000000000000000000000000000027fb68"
],
"data": "0x00000000000000000000000000000000000000000000000000000000000000c000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000065fc48a00000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000076a700000000000000000000000000000000000000000000000000000000000000000600000000000000000000000000000000000000000000000000000000000000341a061136ed9f5ed69395f18961a0a535ef4b3e5f09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef000000000000000000000000",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 3,
"transactionLogIndex": "0x3",
"removed": false
},
{
"address": "0xB25D47412721f681f1EaffD1b67ff0638C06f2B7",
"topics": [
"0x60e45db7c8cb9f55f92f3de18053b0b426eb919a763a1daca0ea9ad20961e878",
"0x0000000000000000000000001a061136ed9f5ed69395f18961a0a535ef4b3e5f",
"0x000000000000000000000000000000000000000000000000000000000027fb68",
"0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
],
"data": "0x",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 4,
"transactionLogIndex": "0x4",
"removed": false
}
],
"logsBloom": "0x00000100400000000000800000000000000000000000000000000000000000000000010020000000000000000000000000000000000010800000000000001000000040000000400040000008002400000080000000004000000000000000000000040000020000000000000000000a00000000008000020000000010000210015000000000000000000080000000001000000000000000000000000200000000040000001020002002000000000000000000000000000000000000000000000000000002000000000000000000008004000000000000010000000000000020000000000000002800000000000000000000000000000000100000000000010000",
"status": 1,
"effectiveGasPrice": 40,
"type": 0
},
"publish": {
"operationId": "1bb622c7-8fa1-4414-b39e-0aaf3f5465f9",
"status": "COMPLETED"
}
},
"UAL": "did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2620264"
}

Here we can see a lot of useful information, such as the Knowledge Asset issuer, transaction IDs from the blockchain, and the status of the operation, which was completed. The UAL returned is the Uniform Asset Locator, a decentralized identifier connected to each Knowledge Asset on the DKG.

Then, you can retrieve the same product, but from the DKG by passing in the UAL to dkg.asset.get(). The output will look like this:

get_asset_result = dkg.asset.get(create_asset_result["UAL"])
print(json.dumps(get_asset_result, indent=4))

The output will be:

did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2620264
{
"operation": {
"publicGet": {
"operationId": "c138515a-d82c-45a8-bef9-82c7edf2ef6b",
"status": "COMPLETED"
}
},
"public": {
"assertion": "<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/category> \"Retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/deployer> \"nvidia\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/description> \"GPU-accelerated model optimized for providing a probability score that a given passage contains the information to answer a question.\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/name> \"rerank-qa-mistral-4b\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"information retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"rerank-qa-mistral-4b\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"text retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://schema.org/Product> .",
"assertionId": "0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
}
}

In this step, you’ve seen how to instruct the NVIDIA LLM to generate Product entities according to user prompts, and how to insert them into the DKG. You’ll now learn how to generate SPARQL queries for products using the LLM.

Step 3 — Generating SPARQL with the AI model

In this step, you’ll use the NVIDIA LLM to generate a SPARQL query for retrieving results from the DKG. The data that we’ll be querying consists of Knowledge Assets that represent each of the models from the NVIDIA Build platform — with the same properties as the one created in Step 2.

SPARQL is a query language for graphs and is very similar to SQL. Just like SQL, it has a SELECT and a WHERE clause, so as long as you’re familiar with SQL you should be able to understand the structure of the queries pretty well.

The data that you’ll be querying is related to Products, stored in the DKG as Knowledge Assets.

Similarly to before, you’ll need to instruct the LLM on what to do:

all_categories = ["Biology", "Gaming", "Visual Design", "Industrial", "Reasoning", "Retrieval", "Speech"];
all_tags = ["3d-generation", "automatic speech recognition", "chat", "digital humans", "docking", "drug discovery", "embeddings", "gaming", "healthcare", "image generation", "image modification", "image understanding", "language generation", "molecule generation", "nvidia nim", "protein folding", "ranking", "retrieval augmented generation", "route optimization", "text-to-3d", "advanced reasoning", "ai weather prediction", "climate science"];

instruction_message = '''
You have access to data connected to the new NVIDIA Build platform and the products available there.
You have a schema in JSON-LD format that outlines the structure and relationships of the data you are dealing with.
Based on this schema, you need to construct a SPARQL query to retrieve specific information from the NVIDIA products dataset that follows this schema.

The schema is focused on AI products and includes various properties such as name, description, category, deployer, URL and tags related to the product.
My goal with the SPARQL queries is to retrieve data from the graph about the products, based on the natural language question that the user posed.

Here's an example of a query to find products from category "AI Weather Prediction":
```sparql
PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description ?ual

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:tags "ai weather prediction" ; schema:name ?name ; schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }```

Pay attention to retrieving the UAL, this is a mandatory step of all your queries. After getting the product with '?product a schema:Product ;' you should wrap the next conditions around GRAPH ?g { }, and later use the graph retrieved (g) to get the UAL like in the example above.

Make sure you ALWAYS retrieve the UAL no matter what the user asks for and filter whether it contains "2043".
Make sure you always retrieve the NAME and the DESCRIPTION of the products.

Only return the SPARQL query wrapped in ```sparql ``` and DO NOT return anything extra.
'''

limitations_instruction = '''\nThe existing categories are: {}. The existing tags are: {}'''.format(all_categories, all_tags)
user_instruction = '''Give me all NVIDIA tools which I can use for use cases related to biology.'''

chat_history = [{"role":"system","content":instruction_message + limitations_instruction}, {"role":"user","content":user_instruction}]

The instruction_message prompt contains the instructions in natural language. You provide the model with a schema of a Product object (in JSON-LD notation) and an example SPARQL query in the appropriate format for the DKG. You also order it to pay attention to the examples and to return nothing else except the SPARQL query.

You can now define the chat history and pass in a user prompt to get the resulting code:

limitations_instruction = '''\nThe existing categories are: {}. The existing tags are: {}'''.format(all_categories, all_tags)
user_instruction = '''Give me all NVIDIA tools which I can use for use cases related to biology.'''

chat_history = [{"role":"system","content":instruction_message + limitations_instruction}, {"role":"user","content":user_instruction}]

completion = client.chat.completions.create(
model="meta/llama2-70b", # NVIDIA lets you choose any LLM from the platform
messages=chat_history,
temperature=0,
top_p=1,
max_tokens=1024,
)

answer = completion.choices[0].message.content
print(answer)

The output will look similar to this:

```sparql
PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:category "Biology" ;
?product schema:name ?name ;
?product schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }
```

This SPARQL query retrieves all products that have the category "Biology" and returns their names and descriptions. The `GRAPH ?g` clause is used to retrieve the graph that contains the product information, and the `FILTER` clause is used to filter the results to only include products that have a UAL that contains "20430".
```

You can employ a similar strategy to clean the result from the Markdown code formatting:

def clean_sparql_query(input_string):
start_index = input_string.find("```sparql")
end_index = input_string.find("```", start_index + 1)
if start_index != -1 and end_index != -1:
cleaned_query = input_string[start_index + 9:end_index].strip()
return cleaned_query
else:
return input_string

query = clean_sparql_query(answer)
print(query)

The output will now be clean SPARQL:

PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:category "Biology" ;
?product schema:name ?name ;
?product schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }
```

This SPARQL query retrieves all products that have the category "Biology" and returns their names and descriptions. The `GRAPH ?g` clause is used to retrieve the graph that contains the product information, and the `FILTER` clause is used to filter the results to only include products that have a UAL that contains "20430". Step 4 — Querying the OriginTrail DKG

Querying the DKG is very easy with SPARQL. You only need to specify the query and the repository to search:

query_result = dkg.graph.query(query, "privateCurrent")
print(query_result)

The privateCurrent option ensures that the SPARQL query retrieves the latest state of Knowledge Assets in the DKG, as it includes the private and public data of the latest finalized state of the Graph.

An example result for the above query looks like this:

[
{
'product': 'https: //build.nvidia.com/nvidia/molmim-generate',
'description': '"MolMIM performs controlled generation, finding molecules with the right properties."',
'name': '"molmim-generate"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619549'
},
{
'product': 'https: //build.nvidia.com/meta/esmfold',
'description': '"Predicts the 3D structure of a protein from its amino acid sequence."',
'name': '"esmfold"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619597'
},
{
'product': 'https: //build.nvidia.com/mit/diffdock',
'description': '"Predicts the 3D structure of how a molecule interacts with a protein."',
'name': '"diffdock"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619643'
}
]

You’ll now be able to utilize the DKG to improve the runtime cost of the LLM model, as well as have it rely on trustable data stored in the Knowledge Assets.

Step 5 — Vector search with NVIDIA embed-qa-4 model and the DKG

In this step, you’ll build an in-memory vector DB based on the verified data queried from the DKG and invoke the NVIDIA model with it to generate more accurate results for the end-user. Sometimes, using SPARrQL queries may not be enough to answer a question, and you can use a vector database to extract specific Knowledge Assets by semantic similarity.

First, you initialize the NVIDIA embed-qa-4 model that you’ll use to generate the vector embeddings:

import requests

invoke_url = "https://ai.api.nvidia.com/v1/retrieval/nvidia/embeddings"

headers = {
"Authorization": f"Bearer {os.getenv('NVIDIA_API_TOKEN')}",
"Accept": "application/json",
}

def get_embeddings(input):
payload = {
"input": input,
"input_type": "query",
"model": "NV-Embed-QA"
}

session = requests.Session()

response = session.post(invoke_url, headers=headers, json=payload)

response.raise_for_status()
response_body = response.json()
return response_body["data"][0].embedding

Then, you build the vector DB in-memory by making embeddings based on the Product description:

from annoy import AnnoyIndex

def build_embeddings_index(embeddings, n_trees=10):
dim = len(embeddings[0])
index = AnnoyIndex(dim, 'angular') # Using angular distance

for i, vector in enumerate(embeddings):
index.add_item(i, vector)

index.build(n_trees)
return index

def add_text_embeddings(products):
for product in products:
product["embedding"] = get_embeddings([product["description"]])

add_text_embeddings(products)

Then, you can retrieve the Product that is semantically nearest to the user prompt, in order to generate the answer to his question with the following:

index = build_embeddings_index([product["embedding"] for product in products])
question = "I would like a model which will help me find the molecules with the chosen properties."

nearest_neighbors = index.get_nns_by_vector(get_embeddings(question), 1, include_distances=True)
index_of_nearest_neighbor = nearest_neighbors[0][0]

print(f"Vector search result: {products[index_of_nearest_neighbor]['description']}")
print(f"Product name: {products[index_of_nearest_neighbor]['name']}")
print(f"https://dkg.origintrail.io/explore?ual={products[index_of_nearest_neighbor]['ual']}")

The output will be similar to this:

Vector search result: Predicts the 3D structure of how a molecule interacts with a protein.
Product name: diffdock
https://dkg-testnet.origintrail.io/explore?ual=did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619643 Conclusion

You have now created a Python project which uses tools from the NVIDIA Build platform to help create and query verifiable Knowledge Assets on OriginTrail DKG. You’ve seen how to instruct it to generate SPARQL queries from Natural Language inputs and query the DKG with the resulting code, as well as how to create embeddings and use vector similarity search to find the right Knowledge Assets.

Additionally, you’ve explored the capabilities of the NVIDIA Build platform and how to use it with the DKG, offering versatile options for both structured data querying with SPARQL and semantic similarity search with vectors. With these tools at your disposal, you’re well-equipped to tackle a wide range of tasks requiring knowledge discovery and retrieval by using the decentralized RAG (dRAG).

Decentralized RAG with OriginTrail DKG and NVIDIA Build ecosystem was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Oasis Open Projects

Building Trust in AI with Open Standards

Open standards in artificial intelligence (AI) are important for a number of reasons: Interoperability: Open standards allow different AI systems to work together seamlessly, regardless of who developed them or what platform they run on. This means that data and services can be shared across different systems, increasing efficiency and reducing costs. Innovation: Open standards […] The post Buil

By Francis Beland, Executive Director, OASIS Open

Open standards in artificial intelligence (AI) are important for a number of reasons:

Interoperability: Open standards allow different AI systems to work together seamlessly, regardless of who developed them or what platform they run on. This means that data and services can be shared across different systems, increasing efficiency and reducing costs.

Innovation: Open standards encourage innovation by providing a common framework for developers to work within. This can lead to the development of new AI tools and techniques that can benefit a wide range of users.

Transparency: Open standards can help increase the transparency of AI systems, making it easier for users to understand how they work and how they make decisions. This is particularly important in applications such as healthcare, finance, and legal, where transparency and accountability are critical.

Accessibility: Open standards can help make AI more accessible to a wider range of users, including those who may not have the resources to develop their own systems. This can help democratize access to AI technology and promote inclusivity.

Trust: Open standards can help build trust in AI by establishing a common set of ethical principles and technical standards that developers can adhere to. This can help address concerns around bias, privacy, and security, and promote responsible AI development and deployment.

The post Building Trust in AI with Open Standards appeared first on OASIS Open.


Hyperledger Foundation

Blockchain Pioneers: Hyperledger Burrow

As we laid out in ourHelping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now retired projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impact

As we laid out in ourHelping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now retired projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impact of these pioneering projects.


Trust over IP

Authentic Chained Data Containers (ACDC) Task Force Announces Public Review

A blueprint for creating truly decentralized, authentic, and verifiable ecosystems of identifiers, “credentials”, and attestations The post Authentic Chained Data Containers (ACDC) Task Force Announces Public Review appeared first on Trust Over IP.

The Authentic Chained Data Containers (ACDC) Task Force at the Trust Over IP Foundation is pleased to request public review of the following deliverables:

Key Event Receipt Infrastructure (KERI) specification  Authentic Chained Data Containers specification  Composable Event Streaming Representation specification 

Together, this suite of specifications provides a blueprint for creating truly decentralized, authentic, and verifiable ecosystems of identifiers, “credentials” [see footnote], and attestations.

The specifications describe a series of unique, innovative features:

Pre-rotation of keys, enabling truly unbounded term identifiers; Cryptographic root-of-trust; Chained “credentials” [see footnote] with fully verifiable proof of ownership and proof of authorship; A serialization format that is optimized for both text and binary representations equally with unique properties that support lookahead streaming for uncompromised scalability.

This suite of specifications contains additional sub-specifications including Out-Of-Band Introductions, Self-Addressing Identifiers and a revolutionary “path signature’ approach for signed containers required to provide a comprehensive solution for Organizational Identity.

With the launch of the vLEI Root of Official Trust this suite of specifications saw its first production deployment through the Python reference implementation in 2022.

The Task Force expects feedback to be provided by April 20, 2024 via GitHub issues on the following repositories using the ToIP Public Review Process:

https://github.com/trustoverip/tswg-keri-specification/issues https://github.com/trustoverip/tswg-acdc-specification/issues https://github.com/trustoverip/tswg-cesr-specification/issues

Licensing Information:

The Trust Over IP Foundation Technology Stack Working group deliverables are published under the following licenses:

Copyright mode: OWFa 1.0 (available at https://www.openwebfoundation.org/the-agreements/the-owf-1-0-agreements-granted-claims/owfa-1-0) Patent mode: OWFa 1.0 (available at https://www.openwebfoundation.org/the-agreements/the-owf-1-0-agreements-granted-claims/owfa-1-0) Source code: Apache 2.0 (available at http://www.apache.org/licenses/LICENSE-2.0.html)

Note: The Task Force considers “credentials” or “verifiable credentials” as termed by the W3C, only one use and a subset of ACDCs.

The post Authentic Chained Data Containers (ACDC) Task Force Announces Public Review appeared first on Trust Over IP.


ResofWorld

You can’t play on both sides of the Great Firewall

What the TikTok ban really means for U.S. companies.
The U.S. is once again on the cusp of banning TikTok. After last week’s House vote, TikTok is just a Senate vote away from being barred from operating in the...

How Prime Video failed so spectacularly in Africa

Filmmakers, critics, and users blame the platform’s struggle in Africa on poor user experience, a lack of local content, and mediocre publicity.
Richard first heard about Amazon’s plans to lay off “several hundred” employees across its Prime Video and MGM Studios divisions in South Africa on January 10. A film development executive...

Wednesday, 20. March 2024

Project VRM

Personal AI at VRM Day and IIW

Most AI news is about what the giants (OpenAI/Microsoft, Meta, Google/Apple, Amazon, Adobe, Nvidia) are doing (seven $trillion, anyone?), or what AI is doing for business (all of Forbes’ AI 50). Against all that, personal AI appears to be about where personal computing was in 1974: no longer an oxymoron but discussed more than delivered. […]

Prompt: A woman uses personal AI to know, get control of, and put to better use all available data about her property, health, finances, contacts, calendar, subscriptions, shopping, travel, and work. Via Microsoft Copilot Designer, with spelling corrections by the author.

Most AI news is about what the giants (OpenAI/Microsoft, Meta, Google/Apple, Amazon, Adobe, Nvidia) are doing (seven $trillion, anyone?), or what AI is doing for business (all of Forbes’ AI 50). Against all that, personal AI appears to be about where personal computing was in 1974: no longer an oxymoron but discussed more than delivered.

For evidence, look up “personal AI.” All the results will be about business (see here and here) or “assistants” that are just suction cups on the tentacles of giants (Siri, Google Assistant, Alexa, Bixby), or wannabes that do the same kind of thing (Lindy, Hound, DataBot).

There may be others, but three exceptions I know are Kin, Personal AI and Pi.

Personal AI is finding its most promoted early uses on the side of business more than the side of customers. Zapier, for example, explains that Personal AI “can be used as a productivity or business tool.”

Kin and Pi are personal assistants that help you with your life by surveilling your activities for your own benefit. I’ve signed up for both, but have only experienced Pit,” or “just vent,” when I ask it to help me with the stuff outlined in (and under) the AI-generated image above, it wants to hook me up with a bunch of siloed platforms that cost money, or to do geeky things (PostgreSQL, MongoDB, Python on my own computer. Provisional conclusion: Pi means well, but the tools aren’t there yet. [Later… Looks like it’s going to morph into some kind of B2B thing, or be abandoned outright, now that Inflection AI’s CEO, Mustafa Suleyman is gone to Microsoft. Hmm… will Microsoft do what we’d like in this space?]

Open source approaches are out there: OpenDAN, Khoj, Kwaai , and Llama are four, and I know at least one will be at VRM Day and IIW.

So, since personal AI may finally be what pushes VRM into becoming a Real Thing, we’ll make it the focus of our next VRM Day.

As always, VRM Day will precede IIW in the same location: the Boole Room of the Computer History Museum in Mountain View, just off Highway 101 in the heart of Silicon Valley. It’ll be on Monday, 15 April, and start at 9am. There’s a Starbucks across the street and ample parking because the museum is officially closed on Mondays, but the door is open. We lunch outdoors (it’s always clear) at the sports bar on the other corner.

Registration is open now at this Eventbrite link:

https://vrmday2024a.eventbrite.com

You can also just show up, but registering gives us a rough headcount, which is helpful for bringing in the right number of chairs and stuff like that.

See you there!

 


DIF Blog

DIF's work on Interoperability Profiles

The challenge  Interoperability is a basic requirement for secure identity management and seamless communication between identity systems and services. However, in a world of multiple digital identity standards and protocols, interoperability doesn’t just happen ‘out of the box’.  Identity standards and protocols tend to

The challenge 

Interoperability is a basic requirement for secure identity management and seamless communication between identity systems and services.

However, in a world of multiple digital identity standards and protocols, interoperability doesn’t just happen ‘out of the box’. 

Identity standards and protocols tend to be flexible by design, entailing a range of decisions about how they should be implemented. 

Differences in business priorities, local regulations and how these are interpreted drive divergent implementations, making interoperability hard to achieve in practice.

This means that standards are a necessary, but not sufficient part of interoperability.

Interop Profiles: reducing optionality to enable interoperability

Interop profiles describe a set of specifications and other design choices to establish interoperability. These profiles specify items like

Data models and supported formats Protocols to transfer Verifiable Credentials (VCs) Which Decentralized Identifier (DID) methods must be supported  Supported revocation mechanism Supported signature suites

They also specify what’s out of scope, further reducing optionality and easing implementation. 

Profiles can be developed to achieve interoperability for a variety of needs in order to establish a trusted ecosystem.

Interop Profiles and Decentralized Identity

There is growing support for interoperability profiles that enable real-world applications of decentralized identity standards and technologies. 

For example, the US Department of Homeland Security (DHS) leads the Silicon Valley Innovation Program, which focuses (among other things) on digitization of trade documentation using Decentralized Identifiers and Verifiable Credentials. To prove interoperability, and help build confidence that the solution doesn’t result in vendor lockin, participants have developed profiles and interoperability test suites to ensure they are able to exchange and verify trade credentials. 

The International Air Transport Association (IATA) plays a similar role in ensuring interoperability within the travel supply chain (for example, when using verifiable credentials to onboard travel agents and intermediaries to an airline's agency portal). 

The Jobs for the Future Foundation has hosted a series of interoperability events (called “JFF Plug Fests”) to select profiles and develop test harnesses demonstrating that individuals can receive and share their credentials using their choice of conformant wallets, and that the flows work across conformant issuers and relying parties.

How DIF is working to make life easier for implementers 

The interoperability challenges highlighted in this article matter for our members. 

For one thing, it’s hard to build workable products, or viable ecosystems, on top of standards and protocols with divergent implementations.

There’s also a growing need for specific approaches to decentralized identity within different industries, regions, and use cases (such as the trade, travel and employment cases mentioned above). 

Interoperability is a core part of the Decentralized Identity Foundation (DIF)’s mission.

Which is why DIF has hosted collaborative work to develop robust interoperability profiles for a number of years. 

Examples include the JWT VC Issuance Profile, which describes the technical protocols, data formats, and other requirements to enable interoperable issuance of VCs from Issuers to Wallets (see https://github.com/decentralized-identity/jwt-vc-issuance-profile ), and the JWT VC Presentation Profile, which describes the technical protocols, data formats, and other technical requirements to enable interoperable exchange of VCs presentations between Wallets and Verifiers (see https://github.com/decentralized-identity/jwt-vc-presentation-profile ). 

Taking a closer look at these examples, the VC Data Model v1.1 defines the data model of Verifiable Credentials (VCs) but does not prescribe standards for transport protocol, key management, authentication, query language, et cetera. The same is true for DIDs.

A range of specifications are available, providing options for how these things (transport, key management, etc) are achieved, but if implementers have to support all possible specifications (and combinations), it would be a lot of work.

So a profile is a way to make choices and even restrictions for a certain use case, allowing all participants to establish interoperability.

Summary

Collaboration on interoperability is an essential part of the process of establishing a viable digital trust ecosystem. 

Interop profiles define specific requirements that must be followed by identity providers, relying parties, and other stakeholders.

DIF provides a neutral venue to collaborate on interop profile development. 

Together with our working group tools, best practices and IPR protection, and our members’ subject matter expertise in decentralized identity technologies, DIF is the destination of choice to host this work. 

Got a question? Email us - we’ll be happy to discuss your requirements. 


Velocity Network

Jen Berres & Mike Andrus on why HCA Healthcare is adopting verifiable digital credentials

On Mar. 8, 2024, HCA Healthcare’s Senior Vice President and Chief Human Resources Officer, Jen Berres, and Vice President of Operations and Technology, Mike Andrus, joined Velocity’s Co-founder and Head of Ecosystem, Etan Bernstein, to discuss the verifiable digital credential movement, the value to healthcare organizations in particular, and the opportunity to work together to solve for an HR cha

Elastos Foundation

Elastos Announces Partnership with IoTeX to Deliver Security and Access to DePIN Infrastructure

Elastos today announced a partnership with IoTeX, to deliver ID verification and validation services across the Decentralized Physical Infrastructure Networks (DePINs) specialists’ portfolio including DePINscan, DePINasset and W3bstream. DePINs lie at the intersection between crowd-sourced participation, funding and governance models and so-called Real World Assets (RWA) – tangible infrastructure

Elastos today announced a partnership with IoTeX, to deliver ID verification and validation services across the Decentralized Physical Infrastructure Networks (DePINs) specialists’ portfolio including DePINscan, DePINasset and W3bstream.

DePINs lie at the intersection between crowd-sourced participation, funding and governance models and so-called Real World Assets (RWA) – tangible infrastructure such as buildings, equipment or other capital-intense assets.  They offer a mechanism to recruit and reward participants to maintain these assets, through the Blockchain.  When the latter is combined with a physical interface such as IoT, the contributions of these so-called physical ‘node managers’ can be tracked and, in turn, rewarded against tokens whose value itself increases with the development and use of the asset.

DePINscan provides a ready-to-use dashboard with essential visualizations for IoT projects and roll outs; while W3bstream is a decentralized protocol that connects data generated in the physical world to the Blockchain world.  IoTeX harnesses its innovative Roll-Delegated Proof of Stake (Roll-DPoS) consensus mechanism, designed to optimize speed and scalability for the seamless integration of IoT devices, while ensuring integrity throughout the entire process. Stakeholders cast their votes to elect specific block producers; block producers receive rewards for their contributions, which they subsequently share with the stakeholders who endorsed them.

Jonathan Hargreaves, Elastos’ Global Head of Business Development & ESG, describes the partnership as Web3 ‘next frontier’.

“Extending the benefits of the SmartWeb in terms of disintermediation, transparency and privacy into the physical domain is a logical but nonetheless exciting next step.  Our partnership with IoTeX means that entrepreneurs and businesses of any size will now have access to infrastructure that would otherwise be off limits to them, direct and on their terms.  This epitomizes Web3’s promise to level the playing field, thanks to its unique ability to ensure irrefutable identity proof which actually requires neither party to relinquish control of the same,” he says.

Raullen Chai, IoTeX’s co-founder and CEO, explains that DePINs permit an entirely new generation of businesses and entrepreneurs to access and monetize global infrastructure – from buildings to cabling, for instance – that otherwise would be prohibitively expensive or inaccessible.    

“Our partnership with Elastos represents an important milestone.  Extending our offering to the Elastos Smart Chain (ESC) offers some compelling advantages, including direct integration with ‘Layer 2’ Bitcoin, meaning that agreements can be embedded and reconciled direct in the World’s most popular and trusted digital currency.  This is an essential capability as DePINs become more mainstream,” he says. 

Interested in staying up to date? Follow Elastos here and join our live telegram chat.

The post Elastos Announces Partnership with IoTeX to Deliver Security and Access to DePIN Infrastructure appeared first on Elastos.


Identity At The Center - Podcast

A new episode of the Identity at the Center podcast is now a

A new episode of the Identity at the Center podcast is now available. This is a special Sponsor Spotlight episode, made in collaboration with our sponsor, Zilla Security. We had a great conversation with Deepak Taneja, CEO & Co-founder of Zilla Security, discussing a range of topics from how Zilla differentiates itself in the crowded IAM market to the role of Robotic Process Automation (RPA) i

A new episode of the Identity at the Center podcast is now available. This is a special Sponsor Spotlight episode, made in collaboration with our sponsor, Zilla Security. We had a great conversation with Deepak Taneja, CEO & Co-founder of Zilla Security, discussing a range of topics from how Zilla differentiates itself in the crowded IAM market to the role of Robotic Process Automation (RPA) in the identity lifecycle.

You can listen to this episode on our website, idacpodcast.com, or in your favorite podcast app. Don't miss it!

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

Future-Proofing Retail with RFID and 2D Barcodes with Sarah Jones Fairchild

Radio frequency identification (RFID) and 2D barcodes are transforming how we handle the supply chain.  Sarah Jones Fairchild, Vice President of Sales Operations at SWIM USA, talks 2D barcode applications for customer safety, efficiency in retail checkout, inventory management, and the broader implications for companies as they prepare for the technological demands of the future. Sarah expl

Radio frequency identification (RFID) and 2D barcodes are transforming how we handle the supply chain. 

Sarah Jones Fairchild, Vice President of Sales Operations at SWIM USA, talks 2D barcode applications for customer safety, efficiency in retail checkout, inventory management, and the broader implications for companies as they prepare for the technological demands of the future. Sarah explains the importance of high-quality data and the impact of incorrect data on consumers. She also touches on the potential for these technologies to address industry-specific needs and regulatory requirements. 

Sarah highlights her personal experience with tech at home and work, specifically how it helps align information for everyone. The discussion emphasizes the importance of GS1 standards for ensuring compatibility in the supply chain and the necessity of proper data management to fully leverage RFID and 2D barcode capabilities. The conversation levels supply chain tracking information for business owners of all types and why RFID can take a few years to implement. 

 

Key takeaways: 

Integrating RFID and 2D barcode technologies in supply chain operations is essential for improving accuracy and efficiency.

Data quality and management are challenging across industries, particularly with the need for high compatibility and usability standards.

Companies must embrace technologies such as RFID and 2D barcodes for the future.

 

Resources: 

What Is RFID Technology, and How Does It Work?

2D Barcodes: Changing the way you eat, shop, and live

Sunrise 2027: The Next Dimension in Barcodes

Enhance Your Supply Chain Visibility

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Sarah Jones Fairchild on LinkedIn

Check out SWIM USA


Digital Identity NZ

Hello Autumn: Let’s Dive into Serious Work Together

If the first two months of 2024 for Digital Identity NZ are anything to go by, this year is certainly turning out to be every bit as busy as 2023. The post Hello Autumn: Let’s Dive into Serious Work Together appeared first on Digital Identity New Zealand.

Kia ora,

If the first two months of 2024 for Digital Identity NZ are anything to go by, this year is certainly turning out to be every bit as busy as 2023. It is a different kind of busy, with more collaboration and partnership engagement needed to ‘get things done’ in the digital identity domain, against a backdrop of regulation and economic headwinds.

A couple of weeks ago, the year’s first Coffee Chat saw good attendance as did last month’s Air New Zealand and Authsignal sponsored webinar on passkeys. This exclusive member-only event shared how DINZ members, Authsignal and Air New Zealand worked together to deliver a world class implementation of passkeys to secure Air New Zealand’s customers accounts. Speaking of Authsignal, founder and DINZ Executive Council Member Justin Soong wrote this exceptional thought piece on AI published in Forbes last month. And DINZ member SSS – IT Security Specialists received this accolade!

Next week, members will receive a personal email from me seeking expressions of interest, particularly from digital ID service and attribute providers, to participate in an investigative sprint early next month from DINZ member PaymentsNZ’s. The aim is to surface the digital identity-related issues that people encounter in the payments industry, and develop best practice requirements to overcome them as part of PaymentsNZ’s Next Generation Payments programme. Stay tuned.

We kick off April with a lunchtime fireside chat; Digital Health Identity: History, current state and the future with two Te Whatu Ora specialists. There’s so much happening in this space. You can find out more and register here.

If you’re getting the impression that April is the month for digital identity, you’re correct! Tuesday 9 April is World Identity Management Day! While co-founded by the Identity Defined Security Alliance and the National Cybersecurity Alliance in the US in 2021, the day is recognised in many countries globally. In its fourth year, the 2024 Virtual Conference brings together identity and security leaders and practitioners from all over the world to learn and engage.

April is also a favourite time of year to publish research that helps to level-set our own strategies and plans, as DINZ did last year. This Australian research forwarded by a public sector member, would probably show similar results in NZ, as reflected in DINZ member InternetNZ’s insights research. And the EU digital wallet is taking shape as it aims to showcase a robust and interoperable platform for digital identification, authentication and electronic signatures based on common standards across the European Union. We hope to continue our research and additional initiatives for 2024, and we’re continually looking for support in the way of sponsorship from our members. Click here to find out how you can support DINZ’s research, and future ambitions.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Hello Autumn: Let’s Dive into Serious Work Together

SUBSCRIBE FOR MORE

The post Hello Autumn: Let’s Dive into Serious Work Together appeared first on Digital Identity New Zealand.

Tuesday, 19. March 2024

Hyperledger Foundation

Why Hyperledger Besu is a Top Choice for Financial Use Cases

Hyperledger Besu has emerged as a preferred runtime for EVM-based financial initiatives worldwide. For projects like tokenization, settlements, CBDCs (Central Bank Digital Currencies), and trade finance, Besu stands out for its robust security features, versatility in network construction, performance, pluggability, and enterprise-friendly licensing and programming language.

Hyperledger Besu has emerged as a preferred runtime for EVM-based financial initiatives worldwide. For projects like tokenization, settlements, CBDCs (Central Bank Digital Currencies), and trade finance, Besu stands out for its robust security features, versatility in network construction, performance, pluggability, and enterprise-friendly licensing and programming language.

Monday, 18. March 2024

FIDO Alliance

Tech Telegraph: Best PC and laptop security accessories 2024

If you haven’t had the pleasure of using biometrics on a device for authentication through Windows Hello, you’re missing out. It’s much faster and easier than having to type in […]

If you haven’t had the pleasure of using biometrics on a device for authentication through Windows Hello, you’re missing out. It’s much faster and easier than having to type in your password.


Android Headlines: X Android App Beta Gets Password-less Passkeys Authentication Support

Passkeys enhance security by eliminating traditional passwords and relying on the interaction between Private and Public keys for user authentication, reducing the instance of phishing attacks and data breaches. Passkeys […]

Passkeys enhance security by eliminating traditional passwords and relying on the interaction between Private and Public keys for user authentication, reducing the instance of phishing attacks and data breaches. Passkeys are gaining traction among various platforms, including websites, gaming platforms, and Windows 11 apps.


The New Stack: 3 Steps to Make Logins with Passkeys Reliable

Passkeys offer modern and secure authentication by enabling cryptography-backed user authentication with a frictionless user experience. With users becoming more accustomed to passkeys, 2024 is the year to ditch passwords […]

Passkeys offer modern and secure authentication by enabling cryptography-backed user authentication with a frictionless user experience. With users becoming more accustomed to passkeys, 2024 is the year to ditch passwords and upgrade to passkeys with these considerations in mind.


Identity At The Center - Podcast

It’s time for the latest episode of the Identity at the Cent

It’s time for the latest episode of the Identity at the Center Podcast! We had the pleasure of welcoming back Andi Hindle, the Conference Chair for Identiverse, for an in-depth discussion about the planning and unique aspects of the Identiverse conference. We explore whether Identiverse is a Digital Identity conference or an IAM conference. Looking forward to an enlightening conversation? Listen t

It’s time for the latest episode of the Identity at the Center Podcast! We had the pleasure of welcoming back Andi Hindle, the Conference Chair for Identiverse, for an in-depth discussion about the planning and unique aspects of the Identiverse conference. We explore whether Identiverse is a Digital Identity conference or an IAM conference. Looking forward to an enlightening conversation? Listen to the full episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 15. March 2024

Identity At The Center - Podcast

Join us for a special Friday episode of The Identity at the

Join us for a special Friday episode of The Identity at the Center Podcast. We discussed the rapidly evolving world of Privileged Access Management with our guest Paul Mezzera. We talked about the driving forces behind these changes and what the future might hold. Listen to our conversation at idacpodcast.com or in your favorite podcast app. #iam #podcast #idac

Join us for a special Friday episode of The Identity at the Center Podcast. We discussed the rapidly evolving world of Privileged Access Management with our guest Paul Mezzera. We talked about the driving forces behind these changes and what the future might hold. Listen to our conversation at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Thursday, 14. March 2024

Berkman Klein Center

Accuracy, Incentives, Honesty: Insights from COVID-19 Exposure Notification Apps

The next pandemic response must respect user preferences or risk low adoption By Elissa M. Redmiles and Oshrat Ayalon Photo by Mika Baumeister on Unsplash Four years after COVID-19 was first declared a pandemic, policy makers, companies and citizens alike have moved on. The CDC no longer offers separate guidance for COVID-19. Apple and Google have shut down their exposure notificat

The next pandemic response must respect user preferences or risk low adoption

By Elissa M. Redmiles and Oshrat Ayalon

Photo by Mika Baumeister on Unsplash

Four years after COVID-19 was first declared a pandemic, policy makers, companies and citizens alike have moved on. The CDC no longer offers separate guidance for COVID-19. Apple and Google have shut down their exposure notification infrastructure, which was used heavily in the US and Europe. As COVID-19 spread, technologists were called to serve by building and deploying exposure notification apps to scale parts of the contact tracing process. These apps allowed users to report when they tested positive for COVID-19 and to notify other users when they had been in the vicinity of an infected user. But getting people to use exposure notification apps during the pandemic proved challenging.

More than three million lives have been lost to COVID-19 over the past four years. Any hope of losing fewer lives during the next pandemic rests on reflection: what did we do, what can we learn from it, and what can we do better next time? Here, we offer five key lessons-learned from research on COVID-19 apps in the US and Europe that can help us prepare for the next pandemic.

Privacy is important, but accuracy also matters

Privacy was the primary focus in early exposure notification apps, and rightfully so. The apps all trace their users’ medical information and movements in various ways, and may store some or all of that information in a central database in order to inform other users of potential infection. The misuse of this information could easily result in unintentional, or even intentional, harm.

However, research into whether (and how) people used exposure notification apps during the pandemic showed that privacy might not be the most important factor. People care about accuracy, or an app’s rate of incorrect reports of COVID-19 exposure (both false positives and false negatives), which may have also influenced rates of public app adoption. Yet, we still know little about how effective the deployed exposure notification apps were. Future apps will need to have measurement tools and methods designed into them before they are released to accurately track their usefulness.

We need to better understand the role of incentives

Researchers discovered that using direct incentives, such as monetary compensation, to get people to install exposure notification apps worked at first, but had little effect in the long term. In fact, one field study found that people who received money were less likely to still be using the app eight months later than those who didn’t. Paying people to download a contact tracing app is even less effective when the app is perceived to be bad quality or inaccurate. However, monetary incentives may be able to “compensate” when the app is perceived to be costly in other ways, such as eating up mobile data.

Given the ethical problems and lack of success with direct incentives, focusing on indirect incentives, such as functionality, may be key to increasing adoption. Exposure notification apps have the potential to serve a greater purpose during pandemics than merely exposure notification. Our research found that people using exposure notification apps wanted them to serve as a “one-stop-shop” for quick receipt of test results, information on the state of public health in their region, and assistance finding testing centers.

Future app design needs to examine user wants and expectations to ensure widespread adoption. This is hardly a new concept — every successful “fun” app begins with this user-centered model. Apps that provide these extra benefits to users will not only be better adopted, they will also see more frequent and prolonged use.

…Over a third of the Coronalert app users we interviewed believed that it tracked their location, despite repeated communications over the course of a year that it used proximity rather than location to detect possible exposures.

Honesty is the most effective communication strategy

Exposure notification apps are often framed to the public as having inherent individual benefits: if you use this app, you’ll be able to tell when you’ve been exposed to a disease. In reality, exposure notification apps have a stronger collective benefit of preventing the overall spread of disease in communities. Being honest with potential users about the true benefits is more effective than playing up the less significant individual benefit. When examining how to best advertise Louisiana’s exposure notification app, we found that people were most receptive to the app when its collectivistic benefits were centered.

Honesty and openness in privacy is also essential, especially when it comes to data collection and storage. Despite this transparency, however, people may still make assumptions based on false preconceptions or logic. For example, over a third of the Coronalert app users we interviewed believed that it tracked their location, despite repeated communications over the course of a year that it used proximity rather than location to detect possible exposures.

Integration with existing health systems is essential

There was a disconnect between COVID-19 exposure notification apps and public healthcare systems, even in countries with universal healthcare and government-supported apps. Belgium’s Coronalert app, for example, allowed users to receive their test results faster by linking their test to their app using a unique code. But, testing center staff were not trained on the app and failed to prompt users for that code. Not only was receiving test results a primary motivator in getting people to use the app; failing to link positive results to specific app users reduced the app’s efficacy.

This disconnect may be far greater in countries without universal healthcare or where exposure notification apps are privately created. In order for these apps to be effective, developers must collaborate with public health workers to develop a shared understanding of how testing centers operate, determine the information needed to provide accurate tracking, and decide on the best way to follow up on potential infections.

Resourcing technical capacity is critical

A wide range of exposure notification apps were developed to combat COVID-19, and by many different organizations. In the absence of immediate government action, many of the earliest efforts were led by universities or volunteer efforts. Academics developed the DP3T proximity tracing protocol, which guided Google and Apple’s development of exposure notification infrastructure for Android and iOS phones.

However, privatization of exposure notification infrastructure created an enormous potential for private medical and other information to fall into the hands of corporations who are in the business of big data. It also subjected exposure notification technology to private company’s rules (and whims).

Google and Apple released exposure notification infrastructure in April 2020 but did not release direct-to-user exposure notification functionality until later in the pandemic. This decision left the development of exposure notification apps to public health agencies that lacked the resources and technical capacity to do so. Volunteers stepped in to fill this void. For example, the PathCheck foundation developed exposure notification apps for 7 states and countries on top of the Google-Apple Exposure Notification infrastructure.

“…We need to eliminate these scattered responses, align incentives, and integrate the strengths and perspectives of public, private, and academic bodies to develop protocols, models, and best practices.”

While it is natural for universities to support the public good, and encouraging that private citizens volunteered so much of their time and resources to do so, they should not have to in the next pandemic. To respond to future pandemics, we need to eliminate these scattered responses, align incentives, and integrate the strengths and perspectives of public, private, and academic bodies to develop protocols, models, and best practices.

Applying the lessons learned

Building tech responsibly means not just considering privacy, but providing technology that respects user preferences. When people give up their data, they expect a benefit — be that a collective benefit, such as fighting a pandemic or helping cancer research, or an individual one. They likewise expect utility: apps that are accurate, achieve their goals, and provide an holistic set of features.

If we continue to build tech based on our assumptions of what users want, we risk low adoption of these technologies. And during times of crisis, such as this still-ongoing COVID-19 pandemic, the consequences of low adoption are dire.

Elissa M. Redmiles is a computer scientist specializing in security and privacy for marginalized & vulnerable groups at Georgetown University and Harvard’s Berkman Klein Center.

Oshrat Ayalon is a human-computer interaction researcher focusing on privacy and security at the University of Haifa.

Accuracy, Incentives, Honesty: Insights from COVID-19 Exposure Notification Apps was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elastos Foundation

Elastos Launches Grant Program to Accelerate Deployment of “Smart Bitcoin” Applications

Visit Destiny Calls Website Page! Today Elastos, a pioneer in blockchain technology announced the launch of its Destiny Calls Program. Elastos is the creator of BeL2, the first Bitcoin Layer 2 applying zero-knowledge technology to enable the direct development and management of  ‘Bitcoin-native’ smart contracts.  The new program is now welcoming applications from the digital […] The po

Visit Destiny Calls Website Page!

Today Elastos, a pioneer in blockchain technology announced the launch of its Destiny Calls Program. Elastos is the creator of BeL2, the first Bitcoin Layer 2 applying zero-knowledge technology to enable the direct development and management of  ‘Bitcoin-native’ smart contracts. 

The new program is now welcoming applications from the digital entertainment, gaming and leisure sector utilising Elastos’ decentralised infrastructure including BeL2 to deliver Bitcoin- denominated services and experiences. The initial cohort of 6 to 8 projects will be backed by up to 100,000 ELA in funding, equivalent to approximately $378,000 USD to kick start a new and non-invasive approach to Layer 2 solutions. The program is a key part of Elastos’ ongoing mission to accelerate the development of the user-controlled SmartWeb.

“With the recent launch of Elastos’ BeL2, innovators and entrepreneurs now have access to the functionality of layer 2 blockchains backed by the unparalleled security of Bitcoin,” said Jonathan Hargreaves, Global Head of Business Development & ESG at Elastos. “Bitcoin Layer 2 promises to unlock various applications that will underpin the SmartWeb and has fundamentally addressed some of the capacity and functionality restrictions that have hindered the mainstream adoption of the Bitcoin ecosystem. Destiny Calls will provide crucial initial funding for teams exploring the potential of BeL2 and Elastos’ other SmartWeb infrastructure, and will accelerate the transformation of the internet into a user-driven and interactive ecosystem.”

Projects will be selected by the Destiny Calls board and reviewed with support by QuestBook, the on-chain grant funding review and administration platform. The initial cohort will be focused on three sectors: digital entertainment, gaming and leisure. In addition to funding, as part of the program Elastos will provide marketing and technical support, along with mentorships to support grantees in reaching their program milestones. Interested applicants are encouraged to visit the Destiny Calls page here.

 

Elastos’ Bitcoin Layer2, BeL2

The launch of Destiny Calls, follows the recent launch of Elastos’ Bitcoin layer 2, BeL2. BeL2 is the first Bitcoin layer 2 to facilitate the creation, recognition, management and exchange of any Bitcoin-denominated smart contract directly between concerned parties, without workarounds like intermediaries, side chains or additional applications. BeL2 promises to unlock the SmartWeb, by providing unprecedented functionality to Bitcoin and is part of growing industry excitement and focus on unlocking layer 2 functionality on Bitcon after the significant growth of L2s in the Ethereum ecosystem.

 

Pilot Recipients Announced

As part of the launch, Elastos is confirming that BeatFarm will join Destiny Calls as an inaugural member, having successfully completed pilot projects with Elastos. Beatfarm is a Decentralised platform to give artists direct access to potential collaborators, promotors, producers and industry professionals on their own terms. In collaboration with Elastos, Beatfarm is working to enable artists to establish Smart Contracts on their own terms with the resulting contracts – eScriptions – secured and assured through Bitcoin that can be traded through a decentralised marketplace. 

“Beatfarm’s success as a pilot project perfectly illustrates the potential of BeL2 to create sustainable business models for decentralised Web3 experiences,” adds Jonathan. “ Beatfarm exemplifies our goal of supporting innovative ideas in digital entertainment, gaming and leisure through Destiny Calls.”

For more information, please visit the Destiny Calls Website plage.

The post Elastos Launches Grant Program to Accelerate Deployment of “Smart Bitcoin” Applications appeared first on Elastos.


MOBI

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes Coalition Advocates for Improvements to Streamline Auto Transactions Los Angeles — 14 March 2024. MOBI, a global nonprofit Web3 consortium, is excited to announce its participation in the Electronic Secure Title and Registration Transformation (eSTART) [...]

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes

Coalition Advocates for Improvements to Streamline Auto Transactions

Los Angeles — 14 March 2024. MOBI, a global nonprofit Web3 consortium, is excited to announce its participation in the Electronic Secure Title and Registration Transformation (eSTART) Coalition as a founding member. eSTART is a group of leading auto industry organizations united in advocating for modern solutions to replace the paper-based processes that currently dominate state and local DMV operations.

The eSTART Coalition focuses on three key areas of vehicle transactions:

Permitting electronic signatures on all title and registration documents; Adopting tools for electronic submission and processing of title and registration; and Enabling electronic vehicle records transfers.

Modernizing these processes will result in significant cost and time savings for consumers, state and local DMV operations, and industry participants.

Across the U.S., countless titling/registration service providers maintain unique databases and processes for vehicle registration and titling. While some of these jurisdictions have begun digitizing certain processes, many rely entirely on paper-based and manual workflows. This fragmented approach presents several pain points for Motor Vehicle Authorities (MVAs), private sector participants, and consumers, including:

Lack of standardized processes leading to inconsistencies in data management and accessibility. Incurrence of substantial costs associated with paper-based systems, including storage, processing, and handling. Prolonged processing times and increased risk of errors due to manual verification processes. Missed opportunities for cost savings, efficiency gains, and enhanced customer experiences.

Addressing these pain points requires a solution that can be easily adopted across all jurisdictions rather than a solution that functions at a state, county or municipal jurisdiction level. MOBI and its members are collaborating on a Web3-enabled standardized solution to enhance efficiency and cross-border regulatory compliance in MVA operations with an interoperability framework rooted in self-sovereign data and identities. This unified framework serves as a common language, enabling organizations with diverse business processes and legacy systems to efficiently coordinate in a standardized manner without having to build and maintain new infrastructure.

The implementation of a standardized Web3 ecosystem offers a promising solution to streamline operations, increase efficiency, reduce costs, and greatly improve data permissioned-only-access. The ability to verify identities and transactions in a decentralized way can reduce odometer and titling fraud, eliminate the need for manual identity verification, improve insurance products, and enable more seamless remote transactions (e.g. online sales and road usage charging).

“We’re excited to be part of a coalition that not only shares our vision for a more streamlined and modern automotive industry but is actively working towards making it a reality,” said Tram Vo, MOBI CEO and Co-Founder. “MOBI and its members are proud to bring a unique Web3 standardized approach to this groundbreaking endeavor. Together, we’re setting the stage for a more efficient, interoperable ecosystem that empowers stakeholders through enhanced trust and data privacy for all.”

Other transportation industry organizations, including government agencies, industry partners, and associations, are encouraged to join the eSTART Coalition to advocate for these important changes. For more information about eSTART, please visit www.estartcoalition.org or contact info@estartcoalition.org.

About MOBI

MOBI is a global nonprofit Web3 consortium. We are creating standards for trusted self-sovereign data and identities (e.g. vehicles, people, businesses, things), verifiable credentials, and cross-industry interoperability. Our goal is to make the digital economy more efficient, equitable, decentralized, and sustainable while preserving data privacy for users and providers alike. For additional information about joining MOBI, please visit www.dlt.mobi.

About eSTART Coalition

The Electronic Secure Title and Registration Transformation (eSTART) Coalition is a united group of leading automotive organizations committed to modernizing and streamlining automotive title and registration processes. eSTART focuses on advocating for the implementation of efficient technology solutions to replace the paper-dependent systems currently used by DMVs. Through collective advocacy and action at the local and national levels, the coalition aims to drive significant improvement in automotive industry processes in ways that benefit all customers, DMVs and industry participants.

For more information, please visit www.estartcoalition.org.

Media Contact: Grace Pulliam, MOBI Communications Manager

Email: grace@dlt.mobi | Twitter: twitter.com/dltmobi | Linkedin: MOBI

The post MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes first appeared on MOBI | The New Economy of Movement.

Wednesday, 13. March 2024

Elastos Foundation

ELA: The Queen of Bitcoin

Bitcoin transformed finance by deploying blockchain technology, a decentralised system that replaces central authority with cryptographic trust. At its heart lies the Proof of Work (PoW) consensus algorithm, where miners expend computational energy to compete and solve complex mathematical problems, securing the network and validating transactions for BTC rewards. This model reflects the natural c

Bitcoin transformed finance by deploying blockchain technology, a decentralised system that replaces central authority with cryptographic trust. At its heart lies the Proof of Work (PoW) consensus algorithm, where miners expend computational energy to compete and solve complex mathematical problems, securing the network and validating transactions for BTC rewards.

This model reflects the natural competition for survival, akin to trees vying for sunlight, businesses vying for market dominance, individuals competing for a mate or the dynamics between predators and prey —each process governed by the relentless pursuit of energy and dominance.

Bitcoin’s hashrate represents its own competitive edge in the digital realm. This hashrate, a staggering 595.79 EH/s, signifies a computational battle much like those found in nature, but on a scale that dwarfs the combined power of the world’s supercomputers, underscoring the network’s unmatched security and the near-impossibility of overpowering it.

PoW elevates beyond a simple mechanism, integrating nature’s laws into the digital domain to fortify Bitcoin’s network through electricity, a tangible, physical cost. Bitcoin—becoming the unchallenged cornerstone of digital finance, offers a decentralised alternative that empowers individuals with financial sovereignty and freedom from central authority. It provides a secure, transparent, and accessible financial system for everyone, regardless of location or status.

 

Satoshi’s Vision for Merged Mining

 

 

Merged mining, or Auxiliary Proof of Work (AuxPoW), allows two different blockchains to use the same consensus mechanism. Miners can mine blocks on both chains simultaneously, submitting proof of their work to both networks. The key is that the ‘child’ blockchain, while independent in transactions and storage, relies on the ‘parent’ blockchain’s PoW for its security.

The concept of merged mining was introduced in a Bitcoin forum post by Satoshi Nakamoto in 2010, discussing the possibility of a new service called BitDNS to be mined simultaneously with Bitcoin. Satoshi proposed that by allowing miners to work on both chains at once, without extra effort or splitting the mining community, both networks could benefit from increased security and efficiency. The benefits include:

Economic Assurance of Security: Merged mining with Bitcoin means a ‘child’ blockchain’s security is underwritten by the considerable economic cost of Bitcoin mining. This straightforwardly leverages the existing, well-established energy expenditure of Bitcoin for maximum security with no additional complexity. Resource Optimisation and Environmental Consideration: Utilising Bitcoin’s existing mining infrastructure, merged mining does not require extra energy, making it an efficient and environmentally considerate approach to securing a blockchain. Scalability through Proven Infrastructure: By tapping into Bitcoin’s vast network of miners, merged mining scales a ‘child’ blockchain’s security with the growth of Bitcoin’s network.

Merged mining showcases efficiency and symbiosis, much like the natural cooperation in mycorrhisal networks, bees’ cross-species pollination, and mutualistic relationships between birds and mammals. It mirrors human ingenuity in leveraging established resources, such as start-ups utilising corporate infrastructures and solar panels or trees harnessing the sun’s energy, emphasising the smart utilisation of existing networks to bolster security and growth without additional expenditure.

Notably, Namecoin, one of the first to adopt this with Bitcoin, aims at decentralising domain-name registration. While Dogecoin, known for being merge mined, actually pairs with Litecoin due to the shared Scrypt algorithm, not Bitcoin. Myriadcoin’s unique approach supports multiple algorithms, including SHA-256, making it compatible with Bitcoin. Syscoin and Elastos also leverage Bitcoin’s hash power for enhanced security through merge mining.

 

Elastos and Bitcoin Merged Mining

Elastos, which began with the vision of creating a secure, decentralised internet, incorporated merged mining with Bitcoin in 2018. BTC.com helped mine its first block, and today, its network and currency ELA benefits from over 50% of Bitcoin’s mining security. So, what does this mean?

Elastos Utilises the Strongest Proof of Work Security Model in Existence: By merged mining with Bitcoin, Elastos capitalises on the most extensive PoW network, inheriting Bitcoin’s unparalleled security attributes. This symbiotic relationship means Elastos’ blockchain integrity is as robust as Bitcoin’s, mitigating risks without directly vying for Bitcoin’s mining resources. Elastos Has Achieved an Energy-Efficient Design Without Compromising Security: Energy efficiency is a major concern in cryptocurrency mining. Elastos adds transaction and block validation on its network by piggybacking on the work done by Bitcoin miners, thus maintaining high security with no additional energy requirements. This model serves as a case study in eco-conscious blockchain design. Elastos Offers a Unique Combination of a Decentralised Operating System with Bitcoin-Level Security: Unlike conventional blockchains, Elastos is a fully-fledged operating system for decentralised applications, secured by a blockchain layer. By integrating Bitcoin’s hash power through merged mining, it ensures a fortified environment for running dApps, differentiating itself significantly from competitors. Elastos Is Pioneering the True Decentralised Internet Backed by the Robustness of Bitcoin’s Network: Elastos’ aim to revamp the internet structure into a truly decentralised form is ambitious. By aligning its consensus mechanism with that of Bitcoin, it anchors its network to the tried-and-tested resilience of Bitcoin’s mining power, driving forward a new paradigm for digital communication and interaction. Elastos’s Ecosystem Is Designed to be Self-Sustaining and Independent, Yet Benefits Directly from Bitcoin’s Continued Growth: The design of Elastos’s ecosystem ensures it remains autonomous. As Bitcoin’s network expands and becomes more secure, Elastos indirectly benefits from these enhancements, bolstering its own proposition without the need for additional investment in security. Elastos May Be the Most Direct Implementation of Satoshi Nakamoto’s Vision for Merged Mining: Elastos’s use of merged mining is arguably a direct reflection of Satoshi’s initial musings on the subject. Its broad strategic outlook that includes an operating system, a carrier network, and SDKs for developers, all secured by the hash rate of Bitcoin, makes it a comprehensive and multidimensional implementation of the concept.

 

BTC’s Queen

Elastos, by merging mining with Bitcoin, can be likened to a queen in the chess game of digital finance, where Bitcoin holds the position of king. Just as a queen’s versatility and power are essential for protecting the king and dominating the board, Elastos’ integration with Bitcoin’s security framework amplifies the ecosystem’s resilience and innovation and gives it’s own ecosystem a plethora of utility. This includes:

Transaction Fees: ELA powers Elastos by covering transaction fees, including smart contracts and asset registrations, ensuring network security and efficiency. Digital Asset Exchange: ELA fuels a decentralised economy in Elastos, enabling direct trade of digital assets and services, cutting out middlemen. Incentive Mechanism: ELA rewards participants, including miners who secure the network via merge mining with Bitcoin, enhancing security and sustainability. Governance: Holding ELA grants governance rights, allowing stakeholders to vote on network decisions through the Cyber Republic, promoting community-driven development. Decentralised Applications (DApps): ELA is essential for using DApps on Elastos, providing access to a broad range of services and expanding the ecosystem’s functionality.

Together, Bitcoin and Elastos form a formidable duo, combining the steadfast security of the king with the dynamic reach and versatility of the queen, setting the stage for a future where digital finance is both secure and boundlessly innovative. What’s more, Elastos is developing BeL2, the Bitcoin Elastos Layer 2 protocol, allowing EVM smart contracts to run directly on top of Bitcoin, a scalable BitVM innovation. What if such services enable anyone with their decentralised wallet the ability to generate their own Bitcoin-backed algorithmic stablecoins, free from censorship? If Bitcoin introduces the concept of “Be Your Own Bank,” what if Elastos can expand on the idea to “Be Your Own Central Bank?”, both secured in POW. This could drastically disrupt finance as we know it.

Interested in staying up to date? Follow Elastos here and join our live telegram.

The post ELA: The Queen of Bitcoin appeared first on Elastos.


Hyperledger Foundation

Hyperledger Mentorship Spotlight: Aries-vcx based message mediator

The world of technology has seen significant developments over the past few decades, largely driven by advancements in cryptography. These advancements have led to innovations including secure internet traffic through HTTPS and WireGuard; protected data storage via BitLocker, LUKS, and fscrypt; decentralized consensus records using Bitcoin and Ethereum; and privacy-focused messaging pr


The world of technology has seen significant developments over the past few decades, largely driven by advancements in cryptography. These advancements have led to innovations including secure internet traffic through HTTPS and WireGuard; protected data storage via BitLocker, LUKS, and fscrypt; decentralized consensus records using Bitcoin and Ethereum; and privacy-focused messaging protocols like Signal and MLS (Messaging Layer Security).

However, despite these advances, our online identities remain controlled by third parties, whether we sign in to apps using Google or Facebook OpenID or manage "verified" accounts on platforms such as Twitter or Instagram. An emerging movement seeks to change this status quo by harnessing the transformative power of cryptography. Governments are also starting to recognize the value of self-sovereign identity (SSI)—a system in which individuals retain full control of their own digital identities.


MyData

Open position: Legal and policy specialist/ ecosystems specialist

Job title:  Legal and policy specialist / ecosystems specialistEmployment type:  Fixed contractContract duration:   March 2024 through 31 March 2026, with opportunity for renewal.Location: Remote, based in the EU and with a preference for Oslo, or Helsinki. Reports to: Executive Director Role description   The ecosystems specialist is responsible for advancing MyD
Job title:  Legal and policy specialist / ecosystems specialistEmployment type:  Fixed contractContract duration:   March 2024 through 31 March 2026, with opportunity for renewal.Location: Remote, based in the EU and with a preference for Oslo, or Helsinki. Reports to: Executive Director Role description   The ecosystems specialist is responsible for advancing MyData’s work to facilitate the emergence of […]

Tuesday, 12. March 2024

MOBI

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens Stay tuned for updates! About Our Web3 Cross-Industry Interoperability Pilots Alongside our global community, we’ve demonstrated several potential use cases for Citopia and Integrated Trust Network (ITN) services through various pilot projects. Together, Citopia and the [...]

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens

Stay tuned for updates!

Toggle Navigation Get Involved MOBI Standards Citopia Integrated Trust Network About Our Web3 Cross-Industry Interoperability Pilots

Alongside our global community, we’ve demonstrated several potential use cases for Citopia and Integrated Trust Network (ITN) services through various pilot projects. Together, Citopia and the ITN provide the necessary infrastructure for node operators to build out secure, seamless, globally compliant web services and applications. MOBI membership is required to operate a node on Citopia and/or the ITN. Contact us to learn more about becoming a node operator

Overview of the Pilot and the Problem It Solves

Across the United States, there is a diverse array of jurisdictions (numbering in the thousands across states, counties, and municipalities) and titling/registration service providers, each maintaining unique databases and processes for vehicle registration and titling. Many states (AZ, DE, GA, FL, LA, MA, MD, NC, SC, PA, VA, and WI) currently mandate the use of electronic lien and title (ELT) systems. Other states have planned ELT mandates in 2024, or more generally are developing a digital approach to electronic vehicle titling. For example, New York and Idaho have or are developing processes for dealer reassignments electronically.

Each of these jurisdictions will maintain their own systems for these varied processes. The challenge lies in achieving interoperability between those systems through standardized communications and data reporting/exchange across jurisdictional, platform, and organizational lines while enabling each jurisdiction to maintain control over its processes. For example, today, each vehicle manufacturer or lender can have hundreds of unique identifiers assigned to them by different jurisdictions, creating confusion, mismanagement, and inefficiency.

Currently, secure digital authentication and communication rely on identifiers issued by centralized platforms to prove their credentials. However, in addition to being vulnerable to fraud, identity theft, and data leaks, centralized approaches to identity management fail to address the trust problems created by the rise of decentralized services, IOT, and Generative AI. As digitization advances, it will become increasingly challenging — and costly — to verify data authenticity, secure digital perimeters, and ensure cross-border regulation compliance. This is critical for state agencies like MVAs as well as dealers and lenders, who are responsible for executing the bulk of the registration/titling process.

Stakeholders: Vehicle Manufacturers (OEMs); Financial Institutions (FIs)/Lenders; Servicers; Dealerships; Motor Vehicle Authorities (MVAs)/Third-party Registration/Titling Providers (RTPs); State Authorized Inspectors; Third-Party Data Consolidators; Fleet Operators; Trade Associations; Vehicle Auctions; and Consumers. Our Innovative Solution

Overcoming these challenges calls for a new solution. The White House’s Federal Zero Trust Strategy (2022) mandates that federal agencies and any organization that works with the federal government adopt a Zero Trust framework by the end of FY 2024. Zero Trust requires every entity to authenticate and validate every other entity for every single digital communication at all times. Since this is not possible at scale through Web2/centralized means, Web3 technologies and principles must be leveraged.

MOBI and its members have developed platform-agnostic standardized “universal translators” that work with any existing legacy system or web service to enable cross-industry interoperable communication through World Wide Web Consortium (W3C) decentralized identity and verifiable credential framework, called Citopia Passports (Web3 Plug-and-Play). Citopia Passports ensure that organizations’ and customers’ data privacy, which is key for complying with comprehensive data privacy laws being passed by many states (e.g., CA, CT, OR, TX, UT, VA).

Explore Cross-Industry Interoperability Requirements

Interested in learning more? Dive deeper on our Web3 Infrastructure Page!

Zero Trust Authentication: Cross-industry interoperability requires claims and identities to be verified for each transaction to ensure maximum security. Read the Federal Zero Trust Strategy

Infosec & Selective Disclosure: Participants must be able to selectively disclose information for transactions at the edge. Verification must be done at the moment of transaction to eliminate the need for PII storage.

Scalability and Extensibility: Cross-industry interoperability requires a shared standards-based framework to enable the creation of globally scalable multiparty applications.

Data Privacy Compliance: Cross-industry interoperability requires (1) compliance with existing global data privacy regulations and (2) the flexibility to comply with future directives.

Global Standards: Cross-industry interoperability requires a standardized system for frictionless data exchange and collaboration while allowing stakeholders to retain their legacy systems.

Decentralization: Cross-industry interoperability requires a community-owned and -operated infrastructure to (1) prevent monopolization and (2) enable consensus-based trust.

Web3 Plug-and-Play

Citopia Passports utilize W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) standards. This creates an interoperability framework that provides:

Explore Citopia Passports

Decentralized, trusted identities

Digital credential issuance/verification

Interoperable communication between each stakeholder’s centralized databases

A bridge between jurisdictions, organizations, and platforms allowing each stakeholder to keep their legacy systems

The result is the reduction of errors, streamlined operations, increased efficiency, and reduced costs, as well as greatly improved data permissioned-access. More generally, this cross-industry, platform-agnostic, universal interoperability is part of what has motivated government interest worldwide in implementing and adopting standards-based digital identity and credential systems (e.g., the Department of Homeland Security (DHS) in the US; European Union Agency for Cybersecurity (ENISA) and European Self-Sovereign Identity Framework (ESSIF) in the EU).

Proposed Stakeholder Meeting

MOBI is proposing a two-part meeting in the first half of 2024: part one being a meeting between the association stakeholders (e.g. AAMVA, NADA, ATAEs, NIADA, AFSA, MOBI) and their representative members, and part two being a meeting including the titling service providers. The goals of the meeting are:

to bring together the key stakeholders to assess the pain points, needs/requirements, and path forward to achieve interoperability between the numerous centralized systems for registration/titling to jointly address the opportunity to develop standardized communication between each stakeholder to achieve interoperability for registration/titling processes to discuss how secure, verifiable digital identifiers and claims (using open-standard Web3 technologies) can address fundamental problems, such as each lender having hundreds of different identifiers assigned to them by different jurisdictions to finalize the scope and scale of the Standardized Web3 Solution for Titling/Registration Pilot Pilot Planning

In Phase 1 of the Pilot, the FSSC WG will demonstrate privacy-preserving cross-industry interoperability for Titling/Registration via standardized universal identifiers and communication/claims without the need to build new infrastructure. This will involve working with MVAs, lenders, dealers, OEMs, and service providers to demonstrate interoperability across different legacy systems and jurisdictions. At the end of Phase 1, stakeholders will have successfully created Citopia Passports and be able to use their Citopia Passport to easily authenticate each other’s identifiers and claims (such as lien release, odometer disclosures, insurance validation, etc.). Stakeholders will be able to examine the code and outputs to verify that all transactions/communications are private and only visible to the intended recipient.

In Phase 2 of the Pilot, each stakeholder will have the opportunity to run nodes, conduct research and development for their own applications, and actively participate in the pilot for a duration of 6-12 months. The FSSC WG will determine the final scope of Phase 2 after the conclusion of Phase 1.

MOBI WEB3 INFRASTRUCTURE

Explore the Future of
Cross-Industry Interoperability

Together, Citopia and the Integrated Trust Network (ITN) form our federated Web3 infrastructure for verifiable identity, location, and business automation. Learn more

JOIN MOBI

Learn How Your Organization Can Get Involved

Join our community to help shape the future of interoperability, accelerate the adoption of cutting-edge tech, and define a new era of digital trust! Submit an inquiry

Dive Deeper

Interested in learning more about MOBI, our community-owned and operated Web3 Infrastructure, and our interoperability pilots? Contact us at connect@dlt.mobi to get in touch with the team!

Get Involved

The post Standardized Web3 Solution for Vehicle Registration, Titling, and Liens first appeared on MOBI | The New Economy of Movement.

Monday, 11. March 2024

OpenID

Notice of Vote for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance

The official voting period will be between Monday, March 25, 2024 and Monday, April 1, 2024, once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, March 18, 2024. The OpenID Connect Working Group page is […] The post Notice of Vote for Proposed Implementer’s Draft of OpenID fo

The official voting period will be between Monday, March 25, 2024 and Monday, April 1, 2024, once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, March 18, 2024.

The OpenID Connect Working Group page is https://openid.net/wg/connect/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/328.

The post Notice of Vote for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance first appeared on OpenID Foundation.


Identity At The Center - Podcast

In the latest episode of the Identity at the Center Podcast,

In the latest episode of the Identity at the Center Podcast, we had the pleasure of speaking with Nick Mothershaw, Chief Identity Strategist at the Open Identity Exchange (OIX). We discussed the concept and functionality of digital wallets, the role of governments in issuing these wallets, and the future of smart and roaming wallets. This was a truly fascinating conversation, and I'm sure you'll f

In the latest episode of the Identity at the Center Podcast, we had the pleasure of speaking with Nick Mothershaw, Chief Identity Strategist at the Open Identity Exchange (OIX). We discussed the concept and functionality of digital wallets, the role of governments in issuing these wallets, and the future of smart and roaming wallets. This was a truly fascinating conversation, and I'm sure you'll find it as insightful as we did. If you're interested in the evolving landscape of identity security, this is one episode you don't want to miss!

You can listen to the episode at idacpodcast.com or on your favorite podcast app.

#iam #podcast #idac

Saturday, 09. March 2024

Elastos Foundation

Elastos Bi-Weekly Update – March 9th, 2024

In the latest Elastos Bi-Weekly Update, significant progress has been made across different areas of Elastos. Let’s take a look at some of the BeL2 and Elacity innovations! BeL2 Implementation of Consensus Circuit for BTC Transactions: A significant milestone has been achieved with the implementation of a consensus circuit based on Cairo 0. This circuit […] The post Elastos Bi-Weekly Update – Ma

In the latest Elastos Bi-Weekly Update, significant progress has been made across different areas of Elastos. Let’s take a look at some of the BeL2 and Elacity innovations!

BeL2 Implementation of Consensus Circuit for BTC Transactions: A significant milestone has been achieved with the implementation of a consensus circuit based on Cairo 0. This circuit is designed to perform basic checks on Bitcoin (BTC) Legacy address transactions. It includes validation of elliptic curve signatures and unspent Transaction Outputs (UTXO) checks, among other crucial verification steps. Zero-Knowledge Proof Verification Contract: A verification contract for zero-knowledge proof of Bitcoin transactions has been successfully implemented. This contract enables the demonstration that a given transaction has passed through the consensus circuit, thereby completing the technical feasibility verification phase. Schnorr Signature Verification Circuit: With the introduction of the Schnorr signature verification circuit, based on Cairo 1, the groundwork has been laid for supporting advanced BTC transactions, including those involving Taproot addresses and Ordinals. This is a foundational step toward enhancing transaction security and efficiency on the blockchain. BTC Oracle Development: The objective is to create a BTC Oracle capable of generating Zero-Knowledge Proofs for all types of BTC transactions. These proofs can then be submitted to the Ethereum Virtual Machine (EVM) smart contracts for verification. The development team has successfully implemented zero-knowledge proof for legacy address transactions using Cairo version 0. This achievement marks a significant step towards building a comprehensive BTC Oracle framework that will eventually support all BTC OP Codes, Segwit transactions, Schnorr signatures, and Taproot transactions. Smart Contract and Proof Verification: The development includes smart contracts and tools for verifying BTC transactions in a decentralized manner. This includes the creation and validation of Merkle proofs for BTC transactions, enabling the secure and efficient handling of BTC assets within the Elastos ecosystem. Infrastructure Enhancements and Tools: The deployment and improvement of various infrastructure components and tools have been noted. This includes the development of contracts for asset exchange, order management, and fraud proof submission. These components are essential for the robust operation of the Elastos infrastructure, ensuring a secure, efficient, and decentralized environment for asset exchange and transaction verification. Elacity Player Update for Flexible Media Streams: The player’s capability has been enhanced to accommodate a broader range of media stream combinations. It now supports playing audio-only or video-only streams, handling multiple streams by selecting the first one available. This update addresses the previous limitation where the player would break if the media was not formatted with one audio and one video stream. This refactoring ensures a more flexible and robust playback experience, catering to diverse media types. A unified signature notification system has been implemented, enhancing the user experience across the platform. Adaptive Streaming Support: Significant work on adaptive streaming support has been completed, ensuring that video playback can dynamically adjust to various internet speeds and device capabilities, optimizing the viewing experience. Android Connection Flow: Enhancements in the connection flow on Android devices have been made to improve usability and performance. ABR Selection Flow: An adaptive bitrate (ABR) selection flow has been developed to further enhance the streaming quality based on the user’s current network conditions. NFT Marketplace Updates: Updates to the filter in the mobile view for the NFT marketplace have been implemented, alongside adjustments to how NFTs opened from search are viewed or routed. Efforts have been made to address sync issues with NFTs, ensuring that collection displays and NFT minting processes are seamless and intuitive. Quality Assurance and Final Preparations: Pre-release testing and quality assurance checks have been conducted, including code reviews and fixes for specific transaction failures and playback issues. Preparations for the release include addressing feedback on collection cover image changes, and ensuring that the mobile filter pop-up experience is consistent across all collection pages. Work on the backend includes fixing RPC call errors, addressing DRM playback issues on iOS, and researching efficient deployment strategies for IPFS nodes.

Interested in staying up to date? Follow Elastos here and join our live telegram.

The post Elastos Bi-Weekly Update – March 9th, 2024 appeared first on Elastos.

Friday, 08. March 2024

FIDO Alliance

TeleMedia Online: Should All Mobile Business Apps Scrap Passwords and Integrate Biometrics?

Now that all the most advanced mobile devices on the market offer biometric authentication, it’s a good opportunity for apps to align with this and integrate it. FIDO Alliance reported that around […]

Now that all the most advanced mobile devices on the market offer biometric authentication, it’s a good opportunity for apps to align with this and integrate it. FIDO Alliance reported that around 80 percent of data leaks are linked to passwords, so it would be useful for a better alternative to become more widespread.


Security Magazine: Cyber Insights 2024: A Dire Year for CISOs?

“CISOs are too often overlooked or low on resources, funding and/or business support to properly implement change,” adds Andrew Shikiar, executive director at FIDO. “Resting the legal liability on one […]

“CISOs are too often overlooked or low on resources, funding and/or business support to properly implement change,” adds Andrew Shikiar, executive director at FIDO. “Resting the legal liability on one individual is overlooking the vacuum of responsibility and engagement at the top of organizations that is preventing meaningful change and true cyber resilience.”


Biometric Update: FIDO Alliance ensures long-term value of its specifications in post quantum era

The FIDO Alliance is actively involved in integrating PQC into its standards to ensure long-term efficacy and security, forming working groups to understand the implications and develop migration strategies. With […]

The FIDO Alliance is actively involved in integrating PQC into its standards to ensure long-term efficacy and security, forming working groups to understand the implications and develop migration strategies. With the addition of Prove Identity to its Board of Directors, the coalition continues its mission to shaping future standards for identity authentication.


Engadget: 1Password adds passkey support for Android

Passkey adoption is on the rise, showcased by 1Password’s support of passkeys for Android devices to provide a more secure alternative to traditional passwords through the use of public and […]

Passkey adoption is on the rise, showcased by 1Password’s support of passkeys for Android devices to provide a more secure alternative to traditional passwords through the use of public and private keys.


Human Colossus Foundation

Securing Your Digital Future: A Three-Part Series on Enhanced Privacy through Data Protection - Part 1

In 'Securing Your Digital Future,' Part 1 of this three-part series unveils the pivotal role of the Blinding Identity Taxonomy (BIT) and its Supplementary Document in fortifying data privacy. Emphasizing the critical need to protect sensitive personal data, we explore the foundation of data semantics—bolstered by the BIT framework crafted by the Human Colossus Foundation and backed by Kantara
Part 1: Understanding the Semantic Foundation of Privacy: The Critical Role of BIT and Its Supplementary Document in Data Protection

In the rapidly evolving digital landscape, the significance of data protection has never been more pronounced. Recent developments, such as the presidential order issued by the White House on February 28th, 2024, to prevent access to sensitive personal data by overseas 'bad actors,' underscore the urgency of safeguarding personal information from exploitation. This context sets the stage for a pivotal conversation on protecting sensitive data from a data semantics perspective—the cornerstone of understanding and interpreting data correctly across diverse systems and stakeholders.

Data semantics supports data interpretability, clarity, and consistency in the digital realm. It includes utilizing data models, vocabularies, taxonomies, ontologies, and knowledge representation to accurately recognize and interpret Personally Identifiable Information (PII) and sensitive data, ensuring that digital entities comprehend the sensitivity of this information, irrespective of their domain. The Blinding Identity Taxonomy (BIT) emerges as a beacon of guidance in data protection, supporting the fight against intrusive surveillance, scams, blackmail, and other privacy violations.

Celebrating the BIT and Its Evolution

Developed by the Human Colossus Foundation (HCF) and supported by Kantara Initiative, the BIT provides a robust framework for identifying and flagging sensitive information within data sets. Its purpose is not just to adhere to privacy laws such as GDPR and CCPA but to fortify the semantic understanding of what constitutes 'sensitive data.' The BIT involves a nuanced comprehension of data attributes that, if mishandled, could lead to privacy breaches or misuse.

With notable contributions from Paul Knowles, Chair of the HCF Decentralised Semantics WG, the BIT Supplementary Document significantly enhances the comprehension of the taxonomy. As an active contributor to the Dynamic Data Economy (DDE), HCF transferred the intellectual property rights of the newly released BIT Supplementary Document on December 13th, 2023, to Kantara Initiative, a global community focused on improving the trustworthy use of identity and personal data. Although not yet incorporated into regulations like GDPR, CCPA, or similar national regulations as an official appendix, the BIT Supplementary Document's publication as an official Kantara Initiative report on March 5th, 2024, significantly enhances the BIT's utility by offering detailed insights into the BIT categories.

The release of the BIT Supplementary Document marks a significant advancement in this journey. Offering detailed insights into the 49 BIT categories, it serves as an indispensable manual for practitioners aiming to navigate the complexities of data protection. It not only enumerates what constitutes sensitive information but also elaborates on how to interpret and handle this data, ensuring semantic integrity across systems. The BIT is the world's most comprehensive taxonomy for preventing re-identification attacks, with the Supplementary Document adding further depth and clarity.

Flagging Sensitive Attributes: A Semantic Safeguard

As the BIT report recommends, flagging sensitive attributes in a schema capture base is a practice rooted in semantic precision. This approach enables data protection officers and schema issuers to identify elements that demand cryptographic encoding, thereby minimizing the risk of re-identifying a data principal, where flagging acts as semantic annotation, marking data with an additional layer of meaning—its sensitivity or risk level, which aids in compliance with data protection regulations and enhances the semantic coherence of data handling practices.

By utilizing the BIT and its Supplementary Document, practitioners have a common guideline for determining which attributes to flag. This standard practice ensures that sensitive data is understood and interpreted consistently, avoiding ambiguities that could lead to data breaches. The BIT framework empowers practitioners to embed data protection principles directly into their semantic models, making privacy a foundational aspect of data interpretation.

Conclusion: The Semantic Imperative for Data Protection

In a digitally interconnected world, we cannot overstate the importance of data semantics as we navigate the complexities of data protection. The BIT and its Supplementary Document offer a comprehensive framework for understanding and protecting sensitive data, grounding data protection in semantic precision. As we move forward, we encourage individuals, organizations, and ecosystems to embrace these tools, ensuring that sensitive information is flagged, protected, and interpreted carefully.

BIT Supplementary Document

The BIT and its Supplementary Document enrich our toolkit for privacy preservation. The BIT is accessible in PDF and HTML formats, catering to diverse user preferences. Those seeking deeper insights can download the BIT Supplementary Document in PDF format from Kantara Initiative's Reports & Recommendations page. This invaluable resource resides under the 'Kantara Initiative Reports' section, clearly labeled as "Supplementary Report to Blinding Identity Taxonomy Report," ensuring straightforward access for all interested parties.

Stay tuned for Part 2 of this three-part series, where we will delve into the crucial aspect of data governance. We will explore how to implement BIT guidelines for protecting sensitive personal information from a data administration vantage point. Our discussion will navigate the governance frameworks and practices that ensure these recommendations are not just theoretical ideals but are effectively integrated into the operational fabric of organizations and distributed data ecosystems, safeguarding privacy at every turn.


OpenID

OpenID Foundation Certification Program Recruiting a Java Developer

The OpenID Foundation is pleased to announce that it is looking to add a Java developer to the successful OpenID certification program team. The OpenID Foundation enables deployments of OpenID specifications to be certified to specific conformance profiles to promote interoperability among implementations. The certification process utilizes self-certification and conformance test suites developed

The OpenID Foundation is pleased to announce that it is looking to add a Java developer to the successful OpenID certification program team. The OpenID Foundation enables deployments of OpenID specifications to be certified to specific conformance profiles to promote interoperability among implementations. The certification process utilizes self-certification and conformance test suites developed by the Foundation.

The Foundation is seeking a consultant (contractor) to join the team on a part- to full-time basis based on availability. This team member will provide development, maintenance, and support services to the program that include but are not limited to implementing new tests, addressing conformance suite bugs, and updating existing conformance test suites.

SKILLS:

Strong and documented experience with Java or a similar language Some knowledge of OAuth 2 / OpenID Connect / OpenID for Verifiable Credentials / SIOPv2 / FAPI / JWTs (with an interest in becoming more proficient in these standards) An interest in security & interoperability Experience participating in relevant standards working groups (e.g. IETF OAuth, OpenID Connect, OIDF Digital Credentials Protocols, and/or FAPI) is a bonus Experience with one or more of the OpenID Certification conformance suites is a bonus


TASKS:

Development tasks include: Developing new test modules Updating existing conformance tests when changes to the specs are approved Extending the conformance tests to work against servers in new ecosystems including to adding additional security / interoperability checks Undertaking more extensive development tasks including developing conformance tests for previously untested specifications Reviewing code changes done by other team members Pushing new versions to production as/when necessary & writing release notes Investigating / fixing reported bugs in the conformance suite Providing guidance to ecosystems that adopt OpenID Foundation specifications Attend OIDF working group calls as/when necessary Attend 1 hour virtual team call every 2 weeks Attend annual team meeting that is usually adjacent to an industry event


If this opportunity is of interest, please send your resume and cover letter to director@oidf.org with the subject, “OIDF Certification Program Java Developer Opportunity”. Please include in your cover letter how your skills and experience align to the requirements outlined above, your available hours per month, including when you are available to start, and your hourly rate.

The post OpenID Foundation Certification Program Recruiting a Java Developer first appeared on OpenID Foundation.

Thursday, 07. March 2024

FIDO Alliance

Mercari’s Passkey Authentication Speeds Up Sign-in 3.9 Times

Mercari, Inc. is a Japanese e-commerce company, offering marketplace services as well as online and mobile payment solutions. With Mercari users can sell items on the marketplace, and make purchases […]

Mercari, Inc. is a Japanese e-commerce company, offering marketplace services as well as online and mobile payment solutions. With Mercari users can sell items on the marketplace, and make purchases in physical stores. In 2023, they implemented passkeys. This article will explain the motivation behind their decision and the results they achieved.

Motivation

Previously Mercari was using passwords and faced with real-time phishing attacks, added SMS OTPs as an authentication method to protect their users. While this improved their security, it did not completely eliminate real-time phishing attacks. Sending a high volume of SMS OTPs was also both expensive and not very user-friendly.

Mercari also had a new service Mercoin, a platform for buying and selling Bitcoin with the user’s available balance in Mercari, which had strong security requirements and passkeys met their needs.

Because passkeys are bound to a website or app’s identity, they’re safe from phishing attacks. The browser and operating system ensure that a passkey can only be used with the website or app that created them. This frees users from being responsible for signing in to the genuine website or app.

Requiring users to use extra authentication methods and perform additional action is an obstacle when what users actually want is to accomplish something else using the app.

Adding passkey authentication removes that additional step of SMS OTP and improves user experience while also providing better protection for users from real-time phishing attacks and reducing the cost associated with SMS OTPs.

Results

900,000 Mercari accounts have registered passkeys and the success rate of signing in with them is 82.5% compared to 67.7% success rate for signing in with SMS OTP.

Signing in with passkeys has also proved to be 3.9 times faster than signing in with SMS OTP–Mercari users on average take 4.4 seconds to sign in with passkeys, while it takes them 17 seconds to do the same with SMS OTP.

The higher the success rate of authentication and the shorter the authentication time, the better the user experience and Mercari has seen great success with implementing passkeys.

Learn more about Mercari’s implementation of passkeys

To learn more about how Mercari solved the challenges of making a phishing resistant environment with passkeys, read their blog on Mercari’s passkey adoption.

Download Case Study

We Are Open co-op

The Power of Community Knowledge Management

Celebrating Open Education Week 2024 A couple of days ago we ran our fourth Community Conversations session. This one was timed to coincide with Open Education Week, an initiative of OE Global created as “an annual celebration [and] opportunity for actively sharing and learning about the latest achievements in Open Education worldwide”. Our focus was on managing knowledge in communities. Th
Celebrating Open Education Week 2024

A couple of days ago we ran our fourth Community Conversations session. This one was timed to coincide with Open Education Week, an initiative of OE Global created as “an annual celebration [and] opportunity for actively sharing and learning about the latest achievements in Open Education worldwide”.

Our focus was on managing knowledge in communities. The version in the video above is a shortened version of the session, which we recorded without the activities. This blog post contains most of the information in the recording.

What is Knowledge?

Community is key to open education, with an often-overlooked aspect of community management and evolution being how knowledge is stewarded within such networks.

Image by gapingvoid

Let’s start with the above image, showing the difference between terms and concepts that are sometimes used interchangeably, but actually mean different things.

When we talk about community knowledge we’re talking about connecting the dots between information being shared between members. This can turn into insight through a process of reflection, and wisdom by connecting together different insights.

In practice, nothing is ever as simple as the process shown in the above diagram. However, it’s a convenient way to tease apart some of the subtleties.

A Simple, Homely Example

I went on holiday with my family recently. We ‘favourited’ some places on Google Maps as part of our planning, to help us navigate while we were there, and to be able to share what we enjoyed with others afterwards.

Screenshot of Google Maps showing ‘favourited’ and ‘bookmarked’ places in Faro, Portugal

What’s represented on the above screenshot is a bunch of data arranged on a map. When you click on each point, there is further information about each place. If I put these together into an itinerary, this could be considered a form of knowledge.

This is a form of community knowledge management on a very small scale: the community represented by my nuclear family, my extended family and friends, and potentially those people who might in future ask for recommendations on what to do in Faro, Portugal.

Other proprietary tools that might be used to store data and information with others include Trello and Pinterest. You are curating these things as individuals for a particular purpose, but there is not necessarily an effort to connect together the dots in any meaningful way.

Community Knowledge Management

So, what’s the difference between what we’ve discussed so far and managing knowledge within communities?

In this case, we’re specifically talking about Communities of Practice, which we discuss in the first three Community Conversations workshops. Briefly put, they can be defined in the following way:

“Communities of Practice are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly.” (Etienne Wenger)

Harold Jarche has a very clear diagram that he uses regularly in his work around Personal Knowledge Management (PKM) to explore the differences between the spaces in which we interact:

Image via jarche.com/pkm

We’re interested in the middle oval in this diagram, with Communities of Practice (CoPs )overlapping with ‘Work Teams’ and ‘Social Networks’. While we might build knowledge within the walls of our organisations, and share things online with strangers, CoPs are intentional spaces for us to build knowledge between organisations with people we get to know better over time.

Rosie Sherry defines Community Knowledge Management in the following way:

Community Knowledge Management is a process for collaboratively collecting information, insights, stories and perspectives with the goal of supporting a community and the ecosystem with their own learning and growth. (Rosie Sherry)

Although she doesn’t mention it explicitly, the inference is that by “collecting information, insights, stories, and perspectives” the idea is that we not only share knowledge, but we also co-create it.

Tools for Community Knowledge Management

The new version of the Participate platform, to which we are migrating the ORE community, is organised around three types of ‘thing’: Badges, Events, and Docs.

This is useful for keeping communities organised. But what if you’ve got a lot of information — books worth, almost, and you need to organise that? In this case, it’s worth looking at another tool to augment your community’s ‘home’ and which provides some more specialised features.

As you would expect from an organisation entitled We Are Open Co-op, we’re interested in working openly, using openly-licensed resources, open source tools, and cooperating with others. That means we’re going to point towards Open Source software in this section that we know, have used, and trust.

Here are three examples of the types of platforms which can host knowledge created in CoPs:

Wikis — everyone knows Wikipedia, but any organisation or community can have a wiki! You can use the same software, called MediaWiki, or use many other alternatives (we use wiki.js) Forums — these are easily searchable so can be used to capture useful information as part of conversations. We’re big fans of Discourse and have used it for several clients projects. Learning Management Systems (LMS) — can be used to capture information, especially if your community is based around educational resources. Our go-to for this is Moodle.

For the sake of brevity, and to point to our own example, we’re going to show our use of MediaWiki to form Badge Wiki. This has been around for over six years at this point, and serves as a knowledge base for the Open Badges and wider digital credentials community.

Community Knowledge Contribution

There are behaviours around this knowledge repository that overlap with those inside the main community platform. But there are also others, specific to it. For example:

Community Calls specifically focused on discussing and planning elements of Badge Wiki. Barn raisings which focus on co-creation of pages to help establish the knowledge base. Asynchronous discussions to talk about strategy, and catch up between synchronous events such as the previous two. Pro-social behaviours are encouraged and recognised through the use of badges.

To dig into the last of these, we know that there are all kinds of reasons why people contribute to Open Source and community projects. We just want to give them a reason to keep doing so.

Image taken from work WAO did with Greenpeace. See more in this post.

We created a range of badges specifically focused on the community knowledge base. There are attendance badges, for example with the barn raising (and attending multiple times) but also for particular actions such as authoring pages, tidying up existing content, and making it look better!

Images CC BY-ND Visual Thinkery for WAO

Once you’ve got a knowledge base, you can run projects on top of it. So when an ORE community member mentioned that it would be useful to have a ‘toolkit’ for helping people understand Open Recognition… Badge Wiki was the obvious place for it to live!

We launched v0.1 of the Open Recognition Toolkit at the ePIC 2023 in Vienna. As it’s a wiki, this can be easily iterated over time with multiple authors — who can contribute as little or as much as they want.

There’s so much more we could say, but there’s no substitute for practice! Whether you’re planning to start a new community, in the midst of setting one up, or stewarding an existing one, it’s important to think about good practices around Community Knowledge Management.

Being intentional and inclusive about what kind of knowledge is captured and shared within communities is crucial. It’s powerful to pool resources and to help generate insights; it helps to provide impact. It also helps fulfil the needs of different members of the community and helps increase the diversity and richness of who gets involved — and how.

If you would like a thought partner for this kind of work, why not get in touch and have a chat with the friendly people at WAO? The first 30 min call is free of charge, and we’ll do our best to help, or point you towards someone who can!

CC BY-ND Visual Thinkery for WAO

The Power of Community Knowledge Management was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 06. March 2024

Ceramic Network

Building Points on Ceramic - an Example and Learnings

We built a Web3 points application on Ceramic to explore the design considerations a successful system requires.

We've made the case in a recent blog post that web3 point systems align the incentives of platforms and their users, acting as reputation systems that allow participants to draw inferences between who's creating value and who's likely to receive rewards for their actions. More importantly, these systems help participants understand what user interactions matter to applications using points. And while points often manifest as objects referred to by different names (badges and attestations, for example), there's a commonality across these implementations relevant to their verifiability.

Why Points and Ceramic?

Points data requires properties allowing consumers of points (traditionally the same applications issuing them) to trust their provenance and lineage. This is unsurprisingly why most Web3 points systems today are built on centralized rails - not only is a simple Postgres instance easy to spin up, but the only data corruption vulnerability would result from poor code or security practices.

For readers familiar with Ceramic's composability value proposition, it's likely obvious why we view web3 point systems (and reputation systems more broadly) as ideal Ceramic use cases. Not only does Ceramic offer rich query capabilities, data provenance and verifiability promises, and performance-related guarantees, but both end users and applications benefit from portable activity. We foresee an ecosystem where end users can leverage the identity they've aggregated from one application across many others. In turn, applications can start building on user reputation data from day one.

To put this into practice, we built a scavenger hunt application for EthDenver '24 that allowed participants to collect points based on in-person event attendance.

A Scavenger Hunt for Points

Ceramic was officially involved in 8 or so in-person engagements this year at EthDenver, some of which were cosponsored events (such as Proof of Data and Open Data Day), while others were cross-collaborations between Ceramic and our partners (for example, driving participants to check in at official partner booths at the official EthDenver conference location). The idea was simple - participants would collect points for checking in at these events, and based on different thresholds or interpretations of participant point data (for example, participants with the most event check-ins) would be eligible for prizes.

To make this happen, we ideated on various patterns of data control and schema design that presented the best balance of trade-offs for this use case. In simple terms, we needed to:

Track event attendance by creating or updating abstractions of that activity in Ceramic Provide a crypto-native means for participants to self-identify to leverage Ceramic-native scalar types Secure the application against potential spoofing attempts Collect enough information necessary to perform creative computation on verifiable point data

We were also presented with several considerations. For example, should we go through the effort to emulate a user-centric data control design whereby we implement a pattern that requires additional server-side verification and signed data to allow the end user to control their Ceramic point data? Or what's the right balance of data we should collect to enable interesting interpretations (or PointMaterializations) to be made as a result of computing over points?

Architecting Document Control

Before we jump in, reading our blog post on Data Control Patterns in Decentralized Storage would help provide useful context. As for the problem at hand, two options stand out as the most obvious ways to build a verifiable points system on open data rails:

Reconstruct the approach that would be taken on traditional rails (the application is the author and controller of all points data they generate). This makes the data easy to verify externally based on the Ceramic document controller (which will always be the same), and data consumers wouldn't have to worry about end users attempting to modify stream data in their favor Allow the end users to control their points data on Ceramic. In this environment, we'd need a flow that would be able to validate the existing data had been "approved" by us by verifying a signed payload, then update the data and sign it again before having the user save the update to their Ceramic document, thus ensuring the data is tamper-evident

You might've guessed that the second option is higher-touch. At the same time, a future iteration of this system might want to involve a data marketplace that allows users to sell their points data, requiring users to control their data and its access control conditions. For this reason and many others, we went with #2. We'll discuss how we executed this in the sections below.

What Data Models Did We Use?

When we first started building the scavenger hunt application the SET accountRelation schema option had not yet been released in ComposeDB (important to note due to the high likelihood we would've used it). Keep that in mind as we overview some of the APIs we built to check if existing model instances had been created (later in this article).

In discussing internally how points data manifests, we decided to mirror a flow that looked like trigger -> point issuance -> point materialization. This means that attending an event triggers issuing point data related to that action. In response, that issuance event might materialize as an interpretation of the weight and context of those points (which could be created by both the application that issued the points and any other entity listening in on a user's point activity).

As a result, our ComposeDB schemas ended up like this:

type PointClaims @createModel(accountRelation: LIST, description: "A point claim model") @createIndex(fields: [{ path: ["issuer"] }]) { holder: DID! @documentAccount issuer: DID! @accountReference issuer_verification: String! @string(maxLength: 100000) data: [Data!]! @list(maxLength: 100000) } type Data { value: Int! timestamp: DateTime! context: String @string(maxLength: 1000000) refId: StreamID } type PointMaterializations @createModel( accountRelation: LIST description: "A point materialization model" ) @createIndex(fields: [{ path: ["recipient"] }]) { issuer: DID! @documentAccount recipient: DID! @accountReference context: String @string(maxLength: 1000000) value: Int! pointClaimsId: StreamID! @documentReference(model: "PointClaims") pointClaim: PointClaims! @relationDocument(property: "pointClaimsId") }

To provide more context, we built the application to create a new PointClaims instance if one did not already exist for that user, and update the existing PointClaims instance if one already existed (and, in doing so, append an instance of Data to the "data" field). I mentioned above that the SET accountRelation option would've likely come in handy. Since we were hoping to maintain a unique list of PointClaims that only had 1 instance for each user (where the issuer represents the DID of our application), SET would've likely been the preferred way to go to make our lives easier.

You'll also notice that an optional field called "refId" that takes in a StreamID value exists in the Data embedded type. The idea here was that issuing points might be in response to the creation of a Ceramic document, in which case we might want to store a reference pointer to that document. For our scavenger hunt example, this was the case - points were issued in recognition of event attendance represented as individual Ceramic documents:

type EthDenverAttendance @createModel( accountRelation: LIST description: "An attendance claim at an EthDenver event" ) @createIndex(fields: [{ path: ["recipient"] }]) @createIndex(fields: [{ path: ["event"] }]) @createIndex(fields: [{ path: ["latitude"] }]) @createIndex(fields: [{ path: ["longitude"] }]) @createIndex(fields: [{ path: ["timestamp"] }]) @createIndex(fields: [{ path: ["issuer"] }]) { controller: DID! @documentAccount issuer: DID! @accountReference recipient: String! @string(minLength: 42, maxLength: 42) event: String! @string(maxLength: 100) latitude: Float longitude: Float timestamp: DateTime! jwt: String! @string(maxLength: 100000) }

Finally, take a look at the "issuer_verification" field in PointClaims and "jwt" field in EthDenverAttendance. Both fields were allocated to store the data our application verified + signed, represented as a base64-encoded string of a JSON web signature. For PointClaims, this entailed just the values within the "data" array (involving a verification, updating, and resigning process each time new point data needed to be appended).

Issuing Points - Data Flow

For the remainder of the article, feel free to follow along in the following public code:

https://github.com/ceramicstudio/fluence-demo

You'll notice two environment variables (SECRET_KEY and STRING) scoped only for server-side access, the first of which is meant to contain our secret 64-character seed from which we'll instantiate our application's DID (to be used for filtering PointClaims instances for documents where our application's DID is the issuer, as well as for verifying and signing our tamper-evident fields). To explain STRING, it might be helpful at this point if I dive a bit deeper into what we built to support the user flow.

Private PostgreSQL Instance (for Whitelisted Codes)

You'll notice that a findEvent method is called first in the useEffect lifecycle hook within the main component rendered on our post-login screen, which subsequently calls a /api/find route (which uses our STRING environment variable to connect to our PostgreSQL client). For this application, we needed to quickly build a pattern where we were able to both issue and verify codes corresponding to each in-person event that had been generated beforehand. This ties back to our planned in-person flow:

Participant scans a QR code or taps an NFC disc that contains the URL of our application + a parameterized whitelisted code that hasn't yet been used The application checks the database to ensure the code hasn't yet been used

While in theory this part could've been built on Ceramic with an added layer of encryption, it was easier to stand this up quickly with a private Postgres instance.

Determining Participant Eligibility

If the call to /api/find determines that the code has not been used, findEvent then calls a createEligibility method, passing in the name of the event as the input variable. Notice that the first thing we do is call a getDID method, which calls a /api/checkdid server route that uses our SECRET_KEY variable to instantiate a DID and send us back the did:key identifier.

This is the second check our application performs to prevent cheating, whereby we query ComposeDB for EthDenverAttendance instances, filtering for documents where the signed-in user is the controller, where the event is the string passed into createEligibility, and where our application is the issuer (as evidenced by the DID).

Finally, if no matching document exists, we determine that the participant is eligible to create a badge.

Generating Points Data

While there's plenty to discuss related to generating and validating badge data, given that the pattern is quite similar when issuing points, I'll focus on that flow. The important thing to know here is that within both our createBadge and createFinal methods found in the same component mentioned above call an issuePoint method if a badge was successfully created by the user, passing in the corresponding value, context, and name of the event corresponding to that issuance.

What happens next is a result of our decision to allow the end user to control their points-related data, such that we:

Call an API route to access our application's DID Call yet another /api/issue route, where we Query PointClaims to see if one already exists or not for the end user where our application is also the issuer const authenticateDID = async (seed: string) => { const key = fromString(seed, "base16"); const provider = new Ed25519Provider(key); const staticDid = new DID({ resolver: KeyResolver.getResolver(), provider }); await staticDid.authenticate(); ceramic.did = staticDid; return staticDid; } // we'll use this both for our query's filter and for signing/verifying data const did = await authenticateDID(SECRET_KEY); const exists = await composeClient.executeQuery<{ node: { pointClaimsList: { edges: { node: { id: string; data: { value: number; refId: string; timestamp: string; context: string; }[]; issuer: { id: string; }; holder: { id: string; }; issuer_verification: string; }; }[]; }; } | null; }>(` query CheckPointClaims { node(id: "${`did:pkh:eip155:${chainId}:${address.toLowerCase()}`}") { ... on CeramicAccount { pointClaimsList(filters: { where: { issuer: { equalTo: "${did.id}" } } }, first: 1) { edges { node { id data { value refId timestamp context } issuer { id } holder { id } issuer_verification } } } } } } `); Use the data passed into the API's request body to sign and encode the values with our application's DID (if no PointClaims instance exists) Decode and verify the existing values of "issuer_verification" against our application's DID before appending the new data, resigning, and re-encoding it with our application's DID (if a PointClaims instance does exist) if (!exists?.data?.node?.pointClaimsList?.edges.length) { const dataToAppend = [{ value: parseInt(value), timestamp: new Date().toISOString(), context: context, refId: refId ?? undefined, }]; if (!refId) { delete dataToAppend[0]?.refId; } const jws = await did.createJWS(dataToAppend); const jwsJsonStr = JSON.stringify(jws); const jwsJsonB64 = Buffer.from(jwsJsonStr).toString("base64"); const completePoint = { dataToAppend, issuer_verification: jwsJsonB64, streamId: "", }; return res.json({ completePoint }); } else { const dataToVerify = exists?.data?.node?.pointClaimsList?.edges[0]?.node?.issuer_verification; const json = Buffer.from(dataToVerify!, "base64").toString(); const parsed = JSON.parse(json) as DagJWS; const newDid = new DID({ resolver: KeyResolver.getResolver() }); const result = parsed.payload ? await newDid.verifyJWS(parsed) : undefined; const didFromJwt = result?.payload ? result?.didResolutionResult.didDocument?.id : undefined; if (didFromJwt === did.id) { const existingData = result?.payload; const dataToAppend = [{ value: parseInt(value), timestamp: new Date().toISOString(), context: context, refId: refId ?? undefined, }]; if (!refId) { delete dataToAppend[0]?.refId; } existingData?.forEach((data: { value: number; timestamp: string; context: string; refId: string; }) => { dataToAppend.push({ value: data.value, timestamp: data.timestamp, context: data.context, refId: data.refId, }); }); const jws = await did.createJWS(dataToAppend); const jwsJsonStr = JSON.stringify(jws); const jwsJsonB64 = Buffer.from(jwsJsonStr).toString("base64"); const completePoint = { dataToAppend, issuer_verification: jwsJsonB64, streamId: exists?.data?.node?.pointClaimsList?.edges[0]?.node?.id, }; return res.json({ completePoint }); } else { return res.json({ err: "Invalid issuer", }); } } Send the result back client-side Use our client-side ComposeDB context (on which our end user is already authenticated) to either create or update a PointClaims instance, using the results of our API call as inputs to our mutation //if the instance doesn't exist yet if (finalPoint.completePoint.dataToAppend.length === 1) { data = await compose.executeQuery(` mutation { createPointClaims(input: { content: { issuer: "${did}" data: ${JSON.stringify(finalPoint.completePoint.dataToAppend).replace(/"([^"]+)":/g, '$1:')} issuer_verification: "${finalPoint.completePoint.issuer_verification}" } }) { document { id holder { id } issuer { id } issuer_verification data { value refId timestamp context } } } } `); }

Does this sound a bit tedious? This is the same pattern we're using for issuing and verifying badges as well. And yes, it is verbose compared to what our code would've looked like had we decided not to go through the trouble of allowing our participants to control their Ceramic data.

Creating Manifestations

As mentioned above, PointMaterializations represent how points manifest in a platform for reward structures (like a new badge, an aggregation for a leaderboard, or gating an airdrop). Most importantly, the PointMaterializations collection is a new dataset built from our composable piece PointClaims.

To create PointMaterializations, we use an event-driven architecture, leveraging our MVP EventStream feature. When PointClaims instances are written to Ceramic, we will receive a notification in another application, in this case, a Fluence compute function.

Our compute function works like this

Determine that the notification is for the model (PointClaims) and the issuer is the DID of our application. Extract from the notification content the PointClaims Verify that the issuer_verification is valid for the data field in PointClaims If the subject of the PointClaims (the document owner) has an existing PointMaterializations, retrieve it, otherwise create a new one. For the context of the PointMaterializations calculate a new value unique-events : tally all the context unique entries in the data field all-events : tally all the entries in the data field first-all-events : similar to all events, we check all unique context entries in the data field. If they have attended all the events, we then record the latest first event check-in as the value, so that we can rank users by that time

If you want to view the Rust code that implements the sequence above, please check out the compute repository.

At the time of writing, the EventStream MVP does not include checkpointing or reusability, so we have set up a checkpointing server to save our state and then use a Fluence cron job, or spell, to periodically run our compute function. In the future, we hope to trigger Fluence compute functions from new events on the EventStream.

What We Learned

This exercise left our team with a multitude of valuable learnings, some of which were more surprising than others:

Wallet Safety and Aversion to Wallet Authentication

We optimized much of the flow and the UI for mobile devices, given that the expected flow required scanning a code/tapping a disc as the entry point to interact with the application. However, throughout EthDenver and the various events we tried to facilitate issuing points, we overwhelmingly noticed a combination of:

Participants intentionally do not have a MetaMask/wallet app installed on their phones (for safety reasons) If a participant has such a wallet app on their phone, they are VERY averse to connecting it to our scavenger hunt application (particularly if they haven't heard of Ceramic previously)

This presents several problems. First, given that our flow required a scanning/tapping action from the user, this almost entirely rules out using anything other than a phone or tablet. In a busy conference setting, it's unreasonable to expect the user to pull out their laptop, hence why those devices were not prioritized in our design.

Second, the end user must connect their wallet to sign an authentication message from Ceramic to write data to the network (thus aligning with our user-centric data design). There's no other way around this.

Finally, our scavenger hunt application stood ironically in contrast with the dozens of POAP NFC stands scattered throughout the conference (which did not require end users to connect their wallets, and instead allowed them to input their ENS or ETH addresses to receive POAPs). We could've quite easily architected our application to do the same, though we'd sacrifice our user-centric data design.

SET Account Relation will be Useful in Future Iterations

As explained above, the PointsClaims model presents an ideal opportunity to use the SET accountRelation configuration in ComposeDB (given how we update an existing model if it exists).

Data Verifiability in User-Centric Data Design Entails More Work

Not a huge shocker here, and this point is certainly relevant for other teams building with Verifiable Credentials or EAS Off-Chain Attestations on Ceramic. While there are plenty of considerations to go around, we figured that our simple use of an encoded JWT was sufficient enough for our need to validate both the originating DID and the payload. It was hard to imagine how we would benefit from the additional baggage relevant to saving point-related VCs to ComposeDB.

Interested in Building Points on Ceramic?

If your team is looking for jam on some points, or you have ideas for how we can improve this implementation, feel free to contact me directly at mzk@3box.io, or start a conversation on the Ceramic Forum. We look forward to hearing from you!


Elastos Foundation

Elacity Enables ERC404 Standard for Revolutionary NFT Functionality

Elacity, the pioneering NFT Marketplace built on Elastos, today announces its support for the trading of ERC404 standard NFTs. This technical development enables the buying and selling of fractional NFTs, like Elawings, seamlessly aligning with current token trading standards. ERC404 addresses the limitations posed by existing NFT trading processes. Designed from the ground up to […] The post El

Elacity, the pioneering NFT Marketplace built on Elastos, today announces its support for the trading of ERC404 standard NFTs. This technical development enables the buying and selling of fractional NFTs, like Elawings, seamlessly aligning with current token trading standards.

ERC404 addresses the limitations posed by existing NFT trading processes. Designed from the ground up to integrate the characteristics of ERC-20 and ERC-721 tokens into a single, more flexible model, ERC404 standard NFTs provide customers the ability to buy and sell portions of NFTs rather than previous methods which only allowed for the purchasing of whole NFTs. This capability brings the ability to create liquidity pools for NFTs, creating better markets for NFT trading. It also unlocks new use cases for NFT platforms for example, the buying and selling of fractional royalties for any form of digital content and assets, including music, artwork, books, and the like. 

Sasha Mitchell, the CEO and Founder of Elacity, says about the development, “The adoption of ERC404 a massive step forward in the digital rights and NFT space as a whole. Providing creators unprecedented ownership over the rights to their content, while also allowing users to engage with creators of their choice on a never-before-seen level. Meanwhile, adopting ERC404 is a unique opportunity to enhance trading for NFT markets which can offer utility through access or royalties to services.“

“The addition of fractional NFT ownership will significantly increase flexibility and choice for both buyers and sellers of exclusive content, potentially creating further secondary markets and other forms of value addition,” he says.

“It’s difficult to overstate the technical challenges that have been overcome to deliver genuine interoperability and conformity with multiple standards.  But the result will mean more control for creators, and more choice for their audiences.”

This milestone aligns with Elacity’s main vision of becoming a Decentralized Digital Rights Marketplace (DDRM), where creators and users alike can reap the benefits of fractional ownership and royalty generation. DDRM is an extension of existing Digital Rights Management Technology (DRM), a familiar technology that is currently used by industry players to protect creator’s content from unauthorized use and distribution. In essence, DRM systems employ encryption techniques, software licenses, and other security measures to control access to digital content and limit who can use it.

Elacity stands as an innovative online decentralized content marketplace, revolutionizing the way users engage in the creation, purchase, and sale of online content through cutting-edge blockchain technology. Elacity’s parent company, Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. 

Join Us on the Journey

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

The post Elacity Enables ERC404 Standard for Revolutionary NFT Functionality appeared first on Elastos.


Identity At The Center - Podcast

We have another Sponsor Spotlight episode of the Identity at

We have another Sponsor Spotlight episode of the Identity at the Center podcast for you this week. We were joined by Rich Dandliker, Chief Strategist at Veza. We had an insightful discussion about Veza's unique approach to identity security, their 'anti-convergence' strategy, the significance of a reputable customer base, and the importance of a data-first approach to identity management. Don't

We have another Sponsor Spotlight episode of the Identity at the Center podcast for you this week. We were joined by Rich Dandliker, Chief Strategist at Veza.

We had an insightful discussion about Veza's unique approach to identity security, their 'anti-convergence' strategy, the significance of a reputable customer base, and the importance of a data-first approach to identity management.

Don't miss out on this episode for a comprehensive understanding of Veza's innovative solutions in the IAM market. You can listen to the episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

Risk, Resilience, and AI in the Supply Chain with Yossi Sheffi

The COVID-19 pandemic threatened to derail supply chain management completely. Or did it? Yossi Sheffi, distinguished MIT professor and an expert with 49 years in supply chain management, breaks down supply chain resilience into five levels and argues that supply chain managers were unsung heroes during the pandemic. Yossi also touches on balancing resilience with sustainability, pointing out th

The COVID-19 pandemic threatened to derail supply chain management completely. Or did it?

Yossi Sheffi, distinguished MIT professor and an expert with 49 years in supply chain management, breaks down supply chain resilience into five levels and argues that supply chain managers were unsung heroes during the pandemic. Yossi also touches on balancing resilience with sustainability, pointing out that while essential, both can introduce short-term costs and competitive imbalances. He underscores the delicate balance companies must strike between cost management and maintaining multiple suppliers for risk mitigation.

He expounds on the role of AI in supply chains, emphasizing the importance of leveraging artificial intelligence for identifying alternative suppliers and predictive analysis. The conversation also delves into the roles of machine learning, large language models, and robotics in evolving supply chains. Despite skepticism about fully autonomous applications like pilotless planes, Yossi highlights ongoing experiments with AI as potential co-pilots. The episode concludes with reflections on the rapid technological evolution impacting the professional landscape and the fabric of daily life.

 

Key takeaways: 

Resilience in supply chains is crucial for navigating disruptions and maintaining operational continuity.

Artificial intelligence (AI) technology is vital for supply chain management despite potential challenges.

Supply chain resilience and sustainability are critical concerns, as are the investments in these areas.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Yossi Sheffi on LinkedIn

Check out Yossi’s book - The Magic Conveyor Belt: Supply Chains, A.I., and the Future of Work

 


FIDO Alliance

Tech Game World: Passkeys are arriving on PlayStation: how the smart alternative to the password works

The advantages are many. Let’s start by saying that Passkeys are more secure than traditional passwords. These are fraud resistant and follow standards Fast Identity Online (FIDO )established by the […]

The advantages are many. Let’s start by saying that Passkeys are more secure than traditional passwords. These are fraud resistant and follow standards Fast Identity Online (FIDO )established by the FIDO Alliance, a global organization of which the Sony Group (owner of PlayStation) is part. The FIDO Alliance is responsible for defining and promoting the more advanced authentication standards for a wide range of devices and platforms. The goal is to reduce the dependence on passwords, which are considered an obsolete method in contemporary times. These standards are supported by leading companies and institutions in the technology sector, with which PlayStation itself has collaborated to offer an optimal access experience.


PCMag: No More Passwords: Sony Adopts Passkeys for PlayStation 4, PS5

Sony has introduced passkey support for PlayStation, eliminating the need for traditional passwords. Users can now opt for a more secure and convenient sign-in method by setting up a passkey […]

Sony has introduced passkey support for PlayStation, eliminating the need for traditional passwords. Users can now opt for a more secure and convenient sign-in method by setting up a passkey stored on their phone or laptop. Passkeys use unique cryptographic keys that remain on the device which are phishing resistant, and can be accessed through other devices in case of loss.

Tuesday, 05. March 2024

Hyperledger Foundation

Hyperledger Collaborative Learning Spotlight: BiniBFT - An Optimized BFT on Fabric

WHAT WE WORKED ON:

WHAT WE WORKED ON:

Monday, 04. March 2024

Project VRM

On Customer Constituency

I’m in a discussion of business constituencies. On the list (sourced from the writings of Doug Shapiro) are investors, employees, suppliers, customers, and regulators. The first three are aware of their membership, but the last two? Not so sure. Since ProjectVRM works for customers, let’s spin the question around. Do customers have a business constituency? […]

A customer looks at a market where choice rules and nobody owns anybody. Source: Microsoft Copilot | Designer

I’m in a discussion of business constituencies. On the list (sourced from the writings of Doug Shapiro) are investors, employees, suppliers, customers, and regulators.

The first three are aware of their membership, but the last two? Not so sure.

Since ProjectVRM works for customers, let’s spin the question around. Do customers have a business constituency? If so, businesses are members by the customer’s grace. She can favor, ignore, or more deeply engage with any of those businesses at her pleasure. She does not “belong” to any of them, even though any or all of them may refer to her, or their many other customers, with possessive pronouns.

Take membership (e.g. Costco, Sam’s Club) and loyalty (CVS, Kroger) programs off the table. Membership systems are private markets, and loyalty programs are misnomered. (For more about that, read the “Dysloyalty” chapter of The Intention Economy.)

Let’s look instead at businesses that customers engage as a matter of course: contractors, medical doctors, auto mechanics, retail stores, restaurants, clubs, farmers’ markets, whatever. Some may be on speed dial, but most are not. What matters in all cases is that these businesses are responsible to their customers. “The real and effectual discipline which is exercised over a workman is that of his customers,” Adam Smith writes. “It is the fear of losing their employment which restrains his frauds and corrects his negligence.” That’s what it means to be a customer’s constituent.

An early promise of the Internet was supporting that “effectual discipline.” For the most part, that hasn’t happened. The “one clue” in The Cluetrain Manifesto said “we are not seats or eyeballs or end users or consumers. we are human beings and our reach exceeds your grasp. deal with it.” Thanks to ubiquitous surveillance and capture by corporate giants and unavoidable platforms, corporate grasp far outreaches customer agency.

That’s one reason ProjectVRM has been working against corporate grasp since 2006, and just as long for customer reach. Our case from the start has been that customer independence and agency are good for business. We just need to prove it.


Oasis Open Projects

OASIS Board Member Spotlight Series: Q&A with Jautau “Jay” White, Ph.D.

Meet Jautau “Jay” White, Ph.D., an accomplished leader with a strong focus on people and teamwork. With two decades of experience, he specializes in building top-notch teams and programs that enhance information security and cybersecurity while reducing risks and ensuring compliance. His expertise spans AI/ML vulnerabilities, supply chain security, data privacy, cybersecurity, and more. What […]

The OASIS Board of Directors are integral to the organization's success. Read our Q&A to gain a better sense of who they are and why they serve the OASIS community.

Meet Jautau “Jay” White, Ph.D., an accomplished leader with a strong focus on people and teamwork. With two decades of experience, he specializes in building top-notch teams and programs that enhance information security and cybersecurity while reducing risks and ensuring compliance. His expertise spans AI/ML vulnerabilities, supply chain security, data privacy, cybersecurity, and more.

What can you tell us about your current role?
At Microsoft, my role involves supply chain security and open source strategy work. My main function is to be the subject matter expert on cybersecurity and information security matters, and take that knowledge and use it to communicate internally to extrapolate ideas, initiatives, and strategies that can be worked on in a collaborative environment such as open source. 

A large part of my job is going out into the open source ecosystem to see what communities are already in place and to help build communities around work that’s for the betterment of mankind. I seek out opportunities that align with Microsoft’s ongoing projects, identifying areas where Microsoft wants to invest its efforts and finding where those efforts are already underway. We initiate projects within Microsoft and leverage open source collaboration to crowdsource innovative solutions from open source communities. I bring those insights back to Microsoft, advocating for the adoption of these solutions, saying “This is already being done, why don’t we use this?” or “Why don’t we get involved with that?” That’s a large part of my job. I love what I do mainly because it takes everything I’ve learned throughout my entire career to do it.

What inspired you to join the OASIS Board?
I love standards, specs, and policies. Having had a hand in writing standards and then using them throughout my entire career, joining the OASIS Board was an excellent opportunity. One of the things I think that I liked most was the fact that I had to run for the board seat. I campaigned and talked to community members and staff; I really put myself out there and I enjoyed that immensely.

I love what OASIS does in terms of the international community. I love its recognition. There are so many specs and technologies that are being used today that people don’t even know originated in OASIS and I just love that I get a chance to be part of it.

Prior to serving on the OASIS Board, were you involved in open source or open standards? 
For the past few years, I’ve been involved with the Linux Foundation, especially their Open Source Security Foundation (OpenSSF) project. I currently sit on OpenSSF’s Technical Advisory Council (TAC) and I lead a few working groups and special interest groups there as well. Getting involved with OASIS was the next evolution. OASIS does such an amazing job bringing standards and specs to market. I’ve always felt that I want to be involved in this part, because the regulatory part is where I thrive.

What skills and expertise do you bring to the OASIS Board and how do you hope to make an impact?
I bring extensive cyber and security knowledge. Unlike many individuals who specialize in one area for the entirety of their careers, I’ve navigated through many roles inside of cyber and information systems. I’ve been a network engineer, a systems admin, a desktop support engineer, and a penetration tester. Also, I’ve done physical security assessments, admin security assessments, and I’ve installed firewalls. I have a software engineering degree, so I’ve written programs. There are so many different places that I’ve touched throughout my entire career across government, healthcare, finance, and professional services sectors. My experiences have enabled me to approach situations from different vantage points and engage meaningfully in conversations. I’m excited to learn about emerging standards and specs from diverse industries.

Why are you passionate about the OASIS mission to advance technology through global collaboration?
Global collaboration is key. I spent my last few years working in open source, and it’s so important to work collaboratively. I coined the phrase, “strategically competing through partnership and collaboration.” A lot of these major companies are competitors in nature, but there’s so much out there right now that is affecting every single one of our businesses at the same time, that we have to come together to build these standards, technologies, controls, and safeguards so that our joint customer base remains safe. Trust is huge and our customers have to trust each and every one of us equally.

What sets OASIS apart from other organizations that you’ve worked with in the past? 
The way OASIS is constructed around Technical Committees and Open Projects is still relatively new to me. I think where OASIS shines is how standards get created and brought to market. That’s the niche.

What would you say to companies that want to bring their projects to OASIS?
It would totally be dependent on what that company wanted. If they want to create a spec or a standard around a tool that’s being created, I would definitely say go to OASIS.

Do you have an impact story about your work in open source or open standards?
I take great pride in establishing a Diversity Equity and Inclusion (DEI) working group in the OpenSSF where there wasn’t one before. Additionally, I’m proud of the AI work that I’ve been able to bring to Microsoft.

At OASIS, I’m excited to be one of the founding members of the OpenEoX Technical Committee alongside Omar Santos. I’m extremely excited about OpenEoX’s potential; I think it’s going to be huge in the industry because there isn’t a standard for end-of-life and end-of-support. There’s nothing out there that allows customers to understand when new releases are coming in, when they’re going out, and how things are deprecated. Having been a part of OpenEoX since its inception and participating in the initial meetings thus far has been incredibly fulfilling.

Can you tell me about any exciting changes or trends in open source and standards?
The AI space is extremely large and there’s so much room to play in it. I don’t want us to get consumed by one area over the other. There are so many different specs and standards that can be created and I want us to be open to all the possibilities and open to the entire knowledge space.

Where do you see standards going in the future?
I see standards becoming more prevalent with respect to these different government regulations coming in. We have more and more regulatory requirements coming out that are beginning to drive standards, for example the EO from the White House, the EU’s Cyber Resilience Act (CRA), and a policy that’s coming out in Germany. I can see that gap closing where you’ll have a standard that could even drive a regulatory requirement at some point which will be something weird to see.

What’s a fun fact about you?
I ride motorcycles and I like to work on cars and bikes. More than anything, I enjoy getting under the hood of a car or lifting the bike up and taking it apart and putting it back together.

The post OASIS Board Member Spotlight Series: Q&A with Jautau “Jay” White, Ph.D. appeared first on OASIS Open.


Identity At The Center - Podcast

It’s another brand-new episode of the Identity at the Center

It’s another brand-new episode of the Identity at the Center Podcast! This week, we had the pleasure of speaking with Laura Gomez-Martin from RSM. We dove into the role of government in protecting privacy, the complexity of privacy policies, and the balance between public and company expectations. Laura shared her unique insights on these topics and much more. You can listen to the episode at idac

It’s another brand-new episode of the Identity at the Center Podcast! This week, we had the pleasure of speaking with Laura Gomez-Martin from RSM. We dove into the role of government in protecting privacy, the complexity of privacy policies, and the balance between public and company expectations. Laura shared her unique insights on these topics and much more. You can listen to the episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 01. March 2024

Elastos Foundation

Open Questions After Elastos Crypto Class Action Settlement

The cryptocurrency world is new, exciting, and complex in relation to governing laws and jurisdictions around the world. The Elastos Foundation’s settlement of a class action lawsuit highlights the ongoing debate over digital assets and securities law. Bradley D. Simon, a seasoned legal expert with a background as both an assistant U.S. attorney and a […] The post Open Questions After Elastos Cr

The cryptocurrency world is new, exciting, and complex in relation to governing laws and jurisdictions around the world. The Elastos Foundation’s settlement of a class action lawsuit highlights the ongoing debate over digital assets and securities law. Bradley D. Simon, a seasoned legal expert with a background as both an assistant U.S. attorney and a trial attorney with the U.S. Department of Justice, recently provided an insightful analysis titled, “Open Questions After Elastos Crypto Class Action Settlement”, in his recent article on Law360. Read the full article here.

Core Points Simplified: Litigation Resolution: The Elastos Foundation settled a major legal case without admitting fault, demonstrating its ability to navigate complex legal challenges effectively. Security Classification Challenge: Elastos successfully argued that its ELA token is not a security, emphasizing the need for a refined understanding of cryptocurrencies in regulatory contexts. International Jurisdiction: Elastos showcased the difficulty of applying U.S. securities laws to decentralized, international entities, highlighting the need for global legal perspectives. Technology and Decentralization: Operating as a DAO and using decentralized blockchain technology, Elastos leads in innovating digital economies. E-Discovery Challenges: The case exposed the inadequacy of current e-discovery tools for modern communication platforms, stressing the need for legal processes to evolve alongside technology. Regulatory Dialogue: The settlement advances discussions on cryptocurrency regulation, advocating for more nuanced legal frameworks for digital assets. Future Litigation Precedent: Elastos’s case offers insights for future crypto litigation, potentially shaping legal approaches to digital currencies.

The Elastos Foundation’s legal battle underscores the need for clarity in how cryptocurrencies are regulated as securities. Simon’s analysis not only sheds light on the intricacies of this particular case but also on the broader challenges facing the regulation of digital assets. As the legal landscape continues to evolve, this case serves as a crucial reference point for both legal practitioners and participants in the cryptocurrency market. Be sure to read here!

The post Open Questions After Elastos Crypto Class Action Settlement appeared first on Elastos.


DIF Blog

Guest blog: David Birch

David Birch is a keynote speaker, a published author on digital money and digital identity, fintech columnist, social media commentator and an international advisor on digital financial services. A recognised thought-leader in digital identity and digital money, he was named one of the global top 15 favorite sources of business

David Birch is a keynote speaker, a published author on digital money and digital identity, fintech columnist, social media commentator and an international advisor on digital financial services. A recognised thought-leader in digital identity and digital money, he was named one of the global top 15 favorite sources of business information by Wired magazine. 

What was the career path that led to you becoming a thought leader on digital money and identity? 

It’s quite straightforward. I started out in the world of secure electronic communications. I was working in primarily in defence and safety-critical communications when suddenly the financial sector needed to know about this stuff too, which took me into the world of finance and payments. 

I’d edited a book about digital identity and then I was encouraged by the economist Dame Diane Coyle to write Identity Is The New Money. She was the person who pushed  me to have a go at writing a book myself. The timing was good, as others were talking about similar ideas. 

The book helped people to rethink some of the problems, and I think it’s stood the test of time. 

It’s been ten years since you published Identity Is The New Money. Is this a reality today? 

On one level it is. Some time ago I started to realize that the big problems in payments were identity problems, not payment problems. It doesn't matter if it's push payment fraud or whatever, the problems all come from the lack of identity infrastructure. Why is Elon Musk spending so much on money transmitter licenses, KYC (Know Your Customer) and AML (Anti Money Laundering)? Is it because he wants to earn a tiny slice when you pay a friend for concert tickets, or is it because he wants to know who you are and what you’re into? The data about a payment is much more valuable than the transaction fee. 

But in the sense in which I meant ‘identity is the new money’, it still isn't, and that’s surprising.

What needs to change? 

The lack of privacy is one area. Digital payments are too intrusive, though a lot of people don't care. I get a lot of messages about how terrible it would be for CBDCs (Central Bank Digital Currencies) to include not-yet existent features such as the ability to block your spending on certain things, yet when it’s Visa or Twitter being able to see everything you buy, no-one seems bothered. 

Authentication is another area. It bugs me that 10 years later I'm still doing password resets. Recently I needed to book a hotel room, so I tried logging into a brand where I've got points. I got the password wrong and didn’t want to wait for the reset email. Instead I logged into a different brand and booked a room. My product choice was based on which password I remembered!  

Do you see a role for decentralized identity in fixing these issues? 

I like the underlying standards, Decentralized Identifiers and Verifiable Credentials. But the implementation isn’t there yet. From the history of Bitcoin we can see that people lack the competence to manage their keys. When I drop my phone in the canal, how do I get all my stuff back? In a centralized world it’s easy. I buy a new phone and Apple puts all the pictures back. I’ve got my 2FA (2-Factor Authentication) device, so I can easily log into my bank again. 

Otherwise, I'd have to put my secret key on a memory stick and bury it in a biscuit tin in the back garden. For 99 per cent of the population that will never work. 

How can we overcome these challenges? 

I believe the answer is custodial SSI (Self Sovereign Identity), whereby I generate the key on my app and my bank looks after it. That looks like a viable option to me, because banks know how to securely manage keys in distributed hardware, so I trust them not to lose the key. If they do, there’s recourse, as they’re regulated. 

Do I want to control my digital identity? Absolutely. Do I want to hold the keys? No, I want my bank to do it for me. 

What makes you believe people will trust their bank with the keys to their digital identity? 

There’s trust in the legal sense, and then there’s trust in the everyday sense: I trust that my personal data won’t be looted, that I won’t lose access if I lose my phone… I trust the system to regulate my bank and ensure they don’t do stupid things. In the mass market, that’s the kind of trust that matters — the belief that if something goes wrong, it will get fixed. 

What does a good digital identity experience look like, in your view? 

When I log in to the airline, it should ask “Which ID would you like to use?” If I want to use my Avios app, I should be able to. It might call my EU wallet in the background, but I don't see that, everything is embedded. Personally I'd like to never think about managing my identity again.

In June 2023 you stated that the lack of mass-market digital identity is a drag on the economy. Have you seen much progress since then? 

Lots of companies are experimenting. But is anything mainstream happening? We’re not there yet. For example, I can’t download my Barclays ID and use it to log into my HSBC account.  

We’re starting to see people storing mDLs (mobile driving licenses) in their Apple wallet, and the EU Digital Identity Wallet is on the horizon. Whether it gets traction or not, it’s driving forward research and development. Does that mean the EU wallet will be all-conquering? I don't know. 

You’ve talked about how machine customers or ‘custobots’ will revolutionize ecommerce. Can you expand on this a bit please? 

I think there’s a good chance this will happen, starting with financial services. A bot can probably do a better job of managing my finances than I can. On April 6 (the start of the UK tax year) I’ll be looking at what are the best ISAs (Individual Savings Accounts). I will spend hours faffing about, researching, applying, waving my phone in front of my face to prove it’s me, figuring out which account to transfer money from… It’s the kind of thing that could be done in nanoseconds by AI. 

I might choose the Martin Lewis bot or the Waitrose bot to do this for me. The idea that they could be regulated by the FCA (Financial Conduct Authority) and operate under relevant duty of care legislation, with the co-ordinated goal of delivering my financial health, is appealing. 

I’ve also proposed that companies will soon need to provide APIs to support the needs of custobots rather than people.

Where is digital identity headed, in your view? 

There’s energy arriving into the space from two unexpected quarters. One is CBDCs. There’s a need for identity coming from that side, and pressure to get it fixed. The other area is the metaverse. People looked at the lack of obvious progress following Meta’s early pronouncements and thought, it’s not going anywhere. That’s the wrong lesson to take away. For example Apple Vision Pro (Apple’s extended reality headset) is out and there will be no shortage of people wanting to buy it. 

Digital identity is fundamental to make the metaverse a safe space. Assuming this is done right and the metaverse has an identity layer built in from the start, it could become a safer, less expensive, and therefore more desirable, place to transact than the “real world”. 

Money in the Metaverse: Digital Assets, Online Identities, Spatial Computing and Why Virtual Worlds Mean Real Business will be published in late April 2024. To pre-order, click here.



Thursday, 29. February 2024

EdgeSecure

Empowering Campus Networks

The post Empowering Campus Networks appeared first on NJEdge Inc.

Beginning his career as a student assistant in technology services, Michel Davidoff was responsible for pulling thin and thick ethernet at Sonoma State University. Upon leaving the University ten years later, Davidoff was in the role of Director of Networking and Telecommunication. “I was responsible for the campus network that we had grown from around three hundred devices to well over 10,000,” says Davidoff. “I left Sonoma State in 2002 and set my sights on California State University (CSU) Chancellors’ Office. They had started a technology strategy program that included at its core digital equity. The Trustees of CSU were looking to implement the Information Technololgy Strategy (ITS) that was represented as the pyramid. IT infrastructure was at the base of the pyramid, the center, called middle world, included initiatives and projects including security and identity management, and the top was student success. CSU’s visionaries, including leadership and trustees, understood very early on the importance of technology to enable education. I was invited to participate in a system wide initiative to determine the standards for the bottom of the pyramid. Following this meeting, I was eager to help further advance this initiative and joined CSU in a new and exciting role.”

Finding Cost-Effective Solutions
Tasked with helping create a consortium of 23 directors of networking at CSU, Davidoff began building working groups of experts. “Along with being a process facilitator for the consortium, I also created guidelines for writing RFPs, particularly outlining the functional requirements,” explains Davidoff. “A large part of my job at CSU was to provide accurate financial projections, both internal and external, and maintain connectivity for all 23 campuses. In 2006 as wireless technology became more prominent, I was tasked with integrating wireless into each campus. Without more money in the budget to do so, I had to get creative.”

“We began by creating a strategy for network rightsizing,” continues Davidoff. “Since I had the data of every single closet at CSU, I knew that more than 50 percent of the switches were not being used, but because of the fear of not having enough as they grow, the network had been built significantly bigger than necessary. I developed a process that if a port has not been used in 90 days, it will not be refreshed. That freed about 50 percent of the budget delegated for switching and routing. We were able to deploy wireless techology on the campuses, and through a RFP, develop the functional requirements. Later when we needed to enhance and standardize security, we went through a similar process and selected a firewall vendor, and became much more systematic and methodical about the deployment process.”

Spanning over two decades, Davidoff’s career at CSU allowed him to become well versed in delivering scalable, ground-breaking strategies and solutions for large scale infrastructure deployments. “I am proud of our accomplishments at CSU, and I believe I was able to bring a few key things to the University,” shares Davidoff. “First, is that collaboration works, even if it might be a little slower. Working together and developing RFPs can offer tremendous cost savings, in fact, at the end of my career, most of the equipment we purchased was at discounts greater than 80 percent.  My job was to create the most efficient administrative process that requires the least amount of money, while providing an exceptional student learning experience.”

“We need to eliminate complexity to enable innovation. If we keep this complexity in our environment, every time someone wants to innovate, we need to change all these features and configurations. Many vendors or wireless controllers have hundreds of different features and it’s difficult to develop best practices. We need a deterministic network and not a non-deterministic network in order to predict the performance.”

— Michel Davidoff
Chief Strategist of Education, Nile

Encouraging Collaboration
Over the years, Davidoff was responsible for security, log management, and later in 2014, developing the cloud strategy for CSU. “For the next three to four years, I developed the CSU cloud strategy and I believe the biggest selling point for leadership was, unlike networking, the Cloud was a new technology,” explains Davidoff. “Instead of several experts from Amazon Cloud and several Azure experts at CSU, I suggested creating a small team that focused on cloud technology and how to make it the most efficient and automated. Throughout my career, I’ve seen the value of collaboration, especially when making important decisions on how the campus is going to run and ensuring systems are as efficient as possible. From a long-term strategic standpoint, I am a believer in the wisdom of the group, rather than the wisdom of the individual. If everyone feels they have a voice, a successful outcome is more likely. This approach was aligned with my approach that we don’t give a campus a budget, we give them a solution.”

A day after Davidoff was set to retire in March 2020, CSU shut down its physical campuses due to the pandemic. “Leadership knew CSU must prepare for remote learning and I began doing a lot of research, along with forming a working group,” explains Davidoff. “We selected a small company to help us teach online in case we would need to offer remote classes. Part of the contract included free licenses for half a million students for up to ten years, as well as licenses for every faculty and staff member. We trained everyone on the software and ensured we could operate online. When the pandemic hit, CSU was the first university system in the U.S. without any downtime because our processes and strategies were ready to go.”

Bringing Insights to a New Role
After retiring, Davidoff began thinking about where he could have an even larger impact on education and helping students. “It became clear that technology companies, especially in the networking domain, are a place where I could make my mark in creating efficient technology solutions,” shares Davidoff. “I learned of a new company, Nile, and I wanted to bring my knowledge and unique perspective of higher education infrastructure and my vast experience in over two hundred network deployments. I knew I could share how the customer thinks because I had been the customer for thirty years.”

“The Edge team works hard to stay at the forefront of innovation in the marketplace. In the world of enterprise networking, Nile represents an entirely new approach that enables organizations to offload the often overwhelming complexities of network management while reaping the benefits of a financially predictable, highly adaptable, and supremely secure network environment. We’re proud to have Nile as a new awardee and partner in our fast-growing EdgeMarket co-op.”

— Dan Miller
Associate Vice President, EdgeMarket

Joining Nile as the Chief Strategist of Education early last year, Davidoff aligns the company’s strategy with an educational lens to ensure all technology and services deliver a superior connectivity experience. “I love the thought leadership part of my role at Nile and writing papers about rethinking networking and higher education,” says Davidoff. “I talk to a lot of students and gather valuable insights about today’s learning expectations. Nile modernizes IT operations through the delivery of a new wired and wireless enterprise network and as a leader in enterprise Network as a Service (NaaS), it allows institutions to stabilize their budgets. From a financial perspective, you’re able to buy a service that assures capacity, availability, and performance. Organizations can plan how much money is needed every year, instead of seeing a huge spike in the budget five years from now to replace the network or to replace routing. Plus, most importantly, using Nile services helps free up staff to focus on other initiatives, like classroom technology or digital transformation.”

“Normally, if an institution purchases a technology solution from a vendor, that system is at max performance on day one,” continues Davidoff. “Six months later, your firewall is upgraded, your core router is not at current code, and you added ten new features. Your capacity and features are now starting to degrade. Without the time to take care of all the maintenance that needs to happen, your investment keeps losing value over time.”

Davidoff says many organizations are not sufficiently leveraging automation in order to efficiently run and maintain the network while creating complexity that no human can solve. “We need to eliminate complexity to enable innovation. If we keep this complexity in our environment, every time someone wants to innovate, we need to change all these features and configurations. Many vendors or wireless controllers have hundreds of different features and it’s difficult to develop best practices. We need a deterministic network and not a non-deterministic network in order to predict the performance.”

Partnering with Edge
Recognizing the important role networking infrastructure plays in the evolution of IT, Edge recently released an RFP to prospective vendors who could provide a NaaS to member organizations. The goal was to provide Edge members with NaaS services that allow these institutions to focus on promoting capabilities and skills, while reducing costs, promoting efficiencies, and improving security. Davidoff led Nile’s response to the RFP and was recently awarded a master contract with Edge (CBTS was the other awardee). “The Edge team works hard to stay at the forefront of innovation in the marketplace,” says Dan Miller, Associate Vice President, EdgeMarket. “In the world of enterprise networking, Nile represents an entirely new approach that enables organizations to offload the often overwhelming complexities of network management while reaping the benefits of a financially predictable, highly adaptable, and supremely secure network environment. We’re proud to have Nile as a new awardee and partner in our fast-growing EdgeMarket co-op.”

Nile helps higher education institutions deliver an uninterrupted wired and wireless experience with a performance guarantee for coverage, availability, and capacity. “Nile can help free up capital and resources to focus on meeting the demands of modern education,” says Davidoff. “We want to help institutions deliver on their mission and provide the strategic value that leadership is looking to achieve. Nile aims to help organizations break free from the traditional constraints of legacy network infrastructures and use IT resources to strategically enhance learning in a digital era.”

To learn more about how Nile is helping institutions move beyond the networking status quo, visit nilesecure.com/enterprise-network/higher-education.

View Article in View From The Edge Magazine

The post Empowering Campus Networks appeared first on NJEdge Inc.


Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation

The post Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation appeared first on NJEdge Inc.

As education institutions and public sector organizations continue to navigate through the critical process of adapting their IT organizations for the digital age, many look for innovative ways to align team members and streamline processes to help advance these objectives. To create an effective strategy, Christopher Markham, Executive Vice President and Chief Revenue Officer, Edge, says starting with a few basic questions can help frame the conversation in how to move forward. “An important question to begin with is how does your organization view information technology? Do you view IT more as an engineering operation or as a service operation? Leadership must also determine if IT is viewed as an art or science, because there are plenty of institutions where IT is expected to be the primary change agent or innovator, not just in the administrative side of the house, but in educational technologies.”

“Organizations should also explore their return on investment from IT, including technology assets and staff,” continues Markham. “Do you have a return on investment and a rate of return? In addition, leadership must explore if technology is informing the business process both on the administrative and academic side, or is technology being informed by those business processes.” Achieving alignment across an IT organization involves several core axioms, including:

Authority and accountability must match Specialization and teamwork Precise domains and clear boundaries The basis of a substructure Avoid conflicts of interest Cluster by professional synergies Business within a business

“The golden rule is that authority and accountability in an IT organization must match,” says Markham. “You want to define clear boundaries with no overlaps or gaps and divide a function into groups based upon its strengths. In addition, cluster groups under a common leader based on similar professions. Institutions must also view higher education and information technology as a business. Faculty, students, and staff are considered customers and every manager is an entrepreneur. An entrepreneur is anyone who brings together all the different pieces to ensure service delivery of IT and high-quality services and solutions.”

“IT governance, funding and financial management, and enterprise data and data governance are among the top technology-related domains that impact digital transformation readiness.”

— Christopher Markham
Executive Vice President and Chief Revenue Officer, Edge

Achieving Digital Transformation Readiness
The first principle of aligning authority and accountability is of top importance and what Markham calls the golden rule in IT organizational design. “This alignment is essential to the success of every IT organization and institution that it is serving. In a particular case study, a CIO appointed a few process owners at the suggestion of process improvement consultants. Each was assigned a process that engaged people from various parts of the organization in producing a specific service. These process owners had authority over those processes, and while they were collaborative and involved stakeholders in designing and implementing the new processes, they were not process facilitators who served others by bringing teams to consensus on how they’ll work together. Process owners didn’t have matching accountability for their effectiveness of those processes and weren’t always the individuals accountable for delivering those services. They were accountable for the delivery of services, but they didn’t have the power to determine the processes they used to do their jobs.”

“If these service delivery groups failed, there was no way to know whether it was due to their own poor performance or due to a bad process,” continues Markham. “Nonetheless, they took the blame. Process owners implemented detailed, rigorous business processes and succeeded at their mission, but the organization became bureaucratic, slow, and inflexible as a result. This structure violated the golden rule. In re-envisioning and restructuring an IT organization, the CIO needs to decide the rules of the game and create the organizational ecosystem, including the group structure, the resource governance process, and the culture.”

Increasing the Pace of Innovation
Once the right structure is in place, leaders can take the opportunity to adjust domains as needed, arbitrate any disputes, create a healthy environment for teamwork, and develop talent through recruiting, inspiring, and coaching efforts. “Leaders should manage performance including negotiating staff’s objectives, giving frequent feedback, measuring the results, deciding rewards, and managing performance problems,” says Markham. “CIOs can leverage performance programs and evaluations to restructure, reorganize and incentivize.  They must also manage commitments and resources which includes assigning work within the group and coordinating shared decisions, like common methods and tools and professional practices. In addition, the CIO must make decisions when consensus cannot be reached.”

Markham shares another case study where the CIO in a large insurance company was tasked with addressing complaints from the business executives regarding the IT department’s opacity, unresponsiveness, and poor understanding of their business strategies. “The leadership in this organization was frustrated that they couldn’t control IT’s priorities and did not understand why many of the requests were not being fulfilled. There was a trend toward decentralization and many business units had started their own small IT groups, which the CIO disparagingly called Shadow IT. These groups only existed because business units did not want to do business with corporate IT. In response, the CIO dedicated a group to each business unit and divided his engineering staff among them. Each group was relatively self-sufficient with all the skills needed to deliver.”

“The senior managers also served as the primary liaisons to those business units,” continues Markham. “The CIO felt this structure would appease the business units and stave off further decentralization, while holding senior managers accountable for business results and client satisfaction. Unfortunately, technical specialists were needed throughout the organization, and since technology subspeciality was scattered among the various client-dedicated groups, this limited their professional exchange. When the sales team, for example, ran into technical challenges, they may not have known that someone in another group already had encountered that issue and knew a solution. Their peers were busy with other priorities, costs rose, and response time slowed, and everyone was reinventing solutions to common problems. Meanwhile, there was little impetus for standards, and individual teams built systems that were optimal for their specific clients, not for the enterprise as a whole.”

Markham continues, “The pace of innovation also slowed, and the organization could not hire an expert in an emerging technology until demand grew across the whole enterprise. As a result, business opportunities to build customer loyalty were missed and the impacts extended beyond IT’s performance. Over time, the structure led to multiple general ledger systems and multiple records for the same customer. Synergies were lost as the company lost a single view of its customers, resources, and suppliers.”

Including productivity specialists can bring efficiency to an IT organization which can translate into cost savings for return on investment. “Specialists have ready answers and don’t have to climb the learning curve with each new challenge,” says Markham. “Quality specialists know the latest methods and technologies in their field and understand how their products are more capable and have lower lifecycle costs. Competence and experience deliver results with fewer risks. Innovation specialists can keep up with the literature and be the first to learn about emerging technologies and techniques.  As a result, the pace of innovation improves. Since they are confident in their abilities, specialists experience less stress, are more productive, and are more likely to excel in their career.”

 “An important question to begin with is how does your organization view information technology? Do you view IT more as an engineering operation or as a service operation? Leadership must also determine if IT is viewed as an art or science, because there are plenty of institutions where IT is expected to be the primary change agent or innovator, not just in the administrative side of the house, but in educational technologies.”

— Christopher Markham
Executive Vice President and Chief Revenue Officer, Edge

Driving Organizational Change
Creating an IT strategy that optimizes processes and technology and fosters a culture of innovation includes several domains of enterprise architecture. “IT governance, funding and financial management, and enterprise data and data governance are among the top technology-related domains that impact digital transformation readiness,” says Markham. “Each of these domains represent specializations of the IT reference disciplines or olive branches from those IT reference disciplines, and the business architecture is an olive branch with each of the functional offices in both administration and academics. But without labeling these domains properly as a CIO, it’s very difficult to reorganize, restructure, or re-envision your organization. The cost of overlapping these domains and clustering by professional synergies is reduced specialization, redundant efforts, confusion, product disintegration, less innovation and teamwork, and lack of entrepreneurship.”

Edge’s E360 assessment is designed to provide a holistic, 360-degree view of an institution’s current-state technology program with a focus on the technology-related domains. Taking a diagnostic and prescriptive approach to evaluating the technology organization, Edge looks at four key areas. “We first identify any unreliable processes and if there is reduced specialization as a result of these gaps,” explains Markham. “We also look if that reduced specialization leads to conflicts of interest. The E360 assessment also focuses on the professional exchange between the domains, if there are domain overlaps, the level of coordination, and whether it is a whole business. Lastly, we explore the substructure and the results of reduced specialization, domain overlaps, and inappropriate biases. E360 produces a final report that not only includes outcomes and analysis, but a three-year roadmap for an IT organization to drive organizational change, improve their technology landscape, and achieve digital transformation goals successfully.”

Ready to achieve operational efficiency and digital transformation? Learn more at njedge.net/solutions-overview/digital-transformation

View Article in View From The Edge Magazine

The post Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation appeared first on NJEdge Inc.


Edge Colocation Services

The post Edge Colocation Services appeared first on NJEdge Inc.

In an age where data collection and analysis continue to grow in importance in nearly every industry, many organizations seek innovative and affordable ways to store data and expand their networking capabilities. In the education community, not every institution is equipped with a large IT infrastructure or the space to host servers, networking equipment, and data storage. To help address this need, Edge offers affordable colocation services where member institutions can receive data center support and colocation space for disaster recovery and business continuity. “Colleges and universities have always had the responsibility to design, build, and run data centers on college campuses,” says Bruce Tyrrell, Associate Vice President Programs & Services, Edge. “Unfortunately, the physical infrastructure, including commercial power, backup generators, and environmental services are extremely expensive and complex to deploy, especially in a typical college campus environment that was not designed for these requirements. Our colocation services are an enhancement of our members’ existing connectivity to the Edge network. By leveraging their existing last mile connections, members have the ability to place hardware at one of several locations around the region.”

With Edge maintaining high availability colocation data centers throughout the Northeast region, several members are choosing to exit the owned data center space and move their hardware to an off-campus location. “Many institutions are relocating hardware to a purpose-built facility that has been professionally engineered and constructed with the desired features,” says Tyrrell. “Access to these features is included in the monthly recurring costs for space outsourcing and using a colocation provider can help reduce the need for additional staff to handle the physical management of those environments.”

Benefits of Colocation
From their optical network, Edge can build connections for members from their campuses directly into the colocation facilities. “Member institutions can choose to place hardware infrastructure at the enterprise grade colocation facility on the campus of Montclair State University at a significant discount over commercial space,” explains Tyrrell. “Colocation is available along our optical network and provides access to 165 Halsey Street in Newark, New Jersey Fiber Exchange (NJFX), which is in Wall Township adjacent to the Tata international cable landing station, and at 401 N Broad Street in Philadelphia. Members can also access the Digital Realty colocation at 32 Avenue, the Americas in Manhattan. Edge is expanding our colocation capability by adding the colocation facility at Data Bank in Piscataway, New Jersey, a bespoke water-cooled facility designed with High Performance Computing in mind.”

Colocation data centers allow members to store their equipment in a secure location with a public IP address, bandwidth, and power availability. These locations also include backup power in the event of an outage. “An organization can use Edge colocation services to extend their internal infrastructure into a professional collocation space from an end user point of view,” says Tyrrell. “The Edge model is unique in that the bandwidth provided to our members is not shared with any other organization, and since this extension is transparent, students, faculty, and staff do not realize their data is traveling off campus and out to a data center and back—the data transfer only takes microseconds.”

With Edge as the provider of the bandwidth, both internally connected to the campus, as well as externally via their internet connections, these connections are designed to scale and burst. “Unlike a cloud environment, where there is an increased cost for bursting when an organization’s computing resources reach their max, a colocation environment offers costs that are fixed,” explains Tyrrell. “An organization rents a cabinet and purchases hardware to store in this cabinet. Edge fixes the cost of transport and internet capacity which can allow for greater budget predictability. This is different from the Cloud, where once an application is placed in the Cloud, upticks in utilization for those apps can have a direct impact on the monthly expense to operate those services. For some institutions, having a fixed monthly budget for colocation services is easier to operationalize from a financial perspective.”

“Unlike a cloud environment, where there is an increased cost for bursting when an organization’s computing resources reach their max, a colocation environment offers costs that are fixed,” explains Tyrrell. “An organization rents a cabinet and purchases hardware to store in this cabinet. Edge fixes the cost of transport and internet capacity which can allow for greater budget predictability. This is different from the Cloud, where once an application is placed in the Cloud, upticks in utilization for those apps can have a direct impact on the monthly expense to operate those services. For some institutions, having a fixed monthly budget for colocation services is easier to operationalize from a financial perspective.”

— Bruce Tyrrell
Associate Vice President Programs & Services, Edge

Onboarding and Support
When an institution selects colocation services, Edge’s engineers help walk the member’s IT team through the ins and outs of the processes and can accompany them to colocation facilities to familiarize them with the data centers. “Edge acquires the space, coordinates the connectivity, and assists in providing remote and physically secured access to the cabinets or gauges,” says Tyrrell. “We also handle all the administrative pieces like billing and passing along clean invoices to the member. Since colocation facilities can often be complex and intimidating, Edge can visit the facilities with you during the onboarding process.”

“Colocation is a unique environment that can be complex from both an operational and an acquisition perspective,” continues Tyrrell. “Edge has decades of experience in operating these environments and we stand ready to assist our members with transitioning hardware and application into these professionally maintained tier three colocation facilities. Once the transition has been made, members are better positioned to weather the storms and unforeseen outage conditions that have been known to impact on campus data centers. This resilient infrastructure can provide peace of mind and a cost-friendly way to optimize resources and meet the growing demands of today’s higher education community.”

To learn more about Edge’s colocation services and how to take advantage of the latest and greatest developments in networking technology, visit njedge.net/solutions-overview/network-connectivity-and-internet2.

View Article in View From The Edge Magazine

The post Edge Colocation Services appeared first on NJEdge Inc.


Navigating AI-Powered Education and the Future of Teaching and Learning

The post Navigating AI-Powered Education and the Future of Teaching and Learning appeared first on NJEdge Inc.

With the age of artificial intelligence (AI) well underway, how we work, learn, and conduct business continues to transform and open the door to new opportunities. In the classroom, AI can be a powerful teaching tool and support innovative and interactive learning techniques and critical thinking. Dr. C. Edward Watson, Associate Vice President for Curricular and Pedagogical Innovation with the American Association of Colleges and Universities (AAC&U) and formerly Director of the Center for Teaching and Learning at the University of Georgia, explores how AI is revolutionizing the future of learning and how educators can adapt to this new era of human thinking in his new book, Teaching with AI: A Practical Guide to a New Era of Human Learning (Johns Hopkins University Press).

“AI is a significant game changer and is presenting a new challenge that is going to be dramatically different from past disruptive innovations,” says Watson. “Goldman Sachs and other sources estimate that two-thirds of U.S. occupations will be impacted by AI.1 With a vastly accelerating expectation within the workforce that new graduates will be able to leverage AI for work, there is a growing pressure on institutions of higher education to ensure students become well-versed in AI techniques. This new learning outcome for higher education is being termed AI literacy.”

AI is also introducing a new academic integrity challenge including how to accurately determine if students are using AI to complete assignments. Along with Teaching with AI co-author, José Antonio Bowen, Watson explores crucial questions related to academic integrity, cheating, and other emerging issues in AI-powered education. “The real focus of the book is how to create assignments and practices that increase the probability that students will engage with the work rather than turn to AI, as well as ways to partner with AI and use these tools in meaningful and impactful ways. Instead of fearing AI and how students may misuse it, the education community must employ logical pedagogical practices within the classroom that encourage our students to become competent partners with AI, including building AI literacy skills that will help them on their future career paths.”

“I look forward to discussing the higher education landscape at EdgeCon and exploring suggestions for how we might move forward. We need to acknowledge that AI is going to be an important thread in the education and research industries. Disruption is not always a bad thing, especially in the workforce. AI can help improve efficiencies, reduce costs, increase productivity, and create new job opportunities. In the higher education setting, these tools have the potential to offer personalized learning experiences, strengthen retention, and resolve accessibility issues. Along with the potential challenges this type of technology may introduce, we must also look at the positive opportunities that will arise and how we can better prepare our students for the world that is already waiting for them.”

— Dr. C. Edward Watson
Associate Vice President for Curricular and Pedagogical Innovation,
American Association of Colleges and Universities (AAC&U)

AI in the Classroom and Beyond
With over twenty-five years of experience in faculty and instructional development, Watson is nationally recognized for his expertise in general education, active learning, classroom practice, course design, faculty development, student learning, and learning outcomes assessment. “I believe in the transformative opportunities that higher education can provide individuals, especially first-generation students like myself,” shares Watson. “When I entered a master’s program in English, I became increasingly interested in the puzzle of how learning works. I wanted to better understand how to make learning more meaningful for students, how to engage them, and how to ensure what I’m teaching is not just memorized for an exam, but will be remembered and utilized long after the course is completed. As I advanced in my career, I was able to take what I learned helping students in my own classroom to provide programming and opportunities that could benefit the breadth of higher education.”

Even though change can be slow within the education community, Watson says the dramatic, fast shifts happening in the industry are causing many institutions to take notice. “Unfortunately, as higher education begins to adapt, AI is creating new digital inequities. Many institutions are struggling to determine how to best serve their students given the new challenges and opportunities. Institutions will need leaders who continue to explore how advancements like AI are changing their world and the ways in which they can harness and manage AI as a powerful teaching tool.”

“To begin to understand AI and its capabilities, I recommend that faculty copy and paste a current assignment into two or three different AI tools to better understand the opportunities, restrictions, and surprises. This can provide insight into ways to improve the assignment and to make it better aligned with the way students might be expected to complete similar work in the real-world post graduation. I think going forward, we will see AI more deeply integrated within systems we already depend upon. For instance, within learning management systems (LMS), it’s foreseeable that when tudents submit assignments, the AI-assisted LMS will check for AI, plagiarism, and may even grade and provide customized feedback using a faculty designed rubric.”

From a teaching perspective, AI can also be beneficial in helping instructors create rubrics and improve the quality of their course syllabus and assignments. “I hope more faculty look at AI as a toolbox, rather than something to fear,” says Watson. “Teachers are still the experts in their field, and AI can help them elevate their courses and find new ways to improve the learning experience. AI is not a search engine; it is more like a knowledgeable colleague. Using it is more about prompt engineering and having a conversation that fine tunes the results. Faculty should see AI as an idea generator that could be leveraged and helpful with many aspects of the classroom and beyond.”

ChatGPT, a chatbot developed by OpenAI and launched in November 2022, is a common AI tool used to automate tasks, compose essays and emails, and have human-like conversations. According to a recent survey conducted by Study.com, 89 percent of students over the age of 18 have used ChatGPT to help with homework, while 48 percent confessed they had used it to complete an at-home test or quiz.2 “While many students are familiar with AI tools like ChatGPT, not all educators are aware of its prevalence, causing a disconnect,” says Watson. “Showing faculty how this tool can be useful is key and encouraging them to have open and honest conversations with students about how AI can be used as a tool of learning, rather than a way to cheat on their schoolwork is now an essential early-in-the-semester conversation. Instead of approaching AI with how it is breaking your pedagogy, consider how AI is relevant for what you would like to accomplish in preparing your students for the future.”

“I hope more faculty look at AI as a toolbox, rather than something to fear. Teachers are still the experts in their field, and AI can help them elevate their courses and find new ways to improve the learning experience. AI is not a search engine; it is more like a knowledgeable colleague. Using it is more about prompt engineering and having a conversation that fine tunes the results. Faculty should see AI as an idea generator that could be leveraged and helpful with many aspects of the classroom and beyond.”

— Dr. C. Edward Watson
Associate Vice President for Curricular and Pedagogical Innovation,
American Association of Colleges and Universities (AAC&U)

Adapting Higher Education in a New Era
With a theme of Excelling in a Digital Teaching and Learning Future, EdgeCon Spring 2024 will welcome Dr. Watson as a keynote speaker to explore how higher education is evolving and ways to overcome the challenges the industry is facing. “A recent Gallup survey shows a steep decline in how higher education is perceived in this country3,” says Watson. “Less than half of Americans have confidence in higher education. All of us within our industry should consider how we can positively impact this national perception of higher education as there are ramifications. Not preparing students for what will certainly be an AI-enhanced career or recklessly using AI detection tools in ways that might unjustly accuse significant numbers of students of cheating can be significantly dangerous for higher education. Combining such practices with the ongoing student debt crisis and a politically polarized higher education dynamic, and more and more students will question if higher education is still as important as it once was. Already many ask if higher education is still a cornerstone of the American Dream.”

“I look forward to discussing the higher education landscape at EdgeCon and exploring suggestions for how we might move forward,” continues Watson. “We need to acknowledge that AI is going to be an important thread in the education and research industries. Disruption is not always a bad thing, especially in the workforce. AI can help improve efficiencies, reduce costs, increase productivity, and create new job opportunities. In the higher education setting, these tools have the potential to offer personalized learning experiences, strengthen retention, and resolve accessibility issues. Along with the potential challenges this type of technology may introduce, we must also look at the positive opportunities that will arise and how we can better prepare our students for the world that is already waiting for them.”

View Article in View From The Edge Magazine

The post Navigating AI-Powered Education and the Future of Teaching and Learning appeared first on NJEdge Inc.


Maintaining Quality Online Learning Programs

The post Maintaining Quality Online Learning Programs appeared first on NJEdge Inc.

Creating and sustaining quality online learning experiences has become a top priority across the higher education community and plays a key role in the appeal and competitiveness of an institution. As these online programs are developed and implemented, quality assurance frameworks and processes are essential to ensuring that these programs meet rigorous standards and continue to align with learning objectives. “Having standards that everyone from across an institution has to meet is of paramount importance in higher education,” says Joshua Gaul, Associate Vice President & Chief Digital Learning Officer. “The lack of standards in today’s higher education system is a top reason for the drop in retention and enrollment, especially among community colleges and small private schools. Every organization should ensure their course offerings and entire digital presence meet quality industry standards, including ADA compliance.”

Using Rubrics to Assess Course Quality
To help ensure learners are engaging with high-quality courses, Quality Matters (QM) is among the most well-known programs for creating a scalable process for quality assurance. “QM is a global organization leading quality assurance in online and digital teaching and learning and is used to impact the quality of teaching and learning at a state and national level,” says Gaul. “QM has eight general standards and 42 total standards. More than 1,500 colleges and universities have joined the Quality Matters community and they’ve certified thousands of online and hybrid courses, as well as trained over 60,000 education professionals, including myself, on online course design standards.”

The SUNY Online Course Quality Review Rubric (OSCQR) is another well-respected online design rubric, used and developed by SUNY Online, in collaboration with campuses through the SUNY system. “With six general standards and 50 total standards, the QSCQR is openly licensed for anyone to use and adopt and aims to support continuous improvements and quality accessibility in online courses,” explains Gaul. “The rubric and the online course review and refresh process support large scale online course design efforts systematically and consistently. The goal is to ensure that all online courses meet a quality instructional design and accessibility standard, and are regularly and systematically reviewed, refreshed, and improved to reflect campus guidelines and research based online effective practices.”

“In addition to QM and OSCQR, there are many other rubrics being used to systematically check courses against,” continues Gaul. “No matter which rubric you are using, it’s important to have accountability and a knowledge sharing process about these standards across the entire institution.”

Implementing an Evaluation Cycle
Regardless of the program being used to conduct online course quality review, developing an evaluation cycle is essential to ensuring courses are meeting key standards. “The first step in implementing an evaluation cycle is gathering data and understanding the trends of your organization,” says Gaul. “What is the enrollment frequency, what courses have high enrollment, how many students fail or drop out? In classes that have very low enrollment or high drop rates, what are their barriers to success? Institutions should review the disciplines and courses with the highest enrollment and which courses should be evaluated and revised on a more frequent basis. Looking at the data closely can provide valuable insight into the effectiveness and quality of each online course.”

In between offerings, institutions should take stock of online courses as a whole and reflect on ways to enhance course content, engagement, and student outcomes. During this assessment, important questions to ask include:

Does the course learning environment welcome and include all students? Is engagement encouraging? Are there opportunities for self-reflection and discussion? Do activities provide opportunities for realistic, relevant, and meaningful application of knowledge? Are students achieving the goals of the course? Is the workload reasonable for both students and the instructor?

By adopting a mission to review and update all courses to ensure the highest quality content and experience, that promise can go a long way in improving the brand of an institution and creating a student centric learning environment that attracts positive attention. To successfully create an evaluation cycle, Gaul says each institution needs a defined project management process. “Each organization should map out a review process that defines individual roles and responsibilities. This should involve instructional designers, librarians, IT services, student support, and academic support. This process should not fall solely on the instructor. If you think of it like building a house, the faculty member is the homeowner, the instructional designer is the general contractor, and IT is your plumbing and electrical. Every person needs to be involved in the planning from day one to ensure a successful build.”

Building a Course Assessment System
Any time an institution begins assessing courses, whether it’s from a system level or individual course level, there are often barriers to overcome. “When technology is involved in instruction, there should be a collaborative effort to identify and overcome any hurdles,” says Gaul. “Technology should never lead the academia; the teaching should lead the technology. We must remember that all students are cognitively different, and this is why Universal Design for Learning (UDL) leans towards accessibility and flexibility and removing barriers to learning. These barriers can include inadequate support, where students do not know where to go for help, whether that’s technical, tutoring, writing style, etc. Access to support must be built into the course in order for students to feel supported and demonstrate emotional intelligence within the class.”

Other common barriers include a lack of a learning community and boredom. Without students feeling connected to the instructor and other classmates, they can become isolated, and without interesting content and delivery, students can feel disengaged. “System barriers we regularly see in regards to course assessments involve implementation,” says Gaul. “Lack of commitment, poor preparation, and inconsistency can all affect the success of a course assessment. Unless there’s some sort of checks and balances, courses are going to be inconsistent, and students are going to have difficulty moving seamlessly between classes if they’re taking more than one online course. The purpose of building a course assessment system is to free up faculty and give them the proper support they need to be successful.”

“Whether a course is fully online, hybrid, HyFlex, or in-person, we can help make sure it meets all the standards of quality technology enhanced instruction. This can provide a level of risk management and quality control that can often get ignored when there’s too much focus on the tools, system recruitment, and retention. Member institutions can also count on web and educational technology support. Edge provides technology and web support service management frameworks and ticketing systems to help with website maintenance and web content management. Most importantly, we can help provide thought leadership in how to implement a systemwide course assessment and revision cycle.”

— Joshua Gaul
Associate Vice President & Chief Digital Learning Officer, Edge

Instructional Design Support
Designing and managing online courses can be a challenging task, especially without the resources and training to do so effectively. Well-versed in instructional design, the Edge team understands digitally-enabled learning environments and how to evaluate online courses against standard industry rubrics. “Edge understands the methodologies, rubrics, and standards that go into the creation of a high quality curriculum,” says Gaul. “We have worked with colleges and universities to conduct evaluations and identify trends we see in their courses. We can also build workshops to help train faculty and students and improve their understanding of why online instruction is different from traditional classroom learning. Specifically, we help prepare staff and students for the challenge of online education through engaging student-centered experiences built to encourage online presence and encouraging active learning methodologies.”

Edge’s course and curriculum evaluation services are designed to help an institution deliver a top-quality product. “Whether a course is fully online, hybrid, HyFlex, or in-person, we can help make sure it meets all the standards of quality technology enhanced instruction,” says Gaul. “This can provide a level of risk management and quality control that can often get ignored when there’s too much focus on the tools, system recruitment, and retention. Member institutions can also count on web and educational technology support. Edge provides technology and web support service management frameworks and ticketing systems to help with website maintenance and web content management. Most importantly, we can help provide thought leadership in how to implement a systemwide course assessment and revision cycle.”

“Our team of experts can help an organization bridge the gap between technology and academia and lead a collaborative effort as opposed to two silos working in competition,” continues Gaul. “We can customize for smaller niche projects, support larger, longer-term initiatives, or become an extension of your team. Edge can provide documentation used in the project and whatever we produce will be owned by the institution, whether it’s a learning object or a series of training modules.”

Gaul says if online courses are not being reviewed and revised regularly, those learning experiences will not make an impact. “Revision cycles that are high quality, trust the data, and have accountability and responsibility are incredibly important to ensuring course content is engaging and impactful. Every institution should look at how their offices work together to create a course evaluation and revision cycle that is beneficial and supportive to the student. As you look for ways to improve your institution, Edge wants to help you transform your instruction, advance your online education, and find powerful ways to improve the way you do business.”

To learn more about optimizing courses for online learning and transforming the student experience, visit njedge.net/solutions-overview/digital-learning.

View Article in View From The Edge Magazine

The post Maintaining Quality Online Learning Programs appeared first on NJEdge Inc.


The Growing Demand for Instructional Designers

The post The Growing Demand for Instructional Designers appeared first on NJEdge Inc.

As the wave of digital transformation continues to change and shape higher education, the demand for highly-skilled talent who understands instructional design is growing too. Especially over the last couple of years when online learning skyrocketed, institutions had to quicken their pace in offering remote classes, while also creating new online courses, programs, and degrees as we entered a modern era of learning. Instructional designers or learning designers have become essential members of an organization, but not only are they difficult to find, many with these credentials do not pursue roles in higher education. And for those who do work at colleges and universities, the increasing pressure to be experts in a multi-faceted profession where institutions are investing in technology at an astounding rate is causing many instructional designers to experience workplace burnout.

Many instructional designers find themselves responsible for designing courses, building learning materials, coding, project management, and ensuring the effective delivery of instructional materials and experiences. With such a high bar, it can be challenging for these individuals to keep up. “Many institutions look at their instructional designers as workhorses,” says Joshua Gaul, Associate Vice President & Chief Digital Learning Officer, Edge. “Oftentimes, faculty members bring the content to the instructional designer and they then organize the content and build the course from a technical standpoint.”

“The biggest benefit of instructional design is not just knowing how to use the learning management system (LMS) or how to repurpose your content and put a discussion board together,” continues Gaul. “These experts work with faculty and leadership to bounce off ideas and integrate learning theory and pedagogy. Instead of shouldering faculty members with learning design on top of teaching and working to elevate the curriculum, instructional designers can help lighten this load and bring an expert perspective that can be hugely valuable to an institution.”

“The biggest benefit of instructional design is not just knowing how to use the learning management system (LMS) or how to repurpose your content and put a discussion board together. These experts work with faculty and leadership to bounce off ideas and integrate learning theory and pedagogy. Instead of shouldering faculty members with learning design on top of teaching and working to elevate the curriculum, instructional designers can help lighten this load and bring an expert perspective that can be hugely valuable to an institution.”

— Josh Gaul
Associate Vice President & Chief Digital Learning Officer
Edge

Quality Assurance in Online Learning
The approach to instructional design and how the field is regarded varies across the higher education community, and even between departments within an organization. “Institutions view instructional designers differently and it’s often tied to their current digital learning path,” shares Gaul. “The schools that were already forward thinking during the pandemic, didn’t have as large a shift in their business processes. The organizations seeing the most change in these processes are the ones who embraced the change but had to adjust on the fly. For the schools that do have instructional designers on staff, there is not always a unified approach to instructional design. The school of biology has different looking courses than the school of journalism, for example, but there need to be instructional design standards to ensure quality and compliance and can offer a clear model for all courses to follow.”

“Rubrics like QSCQR and Quality Matters (QM) establish an instructional design support framework,” continues Gaul. “Some organizations fear a centralized online learning approach will make courses too uniform, when in actuality, standardization gives faculty more academic freedom to customize their courses without worrying about accessibility. For instance, think about a textbook. Each has a table of contents, an index, and is broken into chapters. While the subject matter may be vastly different, it still follows this basic format and when a student opens the book, they know how to use it to get the information that they need.”

Aligning Business Goals with Instructional Design
Instructional design not only encompasses online learning, but extends to in-person instruction and hybrid learning as well. Many in this profession report long hours, lack of support, tight deadlines, and unrealistic expectations, all of which can lead to frustration and fatigue. “Some smaller institutions have a centralized instructional design office, but it’s often poorly staffed and leans more toward instructional technology training,” says Gaul. “This team will train faculty on how to use the LMS or the educational technology tools.”

Gaul continues, “Not every organization is going to have the budget to employ an instructional designer, especially someone who has an advanced education and necessary skill set. This is where Edge can be of value and offer instructional design support. With a team of over twenty seasoned instructional designers, we have consultants across the country, all with at least a master’s degree and several years of experience. Our instructional designers will work with an organization’s subject matter experts to analyze, design, develop, implement, and evaluate instructional materials and programs for an institution.”

As a longtime partner of Edge, Rowan University has a full team of instructional designers, but wanted to free them up to focus on faculty support and taking learning to the next level. “Rowan was able to move the course development and term to term updates to Edge and give their instructional design team the ability to work more closely with faculty members to elevate their courses and create more engaging, student centric content that meets quality standards. However, we understand many schools do not have the budget for a large instructional design team and that is why EdgeLearn can be a valuable solution for institutions. Edge can provide the expert support, strategy, and tools needed to enhance teaching, learning, and student engagement, without breaking the bank.”

One of the most important factors in successfully implementing digital learning programs and systems is ensuring business processes and goals align with instructional design initiatives. “Technology doesn’t drive the mission, but the technology can be informed by and follow the academic mission of the institution,” says Gaul. “As long as you have the right tools in place and the people who understand those tools and processes, you can accomplish amazing things with a small group of instructional designers. Edge can supplement an institution’s existing team or provide expert assistance in developing student-focused curriculum.”

“Edge can come in and hit the ground running because our team knows every tool,” continues Gaul. “We understand how to use data to glean important insights about an organization’s instructional design or ways artificial intelligence (AI) can open doors to new opportunities for digital learning. No matter what LMS an institution is using, or what rubric they follow, we can offer the specialized support needed to help fill any gaps and create superior teaching and learning experiences—anytime, anywhere.”

Ready to discover how Edge’s Digital Learning, Strategy, and Instructional Excellence experts can help your organization?
Visit njedge.net/solutions-overview/digital-learning/.

View Article in View From The Edge Magazine

The post The Growing Demand for Instructional Designers appeared first on NJEdge Inc.


The Vital Role of Academic Libraries in the 21st Century

The post The Vital Role of Academic Libraries in the 21st Century appeared first on NJEdge Inc.

With a longtime love of reading since young adulthood, Ann D. Hoang began working in a library as a student employee at Rutgers University. Her experience there helped direct her career path and she knew right away that she wanted to stay in academia and academic librarianship. “Being able to meet other students and having the ability to work with faculty and support students was very fulfilling,” says Hoang, University Librarian at New Jersey Institute of Technology (NJIT); Chair, VALE Executive Committee. “My first full-time job was at Rutgers as a Reference & Materials Delivery Coordinator. This role was within the IT department and gave me the opportunity to learn more about computing. This included using the iMac when it was first released and loading CD-ROMs onto servers. I realized I could combine my love of reading and supporting students and faculty with an interest in expanding my knowledge of technology.”

Gaining valuable experience in project management, Hoang quickly moved up the ranks, and in 2006, was recruited by NJIT to join their team as the Assistant University Librarian. In this role she served as the chief information officer for library technology overseeing administrative, library information technology, and collections services. “During the eleven years in this position, I was able to further hone my project management capabilities and technology knowledge,” shares Hoang. “When the head of the library position opened at NJIT, I applied and following a rigorous nationwide search, was awarded the position, and I’ve now been in this role for seven years. Even though I’m in a strategic leadership position, I still love meeting with students and helping them find the resources they need.”

Navigating the road to her current role at NJIT wasn’t easy. Reflecting back on her early, formative years, Hoang describes, “My family left Vietnam in 1978 as political refugees. We had to flee the country by boat. We were known as the “Boat People” because of how we left Vietnam. After many months of wandering the ocean, we finally arrived in Hong Kong. We spent two years in refugee camps waiting to settle in the United States. We arrived in New Jersey on a cold September evening in 1980 without speaking English.” She continues, “I remember it was so dark, thinking about how I would live here when everything was big and dark. Reading kept me focused while learning English. I spent many hours and days in ESL (English as a second language) classes. It took several years to understand the language with support from teachers and students of similar backgrounds. This part of my life helped me appreciate why engaging and supporting students is essential.”

“Many times, students are writing research papers and need to choose the right keywords to use when searching for resources,” continues Hoang. “I like being able to break down the process and instruct them on the steps needed to find information, as well as explain the difference between peer review content and block quotes off the Web. My team continually gathers feedback on how we can better support students and faculty. A large part of this is providing off-campus access to resources, anytime, anywhere. The library has evolved far past a quiet space full of books; the underlying infrastructure is now about technology and facilitating seamless access to much-needed content.”

“Professional development will give library staff new skill sets and make sure we’re on par with understanding our faculty’s needs and how to integrate technology successfully. This includes using certain tools like artificial intelligence (AI) and virtual reality and how to provide data management services. To support students, we also have an opportunity to partner with faculty to develop freely available content that is implemented within their curriculum. Since textbooks are often not affordable for many students, this solution could give free access to materials and help students be more successful in their coursework.”

— Ann D. Hoang
University Librarian New Jersey Institute of Technology (NJIT);
Chair, VALE Executive Committee

The New Age of Academic Libraries
In the past, traditional libraries focused on collecting physical print items, but as we entered the digital age, open access to resources became paramount. “The biggest challenge for us is providing instant access to our collection,” explains Hoang. “The only way to address this need is converting print to electronic, which is very time-consuming. And while vendors have designed a way to create digitally native content, this service can be costly. We can create a vicious cycle where faculty write grants to conduct research and pay to publish the research, followed by the libraries paying to acquire access to that research. We end up double or triple paying for the same content. To remain valuable to the institution, we must determine how to continuously provide access to that digital collection, including special collections.”

Digitizing a library’s special collections is necessary, but a labor-intensive process. “Much of the older content is not available digitally and we must determine how content that was created in the 1700s and 1800s, for example, can be converted into a primary source for users to access digitally,” says Hoang. “This is an important challenge and institutions need staff with the technical skills to complete these tasks, along with the technology infrastructure. Partnering with our IT department is crucial to understanding our limitations and ensuring our content is protected behind the firewall, so our institution’s network is not compromised. Libraries are being tasked with understanding technology and how to use these tools to facilitate and simplify our work.”

In addition to harnessing technology in useful ways, many libraries also seek to gain insights about their users and how their resources are being used. “Traditionally, libraries do not collect data, but it would be helpful to understand who is using each resource, and particularly, the number of students and faculty,” says Hoang. “This information could help determine if the resource is worth being part of our collection. We must learn how to collect data, as well as helping our faculty collect that data as they conduct their research and build large data sets. Additionally, we must determine how the faculty can collect, store, and share their data set. The library is in a perfect position to develop a data management service to help our faculty follow best practices for storing data and putting it in a format that allows other researchers to access it easily.”

For library staff to support data management, training will be an important part of achieving this goal. “Professional development will give library staff new skill sets and make sure we’re on par with understanding our faculty’s needs and how to integrate technology successfully,” explains Hoang. “This includes using certain tools like artificial intelligence (AI) and virtual reality and how to provide data management services. To support students, we also have an opportunity to partner with faculty to develop freely available content that is implemented within their curriculum. Since textbooks are often not affordable for many students, this solution could give free access to materials and help students be more successful in their coursework.”

“VALE serves as an incredible resource for higher education libraries across New Jersey, providing thought leadership, consortium buying of subscription databases, professional development events, and more. Edge is sincerely pleased and honored to work with Ann and the VALE organization to support their business organization needs. Moreover, Ann’s leadership of VALE and in her role as University Librarianat NJIT, a rapidly evolving Research 1 institution, uniquely position her to represent the achievements of institutional libraries across the State.”

— Sam S. Conn, Ph.D.
President and Chief Executive Officer, Edge

Expanding Information Literacy
As AI continues to transform the higher education landscape, the use of generative AI tools is on the rise. According to Inside Higher Education, 9 percent of faculty members had used AI writing tools in the spring of 2023, but that rate had risen to 22 percent by the fall. The adoption rate was even higher among students, where the rate jumped from 27 percent in the spring to 49 percent in the fall.1 “The emergence of AI presents both challenges and opportunities,” says Hoang. “Using AI positively can lead to many innovations, however using it negatively can be a detriment to society, so educating students and faculty on the ethical use of AI will be essential. We also need to explore how AI can enhance our data collecting and decision making. This technology can help us synthesize large sets of data and augment the decision-making process; allowing us to solve problems more creatively.”

“Navigating the ethical issues surrounding AI is critical and we must ease the fear that robots are going to take over our jobs,” continues Hoang. “Because in theory, if we use AI properly, this technology can help us synthesize information and increase our skills, allowing us to be one step ahead of the AI tool. Therefore, we become a valuable component of our institutions and can use advanced tools to make informed decisions. We must continue to expand our knowledge of AI and its capabilities to not only be prepared to seize important opportunities, but also to identify solutions that can combat the misuse of this technology.”

To introduce students to available AI tools, librarians at NJIT have regular information sessions and conduct demonstrations to show how AI can answer different questions. “We will compare the information provided by AI to the information we look up manually,” explains Hoang. “Sometimes AI can give the wrong answer, so if we just accept that information, we could potentially create something that is false or a societal detriment. Students must understand how to validate information and avoid plagiarizing someone else’s work. The library’s role is to educate students on the use of AI, while also helping them stay versed in finding information manually, using the search engine, and using our subscription to library resources to find what they need. We want to help them understand how to find the government regulations and rules set forth by an institution surrounding AI and ways to avoid violation of regulatory guidelines.”

A key part of the librarian’s role is helping expand students’ information literacy and their ability to locate and evaluate specific information. “The need for information literacy has exploded with the use of AI,” says Hoang. “Librarians must continue to be at the forefront of changing trends and technologies to teach students the correct way to look up information and verify that the information is accurate before continuing. That component seems to be the missing link with the newer generations because many students feel the Internet is the ultimate resource. Our faculty members partner with librarians and allow us to share this knowledge in the classroom and explain the rationale behind verifying information.”

Supporting Faculty Research
Research continues to be a top priority in NJIT’s strategic plan. In 2022, the institution was reaffirmed as an elite research university and retained its R1 status by Carnegie Classification. “Our faculty is committed to expanding our research initiatives and pursuing grant funding to drive further innovation and discovery,” says Hoang. “To help in this endeavor, the library must ensure we have the infrastructure to support the research output of our faculty. As more research is conducted, the higher the need to store and preserve data. I am currently recruiting an open access and scholarly communications librarian with the skills to support open access and data management services. I also look to expand our team with individuals who have computer science technology, data science, and information technology backgrounds and can support our faculty in their research projects.”

“With librarians involved in supporting faculty research, we can help ensure information is documented properly and they are compliant with copyright laws,” continues Hoang. “We also can help identify possible collaborative partnerships and conduct background research on which journals would be the best match for publishing their work. We can identify which authors have the highest rating or citation impact on their research and connect our faculty with other institutions on a national and global scale. The library is poised to provide valuable insight into how our faculty can expand the reach of their research and find collaborators around the world.”

Facilitating Collaboration
The Virtual Academic Library Environment (VALE) is a consortium of fifty New Jersey college and university libraries, the New Jersey State Library, and LibraryLinkNJ (LLNJ). “The primary focus of VALE is offering a cooperative purchasing agent, similar to EdgeMarket,” says Hoang. “They negotiate a licensing contract with different vendors so libraries can purchase through the consortium without having to negotiate back and forth with the vendor. I was on VALE’s website committee early in my career to help develop websites that provided researchers with easy access to necessary resources, both virtually and in person.”

Currently serving as VALE Chair, Hoang and the Executive Committee discuss recommendations for purchasing databases, hardware, and software, and how libraries can meet the evolving needs of the academic community. “We are currently exploring how we can continue to integrate technology tools and ensure our librarians are equipped with necessary skills,” shares Hoang. “The Committee is looking at digital transformation, open access, publications, scholarships, and how we can house special collections that can be shared across academic institutions in New Jersey. We are exploring how we can work collectively to create a centralized, statewide repository that houses this content and can be accessed free of cost, anytime, anywhere.”

VALE hosts a conference on the first Friday of January to bring academic librarians from throughout the region together to discuss current topics, technology, and strategies for supporting student success. Along with John Kromer, Matthew Brown, Jill Lagerstrom, and Lisa Weissbard, the NJIT Librarians presented at a breakout session at VALE’s 2024 New Jersey Academic Libraries Conference on January 5, 2024. Entitled, Open Textbook Publishing from Conception through Completion: A Proof-of Concept, this session explored their multi-year journey of publishing their first open textbook, An Animated Introduction to Digital Logic Design, by NJIT professor John Carpinelli. They shared details about securing and apportioning funding, collaborating with a faculty author, compiling and editing the book, determining requisite technology, publishing in an open repository, tracking usage, and the plans for the future.

Events like VALE’s annual conference provide valuable networking opportunities and a chance for librarians and the VALE team to prepare for their presentations at national and international conferences. “The VALE conference acts as a starting point or test where and presenters are able to gather feedback on how to improve their presentations and be better prepared for a wider audience,” says Hoang. “Through our partnership with Edge, we are now soliciting sponsorships to supplement the conference to help us continue to provide the event to librarians free of charge.”

A Vision for the Future
As we look toward the future of libraries, Hoang says the biggest challenge will be to keep pace with advanced technology like AI. “We’re just starting to scratch the surface of understanding the impact AI will have on our day-to-day lives. If we fail to keep up with learning more about these technologies and developing programming around AI, we risk missing out on key opportunities. We must also embrace digital transformation and determine how to provide seamless access to resources; otherwise, we lose our value, and our students and faculty will turn to someone else for that information. In addition, libraries need collaborative flexible spaces for our students to learn and we must find innovative ways to support different learning styles, whether that is quiet study or spaces for louder, more active collaboration.”

“Libraries play a vital role in enhancing students’ learning capabilities and will need to provide interactive technologies and tools that students can use to apply what they learn in the classroom,” continues Hoang. “All these goals require thoughtful planning, strategic thinking, network infrastructure, and professional development. The library is integral to institutional success and we must raise awareness about its importance. Traditionally, librarian positions are greatly underpaid. Yet in today’s world, these individuals are expected to have advanced education and skills equal to someone in IT or computer science. I feel that by contributing to VALE, I am supporting my profession and community and helping my institution excel in the 21st century.”

View Article in View From The Edge Magazine

The post The Vital Role of Academic Libraries in the 21st Century appeared first on NJEdge Inc.


Steering La Salle University into the Future of Learning

The post Steering La Salle University into the Future of Learning appeared first on NJEdge Inc.

Beginning his career in 1988 at St. Joseph’s University as a media technician, P. David Lees, Ed.D., Director of Distributed Learning & Educational Technology, La Salle University, has had a front row seat to the evolution of technology in higher education. “This field is constantly changing, and it has been a very interesting journey so far,” says Lees. “I started out delivering overhead projectors to classrooms to later facilitating state-of-the-art multimedia classrooms. Toward the end of the 1990s, the institution began exploring distance learning through video conferencing systems, chat rooms, and online instruction. Our first learning management system (LMS) was CourseInfo, a precursor to Blackboard. Being a pioneer that experimented with different technologies has kept my career extremely interesting.” In the late 1990s, the University received a Federal grant to create the Early Responders Distance Learning Center (ERDLC) which worked with government and other agencies to create online training for different first responders. The University was far ahead of its time in implementing distance learning.”

Later in 2012 with government funding ending, the ERDLC was searching for a place at the University. “We had been actively growing our online programs,” explains Lees. “We decided to combine a part of my old department with the ERDLC and created the Department of Academic Technology and Distributed Learning. During my time at St. Joseph’s, we started a Technology Innovation Fund where we would fund technology initiatives to help meet faculty’s learning objectives. In early 2010, there was a grant to provide iPads to faculty members. I later heard from a faculty member how that was such a transformative experience receiving that technology and how it helped advance the teaching and learning experience.”

“The partnership with Edge has been terrific and they have helped with our course development. Our OPM does provide some assistance with instructional design and Quality Matters (QM) reviews, but we did not have anyone internal to work with these organizations and provide daily Canvas support. Enter Joshua Gaul, Associate Vice President and Chief Digital Learning Office, Edge, who comes in and saves the day a lot of the time. We currently have a six-month contract with Edge while we look to hire two positions for an instructor technologist and may extend that based on our needs. Josh is also helping us with this interviewing process and finding suitable candidates.”

— P. David Lees, Ed.D.
Director of Distributed Learning & Educational Technology, La Salle University

Filling Skills Gaps as Online Learning Grows
Joining La Salle University in 2017 as the Director of Online Hybrid Learning, Lees says the institution had just partnered with an online program management (OPM). “For most of my career, I had been on the faculty development side of online learning. With this new role, I had to broaden my scope and dive further into the business end of online learning, including marketing, enrollment, and retention. I have continually championed the OPM, because we’ve been able to grow our programs. In the fall of 2017, La Salle launched an online Registered Nurse-Bachelor of Science in Nursing (RN-BSN) and Master of Business Administration (MBA). In 2019, we launched two Nurse Practitioner tracks with the OPM, which brought our online offerings to nearly 20 programs.”

However, Lees did keep his ties with faculty and course development, “When I joined La Salle, the instructional design team did not report to me, but we worked closely together.  After a couple of years at the University, there was a reorganization, and the ID Team began reporting to me.”  Hence, the Distributed Learning and Educational Department was born.

Lees earned his master’s degree and doctorate at a distance in the early stages of the online revolution. “From my experience as an online student, and in developing online courses, and as an online instructor, I gained valuable insight into remote education.” Last year, after members of the ID team resigned, Lees was left to learn Canvas very quickly and cover the tasks of several people. “Thankfully, I had just signed up for an Edge conference and learned about the support Edge could provide, so I got in touch with the Member Engagement Manager, Erin Brink,” says Lees. “The University incorporated new technologies during the pandemic which the ID Team had to manage, and we still had programs to launch in the summer of 2023. The support from Edge allowed us to keep moving forward.”

“The partnership with Edge has been terrific and they have helped with our course development,” Lees continues. “Our OPM does provide some assistance with instructional design and Quality Matters (QM) reviews, but we did not have anyone internal to work with these organizations and provide daily Canvas support. Enter Joshua Gaul, Associate Vice President and Chief Digital Learning Office, Edge, who comes in and saves the day a lot of the time. We currently have a six-month contract with Edge while we look to hire two positions for an Instructional Technologist and may extend that based on our needs. Josh is also helping us with this interviewing process and finding suitable candidates.”

Similar to many institutions in higher education, Lees says finding and retaining instructional designers has been challenging. “Instructional Designers or Instructional Technologists are in high demand, but have only stayed short term, oftentimes only a year. We’ve tried to remedy this with competitive salaries and offering professional development, but many times, these individuals move to larger institutions or corporate organizations. While instructional design tasks can regularly be performed remotely, we like our designers to be on campus, but we’re trying to offer more flexibility to help attract talent to La Salle.”

Lees says having Edge’s support has also helped the University launch four master’s programs and six certificates this past summer. “Many of these courses had been built but needed revisions. We’re also launching stackable certificates in the business school in January. While we receive support from our OPM, Josh has taken on some of the heavy lifting and handled many of the daily tasks that I do not have time to do; allowing me to focus on other important initiatives.”

Embracing Technology to Enhance the Learning Experience
As new technologies continue to evolve, artificial intelligence (AI) appears to be front and center and is driving many thought leaders to explore the potential opportunities and capabilities of this advanced technology. “There is a lot of hype around AI, but I think this technology will have a fundamental change on how we’re doing things across many industries, including higher education,” says Lees. “In many of our discussions, we’re seeing how AI can transform teaching, change how students learn, and offer tutoring and deeper critical thinking opportunities. While we’re in the Wild West phase of AI, it will take ongoing discussions and creativity to determine how to harness this technology in the most beneficial and impactful way. Just like video, iPads, and SMARTboards help transform education, technology like AI will take us into a whole new era of learning.”

View Article in View From The Edge Magazine

The post Steering La Salle University into the Future of Learning appeared first on NJEdge Inc.


Understanding State-of-the-Art Cyber Threats and Optimizing Your Budget for Cyber Defense

The post Understanding State-of-the-Art Cyber Threats and Optimizing Your Budget for Cyber Defense appeared first on NJEdge Inc.

Cyber attacks are becoming increasingly sophisticated and are targeting not only corporate entities, but also educational institutions with valuable data. Reports show that educational institutions fall among the top targets for cyber attacks, where incidents in this sector have increased by 44 percent from the prior year.1 “We have definitely seen an uptick in cyber attacks in the last few years and education and local governments are getting hit pretty hard,” says Dr. Dawn Dunkerley, Virtual Chief Information Security Officer, Edge. “Especially in the higher education space where government-funded research projects are being conducted, we’re seeing an increase in threats where people would like to gain access to this information. Institutions must gain an understanding of the current threat landscape and take proactive measures to fortify their defenses.”

The Emergence of New Age Cyber Threats
Cyber attacks are popping up in every industry and across every region as hackers continue to find new ways to take advantage of vulnerabilities. With the rise of artificial intelligence (AI) and its growing integration into our day-to-day lives, the landscape is changing rapidly and a new age of cyber threats is emerging. “Cybercriminals use AI-powered phishing attacks and create intelligent malware to infiltrate networks or devices,” explains Dunkerley. “While AI tools are providing us with many opportunities to streamline and automate processes, this technology can also open the door to cyber crime. In previous years, AI-powered phishing attacks were common, where you may receive a poorly-worded email with misspellings that were easy to spot.”

“Cybercriminals are now using ChatGPT, for example, to create more sophisticated attacks,” continues Dunkerley. “This AI-powered language model can generate human-like text that is more difficult to identify as malicious, as well as can contain code and malware that is designed to attack an organization’s network.” Another trend on the rise is Deepfake AI where convincing images, audio, and video of real people are generated by machine learning. “Deepfakes present new challenges with fraud and misinformation, identity theft, and political manipulation,” says Dunkerley. “We have started to see deepfake voice recognition issues where people are creating voices to attempt to gain access to organizations. So, while AI can lead to incredible opportunities, there are also threats we need to be aware of and ensure protective measures are put in place.”

“Organizations must recognize the importance of cyber defense and that cybersecurity is not where you should cut corners. If you lose the ability to process financial aid, for example, the consequences could be disastrous. To create a strategic plan for success, you want to clearly identify your budgetary needs and the cybersecurity resources that will help enhance your defense.”

— Dr. Dawn Dunkerley
Virtual Chief Information Security Officer
Edge

The MOVEit Attack
One of the most prominent cyber attacks of 2023 is known as the MOVEit breach that compromised confidential data across a wide range of entities. “The latest number of organizations that have been compromised by the MOVEit breach is over 1,000, and has affected over 60 million individuals,” shares Dunkerley. “MOVEit is a piece of software that was created as a file-sharing tool. Current investigations believe their source code was compromised by a ransomware actor and they were able to gain access to organizations that were using the client-based version of the MOVEit software.”

The education sector was among the industries affected by this cyber attack, where reports say nearly 900 colleges experienced a data breach during the mass hack. Among the organizations affected by the breach include National Student Clearinghouse, Teachers Insurance and Annuity Association of America-College Retirement Equities Fund (TIAA-CREF), and Pension Benefit Information (PBI) Research Services. “For National Student Clearinghouse, thousands of student enrollment and other records were involved in the breach,” explains Dunkerley. “This attack has two tiers of compromise, where the lower tier is basic information including student name and non-sensitive data, and the second tier includes more personal information like Social Security numbers. We are working diligently with our member institutions who had information exposed to help provide the steps needed to protect themselves.”

TIAA is well known in the education space for helping provide retirement tools for people in academic, government, medical, cultural, and other nonprofit fields. The organization confirmed that one of its third-party vendors had been exposed in the MOVEit breach, and as a result, has filed a class action lawsuit alleging a breach of nearly 2.4 million personal records. PBI Research Services, a third-party vendor, had to inform their customers when they became aware of the MOVEit data breach. “This incident shows us how strongly we can be affected by third parties that are working on our behalf,” says Dunkerley. “It is becoming increasingly clear that we must make a strong push to understand vendor risk and manage the impact of such events on the education sector.”

“At Edge, we actively monitor our third-party risk and urge institutions to understand who their critical vendors are, who has access to the network, what is their critical software, student information systems, learning management systems, and critical infrastructure. Your organization can also benefit from investing in advanced tools. Technology continues to emerge that uses AI for continuous threat monitoring and detection. We want institutions to be equipped with the tools they need to create a holistic security approach that is both effective and affordable and allows you to make proactive, responsible decisions to improve cybersecurity within your organization.”

— Dr. Dawn Dunkerley
Virtual Chief Information Security Officer
Edge

Investing in Cyber Defense
The financial impact of a cyber attack is not just in the moment, it also extends into rebuilding IT systems and restoring data. “There are significant costs, including people, process, and technology, that are associated with response and recovery,” says Dunkerley. “You may have to create stand-up call centers to receive phone calls or stuff and send out envelopes with notification letters. Educational institutions may also face significant legal fees and penalties following a cyber attack, and can experience reputational damage; impacting future enrollment and funding.”

Cyber attacks can cause significant disruption to learning, with systems and networks often being taken offline for extended periods of time. This can result in lost revenue and enrollment for educational institutions. “Along with the cost of recovery efforts, a data breach can have a huge impact on learning,” says Dunkerley. “You may see a disruption to classes, assignments, and educational resources. Cyber attacks can cause systems and networks to be taken offline, which can disrupt classes and assignments, and result in students falling behind in their coursework.”

Identity theft and other privacy violations are also a major concern when sensitive student data, such as personal information and grades, are compromised. “Cyber attacks can also disrupt research projects at an institution and can impact funding and progress in the field of education,” says Dunkerley.

Optimizing your Cyber Defense
Understanding the evolving threat landscape is an important step in creating a strong cyber defense strategy that prioritizes and optimizes a cybersecurity budget, addresses the most critical functions, and enhances the defense against emerging threats. “Organizations must recognize the importance of cyber defense and that cybersecurity is not where you should cut corners,” explains Dunkerley. “If you lose the ability to process financial aid, for example, the consequences could be disastrous. To create a strategic plan for success, you want to clearly identify your budgetary needs and the cybersecurity resources that will help enhance your defense.”

To optimize a cyber defense budget and prioritize spending, institutions can benefit from implementing spending on critical functions like firewall enhancements, user education, and intrusion detection systems. “Training and education are also critical parts of a cybersecurity strategy and creating an incident response team and tested incident response plan,” says Dunkerley. “You want to be able to answer the question, how can we improve general user behaviors? When this behavior or something else fails, how do we identify that something has happened? Knowing these answers can then inform how to react and recover appropriately.”

Creating a Culture of Cyber Awareness
Outlined in the Safeguards Rule of the Gramm-Leach-Bliley Act (GLBA), institutions are required to protect private data in accordance with a written information security plan created by the institution. To be compliant, organizations must use administrative, technical, or physical safeguards to access, collect, distribute, process, protect, store, use, and dispose of customer information. Requirements include using proper software, testing and monitoring vulnerabilities, and providing employee training and education.

To enhance your institution’s cyber defense, Dunkerley says conducting a comprehensive risk and vulnerability assessment is a good place to start. “It’s important to understand what your top risks are from a confidentiality, integrity, and availability perspective and the impact of each. You also want to understand your overall vulnerability, including physical security. This goes beyond just a scan; you want to gain a comprehensive view where you can identify possible vulnerabilities before they become a threat.”

The GLBA requires institutions to develop an incident response plan and test it regularly. “We definitely recommend reviewing your response strategy at least every year, if not every six months,” says Dunkerley. “The response team also needs to include other team members than just the IT staff. In the event of a breach, there is potentially internal communications that will go out to the faculty, staff, and students, as well as external communication to other organizations.  There will also be a legal aspect. The incident response team is multi-faceted and will require a cooperative effort during response and recovery.”

Edge encourages institutions to understand third-party vendor risk and recommends a vendor risk management program. “At Edge, we actively monitor our third-party risk and urge institutions to understand who their critical vendors are, who has access to the network, what is their critical software, student information systems, learning management systems, and critical infrastructure,” explains Dunkerley. “Your organization can also benefit from investing in advanced tools. Technology continues to emerge that uses AI for continuous threat monitoring and detection. We want institutions to be equipped with the tools they need to create a holistic security approach that is both effective and affordable and allows you to make proactive, responsible decisions to improve cybersecurity within your organization.”

To learn more about building a proactive cyber defense strategy and investing wisely in your cybersecurity infrastructure, visit njedge.net/solutions-overview/cybersecurity/.

1Check Point’s Mid-Year Report for 2022. August 2022.

View Article in View From The Edge Magazine

The post Understanding State-of-the-Art Cyber Threats and Optimizing Your Budget for Cyber Defense appeared first on NJEdge Inc.


Hyperledger Foundation

Meet Aries Agent Controller, a New Hyperledger Lab

A code base developed and contributed by Superlogic that facilitates deploying Hyperledger Aries agents in cloud environments is the latest Hyperledger lab. The new lab, Aries Agent Controller, is now officially a part of the Hyperledger ecosystem, and we are excited to work with the broader community to grow it.  

A code base developed and contributed by Superlogic that facilitates deploying Hyperledger Aries agents in cloud environments is the latest Hyperledger lab. The new lab, Aries Agent Controller, is now officially a part of the Hyperledger ecosystem, and we are excited to work with the broader community to grow it.  


Identity At The Center - Podcast

We are thrilled to announce a new Sponsor Spotlight on the I

We are thrilled to announce a new Sponsor Spotlight on the Identity at the Center podcast! We had the pleasure of hosting Marco Venuti, Director of IAM Business Acceleration for Thales, and Jason Keenaghan, Director of IAM Product Management for Thales. In this episode, we explore the Thales Cloud Security OneWelcome Identity Platform and its comprehensive solution for managing digital identitie

We are thrilled to announce a new Sponsor Spotlight on the Identity at the Center podcast! We had the pleasure of hosting Marco Venuti, Director of IAM Business Acceleration for Thales, and Jason Keenaghan, Director of IAM Product Management for Thales.

In this episode, we explore the Thales Cloud Security OneWelcome Identity Platform and its comprehensive solution for managing digital identities. We dive deep into the world of B2B IAM and discuss its differences from B2C and B2E IAM.

You can listen to the episode on IDACPodcast.com or in your favorite podcast app. Don't miss out on the insights and expert perspectives straight from the source!

A big thank you to Marco and Jason for joining us and sharing their valuable knowledge.

#iam #podcast #idac

Wednesday, 28. February 2024

Elastos Foundation

Beatfarm Digital and Elastos Collaborate on Music-focused Web3 Platform

Beatfarm Digital (“Beatfarm”) and Elastos today unveiled a collaboration to deliver ‘positive disruption’ to the music business based on new inscription technology and Blockchain-based music consumption models. The music creation and performance industry is notoriously inefficient when it comes to matching artists with potential collaborators and – even more so – when remunerating and protecting [

Beatfarm Digital (“Beatfarm”) and Elastos today unveiled a collaboration to deliver ‘positive disruption’ to the music business based on new inscription technology and Blockchain-based music consumption models.

The music creation and performance industry is notoriously inefficient when it comes to matching artists with potential collaborators and – even more so – when remunerating and protecting the rights of creators themselves. Research from industry research firm MIDIA suggests that 1% of artists make a staggering 77% revenue related to recorded music sales; a trend that is actually becoming more regressive over time.  According to research published in 2019 in The Journal of Business Research, in 1982, 5% of the top-earning artists accounted for 62% of concert revenues globally; by 2003 that proportion had risen to 84%. 

A shift that has only been exacerbated by the emergence of new formats and technology; by the turn of the century, while 1% musicians accounted for 75% of ‘traditional’ formats such a CD revenues, they earned an even higher proportion – 79% – of subscription and streaming revenue.  Elastos’ partnership with Beatfarm represents a welcome alternative to this trend; with technology helping to put musicians and artists in control of their work, who they work with and how the resulting work is monetized. 

Through the collaboration, artists will have direct and secure access to all aspects of the music ecosystem – from composition and production, to merchandising and promotion as well as genuine ‘superfans’ –complete with a direct transaction mechanism based on Elastos’ recently launched BeL2 technology, enabling them to establish Smart Contracts on their own terms and remunerated direct in Bitcoin.  The resulting contracts – eScriptions – are secured and assured through Bitcoin and can themselves be traded through a decentralized marketplace. 

“The Elastos chain is an ideal platform for providing artists the tools and resources to control the monetization of their content and develop groundbreaking ways to connect with their fans in ways which the industry hasn’t seen before”, said Beatfarm’s Co-Founder, Alex Panos.

“Our collaboration with Beatfarm reflects everything that Elastos is about and what BeL2 can deliver.  Now artists and creators will not only have direct access to unlimited collaborators and resources, they’ll be able to partner with them on their terms, retaining full control and ownership of their work.  This is the very promise of the SmartWeb in action,” says Jonathan Hargreaves, Global Head of Business Development & ESG.

About Beatfarm

Beatfarm is a Web3 platform focused on the music industry whose mission is to provide artists the tools and resources to control the monetization of their content and develop new sources of revenue through direct collaboration with fans.  

Developed by music industry veterans, Beatfarm aims to become the priority destination for direct artist monetization and enhanced artist to fan engagement.  Follow @beatfarm_io on X

Join Us on the Journey

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

 

The post Beatfarm Digital and Elastos Collaborate on Music-focused Web3 Platform appeared first on Elastos.


Next Level Supply Chain Podcast with GS1

Behind the Barcode: Mastering 2D Barcodes with GS1 US's Gena Morgan

Keeping track of product information and inventory with multiple barcode types can be tricky for businesses.  Gena Morgan, who leads the standards team at GS1 US, shares valuable insights into the world of barcodes, specifically focusing on the transition from traditional 1D barcodes to 2D barcodes and the importance of GS1 standards in driving industry adoption. Gena explains the technical

Keeping track of product information and inventory with multiple barcode types can be tricky for businesses. 

Gena Morgan, who leads the standards team at GS1 US, shares valuable insights into the world of barcodes, specifically focusing on the transition from traditional 1D barcodes to 2D barcodes and the importance of GS1 standards in driving industry adoption. Gena explains the technical differences between traditional linear barcodes and 2D barcodes, such as QR codes and GS1 DataMatrix, highlighting the increased data capacity and smaller footprint of 2D barcodes. 

She elaborates on the potential consumer and business benefits, emphasizing the ability of 2D barcodes to provide more accurate and direct information to consumers, streamline supply chain processes for brands and retailers, and enable functionalities such as product recalls and promotions. The discussion delves into the challenges and opportunities presented by the transition to 2D barcodes, as well as the support and resources available for brands looking to embark on this journey. Gena's expertise on the subject makes for an enlightening and informative conversation, encouraging businesses to consider the advantages of 2D barcodes and GS1 standards in their operations.

 

Key takeaways: 

 The transition from traditional barcodes to 2D barcodes allows brands to provide information to consumers and tailor experiences. 

The adoption of 2D barcodes in the industry allows products to carry more data in a smaller footprint.

GS1 US supports brands transitioning to 2D barcodes and GS1 digital link standards with pilot programs and toolkits. 

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Gena Morgan on LinkedIn

 

Resources:

Learn More About 2D Barcodes

Resources for the Transition from 1D to 2D Barcodes

Fresenius Kabi Infuses Safety from Production to Patient with Unit-of-Use 2D Barcodes

 

Tuesday, 27. February 2024

Hyperledger Foundation

Building Better Together: Insights from the Governing Board to Mark Hyperledger Foundation’s 8th Anniversary

As a follow-up to Hyperledger 8: A Celebration of Building Better Together, Daniela Barbosa asked our Governing Board Representatives for their take on the success and value of Hyperledger Foundation as well as the technical priorities they see for the community. 

As a follow-up to Hyperledger 8: A Celebration of Building Better Together, Daniela Barbosa asked our Governing Board Representatives for their take on the success and value of Hyperledger Foundation as well as the technical priorities they see for the community. 


Oasis Open Projects

OASIS Members to Advance Global Standard for Computing Ecosystem Supply Chain Data Exchange

Boston, MA – 27 February 2024 – Members of OASIS Open, the international open source and standards consortium, have formed the Computing Ecosystem Supply Chain Technical Committee (CES-TC). Leaders in the computing and semiconductor industries established the TC with aims to revolutionize global supply chain dynamics through standardized data exchange. With digital transformation rapidly reshaping

Cisco, Hewlett Packard Enterprise, Intel, Micron, Microsoft, U.S. NIST, SAP, and Others to Develop Use Cases, Standards, and APIs that Enable End-to-End Visibility for Supply Chains

Boston, MA – 27 February 2024 – Members of OASIS Open, the international open source and standards consortium, have formed the Computing Ecosystem Supply Chain Technical Committee (CES-TC). Leaders in the computing and semiconductor industries established the TC with aims to revolutionize global supply chain dynamics through standardized data exchange. With digital transformation rapidly reshaping industries and systems worldwide, the imperative for seamless data exchange has never been more pronounced.

This collaborative endeavor highlights the consensus in the computing ecosystem that digital transformation requires standardized data exchange among member companies over a network. The TC will focus on developing use cases, data schemas and ontologies, and APIs that enable end-to-end visibility for supply chains. The TC’s work will facilitate building resilient capacity, trusted hardware and software, secure systems, and sustainable practices to benefit all customers and end-users.

“Standardization plays a pivotal role in establishing secure and sustainable systems, which are crucial for the evolving digital landscape,” noted Joaquin Sufuentes, CES-TC co-chair, of Intel. “As the CES-TC sets its course, it signifies the collective dedication of OASIS members to lead the charge in technological advancement that directly enriches industries and end-users. The TC’s work will extend to smart contracts that drive logic functions, process automation, and role-based entitlements within the blockchain context.”

“TC contributions will focus on the data schemas and ontologies that define the attributes and entities and a REST API model for putting the data into and getting the data from blockchain or other distributed infrastructure,” said Tom Dodson, CES-TC co-chair, of Intel. “Through standardized approaches, we are empowering industries with the tools necessary to navigate the complexities of the digital age.”

Participation in the OASIS CES-TC is open to all through membership in OASIS. The profile for the types of contributors to the CES-TC include business stakeholders responsible for product delivery, technical experts managing integrations, supply chain professionals, data specialists focusing on ontologies, government representatives concerned with traceability, and industry professionals driving digital transformations.

Support for the CES-TC
Cisco
“The OASIS CES-TC represents a great advancement in standardizing and securing the supply chain of the digital age. By focusing on the development of universally accepted data schemas, APIs, and smart contract specifications, this effort is laying the groundwork for transparency, efficiency, and security in supply chain management. I fully support CES-TC’s efforts to create a more resilient and trustworthy digital ecosystem.”
– Omar Santos, Distinguished Engineer, Cisco | OASIS Board of Directors

Intel
“Working as an ecosystem for the benefit of customers and end users of our computing products requires that we operationalize how we collaborate with data in real time to build more efficient operations and new revenue services. We want to standardize and scale the ability to share the right data and signals.”
-Paul Dumke, Senior Director, Ecosystem Strategy & Operations, Intel Corporation

Micron
“The storage and memory business is complex and competition is fierce. Micron’s success depends on our ability to innovate, and with more than 50,000 lifetime patents, we take innovation very seriously. The value chain ecosystem is no exception. Ecosystem innovation is the next frontier and Micron is thrilled to be on this journey with our fellow CES-TC members.”
-Matt Draper, Senior Director of Micron Supply Chain Optimization

Additional Information
CES Project Charter

The post OASIS Members to Advance Global Standard for Computing Ecosystem Supply Chain Data Exchange appeared first on OASIS Open.


Origin Trail

The ON TRAC(k) podcast returns! Episode 2 on Delegated Staking, AI Agents, & More

We’re excited to announce that the ON TRAC(k) podcast will return on February 29th at 16:00 CET with a brand new episode on delegated staking, AI agents, and more. Hosted by Jonathan DeYoung, (who you may know already as co-host of Cointelegraph’s The Agenda) and recorded live, the second episode of the ON TRAC(k) podcast features a special guest — Martin Köppelmann, co-founder and CEO of Gn

We’re excited to announce that the ON TRAC(k) podcast will return on February 29th at 16:00 CET with a brand new episode on delegated staking, AI agents, and more.

Hosted by Jonathan DeYoung, (who you may know already as co-host of Cointelegraph’s The Agenda) and recorded live, the second episode of the ON TRAC(k) podcast features a special guest — Martin Köppelmann, co-founder and CEO of Gnosis! Martin will join the three co-founders of OriginTrail, Žiga Drev, Branimir Rakić, and Tomaž Levak to discuss a Verifiable Internet for AI & more.

Take this opportunity to tune in to a live conversation between industry pioneers and thought leaders here.

In case you missed it

Last time around, Jonathan DeYoung spoke with the OriginTrail co-founders, about the significance of OriginTrail’s V8 Foundation, explored its robust partnerships, and shed light on the ecosystem’s key initiatives including knowledge mining and staking.

If you missed out on watching this episode live, you can watch it back on the OriginTrail YouTube channel or listen wherever you consume your favourite shows.

And, if you’re curious about OriginTrail’sV8 foundation, you can read more here.

Future Episodes

The On TRAC(k) podcast will continue to bring you the latest and most innovative ideas and advancements both in the OriginTrail ecosystem and beyond. We’re giving our listeners exclusive insights into the world of blockchain and Web3 as we develop the technology that empowers brands and builders alike with verifiable, decentralized knowledge through AI and DKG technology.

Here at the On TRAC(k) podcast, we’re lucky to have such a vibrant, curious community of listeners, and we want to give you a listening experience that matches the cutting-edge ideas and excitement in our community. That’s why we’re making you a part of the podcast. Ahead of each episode, you’ll have the chance to submit questions that delve deeper into the things you want to learn more about.

We’re excited to reveal our upcoming guests and topics to you further down the line. To keep up to date with all announcements and upcoming episodes, don’t forget to follow OriginTrail on X and, of course, subscribe to On TRAC(k) wherever you get your podcasts.

Climb aboard and welcome to the OriginTrail community. Together, let’s explore, learn, and shape the future.

About OriginTrail

OriginTrail is an ecosystem-building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail’s initial adoption was in global supply chains, serving as a trusted hub for supply chain data sharing, allowing customers to authenticate and track products and keep these operations secure. In recent years, the rise of AI has not only created unprecedented opportunities for progress but also amplified the challenge of misinformation. OriginTrail also addresses this by functioning as an ecosystem focused on building a trusted knowledge infrastructure for AI in two ways — driving discoverability of the world’s most important knowledge and enabling the verifiable origin of the information. The adoption of OriginTrail in various enterprise solutions underscores the technology’s growing relevance and impact across diverse industries including real-world asset tokenization (RWAs), the construction industry, supply chains, healthcare, metaverse, and others.

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

OriginTrail has gained support and partnerships with world-class organizations such as British Standards Institution, SCAN, Polkadot, Parity, Walmart, the World Federation of Hemophilia, Oracle, and the EU Commission’s Next Generation Internet. These partnerships contribute to advancing OriginTrail’s trusted knowledge foundation and its applicability in trillion-dollar industries while providing a verifiable web of knowledge important in particular to drive the economies of RWAs.

Web | On TRAC(k) Podcasts | X | Facebook | Telegram | LinkedIn | GitHubDiscord

The ON TRAC(k) podcast returns! Episode 2 on Delegated Staking, AI Agents, & More was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Digital ID for Canadians

First DIACC PCTF-Certified Service Provider Trustmark Granted

Confirming ATB Ventures’s Oliu service PCTF Privacy Component conformance 

Feb. 27, 2024 – Vancouver – We are thrilled to announce that  ATB Ventures’s  Oliu has been certified against the Pan-Canadian Trust Framework (PCTF) Privacy Component. Established in 2012, DIACC is Canada’s largest and most diverse multistakeholder organization, fostering confidence and consistency in the digital trust and identity services market through its internationally recognized PCTF and standardized third-party conformity assessment program. 

Being the first DIACC PCTF-certified service provider is a significant milestone and a unique leadership opportunity.  DIACC PCTF certification provides an assurance signal to the market, indicating that a service fulfills specified requirements. 

The PCTF comprises a set of rules that offers a versatile code of practice and risk assessment approach that organizations agree to follow, which includes best practices, policies, technical specifications, guidance, regulations, and standards, prioritizing interoperability, privacy, security, and trustworthy use of digital identity and personal data. 

ATB’s Oliu, an Identity verification and authentication platform, has been subject to certification for the PCTF, including a point-in-time audit conducted by DIACC Accredited Auditor KUMA and an independent committee review for quality assurance. Oliu demonstrated conformity to the PCTF Privacy conformance criteria, meeting the applicable requirements. Based on the conformity assessment process results, DIACC has issued a three-year cycle Trustmark subject to annual surveillance audits and added ATB Oliu to the DIACC Trusted List – an authoritative trust registry of DIACC PCTF-certified service providers. 

“This certification begins an exciting journey in providing certainty to the market through trusted services subject to DIACC’s certification program, designed around ISO/IEC 17065,” said DIACC President Joni Brennan.  “For Oliu, achieving the certification demonstrates its commitment to providing trustworthy and reliable digital identity verification services and advancing secure and interoperable digital trust and identity services in Canada.“

About DIACC

Established in 2012, DIACC is a non-profit organization of public and private sector members committed to advancing full and beneficial participation in the global digital economy by promoting PCTF adoption and conformity assessment. DIACC prioritizes personal data control, privacy, security, accountability, and inclusive people-centered design.

To learn more about DIACC, please visit https://diacc.ca/ 

ABOUT OLIU™

Oliu is a blockchain-identity management solution that makes it easy for businesses to issue, manage, and verify digital credentials. Built on open (W3C) standards, Oliu leverages identity frameworks such as the Pan-Canadian Trust Framework (PCTF) and National Trust and Identity Fundamentals to make mobility and interoperability between identity systems possible.

To learn more about Oliu, please visit https://oliu.id/ 

About ATB Ventures™

ATB Ventures is the research and innovation arm of ATB Financial, a leading Alberta-based financial institution. Driving growth at the edges and exploring opportunities beyond financial services, ATB Ventures focuses on helping companies bridge the gap between consumers’ increasing concerns about privacy and security, and their desire for more advanced personalized experiences. 

To learn more about ATB Ventures, please visit https://atbventures.com/ 

Monday, 26. February 2024

FIDO Alliance

EMVCo and FIDO Alliance Provide Essential Guidance on Use of FIDO with EMV 3DS

As leaders in authentication and payments spaces respectively, the FIDO Alliance and EMVCo collaborate to provide guidance on how FIDO authentication can be incorporated in payment use-cases allowing merchants, acquirers/PSPs […]

As leaders in authentication and payments spaces respectively, the FIDO Alliance and EMVCo collaborate to provide guidance on how FIDO authentication can be incorporated in payment use-cases allowing merchants, acquirers/PSPs and issuers to have a consistent way to submit and process FIDO authentication data.  

EMVCo released a white paper with FIDO Alliance’s inputs, “EMV® 3-D Secure White Paper – Use of FIDO® Data in 3-D Secure Messages,” which explains how the use of FIDO authentication data in EMV 3DS messages can streamline e-commerce checkout while reducing friction for consumers. 

Authentication flows are evolving, and merchants are increasingly building seamless experiences based on FIDO standards for device-based authentication, where a trusted device is bound to a payment credential to ensure the credential is being used by the verified cardholder. Consequently, it has become apparent that in some scenarios the issuer may require more data to assess risk and validate the authentication cryptographically. 

This paper addresses these scenarios by providing a data structure that allows for a chain of trust to be established between cardholder authentication, FIDO enrolments and FIDO authentication, hence giving issuers increased control and insight into the authentication process as well as validate authentication. 

In the EU, where payment authentication is required as per PSD2 SCA, this industry-wide guidance can provide assistance to enabling more device-based authentication in a standardized way using globally known authentication standards such as FIDO while using widely accepted authentication rails such as EMVCo.

Read the full white paper on the EMVCo website to learn more.


Oasis Open Projects

The Importance of Open Standards for Data Interoperability

The use of open standards in data interoperability is crucial for enhancing governance not only in the European Union but globally. Open standards determine the format, storage, and exchange of data and enable different organizations and systems to communicate seamlessly. This is especially vital for the EU, with its diverse member states and institutions, where […] The post The Importance of Op

By Francis Beland, Executive Director, OASIS Open

The use of open standards in data interoperability is crucial for enhancing governance not only in the European Union but globally. Open standards determine the format, storage, and exchange of data and enable different organizations and systems to communicate seamlessly. This is especially vital for the EU, with its diverse member states and institutions, where open standards ensure free and secure data flow across borders, enabling better coordination and cooperation in implementing healthcare, trade, environmental protection, and security policies.

Furthermore, open standards uphold the principles of transparency and democracy, enabling citizens’ access to governmental data and enhancing public accountability, thereby promoting civic engagement. From an economic standpoint, open standards foster innovation, facilitate cross-border business operations and drive economic growth. Moreover, they help address global challenges such as climate change and pandemics, allowing effective data sharing and collaboration among nations.

OASIS Open interoperability standards are pivotal in ensuring data protection, privacy, and security while harmonizing technological infrastructures. Our standards are vital for the EU and other governments to fully leverage data interoperability’s benefits in an increasingly interconnected world.

The post The Importance of Open Standards for Data Interoperability appeared first on OASIS Open.


Identity At The Center - Podcast

We’ve got another great episode of the Identity at the Cente

We’ve got another great episode of the Identity at the Center podcast for you! We caught up with Eve Maler of Venn Factory to answer a few listener voicemail questions and to see if her thoughts on the difference between digital identity and identity and access management has changed since we last asked her almost two years ago. Episode #262 is available now at idacpodcast.com and in your favorit

We’ve got another great episode of the Identity at the Center podcast for you! We caught up with Eve Maler of Venn Factory to answer a few listener voicemail questions and to see if her thoughts on the difference between digital identity and identity and access management has changed since we last asked her almost two years ago.

Episode #262 is available now at idacpodcast.com and in your favorite podcast app.

#iam #podcast #idac


The Engine Room

Welcoming Dalia Othman as Co-Executive Director

Dalia Othman has been selected by our Board as The Engine Room’s other Co-Executive Director, to lead the organisation alongside Paola Mosso from mid-March The post Welcoming Dalia Othman as Co-Executive Director appeared first on The Engine Room.

Dalia Othman has been selected by our Board as The Engine Room’s other Co-Executive Director, to lead the organisation alongside Paola Mosso from mid-March

The post Welcoming Dalia Othman as Co-Executive Director appeared first on The Engine Room.

Wednesday, 04. October 2023

decentralized-id.com

Ecosystem Overview

This page includes a breakdown of the Web Standards, Protocols,Open Source Projects, Organizations, Companies, Regions, Government and Policy surrounding Verifiable Credentials and Self Sovereign Identity.

Note to reader This is a Work in Progress, and should not be taken as authoritative or comprehensive. Internal Links in Italic

Open Standards Decentralized Identifiers Explainer Literature DID Methods Supporting Tech DIDAuth Critique Verifiable Credentials Explainer Comparisons Varieties Data Integrity JSON-LD LD-Proof (w3c) JSON-LD ZKP BBS+ (w3c) JOSE / COSE JSON SD-JWT (ietf) JWP (ietf) ZKP-CL (Hyperledger) Related JSON-LD (W3C) JSON (IETF) BBS (SIAM 1986) Exchange Protocols DIDComm (DIF) CHAPI (DIF) OIDC4VC (OpenID) mDL (ISO/IEC) WACI-Pex (DIF) VC-HTTP-API (CCG) Authorization Protocols zCap (w3c) UCAN (Fission, Bluesky, Protocol Labs) GNAP (IETF) OAuth (IETF) ISO Standards mDL (ISO/IEC 18013-5) JTC 1/SC 17/WG 3 - Travel Documents (ISO/IEC) ISO 27001 Data Stores Encrypted Data Vaults - EDV (DIF) Decentralized Web Node - DWN (DIF) Trust Frameworks 800-63-3 (NIST) PCTF (DIACC) Non SSI Identity Standards OpenID (OpenID) FIDO (FIDO) OAuth (IETF) SCIM (IETF) SAML (OASIS) KMIP (OASIS) WebAuthN (W3C) Secure QR Code (OASIS) Blockchain Standards ISOTC 307 (ISO) CEN/CLC/JTC 19 (CEN/EENTLIC) ERC-EIP (Ethereum) Code-Bases Open Source Projects Universal Resolver (DIF) KERI (DIF) Other Tools & Libraries (DIF) ESSIF-Lab (ESSIF-Lab) Aires (Hyperledger) Indy (Hyperledger) Ursa (Hyperledger) Other Tools & Libraries (Hyperledger) Blockcerts (Hyland) Company Code Walt.id Verite SpruceID Organizations International Standard Development Organizations [SDO] W3C IETF OASIS ITU-T ISO/IEC National Government/Standard Setting Bodies NIST The Standards Council of Canada BSI - The Federal Office for Information Security, Germany Community Organizations W3C - CCG DIF ToIP ADIA Kantara MyData DIACC ID2020 OpenID Foundation Internet Safety Labs GLEIF Hyperledger Foundation FIDO Alliance OASIS SSI Networks DizmeID Sovrin BedRock ONT Velocity GlobalID Dock ITN , Mobi Companies Microsoft - Azure / Entra EU SSI Startups MyDex MeeCo ValidatedID Bloqzone Procivis Gataca US SSI Startups Dock Anonoyome GlobalID Hyland Magic IDRamp Indicio Verified Inc (formerly UNUMID) Animo Mattr Liquid Avatar Hedera IOTA Trinsic Transmute Spruce Disco.xyz Asia SSI Startups Affinidi ZADA Dhiway Ayanworks NewLogic Africa SSI Startups FlexID Diwala Acquisitions Avast-Evernym-SecureKey Analyst Firms KuppingerCole Forrester Gartner Consulting Firms Deloitte Accenture McKinsey BCG IAM Industry Ping (TomaBravo rollup) Okta Auth0 ForgeRock (TomaBravo rollup) IDENTOS SailPoint (TomaBravo rollup) Policies/Regulations (by region) FATF Europe Data Governance Act GDPR eIDAS1 eIDAS2 UK Data Protection Asia USA COPPA Privacy Act California SB786 India Canada Pan Canadian Trust Framework (PCTF) Government Initiatives US SVIP National Cybersecurity Strategy Germany IDUnion UK Scotland UK Digital Strategy EU eIDAS2 Large Scale Pilots Architecutre and Reference Framework EBSI ESSIF-Lab Catalonia Switzerland APAC New Zealand Australia Singapore South Korea Canada BCGov Alberta Ontario LatAm LACCHAIN Real-World Implementation Government Issued ID Passport eMRTD/DTC (ICAO) Immigraion (USCIS) mDL (US AAMVA) [not SSI standards conformant] IDCard (IATA / Switzerland) Trust Registries & Directories TRAIN (ToIP) Regi-Trust (UNDP) OrgBook BC (BCGov) SupplyChain/Trade GS1 GLEIF Banking Bonifi COVID NYState VCI CCI DTCC DIVOC Enterprise Healthcare Learning/Career/Education Jobs for the Future Velocity Network Learning Economy Foundation TLN - Trusted Learner Network KYC Real Estate Rental Travel Humanitarian Energy IoT Guardianship Wallets Types (by type+topic) Research Papers/Academic Literature Turing Institute Research: Privacy & Trust Events IIW RWoT Topics Biometrics Privacy Human Rights User Experience Business Critiques Future Web3, DWeb, & Other Tech (by focus) Web3 Web3 and SSI DAO Decentralization Metaverse NFT SBT DeFi Organizations Ethereum Enterprise Alliance* Fission Protocol Labs DWeb Secure Suttlebutt Bluesky Web5 Handshake Blockchain Ecosystems Bitcoin Ethereum

Friday, 23. February 2024

FIDO Alliance

Cybersecurity Policy Forum: Identity, Authentication and the Road Ahead

2023 demonstrated that we still have a lot of work to do when it comes to protecting Americans from identity theft and identity-related cybercrime. The GAO and FinCEN together documented […]

2023 demonstrated that we still have a lot of work to do when it comes to protecting Americans from identity theft and identity-related cybercrime. The GAO and FinCEN together documented more than $300 billion in identity-related cybercrime, DHS’ Cyber Safety Review Board (CSRB) outlined how weaknesses in legacy authentication tools enabled adversaries to launch a wave of high-profile attacks, and millions of Americans struggled to recover from identity theft. Meanwhile, the introduction of new tools powered by biometrics and AI to help block attacks also raised concerns about equity and bias, and in the physical world, many Americans still struggle to get foundational credentials that they need to prove who they are. As 2024 kicks off, these issues will all continue to be front and center.  

On Thursday, January 25th in Washington DC, the Better Identity Coalition, FIDO Alliance, and the Identity Theft Resource Center (ITRC) joined forces to present a full-day policy forum looking at “Identity, Authentication and the Road Ahead.”


Security Journal: Fingerprints agrees distribution partnership with Ansal Component

Fingerprints’ biometric access solution is designed for physical and logical access devices and applications such as smart locks, FIDO tokens, crypto wallets and more.

Fingerprints’ biometric access solution is designed for physical and logical access devices and applications such as smart locks, FIDO tokens, crypto wallets and more.


FinExtra: Mitigating fraud risk: effective strategies for small financial institutions

Passwords are one of the most common targets for fraudsters. Strengthening password security demands robust authentication methods, risk-based measures and behavioural analysis to detect anomalies. Active exploration of innovations like […]

Passwords are one of the most common targets for fraudsters. Strengthening password security demands robust authentication methods, risk-based measures and behavioural analysis to detect anomalies. Active exploration of innovations like Passwordless Login, based on the robust Fast Identity Online 2 (FIDO2) standards developed by the FIDO Alliance, is essential to bolster online security and authentication. 


Engadget: PlayStation now supports passkey sign-ins

Sony Interactive Entertainment (SIE) introduces passkey support for PlayStation accounts, allowing users to log in via their mobile device or computer’s screen unlocking method like PIN, fingerprint, or facial recognition. […]

Sony Interactive Entertainment (SIE) introduces passkey support for PlayStation accounts, allowing users to log in via their mobile device or computer’s screen unlocking method like PIN, fingerprint, or facial recognition. Passkeys enhance security by preventing reuse or sharing, reducing vulnerability to phishing and data breaches.


The Verge: Now you can sign into your PlayStation account without a password

Sony PlayStation has introduced passkey support for account logins, enabling users to authenticate without passwords. Similar to Nintendo’s implementation, users can now use authentication methods like iOS Face ID or […]

Sony PlayStation has introduced passkey support for account logins, enabling users to authenticate without passwords. Similar to Nintendo’s implementation, users can now use authentication methods like iOS Face ID or Android fingerprint sensors for account access.


Ceramic Network

Points: How Reputation & Tokens Collide

Points are here and they signal how networks and apps will evolve next with verifiable data.

Points have taken Web3 by storm in the last six months, catalyzed by projects like Blur and EigenLayer rewarding users with points on the way to seizing the NFT market and amassing $7 Billion TVL respectively. More than 115 Billion points have been given out by Web3 projects so far, according to Tim Copeland at The Block.

There are two ways to look at points:

As a precursor to an airdrop. Projects use points ahead of a token to generate interest, signal what they care about and will reward, more effectively target engagement, and navigate legal risks associated with tokens. As a measure of quantifiable reputation. Points ascribe a value to user activity, just like many reputation systems have before: traditional loyalty programs, Reddit karma, check-ins, credentials. They can signal legitimacy in pseudo-anonymous systems and — because they’re more quantitative than, for example, verifiable credentials — standing within the community.

Both of these are right. Points align the incentives of the platform and the user base, like all reputation systems. And they forecast who is creating value and is likely to be rewarded. By understanding how these two intersect, we can forecast where Web3 will go far beyond today’s points craze.

Points are quantifiable like money, enduring like reputation

Tokens were the first major innovation of Web3 and the primary incentive. They offer fully quantifiable value, are transactional, and require no additional context. They work as “one time games.” Reputation is how social systems achieve repeat game use cases, rewarding ‘good behavior’ of an actor (e.g. following contracts and policies, not cheating counterparties) with access and benefits over the long term. Reputations are non-fungible — for them to establish trust, it has to be hard to buy reputation.

Points are proof of activity that act as a building block for reputation (and in Web3, often carry the suggestion of future value). **All reputations come with some benefit. Usually, they’re more subtle than financial rewards. Reputations might gain access to a service (credit score) or club (referral), earn discounts (loyalty programs) or introductions (dating), convince counterparties to transact (Uber rating, credit lending), and build trust with customers (influencers, corporate brands). Reputations are less measurable than financial assets, but often more valuable.

For Web3 to grow into social and other non-financial use cases, more robust reputations are needed. Points are not an isolated mechanism to forecast token earnings — they are one point on a broad spectrum of token (financial) and reputational rewards that Web3 will keep innovating on.

Points as part of the evolution of Web3 reputation

Reputation naturally starts in the most discrete, high-value places and evolves to more broad ones.

1. B2C “Badges”

The earliest forms of reputation helped networks solve discrete pressing problems: anti-sybil and KYC. This involved a business or service issuing badges to users for achieving important milestones. For example, Gitcoin Passport stamps prevent sybil attacks in Grants rounds for Ethereum and other ecosystems. The meaning of the badge is objective and clearly denominated.

2. Attestations

After discrete badges, platforms needed reputation for a wider variety of activities and credentials: history of contribution, trust as a delegate, skills in a DAO, proof of activity. Attestations are still clearly denominated, but rather than signifying a clear milestone like badges they’re more continuous.

This also started B2C (for example, participating in an event, completing a certain step in a user flow, etc.) EAS has emerged as a standard for issuing these, used natively in the Optimism stack and widely in Ethereum. Increasingly, attestations are community-driven as well. Open platforms let users create and verify claims. For example, Metamask is working on a review and reputation system for Snaps.

3. Points: scored activity

On-chain transactions, B2C badges, and attestations are all cryptographically verifiable actions. Users do something that has provable provenance and time, whether that’s on-chain or off-chain signed data.

Point systems specify which of these activities have value in their system (and how much), tabulate them over time for each identity, and ascribe a numeric ‘point’ value to them. Points create a quantifiable reputation that continuously aggregates the previous forms.

4. Events: all activity

There’s no reason to believe the evolution will stop with points. At scale, all of a user’s activities in Web3 apps and platforms will be recorded as cryptographically verifiable events. This might be likes on a cast, messages on a forum, contributions to a codebase, visits to a blackbird restaurant, etc.

These are all events, and they all have value — but it’s not always known up-front what that value is. Some might have value in driving community engagement, others in improved analytics for products or targeting for ads, some as inputs into reputation systems. Because all will be cryptographically recorded, they can be referenced any time in the future.

Points will dominate for now, but before long we’ll see a huge increase in retroactive airdrops, activations, rewards, access, and other forms of value awarded to users based on a much broader history of their events — not just those that are made explicit up-front via point systems.

Infrastructure for points, events, and trust

All of these forms of reputation serve to reward users. Web2 used points exhaustively, but Web3 can uniquely do it openly and with composability. By making every event driving points both transparent and verifiable, events and points can be leveraged across platforms and have trust built in. This trust can reinforce future rewards, encourage more activity, and enable cross-platform point innovation.

Unfortunately, while points are proliferating, to date most haven’t tapped into this unique Web3 possibility — most have been tabulated on centralized databases, putting rewards at risk.

Data ledgers vs. asset ledgers

Financial blockchains were built to be asset ledgers, not point or event ledgers. They’re designed for scarcity; e.g., they must protect against double-spend as a core principle. Points are not bought, sold or traded like assets — they’re earned. They’re best served — fast, cheaply, scalably — on a data ledger.

Data ledgers, for data that is not scarce, operate with different principles. They must still offer strong verifiability and composability; but they don’t have to protect against double spend, and they must scale many orders of magnitude beyond asset ledgers. There are exponentially more data transactions than asset transactions in any web service.

Ceramic is the first decentralized data ledger with the scale, composability, and trust guarantees required to be a system of record for points and events. It’s built to enable the scaling of Web3 beyond financial transactions to richer experiences, including those powered by attestations, points, and the billions of events that are coming to enable a data-rich Web3.

Building with Points

If you’re thinking about a point system for your product, or how to advance point-enabled experiences, please reach out to partners@3box.io.

If you are interested in emerging standards for points, reach out to us on the Ceramic discord to learn more about our working group.

If you’ll be at EthDenver next week, come talk points, reputation and verifiable data with us at Proof of Data.

Thursday, 22. February 2024

The Engine Room

Community call: Dreams of a collective infrastructure for information ecosystems in Latin America 

Join our next community call to talk about the kinds of infrastructures we need to collectively create a better flow of creation, distribution and reception of information in Latin America.  The post Community call: Dreams of a collective infrastructure for information ecosystems in Latin America  appeared first on The Engine Room.

Join our next community call to talk about the kinds of infrastructures we need to collectively create a better flow of creation, distribution and reception of information in Latin America. 

The post Community call: Dreams of a collective infrastructure for information ecosystems in Latin America  appeared first on The Engine Room.


Ceramic Network

ETHDenver 2024: Where to find Ceramic in Denver

The core Ceramic team is coming to ETHDenver 2024! Check out all of the events where you can meet the team.

ETHDenver 2024 is kicking off, and we are very excited to meet you all there. This year, you will find the Ceramic team at a list of side events, talks, workshops, and, most importantly - a Proof of Data event co-organized by Ceramic and Tableland that you don’t want to miss.

Collect attendance points at ETHDenver 2024!

Are you ready for an exciting scavenger hunt at ETHDenver 2024?

Ceramic is partnering with Fluence, a decentralized computing marketplace, to create a fun and interactive game for collecting attendance badges at the majority of the events listed below. Those badges will allow you to collect points throughout the ETHDenver2024.

You can find us at each event and tap a disc to participate! With each attendance, you will claim points represented as documents on Ceramic. Fluence will be consuming the new Ceramic Data Feed API to enable compute over incoming points.

Rumor has it that the first participants to collect all the necessary points will be rewarded with some really cool prizes! So make sure to participate and we can’t wait to see you at all of the ETHDenver events listed below!

Sunday, February 25th - Saturday, March 2nd Silk ETHDenver hackerhouse

Ceramic is partnering with Silk and other ecosystem partners to invite hackers to work together on building better scientific tooling, web account UX, governance forums, and much more.

🚀 Calling all hackers! Unveiling the Silk ETH Denver Hacker House – where innovation meets decentralized tech! 🏡 Join our quest to revolutionize scientific tooling, web account UX, governance forums, and much more!

Are you ready? Save the dates: Feb 25th - March 2nd 🤍🧵

— Silk (@silkysignon) January 26, 2024
Tuesday, February 27th DePIN Day

Join us and our friends at Fluence for a day filled with talks, workshops, and discussions on all things #DePIN.

Location:
Green Spaces
2950 Walnut St.
Denver, CO

Time:
13:00 - 17:00 MST

Wednesday, February 28th Open Data Day

Our co-founder, Danny Zuckerman, will deliver a keynote at Open Data Day hosted by Chainbase. Come hear more about Ceramic and what we have coming up on the roadmap.

Location:
1261 Delaware St,
Denver, CO

Time:
13:00 - 17:00 MST

SciOS

Don’t miss out on a workshop led by Radek, our Developer Advocate at SciOS. The workshop will be focused on discussing the barriers and workarounds for enabling developers to build interoperable DeSci solutions.

Location:
2601 Walnut St 80205,
Denver, CO

Time:
13:00 - 16:00 MST

Thursday, February 29th libp2p day

Come listen to our core engineers discussing the implementation of Recon, a new Ceramic networking protocol that improves network scalability and data syncing efficiency.

Location:
The Slate Denver,
Denver, CO

Time:
13:30 - 18:00 MST

Friday, March 1st Proof of Data

Join Ceramic x Tableland on March 1 in Denver and livestream for Proof of Data Summit, a full-day community gathering on reputation, identity, DePIN, decentralized AI, and decentralized data computing. Featuring lightning talks, technical discussions, and panels with industry visionaries, this will be a can't miss event! Don't miss your chance to RSVP now to secure your spot in person or via livestream.

Location:
Denver Art Museum
Denver, CO

Time:
9:00 - 16:00 MST

ETHDenver - Triton Stage

Ceramic engineer Golda Velez will lead us on a talk about decentralized trust and AI on the Triton Stage at the Spork Castle!

Location:
Spork Castle
Denver, CO

Time:
14:45 MST

Keep an eye on the updates and our twitter as we get closer to the events. We can't wait to see you there!


Elastos Foundation

Elastos Announces Decentralised Digital ID (DID) Partnership with ARGOS Identity

Elastos today is announcing a partnership with global digital Identity verification specialists, ARGOS Identity, to deliver decentralised Digital ID (DID) verification to individuals and businesses. ARGOS Identity provides digital identification verification services for issues ranging from Know Your Customer (KYC) and Know Your Business (KYB), to Anti Money- Laundering (AML) screening. Jonathan H

Elastos today is announcing a partnership with global digital Identity verification specialists, ARGOS Identity, to deliver decentralised Digital ID (DID) verification to individuals and businesses. ARGOS Identity provides digital identification verification services for issues ranging from Know Your Customer (KYC) and Know Your Business (KYB), to Anti Money- Laundering (AML) screening.

Jonathan Hargreaves, Elastos’ Global Head of Business Development & ESG, says that this partnership goes to the very heart of Elastos’ mission and purpose.

“Through ARGOS Identity we are addressing head on the Web’s principal challenge; how to verify who or what you are dealing with, while – simultaneously – retaining full control and discretion over your own identity. Uniquely, Web3 offers a resolution to this apparent contradiction; irrefutable identity proof which actually requires neither party to relinquish control of the same,” he says.

“This is the promise of the SmartWeb. One to which both parties are wholly committed. Decentralised IDs are the future of digital identification and central to Elastos’ vision for a SmartWeb; one where users engage each other without intermediaries, on their terms.” adds Jonathan.

ARGOS Identity’s CEO Wonkyu Lee said:

“ID verification is a necessary and beneficial aspect of online relationships; Web3 now offers us the opportunity to deliver the same directly between the two parties, without the intervention of any intermediary. In essence, both parties can verify each other, while retaining full control of their own credentials. Our Elastos partnership epitomises what both organisations are all about.”

About ARGOS

Launched in 2018, ARGOS Identity is an ID verifying solution which helps to build a truly liberated Web3 environment based on a contactless ID verification service that streamline and secure the onboarding process of global users. The company was named ARGOS Identity from the hundred-eyed giant Greek mythology Argos to safeguard against identity theft, fraud, and money laundering. Today, ARGOS serves customers from 14 countries including the US. https://argosidentity.com/

Join Us on the Journey

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

The post Elastos Announces Decentralised Digital ID (DID) Partnership with ARGOS Identity appeared first on Elastos.


DIF Blog

Guest blog: Tim Boeckmann, Mailchain

Mailchain, founded in 2021, aims to revolutionize decentralized identity and communication with its services, including Vidos and the Mailchain Communication Protocol. These offerings streamline management of, and interaction with, decentralized identifiers (DIDs), ensuring secure, efficient, and compliant operations across various industries, simplifying the integration and adoption of decentrali

Mailchain, founded in 2021, aims to revolutionize decentralized identity and communication with its services, including Vidos and the Mailchain Communication Protocol. These offerings streamline management of, and interaction with, decentralized identifiers (DIDs), ensuring secure, efficient, and compliant operations across various industries, simplifying the integration and adoption of decentralized identity technologies.

What is Mailchain, and how did it come into existence? 

I always had a passion for startups. In 2016 I joined the team at AWS (Amazon Web Services) that helps startups with technology and go-to-market strategy, for example by introducing them to other AWS customers. 

The blockchain landscape was evolving at the time and my soon-to-be co-founders (who I met at AWS) and I started tracking the space closely. We noticed that it wasn’t possible to communicate privately between blockchain addresses without providing an additional piece of information, like an email address. So we sent some encrypted messages with the transaction data as an experiment. 

This grew into a side project. It was open source and had quite a few contributors, but we realized we needed something more scalable that wasn't dependent on blockchain protocols, with the associated gas fees and speed constraints. 

So, in 2021 we set out to build Mailchain, a protocol that enables people to communicate privately using any blockchain address. 

With our SDK, developers can easily add web3 email to their own projects and applications, allowing them to engage with their users in a truly web3 way.

It’s an interesting strategy to focus on upgrading email with web3 capability. Why did you choose this route? 

There are over 3.9 billion active email users today. Each user’s inbox paints a rich picture of who they are. It stores their online actions, communication habits, spending behavior, even their thoughts, ideas, and feelings.  And everybody wants to keep that information private.

‍Web3 on the other hand is underpinned by the principles of decentralization, privacy and digital property rights, using wallets and blockchain addresses as identities. But there’s no native way to communicate privately using these addresses. The workaround is to use another communication channel, whether that’s email, instant messaging or social media.

With Mailchain, users enjoy the privacy and other benefits of a digital identity wallet without needing to leave their email inbox. For instance, people can authenticate with a Web3 application by clicking a link in their inbox. Upon clicking the link, the system creates a self-signed Verifiable Credential (VC). The app knows who should be signing it, and is able to verify the user. 

This use case came from a customer who needed to prevent Zoom-bombing (unauthorized intrusion into a private video conference call). Another use-case is universities selling remote courses. They don’t want people who are not enrolled joining the sessions, or others joining on behalf of those who are enrolled — particularly when it comes to exams. 

How did decentralized identity become part of the Mailchain story? 

We wanted to enable the community, so we open-sourced as much of the codebase as we could. 

We started to see people using Mailchain for authentication, and realized identity was vital to what they were trying to achieve. These developers needed tools to manage user identities. It was early in the adoption cycle and there were a lot of gaps. 

We also started hearing people talking about DIDs (Decentralized Identifiers) and VCs (Verifiable Credentials).  We saw a pattern between VCs and our previous work with NFTs. So, we went deep into the W3C standards and looked at how they were being used in the real world. 

At the time, we didn’t know if we wanted to put people’s Mailchain IDs on-chain. We were looking for a standard way to construct the IDs and convey related attributes, such as the authorized senders for a blockchain address.  

Over time, we saw an opportunity to converge on a standardized approach. We also wanted to extend what we built to help other developers in the ecosystem, so we created Vidos, a suite of managed services to help people building with DIDs and VC related applications. 

Tell us more about the tools you’re building, and how they promote adoption, and interoperability, of decentralized identities

Our first service is the Vidos Universal Resolver. DID resolution forms a core part of any interaction with VCs and needs to be reliable and performant. It’s also something that developers and ops teams shouldn’t need to spend time deploying and managing. The service takes away this burden so deploying a resolver is simple and just requires adding the API details to an application.

The service comes with uptime guarantees and offers granular policy and permissions features, helping organizations meet their own availability and compliance requirements. 

This helps organizations who are not just issuers (of credentials such as course certificates and educational qualifications). They may also need to verify other credentials (such as proof of identity, age, etc.), which potentially involves resolving DIDs on multiple networks and services. 

We also have other services coming later in the year that will facilitate credential verification with similar compliance and logging features. 

You mentioned go-to-market strategy as an area of personal interest. Can you tell us a bit about your own strategy? 

The DID resolution and Mailchain audiences are different. For Vidos, we’re working with enterprises and closing some gaps where technology is not available today. Mailchain is largely feature complete. 

Vidos is a good fit with Mailchain because there’s strong interest in enabling Web3 communication, whether that’s machine-to-machine messages triggered by blockchain transactions or certain types of business communication. 

We need to ground this in the real world, so developing SaaS (Software as a Service) products to move the entire ecosystem forward is what we think is most important right now. 

I’d like to think that building on W3C standards ensures we don’t get ruled out of any geographic  markets. The DID resolver is intended to be multi-region. Customers can already deploy into the UK and EU. We will stand up services elsewhere, as needed. 

What market signals do you see? 

The market never moves fast enough for an entrepreneur! But we’re seeing strong signs. It’s becoming a priority for enterprises to see how they can move beyond identity federation.  Regulatory change and fraud are also encouraging supply chain actors and financial institutions to look at how they can use decentralized identity. 

We’re seeing this pop up in different places, for example it’s good to see LinkedIn verifying humans on the platform. There are certainly tail winds. 

What is the value of DIF membership to Mailchain? 

We’re hoping to collaborate with industry participants, to make sure what we build is right for the use cases we’re targeting, starting with the Vidos Universal Resolver for DIDs, as well as to learn from others building in the space. 

We also want to contribute back to what’s a very useful and sensible set of standards, whether that’s ideas in the working groups and/or contributing packages or libraries.

It’s a great time to be involved in DIF. The standards are reaching a stage where they are mature enough. The opportunity is now!


MyData

MyData4Children ZINE 2024: A challenge for MyData Community – Design a Robot School

Introducing MyData4Children Zine 2024 Numerous studies and real-life events have shown us that emergent technologies affect children, for good and bad. However, the dominant narrative is framed with an individualistic focus, putting a single child or a person in a child’s circle of trust on the spot, leaving many of us feeling defeated, nervous, and […]
Introducing MyData4Children Zine 2024 Numerous studies and real-life events have shown us that emergent technologies affect children, for good and bad. However, the dominant narrative is framed with an individualistic focus, putting a single child or a person in a child’s circle of trust on the spot, leaving many of us feeling defeated, nervous, and […]

Wednesday, 21. February 2024

OpenID

OpenID Summit Tokyo 2024 and Celebrating 10 Years of OpenID Connect

OpenID Foundation Japan (OIDF-J) hosted the OpenID Summit Tokyo 2024 in Shibuya Tokyo on Friday, January 19, 2024 with over 250 in attendance. The OpenID Foundation (OIDF) was thrilled to be a part of the Summit that included contributors from Japan and abroad presenting on current digital identity, security, and digital wallet topics. Gail Hodges, […] The post OpenID Summit Tokyo 2024 and Celeb

OpenID Foundation Japan (OIDF-J) hosted the OpenID Summit Tokyo 2024 in Shibuya Tokyo on Friday, January 19, 2024 with over 250 in attendance. The OpenID Foundation (OIDF) was thrilled to be a part of the Summit that included contributors from Japan and abroad presenting on current digital identity, security, and digital wallet topics.

Gail Hodges, OIDF Executive Director, kicked the Summit off by presenting OIDF’s strategic outlook for 2024 as well as a detailed briefing on the Sustainable Interoperable Digital Identity (SIDI) Summit held in Paris in November 2023.

A highlight of the Summit was a panel discussion celebrating ten years of OpenID Connect. This panel was coordinated and moderated by longtime OIDF board member and OpenID Connect editor, Mike Jones. Panelists included OIDF Chairman, Nat Sakimura, longtime Connect contributor and evangelist, Nov Matake, and Ryo Ito, OIDF-J Evangelist. As Mike Jones noted in his blog, the panelists shared their experiences on what led to OpenID Connect, why it’s been successful, and lessons learned along the way. This was the first of three planned OpenID Connect celebrations in 2024 with the other two taking place at Identiverse in May and the European Identity and Cloud Conference in June.

Nat Sakimura concluded the OpenID Summit Tokyo 2024 by delivering the closing keynote.

The post OpenID Summit Tokyo 2024 and Celebrating 10 Years of OpenID Connect first appeared on OpenID Foundation.


Identity At The Center - Podcast

In our latest episode of the Identity at the Center podcast,

In our latest episode of the Identity at the Center podcast, we had the pleasure of welcoming Sara King and Raul Cepeda from rf IDEAS for a Sponsor Spotlight discussion. This episode, generously sponsored by rf IDEAS, dives deep into the realms of physical security and identity, highlighting the innovative solutions rf IDEAS brings to the table. We explored their unique market positioning, their

In our latest episode of the Identity at the Center podcast, we had the pleasure of welcoming Sara King and Raul Cepeda from rf IDEAS for a Sponsor Spotlight discussion.

This episode, generously sponsored by rf IDEAS, dives deep into the realms of physical security and identity, highlighting the innovative solutions rf IDEAS brings to the table. We explored their unique market positioning, their impactful presence in sectors like healthcare and manufacturing, and how they're leading the charge towards passwordless environments. Our conversation also touched on current industry trends, including the move to secure mobile credentials and the future of biometrics, capped off with insights into rf IDEAS' Reader Remote Management capabilities.

Tune in to this engaging episode to discover how rf IDEAS is bridging the gap between physical and logical security for a seamless authentication experience. It's an insightful discussion on the latest advancements in the field that you won't want to miss.

#iam #podcast #idac


DIDAS

DIDAS Statement for E-ID Technology Discussion Paper

In this latest contribution to the ongoing dialogue surrounding Switzerland's E-ID initiative, DIDAS has released a comprehensive document that critically evaluates the current technological proposals for the Swiss trust infrastructure. This document underscores DIDAS's commitment to a principle-based, collaborative methodology in developing a secure, adaptive E-ID ecosystem, echoing the neces
In this latest contribution to the ongoing dialogue surrounding Switzerland’s E-ID initiative, DIDAS has released a comprehensive document that critically evaluates the current technological proposals for the Swiss trust infrastructure. This document underscores DIDAS’s commitment to a principle-based, collaborative methodology in developing a secure, adaptive E-ID ecosystem, echoing the necessity for an approach that is both inclusive and forward-thinking.

It focuses on the existing scenarios’ technological shortcomings, and is proposing an ‘A+’ scenario that better aligns with EU standards, addresses aspects of privacy (specifically unlinkability and correlation) and fosters iterative development. This approach champions not only secure cryptographic practices but also advocates for the coexistence of various credential types, ensuring a flexible, future-proof infrastructure.

The imperative for cryptographically safe owner binding, a cornerstone for qualified digital identities are further aspects. The document elucidates the necessity for cryptographic primitives embedded directly within the secure elements of devices, particularly for high levels of assurance. This technical requirement is not merely a suggestion but a mandatory prerequisite to prevent any potential misuse or impersonation attempts. It confines of a device’s silicon is highlighted as a critical measure to prevent the unauthorized replication of private keys, ensuring that the sanctity of digital identities remains inviolable.

Furthermore, the document highlights the urgency of action, urging stakeholders to lead the way in establishing a continuously evolving, privacy-centric E-ID framework. It is also aimed at striking a balance between Swiss-specific requirements and EU interoperability, setting a precedent for digital identity management.

DIDAS’s insights into governance structures and the collaborative design of the trust infrastructure serves as a high level guide for policymakers, technologists, and industry stakeholders, emphasizing the collective responsibility in shaping a digital identity ecosystem that is secure, user-centric, adaptable by private sector businesses and aligned with broader societal values and international standards.

Download here: 2024-02 DIDAS E-ID Technology Discussion Paper Response Final  ​​

Next Level Supply Chain Podcast with GS1

Tackling Inventory Headaches in the E-commerce Universe

Tracking and managing inventory from end-to-end is challenging for business merchants dealing with perishable food items. Lichen Zhang, co-founder of Freshly Commerce, is changing how merchants handle the complexities of tracking bundles, managing perishable inventory, and the significance of complying with regulations such as FSMA 204. The company's initial success with its foundation, growth,

Tracking and managing inventory from end-to-end is challenging for business merchants dealing with perishable food items.

Lichen Zhang, co-founder of Freshly Commerce, is changing how merchants handle the complexities of tracking bundles, managing perishable inventory, and the significance of complying with regulations such as FSMA 204. The company's initial success with its foundation, growth, and innovative solutions for inventory and order fulfillment in the e-commerce industry spurred Freshly's evolution into a suite of tools helping e-commerce merchants manage their inventory and order fulfillment. 

The episode provides valuable insights into the evolution of e-commerce and the vital role of innovative solutions like Freshly Commerce in meeting the changing needs of the industry. Explore how Freshly Commerce addresses challenges faced by merchants, the importance of data sharing and collaboration, and the positive impacts of technological advancements on adapting to new conditions. Lichen also emphasizes the role of education and customer-centric strategies in Freshly Commerce's ongoing development and how Freshly Commerce started as a result of identifying a need in the market, leading them to participate in a Shopify app challenge where they secured third place.

 

Key takeaways: 

Accurate and timely inventory management in e-commerce is complex but necessary.

Addressing food safety and compliance with regulations helps prevent food waste and maximize profits.

Embracing technological advancements such as AI-driven tools can positively change some aspects of business operations.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Lichen Zang on LinkedIn

Check out Freshly Commerce

 


Digital Identity NZ

Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter

Kia ora e te whānau Biometrics hit the news again earlier this month. TV1’s 7 Sharp and 1News together with RNZ news, online and several printmedia carried the story of Foodstuffs North Island’s trial of Facial Recognition in 25 of its stores to see if it reduces retail crime. In addition, Māori, Pasifika and People of colour have&nb

Kia ora e te whānau

Biometrics hit the news again earlier this month. TV1’s 7 Sharp and 1News together with RNZ newsonline and several printmedia carried the story of Foodstuffs North Island’s trial of Facial Recognition in 25 of its stores to see if it reduces retail crime. In addition, Māori, Pasifika and People of colour have concerns of bias. Naturally the Office of Privacy Commissioner (OPC) is closely monitoring the trial. I have no special insight but from the links above I deduce that the trial stores run their CCTV feed through facial image matching software set at a high 90% threshold, matching it against that particular store’s database of known and convicted offenders. If a possible match is made, specially trained ‘super recognisers’ visually inspect both enrolled and detected images which in itself should eliminate racial bias while the rest of the feed is deleted.

Permanent deletion being not straightforward and the ‘no sharing’ rule between stores are matters that OPC likely monitors along with the trial’s effectiveness of reducing retail crime. While emerging anecdotal evidence overseas suggests its effectiveness, direct comparative research is needed.

CCTV and facial recognition are widely used for crime detection in public places, we are all using facial recognition every day on our phones, when we cross the border using Smart Gate, or when we use a browser on our PC, so you might ask why all the fuss? 

There are large notices in-store, it’s private property and people can choose to shop elsewhere. The additional use of image software in stores improves matching processes traditionally done by humans, albeit with potential human error. FR software and camera quality continuously improves while human-based matching has limitations. Perfection is challenging, but by combining human and technological efforts we can improve outcomes.

Foodstuffs North Island’s adherence to its rules raises the question of whether striving for perfection impedes progress. DINZ’s Biometrics Special Interest Group reflects on differing community views, agrees with the Deputy Police Commissioner on the need for an open discussion and emphasises the need for education on the technology’s workings and potential benefits when implemented correctly.

Help us provide much needed education and understanding in this domain.

Ngā mihi nui

Colin Wallis

DINZ Executive Director

Read the full news here: Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter

SUBSCRIBE FOR MORE

The post Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter appeared first on Digital Identity New Zealand.


FIDO Alliance

FIDO Alliance Announces Call for Speakers and Sponsors for FIDO APAC Summit 2024

February 21, 2024 The FIDO Alliance is excited to announce the return of the FIDO APAC Summit for its second year, building on the success of the 2023 event in […]

February 21, 2024

The FIDO Alliance is excited to announce the return of the FIDO APAC Summit for its second year, building on the success of the 2023 event in Vietnam. Scheduled to take place at the JW Marriott Kuala Lumpur, Malaysia, from September 10th to 11th, this premier event in the APAC region is dedicated to advancing phishing-resistant FIDO authentication – focusing on FIDO-based sign-ins with passkeys, and addressing IoT security and edge computing challenges with FIDO Device Onboarding (FDO).

Last year’s conference in Vietnam welcomed over 300 attendees and featured more than 20 sessions with engaging content alongside a sold-out exhibit area with over 20 industry-leading exhibitors and sponsors. The 2024 summit aims to build upon last year’s momentum with detailed case studies, technical tutorials, expert panels, and hands-on workshops. Sessions are designed to educate attendees on business drivers, technical considerations, and best practices for deploying modern authentication systems across web, enterprise and government applications. Additionally, attendees will benefit from a dynamic expo hall and engaging networking opportunities, set against the backdrop of downtown Kuala Lumpur’s natural beauty.

FIDO APAC Summit 2024 Call for Speakers

The FIDO Alliance invites thought leaders, industry experts, entrepreneurs, and academic professionals to submit speaking proposals to enrich the diverse FIDO APAC Summit 2024 program. Speakers with innovative ideas, implementation strategies, and successes in authentication and/or edge computing, from case studies to transformative projects, can submit proposals here. Selected speakers will join the ranks of top cybersecurity minds, influencing the community and promoting phishing-resistant authentication methods. Submit a proposal for an opportunity to shape cybersecurity’s future in the APAC region. Deadline for submissions is May 31, 2024. 

Sponsorship Opportunities at FIDO APAC Summit 2024

Join sponsors such as Samsung Electronics, SecureMetric, RSA, Thales, VinCSS, iProov, AirCuve, Zimperium, SmartDisplayer, and Utimaco and elevate your brand in the digital security landscape by sponsoring the FIDO APAC Summit 2024. This key event draws the cybersecurity community, offering sponsors a chance to interact with over 30 VIPs, speakers, and 300+ delegates, providing unparalleled brand visibility and thought leadership opportunities in the Asia-Pacific tech ecosystem. The summit is an ideal platform for sponsors eager to connect with an audience passionate about advanced passkeys and phishing-resistant authentication methods. Sponsoring this event places your brand at the forefront, engaging directly with professionals and policymakers driving the future of secure digital identities. Demonstrate your commitment to innovation and the development of secure, user-friendly digital ecosystems and influence the benchmark for authentication technologies by becoming a sponsor.

To become a sponsor, view the prospectus and complete the Sponsorship Request Form.

About FIDO Alliance

Formed in July 2012, the FIDO (Fast IDentity Online) Alliance aims to address the lack of interoperability among strong authentication technologies and the difficulties users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is revolutionizing authentication with standards for simpler, stronger methods that reduce reliance on passwords. FIDO Authentication offers stronger, private, and easier use when authenticating to online services. For more information, visit www.fidoalliance.org.

Tuesday, 20. February 2024

DIF Blog

Full steam ahead for the Veramo User Group

The Veramo User Group has kicked into action with a well-attended and productive first meeting.  The meeting on 15th February provided context for the origins of the popular Javascript framework and its subsequent donation to DIF, and surfaced a range of questions, ideas, use cases and feedback from current

The Veramo User Group has kicked into action with a well-attended and productive first meeting. 

The meeting on 15th February provided context for the origins of the popular Javascript framework and its subsequent donation to DIF, and surfaced a range of questions, ideas, use cases and feedback from current and prospective users, plus success stories including an Enterprise solution built on top of Veramo, now in full production. 

After an initial round of introductions, the Veramo Labs team provided an overview of the project’s history and current status, and the goals of the User Group. 

Ideas shared by participants included a registry of plugins that is actively maintained by the community, additional development languages and a 12-month roadmap of planned new features. 

“The team’s goal has been from the beginning to grow the project by growing the community,” Veramo co-founder and group co-chair Mircea Nistor commented during the meeting. 

“Prior to donating Veramo to DIF, we had previously donated other libraries, and we saw more adoption and contributions happening on these libraries. We’d love to see the same kinds of activities happening within the Veramo User Group,” he added.

“To make this into a community project, it needs the original team to be no longer exclusively in charge. The Veramo framework and plugins already contain enough functionality to get people started in this space. We hope to see many new GitHub issues coming from the User Group, and for those with bandwidth to take them on” said Senior Engineer at Consensys Identity and group co-chair Nick Reynolds.

The plan is to start reviewing GitHub issues and Pull Requests (PRs) at next week’s meeting, which is at 09:00 PST / Noon EST / 18:00 CET on Thursday, 22 February. The User Group is open to all, the meeting link can be found in the DIF calendar.

In the meantime, interested parties are welcome to join the community on Discord. The new Veramo User Group channel on DIF’s Discord server is recommended for meeting follow-ups and agenda items, and the Veramo Labs Discord server is the place to head for specific technical questions.

The Veramo Labs team is hoping for the community to participate in leading the User Group within the next six months. 


OpenID

Registration Open for OpenID Foundation Hybrid Workshop at Google on Monday, April 15, 2024

Workshop Overview OpenID Foundation Workshops provide technical insight and influence on current digital identity standards while offering a collaborative platform to openly address current trends and market opportunities. This OpenID Foundation Workshop includes a number of presentations focused on 2024 Foundation strategic initiatives as well as updates on active working groups. Workshop Details

Workshop Overview

OpenID Foundation Workshops provide technical insight and influence on current digital identity standards while offering a collaborative platform to openly address current trends and market opportunities. This OpenID Foundation Workshop includes a number of presentations focused on 2024 Foundation strategic initiatives as well as updates on active working groups.


Workshop Details

Thank you kindly to Google for hosting this after lunch, hybrid workshop on Monday, April 15, 2024 12:30-4pm PT. The workshop will be at the Google Sunnyvale Campus with address and room details to be shared once confirmed.

Please note that registration is required and open until Friday, April 5, 2024 12pm PT: https://www.eventbrite.com/e/openid-foundation-workshop-at-google-monday-april-15-2024-tickets-846262954277?aff=oddtdtcreator 

All registered participants will receive a link to participate virtually prior to the workshop. This is an after-lunch workshop with beverages and snacks provided to those attending in person. The Foundation’s Note Well Statement can be found here and is used to govern workshops.


Agenda

The workshop agenda will be published and promoted as soon as confirmed.

The post Registration Open for OpenID Foundation Hybrid Workshop at Google on Monday, April 15, 2024 first appeared on OpenID Foundation.


Content Authenticity Initiative

February 2024 | This Month in Generative AI: Election Season

From AI resurrected dictators to AI powered interactive chatbots, political campaigns around the world are deploying the technology to expand their audience and win over voters. This month, Hany Farid, UC Berkeley Professor, CAI Advisor, looks at examples of increasingly easier to combine fake audio with video, its clear effect on the electorate, and existing solutions to authenticating digita

Adobe Stock

by Hany Farid, UC Berkeley Professor, CAI Advisor

News and trends shaping our understanding of generative AI technology and its applications.

In May of 2019, a manipulated video of House Speaker Nancy Pelosi purportedly slurring her words in a public speech racked up over 2.5 million views on Facebook. Although the video was widely reported to be a deepfake, it was what we would today call a “cheap fake.” The original video of Speaker Pelosi was simply slowed down to make her sound inebriated — no AI needed. The cheap fake was, however, a harbinger.

Around 2 billion citizens will vote this year in some 70 elections around the globe. At the same time, generative AI has emerged as a powerful technology that can entertain, defraud, and deceive.

Today, nearly anyone can use generative AI to create hyper-realistic images from only a text prompt, clone a person's voice from a 30-second recording, or modify a video to make the speaker say things they never did or would say. Perhaps not surprisingly, generative AI is finding its way into everything from local to national and international politics. Some of these applications are used to bolster a candidate, but many are designed to be harmful to a candidate or party, and all applications raise new and complex questions.

Trying to help

In October of last year, New York City Mayor Eric Adams used generative AI to make robocalls in which he spoke Mandarin and Yiddish. (Adams only speaks English.) The calls did not disclose that the voice was AI-generated, and at least some New Yorkers believe that Adams is multilingual: "People stop me on the street all the time and say, ‘I didn’t know you speak Mandarin,’" Adams said. While the content of the calls was not deceptive, some claimed that the calls themselves were deceptive and an unethical use of AI.

Not to be outdone, earlier this year Representative Dean Phillips deployed a full-blown OpenAI-powered interactive chatbot to bolster his long-shot bid for the Democratic nomination in the upcoming presidential primary. The chatbot disclosed that it was an AI-bot and allowed voters to ask questions and hear an AI-generated response in an AI-generated version of Phillips's voice. Because this bot violated OpenAI's terms of service, it was eventually taken offline.

Trying to harm

In October of last year, Slovakia — a country that shares part of its eastern border with Ukraine — saw a last-minute and dramatic shift in its presidential election. Just 48 hours before election day, the pro-NATO and Western-aligned candidate Michal Šimečka was leading in the polls by some four points. A fake audio of Šimečka seeming to claim that he was going to rig the election spread quickly online, and two days later the pro-Moscow candidate Robert Fico won the presidential election by five points. It is impossible to say exactly how much the audio impacted the election outcome, but this incident raised concerns about the use of AI in campaigns.

Fast-forward to January of this month when the state of New Hampshire was holding the nation's first primary for the 2024 US presidential election. On the eve of the primary, more than 20,000 New Hampshire residents received robocalls impersonating President Biden. The call urged voters not to vote in the primary and to "save your vote for the November election." It took two weeks before New Hampshire’s Attorney General announced that his office identified two businesses behind these robocalls. 

The past few months have also seen an increasing number of viral images making the rounds on social media. These range from faked images of Trump with convicted child sex trafficker Jeffrey Epstein and a young girl, to faked images of Biden in military fatigues on the verge of authorizing military strikes. 

On the video front, it is becoming increasingly easier to combine fake audio with video to make people say and do things they never did. For example, a speech originally given by Vice President Harris on April 25, 2023, at Howard University was digitally altered to replace the voice track with a seemingly inebriated and rambling Harris.

And these are just a few examples of the politically motivated deepfakes that we have already started to see as the US national election heats up. In the coming months, I'll be keeping track of these examples as they continue to emerge.

Something in between

In the lead up to their election earlier in February, a once-feared army general, who ruled Indonesia with an iron fist for more than three decades, was AI resurrected with a message for voters. And, in India, former Dravida Munnetra Kazhagam – deceased since 2018 – was AI resurrected with an endorsement for his son, the sitting head of the state of Bengaluru. I expect this type of virtual endorsement will become an (ethically complex) trend.

Looking ahead

There are two primary approaches to authenticating digital media. Reactive techniques analyze various aspects of an image or video for traces of implausible or inconsistent properties. Learn more about these photo forensics techniques in my series for the CAI. Proactive techniques, on the other hand, operate at the source of content creation, embedding into or extracting from an image or video an identifying digital watermark or signature. 

Although not perfect, these combined reactive and proactive technologies will make it harder (but not impossible) to create a compelling fake and easier to verify the integrity of real content. The creation and detection of manipulated media, however, is inherently adversarial. Both sides will continually adapt, making distinguishing the real from the fake an ongoing challenge.

While it is relatively straightforward to regulate AI-powered non-consensual sexual imagery, child abuse imagery, and content designed to defraud, regulating political speech is more fraught. We, of course, want to give a wide berth for political discourse, but there should be limits on activities like those we saw in New Hampshire, where bad actors attempt to interfere with our voting rights. 

As a first step, following the New Hampshire AI-powered robocalls, the Federal Communications Commission quickly announced a ban on AI-powered robocalls. While the ruling is fairly narrow and doesn't address the wider issue of AI-powered election interference or non-AI-powered interference, it is a reasonable precaution as we all try to sort out this brave new world where anybody's voice or likeness can be manipulated.

As we continue to wrestle with these complex questions, we as consumers have to be particularly vigilant as we enter what is sure to be a highly contentious election season. We should be vigilant not to fall for disinformation just because it conforms to our personal views, we should be vigilant not to be part of the problem by spreading disinformation, and we should be vigilant to protect our and others' rights (even if we disagree with them) to participate in our democracy.

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Author bio: Professor Hany Farid is a world-renowned expert in the field of misinformation, disinformation, and digital forensics. He joined the Content Authenticity Initiative (CAI) as an advisor in June 2023. The CAI is an Adobe-led community of media and tech companies, NGOs, academics, and others working to promote adoption of the open industry standard for content authenticity and provenance.

Professor Farid teaches at the University of California, Berkeley, with a joint appointment in electrical engineering and computer sciences at the School of Information. He’s also a member of the Berkeley Artificial Intelligence Lab, Berkeley Institute for Data Science, Center for Innovation in Vision and Optics, Development Engineering Program, and Vision Science Program, and he’s a senior faculty advisor for the Center for Long-Term Cybersecurity. His research focuses on digital forensics, forensic science, misinformation, image analysis, and human perception.

He received his undergraduate degree in computer science and applied mathematics from the University of Rochester in 1989, his M.S. in computer science from SUNY Albany, and his Ph.D. in computer science from the University of Pennsylvania in 1997. Following a two-year post-doctoral fellowship in brain and cognitive sciences at MIT, he joined the faculty at Dartmouth College in 1999 where he remained until 2019.

Professor Farid is the recipient of an Alfred P. Sloan Fellowship and a John Simon Guggenheim Fellowship, and he’s a fellow of the National Academy of Inventors.


FIDO Alliance

Intelligent Health.Tech: Site security: Passwordless fingerprint authentication

Thales has announced the SafeNet IDPrime FIDO Bio Smart Card – a security key that enables strong multi-factor authentication (MFA) for the enterprise. This new contactless smart card allows users […]

Thales has announced the SafeNet IDPrime FIDO Bio Smart Card – a security key that enables strong multi-factor authentication (MFA) for the enterprise. This new contactless smart card allows users to access enterprise devices, applications and cloud services using a fingerprint instead of a password. 


StateTech Magazine: How Passwordless Authentication Supports Zero Trust

Utilizing FIDO passkeys addresses security risks associated with password-based systems which often lead to account takeovers, data breaches and even stolen identities. While password managers and legacy forms of two-factor authentication […]

Utilizing FIDO passkeys addresses security risks associated with password-based systems which often lead to account takeovers, data breaches and even stolen identities. While password managers and legacy forms of two-factor authentication offer incremental improvements, there has been industry-wide collaboration to create passkey sign-in technology that is more convenient and more secure.


TechTarget: How passwordless helps guard against AI-enhanced attacks

In the age of generative AI, phishing scams (which already account for 90% of data breaches according to CISA) are becoming increasingly persuasive and humanlike. To mitigate these evolving threats, […]

In the age of generative AI, phishing scams (which already account for 90% of data breaches according to CISA) are becoming increasingly persuasive and humanlike. To mitigate these evolving threats, organizations should prioritize transitioning to passkeys, a phishing-resistant alternative backed by industry giants like Google, Apple, Amazon, and Microsoft, to enhance both security and usability.


The Wall Street Journal: Forget Passwords and Badges: Your Body Is Your Next Security Key

Andrew Shikiar, executive director of the FIDO Alliance, emphasizes the importance of biometric scans as hacking attempts and other cyber threats have become more sophisticated.

Andrew Shikiar, executive director of the FIDO Alliance, emphasizes the importance of biometric scans as hacking attempts and other cyber threats have become more sophisticated.


Hyperledger Foundation

Hyperledger Mentorship Spotlight: Iroha 2 Blockchain Explorer

Engaging in the Iroha 2 Blockchain Explorer project through the Hyperledger Mentorship project has been an exhilarating journey marked by technical challenges, continuous learning, and a profound sense of contributing to the broader technical community. As a mentee in this project, I immersed myself in various facets that not only enhanced my technical skills but also offered valuable i

Engaging in the Iroha 2 Blockchain Explorer project through the Hyperledger Mentorship project has been an exhilarating journey marked by technical challenges, continuous learning, and a profound sense of contributing to the broader technical community. As a mentee in this project, I immersed myself in various facets that not only enhanced my technical skills but also offered valuable insights on becoming a more effective contributor to the open-source realm.


Velocity Network

Live event with Randstad and Rabobank

On March 19th, discover how the Dutch Banking industry is using verifiable credentials to accelerate the shift to a skills-based economy. The post Live event with Randstad and Rabobank appeared first on Velocity.

The post Live event with Randstad and Rabobank appeared first on Velocity.


MyData

Open Position: Finance and admin officer (50% FTE)

Job title:  Finance and admin officerEmployment type:  50% employment contractContract duration:  Permanent with a 6-month trial periodSalary range: 1,100 € – 1,250 €  (2,200 € – 2,500 € FTE)Location: Finland, with a preference for HelsinkiReports to: Executive Director Role description   The Finance and Administration Officer is responsible for monitoring and imp
Job title:  Finance and admin officerEmployment type:  50% employment contractContract duration:  Permanent with a 6-month trial periodSalary range: 1,100 € – 1,250 €  (2,200 € – 2,500 € FTE)Location: Finland, with a preference for HelsinkiReports to: Executive Director Role description   The Finance and Administration Officer is responsible for monitoring and implementing financial operations, setting up and […]

Monday, 19. February 2024

Elastos Foundation

Elastos Rebranding: Embracing Bitcoin and Cyber Republic Governance

Elastos has initiated a critical shift towards refining its identity and reinforcing its core commitments towards delivering a Bitcoin-secured SmartWeb. Central to this shift is the introduction of a new logo alongside a forthcoming overhaul of the Elastos.info website. These updates are more than aesthetic changes; they symbolise Elastos’ strategic realignment, especially highlighting its integra

Elastos has initiated a critical shift towards refining its identity and reinforcing its core commitments towards delivering a Bitcoin-secured SmartWeb. Central to this shift is the introduction of a new logo alongside a forthcoming overhaul of the Elastos.info website. These updates are more than aesthetic changes; they symbolise Elastos’ strategic realignment, especially highlighting its integration with Bitcoin as a Layer 2 solution and its governance through the Cyber Republic.

 

Why?

The move to redesign the logo stems from a fundamental need to better represent Elastos’ synergy with Bitcoin—a relationship established through merged mining since 2018. Merged mining allows Elastos to enhance its network security by simultaneously mining with Bitcoin without additional computational power, effectively utilising Bitcoin’s extensive mining infrastructure to safeguard against attacks and improve reliability.

The redesigned logo encapsulates Elastos’ strategic alliance with Bitcoin and first emerged for BeL2, Elastos’ Bitcoin Layer 2 protocol that allows smart contracts to run on Bitcoin. After receiving positive feedback, it was suggested that Elastos could adopt a dark blue background and BeL2 a light grey, showcasing a family brand not too dissimilar to the likes of Amazon or Amazon Web Services.

The new logo visually harmonises with Bitcoin’s color scheme, creating an instinctive link to the cryptocurrency leader, yet it preserves Elastos’ unique identity. This balance between affiliation and individuality is a deliberate step towards embodying Elastos’ evolution and its foundational pillars in its visual representation. But to be accepted, the decision had to go through the Cyber Republic.

 

The Cyber Republic: A Model of Decentralised Governance

Elastos’ innovative governance system, the Cyber Republic (CR), highlights its commitment to decentralised, community-led growth. The CR, encompassing everyone from $ELA token holders to the Elastos Foundation and partners, is grounded in the Cyber Republic Consensus (CRC). This system, unlike traditional Proof of Work (PoW) or Byzantine Proof of Stake (BPoS), is designed for direct community involvement in governance, allowing for the election of council members, proposal voting, and incentivising active participation using mainchain ELA, which is merge-mined by Bitcoin. This ensures governance is transparent, equitable, and true to the decentralised nature of blockchain.

The CR Councils unanimous 12/12 decision to approve the logo change reflects a unified commitment to this governance model and to the broader vision of Elastos. The forthcoming rebrand of the Elastos.info website extends this vision. It aims to be a digital manifestation of Elastos’ refreshed identity and strategic direction, enhancing community interaction, learning, and cooperation within the ecosystem. This step is seen as a continuation of aligning Elastos with its core values and the dynamic requirements of its community.

 

Paving the Way for a Brighter Future

Elastos’ journey through strategic rebranding, rooted in the innovative practices of merged mining with Bitcoin and the decentralised governance of the Cyber Republic, not only enhance Elastos’ position within the blockchain ecosystem but also set the stage for a more interconnected, secure, and participatory digital world. What do you think about the refreshed logo? Let us know and stay up to date by following us on X!

The post Elastos Rebranding: Embracing Bitcoin and Cyber Republic Governance appeared first on Elastos.


Identity At The Center - Podcast

In our latest episode of The Identity at the Center Podcast,

In our latest episode of The Identity at the Center Podcast, we dive into a conversation with Daniel Grube about TikTok's adoption of FIDO technology to enhance security. Daniel shares insights into the seamless integration of this technology for both enterprise and user benefits, emphasizing the importance of user education and phased technology rollouts. We also explore the lighter side with a d

In our latest episode of The Identity at the Center Podcast, we dive into a conversation with Daniel Grube about TikTok's adoption of FIDO technology to enhance security. Daniel shares insights into the seamless integration of this technology for both enterprise and user benefits, emphasizing the importance of user education and phased technology rollouts. We also explore the lighter side with a debate on airplane seating preferences. Listen to this enlightening discussion at idacpodcast.com or wherever you download your podcasts.

#iam #podcast #idac


GS1

e-CMR in GS1 Belgium

e-CMR in GS1 Belgium In April 2022, GS1 Belgium & Luxembourg launched a pilot project on e-CMR together with 7 companies, amongst which AB InBev. The goal of this pilot project is to optimise the digitalisation of transport with e-CMR and to define standards that everyone can use. One year later, we have already some
e-CMR in GS1 Belgium In April 2022, GS1 Belgium & Luxembourg launched a pilot project on e-CMR together with 7 companies, amongst which AB InBev.

The goal of this pilot project is to optimise the digitalisation of transport with e-CMR and to define standards that everyone can use. One year later, we have already some great insights and asked Andreea Calin from AB InBev to share her findings on e-CMR and our pilot project.

See more on GS1 Belgilux's article


Paperless – GS1 Poland (in Polish)

Paperless – GS1 Poland (in Polish) paperless_logistyka_bez_papieru_taniej_szybciej_bezpieczniej.pdf

E-CMR in Colian Logistic – GS1 Poland (in Polish)

E-CMR in Colian Logistic – GS1 Poland (in Polish) bc_ecmr.pdf
E-CMR in Colian Logistic – GS1 Poland (in Polish) bc_ecmr.pdf

Friday, 16. February 2024

Oasis Open Projects

Approved Errata for Common Security Advisory Framework v2.0 published

Update to the definitive reference for the CSAF language now available. The post Approved Errata for Common Security Advisory Framework v2.0 published appeared first on OASIS Open.

CSAF Aggregator schema updated

OASIS and the OASIS Common Security Advisory Framework (CSAF) TC [1] are pleased to announce the approval and publication of Common Security Advisory Framework Version 2.0 Errata 01.

This document lists the approved errata for the OASIS Standard “Common Security Advisory Framework Version 2.0.” The specific changes are listed in section 1.1, at https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.html#11-description-of-changes.

The Common Security Advisory Framework (CSAF) Version 2.0 is the definitive reference for the CSAF language which supports creation, update, and interoperable exchange of security advisories as structured information on products, vulnerabilities and the status of impact and remediation among interested parties.

The OASIS CSAF Technical Committee is chartered to make a major revision to the widely-adopted Common Vulnerability Reporting Framework (CVRF) specification, originally developed by the Industry Consortium for Advancement of Security on the Internet (ICASI). ICASI has contributed CVRF to the CSAF TC. The revision is being developed under the name Common Security Advisory Framework (CSAF). TC deliverables are designed to standardize existing practice in structured machine-readable vulnerability-related advisories and further refine those standards over time.

The documents and related files are available here:

Common Security Advisory Framework Version 2.0 Errata 01
OASIS Approved Errata
26 January 2024

Editable source (Authoritative):
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.md

HTML:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.html

PDF:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.pdf

JSON schemas:
Aggregator JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/aggregator_json_schema.json
CSAF JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/csaf_json_schema.json
Provider JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/provider_json_schema.json

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.zip

Members of the CSAF TC [1] approved the publication of these Errata by Full Majority Vote [2]. The Errata had been released for public review as required by the TC Process [3]. The Approved Errata are now available online in the OASIS Library as referenced above.

Our congratulations to the CSAF TC on achieving this milestone.

========== Additional references:
[1] OASIS Common Security Advisory Framework (CSAF) TC
https://www.oasis-open.org/committees/csaf/

[2] https://lists.oasis-open.org/archives/csaf/202402/msg00001.html

[3] Public review:
– 15-day public review, 20 December 2023: https://lists.oasis-open.org/archives/members/202312/msg00005.html
– Comment resolution log: https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/csd01/csaf-v2.0-errata01-csd01-comment-resolution-log.txt

The post Approved Errata for Common Security Advisory Framework v2.0 published appeared first on OASIS Open.


FIDO Alliance

White Paper: Addressing FIDO Alliance’s Technologies in Post Quantum World

There has been considerable press, a number of papers, and several formal initiatives concerned with quantum computing’s impact on cryptographic algorithms and protocols. Most standards development organizations are addressing concerns […]

There has been considerable press, a number of papers, and several formal initiatives concerned with quantum computing’s impact on cryptographic algorithms and protocols. Most standards development organizations are addressing concerns about the impact on the security of the currently deployed cryptographic algorithms and protocols. This paper presents FIDO Alliance initiatives that address the impact of quantum computing on the Alliance’s specifications and how the FIDO Alliance is working to retain the long-term value provided by products and services based on the FIDO Alliance specifications. 

This paper is directed to those who have or are considering FIDO-enabled products and solutions but have concerns about the impact of Quantum Computing on their business. This paper will focus, from a high-level approach, on the FIDO Alliance’s acknowledgment of issues related to Quantum Computing and explain how the FIDO Alliance is taking appropriate steps to provide a seamless transition from the current cryptographic algorithms and protocols to new PQC (or quantum-safe) algorithms in a timely manner.

For any questions or comments, please contact feedback@fidoalliance.org.

Wednesday, 14. February 2024

FIDO Alliance

Webinar: Next-Gen Authentication: Implementing Passkeys for your Digital Services

Despite their shortcomings, passwords have been a necessary evil; an unavoidable reality. No wonder online services have struggled to get rid of passwords for nearly three decades. Not anymore! Passkeys […]

Despite their shortcomings, passwords have been a necessary evil; an unavoidable reality. No wonder online services have struggled to get rid of passwords for nearly three decades. Not anymore! Passkeys have emerged as a modern form of authentication, offering a superior user experience and higher security. With over 8 billion accounts already protected by passkeys, the question for service providers isn’t “if” they should be adopting passkeys, rather “when” and “how”. 

During the webinar attendees were able to: 

Learn why this standards-based approach is gaining such rapid traction  Understand how ready your end users are for adopting passkeys Get actionable guidance to roll out passkeys for both low and high assurance authentication Understand why you should introduce passkeys for your digital services right away

Tuesday, 13. February 2024

Oasis Open Projects

The DocBook Schema Version 5.2 OASIS Standard published

DocBook Version 5.2 continues the evolution of the DocBook XML schema. The post The DocBook Schema Version 5.2 OASIS Standard published appeared first on OASIS Open.

DocBook continues its evolution - over 25 years since origin

OASIS is pleased to announce the publication of its newest OASIS Standard, approved by the members on 06 February 2024:

The DocBook Schema Version 5.2
OASIS Standard
06 February 2024

Overview:

Almost all computer hardware and software developed around the world needs some documentation. For the most part, this documentation has a similar structure and a large core of common idioms. The community benefits from having a standard, open, interchangeable vocabulary in which to write this documentation. DocBook has been, and will continue to be, designed to satisfy this requirement. For more than 25 years, DocBook has provided a structured markup vocabulary for just this purpose. DocBook Version 5.2 continues the evolution of the DocBook XML schema.

The prose specifications and related files are available here:

The DocBook Schema Version 5.2

Editable source (Authoritative):
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.docx

HTML:
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.html

PDF:
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.pdf

Schemas:
Relax NG schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/rng/
Schematron schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/sch/
XML catalog: https://docs.oasis-open.org/docbook/docbook/v5.2/os/catalog.xml
NVDL schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/

Distribution ZIP file

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:

https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.zip

Our congratulations to the members of the OASIS DocBook Technical Committee on achieving this milestone.

The post The DocBook Schema Version 5.2 OASIS Standard published appeared first on OASIS Open.

Monday, 12. February 2024

Hyperledger Foundation

Introducing Hyperledger Web3j, the Ethereum Integration Library for Enterprises

We are very excited to announce the newest Hyperledger project, Hyperledger Web3j. Contributed by Web3 Labs to Hyperledger Foundation in January 2024, Web3j is a well-established open source project with an active community that plans to further thrive within the Hyperledger ecosystem.

We are very excited to announce the newest Hyperledger project, Hyperledger Web3j. Contributed by Web3 Labs to Hyperledger Foundation in January 2024, Web3j is a well-established open source project with an active community that plans to further thrive within the Hyperledger ecosystem.


Identity At The Center - Podcast

It’s time for the next exciting episode of the Identity at t

It’s time for the next exciting episode of the Identity at the Center Podcast! We dive into the fascinating world of Security Operations Centers (SOCs) and their crucial role in identity security. In this episode, we had the privilege of hosting two experts from RSM's Managed Security Practice, Steve Kane and Todd Willoughby. Their insights and expertise shed light on the role of SOCs in identit

It’s time for the next exciting episode of the Identity at the Center Podcast! We dive into the fascinating world of Security Operations Centers (SOCs) and their crucial role in identity security.

In this episode, we had the privilege of hosting two experts from RSM's Managed Security Practice, Steve Kane and Todd Willoughby. Their insights and expertise shed light on the role of SOCs in identity security, evolving threats, and the importance of identity data within SOCs. We also explore the decision-making process between building your own SOC or outsourcing.

Listen to the full episode on idacpodcast.com or in your favorite podcast app and gain valuable insights into the anatomy of a breach, actions taken by SOCs to prevent attacks, and the tactics and techniques used by threat actors to avoid detection.

#iam #podcast #idac

Friday, 09. February 2024

Oasis Open Projects

Safeguarding Democracy with Open Standards

I have been writing a lot lately on the power of open standards but never has it been better displayed than in protecting democracy. Let me know what you think… Open standards can play a critical role in protecting democracy by promoting transparency, interoperability, and accessibility in the digital sphere. Here are a few ways […] The post Safeguarding Democracy with Open Standards appeared fi

By Francis Beland, Executive Director, OASIS Open

I have been writing a lot lately on the power of open standards but never has it been better displayed than in protecting democracy. Let me know what you think…

Open standards can play a critical role in protecting democracy by promoting transparency, interoperability, and accessibility in the digital sphere. Here are a few ways in which open standards can help safeguard democratic values:

Promoting transparency: Open standards ensure that data and information are openly available and can be accessed and verified by anyone. This helps to prevent the spread of disinformation, propaganda, and fake news, which can undermine democratic processes.

Encouraging interoperability: Open standards ensure that different systems can work together seamlessly, which is crucial for democratic institutions that need to share data and collaborate with each other. Interoperability also promotes competition and innovation, which can help to prevent monopolies and maintain a level playing field for all.

Enhancing accessibility: Open standards can help to ensure that everyone has access to digital tools and services, regardless of their socioeconomic status or location. This can help to empower marginalized communities and ensure that everyone has a voice in the democratic process.

Providing security: Open standards can help to ensure that digital systems and data are secure and resistant to hacking, data breaches, and other forms of cyber attacks. This is crucial for protecting democratic institutions and preventing outside interference in the democratic process.

Overall, open standards can help to promote democratic values by ensuring that digital systems are transparent, interoperable, accessible, and secure. By adopting open standards, democratic institutions can help to build trust with citizens and promote a more inclusive and equitable society.

The post Safeguarding Democracy with Open Standards appeared first on OASIS Open.


FIDO Alliance

WIRED: I Stopped Using Passwords. It’s Great—and a Total Mess

More than 8 billion online accounts can set up passkeys right now, says Andrew Shikiar, the chief executive of the FIDO Alliance, an industry body that has developed the passkey […]

More than 8 billion online accounts can set up passkeys right now, says Andrew Shikiar, the chief executive of the FIDO Alliance, an industry body that has developed the passkey over the past decade. So, I decided to kill my passwords.


TechRound: Top 10 UK Business Cybersecurity Providers

Intercede’s solutions offer maximum protection against data breaches, focusing on:

Intercede’s solutions offer maximum protection against data breaches, focusing on:

Digital Identity Management for citizens, the workforce, and supply chains Compliance adherence Technological solutions such as FIDO, Digital ID Registration, Mobile Authentication, and PKI for robust identity and credential management

International Security Journal: The role of MFA in the fight against phishing

Based on FIDO Alliance and W3C standards, passkeys replace passwords with cryptographic key pairs.This requires the user to further authenticate themselves off-site using either soft or hardware-bound solutions.

Based on FIDO Alliance and W3C standards, passkeys replace passwords with cryptographic key pairs.This requires the user to further authenticate themselves off-site using either soft or hardware-bound solutions.


国际安全期刊》:MFA 在打击网络钓鱼中的作用


Gear Patrol: Want a Faster, More Secure Way of Logging into X on Your iPhone? Use a Passkey

X (formerly Twitter) has introduced passkeys for iPhone users as an alternative to traditional passwords. Passkeys offer heightened security through its inherent two-step authentication system and are generated by the […]

X (formerly Twitter) has introduced passkeys for iPhone users as an alternative to traditional passwords. Passkeys offer heightened security through its inherent two-step authentication system and are generated by the device along with the X account, making it less vulnerable to phishing and unauthorized access.


Velocity Network

NSC’s Chris Goodson joins Velocity’s board

We're delighted that National Student Clearinghouse's Chris Goodson has been voted onto the Velocity Network Foundation Board of Directors. The post NSC’s Chris Goodson joins Velocity’s board appeared first on Velocity.

Thursday, 08. February 2024

OpenID

Public Review Period for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft: OpenID for Verifiable Credential Issuance 1.0 This would be the first Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the […]

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft:

OpenID for Verifiable Credential Issuance 1.0

This would be the first Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpenID Implementer’s Draft. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period.

The relevant dates are:

Implementer’s Draft public review period: Thursday, February 8, 2024 to Sunday, March 24, 2024 (45 days) Implementer’s Draft vote announcement: Monday, March 11, 2024 Implementer’s Draft early voting opens: Monday, March 18, 2024* Implementer’s Draft official voting period: Monday, March 25, 2024 to Monday, April 1, 2024 (7 days)*

* Note: Early voting before the start of the formal voting period will be allowed.

The OpenID Connect working group page is https://openid.net/wg/connect/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specification in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “AB/Connect” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-ab, and (3) sending your feedback to the list.

— Michael B. Jones

The post Public Review Period for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance first appeared on OpenID Foundation.


Oasis Open Projects

Two XACML Committee Specifications approved – “Related and Nested Entities” and “Separation of Duties”

This specification defines a method for supporting separation of duties within XACML policies. The post Two XACML Committee Specifications approved – “Related and Nested Entities” and “Separation of Duties” appeared first on OASIS Open.

Two XACML specifications ready for testing and implementation

OASIS is pleased to announce the approval and publication of two new Committee Specifications by the members of the eXtensible Access Control Markup Language (XACML) TC [1]:
– XACML v3.0 Related and Nested Entities Profile Version 1.0 Committee Specification 03
– XACML v3.0 Separation of Duties Version 1.0 Committee Specification 01

These two Committee Specifications are OASIS deliverables, completed and approved by the TC and fully ready for testing and implementation.

XACML v3.0 Related and Nested Entities Profile Version 1.0
Committee Specification 03
30 January 2024

Overview:

It is not unusual for access control policy to be dependent on attributes that are not naturally properties of the access subject or resource, but rather are properties of entities that are related to the access subject or resource. This profile defines the means to reference such attributes from within XACML policies for processing by a policy decision point.

The prose specifications and related files are available here:

Editable source (Authoritative):
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/cs03/xacml-3.0-related-entities-v1.0-cs03.docx

HTML:
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/cs03/xacml-3.0-related-entities-v1.0-cs03.html

PDF:
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/cs03/xacml-3.0-related-entities-v1.0-cs03.pdf

XML schemas:
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/cs03/schemas/

Distribution ZIP file

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file:
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/cs03/xacml-3.0-related-entities-v1.0-cs03.zip

******

XACML v3.0 Separation of Duties Version 1.0
Committee Specification 01
30 January 2024

Overview:

This specification defines a method for supporting separation of duties within XACML policies using obligations and allowing the full generality of attribute-based access control. In particular, duties are not required to be associated with subject roles.

The prose specifications and related files are available here:

Editable source (Authoritative):
https://docs.oasis-open.org/xacml/xacml-3.0-duties/v1.0/cs01/xacml-3.0-duties-v1.0-cs01.docx

HTML:
https://docs.oasis-open.org/xacml/xacml-3.0-duties/v1.0/cs01/xacml-3.0-duties-v1.0-cs01.html

PDF:
https://docs.oasis-open.org/xacml/xacml-3.0-duties/v1.0/cs01/xacml-3.0-duties-v1.0-cs01.pdf

Distribution ZIP file:
https://docs.oasis-open.org/xacml/xacml-3.0-duties/v1.0/cs01/xacml-3.0-duties-v1.0-cs01.zip

Members of the eXtensible Access Control Markup Language (XACML) TC [1] approved these two specifications by Special Majority Vote. The specifications had been released for public review as required by the TC Process [2]. The vote to approve as Committee Specifications passed [3], and the documents are now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving these milestones and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

========== Additional references:

[1] eXtensible Access Control Markup Language (XACML) TC
https://www.oasis-open.org/committees/xacml/

[2] Details of public reviews:
https://docs.oasis-open.org/xacml/xacml-3.0-related-entities/v1.0/csd03/xacml-3.0-related-entities-v1.0-csd03-public-review-metadata.html
https://docs.oasis-open.org/xacml/xacml-3.0-duties/v1.0/csd01/xacml-3.0-duties-v1.0-csd01-public-review-metadata.html

[3] Approval ballot:
https://www.oasis-open.org/committees/ballot.php?id=3812

The post Two XACML Committee Specifications approved – “Related and Nested Entities” and “Separation of Duties” appeared first on OASIS Open.


We Are Open co-op

Pathways to Change

How to run an impactful workshop using our free template Recently, we ran a Theory of Change workshop for the team at the Digital Credentials Consortium, which is hosted by MIT. We’ve found that organisations and projects that are looking to create big impact can benefit from this way of seeing into the future. Theory of Change (ToC) is a methodology or a criterion for planning, partic
How to run an impactful workshop using our free template

Recently, we ran a Theory of Change workshop for the team at the Digital Credentials Consortium, which is hosted by MIT. We’ve found that organisations and projects that are looking to create big impact can benefit from this way of seeing into the future.

Theory of Change (ToC) is a methodology or a criterion for planning, participation, adaptive management, and evaluation that is used in companies, philanthropy, not-for-profit, international development, research, and government sectors to promote social change. (Wikipedia)

In this post we’ll outline what this kind of session aims to achieve, share a template which you can re-use, and explain how to make best use of it.

We’ve become pretty good at running these kinds of workshops for all kinds of clients, large and small, and find them particularly useful in charting a course for collaborative working. Thanks goes to Outlandish for introducing this approach to us!

ToC workshop template by WAO available under a Creative Commons Attribution 4.0 International license

Note: around seven people is ideal for this kind of workshop. We run this workshop remotely, but there’s no reason why it couldn’t be done in person.

🥝 At the Core

The template has several sections to it, but at the core is the triangle of Final goal, Outcomes, and Activities. You work through these in turn, first defining the goal, moving onto outcomes to support that goal, and then activities which lead to the outcomes.

The core of the ToC approach

One of the first things to figure out as a team is the timeframe for the work you are doing together. In terms of the final goal, is that to be achieved in six months? a year? 18 months? three years?

Write that down on the sticky note at the top left-hand corner just to remind everyone.

⛏️ Breaking down the goal

The final goal can be difficult to write, so we’ve broken it down into three sections to make it easier for participants:

Before asking people to contribute ideas, we run through some examples from our own experience. The first row relates to work over around six months, the second over about 18 months, and the third over about three years.

Next, we use the section to the right hand side where each individual participant can take some time to write down what they think the organisation does (or should) do, to influence their stakeholders, to have the desired impact in the world.

They can approach these boxes in any order — for example, some people find it easier to go straight to the impact and then work backwards.

Once everyone has written something in all of the boxes, we go around and ask everyone in turn to explain what they’ve written. This adds some context.

Then, we go around again, and ask everyone to point to things that other people have written that they definitely agree with. This sets the scene for combining ideas into a collaborative final goal.

✅ Good enough for now, safe enough to try

After a quick break, participants are ready to collaborate on a combined final goal. We ask if anyone would like to have a go at filling in one of the boxes. They can do this directly themselves, or we can screenshare and fill it in for them.

After some discussion and iteration of what’s been written, we move onto the other boxes. It’s worth mentioning that the most important thing here is facilitated discussion, which means timeboxing in a way that doesn’t feel rushed.

The phrase to bear in mind is “good enough for now, safe enough to try” which is a slightly different way of “perfect is the enemy of good”.

🔍 Identifying the Outcomes

Getting the goal agreed on by the team is 80% of the work in this session. In our experience, it’s entirely normal for this to take an entire 90-minute session, or even longer.

Moving onto the outcomes, these are statements which support the goal. They are change or achievements that need to happen to help it be achieved; they should be written in a way that it’s possible to say “yes that has happened” or “no it has not”.

For example, “the world is a better place” is not an example of a well-written outcome, but “more people agree that the city is a safer place to live” would work.

Other examples of decent outcomes from different kinds of work might be:

Local biodiversity is enhanced and pollution is reduced. Parents demonstrate improved understanding of internet safety and digital citizenship. Economic diversity within neighbourhoods is increased.

There are several ways we’ve run this part of the workshop, from full-anarchism mode where people just ‘have at it’ through to taking it in turns in an orderly way to add (and then discuss) an outcome.

🚣 Getting to the Activities

People new to ToC workshops often conflate Outcomes and Activities. The easiest way to tell the difference is to ask whether it’s something we’re working towards, or whether it’s something we’re doing.

So, for example, if we take the outcome “Local biodiversity is enhanced and pollution is reduced” some supporting activities might be:

Introduce incentives for creating wildlife-friendly spaces, such as green roofs and community gardens. Run regular river and park clean-up operations to remove pollutants and litter. Enforce stricter regulations on industrial emissions and waste management. Offer subsidies for businesses that implement green practices that reduce pollution and enhance biodiversity. Promote the use of environmentally friendly pesticides and fertilisers in local farming and gardening.

Again, we’ve run workshops where we’ve just had a free-for-all, others where it’s been more orderly, and then others where teams have gone away and come up with the activities outside the session.

Some, in fact, have taken the existing activities they’re engaged with and tried mapping those onto the outcomes. It’s an interesting conversation when those activities don’t map!

💡 Final thoughts

A ToC workshop is a powerful way to chart a course together. It’s a collaborative endeavour for a small group to spend time on. What’s important is strong facilitation, as without it, participants can spend too much time (or not enough!) sharing their thoughts.

If you would like to explore WAO running a ToC workshop for your organisation, get in touch! We also have other approaches and openly-licensed templates that you may want to use and peruse at our Learn with WAO site.

Pathways to Change was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. February 2024

Ceramic Network

CeramicWorld 02

A digest of everything happening across the Ceramic ecosystem for February 2024: ETHDenver, new Orbis, DataFeed API, points, mainnet launches and more!

Welcome to the second edition of CeramicWorld, the monthly Ceramic ecosystem newsletter. Let's dive right in!

🏔️ Attend Proof of Data Summit!

Join Ceramic x Tableland on March 1 in Denver and livestream for Proof of Data Summit, a full-day community gathering on reputation, identity, DePIN, decentralized AI, and decentralized data computing. Featuring lightning talks, technical discussions, and panels with industry visionaries, this is going to be a can't miss event! Don't miss your chance to RSVP now to secure your spot in person or via livestream.

RSVP to Proof of Data Summit 👀 Loading the all new Orbis...

Orbis is expanding beyond social. Driven by developer feedback and a new role as core developers in the Ceramic ecosystem, Orbis’ mission is evolving to offer a simple and efficient gateway for storing and managing open data on Ceramic.

The all new Orbis will provide a developer-friendly SQL interface to explore and query data on Ceramic as well as a user interface and plugin store to save development time on crypto-specific features – from data migration and token gating mechanisms to automated blockchain interactions.

Orbis is built on Ceramic's new Data Feed API, making it fully compatible with ComposeDB. With the new Orbis, developing your project on Ceramic is easier than ever. If you want to learn more or join as an alpha tester, get in touch with the Orbis team on Discord or Twitter.

Learn more about OrbisDB 🔎 Index Network connects LLMs to Ceramic data

Index Network is a composable discovery protocol that enables personalized and autonomous discovery experiences across the web. Index is currently focused on enabling AI agents to query and interact with Ceramic data in a reactive, event-driven manner. Index has also partnered with a number of ecosystem teams like Intuition, Veramo, and more to enable users to create claims and attestations with natural language. Keep an eye out for updates from this team – mainnet launch seems imminent – and check out their documentation.

Learn more about Index 📈 The Ceramic ecosystem is growing

We found it difficult to keep track of all the new projects and initiatives sprouting up throughout the Ceramic ecosystem, so we made this ecosystem map. Let us know if we missed anyone. Enjoy! :)

Ceramic's Data Feed API opens for alpha testing
The Data Feed API is a set of new Ceramic APIs that enable developers to subscribe to the node's data change feed, allowing them to build Ceramic-powered databases and indexing solutions. A number of ecosystem partners are already in testing as we gear up for an early release before EthDenver! ComposeDB now supports interfaces
Interfaces enable standardized data models for interoperability. By defining essential fields that must be shared across models, interfaces facilitate data integration and querying across different models. This is vital for ensuring data consistency, especially in decentralized systems like verifiable credentials. For a detailed overview, see Intro to Interfaces. Get started quickly with create-ceramic-app
The create-ceramic-app CLI tool simplifies the process of starting with the Ceramic network by allowing you to quickly set up a ComposeDB example app. If you're familiar with create-react-app or create-next-app, you should be right at home. If you want to quickly test a Ceramic app locally on your system, simply run npx @ceramicnetwork/create-ceramic-app. This command will guide you through creating a new Ceramic-powered social app in under a minute. Collect attendance badges at ETHDenver 2024!
Ceramic is partnering with Fluence, a decentralized computing marketplace, to put forward a demo that will be in play at each of the events above. You will be able to find us at each event and tap a disc to participate! With each attendance you will claim badges represented as documents on Ceramic. Fluence will be consuming the new Ceramic Data Feed API to enable compute over incoming badges. Deprecation of IDX, Self.ID, Glaze, DID DataStore, 3ID, TileDocuments, Caip10Link
3Box Labs announced the deprecation of a suite of outdated Ceramic development tools including Self.ID, Glaze, DID DataStore, 3ID, 3id-datastore, TileDocuments, and Caip10Link. Due to the improvements in ComposeDB an other Ceramic databases over the last 2 years, these tools saw waning demand, creating significant maintenance overhead while failing to meet our strict UX and security standards. If you're using any of these tools, read this announcement for next steps. Ceramic Community Content FORUM Ceramic protocol minimization? WORKING GROUP Ceramic Points Working Group consisting of 10+ teams formed as a result of this forum post and tweet PODCAST Why to Store ID Data Decentralized with Ceramic TWITTER Oamo launches many new data pools in January BLOG Charmverse x Ceramic: Empowering User Data Ownership in the Blockchain Era TUTORIAL WalletConnect: Create User Sessions with Web3Modal BLOG How Rust delivers speed and security for Ceramic FORUM Ceramic x Farcaster Frames FORUM Making ComposeDB’s composites sharable WORKING GROUP Ceramic Core Devs Notes: 2024-01-02 Upcoming Events Feb 15 Ceramic Core Devs Call Feb 25 - Mar 2 Ceramic x Silk Hacker House (EthDenver): Calling all hackers excited about decentralized tech! Apply to join the Silk EthDenver Hacker House from Feb 25th - March 2nd and take part in revolutionizing scientific tooling, web account UX, governance forums, and more! Participants are encouraged to utilize Ceramic as a decentralized data layer alongside other ecosystem tools like Orbis, EAS and more. (Very limited spots) Feb 25 - Mar 2 DeSci Denver (EthDenver) Feb 27 DePin Day (EthDenver) Feb 28 Open Data Day (EthDenver) Mar 1 Proof of Data Summit (EthDenver) Work on Ceramic JOBS Head of Growth, 3Box Labs (Remote) JOBS Engineering Manager, 3Box Labs (Remote) BOUNTY Build a Ceramic x Scaffold-Eth Module Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.


Next Level Supply Chain Podcast with GS1

The Future of Connectivity with Digital Twins, AI, and the Global Supply Chain

Real-time data monitoring is revolutionizing maintenance and efficiency in industries such as aviation and automotive through digital twin technology. Richard Donaldson, host of the Supply Chain Next podcast, is a visionary in supply chain management and circular economy advocate. His  insights on moving from linear to circular supply chains highlight the potential for substantial environme

Real-time data monitoring is revolutionizing maintenance and efficiency in industries such as aviation and automotive through digital twin technology.

Richard Donaldson, host of the Supply Chain Next podcast, is a visionary in supply chain management and circular economy advocate. His  insights on moving from linear to circular supply chains highlight the potential for substantial environmental benefits and the importance of embracing reuse, especially in the context of his work with startups promoting circularity.

The dialogue extends beyond the digital twin to the broader digital transformation of global supply chains, drawing comparisons to the quick adoption of airplane wifi as an example of rapid technological progress. It explores the role of artificial intelligence in supply chain automation and predictive maintenance, touching upon the divide between machine learning and self-actualized thought. The conversation resonates with historical references and Richard's personal entrepreneurial experiences, including his tenure at eBay, his podcast Supply Chain Next, and his perspective on learning from failure. This episode offers a thought-provoking reflection on the future of supply chains and the role of technology in sustainable business practices.

 

Key takeaways: 

The early days of the Internet continue to influence current work in digitizing supply chains.

The global supply chain still lacks full digitization and transparency, particularly in older, established processes.

There is a strong advocacy for shifting towards circular supply chains that are environmentally mindful and focused on sustainability.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with guest:

Richard Donaldson on LinkedIn

 

Monday, 05. February 2024

Hyperledger Foundation

Governance Update: A Q+A with the Leaders of the 2024 Hyperledger Foundation Technical Oversight Committee

Governance is a cornerstone for open source community management and, ultimately, code. For Hyperledger Foundation, technical governance is the job of the Technical Oversight Committee (TOC), which consists of 11 members of the technical contributor community elected annually by maintainers and Governing Board members. (See the Foundation charter for more on TOC elections). As a group,

Governance is a cornerstone for open source community management and, ultimately, code. For Hyperledger Foundation, technical governance is the job of the Technical Oversight Committee (TOC), which consists of 11 members of the technical contributor community elected annually by maintainers and Governing Board members. (See the Foundation charter for more on TOC elections). As a group, they steer the technical direction for the Foundation. 


FIDO Alliance

ITPro: The end of passwords – and how businesses will embrace it

Big tech firms including Microsoft, Apple and Google have been moving towards a passwordless future for several years, with solutions such as security keys and more recently, passkeys, starting to take off as part of multi-factor authentication […]

Big tech firms including MicrosoftApple and Google have been moving towards a passwordless future for several years, with solutions such as security keys and more recently, passkeys, starting to take off as part of multi-factor authentication (MFA) setups. 

The FIDO Alliance – which most big tech players are members of – is pushing hard for the demise of the password. But what exactly does “the end of the password” mean, in practical terms?


GovTech: Forum Questions Future of Digital Identity, Path Forward

At the recent ID policy forum, the FIDO Alliance, The Identity Theft Resource Center, and other cybersecurity experts discussed the need for new identity verification methods as data breaches reached […]

At the recent ID policy forum, the FIDO Alliance, The Identity Theft Resource Center, and other cybersecurity experts discussed the need for new identity verification methods as data breaches reached record levels in 2023. Panelists argued that relying solely on knowledge-based methods like passwords and Social Security numbers is no longer secure and highlighted the importance of multifactor authentication, passkeys, and biometric checks.


PCMag: Passkeys Are Here: We Just Have to Convince People to Use Them

In a recent identity and authentication conference, Andrew Shikiar, Executive Director of the FIDO Alliance, declared 2023 as the “year of the passkey,” citing 8 billion user accounts with passkey […]

In a recent identity and authentication conference, Andrew Shikiar, Executive Director of the FIDO Alliance, declared 2023 as the “year of the passkey,” citing 8 billion user accounts with passkey access. Shikiar also emphasized the importance of passkeys in enhancing security, streamlining customer experiences, and gradually eliminating the reliance on traditional passwords, while acknowledging ongoing challenges and gaps in support across different industries and platforms.


Content Authenticity Initiative

January 2024 | This Month in Generative AI: Frauds and Scams

News and trends shaping our understanding of generative AI technology and its applications.

Adobe Stock

by Hany Farid, UC Berkeley Professor, CAI Advisor

News and trends shaping our understanding of generative AI technology and its applications.

Advances in generative AI continue to stun and amaze. It seems like every month we see rapid progression in the power and realism of AI-generated images, audio, and video. At the same time, it also seems like we are also seeing rapid advances in how the resulting content is being weaponized against individuals, societies, and democracies. In this post, I will discuss trends that have emerged in the new year.

First it was Instagram ads of Tom Hanks promoting dental plans. Then it was TV personality Gayle King hawking a sketchy weight-loss plan. Next, Elon Musk was shilling for the latest crypto scam, and, most recently, Taylor Swift was announcing a giveaway of Le Creuset cookware. All ads, of course, were fake. 

How it works

Each of these financial scams was powered by a so-called lip-sync deepfake, itself powered by two separate technologies. First, a celebrity's voice is cloned from authentic recordings. Where it used to take hours of audio to convincingly clone a person's voice, today it takes only 60 to 90 seconds of authentic recording. Once the voice is cloned, an audio file is generated from a simple text prompt in a process called text-to-speech. 

In a variant of this voice cloning, a scammer creates a fake audio file by modifying an existing audio file to sound like someone else. This process is called speech-to-speech. This latter fake is a bit more convincing because with a human voice driving the fake, intonation and cadence tend to be more realistic.

Once the voice has been created, an original video is modified to make the celebrity’s mouth region move consistently with the new audio. Tools for both the voice cloning and video generation are now readily available online for free or for a nominal cost.

Although the resulting fakes are not (yet) perfect, they are reasonably convincing, particularly when being viewed on a small mobile screen. The genius — if you can call it that — of these types of fakes is that they can fail 99% of the time and still be highly lucrative for scam artists. More than any other nefarious use of generative AI, it is these types of frauds and scams that seem to have gained the most traction over the past few months. 

Protecting consumers from AI-powered scams

These scams have not escaped the attention of the US government. In March of last year, the Federal Trade Commission (FTC) warned citizens about AI-enhanced scams. And more recently, the FTC announced a voice cloning challenge designed to encourage "the development of multidisciplinary approaches — from products to policies to procedures — aimed at protecting consumers from AI-enabled voice cloning harms, such as fraud and the broader misuse of biometric data and creative content. The goal of the challenge is to foster breakthrough ideas on preventing, monitoring, and evaluating malicious voice cloning."

The US Congress is paying attention, too. A bipartisan bill, the NO FAKES Act, would "prevent a person from producing or distributing an unauthorized AI-generated replica of an individual to perform in an audiovisual or sound recording without the consent of the individual being replicated." 

Acknowledging that there may be legitimate uses of AI-powered impersonations, the Act has carve-outs for protected speech: "Exclusions are provided for the representation of an individual in works that are protected by the First Amendment, such as sports broadcasts, documentaries, biographical works, or for purposes of comment, criticism, or parody, among others." While the NO FAKES Act focuses on consent, Adobe’s proposed Federal Anti-Impersonation Right (the FAIR Act) provides a new mechanism for artists to protect their livelihoods while also protecting the evolution of creative style.

Looking ahead

Voice scams will come in many forms, from celebrity-powered scams on social media to highly personalized scams on your phone. The conventional wisdom of "If it seems too good to be true, it probably is" will go a long way toward protecting you online. In addition, for now at least, the videos often have telltale signs of AI-generation because there are typically several places where the audio and video appear de-synchronized, like a badly dubbed movie. Recognizing these flaws just requires slowing down and being a little more thoughtful before clicking, sharing, and liking.

Efforts are underway to add digital provenance or verifiable Content Credentials to audio. Respeecher, a voice-cloning marketplace gaining traction among creators and Hollywood studios, is adding Content Credentials to files generated with its tool.

For the more personalized attacks that will reach you on your phone in the form of a loved one saying they are in trouble and in need of cash, you and your family should agree on an easy-to-remember secret code word that can easily distinguish an authentic call from a scam.

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Author bio: Professor Hany Farid is a world-renowned expert in the field of misinformation, disinformation, and digital forensics. He joined the Content Authenticity Initiative (CAI) as an advisor in June 2023. The CAI is an Adobe-led community of media and tech companies, NGOs, academics, and others working to promote adoption of the open industry standard for content authenticity and provenance.

Professor Farid teaches at the University of California, Berkeley, with a joint appointment in electrical engineering and computer sciences at the School of Information. He’s also a member of the Berkeley Artificial Intelligence Lab, Berkeley Institute for Data Science, Center for Innovation in Vision and Optics, Development Engineering Program, and Vision Science Program, and he’s a senior faculty advisor for the Center for Long-Term Cybersecurity. His research focuses on digital forensics, forensic science, misinformation, image analysis, and human perception.

He received his undergraduate degree in computer science and applied mathematics from the University of Rochester in 1989, his M.S. in computer science from SUNY Albany, and his Ph.D. in computer science from the University of Pennsylvania in 1997. Following a two-year post-doctoral fellowship in brain and cognitive sciences at MIT, he joined the faculty at Dartmouth College in 1999 where he remained until 2019.

Professor Farid is the recipient of an Alfred P. Sloan Fellowship and a John Simon Guggenheim Fellowship, and he’s a fellow of the National Academy of Inventors.


Identity At The Center - Podcast

Announcing another episode of The Identity at the Center Pod

Announcing another episode of The Identity at the Center Podcast! Join us as we dive into answering voicemail questions from our listeners. In this episode, we discuss topics such as the barrier of entry to IAM for entry-level roles, the role of IAM architects, influential roles in IAM with the rise of AI, and the choice between using Microsoft Enterprise Identity Protection or a dedicated third-p

Announcing another episode of The Identity at the Center Podcast! Join us as we dive into answering voicemail questions from our listeners. In this episode, we discuss topics such as the barrier of entry to IAM for entry-level roles, the role of IAM architects, influential roles in IAM with the rise of AI, and the choice between using Microsoft Enterprise Identity Protection or a dedicated third-party ITDR (IT Disaster Recovery) solution.

Congrats to listeners Andrew, Alex, Tim, Pedro, and Chris for sending in their questions and winning a digital copy of the book “Learning Digital Identity” by and courtesy of Phil Windley

You can listen to this episode and catch up on all our previous episodes at idacpodcast.com or on your favorite podcast app.

#iam #podcast #idac

Friday, 02. February 2024

Ceramic Network

CharmVerse X Ceramic: Empowering User Data Ownership in the Blockchain Era

Discover how CharmVerse integrated Ceramic to implement a credentialing and rewards system that supports user-sovereign data.

CharmVerse, a pioneering web3 community engagement and onboarding platform, recently integrated ComposeDB on Ceramic to store user attestations for grants and rewards. CharmVerse’s decision to build on Ceramic was driven by the need to store credentials in a user-owned, decentralized manner, without relying on traditional databases.

A who’s who of well-known web3 projects leverage CharmVerse to help manage their community and grants programs. Optimism, Game7, Mantle, Safe, Green Pill, Purple DAO, Orange DAO, Taiko (and the list goes on) have all experienced the need for a unique, web3-centric platform to interact with and empower their ecosystems.

What Objectives Does the Integration Address?

The work of vetting developer teams and distributing grants demands a significant investment of time and focus to ensure responsible treasury deployment. This need-driven use case is a wonderful fit for Ceramic’s capabilities.

CharmVerse identified an opportunity to enhance grants/community managers’ capabilities by implementing a credentialing and rewards system that supports user-sovereign data. This system allows grants managers to better understand their applicants, scale the number of teams they can work with, and issue attestations representing skills and participation in grants and other community programs, creating a verifiable record of participation. However, this solution came with technical challenges in maintaining user data privacy and ownership while ensuring decentralization as this data represents significant insight into the historical activity and capabilities of individuals and teams.

Why did CharmVerse Choose Ceramic?

CharmVerse considered various options but ultimately chose Ceramic due to its unique capability to support decentralized credentials and store attestations in a way that aligned with CharmVerse's vision. Alex Poon, CEO & co-founder of CharmVerse, shared:

“Ceramic's unique approach to data decentralization has been a game changer for us, allowing us to truly empower our users while respecting their privacy, allowing users the choice to keep their data private or publish it on-chain. This integration aligns perfectly with CharmVerse's success metrics, centering on community empowerment and data sovereignty.”

How did CharmVerse Integrate Ceramic?

CharmVerse's integration utilizes Ceramic's ability to store user attestations and leverages Ceramic’s work with the Ethereum Attestation Service (EAS) as the underlying model for supporting decentralized credentials. The integration was not only a technical milestone for CharmVerse but also achieved the strategic goal of appealing to an audience concerned with data privacy and ownership.

More specifically, CharmVerse issues off-chain signed attestations in recognition of important grant program milestones (designed to award these credentials both when users create proposals, and when their grants are accepted). Given Ceramic’s open-access design, we expect to see other teams utilize these credentials issued by CharmVerse as a strong indication of applicant reputation, track record, and ability to deliver.

How to See CharmVerse in Action

This collaboration illustrates the power of innovative solutions in advancing blockchain usability, value, and adoption while maintaining the values of the early cypherpunk vision of decentralization. If you would like to check out this integration and use the tool to manage your own community programs, visit app.charmverse.io and follow the CharmVerse X account for more updates!


FIDO Alliance

Recap: 2024 Identity, Authentication and the Road Ahead Policy Forum

What’s the state of identity and authentication in 2024? That was the primary topic addressed in a day full of insightful speaker sessions and panels at the annual Identity, Authentication […]

What’s the state of identity and authentication in 2024?

That was the primary topic addressed in a day full of insightful speaker sessions and panels at the annual Identity, Authentication and the Road Ahead Policy Forum held on January 25 in Washington D.C. The event was sponsored by the Better Identity Coalition, the FIDO Alliance, and the ID Theft Resource Center (ITRC). 

Topics covered included the latest data on identity theft, financial crimes involving compromised identities and the overall ongoing challenges of identity and authentication. The opportunities for phishing-resistant authentication standards and passkeys resonated throughout the event as well. In his opening remarks, Jeremy Grant of the Better Identity Coalition framed identity as both a cause and potential solution to security problems. 

White House advances strong authentication agenda

In the opening keynote, Caitlin Clarke,  Senior Director, White House National Security Council, detailed some of the steps the Biden-Harris administration is taking to improve digital identity and combat rising cybercrime.

“Money is fuelling the ecosystem of crime, but we often see that identity is either the target or the culprit of the cyber incidents that we are seeing every day,” Clarke said. 

In a bid to help improve the state of identity and authentication, the administration is implementing multi-factor authentication (MFA) for all federal government systems. Clarke also highlighted that the administration strongly believes in implementing phishing-resistant MFA.

“We need to make it harder for threat actors to gain access into systems by requiring and ensuring that a person is who they say they are beyond the username and password,” she said. “That is why authentication is also at the heart of the work we are doing to improve the cybersecurity of critical infrastructure, upon which we all rely.”

The role of biometrics

Biometrics have a role to play in the authentication and identity landscape according to a panel of experts.

The panel included Arun Vemury, Biometrics Expert and ITRC Advisory Board Member; James Lee, COO of the Identity Theft Resource Center; Dr. Stephanie Schuckers, Director, Center for Identification Technology Research (CITeR), Clarkson University; and John Breyault VP, Public Policy, Telecom and Fraud, at National Consumers League.

Panelists generally agreed that properly implemented biometrics combined with other security practices could help devalue stolen identity data and strengthen security overall. 

“Biometrics has the potential to affect fraud numbers,” Breyault said. “It’s not a silver bullet, it’s not going to stop everyone and, it may not be useful in every context, but it is something different than what we’re doing now.”

Better Identity at 5 years

Five years ago, the Better Identity Coalition published Better Identity in America: A Blueprint for Policymakers in response to significant questions from both government and industry about the future of how the United States should address challenges in remote identity proofing and other key issues impacting identity and authentication.

Jeremy Grant, Coordinator at the Better Identity Coalition, detailed the progress made in the past five years and also detailed new guidance for 2024.

The report assessed that while some progress has been made in certain areas like promoting strong authentication, overall the government receives poor grades for failing to prioritize the development of modern remote identity proofing systems or establish a national digital identity strategy. 

The revised blueprint outlines 21 new recommendations and action items for policymakers to help close gaps in America’s digital identity infrastructure and get ahead of growing security and privacy challenges posed by issues like synthetic identity fraud and deep fakes.

“Our message today is the same as it was back in 2018, which is that if you take this as a package, if this policy blueprint is enacted and funded by government, it’s going to address some very critical challenges in digital identity and as the name of our coalition would suggest, make things better,” Grant said.

The year of passkeys

While there is much to lament about the state of identity and authentication, there is also cause for optimism too.

Andrew Shikiar, executive director of the FIDO Alliance detailed the progress that has been made in the past year with the rollout and adoption of passkey deployments.

“Passkeys are simpler, stronger authentication, they are a password replacement,” he said. 

Shikiar noted that there are now hundreds of companies enabling consumers to use passkeys, which is helping to dramatically improve the overall authentication landscape. Not only is a passkey more secure, he also emphasized that it’s easier for organizations to use, than traditional passwords and MFA approaches.

“If you’re in the business of selling things, or providing content, or anything like that you want people to get on your site as quickly as possible –  passkeys are doing that,” he said.

Shikiar noted that the FIDO Alliance understands that user authentication is just one piece of the identity value chain. To that end the FIDO Alliance has multiple efforts beyond passkeys, including certification programs for biometrics and document authenticity certification programs among other efforts.

Don’t want to get breached? Use strong, phishing-resistant authentication

The primary importance of strong authentication was highlighted by Chris DeRusha, Federal Chief Information Security Officer in the  Office of Management and Budget (OMB), who detailed a recent report on a Lapsus cybersecurity gang that was released by the Cyber Safety Review Board. 

DeRusha noted that Lapsus hackers were able to beat MFA prompts using a variety of techniques, including social engineering and even just mass spamming employees with prompts to get someone to act.

A key recommendation from the report is to move away from phishable forms of MFA, including SMS and instead embrace FIDO based authentication with passkeys.

The view from FinCEN

The U.S. Treasury’s Financial Crimes Enforcement Network, more commonly known by the acronym FinCEN, is a critical element of the U.S financial system.

FinCEN Director Andrea Gacki spoke at the event about the agency’s recent progress on beneficial ownership reporting and the FinCEN Identity Project. The FinCEN Identity Project refers to FinCEN’s ongoing work related to analyzing how criminals exploit identity-related processes to perpetuate financial crimes. As part of this, FinCEN published a financial trends analysis earlier this month that looked at 2021 Bank Secrecy Act data to quantify how bad actors take advantage of identity processes during account openings, access, and transactions.

“Robust customer identity processes are the foundation of a secure and trusted U.S. financial system and are fundamental to the effectiveness of every financial institution,” Gacki said.

Sean Evans, lead cyber analyst at FinCEN noted that the recent report examined over 3.8 million suspicious activity reports filed in 2021 and found that approximately 1.6 million reports, representing $212 billion in activity, involved some form of identity exploitation.. Evans explained that cybercriminals are finding ways to circumvent or exploit weaknesses in identity validation, verification, and authentication processes to conduct illicit activities like fraud.

Kay Turner, chief digital identity adviser at FinCEN, emphasized that strengthening identity verification is critical for security. 

“We have to get identity right, it is vital to building trust in the system,” Turner stated.

CISA praises the push towards passkeys

Closing out the event was a keynote from Eric Goldstein, Executive Assistant Director for Cybersecurity, Cybersecurity and Infrastructure Security Agency, (CISA), Department of Homeland Security (DHS).

Goldstein emphasized that it’s important to note that while there are challenges, there has also been progress. Passkeys are now used by consumers everyday and increasing numbers of enterprises are moving toward passwordless deployments.

“It’s worth starting out just with some reflection on how far we have come in moving towards a passwordless future,” Goldstein said.”We are seeing more and more enterprises moving to passwordless for their enterprise privileges, their admin, their their employee authentication solutions and that’s a remarkable shift.”


GS1

Maintenance release 2.8

Maintenance release 2.8 daniela.duarte… Fri, 02/02/2024 - 16:33 Maintenance release 2.8
Maintenance release 2.8 daniela.duarte… Fri, 02/02/2024 - 16:33 Maintenance release 2.8

GS1 GDM SMG voted to implement the 2.8 standard into production in November 2023.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.8 contains updates for two work requests and includes reference material aligned with ADB 2.2 and GDSN 3.1.25

 

Updated For Maintenance Release 2.8

GDM Standard 2.8 (November 2023)

Local Layers For Maintenance Release 2.8

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (17 December 2021)

USA - GSMP RATIFIED (UPDATED February 2023)

Finland - GSMP RATIFIED (November 2023)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (Nov 2023)

GPC Bricks To GDM (Sub-) Category Mapping (March 2024)

Attribute Definitions for Business (November 2023)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.25 (August 2023)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (Nov 2023)

GDM Local Layer Submission Template (May 2023)

Training

E-Learning Course

Any questions?

We can help you get started using GS1 standards.

Contact your local office

Thursday, 01. February 2024

Ceramic Network

WalletConnect Tutorial: Create User Sessions with Web3Modal

Learn how to use WalletConnect's Web3Modal toolset to create Ceramic user sessions in this interactive technical tutorial.

WalletConnect offers Web3 developers powerful tools to make building secure, interactive, and delightful decentralized applications easier. This tooling incorporates best-in-class UX and UI with a modular approach to a suite of SDKs and APIs. For many teams looking to accelerate their development cadence without sacrificing security or quality, WalletConnect's various SDKs are an obvious choice.

One of our favorites is Web3Modal - a toolset that provides an intuitive interface for dApps to authenticate users and request actions such as signing transactions. Web3Modal supports multiple browser wallets (such as MetaMask and Trust Wallet) and offers thorough instruction in their documentation to help developers get up and running across multiple frameworks (React, Next, Vue, etc). For this tutorial, we will show how to use WalletConnect's Web3Modal for user authentication and the creation of user sessions.

Ready? Awesome! Let's get started

What Will We Build?

For this tutorial, we will build an application to track event attendance. The use case here is somewhat basic - imagine a conference that wants to keep track of which participants went to which event. They might allow participants to scan a QR code that takes them to this application where they can sign in (with their wallet), optionally opt into sharing their location, and generate a badge showing that they attended.

Here's a simple visual of the user flow:

Based on the summary above, it might be obvious where Web3Modal fits in. That's right - we will be using this SDK to authenticate users and keep track of who attended what event based on their wallet address.

We've made up two imaginary events to align with this use case:

Encryption Event Wallet Event

Below is a sneak peek at our app's UI:

What's Included in Our Technical Stack?

To power this simple application, we will need a few things:

A frontend framework that runs in the attendee's browser and a backend to handle any internal API calls we'll need - we will use NextJS Wallet tooling so we don't have to build authentication logic from scratch - Web3Modal React hooks that work with our browser wallet so we don't have to build these either - we'll use Wagmi Decentralized data storage - we'll use ComposeDB (graph database built on Ceramic) Why ComposeDB?

If dealing with potentially thousands (or more) attendees to these imaginary events (as is often the case with large conferences), storing these records on-chain would be both costly and inefficient. Each record would incur gas fees, and querying the blockchain across tens of thousands of records would be arduous.

Nonetheless, we want our application to give data control to the users who attend the events. And, in our imaginary use case, other conferences must have access to this data (not just our application) so they can determine who should receive admission priority. We will therefore require some sort of decentralized data network.

In Ceramic (which is what ComposeDB is built on), user data is organized into verifiable event streams that are controlled exclusively by the user account that created each stream. Since Ceramic is a permissionless open data network, any application can easily join and access preexisting user data (which meets one of the requirements listed above).

Applications that build on Ceramic/ComposeDB authenticate users (using sign-in with Ethereum), creating tightly-scoped permission for the application to write data to the network on the user's behalf. This is important for us because our application's server will need to cryptographically sign the badge (to prove the badge was indeed generated through our application) before saving the output in Ceramic on the user's behalf.

Finally, ComposeDB adds a graph database interface on top of Ceramic, making it easy to query, filter, order, and more (using GraphQL) across high document volumes - an ideal fit for any teams who want to consume these badges and perform computation over them in an efficient manner.

We will go into more detail throughout this tutorial.

Getting Started

We have set up a special repository for you to help guide you through - keep in mind that we will need to add to it using the below steps for it to work.

Start by cloning the demo application repository and install your dependencies:

git clone https://github.com/ceramicstudio/walletconnect-tutorial cd walletconnect-tutorial npm install

Go ahead and open the directory in your code editor of choice. If you take a look at your package.json file, you'll see our@web3modal/wagmi and wagmi packages mentioned above, as well as several @ceramicnetwork and @composedb packages to meet our storage needs.

Obtain a WalletConnect Project ID

While your dependencies are downloading, you can create a WalletConnect project ID (which we'll need to configure our Web3Modal - more information on their docs). You can do so for free by visiting their WalletConnect Cloud site, creating a new project (with the "App" type selected), and a name of your choosing:

After you click "Create" you will be directed to the settings page for the project you just set up. Go ahead and copy the alphanumeric value you see next to "Project ID."

Back in your text editor, navigate to your /src/pages/_app.tsx file and enter the ID you just copied into the blank field next to the projectId constant. Notice how we use this ID and a mainnet chain setting when defining our wagmiConfig (later used to create our Web3Modal). Just as the Web3Modal docs instructed, we are setting up these functions outside our React components, and wrapping all child components with our WagmiConfig wrapper:

const projectId = '<your project ID>' const chains = [mainnet] const wagmiConfig = defaultWagmiConfig({ chains, projectId }) createWeb3Modal({ wagmiConfig, projectId, chains }) const MyApp = ({ Component, pageProps }: AppProps) => { return ( <WagmiConfig config={wagmiConfig}> <ComposeDB> <Component {...pageProps} ceramic /> </ComposeDB> </WagmiConfig> ); } export default MyApp

We can now make our Web3Modal button accessible to child components of our application to allow our users to sign in. If you take a look at /src/components/nav.tsx, you'll see that we placed our <w3m-button /> component directly into our navigation to allow users to sign in/out on any page of our application (at the moment our application only has 1 page).

Notice how we make use of the size and balance properties - these are two of several settings developers can use to further customize the modal's appearance. These two in particular are fairly simple to understand - one alters the size of the button, while the other hides the user's balance when the user is authenticated.

Finally, you probably noticed in your /src/pages/_app.tsx file that we're also utilizing a <ComposeDB> context wrapper. This is what we will explain next.

Create a ComposeDB Configuration

Now that we've created our Wagmi configuration, we will need to set up our ComposeDB data storage. There are several steps involved (all of which have been taken care of for you). These include:

Designing the data models our application will need Creating a local node/server configuration for this demo (in production) Deploying our data models onto our node Defining the logic our application will use to read from + write to our ComposeDB node

Data Models

If you take a look at your /composites folder, you'll see an /attendance.graphql file where we've already defined the models our application will use. In ComposeDB, data models are GraphQL schema that contain the requirements for a single piece of data (a social post, for example), in addition to its relations to other models and accounts. Since Ceramic is an open data network, developers can build on preexisting data models (you can explore tools like S3 to observe existing schema), or define brand new ones for your app.

In our case, our application will leverage a general event interface that our two event types will implement:

interface GeneralAttendance @createModel(description: "An interface to query general attendance") { controller: DID! @documentAccount recipient: String! @string(minLength: 42, maxLength: 42) latitude: Float longitude: Float timestamp: DateTime! jwt: String! @string(maxLength: 100000) } type EncryptionEvent implements GeneralAttendance @createModel(accountRelation: SINGLE, description: "An encryption event attendance") { controller: DID! @documentAccount recipient: String! @string(minLength: 42, maxLength: 42) latitude: Float longitude: Float timestamp: DateTime! jwt: String! @string(maxLength: 100000) } type WalletEvent implements GeneralAttendance @createModel(accountRelation: SINGLE, description: "A wallet event attendance") { controller: DID! @documentAccount recipient: String! @string(minLength: 42, maxLength: 42) latitude: Float longitude: Float timestamp: DateTime! jwt: String! @string(maxLength: 100000) }

Notice how we've set the accountRelation field for both types to "SINGLE" - what this means is that 1 user can only ever have 1 model instance of that type, thus creating a 1:1 account relationship. This is contrasted with "LIST" accountRelation which would indicate a 1:many relationship.

You'll also notice that our latitude and longitude fields do not use a ! next to their scalar definition - what this means is that they are optional, so a model instance can be created with or without these fields defined.

Finally, we will use our jwt field to record the signed badge payload our server will create for the user. Since the user will ultimately be in control of their data, a potentially deceptive could try to change the values of their model instance outside the confines of our application. Seeing as our architecture requires a way for both our application and other conferences to read and verify this data, the jwt field will create tamper-evident proof against the values by tying the cryptographic signature of our application's DID together with the data.

Create a Local Server Configuration

Seeing as this is just a demo application and we don't have a cloud-hosted node endpoint to access, we will define a server configuration to run locally on our computer. While there are multiple server settings an application can leverage, the key items to know for this demo are the following:

Our app will run inmemory whereas a production application will use mainnet for their network setting Our server will define sqlite as our SQL index, whereas a production application would use PostgreSQL Our IPFS will run in bundled mode (ideal for early prototyping), whereas a production application will run in remote

Finally, each Ceramic node is configured with an admin DID used to authenticate with the node and perform tasks like deploying models. This is different from the DIDs end users will use when authenticating themselves using their wallet and writing data to the network.

Fortunately for you, we've taken care of this for you by creating a command. Simply run the following in your terminal once your dependencies are installed:

npm run generate

If you take a look at your admin_seed.txt file you will see the admin seed your Ceramic node will use. Your composedb.config.json file is where you'll find the server configuration you just created.

Deploying the Models onto Our Node

Seeing as we're not using a preexisting node endpoint that's already set up to index the data models we care about, we'll need a way to deploy our definitions onto our node. If you look at /scripts/composites.mjs you'll find a writeComposite method we've created for you that reads from our GraphQL file, creates an encoded runtime definition and deploys the composite onto our local node running on port 7007.

The important thing to take note of here is how the writeEncodedCompositeRuntime method generates a definition in our definition.js file. We will explain in the next step how this is used by our client-side library to allow our application to interact with these data models and our Ceramic node.

Don't take any action yet - we will explain how to use this script in the coming steps.

Integrating ComposeDB with Our Application

Finally, as mentioned above, we will need a way for our application to read from and write to our ComposeDB node. We will also need a way to combine our Web3Modal authentication logic with the need to authenticate users onto our node.

If you take a look at /src/fragments/index.tsx you'll find a ComposeDB component that allows us to utilize React's createContext hook and create a wrapper of our own. Since we know Web3Modal will make use of our wallet client, we can leverage the wallet client to request a Ceramic user session authentication from our user.

Observe the following:

const CERAMIC_URL = process.env.URL ?? "http://localhost:7007"; /** * Configure ceramic Client & create context. */ const ceramic = new CeramicClient(CERAMIC_URL); const compose = new ComposeClient({ ceramic, definition: definition as RuntimeCompositeDefinition, }); let isAuthenticated = false; const Context = createContext({ compose, isAuthenticated }); export const ComposeDB = ({ children }: ComposeDBProps) => { function StartAuth() { const { data: walletClient } = useWalletClient(); const [isAuth, setAuth] = useState(false); useEffect(() => { async function authenticate( walletClient: GetWalletClientResult | undefined, ) { if (walletClient) { const accountId = await getAccountId( walletClient, walletClient.account.address, ); const authMethod = await EthereumWebAuth.getAuthMethod( walletClient, accountId, ); const session = await DIDSession.get(accountId, authMethod, { resources: compose.resources, }); await ceramic.setDID(session.did as unknown as DID); console.log("Auth'd:", session.did.parent); localStorage.setItem("did", session.did.parent); setAuth(true); } } void authenticate(walletClient); }, [walletClient]); return isAuth; } if (!isAuthenticated) { isAuthenticated = StartAuth(); } return ( <Context.Provider value={{ compose, isAuthenticated }}> {children} </Context.Provider> ); };

Notice how we're using the wallet client's account address to initiate a DID session that asks for specific resources from compose. If you track deeper, you'll see that compose was instantiated using the definition imported from the file our deployment script wrote into. This allows us to access a limited scope to write data on the user's behalf specifically for the data models our application uses (these sessions auto-expire after 24 hours).

Finally, to bring this full circle, back to our /src/pages/_app.tsx file, you should now understand how we're able to use ComposeDB as a contextual wrapper, enabling us to access both the ComposeDB client libraries and our model definitions from within any child component. For example, if you take a look at /src/components/index.tsx you'll see how we're now able to utilize our useComposeDB hook that allows us to run queries against our node's client.

Create a Seed for our Application's Server DID

We mentioned above that we'll want our application to sign each badge payload before handing document control back to the end user. While this flow will not always be the case (read this blog on common data control patterns in Ceramic for more), we'll want to implement this to ensure the verifiability of the data.

In /src/pages/api/create.ts we've created an API our application's server will expose that does exactly this - it intakes the data relevant to the event, uses a SECRET_KEY environment variable to instantiate a static DID, and returns a Base64 string-encoded JSON web signature containing the signed data.

We will therefore need to create a separate static seed to store in a .env file that we'll create:

touch .env

For this tutorial, enter the following key-value pair into your new file:

SECRET_KEY="11b574d316903ced6cc3f4787bbcc3047d9c72d1da4d83e36fe714ef785d10c1"

When you use the above seed to instantiate a DID, this will yield the following predictable did:

did:key:z6MkqusKQfvJm7CPiSRkPsGkdrVhTy8EVcQ65uB5H2wWzMMQ

If you look back into /src/components/index.tsx you'll see how our lengthy getParams the method performs a quick check against any existing EncryptionEvent or WalletEvent badges the user already holds to test whether the jwt value was indeed signed by our application (a more thorough version of this could include verifying that the signed data matches the same values from the other fields, but we'll leave that up to you to add).

That's it! We are finally ready to run our application!

Running the Application

Now that we've set up everything we need for our app to run locally, we can start it up in developer mode. Be sure to select the correct node version first:

nvm use 20 npm run dev

Once you see the following in your terminal, your application is ready to view in your browser:

In your browser, navigate to http://localhost:3000 - you should see the following:

Signing in with Web3Modal

As mentioned above, we've made our Web3Modal accessible from our navigation which is where our "Connect Wallet" button is coming from. Go ahead and give this button a click and select your wallet of choice.

During the sign-in cadence, you will notice an additional authorization message appear in your wallet that looks something like this:

If you recall what we covered in the "Integrating ComposeDB with Our Application" section above, you'll remember that we discussed how we created a DIDSession by requesting authorization over the specific resources (data models) our application will be using. These are the 3 items listed under the "Resources" section of the sign-in request you should see.

Finally, after you've signed in, your Web3Modal will now show a truncated version of your address:

Creating Badges

As you can see, our application does not allow the user to input which event they have attended - this will be determined based on the URL the QR code sends the user with the following format:

http://localhost:3000/?event={event id}

Take a look at your browser console - you should see logs that look similar to this:

We've preset these logs for you by reading from our runtime composite definition that we've imported into the /src/components/index.tsx component. Go ahead and copy one of those fields and construct your URL to look something like this:

http://localhost:3000/?event=kjzl6hvfrbw6c8njv24a3g4e3w2jsm5dojwpayf4pobuasbpvskv21vwztal9l2

If you've copied the stream ID corresponding to the EncryptionEvent model, your UI should now look something like this:

You can optionally select to share your coordinates. Finally, go ahead and create a badge for whichever event you entered into your URL:

If you navigate back to your /src/components/index.tsx file you can observe what's happening in createBadge. After calling our /api/create route (which uses our application server's static DID to sign the event data), we're performing a mutation query that creates an instance of whichever event aligns with the identifier you used in your URL parameter. Since our user is the account currently authenticated on our node (from the creation of our DID session), the resulting document is placed into the control of the end user (with our tamper-evident signed data entered into the jwt field).

If you take a look at our getParams method in our /src/components/index.tsx file, you'll notice that we've created a query against our ComposeDB node that runs both within our useEffect React hook as well as after every badge creation event. Notice how we're querying based on the user's did:pkh: did:pkh:eip155:${chainId}:${address?.toLowerCase()}

If you take a look at our chainId and address assignments, you'll realize these are coming from our Wagmi hooks we mentioned we'd need (specifically useAccount and useChainId).

What's Next?

We hope you've enjoyed this fairly straightforward walk-through of how to use WalletConnect's Web3Modal toolkit for authenticating users, creating user sessions in Ceramic, and querying ComposeDB based on the authenticated user! While that's all for this tutorial, we encourage you to explore the other possibilities and journies Ceramic has to offer. Below are a few we'd recommend:

Test Queries on a Live Node in the ComposeDB Sandbox

Build an AI Chatbot on ComposeDB

Create EAS Attestations + Ceramic Storage

Finally, we'd love for you to join our community:

Join the Ceramic Discord

Follow Ceramic on Twitter


MyData

How to build a human-centric data space: introducing the Blueprint for the European Data Space for Skills & Employment

The DS4Skills project, funded by the European Commission under the Digital Europe Programme and led by DIGITALEUROPE, has launched its Blueprint for a European Data Space for Skills & Employment (https://skillsdataspace-blueprint.eu/), a comprehensive guide for creating and managing data spaces that collect, store, and share skills & employment data.  A human-centric Data Space for Ski
The DS4Skills project, funded by the European Commission under the Digital Europe Programme and led by DIGITALEUROPE, has launched its Blueprint for a European Data Space for Skills & Employment (https://skillsdataspace-blueprint.eu/), a comprehensive guide for creating and managing data spaces that collect, store, and share skills & employment data.  A human-centric Data Space for Skills […]

Identity At The Center - Podcast

It’s time for another Sponsor Spotlight episode of The Ident

It’s time for another Sponsor Spotlight episode of The Identity at the Center Podcast! In these fully sponsored episodes, we take a deeper dive into solutions in the digital identity space. It's a slight departure from our usual vendor-neutral format, allowing us to explore the viewpoints and innovations straight from the source. These bonus shows drop from time-to-time mid-week and in addition t

It’s time for another Sponsor Spotlight episode of The Identity at the Center Podcast!

In these fully sponsored episodes, we take a deeper dive into solutions in the digital identity space. It's a slight departure from our usual vendor-neutral format, allowing us to explore the viewpoints and innovations straight from the source. These bonus shows drop from time-to-time mid-week and in addition to our normal weekly conversations that release on Monday’s.

In this Sponsor Spotlight, we had a great conversation with Alex Bovee, CEO of ConductorOne, about modern Identity Governance & Administration (IGA) and how ConductorOne approaches this competitive space. We also talked about some exciting new capabilities leveraging AI that ConductorOne is launching today. Our conversation can be heard in episode #257 which is available now at idacpodcast.com and in your podcast app.

Tune in to get a better understanding of how ConductorOne helps organizations secure their workforce through modern access controls and identity governance.

#iam #podcast #idac #sponsorspotlight #conductorone


DIDAS

Embracing Standardization in Employee Verification

In today's fast-moving business world, it's important for different systems to work well together, and having standard data formats helps a lot with this. This article investigates how the EmployeeID is used, which is a key part of Self-Sovereign Identity (SSI). The adoption group from DIDAS has worked on making sure this EmployeeID follows a ...

In today’s fast-moving business world, it’s important for different systems to work well together, and having standard data formats helps a lot with this. This article investigates how the EmployeeID is used, which is a key part of Self-Sovereign Identity (SSI). The adoption group from DIDAS has worked on making sure this EmployeeID follows a set standard. Thanks to their work, we can now see how the EmployeeID can be used in real situations and think about how it can be used more in the future.  

Demonstrating Real-World Applications 

The practicality of the EmployeeID VC was tested in real-world scenarios by organizations SBB, AXA, Orell Füssli, and Swisscom. Together they showed the power of an interoperable eco system based on the credential “EmployeeID”. These demos showcased varied applications: 

SBB: Used the VC for external partners to access internal IT systems, replacing traditional username/password methods.  AXA: Employed the VC to verify employment status for online insurance offerings.  Orell Füssli & Swisscom: Utilized the VC to offer employee discounts in online shopping. 

More information here: Successfully testing the potential of digital evidence from the private sector  

Collaboration: The Key to Adoption 

One of the primary goals was to ensure that the EmployeeID could seamlessly integrate into various ecosystems. This required a standardized approach that would allow different systems and organizations to interact without compatibility issues.  

This process was not an isolated endeavor. It involved a collaborative effort from experts and stakeholders across different industries. By pooling their knowledge and insights, the group could identify and agree on the most relevant and sustainable attributes. This collaborative approach was instrumental in developing a schema that was not only effective for the present but also robust enough to stand the test of time and technological evolution. 

The outcome of these pre-implementation discussions was a well-thought-out, standardized EmployeeID schema. This schema serves as a testament to the importance of foresight and collaboration in the digital age. By addressing the need for standardization and futureproofing at the initial stages, the EmployeeID schema was positioned to be an enduring and versatile tool, capable of adapting to the ever-changing business and technological landscapes. 

The EmployeeID Schema 

The EmployeeID schema incorporates various attributes, such as: 

Employee Core: A blend of personal and employer data.  Employment Contract: Key contractual information like contract type and working hours.  Role: The employee’s role and organizational unit.  Office Address: The physical office location.  Authorization: Specific access rights issued as separate VCs. 

While many attributes are initially represented as free text in the logical schema, they should ideally be derived from predefined lists or tables for specific implementations. This becomes especially critical when exchanging information with other organizations, necessitating mutually agreed-upon value sets. 

Detailed Schema: Detailed Schema of EmployeID  

Conclusion 

For organizations exploring use cases with SSI or sandbox environments, it’s crucial to engage with the DIDAS adoption group. This collaboration will help define a standard for your verifiable credentials, ensuring a unified approach and compatibility across different ecosystems. 

In conclusion, the adoption of a standardized EmployeeID schema, as illustrated by DIDAS adoption group, is not just a technical necessity but a strategic move towards interoperability and efficiency in the digital age.  


Velocity Network

Neeyamo trailblazes the use of the Internet of Careers® in India

Neeyamo trailblazes the use of the blockchain-based Internet of Careers® in India to accelerate background screening. The post Neeyamo trailblazes the use of the Internet of Careers® in India appeared first on Velocity.

Wednesday, 31. January 2024

DIF Blog

Veramo User Group

DIF is thrilled to announce the donation of the Veramo project and the formation of the Veramo User Group. Veramo is a broadly-used SDK that is a popular choice for SSI implementation, providing core functionality such as Decentralized Identifier (DID) and Verifiable Credential (VC) management cryptographic key management secure peer-to-peer

DIF is thrilled to announce the donation of the Veramo project and the formation of the Veramo User Group.

Veramo is a broadly-used SDK that is a popular choice for SSI implementation, providing core functionality such as

Decentralized Identifier (DID) and Verifiable Credential (VC) management cryptographic key management secure peer-to-peer communications through DIDComm

Veramo is a powerful toolkit and base layer that provides the scaffolding to build applications leveraging decentralized identity. Its flexible plugin model and extensible and accessible APIs make it easy to integrate into, and update your stack as the technology evolves, helping you keep up with new protocols, standards and implementations.

“The donation of the Veramo codebase, as well as the formation of the User Group, are significant milestones for Veramo and for DIF. This generous donation will allow the DIF community to contribute to the future of Veramo and provide valued governance” said DIF’s Executive Director, Kim Hamilton Duffy.

Housing Veramo within DIF allows the project to harness unparalleled expertise in decentralized identity, promoting collaboration and driving innovation, while enabling Veramo to solidify its role as a productivity accelerator and reference to ensure interoperability. 

“By providing the DIF’s platform and contributions of the leading decentralized identity experts, while remaining open to all, regardless of DIF membership, Veramo’s new home at DIF will ensure it continues to evolve into the leading decentralized identity toolkit, enabling builders of future identity solutions to move faster,” Kim added. 

"We were proud to donate Veramo to DIF and reaffirm our commitment to public goods development in this space. Now, with the formation of the User Group, we're excited to work even more closely with the DIF community and ensure we can all build a framework that meets everyone's needs" said Head of Identity at Consensys Mesh R&D, Nick Reynolds.

How you can get involved

Join the Veramo User Group. The first meeting takes place at 15.00 CET on Thursday 15th January. Meetings will take place weekly, alternating between Noon EST / 18.00 CET and 09.00 EST / 15.00 CET.  Click here for more details Use the code https://github.com/decentralized-identity/veramo Contribute to the code

More details on #3:

If you are looking to become more active in decentralized identity, or open source generally, Veramo is a great way to get started and join this vibrant community. We are looking for help requiring a wide range of expertise:

Build and/or integrate support for additional standards and protocols, such as Presentation Exchange, SD-JWT, and Aries. In some cases, the implementations exist and just need to be integrated into the Veramo framework. Support for new cryptographic key types Process/build improvements: helping with formatting and linting commit hooks, developing a test harness for improved test automation

Finally, consider joining DIF. The Decentralized Identity Foundation opens the door to shaping the future of identity standards and SDKs, offering unmatched opportunities to influence and drive the evolution of digital identity at a global scale. Membership not only places you at the forefront of cutting-edge developments but also embeds you within a community of innovators and thought leaders dedicated to redefining the landscape of digital identity. Find out more here.


DIF Newsletter #37

January 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News Veramo User Group DIF is thrilled

January 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

Veramo User Group

DIF is thrilled to announce the donation of the Veramo project and the formation of the Veramo User Group.

Veramo is a broadly-used SDK that is a popular choice for SSI implementation, providing core functionality such as

Decentralized Identifier (DID) and Verifiable Credential (VC) management cryptographic key management secure peer-to-peer communications through DIDComm

Veramo is a powerful toolkit and base layer that provides the scaffolding to build applications leveraging decentralized identity. Its flexible plugin model and extensible and accessible APIs make it easy to integrate into, and update your stack as the technology evolves, helping you keep up with new protocols, standards and implementations.

“The donation of the Veramo codebase, as well as the formation of the User Group, are significant milestones for Veramo and for DIF. This generous donation will allow the DIF community to contribute to the future of Veramo and provide valued governance” said DIF’s Executive Director, Kim Hamilton Duffy.

Housing Veramo within DIF allows the project to harness unparalleled expertise in decentralized identity, promoting collaboration and driving innovation, while enabling Veramo to solidify its role as a productivity accelerator and reference to ensure interoperability. 

“By providing the DIF’s platform and contributions of the leading decentralized identity experts, while remaining open to all, regardless of DIF membership, Veramo’s new home at DIF will ensure it continues to evolve into the leading decentralized identity toolkit, enabling builders of future identity solutions to move faster,” Kim added. 

"We were proud to donate Veramo to DIF and reaffirm our commitment to public goods development in this space. Now, with the formation of the User Group, we're excited to work even more closely with the DIF community and ensure we can all build a framework that meets everyone's needs" said Head of Identity at Consensys Mesh R&D, Nick Reynolds.

How you can get involved

Join the Veramo User Group. The first meeting takes place at 15.00 CET on Thursday 15th January. Meetings will take place weekly, alternating between Noon EST / 18.00 CET and 09.00 EST / 15.00 CET.  Click here for more details Use the code https://github.com/decentralized-identity/veramo Contribute to the code

More details on #3:

If you are looking to become more active in decentralized identity, or open source generally, Veramo is a great way to get started and join this vibrant community. We are looking for help requiring a wide range of expertise:

Build and/or integrate support for additional standards and protocols, such as Presentation Exchange, SD-JWT, and Aries. In some cases, the implementations exist and just need to be integrated into the Veramo framework. Support for new cryptographic key types Process/build improvements: helping with formatting and linting commit hooks, developing a test harness for improved test automation

First DIF-sponsored Hackathon inspires participants to discover the power of decentralized identity.

The first ever DIF-sponsored hackathon wrapped up with a Meet The Winners Twitter Space earlier this month which highlighted how the event has boosted skills development, community engagement and participation in DIF. 

422 developers registered and 52 projects were submitted, surpassing expectations. Here's a quick overview of the winners of the DIF Main Prize Pool:

First Prize: Decentralinked
Second Prize: Anonymous Door Unlocking (Meet The Winners blog here)
Third Prize: HealthX Protocol (Meet The Winners blog here)
Honorable Mention: TrustBox  (Meet The Winners blog here)
Honorable Mention: Mail5

Watch out for more Meet The Winners blog posts, and for future events! As DIF hackathons grow, we’ll need more volunteer developer advocates and online event organizers. If you’re interested in volunteering at DIF please reach out to us at membership@identity.foundation.

More on the DIF Hackathon here.

SIDI Hub 2024 roadmap unveiled

The Sustainable & Interoperable Digital Identity (SIDI) Hub's 2024 work plan was announced earlier today.

Dozens of digital identity schemes have been launched or are underway around the world. Yet to date, there is no known scheme considered truly interoperable across borders. SIDI Hub was was conceived as a community to accelerate the path to cross-border interoperability.

SIDI Hub held its first summit at TRUSTECH 2023. DIF was one of the event organizers, alongside our liaison partners including the Open ID Foundation, Open Identity Exchange, FIDO and Open Wallet Foundation.

120+ digital identity experts from governments, multilaterals, standards organizations, and non-profits representing 22 countries attended. Over 90% of participants agreed that the work started at the summit must continue in 2024. In response, the SIDI Hub community has defined the following work plan:

Identifying champion use cases for cross-border interoperability that serve as baseline for all workstreams Defining minimum interoperability requirements for priority use cases Mapping trust frameworks across jurisdictions Defining metrics of success

The group is organizing a series of virtual and in-person meetings this year to progress the roadmap, and invites all organizations involved in the development, adoption and implementation of digital identity solutions to add their voice to this important work. Join the community on LinkedIn, visit the website and sign up for the SIDI Hub newsletter to learn more and stay up to date on the latest news and events.

🛠️ Working Group Updates 💡Identifiers and Discovery Work Group

The Identifiers and Discovery Work Group hosted a presentation and discussion of the did:dht method, as well as a discussion of DID Rotation, and integrating support for it in the Universal Resolver and Universal Registrar.

The work item "Linked Verifiable Presentations" is progressing with dedicated monthly calls. See https://identity.foundation/linked-vp/

The Work Group seeks input on recent activity for this work item. Please visit our GitHub page and review the issues and PRs!
https://github.com/decentralized-identity/linked-vp

Recordings and notes can be found here: https://github.com/decentralized-identity/identifiers-discovery/blob/main/agenda.md

Identifiers and Discovery Work Group meets bi-weekly at 11am PT/ 2pmET/ 8pm CET Mondays

🔐 Applied Cryptography WG

The BBS Signature Scheme continues on its path towards becoming an official web standard after Draft 05 of the specification was published by the Internet Engineering Task Force (IETF) last month.

The BBS Signature Scheme is a secure, multi-message digital signature protocol that supports proving knowledge of a signature while selectively disclosing any subset of the signed messages. Being zero-knowledge, the BBS proofs do not reveal any information about the undisclosed messages or the signature itself, while at the same time, guarantying the authenticity and integrity of the disclosed messages.

The latest update to the specification follows ongoing work on the spec by DIF's Applied Cryptography Work Group.

Draft 05 of the specification can be viewed here.

The DIF Crypto - BBS work item meets weekly at 11am PT/2pm ET /8pm CET Mondays

📦 Secure Data Storage

Decentralized Web Node (DWN) Task Force
The DWN beta release is out! Check out the reference implementation and Web SDK with easy methods to use with it for client apps, here

The DWN Task Force still needs help with updating the draft specification to match the reference implementation, and warmly invites all DIF members to join us and listen in on progress.

TBD, a division of Block, has also been working on open-source tooling and onboarding to DWNs, in addition to the core sample implementation. Also a reminder that the DWN companion guide is available.

DIF/CCG Secure Data Storage WG - DWN Task Force meets bi-weekly at 9am PT/12pm ET/6pm CET Wednesdays

Claims & Credentials Working Group

Work on Trust Establishment continues, led by Sam Curren. The calls take place weekly on Monday at 10am PT.

If you are interested in participating in any of the Working Groups highlighted above, or any of DIF's other Working Groups, please click here.

📖 Open Groups at DIF Korea Special Interest Group

Since the Blockchain Grand Week event in November 2023, awareness of DIF has increased considerably in Korea, reports DIF Korea SIG chairman Kyoungchul Park.

"Since the Financial Services Commission has established association standards for the financial sector, we will look at those standards together, including DIF, W3C, ITU-T and ISO.

"We are also interested in exploring Digital Product Passports, and how they can be applied to high-value products to highlight their unique value and resolve consumer concerns about counterfeit goods," Kyoungchul added.

The SIG has selected discussion on standards related to e-wallets as its main focus for 2024, and will change the current bimonthly meeting to a monthly meeting to facilitate this.

Planned activities include IITP / NIPA consultation and promotion under the Ministry of Science and ICT, KISA seminars under the Ministry of Public Administration and Security and participation in the Information Protection Society and Association's seminar and promotion schedule.

The new meeting time and joining details for the Korea SIG will be announced soon.

DIF Interoperability Group

The Interoperability Open Group is focusing its Q1 2024 efforts towards building an interoperability map comprised of methods/standards/SDKs relevant to interoperability (vendor agnostic). The map aims to serve all DIF working groups and members currently working on solving interoperability challenges. The Interop group's chairs are meeting with other DIF working group chairs and attending DIF meetings this quarter to learn what interoperability challenges the groups are facing. These will be the initial focus areas represented on the interoperability map.

The Interoperability Group meets bi-weekly at 8am PT/11am ET/5pm CET Wednesdays

📡 DIDComm User Group

A reminder that the DIDComm Demo is available. This developer-focused tool allows you to connect to another person, a computer, a phone or simply another window in a different browser tab so you can see DIDComm messages traverse back and forth after the messages have been decrypted. 

The app was developed at Indicio, with the goal to allow people to see how DIDComm works without needing to sift through or learn a substantial stack like Hyperledger Aries Cloud Agent Python (ACA-Py).

Why not try it out for yourself . You’ll see a Help button there with a tutorial, plus a link to the GitHub repo. 

The DIDComm user group meets weekly at 12pm PT/3pm ET/ 9pm CET Mondays

🌏APAC / ASEAN Discussion Group

Group participant Finema has developed a comprehensive overview of GLEIF’s vLEI Ecosystem Governance Framework.

The overview, published in an article on Medium, provides a visual guide for different types of stakeholders participating in the ecosystem, the collective identifiers and key management for vLEI, and the 6 variations of vLEI Credentials.

Co-author Yanisa Sunanchaiyakarn will present the overview to the Discussion Group on the February / March call.

We invite everyone in the APAC region to join our monthly calls and contribute to the discussion. You will be able to find the minutes of the latest meeting here.

The DIF APAC call takes place Monthly on the 4th Thursday of the month. Please see the DIF calendar for updated timing.

📢 Announcements at DIF

DIF China SIG: pre-launch event
The DIF China Special Interest Group (SIG) is holding a pre-launch event at 9:30am Beijing time on Friday 2 February / 8.30pm EST on Thursday 1 Feb.

The SIG will be addressed by DIF's Executive Director, Kim Hamilton-Duffy and SIG chairman Xie Jiagui, Chief Engineer at China's Institute for Industrial Internet & Internet of Things, who will introduce the SIG and its future plans, followed by short presentations by SIG members, who will share their work and their thoughts about decentralized identity.

Participants will have an opportunity to ask Kim & Mr Xie questions about DIF, the China SIG and related topics.

Please note: the pre-launch event is taking place on Tencent, due to issues accessing Zoom within China. To join the event, download VooV, login using your Google account, and join the meeting using the following credentials:

#Tencent Meeting Number:540-592-920

#Tencent Meeting Password:227973

Meeting URL if needed: https://meeting.tencent.com/dm/6LSxx0fkDvlc

Here's a screenshot of the login screen:

You can read more about the SIG's goals and related projects here, or follow the SIG's Chinese WeChat channel here.

MOSIP Connect 2024

DIF Steering Committee member Yodahe Zemichael, who serves as Executive Director of National ID Ethiopia, will represent DIF at MOSIP Connect in Addis Ababa in March.

MOSIP is a university-incubated not-for-profit that offers countries modular and open-source technology to build and own their national identity systems. The project was established in 2018 at the International Institute for Information Technology in Bangalore. Today, over 100 million citizens in 11 countries are registered on MOSIP-based systems.

The inaugural MOSIP Connect promises to bring together technologists, policy makers and civil society groups for in-depth discussions and collaboration on inclusive digital identity systems. Participants will learn how digital identity empowers citizens and promotes inclusion, explore how MOSIP addresses concerns around data protection, privacy, and consent, and benefit from networking opportunities.

🗓️ ️Community Events

Digital Switzerland

DIF’s Executive Director, Kim Hamilton Duffy, participated in a panel discussion on the importance of digital identity wallets alongside Beat Jans, the new Head of Switzerland’s Federal Ministry of Justice, who is responsible for the country’s digital identity policy, Stephan Wolf, the outgoing CEO of GLEIF and Daniel Goldscheider, founder of DIF's liaison partner, the Open Wallet Foundation.

The panel was convened after Mr Jans decided to visit the host event, Digital Switzerland, which took place in Davos recently, at short notice.

A Swiss E-ID was proposed to voters but rejected in 2021 due to fears that citizens could be tracked during verification, and that private companies would collect and store their data in centralized databases. Subsequent revisions focused on addressing these issues, with Self Sovereign Identity (SSI) principles taking center stage. 

During the panel discussion, Kim pointed out that much of the unnecessary personal data sharing that typically accompanies customer onboarding is due to the lack of a strong digital identity. Conversely, digital identity based on open standards and SSI principles avoids data oversharing by design. It also offers powerful new economic opportunities, by providing the means to share a wide range of identity data in a privacy-preserving way. 

Kim highlighted the opportunity for “reusable identity”, tying it to improved onboarding enabled by decentralized identity standards, and described how Verifiable Credentials (VCs) provide an ‘envelope’ that can store messages ranging from government-backed Identity data to competencies or skills certifications. 

“Once relying parties trust that the envelope securely wraps and conveys one type of message, it opens the door for others, which offer some of the most exciting, potentially transformative signifiers of trust," she said after the event.

“Risks of AI were also an urgent topic of discussion. Indeed the waves of disinformation are getting bigger and coming faster as it becomes easier and cheaper to exploit the possibilities of Large Language Models (LLMs) and deepfakes. Decentralized identity architectures enable harnessing the benefits of AI but with a stronger, more trustworthy foundation. The standards were designed to apply to Non Person Entities (NPEs) as well as natural persons. So for example, Decentralized Identifiers (DIDs) and Verifiable Credentials (VC) can establish a chain of trust demonstrating an AI agent is acting on behalf of a natural person.”

DIF Hackathon Winners share their insights and experiences

DIF caught up with Ken Watanbe, whose team's submission for the DIF Hackathon scooped first place in the TBD (Block) and Trinsic sponsor challenges, as well as winning second place in the main DIF prize pool.

You can read about the team's innovative use of Decentralized Identifiers (DIDs), Verifiable Credentials (VCs) and Decentralized Web Nodes (DWNs) to enable a real-world use case within their university premises here.

We also spoke to Harsh Tyagi, whose team developed HealthX Protocol, winning third place in the DIF Hackathon main prize pool, as well as third place in the TBD (Block) sponsored challenge.

Read the interview with Harsh here.

Finally, Edward Curran told us about his path to the Hackathon, how he found developing TrustBox using Veramo and his experience of participating in the DIF community. Check it out here.

🗣️ DIF Member Announcements

Multiparty Computation (MPC) is an exciting branch of cryptography with an important role to play in the future of cybersecurity.

DIF caught up with Jay Prakash of DIF member Silence Laboratories, who explained how customers are using their MPC protocol to improve wallet security and enable privacy-preserving data collaboration.

Check out Jay's guest blog here.

New Member Orientation

Our Senior Director of Community Engagement, Limari Navarrete, led a New Member Orientation today.

Subscribe to DIF’s eventbrite for upcoming notifications on future orientations and events, here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives

2,837 words


Oasis Open Projects

Invitation to comment on Data Model for Lexicography v1.0

Data Model for Lexicography v1.0 describes and defines standard serialization independent interchange objects as part of an open standards based framework for internationally interoperable lexicographic work. The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.

Second public review ends February 29th

OASIS and the OASIS Lexicographic Infrastructure Data Model and API (LEXIDMA) TC are pleased to announce that Data Model for Lexicography Version 1.0 is now available for public review and comment. This 30-day review is the second public review for this specification.

About the specification draft:

The LEXIDMA TC’s high level purpose is to create an open standards based framework for internationally interoperable lexicographic work. Data Model for Lexicography v1.0 describes and defines standard serialization independent interchange objects based predominantly on state of the art in the lexicographic industry. The TC aims to develop the lexicographic infrastructure as part of a broader ecosystem of standards employed in Natural Language Processing (NLP), language services, and Semantic Web.

This document defines the first version of a data model in support of these technical goals, including:
– A serialization-independent Data Model for Lexicography (DMLex)
– An XML serialization of DMLex
– A JSON serialization of DMLex
– A relational database serialization of DMLex
– An RDF serialization of DMLex
– An informative NVH serialization of DMLex

The documents and related files are available here:

Data Model for Lexicography (DMLex) Version 1.0
Committee Specification Draft 02
12 January 2024

PDF (Authoritative):
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd02/dmlex-v1.0-csd02.pdf
HTML:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd02/dmlex-v1.0-csd02.html
PDF marked with changes since previous public review:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd02/dmlex-v1.0-csd02-DIFF.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd02/dmlex-v1.0-csd02.zip

How to Provide Feedback

OASIS and the LEXIDMA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 31 January 2024 at 00:00 UTC and ends 29 February 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility, which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=lexidma).

Comments submitted by TC non-members for this work and for other work of this TC are publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/lexidma-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the LEXIDMA TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/lexidma/

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/lexidma/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#Non-Assertion-Mode
Non-Assertion Mode

[3] Public review metadata document:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd02/dmlex-v1.0-csd02-public-review-metadata.html

The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.


Energy Web

DENA and Energy Web Unite to Launch DIVE to Transform the Energy System with Digital Identities

Innovating Energy: Digital Identities at the Core of Transformation ZUG, Switzerland — January 31, 2024, The German Energy Agency Deutsche Energy Agentur (DENA), has announced a partnership with Energy Web, Forschungsstelle für Energiewirtschaft e.V. Oli Systems, KILT Protocol, Fieldfisher, and Fraunhofer FIT to launch DIVE — “Digital Identities as Trust Anchors in the Energy System.” DIVE is set
Innovating Energy: Digital Identities at the Core of Transformation

ZUG, Switzerland — January 31, 2024, The German Energy Agency Deutsche Energy Agentur (DENA), has announced a partnership with Energy Web, Forschungsstelle für Energiewirtschaft e.V. Oli Systems, KILT Protocol, Fieldfisher, and Fraunhofer FIT to launch DIVE — “Digital Identities as Trust Anchors in the Energy System.” DIVE is set to redefine the energy sector in Germany, leveraging the potential of digital identities to integrate and manage renewable energies in a decentralized landscape.

The Energy System Transformation
As part of the urgent effort to achieve climate neutrality, Germany is embracing a massive expansion of renewable energies. This shift is leading to a decentralized energy system with an increasing number of small scale renewable energy facilities prolifterating including wind turbines, solar parks, batteries, electric vehicles, and distributed rooftop solar installations. In this new landscape, consumer choice and seamless integration to various use cases like tracing renewable energy, using customer-owned assets for grid services or selecting a different supplier is vital for efficient energy management.

Introducing Digital Identities for Trust and Efficiency
At the heart of the DIVE project lies a critical objective — establishing secure and reliable digital identities for devices and systems within the energy sector. These digital identities act as trust anchors, verifying the existence and capabilities of each system in real time. By automating verification processes, DIVE allows device owners to choose and change the electricity use cases that their devices are participating in almost no time and little effort, ensuring grid stability and cost savings for energy consumers.

Energy Web plays a central role in the DIVE project, contributing expertise and technology to two working packages:

Digital Applications in Proof Management Energy Web is taking the lead in developing and implementing use cases related to electricity labeling and verification. This includes the use of digital identities in conjunction with the registry and exploring energy industry use cases and their links to the identity register. Additionally, Energy Web’s contribution involves extending its existing open-source Green Proofs solution to connect to the digital trust anchors of DIVE and leveraging the substrate-based Energy Web X chain (EWX).

Energy Web’s Green Proofs solution enables trustworthy device identities to connect with different registers for guarantees of origin. This empowers devices to freely choose and switch between standards and marketplaces without encountering compliance issues like double spending. Energy Web aims to support existing standards and platforms, such as EnergyTag, Energy Track&Trace (an Elia Group platform in collaboration with energinet and Elering), GO, REC, I-REC, and the German guarantees of origin register (Herkunftsnachweisregister — HKNR).

Furthermore, Energy Web continues to advance the development of its “Fast Change of Supplier for EV Charging Stations” solution (ReBeam). Initially tested with Elia Group & 50Hertz Transmission in Berlin during the summer of 2022, the project now integrates with DIVE to enhance security and confidence in the entire process. This integration ultimately allows the consumption of self-generated PV power at public charging stations.

Smart Contract Library In addition, Energy Web will develop standardized representation and description forms for smart contracts under DIVE. This includes classification within the energy industry context, ensuring implementation-independent descriptions of inputs, outputs, conditions, and logic of smart contracts. The establishment of a “Smart Contract Register” as an “App Store” for decentralized applications and logic devices, along with the provision of smart contracts under free licenses, will set the groundwork for an independent technology library.

About Energy Web
Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, internet-of-things, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

DENA and Energy Web Unite to Launch DIVE to Transform the Energy System with Digital Identities was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


DIF Blog

DIF Hackathon Meet The Winners: Edward Curran

What led you to start working with Decentralized Identity?  I studied computer science at university and wrote my dissertation on Bitcoin, as well as developing using Hyperledger Fabric, so I’ve been in the Web3 world for a while. After uni I got involved in a research project

What led you to start working with Decentralized Identity? 

I studied computer science at university and wrote my dissertation on Bitcoin, as well as developing using Hyperledger Fabric, so I’ve been in the Web3 world for a while.

After uni I got involved in a research project looking into how the mortgage application process could be improved, as part of a knowledge transfer partnership between a university and a bank.

Applying for a mortgage is slow and painful, so streamlining it should make it better for customers and cheaper for the bank. The number of parties — including buyer and seller, lenders, conveyancers, estate agents, credit reference agencies and the Land Registry — made us think “This seems like a problem blockchain can help to solve”. 

However when we spoke to the parties involved, it became clear the core challenge is a data problem. There’s a need to securely exchange data about the property and the participants. That’s not easy, as a lot of the information is private.

All the guidance was, “Don’t put personal information on the blockchain”. So we searched around and learned about SSI (Self Sovereign Identity) and Verifiable Credentials (VCs). We visited the Rebuilding the Web of Trust (RWOT) conference and got excited. We thought, “This seems like the right way to do it”. 

At this point I realised I wanted to work in decentralized identity, so I moved to Berlin and worked at Jolocom. 

Tell us about TrustBox, please  

I’m not the only one who’s seen these problems with the property buying process. After I returned to the UK I got involved with the Property Data Trust Framework, which is a group of financial institutions, conveyancers and other industry participants working together to standardise data schemas. 

I came in to try to find a way to standardise the data exchange using VCs. To do that, you need to know who’s allowed to issue and verify certain things, so I started looking into the DIF Trust Establishment specification, which is when I saw the publicity for the DIF Hackathon. 

I thought “I can build something using Trust Establishment”. Initially I was looking at the organizational side. If each organization has a DID, they can recognize each other. I was less clear how it would work for buyers and sellers. DIDs are quite an abstract concept for people to get their heads around. 

People do most of their research on the web, and I liked the idea of mimicking the SSL padlock (the icon displayed by browsers when visiting a secure website) so it verifies a site is part of the Property Data Trust Framework. So we developed a browser extension called TrustSight, building on existing work around DID configurations (used to cryptographically link a DID to a domain).

We also built a tool for deploying trust frameworks, and a tool to visualize trust relationships.

Why did you choose to develop using Veramo? 

It’s tricky maintaining all these DIDs and DID configurations. There's a package by Sphereon around DID configurations, but it doesn’t provide any tooling around issuing or verifying credentials.

I was struggling to get the browser extension to work with JSON LD libraries, so I decided to use Veramo for DID and VC related operations and connect to the Sphereon resolver. 

It’s made my life much easier, particularly if I want to use a new DID method or different VC formats.

What user benefits are you targeting? 

As a buyer, ideally you’d apply to a hundred different lenders in order to get the best deal. However, it’s currently too expensive for lenders to do the checks. As a result, each buyer can really only apply for one mortgage, so people are very conservative about what they are looking for, to ensure they secure a mortgage.

We’re aiming to make the mortgage application process quicker and easier, and to ensure people get the right product for them. 

How was your experience of participating in the Hackathon? 

It was very well-organised. The criteria and timelines were all clear. I really enjoyed the Discord server. Some people were very active there, which made it feel like a community. I also liked the talks that were put on.

It was great to get to the end and see everyone’s submissions, and to feel connected and part of the community. 

What next for TrustBox? 

There is some tooling on the Trust Framework side that doesn’t yet exist, so I’d like to work with DIF on that. 

Downloading a browser extension is still quite a big piece of user friction, ultimately you want to get into the browser itself. That’s one to discuss with the browser companies!


OpenID

SIDI Hub Announces Roadmap to Drive Cross-Border Interoperability for Digital Identity

The OpenID Foundation is proud to be a founding member of the Sustainable and Interoperable Digital Identity (SIDI) Hub. Interoperability is crucial for a fair and inclusive digital society. By coordinating the digital identity activities already underway, and defining a governance structure for digital identity credentials, the SIDI Hub is helping accelerate the path to […] The post SIDI Hub An

The OpenID Foundation is proud to be a founding member of the Sustainable and Interoperable Digital Identity (SIDI) Hub.

Interoperability is crucial for a fair and inclusive digital society. By coordinating the digital identity activities already underway, and defining a governance structure for digital identity credentials, the SIDI Hub is helping accelerate the path to cross-border interoperability.  

Following the success of our first summit at TRUSTECH 2023, 90% of participants agreed that our work must continue in 2024.

In response, the SIDI Hub has defined its workstreams and roadmap for 2024:

Identifying champion use cases for cross-border interoperability that serve as baseline for all workstreams

Defining minimum interoperability requirements for priority use cases

Mapping trust frameworks across jurisdictions

Defining metrics of success 

Read our official announcement to learn more: https://sidi-hub.community/2024/01/30/digital-identity-community-unites-to-drive-cross-border-interoperability/ 

We are hosting a series of in-person meetings this year to progress the roadmap and invite all organizations involved in the development of digital identity solutions to get involved.

Visit the SIDI Hub website: https://lnkd.in/erWmZTCj
Join the official LinkedIn Group: https://lnkd.in/ecqtM5ry
Sign up to the newsletter: https://bit.ly/47Ul29j

 


OpenID Foundation

The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.

Find out more at openid.net.

The post SIDI Hub Announces Roadmap to Drive Cross-Border Interoperability for Digital Identity first appeared on OpenID Foundation.


Human Colossus Foundation

Unraveling the Path to Genuine Semantic Interoperability across Digital Systems - Part 2

Delving into the integration of stemmatic traceability with directed acyclic graphs (DAGs) and kinematical mechanics reveals advanced strategies for enhancing data integrity and fortifying the foundations of semantic interoperability in the digital era.
Part 2: Stemmatic Traceability Further Explorations

Building on the foundational exploration of semantic interoperability in Part 1, we delve deeper into the innovative fusion of traditional methodologies with modern computational models. Previously, we discussed the pivotal roles of decentralized semantics and Overlays Capture Architecture (OCA) in enabling semantic interoperability between data models and data representation formats across varied environments. We also explored how morphological and epistemological semantics enhance our understanding of data, setting the stage for preserving meaning and context across digital platforms.

In this installment, we introduce "stemmatic traceability," marrying the ancient discipline of Stemmatics, focusing on tracing textual variations and origins, to contemporary event provenance models like Directed Acyclic Graphs (DAGs). This synergy enhances data integrity mechanisms and seamlessly integrates classical textual analysis with cutting-edge event provenance practices. Through this exploration, we aim to demonstrate how the principles of Stemmatics, previously confined to textual criticism, enhance semantic interoperability by offering innovative ways to track the evolution of digital objects.

The Convergence of Stemmatics and Directed Acyclic Graphs (DAGs) for Enhanced Data Integrity

In the dynamic landscape of digital content and data object evolution, 'Stemma' [1] emerges as a versatile umbrella term, encompassing diverse tree structures representing the evolution of digital objects. Traditionally linked to textual criticism, 'Stemma' transcends its origins, mirroring the characteristics of Directed Acyclic Graphs (DAGs), a robust model for version control systems, and more. It is a unifying genealogical tree encompassing trackable events in depicting the evolution of digital content.

Converging advanced data structures and traditional textual analysis methodologies in data management is profound and strategic. One of the exemplary intersections is the alignment between DAGs and the time-honored principles of Stemmatics. This synergy unveils a dynamic landscape where event provenance models enable tracing the evolution of systemic objects, enhancing data integrity and reliability across diverse computational ecosystems.

"Stemmatics" is a discipline within textual criticism that involves studying and analyzing the relationships among various copies of a text to reconstruct the original or an earlier form of that text. It seeks to trace and depict the transmission history and ancestral relationships of different versions of a manuscript or text.

Figure 1. A Stemma example for 'De nuptiis Philologiae et Mercurii' by Martianus Capella, as proposed by Danuta Shanzer [2].

A more precise understanding of data lineage and text variation between the ordered, hierarchical structuring of texts in Stemmatics and the nodal representation of data within DAGs demonstrate how these causal models encapsulate complex, multifaceted data.

Mirroring 'Stemmatics,' which focuses on tracing texts back to their original form or archetype by unraveling their complex, layered evolutions, DAGs enable a similar journey for data objects. Every node, representing a distinct event or data state, is a stepping stone that leads back to the root (i.e., the source node) – the initial event. All other nodes (events) are causally or sequentially linked, directly or indirectly, to the source node without cycles. Each edge in the DAG signifies a direct influence or connection between events, tracing back to the initial event as the origin. Traceable tree-like structures bring transparency to the fields of data evolution and data integrity. In a digital world where data is as fluid as it is expansive, such a structured approach is instrumental in mitigating data corruption, loss, or misinterpretation.

Integrating Kinematical Mechanics and Data Stemmatics

Marking the convergence of classical and modern computational paradigms, the role of structures like DAGs, imbued with the essence of principles like Stemmatics, serve as pillars of data integrity, ensuring that as data traverses the complex pathways of digital systems, its essence, authenticity, and assurance remain intact and enriched, offering a foundation for causal representation.

Introducing "kinematical mechanics" [3] into our discussion, we delve into how this discipline optimizes event pathways and interactions within digital environments. Kinematical mechanics contributes to authentic data provenance and system performance by employing motion and event sequencing concepts, enhancing workflow optimization and data processing. Integrating morphological semantics and kinematical mechanics lays the groundwork for data stemmatics, offering a comprehensive framework for representing and understanding the sequence and patterns in data evolution.

Kinematical Mechanics: In data-centric design, kinematical mechanics analyzes and optimizes event pathways and interactions within digital environments. This discipline employs the study of motion sequences and patterns to enhance the understanding and organization of event sequences, crucially contributing to authentic data provenance and improved system performance. Its importance in workflow optimization enables computational task scheduling and data processing pipelines. Understanding the sequences of events and their causality is fundamental to achieving system efficiency and optimal performance.

Example: In data analysis, 'Kinematical Mechanics' investigates the sequence and patterns of specific events, such as data updates or user interactions, and their impact on the system's behavior within a defined framework.

Morphological semantics and kinematical mechanics form the basis for data stemmatics, offering traceable genealogical structures to represent causal relationships between tangible 'objects' and recorded 'events,' providing a comprehensive understanding of data evolution.

Figure 2. Visualizing the Intersection of Objects and Events in Data Stemmatics.

Data Stemmatics: Data stemmatics explores the causal relationships behind data evolution, utilizing traceable graph structures with root archetypes to depict genealogical hypotheses about data relationships driven by content and historical context. It identifies the causes of data changes and offers insights into data evolution across domains. Data stemmatics is concerned with objects and events, delving into the cause of data modifications.

Data stemmatics clarifies the lineage of data changes and provides deeper insights into data evolution across various domains, thus enhancing our ability to achieve genuine semantic interoperability.

OCA and DAGs: A Synergetic Combination for Stemmatic Traceability

Stemmatic traceability, rooted in textual criticism and historical data analysis, is crucial in tracing data origins, transformations, and evolutionary paths. This method goes beyond mere nodal relationships to offer a nuanced understanding of data's evolutionary journey, thereby significantly improving our capacity for semantic interoperability.

Enhancements offered by the integration of OCA and DAGs include:

Precision in Data Lineage: The structural organization provided by OCA, coupled with the causal pathways rendered by DAGs, ensures the integrity of data structures and facilitates a transparent, unambiguous tracing of their historical evolution and transformations.

Enhanced Data Interpretability: Leveraging DAGs within the OCA framework transforms each data object's trajectory into a narrative that is both coherent and intuitively understandable. This clarity proves invaluable in scenarios where deciphering the evolution and provenance of data is critical.

Robustness Against Data Corruption: DAGs' acyclic nature inherently safeguards against data corruption and cyclic errors. Combined with OCA's structured framework, this resilience constructs a formidable defense mechanism for maintaining data integrity.

Scalability and Flexibility: Engineered with scalability at their core, OCA and DAGs adeptly navigate the complexities of expanding data landscapes. This synergistic blend ensures data integrity and traceability maintenance without compromising performance or adaptability.

Example: Consider a healthcare data ecosystem where patient records evolve. OCA organizes and structures this data while DAGs meticulously track every alteration, from initial diagnosis to treatment outcomes, ensuring a transparent, error-free historical record.

Conclusion

Data Integrity and Traceability: While using DAG technology to integrate Stemmatics principles, DAGs are a natural outcome of OCA's design, not a standalone feature. They are one of a few tools OCA uses to ensure data integrity, contributing to better data transparency and traceability and highlighting OCA's adaptability and versatility in handling data.

The integration of OCA with DAGs represents a synergistic relationship in data science, advancing stemmatic traceability by combining OCA's structural framework with the tractual precision of DAGs. This marriage ensures the tractual causality of data lineage and reinforces data integrity, making it a cornerstone of modern data management and the evolution of data records.

As we navigate the complexities of modern distributed data ecosystems, OCA and DAG technologies underscore a promising horizon for semantic interoperability and data integrity. The exploration into stemmatic traceability and its integration with contemporary technological frameworks marks a significant milestone in the modern digital landscape and the ongoing journey towards a future where data is not only abundant and accessible but also enriched with a clear, traceable lineage.

That concludes Part 2 of this two-part series on genuine semantic interoperability across digital systems, where we have explored the advanced concepts of stemmatic traceability and its integration with contemporary computational models. Our exploration delved into how the ancient discipline of Stemmatics, complemented by Directed Acyclic Graphs (DAGs) and kinematical mechanics, significantly enhances our understanding of data evolution and data integrity, thereby reflecting our ongoing commitment to deepening the dialogue around semantic interoperability.

For those who found these insights enlightening and wish to explore the foundational aspects of this topic, we highly recommend revisiting Part 1 of the series on "Semantic Interoperability," where we examined the crucial roles of decentralized semantics and Overlays Capture Architecture (OCA) in facilitating data harmonization across diverse platforms. We delved into the intricate dynamics of morphological and epistemological semantics and their critical contributions to the semantic interoperability framework. Part 1 sets the stage for understanding how to achieve seamless data exchange, challenging traditional notions, and offering innovative solutions for ensuring data models carry consistent and meaningful interpretations across varied systems and platforms.

Revisiting Part 1 will provide a comprehensive backdrop to the advanced discussions presented here, offering a holistic view of achieving genuine semantic interoperability in our increasingly interconnected digital world.

Link to Genuine Semantic Interoperability across Digital Systems - Part 1: Semantic Interoperability

Stay tuned for more insightful discussions as we continue to unravel the complexities and innovations in data science and interoperability.

References

[1] Parvum Lexicon Stemmatologicum. Stemma (Stemmatology). Department of Greek and Latin Philology, University of Zurich (UZH). Retrieved from https://www.sglp.uzh.ch/static/MLS/stemmatology/Stemma_229149940.html

[2] Shanzer, D. (1986). Review Article: Felix Capella: Minus sensus Quam Nominis Pecudalis [Review of Martianus Capella: “De Nuptiis Philologiae et Mercurii,” by J. Willis]. Classical Philology, 81(1), 62–81. http://www.jstor.org/stable/269880 

[3] Zhijiang Du, Wenlong Yang, Wei Dong, Kinematics modeling and performance optimization of a kinematic-mechanics coupled continuum manipulator, Mechatronics, Volume 31, 2015, Pages 196-204, ISSN 0957-4158, https://doi.org/10.1016/j.mechatronics.2015.09.001


DIF Blog

DIF Hackathon Meet The Winners: Harsh Tyagi

Please tell us about yourself and how you got involved in the Hackathon I’m currently in my final year at university. I’ve been building apps for the past two years and have recently been learning about cryptography, but I hadn’t previously built anything with

Please tell us about yourself and how you got involved in the Hackathon

I’m currently in my final year at university. I’ve been building apps for the past two years and have recently been learning about cryptography, but I hadn’t previously built anything with Decentralized Identifiers (DIDs).

I heard about DIF through an email from Devpost with information about the Hackathon. I started reading recent DIF announcements and looking into the specifications. 

A lot of the concepts were new to me. It made me rethink how the internet can be delivered! 

What motivated you to build something in the area of personal health data? 

Another team member, who has health problems in her family, came up with the idea for HealthX Protocol. Her family has to take a big bunch of files with them to every medical appointment. There’s no way for healthcare providers to filter or compute on the data. Everything is manual and takes a lot of time and resources to manage. 

Privacy is obviously really important when it comes to personal health data, but it’s not enough. Given the current state of cyber attacks, you have no idea who has your data, whether it’s in the cloud or whatever. So, the ability to own your health data was a critical requirement for us. 

The other part is, you don’t only need to store the data, you also need to be able to share it with those who need it.

What did you learn during the hackathon, and how did you use it to meet these needs? 

The hackathon introduced me to DWNs and Web5. Before, I was into other stacks and protocols: Ethereum, zero knowledge and DeFi (Decentralized Finance). I wasn’t actively building identity into my applications. 

These past two months, I’ve been fully immersed in decentralized identity. 

With Decentralized Web Nodes (DWNs), owning your data is simple. You spin up a basic server, which can be in the cloud, on your smartphone, even in your browser. You can store the data there and send it to another DID, which can see or edit the full data, or a certain portion of it.

You encrypt your data with your DID. It’s not like Google has the keys. Even if it’s in the cloud, it’s yours. 

Somehow everything fell into place. I thought “this can be a great use case”. 

How have decentralized identity and Web5 changed how you think about app development? 

The most important thing is being able to store, own and share data. You have self-custody with web3, but it’s just the private key. This enables decentralized money, but what if I need to build an application that generates a large amount of data? I can’t put the file on Ethereum. Even with systems like Filecoin and IPFS (InterPlanetary File System), ownership is still a big issue. 

Something that connects your identifier to your storage and enables you to give access to other identifiers opens the door to a lot of new applications. Ownership of data is lacking in web3, and web5 fills the gap. 

Another consideration is that in web3, identity is anonymous. To build identity into an app, you need to write a smart contract. Proving your identity becomes really easy when you’re using DIDs and Verifiable Credentials (VCs). You can choose what information to share using Selective Disclosure. It’s all there out of the box.

VCs can also be used for access control to a DWN. For example, if you have a conference, anyone with a pass can get access to the materials. The old way of access control is using a username and password. The new way is you can give access to anyone with a VC.

What next for HealthX Protocol? 

The application is currently just a prototype. We want to build something much more polished. 

At this point, generating the DIDs is a little difficult. To address that, I’m planning to integrate a digital identity wallet, perhaps even build one. Then we can do a production application. 

I’d also like to build other applications and get more involved in the decentralized identity community. 

I have a lot of ideas and I’m just getting started in this space!

Tuesday, 30. January 2024

DIF Blog

DIF Hackathon Meet The Winners: Ken Watanabe

Please introduce yourself and tell us the background to your project, "Anonymous Door Unlocking with Anonymity Revocation" My name is Ken Watanabe. I’m studying cryptography at Waseda University under Kazue Sako. My current research focus is on use cases for Verifiable Credentials (VCs), as well as


Please introduce yourself and tell us the background to your project, "Anonymous Door Unlocking with Anonymity Revocation"

My name is Ken Watanabe. I’m studying cryptography at Waseda University under Kazue Sako. My current research focus is on use cases for Verifiable Credentials (VCs), as well as signature schemes such as BBS+. 

The first time I used VCs was during a national project for the Japanese government, where we used VCs to authenticate UAVs (drones) delivering packages from A to B. 

For the hackathon, we were looking for a simple use case we could implement in our day-to-day environment. We work in a lab, so I decided to make a physical door unlocking system using 3D printers, that enables us to unlock the door to the lab using VCs. 

Please can you describe your solution? 

In our solution the university is the issuer, students are the holders and the door unlocking application is the verifier. 

We also introduced a new role in the ecosystem, ‘Opener’. Only the Opener can revoke a holder’s anonymity. The holder and verifier agree who will be the Opener during the setup process. 

The key technical components are Decentralized Identifiers (DIDs), W3C JSON-LD Verifiable Credentials and Decentralized Web Nodes (DWNs). 

We chose DWNs as they offer both storage and messaging. We realized we could put our VC into a DWN and deliver it to other entities like doors using the TBD messaging libraries. This met our requirements and made it simpler to develop. 

We made a wallet application that connects to the DWN, shows the list of VCs and presents the needed information to verifiers through a QR code. 

We used Dock network (which supports BBS+ signatures) and Arkworks to implement the crypto libraries, which I developed myself. 

Why was anonymity revocation an important feature?

We think Selective Disclosure is very important for many use cases, including this one. The holder shouldn’t have to share unnecessary attributes and the verifier shouldn’t need to hold sensitive data. 

But sometimes, data breaches, a theft or a physical accident might occur and the incident needs to be investigated. 

The Verifiable Presentations (VPs) generated by the application are stored on the lab’s Slack channel. This enables the lab manager to see when the door is opened in real time, without seeing who opened it. If something bad happens, you can pick the presentation from Slack and send it to the Opener to open. 

Please can you explain how the system preserves users’ privacy? 

The holder can choose which attributes to share and generate a Verifiable Presentation (VP) with just this information. They don’t need to share their name, only their faculty membership. 

We used BBS+ signatures because of the unlinkability feature, which means you can’t link multiple transactions to a single user, for enhanced privacy. BBS uses Zero Knowledge Proofs (ZKPs) to hide attributes. In this project we added another ZKP to the BBS signature, that we call “verifiable encryptions”. In our system, the holder encrypts his identifier using the Opener’s public key. The extra ZKP means the verifier can verify this has happened. 

We think it’s our main contribution to these kinds of door unlocking systems. 

What’s next for the application? 

Only four faculty members use the system currently. I want others to be able to use it. To do this we need to issue VCs to more students. I also want to introduce the system to other doors within the university. But I know it will take time to get it into production, as we need to verify it works well. 

Only one credential can be used in the application currently, but in future we want to make VPs from multiple VCs. I also want to introduce the ability for the holder and verifier to negotiate the Opener dynamically.

What other use cases do you envisage? 

This is a research project but we think the system can apply to other scenarios. One is ride hailing apps such as Uber and Lyft. The customer can order a taxi anonymously but if an accident happens, their anonymity can be revoked. Another is anonymous social networks. Users can chat anonymously, but if there’s a message that’s violent or abusive the author’s anonymity could be revoked. 

Apart from VCs, my other interest is I'm a cryptographer, so I also want to make completely different scenarios. I want to keep using it.

Last week I presented the application at SCIS2024 in Nagasaki, the biggest cryptography conference in Japan. Many people from government, academia and industry were there. I hope it generated a lot of interest. 

How did you find the experience of participating in the hackathon? Do you envisage participating in DIF going forwards? 

I really enjoyed the hackathon and am very honored to receive this prize. It was my first time using Decentralized Web Nodes. I found the DWNs GitHub readme page and API documentation very easy to read. I just followed the intro and found I could implement it easily. It was straightforward to store the VCs using DWNs. 

For me, BBS Signatures is an interesting area to explore further. I’d also like to use DWNs in other projects and, if possible, I would like to add some features to it. 


Velocity Network

Gianna Pfifner joins Velocity’s board

Congratulations to Gianna Pfiffner, who has been voted on to the Velocity Network Foundation Board. The post Gianna Pfifner joins Velocity’s board appeared first on Velocity.

Monday, 29. January 2024

FIDO Alliance

SRF: Change your Password Day: New standard passkey: Are passwords soon a thing of the past?

Passwords are as old as computers – but are still considered insecure and cumbersome. But now there is hope: “Passkey” is the name of a new procedure in which you […]

Passwords are as old as computers – but are still considered insecure and cumbersome. But now there is hope: “Passkey” is the name of a new procedure in which you don’t have to remember passwords or type in codes – and it’s still more secure. Behind it is FIDO, an alliance of large IT companies. SRF digital editor Peter Buchmann explains what it’s about.


IdentityWeek: Mastercard: 80% of data breaches linked to passwords

Mastercard is modernising digital interactions underpinned by biometric and AI-powered tools which provide less friction than endless password authentication. The company has joined the FIDO Alliance that calls for  encrypted […]

Mastercard is modernising digital interactions underpinned by biometric and AI-powered tools which provide less friction than endless password authentication. The company has joined the FIDO Alliance that calls for  encrypted passkey solutions and specifications to end passwords. The Mastercard Biometric Authentication Service provides a FIDO certified passwordless authentication that allows users to verify their identity using biometrics.


PCMag: X Now Supports Passkey Login on iOS

X (previously known as Twitter) will now let its users login with a passkey instead of a password – but only on iOS devices. X announced its intentions to adopt the passwordless […]

X (previously known as Twitter) will now let its users login with a passkey instead of a password – but only on iOS devices. X announced its intentions to adopt the passwordless technology a while back, and now it has launched the feature for iPhone users. It allows for a quicker way to login, only requiring users to authenticate with whatever they use to lock their device, such as their fingerprint, FaceID, or PIN. 


TechCrunch: X adds support for passkeys on iOS after removing SMS 2FA support last year

X, formerly known as Twitter, has introduced support for passkeys, a secure login method for U.S. users on iOS devices. This implementation follows the removal of SMS 2FA support for […]

X, formerly known as Twitter, has introduced support for passkeys, a secure login method for U.S. users on iOS devices. This implementation follows the removal of SMS 2FA support for non-paying, which was criticized for reducing overall security last year.


Identity At The Center - Podcast

We close January of 2024 with a bang! The latest episode of

We close January of 2024 with a bang! The latest episode of the Identity at the Center Podcast features Ryan Galluzzo, the Identity Management Program Lead at NIST to find out if identity is at the center for NIST. We delved into the creation of NIST 800-63 r3, its upcoming revision, its significance for identity practitioners, and more. To listen to this insightful conversation and gain valuable

We close January of 2024 with a bang! The latest episode of the Identity at the Center Podcast features Ryan Galluzzo, the Identity Management Program Lead at NIST to find out if identity is at the center for NIST. We delved into the creation of NIST 800-63 r3, its upcoming revision, its significance for identity practitioners, and more.

To listen to this insightful conversation and gain valuable industry insights, tune in to the episode on IDACPodcast.com or your favorite podcast app.

#iam #podcast #idac

Sunday, 28. January 2024

Project VRM

An Approach to Paying for Everything That’s Free

Now that we’ve hit peak subscription, and paywalls are showing up in front of formerly free digital goods (requiring, of course, more subscriptions), perhaps the world is ready for EmanciPay, an idea that has been biding its time on our wiki since 2009. So, rather than leave it buried there, we’ll surface it here. Dig::: […]

Prompt: “A public marketplace for digital goods where people pay whatever they please for everything they consume.” Via Microsoft Image Creator

Now that we’ve hit peak subscription, and paywalls are showing up in front of formerly free digital goods (requiring, of course, more subscriptions), perhaps the world is ready for EmanciPay, an idea that has been biding its time on our wiki since 2009.

So, rather than leave it buried there, we’ll surface it here. Dig:::

Overview

Simply put, Emancipay makes it easy for anybody to pay (or offer to pay) —

as much as they like however they like for whatever they like on their own terms

— or at least to start with that full set of options, and to work out differences with sellers easily and with minimal friction.

Emancipay turns consumers (aka users) into customers by giving them a pricing gun (something which in the past only sellers used) and their own means to make offers, to pay outright, and to escrow the intention to pay when price and other requirements are met. And to be able to do this at scale across all sellers, much as cash, browsers, credit cards, and email clients do the same. Payments themselves can also be escrowed.

In slightly more technical terms, EmanciPay is a payment framework for customers operating with full agency in the open marketplace, and at scale. It operates on open protocols and standards, so it can be used by any buyer, seller or intermediary.

It was conceived as a way to pay for music, journalism, or what any artist brings into the world. But it can apply to anything. For example, [subscriptions], have become a giant fecosystem in which every seller has separate and non-substitutable scale across all subscribers, while subscribers have zero scale across all sellers, with the highly conditional exceptions of silo’d commercial intermediaries. As [Customer Commons] puts it,

There’s also not much help coming from the subscription management services we have on our side: Truebill, Bobby, Money Dashboard, Mint, Subscript Me, BillTracker Pro, Trim, Subby, Card Due, Sift, SubMan, and Subscript Me. Nor from the subscription management systems offered by Paypal, Amazon, Apple or Google (e.g. with Google Sheets and Google Doc templates). All of them are too narrow, too closed and exclusive, too exposed to the surveillance imperatives of corporate giants, and too vested in the status quo.

That status quo sucks (see here, or just look up “subscription hell”), and it’s way past time to unscrew it.) But how?

The better question is where?

The answer to that is on our side: the customer’s side.

While EmanciPay was first conceived by ProjectVRM as a way to make live payments to nonprofits and to provide a new monetization method for publishers. it also works as a counterpart to sellers’ subscription systems in what Zuora (a supplier of subscription management systems to the publishing industry, including The Guardian and Financial Times) calls the “subscription economy“, which it says “is built on ever-changing relationships with your customers”. Since relationships are two-way by nature, EmanciPay is one way that customers can manage their end, while publisher-side systems such as Zuora’s manage the other.

Emancipay economic case

EmanciPay provides a new form of economic signaling not available to individuals, either on the Net or before the Net became available as a communications medium. EmanciPay will use open standards and be comprised of open-source code. While any commercial fourth parties can use EmanciPay (or its principles, or any parts of it they like), EmanciPay’s open and standard framework will support fourth parties by making them substitutable, much as the open standards of email (SMTP, POP3, IMAP) make email systems substitutable. (Each has what Joe Andrieu calls service endpoint portability.)

EmanciPay is an instrument of customer independence from all of the billion (or so) commercial entities on the Net, each with its own arcane and siloed systems for engaging and managing customer relations, as well as receipt, acknowledgment, and accounting for payments from customers.

Use Case Background

EmanciPay was conceived originally as a way to provide customers with the means to signal interest and the ability to pay for media and creative works (most of which are freely available on the Web, if not always free of charge). Through EmanciPay, demand and supply can relate, converse, and transact business on mutually beneficial terms, rather than only on terms provided by the countless different siloed systems we have today, each serving to hold the customer captive, and causing much inconvenience and friction in the process.

Media goods were chosen for five reasons: 1) because most are available for free, even if they cost money, or are behind paywalls 2) paywalls, which are cookie-based, cannot relate to individuals as anything other than submissive and dependent parties (and each browser a users employs carries a different set of cookies) 3) both media companies and non-profits are constantly looking for new sources of revenue 4) the subscription model, while it creates steady income and other conveniences for sellers, is often a bad deal for customers, and is now so overused (see Subscriptification) that the world is approaching a peak subscription crisis, and unscrewing it can only happen from the customer’s side (because the business is incapable of unscrewing the problem itself 5) all methods of intermediating payment choices are either siloed by the seller or siloed by intermediators, discouraging participation by individuals.

What the marketplace requires are new business and social contracts that ease payment and stigmatize non-payment for creative goods. The friction involved in voluntary payment is still high, even on the Web, where one must go through complex ceremonies even to make simple payments. There is no common and easy way either to keep track of what media (free or otherwise) we use (see Media Logging), to determine what it might be worth, and to pay for it easily and in standard ways &#151; to many different suppliers. (Again, each supplier has its own system for accepting payments.)

EmanciPay differs from other payment models (subscriptions, newsstands, tip jars) by providing customers with the ability to choose what they wish to pay and how they’ll pay it, with minimum friction — and with full choice about what they disclose about themselves.

EmanciPay will also support credit for referrals, requests for service, feedback, and other relationship support mechanisms, all at the control of the user. For example, EmanciPay can provide quick and easy ways for listeners to pay for public radio broadcasts or podcasts, for readers to pay for otherwise “free” papers or blogs, for listeners to pay to hear music and support artists, for users to issue promises of payment for stories or programs — all without requiring the individual to disclose unnecessary private information or to become a “member” — although these options are kept open.

This will scaffold genuine relationships between buyers and sellers in the media marketplace. It will also give deeper meaning to “membership” in non-profits. (Under the current system, “membership” generally means putting one’s name on a pitch list for future contributions, and not much more than that.)

EmanciPay will also connect the sellers’ CRM (Customer Relationship Management) systems with customers’ VRM (Vendor Relationship Management) systems, supporting rich and participatory two-way relationships. In fact, EmanciPay will by definition be a VRM system.

Micro-accounting and Macro-distribution

The idea of “micro-payments” for goods on the Net has been around for a long time and is often brought up as a potential business model for journalism. For example in this article by Walter Isaacson in Time Magazine. It hasn’t happened, at least not globally, because it’s too complicated, and in prototype only works inside private silos.

What ProjectVRM suggests instead is something we don’t yet have, but very much need:

micro-accounting for actual uses. Think of this simply as “keeping track of” the news, podcasts, newsletters, or music we consume. macro-distribution of payments for accumulated use (that’s no longer “micro”).

Much — maybe most — of the digital goods we consume are both free for the taking and worth more than $zero. How much more? We need to be able to say. In economic terms, demand needs to have a much wider range of signals it can give to supply. And give to each other, to better gauge what we should be willing to pay for free stuff that has real value but not a hard price.

As currently planned, EmanciPay would –

Provide a single and easy way for consumers of “content” to become customers of it. In the current system — which isn’t one — every artist, every musical group, and every public radio and TV station has his, her or own way of taking in contributions from those who appreciate the work. This can be arduous and time-consuming for everybody involved. (Imagine trying to pay separately every musical artist you like, for all your enjoyment of each artist’s work.) What EmanciPay proposes, however, is not a replacement for existing systems, but a new system that can supplement existing fund-raising systems — one that can soak up much of today’s MLOTT: Money Left On The Table. Provide ways for individuals to look back through their media usage histories, inform themselves about what they have been enjoying, and determine how much it is worth to them. The Copyright Arbitration Royalty Panel (CARP), and later the Copyright Royalty Board (CRB), both came up with “rates and terms that would have been negotiated in the marketplace between a willing buyer and a willing seller.” This almost absurd language first appeared in the 1995 Digital Performance Royalty Act (DPRA) and was tweaked in 1998 by the Digital Millennium Copyright Act (DMCA), under which both the CARP and the CRB operated. The rates they came up with peaked at $.0001 per “performance” (a song or recording), per listener. EmanciPay creates the “willing buyer” that the DPRA thought wouldn’t exist. Stigmatize non-payment for worthwhile media goods. This is where “social” will finally come to be something more than yet another tech buzzmodifier.

All these require micro-accounting, not micro-payments. Micro-accounting can inform ordinary payments that can be made in clever new ways that should satisfy everybody with an interest in seeing artists compensated fairly for their work. An individual listener, for example, can say “I want to pay 1¢ for every song I hear,” and “I’ll send SoundExchange a lump sum of all the pennies wish to pay for songs I have heard over a year, along with an accounting of what artists and songs I’ve listened to” — and leave dispersal of those totaled pennies up to the kind of agency that likes, and can be trusted, to do that kind of thing. That’s the macro-distribution part of the system.

Similar systems can also be put in place for readers of newspapers, blogs, and other journals. What’s important is that the control is in the hands of the individual and that the accounting and dispersal systems work the same way for everybody.

Friday, 26. January 2024

LionsGate Digital

The Declaration Of Digital Independence

Declaration Of Digital Independence Authored by Larry Sanger, Co-founder of Wikipedia We declare that we have unalienable digital rights, rights that define how information that we individually own may or may not be treated by others, and that among these rights are free speech, privacy, and security. Since the proprietary, centralized architecture of the Internet at present has induced most
Declaration Of Digital Independence Authored by Larry Sanger, Co-founder of Wikipedia

We declare that we have unalienable digital rights, rights that define how information that we individually own may or may not be treated by others, and that among these rights are free speech, privacy, and security. Since the proprietary, centralized architecture of the Internet at present has induced most of us to abandon these rights, however reluctantly or cynically, we ought to demand a new system that respects them properly.

The difficulty and divisiveness of wholesale reform means that this task is not to be undertaken lightly. For years we have approved of and even celebrated enterprise as it has profited from our communication and labor without compensation to us. But it has become abundantly clear more recently that a callous, secretive, controlling, and exploitative animus guides the centralized networks of the Internet and the corporations behind them.

The long train of abuses we have suffered makes it our right, even our duty, to replace the old networks. To show what train of abuses we have suffered at the hands of these giant corporations, let these facts be submitted to a candid world.

They have practiced in-house moderation in keeping with their executives’ notions of what will maximize profit, rather than allowing moderation to be performed more democratically and by random members of the community.

They have banned, shadow-banned, throttled, and demonetized both users and content based on political considerations, exercising their enormous corporate power to influence elections globally.

They have adopted algorithms for user feeds that highlight the most controversial content, making civic discussion more emotional and irrational and making it possible for foreign powers to exercise an unmerited influence on elections globally.

They have required agreement to terms of service that are impossible for ordinary users to understand, and which are objectionably vague in ways that permit them to legally defend their exploitative practices.

They have marketed private data to advertisers in ways that no one would specifically assent to.

They have failed to provide clear ways to opt out of such marketing schemes.

They have subjected users to such terms and surveillance even when users pay them for products and services.

They have data-mined user content and behavior in sophisticated and disturbing ways, learning sometimes more about their users than their users know about themselves; they have profited from this hidden but personal information.

They have avoided using strong, end-to-end encryption when users have a right to expect total privacy, in order to retain access to user data.

They have amassed stunning quantities of user data while failing to follow sound information security practices, such as encryption; they have inadvertently or deliberately opened that data to both illegal attacks and government surveillance.

They have unfairly blocked accounts, posts, and means of funding on political or religious grounds, preferring the loyalty of some users over others.

They have sometimes been too ready to cooperate with despotic governments that both control information and surveil their people.

They have failed to provide adequate and desirable options that users may use to guide their own experience of their services, preferring to manipulate users for profit.

They have failed to provide users adequate tools for searching their own content, forcing users rather to employ interfaces insultingly inadequate for the purpose.

They have exploited users and volunteers who freely contribute data to their sites, by making such data available to others only via paid application program interfaces and privacy-violating terms of service, failing to make such freely-contributed data free and open source, and disallowing users to anonymize their data and opt out easily.

They have failed to provide adequate tools, and sometimes any tools, to export user data in a common data standard.

They have created artificial silos for their own profit; they have failed to provide means to incorporate similar content, served from elsewhere, as part of their interface, forcing users to stay within their networks and cutting them off from family, friends, and associates who use other networks.

They have profited from the content and activity of users, often without sharing any of these profits with the users.

They have treated users arrogantly as a fungible resource to be exploited and controlled rather than being treated respectfully, as free, independent, and diverse partners.

We have begged and pleaded, complained, and resorted to the law. The executives of the corporations must be familiar with these common complaints; but they acknowledge them publicly only rarely and grudgingly. The ill treatment continues, showing that most of such executives are not fit stewards of the public trust.

The most reliable guarantee of our privacy, security, and free speech is not in the form of any enterprise, organization, or government, but instead in the free agreement among free individuals to use common standards and protocols. The vast power wielded by social networks of the early 21st century, putting our digital rights in serious jeopardy, demonstrates that we must engineer new—but old-fashioned—decentralized networks that make such clearly dangerous concentrations of power impossible.

Therefore, we declare our support of the following principles.

Principles of Decentralized Social Networks

We free individuals should be able to publish our data freely, without having to answer to any corporation.

We declare that we legally own our own data; we possess both legal and moral rights to control our own data.

Posts that appear on social networks should be able to be served, like email and blogs, from many independent services that we individually control, rather than from databases that corporations exclusively control or from any central repository.

Just as no one has the right to eavesdrop on private conversations in homes without extraordinarily good reasons, so also the privacy rights of users must be preserved against criminal, corporate, and governmental monitoring; therefore, for private content, the protocols must support strong, end-to-end encryption and other good privacy practices.

As is the case with the Internet domain name system, lists of available user feeds should be restricted by technical standards and protocols only, never according to user identity or content.

Social media applications should make available data input by the user, at the user’s sole discretion, to be distributed by all other publishers according to common, global standards and protocols, just as are email and blogs, with no publisher being privileged by the network above another. Applications with idiosyncratic standards violate their users’ digital rights.

Accordingly, social media applications should aggregate posts from multiple, independent data sources as determined by the user, and in an order determined by the user’s preferences.

No corporation, or small group of corporations, should control the standards and protocols of decentralized networks, nor should there be a single brand, owner, proprietary software, or Internet location associated with them, as that would constitute centralization.

Users should expect to be able to participate in the new networks, and to enjoy the rights above enumerated, without special technical skills. They should have very easy-to-use control over privacy, both fine- and coarse-grained, with the most private messages encrypted automatically, and using tools for controlling feeds and search results that are easy for non-technical people to use.

We hold that to embrace these principles is to return to the sounder and better practices of the earlier Internet and which were, after all, the foundation for the brilliant rise of the Internet. Anyone who opposes these principles opposes the Internet itself. Thus we pledge to code, design, and participate in newer and better networks that follow these principles, and to eschew the older, controlling, and soon to be outmoded networks.

We, therefore, the undersigned people of the Internet, do solemnly publish and declare that we will do all we can to create decentralized social networks; that as many of us as possible should distribute, discuss, and sign their names to this document; that we endorse the preceding statement of principles of decentralization; that we will judge social media companies by these principles; that we will demonstrate our solidarity to the cause by abandoning abusive networks if necessary; and that we, both users and developers, will advance the cause of a more decentralized Internet.

Sign the Petition at Change.org

The post The Declaration Of Digital Independence appeared first on Lions Gate Digital.

Thursday, 25. January 2024

Oasis Open Projects

Invitation to comment on CACAO Layout Extension v1.0

This specification defines the CACAO Layout Extension for the purpose of visually representing CACAO playbooks accurately and consistently across implementations. The post Invitation to comment on CACAO Layout Extension v1.0 appeared first on OASIS Open.

First public review - ends February 23rd

OASIS and the OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC are pleased to announce that CACAO Layout Extension v1.0 is now available for public review and comment. This 30-day review is the first public review for this specification.

About the specification draft:

Collaborative Automated Course of Action Operations (CACAO) is a schema and taxonomy for cyber security playbooks. The CACAO specification describes how these playbooks can be created, documented, and shared in a structured and standardized way across organizational boundaries and technological solutions.

This specification defines the CACAO Layout Extension for the purpose of visually representing CACAO playbooks accurately and consistently across implementations.

The documents and related files are available here:

CACAO Layout Extension Version 1.0
Committee Specification Draft 01
16 January 2024

Editable source (Authoritative):
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01.docx
HTML:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01.html
PDF:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01.zip

How to Provide Feedback

OASIS and the CACAO TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 25 January 2024 at 00:00 UTC and ends 23 February 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility, which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=cacao).

Comments submitted by TC non-members for this work and for other work of this TC are publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/cacao-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the CACAO TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/cacao/

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/cacao/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#Non-Assertion-Mode
Non-Assertion Mode

[3] Public review metadata document:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01-public-review-metadata.html

The post Invitation to comment on CACAO Layout Extension v1.0 appeared first on OASIS Open.


Digital ID for Canadians

The Crucial Link Between Accessibility and Digital Identity

Author: Marie Jordan from VISA. Additional contributions made by members of DIACC’s Adoption Expert Committee. In the rapidly evolving landscape of the digital age, the…

Author: Marie Jordan from VISA. Additional contributions made by members of DIACC’s Adoption Expert Committee.

In the rapidly evolving landscape of the digital age, the concept of identity has transcended the physical realm and taken root in the digital world. This shift towards digital identities brings about numerous conveniences and efficiencies, but it also presents challenges: ensuring accessibility and equity for all. From online banking to social media profiles, our digital identity is an intricate tapestry that weaves together various facets of our lives. It’s crucial to note that when discussing inclusion, equity, and accessibility in this context, the focus is primarily on individuals who experience physical or cognitive disabilities that may impair their use of technology from the outset.

The importance of accessibility in creating digital identity solutions cannot be overstated. To achieve true inclusivity for this specific group, both the public and private sectors must prioritize accessibility and consider specific principles to safeguard the rights and privacy of individuals with disabilities. In this article, we’ll delve into the significance of accessibility for digital identity and the protection of marginalized communities, outlining key principles for both public and private sectors to consider.

Part 1: The Significance of Accessibility in Developing Digital Identity

Digital identity solutions are central to our modern lives, facilitating everything from accessing healthcare records to participating in online communities. However, these advantages are only fully realized when these systems are accessible to everyone, regardless of their physical or cognitive abilities, including accounting for aging populations. An initial product release that lacks accessibility and proves difficult to use, even if it functions as intended, can erode trust and create negative perceptions.

Universal design: A foundational principle for digital identity solutions is creating systems usable by all individuals, regardless of disability. A universally designed digital identity solution should accommodate a wide range of abilities, modalities of interaction, and preferences, ensuring that everyone can participate in the digital world on equal terms. Inclusivity in development: Involving individuals with disabilities in the design and testing phases ensures that the final product is genuinely accessible. By including diverse perspectives, developers can identify and rectify accessibility issues early in the development cycle. Adherence to standards: To ensure accessibility, digital identity solutions must adhere to globally recognized accessibility standards, such as W3C’s Web Content Accessibility Guidelines. These provide a clear set of guidelines for making digital content and applications accessible. Compliance with these standards is crucial for ensuring that digital identities are available and usable for all. User-centric approach: Developers must seek to understand how individuals with disabilities interact with their application or technology, offering customization options that empower users to adapt the system to their unique needs and requirements. This might include adjustable font sizes, alternative input methods, and compatibility with assistive technologies. They should also be adaptive in their design. Privacy and security: Paramount in digital identity solutions, individuals with disabilities may be particularly vulnerable to privacy breaches and identity theft. Implementing robust security measures while maintaining respect for user privacy is essential. This can be achieved through encryption, robust authentication methods, and clear privacy policies. Regular audits and assessments can address the security and privacy practices of digital identity solutions as technology shifts, including vulnerability testing and compliance checks to ensure the highest standards of privacy and security are maintained.

Part 2: Safeguarding the Privacy and Trust of Individuals with Disabilities

To ensure that the privacy and trust of all citizens are safeguarded appropriately, accessible solutions must be designed and delivered with intent. To ensure that accessibility is realized, a high level of understanding and education is necessary for individuals to utilize their identity in digital channels without the apprehension of misuse or fear of being exploited.

Informed consent: Individuals with disabilities should have access to clear and understandable information about how their digital identity data will be used. Obtaining informed consent ensures that users are aware of the risks and benefits of participating in digital identity systems. Minimal data collection: Users should understand that only the data that is absolutely necessary for the functioning of the digital identity system is being collected. Minimizing data collection reduces the risk of privacy breaches and limits the potential for misuse of personal information. Transparency in data practices: Transparency should be maintained in data practices. All users must have access to their data and understand how it is being used and processed. Transparency, particularly to historically marginalized communities, builds trust and empowers individuals to make informed decisions about the use of their digital identities. Accessible privacy settings and controls: Accessible privacy settings and controls that are easy for individuals with disabilities to use must be available. These controls must allow users to manage their data and privacy preferences effectively.

In conclusion, it’s important to recognize that accessibility, inclusion, and equity are multifaceted challenges. While this article focuses on individuals experiencing physical or cognitive disabilities, it’s crucial to acknowledge that there are various barriers to equitable access, including socio-economic factors, digital literacy, and language barriers. By addressing these challenges collectively, we can work towards creating a more inclusive digital world for everyone.


Velocity Network

Marc Jansen joins Velocity’s board

We're pleased to announce that Velocity Network Foundation members have voted Marc Jansen onto the Board of Directors. The post Marc Jansen joins Velocity’s board appeared first on Velocity.

The post Marc Jansen joins Velocity’s board appeared first on Velocity.


DIF Blog

Guest blog: Jay Prakash, Silence Laboratories

Founded in 2022, Silence Laboratories is a cybersecurity startup enabling adoption of privacy preserving threshold signatures and secure computations through its developer-focused cryptographic stack. The company also organizes Decompute, a conference focused on decentralized security with multiparty computation.  We spoke to CEO and co-founder Jay Prakash.  Please introduce

Founded in 2022, Silence Laboratories is a cybersecurity startup enabling adoption of privacy preserving threshold signatures and secure computations through its developer-focused cryptographic stack. The company also organizes Decompute, a conference focused on decentralized security with multiparty computation. 

We spoke to CEO and co-founder Jay Prakash. 

Please introduce yourself, and explain how you developed the idea for Silence Labs 

I did my PhD in Usable Security, that is security an average user is able to handle, hiding all the math and complexity. My PhD Supervisor and I found multiple vulnerabilities in existing Two-Factor Authentication (2FA) solutions, which we published and described at various conferences. We thought, “Why not build a company to do this better?”

During this period I spent time in both Singapore and the US. In the process of talking to prospective customers, we realized there was a bit of a mismatch between our original idea and the market need. However, we saw clear demand for decentralized authentication. 

We began meeting with crypto wallet providers. Many were talking about exposure of private keys, which is a common problem. That’s how we landed on Multiparty Computation (MPC) as an area with a lot of commercial potential. 

What are you building? 

We have developed an interactive protocol which allows a group of parties to do mathematical calculations on private data.

For example, the data could be keyshares by isolated computing nodes trying to calculate the signature for a transaction. The requirement is to produce a valid signature from a predetermined proportion of the nodes in the network, known as t out of n secret sharing. 

It’s a hot problem that we latched onto and started to develop around. 

We expose the protocol in our SDKs and libraries, which customers can use to distribute the signing process and overcome the problem of key exposure. 

How is your solution being used today? 

Our solution provides a good amount of freedom regarding what policies are set and how keyshares take place. We provide the tool, but don't dictate how it should be used. 

There are a couple of ways our partners are using it.

One is browser plugin wallets that split the private key between the user’s browser and their phone. 

Another design is to create a network which manages the keys on your behalf. You provide your ID, then the network runs a protocol (such as  5 out of 10 nodes) to get a valid signature. 

A third design is to do one keyshare from the phone and one from the wallet provider. If our customer is a custodian holding a large volume of assets, they can also split their key between multiple directors and/or employees. 

Are you targeting other market segments, in addition to crypto wallets?  

MPC is a powerful tool that can be used for many purposes.  We’ve been doing a lot of research and development around using it for privacy guarantees. For example, a number of financial institutions hold your financial data. If you now want to take out a loan, the lender needs access to your credit score. Traditionally, credit agencies scrape your data without you knowing and return a score. Your data passes through lots of hands, you have no control over what’s happening to it, it’s aggregated and vulnerable to attack. MPC can radically improve how this is done. 

Another use case is Reg Tech (regulatory technology) including Anti Money Laundering (AML) compliance. To uncover money laundering, you need to collaborate with lots of partners. For example, if I’m a telco and you’re a bank, we can both reduce our risk by computing on the customers’ combined telco and banking history. Reg Tech providers currently can’t share private data with each other, but with privacy guarantees, these protocols can comply fully with GDPR and other applicable regulations. 

We want to position this like Two-Factor Authentication, which is already well understood by consumers. The intention is that the user experience will be exactly the same. To deliver that, it has to work fast. Right now we have the fastest multiparty signing library in production, around 5 to 10x faster than other solutions. 

Can you unpack the concept of Privacy Guarantees a bit please? 

There’s a big misunderstanding around consent. Typically a service provider creates a super-long consent form and you tick to say you agree. What we are trying to champion is: One, the user interface should be clearer and Two, consent should not be one-time or one-directional. If I want to pull a piece of private data I previously provided, it should be removed from the entire ecosystem. 

To make consent programmable, you need something like Multiparty Computation. MPC allows you to build more powerful and user-centric applications by guaranteeing decentralization of the computation. 

In short, wherever multiple institutions have your data and want to collaborate without exposing your data to each other, that’s where our solution can help. 

What do you see as the value of participating in DIF? 

I heard about the hackathon through someone at DIF. I’ve been quite active on DIF’s Slack channel and hope to engage more formally soon. 

I see two opportunities for Silence Labs. One is collaboration with others focused on similar topics, for example through a DIF working group. The other is about driving awareness. There’s little inherent ‘pull’ for privacy from companies, as they believe it’s just about compliance. But multiple surveys show there are business benefits too. For example, one survey showed that banks offering privacy guarantees can provide twenty percent more loans with less overall risk. 

Last year we organized a conference, Decompute, where DIF was one of the partners. The event is happening again this year (in Singapore on 17 September) and we’re also interested in running an event in London. We see this as an opportunity to drive much more engagement from the decentralized identity community, as well as wider awareness beyond it. 

Wednesday, 24. January 2024

The Rubric

DIDs for Any Crypto (did:pkh, Part 2)

did:pkh is the minimalist multi-blockchain DID method, designed to work with any blockchain with minimal fuss. Today we talk with two of the authors–and implementers–of did:pkh, Wayne Chang and Joel Thorstensson.    References 3Box Labs https://3boxlabs.com/   Ceramic Network https://ceramic.network/  Chain Agnostic Improvement Proposals (CAIP) https://github.com/ChainAgnos
did:pkh is the minimalist multi-blockchain DID method, designed to work with any blockchain with minimal fuss. Today we talk with two of the authors–and implementers–of did:pkh, Wayne Chang and Joel Thorstensson.    References 3Box Labs https://3boxlabs.com/   Ceramic Network https://ceramic.network/  Chain Agnostic Improvement Proposals (CAIP) https://github.com/ChainAgnostic/CAIPs  Chain Agnostic Standards Alliance (CASA) https://github.com/ChainAgnostic/CASA   DID Directory https://diddirectory.com/  did:ens...

DIDs for Any Crypto (did:pkh, Part 1)

did:pkh is the minimalist multi-blockchain DID method, designed to work with any blockchain with minimal fuss. Today we talk with two of the authors–and implementers–of did:pkh, Wayne Chang and Joel Thorstensson.    References 3Box Labs https://3boxlabs.com/   Ceramic Network https://ceramic.network/  Chain Agnostic Improvement Proposals (CAIP) https://github.com/ChainAgnos
did:pkh is the minimalist multi-blockchain DID method, designed to work with any blockchain with minimal fuss. Today we talk with two of the authors–and implementers–of did:pkh, Wayne Chang and Joel Thorstensson.    References 3Box Labs https://3boxlabs.com/   Ceramic Network https://ceramic.network/  Chain Agnostic Improvement Proposals (CAIP) https://github.com/ChainAgnostic/CAIPs  Chain Agnostic Standards Alliance (CASA) https://github.com/ChainAgnostic/CASA   DID Directory https://diddirectory.com/  did:ens...

Velocity Network

Velocity’s Etan Bernstein features in Polygon webinar

Velocity's co-founder and Head of Ecosystem Etan Bernstein joins a panel hosted by Polygon ID on “The Future of Digital Identity: Identity Ecosystem”. The post Velocity’s Etan Bernstein features in Polygon webinar appeared first on Velocity.

FIDO Alliance

FIDO Alliance Announces Call for Speakers for Authenticate 2024

Carlsbad, Calif., January 24, 2024 – The FIDO Alliance is pleased to announce the return of Authenticate, the only industry conference dedicated to all aspects of user authentication – including […]

Carlsbad, Calif., January 24, 2024 – The FIDO Alliance is pleased to announce the return of Authenticate, the only industry conference dedicated to all aspects of user authentication – including a focus on FIDO-based sign-ins with passkeys. 

Authenticate 2024, featuring signature sponsors Google, Microsoft, and Yubico, will be held October 14-16, 2024 at the Omni La Costa Resort & Spa in Carlsbad, CA, just north of San Diego. Information on submitting a speaking proposal is available on the event website.

Aimed at CISOs, security strategists, enterprise architects, and product and business leaders, this is the fifth consecutive year that the FIDO Alliance is hosting the public conference. The annual event is specifically designed to share education, tools, and best practices for modern authentication across web, enterprise, and government applications. 

Last year’s conference welcomed over 850 total attendees in Carlsbad and remotely. The event featured more than 100 sessions with highly engaging content, plus a sold-out exhibit area with 50+ industry-leading exhibitors and sponsors.

Authenticate 2024 will build upon this momentum and feature detailed case studies, technical tutorials, expert panels, and hands-on lab sessions aimed at helping educate attendees on business drivers, technical considerations, and overall best practices for deploying modern authentication systems. Attendees also benefit from a dynamic expo hall and engaging networking opportunities that tap into the natural beauty of Carlsbad and the La Costa Resort. 

Authenticate 2024 Call For Speakers

With today’s announcement, the Authenticate 2024 program committee has opened its call for speakers. Authenticate provides speakers with an opportunity to increase their industry reach and visibility by educating attendees on in-market approaches for deploying modern authentication solutions.

The committee is looking for vendor-neutral, educational presentations that focus on authentication implementations and best practices for specific steps of the passwordless journey from the service provider perspective for consumer and workforce rollouts across regulated and non-regulated industries. 

Submissions can span all aspects of authentication implementations from initial research and business case development through piloting to rollout and beyond. Perspectives on global trends and considerations for user authentication and topics closely related to user authentication and account lifecycle management will also be considered. 

The committee is looking for a variety of session types and formats including main stage market perspectives, detailed case studies, technical tutorials, hands-on labs, and thought provoking panels. Experienced and new speakers alike are encouraged to submit proposals.

Submissions that are unique, expertise-driven, and reflect diversity in speakers are most likely to be accepted. Product and sales pitches will not be accepted.

The Authenticate Call for Speakers closes on March 4, 2024. To submit an application, please visit https://authenticatecon.com/authenticate-2024-call-for-speakers/

Sponsorship Opportunities at Authenticate 2024 

Authenticate 2024 offers sponsors a wide range of opportunities to provide broader brand exposure, lead-generation capabilities, and a variety of other benefits for both on-site and remote attendees. Authenticate is currently accepting applications for sponsorship from FIDO Alliance members and will open to the industry at large on February 2, 2024. Sign up for the Authenticate newsletter to receive sponsorship information when it becomes publicly available.

Sponsorship requests will be filled on a first-come, first-served basis; requests for sponsorship should be sent to authenticate@fidoalliance.org.

Signature sponsors for the 2024 event are Google, Microsoft, and Yubico.

About Authenticate

Hosted by the FIDO Alliance, Authenticate is the industry’s only conference dedicated to all aspects of user authentication – including a focus on FIDO-based sign-ins with passkeys. It is the place for CISOs, business leaders, product managers, security strategists and identity architects to get all of the education, tools and best practices to roll out modern authentication across web, enterprise and government applications.

Authenticate 2024 will be held October 14-16, 2024 and will be co-located with the FIDO Alliance’s member plenary (running October 14-17) at the Omni La Costa Resort in Carlsbad, CA, just north of San Diego. The conference will feature ample space for a rapidly growing audience, a variety of session types to appeal to all levels, and its most dynamic expo hall yet for companies bringing passwordless to fruition – as well as added networking opportunities. 

Whether you are new to FIDO, in the midst of deployment or somewhere in between, Authenticate 2024 will have the right content – and community – for you. 

Visit www.authenticatecon.com for more information and follow @AuthenticateCon on Twitter. To receive updates about Authenticate events, speaking and sponsorship opportunities, sign up for the newsletter.

Authenticate Contact

authenticate@fidoalliance.org   

PR Contact 

press@fidoalliance.org


Next Level Supply Chain Podcast with GS1

Enhancing Consumer Experiences in the Digital Age

Anne Bernier from Topco and Rishi Banerjee from the Consumer Brands Association are a wealth of knowledge from the frontlines of industry innovation. They join Reid and Liz on this episode to delve into the world of QR codes, the groundbreaking Smart Label program, and new technologies such as Flash Food and Too Good to Go, two platforms looking to reduce food waste.  Anne shares how Topco

Anne Bernier from Topco and Rishi Banerjee from the Consumer Brands Association are a wealth of knowledge from the frontlines of industry innovation. They join Reid and Liz on this episode to delve into the world of QR codes, the groundbreaking Smart Label program, and new technologies such as Flash Food and Too Good to Go, two platforms looking to reduce food waste. 

Anne shares how Topco is revolutionizing the grocery space with smart label initiatives championing transparency and sustainability. Discover how Topco aids its members in competing in the marketplace by providing top-quality products, ensuring reliable supply, and offering competitive pricing—all while maintaining a steadfast commitment to environmental stewardship.

Rishi Banerjee talks Smart Label program, its expansion beyond food products, and how this initiative is setting the industry standard for sharing product data and enhancing customer safety in a standardized format. Listen in to hear all about these new technologies and how their teams are enhancing the consumer experience in a digital age. 

 

Key takeaways: 

Discover how Smart Label technology is revolutionizing consumer transparency in the CPG industry, providing comprehensive product information through QR codes, and how companies like Topco integrate this to strengthen consumer trust and drive sustainability efforts.

Learn from industry leaders about the intersection of sustainability and technology, including innovative apps combatting food waste and the critical advancements in renewable energy sectors, positioning these insights as pivotal for supply chain professionals to adapt and lead in the evolving market landscape.

Gain strategic perspectives on the significance of omnichannel messaging and the compelling storytelling for private label products from Topco and Consumer Brands Association experts, underlining the necessity for retail professionals to leverage digital platforms for enhanced consumer engagement.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with the guest:

Follow Rishi Banerjee on LinkedIn

Follow Anne Bernier on LinkedIn

Check out Consumer BRands Association

Check out Topco

 


Human Colossus Foundation

Unraveling the Path to Genuine Semantic Interoperability across Digital Systems - Part 1

Exploring decentralized morphological semantics to address the complexities and solutions for achieving semantic interoperability in data exchange.
Part 1: Semantic Interoperability Introduction

Imagine you're a data scientist working in a multi-disciplinary team. Your day-to-day involves exchanging complex datasets and intricate models with partners and collaborators globally. However, every exchange feels like a hurdle - inconsistencies, inefficiencies, and a lack of synergy often arise when people need to interpret data. You ask yourself, "Isn't there a more seamless way to ensure that data models carry the same meaning for everyone involved, regardless of the system or platform they're using?"

"Data semantics" refers to the interpretation and meaning of data, and "semantic interoperability" is the ability of computer systems to exchange data with unambiguous, shared meaning. Semantic interoperability is a common requirement to enable machine computable logic, inferencing, knowledge discovery, and data federation between information systems.

Achieving genuine semantic interoperability is a central yet challenging goal in the complex data management and exchange landscape. This post explores decentralized semantics, a vital solution aiming to transcend the limitations of traditional methods. As we delve into data semantics, we touch upon Overlays Capture Architecture (OCA), a decentralized metadata framework for defining the semantic composition of digital objects, enabling the creation of interoperable schemas, and facilitating data harmonization. We aim to showcase how decentralized semantic structures enhance data exchange and ensure the transmission of intrinsic meaning and interpretability, paving the way for more robust and meaningful interactions between digital systems and challenging the prevailing notion that semantic alignment between ontologies and knowledge graphs is the solution to semantic interoperability.

Data Semantics Demystified

Data semantics includes "morphological semantics" [1] and "epistemological semantics" [2]. Imagine morphological semantics as the blueprint of a dataset. It pertains to the textual attributes that outline the concrete structure of data, akin to how architectural plans delineate the formation of physical structures. This branch of semantics deals with the 'objects' - tangible elements like data structures, systemic variables, functions, or methods. Think of it as the compositional elements that give structures shape and functionality.

In contrast, epistemological semantics focuses on 'concepts,' analyzing contextual terms and relationships to build knowledge. It mirrors an abstract idea's cultural or communal interpretation, where traditions and norms provide life and meaning to physical structures.

Here's a breakdown:

Morphological Semantics: Picture a complex machine. Every gear, lever, and pulley (the "Objects") has a specific role and structure, defined by its physical and functional attributes. Morphological semantics is the study of these tangible elements and their interconnected parts.

Example: In a sentence, words and their arrangements (the 'objects') carry specific meanings based on their structure and syntax.

Epistemological Semantics: Imagine the operator's manual explaining how to use the machine effectively or the operational training a worker undergoes. "Concepts" encompass this contextual enhancement, deriving understanding from contextual and subjective knowledge.

Example: In understanding a sentence, the meaning isn't just in the words but in the context, tone, and relationship between those words and the broader conversation or text.

These two intertwining branches constitute data semantics, offering a comprehensive semantic representation of tangible 'objects' and intangible 'concepts.'

Figure 1. Visualizing the Intersection of Objects and Concepts in Data Semantics.

Data Semantics: Semantics is the branch of linguistics and logic concerned with meaning, and in the context of data, it encompasses objects and concepts. Data semantics involves creating a systematic plan or arrangement for data and defining how it should be structured and interpreted to achieve specific objectives. It establishes a blueprint for understanding data, ensuring it is used effectively and aligned with the intended goals. Data semantics provides the guiding framework for data, focusing on purposeful representation and how it’s organized.

Morphological Semantics Demystified

Morphological semantics stands at the forefront of driving semantic interoperability, which is crucial to ensuring seamless and meaningful communication across diverse systems. Unlike its epistemological counterpart, decentralized morphological semantics focuses on 'objects,' representing them as layered structures composed of interoperable overlays. This nuanced representation is pivotal, transforming static structures into dynamic schemas that can seamlessly traverse distributed data ecosystems.

The harmonic interaction between disparate systems in data exchange is not just about transferring data but preserving its innate, objectual meaning. Decentralized semantics facilitates this by encapsulating that meaning within meticulously crafted overlays, each layer adding depth and richness to the semantic structure, ensuring the data's integrity remains unscathed during transfer. Every piece of textual information enhances the interpretability of the object. This process transforms primary data exchange into more profound, more meaningful communication. It aligns with the data's core nature and enables semantic interoperability at scale.

Figure 2. Visualizing an Object as a layered structure of interoperable overlays.

Overlays Capture Architecture (OCA) offers a decentralized metadata framework for defining the semantic composition of digital objects, enabling the creation of interoperable schemas and facilitating data harmonization.

  The Epistemic Shortfall in Achieving Semantic Interoperability

In knowledge representation, "epistemic" pertains to knowledge acquisition and understanding. It is derived from the study of epistemology and involves the logic and concepts associated with how knowledge is processed and understood. An "epistemic shortfall" in semantic interoperability indicates a gap where traditional methods fail to represent and share complex knowledge across various computational systems effectively. This shortfall highlights a challenge in preserving the depth and breadth of knowledge when it is exchanged or transferred among diverse platforms.

Although epistemological semantics does not directly contribute to interoperability in a technical sense, it is essential for ensuring that the data’s meaning is accurately interpreted and utilized. This aspect is crucial in complex domains where data interpretation is as critical as data exchange. Even though uniform nodal relationships enhance clarity for generic comprehension, the inherent nature of knowledge representation does not drive semantic interoperability. While knowledge graphs reveal the connections between related entities, they obscure the data's depth and evolutionary journey.

While aiding in generic comprehension, pursuing nodal relationships must remain generic within knowledge graphs to ensure uniform comprehension of the subjective concept. OCA enables dynamic semantic interoperability by distinctly separating the relationships of textual elements within a schema. Epistemological semantics enriches context via the intricate relationships delineated in ontologies and knowledge graphs. However, morphological semantics facilitate semantic interoperability through the inherent properties of the objects rather than the abstract concepts that house them.

OCA: The Beacon of Interoperability

As a scalable architecture, OCA streamlines the creation of each overlay and the sum of their connected parts, ensuring rich and comprehensive object representation. This foundational step enhances object utilization within assigned knowledge graphs. It's not merely about establishing relationships but understanding each nodal entity's genesis, evolution, and contextual relevance. OCA elevates semantic interoperability, making it a stable architecture for those pursuing authentic, meaningful, and robust data integration and interpretation.

Conclusion

Enhanced Data Interpretation: OCA ensures a richer data interpretation, capturing the detailed nuances often missed by traditional epistemic models.

OCA, grounded in “Decentralized Semantics,” addresses the limitations of epistemic models that often miss the intricate details of the data they represent. OCA is a robust solution that achieves semantic interoperability by providing a metadata framework for separating structural tasks into task-specific objects that, when combined, provide a digital representation of a complex object while ensuring that those components remain intact during data exchange.

The integration of OCA at the point of data capture promises a future where data is abundant, meaningful, and context-rich. This enhanced capability is pivotal for the evolution of AI and cross-industry data communication, ensuring that collected data retains its intrinsic value and meaning.

That concludes Part 1 of this series. In Part 2, we'll examine the dynamic union of stemmatics and Directed Acyclic Graphs (DAGs) and their role in bolstering data integrity. We'll see how merging these time-honored methods with cutting-edge technology creates a strong foundation that verifies and preserves the authenticity of data. This innovative approach safeguards the evolutionary aspects of data integrity across digital platforms and enhances true semantic interoperability.

Link to Genuine Semantic Interoperability across Digital Systems - Part 2: Stemmatic Traceability

References

[1] Acquaviva, P. (2016). Morphological Semantics. In A. Hippisley & G. Stump (Eds.), The Cambridge Handbook of Morphology (Cambridge Handbooks in Language and Linguistics, pp. 117-148). Cambridge: Cambridge University Press. doi:10.1017/9781139814720.006

[2] Rattan, Gurpreet, and Journal of Philosophy Inc. “Epistemological Semantics beyond Irrationality and Conceptual Change.” Journal of Philosophy 111, no. 12 (2014): 667–88. doi:10.5840/JPHIL20141111244.


Digital Identity NZ

Will 2024 move the dial? | January Newsletter

Kia ora e te whānau Ngā mihi o te Tau Hou! And I wish you all a trustful and prosperous one. Will 2024 move the dial on Digital Trust in Aotearoa? This may seem provocative, but it’s not intended to be. It’s a reasonable question, given our economic and budgetary challenges coupled with Canadian evidence of … Continue reading "Will 2024 move the dial? | January Newsletter" The post Will 202

Kia ora e te whānau

Ngā mihi o te Tau Hou! And I wish you all a trustful and prosperous one.

Will 2024 move the dial on Digital Trust in Aotearoa? This may seem provocative, but it’s not intended to be. It’s a reasonable question, given our economic and budgetary challenges coupled with Canadian evidence of a decrease in internet trust likely reflected globally. This means Aotearoa has to put in a mighty team effort with slim resources to reverse the trend, and in the year that the Digital Identity Trust Framework regulation comes into force and hopefully portable verifiable credentials that form part of it become the norm in our daily digital lives. We will only move the dial if public sector agencies and industry genuinely rally around our shared issues, collectively inform and educate people on why these issues matter. Alongside these efforts, it is important to maintain non-digital channels for those who can’t, don’t feel confident with or choose not to use digital.  

While I’ve drawn on the above examples, there are other industry sectors with their requisite frameworks, concepts, regulations and codes of practice. They may not be termed ‘Trust Frameworks’ but in most practical senses they are. For instance, consider the Know Your Customer (KYC) component of the Anti-Money Laundering and Countering Financing of Terrorism (AML/CFT) regulations applying to financial institutions and our friends at PaymentsNZ leveraging its API centre to support adoption of open banking. Or consider farm sourced product supply chain provenance promulgated through verifiable credentials issued to a digital farm wallet by our friends at Trust Alliance NZ. These span private and public sectors, industries and across international boundaries. They are ecosystems and networks performing critical purposes that – if the vision for a frictionless, safe, privacy-aware trusted digital future that most want to see – must work seamlessly with each other and be trusted by people, institutions and governments globally.

The fact is our existing centralised and federated digital identity systems with their requisite username/password processes that we started our digital lives with, are increasingly less fit for purpose. They are inconvenient to use, restrict our control over data about us, and provide opportunities for bad actors to steal our data, scam us, take over our accounts and exploit us as ‘products’, causing pain, stress and distrust.

Of the new, portable, reusable, more secure and privacy aware emerging systems, components such as verifiable credentials and passkeys that eliminate the use of passwords as DINZ member Authsignal has just implemented for DINZ member Air New Zealand, are mature and usable today. DINZ itself has preliminary ideas to introduce its members to Decentralised Identifiers (DIDs), a key component of verifiable credentials as explained in this whitepaper by our member Microsoft. But to use them, digital systems operated by all sectors need to be able to accept and in many cases issue them. Australia’s Steve Wilson, industry commentator and speaker at our 2022 Digital Trust Hui Taumata discusses the digital wallet metaphor and its relationship to the other components here. While humans and enterprises are naturally resistant to change due to fear, uncertainty, doubt, effort, focus and money, Aotearoa has a great record of embracing change. Look how we took to EFTPOS and more recently Apple Pay, Google Pay and our smartphone banking apps. Look how we’ve learned to use QR codes to access buildings or even collect green fees at golf courses! These ‘new era’ tools can hide technical complexity to deliver the convenient, secure, consented and privacy respecting digital experiences we are entitled to expect when we are asked to confirm that minimal set of ‘identiful’ things needed to know to trust before we transact. It’s not just about you personally and it’s not just one-way trust. It is trusting each others’ devices, organisations and websites as we go about our daily lives. Sponsoring aspects of Digital Identity NZ’s mahi will help raise public awareness and knowledge to ultimately deliver tangible products and services in support.  

However, moving the dial on digital trust in 2024 will take more than ‘a coalition of the willing’ while the majority sit back. It will take a concerted, orchestrated effort from the coalition of five million. No person or entity is immune to the potential threats and no-one should be limited in taking the opportunities digitalisation brings.

To make progress, we need to step out of our comfort zone. Organisations should use new tools for their services, and people need to be willing to try them.

Ngā mihi nui
Colin Wallis
DINZ Executive Director

PS:  If you have member news to share, please tag us online or email us. Please consider supporting our mahi. Read more.

Read full news here: Will 2024 move the dial? | January Newsletter

SUBSCRIBE FOR MORE

The post Will 2024 move the dial? | January Newsletter appeared first on Digital Identity New Zealand.

Tuesday, 23. January 2024

Velocity Network

Michelle Ainsworth joins Velocity’s board

We're delighted that Michelle Ainsworth has been voted on to Velocity Network Foundation's Board of Directors. The post Michelle Ainsworth joins Velocity’s board appeared first on Velocity.

MOBI

State of Web3 in 2024

State of Web3 in 2024 Insights into the past, present, and future of Web3 When the term "Web3" was first coined, it symbolized the dawn of a new era: a decentralized, user-owned, privacy-first digital economy. It promised smarter, more seamless communication and business practices, and the unlocking [...]

State of Web3 in 2024

Insights into the past, present, and future of Web3

When the term “Web3” was first coined, it symbolized the dawn of a new era: a decentralized, user-owned, privacy-first digital economy. It promised smarter, more seamless communication and business practices, and the unlocking of new revenue streams through trusted data. Now, in 2024, we find that the limitations of the old Web2 paradigm — characterized by data silos, interoperability issues, and escalating trust costs — are more apparent than ever. These challenges underscore the need for a new approach that can cater to our increasingly complex global ecosystem.

Surge in digital identity fraud is a major problem for financial services, research reveals | IBS Intelligence JPMorgan suffers wave of cyber attacks as fraudsters get ‘more devious’ | Financial Times Understanding Web3

What is Web3? Is it blockchain? Cryptocurrency? The Metaverse? In truth, ‘Web3’ is more or less a blanket term that represents a broad vision, rather than pinpointing a specific technology or application. This vision extends beyond the boundaries of single entities, encompassing entire value chains to foster a unified digital ecosystem. As a result, the use of blockchain or crypto alone is not enough to constitute Web3 in a meaningful sense. At its core, Web3 is about trust and cross-industry interoperability — seamless interconnection and cooperation among diverse systems, organizations, and industries, with trusted identities and verifiable transactions. Web3’s essence lies in the ability to create a decentralized network where data ownership and control are democratized, unlike the centralized structures of traditional blockchain and cryptocurrency systems.

Central to this evolution is the World Wide Web Consortium (W3C) Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) standards. These technologies provide the foundation for a standardized method of verifying identities and claims, crucial in establishing digital trust and ensuring seamless interoperability across various enterprise environments. Web3, bolstered by the capabilities of DIDs and VCs, is paving the way for a future where digital interactions are more secure, transparent, and user-driven. Its successful adoption and scaling in the enterprise domain hinge on embracing this shared approach to break down silos and foster a unified digital ecosystem. Learn more about how DIDs and VCs serve as key enablers for Web3

Perspectives on the Adoption of Verifiable Credentials | Digital ID & Authentication Council of Canada Digital Identity: Leveraging the SSI Concept to Build Trust | ENISA Solving Challenges in Driving Enterprise Web3 Adoption

The path to widespread enterprise adoption of Web3 has been slow, with efforts often limited to proofs-of-concept. This hesitance can be attributed to several factors:

Lack of Understanding: For many, the concept of Web3 remains shrouded in confusion, with terms like blockchain, NFT, and cryptocurrency often used, mistakenly, as synonyms for Web3. This conflation not only obscures the true potential of Web3 but also impedes its integration into mainstream applications. Lack of Standardization: Lack of standardization in Web3 leads to a ‘wild west’ scenario, discouraging many companies from attempting implementation. Those who do venture into Web3 face fragmentation, as the use of varied protocols and blockchain platforms essentially results in the same Web2 siloing issues that Web3 aims to address. Scalability Concerns: Challenges with network congestion and transaction throughput have made scalability a critical issue. MOBI’s Web3 Infrastructure: Trusted Identities and Verifiable Transactions for Cross-Industry Interoperability and Commerce

MOBI is working alongside our members to build a federated Web3 InfrastructureCitopia and the Integrated Trust Network (ITN) — to create a privacy-preserving decentralized marketplace for digital commerce. Leveraging standards from MOBI, W3C, ISO, IEEE, SAE, and more, Citopia and the ITN together offer standardized communication protocols for cross-industry interoperability. This unified framework serves as a common language, enabling organizations with diverse business processes and legacy systems to efficiently coordinate and communicate without having to build and maintain new infrastructure. Organizations can securely share data and verify claims throughout the entire value chain, overcoming the barriers posed by traditional, centralized Web2 systems.

In our commitment to demonstrating the scalability and effectiveness of this infrastructure, MOBI and its members have conducted several successful pilots, including:

A vehicle emissions monitoring test with the European Commission in 2021. A Transit IDEA-funded Citopia MaaS pilot in 2022-23.

These pilots have yielded outstanding results, affirming Citopia and the ITN’s capacity to handle large-scale operational demands with robustness and efficiency.

To accelerate Web3 understanding and adoption, MOBI is also providing comprehensive educational courses through MOBI Academy, aimed at demystifying Web3 concepts and their practical applications. Explore MOBI Academy

Looking Ahead: 2024 and Beyond

As we forge ahead into 2024, our focus at MOBI remains steadfast on enhancing Citopia and the ITN for more effective enterprise adoption and scalability. We are actively exploring the transformative potential of this technology across various industry verticals. Collaboration is at the core of our strategy, as we work hand-in-hand with our partners and other consortia — including the Global Battery Alliance, Catena-X, Battery Pass, NADA, and AAMVA — to address critical pain points and refine our solutions.

In our commitment to fostering a dynamic and collaborative community, next month marks a significant milestone. We are excited to host MoCoTokyo (19 Feb 2024) in partnership with Amazon Web Services (AWS). This event will bring together industry leaders and innovators to delve deeper into the initiatives of our Circular Economy and the Global Battery Passport (CE-GBP) Working Group. Additionally, we’re hosting a members-only workshop in Tokyo (20 Feb 2024) in partnership with DENSO to facilitate vital discussions surrounding the Global Battery Passport, including laying out a roadmap for the implementation of a cross-industry Minimum Viable Product. Register for MoCoTokyo and our Members-Only Workshop

The post State of Web3 in 2024 first appeared on MOBI | The New Economy of Movement.


DIF Blog

DIF Hackathon drives skills, community engagement and participation

The first DIF-sponsored hackathon wrapped up with a Meet The Winners Twitter Space earlier this month which highlighted how the event has supported skills development, engagement with the decentralized identity community and participation in DIF’s work.   Five winners shared the main DIF-funded $10,000 prize pool,

The first DIF-sponsored hackathon wrapped up with a Meet The Winners Twitter Space earlier this month which highlighted how the event has supported skills development, engagement with the decentralized identity community and participation in DIF’s work.  

Five winners shared the main DIF-funded $10,000 prize pool, with additional prizes scooped by winners of challenges sponsored by DIF members TBD (Block), Trinsic, Polygon ID and Ontology. 

Participants included people both new to and experienced with decentralized identity. 

“The number of submissions and the breadth of ideas, projects and levels of experience on display exceeded our expectations. We are particularly pleased that so many projects used DIF work items. A number of people from the web3 space used Decentralized Identifiers and Verifiable Credentials for the first time, thanks to some interesting challenges from our sponsors, and many participants explored the exciting possibilities of Decentralized Web Nodes” said Limari Navarette, DIF’s Senior Director of Community Engagement.

Several winners of the main DIF prize pool are currently studying at university, and submitted research projects that combined DIF work items and sponsors’ SDKs to create sophisticated solutions to real-world problems. 

“We are thrilled to have helped so many talented young developers upskill around decentralized identity, which also expands the talent pool available to our members” Limari added. 

Participants, including those with significant experience working with decentralized identity technology, said the hackathon has helped them feel more connected to the community, with the technical masterclasses, Office Hours support and Discord server all singled out for praise.  

The hackathon has also sparked new participation in DIF, including two proposed new working groups, plus new work items for established working groups. 

Here’s a flavor of the insights shared by DIF Hackathon winners during the Twitter Space on 10th January:

“This was my first time working with DIDs and DWNs. I fell in love with how revolutionary this is, and the possibilities for applications” Harsh Tyagi, HealthXProtocol - Third Place, DIF Main Prize Pool & TBD Prize Pool.

“This project is inspired by an effort I'm involved in to build a trust framework for the home buying and selling industry in the UK. It aims to make it safe and easy for individuals and organizations to share data during the home buying and selling process”. Edward Curran, TrustBox - Honorable Mention, DIF Main Prize Pool.

“I’ve been in the automotive space for a while, removing friction for businesses. Otto told me about the hackathon and I deployed my team just a week before submissions closed. I came up with a use case and challenged the team with it, because I wanted them to learn and develop around primitives such as VCs, since we’d like to use them in our own product”. Aaron Wuchner,Titleblox - Third Place, Trinsic Prize Pool. 

“Pop up events at big conferences present a challenging use case. All current solutions are “Prove this at home with your private key, then go”. But if you're at a conference and an event pops up, how do you prove ownership of an NFT? I didn't want our members walking around the conference with the private keys to their NFTs in order to sign a transaction to get in the door. I came to the hackathon specifically to learn Polygon ID and to use its privacy features to disconnect “Am I a member of the DAO?” from “What's my crypto wallet address” and my potential crypto net worth at a public event. Bob Black, ATX DAO  - First place, Polygon ID Prize Pool. 

DIF Main Prize Pool winners

First Prize: Decentralinked

Second Prize: Anonymous Door Unlocking

Third Prize: HealthX Protocol

Honorable Mention: TrustBox 

Honorable Mention: Mail5

Click here for a full list of prize winners. 

Monday, 22. January 2024

The Engine Room

Working to strengthen the information ecosystem in Latin America? Get free tech and data support 

We're especially offering support for groups who are advocating for access to information for their community, working to build healthier info systems, or monitoring the impacts of info disorder. The post Working to strengthen the information ecosystem in Latin America? Get free tech and data support  appeared first on The Engine Room.

We're especially offering support for groups who are advocating for access to information for their community, working to build healthier info systems, or monitoring the impacts of info disorder.

The post Working to strengthen the information ecosystem in Latin America? Get free tech and data support  appeared first on The Engine Room.


Content Authenticity Initiative

How a voice cloning marketplace is using Content Credentials to fight misuse

It has become easier for people to create and deploy deepfakes at scale. Learn how Respeecher has embraced transparency in AI by integrating Content Credentials into their audio marketplace.

By Coleen Jose, Head of Community, CAI

Taylor Swift isn’t selling Le Creuset cookware. However, deepfake advertisements recently appeared that used video clips of the pop star and synthesized versions of her voice to look and sound as if she were doing a giveaway offer. Unsuspecting fans and others who trusted Swift’s brand clicked the ads and entered their personal and credit card information for a chance to win.

Public figures are prime targets for scams that are being created and scaled with help from voice cloning and video deepfake tools.

In July 2023, a video of Lord of the Rings star Elijah Wood appeared on social media. Originally created with Cameo, an app that lets people purchase personalized video messages from celebrities, the clip featured Wood sharing words of encouragement with someone named Vladimir who was struggling with substance abuse. “I hope you get the help that you need,” the actor said.

Soon after, it emerged that the heavily edited video was part of a Russian disinformation campaign that sought to insinuate that Ukrainian President Volodymyr Zelensky had a drug and alcohol problem. At least six other celebrities, including boxer Mike Tyson and Scrubs actor John McGinley, unwittingly became part of the scheme. Despite clumsy editing, the videos were shared widely on social platforms.

View this post on Instagram

A post shared by New York Post (@nypost)

Now, imagine if the level of effort required to create these videos dropped to zero. Instead of commissioning content on Cameo, one could choose a celebrity or obtain enough video content of an individual and use AI to make them say or do anything.

This challenge is intensifying with the spread of AI-enabled audio and voice cloning technology that can produce convincing deepfakes. Governments have started taking steps to protect the public; for example, the US Federal Trade Commission issued a consumer warning about scams enabled by voice cloning and announced a prize for a solution to address the threat. Both the Biden administration and the European Union have called for clear labeling of AI-generated content. 

These scams and non-consensual imagery, which range from fake product endorsements to geopolitical campaigns, are increasingly targeting people who aren’t famous singers or politicians.

A commitment to verifiable AI-generated audio

Dmytro Bielievstov knows that top-down regulation isn’t enough. In 2017, along with Alex Serdiuk and Grant Reaber, the Ukraine-based software engineer co-founded Respeecher, a tool that allows anyone to speak in another person's voice using AI. “If someone uses our platform to pretend to be someone they are not, such as a fake news channel with synthetic characters, it might be challenging for the audience to catch that because of how realistic our synthetic voices are,” he said.

Keenly aware of the potential for misuse, the Respeecher team has embraced the use of Content Credentials through the Content Authenticity Initiative’s (CAI) open-source tools. Content Credentials are based on the C2PA open standard to verify the source and history of digital content. With an estimated 4 billion people around the world heading to the polls in over 50 elections this year, there’s an urgent need to ensure clarity about the origins and history of the visual stories and media we experience online. And there’s momentum for C2PA adoption across industries and disciplines.

We recently spoke with Dmytro about how Respeecher implemented Content Credentials, his tips for getting started, and why provenance is critical for building trust into the digital ecosystem.

This interview has been edited for length and clarity.

How would you describe Respeecher?

Respeecher lets one person perform in the voice of another. At the heart of our product is a collection of high-quality voice conversion algorithms. We don't use text as input; instead, we utilize speech. This enables individuals to not just speak but also perform with various vocal styles, including nonverbal and emotional expressions. For example, we enable actors to embody different vocal masks, similar to how they can put on makeup. 

Who are you looking to reach with the Voice Marketplace?

Initially, our focus was on the film and TV industry. We’ve applied our technology in big studio projects, such as Lucasfilm’s limited series “Obi-Wan Kenobi,” where we synthesized James Earl Jones’ voice for Darth Vader. 

Now, we’re launching our Voice Marketplace, which democratizes the technology we’ve used in Hollywood for everyone. The platform allows individuals to speak using any voice from our library. This will enable small creators to develop games, streams, and other content, including amateur productions. While we’ll maintain oversight of content moderation and voice ownership, we’ll now have less control over the content’s destination and usage. 

What motivated you to join the Content Authenticity Initiative?

First, the CAI aids in preventing misinformation. We do not allow user-defined voices (say President Biden’s), as controlling that would be quite difficult. Still, users should know that the content they’re consuming involves synthetic speech. As a leader in this space, we want to ensure that all generated content from our marketplace has Content Credentials. In the future, as all browsers and content distribution platforms support data provenance, it will be easy for consumers to verify how the audio in a video was produced. If something lacks cryptographic Content Credentials, it will automatically raise suspicion about authenticity. 

Secondly, the initiative addresses the needs of content creators. By embedding credentials in the audio they produce, they can make it clear that the work is their own artistic creation. 

How do Content Credentials work in Respeecher?

We have integrated CAI's open-source C2PA tool into our marketplace. Whenever synthetic audio is rendered on our servers, it is automatically cryptographically signed as being a product of the Respeecher marketplace. When clients download the audio, it contains metadata with Content Credentials stating that it was converted into a different voice by Respeecher. GlobalSign, our third-party certificate authority, signs our credentials with its cryptographic key so that anyone who receives our content can automatically verify that it was signed by us and not an impersonator.

The metadata isn't a watermark but rather a side component of the file. If the metadata is removed, consumers are alerted that the file's source is uncertain. If the file is modified after it’s downloaded (say someone changes the name of the target voice), the cryptographic signature won't match the file's contents. So it’s impossible to manipulate the metadata without invalidating Respeecher’s signature. 

All voice cloned audio files downloaded from Respeecher Marketplace includes Content Credentials for AI transparency.

What challenges did you encounter while implementing Content Credentials technology?

Surprisingly, the biggest challenge was obtaining keys from our certificate authority. We had to prove that we were a legitimate organization, which took several weeks and some back and forth with GlobalSign.

Another challenge was that cryptography, particularly public key infrastructure (PKI), can be challenging to grasp for someone who isn’t an expert. Our team had to understand the specifics of C2PA, figure out appropriate configurations, and determine whether we needed a third-party certificate authority. These nuances required time and effort, especially since we don’t have a cryptographic expert on our team. However, the CAI team and community were incredibly helpful in assisting us with these challenges.

What advice do you have for other developers getting started with Content Credentials and the CAI’s open-source tools?

Start with CAI’s Getting Started guide, but then dedicate time to read the C2PA specification document. Although it is somewhat long and intimidating, it’s surprisingly comprehensible for non-experts. 

Also, utilize ChatGPT to help explain complex concepts to you. Even though ChatGPT doesn’t know the technical details of C2PA (because its current version has limited access to information beyond certain dates), it still does a great job explaining concepts such as PKI and cryptographic signatures.

What's next for you when it comes to advances in AI safety and Content Credentials?

We’re planning to give users of our marketplace the option to add their own authorship credentials to the Content Credentials. Currently, the metadata indicates that the audio was modified by Respeecher, but it doesn't attribute the audio to its creator. Some users may prefer to remain anonymous, but others will choose to include their credentials. 

Respeecher will continue to be at the forefront of initiatives to adopt data provenance standards across the AI synthetic media industry. It’s essential for companies to create a unified authentication layer for audio and video content, and content distribution platforms like YouTube and news websites have a crucial role to play in embracing this technology. Just as a lack of HTTPS warns users of potential security issues, a similar mechanism could alert users to the source of an audio file, enhancing transparency and authenticity.

We are also closely watching how the concept of cloud-hosted metadata develops. Embedding hard-to-remove watermarks in audio and video signals without making them obvious remains a largely unsolved problem. By storing metadata in the cloud and enabling checks for pre-signed content, we can potentially simplify authentication and the synthetic media detection problem.

The Respeecher team pictured in their office in Kiev, Ukraine has won awards for their projects with global brands including an Emmy and Webby. Photo courtesy: Respeecher

Stay connected on LinkedIn and consider joining the movement to restore trust and transparency online.

Subscribe to the CAI newsletter to receive ecosystem news.

For implementers, explore the CAI’s open-source tools to integrate verifiable Content Credentials to services, websites, and apps.


Hyperledger Foundation

Developer Showcase Series: Fernando Paris, Co-CEO and CTO, ioBuilders

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Fernando Paris, Co-CEO and CTO, ioBuilders. 

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Fernando Paris, Co-CEO and CTO, ioBuilders


Identity At The Center - Podcast

It’s time for episode #255 of the Identity at the Center pod

It’s time for episode #255 of the Identity at the Center podcast! In this episode, we had the privilege of interviewing two incredible guests, Atul Tulshibagwale, CTO of SGNL and Co-chair of the Shared Signals Working Group at the OpenID Foundation, and Sean O'Dell, Senior Staff Security Engineer at Disney. Atul and Sean bring us up to speed on Continuous Access Evaluation Profile (CAEP), Identi

It’s time for episode #255 of the Identity at the Center podcast!

In this episode, we had the privilege of interviewing two incredible guests, Atul Tulshibagwale, CTO of SGNL and Co-chair of the Shared Signals Working Group at the OpenID Foundation, and Sean O'Dell, Senior Staff Security Engineer at Disney.

Atul and Sean bring us up to speed on Continuous Access Evaluation Profile (CAEP), Identity Threat Detection and Response (ITDR), and the Shared Signals Framework (SSF). It was an insightful conversation that shed light on the importance of these concepts for IAM practitioners.

You can listen to the episode on idacpodcast.com or find it on your favorite podcast app.

#iam #podcast #idac

Friday, 19. January 2024

FIDO Alliance

Computer Weekly: Thanks to AI tools, attackers also have an easy time of it

Instead of fighting AI with AI, it’s time for companies to rewrite the rules and get to the root of the problem – and that’s traditional login methods. A byline […]

Instead of fighting AI with AI, it’s time for companies to rewrite the rules and get to the root of the problem – and that’s traditional login methods. A byline from Rolf Lindemann.


Professional Security Magazine: Mitigating AI security risks

Major tech companies and members of the FIDO Alliance such as Google, Apple, and Microsoft have been aiming to eliminate passwords completely. The use of alternative, modern authentication methods such as passkeys […]

Major tech companies and members of the FIDO Alliance such as Google, Apple, and Microsoft have been aiming to eliminate passwords completely. The use of alternative, modern authentication methods such as passkeys and security keys based on the FIDO protocol, are phishing-resistant and cannot be circumvented by AI.


Retail Banker International: Banks don’t have to compromise on security to prioritise consumer convenience

Biometric authentication powered by the global FIDO standard is the way forward to meet the needs of customers, but also the diverse requirements of banks and other financial institutions.

Biometric authentication powered by the global FIDO standard is the way forward to meet the needs of customers, but also the diverse requirements of banks and other financial institutions.


The Wall Street Journal: 11 Technology Trends To Watch This Year

Major companies like Google, Apple, and Amazon have adopted passkeys as their secure login method. With over eight billion passkey-enabled accounts and an anticipated 20 billion by the end of […]

Major companies like Google, Apple, and Amazon have adopted passkeys as their secure login method. With over eight billion passkey-enabled accounts and an anticipated 20 billion by the end of 2024, passkeys offer enhanced protection against hacking attempts and minimize the risk of security breaches.


Techopedia: Secure Authentication: Will Passkeys Kill the Password?

Passkeys represent a revolutionary shift in online authentication, which could mean a potential end to the traditional password methods. Supported by Apple, Google, and Microsoft, FIDO passkeys utilize biometrics (or device […]

Passkeys represent a revolutionary shift in online authentication, which could mean a potential end to the traditional password methods. Supported by Apple, Google, and Microsoft, FIDO passkeys utilize biometrics (or device PINs) to offer a more streamlined, user-friendly, and secure method of protecting online identities.


GS1

Don’t miss the sessions on construction during the GS1 Global Forum

Don’t miss the sessions on construction during the GS1 Global Forum daniela.duarte… Fri, 01/19/2024 - 16:14 Don’t miss the sessions on construction during the GS1 Global Forum 19 January 2024
Don’t miss the sessions on construction during the GS1 Global Forum daniela.duarte… Fri, 01/19/2024 - 16:14 Don’t miss the sessions on construction during the GS1 Global Forum 19 January 2024 Join us at the GS1 Global Forum for the latest updates on construction

It’s that time of year again! The GS1 Global Forum is a great opportunity to learn about GS1 standards, solutions and services as well as exchange innovative ideas and expand your network. This annual event aligns GS1’s global strategy, fostering inspiration and uniting GS1 professional worldwide. The Centre of Excellence Construction team is set to share captivating construction stories across various sessions.

Don’t miss the chance to join us at the GS1 Global Forum and participate in sessions highlighting relevant use cases illustrating the pivotal role of GS1 standards in construction. In addition to our dedicated session on Thursday, 22 February, this year, the GS1 Centre of Excellence Construction team will take the stage at the European Regional session with an exclusive slot dedicated to technical industries. Be part of this showcase, where we’ll highlight the best use cases, demonstrating the vast potential of GS1 standards in the construction sector.

Where can you meet us? GS1 in EU Regional Forum - 21 February 2024, 8:30-11:45 (CET) Blueprint for the future: building up the modern construction landscape with the help of standards (2D featured) – dedicated construction session – 22 February 2024, 10:45-12:00 (CET) Location: Square Brussels Meeting Centre in Brussels, Belgium Or online

Register today and join live or re-watch later! Industry stakeholders are also welcome!

Register now!

Origin Trail

Get #OnTrac(k) and subscribe for the first-ever On TRAC(k) podcast

Exciting news is on the horizon as we proudly announce the launch of our brand new podcast, On TRAC(k), aimed at keeping our community on the pulse of the latest advancements within the OriginTrail ecosystem and the evolving landscape working towards empowering world-class brands and builders with verifiable Web for Decentralized Artificial Intelligence (AI). As we embark on this thrilling journe

Exciting news is on the horizon as we proudly announce the launch of our brand new podcast, On TRAC(k), aimed at keeping our community on the pulse of the latest advancements within the OriginTrail ecosystem and the evolving landscape working towards empowering world-class brands and builders with verifiable Web for Decentralized Artificial Intelligence (AI).

As we embark on this thrilling journey, we look forward to bringing you closer to the heart of our mission, showcasing the power of transparency, traceability, and decentralization in the age of AI.

A Vibrant Ecosystem of Pioneers

To elevate your listening experience, we’ve partnered with seasoned host, Jonathan DeYoung, who brings a wealth of knowledge and a passion for blockchain technology. We deeply value the thoughts and questions coming from our dynamic community. With Jonathan at the helm, expect engaging discussions with industry experts and innovators each month.

On TRAC(k) spotlights the ways we can empower world-class brands and builders with verifiable Web for Decentralized AI by establishing a reliable knowledge base, cultivating trust and transparency across various sectors, including supply chains, construction, life sciences, healthcare, metaverses, and beyond.

Episode 1: The V8 Foundation

On January 18, 2024, the inaugural episode of On TRAC(k) was launched. In this live session, Jonathan was joined by OriginTrail’s founders, Žiga Drev, Branimir Rakić, and Tomaž Levak. Together, they delved into the significance of the V8 Foundation, explored robust partnerships, and shed light on the upcoming knowledge mining and staking initiatives.

Read more on the V8 Foundation here.

Future Episodes: Your Topics, Your Voices

Beyond the airwaves, we are excited to foster community engagement through various initiatives — stay tuned for live Q&A sessions at the end of each episode. We believe in the power of our community, and while we have an exciting lineup of topics and guests, we are equally excited to hear from you.

We want to hear your voices, ideas, and stories. Your input is invaluable, and we want this podcast to be a collaborative experience.

To ensure you don’t miss a beat, get on board and subscribe to On TRAC(k) on your favorite podcast platform. Visit ontrack.origintrail.io

Thank you for being part of the OriginTrail community. Together, let’s explore, learn, and shape the future of the global economy.

About OriginTrail

OriginTrail is an ecosystem-building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail’s initial adoption was in global supply chains, serving as a trusted hub for supply chain data sharing, allowing customers to authenticate and track products and keep these operations secure. In recent years, the rise of AI has not only created unprecedented opportunities for progress but also amplified the challenge of misinformation. OriginTrail also addresses this by functioning as an ecosystem focused on building a trusted knowledge infrastructure for AI in two ways — driving discoverability of the world’s most important knowledge and enabling the verifiable origin of the information. The adoption of OriginTrail in various enterprise solutions underscores the technology’s growing relevance and impact across diverse industries including real-world asset tokenization (RWAs), the construction industry, supply chains, healthcare, metaverse, and others.

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

OriginTrail has gained support and partnerships with world-class organizations such as British Standards Institution, SCAN, Polkadot, Parity, Walmart, the World Federation of Hemophilia, Oracle, and the EU Commission’s Next Generation Internet. These partnerships contribute to advancing OriginTrail’s trusted knowledge foundation and its applicability in trillion-dollar industries while providing a verifiable web of knowledge important in particular to drive the economies of RWAs.

Web | On TRAC(k) Podcasts | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

Get #OnTrac(k) and subscribe for the first-ever On TRAC(k) podcast was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


OpenID

Formal Security Analysis of OpenID for Verifiable Credentials

The first in-depth security analysis of OpenID for Verifiable Credentials has been completed, with the goal of increasing confidence in the security of these specifications. The formal security analysis includes the protocols OpenID for Verifiable Credential Issuance (OID4VCI) and OpenID for Verifiable Presentations (OID4VP), both part of the OpenID for Verifiable Credentials family.  The for

The first in-depth security analysis of OpenID for Verifiable Credentials has been completed, with the goal of increasing confidence in the security of these specifications. The formal security analysis includes the protocols OpenID for Verifiable Credential Issuance (OID4VCI) and OpenID for Verifiable Presentations (OID4VP), both part of the OpenID for Verifiable Credentials family

The formal security analysis uses the Web Infrastructure Model (WIM), a detailed formal model of the web, which has been developed by the University of Stuttgart and used to complete formal analysis of other protocols including the OpenID Foundation standards OpenID Connect, FAPI 1.0 and FAPI 2.0, and the foundational IETF standard OAuth 2.0 (RFC6749). In this instance, the WIM is used to model the interaction of OID4VCI and OID4VP in an ecosystem. 

This work has been carried out as part of a master’s thesis at University of Stuttgart co-supervised by Verimi GmbH. The goal of the thesis was to prove that both protocols are secure with respect to the definition of security under certain assumptions and modeling decisions. The security definition used in the analysis covers several important properties around credential issuance and presentation; in particular, that an attacker must not be able to impersonate an honest user, initiate a login flow on a user’s device, or force a user to be logged in under an attacker-chosen identity.

For cases where certain preconditions are not met, the analysis revealed some potential attacks. In particular, the pre-authorized code flow in OID4VCI and the cross-device flow in OID4VP may be vulnerable to phishing attacks and require user attention to be secure. These attacks are not surprising, as cross-device flow vulnerabilities are a well-known class of attacks, affecting most cross-device protocols. However, the analysis also confirmed the security of the same-device flows of OpenID for Verifiable Credentials. The OpenID Foundation’s Decentralized Credentials Protocols WG has taken the feedback from this master’s thesis into account in the current versions of the specifications. 

Please refer to the analysis, a master’s thesis, for details on the assumptions, modeling decisions, security properties, and the formal mathematical proof.

Our thanks to Fabian Hauck for his master’s thesis, conducted under the supervision of Dr. Daniel Fett and Pedram Hosseyni M.Sc with Prof. Ralf Küsters as the Examiner at the Institute of Information Security, University of Stuttgart. The thesis was generously supported by Verimi GmbH through the IDunion project. Please note that this analysis is preliminary and has not yet been peer-reviewed. Additionally, it may not reflect recent changes in the current specifications as the analysis is based on the May 2023 versions of OpenID for Verifiable Presentations and OpenID for Verifiable Credential Issuance.


OpenID Foundation

The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.

Find out more at openid.net.

The post Formal Security Analysis of OpenID for Verifiable Credentials first appeared on OpenID Foundation.

Thursday, 18. January 2024

Digital Identity NZ

Sponsorship Opportunities for a Thriving Future

Digital Identity NZ offers terrific opportunities to promote and associate your brand and organisation with the very best mahi. As a sponsor you can input into the design and content of these assets too! In 2024 we have the following ideas concepted, ready for a sponsor to take them forward; In addition we have a … Continue reading "Sponsorship Opportunities for a Thriving Future" The post Spons

Digital Identity NZ offers terrific opportunities to promote and associate your brand and organisation with the very best mahi.

As a sponsor you can input into the design and content of these assets too!

In 2024 we have the following ideas concepted, ready for a sponsor to take them forward;

Annual consumer and business attitudinal research (also market size research) with the associated reports of the results Annual identity products and people Awards Jobs Board Product Directory Weekly international roundup of digital identity news (2 options) already collated by 3rd parties. Digital Trust Hui Taumata (Wellington, 12-13 August 2024) – the only event of its kind in Australasia.

In addition we have a range of podcasts, webinars, studies/ analysis reports and whitepapers, submissions and briefings that are concepted and now looking for sponsor input to bring them to life.

Why not have DINZ host and manage your product or organisation milestone launch? Leverage DINZ’s resources, membership and community networks to bring in the crowds alongside your own customers, networks and colleagues.

Email info@digitalidentity.nz to discuss.

The post Sponsorship Opportunities for a Thriving Future appeared first on Digital Identity New Zealand.

Wednesday, 17. January 2024

OpenID

2024 OpenID Foundation Board of Directors Election Results

I want to personally thank all OpenID Foundation members who voted in the 2024 elections for representatives to the OpenID Foundation board of directors. Please note that the OpenID Foundation board of directors unanimously approved updated Bylaws at a board meeting on November 16, 2023, with full details, including a summary of updates, found here. Per […] The post 2024 OpenID Foundation B

I want to personally thank all OpenID Foundation members who voted in the 2024 elections for representatives to the OpenID Foundation board of directors.

Please note that the OpenID Foundation board of directors unanimously approved updated Bylaws at a board meeting on November 16, 2023, with full details, including a summary of updates, found here. Per the updated Bylaws and as of December 1, 2023, there were two Community Representative seats (four total) and two Corporate Representative seats up for election in 2024. 

Each year Corporate members of the Foundation elect a members to represent them on the board with all Corporate members in good standing eligible to nominate and vote for candidates. Thank you to following Corporate members that nominated themselves for the 2024 election:

Ralph Bragg – Co-Founder & CTO at Raidiam Chris Michael – Co-Founder & Co-CEO at Ozone API Atul Tulshibagwale – CTO at SGNL Mark Verstege – Lead Architect for Information Security and Banking for the Consumer Data Right (AU)

I am very pleased to welcome Atul and Mark to the board of directors as the 2024 Corporate Representatives. I want to kindly thank Ralph and Chris for their continued support and contributions to the Foundation. I look forward to continuing to work with all four valued members in 2024 to achieve the Foundation’s vision and mission.

Per the Foundation’s updated Bylaws, four individual members represent the membership and the community at large on the board. Chairman Nat Sakimura and John Bradley have one year remaining on their two-year terms and I look forward to their continued leadership in 2024.

George Fletcher was re-elected, and Mike Jones elected to two-year terms as Community member representatives. Their long-time support and service to the Foundation is sincerely appreciated and valued. Thank you kindly to Sebastian Rohr for nominating himself and for his continued contributions to the Foundation.

Please join me in thanking Atul, Mark, George, and Mike, as well as all of the board of directors, for their service to the Foundation and the community at large. And thank you to all Foundation members for your continued investment of time and membership that drives and supports the Foundation.

Gail Hodges
Executive Director
OpenID Foundation

 

The post 2024 OpenID Foundation Board of Directors Election Results first appeared on OpenID Foundation.


Blockchain Commons

Blockchain Commons 2023 Overview

Welcome to Blockchain Commons’ yearly report on our status and projects.

Welcome to Blockchain Commons’ yearly report on our status and projects.

What is Blockchain Commons?

We’ve revisited Our Vision in past annual reports for Blockchain Commons, but this year we have a great overview of Blockchain Commons that we want to share. It was penned by Simon Ratner, one of our long-time patrons at Proxy, and now Head of Engineering at ŌURA. He said:

“Blockchain Commons isn’t a traditional industry association that only organizes, where all the work is contributed by the member companies. They’re producing their own architectures and reference libraries and sample applications. So in a way they are co-developing their standards, remaining independent while working among industry companies.”

“They provide specifications that actually exist and have been tested and vetted. This is especially important for a startup, since we’re benefiting the most from having standardized and interoperable work, as well as from having access to work of an additional engineering team.”

We’re appreciative of Simon for his recommendation, but also for his clear statement of what Blockchain Commons is and what we do. Blockchain Commons develops specifications that create interoperability. Doing so does indeed give smaller companies a leg up. While working with those companies, we’re also offering our own expertise. We’re doing so to benefit users and companies alike by creating a shared infrastructure that embeds our Gordian Principles of independence, privacy, resilience, and openness.

The whole recommendation can be found at LinkedIn. Thanks, Simon!

Our Standardizations

Blockchain Commons has been working on specifications for years. Our Animated QRs, which are built on URs, were our first specification to receive strong uptake in the digital-asset wallet community. We released a video this year to memorialize that. Meanwhile, the Gordian Envelope data format has been one of our major initiates over the last year and a half or so.

But specifications are just the first step. We ultimately need to work with standards bodies to ensure that our specifications are reliable and accessible in the long-term. In 2023, our specifications reached a sufficient level of maturity that we were able to begin that work.

Much of out new push for standardization focused on dCBOR, or deterministic CBOR. This is a variant of the CBOR data format that allows for consistent representation of data, something that is vital for a variety of use cases including our own Envelope. We’ve just published v07 of the spec, with CBOR expert Carsten Bormann now on board as a co-author, so we’re quite hopeful this will become an actual standard soon.

We’ve also been working on an Envelope standard for the IETF. We hope that there is interest in the format, but ultimately we’ll be happy if we can get the larger community focused on hash-based elision as a privacy measure, whether that’s with Envelope or not.

Our Envelope standardization work also expanded to IANA, who registers CBOR tags. Envelope is now registered with tag #200, allowing its widespread recognition whether it becomes an IETF standard or not. Other tags, which are used for URs, are also officially registered in the #40000+ range.

Overall, it was a great year for this new step forward in our specification work. It gives us strong faith that as we finalize the development of specifications with our community, we can move them into standards for use by the larger ecosystem.

Our Community

Work with the IETF helped Blockchain Commons to dramatically expand its community in 2023 through our meetings with that standards body. However, the Gordian Development community of wallet designers, chip manufacturers, and others continues to be our core constituency. As Simon wrote, part of Blockchain Commons’ mission is to act as an industry group that turns the needs of our patrons and partners into reality. That work is done with the Gordian community.

Blockchain Commons held a full dozen meetings with community members in 2023, including eight regular Gordian Developer meetings, one meeting focused on signing with Bitcoin keys, two Silicon Salons, and one FROST Round Table. We’re very appreciative of everyone who took the time to join us, and we’ve taken some real pointers from the community this year, resulting in the publication of our Gordian output descriptor v3 and a new system for detailing the status of our specifications.

We expect community members to benefit from our meetings too. Most recently, we heard of a tape-out, where the design of a circuit is sent to a manufacturing facility, following the work at our Silicon Salons where we brought together hardware-wallet manufacturers with semiconductor manufacturers interested in supporting them.

Our Depository

Standards such as Envelope and dCBOR focus on openness, but Blockchain Commons’ attention to resilience is at least as important. Openness ensures that a user will be able to control their digital assets through the interoperable tools of a variety of developers, while resilience helps them to be safe while doing so.

Since late 2022, our major resilience project has been Collaborative Seed Recovery (CSR). It attacks one of the biggest problems with independent control of your digital assets: it’s too easy to lose your keys (and the seeds they’re constructed from). Shamir’s Secret Sharing, which is part of our security-reviewed SSKR library, offers one solution: you shard a secret and then only need to recover some of those shares to reconstruct your seed. But, the shares can be subject to loss just like the seed itself!

Enter CSR. It envisions a variety of automated ways to recover shares from remote servers using diverse authentication systems to maximize security. Throughout 2023, Blockchain Commons advanced Envelope as a methodology for storing not just seeds, but the metadata critical for fully restoring both a seed and its assets. Toward the end of the year we also released our first version of Gordian Depository, a CSR server that can remotely store those Envelopes (or other data).

The next step is for Blockchain Commons to publicly deploy its Depository, which we hope will be the first of a network of share servers, provided by a variety of organizations. We hope these organizations will include not just traditional Bitcoin wallet companies, but also organizations such as modern-art museums, who have a mission to not only display but also preserve artwork — and for NFTs that means protecting Ethereum and Tezos keys. A varied and diverse ecosystem like this will give users real choices for whom they want to store their seed shares with.

We hope to see this ecosystem development over the course of 2024, but the strong foundation is now there.

FROST is Coming

Though we mainly focus on the big picture in our annual report, we wanted to mention our increasing focus on FROST, a quorum signature system that uses Schnorr. We’ve been watching Schnorr signatures for a long time and think they’re both important and powerful because of abilities such as threshold, blind, and adapter signatures, as Christopher wrote in “A Layperson’s Intro to Schnorr”.

FROST became a lot more accessible in 2023 thanks to the Zcash Foundation’s release of ZF FROST. We hosted a first round table to give FROST implementers the chance to talk with each other about their work and expect to continue supporting this exciting new technology in the year ahead.

Our Websites

Of course, no specification, standard, or other work is worthwhile unless people can access it. As a result, Blockchain Commons put real work in 2023 into expanding our presence on the web.

Most importantly, we launched a new Developer Website that collects together all of our tutorials, test vectors, and other info for developers, demonstrating how to implement URs, SSKR, and the rest. (Links throughout this yearly report connect back to developer pages.)

We also created a new Advocacy Website, which focuses primarily on our support for (or criticism of) upcoming digital-asset laws. This remains an important topic: most recently, we talked about the dangers of EIDAS in Europe and looked back at the tragedies created by overidentification in the past.

We’ve also updated our Main Website and our SmartCustody Website to fit the same format as our new sites and link them all into a cohesive whole.

Related, Christopher updated his own Life with Alacrity blog pages, which offer many of his thoughts on collaboration and identity going back two decades. His new Musings of a Trust Architect articles appear there as well (but also on the Blockchain Commons blog).

We of course continue to do much of our work in GitHub, which is the internet home of our library and apps — as well as our actual specifications, in the form of our Research Papers repo. We reorganized things there as well, with a new “Research Status” system that reveals the adoption level and thus stability of each of our specifications.

Looking Forward

The Blockchain Commons mission statement highlights four objectives: creating a Commons, Architecture, and Demand and establishing relationships with Peers. We expect to continue to work on all of these objectives in 2024.

A lot of our work is part of creating a Commons. We’re thrilled to constantly add reference libraries and apps to the Commons, including Gordian Seed Tool, which should see a 1.6 release early in 2024. We think that our new advocacy toward Open Development will be an important next step in creating this Commons; we expect to discuss that more widely in the coming year.

We talked about our Architecture underpinnings much more in 2023 with Christopher’s Musings of a Trust Architect articles. We expect more of that in 2024, with the topics of edge identifiers and cliques currently in our queue for both discussion and experimentation. However, we’re also working on the opposite side of the spectrum, which involves making our architectural ideas into standardized reality. We’ll be continuing to work with IETF, with plans to release a Problem Statement on the needs for hash-based elision for data storage, and we’re also planning to move some of our specifications to BIPs, beginning with our animated QR work.

We also expect to further investigate the usage of NFCs in a secure digital-asset architecture. This will include an examination of commodity NFC smartcards with the intent of creating references and best practices for securing a variety of data, from seeds and share to backed-up keys for developers. This is of a piece with our work on Depo servers: the ultimate goal is to give users a choice as to how their store their digital-asset seeds & keys.

Some of architectural work ties directly to our work on Demands. We want to increase the call for privacy-first solutions to better support human rights, protect against coercion, and generally allow people to decide what information they release to the world. Demonstrating the need for hash-based elision is one way to do so.

Finally, we will continue to support our Peers, especially through meetings. Besides our regular Gordian meetings, we also hope to organize special-purpose events for emerging hot topics. This may include more FROST round tables, additional Silicon Salons, or others as needs arise within the digital-asset space.

However, we need your help to ensure this future. The financial climate for web3 work became very poor in 2023 due to investments drying up for the whole sector. This has resulted in some of our long-term patrons being forced to step back or even merge their businesses, which has impacted their patronage. If your company benefits from our work, please become a sponsor to ensure it continues. We are particularly looking for companies able to become sponsors at the $1,000 a month or better level. Please contact Christopher Allen to learn more of the benefits of doing so.

Tuesday, 16. January 2024

Velocity Network

Trellis Strategies joins Velocity

We’re delighted to welcome Velocity Network Foundation’s newest member, Trellis Strategies, a leading strategic research and consulting firm focused on advancing postsecondary education and strengthening the workforce. The post Trellis Strategies joins Velocity appeared first on Velocity.

The post Trellis Strategies joins Velocity appeared first on Velocity.


Digital ID for Canadians

Request for Comment & IPR Review: PCTF Authentication Final Recommendation V1.1

Notice of Intent: DIACC is collaborating to develop and publish the Authentication component of the Pan-Canadian Trust Framework (PCTF) to set a baseline of public…

Notice of Intent: DIACC is collaborating to develop and publish the Authentication component of the Pan-Canadian Trust Framework (PCTF) to set a baseline of public and private sector interoperability of identity services and solutions. During this public review period, DIACC is looking for community feedback to ensure that the conformance criteria is clear and auditable.

To learn more about the Pan-Canadian vision and benefits-for-all value proposition please review the Pan-Canadian Trust Framework Overview.

Document Status: These review documents have been developed by members of the DIACC’s Trust Framework Expert Committee (TFEC) who operate under the DIACC controlling policies and consist of representatives from both the private and public sectors. These documents have been approved by the TFEC as Final Recommendations V1.1.

Summary:

The PCTF Authentication Component defines:

1.      A set of processes that enable access to digital systems.

2.      A set of Conformance Criteria for each process that, when a process is shown to be compliant, enable the process to be trusted.

Invitation:

All interested parties are invited to comment.

Period:

Opens: January 16, 2024 at 23:59 PT | Closes: February 15, 2024 at 23:59 PT

When reviewing the components Conformance Criteria, please consider the following and note that responses to this question are non-binding and serve to improve the PCTF.

Would you consider the Conformance Criteria as auditable or not? That is, could you objectively evaluate if an organization was compliant with that criteria and what evidence would be used to justify that?

Review Documents: Authentication

Conformance Profile Final Recommendation V1.1 Component Overview Final Recommendation V1.1 DIACC Comment Submission Spreadsheet 

Intellectual Property Rights:

Comments must be received within the 30-day comment period noted above. All comments are subject to the DIACC contributor agreement; by submitting a comment you agree to be bound by the terms and conditions therein. DIACC Members are also subject to the Intellectual Property Rights Policy. Any notice of an intent not to license under either the Contributor Agreement and/or the Intellectual Property Rights Policy with respect to the review documents or any comments must be made at the Contributor’s and/or Member’s earliest opportunity, and in any event, within the 30-day comment period. IPR claims may be sent to review@diacc.ca. Please include “IPR Claim” as the subject.

Process:

All comments are subject to the DIACC contributor agreement. Submit comments using the provided DIACC Comment Submission Spreadsheet. Reference the draft and corresponding line number for each comment submitted. Email completed DIACC Comment Submission Spreadsheet to review@diacc.ca. Questions may be sent to review@diacc.ca.

Value to Canadians:

The purpose of the PCTF Authentication Component is to assure the on-going integrity of login and authentication processes by certifying, through a process of assessment, that they comply with standardized Conformance Criteria. The Conformance Criteria for this component may be used to provide assurances:

·  That Trusted Processes result in the representation of a unique Subject at a Level of Assurance that it is the same Subject with each successful login to an Authentication Service Provider.

·  Concerning the predictability and continuity in the login processes that they offer or on which they depend.

All participants will benefit from:

·  Login and authentication processes that are repeatable and consistent (whether they offer these processes, depend on them, or both).

·  Assurance that identified Users can engage in authorized interactions with remote systems.

Relying Parties benefit from:

·  The ability to build on the assurance that Authentication Trusted Processes uniquely identify, at an acceptable level of risk, a Subject in their application or program space.

Context:

The purpose of this review is to ensure transparency in the development and diversity of a truly Pan-Canadian, and international, input. In alignment with our Principles for an Identity Ecosystem, processes to respect and enhance privacy are being prioritized through every step of the PCTF development process.

DIACC expects to modify and improve these Draft Recommendations based upon public comments. Comments made during the review will be considered for incorporation into the next iteration and DIACC will prepare a Disposition of Comments to provide transparency with regard to how each comment was handled.


Ceramic Network

How Rust Delivers Speed and Security for Off-Chain Storage

We're introducing more code written in Rust to Ceramic's codebase.
Introduction

At 3Box Labs, the core team building Ceramic, we are continually striving to produce the best solution for off-chain storage. This requires a combination of simplifying the user experience, while also delivering on the demanding requirements of users when it comes to performance and safety. To accomplish this, we’ve started to introduce more code written in Rust to our codebase.

For those not familiar with Rust, Rust is a language that focuses on three areas

Performance Reliability Productivity

We will dive into what Rust offers for performance and reliability, and how that allows us to produce fast, reliable, and secure off-chain storage on Ceramic. We will not talk about productivity in this blog post, as that is somewhat subjective and leads to language debates (even if we do love the productivity).

1. Performance With Efficient Memory Usage and Systems Programming

One of the main reasons that Rust is able to achieve its performance is that it does not use a virtual machine or garbage collector. When Rust code is built, it produces machine code and binaries at a systems level, like C or C++. To further improve on this, Rust also has zero-cost abstractions. Zero-cost abstractions allow developers to focus on writing clean, performant, and maintainable code, as the compiler will remove the abstractions used and produce optimized machine code.

Source: TechEmpower Round 22 Benchmarks

Rust regularly performs similarly to C and C++ in benchmarks, while often having some measurable performance improvement over Go. In the above image, you will see a Lng column indicating the language a webserver is written in. Rust webservers, denoted with rs and purple in color, outperform C (C - red), C++ (C++ - red), and Go (Go - green).

Go (Purple) vs Rust (Blue) Comparison. Source.

Additionally, Rust regularly produces better 75% and 99% response times due to better memory management. Discord switched from Go to Rust largely because these spikes wreak havoc on systems and user experience. They would have @mention spikes of 2 seconds or higher with Go, while Rust would consistently be in the sub-second range.

2. Reliability: Safety and Uptime

The decentralized nature of Web3 applications demands an unyielding commitment to safety. Additionally, our users demand availability of their data, so uptime is of primary importance. When dealing with systems languages like C and C++, most people are familiar with the multitude of common vulnerabilities and exposures (CVEs) that are filed, with a large number due to issues around memory management.

As mentioned above, Rust performance is a result of its close-to-the-metal code and memory management without a garbage collector. To ensure safety, Rust has introduced a memory ownership system, enforced at compile time, rather than runtime. This ownership system determines who is using data, and when it is free to release that data and, once freed, ensures that no one else can improperly use that data.

fn main() { { let x = 42; println!("x: {}", x); } println!("x: {}", x); // ERROR: x not in scope }

This additional security has led to a large number of companies needing secure-by-default software to start implementing code in Rust. Rust was created by Mozilla for use in Firefox to improve performance and security. Google and Microsoft have both been introducing more Rust into systems such as Windows and Android.

3. Performance and Safety: Binaries and Containerization

One final benefit to Rust is our ability to deliver binaries for systems, providing a verifiable supply chain to your running program, and allowing you to choose whether you want to run our software natively on your system, or through a container. For those of you choosing containers, we are also able to deliver much smaller images, usually 100-200mb or less, compared to our Javascript images, which are much closer to 1GB in size.

What’s Next

As we release more software in Rust, we will let you know and share upgrade paths for any software that is a replacement on our blog. Additionally, we will be looking to build more functionality in WASM, so users that require in-browser functionality with performance and safety can leverage libraries that 3Box Labs produces.

We look forward to continuing to provide you with the fastest, most reliable off-chain storage for your Web3 needs.

If you'd like to tell us more about your project or get in touch with the core team please fill out this form.


We Are Open co-op

How to find the right mental models for your audience

Demystifying technology through common language Here at the co-op we talk about a lot about mental models and metaphors. As we’ve written in the past, metaphors are powerful things, helping people understand complex ideas like how to build Open Recognition programmes. They can shape community culture. They mean people can spend less time understanding a thing and more time working to make th
Demystifying technology through common language

Here at the co-op we talk about a lot about mental models and metaphors. As we’ve written in the past, metaphors are powerful things, helping people understand complex ideas like how to build Open Recognition programmes. They can shape community culture. They mean people can spend less time understanding a thing and more time working to make that thing better.

cc-by WAO

We’ve started working with the Digital Credentials Consortium (DCC), an applied research lab working on Verifiable Credentials (VCs). Hosted at MIT with a membership portfolio that currently includes 12 other universities, the DCC is leading the development of technical standards that underpin VCs. They are building demos and betas to help university registrars and IT departments, employers, vendors and developers understand the power of this technology.

When we talk about this work to people who don’t work in technology, we have to lean heavily on real-world metaphors to help people understand the underlying concepts of how technology works. Educators understand marking achievement with a credential much differently to technologists. As both educators and technologists, we need to find ways to explain the new and various worlds that technology opens up for us.

The DCC seeks to empower learners by enabling them to have ownership and control over their digital credentials. As an applied research lab, the DCC can show that the technology works, paving the way for much wider adoption. However, technology alone is never enough. Most people do not care about a technical standard per say, they care about what that standard can do for them. Aas Marshall McLuhan famously pointed out, people tend to understand new things using existing mental models.

In this post, therefore, we want to play with a few examples, metaphors and mental models to help us explain what the DCC does.

Examples of standards

There are a couple ways to understand a ‘standard’.

A standard can be just the usual way of doing something. A standard can also be a reference to make sure things can be understood in the same way, no matter where or what is referencing the standard.

For example a kilogram in France is the same as kilogram in Germany, and a kilogram of feathers weighs the same as a kilogram of bricks. The kilogram is the standard, but where it is applied or what it is used for is up to whoever is using it.

Here’s some examples of standards you’ve probably heard about:

Metric System: 100 centimetres is a metre and a foot is 12 inches. All our rulers and measuring tapes follow one of these systems, at least in the western industrialised world. AM/FM Radio: “Amplitude modulation” or “Frequency modulation”, the way radios work is standardised. Timezones, chemical symbols, maritime signal flags — our world has plenty of standards. CC BY-ND Bryan Mathers

A technical standard is similar to these everyday standards, and how we get things such as the entire internet. We have technical standards that allow us to communicate and collaborate on the World Wide Web. However, because people need to understand what they are doing at any given point, we use homely metaphors — such as an envelope to signal ‘email’ (i.e. a digital message sent between servers).

Open technical standards are important when it comes to digital credentials. We don’t want a world in which the digital credentials we’ve earned at University X work differently to those at University Y and aren’t understood by Organisation Z. We need to make sure that our symbols of achievement make sense to whoever is looking at them.

Towards new metaphors for VCs

The DCC is active in helping shape the open technical standard for Verifiable Credentials and building ‘reference implementations’ on top of it. That’s a lot to take in if all you need to know is “Can we use DCC’s openly licensed work?” or “Is this person qualified?”

the DCC logo

In technology, standards are a way to ensure that solutions with different tech stacks work together in a way that is described as interoperable. There are common examples of technical standards in action such as:

Email standard: Did you ever think about how you can send an email to any email address no matter what domain the email is pointing to? (e.g. laura@example.com can send an email to doug@mycooladdress.edu) HTTP: You will have noticed the “http://“ or “https://“ in your browser menu bar. The HTTP application layer of the World Wide Web is what’s called a “standard protocol”. URLs: Come to think of it “Uniform Resource Locator” are also standardised, which is why you can just type in blog.weareopen.coop and end up reading this post!

The DCC exceeds at promoting interoperability. It creates and maintains open-source tools that facilitate issuing, verifying, and managing digital credentials in different contexts. The DCC contributes to the development of open standards necessary for Verifiable Credentials to gain global acceptance, like email has.

Mental models, metaphors and symbolism

Helping people understand examples of a standard is one thing, finding a metaphor for a specific standard is quite another.

A mental model is the container within which you can include a metaphor. A metaphor is, further, a container within which you can use symbolism and iconography.

If you have the mental model of chatting with your friends, the metaphor of “speech clouds” — this is a metaphor because language is not visible — helps you understand why most chat apps use speech clouds as logos and icons. This symbolism and iconography helps users understand that a piece of software can help them communicate with other people.

These mental shortcuts pop up all across technology — A folder icon to signify a new organising container for a body of work. A telephone to indicate making a call. A camera to designate a photo application.

Most people have a mental model of what “credentials” are, so how can we use other containers to explain Verifiable Credentials? The secret is in the name — “Verifiable” denotes “trust”, but it’s not actual trust. To use a metaphor the DCC often uses, VCs guarantee that what is inside the envelope hasn’t been tampered with, but doesn’t say anything about the letter inside it.

We trust other people’s trust. For example, if someone has a driver’s license, we trust that they know how to drive. It’s not just because they have the license, but rather we trust the agency behind the credential.

Often for trust, safety or verification, we use symbols like padlocks and vaults. For credentials, we often use seals and certification icons. Might using these kinds of iconography help users and implementers understand what is involved with cryptographically-signed credentials such as VCs?

Mental models for VCs

If we abstract away from “credentials” as the mental model, we might find other ways to explain this work. Another mental model that could help people understand what Verifiable Credentials are is the World Wide Web. Anyone can create a webpage, but some webpages are created by people or organisations we trust more than others. For example, when you want to know the weather, you likely go to the webpage of an organisation that employs meteorologists and uses weather prediction models.

the DCC’s work as a tree

Verifiable Credentials could also be thought about as state-issued identity documents. Be it a birth certificate, passport or drivers license, a marriage certificate or a school transcript. Even before the internet, these forms of identity documents are requested and verified by people other than the person who owns them (e.g. Laura’s birth certificate or Laura’s school transcript). These kinds of credentials are more than just the paper the credential is printed on. For example, a school transcript is a ‘product’ which is produced by a ‘school system’ that is predicated on national standards.

There are a variety of organisations that issue identity documents. They are issued in different countries and with different water marks. Somewhere underneath it all though, there are standards and processes, procedures and protocols that make identity documentation for many people* interoperable. No matter who issued your birth certificate, you can use it to verify your identity. The big flaw in this mental model is the fact that ANYONE, not just states or governments or elite organisations, can issue Verifiable Credentials.

The DCC doesn’t just work in standards. They also create infrastructure and lightweight products to help people visualise why these standards matter. This is part of the reason we wish to experiment with different mental models.

*Unfortunately not all people have even this amount of privilege. Indigenous people, refugees and people from certain countries do not always have the ability to verify their own identities. Transitioning to a world in which we use VCs for identity could help.

Finding the right language

We don’t have to choose a single mental model to help us talk about complex ideas. Instead, we can play with multiple models, metaphors and symbols to find things that resonates with the audiences we’re trying to communicate with.

In future posts, we’ll pick apart examples and metaphors for things like infrastructure, products, interoperability, data security, efficiency and any other terms or jargon that will can help people better understand what the Digital Credential Consortium is up to.

How to find the right mental models for your audience was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 15. January 2024

DIF Blog

DIF welcomes our new Executive Director

DIF is thrilled to welcome Kim Hamilton-Duffy as our new Executive Director.  Kim is a well-known figure at the heart of the decentralized identity technical community. She has been instrumental in pioneering early open source projects in the space, and has held leadership roles in technical standards and interoperability

DIF is thrilled to welcome Kim Hamilton-Duffy as our new Executive Director. 

Kim is a well-known figure at the heart of the decentralized identity technical community. She has been instrumental in pioneering early open source projects in the space, and has held leadership roles in technical standards and interoperability groups including the World Economic Forum, W3C, Decentralized Identity Foundation, and US Chamber of Commerce Foundation.

Kim brings over two decades of experience in software engineering and distributed systems to the ED role, together with a wealth of market insights and personal contacts. She takes over operational leadership of DIF from our outgoing ED Clare Nelson, effective today. 

Welcome Kim!! 

DIF thanks Clare Nelson

DIF says a big Thank You to our wonderful outgoing ED, Clare Nelson, who has gone on sabbatical at the end of her one-year contact, having handed over the reins of operational leadership to Kim.

Clare has made a big impact in a short period of time and leaves DIF in great shape, having helped build an effective organization that supports our members to innovate, implement and scale up.

Clare brought clarity and focus to DIF’s operational activities, sharpened how we communicate the value of DIF membership, built strong relationships with our liaison partners and stakeholders and helped forge productive relationships with new partners like ID4Africa and SIDI Hub. 

She used her extensive experience and contacts within the identity industry to build bridges, awareness and understanding of decentralized identity technologies in the wider technology sector, while always advocating powerfully for our members, and the close-knit team she built.   

She leaves DIF with a clear focus, stronger operational capabilities and an enhanced reputation. 

Thank you Clare!


Identity At The Center - Podcast

It’s time for a new episode of The Identity at the Center Po

It’s time for a new episode of The Identity at the Center Podcast! In this episode, we had the pleasure of hosting Phil Windley, Senior Software Development Manager at Amazon Web Services (AWS) Identity and co-founder/organizer of the Internet Identity Workshop. We discussed Phil's IAM journey, his involvement in the Internet Identity Workshop, and his incredible book "Learning Digital Identity."

It’s time for a new episode of The Identity at the Center Podcast! In this episode, we had the pleasure of hosting Phil Windley, Senior Software Development Manager at Amazon Web Services (AWS) Identity and co-founder/organizer of the Internet Identity Workshop. We discussed Phil's IAM journey, his involvement in the Internet Identity Workshop, and his incredible book "Learning Digital Identity." Tune in now at idacpodcast.com or find us on your favorite podcast app.

Want to win a free digital copy of Phil's book? Leave us a voicemail on our website and we will choose five of them for our February 5th episode. Those "callers" will receive a free electronic copy!

It's easy to enter, go to idacpodcast.com and click on the "Talk to Us" banner on the screen. Record a question, enter your name and email, and click send. That's it! We'll notify the winners via email.

Sunday, 14. January 2024

LionsGate Digital

A New Era in Online Identity Security: SPF, DKIM, and DMARC Protocols Enforced on Google and Yahoo!

The digital landscape is on the cusp of a monumental transformation. As of February 1, 2024, Google and Yahoo! are enforcing Sender Policy Framework (SPF), DomainKeys Identified Mail (DKIM), and Domain-based Message Authentication, Reporting, and Conformance (DMARC) protocols. This pivotal moment signifies the beginning of the end for online identity fraud, ushering us into a new era of digital i

The digital landscape is on the cusp of a monumental transformation. As of February 1, 2024, Google and Yahoo! are enforcing Sender Policy Framework (SPF), DomainKeys Identified Mail (DKIM), and Domain-based Message Authentication, Reporting, and Conformance (DMARC) protocols. This pivotal moment signifies the beginning of the end for online identity fraud, ushering us into a new era of digital identity security.

Understanding SPF, DKIM, and DMARC: The Trinity of Email Security

Before delving into the implications of this enforcement, let’s understand what SPF, DKIM, and DMARC entail:

SPF (Sender Policy Framework): This protocol verifies the sender’s IP address against the list of authorized sending IPs for a domain. It prevents email spoofing, ensuring that emails originate from legitimate sources. DKIM (DomainKeys Identified Mail): DKIM adds a digital signature to each email, which helps the receiving server verify that the email hasn’t been tampered with and is from a legitimate domain. DMARC (Domain-based Message Authentication, Reporting, and Conformance): This protocol uses SPF and DKIM to provide a holistic approach to email validation. It allows domain owners to decide how an email should be treated if it fails SPF or DKIM checks. The Impact on Online Identity Fraud

The enforcement of these protocols by internet giants like Google and Yahoo! is a game-changer in combating online identity fraud. By ensuring that emails are authenticated, it becomes increasingly difficult for fraudsters to execute phishing attacks and impersonate others. This move is a critical step towards a future where digital identity is secure and trusted.

The Path to Mandatory Digital ID

This development is not just about improving email security; it’s a significant stride towards the broader goal of establishing a mandatory digital ID. The enforcement of SPF, DKIM, and DMARC protocols lays the foundation for a more secure digital identity ecosystem. It sets a precedent for other online platforms and service providers to follow suit, gradually building a more robust and fraud-resistant digital world.

What This Means for You

For tech-savvy individuals and businesses, this change is a call to action. It’s essential to ensure that your domains are compliant with these protocols. Not only will this protect your digital identity, but it will also ensure that your communications are trusted and reach their intended recipients.

Embracing the Future

The enforcement of SPF, DKIM, and DMARC protocols marks a significant milestone in our journey towards a secure digital future. It’s a crucial step in the fight against online identity fraud, paving the way for the inevitable integration of mandatory digital IDs. As we embrace these changes, we move closer to a world where our digital identities are as secure and authentic as our physical ones.

Stay informed in online identity security, stay secure, and join us in welcoming this new era of digital identity protection.

Lions Gate Digital – Leading the Charge in Digital Security

The post A New Era in Online Identity Security: SPF, DKIM, and DMARC Protocols Enforced on Google and Yahoo! appeared first on Lions Gate Digital.

Friday, 12. January 2024

Ceramic Network

Interfaces in Ceramic Protocol: Intro Guide

Interfaces, a new feature of ComposeDB, offer standardization, flexibility and efficient data queries.

The Ceramic protocol has expanded ComposeDB's capabilities with a new feature called Interfaces (introduced in ComposeDB v0.6.0). Interfaces are important for setting standards in data models.

Interfaces also constitute another step toward real data composability, which is the ability to easily combine and recombine data from various sources to create new datasets or insights.

This article explains Interfaces, why they're useful, and how they're implemented in ComposeDB.

What are Interfaces?

Interfaces are like a blueprint for data models. They define a set of fields that any type or model must include to be compliant with the Interface. This approach ensures that different models, though unique, maintain a level of standardization and interoperability.

In ComposeDB, Interfaces help create a common ground for diverse data models.

For a simplified example, you could have an Interface Vehicle that represents any kind of vehicle in a transportation system.

interface Vehicle { id: ID! model: String! operators: [Person] usedIn: [Route]! }

This means that any type that implements Vehicle must have these specific fields.

Here are some types that might implement Vehicle:

type Car implements Vehicle { id: ID! model: String! operators: [Person] usedIn: [Route]! passengerCapacity: Int vehicleType: String } type Airplane implements Vehicle { id: ID! model: String! operators: [Person] usedIn: [Route]! wingspan: Float airline: String }

You'll notice that both of these types possess all the fields from the Vehicle Interface, but they also feature additional fields: passengerCapacityvehicleTypewingspan, and airline, that are unique to that specific kind of vehicle.

Interfaces are particularly useful when you wish to return an object or a collection of objects, and these objects could represent several different types.

Benefits of Interfaces in ComposeDB

Interfaces in ComposeDB offer several advantages:

Standardization: They create a common standard for different models to follow, which is essential for consistency Flexibility: Interfaces allow for the expansion of models by adding new fields, without breaking the established standards. (Note: Interfaces do not extend the existing models, they create new models that implement the same fields and constraints) Efficient data queries: With Interfaces, it's easier to query data across various models, when these models share the same interface. You don't need to query by all the models separately, you can query by a single interface that these models share. This makes data retrieval more efficient. Interface Implementation Example

An excellent example of Interfaces in action within Ceramic is the verifiable credentials Interface. This Interface example standardizes the structure for decentralized identity systems.

It establishes a standard set of fields for all credential models, ensuring consistency across applications. At the same time, it allows you to add whatever additional fields you need for your own application.

## Define the overarching VC interface that acts agnostic of our proof type interface VerifiableCredential @createModel(description: "A verifiable credential interface") { controller: DID! @documentAccount issuer: Issuer! context: [String!]! @string(maxLength: 1000) @list(maxLength: 100) type: [String!]! @string(maxLength: 1000) @list(maxLength: 100) credentialSchema: CredentialSchema! credentialStatus: CredentialStatus issuanceDate: DateTime! expirationDate: DateTime } ## Use the main interface to create a more specific interface - this one is for EIP712 interface VCEIP712Proof implements VerifiableCredential @createModel(description: "A verifiable credential interface of type EIP712") { controller: DID! @documentAccount issuer: Issuer! context: [String!]! @string(maxLength: 1000) @list(maxLength: 100) type: [String!]! @string(maxLength: 1000) @list(maxLength: 100) credentialSchema: CredentialSchema! credentialStatus: CredentialStatus issuanceDate: DateTime! expirationDate: DateTime proof: ProofEIP712! # The new field that is not present in the original interface } ## Define our EIP712 type that uses VerifiableCredential and VCEIP712Proof interfaces and on top of that, adds credentialSubject specific to our use case type VerifiableCredentialEIP712 implements VerifiableCredential & VCEIP712Proof @createModel(accountRelation: LIST, description: "A verifiable credential of type EIP712") @createIndex(fields: [{ path: "issuanceDate" }]) @createIndex(fields: [{ path: "issuer" }]) { controller: DID! @documentAccount issuer: Issuer! context: [String!]! @string(maxLength: 1000) @list(maxLength: 100) type: [String!]! @string(maxLength: 1000) @list(maxLength: 100) credentialSchema: CredentialSchema! credentialStatus: CredentialStatus issuanceDate: DateTime! expirationDate: DateTime proof: ProofEIP712! credentialSubject: CredentialSubject! # The new field that is not present in any of the two implemented interfaces }

This snippet shows how an Interface (VerifiableCredential) is defined and how a specific type (VerifiableCredentialEIP712) implements it. This snippet is from our Verifiable Credentials example app and you can see more on GitHub.

This example of the verifiable credentials Interface standardizes the basic structure for credentials in decentralized identity systems. By doing so, it facilitates various applications in creating, exchanging, and interpreting credentials in a consistent and reliable way.

Conclusion

Interfaces are crucial in the Ceramic protocol for helping decentralized applications work together. As managing data in a decentralized way becomes more common, knowing and using Interfaces will become more important.

Are you currently building a project on Ceramic or want to connect with the core team? Reach us here.


We Are Open co-op

Open Recognition — A feminist practice for more equal workplaces

Open Recognition — A feminist practice for more equal workplaces How can we use Open Recognition to create better workplaces? This following video is the recording of a talk I originally planned to give at the ePIC 2023 conference in Vienna. Unfortunately I got covid the day I was supposed to give that talk. Which is why I recorded it for you. In this talk I explore how feminist practices
Open Recognition — A feminist practice for more equal workplaces How can we use Open Recognition to create better workplaces?

This following video is the recording of a talk I originally planned to give at the ePIC 2023 conference in Vienna. Unfortunately I got covid the day I was supposed to give that talk. Which is why I recorded it for you.

In this talk I explore how feminist practices and open recognition are connected and can be used to create better and equal workplaces.

Here is a link to the slidedeck!

Do you want to know more about this? Do you have specific questions on how to transform you workplace? Or do you have comments for this video? Feel free to write me or comment this blogpost!

Open Recognition — A feminist practice for more equal workplaces was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 11. January 2024

Origin Trail

The V8 Foundation

“Q. How do you propose to do this? A. By saving the knowledge of the race. The sum of human knowing is beyond any one man; any thousand men. With the destruction of our social fabric, science will be broken into a million pieces. Individuals will know much of the exceedingly tiny facets of which there is to know. They will be helpless and useless by themselves. The bits of lore, meaningless,

“Q. How do you propose to do this?

A. By saving the knowledge of the race. The sum of human knowing is beyond any one man; any thousand men. With the destruction of our social fabric, science will be broken into a million pieces. Individuals will know much of the exceedingly tiny facets of which there is to know. They will be helpless and useless by themselves. The bits of lore, meaningless, will not be passed on. They will be lost through the generations. But, if we now prepare a giant summary of all knowledge, it will never be lost. Coming generations will build on it, and will not have to rediscover it for themselves. One millennium will do the work of thirty thousand.”

Hari Seldon, Foundation series by Isaac Asimov (1951)

October’s DKGcon saw the launch of the Metcalfe Genesis phase of the OriginTrail roadmap. The Decentralized Knowledge Graph (DKG) has seen staggering growth since, reaching over 3MM Knowledge Assets published before the end of 2023. New partners have joined the OriginTrail ecosystem, bringing industry-leading real-world adoption and solving tangible problems of misinformation across sectors.

In 2024, we’re on the cusp of monumental progress as the Metcalfe Genesis phase continues, leading up to the much-anticipated launch of the OriginTrail DKG V8. On this path, we will see three impact bases get established, forming the V8 Foundation which is inspired by the legendary works of Isaac Asimov, symbolizing the bold steps into a future where knowledge and innovation converge.

Impact base: Trantor (established in Q1 2024)

One of the prominent features of Trantor was the Library of Trantor, in which librarians indexed the entirety of human knowledge by walking up to a different computer terminal every day and resuming where the previous librarian left off.

Catalyst 1: Beta Knowledge mining launch

Knowledge Mining is the process of producing high-quality, blockchain-tracked knowledge for artificial intelligence (AI), supported by the NeuroWeb (parachain on Polkadot, formerly known as OriginTrail Parachain). During the beta knowledge mining, an initial 1,000,000 NEURO token pool will be made available to be mined by publishing high-quality Knowledge Assets to the DKG. Miners will be able to apply for the beta program and start knowledge mining as soon as January. The beta Knowledge miners will be eligible to claim NEURO rewards by publishing Knowledge Assets using both NeuroWeb and Gnosis blockchains. The beta program will be gathering key learnings on the dynamics of knowledge mining as the system matures from beta towards its fully decentralized version on NeuroWeb, setting the stage for the future of knowledge creation across all of humanity’s knowledge fields. Details are available in OT-RFC-20 and more will be shared in the upcoming knowledge mining kit release.

Catalyst 2: Delegated staking launch

As NeuroWeb is loaded with growing enterprise adoption and is used in production environments, the incubation of the new DKG staking module will initially be validated on the Gnosis chain. Gnosis is the ideal environment for staking incubation as DKG V6 integration will already include the new staking module (no breaking change risk). Igniting network activity on the Gnosis chain, 1,000,000 TRAC will be used to kickstart knowledge creation. At the same time, the DKG on NeuroWeb is expected to grow faster with Polkadot’s asynchronous backing feature and other upcoming updates (increasing throughput by a significant factor). Details on delegated staking are available in OT-RFC-18.

Impact base: Terminus (established in Q2 2024)

The founding population of Terminus consisted of 100,000 especially healthy scientists, whose ostensible purpose was to publish an Encyclopedia Galactica in order to preserve science and technology. The lack of natural resources forced Terminians to develop extremely high-efficiency tech, as their knowledge due to their position as the inheritors of the Imperial Library allowed them to do so.

Catalyst 1: Random sampling module for 100x scalability increase in Knowledge Asset throughput capacity

The explosive growth of the amount of Knowledge Assets on the DKG V6 has enabled the network to proceed towards the next scalability upgrade. DKG nodes will benefit from the improved “random sampling” method, a significant optimization of blockchain usage resulting in over 100x scalability increase in the amount of Knowledge Assets supported by the DKG and the blockchain layer.

Catalyst 2: Multichain growth

The multichain approach will explore integrations ranging from the most robust blockchains (e.g. Bitcoin) to the fastest yet less robust chains with strongly established markets, in accordance with the ecosystem’s neutrality principle.

Impact base: Gaia (established in H2 2024)

The human beings on Gaia, under robotic guidance, not only evolved their ability to form an ongoing telepathic group consciousness but also extended this consciousness to the fauna and flora of the planet itself, even including inanimate matter. As a result, the entire planet became a super-organism.

Catalyst: AI-powered Autonomous Knowledge Mining

Synergizing the effects of Trantor and Terminus bases, Gaia drives the OriginTrail DKG native AI integrations with any model (e.g. Gemini, GPT), vector database (e.g. Milvus, Weaviate), and agent (e.g. AutoGPT). The AI-native capabilities enable a suite of autonomous activities — from discovering inferences in the DKG and mining new knowledge to validating existing data and autonomous agent-based solutions using trusted knowledge on the DKG. The Gaia impact base catalyzation leads to the Convergence in the Metcalfe roadmap phase.

Conclusion

The impact bases presented in this blog post serve as the V8 Foundation, describing core fields in which impact must be pursued to achieve an effective DKG V8 launch. The above does not focus on integrations with the world’s leading tech solution providers, growing governmental and institutional support, or other adoption updates. Each of the impact bases and their outcomes are therefore expected to be further augmented by the ongoing DKG adoption.

Stay #OnTrac(k)!

About OriginTrail

OriginTrail is an ecosystem-building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail’s initial adoption was in global supply chains, serving as a trusted hub for supply chain data sharing, allowing customers to authenticate and track products and keep these operations secure. In recent years, the rise of AI has not only created unprecedented opportunities for progress but also amplified the challenge of misinformation. OriginTrail also addresses this by functioning as an ecosystem focused on building a trusted knowledge infrastructure for AI in two ways — driving discoverability of the world’s most important knowledge and enabling the verifiable origin of the information. The adoption of OriginTrail in various enterprise solutions underscores the technology’s growing relevance and impact across diverse industries including real-world asset tokenization (RWAs), the construction industry, supply chains, healthcare, metaverse, and others.

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

OriginTrail has gained support and partnerships with world-class organizations such as British Standards Institution, SCAN, Polkadot, Parity, Walmart, the World Federation of Hemophilia, Oracle, and the EU Commission’s Next Generation Internet. These partnerships contribute to advancing OriginTrail’s trusted knowledge foundation and its applicability in trillion-dollar industries while providing a verifiable web of knowledge important in particular to drive the economies of RWAs.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

The V8 Foundation was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

2023 FIDO Seoul Public Seminar: Charting the Future of Online Authentication with Passkeys

The FIDO Seoul Public Seminar, “Passkeys – Online Authentication Paradigm Shift,” was hosted on December 5, 2023, at the SK Telecom Headquarters in Seoul by SK Telecom’s developer community, DEVOCEAN. […]

The FIDO Seoul Public Seminar, “Passkeys – Online Authentication Paradigm Shift,” was hosted on December 5, 2023, at the SK Telecom Headquarters in Seoul by SK Telecom’s developer community, DEVOCEAN. This marked yet another significant milestone in the journey towards simpler and stronger online authentication. The event drew an audience of over 200, showcasing a myriad of updates and advancements in the realm of passkeys.

[Pictures from 2023 FIDO Seoul Public Seminar]

The sessions, which included diverse case studies from Korea, USA, Vietnam, Malaysia, and other countries, offered attendees a comprehensive view of FIDO authentication and FIDO Device Onboard (FDO) implementations, presenting a unique blend of global insights and practical applications. We are excited to share highlights and recorded videos from some of the seminar’s pivotal sessions:

FIDO Alliance Update – Passkey Tipping Point: Andrew Shikiar, Executive Director & CMO of the FIDO Alliance, discussed the rapid adoption and market interest in passkeys, underlining their potential to revolutionize online authentication. (Watch Video) KISIA-FIDO Alliance Collaboration Guide: Yeri Won, Department Head at KISIA, presented a new subsidized program for Korean small and medium enterprises that apply for FIDO certifications commencing in 2024. (Watch Video) The Beginning of the End of Passwords: Christiaan Brand from Google provided insights into Google’s strategy for implementing passkeys as a primary sign-in method. (Watch Video) Experience with Passkeys on Galaxy Devices: Samsung Electronics’ Passkey Task Force members, Jong Su Kim and Joon Suk Lee discussed the integration of passkeys in Galaxy devices, along with Samsung Pass and Samsung Internet Web Browser. (Watch Video) The Key to the Future: Understanding the Future Brought by Passkeys: Ki-Eun Shin, Principal Researcher at SK Telecom and leader of the FIDO Korea Working Group Technical Sub-Group, offered forward-looking perspectives on the application of passkeys in various Korean business sectors. (Watch Video)

This remarkable event garnered significant attention from various local media outlets, including ZDNet Korea, IT Daily, Data Net, Boan News, Daily Secu, and Byline Networks. Coverage by these media outlets highlighted the seminar’s forward-thinking approach, noting its potential to shape the future of online authentication. They emphasized a move towards less dependency on password and OTP-like, knowledge-based authentication methods, which are easy targets for cyber-attacks like phishing and credential stuffing.

We extend our deepest gratitude to all the distinguished speakers for their invaluable insights and contributions. Special thanks to Seungwon Shin, Vice President of the Security Team at Samsung Electronics, Heungyeol Yeom, Professor at Soonchunhyang University, Jaebeom Kim from TTA, Leewon Ye of KISIA, Yoo Seok Han from AirCuve, Chong Seak Sea of SecureMetric Malaysia, Jaehyung Lee from Octatco, Simon Trac Do of VinCSS Vietnam, Christiaan Brand of Google, Sang Jun Park from Microsoft, Jong Su Kim and Joon Suk Lee of Samsung Electronics, and Kieun Shin of SK Telecom.For additional details about the seminar agenda, please visit the 2023 FIDO Seoul Public Seminar landing page.


We Are Open co-op

On the strategic uses of ambiguity

Introducing a continuum of ambiguity and reflecting on importance of avoiding dead metaphors in the workplace Image created by DALL-E 3 There are many things that mark individuals and organisations out as different. We can describe various characteristics and dispositions, but we can also point to different tolerances. In this regard, perhaps the one we talk about the most is ‘risk’ toleranc
Introducing a continuum of ambiguity and reflecting on importance of avoiding dead metaphors in the workplace Image created by DALL-E 3

There are many things that mark individuals and organisations out as different. We can describe various characteristics and dispositions, but we can also point to different tolerances.

In this regard, perhaps the one we talk about the most is ‘risk’ tolerance, but in this post let’s explore the tolerance that individuals and organisations have to ambiguity. It’s key to understanding how change happens.

Let’s define terms

It might seem a little ironic to define a term like ‘ambiguity’, but it’s important to separate it from the idea that something is ‘vague’ or ‘unclear’.

The Oxford English Dictionary, for example, contains several definitions for ambiguity:

Originally and chiefly with reference to language: the fact or quality of having different possible meanings; capacity for being interpreted in more than one way; (also) lack of specificity or exactness. An instance of ambiguity or uncertain meaning; a doubt; an uncertainty. A word or phrase that can be interpreted in more than one way; an ambiguous expression. A nuance which allows for an alternative reading of a piece of language; the fact or quality of having one or more such nuances. Uncertainty about one’s course of action; doubt, hesitation. Also an instance of this: a feeling of uncertainty, a doubt. The fact or quality of being difficult to categorise or identify, esp. due to changeable or contradictory qualities or characteristics. Also: something that is difficult to categorise or identify.

If some of these definitions seem like how work sometimes feels, then you’re not alone! Life is ambiguous, but in different ways. Some types of ambiguity are more useful than others, and having a tolerance for ambiguity can be helpful in avoiding ‘dead metaphors’.

Let’s dig deeper…

Introducing a continuum of ambiguity

The image below looks simple but comes from plenty of research which spans everything from literary criticism to philosophy.

CC BY Doug Belshaw via ambiguiti.es Dead metaphors

One way of perhaps making this continuum immediately understandable is to say that something being ‘vague’ would be over to the left. The green bars represent different types of ambiguity, and then the phrase ‘dead metaphors’ represent terms and and ways of speaking which no longer have any explanatory power.

Here are some examples of dead metaphors — words or phrases which were once novel, but have become so common that they are now considered ‘dead’ or clichéd:

Thinking outside the box — originally meant to encourage creative and unconventional thinking, it’s now a standard phrase for any form of innovation or different thinking. Moving the goalposts — once a useful metaphor for changing criteria or rules unfairly, it’s now a routine way of describing shifting standards or expectations. Game changer — initially used to describe something which changed radically the existing conditions or rules, now it’s often used to describe anything even slightly innovative.

So now we understand that we’re dealing with the majority of our working life: the stuff that happens between something just being really unclear or vague, on the one hand, and something being a dead metaphor, on the other.

Where does ambiguity come from?

While there are different types of ambiguity, they stem from the overlap between what is denoted by a statement and what is connotes.

CC BY Doug Belshaw via ambiguiti.es

The denotative aspect represents the literal, dictionary definition of a word, while the connotative aspect represents the associations, emotions, or additional meanings that the word carries beyond its literal definition.

There are many examples to help understand this:

Dove — a bird (denotative) and also a sign of peace (connotative) Home — where someone lives (denotative) and also a feeling of comfort and familiarity (connotative) Cold — low temperature (denotative) and also a way of interacting that is unfriendly or distant.

The overlap in the diagram, indicated by the arrow, represents words that carry both denotative and connotative meanings simultaneously. This can lead to ambiguity because the person who wrote or spoke the words cannot control their connotation in the mind of someone else. The ambiguity arises from the interplay between the precise, literal meaning and the variable, subjective interpretation of a word’s connotation.

Types of ambiguity

Given that we live in language and use it to understand the world around us, ambiguity is inevitable. However, as mentioned above, and as shown in the continuum image, there are different types of ambiguity, which are more or less useful for various purposes.

One way of thinking about these different types is as the state change of a substance from gas, to liquid, through to solid:

Generative Ambiguity — this is where ambiguity can give rise to new ideas or interpretations. These are not usually well-defined and often only make sense to an individual. For example, you may have a flash of insight as to how something you’ve been grappling with in your personal life can help you with your professional life. This would likely only make sense to you. Creative Ambiguity — here, ambiguity is used to help other people understand an idea. It remains highly contextual, so would only be understood by someone who shares your context: for example, you use a new term to make sense of how some problems your sector has been having are related. Productive Ambiguity — this represents ambiguity that is beneficial or intentional in generating positive outcomes because it helps people shift their view. For example, as part of a strategy where leaving things unsaid or open to interpretation can lead to more flexible and adaptable solutions.

So Generative Ambiguity is where a term or way of understanding something works for you. Creative Ambiguity is where it works for people like you (i.e. with similar domain knowledge). Productive Ambiguity is when it works for most people.

Using different types of ambiguity in your work

The first thing to say is that we should avoid vague, unclear statements as much as possible. Likewise, we need to stop talking in dead metaphors. Neither are particularly useful. Challenge both wherever you can.

Ambiguity, however, is useful, and sitting in it for a period of time can be creatively beneficial. Building up our tolerance of ambiguity involves recognising that not all ambiguity is equal — some types serve us better than others. Understanding how to use different types of ambiguity can be particularly useful in the workplace.

Generative Ambiguity

Generative ambiguity is a state where ideas are in flux, akin to a brainstorming session where every contribution is valued and nothing is off-limits.

To use this in your work:

Encourage free thinking: create an environment where team members feel comfortable sharing unpolished ideas. This could be through regular brainstorming sessions or a digital ‘ideas board’. Embrace uncertainty: when presented with a new problem, resist the urge to find immediate clarity. Instead, allow ideas to percolate and evolve. (There’s a wonderful scene in I Capture the Castle where the character played by Bill Nighy points to scraps of paper pinned all around his room and says that his “ideas are percolating”) Reflect individually: set aside time for personal reflection. This can help in connecting disparate ideas, leading to innovative solutions. Creative Ambiguity

Creative ambiguity thrives on context and shared understanding. It’s about communicating new concepts in a way that resonates with a specific audience.

To encourage this:

Know your audience: tailor your communication to the background and experience of your team. After all, what makes sense in one context may not in another. This is particularly important if you work across countries and timezones. Use metaphors wisely: use fresh metaphors that align with your team’s experiences to illustrate complex ideas. Metaphors are extremely powerful, so think about the benefits and potentially unexpected consequences of introducing a new one. Test new terminology: introduce new terms or phrases that encapsulate broader concepts or strategies specific to your work environment. See what resonates. Productive Ambiguity

Productive ambiguity can be strategic, helping build a culture which values flexible thinking without a one-size-fits-none ‘solution’.

To help this thrive:

Encourage adaptability: when setting project goals, be clear on the outcomes but flexible on the methods to achieve them, allowing room for innovative approaches. Practice strategic communication: sometimes less is more. Give enough information to guide but not so much that it stifles alternative interpretations or solutions. Enable collaboration: bring together diverse teams to work on projects. The intersection of different knowledge bases can lead to a fruitful ambiguity that encourages new ways of thinking. Conclusion

Navigating the continuum of ambiguity requires an awareness of language’s power and its impact on thought and communication. By consciously employing different types of ambiguity, we can create environments full ofinnovation, creativity, and strategic thinking.

It’s about choosing the right tool for the job: sometimes a scalpel, sometimes a hammer, and sometimes a paintbrush. Each type of ambiguity can be your ally, if approached with intention and understanding.

Do you need some help with navigating and using ambiguity in your organisation? WAO has extensive experience in mapping and finding leverage points! Get in touch for a free initial chat 😊

Note: this post is based on Doug Belshaw’s work for his doctoral thesis and his blog, ambiguiti.es

On the strategic uses of ambiguity was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 10. January 2024

Ceramic Network

RariMe: Bringing Your Web3 Identity to Metamask with Ceramic

RariMe initially used MetaMask local storage to hold credentials, but switched to Ceramic as it allows for data to be synced between multiple devices

Rarimo protocol allows users to seamlessly port their identities across Web3. One of Rarimo’s use-cases is RariMe, an extension that allows users to store and manage their identity credentials inside of Metamask, generating privacy-protecting Zero-Knowledge proofs that can be submitted to dApps. So far, there have been over 6,000 RariMe downloads. 

RariMe initially used MetaMask local storage to hold credentials, but switched to Ceramic as it allows for data to be synced between multiple devices—both increasing the portability and improving the disaster recovery process for users. 

The Story of RariMe

Rarimo is an interoperability protocol solving identity fragmentation in Web3. It aggregates issuers and standards, allowing dApps to verify and interact with a broad range of identity artifacts. This in turn allows users to port their identities across Web3. All of this is accomplished with privacy-preserving Zero-Knowledge technology. Rarimo can be thought of as a Chainlink for identities. 

Prior use cases include: 

Launching the Proof-of-Humanity plug-in This plug-in allows platforms to filter out bots by gating access to users who have proven their humanity. Through a single integration, dApps gain instant access to a range of four identity providers representing different methods of verification: Gitcoin Passport, Unstoppable Domains, Civic, and Worldcoin. This allows users to choose exactly how they prove their humanity. Scaling PolygonID by making their Zero-Knowledge Proofs multi-chain Launching RariMe, a MetaMask Snap  RariMe allows users to store identity credentials inside their MetaMask  It also allows them to generate Zero-Knowledge Proofs for all of their credentials Users are able to store both their Web3 assets and identities in the same location Why RariMe Built With Ceramic

Multiple other solutions were considered including IPFS, FileCoin, and Arweave. In the end, Ceramic was selected because it is free for users and comes with a convenient high-level API provided by Compose DB.

RariMe snap uses ComposeDB as the primary encrypted storage of Verifiable Credentials. The keys for authorization and encryption are tied to the MetaMask mnemonic phrase. The safe getEntropy method is used to derive them deterministically without compromising the security of crypto accounts. It is enough for the user to have the MetaMask mnemonic to retrieve and decrypt all her RariMe credentials.

This solution has the following advantages:

Data is instantly synced between all devices that use the same MetaMask mnemonic There is no need for manual backups because the data in Compose DB is durable Straightforward UX, no extra steps or passwords required Local storage capacity does not limit the maximum number of credentials. ComposeDB is scalable enough to handle millions of credentials Try it Out

Test out RariMe by downloading it and by generating a Rarimo Proof-of-Humanity credential, which you can store inside your RariMe and use to gain access to gated human-only spaces, including a VIP Rarimo discord channel. 

Download RariMe here: https://rarime.com/ 

Prove your humanity here: https://robotornot.rarimo.com/


Next Level Supply Chain Podcast with GS1

E-commerce Entrepreneurship: Navigating the Challenges and Triumphs of Product Innovation

Lisa Lane is the innovative mind behind Rinseroo, an e-comm shower darling to solve the pain point of washing big dogs in the shower. She joins Reid and Liz to discuss the intricacies of bringing a new product to market, including the importance of building and maintaining brand identity and the need for brand protection in the ever-evolving landscape of e-commerce.  Her perspective offers

Lisa Lane is the innovative mind behind Rinseroo, an e-comm shower darling to solve the pain point of washing big dogs in the shower. She joins Reid and Liz to discuss the intricacies of bringing a new product to market, including the importance of building and maintaining brand identity and the need for brand protection in the ever-evolving landscape of e-commerce. 

Her perspective offers practical advice for entrepreneurs seeking to establish and safeguard their brands in online marketplaces. She underscores the immense potential of platforms like Amazon and TikTok Shop for reaching new customers and driving sales and also delves into the challenges faced during product manufacturing. Tune in for her insight into bringing an idea to life in the marketplace.

Key takeaways: 

The significance of understanding the market, manufacturing, distribution, and advertising aspects of product development when bringing a new product to market. the significance of having a viable product that is different from what's already out there, determining the market size and potential customer appeal, understanding profit margins, and making the right decisions about manufacturing and distribution.

The power of ecommerce platforms and technology, particularly Amazon and TikTok, in reaching a wide audience and scaling a business, which have enabled small businesses to thrive in the ecommerce landscape.

The importance of building a strong brand identity and obtaining trademarks to protect products from knockoffs, particularly in the ecommerce space, and the role of GS1 barcodes in ensuring legitimacy and brand protection.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with the guest:

Follow Lisa Lane on LinkedIn

Check out Rinseroo

Tuesday, 09. January 2024

Ceramic Network

CeramicWorld 01

Hello and welcome to the first edition of CeramicWorld, a monthly roundup of everything happening across the Ceramic ecosystem.

Hello and welcome to the first edition of CeramicWorld, a monthly roundup of everything happening across the Ceramic ecosystem.

Ceramic Community Signup!

Community is really important to us at Ceramic! We'd like to know more about you and what you're working on, so we can create better technology and experiences.

Click the link below to learn more about the tech, meet the team, become an official Ceramic community member, explore integrations and partnerships, or share your project.

Take me there! Play with ComposeDB’s no-code sandbox

ComposeDB Sandbox is an awesome no-hassle way to try ComposeDB's GraphQL API without needing to install anything. While playing in the sandbox, you'll also notice that ComposeDB now supports query filtering and ordering, which allows developers to filter and/or order the results of their data queries.

Play with ComposeDB Sandbox Watch Optimism, Base, MetaMask, Karma3 discuss decentralized reputation @ RepConnect Istanbul

Watch MetaMask, Optimism, Base, and Karma3 share learnings from building industry-leading reputation products, protocols, and systems. For even more content, catch the full RepConnect Summit Istanbul Recap.

We’re going to be hosting a sequel event at ETHDenver, so be sure to keep your eyes peeled for a signup form posted on our Twitter account.

Watch RepConnect talks and panels PriceFeed dataset is on fire! 📈

As crypto markets pick up, so does interest in coin prices. Over the last month, we’ve seen the PriceFeed dataset emerge on Ceramic mainnet, rapidly growing from obscurity to more than 70K entries at time of writing. As an example, this document stores a “snapshot” of the price of Solana ($SOL) at a given moment, but there are also others for Ethereum ($ETH) and Bitcoin ($BTC).

PriceFeed is also notable in that it’s the first dataset on mainnet authored by a “data oracle,” responsible for bridging the gap between data that exists elsewhere and Ceramic. Just as oracles play an important role in blockchain networks, we expect data oracles to play a huge role in generating verifiable datasets in Ceramic.

Since the PriceFeed dataset exists on Ceramic mainnet and continues to grow everyday, you can now query, compose, and remix this data in your applications.

See PriceFeed dataset on Cerscan Recon, scalability protocol, enters final testing

Core devs are finalizing development on Recon, a set reconciliation protocol for syncing data between Ceramic nodes. Recon is a more scalable, purpose-built alternative to IPFS pubsub that enables improved efficiency and greater controls for how data is synced between Ceramic nodes. For an idea of performance gains, during early testing Recon was able to publish 1M streams to Ceramic in under an hour.

Explore Recon (CIP-124) Devtools Improved Ceramic developer portal

Ceramic documentation has gone through a major update. There’s now a landing page to welcome developers and serve as a jumping-off point on their Ceramic journey. The site also now includes the docs for ComposeDB and Decentralized Identifiers, so there’s only one site to worry about. Visit the new site to get started playing or building with verifiable data.

Cerscan and S3 Explorers display richer network data

The S3.xyz hub offers exciting ways for developers to view activity across the network, from ComposeDB model usage to application-level views, and more. Developers can also query an embedded GraphQL instance while inspecting ComposeDB schemas, adding allowing users to run pre-crafted queries, or create custom queries of their own.

Cerscan gets its own set of improvements too!

HireNodes offers commercial-grade Ceramic node hosting

When you’re ready to move from testnet to mainnet, you may face new challenges of scalability, reliability, security, and cost-effectiveness. HireNodes has emerged as an early leader in Ceramic node hosting for project looking for production grade deployments.

Intuition, decentralized reputation protocol, enters private alpha

Intuition, an attestations and credentials protocol built on Ethereum and ComposeDB, is now live on mainnet for their private alpha.

Tutorial: Storing Ethereum Attestation Service (EAS) attestations on Ceramic

Ethereum Attestation Service (EAS) is a protocol for credentials used by some of the most popular dapps, DAOs, and Web3 communities. Now you can use Ceramic as a storage backend for EAS attestations. There's a great tutorial on the official EAS documentation site for how to store your EAS credentials on Ceramic.

Ceramic Protocol Upgrade to Ceramic v3!

Ceramic v3 is out now with Node.js version 20 support. It’s a major update that allows developers to leverage the most recent features and performance updates of Node.js. Developers are strongly advised to upgrade their Ceramic nodes to v3, as older node versions will no longer receive active technical support and bug fixes. In addition to the Node.js version update, Ceramic v3 bids farewell to all legacy, non-standard DID methods like DID:ETH, DID:NFT, and DID:SAFE.

Firehose API development underway

3Box Labs is kicking off work on the Firehose API. Firehose is a new set of Ceramic APIs that will enable developers to build their own custom database and indexing solutions, such as a ComposeDB alternative, on top of Ceramic’s stream APIs. This project is still in the early stages of development. Keep an eye on Ceramic public roadmap for status updates.

Want to join Ceramic Core Devs?

Catch up on notes from past calls and add the Ceramic Calendar to join a future one.

In Case You Missed It Data Control Patterns in Decentralized Storage An Easy Way to Create Verifiable Claims on Ceramic Attestations vs. Credentials: Making Claims Interoperable FAQ: Ceramic Network & ComposeDB

Velocity Network

2024 Velocity for Startups Program

Our next Velocity for Startups cohort launches soon. The program's objective is to infuse greater innovation into the work of Velocity Network Foundation. The post 2024 Velocity for Startups Program appeared first on Velocity.

The post 2024 Velocity for Startups Program appeared first on Velocity.

Monday, 08. January 2024

FIDO Alliance

Webinar Recap: Passkey Technology Implementation and Application

On December 27, 2023, FIDO China Working Group successfully hosted a webinar titled “Passkeys Technology Implementation and Application.” Chaired by Henry Chai, who also serves as the co-chair of FIDO […]

On December 27, 2023, FIDO China Working Group successfully hosted a webinar titled “Passkeys Technology Implementation and Application.” Chaired by Henry Chai, who also serves as the co-chair of FIDO China Working Group, the event provided a highly professional platform for discussion. Yang Li (OPPO), Shaobo Han (Uni-ID), and Mengyang Lin (FIT2CLOUD) joined as guest speakers and shared their insights and practical experiences regarding the implementation and application of passkeys technology. Over 100 industry professionals attended the event and actively participated in post-sharing discussions.

This event centers on the advancement of the implementation and utilization of passkeys technology within China’s industrial sector. It serves as a platform for professionals, scholars, entrepreneurs, and technology developers in the field of cybersecurity to engage in meaningful dialogue and collaboration. As technology continues to evolve and demand for application grows, FIDO Alliance anticipates that the implementation of passkeys will become more ingrained in society, serving as a vital tool in safeguarding information security. This development is expected to infuse fresh momentum into the journey of internet security and digitization, and propel the cybersecurity industry forward.

FIDO WeChat official account tweets:

Invitation: https://mp.weixin.qq.com/s/-96gh7WjQoIWAMy72GRRcg Summary: https://mp.weixin.qq.com/s/7SzXM51FsFpMJUOESYkDlg

GS1

Using GS1 standards to trace surgical and dental instruments

Using GS1 standards to trace surgical and dental instruments Keeping track of many surgical instruments by visual inspection presented problems from a risk management and workload perspective. There are around 40,000 surgical instruments at Tokyo Medical and Dental University Hospital. Keeping track of these by visual in
Using GS1 standards to trace surgical and dental instruments Keeping track of many surgical instruments by visual inspection presented problems from a risk management and workload perspective.

There are around 40,000 surgical instruments at Tokyo Medical and Dental University Hospital. Keeping track of these by visual inspection presented problems from a risk management and workload perspective.

The hospital has acquired its own GS1 Company Prefix and directly marked a GS1 DataMatrix encoding GS1 Global Individual Asset Identifier (GIAI) on all 40,000 surgical instruments, including dental equipment such as handpieces. This has made it possible to track and trace items.

Business goal GS1 Healthcare Reference Book 2023-2024 reference_book_2023-2024_jpn_final_.pdf

Identity At The Center - Podcast

The Identity at the Center Podcast is back from the holiday

The Identity at the Center Podcast is back from the holiday break! To kick off 2024, we had the pleasure of chatting with Gary Rowe, CEO & Founder of TechVision Research. We discussed the upcoming Chrysalis IV conference, and what attendees can expect to hear about the future of identity, security, governance, and privacy within large enterprises. As a special bonus, IDAC listeners can use th

The Identity at the Center Podcast is back from the holiday break! To kick off 2024, we had the pleasure of chatting with Gary Rowe, CEO & Founder of TechVision Research. We discussed the upcoming Chrysalis IV conference, and what attendees can expect to hear about the future of identity, security, governance, and privacy within large enterprises.

As a special bonus, IDAC listeners can use the discount code IDAC24 when registering for the Chrysalis IV conference at https://techvisionresearch.com/chrysalis-iv/ to save 💰$200!💰

Episode #253 is available now at idacpodcast.com and wherever you get your podcasts from.

#iam #podcast #idac

Saturday, 06. January 2024

OwnYourData

DID Rotation

Der Beitrag DID Rotation erschien zuerst auf www.ownyourdata.eu.
DID Rotation

Understanding DID Rotation: Transitioning between DID Methods

In the realm of digital identity, the concept of Decentralized Identifiers (DIDs) stands as a cornerstone. DIDs are a new type of identifier that enables verifiable, self-sovereign digital identities. This blog post explores possibilities to transition between different DID Methods and invites DID method implementors to join this journey towards more robust, interoperable, and secure digital identities.

What is DID Rotation?

DID Rotation refers to the process of updating or changing a Decentralized Identifier (DID) while maintaining the continuity and integrity of the digital identity it signifies. This procedure is essential for various reasons, including enhancing security, adhering to emerging standards, or transitioning to a more sophisticated infrastructure. To facilitate DID Rotation effectively, we have identified three fundamental requirements for a DID method:

DID Rotation Requirements DID-ROT #1: The original DID must include a reference to the new DID.
(Motivation: enable resolver to switch between DID methods) DID-ROT #2: The new DID must include a reference to the original DID.
(Motivation: provide provenance & legitimacy) DID-ROT #3: There must be a proof that the original DID is deactivated upon rotation.
(Motivation: prohibit forks) An Example Implementation: did:oyd to did:ebsi

did:oyd was developed by OwnYourData and provides a self-sustained environment for managing decentralised identifiers. The did:oyd method links the identifier cryptographically to the DID Document and through also cryptographically linked provenance information in a public log it ensures resolving to the latest valid version of the DID Document. This is in contrast to other DID methods which are based on blockchain technology and provide a trust anchor based on the respective governance of the used ledger for handling sensitive data.

did:ebsi is part of the European Union’s efforts to build a secure and interoperable blockchain infrastructure. It operates on a pan-European network of nodes, ensuring robustness and security. The technology encompasses APIs, smart contracts, and a decentralized ledger, which are used across various use cases to provide trusted information for business processes.

The did:oyd method offers a low-entry barrier for generating a large number of DIDs) proving particularly useful in scenarios where blockchain access is impractical. However, there are instances where the requirements and conditions evolve, necessitating existing identifiers to meet specific governance standards that are beyond the capabilities of this DID method.

Process

A process that meets the aforementioned requirements can be outlined in the following four steps, as illustrated below:

Start with an existing/original DID Create New DID: establish a new DID to replace the old one Update Original DID: link the old DID to the new one for resolvers to find the new DID Deactivate Original DID: deactivate the original DID to avoid forks Update New DID: finally, update the new DID with references to the original, ensuring a seamless transition Example

The following DID demonstrates resolving a did:oyd to a did:ebsi document:

did:oyd:zQmZ7wwgCxkExNeXHm9XLxAKs7Y7pubTKCHQLTxRrA3Fz51

And as reference here links to the respective DID Documents:

Original did:oyd New did:ebsi Resolution Process

The resolution process for DID Rotation involves extending implementations with a follow-alsoKnownAs=TRUE flag that instructs the resolver to correctly interpret the alsoKnownAs attribute in DID Documents. This flag ensures that when a DID is updated or rotated, the resolution process continues to recognize and track the previous DID, linking it to the new one. Essentially, by setting follow-alsoKnownAs=TRUE, the system maintains a connection between the old and new DIDs, thereby preserving the continuity and historical integrity of the digital identity throughout the rotation process.

Call to Action

The DID community is invited to provide more general support for DID Rotation. Whether you’re a developer, policy-maker, or just an enthusiast in the field of digital identities, your insights and contributions are vital! We encourage the community to:

Provide Feedback: Share your experiences, challenges, and suggestions. Your input is crucial for the continuous improvement. Support DID Rotation in your preferred DID method: Implementing DID Rotation is straight forward – it only requires the respective resolver to process information in the alsoKnownAs attribute. Stay Informed and Educated: With the fast-evolving nature of DIDs, staying updated with the latest trends and advancements is essential.

DID Rotation is more than just a technical upgrade. It’s a step towards a future where digital identities are more secure, private, and user-centric. Let’s embrace this change together, contributing to a digital world that respects and empowers individual identity.

The work on DID Rotation has received funding from the European Union’s Horizon 2020 research and innovation program through the NGI TRUSTCHAIN program under cascade funding agreement No 101093274. For more information visit our project website here.

Der Beitrag DID Rotation erschien zuerst auf www.ownyourdata.eu.

Friday, 05. January 2024

FIDO Alliance

IT Security Wire: Key Strategies for Enterprise Cybersecurity in 2024

In 2024, companies will see more adoption of passkeys and other MFA methods to access business assets. Passkey adoption, along with biometrics, hardware tokens, and public-key cryptography, will replace the […]

In 2024, companies will see more adoption of passkeys and other MFA methods to access business assets. Passkey adoption, along with biometrics, hardware tokens, and public-key cryptography, will replace the use of passwords. These security technologies will also help mitigate phishing and social engineering, which target credential theft. One way to reduce risk and boost security patches is the usage of proximity badges, physical tokens, or USB devices (FIDO2-compliant keys).


Geeky Gadgets: 2024 Cybersecurity trends with the evolution of artificial intelligence

The traditional password system is becoming obsolete, making way for more secure methods like FIDO standard passkeys. These new authentication tools, (which can be physical or digital), don’t require users […]

The traditional password system is becoming obsolete, making way for more secure methods like FIDO standard passkeys. These new authentication tools, (which can be physical or digital), don’t require users to remember complex passwords and are designed to reduce the risk of security breaches.


The Verge: Passkeys: All the news and updates around passwordless sign-on

By using authentication mechanisms built into a users own device offering heightened security, passkeys are expected to replace passwords in the near future. Backed by Apple, Google, and Microsoft, passkeys […]

By using authentication mechanisms built into a users own device offering heightened security, passkeys are expected to replace passwords in the near future. Backed by Apple, Google, and Microsoft, passkeys are built on WebAuthn tech and stored directly on your device, making them more secure than passwords or PINs which can easily be stolen.


Dark Reading: Getting Started With Passkeys, One Service at a Time

Executive Director of the FIDO Alliance, Andrew Shikiar, emphasized the potential for over 7 billion accounts enabled to use passkeys at the end of 2023. Supported by Apple, Google, and […]

Executive Director of the FIDO Alliance, Andrew Shikiar, emphasized the potential for over 7 billion accounts enabled to use passkeys at the end of 2023. Supported by Apple, Google, and Microsoft, these tech giants worked with FIDO to establish a standardized passkey system using certificates compliant with WebAuthn.


Gemini Protects Users with FIDO Authentication

Gemini is a cryptocurrency exchange and custodian, founded by Tyler and Cameron Winklevoss in 2014. Gemini enables its users to transact both via a website as well as mobile apps […]

Gemini is a cryptocurrency exchange and custodian, founded by Tyler and Cameron Winklevoss in 2014. Gemini enables its users to transact both via a website as well as mobile apps to buy, sell and store cryptocurrency assets.

The Challenge/ Use Case

As a financial services vendor in a space that is highly targeted by criminals, the need for strong authentication is paramount. 

Gemini’s security efforts are led by Chief Security Officer Dave Damato who is no stranger to the security industry and previously worked at security incident response firm Mandiant.

“So much of my career has been really focused on preventing and responding to incidents and strong two factor authentication is at the core preventing most of those attacks,” Damato said (in a session at the Authenticate Financial Services Summit). “It’s also why I’m so very enthusiastic about FIDO.”

How Gemini Uses FIDO To Secure Its Users

Gemini wanted to provide its users with the strongest level of security authentication to help minimize risk.

While using an SMS based two factor approach can be better than just a username and password, given the high value of a Gemini account, attackers might well go through the steps necessary to bypass SMS two factor. Beginning in 2019, Gemini began offering its customers the highest level of security possible and it did this by starting to support the FIDO2 authentication standard.

“FIDO2 is designed to overcome challenges and dramatically increase the cost for an attacker,” Damato said. “There’s no password that can be shared by our customers and that’s why FIDO2 is phishing resistant.”

Benefits

For Gemini, the use of FIDO2 provides a series of tangible risk mitigation benefits that helps to reduce the attack surface. Instead of needing to rely on a One-Time Password (OTP), SMS or backup codes, Gemini users can benefit from a more user-friendly FIDO2 powered experience.

Among the most common types of attack is credential stuffing, where an attacker makes use of passwords lost or stolen from one site, to re-use or ‘stuff’ into another. With FIDO, that risk is minimized for Gemini. Since FIDO strong authentication is based on cryptography and not a shared secret, even if a user reuses a password, the deployment of FIDO will minimize the risk significantly.

“The benefit to me as a company is that I don’t actually have to store, manage credentials or worry about other breaches, where credentials have been stolen,” Damato said.

Wednesday, 03. January 2024

OpenID

OpenID Foundation Comments on CFPB Rule 1033 Regarding Open Banking

Blog authored by Mark Haine. The OpenID Foundation submitted comments to the CFPB on the recent Open Banking rule 1033 on Friday, December 29, 2023. The cover note to the CFPB is provided in full below, and the detailed comments can be viewed here.  We are proud to support the CFPB in their due diligence […] The post OpenID Foundation Comments on CFPB Rule 1033 Regarding Open Banking first

Blog authored by Mark Haine.

The OpenID Foundation submitted comments to the CFPB on the recent Open Banking rule 1033 on Friday, December 29, 2023. The cover note to the CFPB is provided in full below, and the detailed comments can be viewed here

We are proud to support the CFPB in their due diligence on this rule to date, and we look forward to continued support of the CFPB and the US Open Banking community in the months and years ahead.

Any questions on our comments can be directed to director@oidf.org

 

 

OpenID Foundation
5000 Executive Parkway Suite 302
San Ramon, CA 94583

Consumer Finance Protection Bureau
1700 G St NW
Washington, DC 20552

December 29, 2023

 

Dear Sir or Madam

My name is Nat Sakimura, and I am the Chairman of the OpenID Foundation (Foundation)1 and am one of the Chairs of the FAPI Working Group (FAPI WG)2. The OpenID Foundation would like to thank the CFPB for the opportunity to provide feedback on the recent draft Open Banking 1033 Rulemaking, and provide ongoing support as may be useful in your due process.

Introduction

The OpenID Foundation is a non-profit organization whose mission is to lead the global community in creating open standards that are secure, interoperable and privacy-preserving. As part of that mission, the OpenID Foundation closely collaborates with several significant ecosystems internationally that have Open Banking and Consumer Financial data sharing systems requirements, and, in some cases, the OpenID Foundation provides services to help those ecosystems operate and scale their deployments.

Delivery of Open Banking on a national basis is a big challenge. The complexity of multi-party ecosystems that process millions of transactions should not be underestimated. In the financial services security and privacy are critical concerns for consumers and are important in maintaining the reputation of the ecosystem. The costs arising from meeting these requirements can be challenging to manage and the burden of those costs should not fall unfairly on any entity type as that will reduce participation and limit the overall success of the ecosystem. The use of open standards that have had input from hundreds of industry experts, have been through formal security assessment, can be delivered using off-the-shelf software components, and have been demonstrated to successfully support Open Banking ecosystems elsewhere, and that demonstrably achieve positive outcomes for millions of consumers can help minimize both cost and risk overall. The OpenID Foundation provides highly applicable open standards, years of hands-on experience and supporting services to several Open Banking and Consumer Data right ecosystems around the globe.

We have structured our feedback in two ways, one being along a number of themes, the second being by specifically addressing a number of the specific questions that the CFPB included in their consultation paperwork. These are attached as a spreadsheet.

1 https://openid.net/
2 https://openid.net/wg/fapi/

Key Themes Standardized “Communication Protocol” & security Conformance & certification Software availability and cost Lifecycle of standards Digital identity attributes Entity metadata and trust How the OpenID Foundation can contribute Standardized “Communication Protocol” & security The most important item of feedback that the OpenID Foundation would like to offer is that use of an open industry standard, or “Standardized Communication Protocol”, should be required within the CFPB 1033 rulemaking. We define this “Standardized Communication Protocol” in the context of Open Banking as “the secure exchange of messages between the Consumer, Data Provider and the Third Party for the purpose of consumer authorization of access to data and services, and subsequent issuance of a Secure Access Token to the Third Party”. The Communication Protocol is critically important for maintaining the security of data being passed (security being a feature that is focussed on preventing unauthorized access or modification). A proven, rigorously tested open standard based Standardized Communication Protocol protects all ecosystem participants and addresses the need to authorize access to data and services, orchestrate authentication and provide Secure Access Tokens. These capabilities should be included by the CFPB as requirements of a Standardized Communication Protocol for Open Banking. Without a Standardized Communication Protocol there is a significant cost incurred by all parties, but most notably by third parties’ where integration costs compound due to multiple different implementations provided by the Data Providers. The result of this being market dynamics that favor the largest entities over the smallest.

The OpenID Foundation provides one example of a Standardized Communication Protocol: the “FAPI profile of OpenID Connect.” This standard has been selected by many public and private Open Finance, Open Data, and Consumer Data Right ecosystems (e.g. the Australian Consumer Data Right, Australia’s ConnectID, UK Open Banking, Brazil Open Banking, Brazil Open Insurance, Saudi Arabia Open Banking, UAE Open Finance, USA FDX, Canada FDX, and HelseID Norway). These standards are proven at scale in many production deployments, both regulatory-driven and commercial. As a result, the OpenID Foundation has first-hand experience observing the pitfalls faced by ecosystems that lacked Standardized Communication Protocol mandates vs the benefits to those that had mandates.

In short, the OpenID Foundation strongly recommends the CFPB mandate use of a qualified industry Standardized Communication Protocol with a clear definition of its features in the final rule.

Conformance & certification

The OpenID Foundation’s experience of working with multiple Open Banking ecosystems shows that while standards are very important they need support from complementary tools and services, specifically conformance testing tools and certification services.

The delivery of secure and interoperable Communication Protocol depends on Data Providers and third parties implementing it consistently and correctly. This can be tested empirically through the use of conformance testing tools. Tests significantly reduce time to detect implementation errors and reduce the cost and complexity for all parties. Public certification of conformance significantly increases the chances of interoperability being delivered across an ecosystem and provides evidence that the security aspects of the protocol have been implemented correctly too, increasing all parties’ confidence in the ecosystem as a whole.

In ecosystems that went ahead without a requirement for conformance testing, there were issues with interoperability and security. Where conformance testing is performed there have been several instances where significant security issues have been identified earlier in the lifecycle and were privately shared back to the Data Provider or Third Party to remediate prior to production deployment of vulnerable services.

The OpenID Foundation provides open source conformance testing tools for various specifications (including all versions of FAPI). These tools are written in close collaboration with the standards authors and contributors. While CFPB or nominated technical support partners can write their own conformance testing tools, re-use of existing test tools further reduces cost, risk, and time-to-market for implementations that use these OpenID Foundation maintained Standardized Communication Protocols. There is also a possibility to collaborate on specific additional CFPB or Qualified Industry Standards body requirements.

It is the recommendation of the OpenID Foundation that the CFPB rules should state that all ecosystem participants have their production implementations certified as conformant to the Standardized Communications Protocol prior to launch and periodically thereafter (we recommend at least annually) in order that implementations mitigate risks around interoperability and security. The certification should be backed by successful use of empirical testing conformance tools. As seen in other ecosystems, if this is not a clear requirement there is a significant risk that implementation errors will persist for longer than necessary and adverse consumer outcomes arise.

Furthermore, the OpenID Foundation recommends that conformance testing results be published publicly to improve consumer trust and transparency.

Software availability and cost

When ecosystems and implementers elsewhere have been considering how to deliver Open Banking the question of cost has arisen and it is worth stating that there is a less obvious benefit of having a Standardized Communication Protocol. Software vendors now offer “off-the-shelf” components that can deliver existing Standardized Communication Protocols needed, this often reduces the cost of implementation. Using an existing, widely deployed Standardized Communication Protocol has the additional benefit of reducing the time to market for that ecosystem as well as capitalizing on existing skills and experience developed in the delivery of Open Banking elsewhere. The final significant cost reduction factor relating to a Standardized Communication Protocol is that Third Parties in particular can not only use off-the-shelf components but re-use it across their integrations with multiple Data Providers.

 

Entity metadata and trust

One area that seems absent from the rule making is how Data Providers may make reasonable decisions about whether to trust a Third Party that connects to them. The way the rules are currently drafted, it seems that this is a decision that is left up to each Data Provider. This could become a significant source of risk and cost to both the Data Providers as well as the consumers as Data Providers individually take decisions about third party access. It may even result in significant cost for Third Parties as they may be required to go through bi-lateral due-diligence procedures with many Data Providers in order to sufficiently mitigate these risks. An alternative approach would be to allow for some intermediary to define and apply supporting processes, and technology to do due-diligence on third parties and provide a streamlined system for establishing trust between third parties and data providers. If this is done in concert with a qualified industry standard communication protocol then the communication of that entity’s technical metadata will also allow much quicker technical integration once the due-diligence is done.

The OpenID Foundation recommends adding in additional rules to address how trust can be established in an interoperable and scalable fashion without bilateral, point-to-point trust requirements. This allows entities to onboard once, have a source of truth for trust and configuration that uses a standard communication protocol. It is also recommended to have a qualified industry standard specific to the exchange of this trust information and the associated integration details. The OpenID Foundation develops and maintains standards for this purpose.

 

Lifecycle of standards

One topic that has been challenging and should be considered carefully in the rulemaking is striking a balance between the use of a small number of critical standards and the agility of the ecosystem when it comes to change over time. To future-proof the rulemaking, it would be appropriate for the rules to include a mechanism for on-going maintenance and versioning of qualified industry standards in a way that the benefits of standardization are realized, while mitigating the risk of inertia and cost of change to the ecosystem. To this end, we recommend the following processes are established:

Review proposed changes to the qualified industry standards at an ecosystem level and in partnership with relevant standards bodies, balancing the impact of change against the benefits that may be realized via the updates. Process to ensure conformance to the updated standards. The current practice is a cadence of regular standards review, followed by a notice period on standards and any changes to requirements, and then a period of conformance and certification to ensure implementations reflect the updated requirements.

Failure to account for the lifecycle of standards can lead to resistance by key ecosystem participants who seek to minimize conformance costs at the expense of security, operational and user experience benefits. From a national perspective, this inflexibility can also reduce US competitiveness, when global open banking and open data use cases emerge and ecosystems seek to interoperate. As an example, domestic implementations of FAPI 1 and other standards like India’s UPI are not currently interoperable, but the use cases and path to global interoperability are emerging as domestic deployments mature. This global landscape and path to global interoperability is discussed in the 2023 OIDF whitepaper “Open Banking, Open Data: Ready to Cross Borders?”3.

The OpenID Foundation manages the lifecycle of its specifications in an open collaborative forum with a clearly documented process. Representatives of user communities around the world participate and their interests and needs are taken into account. The latest version of the OpenID Foundation’s Communications Protocol (FAPI 2.0 Security Profile) is currently going to a membership vote on “Final”, having successfully gone through previous stages of the process, including formal security analysis. As an example of good lifecycle management, having analyzed the benefits, many ecosystems are updating their roadmaps to migrate to FAPI 2.0.

3 https://openid.net/2023/02/06/final-version-of-open-banking-and-open-data-ready-to-cross-borders-wh itepaper-published/

Digital identity attributes

The current rule on Open Banking includes “Basic Account Information” in the scope. If this information remains in scope, then the CFPB should be able to answer the following questions affirmatively, as any answers to the contrary may have unintended consequences.

If Basic Account Information g. “name, address, email address, and phone number” are requested, is there the intent or even a requirement for this PII data to be requested separately with separate and explicit informed consent of this PII by the user? If Basic Account Information g. “name, address, email address, and phone number” are requested, are there security requirements for Third Parties holding this information, how it can be used (e.g. a relevant purpose / use case) and are there requirements on Data Providers disclosing it (e.g. a higher authentication assurance level to share the PII vs bank account information), as this information arguably has additional risks relating to identity theft.

The scope of the CFPB is clearly financial services and attributes such as “name, address, email address, and phone number” are typically maintained by financial services organizations for communication with the customer and fraud mitigation purposes. There is a risk that including this information in the rule might allow a pseudo-assured identity market to emerge based on this basic account verification information. If this is the intent of the CFPB, then we recommend guidance for (1) and (2) to avert unintended outcomes, and ensure the user’s data rights are respected and this sensitive data is suitably protected. If it is not the intent of the CFPB to enable identity assurance use cases (based on ‘AML backed” digital identity data), we recommend the CFPB clarify the intent of the rule making and to make more explicit the use cases that are in and out of scope of the rulemaking.

Separately, the current requirement for Data Providers to “make available covered data when it receives information sufficient to: Authenticate the consumer’s identity…” is not very explicit about authentication and could be interpreted in various ways, some of which could have a significant risk of the incorrect consumer being identified and consequently consumer finance data being inadequately protected. A range of standards for consumer authentication that are sufficiently robust for use in financial services exist and all of these involve the consumer authenticating directly to each party rather than doing so via a third party. It is done this way mainly for enabling a variety of options for secure authentication and ensuring that consumer credentials do not need to be shared with a third party.

The OpenID Foundation does not address authentication directly other than requiring authentication to be done when the FAPI profile is used. This permits a range of choices for Authentication to be provided in the context of the Standardized Communication Protocol and Authentication of the consumer by the Data Provider directly.

 

The OpenID Foundation contribution

In accordance with our mission and vision, the OpenID Foundation will continue to actively engage with US market stakeholders (as we do with any market engaging in Open Data) to support the local market development, support due-diligence checks of our specifications, and adapt our operational capabilities to serve local ecosystems. In context of the current rules, the OpenID Foundation is keen to continue engaging and innovating along with key stakeholders in the US market, including the CFPB. The OIDF is willing to act as a Qualified Industry Standard Body or participate as part of a consortium that jointly act as a Qualified Industry Standards Body, or play both roles as required by the US market participants.

Should any OpenID Foundation standards be selected by any Qualified Industry Standards Body (either directly or as part of joint offering) there would be no charge for access to or use of OpenID Foundation specifications. Similarly, there is open source conformance testing software provided to test implementations of the key OpenID Foundation standards. These Open Source tools are made available to all for free. The OpenID Foundation also currently operates a cloud instance of the conformance testing tools that can be used by any party free of charge.

Self-certification of conformance to key standards can be asserted and published on the OpenID Foundation website for a small fee. To support ecosystems with deployment, we are also open to ecosystem-specific, strategic partnerships to deliver local stakeholder requirements.

Should any organization or individual wish to contribute to the OpenID Foundation’s on-going developments or maintenance of FAPI and related standards the only requirement would be the signing of the Intellectual Property Rights agreement, membership and fees do not apply. That said, membership of the OpenID Foundation entitles three key benefits: voting on the specifications at key milestones in their lifecycle, a significant discount on self-certification, and the opportunity to direct funds to projects of benefit to the OpenID Foundation community.

Please see attached our detailed comments on the CFPB rulemaking. We would be delighted to clarify any points in this cover letter or the attached documents, just contact us via director@oidf.org.

 

Sincerely,

Nat Sakimura
Chairman & Co-Chair FAPI WG
OpenID Foundation

 

The post OpenID Foundation Comments on CFPB Rule 1033 Regarding Open Banking first appeared on OpenID Foundation.

Friday, 22. December 2023

FIDO Alliance

Fintech Times: Cybersecurity trends for 2024 with TransUnion, Forter, WatchGuard, Vouched, FIDO Alliance, Fusion

FIDO Alliance’s Andrew Shikiar emphasises the need for enterprises to adapt cybersecurity strategies due to rising AI-driven social engineering threats and a push for greater cyber-transparency. Traditional methods like company-wide […]

FIDO Alliance’s Andrew Shikiar emphasises the need for enterprises to adapt cybersecurity strategies due to rising AI-driven social engineering threats and a push for greater cyber-transparency. Traditional methods like company-wide phishing training may become inadequate. Shikiar suggests reducing reliance on passwords and adopting passkeys, either synced or device-bound, as a more secure and user-friendly authentication approach.


Forbes: Forget passwords, this new tech is nearly hacker-proof, 1Password says

Since the adoption of passkeys in September, 1Password has reported that more than 700,000 passkeys have been created and saved by their users, which doubled the end-of-year expectations. Currently, 334,000 […]

Since the adoption of passkeys in September, 1Password has reported that more than 700,000 passkeys have been created and saved by their users, which doubled the end-of-year expectations. Currently, 334,000 1Password users are trying passkey technology, 79% of them being consumers and 21% of them being business customers.


Security Magazine: Top cybersecurity predictions in 2024

Aside from eliminating the need to remember passwords, integrating passkeys offers numerous advantages for enterprises by unlocking devices seamlessly and with ease. This gives users the ability to use the […]

Aside from eliminating the need to remember passwords, integrating passkeys offers numerous advantages for enterprises by unlocking devices seamlessly and with ease. This gives users the ability to use the same biometric verification method across multiple devices and accounts which not only enhances security but simplifies the login process.

Thursday, 21. December 2023

FIDO Alliance

Target Uses FIDO Authentication to Secure the Workforce

Target is a retailer with locations across the U.S as well as online ecommerce operations. Target also provides loyalty and credit card services to its customers. The Challenge/ Use Case […]

Target is a retailer with locations across the U.S as well as online ecommerce operations. Target also provides loyalty and credit card services to its customers.

The Challenge/ Use Case

The initial use case for FIDO at Target was to help enable a secure login experience across applications at the company, as part of a broader platform modernization effort.

Target’s challenge was to provide a consistent and secure login experience across applications at Target, to provide a seamless experience to its users. 

“We had to reduce friction, wherever possible, be it in the authentication flow by reducing the dependencies on passwords, or in the onboarding process by making it easier for applications and business owners to easily consume the enterprise authentication services,” explained Nataraj Rao, Principal Engineer for Security Solutions at Target.

How Target Uses FIDO To Secure Its Users

Target initially integrated a FIDO server with its Single Sign On (SSO) platform to provide multi-factor strong authentication capabilities.

“Support for a wide variety of authenticators makes it possible for team members to choose from a wide variety of authenticators and avoids a scenario where they are not able to move forward, just because they did not have a specific authenticator at that time,” Rao said.

With a solid understanding of how FIDO works and how it can be integrated with Target’s systems there are multiple use cases where it can be deployed. Among those use cases is for providing additional verification, in a multi factor authentication flow. FIDO can be used as the primary authenticator and can completely eliminate passwords from the login equation. It can be used for native authentication to mobile applications, providing a very intuitive login experience for Target’s mobile users.

Benefits

FIDO2 in particular has been useful for Target as it’s integrated into most modern web browsers without the need for users to install any third-party software or plugin on their devices or browsers.

With FIDO, Target is able to provide a better authentication experience for its users and is taking steps toward enabling a passwordless future.

“We all know that it’s not easy to get rid of passwords immediately,” Rao said. “But let’s all take a step towards it.”


SURF Uses FIDO2 to Protect Users in the Netherlands

SURF is the shared IT organization for research institutes and universities in the Netherlands. The organization helps to connect over 100 different institutions across the country.  The Challenge/ Use Case: […]

SURF is the shared IT organization for research institutes and universities in the Netherlands. The organization helps to connect over 100 different institutions across the country. 

The Challenge/ Use Case:

With lots of students and educators that need access, SURF faces multiple challenges. 

Since 2007, SURF has been developing and using a service it calls SURFconext, which provides a national identity federation for research and higher education. SURFconext is an identity federation that consists of over 180 different identity providers and it provides a single sign-on (SSO) capability for SURF’s member institutions. SURFconext is based on the SAML 2.0 standard and makes use of OpenID Connect and is used by 1.7 million people across the Netherlands. 

Over the last decade, there have been increasingly sensitive workloads and growing security concerns with accessibility. Some member institutions were only enforcing access with basic password authentication and there was a need to introduce multi-factor strong authentication.

How SURF Uses FIDO To Secure Its Users

With multiple member organizations each using various technologies, SURF implemented an add-on service called SURFsecureID.

SURFsecureID is a hosted service that provides multi-factor authentication, with a step-up approach.

“The idea is that users authenticate at their home University using the password and before they are redirected to the service provider they are redirected to us where we require a second factor before sending them off to the service they initially requested,” explained Joost van Dijk, Technical Product Manager at SURF.

The step up authentication approach makes use of FIDO2 standards to help protect SURF’s users.

Benefits

With FIDO, SURF is now able to provide strong authentication to users across the Netherlands in an approach that helps to improve resiliency and security.

One particular risk that FIDO helps SURF to minimize is that of phishing attacks which has been a growing concern since at least the onset of the pandemic.
“Especially since the COVID crisis began, we’ve seen a lot of phishing campaigns launched against our users and we see FIDO2 as an excellent way to mitigate this threat,” commented Joost van Dijk, Technical Product Manager at SURF.


How CZ.Nic uses FIDO Authentication 

The Company:  CZ.nic is a domain registry organization in the Czech Republic that has been in operation since 1998. The organization manages over 1.3 million domains and is operated as […]

The Company: 

CZ.nic is a domain registry organization in the Czech Republic that has been in operation since 1998. The organization manages over 1.3 million domains and is operated as a not-for-profit entity.

In addition to the administration of domain names, CZ.nic is active in the development and deployment of internet technologies as well as identity services.

The Challenge/ Use Case:

One of the primary activities that the CZ.nic domain registry does is it needs to verify the identity of domain owners. CZ.nic has contact information on well over 800,00 domain owners and administrative contacts.

Verifying and authenticating the integrity of user identities is a key challenge that faces CZ.nic. The European Union has a regulation known as Network Information Security (NIS) version 2 (NIS2) that recommends that top level domain registries like CZ.nic have technology and policies in place to properly verify domain owners.

“There’s a common agreement that illegal content is usually linked to fake identities,” explained Jaromir Talir, technical fellow at CZ.NIC and member of eIDAS Technical subgroup. “In the case of domains, there is definitely the possibility to register fake identities as domain owners.”

To that end, CZ.nic developed the mojeID (my ID) service as a way to authenticate user identities. MojeID serves as a central identity service where an individual identity can be associated with a domain. 

MojeID also acts as an identity provider that ties into the European Union’s eIDAS (electronic identification and trust services) approach for an identity system that works across the EU.

How CZ.nic Uses FIDO To Secure Its Users

CZ.nic started out with just a username and password for authentication and realized over time that there was a clear need to have stronger authentication options for users.

In 2018, CZ.nic began evaluating the FIDO U2F specification as a solution for two factor authentication. In 2019, CZ.nic shifted its focus to using FIDO2/WebAuthn as it began to roll out the technology for production deployments. 

Benefits

The use of FIDO2 provides CZ.nic with an extensible framework that works across desktop and mobile operating systems and devices.

With FIDO, CZ.nic is able to provide its users with strong authentication for identity verification. FIDO2/WebAuthn is also a core element of the eIDAS enablement for MojeID, which requires the use of a FIDO authenticator, alongside username/password for access.

As of July 2021, CZ.nic had over 30,000 users with FIDO security keys.


How CVS Health Uses FIDO to Secure Its Users

CVS Health is a U.S. healthcare organization that includes multiple operating divisions including retail with CVS Pharmacy, which has nearly 10,000 locations across America. CVS Health also includes a large […]

CVS Health is a U.S. healthcare organization that includes multiple operating divisions including retail with CVS Pharmacy, which has nearly 10,000 locations across America. CVS Health also includes a large healthcare insurance business that integrates assets from Aetna.

As of Q2 2023, CVS Health is using passkeys for consumer logins to their mobile web service.

The Challenge/ Use Case

The key focus for CVS Health is to ensure integrity and confidentiality of customer data. The overall user experience also needs to be positive, to drive traffic to CVS’s digital assets. 

CVS Health is on a path to help make its consumer authentication experience not only secure, but easier to use. CVS Health is also on a path toward enabling password-less experiences for consumers wherever possible.

“For the external user, they would just simply walk away, if the user log in experience is cumbersome, in any way,” Cisa Kurian, senior security advisor at CVS Health commented. “Good security is always a balance between security and usability.”

How CVS Health Uses FIDO To Secure Its Users

CVS Health is building out an authentication platform to provide password-less authentication capabilities in its web, mobile, IoT and voice applications. Passwordless authentication is enabled with biometric authentication using FIDO standards

“Our goal is to increase friction for a potential threat actor, while enabling ease of use for the legitimate user,” Kurian said.

Benefits

By adopting a FIDO based approach, CVS Health is able to provide an easier authentication experience for its users. Making the login experience more seamless also helps to improve the overall user experience as well.

“We chose FIDO because the standards are open, and allow for simpler and stronger authentication that is based on public key cryptography,” Kurian said. “In other words, it’s easy to use and more secure, at the same time.”


OpenID

OpenID Foundation Provides FAPI and Certification Program Overview for the CAMARA Identity & Consent Work Group

As recently announced, the OpenID Foundation has joined the Linux Foundation’s CAMARA project as an Associate Member. CAMARA is an open-source project within Linux Foundation that defines, develops and tests the APIs enabling seamless access to Telco network capabilities. Bjorn Hjelm, OIDF Liaison Officer to CAMARA and OIDF MODRNA WG Co-Chair, led an OIDF meeting […] The post OpenID Foundation P

As recently announced, the OpenID Foundation has joined the Linux Foundation’s CAMARA project as an Associate Member. CAMARA is an open-source project within Linux Foundation that defines, develops and tests the APIs enabling seamless access to Telco network capabilities.

Bjorn Hjelm, OIDF Liaison Officer to CAMARA and OIDF MODRNA WG Co-Chair, led an OIDF meeting with the CAMARA Identity & Consent Work Group on Wednesday, December 20, 2023. Nat Sakimura, OIDF Chairman and FAPI WG Co-Chair, provided a deep technical dive into the FAPI specification that included a roadmap for FAPI 2.0 becoming a final specification in 2024. Mike Leszcz, OIDF Program Manager, followed with an overview of FAPI ecosystem adoption and then provided an overview of the OIDF certification program highlighting the success of the program since launching in 2015.

The presentation was followed by an active Q&A session with the Identity & Consent Work Group.

The OIDF presentation for the CAMARA Identity & Consent Work Group can be found here.

The post OpenID Foundation Provides FAPI and Certification Program Overview for the CAMARA Identity & Consent Work Group first appeared on OpenID Foundation.


Origin Trail

DNA and DKG: The guardians of real-world asset integrity

This year has seen a booming expansion for the potential of counterfeits with the mainstream adoption of the great enabler — generative artificial intelligence (GenAI). The impact of GenAI spans across digital and physical worlds and is already causing very tangible losses. As a result, we’ve seen Association of Photographers representatives calling for the introduction of smarter laws while other

This year has seen a booming expansion for the potential of counterfeits with the mainstream adoption of the great enabler — generative artificial intelligence (GenAI). The impact of GenAI spans across digital and physical worlds and is already causing very tangible losses. As a result, we’ve seen Association of Photographers representatives calling for the introduction of smarter laws while others, like Sarah Silverman, found enough ground in the current legal framework to launch a lawsuit against OpenAI for their large language model implementation ChatGPT. The impact of GenAI will be immense and it can significantly enhance counterfeit problems such as stealing intellectual property (IP) and lowering transparency across markets. Adding that to the already booming real-world counterfeit problem (estimated 10% counterfeit drugs globally, up to 50% of artworks suspected to be forgeries, $10–15 billion annual damages in the food and beverages industry …), there is a clear need to bring more powerful tools to counteract these trends and protect the innovators, artist, creators, and other IP owners. We should not show up with a knife to a gunfight.

That is why Trace Labs — core developers of OriginTrail — and DATANA, the flagship project of BioSistemika, have inked a solution partnership in which DNA and Decentralized Knowledge Graph technologies are combined to deliver effective authenticity-focused solutions for many sectors, such as supply chains, real-world assets (RWAs), art, and many more.

Source: DATANA Unparalleled real-world asset (RWA) integrity with DNA and DKG

The fusion of DNA Data Storage (DDS) technology with Decentralized Knowledge Graph (DKG) offers a disruptive approach to real-world assets (RWAs) authenticity verification, especially for high-value items. DDS is the process by which binary information is encoded into a sequence of nucleotides and synthesized to form synthetic data-carrying DNA molecules, which are then stored accordingly. Much like the unique DNA makeup of every person, consumer products can be outfitted with synthetically produced DNA tags unique to every unit, batch, or brand. These miniature DNA particles, or DNA tags, can be embedded in products or packaging, providing unique, tamper-proof identifiers, making counterfeiting extremely challenging. These tags can accompany and protect the authenticity of any physical RWA throughout its entire lifecycle. This can be achieved as the DNA is very hard to destroy, can be very well hidden, and has incredible storage capabilities so we need microscopically little of it (e.g. entire world’s data fits in a water bottle of DNA).

However, data in the DNA is not that easy to read and it cannot be frequently updated, which might be required and desired for many products. This is where the DKG steps in. By putting a smart anchor in the DNA and applying the global standard GS1 Digital Link on the product, we create a physical-to-digital connection that leads to extending the capabilities of the DNA tag. The DKG allows the product owner to add any additional information to it (about the production process, ingredients, quality parameters, ownership experiences).

Source: DATANA

All the information kept in the DKG has an owner and cannot be tampered with, giving the required digital protection. This protection extends towards the GenAI tools as well, as the DKG offers known sources that the GenAI can use and reference (see the Trusted AI framework implementation here).

Looking towards the future, this technology could revolutionize sectors like luxury goods, where authenticity is paramount. In pharmaceuticals, it could be the backbone for verifying the legitimacy of drugs, combating counterfeit medicines that pose substantial risk to public health. In the food and beverage industry, it could ensure the origin and quality of organic or premium products, while in the art world, the powerful combination of technologies could authenticate artworks and historical artifacts.

DNA as a next-gen tool for authenticity verification

DNA-based tracking and authenticity verification offer a sophisticated and secure method for protecting real-world assets, leveraging the unique nature of DNA sequences. Products can be tagged using synthetic DNA markers modified to encode digital data, which are applied through various methods such as microscopic particles, direct surface application, integration into materials, or embedding in packaging. With its multiple and growing use cases, the DNA production industry is now advancing at an exponential rate.

Source: DATANA Benefits of DNA tagging systems

Using DNA for authenticity verification is a cutting-edge approach that leverages the unique and tamper-proof nature of DNA sequences. Here is how it can be implemented:

DNA Tagging: Products can be tagged with synthetic DNA markers that are uniquely designed and nearly impossible to replicate. These markers can be applied to various products like luxury goods, pharmaceuticals, or agricultural products. The DNA sequences used are typically non-biological, meaning they do not come from living organisms but are artificially created. Traceability: As products move through the supply chain, the DNA tag remains with them. This allows for the tracking of goods from the point of origin to the end consumer. At any stage of the supply chain, a sample of the DNA tag can be collected and analyzed. Verification and Authentication: To verify the authenticity of a product, a small sample of the DNA tag is extracted and analyzed. Simple and Covert Application: DNA tags can easily be applied to products in various covert forms (*see next section), preventing counterfeiters from detecting, replacing, or removing them. Enhanced Encoding: Advanced DNA tagging systems also allow encoding significantly more data (metadata, media, etc.) into the tag itself, compared to alternative tracking forms (e.g. QR codes, barcodes). For example, a typical QR code can store roughly 1 kb of data, whereas a DNA tag can store up to 1MB of data (1000x more). Regulatory Compliance and Quality Control: In industries like pharmaceuticals and food, DNA tagging can help ensure compliance with regulatory standards and improve quality control processes. This method is particularly valuable for high-value or sensitive products where authenticity and origin are crucial. The technology is advancing rapidly, making it more accessible and cost-effective for various industries.

The application of DNA tags to products can vary depending on the type of product, the intended use of the tagging, and the durability required. Here are some common methods of applying DNA tags:

Integration into Product Materials: For some products, DNA tags can be integrated into the material itself during the manufacturing process. This could involve adding DNA to plastics, textiles, or paper. For example, high-value clothing or documents can have DNA integrated into their fabric or paper. Incorporation in Ink or Printing: DNA can be mixed with inks used for printing on products or their packaging. This method is useful for products that are already undergoing a printing process, such as packaged goods, books, or official documents. Microscopic Particles or Encapsulation: DNA can be encapsulated in microscopic particles that are then applied to the product. These particles can be mixed into paints, varnishes, or adhesives that are then applied to the product’s surface. This method is often used for high-value items like artworks, electronics, or luxury goods. Direct Application to Product Surfaces: In some cases, the DNA solution can be directly applied to the surface of the product. This could be through spraying, brushing, or immersing the product in the DNA solution. It is commonly used for products where the appearance is not a primary concern, or where the solution can be applied in an inconspicuous area. Embedding in Packaging or Labels: DNA tags can be incorporated into the product’s packaging or labels. This is a less invasive method and is particularly useful for products where direct application of DNA might be impractical or could affect the product’s quality or integrity, such as food products or pharmaceuticals. Coating with DNA-Infused Films: Products can be coated with a thin film that contains the DNA tag. This method is useful for a wide range of products, including electronics, automotive parts, and various consumer goods. Injection into Products: In some cases, especially for large or solid items, the DNA tag can be injected into the product. This method ensures that the DNA is present inside the product, making it difficult to remove or tamper with. Edible DNA Tags: For food products, edible DNA tags have been developed. These are safe for consumption and can be applied directly to the food or mixed into the food product during processing. Source: DATANA Encoding DKG into DNA sequences: bridging digital and physical

Combining DNA tagging and OriginTrail technology can create a robust tool for enhanced supply chain tracking and authenticity verification for RWAs. Here is a brief overview of how these two technologies can be integrated:

Assembling the Relevant Identifiers: The OriginTrail DKG enables connecting various identifiers across the entire product lifecycle to create a digital twin. These can include decentralized identities (DIDs), uniform asset locators or UALs (similar to URLs), and cryptographic proofs of the data (i.e. hashes) for immutability. Encoding DKG into DNA Sequences: Depending on the product, tamper-proof data from the DKG (such as UALs) are then encoded into short, synthetic DNA sequences, which are “stitched” together in DATANA’s DNA writer. The initial encoding process translates the digital information into the biological sequence of DNA (A, T, C, G). Then, the physical DNA molecule is synthesized in a process that involves carefully mixing thousands of microscopic droplets containing the DNA building blocks at extreme speeds. Each mixture of droplets represents a unique biochemical reaction, resulting in the elongation of the data-coding DNA chain. DNA Tagging: Synthetic DNA sequences, serving as trusted links to the DKG, are administered to the product as DNA tags, anchoring the connection between the product and the DKG digital twin. The selection of the DNA application method (microscopic particles, direct surface application, integration into materials or packaging, etc.) is contingent upon the product’s characteristics and the chosen implementation approach. Products undergo tagging with synthetic DNA markers employing diverse techniques like microscopic particles, direct surface application, integration into materials, or embedding in packaging. These distinctive DNA sequences are intentionally crafted to be tamper-proof and are exceptionally challenging to replicate. Traceability and Verification: Throughout the supply chain journey, the DNA tag stays linked with the product. At any stage in the supply chain, it is possible to collect a sample of the DNA tag, followed by an analysis to extract the encoded data. As products progress through the supply chain, the DNA tag remains connected to them within the OriginTrail DKG. The interlinked knowledge assets construct a thorough and accessible record of the product’s voyage, incorporating details about manufacturing, transportation, storage, and sale. DKG Verification: Retrieving digital data from DNA involves sequencing the DNA, translating its nucleotide sequence back to the original binary code. DNA sequencing is a rapidly advancing technology routinely used in molecular biology and genome sequencing. The DKG data extracted from the DNA sample undergoes comparison with what is stored on the OriginTrail DKG. If the two align, the product is confirmed as authentic, affirming that the information in the DKG aligns with the physical product. Authenticity verification involves collecting a small DNA tag sample from the product, followed by analysis using techniques like Polymerase Chain Reaction (PCR) to confirm its match with the original DNA sequence stored in the OriginTrail DKG. Privacy and Ownership: OriginTrail DKG provides options for configuring privacy settings, and guaranteeing secure management of sensitive data, including the details about the placement of DNA tags and other product information. Verifiable transfer of ownership, encompassing the DNA-tagged identifiers ensures transparency and accountability. Ensuring Data Integrity: The integrity of the product information is maintained through cryptographic hashing on the blockchain within the OriginTrail DKG, ensuring that the last state can be verified. AI-powered product interaction: DKG makes all the product information AI-ready, easily connecting to the constantly advancing AI tools. It allows users to interact with information through questions and prompts that are most relevant to them — learning more about the brand history, production process, ingredients quality or DNA protection on their products. All the responses such an AI-powered system provides are based on inputs from the DKG. AI based semantic search solutions can be developed to simplify and enhance the interaction with trusted product information on the DKG. Source: DATANA Vision of the future

From high costs and complexity in implementation, to the technological requirements, and the potential stability issues of DNA tags under different environmental conditions, the real-world use case of DNA-based tracking and authenticity verification in protecting real-world assets is, however, not without its limitations.

Overcoming these aforementioned limitations demands collaborations between leading ecosystem players, with each leveraging their expertise for mutual and collective benefits. The partnership between OriginTrail and DATANA is exactly that. With the coming together of DKG technology and DNA-based tracking, we are poised to usher in a new age for product authenticity, where DNA is used to store humanity’s most important knowledge, and enshrined as assets on the DKG.

OriginTrail and DATANA are already working on a live solution, and we are excited to showcase this at the GS1 Global Forum in Brussels in February 2024. In the meantime, if you share our vision of supercharging the protection of real-world assets, valuable products, or supply chains, feel free to get in touch!

About OriginTrail

OriginTrail is an ecosystem building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail’s initial adoption was in global supply chains, serving as a trusted hub for supply chain data sharing, allowing customers to authenticate and track products and keep these operations secure. In recent years, the rise of AI has not only created unprecedented opportunities for progress but also amplified the challenge of misinformation. OriginTrail also addresses this by functioning as an ecosystem focused on building a trusted knowledge infrastructure for AI in two ways — driving discoverability of the world’s most important knowledge and enabling the verifiable origin of the information. The adoption of OriginTrail in various enterprise solutions underscores the technology’s growing relevance and impact across diverse industries including real-world asset tokenization (RWAs), the construction industry, supply chains, healthcare, metaverse, and others.

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

OriginTrail has gained support and partnerships with world-class organizations such as British Standards Institution, SCAN, Polkadot, Parity, Walmart, World Federation of Hemophilia, Oracle, and the EU Commission’s Next Generation Internet. These partnerships contribute to advancing OriginTrail’s trusted knowledge foundation and its applicability in trillion-dollar industries while providing a verifiable web of knowledge important in particular to drive the economies of RWAs.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

About DATANA

BioSistemika, the parent company of the DATANA project, is on a mission to revolutionize digital data storage with the DNA molecule as the next-generation data storage medium. The company is looking to introduce the first commercial solution for sustainable, cost-efficient, and rapid writing of digital data to DNA.

DNA data storage offers groundbreaking benefits in terms of storage density, data safety, stability, and replicability. Its storage density is unparalleled, with a gram capable of holding up to 215 petabytes of data, vastly surpassing traditional electronic storage. This makes DNA an efficient solution for space-constrained data centers.

In terms of data safety and stability, DNA is resistant to many vulnerabilities like electromagnetic interference and can retain information accurately for millennia, ideal for long-term archival and supply chain tracking purposes. This stability ensures data safety over extended periods, far exceeding the capabilities of current storage media.

Furthermore, DNA’s replicability is a significant advantage. It allows for the creation of numerous, identical copies of data at a molecular level, ensuring high-fidelity backups and easy dissemination. This feature guarantees that data can be preserved and accessed reliably over long periods, making DNA data storage a promising technology for managing the growing volume of digital information in our world.

DATANA has been recognized by the EU and the European Innovation Council, receiving funding through three major R&D grants, totaling over €5 million. Additionally, in January 2023, they successfully collaborated with Bitstamp, a pioneering cryptocurrency exchange, storing and retrieving private keys and seed phrases of a cryptocurrency wallet in and from DNA, showcasing the readiness and reliability of their technology for commercialization.

Their proprietary benchtop “DNA writer” is suitable for commercial production of synthetic data-coding DNA , including DNA tags. By leveraging their technology, they can store over 1MB of data in product tags, substantially surpassing the current storage capacity of QR codes and other product markers.

DNA and DKG: The guardians of real-world asset integrity was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

heise: Modern authentication with and without a password

The online workshop Modern Authentication with and without a password is about modern alternatives and what role PKI certificates, FIDO and passkeys play. Participants learn to manage certificates, use certificate-based […]

The online workshop Modern Authentication with and without a password is about modern alternatives and what role PKI certificates, FIDO and passkeys play. Participants learn to manage certificates, use certificate-based authentication and implement security concepts in real scenarios.


heise:有密码和无密码的现代身份验证


BGR: 1Password launches sign-in for public test ahead of official release

As 1Password opens the passkey feature for public beta testing, users will no longer need their “Secret Keys”. Passkeys offer heightened security through facial recognition (and fingerprints), along with recovery […]

As 1Password opens the passkey feature for public beta testing, users will no longer need their “Secret Keys”. Passkeys offer heightened security through facial recognition (and fingerprints), along with recovery codes for added security measures in case of device loss.


Security Info Watch: 4 cyber-attack prevention strategies your organization must implement

Phishing-resistant passkeys offer enhanced security (unlike passwords) and prevent attackers from bypassing security measures. Major companies like Microsoft, Apple, and Google are endorsing passkeys to bolster user security, with Google […]

Phishing-resistant passkeys offer enhanced security (unlike passwords) and prevent attackers from bypassing security measures. Major companies like Microsoft, Apple, and Google are endorsing passkeys to bolster user security, with Google even making them the default option for personal accounts.


Authority Magazine – Medium: Jason Rebholz Of Corvus Insurance: How AI Is Disrupting Our Industry, and What We Can Do About It

In an interview discussing AI’s impact on the industry, Jason Rebholz from Corvus Insurance underscores the threat to user credentials targeted by attackers. He recommends securing identities using passkeys or […]

In an interview discussing AI’s impact on the industry, Jason Rebholz from Corvus Insurance underscores the threat to user credentials targeted by attackers. He recommends securing identities using passkeys or FIDO2 technologies and adopting zero-trust principles to prevent these incidents.


GS1

Exchanging data via GDSN as a foundation for traceability

Exchanging data via GDSN as a foundation for traceability To increase the safety and efficiency of medicine supply chains, healthcare leaders in Zambia are keen to implement greater traceability of products. With support from the USAID (United States Agency for International Development) Global Health Supply Chain Progra
Exchanging data via GDSN as a foundation for traceability To increase the safety and efficiency of medicine supply chains, healthcare leaders in Zambia are keen to implement greater traceability of products.

With support from the USAID (United States Agency for International Development) Global Health Supply Chain Program–Procurement and Supply Management and from GS1 South Africa, work to implement a national product catalogue began in 2022. This product catalogue contains standardised master data about products. The data is kept up to date using the Global Data Synchronisation Network (GDSN), which is kept up to date by the product manufacturers.

Business goal GS1 Healthcare Reference Book 2023-2024 reference_book_2023-2024_zambia-gs1-south-africa.pdf