Last Update 6:21 PM July 24, 2024 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Wednesday, 24. July 2024

OpenID

Guidance to the CFPB regarding US Open Banking

Authors: Gail Hodges, Joseph Heenan, Dima Postnikov, Mark Haine, Mike Leszcz, Elizabeth Garber  Following our May 16 open letter to the Consumer Financial Protection Bureau, the OpenID Foundation has been engaged in discussions about their rule-making on Personal Financial Data Rights. This post summarizes our guidance to the CFPB. Why are we engaged? The OpenID […] The post Guidance to the
Authors: Gail Hodges, Joseph Heenan, Dima Postnikov, Mark Haine, Mike Leszcz, Elizabeth Garber 

Following our May 16 open letter to the Consumer Financial Protection Bureau, the OpenID Foundation has been engaged in discussions about their rule-making on Personal Financial Data Rights. This post summarizes our guidance to the CFPB.

Why are we engaged?

The OpenID Foundation is committed to supporting Open Banking ecosystems worldwide – all ecosystems that rely on identity data. In particular, we develop and continuously iterate upon world-class identity standards that improve security and underpin interoperability. This creates conditions for competitive marketplaces that protect consumers. 

By offering this guidance to the CFPB, we are answering a call from those seeking to enhance the security of US digital infrastructure. In its March 2024 report, the United States Cyber Safety Review Board (CSRB) called on us to continue iterating on our standards to ensure they are fit for purpose in use cases requiring heightened security – and they called on Cloud Service Providers (CSPs) to adopt those standards. We believe that the US Open Banking ecosystem, to best protect consumer data, should follow suit. 

FAPI as a Secure Communications Protocol

We recommend that the CFPB, in its rule-making, ensures the use of a secure communications protocol for the exchange of identity data. We also propose that the widely adopted FAPI family of specifications performs this role.

The FAPI profile enhances the OAuth 2.0 framework for high-security use cases. It is based on an advanced attacker model and closes critical security gaps that OAuth 2.0 does not address. We provided the CFPB with the example of Client Authentication:

This slide also shows how each jurisdiction may develop its own local profile for any final configuration choices.

Current global ecosystem adoption of FAPI includes: 

 

Selected FAPI

Mandated FAPI

Deployed FAPI

United Kingdom – Open Banking

Australian Treasury & Data Standards Body

Australian ConnectID

Brazilian Open Finance

Saudi Arabian Monetary Authority

United Arab Emirates Government

2024 launch

Chilean Ministry of Finance

 

Colombian Government

Expected 2024

 

Norwegian HelseID (Health)

   

German Verimi

Canadian Open Banking 

Expected

   

US FDX 

Recommended

    The Benefits of Interoperability

By reducing the optionality inherent in the OAuth 2.0 framework, FAPI also promotes interoperability within and across ecosystems. We shared an example of one startup that sought to integrate with the US and other banks & open banking partners globally. They encountered a wide variety of:

Cryptographic Methods, including less secure signing methods (covered by FAPI) Client Authentication, including less secure authentication methods (covered by FAPI) Data formats and payloads, each of which required interpretation  Approaches to data minimization, including many cases of receiving more data than requested Security Culture & Practices, enabling the selection of less secure options (somewhat addressed by selecting FAPI)

This wide variety prevents interoperability and places heavy burdens on fintechs and new market entrants. Interoperability, on the other hand, ensures:

A level playing field for new fintech entrants Less reliance on aggregators Opportunities for banks and fintechs to work with partners across borders (see “Open Banking and Open Data: Ready to Cross Borders?” and our contributions to the “Global Assured Identity Network” and “Sustainable Interoperable Digital Identity” movements)

Our original Open Letter provides more information about FAPI and its role in underpinning security and interoperability.

Other Relevant Standards: Federation and Shared Signals

While the conversation with the CFPB began as a strong recommendation to name a secure communications protocol, we would be remiss if we did not also refer to other OpenID Standards designed to improve the security and viability of open data ecosystems. In particular:

OpenID Federation is designed to quickly establish trust between parties who have been onboarded to an ecosystem. This is how banks can ensure that data requests are coming from legitimate actors – and how legitimate actors can quickly gain access to an open banking ecosystem.  Shared Signals and Events is an open API built upon a protocol suite that enables applications and service providers to communicate about security events to make dynamic access and authorization decisions. It acts as a signaling layer on a back channel that helps secure near real-time sessions. We wrote about its benefits in a read-out from a recent interoperability event here. What’s Next?

The OpenID Foundation is engaged in ongoing discussions with the CFPB and is exploring the requirements for approved standard-setting bodies. Those interested in promoting a secure and thriving Open Banking ecosystem in the United States and around the world should stay tuned!

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy-preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments, and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Guidance to the CFPB regarding US Open Banking first appeared on OpenID Foundation.


Me2B Alliance

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you […] The post Identity Resolution and Customer Dat

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

 

The post Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic appeared first on Internet Safety Labs.


Identity Resolution and Customer Data Platform Companies

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections. This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 […

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platform Companies

The post Identity Resolution and Customer Data Platform Companies appeared first on Internet Safety Labs.


The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms 

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and […] The post The

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and spanning the digital world and the physical world.  

In February 2024, Cracked Labs published “Pervasive identity surveillance for marketing purposes”, an in-depth analysis of LiveRamp’s RampID identity graph. One of the most superficial yet most powerful functions of this excellent report was to guide attention towards industries responsible for pervasive consumer surveillance. The timing was excellent as I’d already committed to present “The Hidden Identity Infrastructure” at Identiverse (May 2024) and prompted by the report, I dug in to better understand the two industries underpinning hidden identity infrastructure, namely, Identity Resolution (ID Res) and Customer Data Platforms (CDPs).  

There are nearly $9T worth of industries worldwide that rely on persistent, hidden identification of people. Naturally, demand of this magnitude fueled the now mature industries that perform pervasive, universal identification of people and their personal information. ISL identified over 350 companies providing either identity resolution platforms, customer data platforms, or both.  

This paper explores the magnitude and reach of these two industries, how they came to be, and most importantly, why, from a human well-being perspective, it’s crucial that these kinds of platforms be held to higher regulatory standards of scrutiny, transparency, and accountability. One identity resolution company alone out of 93  such companies (worldwide) boasts the collection of 5,000 data elements for [each of] 700 million consumers in 2021. To put this in perspective, the number of user accounts breached worldwide in 2023 was about 300 million1. Is there an appreciable difference between stolen user data and undisclosed “legitimate” personally identifiable information sharing? Moreover, nearly 40% of the 93 companies that provide identity resolution platforms are registered data brokers.   

Indeed, after reviewing the research, we must ask ourselves, is this the kind of world we want to live in: a world where everything about us is always known by industry; a world where the ongoing surveillance of people is deemed necessary in the name of capitalism. Is this the kind of world in which humans and societies will flourish or self-destruct? Are humans more than capitalistic consumers? Are we more than our purchasing potential?  

A Call to Action 

ISL conducted this research to help illuminate the sizable risk of hidden identification and the worldwide web of user surveillance. ISL believes naming and exposure is crucial to effecting change. Identification resolution and customer data platforms have been hiding in plain sight for more than a decade, and yet even the “identerati” are largely unfamiliar with these industries. How can we expect everyday people to know?   

This paper is a rallying call for privacy advocates to come together to demand greater regulatory scrutiny, transparency and oversight for these industries, in conjunction with more meaningful data broker regulation.  

Additionally, this is a rallying call to acknowledge the catastrophic failure of notice and consent as a valid permissioning mechanism for highly complex and interconnected digital services. It’s inconceivable that people understand the magnitude of data sharing that consenting to sharing “your data with our marketing” entails.  

We must ask ourselves if this is the kind of world we want for ourselves and our children, where our preferences, practices, relationships, behaviors, and beliefs are all up for sale and broadly shared without our awareness. Are we ourselves in fact being sold?  

The technologies fueling these capabilities have received billions of dollars; consumers don’t have a chance in the face of voracious hunger to identify, know, and manipulate them. We hope that this research shines a much needed light on the forces enabling the worldwide web of human surveillance so that they may be held to accountability for their troves of data on nearly all internet users. 

PS. Also check out our latest podcast with guest Zach Edwards where we discuss this worldwide web of human surveillance live

Open Report PDF

Identity Resolution and Customer Data Platform Companies

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

The post The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms  appeared first on Internet Safety Labs.


ResofWorld

The AI job interviewer will see you now

AI interview services say they’re eliminating bias — but not everyone agrees.
When Floria Tan applied for an internship at China’s food delivery giant Meituan, her first video interview was not with a human being. The interviewer certainly looked real enough. It...

Next Level Supply Chain Podcast with GS1

Digital Twins & Their Supply Chain Wins with Elyse Tosi

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.   Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in produ

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.

 

Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in product digitization. Elyse shares her extensive knowledge and experience in supply chain management, touching on her work with brands like Victoria's Secret and Eileen Fisher, to discuss the transformative impact of technology and standards on global supply chains.

 

They discuss enhancing value chain efficiency through interoperability, the significance of the EPCIS standard in scaling and achieving interoperability, and how EON, chosen by the EU to pilot digital product passports, is influencing legislation and standards adoption—an initiative critical for compliance, brand protection, and product authentication. They also explore emerging trends like digital twins, QR codes, digital links, and their game-changing potential for retail and customer engagement.

 

In this episode, you’ll learn:

How EPCIS standards ensure interoperability and scalability for digital product passports, enabling seamless data exchange and lifecycle management in supply chains

The transformative impact of digital twins, QR codes, and digital links on retail experiences, customer engagement, and product data connectivity, driving new commerce channels and incremental revenue opportunities.

How Eon leverages compliance with EU legislation to provide commercial benefits such as brand protection and product authentication, reinforcing the importance of scalable and cost-effective blockchain applications.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Elyse Tosi on LinkedIn

More about EON -https://www.eon.xyz/ 

 


DIF Blog

Extrimian HackAlong

We’re excited to announce the first ever DIF HackAlong, co-produced with DIF Associate member Extrimian! These HackAlongs are dynamic educational sessions designed to equip both developers and business leaders with essential knowledge about decentralized identity. Join us for the first of it’s kind, beginner friendly Spanish-language

We’re excited to announce the first ever DIF HackAlong, co-produced with DIF Associate member Extrimian!

These HackAlongs are dynamic educational sessions designed to equip both developers and business leaders with essential knowledge about decentralized identity.

Join us for the first of it’s kind, beginner friendly Spanish-language event. The series will consist of five one-hour sessions, starting on Thursday, August 8, and running through September 12.

Want to participate? Visit the Extrimian/DIF event page to learn more and secure your spot today.

From the organizers: "We will analyze a use case based on the Travel & Hospitality industry, and show you how the QuarkID protocol works. If you’re a fan of technology, a web3 enthusiast, developer, a decision-maker, or an IT student, don’t miss out on talks about decentralized, innovative technologies focusing on data security and privacy. Plus, create and develop your project using SSI. 

"By participating, you could be featured in the DIF’s project gallery and win many other prizes for participants! If you’re in Buenos Aires, Argentina, you can join us in person at the Innovation Park, where our team will be broadcasting the workshops!"


Digital Identity NZ

Digital Trust Framework: Launch & Future | July Newsletter

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

It was a different story in industry, however, where the occasion was covered by DINZnationally by RNZlocally, internationally by Biometric Update and social media posts, including my own and those from DINZ.

The quote that really stuck was this one from Victoria University’s Professor of Informatics Markus Luczak-Roesch: “There’s a huge risk of doing nothing. Which is why it’s good that we’re doing something.” He’s absolutely right. It’s been over seven years of policy work at the DIA to reach this point, which I described as ‘the end of the beginning’. While a challenging journey, Aotearoa can build from here with those that want to opt-in.

Next month’s Digital Trust Hui Taumata ahead of Net Hui and The Point 2024, will kick off with a keynote by Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik. The panel that follows will discuss NZ’s Digital Identity Trust Framework, representing organisations that could be potential Relying Parties/Verifiers in Aotearoa under the DISTF regulation. The Trust Framework market model would see such parties seek out Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials, a topic that I blog about here. Publicly, it’s known that MSD and HNZ are piloting DIA’s platform, with RealMe as a notional issuer.

Additionally, the event will cover Digital Public Infrastructure, AI, biometrics, digital acceptance networks, digital drivers’ licences, the Metaverse, passkeys, digital cash, next generation payments, and the challenges of delegated administration across communities and much more. It’s all there, along with a panel of four experts who will review the sessions from a Te Ao Māori perspective.

In short, this year’s Digital Trust Hui Taumata will be like no other. The wait is over, and the rubber is hitting the road for the DISTF. What matters now is scale – will they come?

Lastly, I’m very excited to tell you that the DINZ podcast series is almost ready for launch so do keep an eye out for the first episode dropping very soon.

Ngā mihi
Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Digital Trust Framework: Launch & Future | July Newsletter

SUBSCRIBE FOR MORE

The post Digital Trust Framework: Launch & Future | July Newsletter appeared first on Digital Identity New Zealand.

Tuesday, 23. July 2024

Digital Identity NZ

Will the Digital Trust Hui Taumata 2024 move the dial?

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years.  The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years. 

Significant attention will be devoted to Trust Frameworks given the Digital Identity Services Trust Framework (DISTF) regulation coming into play on 1 July. Immediately following Minister Collins’ opening remarks, Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik will deliver an intensely interesting first keynote – The international landscape for Digital Identity Trust Frameworks and how NZ compares. Trust frameworks already exist and we use them daily – for example using your bank card to withdraw cash from another bank’s ATM. The panel that follows, representing organisations that could be potential Relying Parties (RPs)/Verifiers under the DISTF, discuss how they see Trust Frameworks playing out. To be relieved of the burden and to minimise risk, these parties notionally look for accredited Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials.  

Two of these three panellists come from regulated industries while the other is a key government agency, where in all cases the failure to verify parties correctly could have devastating consequences. Other regulated industries and government agencies that need similar verification processes include estate agents, rental companies, law firms, financial services, insurance companies, the pharmacies, doctor’s surgeries, the Police Vetting Service, driver licencing, firearms licensing, the box store where you take out a loan for your new appliance, registering for a loyalty scheme – and the list goes on. Representatives from the Regulator, the DISTF Trust Framework Authority, will lead a Roundtable discussion after lunch where delegates can pose their questions.   

There are multiple paths to achieve this nirvana of privacy-aware, cryptographically secured verified credentials available to all people under the auspices of a Trust Framework which is why, straight after the Trust Frameworks panel, Worldline’s Conrad Morgan will keynote a complementary path – Turning transactions into interactions – building New Zealand’s first digital identity acceptance network’. 

Supporting Trust Frameworks are increasingly biometrics and AI – both of which need demystifying for the public to gain confidence in them – along with Digital Public Infrastructure, the Metaverse, passkeys, digital cash, digital driver’s licences, next generation payments, the critical need for digital inclusion, and the challenges of delegated administration across communities. The agenda comprises local and international speakers covering these topics as well, all reviewed by a panel of four experts reviewing the sessions from a Te Ao Māori perspective. 

The richness of the content to be presented at this year’s event is incomparable with previous years. So do not be surprised when the 2024 Digital Trust Hui Taumata is dropped into conversations in years to come.    

Colin Wallis, Executive Director, DINZ

The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.


Energy Web

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in…

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic pa
Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape

July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic partnership with Acurast, an innovative leader in decentralized computing. This collaboration marks a significant step forward in enhancing the capabilities of both platforms while driving sustainability and technological innovation across the global energy landscape.

The partnership aims to seamlessly integrate Energy Web worker node networks with Acurast’s Decentralized Compute network. This integration will enable Energy Web users to host Energy Web workers on Acurast’s secure and widely distributed compute protocol. The primary goal is to facilitate a more efficient and scalable deployment of digital energy solutions.

In a move to expand its digital footprint, Energy Web will leverage the Acurast SDK to roll out a new mobile application. This collaboration will not only enhance mobile accessibility but also significantly improve the functionality, providing users with robust tools for managing their energy resources efficiently.

Both Acurast and Energy Web Foundation are committed to sustainability. Acurast’s approach to upcycling smartphones, giving them a second life as compute units in its decentralized network, dramatically reduces electronic waste and promotes efficient resource use. Similarly, Energy Web Foundation is dedicated to accelerating the clean energy transition through its development of cutting-edge, open-source technologies for energy systems.

By combining their unique resources and expertise, Acurast and Energy Web Foundation aim to foster significant innovation, efficiency, and sustainability in the energy sector. This partnership underscores their shared vision of a more sustainable and decentralized future, driving positive change across communities worldwide.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in… was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant  

In May 2024, the US government’s General Services Administration (GSA) updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the […] The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.

In May 2024, the US government’s General Services Administration (GSA) updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the IT Large Category.SIN 541519CSP is designed to help federal agencies ensure that any IT services procured meet the requirements of National Institute of Standards and Technology (NIST) Special Publication (SP) 800-63 requirements and digital identity compliant services.  To provide credential services under the new SIN, companies must meet specific instructions and requirements. SIN 541519CSP was created to meet the increasing need for robust, trustworthy credential service providers. The new SIN will help government agencies quickly identify credential service providers that have been vetted against the government’s standard requirements. If your company offers credential services and meets the requirements, obtaining SIN 541519CSP will place your company in a better position to capture bids as agencies look to acquire NIST 800-63 compliant services. In order to be included on the Schedule, Credential Service Providers must either be listed on the Kantara Trust Status List or provide a letter pf approval from Kantara Initiative, or other GSA approved third party that can assure conformance to NIST SP 800-63.  To begin the process, you’ll need to complete forms that can be found on idmanagement.gov. Since both state and federal government agencies are permitted to use the vendors on this schedule for credential services, this considerably extends opportunities for Kantara certified companies.

Read the full instructions on how to be included on MAS, including the Technical Evaluation Criteria.

The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.


ResofWorld

Pressured to relocate, Microsoft’s AI engineers in China must choose between homeland and career

As geopolitical tensions grow, many employees have decided that a career with the Silicon Valley tech giant isn’t worth giving up the comforts of home.
Alan, a young engineer at Microsoft, has been living a comfortable life in Beijing working for the tech giant on cloud computing. He earns six times the average income in...

In Indonesia, social media is a “hunting ground” for religious minorities

Conservative Muslim influencers spread hate speech to their millions of followers on TikTok and YouTube, with little pushback from authorities or platforms.
For nearly two decades, hundreds of Ahmadiyya Muslims have lived in a cramped government shelter on the Indonesian island of Lombok, after they were attacked by a mob that accused them...

Blockchain Commons

2024 Q2 Blockchain Commons Report

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digi

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digital assets in ways that are private, independent, resilient, and open.

In Q2 of 2024, we advanced these principles through the following work:

Gordian Envelope Updates Expanded Developer Pages Request/Response Presentation Graph Representation Gordian Meetings FROST Presentation PayJoin Presentation All the Rest Seedtool-Rust Release seedtool-cli-rust Seedtool Manual dCBOR Adoption cbor.me cbor2 QCBOR IANA assignment of Tag 201 GSTP Improvements SSH Research ssh-envelope Experiment in Python SSH Key Support for envelope-cli More to Come Architectural Articles Minimum Viable Architecture Authentication Patterns DID Futures W3C DID 1.1 WG RWOT 13 Grants/Funding What’s Next? Gordian Envelope Updates

Gordian Envelope, Blockchain Commons’ “Smart Document” system, continues to be a major focus. Here’s what that meant in Q2.

Expanded Developer Pages. The developer pages were updated with a new executive summary and feature list to clarify the capabilities and advantages of using Envelope. (More executive summaries of our technology to follow!)

Request/Response Presentation. Our May Gordian Developers Meeting included a presentation on Request/Response, which is an interoperable communication methodology using Gordian Envelope. Why use it? It can make complex digital-asset procedures more accessible by using automation to dramatically reduce the amount of human interaction needed, yet it also preserves security by ensuring that human choices are required whenever data is transmitted from one device to another. (But watch the presentation for more!)

Graphs Representation. Blockchain Commons has a new research paper out on Representing Graphs with Envelope, which presents a proposed architecture for representing many types of graphs, enabling the use of Envelope for a variety of graph-based structures and algorithms.

Gordian Meetings

Gordian Developer Meetings are how we bring the wallet community together to talk about our interoperable specifications. We’ve been thrilled to expand that in the last quarter with some feature presentations from experts in the field.

FROST Presentation. April saw a special presentation on FROST by Jesse Posner that not only talked about his work to date, but also some of the emerging capabilities of FROST, such as the ability to regenerate shares or even change thresholds without changing the underlying secret! We’ve long thought FROST was a great next-generation resilience solution for digital assets, and so appreciate Jesse talking to our community about why it’s so exciting. See the complete video of our April meeting for more.

PayJoin Presentation. Privacy is one of our fundamental principles for Gordian design. It’s also a principle that will be better supported in Bitcoin with a new version of PayJoin. Dan Gould was kind enough to give a full presentation on the updates he’s working on at our May meeting. We’ve got a video of just his PayJoin presentation.

All the Rest. Both meetings of course also included details on Blockchain Commons’ own work (much of which is detailed in this report). The Gordian Developer meetings continue on the first Wednesday of every month. We’ve also already scheduled a few feature presentations for the rest of the year. On August 7th, we’ll have a special presentation on BIP-85, then on December 4th, we’ll have another FROST presentation for wallet developers. If you’d like to make a special presentation in September, October, or November on a topic of interest to wallet developers, let us know!

Also, if you’re a cryptographer, spec designer, or library developer who is working to implement FROST, please be sure to sign up for our FROST implementers announcements-only list so that you can receive invites for our second FROST Implementers Round Table, which will be on September 18 thanks to support from the Human Rights Foundation (HRF).

Seedtool-Rust Release

Blockchain Commons’ newest reference application is seedtool-cli for Rust.

seedtool-cli-rust. Seedtool is a domain-specific application that allows the creation, reconstruction, translation, and backup of cryptographic seeds. Blockchain Commons’ new Rust-based Seedtool replaces our older C++-based CLI and provides broader support for Gordian Envelope, including offering Gordian Envelopes of SSKR shares, that can backup a seed using Shamir’s Secret Sharing. Seedtool’s Gordian Envelopes can then be piped into envelope-cli-rust for compression, encryption, or the addition of further metadata.

Seedtool Manual. For more on seedtool-cli-rust, check out the full user manual, which explains how to use all of its functionality and why it’s important.

dCBOR Adoption

dCBOR is one of the foundations of Envelope, as it allows for the deterministic ordering of data, which is crucial for a hashed data system like Envelope. The IETF dCBOR Internet-Draft updated from v8 to v10 over Q2, with most of those changes due to expanding support for the spec. We’re still hoping to see the Internet-Draft finalized soon!

cbor.me. The CBOR Playground is Carsten Bormann’s foundational diagnostic site for CBOR. It now supports dCBOR thanks to a new Ruby Gem that Carsten authored.

cbor2. Joe Hildebrand’s cbor2 library for Typescript has also been expanded to support dCBOR.

QCBOR. Laurence Lundblade’s QCBOR library (which is written in C) now supports dCBOR in its development branch.

IANA Assignment of Tag 201. Finally, 201 is now officially the “enclosed dCBOR” tag for CBOR. This is also critical for Gordian Envelope, which uses this tag to wrap dCBOR in each of an envelope’s “leaf” nodes.

GSTP Improvements

Gordian Sealed Transaction Protocol (GSTP) is a Gordian Envelope extension. It allows for Envelope Requests and Responses to be sent in a secure way and is a critical element of Blockchain Commons’ Collaborative Seed Recovery system, which enables the storage of SSKR shares in a Gordian Depository.

GSTP Advances. Thanks to support from our Research Sponsor, Foundation Devices, Blockchain Commons was able to expend considerable engineering work on GSTP in the last quarter, resulting in more fluent API patterns for building GSTP requests and responses. In addition, GSTP now supports bidirectional self-encrypted state with a unique and powerful new feature that we are calling Encrypted State Continuations (ESC). Overall, GSTP is a system that is secure, distributed, and transportation-agnostic. In a world where we could be sending digital-asset info by NFC, Bluetooth, or QR codes, it’s a critical security measure. See our presentation from the most recent Gordian Developers Meeting for more!

SSH Research

SSH has been long used as an authentication system, primarily for accessing UNIX computers. However, it’s recently come under increasing usage as a signing system as well, primarily thanks to extensions in Git. That has led to Blockchain Commons experimenting with the integration of SSH keys into Envelope. (This has also demonstrateð the flexibility of Envelope through the addition of these signing methodologies.) We’ve now got some first results.

ssh-envelope Experiment in Python. Early in the quarter, we produced ssh-envelope, an experimental Python program that worked with both ssh-keygen and envelope-cli. But, thanks to some very rapid development, we’ve already moved beyond that.

SSH Key Support for envelope-cli. We’ve since integrated SSH key support throughout our Rust stack, primarily affecting our bc-components and bc-envelope Rust crates. This allowed us to bring our SSH key support fully into the Rust envelope-cli, which you can now use for SSH signing.

More to Come. We’re still working on processes that will allow for the safe, secure, and reliable signing of software releases, something that we talked about extensively in our software use cases. You can see some more of our work-in-progress in a discussion of SSH Key Best Practices. We hope to have more on using SSH to enable resilient & secure software releases later in the year.

Architectural Articles

Blockchain Commons expresses a lot of its more architectural thoughts as articles. There were two major articles in Q2.

Minimum Viable Architecture. Our first major article for the quarter focused on the methodology of Minimum Viable Architecture (MVA). Many companies still focus on Minimum Viable Products. Our article advocates instead looking at the big picture (with lots of discussion on why that’s important).

Authentication Patterns. Design patterns are a crucial element in architectural design. Much as with the adversaries found in #SmartCustody, design patterns allow you to put together a larger system piece by piece. As part of a guide to the strength of heterogeneity in architectural design, Blockchain Commons penned a set of authentication design patterns. We’d like to do more to fill out the space, but for now feel like this is a good first cut that shows the value of the design style.

DID Futures

The Blockchain Commons principals have been involved with DIDs since Christopher Allen founded Rebooting Web of Trust in 2015.

W3C DID 1.1 WG. After a hiatus, the W3C DID working group has been rechartered through 2026. Christopher Allen continues as an Invited Expert, focused on a variety of privacy issues, including elision, DID registration, and DID resolver issues.

RWOT 13. Meanwhile, Rebooting the Web of Trust continues to be on the frontline for DID advancements, with Christopher still the chair of the organization and Shannon Appelcline the editor-in-chief. RWOT13 is finally back in the USA, with the early bird deadline for advance-reading papers at the start of August.

Grants/Funding

As we’ve written elsewhere, funding has become more difficult in the last year because of large-scale financial factors such as inflation and the resultant increase in interest rates. Blockchain Commons has responded by working more closely with some of our partners on topics of special interest to them and by seeking out grants.

Thanks to Human Rights Foundation for their grant enabling our continued support of FROST work.

Thanks to Foundation Devices for their support of GSTP work.

Thanks to Digital Contract Design for their support of our advocacy over the last year.

Please consider becoming a personal or corporate sponsor of Blockchain Commons so that our work can continue. Or, if you want support to integrate or expand one of Blockchain Commons’ existing projects (such as SSKR, Envelope, or the Gordian Depositories) in an open manner, to meet your company’s needs, contact us directly about becoming a Research Sponsor.

Also, please let us know of any grants or awards that you think would be closely aligned with our work at Blockchain Commons, so that we can apply.

What’s Next?

Coming up:

More work on Envelope & GSTP. More reveals of our SSH work. A new musings on cryptographic “cliques”.

We’re looking forward to Q3!

Monday, 22. July 2024

EdgeSecure

Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program

The post Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program appeared first on NJEdge Inc.

NEWARK, NJ, July 24, 2024 – Edge is pleased to announce that the American Association of Colleges and Universities (AAC&U) has joined the EdgeMarket cooperative as an Affiliate Partner. Through this partnership, AAC&U and its members can take advantage of thousands of services and solutions through EdgeMarket’s streamlined procurement process. Edge and AAC&U will mutually explore new ways to add value to the constituencies they serve.

AAC&U is a global membership organization dedicated to advancing the vitality and democratic purposes of undergraduate liberal education. Serving administrators, faculty, staff, and students at nearly 1,000 colleges and universities across the country and around the world, AAC&U serves as a catalyst and facilitator for innovations that improve teaching and learning and support student success. AAC&U offers special discounts to its members on select products and services from a range of higher education providers, and now as an EdgeMarket Affiliate Partner, that catalog has greatly expanded.

Designed to reduce the time, cost, and hassle of purchasing products and services, the EdgeMarket portal provides easy access to a variety of procured point solutions and services, including one of the most powerful procured IT and Services catalogs in the nation.

“This is much more than a transactional relationship; this is a partnership where we will seek ways to combine our strengths to create new opportunities for the higher education community. We are excited to learn more about what is important to AAC&U and its members and how Edge and EdgeMarket can add to AAC&U’s legacy of deep and positive impact.”

— Dan Miller
Associate Vice President, Edge Market and Solution Strategy
Edge

“The ethos of Edge aligns well with AAC&U’s vision of improving educational quality and equity, and EdgeMarket now serves as pathway to the essential technologies and services that support AAC&U’s members’ pursuit of that vision,” said Dan Miller, Associate Vice President, EdgeMarket and Solution Strategy. “This is much more than a transactional relationship; this is a partnership where we will seek ways to combine our strengths to create new opportunities for the higher education community. We are excited to learn more about what is important to AAC&U and its members and how Edge and EdgeMarket can add to AAC&U’s legacy of deep and positive impact.”

To learn more about the EdgeMarket Affiliate Partner Program visit us here.

About the American Association of Colleges and Universities: AAC&U is a global membership organization dedicated to advancing the democratic purposes of higher education by promoting equity, innovation, and excellence in liberal education. Through its programs and events, publications and research, public advocacy, and campus-based projects, AAC&U serves as a catalyst and facilitator for innovations that improve educational quality and equity and that support the success of all students. In addition to accredited public and private, two-year and four-year colleges and universities, and state higher education systems and agencies throughout the United States, AAC&U’s membership includes degree-granting higher education institutions around the world as well as other organizations and individuals. To learn more, visit www.aacu.org.

About Edge: Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program appeared first on NJEdge Inc.


FIDO Alliance

Strengthening Authentication with Passkeys in Automotive and Beyond

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had […]

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had over 100 attendees who heard from various stakeholders on the need and opportunity for standards-based approaches to securing the automotive workforce and manufacturing process. Themes included how passkeys and FIDO-certified biometrics can help transform the future of in-vehicle experiences, especially with in-car payments, smart cars, and IoT.

FIDO Momentum in the Automotive Industry

Like just about every market sector, the automotive industry is plagued by risks and ramifications associated with decades of relying on passwords – and is also uniquely poised to improve the user experience by embracing passkeys for user authentication.

With smart cars having embedded technology to connect to digital experiences, there are several innovations primed for take-off in the automotive industry. With nearly 100 million vehicles will be making payments by 2026, up from just 2.3 million in 2021, passkeys will be crucial to simplify the in-vehicle user experience. At the same time, manufacturers have the opportunity to improve IoT and secure embedded devices to improve customer experiences on and off the road.

Manufacturing and Smart Car Case Studies

On the workforce front, the event featured a case study from MTRIX and considerations on how to deploy FIDO security keys to a manufacturer’s workforce – contemplating the many types and locations of workers for today’s global manufacturers. This case study reinforced the factors called out in a presentation by Infineon on the regulatory-driven push and pull with FIDO authentication.

VinCSS described how FIDO Device Onboard is being used today to secure the smart car ecosystem both at point of manufacturing as well as for after-market use cases.

Using Passkeys for In-Vehicle Payments

The final block of sessions looked more closely at our in-vehicle future – including an overview of current trends for in-vehicle payments. Visa and Starfish then presented a blueprint and demo respectively for a standards-based approach for in-vehicle payments before Qualcomm wrapped things up with their vision for a digital chassis as the foundation for a software-defined vehicle that contemplates the need for secure identity, payments and driver/passenger personalization.

Driving FIDO in the Automotive Industry – Next Steps

Interested in this seminar’s content? Find these presentations and more on the Munich Seminar event page.

The FIDO Alliance welcomes input from the public and the identity security community on FIDO’s future in the automotive industry. Comments are welcome via our contact us page. For in-person connections, we encourage identity security and authentication professionals to join us at our conference, Authenticate, where there will be several automotive and passkey related sessions, content, and peer networking. This year’s event, held Oct. 14-16th, 2024, will be held in sunny southern California at the La Costa Omni Resort in Carlsbad, CA.


FIDO Munich Seminar: Strengthening Authentication with Passkeys in Automotive and Beyond

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state […]

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state of passwordless technology, detailed discussions on how passkeys work, their benefits, case studies, and practical implementation strategies. Attendees learned about current and emerging elements of the FIDO Certified program and how they pertain across sectors, including a focus on automotive and payments use cases. 

Attendees also had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A and networking – plus the opportunity to see demos and meet the experts that can help move FIDO deployments forward.

View the seminar slides below. More slides will be added.

FIDO Munich Seminar Introduction to FIDO.pptx from FIDO Alliance

FIDO Munich Seminar Blueprint for In-Vehicle Payment Standard.pptx from FIDO Alliance

FIDO Munich Seminar FIDO Automotive Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Biometrics and Passkeys for In-Vehicle Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Strong Workforce Authn Push & Pull Factors.pptx from FIDO Alliance

FIDO Munich Seminar: Securing Smart Car.pptx from FIDO Alliance

FIDO Munich Seminar In-Vehicle Payment Trends.pptx from FIDO Alliance

FIDO Munich Seminar Workforce Authentication Case Study.pptx from FIDO Alliance

FIDO Munich Seminar: FIDO Tech Principles.pptx from FIDO Alliance

Energy Web

ECS4DRES: Shaping the Future of Renewable Energy Systems

A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government. In collaboration wi
A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe

We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government.

In collaboration with 33 partners across 6 European countries, ECS4DRES aims to revolutionize the reliability, safety, and resilience of Distributed Renewable Energy Systems (DRES). By developing advanced monitoring and control technologies, the project will incorporate integrated sensors with energy harvesting functions, capable of various types of detection for safety and monitoring of energy transfers. Additionally, ECS4DRES will achieve interoperable and low-latency communication systems, along with sophisticated algorithms, AI tools, and methods. These innovations will enable the widespread interconnection, monitoring, and management of numerous DRES, subsystems, and components, optimizing energy management between sources, loads, and storages, enhancing power quality, and ensuring resilient system operation.

ECS4DRES is committed to thorough validation of these technologies through a series of five relevant use cases and demonstrators. The project’s results will generate a wide range of scientific, technological, economic, environmental, and societal impacts on a global scale, meeting the needs of Original Equipment Manufacturers (OEMs), Distribution System Operators (DSOs), grid operators, EV charging station aggregators, energy communities, end customers, and academia.

By providing interoperable and tailored solutions in electronic control systems, sensor technology, and smart systems integration, ECS4DRES will facilitate the deployment and efficient, resilient operation of DRES, including the integration of hydrogen equipment and components.

As we embark on this ambitious project, we are reminded of the words of renowned futurist Alvin Toffler: “The great growling engine of change — technology.” ECS4DRES represents a significant leap forward in the technological advancement of renewable energy systems, driving us toward a more sustainable and resilient future.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

ECS4DRES: Shaping the Future of Renewable Energy Systems was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Join us on The Identity at the Center Podcast as we sit down

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea. In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea.

In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field of ITDR, and the intersection of digital identity, authentication, and AI in cybersecurity.

Watch the episode: https://www.youtube.com/watch?v=klBxFLvUC78

More Info: idacpodcast.com

#iam #podcast #idac


ResofWorld

Ethiopians are struggling to keep up with the new “EV or nothing” policy

Ethiopia became the first country in the world to ban the import of gas and diesel cars. But the country has only around 50 charging stations.
When Araya Belete’s employer asked him to purchase four new cars in Addis Ababa last year, the IT professional quickly settled on an electric model, manufactured by China’s Kas Auto....

Saturday, 20. July 2024

ResofWorld

Bangladesh’s internet blackout immobilizes its booming tech industry

The government and internet service providers blame each other for the blackout amid massive protests.
Bangladesh’s tech industry has come to a halt as the nationwide internet blackouts entered a third day, leaving thousands of companies with financial and reputational losses, and workers feeling helpless....

Friday, 19. July 2024

OpenID

Calling all Implementers: Shared Signals Interop at Gartner IAM Summit

Join us for a Shared Signals Interop Event at Gartner IAM in Grapevine, Texas (December 9-11) Momentum continues to build around the OpenID Shared Signals Framework, CAEP, and RISC standards. Leading companies have announced their support, and implementations are now in production. Building on the success of the first CAEP interoperability event, The OpenID Foundation […] The post Calling all Im

Join us for a Shared Signals Interop Event at Gartner IAM in Grapevine, Texas (December 9-11)

Momentum continues to build around the OpenID Shared Signals Framework, CAEP, and RISC standards. Leading companies have announced their support, and implementations are now in production.

Building on the success of the first CAEP interoperability event, The OpenID Foundation and the Shared Signals Work Group (SSWG) are delighted to announce that we are returning to Gartner’s IAM Summit for a second interop event, this time in Grapevine Texas, December 9-11 2024. 

During this event, we will demonstrate interoperability of implementations of:

The Shared Signals Framework (SSF) Continuous Access Evaluation Profile (CAEP) Risk and Incident Sharing and Collaboration (RISC) SCIM Events  How to Get Involved

There is room for up to 10 implementers and you are invited to register your interest with Atul Tulshibagwale, co-chair of the SSWG, at atul@sgnl.ai by July 29th, 2024. Note that this is not a final commitment to participate, but an expression of interest. The final scope of the event will be determined between the SSWG and the implementers who register interest. You will have the opportunity to confirm participation once all interested parties have met to agree on the scope and format of the interoperability event.

The Gartner IAM Summit will feature a breakout session and two interoperability sessions where the implementations will be discussed.

In the breakout session, we will display the matrix of interoperability test results for all committed implementers. In the interoperability sessions, conference attendees will have the opportunity to interact with the implementers and watch the live implementations interoperate with each other.

OIDF will publicize the participants’ interop test results after the event takes place – including whether they did or did not achieve interoperability for specific use cases.

Note that we can enable remote participation for offline testing and inclusion in test results, but remote participation is not possible during the summit.


About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Calling all Implementers: Shared Signals Interop at Gartner IAM Summit first appeared on OpenID Foundation.

Thursday, 18. July 2024

FIDO Alliance

Battling Deepfakes with Certified Identity Verification

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated […]

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated “deepfakes.” As internet users have learned about the increasing threat of deepfakes, they have become increasingly concerned about their identities being spoofed online, according to a new study conducted by the FIDO Alliance. As a result, deepfake awareness and the risks associated with them have steadily increased.

Amidst this landscape, the FIDO Alliance released its newest research in the eBook, Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024, which reveals insights from an independent study surveying 2,000 respondents in the U.S. and the U.K. on consumer perceptions on remote identity verification, online security, and biometrics. While the data showed consumer awareness and adoption of biometrics is increasing, consumers also expressed concerns about the rise of AI-generated deepfakes – reinforcing the need for preventative strategies and technologies focused on secure remote identity verification. 

What is a “deepfake”?

According to the Center for Internet Security, a deepfake consists of convincingly fabricated audio and video content designed to mislead audiences into believing that fabricated events or statements are real. These manipulations can create realistic yet entirely false representations of individuals through synthetic images or complete video footage. This manipulated audio/video content is dangerously effective at spreading false information. In cybersecurity, deepfakes are increasingly being used to spoof identities to fraudulently open accounts or take control of existing accounts.

With the advent of AI and the increasing use of face biometrics for remote identity verification, the deepfake risks to remote identity proofing (RIDP) methods have become a reality. Security researchers have been closely evaluating the identity verification risks associated with deepfakes to increase awareness of the rapidly changing threat landscape and support stronger countermeasures that enhance the trustworthiness and reliability of remote identity proofing (RIDP) methods. In the European Union Agency for Cybersecurity’s (ENISA) latest remote ID report, researchers observed that deepfake injection attacks are increasing and becoming more difficult to mitigate.

Users Express Concerns about Deepfakes and ID Verification

With the rise of generative AI and deepfake videos in the news, there has been a heightened consumer unease about the security of biometrics for online verification. In the FIDO Alliance’s study, the deepfake trends have not escaped consumers’ attention online, who are increasingly using face biometrics to authenticate identities online and are concerned about identity security.

On one hand, the study reinforced consumer preference for using biometrics in remote identity verification, with nearly half of the respondents indicating a preference to use face biometrics, especially for sensitive transactions, like financial services (48%). 

On the other hand, just over half of respondents revealed they are concerned about deepfakes when verifying identities online (52%).

Building Consumer Trust in Face Biometrics

As the concerns around deepfake security threats gain prominence, the industry has taken a significant step forward with the FIDO Alliance’s newly introduced Identity Verification certification program for Face Verification. This industry-first testing certification program, based on ISO standards, with requirements developed by the FIDO Alliance, aims to measure accuracy, liveness (including deepfake detection), and bias (including skin tone, age, and gender) in remote biometric identity verification technologies. By providing a framework for testing biometric performance and a network of accredited laboratories worldwide, this certification program standardizes and evaluates the performance of face verification systems while mitigating the impact of bias and security threats, like deepfakes.

Certifying Identity Verification with the FIDO Alliance

The Identity Verification certifications that the FIDO Alliance provides offer industry providers the ability to demonstrate commitment to addressing bias and security threats in remote biometric identity verification technologies. With a focus on standardizing and enhancing the performance of face verification technologies, the Alliance released its new FIDO Certification Program to elevate the performance, security, and equity of biometric solutions for remote identity verification. Combined with its Document Authenticity (DocAuth) Certification Program, these two certifications work together to ensure identity verification solution providers can leverage FIDO’s independent testing and accredited laboratories as a market differentiator. 

What is the value for IDV Biometric Vendors? Independent validation of biometric performance Opportunity to understand gaps in product performance to then improve and align with market demands Demonstrate product performance to potential customers  Improve market adoption by holding an industry-trusted certification Leverage one certification for many customers/relying parties  Benefit from FIDO delta and derivative certifications for minor updates and extendability to vendor customers Reduce need to repeatedly participate in vendor bake-offs What is the value for Relying Parties? One-of-a-kind, independent, third-party validation of biometric performance assessing accuracy, fairness and robustness against spoofing attacks Provides a consistent, independent comparison of vendor products – eliminating the burden of maintaining own program for evaluating biometric products Accelerates FIDO adoption to password-less Commitment to ensure quality products for customers of the relying parties  Requirements developed by a diverse, international group of stakeholders from industry, government, and subject matter experts Conforms to ISO FIDO Annex published in ISO standards What is the value of accredited laboratories?

FIDO Accredited Laboratories are available worldwide and follow a common set of requirements and rigorous evaluation processes, defined by the FIDO Alliance Biometrics Working Group (BWG) and follow all relevant ISO standards. These laboratories are audited and trained by the FIDO Biometric Secretariat to ensure lab testing methodologies are compliant and utilize governance mechanisms per FIDO requirements. Laboratories perform biometric evaluations in alignment with audited FIDO accreditation processes. In contrast, bespoke, single laboratory biometric evaluations may not garner sufficient trust from relying parties for authentication and remote identity verification use cases.

What are the ISO Standards that FIDO certification conforms to?

When a vendor invests in FIDO’s Face Verification Certification, they and their accredited lab are adhering to the following ISO standards:

Terminology
ISO/IEC 2382-37:2022 Information technology — Vocabulary — Part 37: BiometricsPresentation Attack Detection
ISO/IEC 30107-3:2023 Information technology — Biometric presentation attack detection — Part 3: Testing and reportingISO/IEC 30107-4:2020 Information technology — Biometric presentation attack detection — Part 4: Profile for testing of mobile devices
-FIDO Annex, published 2024Performance (e.g., FRR, FAR)
ISO/IEC 19795-1:2021 Information technology — Biometric performance testing and reporting — Part 1: Principles and frameworkISO/IEC 19795-9:2019 Information technology — Biometric performance testing and reporting — Part 9: Testing on mobile devices
-FIDO Annex, published 2019Bias (differentials due to demographics)
ISO/IEC 19795-10:2024 Information technology — Biometric performance testing and reporting — Part 10: Quantifying biometric system performance variation across demographic groups
-FIDO Annex, under developmentLaboratory
ISO/IEC 17025:2017, General requirements for the competence of testing and calibration laboratories Learn More about FIDO IDV Certification

As organizations and policymakers navigate the evolving landscape of digital identity verification, these consumer insights serve as a testament to the pressing need for independently tested and accurate biometric systems. The FIDO Alliance’s new Face Verification Certification Program offers solution providers the opportunity to demonstrate deepfake prevention to relying parties and end users by testing for security, accuracy, and liveness.

Download the Remote ID Verification eBook here today, and discover the world-class offerings from FIDO’s certified providers that have invested in independent, accredited lab testing with FIDO certification.


DIF Blog

DIF announces DWN Community Node

The Decentralized Identity Foundation (DIF) today announced the availability of the Decentralized Web Node (DWN) Community Instance, operated by DIF and powered by Google Cloud.  Decentralized Web Nodes (DWNs - also referred to as DWeb Nodes) are personal data stores that eliminate the need for individuals to trust apps

The Decentralized Identity Foundation (DIF) today announced the availability of the Decentralized Web Node (DWN) Community Instance, operated by DIF and powered by Google Cloud. 

Decentralized Web Nodes (DWNs - also referred to as DWeb Nodes) are personal data stores that eliminate the need for individuals to trust apps to responsibly use and protect their data. Instead, data is owned and controlled by the individual — offering developers a brand new way to create apps that request individuals’ permission to read and access their data, but don’t store it.

The Managed DWN service or “community node” will allow existing and new Google Cloud customers to more easily build test applications using DWNs. Rather than having to run their own DWN or server infrastructure to store data, developers will be able to leverage a DWN on Google Cloud to build test applications. 

Developers can use the community DWN node at no cost, including up to 1GB of storage per DID.

The service was launched during a strongly attended DIF community call earlier today (event highlights below). 

The launch marks an exciting milestone in DIF’s mission to make it easy for developers to build using decentralized identity, and follows the recent establishment of the Veramo User Group and an upgrade to DIF’s Universal Resolver infrastructure. 

SC members Daniel Buchner and Markus Sabadello will share the stage with leaders from Google Cloud and DIF member org TBD at We Are Developers World Congress in Berlin tomorrow to shine a spotlight on the new service, including a live demo of applications built using DWNs.

Today’s launch event in brief 

Following an introduction by DIF’s Executive Director, Kim Hamilton Duffy, and DIF Ambassador, Jeffrey Schwartz, Founder and CEO of Dentity, several companies who are deploying DWNs in real-world use cases gave case study presentations. 

Daniel Bucher and Andor Kesselman, co-chairs of the DWN work item within DIF’s Secure Data Storage working group, described how DWeb Nodes work, and what developers can do with them. 

A hot topic that surfaced during the discussion and subsequent Q&A, led by DIF’s Senior Director of Community Engagement, Limari Navarrete, was how developers can define protocols that give users fine-grained control over the data in their DWN. 

Protocols “encode rules around how people interact with my data,” Andor said. “It’s very powerful!” he added. 

Celina Villanueva from Extrimian described a pre-authorisation protocol designed to enable first responders at an emergency to access critical information in an accident victim’s DWN. The protocol powers a new service that will be piloted in Buenos Aires later this year. Extrimian also envisages the protocol enabling customers to pre-authorise their bank to access personal information when needed. 

Michal Jarmolkowicz described how Swiss Safe is working with clients in the travel and healthcare sectors to build new services around DWNs, including a protocol enabling patients to take control of their health record and grant specific, time-limited access to healthcare professionals. 

Another recurring topic was how DWNs and Verifiable Credentials can interact. Insights included how DWNs enable backup and recovery of VCs, and the benefits of combining VCs with a “bring your own data” model. 

There was also discussion about the adoption status of DWNs, and how to accelerate this. Jeffrey Schwartz said Dentity clearly sees an adoption cycle underway, based on Proofs of Concept they are involved in or aware of. Michal Jarmolkowicz noted that searching out CIOs and CTOs with an active “technology radar” (a service that flags and tracks emerging technologies) has proved fruitful for Swiss Safe, as they are more likely to be aware of Decentralized Identity and the value it offers. He added that Data Protection Officers often respond enthusiastically to DWNs.

Several participants wanted to know which cloud platforms and developer tools DWNs can be used with today. Daniel Buchner said Google has already adapted DWNs for Cloud SQL and Blobstore, while others are currently working on adapters for AWS S3 and other services. He also shared the insight that cloud providers are incentivized to enable developers to add more DWN protocols, since this drives more utility from customers’ data, encouraging them to increase their service usage. 

Wrapping up, he urged participants to “get involved and see what you can build. Together, DIF’s Community Node and the Web5 SDK make it pretty simple. Try writing your own protocols and have fun with it!”

What are Decentralized Web Nodes?

DWNs are personal data stores and peer-to-peer communication nodes that serve as the foundation for decentralized apps and protocols. They live on devices and can be replicated on hosted instances in the cloud. These data stores are a realization of true serverless applications in which developers can store app data without using centralized servers or an account with a centralized service.

DWNs are a foundational component of Web5, an open source platform that provides a new, reliable identity and trust layer for the Web to enable decentralized applications and protocols that solve real problems for people. Built on open standards developed by the World Wide Web Consortium (W3C) and DIF, Web5 consists of three key pillars: Decentralized Identifiers (DIDs), Verifiable Credentials (VCs), and Decentralized Web Nodes.

Why use DIF’s Managed DWN Service?

The service empowers developers to build decentralized apps that give users full ownership and control over their data, without needing to ask each user to deploy a local DWN. With DIF’s Managed DWN Service, developers can build decentralized apps with data in DWNs hosted on Google Cloud, making it easier than ever before to empower individuals with ownership and control of their data.

How do I get started with Managed DWNs? 

To start building decentralized apps on Web5, developers can visit developers.tbd.website. 

To access the DIF community node, start here.


Energy Web

Green Proofs by Energy Web Now Available as a Service

Enables energy companies to rapidly construct digital registries for green commodities July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly c
Enables energy companies to rapidly construct digital registries for green commodities

July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly construct digital registries for tracking, tracing, and exchanging digital certificates representing any green commodity with unprecedented flexibility and control.

Green Proofs as a Service includes the following key features:

Customized Data Formats and Schema: Users can tailor data formats and schema specific to different green commodities, enabling any green commodity and associated data format to be supported Configurable Business Logic and Rules: Administrators can define and adjust business logic and rules for the creation, transfer, issuance and retirement of certificates, providing full control over the certification process. Comprehensive Registry Administration: The service includes all functionalities expected of a registry administrator, such as the ability to add and remove users from individual companies or multiple companies, enhancing security and user management.

Green Proofs has already demonstrated its efficacy and reliability in supporting multiple enterprise solutions. Notable implementations include the RMI / EDF Sustainable Aviation Fuel Certificate Registry, a low-carbon shipping registry, and multiple 24/7 renewable energy matching solutions. These use cases highlight the versatility and robustness of Green Proofs in real-world applications.

“Green Proofs as a Service marks a significant milestone for Energy Web and our commitment to driving innovation in the energy sector,” said Mani Hagh Sefat, CTO of Energy Web. “By offering Green Proofs via an as-a-service model, we help our clients innovate much faster by quickly putting a digital registry into their hands for experimentation and rapid prototyping.”

Green Proofs as a Service is now available to businesses and organizations worldwide who are interested in using digital registries to support any green commodity supply chain. For more information or to schedule a demo, please visit www.energyweb.org or contact hello@energyweb.org

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Green Proofs by Energy Web Now Available as a Service was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Origin Trail

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years

Celebrating 50 Years of the GS1 Barcode June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards. As consumer demand for product information grew, regulator
Celebrating 50 Years of the GS1 Barcode

June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards.

As consumer demand for product information grew, regulatory requirements became stricter, and supply chain optimization pressures increased, the need for an updated barcode became evident. Enter the GS1 Digital Link, the barcode upgrade designed to provide dynamic access to comprehensive product information. Now, with leading retail and consumer goods companies actively supporting the transition to Digital Link QR codes, the stage is set for the traditional barcode to retire gracefully.

Setting a Strong Foundation for Digital Link with OriginTrail

For products and brands to fully benefit from the GS1 Digital Link transition, a robust, connected, and verifiable data foundation is crucial. Product data is often split across various supply chain partners, including manufacturers, logistics providers, wholesalers, retailers, and others. To connect billions of products to the internet in a meaningful way that provides genuine insights and business value, this scattered product data needs to be interconnected.

Scanning a Digital Link on a product and seeing the manufacturer’s information, such as production date, description, ingredients, and brand details is good. Scanning the same code and accessing comprehensive information about the product’s journey through the supply chain — including whether the ingredients were ecologically produced, if the product was stored at proper temperatures during transport, and how long it was in the supply chain — is much better. This is the true potential of the Digital Link.

Beyond consumer engagement, consider a business operating a rail or plane network being able to access details on a component’s manufacture, testing, and maintenance by scanning a Digital Link code. That would have surely been invaluable with the recent Boeing aircraft incidents.

This is where the OriginTrail Decentralized Knowledge Graph (DKG) and GS1 Digital Link make a match in heaven. The DKG provides a verifiable and interconnected knowledge base encompassing product data, supply chain events, certifications, locations, and more — across organizations and data sources. With the new DKG V8, the OriginTrail introduces the scalability needed to bring billions of products equipped with Digital Link into a world of standards-based, connected, and verifiable data. And the new DKG Edge Node concept empowers organizations and business networks to exchange product and other supply chain data with just a few clicks while maintaining data privacy, verifiability, and connectivity.

Supply chain data from multiple sources connected in a verifiable Decentralized Knowledge Graph.

As a longstanding partner of GS1, OriginTrail DKG is designed to natively support GS1 standards, including EPCIS, Core Business Vocabulary (CBV), Global Data Model (GDM), and Digital Link. This integration means that consumers, regulators, brands, and other stakeholders can access richer, more comprehensive, and trusted product data. The challenge now is to make this user experience seamless and simple, and there’s a tech perfect for the job — Artificial Intelligence (AI).

OriginTrail, Digital Link, and AI: A Consumer Engagement Power Throuple

Incorporating AI into the mix creates an incredibly powerful technology trio, enabling brands to enhance consumer engagement, based on connected and verifiable data spanning organizations, in unprecedented ways. And with the DKG Edge Node, AI capabilities come natively. Brands can thus offer personalized and tailored experiences by allowing customers to scan a product with a Digital Link QR code and ask anything — from brand details to product origins, sustainability, and environmental impact, all based on verifiable data from OriginTrail DKG.

This combination not only benefits consumers but also provides brands with valuable insights into customer preferences, allowing them to refine their business strategies. As billions of products transition from barcodes to Digital Link, the potential of this technology trio becomes evident. In fact, AI-powered product discovery, based on OriginTrail and Digital Link, is no longer a future concept but a current reality:

Some additional examples to check out:

Check the origin » Perutnina Ptuj Church of Oak Whiskey Distillery

Simultaneously, organizations can leverage AI to better understand and enhance their supply chains, ensuring they receive accurate and verifiable responses rooted in data from across their business network. By simply scanning a Digital Link QR code on a product, pallet, or shipping container, users are immediately empowered to ask questions and get verifiable answers — from basic queries like “Where was this product manufactured?” to more complex ones such as “Was the temperature in this shipping container in line with expectations?” and “Give me a list of all train wagons that are likely to experience issues with their wheels in the next month.” Exciting stuff indeed.

Where do we go from here?

As billions of products transition from traditional barcodes to Digital Link QR codes, establishing a robust foundation of connected and verifiable data becomes paramount. OriginTrail is at the forefront of this transformation, with the new DKG V8 offering the scalability and simplicity necessary to realize its full potential. When combined with AI, this technology trio unlocks immense opportunities for brands to engage with their customers in a trusted and meaningful way.

But consumer engagement is just one area set to benefit significantly from this transition. Regulatory bodies will gain streamlined access to verifiable product data, and supply chain management will become more proactive and efficient. The coming months and years promise exciting advancements and opportunities, making this a pivotal moment in the evolution of product information and consumer engagement.

We are excited to see OriginTrail at the epicenter of it all, as we — Trace Labs, the core developers of OriginTrai — along with our ecosystem partners get ready to unveil the Digital Link support via the new DKG V8 at the GS1 Industry & Standards Event. Over 1,000 business leaders from 80+ countries will come together virtually to solve today’s greatest business challenges through the development and adoption of the GS1 global standard.

For the GS1 Industry & Standards Event, register at: https://standards-event.gs1.org/

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Oasis Open Projects

Introducing the Coalition for Secure AI, an OASIS Open Project

Boston, MA – 18 July 2024 – The Coalition for Secure AI (CoSAI) was announced today at the Aspen Security Forum. Hosted by the OASIS global standards body, CoSAI is an open-source initiative designed to give all practitioners and developers the guidance and tools they need to create Secure-by Design AI systems. CoSAI will foster […] The post Introducing the Coalition for Secure AI, an OASIS Open

Boston, MA – 18 July 2024 – The Coalition for Secure AI (CoSAI) was announced today at the Aspen Security Forum. Hosted by the OASIS global standards body, CoSAI is an open-source initiative designed to give all practitioners and developers the guidance and tools they need to create Secure-by Design AI systems. CoSAI will foster a collaborative ecosystem to share open-source methodologies, standardized frameworks, and tools. 

CoSAI brings together a diverse range of stakeholders, including industry leaders, academics, and other experts, to address the fragmented landscape of AI security.

CoSAI’s founding Premier Sponsors are Google, IBM, Intel, Microsoft, NVIDIA, and PayPal. Additional founding Sponsors include Amazon, Anthropic, Cisco, Chainguard, Cohere, GenLab, OpenAI, and Wiz. CoSAI is an initiative to enhance trust and security in AI use and deployment. CoSAI’s scope includes securely building, integrating, deploying, and operating AI systems, focusing on mitigating risks such as model theft, data poisoning, prompt injection, scaled abuse, and inference attacks.  The project aims to develop comprehensive security measures that address AI systems’ classical and unique risks.  CoSAI is an open-source community led by a Project Governing Board, which advances and manages its overall technical agenda, and a Technical Steering Committee of AI experts from academia and industry who will oversee its workstreams.

The Need for CoSAI

Artificial intelligence (AI) is rapidly transforming our world and holds immense potential to solve complex problems. To ensure trust in AI and drive responsible development, it is critical to develop and share methodologies that keep security at the forefront, identify and mitigate potential vulnerabilities in AI systems, and lead to the creation of systems that are Secure-by-Design.

Currently, securing AI and AI applications and services is a fragmented endeavor. Developers grapple with a patchwork of guidelines and standards which are often inconsistent and siloed. Assessing and mitigating AI-specific and prevalent risks without clear best practices and standardized approaches is a significant challenge for even the most experienced organizations.

With the support of industry leaders and experts, CoSAI is poised to make significant strides in establishing standardized practices that enhance AI security and build trust among stakeholders globally.

“CoSAI’s establishment was rooted in the necessity of democratizing the knowledge and advancements essential for the secure integration and deployment of AI,” said David LaBianca, Google, CoSAI Governing Board co-chair. “With the help of OASIS Open, we’re looking forward to continuing this work and collaboration among leading companies, experts, and academia.”

“We are committed to collaborating with organizations at the forefront of responsible and secure AI technology. Our goal is to eliminate redundancy and amplify our collective impact through key partnerships that focus on critical topics,” said Omar Santos, Cisco, CoSAI Governing Board co-chair. “At CoSAI, we will harness our combined expertise and resources to fast-track the development of robust AI security standards and practices that will benefit the entire industry.”

Initial Work

To start, CoSAI will form three workstreams, with plans to add more over time:

Software supply chain security for AI systems: enhancing composition and provenance tracking to secure AI applications. Preparing defenders for a changing cybersecurity landscape: addressing investments and integration challenges in AI and classical systems. AI security governance: developing best practices and risk assessment frameworks for AI security.

Participation 

Everyone is welcome to contribute technically as part of the CoSAI open-source community. OASIS welcomes additional sponsorship support from companies involved in this space. Contact join@oasis-open.org for more information.  

Additional Information
CoSAI charter

Support for CoSAI

Amazon
“At Amazon, our top priority is safeguarding the security and confidentiality of customer data. From day one, AWS AI infrastructure and the Amazon services built on top of it have had security and privacy features built-in that give customers strong isolation with flexible control over their systems and data. As a sponsor of CoSAI, we’re excited to collaborate with the industry on developing needed standards and practices that will strengthen AI security for everyone.”
– Paul Vixie, VP/Distinguished Engineer and Deputy CISO, Amazon Web Services

Anthropic
“As a safety-focused organization, building and deploying secure AI models has been core to our mission from the start. We’re proud to partner with other industry leaders to help foster a secure AI ecosystem and collaborate on a set of technical security best practices and standards. We look forward to the work ahead with the coalition to encourage safe AI development.”
– Jason Clinton, Chief Information Security Officer, Anthropic 

Cisco
“Cisco is very excited to join forces with other industry leaders in the Coalition for Secure AI (CoSAI). This effort underscores our commitment to advancing AI security, developing standardized best practices, and ensuring that AI technologies are secure-by-design. Together with our partners, we aim to drive innovation and build trust in AI systems across all sectors.”
– Omar Santos, Distinguished Engineer, Cisco

Chainguard
“As we witness AI workloads evolving beyond simple applications to more sensitive and critical functions, ensuring their security becomes paramount. The current landscape is fragmented, with developers navigating through inconsistent and siloed guidelines. At Chainguard, we are excited to join CoSAI and contribute our expertise in creating secure-by-design AI systems. Together, we can set new benchmarks for AI security, ensuring that innovation progresses on a foundation of safety and reliability.” 
– Kim Lewandowski, Co Founder and Chief Product Officer, Chainguard

Cohere
“Cohere is proud to join the Coalition for Secure AI (CoSAI) to further our commitment to building frontier enterprise AI solutions with security and data privacy at the core. AI will have a transformative impact on businesses and we look forward to working with the rest of the industry to develop comprehensive standards that enhance trust and security to encourage wider adoption of this technology.” 
– Prutha Parikh, Head of Security, Cohere

GenLab
“Security requires a community to support, integrate, and promote best practices globally to ensure the stability and safety of AI. That’s why we are excited about being a member of CoSAI and helping discover and promote these practices within its own companies and the broader global ecosystem.”
– Daniel Riedel, Founder, GenLab Venture Studio

Google
“We’ve been using AI for many years and see the ongoing potential for defenders, but also recognize its opportunities for adversaries. CoSAI will help organizations, big and small, securely and responsibly integrate AI – helping them leverage its benefits while mitigating risks.”
– Heather Adkins, Vice President and Cybersecurity Resilience Officer, Google

IBM
“IBM is excited to join the Coalition for Secure AI (CoSAI), a new initiative that brings together industry leaders, organizations, and technology experts to develop standardized approaches to address AI cybersecurity. By participating in CoSAI, we are committed to fostering collaboration, innovation, and education, so that AI systems are more secure-by-design. This initiative will empower developers with the best practices, tools, and methodologies needed to safeguard AI solutions.”
– Alessandro Curioni, IBM Fellow, Vice President Europe and Africa and Director IBM Research Zurich

Intel
“The speed of AI innovation must be matched by the security of its creations. Intel is committed to advancing secure AI practices and doing so will require collaboration across the ecosystem. The Coalition for Secure AI (CoSAI) will provide security practitioners and developers with accessible guidance, resources and tools to create secure AI systems. We are proud to participate in this effort as a founding member alongside our CoSAI partners.”
– Dhinesh Manoharan, Vice President and General Manager, Security for AI & Security Research, Intel

Microsoft
“Microsoft remains steadfast in its commitment that safety and security be at the heart of AI system development. As a Founding Member of the Coalition for Secure AI, Microsoft will partner with similarly committed organizations towards creating industry standards for ensuring that AI systems and the machine learning required to develop them are built with security by default and with safe and responsible use and practices in mind. Through membership and partnership within the Coalition for Secure AI, Microsoft continues its commitment to empower every person and every organization on the planet to do more…securely.” 
– Yonatan Zunger, CVP, AI Safety & Security, Microsoft

NVIDIA
“As AI adoption continues to grow across industries, it’s paramount to ensure proper guidance and security measures when building and deploying models. As a founding member of the Coalition for Secure AI, NVIDIA is committed to building a community dedicated to making secure and trustworthy AI accessible to all.”
– Daniel Rohrer, VP of Software Product Security, Architecture and Research at NVIDIA

OpenAI
“Developing and deploying AI technologies that are secure and trustworthy is central to OpenAI’s mission. We believe that developing robust standards and practices is essential for ensuring the safe and responsible use of AI and we’re committed to collaborating across the industry to do so. Through our participation in CoSAI, we aim to contribute our expertise and resources to help create a secure AI ecosystem that benefits everyone.”
– Nick Hamilton, Head of Governance, Risk, and Compliance, OpenAI

PayPal
“PayPal is proud to partner with CoSAI to help shape the industry’s guidelines and standards for secure AI development. We are at the forefront of the ever-evolving cybersecurity landscape as we power about a quarter of the world’s e-commerce transactions every year. Ensuring that every transaction is safe and secure is our top priority. We are excited to collaborate with the coalition to develop comprehensive standards and practices that ensure safe, secure AI for everyone.”
– Shaun Khalfan, Chief Information Security Officer, PayPal

Wiz
“Like the early days of cloud, AI adoption has skyrocketed while governance and security must play catch up. Wiz believes in enabling organizations to tap into the transformative power of AI while staying secure. That belief is driving our participation in CoSAI, and we can’t wait to partner alongside so many thought leaders who are equally committed to the cause. The future is bright.”
– Ryan Kazanciyan, Chief Information Security Officer, Wiz

Media Inquiries:
Carol Geyer, carol.geyer@oasis-open.org

The post Introducing the Coalition for Secure AI, an OASIS Open Project appeared first on OASIS Open.


ResofWorld

Thailand’s big market for small trucks goes electric

Toyota and Isuzu have long dominated Thailand’s pickup truck market. As they prepare to launch EV trucks, they face competition from Chinese firms.
In the Thai beach town of Pattaya, travelers disembarking at the Bali Hai pier can hail a taxi or cram into a songthaew, a modified pickup truck. On a recent...

How Apple’s India gamble paid off

India’s growing middle class is fueling a billion-dollar sales surge.
A quick programming note: Exporter is going to be taking a summer break, but look for us back in your inbox in September. In the meantime, you can keep up...

DIF Blog

Decentralizing Trust in Identity Systems

DIF's Credential Trust Establishment Working Group has released a new white paper titled "Decentralizing Trust in Identity Systems", describing how to achieve scaleable trust relationships in decentralized identity networks. The problem of trust in a decentralized identity ecosystem comes down to the simple question of whether

DIF's Credential Trust Establishment Working Group has released a new white paper titled "Decentralizing Trust in Identity Systems", describing how to achieve scaleable trust relationships in decentralized identity networks.

The problem of trust in a decentralized identity ecosystem comes down to the simple question of whether a credential verifier should trust the issuer of a credential. This problem becomes increasingly complex as networks expand to include many credential issuers and delegated trust relationships.

This paper explores different trust network architectures, comparing risks, benefits, and tradeoffs. It highlights the advantages of the Credential Trust Establishment Specification, offering practical recommendations for developing and managing of trust networks.

Highlights Introduction to Trust Networks: A discussion on Trust Networks, with examples from various industries such as credit card networks, telecommunications, and online marketplaces. Relevance to Decentralized Identity: How Trust Networks facilitate trust in decentralized identity ecosystems. Architecture Comparisons: A comparison of API-oriented and data-oriented architectures, highlighting the advantages and disadvantages. Credential Trust Establishment: An overview of the Credential Trust Establishment specification and how to get started. Read the White Paper

To read the full white paper, please see the Credential Trust Establishment White Paper.

Learn More

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| Subscribe on YouTube
| Read our DIF blog
| Read the archives

Wednesday, 17. July 2024

Ceramic Network

Optimizing Ceramic: How Pairwise Synchronization Enhances Decentralized Data Management

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic. “Information also wants to be expensive. Information Wants To Be Free.

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic.

“Information also wants to be expensive. Information Wants To Be Free. ...That tension will not go away.” Stewart Brand. There is tension since data storage is a competitive market but data retrieval can only be done by the service that has your data. At 3Box Labs, we want to catalyze a data ecosystem by making community driven data distribution not only possible but available out of the box. Ceramic is a decentralized storage solution for apps that are dealing with multi-party data and that is more scalable, faster and cheaper than the blockchain.

Data vendor lock-in

Many organizations and individuals have data that they want to publish and Ceramic lets them do so without instant data vendor lock in for storing their own data. In the Web2 era, data often becomes ensnared within exclusive services, restricting its accessibility and durability. Access to this data requires obtaining permission from the service provider. Numerous platforms have vanished over the years, resulting in significant data loss like GeoCities, Friendster and Orkut. Even within still existing companies like Google, numerous lost data products are documented. See killed by google.

We can break free from this risk by creating data-centric applications that multihome the data. Ceramic is the way to have many distinct data controllers publishing into shared tables in a trustless environment. Each reader can know who published what content and when they did without relying on trusting the storage service to keep accurate audit logs. Since each event is a JSON Document, signed by a controller, timestamped by ethereum, and in a documented schema it can be preserved by any interested party, with or without permission from the storage vendor.

Multihome the data

In Ceramic we separate the roles of data controllers from the data servers. By allowing data to live on any preferred server the data is durable as long as any server is interested in preserving the data. This allows data to outlive a particular data server, paired with the durability of data living in multiple places and the speed/reliability of operating on local data.

Document the schema

Throughout the history of the internet, we have witnessed numerous data services going away and taking the users data with them. While multihoming helps preserve data, it's useless without the ability to interpret it.

Ceramic preserves the data formats in two ways. The first is that the data lives in JSON Documents. This format allows us to reverse engineer and examine the data.  The second is that the model schema gets published. The model schema contains both json-schema and human language description that the original developer can use to give machine and human context to the data. This enables both the preservation of the data and schema so the data can be understood and new apps can be made to interact with the preserved data.

{ "data":{ "accountRelation":{ "type":"list" }, "description":"A blessing", "name":"Blessing", "relations":{ }, "schema":{ "$defs":{ "GraphQLDID":{ "maxLength":100, "pattern":"^did:[a-zA-Z0-9.!#$%&'*+\\/= ?^_`{|}~-]+:[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~- ]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*$", "title":"GraphQLDID", "type":"string" } }, "$schema":"https://json-schema.org/draft/2020-12/schema", "additionalProperties":false, "properties":{ "text":{ "maxLength":240, "type":"string" }, "to":{ "$ref":"#/$defs/GraphQLDID" } }, "required":[ "to" ], "type":"object" }, "version":"1.0", "views":{ "author":{ "type":"documentAccount" } } }, "Header":{ "controllers":[ "did:key:z6MkgSV3tAuw7gUWqKCUY7ae6uWNxqYgdwPhUJbJhF9EFXm9" ], "model":{ "/":{ "bytes":"zgEEAXFxCwAJaG1vZGVsLXYx" } }, "sep":"model" } }

Example schema document

Information retrieval 

The key to multihome data is being able to retrieve the data from a server that has it.

How do we move the data from the servers that have the data to the servers that are interested in storing it? When we first made Ceramic we used two multicast methods: The first was to do a gratuitous announcement of new data. Send the data to EVERY node in the network so that they can store it if they are interested in it. Second, if a node did not know about a stream then when requested by a user it would multicast a request to the whole network and take the latest version to come back as a response.

This worked but had several drawbacks. The first is that requests for streams that a node did not know used WAN traffic and would have unpredictable latencies. This meant that all applications needed to design for slow unpredictable retrieval times. The second drawback was that a node had no way to retrieve a complete set of the streams that matched their interests. They could only listen to the multicast channel and fetch any stream they happened to hear about. Any stream that they missed either because it happened before the node was online or during down time could be missed forever. Third, there is a performance cost to sending requests to nodes that have no mutual interest with your node. A node that did 100 events a year could not scale down since it would need to keep up with filtering announcements from nodes doing 100 events a second. If we wanted to support both very large and very small data centric applications we needed a new strategy. We even saw cases where a slow node could not keep up on the multicast channel harming the performance of larger more powerful nodes.

To solve these problems of performance, completeness, and scalability we switched to a pairwise synchronization model. Each node advertises the ranges of streams that the node is interested in. Each node only synchronizes the streams that are of mutual interest and the nodes synchronize pair wise.

Scalability

Since the nodes synchronize pairwise, no slow node can harm the ability of two healthy nodes to complete a synchronization. If two nodes have no intersection in their interests then the conversation is done. A range of streams that has 100s of events per second that your node is not interested in will not create work for your node. A node only needs to scale to the speed of events in the ranges it is interested in and the scale of any model you are not interested in costs you nothing. This solved our scale up / scale down objective.

Completeness

If the two nodes do have an intersection of their interests they will continue the synchronization until both nodes have ALL the events that the other node had when the synchronization began. There is no longer a need for high availability to be online either when the stream’s event was originally published or when some node queried for that stream. If the event is stored by either of the nodes both nodes will have it at the end of the pairwise synchronization. Once a node has pairwise synchronized with each of the nodes that are advertising an interest range that node has all of the events in that range as of the time of the synchronization. This solves the completeness objective.

More interestingly, the local completeness means that we can build local indexes over the events and do more complex queries over the events in the ranges nodes are interested in entirely locally.

Performance

 Lastly, since we have a complete set of events for our interests we can serve queries about the events from the local node with no need for WAN traffic. This solves the performance objective for predictable fetch latencies.

Pairwise Synchronization in Logarithmic rounds

In the multicast model ceramic sends messages to all other ceramic nodes. One of the most notable differences with synchronization is that nodes do pairwise synchronization one peer at a time. The two peers will each send the other their interests. Both nodes filter the events that they have to find the set of events of mutual interest between the two nodes. Once this intersection is found we synchronize the set with a Range-Based Set Reconciliation protocol we call Recon.

We can report progress in a Recon synchronization by reporting the percentage of events in the in sync vs syncing ranges. Alternatively we could render a bar like in the diagram showing which ranges are in which states.

This is a divide and conquer protocol. We start with the full intersection as a single range. We pull a range off the work list and send the (hash, count) of all events in the range to the other side. They compare their own (hash, count) and respond accordingly.

We have

They have

Acton

hash_A

hash_A

Done.
`in sync`

0

hash_A

Send a request for the events.
`Don’t have`

hash_A

0

Send the events.
`in sync`

hash_A

hash_B

Split the range
Push sub-ranges from split on the work list.
Each range `syncing`

The range splits are handled differently on the Initiator then the Responder. The Initiator maintains the work list and pushes all of the subranges onto the work list. The Responder just sends a message back with multiple ranges and hashes for each range. This keeps the synchronization state on the Initiator and reduces the burden on the Responder to a stateless call and response. This fits Recon into the http client server request response paradigm.

Exponential distribution

Now that we have replaced a multicast message to all nodes in the network with pairwise sync it is reasonable to ask if we have broken the exponential distribution we got from multicast trees.

How fast can data spread through the network? Now that we have replaced the multicast channel with pairwise connections, how do we match the exponential distribution of the multicast channel? 

We get this property since each node cycles through connecting to all other nodes that advertise overlapping interests. When the node that originally received the event from a client there is 1 copy on the network. After the first sync there are 2. Then both of the nodes sync to new nodes giving 4. This will grow exponentially until almost all interested nodes have the data. At that point the odds that any node with the event calls a node without it is small but the odds that the node without the event calls a node with it is large. By using synchronization we get the benefits of both push and pull gossip protocols. Push which is fast when the knowledge of the event is rare and pull which is fast when knowledge of the event is common.

Summary

By using Set reconciliation to perform pairwise synchronization of node’s overlapping interests we are able to have performance, completeness, and scalability. The predictable performance of querying local data on your node. The completeness of synchronizing all of the events of interest preemptively. The scalability of not synchronizing the events that lay outside of the interests of a node. Pairwise synchronization protects the network from slow nodes from slowing down the rest of the network. It is now possible to scale up or down without performance and completeness problems. This enables developers to build data intensive applications without the data vendor lock-in from either the storage providing service or the application that originally read the schema.


ResofWorld

Bhutan’s first AI startup is seven college kids in a dorm

NoMindBhutan services prominent clients like the Bhutan National Bank and Drukair - Royal Bhutan Airlines.
When college students Ugyen Dendup and Jamphel Yigzin Samdrup launched their startup last year, they had yet to learn that they would spend most of their time servicing some of...

Tuesday, 16. July 2024

FIDO Alliance

Case Study: Wedding Park Deploys Company-Wide Passwordless Authentication for Internal Cloud Service Logins

Corporate overview: Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review […]

Corporate overview:

Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review information site and has since expanded its operations. Utilizing a wealth of information, it operates several wedding-specialized media, including the wedding preparation review site Wedding Park. In addition, it runs various businesses in the realm of weddings combined with digital technology, such as internet advertising agency services, digital transformation (DX) support, and educational ventures.

Background and challenges leading to deployment

Wedding Park was faced with the challenges of strengthening the security of multiple cloud services that were being used for internal operations and the complexity of password management. As a way to address these issues, the company introduced an ID management service and consolidated them into a cloud service entrance with a single sign-on function.

The impetus for deploying FIDO authentication came from the fact that Salesforce, which is used for authentication for customer management, order and supply systems, and time and attendance management, announced that multi-factor authentication (MFA) was mandatory. However, if MFA is applied only to Salesforce and other cloud services continue to operate with password authentication, not only will the usability of users deteriorate, but the work of the IT management department will also become more complicated. In addition, due to the vulnerability of password-only authentication, the company decided to apply MFA to all cloud services, including Salesforce, in accordance with its policy to promote zero-trust security in February 2020.

Selection and verification of an authenticator

As an authentication method for MFA, the company considered one-time password authentication (OTP) and biometric authentication using smartphone applications, but ultimately decided to deploy passwordless authentication using FIDO for its unique ability to improve both security and user convenience.

In order to realize passwordless authentication using FIDO, a terminal equipped with a FIDO-compatible biometric authentication device is required. The majority of devices currently on the market support FIDO authentication, and with the exception of a few employees, the adoption of FIDO has been supported by the fact that all in-house devices are already equipped with Windows Hello and Touch ID. For some employees who use the devices not equipped with biometric features, a separate external authenticator has been installed.

A step-by-step changeover for each department

After examining the authenticators, the policy to deploy passwordless authentication company-wide in January 2022 was officially launched. The transition took place from February to March of the same year, and the smooth implementation in a short period of one month was made possible by the department-by-department implementation and the generous support provided by the IT management department. For this implementation, the company requested the support of CloudGate UNO, an identity management platform by International System Research Corporation (ISR) that the company has been using since 2012, because it supports passwordless authentication using FIDO2 and biometric authentication using a smartphone APP. 

The introduction of the system within the company began with the development department and gradually progressed to departments with a larger number of employees. First, at regular meetings for each department, the company communicated the purpose of why the system was being introduced and the benefits of “the deployment of the system will make daily authentication more convenient,” and gained the understanding across the company. The introduction of the system on a departmental basis had the advantage of not only limiting the number of people the IT management department had to deal with at one time, but also allowing the accumulation of QA as test cases and the smooth maintenance of manuals, since the system was introduced starting with the development department, which had high IT skills.

As a result of close follow-up by the IT management department, which not only prepared materials, but also checked the progress status on the administrator website as needed, and individually approached employees who had not yet registered their certifiers, the company was able to implement the system company-wide within the targeted time frame.

Effects of introduction

The number of login errors due to mistyping of passwords, which used to occur about 200 times a month, has been reduced to zero since the deployment of FIDO authentication. Many employees commented that the system has become very convenient, eliminating authentication failures due to forgotten passwords or typing errors. In addition, the number of periodic password reset requests has decreased, resulting in a reduction in man-hours for the administrator.

The passwordless authentication is smooth, and the authentication status retention period was shortened to further enhance security, but the system has continued to operate without problems since then.

Wedding Park’s future vision is to link all cloud services used within the company to “CloudGate UNO” and centrally manage them, including authentication, with “CloudGate UNO.

Akira Nishi, General Manager of the Corporate IT Office, who spoke with us about this case study, made the following comments.

“For those who are considering the deploying of a new authentication method, there is inevitably a concern that a change in authentication method will cause a large-scale login failure. In our case, in the early stages of the project, we held explanatory meetings for each department and repeatedly brushed up on explanatory materials and procedures, which was effective in minimizing confusion and anxiety within the company.

“After the switchover, we continued to check on the progress of the implementation and followed up with each department individually, but once the use of passkey (device-bound passkey) became standardized within the company, we felt that the scope of use, including various security measures, was expanding dramatically.”

download the case study

Ceramic Network

New Ceramic release: ceramic-one with new Ceramic Recon protocol

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of ceramic-one.

About the release

The new release of Ceramic includes a data synchronization protocol called Recon, implemented in Rust. This new implementation of the Ceramic protocol enables data sharing between nodes and allows developers to run multiple nodes that stay in sync and are load balanced. All this facilitates highly available Ceramic deployments and reliable data synchronization.

To utilize the Recon protocol for their applications, developers are provided with a binary called ceramic-one.

This new implementation of the Ceramic protocol offers significant performance and stability improvements. Additionally, this release marks a significant shift in making the Ceramic architecture more robust, allowing the team to iterate on and build new protocols in the future.

The new Recon protocol

Recon is a new data synchronization protocol used for synchronizing stream events in the Ceramic network, implemented on top of libp2p. Stream sets bundle multiple streams together, allowing nodes with a common interest in certain streams to synchronize efficiently.

Before Recon, Ceramic nodes broadcasted updates to streams to every node in the network using a simple libp2p pubsub topic. Due to the single channel, nodes would receive stream event announcements they were not interested in, imposing a significant overhead on every node. Additionally, the network's throughput was limited by bandwidth, which led to either prioritizing high-bandwidth nodes or greatly limiting the network throughput to support low-bandwidth nodes.

Recon provides low to no overhead for nodes with no overlap in interest, while retaining a high probability of receiving the latest events from a stream shortly after any node has the events, without any need for remote connections at query time. By shifting updates from the pubsub channel to a stream set, interested nodes can synchronize without burdening uninterested ones. Stream sets also enable sharding across multiple nodes, allowing synchronization of only sub-ranges, which distributes the storage, indexing, and retrieval workload.

Additionally, nodes need to discover peers with similar interests for synchronization. Recon achieves this through nodes gossiping their interests and maintaining a list of peers' interests, ensuring synchronization with minimal bandwidth. Nodes also avoid sending event announcements to uninterested peers.

Performance and robustness improvements

This release, along with the recent Ceramic Anchor Service (CAS) updates, marks significant scalability improvements. Currently, Ceramic provides a throughput of 250 TPS (transactions per second), more than double the previous throughput of up to 100 TPS before the Recon implementation. This increase in throughput is especially important for applications that handle large amounts of user data and require fast transaction times.

These numbers were measured between two nodes that share the same interest. It’s worth noting that nodes without overlapping interests do not affect each other's throughput. This means that, in theory, the throughput of a ceramic-one node scales horizontally. However, there is still one component that puts an upper limit on this: the CAS, which is operated by 3Box Labs. This service is currently a centralized bottleneck in the protocol, which is why the team’s next goal is Self-Anchoring, allowing any Ceramic-One node to operate completely independently.

This release of Ceramic is also a significant step towards making the Ceramic architecture more robust, enabling the team to iterate on it and build new protocol implementations more easily and quickly.

Getting started with ceramic-one

All new Ceramic developers are recommended to use the ceramic-one to start building on Ceramic. Check out the setup guides on the Ceramic documentation to get started.

Developers, who have been building on Ceramic for a while, are encouraged to migrate their applications to the ceramic-one-based implementation. Check out this migration guide to follow the migration steps.

Share you feedback with us!

We would like to get your feedback on building on Ceramic. Do you have any suggestions or ideas of how the core Ceramic team can improve the implementation of Ceramic? Do you have questions or troubles using the new release or migrating your existing application? Share your thoughts and ideas with us by posting on the Ceramic Community Forum.


FIDO Alliance

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Join this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

View the webinar slides to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

View the webinar slides to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Watch this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

Watch the webinar to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

Watch the webinar to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

DIF Blog

Guest blog: Steve McCown

Anonyome Labs was founded in 2014 to give people control and freedom over their personal and private information. Based in California, Utah and Australia, the company has deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography, and equips businesses with cutting-edge privacy and cybersecurity solutions that

Anonyome Labs was founded in 2014 to give people control and freedom over their personal and private information. Based in California, Utah and Australia, the company has deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography, and equips businesses with cutting-edge privacy and cybersecurity solutions that seamlessly integrate with existing offerings and systems. 

Anonyome Labs Chief Architect and DIF Steering Committee member, Steve McCown, talked to us about how the company is using Decentralized Identity to drive interoperability and usability of their products, his involvement in DIF, and decentralized identity standards work. 

What does Anonyome offer, and who are your customers? 

We started 10 years ago, when we noticed that people’s personal and private information was being collected and used without their permission. There were certain contact points that industry was using to triangulate people’s behavior, such as phone numbers and email addresses. So we developed an app called MySudo that gives users the ability to create contact sets and associated pseudonymous profiles called ‘Sudos’, consisting of an email address, phone number and a one-time or reusable credit card number. 

People already had separate home and work contact details, so we took it further. With MySudo, you can create Sudos for your online purchases, for hobbies, for travel, and so on. Our mission was to create separate IDs, so when a hacker steals data from websites, or data brokers buy and sell your online activity data, they can’t correlate your home activities with your work activities, or your social activities with products you purchase. This increases your privacy by disrupting how your data can be correlated. 

Individuals use MySudo to privately communicate back and forth, share files, and so on — for all their usual digital activities. We also have an Enterprise-grade platform with all the same capabilities, which we license to organizations who incorporate these privacy-enhancing services into their own offerings. There are some very notable companies you would have heard of that are providing a white label version of our privacy-enhancing technologies to their customers. We’ve also been working with national governments to help preserve their citizens’ and commercial entities’ private data. 

What got Anonyome interested in Decentralized Identity? 

Anonyome is focused on creating strong privacy-enhancing and secure identity tools for everyday users.  The MySudo product is a great example of how internet users can control their privacy during everyday activities, such as email, texting, calling, and purchasing.  The critical privacy and security requirements for these elements are greatly enhanced with decentralized identity elements and protocols.  

In a previous job, I worked for the US Department of Energy as a cybersecurity exploit researcher, so that’s part of my mindset. How would someone take advantage of existing identity paradigms? It’s not just about all the good that everyone is trying to accomplish, it’s also about how an adversary might leverage identity technologies for illicit purposes. That concern is what led me to engage with Decentralized Identity and keeping the crypto keys, passwords, tokens, etc. within your own digital wallet. Because if you control the keys, you can better control access to your data assets. 

The other thing that got Anonyome on this track is that strong crypto environments like Signal were emerging as closed ecosystems. If someone wants to use Signal, they also need to convince others to adopt it, too.  For a lot of users, that can be really hard. Decentralized Identity provides a way to create an open encryption environment and keep the strong cryptography while also bridging across ecosystems.

While we may have competitors in parallel spaces, it benefits everyone if we can increase secure communication between applications that users enjoy using — this secure cross talk between applications becomes a rising tide that floats all boats … customers, companies, everyone. 

Interoperable security is the next evolution that extends beyond closed secure ecosystems, and we’re working to be interoperable with lots of other decentralized identity providers and users. This is why Anonyome has tasked me to work in the standards orgs. If we can collectively solve security and privacy issues at the identity standards level, then we will have a way to realize our goals for secure and private interoperability between applications … and people. 

What is driving Decentralized Identity uptake, in your view? 

There’s tremendous interest in Decentralized Identity. I credit a lot of this to GDPR. The EU has assigned significant liability and penalties for data breaches. Faced with these very large fines, companies are trying to figure out what they need to do to protect users' privacy, since this now affects their bottom lines. While some are doing the bare minimum, others are doing a lot more. There’s also a lot happening in Europe with the Digital Identity Wallet. We want to be interoperable with these standards and the services that implement them.  So we've been reviewing the EU’s Architecture and Reference Framework (ARF) to find out what identity, credentials, proof types and so on we need to work with. 

Privacy at the legislative level is no longer just an EU thing. For example, there are new laws in Utah (US) that mandate that state government organizations can only collect the data they actually need, keep it only as long as authorized and then destroy it unless retention regulations require that they need to retain it. I serve on the Utah Privacy Commission and we’re very strong proponents of this. We will receive presentations from state agencies and give our feedback on what enhances privacy and what needs additional work. There is a real appetite among government officials for better privacy. 

There’s also growing awareness of the potential problems associated with working in the cloud. For example, if you store your data in the cloud, it may be encrypted, but where do the keys go? If you don’t control access to the data access keys (or a provider does this as part of their service), then your information may be insecure without you really knowing it. Recently, there have been some very large data breaches, which seem like about once a month. In response, people are receiving emails saying “your data has been stolen, so here’s 3 to 6 months of free credit monitoring”. This almost seems on the level of “security theater” where companies are doing something, but it’s not very useful and doesn’t solve the problem.

Decentralized Identity can augment many existing services.  For example, DI capabilities can be used to encrypt my files before they are sent to a cloud storage provider.  The cloud storage provider can then work their storage, retrieval, and replication magic without having access to the unencrypted files. As long as I control the encryption keys, my data can only be decrypted by me. 

That’s just what’s possible with file storage. Decentralized Identity is also delivering a wide range of interoperable privacy-enhancing capabilities for communications systems, identity control, access management, digital credentials, and so on. 

That’s what I see happening with DI. If implemented as designed, we’re going to put privacy and security control in the users’ hands. Then we can continue to enjoy many wonderful cloud services while we will be able to control access to our own data. 

What is the value of open standards for Anonyome and your customers? 

We’re huge fans of open standards. We try not to build systems that are proprietary from an interoperability perspective. We strive to ensure that all of our interoperability points are based on industry standards, so that we can communicate with other platforms and they can easily communicate with ours.  Embracing DI standards means if a user wants to use a particular platform, then they don’t have to convince others to use it before it's useful.

As a quick analogy, we’re aiming for the interoperability of email combined with the strong security and authentication of a secure communications application. Pick up any old email app and it will work with most any other, but the security isn’t there. If you work for a large enterprise, they may have put something like S/MIME in place, but when you or I get a regular gmail account, that’s not something we typically add. This is primarily because it’s too hard for users to manage the certificates and so forth. Today, this means that emails are typically transmitted in the clear and not end-to-end encrypted. DI facilitates privacy-enhancing and more secure interactions, which is a key reason why we’re working with these technologies. 

As someone who is involved with DIF, W3C (the World Wide Web Consortium) and the Trust over IP Foundation, what do you see as their respective roles, and the differences between them? 

We’re super excited about each of these organizations.  W3C has spent a great deal of time and effort to create DI’s main building blocks, namely, DIDs and Verifiable Credentials.  W3C is continuing to actively refine and extend these elements in order to facilitate many enhanced DI capabilities.

Trust over IP has created an excellent DI paradigm for illustrating how all of the DI components connect and interact.  These layers depict how DIDs are anchored in a Verifiable Data Registry (such as a decentralized ledger), how DID-based communication takes place, how Verifiable Credentials fit into the ecosystem, and finally how a top layer governance model shows all participants in a system what the systems rules are.

What brought me to DIF was getting involved in DIDComm (Decentralized Identifier Communications).  I see this as one of the main attractions in DI.  As I had been participating for a while, I volunteered to become a co-chair with Sam Curren.  This gave me new insights into the community standards-building processes and in particular key details of the DIDComm protocol.  Later, Sam nominated me as a candidate during the Steering Committee elections and I am honored to have been elected to the SC.

I see DIF as one of the leading development organizations where implementation happens. While other organizations focus primarily on creating a range of standards and documents, which is vital, DIF typically focuses on producing a variety of working software that is based on industry standards such as Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).  This makes DIF a key provider in the larger symbiotic DI industry.  

All of the various standards organizations are distinct and have different missions. DIF fills a critical void that traditional standards organizations don’t typically emphasize. The term incubator is a little outdated, but that’s part of the role that DIF performs. Everything in DI has started with a few people getting together and having a conversation about how to design, build, or enhance some particular technology element. This leads to standards being created.  At some point, usable code libraries are created using the standard base elements and then those are combined into larger protocols, services, environments, and so on. 

There’s a whole lot that needs to happen in this process, and a lot of that work happens at DIF. 

If your goal is implementation right now, that’s where DIF excels. It’s where companies come to pick up architectural designs and reusable code libraries they can use in their products today. 


FIDO Alliance

UX Webinar Series: Passkeys Design Guidelines AMA ask me anything!

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format! Speakers answered audience […]

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format!

Speakers answered audience questions for the full hour to provide actionable guidance for the use of passkeys for consumer authentication.

Phase 1: Identity needs and the “password problem”
Phase 2: Research and Screen Ideas
Phase 3: Concept and Prototype
Phase 4: Build and Test Phase 5: Release and Optimize

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents Designers Content Strategists Product managers IT managers / leaders Security Analysts Data Analysts

Elastos Foundation

Elastos BIT Index: Donald Trump is the “Bitcoin President” for US and Global Tech-Savvy Consumers as Bitcoin Goes Mainstream

US and global respondents perceive Donald Trump as the Most ‘Crypto-Aware’ and ‘Crypto-Ready’ Presidential Candidate, outpacing Joe Biden and Robert F. Kennedy Singapore: July 16th, 2024 – Here at Elastos, the premier SmartWeb ecosystem, we are excited to reveal the latest findings from our BIT Index (Bitcoin, Innovation & Trust). This report underscores that Donald […]

US and global respondents perceive Donald Trump as the Most ‘Crypto-Aware’ and ‘Crypto-Ready’ Presidential Candidate, outpacing Joe Biden and Robert F. Kennedy

Singapore: July 16th, 2024 – Here at Elastos, the premier SmartWeb ecosystem, we are excited to reveal the latest findings from our BIT Index (Bitcoin, Innovation & Trust). This report underscores that Donald J. Trump is perceived as the leading figure among US tech-savvy consumers for his understanding and readiness to embrace Bitcoin.

Key Findings:

38% of respondents anticipate Bitcoin becoming mainstream within four years. 80% foresee Bitcoin evolving into a ‘default’ global currency. Bitcoin adoption is progressing more rapidly in BRICS and Global South Nations compared to Western Nations.

Who is the Bitcoin President?

In the US, 50% of tech-savvy consumers recognize Donald Trump as the most ‘crypto-aware’ presidential candidate, demonstrating a profound understanding of Bitcoin’s intricacies and advantages, compared to Joe Biden (32%) and Robert F. Kennedy (19%).

Globally, the perception remains consistent:

Donald Trump: 51% Joe Biden: 31% Robert F. Kennedy: 19%

Demographic insights reveal that younger consumers (18-24) are slightly less inclined to view Trump as ‘crypto-aware’ (45%), compared to 25-34 year-olds (54%) and 35-44 year-olds (54%). Biden (34%) and Kennedy (21%) see a minor increase among the 18-24 demographic.

Trump is also viewed as the most ‘crypto-ready’ candidate in the US:

Donald Trump: 49% Joe Biden: 30% Robert F. Kennedy: 21%

Globally, the figures are:

Donald Trump: 51% Joe Biden: 29% Robert F. Kennedy: 20%

 

Again, younger demographics (18-24) show less support for Trump as ‘crypto-ready’ (47%) compared to 25-34 year-olds (51%) and 35-44 year-olds (54%). Kennedy receives a slight uplift from 18-24 year-olds (24%).

Trump is also seen by 42% of US respondents as the candidate most likely to promote the use and benefits of Bitcoin compared to Joe Biden (23%) and Robert F. Kennedy (14%).

Globally:

Trump’s support from younger voters (18-24) is lower (37%) compared to 25-34 year-olds (43%) and 35-44 year-olds (45%), while Kennedy sees a slight increase from 18-24 year-olds (17%).

Internationally, Nigerian respondents (59%), followed by the UK (56%) and Germany (54%), believe Trump is the most ‘crypto-ready’, compared to only 42% in India.

Bitcoin Going Mainstream

More than a third of tech-savvy consumers (38%) believe Bitcoin will become mainstream within four years. This belief is higher among 25-34 year-olds (41%) and 18-24 year-olds (40%).

A significant 80% foresee a future where Bitcoin becomes a ‘default’ currency for global transactions, including commodities, real estate, and company valuations.

BRICS Nations and Global South Leading in Bitcoin Adoption

24% of tech-savvy Indian consumers and 26% of UAE respondents use Bitcoin daily, compared to the global average of 18%. In contrast, only 11% of Germans, 13% of UK respondents, 14% of South Koreans, and 15% of US tech-savvy consumers use Bitcoin daily. High acceptance is observed in the UAE and Brazil (49%) for Bitcoin going mainstream within four years, compared to 22% in Germany, 25% in South Korea, and 36% in the UK. 91% of Nigerians and 90% of Indians envision Bitcoin as a ‘default’ currency, compared to 70% in Germany, 73% in the UK and South Korea, and 75% in the US. About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.

The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.

https://elastos.info

https://www.linkedin.com/company/elastosinfo/


ResofWorld

Kenya’s biggest protest in recent history played out on a walkie-talkie app

More than 40,000 Kenyans have downloaded Zello since protests began against the government’s plan to raise taxes.
Betty had never heard of the Zello app until June 18. But as she participated in Kenya’s “GenZ protests” that month — one of the biggest in the country’s history...

Monday, 15. July 2024

Elastos Foundation

Elastos at Bitcoin 2024 Conference in Nashville!

We are pleased to announce that Elastos, under the BeL2 initiative, will be attending the Bitcoin Conference in Nashville, taking place from July 25-27, 2024, at the Music City Center, less than two weeks away! This participation is the result of a collaborative effort supported equally by our Cyber Republic (our DAO) via Proposal 145 […]

We are pleased to announce that Elastos, under the BeL2 initiative, will be attending the Bitcoin Conference in Nashville, taking place from July 25-27, 2024, at the Music City Center, less than two weeks away! This participation is the result of a collaborative effort supported equally by our Cyber Republic (our DAO) via Proposal 145 and the Elastos Foundation, spearheaded by Mark Blair, longstanding CR Council Member and Head of Strategy at BeL2. In this article, we outline why this is so important for us, the details of our participation and how you can get involved! Let’s dive right in!

Transforming Bitcoin: Elastos’ Vision and the Power of BeL2

Elastos is dedicated to transforming Bitcoin into a smart, versatile asset while preserving its core principle of decentralisation. BeL2 is a powerful new technology that leverages zero-knowledge proofs (ZKP) and a decentralised clearing network to enable advanced DeFi applications for native Bitcoin (NB). The mission is to maintain Bitcoin’s integrity as the ultimate decentralised system while unlocking its potential for complex financial transactions, staking, and smart contracts​​​​.

BeL2 achieves this through a unique model where information, not assets, is transmitted across chains. This ensures that Bitcoin remains on its main network, maintaining its security and decentralisation. By utilising light client verification and zkBTC full nodes, BeL2 supports a range of Layer 2 applications, from lending to stablecoin issuance, paving the way for a new decentralised financial system akin to the historical Bretton Woods system, which, rather than being backed by gold, is backed by the 21st-century digital gold equivalent, Bitcoin.

Prime Location: Visit Elastos at Booth 621

Our booth at the conference has been officially confirmed, and we have secured a prime location on the conference floor. Booth 621 is the first large booth on the left as you enter, conveniently situated near the entrance, main stage, and the “Nakamoto Stage.” It’s also adjacent to the “Zen Lounge,” ensuring high visibility and foot traffic. This strategic positioning allows us to showcase our innovative solutions to a large audience, expected to be over 35,000 attendees​​!

At our booth, visitors will find exciting merchandise, fun giveaway prizes, and interactive games, including a spin-the-wheel activity. We will showcase alongside Elastos partners and stakeholders and also distribute a brochure (attached in this article) that highlights Elastos and its significant relationship with Bitcoin, emphasising ELA’s status as a merged-mined coin with over 50% of Bitcoin’s hash power​​. Sasha Mitchell, the Head of BeL2 and founder of Elacity, will be speaking at the event, providing deeper insights into our technologies and future plans. The event will feature influential leaders such as Donald Trump, Robert F. Kennedy Jr., Cathie Wood, Michael Saylor, Russell Brand, and Edward Snowden. This year is shaping up to be a very interesting event, given the US elections in November and Web3 becoming political.

BTC Nashville represents an important moment for Elastos and the BeL2 project as we continue to push the boundaries of what Bitcoin can achieve. We invite everyone to visit our booth, engage with our team, and get excited for the future of decentralised finance with Elastos and BeL2. For more information about our participation and updates, stay tuned to our official channels. We look forward to seeing you in Nashville on the 25th! Excited to learn more? Follow Infinity for the latest updates, we will keep you updated throughout the whole week!


We Are Open co-op

Building and Sustaining Engagement with the Digital Credentials Consortium

Developing communications for your organisation This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world. The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education t
Developing communications for your organisation

This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world.

The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education through open source technology development and leadership, research, and advocacy.

The DCC plays a pivotal role in the definition and establishment of the W3C Verifiable Credentials Standard. Standards are often invisible, but they are massively important!

In this post, we’ll use our work with the DCC to help you systematically review your communication initiatives and give you a bit of a playbook on how to develop reusable communication assets and resources.

Understanding your audience An audience map WAO created with the DCC Research

When crafting communications strategies, most organisations miss a crucial step: audience research. Implementing lessons from outdated research or making assumptions about audience only to find out that your assumptions were wrong are two mistakes that you can avoid!

Before we started creating communication messaging and assets for the DCC, we did two rounds of interviews. In both rounds, we spoke one-to-one with people deeply involved in the DCC’s work. In the first round, we spoke with staff members, W3C task group members and people already implementing the Verifiable Credentials standard. In the second round, we talked to members of the DCC and with the Leadership Board. We asked the same questions for both rounds, but allowed for organic conversation to emerge.

cc-by-nd Visual Thinkery for WAO

These interviews not only provided us with a bounty of onboarding and understanding to the DCC’s multitude of work, but it helped us identify, specifically, what stakeholders need and want from the DCC.

Segmentation

Once you have collected insights from your audience, you can begin to reflect those insights back in ways that help others understand who your audience is. Segmentation is a way to find overlapping interests and topics. We like to visualise segmentation and have done so in multiple ways, from our Audience Ikigai to Defining the Cast and Persona Spectrums, we use a couple of different tools to find audience overlaps. Figuring out a visual way to explain your audience and their unique needs and insights is a great way to help people feel connected to your organisation.

Crafting your communications First slide of a deck implementing suggested design constraints Being specific

Understanding your audience will help you tailor your messages and customise content to specific segments of your audience. Through research, you are also creating relationships with your audience and can encourage people to feel open to giving you feedback.

Our research and subsequent analysis helped us see trends and patterns to pay attention to as we began to craft communications for the DCC. We also identified some quick intervention points allowing us to immediately implement small changes and quick wins. For example, before we ran our final interview, we implemented a new README for the DCC’s Github organisation. Small wins can have big impact!

Our onboarding and research activities helped us see where there were misunderstandings, so that we could deal with them as quickly as possible.

Design guidelines

It can be helpful to put what we call “Design Constraints” in place when we’re building communication strategies and initiatives. Design Constraints are simply rules you and your colleagues use to create consistency in both visual and written language. For example, we helped the DCC select a colour palette, fonts and an illustration library for their future communications.

A brand guide is an example of visual design constraints. A “key messaging and wording” section in your communications strategy is another. It helps create consistency, so that your audiences know how you wish to communicate your organisational goals.

Growing your audience cc-by-nd Visual Thinkery for WAO Engagement

You want to engage with people strategically so that you can work sustainably and your communications are aligned with your current initiatives and goals. We use several tools to help us figure out the best way to engage with a specific audience or community. We’ve written often about the Architecture of Participation, our go-to framework for creating participatory communities.

We also like to build Contributor Pathways, which help show how different stakeholders engage with a project. These pathways can outline steps different audience take and where you might be able to engage with them more effectively.

There are four stages to the engagement model we like to use:

Awareness — The first stage invites you to think about how your particular group hears about you or your project for the first time. The questions to as are How do they hear about us and how would we like them to hear about us? First Engagement — Stage two identifies the first interaction a person or a group has with you or your project. What is the first action that they take and what action would you like them to take? Build Relationship — Stage three is about your interaction. How do you build relationships with people or groups and what value can you bring? Deepen Engagement — As people deepen their engagement with your organisation or project, you’ll want to show them that they’re valued. So how can you ensure consistent engagement with your most engaged audiences?

We think about each of these stages in reference to each specific audience group, as some audiences might be more or less engaged than others.

Advocacy

WAO tends to work with groups and organisations that are trying to create a better world. Advocacy is an integral part of our work. There are a variety of advocacy and collaboration strategies as well as best practices that you can use to ensure you are able to promote your messages in a way that lead to action.

In this post on campaigning for the right things, we take a deep dive into using an advocacy framework to figure out where we might focus efforts. You can reapply this framework to your own initiatives!

Building and sustaining engagement cc-by-nd Visual Thinkery for WAO Cadence

If you’ve truly understood your audiences through research and analysis and you’ve determined the messages and design constraints you need to utilise for maximum communication effectiveness, your audience will begin to grow. Yay! You are building engagement!

It’s time to find sustainable ways to keep your engagement going. Probably the most effective strategy we have for sustaining engagement is cadence and consistency.

an example month of DCC events and associated comms

You need to establish a cadence to your engagement efforts both so that your growing audience knows what to expect and so that you and your team can stay sane. It’s simple, but a communication schedule will help you be consistent, so that people stay engaged. Check out our how to be a great moderator post too, it has good tips on building consistency into your workflow.

Commitment

Last, but not least, commitment to your goals, team and community are essential. However it is that you are trying to have impact on the world, it is a marathon, not a sprint. We believe that open, flexible strategies with reusable and adaptable assets are a great way to help you stay committed.

🔥 Do you need help with communications and engagement? Get in touch!

Building and Sustaining Engagement with the Digital Credentials Consortium was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

It’s time for another episode of The Identity at the Center

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies. You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU Visit our website for more: idacpodcast.com #iam #podcast #idac

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies.

You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU

Visit our website for more: idacpodcast.com

#iam #podcast #idac


GS1

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety. However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing inv
Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety.

However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing investigational products and their locations, as well as for data interchange in clinical trials.

As a result, participants had to configure their IT systems to adapt to each solution implemented by each specific Investigational Medicinal Product (IMP) manufacturer.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1_healthcare_cases_studies_2024_france_v_final_.pdf

ResofWorld

Meta is training its AI with public Instagram posts. Artists in Latin America can’t opt out

Latin America lacks robust data protection laws that would allow Meta users in the region to prohibit the company from using their content.
On June 2, María Luque noticed several of her contacts on Instagram posting about a form she had never heard of. The form, Luque found out, had been sent to...

Friday, 12. July 2024

ResofWorld

The role of trust in building fintech for Africa

Yanmo Omorogbe of Nigerian wealth management firm Bamboo talks about navigating government regulations and the role of longevity.
Yanmo Omorogbe is the co-founder and chief operating officer of Bamboo, a Nigerian digital wealth management platform. The startup has raised funding from investors like Tiger Global, Gerloft, and Y...

Thursday, 11. July 2024

Digital ID for Canadians

The DIACC releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.…

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.

Why is the PCTF Authentication component important?

The Authentication component helps assure the on-going integrity of login and authentication processes by certifying, through a process of assessment, that they comply with standardized Conformance Criteria. The Conformance Criteria for this component may be used to provide assurances that Trusted Processes result in the representation of a unique Subject at a Level of Assurance that it is the same Subject with each successful login to an Authentication Service Provider while also providing assurances concerning the predictability and continuity in the login processes that they offer or on which they depend.

What problems does the PCTF Authentication component solve?

The Authentication component helps establish a standardized way for individuals and organizations to verify their identities when accessing digital services. This reduces the risk of unauthorized access and potential breaches. Additionally, by providing a reliable method for authentication, this allows the PCTF to foster trust and confidence among users, service providers, and stakeholders. This is crucial for the widespread adoption of digital services.

Who does the PCTF Authentication component help?

All participants will benefit from login and authentication processes that are repeatable and consistent (whether they offer these processes, depend on them, or both). It can help lay the foundation to provide assurances that identified Users can engage in authorized interactions with remote systems. When combined with considerations from the PCTF Wallet Component, participants may have an enhanced user experience through the reuse of credentials across multiple Relying Parties.

Relying Parties can benefit from the ability to build on the assurance that Authentication Trusted Processes uniquely identify, at an acceptable level of risk, a Subject in their application or program space.

Find the PCTF Authentication component here.


The Engine Room

Launching our UXD support services!

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab. The post Launching our UXD support services! appeared first on The Engine Room.

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab.

The post Launching our UXD support services! appeared first on The Engine Room.


Berkman Klein Center

Fellows Spotlight: Johanna Wild, Investigative Journalist

An interview on risks, trends, and tools in OSINT digital research Photo by Emily Morter on Unsplash When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of tec

An interview on risks, trends, and tools in OSINT digital research

Photo by Emily Morter on Unsplash

When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of technical approaches to support a more diverse cohort of public interest reporters and investigators, blending automated approaches with human-centered research methodology.

As someone who has supported expert networks in both disinformation and conflict documentation, I wanted Wild’s first-hand perspective on the benefits and risks of using novel open source intelligence (OSINT) tools to enable a broader, more transparent global knowledge base. We conducted this interview over email between Amsterdam and New York City.

Sam Hinds: Do you encounter specific types of people or professional backgrounds in the work of investigations and OSINT tool development?

Johanna Wild: The great thing about the field of open source research is that it consists of people from various backgrounds. Open source researchers spend a lot of time online. They find pieces of information on social media platforms, in online forums, and databases, and they compare features that they identify in user-generated online videos and photos with locations that can be seen on satellite imagery. This process, called geolocation, is used to verify online images. The nature of open source research allows everyone with an internet connection to do this type of work.

The open source researcher community is therefore a mix of people who do open source research as part of their job and volunteers who are passionate about contributing to important research in their free time. My surveys and user interviews with our Bellingcat community showed that our community consists of people working for human rights organizations, stay-at-home-parents who use their limited time to do something mentally challenging and useful, cybersecurity specialists, job seekers who want to learn new skills, lawyers, data scientists, people who are retired and many more. When I ask volunteers about their motivation, they often say that they want to contribute to research that reveals issues in the regions where they live, that they want to feel that in these times that are characterized by various conflicts around the world, and global challenges like climate change; they do not just passively sit around but actively contribute to something that creates new knowledge about those issues. Another motivation is to become part of a community with similar interests and to improve their open source research skills.

Of course there are also many journalists who are part of this community. Nowadays, more and more newsrooms are setting up teams focusing on open source research. However, journalists were more of the late adopters in this field. Most of them only discovered in the last few years how useful this type of research can be, especially if it is combined with traditional journalistic skills and methods. Newsrooms even started hiring skilled open source researchers who are completely self-taught and who have no journalism degree, which is something that is still rather unusual in the news industry.

Volunteers with a technical background contribute by building tools. These are often simple command line tools that are able to do one very specific task, for instance to scrape posts from a specific social media platform or to check whether an online account has been created on a platform using a specific phone number. Those tools do not usually turn into big commercial products; they are built by people from within the open source software community who focus on writing code that is publicly accessible to anyone. Several years ago, I clearly saw that the open source researcher and the open source software community are a very good match for each other, we just needed to bring them together. This is one of the things that we now do at Bellingcat. We organize hackathons, actively invite software developers into our volunteer community, and support them to build their own tools or to contribute to tools built by the Bellingcat team. This group of volunteers consists for example of people who have a full time job in a software company but want to do something meaningful in their free time, of job seekers who want to create their own portfolio of tools, or of academics who are already deep into a technical topic but would like to test its practical application.

Although the open source researcher and tech communities are very diverse in terms of their professional and personal backgrounds, they are currently still dominated by volunteers and professionals from Western countries, mainly from the US and Europe. The technical tool builder community is also, to date, still male dominated. This lack of representation raises serious questions in terms of who defines the future of our field and who has the power to research topics in regions all around the world. With people in many other regions still excluded from participating in this type of research, they mainly become the subject of Western researchers.

“While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity!”

SH: Have you seen novel trends emerge in the type of information researchers want today?

JW: I definitely observe that researchers, and especially journalists, have become more aware of how useful it is to be able to work with large datasets, to know how to scrape information from websites or to have the skills to build small tools that can speed up some of their research tasks.

Currently, everyone is of course interested in AI. Less experienced researchers are hoping for a tool that lets them input any picture or video and then spits out the exact location of where it was taken. While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity! Creativity is needed to spot topics that are worth getting investigated. When deciding where to look next in the vast amount of online information that is out there, creativity helps to connect multiple, often tiny, pieces of verified information which allow researchers to draw conclusions on a certain topic.

Another trend is the use of facial recognition tools. Open source researchers often find pictures that show individuals who have a connection to a certain research case but whose identity they don’t know. In the last few years, several easy to use facial recognition tools have emerged. Researchers can upload a picture of a person and the tool compares this picture with collections of photos from social media platforms. Sometimes, this can reveal the identity of a person, for instance by providing the person’s LinkedIn profile. It is obvious how useful this can be to identify individuals who were involved in serious crimes that require journalistic reporting.

However, facial recognition tools are a double-edge sword. We all know that they can provide wrong results. Two people might just look very similar and an uninvolved person might be misidentified as someone who is involved in illegal activities. It is therefore important that open source researchers do not use those tools as the only way of identifying someone. On top of that, the use of such tools raises various ethical questions ranging from the risk of stalking random people online, to questions about the data sources on which facial recognition tools rely. At Bellingcat, we reflected on how we can ensure a responsible use of facial recognition technologies and concluded that we will refrain from using these tools extensively, and never as a core element of an investigation. We also never used products from companies like Clearview AI. A good example of how we sometimes use a facial recognition tool as a starting point for further research can be found in our article on how “Cartel King Kinahan’s Google Reviews Expose Travel Partners”.

SH: Are there any overlooked tools that you like to highlight in your trainings?

JW: The best type of tool really depends on the research topic. Often a combination of several small tools can lead to the best results. For instance, our Name Variant Search Tool is basically an enhanced search engine for finding information about people. Open source researchers often start with a name and try to find out as much as possible about the person’s online presence. However, the name might be written differently on different sites. “Jane Doe” might also show up as “J. Doe” or “Doe, Jane”. The tool suggests different possible variations of a name and provides search results for all those variations. It is also possible to instruct the tool to search for a name specifically on Linkedin or Facebook.

Example: Name Variant Search results for different variants of the name “Jane Doe”

Our OpenStreetMap search tool, on the other hand, supports the geolocation process. A core task of many open source researchers is to find out where a photo or video that they found online has been taken. To do that, they try to identify specific features and compare those with what is visible on satellite imagery or maps. If researchers already have a rough idea in which region a photo might have been taken, they can input a list of features that are visible in the photo (for instance, a residential street, a school and a supermarket) into our tool, which will try to list all locations in a pre-defined region in which those features show up together. This can really help narrow down possible locations.

SH: What’s an example of an unusual story or insight one can find from OS tools?

JW: If open source researchers have no idea where a picture might have been taken but they know at which time it was captured and the photo shows objects that cast clearly visible shadows, they can try our ShadowFinder tool which is able to calculate at which locations around the world shadow lengths correspond with what can be seen in the photo at a specific point in time. This helps open source researchers concentrate their geolocation efforts to the areas suggested by the tool instead of searching across the whole world.

Example of a ShadowFinder tool result: Possible locations are shown by the yellow circle.

Another tool that has gained popularity within the open source researcher community is PeakVisor, a tool that was originally targeted at helping mountaineers orient themselves but which can also be used for geolocation tasks. For instance, we used it to research the location of the killing of Colombian journalist Abelardo Liz. This example in particular shows that a combination of research skills and the use of tools can go a long way.

SH: What frustrations or barriers do you see as a trainer, and how could the field democratize knowledge of command line tools?

JW: First of all: Teaching open source research is great. People who are interested in learning these methods come from so many different backgrounds which allows everyone to learn new things from each other, including the trainers! The topic is also quite accessible, meaning that everyone can start doing open source research with very simple methods, like using search engines in creative ways. Sometimes, this can lead to surprising results: For instance, just by googling, my colleague Foeke Postma revealed how US soldiers exposed nuclear weapons secrets via flashcard apps.

Of course not all methods are as simple, and one of the things people are struggling with the most are research tools. During my Nieman-Berkman Klein fellowship my research assistant Cooper-Morgan Bryant and I interviewed forty open source researchers about their use of tools. Their answers confirmed my previous findings on this topic: Open source researchers, who are either beginners or who are looking at a topic that is new to them, find it really difficult to figure out what tool they should use at what stage of the research process and how those tools work. With such a wide variety of online tools, some more useful and some easier to find than others, and many researchers feel overwhelmed by the task of finding their way through the landscape of available tools spread across various platforms.

In addition, the majority of open source researchers are not able to use command line tools since this requires a certain degree of technical skills. However, those are exactly the type of small tools that the open source software community is building most frequently. There is a clear divide between those who are building tools for open source researchers and the researcher community itself, for whom those tools often turn out not to be accessible.

“Open source researchers want complex tools that are easy to use and that are stable and well-developed but such tools need funders and teams who build them, and these conditions are not always easily met in the open source research and journalism space.”

On the other side, open source researchers are often not aware of the resources that are required to build mature tools that have an easy-to-use interface. It is getting easier now, but tool builders need to invest a lot more time to build such tools and this is difficult for people who do this task in their free time and without any funding. Open source researchers want complex tools that are easy to use, stable, and well-developed, but such tools need funders and teams who build them. These conditions are not always easily met in the open source research and journalism space. I hope that researchers will become a little bit more open to learn some basic technical skills, and even more importantly that they understand that not every tool that is useful for their research has to function like a fully built commercial tool.

At Bellingcat, we focus on bridging this gap between tool builders and open source researchers. We work with tech communities —often through programs like hackathons or fellowships — and make them aware of how important good user guides are, even for seemingly easy-to-use tools. On the other hand, we teach open source researchers how to use command line tools. We also launched a video series with the goal to help researchers make their first steps towards the more technical side of research tools.

SH: Tools take a lot of resources to build. Do any OSINT tools have a complicated provenance in terms of private sector origin or geopolitics?

JW: It is definitely problematic that researchers and journalists can be so dependent on tools provided by big tech companies. Meta’s social monitoring platform Crowdtangle will be shut down in August and this has caused a lot of discontent amongst journalists, in particular amongst those who are covering elections. For instance, many of the platforms and tools open source researchers use are provided by Google, like Google Search, Google Maps and Google Earth Pro. We are often at the mercy of the decisions that big tech companies take regarding use of their tools.

However, their tools are usually provided for free, which is not the case for other commercial tools. Open source researchers definitely need to look into the companies from which they are buying tools. One risk is that tool providers might be able to see what type of keywords people are typing in or on what topic someone is working on. Researchers and journalists need to be sure that their sensitive research topics are safe from being monitored by tool providers.

At Bellingcat we focus on mostly small open source tools, but those tools come with their own set of challenges. For instance, it is often not clear who is behind a tool that is offered on code-sharing platforms like Github, which can raise security-related questions.

“I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists…since both sides have the common goal of advancing research in the public interest”

This is why I really hope we can build a different tool ecosystem for open source researchers in the future. I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists. I think that such collaborations could work well since both sides have the common goal of advancing research in the public interest, and many of the tools that are used by open source researchers are equally useful for academic researchers. I also see opportunities to research security-related aspects of widely used tools together, as journalists and open source researchers could definitely use some help in assessing the risks that some of the tools they are using might be posing. If anyone who reads this would like to discuss these topics with me: Feel free to get in touch!

SH: Misinformation, disinformation, conspiratorial thinking: What are some of the uses and abuses of “research” you see in these contexts?

JW: What is most common — especially during conflicts and wars — is that people share either photos or videos from a different conflict or old imagery and make people believe that they are related to current events. In the context of the Israel-Gaza conflict since October 2023, this phenomenon has reached a new scale with countless examples circulating online. For instance, Bellingcat found videos that were shared with the claims that one showed rockets that were fired at Israel by Hamas and another that claimed to show recent Israeli strikes on Hamas; both turned out to be recycled videos that had been uploaded to YouTube several years prior.

“People who post such pictures might sometimes think they are doing ‘research’ and that they are sharing relevant information about an ongoing conflict, without realizing that they are actually sharing incorrect information.”

What is dangerous is that some of those posts go viral and are able to reach significant numbers of people who will never know that they fell for misinformation. People who post such pictures might sometimes think they are doing “research” and that they are sharing relevant information about an ongoing conflict, not realizing the information is incorrect. Others, however, will do it on purpose to evoke emotions either in favor or against one of the conflict parties. Users of online platforms cannot really do much to prevent being confronted with such posts. This is another reason it is essential that we all learn to question what we see online and to invest some time in learning basic verification skills.

What we have also been seeing is that supporters of conspiracy ideologies are increasingly using open source research tools and presenting the information as journalistic findings. For example, Qanon supporters in German-speaking countries started using flight-tracking sites to search for flights which they falsely believed were circling above “deep underground military bases” in which children were hidden and mistreated. This is problematic since people who are not aware of the methods and standards of open source research might not be able to differentiate between serious research and the distorted version of it.

SH: What are some of your favorite guidelines or best practices for journalists who aim to cover (and fact-check) broad conspiratorial thinking enabled by OS information?

JW: Looking at their business models can often be a very promising approach. More often than not, conspiracy-minded communities have business-savvy people amongst them who manage to benefit financially from those communities’ beliefs. When I was researching QAnon online communities in Germany, big platforms like Amazon and eBay had started implementing measures to ban QAnon products from their platforms. However, this seemed to have created new opportunities for QAnon influencers who were offering merchandise via their own small online shops. On top of that, customers in Germany were able to buy QAnon products from abroad, for instance from Chinese or British companies who offered products targeted specifically at German-speaking customers. It was interesting but also concerning to see how international today’s conspiracy merchandise markets are.

To research online shops, it is always worth researching what payment options those shops are using and to look into their potential use of cryptocurrencies. It is also important to take some time to learn the terminology a certain group is using. If you are looking into the far-right, for instance, it is crucial to learn how to interpret the symbols they use.

”Open source researchers are often portrayed as some type of ‘nerdy hero’ who spends time on his laptop to research ‘the bad guys’ and is celebrated once he succeeds. The idea of one hero figure who solves all the research challenges is really the exact opposite of how open source research works best…”

SH: How might international organizations build stronger support for women, femme-identified, and gender-nonconforming media and research professionals?

JW: In the field of open source research, there are definitely tendencies that I would like to see changed in the future. It is well established that women and gender-nonconforming people have traditionally had a much harder time to enter and succeed in the space of investigative journalism. Those issues are far from being overcome, but the journalism world has started to talk more openly about it, and the fact that academic researchers have published work on this topic has also been helpful.

My impression is that as open source researchers, we have not yet put enough effort into reflecting on what is happening in our own field. Maybe we thought that since it is relatively new, those issues would not appear as strongly. Unfortunately, however, they do, and it’s time to recognize this.

There are definitely many contributing factors, but one that has had a strong effect on me is that open source researchers are often portrayed as some type of “nerdy hero” who spends time on his laptop to research “the bad guys” and is celebrated once he succeeds. The idea of one lone wolf who solves all the research challenges on their own is really the exact opposite of how open source research works best, which is by nature collaborative and often requires the efforts of many to put together various small pieces of verified online sources for a specific research case. For those of us who don’t want, and are also not able to fit into this commonly portrayed male hero picture, this field might not necessarily feel like a good fit.

However, since more and more traditional newsrooms are setting up open source research units right now, I see more women entering the field and hopefully, this will also change how we publicly talk about open source research over time. To everyone who organizes a public event on open source research, I recommend to not only approach the few already well known voices in the field but to take the effort to find and invite speakers who can contribute new perspectives and who have done research on topics that are not always in the spotlight.

SH: What were the most meaningful conversations you had during your time at the Berkman Klein Center? Do you plan to use any of your connections or insights from the fellowship in your future work?

JW: I am very grateful that I was able to be a Berkman Klein Fellow this year. It was a great opportunity to be part of a community of people who all reflect on how we integrate new technologies in our lives but from various different angles. Each fellow and community hour provided me with insights into a different technology-related topic and I liked the “surprise” effect of being able to learn new things about topics I usually don’t have the time to think about. This has definitely had an impact on how I approached my own projects with Bellingcat. I feel that being immersed in such a knowledgeable and collaborative community has unlocked my creativity and I am looking forward to continuing to learn from everyone in the Berkman Klein sphere in the future.

Johanna Wild was a joint 2023–2024 Nieman-Berkman Fellow in Journalism Innovation, a joint fellowship administered between the Nieman Foundation for Journalism and the Berkman Klein Center for Internet & Society at Harvard University. Wild is currently Investigative Tech Team Lead at Bellingcat.

Fellows Spotlight: Johanna Wild, Investigative Journalist was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

Kantara awards IAL3 certification to NextGenID Component Services

World's first Trust Mark award for Component Services at IAL3 will continue to build confidence in the identity industry The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.

We are delighted to announce that NextGenID has successfully obtained IAL3 certification for its component services. This effectively makes it the first organization to achieve IAL3 in the Identity Credentialing and Access Management (ICAM) space. This sets a new industry standard for security, accessibility and reliability.

NextgenID’s Trusted Services Solution (TSS) provides Supervised Remote Identity Proofing (SRIP) identity stations. Operators use SRIP stations to collect, review, validate, prove and package IAL3 identity evidence and enrollment data.  This means that CSPs that use the NextGenID TSS will offer an enhanced level of assurance.

Speaking of the award, Kantara Exec Director, Kay Chopard, said: “Achieving Kantara certification is a significant endeavor, reflecting a rigorous commitment to excellence in identity and access management. By developing frameworks and ensuring conformance to robust standards, we provide guidance that ensures security, privacy and interoperability in digital transactions. This is critical for organizations looking to adopt identity solutions that not only comply with current regulations but also anticipate future challenges in digital identity verification. We congratulate the NextGenID team on being the first to achieve IAL3 certification for Component Services.”

Are you ready for identity assurance certification?  Visit our Approval Process page for full details of what is involved and the criteria we use when evaluating applications.

 

The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.


ResofWorld

The Northeast Indian YouTubers challenging cultural stereotypes through mukbang

Creators from tribes in Northeast India hope these YouTube videos will help break down harmful stereotypes that have kept them isolated from the rest of the country.
In the video that made him famous, Apollos Kent is barefoot, shirtless, and scooping fistfuls of snails out from a muddy paddy field. He cooks the snails on a campfire...

Wednesday, 10. July 2024

OpenID

All Aboard the CAEP-Ability Hype Train!

Authors: Sean O’Dell (Disney), Atul Tulshibagwale (SGNL) An Identiverse 2024 Panel Recap The attendance for this panel, which featured all co-chairs of the Shared Signals Working Group (SSWG), was near capacity and the engagement from the audience in the Q&A was resounding…because the hype is real with CAEP. The panel was moderated by IDAC podcast […] The post All Aboard the CAEP-Ability Hyp

Authors: Sean O’Dell (Disney), Atul Tulshibagwale (SGNL) An Identiverse 2024 Panel Recap

The attendance for this panel, which featured all co-chairs of the Shared Signals Working Group (SSWG), was near capacity and the engagement from the audience in the Q&A was resounding…because the hype is real with CAEP. The panel was moderated by IDAC podcast host, Jeff Steadman. His questions ranged from provisioning use cases to applicability in connected scenarios with other IAM domains (such as ITDR) and diving deeper into the CAEP specification and Shared Signals Framework. The “IAM CAEPable” T-shirts were also a hot commodity…and there might be another order coming soon.

The many questions from the audience made the discussion even more lively, allowing for open and real conversations to occur with the assembled panel of experts. The panelists felt the audience’s engagement as they saw people scribbling notes, typing on a laptop, or nodding their heads before raising their hands to elaborate or branch off into new areas. Sometimes the energetic Q&A led to a conversation between the audience and multiple panelists. This article covers the highlights.

Highlights & Key Points

Q: What are the practical use cases and applications of CAEP Events?

Apart from the immediate “session revoked” scenario, now implemented by platform providers like Apple, CAEP can be applied in numerous other scenarios. These include, for example, revoking a suspicious device’s session without impacting the end user or informing an IdP of assurance level changes – informative and actionable signals.

A real world scenario is when an event is emitted from an anomaly detection engine, which results in a CAEP event being transmitted so you could take action to revoke the specific session for both the user and possibly the device, if applicable.

Q: Where do CAEP and ITDR intersect? Can you explain the significance of this intersection?

CAEP brings the “R” in ITDR (Identity Threat Detection and Response). Additionally, Shared Signals (SSF) can be leveraged to enhance ITDR by providing a way to communicate detected threats and trigger responses to security systems…using an open standard. Think of Shared Signals as the management framework and CAEP as, effectively, the events that sit on top of it. The new events introduced in the latest CAEP draft, “Session Established” and “Session Presented”, can also help detect usage anomalies like lateral movement across cloud resources.

Q: Can this be used in provisioning use cases? 

A new draft in the IETF called “SCIM Events” defines events that can be shared using the Shared Signals Framework (SSF). This can be used to communicate changes to accounts such as new account provisioning or account termination. 

Q: How can you link events to the same underlying action or reason? 

The latest draft of the Shared Signals Framework (SSF) includes guidelines on using the JWT “txn” claim to ensure that transmitters and receivers do not process multiple events for the same underlying cause or reason and to establish a lineage between cause or reason to the events transmitted for reconciliation or closing the loop.

New Features and Drafts Released

There have been some exciting new developments from the Shared Signals Working Group. The new drafts have been released for review by the OpenID Foundation membership and voting. This stemmed from feedback at the Gartner Interoperability Summit, robust security analysis by the University of Stuttgart, natural maturation of the specification, and Work Group feedback where more use cases were brought to light.

Shared Signals Framework (SSF) – Draft 03

Clarification and bug fixes have been added to this draft. There have also been security issues addressed with issuer and stream audience mix up and potential attacker subject insertion. New features added include: the use of the txn claim to prevent cascading chains from the same underlying event and a means of using it for reconciliation and transmitters can now specify in their metadata which streams have no subjects by default or “appropriate subjects”.

Continuous Access Evaluation Profile (CAEP) Draft 03

The big update here is the introduction of 2 new events: “Session Established” and “Session Presented”. Additionally, the draft has been updated to reflect new formats and fields in examples to match the new SSF draft.

CAEP Interoperability Profile – Draft 00

The first version of the CAEP Interoperability Profile, which defines how implementations can be fully interoperable for specific use cases such as session revocation and credentials change, is also released.

To learn more about the new drafts from the Shared Signals Working Group (SSWG) please click here.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post All Aboard the CAEP-Ability Hype Train! first appeared on OpenID Foundation.


Elastos Foundation

Welcoming Elastos’ New International Airport: Chainge Finance

We are pleased to announce the recent integration between Elastos and Chainge Finance, enabling users to swap assets like ETH, USDC, and USDT from over 14 blockchain networks into ELA on the Elastos Smart Chain (ESC) or Ethereum (ETH) and back. This integration breaks down barriers and fosters a unified financial ecosystem in Web3. Use […]

We are pleased to announce the recent integration between Elastos and Chainge Finance, enabling users to swap assets like ETH, USDC, and USDT from over 14 blockchain networks into ELA on the Elastos Smart Chain (ESC) or Ethereum (ETH) and back. This integration breaks down barriers and fosters a unified financial ecosystem in Web3. Use here now!

Elastos’ Friction Point

Elastos has long struggled with bridging issues, which affects ecosystem growth. Users on exchanges like Coinbase faced difficulties accessing Elastos as only ELA on Ethereum was available. Previous solutions, such as Glide Finance’s high-fee shadow token bridge and ELK Finance‘s complex swaps, frustrated the community. Just as airports facilitate international travel and business, cross-chain interoperability is vital for a cohesive DeFi ecosystem, reducing costs, enhancing tourism, and promoting overall business.

Unlike Centralised exchanges (CEXs) with third-party control, stringent identity checks (KYC), and entry/exit restrictions, decentralised cross-chain exchanges (DEXs) like Chainge offer a permissionless alternative. They enable secure, seamless asset transfers across blockchains, supporting decentralised identities and eliminating centralisation risks, thus improving utility, security and efficiency for all Web3 stakeholders.

Interconnectivity Stemming from CRC Proposal 151 and led by Sasha Mitchell, Elacity CEO and BeL2 Head of Operations, in collaboration with Chainge team CEO DJ Qian, Elastos and BeL2 Co-Founder Sunny Feng Han, and the BunnyPunk team, Chainge’s cross-chain DEX has been integrated with Elastos. This integration merges liquidity pools on Ethereum (Uniswap) and ESC (Glide Finance) and adds 18,513 ELA and 41,655 USDC liquidity on Chainge’s Fusion blockchain, allowing users to swap assets into ELA from over 14 chains. These chains include Fusion, Ethereum, BNB Chain, Base, Avalanche C, Polygon, Aurora, Syscoin Rollux, X Layer Mainnet, CoreDAO, Syscoin NEVM, Arbitrum, Optimistic, Linea, zkSync and B2 (recent BeL2 partner).

 

In this screenshot, we show how USDC on Arbitrum was successfully used to purchase ELA on ESC in a single transaction using a decentralised wallet. This simplifies access to the Elastos ecosystem and its various Dapps, effectively opening up Elastos to the entire Web3 community with a new international airport.

 

Cyber Republic Proposal #294: Banking, Liquidity and Slippage

Next, working with Chainge, the goal is to soon connect to a fiat on/off ramp service, allowing users to buy ELA directly with credit cards or bank accounts and exchange ELA for US dollars into their bank accounts, enhancing accessibility and ease of use.

However, there is still a necessary challenge to tackle surrounding liquidity and slippage. Liquidity is the ease of converting an asset to cash without affecting its price, while slippage is the difference between the expected and actual trade price. High liquidity and low slippage ensure quick, predictable trades, enhancing user experience and driving adoption. Deepening liquidity and reducing slippage are crucial for an efficient financial ecosystem. Below we can see how on the left, a 1,000 ELA order on Chainge has low fees, however, on the right, a 10,000 ELA order has drastic fees of above 15% due to lacking liquidity and high slippage.

Sasha Mitchell proposes using the remaining 197,152 stablecoin assets from the G20 proposal to match with CRC ELA and add to Chainge’s liquidity pool, which currently holds 18,513 ELA and 41,655 USDC. This will drastically boost cross-chain and banking purchase liquidity, reducing slippage and transaction costs and setting Elastos up for the upcoming fiat on/off ramp service. This increased ELA liquidity will allow stakeholders to move assets between chains and banks more efficiently without high costs. Deep liquidity and low slippage help enable quick, stable, and predictable trades.

For more information and to participate in ongoing developments, visit the CRC proposal by Sash and explore the live bridge on Chainge Finance via the web dapp, or download the mobile app from Google Play or Apple App Store. Excited to learn more? Follow Infinity for the latest updates!

 


Next Level Supply Chain Podcast with GS1

Replay: Ways to Build an Enduring Brand on Amazon with Shannon Roddy

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond!  Key takeaways: Building a defensible brand is crucial for long-term success. Invest in building a

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond! 

Key takeaways:

Building a defensible brand is crucial for long-term success. Invest in building a brand that is recognizable, trustworthy, and unique to differentiate yourself from your competitors.

Amazon holds over 50% of the online market and can significantly impact the success or failure of a brand. Harnessing Amazon's data and feedback is crucial for identifying trends, understanding demographics, and developing new products.

Leveraging Amazon's platform and customer data can give you a competitive edge, but you need to adapt to changing customer preferences and market demands.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with our guest:

Follow Shannon Roddy on LinkedIn

More on Avenue7Media

 

Jump into the Conersation:

[1:42] Can you share a little bit of your background and what you’ve been working on in the last couple of years?

[4:31] The Amazon space is constantly evolving, are there some major trends or changes that have happened recently?

[10:33] You gave us some examples when we talked before of how things can go wrong for brands on Amazon, so how can you help them make things go right?

[14:45] What are some other tips and tricks that you can offer?

[18:37] Does that also mean discontinuing product one and two while you expand out, or is that what you learn from the data?

[27:14] What do you see is next from Amazon’s perspective?

[29:06] What trends are you seeing that are blowing your mind?

[32:31] What’s your favorite technology?


Digital Identity NZ

Government bringing in new digital trust framework

The government has quietly ushered in the beginnings of what it hopes will be the answer to people's experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days. The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

The government has quietly ushered in the beginnings of what it hopes will be the answer to people’s experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days.

Digital Identity New Zealand Executive Director Colin Wallis spoke to Radio New Zealand this morning, “The intent is that you’ll have a safer digital playing field as a baseline to build other services on top of. It’s just going to take some time for the ripple through where we are now for it to become seismic.”

Listen to the full recording

You can learn more about the DISTF and the Digital Public Infrastructure on Tuesday 13 August at The Digital Trust Hui at Te Papa, Te Whanganui a Tara.

The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

Tuesday, 09. July 2024

We Are Open co-op

Behind the Scenes of Our New Project on Job Readiness Credentials

A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary ma
A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context

We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary materials, and conducting interviews with employers, IRC staff, and, if possible, IRC clients.

Our broad key question relates to how the visual design and metadata contained in a digital badge impact employer perceptions and interactions. We want to help JFF and the IRC have the most impact possible with the Job Readiness Credential because that impact means changing the lives of real people.

How we approach this kind of work

At the start of any project, it’s important to know the absolute basics. In fact, it’s a good time to get the Busytown Mysteries theme tune in your head as an earworm! The 5W’s and an H shown above help make sure we know all of the things necessary to set the project up for success. Ideally, we’d know most of this before even signing the contract, but anything missing we can pick up in the client kick-off meeting.

Before the client kick-off meeting, we have an internal project kick-off where we talk about everything from timelines and responsibilities, to setting up the digital environments in which we’ll do the work. If we need to purchase any new equipment or subscriptions, we’ll identify those in this meeting. Our guidelines for this can be found on the WAO wiki.

Communications and cadence Early days of JFF/IRC Trello board. It’s the usual kanban format with the additional of the self-explanatory ‘Feedback Needed’ along with ‘Undead’. The latter is for cards that would otherwise get stuck somewhere but we don’t want to delete/archive just in case they come back to bite us!

Getting into the right rhythm with clients is an art rather than a science. While it’s easy to put an hour in the calendar each week for a catch-up call, this is a sub-optimal for anything other than the very short term. This is because, in our experience, these kind of calls quick devolve into status update meetings.

Much better is to work as openly as possible. Sometimes that means entirely publicly with etherpads, public Trello boards, and the like. Other times, it’s working transparently with tools that provide either real-time or summary updates. Often this means that the number and frequency of meetings can be reduced. With our recent work with the DCC, for example, we met every other week, aiming for 45 minutes. Between meetings, we sent Loom videos and other sorts of outputs to make sure our collaborators knew how thinking had evolved.

While it’s important that there is a project lead from both sides, it’s also crucial that their inboxes do not become information silos. Larger organisations might use CRM systems, but for us information is best in context. So, for example, a Google Doc for ongoing meta-level important info, and everything else on the relevant Trello card (or equivalent).

Documentation is not putting a message in a Slack channel or mentioning something during a meeting. Documentation is writing something down in an uncontroversial way that makes sense to everybody involved in the project. This is important because humans can only hold so much information in our heads at one time, and our memories can be faulty.

Everything is a work in progress CC BY-ND Visual Thinkery for WAO

‘Perpetual beta’ is another name for saying that everything is a work in progress. What’s true of software is true of documentation and everything involved in a project. Conclusions are provisional and based on the data and knowledge we had at the time.

To account for this, we usually version our work, starting at v0.1 rather than 1.0. The reason for this is to show the client (and ourselves) that we’re working towards our first fully-formed opinions and outputs. It’s all part of our attempt to work openly and show our work.

With this work that we are starting with JFF and IRC, we’ll be talking to stakeholders in a couple of different places. Our human brains want to take shortcuts and jump to conclusions quickly so that we can take action. However, we’ve learned to “sit in ambiguity” for long enough to allow thoughts and reflections to percolate. This slower kind of thinking allows us to spot things that might have been missed by our ‘System 1’ mode of thought.

Conclusion

We’re greatly looking forward to getting started with this work. We haven’t gone into how we perform user research, which is perhaps the topic for a future post. There’s a lot to cover from that point of view in terms of ethics, data, and different kinds of methodologies.

What we hope that we have shown in this post is our commitment to working openly, holistically, and thoroughly so that the outputs we generate are trusted, interesting, and actionable. We’ll share more on the project as it progresses.

Behind the Scenes of Our New Project on Job Readiness Credentials was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 08. July 2024

Hyperledger Foundation

LF Decentralized Trust: A Bigger Tent for Projects, Labs, Members, and Communities

In case you missed it, the Linux Foundation recently announced the intent to form LF Decentralized Trust, a new bigger umbrella organization where we will gather and grow an expanded community and portfolio of technologies to deliver the transparency, reliability, security, and efficiency needed to successfully upgrade critical systems worldwide.

In case you missed it, the Linux Foundation recently announced the intent to form LF Decentralized Trust, a new bigger umbrella organization where we will gather and grow an expanded community and portfolio of technologies to deliver the transparency, reliability, security, and efficiency needed to successfully upgrade critical systems worldwide.


DIF Blog

DIF welcomes JC Ebersbach as co-chair of the Identifiers & Discovery WG

DIF is delighted to have Jan Christoph (JC) Ebersbach join us as an Identifiers and Discovery Working Group co-chair, and as a new member of DIF Technical Steering Committee. "JC is one of the rare individuals who combine strong technical expertise and creativity with community leadership abilities. Within a

DIF is delighted to have Jan Christoph (JC) Ebersbach join us as an Identifiers and Discovery Working Group co-chair, and as a new member of DIF Technical Steering Committee.

"JC is one of the rare individuals who combine strong technical expertise and creativity with community leadership abilities. Within a relatively short period of time, he has attracted a lot of interest in his work, which is already being adopted, such as Linked VPs and DID Traits. I'm excited that he has agreed to join as I&D WG co-chair, and I look forward to the collaboration!" said existing WG chair, Markus Sabadello.

"I've been following DIF's work since 2019. However, I didn't actively participate in the work items. With last year's first did:hack hackathon, my interest spiked due to the discovery of initial ideas that culminated in the Linked Verifiable Presentations specification, and its recent ratification as a DIF Deliverable", JC said.

"The collaboration and support I received from the working group, and from Markus in particular, inspired me to take on the role of co-chair.

"The Identifiers & Discovery WG is an invaluable resource for working with and building DIDs. I feel honored to serve as co-chair and I'm looking forward to advancing Decentralized Identifiers with our working group," he added.

“JC has shown tremendous leadership in the decentralized identity community, including through his work on Linked Verifiable Presentations, a recently ratified DIF specification, and DID Traits. We are delighted and honored he has accepted the role of Identifiers and Discovery WG co-chair, as well as a position on the DIF Technical Steering Committee. DIF will greatly benefit from his leadership,” commented DIF's Executive Director, Kim Hamilton Duffy.


Identity At The Center - Podcast

The Identity at the Center Podcast episode this week dives i

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world. Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world.

Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast app. Visit idacpodcast.com for more info.

#iam #podcast #idac

Friday, 05. July 2024

Ceramic Network

Ceramic Nodes in Production: Example Costs + Scenarios

Running a Ceramic node involves several key services. Learn about what production costs to expect across example hypothetical scenarios.

Running a Ceramic node in a production environment involves several key components. This article aims to provide an overview of the necessary resources and cost estimates for deploying a Ceramic node in the cloud. While we only showcased two specific providers for the services required (DigitalOcean and QuickNode), we hope these cost examples given the hypothetical scenarios we walk through will help give you a general idea of cost.

Components Required for a Ceramic Node

There are several sub-services to consider when running a Ceramic node in production, each serving different functions. As such, you will need:

1. Resources for JS-Ceramic Functionality: Tracks and stores the latest tips for pinned streams, caches stream state, provides the HTTP API service for connected clients to read, and communicates with the Ceramic Anchor Service (CAS) for blockchain anchoring and timestamping. 2. Resources for Ceramic-One Functionality: These nodes store the actual data and coordinate with network participants. 3. Resources for Postgres Database Functionality: Required for indexing data. 4. Ethereum RPC Node API Access Functionality: Required to validate CAS anchors. 5. Ceramic Anchor Service (CAS) Access Current Status: Anchors Ceramic protocol proofs to the blockchain. This service is currently funded by 3box Labs, however, eventually, this function will be provided by node operators and with some expected cost. Baseline Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “baseline” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state Ceramic-One 4 vCPU 4 GB memory 100 GB disk for storage Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Traffic Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “High Traffic” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state 10,000 IOPs Ceramic-One 6 vCPU 8 GB memory 500 GB disk for storage 15,000 IOPs Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Availability Configuration

For high availability, an additional node can be configured to sync data and handle dynamic read/write tasks, thus doubling the cost of a single-node setup.

Ethereum RPC Node Endpoint Costs

We’ve also chosen QuickNode to provide several RPC cost examples :

QuickNode Base Plan: $10/month (100 million API credits, 2 endpoints, 550 credits/second) QuickNode Middle Plan: $49/month (500 million API credits, 10 endpoints, 2,500 credits/second) QuickNode Premium Plan: $299/month (3 billion API credits, 20 endpoints, 6,000 credits/second) Hypothetical Scenarios and Cost Estimates

Let’s walk through three hypothetical need scenarios and use these to help estimate our cost structure:

Application A: Small User Base User Base: 10,000 monthly active users Query Behavior: 30% writes, 70% reads Availability: Low-priority Configuration: Baseline resources Cost Estimate: Node: $96/month Ethereum RPC: $10/month Total: $106/month Application B: Write-Heavy Mid-Sized Application User Base: 100,000-500,000 monthly active users Query Behavior: 70% writes, 30% reads Availability: High priority (2-node setup) Configuration: High Traffic Cost Estimate: Nodes (2x): $918/month (2x $459) Ethereum RPC: $49/month Total: $967/month

Example GCP budget

Other Considerations

Additional cloud costs must be considered for networking - these costs will vary based on traffic patterns. Most cloud providers offer free traffic ingress to the nodes but will charge for egress, or data leaving the nodes.

Running a Ceramic node in production involves various components and resources, each contributing to the overall cost. By understanding the necessary configurations and associated costs, developers can make informed decisions tailored to their application's needs and user base. High availability setups and resource over-provisioning can significantly impact costs, especially for mid-sized applications with high traffic and write volumes.

Thursday, 04. July 2024

Digital ID for Canadians

OIX and DIACC join forces to move digital trust and verification interoperability forward

Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification. UK, June 2024 – The global non-profit…
Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification.

UK, June 2024 – The global non-profit Open Identity Exchange (OIX) and the Canadian non-profit Digital ID Authentication Council of Canada (DIACC) have committed to working together to advance global digital interoperability – a crucial element for trusted, successful international trade in a rapidly advancing digital global economy.

OIX is an influential global community for all those involved in the ID sector to connect and collaborate, developing the thought leadership and guidance needed to enable interoperable, trusted identities carried seamlessly from place to place in ‘roaming wallets’ for everyone. DIACC is an equally influential community of public and private sector leaders committed securing inclusive digital economy benefits by promoting user-centric design principles and verifying private sector services against the Pan-Canadian Trust Framework (PCTF) to support a secure ecosystem of services to enable user-directed information verification between public and private sector data authorities.

The two organisations will explore how different country-based policies related to identity management, verification, security, data privacy innovation and approaches to digital identity assurance can be compared and analysed so that more rapid progress can be made towards global digital ID interoperability through alignment of policy or acceptance of policy differences.

The collaboration will focus on advancing methods for participants in one framework to accept identity verification and digital credentials verified through another trust framework based on a mixture of policy acceptance and technology adaption. DIACC and OIX will explore equivalency and interoperability processes, identify potential alignments, new standards required, and gaps that may need to be addressed, and highlight use cases that can be facilitated through interoperability across digital ecosystems. Within this work, they will explore methods to describe common features of jurisdictional and sectoral trust frameworks, and share insights widely available as a resource.

The exchange and transfer of knowledge and expertise will be at the heart of this collaboration. OIX and DIACC will work together to create ‘intellectual capital’ to shape debate and bring about actions, moving identity management, data privacy, and security forward at pace.

Nick Mothershaw, Chief Identity Strategist at OIX, said: “The benefits of the digital global economy will be vast, but there is still some way to go before everyone can confidently access them. Our collaboration with DIACC will play a critical role. The fantastic progress DIACC has already made across Canada is an exemplar for global interoperability and will provide much needed insight, tools and guidance to pave a much clearer way forward globally.

“Our plans are to share our work with other trust frameworks across the globe, by publishing the criteria and values, and in the short-term creating an interim tool for trust frameworks to use for policy areas. We also want to secure their input on what they want to see in Trust Framework Comparison tool, as well as to start demonstrating how a roaming wallet will work.”

Joni Brennan, DIACC President, said: “We’re thrilled to collaborate with the Open Identity Exchange. The formalization of our liaison demonstrates progress in supporting our shared values to advance secure, user-centric digital identity solutions globally. Our collaboration will leverage each organization’s expertise to explore opportunities to foster innovation, enhance interoperability, and build public trust in digital services by identifying the alignments and gaps between jurisdictional and sectoral trust frameworks.”

For more information, please contact Serj Hallam at communications@openidentityexchange.org 

About The Open Identity Exchange (OIX)

The OIX is a non-profit trade organisation on a mission to create a world where everyone can prove their identity and eligibility anywhere through a universally trusted ID. OIX is a community for all those involved in the ID sector to connect and collaborate, developing the guidance needed for inter-operable, trusted identities. Through our definition of, and education on Trust Frameworks, we create the rules, tools and confidence that will allow every individual a trusted, universally accepted, identity.

About The Digital ID and Authentication Council of Canada (DIACC)

The Digital ID and Authentication Council of Canada (DIACC) is a not-for-profit corporation of Canada that benefits from membership of public and private sector leaders committed to developing a trust framework to enable Canada’s full and secure participation in the global digital economy. DIACC’s objective is to unlock economic opportunities for consumers and businesses by providing the framework to develop a robust, secure, scalable and privacy-enhancing digital identification and authentication ecosystem that will decrease costs for governments, consumers, and businesses while improving service delivery and driving GDP growth.


Origin Trail

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various government-funded initiatives, the DKG has also played a crucial role in informing public policies.

DKG uniquely and effectively addresses the challenges of data ownership, AI hallucinations, and bias⁷ with the Decentralized Retrieval-Augmented Generation (dRAG)⁸ framework. dRAG drives a vast advancement of the RAG model initially developed by Meta⁹, by organizing external sources in a DKG while introducing incentives to grow a global, crowdsourced network of knowledge made available for AI models to use.

The DKG V8 has through a prototype demonstrated an unprecedented scale at which the Verifiable Internet for AI can drive value for anyone, on any device, and any chain. Addressing sensitive data concerns, scalability, and AI challenges concurrently has brought encouraging results that importantly shape the expected V8 release timeline.

DKG V8 — for Anyone, on Any Device, on Any Chain at Internet Scale

OriginTrail DKG has been battle-tested in real-world applications increasingly used by an ecosystem of organizations and government-supported initiatives. To date, no decentralized system has scaled in the production environment the way V6 DKG has. However, the current capacity of DKG reached its limits to support the growing usage requirements, prompting a transition to the V8, evolved to tackle the scale at which AI is consumed in any environment.

Data has been growing exponentially for decades, with AI driving further growth acceleration — according to the latest estimates, 402.74 million terabytes of data are created each day¹⁰. This trend is increasingly visible in the rising demands for additional capacity in the DKG, driven by data-intensive industry deployments in aerospace, manufacturing, railways, consumer goods, and construction driving DKG growth.

Version 8 of the DKG has therefore been designed with major scalability improvements at multiple levels, with a prototyped implementation tested in collaboration with partners from the sectors mentioned above.

3 key products of OriginTrail DKG V8

The major advancement that DKG V8 is making is in expanding the OriginTrail ecosystem’s product suite to 3 key products:

DKG Core Node V8 — highly scalable network nodes forming network core, persisting the public replicated DKG DKG Edge Node V8 — user-friendly node applications tailored to edge devices (phones, laptops, etc). ChatDKG V8 — the launchpad for creating AI solutions using decentralized Retrieval Augmented Generation (dRAG). DKG Edge Node — enabling the largest, internet-scale decentralized physical infrastructure network (DePIN)

The newcomer in the product suite is the DKG Edge Node — a new type of DKG node enabling the OriginTrail ecosystem to tackle the global challenges described above. As the name suggests, DKG edge nodes can operate on Internet edge devices. Devices such as personal computers, mobile phones, wearables, IoT devices, but also enterprise and government systems are where we can find huge volumes of very important data activity that DKG edge nodes will enable to enter the AI age in a safe and privacy-preserving way. The DKG edge node will enable such sensitive data to remain protected on the device, giving owners full control over how their data is shared.

Together with being protected on the device, edge-node data becomes a part of the global DKG with precise access management permissions controlled by the data owner. In this way, AI applications that the owner allows data access to will be able to use it together with the public data in the DKG via decentralized Retrieval Augmented Generation (dRAG).

Since such AI applications can equally be run locally on devices directly, this enables fully privacy-preserving AI solutions aimed at the ever-growing number of devices on the network edge that can at the same time use both public and private DKG data. The introduction of the DKG edge node enables the DKG to quickly expand to be the largest, internet-scale decentralized physical infrastructure network (DePIN).

New features of the DKG Edge Node

To unlock these powerful capabilities, DKG edge node will include new features that have previously not been available on DKG nodes but were elements of other proprietary or open-source products.

To enable a seamless creation of knowledge, DKG nodes will inherit the proven knowledge publishing pipelines from the Network Operating System (nOS). The data protection techniques for private and sensitive data will be based on the NGI-funded OpenPKG project outcomes. The DKG Node will support all major standards such as GS1 Digital Link, EPCIS, Verifiable Credentials, and Decentralized Identities. To support the growing field of knowledge graph implementations globally, it will enable seamless knowledge graph integrations of major knowledge graph providers such as Ontotext, Oracle, Snowflake, Neo4j, Amazon Neptune, and others.

DKG Edge Node V8 Prototype — Oura Ring integration with demonstrated 400 Knowledge Assets published in 10 seconds

DKG V8 Timeline

The V8 DKG launch sequence consists of 4 stages, aligned with the wider OriginTrail ecosystem roadmap, with a forkless upgrade to V8.

Stage 1: V8 multi-chain infrastructure deployment

Paranet deployment and first IPOs launched Base blockchain integration Cross-chain knowledge mining support

Stage 2: DKG core internet-scale V8 testnet launch

Asynchronous backing Knowledge assets V2: Batch minting (in prototype) DKG Core: Random sampling (in prototype)

Stage 3: DKG edge nodes on V8 testnet

Edge node beta launch Knowledge assets V2: Batch minting & native vector support DKG Core: Random sampling deployment

Stage 4: V8 mainnet upgrade deployment (October 2024)

To stay on trac(k) with updates on DKG V8 as it nears the deployment phase, make sure to join our Telegram or Discord channels!

¹https://v1.bsigroup.com/en-GB/insights-and-media/media-centre/press-releases/2023/july/new-solution-developed-for-cross-border-food-transfers/

²https://page.bsigroup.com/BSI-Academy-Blockchain-Solution

³https://www.bsigroup.com/globalassets/localfiles/en-th/innovation/blockchain-white-paper-th.pdf

https://www.gs1.org/sites/default/files/bridgingblockchains.pdf

https://www.gs1si.org/novice/novica/origintrail-resuje-izziv-ponarejenega-viskija

https://ec.europa.eu/digital-building-blocks/sites/display/EBSISANDCOLLAB/European+Blockchain+Sandbox+announces+the+selected+projects+for+the+second+cohort

https://origintrail.io/documents/Verifiable_Internet_for_Artificial_Intelligence_whitepaper_v3_pre_publication.pdf

https://origintrail.io/blog/decentralized-rag-with-origintrail-dkg-and-nvidia-build-ecosystem

https://ai.meta.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models/

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Energy Web

Energy Web Launches Full RPC Node for the Energy Web Chain

Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operatio
Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options

Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operations for energy sector enterprises and application developers.

The new Full RPC Node is available in two flexible deployment options: fully managed or Bring Your Own Cloud (BYOC). Clients can choose to deploy their node on leading cloud platforms including AWS, GCP, and Digital Ocean. This flexibility ensures that organizations can select the deployment model that best fits their operational needs and technical environments.

Key features of the Energy Web Full RPC Node include:

Fully Dedicated Node: Each client receives a dedicated node, eliminating rate limiting and ensuring optimal performance and security for their blockchain applications. Comprehensive Security: Nodes are properly secured, providing peace of mind that organizational data and transactions are protected. Embedded Analytics Dashboards: Integrated analytics dashboards offer deep insights and real-time monitoring, enabling clients to make informed decisions based on accurate data.

The introduction of the Full RPC Node further expands Energy Web’s infrastructure offerings, reinforcing the company’s commitment to providing cutting-edge solutions that meet the evolving needs of the energy sector.

With the launch of our Full RPC Node, we’re offering a powerful tool for organizations that require robust access to the Energy Web Chain,” said Jesse Morris, Senior Fellow of Energy Web. “This product ensures that our clients can operate their applications smoothly and securely, with the flexibility to choose a deployment option that best suits their needs.

For more information about the Energy Web Full RPC Node and how it can benefit your organization, please visit www.smartflow.org

About Energy Web

Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Launches Full RPC Node for the Energy Web Chain was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 02. July 2024

OpenID

Notice of Vote for Proposed Fourth Implementer’s Draft of OpenID Federation

The official voting period will be between Wednesday, July 17, 2024 and Wednesday, July 24, 2024 (11:59:59PM PT), once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Wednesday, July 10, 2024. The OpenID Connect Working Group […] The post Notice of Vote for Proposed Fourth Implemente

The official voting period will be between Wednesday, July 17, 2024 and Wednesday, July 24, 2024 (11:59:59PM PT), once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Wednesday, July 10, 2024.

The OpenID Connect Working Group page is https://openid.net/wg/connect/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/331.

The post Notice of Vote for Proposed Fourth Implementer’s Draft of OpenID Federation first appeared on OpenID Foundation.


Oasis Open Projects

Invitation to comment on TOSCA Version 2.0

OASIS and the OASIS Topology and Orchestration. Specification for Cloud Applications (TOSCA) TC are pleased to announce that TOSCA Version 2.0 is now available for public review and comment. This 30-day review is the third public review for this specification. About the specification draft: The Topology and Orchestration Specification for Cloud Applications (TOSCA) provides a […] The post Invita

Public review - ends July 27th

OASIS and the OASIS Topology and Orchestration. Specification for Cloud Applications (TOSCA) TC are pleased to announce that TOSCA Version 2.0 is now available for public review and comment. This 30-day review is the third public review for this specification.

About the specification draft:

The Topology and Orchestration Specification for Cloud Applications (TOSCA) provides a language for describing application components and their relationships by means of a service topology, and for specifying the lifecycle management procedures for creation or modification of services using orchestration processes. The combination of topology and orchestration enables not only the automation of deployment but also the automation of the complete service lifecycle management. The TOSCA specification promotes a model-driven approach, whereby information embedded in the model structure (the dependencies, connections, compositions) drives the automated processes.

The documents and related files are available here:

TOSCA Version 2.0
Committee Specification Draft 06
20 June 2024

https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.md
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.html
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.zip

How to Provide Feedback

OASIS and the TOSCA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 02 July 2024 at 00:00 UTC and ends 31 July 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on TOSCA”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on TOSCA works are archived at https://lists.oasis-open.org/archives/tosca-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the TOSCA TC can be found at the TC’s public home page:
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=f9412cf3-297d-4642-8598-018dc7d3f409

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=f9412cf3-297d-4642-8598-018dc7d3f409
https://www.oasis-open.org/policies-guidelines/ipr/#RD-Limited
“RF (Royalty Free) on Limited Terms”

[3] Public review metadata document:
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06-public-review-metadata.html

The post Invitation to comment on TOSCA Version 2.0 appeared first on OASIS Open.


FIDO Alliance

What is a passkey? Why Apple is betting on password-free tech

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, […]

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, transforming how users secure their online accounts. This innovation is set to create a password-less future, significantly enhancing user data protection.

What Are Passkeys?

Passkeys are a sophisticated, passwordless login option for apps and websites developed by the FIDO Alliance. They consist of a “private key” stored on the user’s device and a “public key” residing with the service. This dual-key system undergoes an encrypted verification process, ensuring that access is granted only when the user’s biometrics or device PIN confirm their identity. This system effectively eliminates the need for passwords and multi-factor authentication codes, creating a seamless and secure user experience.

The Benefits of Passkeys

Traditional logins rely on passwords, which users often reuse across multiple sites, posing substantial security risks. Passkeys, however, are tied to the user’s unique device and biometric data, rendering them immune to phishing and brute-force attacks. If a passkey is stolen, it becomes useless without the rightful owner’s biometric verification. This intrinsic link between the user and the device significantly mitigates the threat landscape.

Banks and Passkey Adoption

While the advantages of passkeys are clear, some industries have been slow to adopt, including banks. Andrew Shikiar, CEO and Executive Director of the FIDO Alliance, explains, “Banks and financial institutions operate in a highly regulated industry, so they are vigilant when it comes to ensuring that user authentication complies with relevant regulations. Synced passkeys introduce a new customer assurance model that compliance leads within banks are still adjusting to.”

However, Shikiar noted that “we are now seeing regulatory and other government bodies begin to give formal guidance on how industry should contemplate passkeys,” including an April 2024 missive from the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) offering guidance about implementation.

But Shikiar says that “banks are hypersensitive to customer experience,” too, and thus more cautious about changing how customers log in—even if passkeys are quicker and more secure. New login methods require educating customers—and that takes time.

Despite these bottlenecks, Shikiar says that banks are slowly moving away from strictly password-based logins because they “inherently understand that using a passkey as a primary factor is far superior to a password.”

The Collaborative Future of Passwordless Authentication

Apple’s implementation of passkeys underlines a collective effort by tech giants within the FIDO Alliance, including Microsoft and Google, to enhance internet security. The Alliance has pioneered developments in authentication standards, striving to eliminate the vulnerabilities of password-based systems. Users can visit the FIDO Alliance to learn more about the ongoing efforts and advancements in passkey technology and the latest in passkey implementation.

As passkeys gain traction, the internet moves closer to a future where security does not come at the expense of user convenience. The collaborative efforts of industry leaders within the FIDO Alliance signal a transformative shift towards more secure, passwordless authentication methods, promising a safer digital experience for all.


Oasis Open Projects

Lead with Open Standards and Innovation

Picture this: you’re not just in the game, you’re leading it, crafting the very rules that define success. This is the golden opportunity that unfolds when you, as a tech innovator, dive headfirst into the realm of establishing open standards. Let’s talk about the magnetic pull of influence and the domineering stature in the tech […] The post Lead with Open Standards and Innovation appeared firs

By Francis Beland, Executive Director, OASIS Open

Picture this: you’re not just in the game, you’re leading it, crafting the very rules that define success. This is the golden opportunity that unfolds when you, as a tech innovator, dive headfirst into the realm of establishing open standards.

Let’s talk about the magnetic pull of influence and the domineering stature in the tech world. Engaging actively in setting open standards is not just about putting your name out there. It’s about embedding your products, your technologies, and your visionary outlook into the very fabric of the industry’s future. You’re not just aiming to lead; you’re set to redefine leadership. Why settle for just competing when you can be the one crafting the arena? It’s about creating a scenario where your competition isn’t just trying to catch up; they’re scrambling to decode your rulebook.

You’re a tech innovator armed with resilience, foresight, and an unmatched zest for breakthroughs. It’s not about facing challenges; it’s about embracing them, dissecting them, and transforming them into stepping stones towards your ultimate victory. Envision a future where your influence reverberates across markets, shaping demands, dictating trends, and steering the technological evolution. Your efforts in spearheading open standards not only catapult your products and strategies to the forefront but also enshrine your status as an indomitable leader and a relentless innovator.

But here’s the deal – it demands more than just ingenuity and expertise. It calls for an unyielding spirit, an insatiable appetite for excellence, and an unwavering commitment to surpass the benchmarks you’ve set yourself. It’s a call to arms for those ready to lead, innovate, and inspire. Let’s not just participate in the evolution of technology. Let’s lead it. Together, we can redefine the boundaries, push the limits, and craft a legacy that echoes through the annals of tech history. The path to unparalleled success and influence is before you. The question is, are you ready to seize it?

Remember, in the grand chessboard of technological advancement, it’s not about the pieces you start with, but how you decide to play them. Write the rules, win the game. Let’s make history. Join us at OASIS Open!

The post Lead with Open Standards and Innovation appeared first on OASIS Open.


Ceramic Network

CeramicWorld 05

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks: Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks:

Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic are calling developers to build for the Base Onchain Summer! 🏖️ Proof of Data is coming to EthCC ✈️ Ceramic’s new Recon Protocol is almost here! 🔥 Supercharge your crypto database with OrbisDB plugin for Gitcoin Passport! 🔥

OrbisDB team has just announced their new plugin for Gitcoin Passport.

OrbisDB is a decentralized database built on Ceramic, for onchain builders providing a practical, scalable solution for storing and managing open data. Gitcoin Passport lets users collect verifiable credentials to prove their identity and trustworthiness without revealing personal information, providing apps with a safeguard against sybil attacks and bad actors.

The new plugin allows developers to simply integrate the no-code Gitcoin Passport plugin with the OrbisDB instance to automatically generate reputation scores and filter out malicious actors from being indexed.

Check out this video to learn more and see the new plugin in action:

0:00 /2:18 1× If you’d like to become a Beta tester for Orbis plugins, shoot the team a DM! Index Network adds Farcaster integration

Index Network has recently added a Farcaster integration to their decentralized semantic index.

Index is a composable discovery protocol built on Ceramic, allowing the creation of truly personalized and autonomous discovery experiences across the web.

This integration allows for seamless interaction with decentralized graphs, including user-owned knowledge graphs on Ceramic and social discourse on Farcaster. Paired with autonomous agents, which can be used to subscribe to specific contexts, this new integration pushes the limits of what’s possible with semantic search. Check out the demo below:

0:00 /1:51 1× Learn more about Index Network

Build on Index Network for the Base Onchain Summer

Index Network has teamed up with the Ceramic team to call developers to build on Index Network for this year’s Base Onchain Summer! Base Onchain Summer is a multi-week celebration of onchain art, gaming, music, and more, powered by Base.

Devs are invited to build composable search use cases between Base and other projects participating in Base Onchain Summer. For example, those use cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain projects (Zora, Nouns)

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

And of course, there are prizes! A total prize pool of 2250 USDC will be distributed across the top 3 best applications!

Check out the bounty details on Bountycaster and reach out to Ceramic and Index teams on Farcaster if you have any additional questions.

Start building today! Proof of Data is coming to EthCC!

The third edition of the Proof of Data event series is coming to Brussels! Join the Ceramic and Textile (creators of Tableland and Basin) teams for an inspiring afternoon, expanding on the essential discussions from EthCC. This event will unite pioneers and practitioners in the decentralized data realm. Engage in dynamic panel discussions and networking opportunities, ideal for developers and innovators eager to push the boundaries of decentralized technology.

Featured presenters from IoTeX, DIMO, WeatherXM, and Filecoin will share the latest advancements and projects, sparking engaging conversations with all attendees. A moderator will guide these discussions, ensuring critical themes in crypto, web3, and beyond are covered.

Don’t miss this chance to connect, collaborate, and contribute to the future of decentralized technology. Be part of the conversation driving the next wave of technological innovation!

RSVP today and join our Data Room Telegram channel.

RSVP Index Network & CrewAI Integration

Index now supports an integration with CrewAI, which brings an intuitive way to design multi-agent systems, with Index offering composable vector database functionality. Now, autonomous agents can synthesize data from multiple sources seamlessly.

Learn more! Ceramic’s new Recon Protocol is almost here!

The core Ceramic team is getting ready for the public release of Ceramic’s new Recon Protocol. This new Ceramic networking protocol improves network scalability and data syncing efficiency. It unlocks data sharing between nodes and enables users to run multiple nodes that stay in sync and load balanced. This will enable highly available Ceramic deployments.

Ceramic’s Recon Protocol is in the last testing stages, with some key partners already building on it. It will be launched as part of the nearest upcoming Ceramic release, which will unlock the document migration process from js-ceramic + Kubo to js-ceramic + rust-ceramic.

The next Ceramic public release is scheduled in a few weeks' time. Keep an eye on the Ceramic public roadmap and Ceramic blog for updates regarding the release!
Ceramic Community Content BOUNTY: Build composable search applications on Index Network TRENDING DISCUSSION: Ceramic without Anchoring TRENDING DISCUSSION: Private Data Architecture TUTORIAL: Save OpenAI Chats to OrbisDB on Ceramic VIDEO: How data logs are defined to be easily discoverable in an open network by Charles from the Orbis team VIDEO: OrbisDB lifecycle by Charles from the Orbis team VIDEO TUTORIALS: Check out the latest video tutorials shared on the Ceramic YouTube channel Events Meet the Ceramic team at EthCC and side events: July 9, Proof of Data July 9, Data on Tap: Data & AI Cocktail Hour with Ceramic & Tableland July 10, Builders Brunch July 11th, Ceramic ecosystem developers call Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.

pic.twitter.com/yJkRHzFdtb

— Ceramic (@ceramicnetwork) June 17, 2024


Until next time! 🔥


Elastos Foundation

BeL2 Loan Demo App Updated to Version 0.3: Enhanced Features

We are excited to announce that the BeL2 Loan Demo App has been updated to version 0.3, available at https://lending.bel2.org/. This update brings significant enhancements and new features to improve the user experience and functionality of the app. Here’s a detailed look at what’s new: Added support for Taproot addresses Added the ability to manually […]

We are excited to announce that the BeL2 Loan Demo App has been updated to version 0.3, available at https://lending.bel2.org/. This update brings significant enhancements and new features to improve the user experience and functionality of the app. Here’s a detailed look at what’s new:

Added support for Taproot addresses Added the ability to manually request ZKP proofs Added support for USDC in addition to USDT Additional repayment/proof information in order details Implemented additional arbitration request use cases Fixed expired order use cases Other minor bug fixes

 

Technical Update Insights Added Support for Taproot Addresses

Integrating Taproot addresses ensures compatibility with the latest Bitcoin advancements, enhancing privacy and scalability. Implemented in November 2021, Taproot makes all transaction outputs look the same, improving privacy whether for simple payments or complex smart contracts. It also boosts efficiency and reduces fees. Supporting Taproot allows our app users to benefit from these enhancements, enabling more discreet and cost-effective transactions. This aligns with our commitment to providing a secure and efficient platform, enhancing the overall user experience.

Added Ability to Manually Request ZKP Proofs

Allowing users to manually request Zero-Knowledge Proofs (ZKPs) enhances control and flexibility, enabling them to generate proofs as needed. ZKPs are cryptographic methods that verify a statement’s truth without revealing any additional information. In BeL2, ZKPs ensure transaction privacy and security without exposing underlying data. Previously, ZKP generation was automatic, which might not have suited all users. Now, users can decide when to generate ZKPs, providing greater autonomy and control over their transactions and improving the overall user experience.

Added Support for USDC in Addition to USDT

Supporting USDC alongside USDT provides users with more stablecoin options, enhancing liquidity and flexibility within the app. USDC is known for its regulatory compliance and widespread acceptance, making it a valuable addition to our platform. As a stablecoin pegged to the US dollar, USDC maintains a stable value, attracting users who seek to avoid cryptocurrency volatility. Integrating USDC, widely supported in decentralised applications (dApps) on the Elastos Smart Chain, such as Glide and Elacity, offers more financial activity options and leverages existing liquidity, making transactions more seamless and efficient.

Additional Repayment/Proofs Information in Order Details

Providing detailed repayment and proof information in order details enhances transparency and user trust. It allows users to clearly understand their transactions, repayments, and associated proofs, essential for effectively managing financial activities. This update adds comprehensive data on repayment schedules and proof generation directly within the order details. Users can now see all necessary information in one place, making it easier to track and manage loans. This transparency is crucial for building trust in decentralised finance (DeFi) applications, ensuring users have all the information they need to make informed decisions.

Implemented Additional Arbitration Request Use Cases

Enhancing the arbitration process is critical for ensuring fair and efficient dispute resolution within the platform. By implementing additional use cases for arbitration requests, we aim to provide a more robust mechanism for handling disputes, and maintaining user trust and satisfaction. Arbitration in BeL2 resolves disputes between parties in a decentralised manner, using ELA collateralised nodes, ensuring fairness without relying on centralised authorities. The new use cases expand the scenarios for arbitration, making the process more comprehensive and adaptable, reinforcing the platform’s reliability and fairness.

Fixed Order Expired Use Cases

Addressing issues related to order expiration is vital for ensuring smooth and reliable transaction processes. Fixing these use cases improves the overall user experience by preventing disruptions caused by expired orders. Order expiration issues occur when a transaction is not completed within a specified timeframe, leading to complications or the need for manual intervention. By fixing these issues, we enhance the predictability and reliability of the platform. This ensures that orders are processed as expected, reducing the likelihood of unexpected expirations and associated complications. These improvements help maintain a seamless transaction flow, enhancing user satisfaction.

Other More Minor Bug Fixes

Continuous improvement through minor bug fixes is essential for maintaining the stability and performance of the platform. These fixes address small issues that, collectively, can significantly impact the user experience. Minor bug fixes involve resolving smaller issues that may not be immediately noticeable but contribute to the overall functionality and stability of the app. By regularly addressing and fixing minor bugs, we ensure that the app runs smoothly and efficiently, providing users with a reliable and enjoyable experience.

These updates mark a significant step forward in our mission to provide a secure, flexible, and user-friendly platform for decentralised finance built on Bitcoin. By continuously enhancing the BeL2 Loan Demo App, we aim to offer the best possible experience for our users, ensuring that they can leverage the full potential of their Bitcoin holdings in a secure and decentralised manner. Stay tuned for more updates and innovations as we continue to develop and expand the capabilities of the BeL2 ecosystem. Excited to learn more? Head over to the BeL2 website and follow Infinity for the latest updates!

 

Monday, 01. July 2024

OpenID

New Shared Signals Drafts

Authors / Shared Signals Co-Chairs: Atul Tulshibagwale, SGNL; Shayne Miel, Cisco; Sean O’Dell, Disney; and Tim Cappalli, Okta The OpenID SSWG has released three new drafts for review by the OpenID Foundation membership. We would like to describe the salient features of these drafts here. At the end of the 45-day review period, members can […] The post New Shared Signals Drafts first appeared on

Authors / Shared Signals Co-Chairs: Atul Tulshibagwale, SGNL; Shayne Miel, Cisco; Sean O’Dell, Disney; and Tim Cappalli, Okta

The OpenID SSWG has released three new drafts for review by the OpenID Foundation membership. We would like to describe the salient features of these drafts here. At the end of the 45-day review period, members can vote on adopting these drafts as implementer’s drafts.

Shared Signals Framework – Draft 03

After the Shared Signals Framework Implementer’s Draft 02 was released, the OpenID Foundation contracted with the University of Stuttgart for performing a formal security review of the draft specification. The good news is that the findings from the preliminary report were minor, but the bad news is that addressing them required changes to the normative language in the draft. As a result, the SSWG decided to create a Draft 03, which would need to go through the OpenID review process in order to be adopted as a successor Implementer’s Draft. Because we had to do this change, we decided to update some other aspects of the framework, which are backwards compatible (i.e., anything that implements draft 02 will still be draft 03 compliant). The salient features added in this draft and the issues fixed in this draft are listed below:

Security issues addressed Issuer Mix Up: Draft-02 did not specify that a receiver must validate the issuer value in incoming events and API responses from the transmitter. This language has now been updated to specify that receivers must validate the iss value in events and API responses they receive from a transmitter. Stream Audience Mix Up and Attacker Stream Subject Insertion: Draft-02 did not specify that a transmitter must authenticate a receiver, which we have remedied in draft-03. The new language in draft-03 also requires that transmitters use TLS and recommends that receivers verify the trusted source of the transmitter URL and use HTTPS. New features added Use of txn claim: Draft-03 now clarifies how to use the JWT txn claim in order to prevent cascading cyclic chains of SSF events caused by the same underlying event. By verifying that the txn claim in a newly received SSF event is the same as a previously received SSF event, the receiver can ignore subsequent events it receives. The txn claim can also be used for reconciliation or auditing purposes between a transmitter and receiver as part of “closing the loop” on security events and actions. SSF transmitters can now specify in their metadata whether streams they create have no subjects in them, or “all appropriate subjects” automatically added in them, immediately after the stream has been created. Clarifications and Bug fixes

A number of minor bugs, mostly involving non-normative language such as examples, have been fixed in this draft. Some new examples have been added and existing examples have been updated to match formats that have changed since those examples were first introduced.

CAEP Draft 03

The Continuous Access Evaluation Profile now has a new draft for review. The main update in this draft is the introduction of two new CAEP events: “Session Established” and “Session Presented”. These events can help in the following ways:

Session Established:

Notify completion of a federated identity initiated SSO Indicate to a monitoring service that a user has established a new session with a particular application Optionally bind a session to a specific device or other context so that it is easier to detect session hijacking

Session Presented:

Helps a monitoring service detect user presence at a specific application Helps detect impossible travel across applications Helps detect changes in environmental properties, such as IP-address changes.

Together these two events can help effectively monitor an organization’s cloud services for identity threats.

In addition to these new events, the draft has been updated to reflect the new formats and fields in all examples to match the latest SSF draft.

CAEP Interoperability Specification

In March 2023, the OpenID Foundation conducted an interoperability event hosted at the Gartner IAM Summit in London. The results of that interoperability event are documented as a part of this blog post. At that time, the implementers established interoperability of the actual events being exchanged. The SSWG had already begun work on an interoperability profile that would specify more than just the event formats to be supported. So now we are pleased to announce the first version of this interoperability profile, which specifies:

Spec versions that must be supported by transmitters API endpoints that must be supported by transmitters Authorization schemes that must be supported by transmitters Stream control features that must be supported by transmitters AddSubject behavior of transmitters Subject formats supported by transmitters and receivers Signature formats supported by transmitters and receivers Details of OAuth options that must be supported by transmitters and receivers Event types that must be supported by transmitters and receivers

We invite the general public and members of the OpenID Foundation to review the specifications that are available here. Feedback may be provided by opening an issue in the Shared Signals GitHub repository.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post New Shared Signals Drafts first appeared on OpenID Foundation.


We Are Open co-op

Finding your activist friends

Solidarity, common ground, and intersectionality in the climate movement This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities. Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional underst
Solidarity, common ground, and intersectionality in the climate movement

This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities.

Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional understandings of the climate crisis. Together, we explored what intersectionality looks like in the climate movement and how we can tell stories that lead to action.

cc-by-nd Bryan Mathers Expanding our activist energies

Although many of us care about a variety of struggles, we don’t have the time or the energy to get involved in every single thing. We focus our energies, we have to. The problem, of course, is that each issue and cause needs the visibility a group of activists coming together can provide. So how can we focus ourselves and find energy to do more?

For just a couple hours last week, I worked with a small group of rabble rousers to create campaign ideas for the challenge:

The intersection and complexities of our structural problems makes people feel powerless.

The structural problems we are facing are complex and co-exist within a matrix of other challenges. We are dealing with environmental crisis, racist societies, and social inequalities left, right, and centre. No matter how positive your personality may be, it’s hard to stay optimistic. No matter how cognisant you are about other struggles, it’s hard to pay attention to everything.

When we are overwhelmed and feeling powerless, we tend to recede. Our group had the insight that feeling overwhelmed or powerless is lonely. Loneliness is a cascading psychological phenomena that halts action and feeds despair.

We started to think about how we might address loneliness in activist movements by telling stories that help people who feel like they belong to one group (e.g. environmentalists) to understand their connection to other groups.

Our theory of change is that finding belonging amongst your activist friends can provide you with solidarity and a source of energy. We wanted to push for intersectionality in the climate movement.

Intersections in audiences, the Audience Ikigai, cc-by WAO Choose three: intersectionality in practice

Everybody cares about something, whether that be sports, the environment, or even status. If you can identify one thing you care about, you can surely identify three others. Using arbitrary design constraints, like “choosing three” is a good way to move any idea forward, including ideas around your own activism or civic participation.

We know that climate change disproportionately affects already marginalised communities, which can exacerbate existing social inequalities. With this in mind, we choose to look at the intersectionality of climate with three human rights issues:

Refugee and migrant justice Women’s rights LGBTQIA+ rights

Easy, right? Choosing three issues to put your energy into is a lot less than “everything”. Three is also enough to give variability and provide access to different communities. Different communities come with different energies and that is something you can tap into when needed.

We often consider the thematic intersections of our own work. See how we work in the overlaps together with thoughtful, ethical organisations in Practical utopias and rewilding work. Find connections and leverage points

Intersectionality is about understanding the points of interconnection between two issues. Seeing the overlaps means that you can connect issues together in new and novel ways. Novelty is just one storytelling tactic in calling attention to a particular issue. Once you’ve determined places to focus, you can further narrow your focus by looking for leverage points that lead to connection.

Refugee and migrant justice — From climate-induced displacement to the fact that people who are forced to migrate whatever the reasons can face challenges in accessing their basic rights, refugee and migrant justice ties heavily to other environmental and human rights issues. Women’s rights — Women’s societal roles as caregivers and food producers make them more vulnerable to the effects of climate change. It’s now widely understood that educating girls is a catalyst towards climate action. LGBTQIA+ rights — Again, marginalised communities are disproportionately affected by the climate crisis. LGBTQIA+ are often members of other marginalised communities, such as racial minorities, and they are more likely to live in poverty.

Human rights and environmental justice are big and complex areas of focus. Thinking about how the complexities of these issues overlap can help narrow down the impact you want to have.

cc-by Iris Maertens with Dancing Fox Have some fun

Yes, structural problems are serious and complicated. It is essential to be aware of both your own privileges (whatever they may be) and to think deeply about the issues and communities you are working with and within. It’s also important to know that joy is a common emotional human experience. Inciting joy is a way to truly help people. It can help build psychological characteristics that help people deal with whatever life has to throw at them. Joy can also open people up to a better tomorrow.

At the event I attended last week, as we thought about the intersectionality of environmentalism with human rights, we thought about how we might be able to inspire people to be joyfully curious to learning more about an issue they might not have much involvement with.

We developed a few posters, designed to be displayed on a metro, to inspire this curiosity.

Our poster ideas, drawn by the incredible Iris Maertens Solidarity with others

The complexity of our global problems can be overwhelming, but we cannot solve one complex issue without tackling the intertwining structural issues. Finding ways to relate what you care about to what others care about is a way to build solidarity and, therefore, momentum. It’s not always easy, but the more you can participate in cross-cutting social and environmental communities, the bigger our collective power becomes.

I worked with inspiring people from these organisations:

La Casa dels Futurs is both an ongoing project dedicated to supporting intersectional organizing between social and ecological movements, and a campaign to create a permanent Climate Justice Center and Movement School…” “Rinascimento Green…aims to bring together various pieces of civil society to promote, through a path of popular participation, a bottom-up Green Deal.” “WeSmellGas is a collective of organisers, researchers and film-makers based in Northern Europe. Climate justice can only be realised by dismantling capitalism and the imperial processes that reinforce it, including our current extractivist energy system.”

🔥 Do you need help with storytelling and advocacy? Check out WAO’s free, email-based Feminism is for Everybody course, or get in touch!

Finding your activist friends was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Happy birthday to the Identity at the Center podcast! Our la

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners for your continued support!

Watch it here https://youtu.be/OUHTB1ncLME?si=fxeS8bNtzmKaW2kp or listen in your podcast app.

More info at idacpodcast.com

#iam #podcast #idac

Sunday, 30. June 2024

OpenID

OpenID for Verifiable Credentials Wins EIC Award!

The OpenID Foundation is proud to announce that, for the work building the “OpenID for Verifiable Credentials” family of specifications, members of the Digital Credentials Protocol (DCP) Work Group won the “Future Technologies and Standards” award at the European Identity and Cloud Conference. For the last several years, this group has been working tirelessly to […] The post OpenID for Verifiabl

The OpenID Foundation is proud to announce that, for the work building the “OpenID for Verifiable Credentials” family of specifications, members of the Digital Credentials Protocol (DCP) Work Group won the “Future Technologies and Standards” award at the European Identity and Cloud Conference.

For the last several years, this group has been working tirelessly to develop scalable OpenID specifications attuned to Issuer-Holder-Verifier use cases. This family of specifications enable both the issuance and presentation of digital credentials – regardless of their format – and pseudonymous authentication. The net result of their work will be that end-users gain control, privacy, and portability over their identity information. And their constant, simultaneous focus on verifiers underpins a solid path to adoption. 

You can learn more by listening to WG Chair, Kristina Yasuda, speak about how Digital Identity Wallets can “cross the chasm” to widespread adoption during her EIC Keynote.

This award recognizes the impact that the WG has already had on the market (learn more on their landing page):

The European Architecture and Reference Framework lists several of their specs as required for certain use cases 3 draft ISO standards reference DCP specifications 18 wallets in the European Commission  EBSI project support them NIST plans to implement reference implementations of OID4VP to present mdocs/MDL

To further support and enable OID4VC implementers, the Work Group has been engaging closely with the OpenID Certification team to develop tests that will ensure that deployments are interoperable and secure. There are already draft tests for Verifiable Presentations being trialed by a number wallets and the Foundation is working with NIST and other partners for more. So stay tuned!

Thank you so much to the DCP WG for their efforts, their commitment to the work of the Foundation, and their advocacy for the users at the heart of this family of standards. 

Congratulations to Kristina Yasuda, Torsten Lodderstedt, Joseph Heenan, Tobias Looker, Oliver Terbu, Paul Bastian, John Bradley, Mike Jones, Fabian Hauck, Jan Vereecken, Nat Sakimura, Gail Hodges, Daniel Fett, Brian Campbell, Christian Bormann, and all others who have participated in and progressed this work.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post OpenID for Verifiable Credentials Wins EIC Award! first appeared on OpenID Foundation.

Friday, 28. June 2024

Hyperledger Foundation

Introducing Splice, a New Hyperledger Lab for Canton Network Interoperability

Hyperledger Labs

Hyperledger Labs


DIF Blog

DIF Newsletter #41

June 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Conference Season wrap-up; 3. Announcements at DIF; 4. Community Events; 5. DIF Members; 6. Get involved! Join DIF 🚀 Decentralized Identity Foundation News Credential Schemas launches The Credential Schemas work item is

June 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Conference Season wrap-up; 3. Announcements at DIF; 4. Community Events; 5. DIF Members; 6. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

Credential Schemas launches

The Credential Schemas work item is up and running following a successful kick-off meeting earlier this month.

“There was strong attendance and great energy. We had participants with deep experience in schema development and KYC (Know Your Customer) compliance, as well as newcomers to decentralized identity. We are currently focused on establishing the scope and use cases,” said DIF’s Executive Director, Kim Hamilton Duffy.

“It’s not too late to participate. We’re looking to involve a broad range of expertise, including people familiar with KYC reusable ID use cases, such as compliance experts.” she added.

The work item meets every other Tuesday at 10:00 PST / 13:00 EST / 19:00 CET, with the next meeting scheduled for Tuesday 2 July.

Join DIF to get involved.

Linked Verifiable Presentations
Photo by FlyD on Unsplash

The Linked Verifiable Presentations specification has been approved.

The spec defines how to share, discover and retrieve Verifiable Credentials publicly via a service entry in a DID Document. It complements existing technologies for sharing VCs privately, like DIDComm messaging and OID4VC. Use cases enabled by Linked VPs include

Discover verifiable data about a website Simplify the onboarding of suppliers and customers by linking relevant non-sensitive data such as to business registration credentials to the organization’s DID Make mandatory data verifiable: provide imprint pages or terms of use statements as machine-readable, verifiable credentials Decentralized business network: people share their educational background and work experience as verifiable credentials publicly

The specification is available here: https://identity.foundation/linked-vp/

Working Group training

DIF provided our first WG training session. Check out the session recording and slides (recommended viewing for all DIF members).

Operational Excellence @ DIF

Work to automate DIF’s operational processes continues to make excellent progress thanks to our star Systems Administrator Pratap Mridha (pictured above).

🛠️ Conference Season wrap-up

European Identity and Cloud Conference (EIC) 2024

This year's EIC felt to many like a defining moment in decentralized identity's journey from idea to reality.

The event took place as Germany was gearing up to host the Euro 2024 football competition.

Decentralized identity luminaries were in abundance. Rolf Rauschenbach, Anil John, Daniel Goldschneider of OpenWallet Foundation, Kim, Ramesh Narayanan of MOSIP, and Damian discuss standards collaboration outside of the Berlin Congress Center.

Long-time decentraliezd identity leaders and visionaries Kaliya Young and Phil Windley catch up in between sessions.

Executive Director Kim Hamilton Duffy and DIDComm WG co-chair Steve McCown of Anonyome Labs delivered a Decentralized Identity Technical Mastery Sprint to a packed seminar theatre on the opening day of the conference (see this summary of their session on the DIF blog)

DIF Steering Committee Members Markus Sabadello of Danube Tech duelled with OpenID Foundation chairman Nat Sakimura over how to realize SSI principles in their joint keynote presentation, Les Miserables of the Cyber Frontier (session summary here)

Kim teamed up with Wayne Chang of SpruceID and Linda Jeng of Digital Self Labs to explore the key role of decentralized identity in building trust in AI (session summary here)

Misha Deville, co-founder of Mailchain, spoke about lessons learned from Web3, including the importance of network design in achieving the target outcomes of decentralized identity ecosystems

Kaliya Young gave a presentation about institutional memory, and the implications for organizations, individuals and society

Nick Lambert of Dock Labs and Nick Price, who co-chairs the DIF Travel & Hospitality SIG, joined other industry experts to explore how decentralized identity can help upgrade Customer Identity and Access Management (CIAM) - summary on the DIF blog here.

Riley Hughes of Trinsic, Sam Curren of Indicio and Kim were joined by Abbie Barbir to discuss reusable identity and bootstrapping decentralized identity ecosystems.

Fraser Edwards of cheqd and Sharon Leu of JFF Labs spoke about incentives for wallet developers during a panel discussion addressing usability challenges of digital identity wallets

The German Federal Agency for Disruptive Innovation (SPRIND) selected several companies including Sphereon to develop prototypes for the European Digital Identity Wallet

Identity Week Europe

DIF members including Polygon ID / Privado, Mailchain, Indicio, Hypermine, PassiveBolt and Tonomy Foundation converged on Amsterdam for another European identity industry gathering.

AI, and the rapidly changing cyber-threat landscape were major themes.

Decentralized identity also generated great interest at the event, which is traditionally dominated by IAM, physical and cross-border ID: see this summary of several decentralized ID themed discussions on the DIF blog.

Digital Identity unConference Europe

Europe's own IIW-inspired event returned to Zurich, Switzerland, where many of those present at last year's inaugural event were joined by a throng of new participants.

The eIDAS 2 regulation and EU Digital Identity Wallet were key topics.

Practical questions such as how to kick-start a DI ecosystem, onboard customers using government-provided PID (Personal Identification Data) credentials and make life simpler for technology implementers and users were at the heart of the discussions.

Organizational identity and B2B use cases were also recurring themes in many of the sessions.

📢 Announcements at DIF

DWN users - share your use case!!

Do you use Decentralized Web Nodes? We want to hear about it! Let us know how you're using DWNs here.

DIF Labs

The DIF Labs working group is coming soon; contact membership@identity.foundation to learn more

🗓️ ️Community Events

Coffee Breaks

If you missed this month's DIF Coffee Breaks, moderated by DIF's Senior Director of Community Engagement, Limari Navarrete, be sure to check out the recordings:

Andres Olave, Head of Technology at Velocity Career Labs Cole Davis, Founder and CEO at Switchchord

Last month's coffee breaks

Tim Boeckmann, CEO and Co-founder of Mailchain Nara Lau, Founder at Fise Technologies Ankur Banerjee, CTO and Co-founder at Cheqd Humpty Calderon, Advisor @Ontology and creator of Crypto Sapiens Podcast

Follow https://twitter.com/DecentralizedID to get updates

🗓️ ️DIF Members

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events which can be found here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives


Ceramic Network

Save OpenAI Chats to OrbisDB on Ceramic (Tutorial)

Build an AI-powered chatbot using OrbisDB for storage and the OpenAI API.

Last year we partnered with Learn Web 3 (a free educational platform for Web3 developers) to publish a tutorial on Saving OpenAI Chats to ComposeDB on Ceramic to showcase an easy-to-understand design architecture and how ComposeDB could be leveraged for storage. In that example we showed how to configure and deploy a local ComposeDB/Ceramic node, walking through data model design, server configurations, model deployment, and runtime definition generation, all of which are necessary steps a developer must undergo before running the application locally.

But what if developers could bypass local node configuration altogether and start testing their database design and application logic immediately? What if they could do so with the assurances of no lock-ins, and the open option to move to a self-hosted configuration in the future? And finally, what if they could benefit from all of these things while enjoying a seamless developer experience that makes storage setup easy?

That's where OrbisDB comes in.

What is OrbisDB?

OrbisDB is an advanced decentralized database built on the Ceramic Network and offers an ORM-like interface that developers can leverage when integrating the OrbisDB SDK. Developers who have worked with Prisma or Drizzle with a Postgres instance will find this experience familiar and exceedingly easy to work with.

As for developer experience, what sets OrbisDB apart are the following:

A user interface (UI) developers can run either locally or using a hosted Studio instance, bypassing the need to define and deploy data models by hand (which is still an option if using the SDK). The UI also includes data visualization (you can view data relevant to your applications in table format), as well as other views for configuring add-ons like plugins (described below). OrbisDB offers the option to leverage a variety of a growing list of plugins to enrich the data capabilities developers can incorporate into their application logic. Some example plugins offer gating ability, automatic resolution of ENS domains, sybil resistance, and more. Anyone can also build plugins and incorporate them immediately in the event they're running a standalone instance. The OrbisDB SDK wraps user authentication, client creation, schema creation (if developers prefer not to use the UI), and querying all under one roof, therefore simplifying the list of dependencies developers need to worry about. Finally, OrbisDB offers the option to run an instance locally (similar to the ComposeDB tutorial mentioned above), or on a shared (hosted) instance. This is a significant feature for overall development and testing velocity as it lets developers start writing and reading data right away without having to worry about node configuration. Once developers are ready to take their application to production after testing on the shared instance, setting up a self-hosted (standalone) instance is straightforward.

For this tutorial, we will be leveraging the hosted Studio instance to both define our data models and utilize a shared OrbisDB instance.

Let's Get Started!

Before we get started, you will need the following dependencies:

MetaMask Chrome Extension (or similar browser wallet for authentication) Node v20 An OpenAI API key A project ID you will need from WalletConnect  A free OrbisDB Studio account - first, log in with your browser wallet. We will use this later to define our data models and obtain a context and environment ID Initial Setup

First, clone the repository and install the dependencies:

git clone https://github.com/ceramicstudio/orbisdb-chatbot && cd orbisdb-chatbot npm install

Next, create a copy of the example env file in your root directory:

cp .env.example .env

Visit OpenAI Signup page to create an account if you don't yet have one and generate an API key. OpenAI offers a free OpenAI API trial, with $5 worth of credit (which can take you a LONG way). Go ahead and assign your new API key to OPENAI_API_KEY in your new .env file.

Navigate to WalletConnect and create a free account and a new project (with a name of your choosing and the App type selected). Copy the resulting project ID and assign it as the value for NEXT_PUBLIC_PROJECT_ID .

OrbisDB Setup

If you're logged into your OrbisDB Studio account, we can start connecting our application to a shared OrbisDB instance.

First, you will need to define a new context. Contexts are a Ceramic-native feature exposed in all data management methods, and make it easy for developers to organize data across different applications or projects (there is also the option to leverage sub-contexts, but we'll save this for a future tutorial).

Go ahead and click "+ Add context" within your root studio view - feel free to give your new context a name and description of your choosing:

If you click into your new context you can view its corresponding ID:

Go ahead and copy this value and assign it to NEXT_PUBLIC_CONTEXT_ID in your .env file.

On the right-hand side, you should also see details about your setup:

Copy the value found under "Environment ID" and assign it to NEXT_PUBLIC_ENV_ID in your .env file. This ID is required to identify you when using the shared OrbisDB instance.

You will also see the endpoints for the shared Ceramic and OrbisDB instances in the same section. No need to copy these values as they are already hard-coded into the repository.

Defining Data Models

We will also use the Studio UI to define the data models our application needs. This demo application utilizes two simple data models found within the tables file in our repository:

posts - this will contain each message within our conversation exchange. The "body" field will house the message itself, while the "tag" field will keep track of who the message came from (user vs. bot). This model will use the "List" account relation, which means an authenticated account can have an unbounded amount of instance documents that fall under this definition. profiles - this model will allow us to assign additional data to ourselves and our chatbot, including a name, username, and fun emoji. The "actor" subfield will be used to differentiate between the user (using the value "human"), and your chatbot (using the value "robot"). In contrast to posts, this model will use the "Set" account relation based on the "actor" subfield, which means an account can have exactly 1 instance document given a value assigned to "actor". For example, this ensures that our application won't allow us to accidentally create >1 document with an "actor" subfield matching "human".

To start creating the models, navigate to "Model builder" from the Studio navigation. You can start by defining your "posts" table. After clicking "Create Model" you will be able to view the model ID:

Copy this value and assign it to NEXT_PUBLIC_POST_ID in your .env file.

Go through the same steps for your "profiles" table. However, be sure to select the "Set" option under "Account relation". Copy the resulting model ID and assign it to NEXT_PUBLIC_PROFILE_ID in your .env file.

Application Architecture

As mentioned above, the OrbisDB SDK makes it easy to instantiate clients, authenticate users, and run queries using the same library. As you'll note in the application repository, there are various components that need to be able to access the state of the authenticated user. While we're wrapping all components of our application within a WagmiConfig contextual wrapper (which will allow us to leverage Wagmi's hooks to see if a user's wallet is connected - learn more about this in our WalletConnect Tutorial), we also need a way to know if the user has an active OrbisDB session.

While there are multiple ways to facilitate this, our application uses Zustand for state management to circumvent the need for contextual wrappers or prop drilling.

If you take a look at the store file you can see how we've set up four state variables (two of which are methods) and incorporated the OrbisDB SDK to authenticate users and alter the state of orbisSession:

type Store = { orbis: OrbisDB; orbisSession?: OrbisConnectResult | undefined; // setOrbisSession returns a promise setAuth: ( wallet: GetWalletClientResult | undefined ) => Promise<OrbisConnectResult | undefined>; setOrbisSession: (session: OrbisConnectResult | undefined) => void; }; const StartOrbisAuth = async ( walletClient: GetWalletClientResult, orbis: OrbisDB ): Promise<OrbisConnectResult | undefined> => { if (walletClient) { const auth = new OrbisEVMAuth(window.ethereum!); // This option authenticates and persists the session in local storage const authResult: OrbisConnectResult = await orbis.connectUser({ auth, }); if (authResult.session) { console.log("Orbis Auth'd:", authResult.session); return authResult; } } return undefined; }; const useStore = create<Store>((set) => ({ orbis: new OrbisDB({ ceramic: { gateway: "https://ceramic-orbisdb-mainnet-direct.hirenodes.io/", }, nodes: [ { gateway: "https://studio.useorbis.com", env: ENV_ID, }, ], }), orbisSession: undefined, setAuth: async (wallet) => { if (wallet) { try { const auth = await StartOrbisAuth(wallet, useStore.getState().orbis); set((state: Store) => ({ ...state, orbisSession: auth, })); return auth; } catch (err) { console.error(err); } } else { set((state: Store) => ({ ...state, session: undefined, })); } }, setOrbisSession: (session) => set((state: Store) => ({ ...state, orbisSession: session, })), }));

As you can see, we've hard-coded the Ceramic and OrbisDB gateways, whereas we've imported our environment ID that we previously assigned as an environment variable.

Our navbar component sits at the same or greater level as all of our child components and includes our Web3Modal widget. You can see how we're using a useEffect hook to check if our session is active and either set our "loggedIn" state variable as true or false. This result determines if we generate a new session for the user by leveraging our setAuth method from our Zustand store, or if we simply set our orbisSession as the value of our valid active session.

Back in the home page component you can see how we're conditionally rendering our MessageList child component based on whether we have both an active orbis session AND the user's wallet is connected (allowing us to access their address).

Reading Data

The message list and userform component files are responsible for performing the majority of writes and reads to OrbisDB. If you navigate to the message list component for example, take a look at how we've imported our client-side environment variables to identify our post and profile models, as well as our context ID. When this component is rendered, the useEffect hook first invokes the "getProfile" method:

const getProfile = async (): Promise<void> => { try { const profile = orbis .select("controller", "name", "username", "emoji", "actor") .from(PROFILE_ID) .where({ actor: ["human"] }) .context(CONTEXT_ID); const profileResult = await profile.run(); if (profileResult.rows.length) { console.log(profileResult.rows[0]); setProfile(profileResult.rows[0] as Profile); } else { // take the user to the profile page if no profile is found window.location.href = "/profile"; } await getRobotProfile(profileResult.rows[0] as Profile); } catch (error) { console.error(error); return undefined; } };

Notice how we've constructed a .select query off of our OrbisDB instance (provided by our Zustand store), asking for the corresponding values for the 5 columns we want data for.

Next, we need to notate which data model we want our query to reference, which is where we use .from with our profile model ID as the value.

We also only want the records where the profile is for the human user, indicated on the following line.

Finally, we use the context ID that corresponds to this project as the final value that's appended to the query.

If a corresponding profile exists, we then invoke the getRobotProfile method to obtain our chatbot's information. If it does not exist, we take the users to the profiles page so they can create one.

Writing Data

Let's take a quick look at an example of data mutations. Within the same message list component you will find a method called createPost which is invoked each time the user creates a new message:

const createPost = async ( thisPost: string ): Promise<PostProps | undefined> => { try { await orbis.getConnectedUser(); const query = await orbis .insert(POST_ID) .value({ body: thisPost, created: new Date().toISOString(), tag: "user", edited: new Date().toISOString(), }) .context(CONTEXT_ID) .run(); if (query.content && profile) { const createdPost: PostProps = { id: query.id, body: query.content.body as string, profile, tag: query.content.tag as string, created: query.content.created as string, authorId: query.controller, }; return createdPost; } } catch (error) { console.error(error); return undefined; } };

While this looks similar to the syntax we use to read data, there are a few differences.

First, take a look at the first line under the "try" statement - we're calling getConnectedUser() off of our OrbisDB prototype chain to ensure that our active session is applied. This is necessary to run mutation queries, whereas it's not a necessary step for reading data.

You can also see that we've swapped out the .select and .from statements for .insert which references the model ID we want to use, thus creating a new row in the corresponding table.

Finally, we're referencing the user's message value for the body while ensuring we tag the message as coming from the "user" before running the query and checking on its success status.

Running the Application in Developer Mode

We're now ready to boot up our application!

In your terminal, go ahead and start the application in developer mode:

nvm use 20 npm run dev

Navigate to http://localhost:3000/ in your browser. You should see the following:

Go ahead and click on "Connect Wallet." You should see a secondary authentication message appear after you connect your wallet:

Signing this message creates an authenticated session (using orbis.connectUser() from our Zustand store). You can check the value of this session by navigating to the "Application" tab in your browser and looking for the orbis:session key pair:

Given that you have not yet created any messages, the application should automatically direct you to the /profiles page where you can assign identifiers to yourself and your chatbot:

Finally, navigate back to the homepage to begin exchanging messages with your chatbot. Notice how the values from your corresponding profiles appear next to the messages:

How Could this Application be Improved?

Since our message history is being written and queried based on static values (for example, assigning messages to the "user" tag), you'll notice that the same conversation history appears when self-authenticating with a different wallet address and creating a new session.

As a challenge, try thinking about how to implement different ways to improve the application design to improve this experience:

Tagging the profiles and messages with values that align with actual authenticated accounts instead of static ones Altering our message data model and application to accommodate different chat contexts, allowing a user to have different conversation histories Next Steps

We hope you've enjoyed this tutorial and learned something new about how to configure and incorporate OrbisDB into your application! While this concludes our walk-through, there are other possibilities Ceramic has to offer:

Join the Ceramic Discord

Follow Ceramic on X

Follow Orbis on X

Start Building with Ceramic


GS1

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plas
Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less

Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plastic and make recycling easier.

Information that was previously on the labels of Jeju SamDaSoo mineral water is now available simply by scanning the QR Code with GS1 Digital Link on the bottle cap.

Beyond being compliant with national laws, the company is seeing improved engagement with consumers, better inventory management and more.

case-study-gs1-korea-jpdc.pdf

Thursday, 27. June 2024

EdgeSecure

Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid

The post Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid appeared first on NJEdge Inc.

Webinar
Thursday, July 25, 2024
10 AM ET

For higher education institutions offering financial aid to students, the Gramm-Leach-Bliley Act, or GLBA, means your institution is required to meet compliance standards for the security and protection of financial information, and to provide transparency related to how personal information is used and shared. Failure to meet these standards carries significant risk for institutions, including restrictions or loss of eligibility for Title IV funding. In this session, we’ll review how the latest revisions to GLBA compliance standards, aligned with the NIST 800-171 revision 3, will impact higher education. Our privacy and compliance experts will review how these revisions increase the compliance burden for institutions, and key steps that institutions can take to meet the new standard and maintain compliance to receive federal financial aid support.

Register Now »

The post Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid appeared first on NJEdge Inc.


OpenID

Public Review Period for Three Shared Signals Drafts

The OpenID Shared Signals Working Group recommends approval of the following three specifications as OpenID Implementer’s Drafts: Shared Signals Framework Draft 03 Other formats: TXT, XML, MD CAEP Draft 03 Other formats: TXT, XML, MD CAEP Interoperability Profile Draft 00 Other formats: TXT, XML, MD An Implementer’s Draft is a stable version of a specification providing intellectual […]

The OpenID Shared Signals Working Group recommends approval of the following three specifications as OpenID Implementer’s Drafts:

Shared Signals Framework Draft 03 Other formats: TXT, XML, MD CAEP Draft 03 Other formats: TXT, XML, MD CAEP Interoperability Profile Draft 00 Other formats: TXT, XML, MD

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification drafts in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the Working Group believes must be addressed by revising the drafts, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve these drafts as OpenID Implementer’s Drafts. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period. The relevant dates are:

Implementer’s Draft public review period: Thursday, June 27, 2024 to Sunday, August 11, 2024 (45 days) Implementer’s Draft vote announcement: Monday, July 29, 2024 Implementer’s Draft early voting opens: Monday, August 5, 2024 * Implementer’s Draft voting period: Monday, August 12, 2024 to Monday, August 19, 2024 (7 days)*

* Note: Early voting before the start of the formal voting will be allowed. The Shared Signals work group page is https://openid.net/wg/sharedsignals.

Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specifications in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “Shared Signals” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-risc, and (3) sending your feedback to the list. 

Marie Jordan – OpenID Foundation Board Secretary

Update: July 1, 2024:

The SSWG has now released an overview of the changes found in the 3 drafts released on June 27, 2024.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Public Review Period for Three Shared Signals Drafts first appeared on OpenID Foundation.


Hyperledger Foundation

Building Bridges: Developing the Stellar Connector for Hyperledger Cacti

Introduction: The Importance of Interoperability in Blockchain

Introduction: The Importance of Interoperability in Blockchain


Oasis Open Projects

Invitation to comment on Data Model for Lexicography v1.0

Data Model for Lexicography v1.0 defines standard serialization independent interchange objects based on state of the art in the lexicographic industry. The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.

Third public review - ends July 27th

OASIS and the OASIS Lexicographic Infrastructure Data Model and API (LEXIDMA) TC are pleased to announce that Data Model for Lexicography Version 1.0 is now available for public review and comment. This 30-day review is the third public review for this specification.

About the specification draft:

The LEXIDMA TC’s high level purpose is to create an open standards based framework for internationally interoperable lexicographic work. Data Model for Lexicography v1.0 describes and defines standard serialization independent interchange objects based predominantly on state of the art in the lexicographic industry. The TC aims to develop the lexicographic infrastructure as part of a broader ecosystem of standards employed in Natural Language Processing (NLP), language services, and Semantic Web.

This document defines the first version of a data model in support of these technical goals, including:
– A serialization-independent Data Model for Lexicography (DMLex)
– An XML serialization of DMLex
– A JSON serialization of DMLex
– A relational database serialization of DMLex
– An RDF serialization of DMLex
– An informative NVH serialization of DMLex

The documents and related files are available here:

Data Model for Lexicography (DMLex) Version 1.0
Committee Specification Draft 03
12 June 2024

PDF (Authoritative):
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.pdf
HTML:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.html
PDF marked with changes since previous public review:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03-DIFF.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.zip

How to Provide Feedback

OASIS and the LEXIDMA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 28 June 2024 at 00:00 UTC and ends 27 July 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on Data Model for Lexicography”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on LEXIDMA works are archived at https://lists.oasis-open.org/archives/lexidma-comment/.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the LEXIDMA TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/lexidma/

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/lexidma/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#Non-Assertion-Mode
Non-Assertion Mode

[3] Public review metadata document:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03-public-review-metadata.html

The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.


OASIS Approves Universal Business Language V2.4 Standard for Global Business Transactions

Boston, MA – 27 June 2024 – OASIS Open, the international open source and standards consortium, announced the approval of Universal Business Language (UBL) V2.4 as an OASIS Standard, a status that signifies the highest level of ratification. Developed by the UBL Technical Committee (TC), UBL V2.4 is the leading interchange format for business documents, […] The post OASIS Approves Universal Busi

IBM, Logius, Orbex Global Markets, Publications Office of the European Union, U.S. Department of Defense (DoD), and Others Advance Open Standard for Enhanced Interoperability and Efficiency in Supply Chain and Digital Trade

Boston, MA – 27 June 2024 – OASIS Open, the international open source and standards consortium, announced the approval of Universal Business Language (UBL) V2.4 as an OASIS Standard, a status that signifies the highest level of ratification. Developed by the UBL Technical Committee (TC), UBL V2.4 is the leading interchange format for business documents, revolutionizing global business transactions with its latest version. 

The standard works seamlessly with frameworks like ISO/IEC 15000 (ebXML), extending the benefits of Electronic Data Interchange (EDI) systems to businesses worldwide. UBL V2.4 maintains backward compatibility with earlier V2.# versions while introducing new business document types, now totaling 93. The European Union has recognized UBL’s significance by declaring it officially eligible for referencing in tenders from public administrations. 

“UBL 2.4 represents a significant advancement, featuring enhanced support for B2C transactions, which will greatly benefit businesses and consumers alike,” said Kenneth Bengtsson, Chair of the UBL TC. “Additionally, it offers improved alignment with U.S. tax models, ensuring compliance and facilitating smoother transactions. These enhancements reaffirm our commitment to evolving and adapting UBL to meet the ever-changing needs of global commerce.”

As global sustainability efforts increase, UBL will expand its utility to encompass circular data exchange, reflecting the evolving needs of modern commerce. Looking ahead to the development of UBL V2.5, the TC will integrate circular economy data elements, marking a transformative step towards embedding sustainability into global supply chain data exchange. 

The UBL TC is forming a new UBL Commodities Subcommittee (SC), which aims to streamline electronic transactions for raw materials, recycled goods, and agricultural products in global supply chains, with the goal of improving efficiency, transparency, sustainability, and reliability in commodity markets. The SC will standardize UBL document types and semantic library entries for global commodity trading and procurement processes.

The UBL TC encourages global collaboration and actively seeks input from stakeholders to ensure the success of UBL as a cornerstone for sustainability data exchange. The TC welcomes a diverse range of contributors, including ERP vendors; software and service providers; national, regional and local public authorities; procurement and trade communities; e-invoicing networks; supply chain communities; and logistics and transportation companies. Participation is open to all through membership in OASIS, with interested parties encouraged to join and contribute to shaping the future of structured business document exchange. Contact join@oasis-open.org for more information.  

The post OASIS Approves Universal Business Language V2.4 Standard for Global Business Transactions appeared first on OASIS Open.

Wednesday, 26. June 2024

FIDO Alliance

FIDO APAC Summit 2024 Announces Keynotes, Speakers, and Sponsors

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, […]

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, 2024. Co-hosted by SecureMetric Technology and supported by Malaysia Digital Economy Corporation (MDEC) and CyberSecurity Malaysia, this premier event is dedicated to advancing phishing-resistant FIDO authentication across the region under the theme, “Unlocking a Secure Tomorrow.”

The summit will feature keynote addresses by notable leaders such as Gobind Singh Deo, Malaysia’s Minister of Digital; Dato’ Dr. Amirudin Abdul Wahab, CEO of CyberSecurity Malaysia; TS. Mohamed Kheirulnaim Mohamed Danial, Senior Assistant Director of National Cyber Coordination and Command Centre (NC4) & National Cyber Security Agency (NACSA); Andrew Shikiar, CEO & Executive Director of FIDO Alliance; and Edward Law, CEO of Securemetric. 

They will be joined by a distinguished roster of speakers including Christiaan Brand, Product Manager: Identity and Security at Google; Eiji Kitamura, Developer Advocate at Google; Henry (Haixin) Chai, CEO of GMRZ Technology / Lenovo; Hyung Chul Jung, Head of Security Engineering Group at Samsung Electronics; Khanit Phatong, Senior Management Officer at Thailand Electronic Transactions Development Agency; Masao Kubo, Manager of Product Design Department at NTT DOCOMO; Naohisa Ichihara, CISO at Mercari; Niharika Arora, Developer Relations Engineer at Google; Sea Chong Seak, CTO at SecureMetric; Simon Trac Do, CEO & Founder of VinCSS; Takashi Hosono, General Manager at SBI Sumishin Net Bank; Yan Cao, Engineering Manager at TikTok; and Hao-Yuan Ting, Senior Systems Analyst at Taiwan Ministry of Digital Affairs.

The updated list of speakers can be found here.

Among the speakers, Tin Nguyen, a former U.S. Marine and FBI Special Agent, now a cybersecurity expert, will discuss the benefits of passwordless authentication and how it enhances organizational defenses against cyber threats. “Cybercriminals continuously search for vulnerabilities to take advantage of. Therefore, it is imperative for organizations to implement strong cybersecurity measures to safeguard their users,” says Nguyen. “Implementing FIDO-based passkeys provides an extra layer of security, mitigating potential threats without compromising user experience.”

The event promises to attract hundreds of attendees and will feature keynote addresses, panel discussions, technical workshops, and an expo hall showcasing the latest innovations from leading technology companies such as Securemetric, VinCSS, OneSpan, iProov, Thales, AirCuve, Zimperium, RSA, Yubico, Identiv, Utimaco, FETIAN, and many more. Attendees will have the opportunity to explore the latest trends in cybersecurity, network with top industry minds, and gain invaluable knowledge on implementing FIDO standards for enhanced security.

“The FIDO Alliance is thrilled to host its second FIDO APAC Summit 2024 in Malaysia, featuring presentations from some of the brightest minds in authentication from the APAC region and beyond,” said Andrew Shikiar, Executive Director and CEO of the FIDO Alliance. “With the continuous rise in the volume and sophistication of cyber-attacks, it is crucial for organizations to move past passwords and adopt passkeys, a user-friendly alternative based on FIDO standards.”

Registrations are now open to the public. For more information and to register, please visit www.fidoapacsummit.com. For sponsorship opportunities, please contact events@fidoalliance.org.

About the FIDO Alliance 

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.

PR Contact 

press@fidoalliance.org


Me2B Alliance

Do SDKs Represent Actual Network Traffic in EdTech Apps?

1. Background  In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on […] The post Do SDKs Represent Actual Network Traffic in EdTech
1. Background 

In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on their behaviors related to safety. As part of the safety evaluation, SDKs in each app were identified and researchers collected network traffic for 1,357 apps. In total, there were 275 unique SDKs in the apps, and 8,168 unique subdomains, 3,211 unique domains from the network traffic.  

A key research question in conducting the 2022 EdTech benchmark was to determine how accurate SDKs were as a proxy for actual third-party data sharing, since network traffic data collection is somewhat labor-intensive. This report shares the results of the analysis. 

2. Analysis 

The basis of the analysis was to compare the “expected” third parties as based on the company owners of the SDKs with the observed companies in the network traffic. This required identifying the owner companies for both the SDKs and all the subdomains observed in the aggregate network traffic.1 

Researchers first identified which SDKs were in use in apps by using AppFigures as a resource. In total, 275 SDKs unique SDKs were found in use across all apps.  Next, researchers identified the companies who published these SDKs. For each app, the number of unique company owners of SDKs found in each app is referred to as the “expected” number of companies to receive data.  

Next, researchers performed a similar analysis on the subdomains observed in the network traffic (1,175 total apps). Each subdomain was resolved to an “owner” company.  Subdomains were identified from HTTP POST/GET requests captured in the network traffic. 

We then performed two quantitative analyses: (1) we examined the network traffic of apps with at least one SDK (n=1,083 apps), and (2) we examined the network traffic of apps with no SDKs (n=92 apps).  

2.1   Apps With at Least One SDK 

Apps with at least one SDK communicated with an average of 10.1 companies based on observed network traffic (Table 1).  

2.1.1   “Expected” Companies in Network Traffic 

In apps with at least one SDK, there were an average of 4.7 unique companies represented by the SDKs–thus, 4.7 “expected” companies to receive data. However, on average, only 1.7 (or 36.2%) of the “expected” companies were seen in the network traffic of apps with at least one SDK (Table 1). 

Note that there are several contributing factors that could account for this, including: 

The manual testing performed by the researchers was unstructured and therefore had inconsistencies across researchers.  The manual testing didn’t perform all functions in the app. For instance, the tested did not make any optional purchases or upgrading to a premium version.    Table 1  Apps containing at least one SDK (n=1,083)   Average Expected Companies  Average Expected Companies Seen  Average # Unexpected Companies Seen  Average Total # of Companies Seen  Webview – With (n=609)  5.0  1.9  12.6  14.5  Webview – Without (n=474)  4.3  1.4  2.6  4.0  Advertisements – With (n=189)  5.6  2.1  24.0  26.1  Advertisements – Without (n=894)  4.5  1.6  5.0  6.6  Behavioral Advertisements – With (n=105)  5.4  2.1  33.7  35.8  Behavioral Advertisements – Without (n=978)  4.6  1.6  5.5  7.1  ALL Tested Apps With 1+ SDK (n=1083)  4.7  1.7  8.4  10.1 
2.1.2   “Unexpected” Companies in Network Traffic 

Additionally, as seen in Table 1, these apps communicated with an average of 8.4 unexpected companies.  

As expected, apps that used Webview2, had advertisements or behavioral ads all had even higher average numbers of unexpected companies, with apps with behavioral ads having the highest at 33.7 unexpected companies on average3. The ISL app score rubric regards the use of Webview and the inclusion of advertising as very high risks for K-12 students and the data in Table 1 reinforces the rubric.  

Apps with at least one SDK that use Webview had 2.6 times as many third parties as apps with at least one SDK that don’t use Webview.  Apps with at least one SDK that include ads had 3.0 times as many third parties as apps with at least one SDK that don’t include ads.  Apps with at least one SDK that include behavioral ads had 4.0 times as many third parties as apps with at least one SDK that don’t include behavioral ads.  2.2   Apps with No SDKs 

There were 92 apps in the data set that had no SDKs and for which we had network traffic. Since these apps had no SDKs, there were no “expected” companies to receive data from the app [other than the app developer, of course].  

Apps with no SDKs averaged 4.6 companies observed in network traffic—negligibly less than the average for apps with at least one SDK. However, for apps that use Webview, or include advertising or behavioral advertising, the average observed companies is markedly lower (Table 2).  

Apps with no SDKs that use Webview had 44.1% fewer observed companies.  Apps with no SDKs that include advertising had 40.6% fewer observed companies.  Apps with no SDKs that include behavioral advertising had 21.0% fewer observed companies.  Table 2  Apps with no SDKs   Average # of Companies Seen  Webview – With (n=43)  8.1  Webview – Without (n=49)  1.6  Advertising – With (n=11)  15.5  Advertising – Without (n=81)  3.2  Behavioral Ads – With (n=4)  28.3  Behavioral Ads – Without (n=88)  3.6  All Tested Apps Without SDKs (n=92)  4.6 
3. Conclusion
3.1   SDKs as a Proxy for Third Party Sharing

As the data shows, SDKs aren’t a useful proxy for the actual number of third parties receiving data from the app. Moreover, apps that include ads or that use Webview will likely have significantly more third parties than apps without.  

This means that viable measurement of third parties receiving data from apps requires testing and observation of network traffic. ISL used mostly manual methods for the collection of this data but automated methods would be extremely beneficial for ongoing and pervasive measuring of app third party sharing.  

SDKs do provide value in identifying potential omissions in the manual testing process. Can we account for the specific SDKs that don’t appear in the network traffic? Did we miss a particular functional branch of the app that we should go back and test? Or might it be an indication of an error in the SDK database? So while SDKs don’t serve as a perfect indication of the third parties communicating with the app, they still provide valuable information, and as such, they will remain in our app safety labels (see https://appmicroscope.org/).  

3.2   Validation of ISL App Scoring Rubric 

As shown in section 2, use of Webview and the inclusion of advertising substantially increase user exposure to data sharing with more third parties. This finding reinforces the ISL app scoring rubric wherein the use of Webview and presence of advertising are indicators for very high risk. 

4. Helpful Links 

App Microscope 

SDK Risk Dictionary 

Domain Risk Dictionary 

Company Risk Dictionary

 

Footnotes: See the SDK Risk Dictionary and the Subdomain Risk Dictionary for details.  Note: researchers determined the use of Webview manually, by observing third-party pages opening within the app. Thus, the presence of Webview as tagged in ISL’s AppMicroscope.org may not accurately assess Webview use for first-party web pages.  It would be interesting to study how many apps have behavioral ads and don’t use Webview.

The post Do SDKs Represent Actual Network Traffic in EdTech Apps? appeared first on Internet Safety Labs.


Elastos Foundation

The New Bretton Woods: How BeL2 Aims to Transform Global Finance using Native Bitcoin

In the records of financial history, few events have shaped the global economic landscape as profoundly as the establishment of the Bretton Woods system. In 1944, amidst the ruins of World War II, representatives from 44 Allied nations convened in Bretton Woods, New Hampshire, to create a new framework for international economic cooperation. The primary […]

In the records of financial history, few events have shaped the global economic landscape as profoundly as the establishment of the Bretton Woods system. In 1944, amidst the ruins of World War II, representatives from 44 Allied nations convened in Bretton Woods, New Hampshire, to create a new framework for international economic cooperation. The primary goal was to prevent the economic instability and competitive devaluations that had contributed to the Great Depression and the war.

The Bretton Woods system pegged major currencies to the US dollar, which was convertible to gold at a fixed rate. This effectively made the US dollar the world’s reserve currency, providing much-needed stability and fostering economic growth. However, by the late 1960s, the system began to unravel. The US faced mounting balance-of-payments deficits and dwindling gold reserves. On August 15, 1971, President Richard Nixon unilaterally ended the dollar’s convertibility to gold, effectively dismantling the Bretton Woods system and ushering in the era of fiat currencies.

The transition to fiat currencies, while offering greater flexibility for monetary policy, also introduced significant challenges. Governments could now print money without restraint, leading to inflation, currency devaluations, and a series of financial crises. Today, the world faces a staggering $307 trillion in debt, excessive currency issuance, declining bank credit, and rising economic instability. This backdrop underscores the need for a new financial paradigm, one that combines stability with the technological advancements of the digital age.

 

Bitcoin: Digital Gold

Bitcoin, created in 2009 by the pseudonymous Satoshi Nakamoto, was designed as a decentralised digital currency that could operate independently of central banks and governments. Often referred to as “digital gold,” Bitcoin possesses many qualities that make it an ideal candidate for a global reserve asset: it is scarce (with a cap of 21 million coins), durable, portable, and easily divisible. Bitcoin’s blockchain technology ensures transparency, security, and resistance to censorship, making it a robust vehicle to support fiat currencies and value exchange.

Despite its adoption and over $1 trillion in value, Bitcoin’s mainstream financial use faces challenges like scalability and programmability limitations. Its high decentralisation and security make transactions slower and resource-intensive. While its simplicity ensures robust security, it limits Bitcoin’s ability to handle complex transactions like digital agreements for loans or exchanges. Innovations like Ethereum, which introduced smart contracts in 2015, offer more functionality, leading to Layer 2 solutions aimed at uniting technologies and enhancing Bitcoin’s capabilities.

 

Bitcoin Layers

Layer 2 solutions are protocols built on top of a blockchain (Layer 1) to enhance performance and enable more complex functionalities. For Bitcoin, Layer 2 technologies like the Lightning Network and sidechains address issues of transaction speed, programmability and scalability, while bridges facilitate interoperability with other blockchain ecosystems. These solutions allow Bitcoin to interact with other blockchain ecosystems and innovations like Ethereum, enabling smart contracts and decentralised applications (DApps) that were previously not possible. However, there is a problem.

 

Inherent Problems

Scalability layers involve bridging Bitcoin off its main network and into these environments, creating security concerns that undermine its decentralised ethos. Wrapped Bitcoin (WBTC), for instance, is an ERC-20 token that represents Bitcoin on Ethereum networks. While it brings Bitcoin’s liquidity to more programmable finance platforms, it has several critical issues:

Centralisation Risk: WBTC requires users to trust centralised institutions to manage and safeguard the Bitcoin backing the WBTC. If these institutions act maliciously, users have no recourse, undermining Bitcoin’s decentralisation ethos. Custodian Risk: The centralised custodians holding the actual Bitcoin can potentially be hacked or face regulatory pressures, putting users’ assets at risk. Lack of Transparency: Users must rely on the custodians’ transparency regarding the actual reserves backing the WBTC, which may not always be reliable.

Recent cross-chain bridge hacks, such as the Nomad Bridge exploit, highlight these vulnerabilities. Chainalysis reports that $2 billion has been stolen in 13 cross-chain bridge hacks, accounting for 69% of total funds stolen in 2022, with North Korean-linked hackers stealing approximately $1 billion. Currently, there are more than 70 cross-chain bridges with over $25 billion locked and daily transaction volumes in the millions. Synapse, a popular cross-chain bridge, has surpassed $5 billion in transaction volume. Bridges are vulnerable due to their structure, combining custodians, debt issuers, and oracles, each presenting multiple attack vectors. For instance, the Poly Network and Wormhole attacks showcased vulnerabilities in cross-chain communication, resulting in significant losses.

To mitigate these risks, it is crucial for Bitcoin’s evolution to connect other innovation layers through information transmission rather than asset transfer, while decentralising staking on Bitcoin to avoid pooling assets together. This approach keeps Bitcoin native, secure, and decentralised, while enabling broader financial applications in scalable environments.

 

Native Bitcoin DeFi, Pioneering the New Bretton Woods System

Native Bitcoin refers to Bitcoin that remains on its main network while being collateralised for Layer 2 DeFi applications. Through BeL2, Bitcoin can participate in complex financial transactions without being transferred off the Bitcoin blockchain, maintaining its security and decentralisation. This allows Bitcoin to act as a versatile tool in DeFi ecosystems, leveraging its inherent strengths while expanding its functionality into smart services such as swaps, loans, and stablecoin issuance.

Staking: BeL2 employs non-custodial native Bitcoin staking in decentralised wallets, providing security for the network. Zero Knowledge Proofs (ZKPs): These provide private and verifiable proofs on Bitcoin staking transactions. BTC Oracle: Connects BTC proof information into Layer 2 smart contracts, facilitating Bitcoin DeFi services without moving assets off the main network. Arbiter Network: Utilises decentralised and collateralised nodes to facilitate time-based execution and dispute resolution, enhancing trustless financial operations.

BeL2 transmits information, not assets, across chains, preserving Bitcoin’s security and integrity while enabling smart contracts and decentralised applications for complex financial transactions. This approach eliminates reliance on centralised entities and ensures secure, decentralised financial operations. By keeping Bitcoin native, BeL2 ensures that the original blockchain remains the ultimate trust anchor for all transactions, thereby maintaining the foundational principles of Bitcoin’s decentralised ethos.

 

Native Bitcoin Loan App Demo

In the context of BeL2’s transformative role in decentralised finance (DeFi), the BeL2 Loan Dapp Demo showcases its potential to enhance Bitcoin’s utility while maintaining security and decentralisation. This demo is the first native Bitcoin lending protocol built on Starkwares Cairo programming language, allowing users to lock their native Bitcoin as collateral without relying on Wrapped Bitcoin (WBTC) or cross-chain bridges. The BTC remains on the Bitcoin mainnet, ensuring non-custodial and non-liquidatable collateral.

Users lock their Bitcoin through a bespoke transaction script, and the loan terms, including interest rates and conditions for collateral release, are governed by a smart contract on the Ethereum Virtual Machine (EVM). BeL2’s arbiter network acts as an intermediary, facilitating communication between the Bitcoin and EVM chains and verifying transaction proofs. If a borrower, like Alice, fails to repay a loan, the lender, Bob, can retrieve the BTC. If Bob refuses to cooperate in unlocking the BTC after repayment, Alice can initiate arbitration, and the arbiter will co-sign to unlock the BTC.

This peer-to-peer system ensures fairness and security through zero-knowledge proofs and the arbiter network. Any malicious actions by the arbiter or parties involved are deterred by the ability to challenge and penalise misconduct, ensuring cooperation is the best outcome. This innovative approach enables Bitcoin holders to access liquidity while preserving Bitcoin’s core principles of decentralisation and security.

 

Principles for a New Native Bitcoin Bretton Woods System: Decentralised Global Settlement: Native Bitcoin must act as a global settlement layer where all transactions are secured on Bitcoin’s main network, ensuring it remains the ultimate trust anchor for global finance. Financial Innovation and Stability: By integrating native Bitcoin with smart contracts and DApps, we can support new financial products like BTC-backed loans and stablecoins, providing liquidity and stability to the global economy whilst uniting all layers. Trustless and Transparent Operations: The implementation of information transmission through zero knowledge proofs and a decentralised arbiter network ensures trustless and transparent financial operations for native Bitcoin applications, reducing counterparty risk and enhancing transaction integrity.

BeL2’s vision is to become the defining piece of native Bitcoin infrastructure for a new Bretton Woods system. Emerging from the Elastos SmartWeb vision, which aims to create a decentralised internet where data, applications, and identities are secure, private, and user-owned, BeL2 strives to become a pivotal component of native Bitcoin infrastructure, transforming Bitcoin from digital gold into the cornerstone of a new global financial system.

 BeL2 leverages Elastos’ secure infrastructure, using ELA as collateral for arbiters to ensure robust and trustless dispute resolution. ELA, an asset merged mined with over 50% of Bitcoin’s miner security, adds an additional layer of security and decentralisation to the BeL2 ecosystem, reinforcing both projects’ commitment to a secure and decentralised financial future. Excited to learn more? Head over to the BeL2 website and follow Infinity for the latest updates!


Oasis Open Projects

Invitation to comment on OData Vocabularies v4.0

OData Vocabularies v4.0 describes a set of OData vocabularies maintained by the OASIS OData Technical Committee. The post Invitation to comment on OData Vocabularies v4.0 appeared first on OASIS Open.

Public review ends July 24th

OASIS and the OASIS Open Data Protocol (OData) TC [1] are pleased to announce that OData Vocabularies Version 4.0 is now available for public review and comment. This 30-day review is the second public review for this specification.

The Open Data Protocol (OData) enables the creation of REST-based data services, which allow resources, identified using Uniform Resource Locators (URLs) and defined in an Entity Data Model (EDM), to be published and edited by Web clients using simple HTTP messages.

OData Vocabularies v4.0 describes a set of OData vocabularies maintained by the OASIS OData Technical Committee. These vocabulary components are continuously evolved.

The documents and related files are available here:

OData Vocabularies Version 4.0
Committee Specification Draft 02
19 June 2024

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.md
HTML:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.html
PDF:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.pdf

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file at:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.zip

How to Provide Feedback

OASIS and the OData TC value your feedback. We solicit feedback from potential users, developers and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

This public review starts 25 June 2024 at 00:00 UTC and ends 24 July 2024 at 11:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on OData Vocabularies”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on OData works are archived at https://lists.oasis-open.org/archives/odata-comment/.

All comments submitted to OASIS are subject to the OASIS Feedback License [2], which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with the public review of these works, we call your attention to the OASIS IPR Policy [3] applicable especially [4] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specifications, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about these specifications and the OData TC may be found on the TC’s public home page.

========== Additional references:

[1] OASIS Open Data Protocol (OData) TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=e7cac2a9-2d18-4640-b94d-018dc7d3f0e2
https://www.oasis-open.org/committees/odata/

[2] OASIS Feedback License:
https://www.oasis-open.org/who/ipr/feedback_license.pdf

[3] https://www.oasis-open.org/policies-guidelines/ipr/

[4] https://www.oasis-open.org/committees/odata/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-RAND-Mode
RF on RAND Mode

The post Invitation to comment on OData Vocabularies v4.0 appeared first on OASIS Open.


Next Level Supply Chain Podcast with GS1

50 Years of Confidence, Supply Chain Success, and the Next Dimension in Barcodes

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with: Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcode

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with:

Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcodes for better product traceability.

Sean Murphy from Cencora demystifies the Drug Supply Chain Security Act and emphasizes the necessity of unique serial numbers and digital backpacks for pharmaceutical products to ensure safety and compliance in the healthcare industry.

Andrew Meadows is the founder and CEO of BL.INK introduces the intriguing world of 2D barcodes and digital resolvers. Learn how BL.INK’s platform, BL.INK CXP revolutionizes consumer engagement by providing personalized experiences and enhancing data privacy.

JW Franz from Barcoding Inc. emphasizes the importance of supply chain automation innovation and the future of barcoding, including RFID and computer vision technologies.

They all speak on the gradual implementation of new technologies, the strategic importance of 2D barcodes, and the transformative potential of computer vision in inventory management. The episode also covers the crucial role of standardization and regulatory compliance in healthcare and explores the exciting advancements paving the way for smarter, safer, and more efficient supply chains.

 

Key takeaways:

Discover how the integration of 2D barcodes and QR codes, paired with advancements in computer vision, is revolutionizing retail and supply chain management for enhanced consumer experiences and operational efficiency.

Explore the significant impact of the Drug Supply Chain Security Act and the digital backpack concept on pharmaceutical traceability, with insights from Sean Murphy of Cencora on how serialization ensures compliance and safety.

Learn about BL.INK’s innovative 2D barcode technology and digital resolvers, with Andrew Meadows explaining how these tools enable personalized consumer interactions and secure data privacy, driving a more direct and meaningful brand engagement strategy.

 

Jump into the Conversation:

 

[00:00] Welcome to Next Level Supply Chain

[00:48] Coming to you from GS1 Connect 2024 in Orlando

[02:45] Introducing Dave DeLaus, CIO at Wegmans

[03:42] Hot Topics with Wegmans

[04:47] Some insights on use of the 2D barcode at Wegmans

[06:01] How you can interact with the 2D barcode differently for your customer

[10:17] Introducing Sean Murphy with Cencora

[12:14] Cencora’s use of EPCIS or Electronic Product Code Information Service

[14:04] Leveraging RFID technology

[14:53] Focusing on DSCSA to create a smart, safe, and sustainable supply chain

[16:48] 2D barcodes in the pharmaceutical and healthcare industry

[18:49] Introducing Andy Meadows, founder and CEO of BL.INK

[19:25] BL.INK platform and digital resolvers

[24:22] Advising product manufacturers about BL.INK

[25:37] Andy’s thoughts on the future of 2D barcodes

[27:53] Introducing JW Franz from Barcoding Inc.

[29:07] JW’s biggest takeaway fro attending Connect

[29:48] Barcoding Inc’s current focus

[30:28] JW’s thoughts on the future of RFID and 2D barcodes

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Dave DeLaus - CIO, Wegmans 

Sean Murphy - Senior Manager of Manufacturing Operations, Cencora

Andrew Meadows - Founder & CEO, BL.INK

JW Franz - IoT Automation Solution Director, Barcoding Inc.


Digital Identity NZ

Postcard from Berlin | June Newsletter

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Kia ora,

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. For someone who regularly spoke on this circuit for over 10 years, it was a blast to be back, revelling in the richness of the presentations and discussion, highlighted in this post by my predecessor at the Kantara Initiative. The brain is fully engaged for hours, absorbing expert insight, experience, and innovation that is found here at EIC in Germany and Identiverse in the US, the world’s two biggest conferences in this space. I’m looking forward to a tiny fraction of the ground being covered and contextualised locally at our Digital Trust Hui Taumata in just six weeks’ time.

The pre-conference SIDIHub made significant progress in the challenging goal to achieve digital identity cross border interoperability. Participants were interested in learning about Aotearoa becoming the first common law country to implement a regulated digital identity trust framework on July 1st. This framework regulates stakeholders that opt in for accreditation, thereby increasing customer trust in their security and privacy settings. The US, UK, Canada, and Australia have digital ID trust frameworks in operation and piloted, but not yet nationally legislated. Credential authentication and verification continue to evolve both in policy and technology at different speeds, making it a complex issue. 

At our recent DIA-hosted public/private sector working group meeting on digital identification standards, I commented that there is still much work needed globally before we can adopt comprehensive policies, protocols and standards for decentralised digital ID and its containerwallets. This is a plane we will continue to build as we fly it. 

The high interest in digital ID led us to host/co-host two events in June: the capacity-filled ‘Digital Identity, Higher Education, Aotearoa’ sponsored by Middleware and featuring the University of Auckland, and a Town Hall-styled session on digital cash with the Reserve Bank of New Zealand Te Pūtea Matua (RBNZ), in partnership with FinTechNZ. Both events showed how important broad digital trust is for ensuring cybersecurity and protecting against deepfakes, scams and hacking threats. Awareness and education are essential, so we thank our members for supporting DINZ initiatives, just as DINZ supports members’ initiatives like the upcoming series from NEC.

And finally, the DIA has released a new schedule of Identification Masterclasses through to August.

To register for any of the Zoom sessions, please email identity@dia.govt.nz with the G or HD reference number and a Zoom link will be supplied.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read full news here: Postcard from Berlin | June Newsletter

SUBSCRIBE FOR MORE

The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Monday, 24. June 2024

GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1 To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle. By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many fi
Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle.

By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many filling cycles the bottle has gone through, and whether it should be refilled or recycled.

Beyond the positive sustainability impact, the initiative provides a valuable set of data about each bottle’s journeys through the market across its lifecycle.

case-study-gs1-brazil-coca-cola.pdf

Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games

Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games A top priority of the organisers of the Hangzhou Asian Games was to serve safe food from trusted supply chain partners. Under the stewardship of Zhejiang AMR, QR Codes powered by GS1 were extensively implemented across the entire end-to-end food supply chai
Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games

A top priority of the organisers of the Hangzhou Asian Games was to serve safe food from trusted supply chain partners.

Under the stewardship of Zhejiang AMR, QR Codes powered by GS1 were extensively implemented across the entire end-to-end food supply chain of the Games.

Zero food safety accidents – an accomplishment acknowledged by Thomas Bach, President of the International Olympic Committee.

case-study-gs1-china-hangzhou-games.pdf

Ceramic Network

Calling all devs: Build composable search applications for the Base Onchain Summer

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include: Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns) The bounty is officially

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns)

The bounty is officially hosted on bountycaster.

About the Index Network

Index is a discovery protocol, built on Ceramic, that eliminates the need for intermediaries when finding knowledge, products, and like-minded people through direct, composable discovery across the web. By leveraging Web3 and AI, Index offers an open layer for discovery as the first decentralized semantic index. It functions as a composable vector database with a user-centric perspective, enabling interaction with decentralized graphs like Ceramic Network for user-owned knowledge graphs and Farcaster for social discourse.

About the bounty

For this bounty, developers have access to the Base search engine created on Index Network. They can utilize this index and integrate it with other projects and tools participating in Base’s on-chain summer to innovate and enhance information discovery experiences. Additionally, using Farcaster Channel indexes as a data source can help create personalized applications.

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

Prizes

A total prize pool of 2250 USDC will be distributed across top 3 best applications:

First place: 1000 USDC Second place: 750 USDC Third place: 500 USDC Useful links

Below you can find all of the tools and links available for you to build for this bounty.

Bounty:

Official link to the bounty on bountycaster

Indexes:

Create new indexes (for bounty builders only) Base documentation index Index Network Profile on Index Farcaster Channels Profile on Index

Documentation, tutorials and support:

Index.Network documentation Video tutorial for Farcaster contextual subscription Getting Started with Index Network SDK Video tutorial: Creating the Base documentation index GitHub Discord and forum for technical support

FIDO Alliance

The Register: AWS is pushing ahead with MFA for privileged accounts. What that means for you.

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to […]

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to maintain account access.


IT Brew: FIDO Alliance announces identity-proofing certification

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of […]

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of the FIDO Alliance, highlights that this certification technology “gives licensing companies added assurance that a vendor is performing well.”


Find Biometrics: ID Talk: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s […]

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s vendor accreditation, and the new Face Verification Certification program.


DIF Blog

Blockchain and Identity

Theatre 4 was the place to be at Identity Week Europe in Amsterdam earlier this month, when a series of presentations and panel discussions on decentralized identity and blockchain proved one of the exhibition hall's top draws. The session, on the afternoon of Day 2, began with a

Theatre 4 was the place to be at Identity Week Europe in Amsterdam earlier this month, when a series of presentations and panel discussions on decentralized identity and blockchain proved one of the exhibition hall's top draws.

The session, on the afternoon of Day 2, began with a panel discussion, "Blockchain and ID" moderated by Alex Tourski, with Steffen Schwalm, Co-ordinator, TRACE4EU, Maarten Boender, INATBA Identity Workgroup, Sphereon.com and William Wang, Founder, Palau Digital Residency Program (RNS.ID).

Alex Tourski: "Why does blockchain need identity?"

Steffen Schwalm: "If you only want to prove identities, you can use a PKI. But if you want to combine identity and transactions in one system — for example, if you want to trace the parts and materials in your Tesla’s battery — you need a distributed ledger." 

Maarten Boender: "I agree. How can you be held accountable for what’s written to the blockchain, unless the transaction is signed by your identifier? If you need to make the audit trail of your product evident, that’s much easier to do with DLT (Distributed Ledger Technology). The DID document can’t be changed and will be around for as long as the blockchain it resides on, which is available always and everywhere. There’s no single point of failure.

"There are not so many other systems with the properties of a DLT". 

Alex Tourski: "Can DIDs be considered a universal identifier scheme, when there are around 200 DID methods?" 

Steffen Schwalm: "We have multiple credential data models, signature formats and protocols. What matters is achieving interoperability. I don’t have a problem with 500 DID methods, as long as we have a universal resolver that works."

Maarten Boender: "There are many types of database. As long as everyone talks SQL, its’ fine. Consumers won’t need to think about 200 DID methods. You’ll only have one choice, whether to login with your EU digital identity wallet." 

Alex Tourski: "How do we know blockchain-based identifiers will persist?" 

William Wang: "Why do IP addresses exist? Because there's an underlying need to transfer information. If there's a better way of doing this in ten years’ time, IPs may vanish. Blockchain exists because there’s a need for instant transfer of value. Maybe another way will arise and blockchain will disappear. We can’t say anything will exist for sure, even in 5 years time." 

Maarten Boender: "DIDs and DLTs will be essential tools for businesses that need to provide audit trails. Qualified electronic ledgers are part of eIDAS 2.0. They are managed by organizations that are certified and fully liable to maintain the ledger."

Steffen Schwalm: "I’m pretty sure nothing of current IT systems will still be here in 50 years' time. It’s the data that needs to persist." 

Sovrin: an example of a blockchain-based identity system 

Stephen Curran, who chairs the Sovrin Foundation's board of trustees, and is a long-term contributor to Hyperledger Indy, Aries and AnonCreds, took to the stage to give an update on the Sovrin Network.

“Picking up from the previous talk, Sovrin is a distributed ledger that's used for identity. It’s global, for public-private use and enables different ecosystems of users.

“We provide a platform for issuers to publish information, that enables verifiers to independently verify this information. Sovrin is a valid place for any ecosystem where DIDs are used. It’s not tied to the Hyperledger stack,” he added. 

Stephen described the Lawyer Verifiable Credential, which is used to ensure certain systems can only be accessed by qualified lawyers. "The Law society of British Columbia issues a VC confirming the holder is certified to practice law. The data is held in a wallet, enabling the holder to present it directly to verifiers, such as these restricted systems, without the issuer knowing this has happened. 

"The Verifier reaches down into where the DIDs are to verify it's exactly what the issuer said, using issuer’s public keys.

“The Government of British Columbia is also very concerned about the surveillance economy. The goal of AnonCreds is to share the minimum possible data that’s needed for each use case. We’re trying to remove correlateability, traceability and surveillance”. 

Stephen also reminisced about "Sovrin’s infamous token days. We got through that. We had strong technology and governance, and that’s what we took forward.

"The technology is very solid and robust — we’ve had 100% uptime for the past 5 years”. 

Enabling the Economy of Trust

Next on stage was Catherine Fankhauser, Head of Identity at SICPA, who provided an overview of how authentication, data authenticity and communication have evolved since the inception of the web, and the impact this has had on digital trust.

Turning to the new generation of decentralized technologies, she shared that adoption will be driven by credentials with daily utility. In the context of the EU Digital Identity Wallet, this means lower-assurance credentials that improve the user experience, for example via passwordless access to online services.

Catherine concluded her presentation by highlighting the Unlimitrust Campus, the world's first site dedicated to the Economy of Trust.

Are unique identifiers a good idea?

In the final session, Alex Tourski returned to moderate a panel discussion focused on unique identifiers, with Executive Directors Judith Fleenor of Trust over IP Foundation and Mary Camacho of Holochain, plus Maarten Boender, Stephen Curran and myself.

Alex Tourski kicked off the discussion by highlighting how the lack of persistent unique identifiers for digital assets means content created in the early days of the internet is often irretrievable today, with broken weblinks all that remains.

He asked the panel to consider the proposition that "transparency means safety", citing his home country of Ukraine, where false rumours contributed to the war.

"When an Uber driver likes or dislikes me, should their feedback not be connected to an identifier, to ensure accountability?," he added.

The panel's response was unanimous: assigning persistent unique identifiers to natural persons is a bad idea — though persistent identifiers may make sense for organisations, and certain types of physical and digital objects. Moreover, there are many benefits from using a standardized identifier framework, such as the Decentralized Identifier (DID) specification.

Judith Fleenor pointed out that the Internet Protocol (IP) succeeded because it does the minimum required to establish a universal data transfer mechanism. Similarly, while content provenance is needed to address the explosion of fake content, creators must be able to use multiple identifiers, to minimize privacy risks.

Mary Camacho agreed, adding: "Not all societies are as well-governed and free as the Netherlands. In some places, knowing who took a photo could mean death for that person."

Alex Tourski asked the panel whether Privacy Enhancing Technologies can protect us from the dangers of unique identifiers.

In response, Stephen Curran stated that cryptographic techniques and identifiers are separate topics, and that clever cryptography doesn't mitigate the privacy risks of assigning unique identifiers to natural persons.

Maarten Boender agreed: "We're trying to make it impossible to correlate a holder's use of their credentials, which is the opposite of creating a single identifier."


Oasis Open Projects

OASIS Membership Elects New Leaders to its Board of Directors

Boston, MA, USA, 24 June 2024 – OASIS Open, the international standards and open source consortium, announced the results of its 2024 Board of Directors Annual and Special Elections. The Board, selected by the OASIS membership, will continue guiding the organization’s direction by encouraging ongoing growth and fostering increased collaboration within the open source and […] The post OASIS Membe

Dr. Pablo Breuer of Orthogonal Insights, Daniel Rohrer of NVIDIA, and Daniella Taveau of Bold Text Strategies Bring Diverse Expertise to OASIS

Boston, MA, USA, 24 June 2024 – OASIS Open, the international standards and open source consortium, announced the results of its 2024 Board of Directors Annual and Special Elections. The Board, selected by the OASIS membership, will continue guiding the organization’s direction by encouraging ongoing growth and fostering increased collaboration within the open source and standards communities. Newly elected Board members from the Annual Election are Daniel Rohrer, VP of Product Security, Architecture and Research, at NVIDIA, and Daniella Taveau, President of Bold Text Strategies. Additionally, Jim Cabral, Gershon Janssen of Reideate, Bret Jordan, and Vasileios Mavroeidis of University of Oslo and Sekoia.io were re-elected. These individuals will serve two-year terms ending in 2026. Dr. Pablo Breuer, President of Orthogonal Insights, was elected in the Special Election to serve a one-year term ending in 2025. Continuing members of the Board are Jason Keirstead of Cyware, Daniel Riedel, Omar Santos of Cisco, and Jay White of Microsoft.

Francis Beland, Executive Director of OASIS, expressed his congratulations. “We welcome these distinguished leaders, both newly elected and re-elected, to the Board of Directors. Their extensive leadership experience will be instrumental as OASIS continues to develop meaningful new initiatives and broaden its opportunities. I look forward to collaborating with each of them as we pursue ambitious goals for the future.”

Dr. Pablo Breuer, President of Orthogonal Insights, brings a wealth of experience from his previous role as an executive at a Fortune 50 company and his 22-year Navy career, which included top-level positions in the U.S. Special Operations Command Donovan Group, SOFWERX, the NSA, US Cyber Command and US Naval Forces Central Command. A DoD Cyber Cup and two-time Defcon Black Badge winner, Breuer has taught at the Naval Postgraduate School, National University, California State University Monterey Bay, and Carnegie Mellon CERT/SEI. Breuer is the co-founder of the Cognitive Security Collaborative and coauthor of the DISARM (Disinformation Analysis and Response Measures) framework, the methodology used by the US and EU governments and NATO to address Misinformation and Disinformation. He is a sought-after speaker in the fields of cybersecurity and Mis- and Disinformation, sits on several Boards, and has mentored countless students and professionals.

“I’m honored to join the board of OASIS Open to help develop and promote open and inclusive standards for innovation and technology which promote our global ethics and shared interest. The speed of technology and innovation require standards to promote safety, fairness, and interoperability,” said Breuer. “I look forward to working on standards including countering disinformation and promoting artificial intelligence safety and resiliency. OASIS Open has been at the forefront of technology standards for more than three decades, and I’m proud to be able to contribute to their mission.”

Daniel Rohrer serves as VP of Software Product Security, Architecture and Research at NVIDIA, where, throughout his 24-year tenure, he has led efforts to enhance AI security, deliver GPU confidential computing, and advance research efforts in secure platform design. Rohrer has taken his integrated knowledge of “everything NVIDIA” to hone security practices, explore novel cybersecurity solutions, and help deliver some of the world’s most advanced and trustworthy computing platforms. He has been at the forefront of AI Security, contributing to the development of safe and trustworthy ecosystems through training, open source tools, and other initiatives aimed at scaling communities. An advocate for democratized access to computing resources, Rohrer strives to ensure equality and accessibility for all communities. He serves on the NVIDIA AI Ethics Review Committee and has held a Board position with the nonprofit NVIDIA Foundation, driving significant product security innovations.

“As AI adoption continues to grow across every industry, building systems that advance security and trust are paramount to success,” said Rohrer, VP of Software Product Security, Architecture and Research at NVIDIA. “I am honored to join the OASIS Board and contribute to the community so invested in the transparent development of open-source software and standards.”

Daniella Taveau, President of Bold Text Strategies, is an internationally recognized expert in developing global business and regulatory strategies. She has extensive experience working with senior political officials and advising multinational corporations worldwide. Taveau’s expertise spans international trade, finance, agriculture, food security and safety, chemicals, pesticides, new technologies, cosmetics and personal care, intergovernmental organizations, information technology, and combating mis- and disinformation. Prior to starting her own firm, Taveau was an International Trade Negotiator with the U.S. Environmental Protection Agency (EPA), where she represented the US at the World Trade Organization (WTO); all U.S. Free Trade Agreements including the TransPacific Partnership, the Transatlantic Trade and Investment Partnership (U.S./E.U. FTA), and the U.S. Korea Free Trade Agreement; the U.N. Food and Agriculture Organization (U.N. FAO); and the Asia Pacific Economic Cooperation (APEC). She also served as an International Policy Analyst with the U.S. Food and Drug Administration (FDA) and held an executive role at a global cosmetics company for a decade.

“I am honored to join the board of OASIS Open, considered one of the leading global forces in open-source standards. Accessible standards are necessary to ensure interoperability, innovation, and inclusivity,” said Taveau. “As we confront the growing challenges of misinformation and disinformation, I am committed to working with OASIS Members to promote accuracy, transparency, and trust in our digital world.”

OASIS expressed sincere gratitude to outgoing Board member Duncan Sparrell of sFractal Consulting for his invaluable service, dedication, and significant contributions during his tenure as a director. To learn more about the OASIS Board of Directors, please visit our website.

The post OASIS Membership Elects New Leaders to its Board of Directors appeared first on OASIS Open.


Identity At The Center - Podcast

In our newest episode of the Identity at the Center podcast,

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments. You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO Visit our website

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments.

You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO

Visit our website for more: idacpodcast.com

#iam #podcast #idac

Thursday, 20. June 2024

Me2B Alliance

We Need to Talk About Product Labels

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but […] The post We Need to Talk About Product Labels a

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but perhaps not warning labels and definitely not just for social media platforms. 

Various experts have written thoughtful responses1 to this week’s call for warning labels, and their concerns underscore the fact that a warning label may be inappropriate.  

But of course we need safety labels on technology. 

History of Product Labels 

There are several types of product labels: ingredient labels (food), test result labels (automobile crash test ratings), warning labels from the surgeon general (cigarettes) and from other entities (OSHA’s Hazard Communication System for chemicals).  

Safety labels have a long-standing history in the US as a core component of product safety and product liability. The purpose of safety labels is to illuminate innate—i.e. unavoidable—risks in a product whether it is food, vehicles, cleaning solvents, toys for children, or the technology that we use with increasing reliance for all facets of living our lives.  

Safety labels almost always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

According to InComplianceMag.com, US and Canadian tort-based law makes manufacturers’ product safety obligations clear (emphasis is mine): 

” Manufacturers have an obligation to provide safe products and to warn people about any hazards related to the product. Those requirements have risen up out of the original product safety/liability cases, some of which happened in the same timeframe as the Chicago World’s Fair, the middle to late 19th century, with many more to follow.

 

The assumption in U.S. liability law, and also typically if a case is brought in Canada, is that the manufacturer of the product is guilty and has to prove that they did everything necessary to provide a safe product. That includes warnings, user instructions, and other elements. Today, that continues to be the basic concept in product liability, that the burden lies on the manufacturer to prove that they did everything possible to make their product safe.” 

 

https://incompliancemag.com/product-safety-and-liability-a-historical-overview/

Another interesting fact about safety labels is that they always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

If tech were food we would have never stood for the absence of product information for as long as we have. Never. We use tech with little to no visibility or awareness of what it’s actually doing. That simply must change.  

We need a science of product safety for software and software driven technology. And that’s exactly what we’ve been building for five years at ISL. The current attitude of placing the onus on consumers to somehow gird themselves against invisible risks that not even vendors fully understand is absurd. Of course we need labels.  

And the good news is we’ve got them started on over 1,300 EdTech related apps. Here’s an example https://appmicroscope.org/app/1614/. The image below shows just the label header and the safety facts summary.   

Labels for Technology 

What type of label is appropriate for technology? A warning label is appropriate when the science is irrefutable. Are we there with the physical and mental health risks due to the use of technology? Maybe. Depends on who you ask. But maybe a label more like chemical warning labels is appropriate. Or perhaps just a test results label.  

In our work at Internet Safety Labs, our intention since day one was to expose invisible or difficult to recognize facts about risky behaviors of technology. As can be seen from the design of our app safety labels, we chose to emulate food nutrition labels that report measured findings. This approach of reporting measured findings works very well for this early stage of the science of product safety for technology.  

For instance, in our safety labels, you can see the category averages for most of the measures in the label. Why did we do that? Because there is no concrete threshold that distinguishes safe from unsafe ranges. There’s no industry standard that says, “more than ten SDKs is bad” for example. Moreover, technology norms vary by industry, such that personal information collection in fintech and medical apps is quite different than personal information collection in retail (at least one hopes). Thus, the category averages displayed in our labels don’t necessarily mean “safe”, they just provide context as we continue to measure and quantify technology behavior. An example of the shortcomings of this approach is when, for instance, the category average number of data brokers is greater than zero for apps typically used by children. (We advocate for no data brokers in technology used by children.) But we need to start with understanding the norms. We can’t change what we can’t see.  

The Devil is in the Details 

The call for a congressional mandate for something (not necessarily a warning label) is a step in the right direction. Why? Because it treats software as a product and tacitly places product safety requirements on it. This is an advancement in our eyes.  

Moreover, product safety is almost always the domain of government (or insurance). In the absence of a government mandate for product safety for technology, we see fragmented efforts with the FTC boldly championing privacy risks in technology, and the FCC advocating for a different type of label. So indeed, it’s encouraging that we’re starting to talk about technology in product safety terms.  

But the devil is in the details of any labeling program. In the words of Shoshana Zuboff, “who decides and who decides who decides?” As in, who decides what goes on the labels? Also who oversees the integrity of the labels? The US government is a customer of data obtained by surveillance capitalism2. When it comes to technology can the government be trusted to keep people safe? (When it comes to food can the government be trusted to keep people safe? When you dig into it, the track record is spotty.)  

Product safety exists in natural opposition to the industry status quo and any kind of regulation is already facing and will continue to face strong opposition3. In the early 1900s, when chemist Dr. Harvey W. Wiley began a crusade for the labeling of ingredients and identifying toxic elements in food, industries who relied on the opacity of ingredients (snake oil salesmen) or who simply didn’t want to incur the cost of change (whiskey distillers) opposed such a mandate.  

“Strenuous opposition to Wiley’s campaign for a federal food and drug law came from whiskey distillers and the patent medicine firms, who were then the largest advertisers in the country. Many of these men thought they would be put out of business by federal regulation. In any case, it was argued, the federal government had no business policing what people ate, drank, or used for medicine. On the other side were strong agricultural organizations, many food packers, state food and drug officials, and the health professions. But the tide was turned, according to historians and Dr. Wiley himself, when the activist club women of the country rallied to the pure food cause.”

 

https://www.fda.gov/media/116890/download  

Product safety challenges the status quo and creates necessary growing pains for industry. But industry always survives. And more often than not, new industries emerge, such as the ongoing development of safety features for vehicles.  

Let’s return to the challenge of deciding what goes in the labels. We at ISL know quite a lot about what it takes to develop safety labels in a space where the naming and measurement of risk isn’t fully baked (or worse, non-existent). Determining what goes into a previously uncharted, unmeasured safety label is extraordinarily challenging. It’s even more challenging if the measurement tools don’t exist. But our situation is even worse than that: we don’t even have agreement on what the risky behaviors in technology are. AND, we are talking about behaviors here—which is not language we typically associate with products. Products don’t typically behave. From our several years in development, these are the highly iterative steps that must occur to reach consensus on labels for technology: 

Consensus on naming the hazards/harms in technology.4  Consensus on assessing and quantifying the risks.   Identify the behaviors that embody the risks.  Figure out a way to measure the behaviors that embody the risks.   Assess the measurements.5  Consensus on presentation of the measurements/information. 

As far as presentation of the data, in our case, we decided to aggregate the data into clusters based on riskiness, and we also ultimately decided to provide a single app score. This was done with some reluctance, and it will no doubt be a much-evolving scoring rubric for the next few years.  

For now, we believe the best thing the labels can do is objectively report the invisible (or poorly understood) behaviors of the products until such time as definitive harm thresholds can be derived.  

There’s a final vital detail regarding the establishment of any labels, and that’s having what I would characterize as exceptional diversity of participants in establishing safety standards. This isn’t lip service. A few years ago, when I started to better see how what was risky for me was very different than what was risky for people who are different from me such as a person of color, or a person with a disability, or an incarcerated person, I woke up one night from a deep sleep with the awareness that any attempt at standardizing or consensus is doomed if it doesn’t have full diversity involved6 . Why this is so is a long and complicated matter. On the one hand, everything ever done should endeavor to have exceptional inclusion of a massively diverse set of participants.  

But it also has to do with the fact the software and software driven tech is “alive” and interactive in a way that other products in our lives aren’t. We have a special duty when it comes to product safety of software animated products. We may even need to reconsider what a “product” is. We have seen evidence of the hazards of animated technology not built with adequate understanding of the diversity of users with the embodiment of human bias in automated decision making or with hand dryers that don’t activate for people of color. The point is that technology acts on and with us in a different (and constantly changeable) way than other products. So labeling is both harder and matters more than ever.  

Conclusion 

Overall, I remain optimistic that the lens is happily starting to focus on product safety, implicit though it may be. People will be thinking more about labels for technology. And they will see that ISL is already providing labels with privacy risks. We can call out the presence of infinite scroll, and like buttons and other widely recognized as addictive user interface patterns in labels today.  

As I mentioned above, confusion stems from Dr. Murthy’s call for a “warning label” instead of a safety or ingredients label. Technology is cigarettes7. We use the metaphor all the time. Technology today is cigarettes in the 1940s/1950s when just about everybody chain smoked and the harms were likely all anecdotal and pooh-poohed. It took decades to assemble causal evidence. But tech is also much more complicated than cigarettes and a warning label is premature. This is not a compelling argument to say that we don’t deserve to have accurate information on tech’s risky behaviors. As it is right now, we don’t even have an ingredient label for technology. We are flying (tech-ing?) blind. 

Of course we need labels. Industry would do well to proactively embrace label enablers like software bills of material, consent receipts, and machine-readable record of processing activities (ROPAs). Because there can be no doubt that labels are imminent.  

Earlier, I said that we’ve “started”. I say that because our labels only include privacy risks at present. Our labels are deliberately modular and we’ve scoped additional sections: 

Risky UI Patterns –like deliberately addictive UI patterns of the sort Dr. Murthy is calling for exposing. Our Safe Software Specification for Websites and Mobile Apps already describes measurement of these kinds of risks.  Automated Decision-Making Risks  Security [client side only] Risks  Differences between observed tech behavior and privacy policy and/or terms of service.  

All of these are on our roadmap. We know exactly how to add these sections to the label, it’s strictly a resource and funding issue. If they sound good to you, please consider supporting our mission

Because of course we need labels. 

 

Footnotes: https://www.wsj.com/us-news/u-s-surgeon-general-calls-for-warning-labels-on-social-media-platforms-473db8a8?st=gmnjmhotka7febm&reflink=desktopwebshare_permalink 
https://technosapiens.substack.com/p/should-social-media-have-a-warning  https://arstechnica.com/tech-policy/2024/01/nsa-finally-admits-to-spying-on-americans-by-purchasing-sensitive-data/
https://www.nbcnews.com/tech/security/us-government-buys-data-americans-little-oversight-report-finds-rcna89035
https://www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x  https://www.politico.com/news/2023/08/16/tech-lobbyists-state-privacy-laws-00111363 We have ongoing work with our Digital Harms Dictionary. They will be wrong, and you will have to find a different measure. We welcome everyone, whether you are technical or not, to participate in our open Software Safety Standards Panel where we define the content of the safety labels, and name hazards and harms. Tech may actually be worse than cigarettes because it has the capability of inflicting every kind of harm people can experience, either directly or indirectly, in a multitude of increasingly creative ways: financial, reputational, social, emotional/psychological, and even physical. 

The post We Need to Talk About Product Labels appeared first on Internet Safety Labs.


Hyperledger Foundation

Energy & Mines Digital Trust: The Future of Global Supply Chains

After two years of piloting, Energy & Mines Digital Trust (EMDT) has successfully launched two digital credentials, reshaping how mining operators in British Columbia (B.C.) share verified data. As the project evolves, EMDT is working with the United Nations to explore how digital trust technology can improve cross-border trade and supply chain traceability, while further lowering t

After two years of piloting, Energy & Mines Digital Trust (EMDT) has successfully launched two digital credentials, reshaping how mining operators in British Columbia (B.C.) share verified data. As the project evolves, EMDT is working with the United Nations to explore how digital trust technology can improve cross-border trade and supply chain traceability, while further lowering the barriers to entry for companies worldwide. 


GS1

Maintenance release 2.10

Maintenance release 2.10 daniela.duarte… Thu, 06/20/2024 - 14:58 Maintenance release 2.10
Maintenance release 2.10 daniela.duarte… Thu, 06/20/2024 - 14:58 Maintenance release 2.10

GS1 GDM SMG voted to implement the 2.10 standard into production in May 2024.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.10 contains updated reference material aligned with ADB 2.4 and GDSN 3.1.27.

 

Updated For Maintenance Release 2.10

GDM Standard 2.10 (May 2024)

Local Layers For Maintenance Release 2.10

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (December 2021)

USA - GSMP RATIFIED (February 2023)

Finland - GSMP RATIFIED (November 2023)

Netherlands - GSMP RATIFIED (May 2024)

Italy - GSMP RATIFIED (May 2024)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (February 2024)

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.10 and 2.11 (April 2024)

Attribute Definitions for Business (May 2024)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.27 (May 2024)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (May 2024)

GDM Local Layer Submission Template (May 2024)

Training

E-Learning Course

Future Release Documentation

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.10 and 2.11 (April 2024)

Any questions

We can help you get help you get started using the GS1 standards

Contact your local office


Oasis Open Projects

OASIS Launches Global Initiative to Standardize Supply Chain Information Models

Boston, MA – 20 June 2024 – With escalating cybersecurity threats exploiting software supply chain vulnerabilities, there’s an urgent need for better understanding and proactive measures to identify and prevent future risks. Members of OASIS Open, the global open source and standards organization, have formed the Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) […] The post

Checkmarx, Cisco, Cyware, Google, IBM, Legit Security, Microsoft, Root, SAP, US NSA, CISA, and Others Join Forces to Build a Framework to Complement SBOM Data Formats, CSAF, CycloneDX, OpenVEX, and SPDX

Boston, MA – 20 June 2024 – With escalating cybersecurity threats exploiting software supply chain vulnerabilities, there’s an urgent need for better understanding and proactive measures to identify and prevent future risks. Members of OASIS Open, the global open source and standards organization, have formed the Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) to standardize and promote information models crucial to supply chain security. 

The aim of OSIM is to build a unifying framework that sits on top of existing SBOM data models–such as CSAF, CycloneDX, OpenVEX, and SPDX. OSIM is not intended to replace or endorse any one of these models. Instead, as an information model, OSIM will bring clarity to software supply chain partners, mitigate vulnerabilities and disruptions, reduce security risks, and make it easier for companies to plan for upgrades and contingencies.

“CISA is excited to be a part of this technical effort to bring greater visibility to the software supply chain,” said Allan Friedman, Senior Technical Advisor at CISA. “We have many of the basic building blocks for software transparency and security, including SBOM, VEX, and CSAF. This work by OASIS will facilitate automation for easier and cheaper implementation and tooling, and help provide a unifying supply chain framework and raise the level of collaboration across industries.”

“OSIM represents an important effort to address the need for greater structure and comprehensibility of software supply chains,” said Isaac Hepworth, Google, and OSIM co-chair. “By establishing standardized information models we can enhance transparency, interoperability, and resilience in end-to-end operations — ultimately aiding cyber risk management and protecting critical infrastructure.”

Recognizing the crucial role of Software Bill of Materials (SBOMs) in fortifying software supply chain security, the OSIM TC aims to create, for example, a standardized SBOM information model that would enhance understanding and interoperability across diverse SBOM data formats (i.e. SPDX and CycloneDX). Competing data models, like SPDX, CycloneDX, CSAF, and OpenVex, show the need for creating information models that would bring coherence across diverse specifications.

“OSIM’s approach not only drives a universal taxonomy of thought, it also brings clarity and ease to how we implement standards and frameworks to support multiple industry software supply chain security needs. OSIM facilitates the identification of similarities and differences across specifications, enhancing interoperability and simplifying processes. The current cybersecurity landscape can no longer be defended in a silo,” said Jay White, Microsoft, and OSIM co-chair.

The OSIM TC welcomes a diverse range of contributors, including software and hardware vendors, open-source maintainers, technology consultants, business stakeholders, government organizations, and regulatory bodies. Participation is open to all through membership in OASIS, with interested parties encouraged to join and contribute to shaping the future of supply chain information modeling.

Support for OSIM

Checkmarx
“Checkmarx is proud to be working with OASIS and be part of the OSIM Technical Committee. A major part of Checkmarx’ mission to secure the applications driving our world involves sharing our time, experience, and threat intelligence to help make the software supply chain ecosystem safer. As one of the biggest challenges remains education and closing the knowledge gap, we believe standardization is a crucial step and are committed to assisting in laying the foundations.”
– Erez Yalon, VP of Security Research, Checkmarx

Root
“The OASIS OSIM is a vital project for enhancing security and trust in the software supply chain. As a part of the OSIM Technical Committee, Root is committed to advancing supply chain security and transparency, aligning perfectly with this initiative’s goals. By collaborating on data schemas, data modeling, and security standards, we aim to improve vulnerability management and software security, ensuring threats are identified and mitigated promptly. This enhances software integrity, benefiting our customers and strengthening trust in the broader digital ecosystem.”
– Ian Riopel, CEO, Root.io

SAP SE
“Having a unified information model for representation of objects in the supply chain domain would enable efficient integration models and interoperability. Especially with the wave for generative AI, such aligned models can bring benefits in development efficiency , reduced maintenance and operations for upcoming innovations in the domain.”
– Gururaj Raman, Chief Development Expert, SAP SE

Additional Information
OSIM Project Charter

Disclaimer: CISA does not endorse any commercial entity, product, company, or service, including any entities, products, or services linked or referenced within this press release. Any reference to specific commercial entities, products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply endorsement, recommendation, or favoring by CISA.

The post OASIS Launches Global Initiative to Standardize Supply Chain Information Models appeared first on OASIS Open.

Wednesday, 19. June 2024

Origin Trail

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI. There is however another gap that has become strikingly apparent — t

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI.

There is however another gap that has become strikingly apparent — the AI trust gap. As challenges such as AI hallucinations, bias, and intellectual property slurping continually cause damage, we look into how the base of the current Web could be effectively transformed to support the Verifiable Internet for AI.

The announced Apple and OpenAI integration signals the trust gap is widening, with Apple users’ data becoming the next frontier for training ChatGPT and questionable transparency on how it is used. This data is so valuable that it reportedly makes up for charges Apple would pay for using costly ChatGPT AI models. The Verifiable Internet for AI shifts this paradigm, making such data transactions transparent on chain with ownership of data taken back by users, who ultimately get to monetize it.

Decentralized AI: Intersection of Crypto and AI

Having employed the fundamentals of crypto, AI, and knowledge graphs successfully within a plethora of sectors, where trust, transparency, and accuracy are of paramount importance, OriginTrail now integrates Base blockchain with OriginTrail Decentralized Knowledge Graph (DKG), to help drive trust and transparency with neuro-symbolic AI. Instilling information provenance, ownership, and graph structure through blockchains and knowledge graphs together can effectively address the aforementioned problems of AI, as detailed in the most recent White Paper 3.0.

Your Body of Knowledge, Your Choice of AI

The opportunity of graph algorithms as a foundation for reputation in the age of AI was also highlighted by Brian Armstrong, CEO of Coinbase, in a recent podcast:

“Another piece of a puzzle that I feel could be missing, is something around reputation that’s on chain. You can imagine a version of this that’s like using the graph structure of the chains. To sort of say, okay if I trust this node, and they sent money to this node, that sort of implies some amount of trust. Kind of like a Google Page Rank had an algorithm, something like that could be built on chain.” — Brian Armstrong, CEO of Coinbase

The recently introduced OriginTrail Paranets (user-controlled on-chain knowledge graphs), enable users total control over their data, connecting it into the DKG decentralized physical infrastructure (DePIN), while keeping it safely stored on their devices. Users are then able to choose from a growing selection of open-source AI systems integrated with OriginTrail via ChatDKG.ai, a launchpad for user-controlled AI.

Knowledge graphs with paranets enable transparent on-chain reputation, relevance scoring with PageRank, recommendation engines, graph neural networks, and other AI reasoning applications.

The first of such knowledge graphs to launch on Base is the DeSci paranet for autonomous scientific research by ID Theory, crowdsourcing knowledge assets utilizing on-chain reputation via the OriginTrail DKG.

#DeSci has great potential. Those who have a working product will have the power to forever improve science and the scientific process.” — Brian Armstrong, CEO of Coinbase

One of the first dapps deployed on the DeSci paranet on Base will be the DeSci AI agent, which will include a knowledge mining interface through which scientific knowledge will be minted on chain, with publishers receiving token incentives.

“We’re creating a user-friendly hub to coordinate scientific knowledge creation onchain — a co-owned substrate to crowdsource AI and supercharge research and discovery as we know it through autonomous science. The first iteration will focus on neuroscience as it’s very close to our hearts, but the future is boundless. Who knows, crypto might actually cure cancer.” — ID Theory

DeSci AI Agent in action built on OriginTrail

AI and Crypto, converging together in the OriginTrail DKG can tackle some of the largest challenges while providing users with an inclusive, unbiased, and verifiable way of making mission-critical decisions. As we bring this technology to more data-intensive sectors such as science, the trust layer — blockchain underpinning the neuro-symbolic AI approach made possible by the DKG — needs to fulfill both the scalability and user experience requirements.

This is where Base can help Trust Thy AI — in a scalable, inclusive, and user-friendly way.

Make sure to subscribe and follow the next steps as we make AI Base-d.

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


We Are Open co-op

Building Credibility into Digital Credentials

Validity + Reliability + Viability = Credibility Image CC BY-ND Visual Thinkery for WAO Digital credentials are playing an increasingly important role in recognising a broad spectrum of skills in the workplace. These range from the tangible to the abstract and innovative, but in all cases, we want credentials that are credible — that is to say, something worth earning. In this post, we
Validity + Reliability + Viability = Credibility Image CC BY-ND Visual Thinkery for WAO

Digital credentials are playing an increasingly important role in recognising a broad spectrum of skills in the workplace. These range from the tangible to the abstract and innovative, but in all cases, we want credentials that are credible — that is to say, something worth earning.

In this post, we explore two distinct scenarios, a culinary school and a large organisation looking to recognise and encourage the creativity of its exmployers. In each, the core elements of validity, reliability, and viability are discussed in terms of developing a credible digital credential system .

Validity

Validity ensures that an assessment accurately measures what it claims to measure. In the context of digital credentials, this means that the assessment process leading to the badge should accurately evaluate the abilities, knowledge, or competencies it is intended to certify. This alignment between the assessment criteria and the attributes it assesses is important. Without strong validity, the integrity of the credential could be questioned, undermining its acceptance by educational institutions and employers.

Culinary School: Issuing badges for skills like knife handling, pastry making, and creative presentation means assessing specific competencies. Each badge must represent true mastery of these skills. Assessments might involve practical tests where students must produce a dish that adheres to professional standards. These practical assessments must be designed to measure relevant culinary skills accurately, ensuring the badge directly reflects the student’s capability. Creative Organisation: For certifying creativity, the assessments might require participants to propose innovative solutions to real business challenges. These should be evaluated for their originality, practicality, and impact, ensuring the badge reflects genuine creative thinking and problem-solving ability. Reliability

Reliability focuses on the consistency and dependability of the assessment results. A reliable digital credential system ensures that all recipients are evaluated using the same standards, and that these standards are consistently applied. This consistency builds trust in the credentialing system, making the credentials more likely to be recognised across different sectors.

Culinary School: It is crucial that all culinary tests are graded on a consistent rubric. If two students deliver dishes of similar quality, they should both earn the badge. This uniformity builds trust in the credentialing system by affirming its fairness and rigour. Creative Organisation: When evaluating creative projects, it’s essential that all judges use the same criteria to assess the submissions. This ensures that every employee who meets the standard of creativity receives recognition, maintaining the reliability of the credential across the organisation. Viability

Viability deals with the practical aspects of sustaining an assessment system. This includes considerations such as the costs involved, the resources required, and the technology necessary to issue and maintain the credentials. A viable digital credential system is scalable and sustainable, capable of adapting to growing demands and evolving educational environments.

Culinary School: The school needs a system that can efficiently handle various forms of assessments, from practical cooking exams to theoretical tests. This includes logistical aspects such as scheduling, recording results, and managing digital badges. The technology used must support these activities without prohibitive costs. Creative Organisation: For a badges relating to creativity, the system should support diverse submission formats and robust communication for feedback. It must also be scalable and adaptable to accommodate a growing number of participants as the organization evolves. Credibility as convergence

In both examples, credibility arises from the effective integration of validity, reliability, and viability. A credible digital credential system not only accurately and consistently evaluates and represents diverse skills — from cooking to creativity — but also operates efficiently and sustainably within its intended scope. This credibility enhances the value of a digital credential, making it a recognized and sought-after mark of achievement.

Next steps

Understanding and applying the principles of validity, reliability, and viability can enhance the effectiveness and perception of digital credentialing systems. Whether in a culinary school or a large corporation, these principles ensure that credentials issued represent genuine skills and achievements, fostering trust and respect in the digital badges awarded. Through these dual examples, this post demonstrates how these principles can be applied across various disciplines, encouraging readers to think broadly about the potential of digital credentials in their own fields.

Need help thinking about digital credentials in your organisation? Get in touch!

Related posts Badge Project Blueprint Making Credentials Work for Everyone A Compendium of Credentialing

Special mention and thanks to Paddy Craven of City & Guilds, who initially pointed us in the right direction around this!

Building Credibility into Digital Credentials was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

Musings of a Trust Architect: Minimum Viable Architecture (MVA)

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian sys

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian system, illustrate the successful implementation of MVA in software development.

A business methodology focused on producing a Minimum Viable Product blossomed in the 21st century. Unfortunately, it can set businesses up for future failure because it doesn’t properly define the larger architecture that is needed to evolve a product past its earliest, minimal state.

The Old Methodology: Minimum Viable Product

A Minimum Viable Product (MVP) is a business methodology that advocates creating the simplest possible version of a product as a first release, to see if the market responds positively, or else to understand why it doesn’t1. If an MVP is successful, possibly through iterations of the initial work, the product can then be grown and ultimately find large-scale success in the market.

Twitter has long been used as an example of an MVP that did great, with Dropbox and Facebook being other examples of MVPs2 (to various extents). By the criteria of these companies, MVP would seem to be a win-win methodology.

However, they’re not the full story.

MVP Biases

Unfortunately, current literature about Minimum Viable Products suffers from Survivorship Bias. We hear about the success of companies that used MVPs, but we don’t know that their doing so actually led to success. In fact, the successes that we see might just be a false signal.

How many hundreds or even thousands of companies pursuing MVPs failed for each Twitter or Facebook that succeeded? How many companies found that they couldn’t scale their MVP, realized that they couldn’t take commercial advantage of an otherwise successful MVP, or simply were beaten by competitors with even more viable products?

We can’t measure the success of the MVP methodology by the anecdotal success of a few individual companies.

Survivorship bias image by Martin Grandjean (vector), McGeddon (picture), Cameron Moll (concept). Released under cc by-sa 4.0.

MVP Dangers

The MVP system also has other dangers.

Some of these are related to company brand. Though a new company doesn’t have a reputation to damage, a series of unsuccessful MVPs could nonetheless curtail their future opportunities. Meanwhile, a more mature company might find their existing reputation blemished by a poor MVP. This is especially true today, as companies are increasingly saying that MVPs can be poor-quality releases3. That didn’t work out that well for Cyberpunk 2077, one of the highest profile and most controversial computer game releases of recent years, even though (like many modern-day computer games) it wasn’t quite released as an MVP, but not in a fully complete form either4.

There are also competitive dangers. Within a developmental niche, MVPs only work if everyone pursues them; otherwise, an MVP built on solid ideas could easily be out-competed by a firm who produced a slightly more mature prototype. Similarly, a company with more resources might be able to scoop up the ideas in a MVP and replicate them to their own advantage5.

However the biggest dangers of MVPs are probably architectural. By defining an MVP, a company can easily ignore the larger architecture issues that would have once been considered before starting work on any serious release. This can cause problems with scaling, with missing features that can’t easily be added, and with locked-in decisions that become part of the final product.

Twitter (“X”), for example, didn’t finalize its network architecture design until 20106, four years after its advent. It would have been easy for a better architected social-medium system to get in there first; the fact that no one did is one of the pieces of luck that led to Twitter’s ultimate success. In fact, one of the developers at Twitter has noted this, saying: “In the end, Twitter barely made it, and product progress was slow for years due to post facto infrastructure investment.”7

The biases and dangers implicit in MVPs suggest the need to at least experiment with other methodologies for product releases. The huge problems implicit in the potential lack of architecture in an MVP also suggest what that alternative methodology should be: a Minimum Viable Architecture.

A New Methodology: Minimum Viable Architecture

Minimum Viable Architecture (MVA)8 is a methodology that has been discussed in somewhat different forms over the last several years. It doesn’t focus on the simplest product that can be released to consumers, but instead on the simplest architecture that can support future development within a product’s technological ecosystem.

The goal of an MVA is still to create a product that doesn’t strain the resources of a company and that doesn’t create a situation where a company’s ultimate success or failure depends on that singular release. However, that simple product must be created with the understanding of a larger architecture that has enough flexibility9 that designers can fill in gaps in that architecture in the future. It’s just that the decisions for filling in those gaps are delayed as much as possible10. It’s a melding of agile methodologies with architectural concerns.

Though an MVA could be created with a full understanding of future expansions that may or may not be ultimately accomodated, it’s more powerful to create an MVA that is modular and expandable — that doesn’t depend on the architect thinking of everything, but instead future-proofs itself so that the architecture could include unthought-of elements in the future. As Jorge Lebrato says: “The architecture remains cohesive and each piece cooperates with the others, despite having had different rhythms.” The best MVA is a compromise between entirely ignoring the architecture (as is likely in an MVP) and designing it entirely (which would likely result in time cost and waste)11.

MVA Examples

The following examples contain some real-world usages of MVA instead of MVP.

SSL/TLS

When I co-authored the TLS spec in the ’90s, I did my best to future-proof it by simultaneously constraining the design and giving it enough flexibility to be expanded in the future. This is an example of a Minimum Viable Architecture whose usefulness has proven itself: TLS is now the most deployed security system on the internet, at the heart of almost every shopping, financial, or banking transaction.

This future-proofing was thanks in part to our architecting elements that we suspected would be required in the future, but which couldn’t be deployed in the then-present, primarily due to CPU limitations. Perfect forward secrecy12 is an example. Users were able to simply turn it on when its usage became viable on standard hardware platforms.

However, our more notable work in creating an MVA came from our inclusion of ciphersuites. These are powerful encryption and decryption rules that do the actual cryptographic work of TLS. By defining them as modular plug-ins, we supported the future innovation of TLS, even in ways that we could not envision. And, there was considerable innovation. TLS 1.2 had 37 ciphersuites, though that dropped back to five with TLS 1.313.

The Gordian System

One of my most recent endeavors is Blockchain Commons’ Gordian system14, which is a layered architecture for protecting digital assets and identity that has seen early successes with the protection of seeds with systems like SSKR15 and CSR16 and that focuses on the Gordian Principles of independence, resilience, privacy, and openness.

Blockchain Commons’ Mission: Advocating for the creation of open, interoperable, secure & compassionate digital infrastructure to enable people to control their own digital destiny and to maintain their human dignity online

In order to create an MVA that future-proofs the Gordian products, the Gordian architecture identifies points of potential interoperability and breaks the architecture into discrete components across those interoperable interfaces, thus allowing individual elemetns to be replaced. This was done both at the large-scale application level and at the small-scale programmatic level. It’s important everywhere.

At the large-scale application level, the Gordian system achieves interoperability by the careful architecting of both discrete applications and the ways that they can interact. Airgaps are a traditional methodology for introducing security into a digital asset system17, but the Gordian system has expanded that to include Torgaps18, which is a way for making transactions between connected applications both secure and non-correlatable. This modular approach is one way to enable future-proofing, and it’s only strengthed by systems such as airgaps and torgaps that tightly constrain communications between the modules.


At the small-scall programmatic level, the Gordian system introduces a layered stack of specifications that together enable the private and secure transmission of sensitive data. This stack includes dCBOR19, Bytewords20, URs21, Animated QRs22, Envelope23, Gordian Transport Protocol24, and Gordian Sealed Transaction Protocol25. Together these specifications allow for the deterministic storage of binary data (dCBOR), the alphabetic representation of binary data (Bytewords), the tagged display of that representation with functionality to support multipart data (URs), the QR display of multipart data (animated QRs), the structured & smart storage of content (Envelope), the communication of Envelopes (GTP), and the secure communication of Envelopes (GSTP). But we didn’t know what all the layers would be when we got started: this is another example of future-proofing, and one that easily arises from carefully layered specifications.

Similarly, when Blockchain Commons creates its progressive use cases we focus first on the requirements without needing to know the technology. The technological specifics can be filled in by ourselves or individual vendors in the future.

By abstracting and separating architectural elements—whether they be large-scale components, layered specifications, or additional requirements found in progressive use cases—the Gordian system will be able to incorporate options that we are not even considering. The ultimate goal of all of these designs is to ensure that our MVA architecture does not limit itself, but instead remains flexible for the future.

Other MVA Examples

This type of MVA thinking is a pattern that can be widely successful and that doesn’t create some of the limitations that appear in MVP thinking. For example, when I was supporting the creation of the earliest specifications for Decentralized Identifiers (DIDs)26, I was pleased to see us arrive at a compromise where core DID specifications were separated from specific DID methods and from signature suites. It’s an architecture that allows for a lot of future expansion.

Similarly, some of my earliest Blockchain Commons work was with a company who was adapting the Gordian architecture. Even though they weren’t planning to initially implement multi-sigs, I ensured that they don’t make decisions that would lock them out of multi-sig usage in the future, because I was thinking of a MVA that went beyond the MVP they were focused on.

Coda: The Benefits of Coopetition

It can be quite hard for a single company to figure out an MVA. Thus, it’s great to work with other companies in your technology space.

This is particularly true if your industry supports coopetition, where business competitors can work together for a mutually beneficial good. If an industry supports interoperability, or one company adding services to another company’s products, then it’s a great candidate for coopetition—and thus MVAs are even more likely to be successful.

Blockchain Commons has been able to take advantage of this. A variety of companies have participated in the Gordian Developer Community community27, each contributing their own ideas and requirements for the Gordian architecture. In turn, they’ve then gone off and created open-source libraries that adapt the architecture28, before beginning work on their own wallets that use the MVA that we cooperatively designed. A not-for-profit organization can be a great support for MVA work of this type; that’s what Blockchain Commons does.

Conclusion

Hollowing out spaces in architectures for future development and creating flexibility for the future through modular designs are two of the most successful methods for turning an MVP into an MVA. They give you something that supports minimal investment and agile development, while simultaneously maximizing the ability to scale and expand in the future.

We don’t always know the right solutions. We can’t predict what will work best. So the best we can do is create architectures that won’t lock us in to specific decisions about the future. By doing so, especially by working in coopetition to do so, we also ensure that no one company will lock us or our users into futures that we don’t agree with.

This article was originally drafted in 2021, and then back-burnered for various reasons. It’s been great to see a real exposion in discussion of MVA in the years since by authors such as Ekaterina Novoseltseva 9, Jorge Labrato11 and Murat Erder and Pierre Pureur10, much of which reflects my own thoughts on MVA. Hopefully that means we’re moving in this direction!

Various. Retrieved 2021. “Minimum Viable Product”. Wikipedia. https://en.wikipedia.org/wiki/Minimum_viable_product. 

Michael Sweeney. 2015, 2020. “5 Successful Startups That Began With an MVP”. Clearcode. https://clearcode.cc/blog/successful-startups-minimum-viable-product/. 

Allan Kelly. 2020. “The MVP is broken: It’s time to restore the minimum viable product”. TechBeacon. https://techbeacon.com/app-dev-testing/mvp-broken-its-time-restore-minimum-viable-product. 

Frank, Allegra. 2020. “How one of the biggest games of 2020 became one of the most controversial”. Vox. https://www.vox.com/culture/22187377/cyberpunk-2077-criticism-ps4-xbox-one-bugs-glitches-refunds. 

Andrea Contigiani. 2018. “The Downside of Applying Lean Startup Principles”. Knowledge at Wharton. 

Mazdak Hashemi. 2017. “The Infrastructure behind Twitter: Scale”. Twitter blog. 

Evan Weaver quoted by James Governor. 2017. “Minimum Viable Architecture – good enough is good enough in an enterprise”. James Governor’s Microchips. https://redmonk.com/jgovernor/2017/06/13/minimum-viable-architecture-good-enough-is-good-enough-in-an-enterprise/. 

Deepak Karanth. 2016. “How to Create a Minimum Viable Architecture”. Dzone. https://dzone.com/articles/minimum-viable-architecture. 

Novoseltseva, Ekaterina. 2022. “Minimum Viable Architecture”. Apiumhub. https://apiumhub.com/tech-blog-barcelona/minimum-viable-architecture/#.  ↩2

Pureur, Pierre. 2021. “Minimum Viable Architecture: How To Continuously Evolve an Architectural Design over Time”. Continuous Architecture in Practice. https://continuousarchitecture.com/2021/12/21/minimum-viable-architecture-how-to-continuously-evolve-an-architectural-design-over-time/.  ↩2

Lebrato, Jorge. 2022. “What is a Minimum Viable Architecture (MVA) and why an iPaaS such as Anypoint Platform can help you achieve it”. Medium: Another Integration Blog. https://medium.com/another-integration-blog/what-is-a-minimum-viable-architecture-mva-and-why-an-ipaas-such-as-anypoint-platform-can-help-you-f54c9791f6c3.  ↩2

Various. Retrieved 2021. “Forward Secrecy”. Wikipedia. https://en.wikipedia.org/wiki/Forward_secrecy. 

Uncredited. 2020. “Cipher Suites and TLS Protocols”. SSLs.com Blog. https://www.ssls.com/blog/cipher-suites-and-tls-protocols/. 

Various. Retrieved 2024. “Blockchain Commons Developer pages”. Blockchain Commons website. https://developer.blockchaincommons.com/. 

Various. Retrieved 2024. “SSKR: Sharded Secret Key Reconstruction”. Blockchain Commons website. https://developer.blockchaincommons.com/sskr/. 

Various. Retrieved 2024. “CSR: Collaborative Seed Recovery”. Blockchain Commons website. https://developer.blockchaincommons.com/csr/. 

Various. Retrieved 2024. “Air Gaps” Blockchain Commons website. https://developer.blockchaincommons.com/airgap/. 

Various. Retrieved 2024. “Torgaps”. Blockchain Commons website. https://developer.blockchaincommons.com/torgap/. 

Various. Retrieved 2024. “Deterministic CBOR (dCBOR)”. Blockchain Commons website. https://developer.blockchaincommons.com/dcbor/. 

Various. Retrieved 2024. “Bytewords”. Blockchain Commons website. https://developer.blockchaincommons.com/bytewords/. 

Various. Retrieved 2024. “Uniform Resources (URs)”. Blockchain Commons website. https://developer.blockchaincommons.com/ur/. 

Various. Retrieved 2024. “Animated QRs”. Blockchain Commons website. https://developer.blockchaincommons.com/animated-qrs/. 

Various. Retrieved 2024. “Gordian Envelope”. Blockchain Commons website. https://developer.blockchaincommons.com/envelope/. 

Appelcline, Shannon, Wolf McNally & Christopher Allen. 2024. “Gordian Transport Protocol / Envelope Request & Response Implementation Guide 📖”. GitHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.mdhttps://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.md. 

McNally, Wolf & Christopher Allen. 2023. “Gordian Sealed Transaction Protocol (GSTP)”. GItHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2023-014-gstp.md. 

Drummond Reed, Manu Sporny, Dave Longley, Christopher Allen, Ryan Grant, and Markus Sabadello. 2021. “Decentralized Identifiers (DIDs) v1.0”. https://www.w3.org/TR/did-core/ 

Various. Retrieved 2024. “Gordian Developer Community”. GitHub. https://github.com/BlockchainCommons/Gordian-Developer-Community 

Various. Retrieved 2024. “Blockchain Commons Libraries”. Blockchain Commons website. https://developer.blockchaincommons.com/libraries/ 

Tuesday, 18. June 2024

OpenID

Digital Identity at the G20

On June 18, 2024 the OpenID Foundation’s Executive Director, Gail Hodges, spoke about Digital Identity at the G20 during the Digital Government and Inclusion Workshop. The following are her prepared remarks.   Bom dia and hello. I’d first like to applaud the Brazilian Government for your impressive work on Digital Identity here, in Brazil, and […] The post Digital Identity at the G20 first

On June 18, 2024 the OpenID Foundation’s Executive Director, Gail Hodges, spoke about Digital Identity at the G20 during the Digital Government and Inclusion Workshop. The following are her prepared remarks.

 

Bom dia and hello. I’d first like to applaud the Brazilian Government for your impressive work on Digital Identity here, in Brazil, and your G20 leadership. Brazil is modeling the kind of multi-stakeholder approach we need to enable Digital Public Infrastructure, data sharing and Digital Identity. The G20 is ideal forum to accelerate work in this area globally.

Today I’d like to suggest a vision for Digital Identity. We have a UN Sustainable Development Goal 16.9 for Identity, with the principle that 8 billion people should have access to an identity credential. What should our goal be for Digital Identity? Should 8 billion people also have the right to a digital identity credential? How can we achieve social inclusion if all 8 billion do not have the option to fully participate in the digital economy? 

If we have a Digital Identity for everyone, what should it feel like? I suggest that it should be as easy for people to assert their Digital Identity credential as it is to assert their email or phone number.

It is possible to achieve these goals – the technology is not the barrier. But it will take G20 leadership, national leadership, multi-stakeholder collaboration (like our conversation today), and crucially… it will take global open standards.

The OpenID Foundation is one of the open standards bodies at the center of Digital Identity specification development. We seek to offer secure and interoperable standards that respect domestic sovereignty. Our most popular standard is OpenID Connect, which is currently used by over 3 billion people across millions of applications. The OpenID Foundation also supports 28+ countries that have selected our OpenID for Verifiable Credential specifications, and another 12 countries have selected the OpenID Foundation’s high security profile for data sharing called FAPI. In fact, Brazil is one jurisdiction where FAPI was selected, and the OIDF is proud to supports Brazil’s Open Finance and Open Insurance programs by offering certification to all ecosystem participants.

Last year the OpenID Foundation and 12 other non-profits announced a white paper titled “Human-Centric Digital Identity” in conjunction with the OECD’s Recommendations on the Governance of Digital Identity. In that paper we recognized that countries have a mix of operating models that align to their technical implementations. Some countries lean to centralized modeled others to “decentralized” models. Some countries are government-led and others are private sector-led.  From our vantage point all of these are legitimate models. There is no single way to develop and deliver a Digital Identity program. However, if you want to achieve social inclusion and a human-centric approach, you need to stare hard at your domestic model to ensure no one is left behind.

So what about your country? For those of you in countries at the start of your Digital Identity journey, I suggest you answer one critical question early on in your program development. Will you use global standards or will you develop your own local specifications? It is your choice. But I encourage you to take that decision at the most senior levels. Global standards offer you confidence in the security model, technical interoperability, the ability to scale, interoperability across borders, and it is resistant to vendor and consulting provider lock-in. Local standards could place limitations on the ability of your people and your businesses to thrive outside of the local context, and it could open you up to security threats if you become the “weakest link” relative to your peers. Even if you choose to leverage open source code, I encourage you to ensure that the open source code and your local implementations are certified as conformant to global open standards.

I assure you, the trend is already toward global standards. It was the rallying cry in Cape Town last month during ID4Africa. In recent weeks I was delighted to hear friends focused on the global south countries from the World Bank, UNDP, GovStack, MOSIP, Center of Digital Public Infrastructure all encouraging use of global standards and moving towards certification to global standards. Similarly, the European Digital Wallet program has been shaking up stakeholders in the global north with its Architectural Reference Framework, which leverages global standards that all EU member states will need to conform to so that European countries can interoperate.  

Unfortunately, Digital Identity standards are complicated: there is not a single place or a single playbook to follow at this time. I encourage you to embrace the complexity. The strategy you develop could well include global standards from ISO, the IETF, the W3C, the OpenID Foundation, as well as best practices from other organizations like NIST to help avoid bias in your biometric algorithms – and you might want to consider using open source code to help you accelerate down the adoption curve. Either way, you are likely going to have to embrace some complexity in-house in order for your residents and businesses to benefit from simple user experiences.

Some of you represent countries with mature Digital Identity programs. We applaud you for being early adopters. I have a different question for you. How will you serve your residents and businesses that need to transact across borders? Is it worth investing in capabilities that will allow cross-border interoperability? You might want to ask yourselves, what percentage of your GDP is driven by cross-border trade, how important your global diaspora is, and / or how often your citizens travel abroad. Or you might look at Digital Identity in terms of how it can enhance your national security posture.   With $1.4T lost annually to cybercrime globally, we all have room to improve to better protect our residents, our businesses, and our security posture. I also encourage countries with mature Digital Identity programs to take a leadership role in the work to develop global open standards, and to work on achieving cross-border interoperability of digital identity in practice.   As David said in the earlier panel, the transformation and Digital Identity leadership in the global south is impressive … but global south representatives and the entities that fund their transformation are much less active in global standards bodies working on Digital Identity.

Earlier, I offered a vision of what good can look like: 8 billion people with Digital Identity, using their credentials seamlessly. But what does it look like when it goes wrong and we do not have global standards? One example is the train tracks where train gauges do not line up and people and goods have to move from one train to another.

In a new project called the Sustainable and Interoperable Digital Identity HUB or SIDI Hub, we are challenging ourselves to tackle the question of cross-border interoperability of Digital Identity. SIDI Hub is a multi-stakeholder community comprised of more than 25 countries from the global north and global south, 25 non-profits, and many of the major multinational organizations. In the last 7 months since we formed SIDI Hub, we held 3 summits on two continents, and we will have three more summit on three additional continents this year. We encourage the G20 to leverage multi-stakeholder forums like the SIDI Hub to ensure that the principles you develop can be implemented in practice, all the way down to the protocol layer, in a way that millions of developers will be able implement against those policies by default. Only then can your residents benefit from digital identity as a “public good”.

I will leave you with one last tip. If you want to achieve domestic or global interoperability, you need to test and certify implementations to a common specification, and then maintain conformance to that specification. When you multiply this by millions of entities — and millions of developers— testing, certification and conformance become pivotal.

Many thanks.

The following comments were made in response to other speakers and the Q&A:

First I’d like to agree with Adam: that the critical path for any jurisdiction is domestic use cases. My ask is to ensure that each G20 jurisdiction reserve some thought for cross-border interoperability. We have heard from Adam about the progress with the EU in cross-border interoperability, and we know in the African Union they also want to enable cross-border trade and interoperability of Digital Identity deployments across Africa. From Husdon’s comments we know that Latin American interoperability is also growing in interest, and if we had an Asian representative on the panel we would probably hear the same from them.

I’d like to elaborate on one of the key ways we need to enable cross border Digital Identity. It starts with identifying use cases. To date with SIDI Hub we have identified 30 potential “champion” cross-border use cases. Let me offer four examples that are bubbling towards the top.

In Africa, the most popular use case was “cross-border trade” – helping people living lives along a geographic border, a use case that ranked lower in Europe for obvious reasons. A second use case was “helping people assert their educational and employment certifications across borders” a use case that can serve all migrants whether they are high income or low income. A third example is the “refugee” use case.  UNHCR currently cares for 120 million refugees and they need to deliver on their mission to serve these individuals from the countries they originate from, through the UN system including host countries, all the way to any future destination country or back to their home jurisdiction. The fourth example is “opening a bank account” which can be for students, employee relocations or any other migration use case.

We will not select “champion use cases” until later this year, but we already know that we need these use cases to be able to flesh out the minimum technical requirements to enable cross border interoperability and to map the trust frameworks across borders.

About the OpenID Foundation

The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Digital Identity at the G20 first appeared on OpenID Foundation.


Velocity Network

Authoritative Sources for Verifiable Credentials – Part 3

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

FIDO Alliance

AWS Expands MFA Requirements, Boosting Security and Usability with Passkeys

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by […]

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by incorporating secure-by-design and safe-by-default principles. FIDO passkeys offer a strong and easy MFA option, leveraging public key cryptography to resist phishing attempts and enhance overall account protection.


Velocity Network

Velocity Network’s Architecture for Issuer Trust – Part 2

Velocity Network aims to migrate career data to a three-party data exchange model with reliable data. This architecture is key to the revolution in career data that is waiting to happen. Without the three-party model, relying parties must create API integrations with a source of trusted digitized data from a single, often monopolistic, trusted issuer. The post Velocity Network’s Architecture for

Monday, 17. June 2024

FIDO Alliance

ID Talk Podcast: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential […]

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential industry standards, promoting the adoption of biometrics, and enhancing digital security through two-factor and multi-factor authentication technologies.

This week, Andrew Shikiar, FIDO’s Executive Director and CEO, joins the ID Talk podcast to discuss critical issues in authentication and identity security. The conversation covers topics such as the intricacies of passkeys, the dangers of phishing and deepfakes, and the comprehensive testing FIDO certified products undergo with independent, accredited labs to gain FIDO certification. Additionally, Shikiar introduces FIDO’s new Face Verification Certification program, aimed at standardizing selfie-based identity verification technologies across various sectors.Gain valuable insights from Andrew Shikiar by tuning into the podcast, available on Soundcloud, Spotify, Apple Podcasts, or using the link below.


Hyperledger Foundation

Developer Showcase Series: Nithin Pankaj, Senior Software Engineer, Walmart

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Nithin Pankaj, a Senior Software Engineer at Walmart. 

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Nithin Pankaj, a Senior Software Engineer at Walmart


EdgeSecure

AI Teaching & Learning Symposium, presented by Edge and Seton Hall University

Read all about this SOLD OUT event! The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.

Edge and Seton Hall University hosted the inaugural AI Teaching & Learning Symposium on June 11, 2024, to explore the impact of AI on teaching, learning, and the student experience. The sold-out event was held at the University’s Bethany Hall and included a student panel, breakout sessions, lightning sessions, and the opportunity to connect with industry leaders, exhibitors, and fellow members.

Engaging with Generative AI Tools
The day’s events kicked off with a student panel, Experiencing Generative AI Insights from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics. Seton Hall University students, Thiago Alves, Filip Malesev, Kathleen Meagher, Victoria Torna, Nicole Voltmer, and Jasmine Patel, joined moderator, Julia Boivin, to discuss their experiences engaging with generative AI (GenAI) tools in their coursework and which applications, challenges, and outcomes they have encountered from integrating AI into student workflows. The panel shared a closer look at the courses and GenAI tools used and how the GenAI Journal helped enhance students’ understanding of AI technology.

During the session, the group also discussed AI’s potential in learning and its practical application in various projects, as well as insight into the development of skills in prompt engineering for AI interactions, specifically with ChatGPT. Students shared ways to communicate with AI effectively and ethically and how the technology can impact student creativity and problem-solving skills through AI-assisted analysis and brainstorming. In recounting their experiences, the group also explained the technical and ethical challenges that were encountered, the strategies that improved the effectiveness of using AI in their coursework, and how these tools could positively impact academic performance and engagement. Most importantly, the student panel reflected on how the experience has prepared them for an AI-driven professional landscape and what they see for AI’s role in their future careers.

“This was one of the best conferences/symposiums I have ever attended. Every session was exceptional. The energy was extremely positive!” — Kate S., Instructional Designer

Empowering Personalized Learning with AI
The integration of AI and generative AI in higher education is revolutionizing student learning, teaching, and research. To explore this topic more in depth, Sergio Ortega, Business Development Lead for Artificial Intelligence/Machine Learning Worldwide Public Sector, Amazon Web Services, led the afternoon session, Industry Perspectives: Empowering Personalized Learning and Innovation with AI in Education. The presentation discussed how Amazon Web Services (AWS) is empowering universities to leverage these technologies to create personalized learning experiences, automate administrative tasks, and drive innovation.

The session looked at the potential of AI to analyze student data and provide personalized recommendations, as well as using GenAI tools to create customized learning materials that adapt to each student’s learning style and pace. Ortega examined how AI and generative AI can streamline administrative tasks, such as grading and course scheduling; helping to free up time for faculty to focus on teaching and research. Participants learned how AWS is supporting research and innovation in higher education by providing the infrastructure and tools necessary to analyze large datasets, identify patterns and trends, and develop new insights and theories.

Attendees of the symposium also had the opportunity to attend lightning sessions, which were shortened presentations exploring certain topics. The breakfast presentation, Adobe’s Perspective on Gen AI, was led by Stephen Hart, Principal Customer Manager, Adobe Education – New York, and reviewed some of Adobe’s most recent AI offerings, including the company’s ethical approach to creation and discovery. The lunch lightning session, Navigating AI in Higher Education: Engaging Faculty and Enhancing Student Success, delved into the transformative impact of AI in higher education, focusing on faculty engagement, student support, and learning journeys.

Presenters, Cole Galyon, Vice President, Academic Innovation, Anthology, and Dr. Jae Kim, Senior Instructional Designer, William Paterson University, explored how they used the insights and feedback from faculty to create a multifaceted approach to AI adoption across campuses and how it was designed to address both opportunities and challenges. Participants experienced a live demonstration of the AI Design Assistant found within Blackboard Learn, and how this tool has the potential to streamline administrative tasks and enhance educational outcomes. Dr. Kim explained how the University is leveraging Blackboard Learn Ultra to foster an interactive learning environment that empowers faculty and supports student success.

“The content covered in the conference was excellent and very helpful.”

— Patrick S., Professor

The Transformative Impact of Generative AI Tools
Christopher Petruzzi, Manager, User Interface and Multimedia Design, Seton Hall University, led the breakout session, Enhancing Creative Workflows with Generative AI in Adobe Photoshop, to discuss the transformative impact of GenAI tools in Adobe Photoshop. With two decades of Photoshop experience, Petruzzi has witnessed numerous advancements from Adobe and has seen how technology has enhanced both the creative/editing applications within higher education, specifically at Seton Hall University’s Teaching, Learning and Technology Center (TLTC).

The introduction of generative AI into Adobe Photoshop has revolutionized the institution’s Digital Media Team’s workflow by automating routine tasks such as image corrections and creations. Petruzzi shared how these advancements have not only accelerated their project timelines, but also expanded their creative possibilities. Attendees gained an in-depth look at how these AI tools have altered Seton Hall’s digital workflows—improving efficiency and allowing for the production of more complex and creative outputs.

With a live demonstration of key generative AI features within Adobe Photoshop, Petruzzi illustrated their practical applications which have been instrumental in enhancing workflows at the TLTC. The audience saw how AI-enhanced tools are used in real-world scenarios to fix, create, and enhance images, and how they are useful for producing high-quality content and marketing materials.

The Impact of ChatGPT on First-Year Writing
Redefining the Write Path: Exploring the Impact of Generative AI on First-Year Writing Education, presented by Nikki Bosca, Associate Director, Online Teaching and Course Development, New Jersey Institute of Technology (NJIT), shared the impact of ChatGPT on their First-Year Writing (FYW) courses. Taking the audience through their case study, Bosca showed how the research was anchored by an integrated framework drawing from Cognitive Process and Sociocultural Theories of Writing, and addressed four central questions examining perceptions, challenges, and opportunities for integrating generative AI into FYW instruction.

Utilizing a multiple methods case study approach, including interviews and surveys, the study provided nuanced insights into the dynamics of ChatGPT integration. Bosca explained how preliminary findings revealed diverse perspectives among instructors and students, which have challenged prevailing opinions on generative AI in student writing. As ChatGPT becomes more prevalent, this research has informed effective AI utilization that does not compromise the quality of student learning in FYW courses.

“The student panel was excellent! I found the agenda on the postcard super useful! It kept me off my devices. Overall it was a pleasure attending. Thanks so much for your efforts!”

— Elizabeth P., Senior Instructional Designer

Keeping Up with GenAI at Montclair State University
In January 2023, the instructional design team at Montclair State University began ideating a response to the advances in AI. Since then, Instructional Technology and Design Services (ITDS) has produced a suite of web-based resources, workshops and trainings, and consultations to guide their faculty through discovery and exploration of GenAI and how it can be leveraged pedagogically and ethically. In this session, The Amazing Race: Keeping Up with GenAI at Montclair State University, Montclair instructional designers, Joe Yankus and Gina Policastro, shared their experience composing these resources and facilitating small and large-group faculty development, as well as their lessons learned and the goals for the upcoming year.

The Transformative Potential of GenAI in Higher Education
John Shannon, Professor, Seton Hall University, and Susan A. O’Sullivan-Gavin, Professor, Rider University, joined together to present, Integrating Generative AI in Higher Education Learning Environments: Opportunities and Challenges. The presentation explored the transformative potential of GenAI and how it can help enhance learning environments in higher education. Attendees received a comprehensive overview of the benefits, challenges, and best practices of incorporating GenAI at an institution and the importance of guiding students in its use to gain “AI fluency.”

The session examined the importance of integrating GenAI in modern teaching and learning and the ways this advanced technology can enhance personalized learning, critical thinking, and digital literacy. GenAI can also facilitate the development of innovative teaching methods, improve student engagement and outcomes. Presenters offered guidance on the use of GenAI ethically and effectively, including academic integrity and plagiarism concerns, and issues related to transparency, privacy, accessibility, bias, accuracy, security, and regulatory challenges.

Creating Meaningful Learning Environments
With a look at the innovative ways AI tools can be used in the classroom, Centering Students and Using AI within Meaningful Learning Environments, explored how instructors can use AI to engage students in active, collaborative, constructive, authentic, and goal-directed activities, while also achieving culturally responsive teaching. This student-centered approach enriches how AI tools are used within meaningful learning experiences and is achieved when teachers situate students’ lived experiences, frames of reference, and ways of being as resources for learning. Presenter, Manny Algarin, Director of Education, Ed.D. Candidate, New Jersey City University, shared a process for designing meaningful learning experiences, a list of AI tools that promote meaningful engagement, and a culturally responsive framework that advances meaningful technology integration.

Robbie Melton, Provost, Tennessee State University, led another interactive session called Convenience to Competence: A Spectrum for Purposeful AI Integration in Education. The presentation introduced educators to the Arrighi AI-C2 Utilization Spectrum, a framework for conceptualizing how learners develop skills and competencies through the purposeful integration of AI tools into education. Attendees also received an overview of the Spectrum’s five stages from basic use to autonomous innovation.

“It was a great symposium – a nice variety of sessions. One enhancement for breakfast would be to include a yogurt option for those have gluten free dietary restrictions. The fruit was fresh and delicious. Very nice campus and good presenters. Kudos to Seton Hall and Edge for a well executed event! Thank you.”

— Abigail H., Senior Technology Trainer

Harnessing the Power of AI
The afternoon breakout sessions included Universal Access Through AI: Leveraging AI for Inclusive Education, presented by Jaimie Dubuque, Teaching and Learning Technologist, Rider University. In this presentation, participants learned how AI can be leveraged in teaching and learning to enhance accessibility and the practical strategies for implementing AI in planning, assessing, and supporting students in higher education. The session also examined the potential impact of AI on IEPs and 504s.

Also looking at the unparalleled opportunities that AI can unlock in the classroom, Muhammad Hassan, Executive Director, Nancy Thompson Learning Commons, Kean University, led the breakout session, Revolutionize Classroom Learning with AI. The presentation discussed how AI-powered platforms can personalize learning content, recommend supplementary materials, and facilitate real-time feedback. Thompson shared ways AI can enable the curation of unique assessments tailored to individual learning objectives and contexts and that AI algorithms can analyze students’ performance data and preferences to design more relevant, relatable, and reflective assessments of real-world scenarios. By embracing AI in assessment design, Thompson said educators can cultivate critical thinking, creativity, and problem-solving skills while providing students with meaningful feedback and evaluation metrics.

In Lessons Learned, Actions Unleashed: Unlocking Our Potential, Diane Rubino, Adjunct Professor, New York University led a candid discussion of what worked well and the opportunities for growth in integrating AI tools at their institution. The collaborative conversation asked various questions, including what were the outcomes of new teaching methods used, what wisdom can we extract from decisions that didn’t quite work out, and what action items can we identify to address challenges and improve the learning environment for ourselves and our students.

Improving Student Readiness
The panel presentation, Using AI Avatar Patients to Increase Student Readiness, explored how AI can be used to help educators to prepare students to enter the workforce after graduation. Seton Hall University’s Leslie Rippon, Associate Professor, Department of Athletic Training, Genevieve Zipp, Professor, Program Director of Ph.D. in Health Sciences, and Lorene Cobb, Assistant Professor, Department of Physical Therapy, talked about the pedological design and implementation of AI avatar patients into healthcare interprofessional curricula. The session explored student readiness, which includes perceptions of knowledge, attitude, and ability, and how experiences are impacted by context-specific readiness.

Contextual readiness includes socio-political, community, organization, financial, and learning resources and opportunities that influence the experience. Presenters looked at how experiences should promote active learning and simulate real-world scenarios to foster student readiness appropriately. Developing and delivering real-world learning experiences can present many challenges, including contextual characteristics, limited space for hands-on engagement, increased staffing needs, scheduling conflicts, and costs associated with patient actors or experiential opportunities. The session highlighted how integrating AI and virtual reality (VR) can address these challenges and generate human-like content in response to complex and varied prompts. Attendees gained an inside look at the data on financial impact and students’ perceived readiness post-AI-VR experiences and the recommendations for future integration into the broader educational curriculum.

John Baldino, Director, Center for Teaching and Learning, Lackawanna College added to the discussion in Machine Morality: Ethical and Creative Uses of AI for Faculty and Students. The professional development workshop examined AI, its role in education, and the opportunities for students and teachers to use the emerging technology to enhance teaching and learning.

As AI continues to be at the forefront of strategic planning discussions within academia, Edge will further explore how advanced technology can improve institutional effectiveness, enhance teaching and learning, expand research capabilities, and shape the skill sets that will be essential in tomorrow’s workforce. Edge events like the AI Teaching & Learning Symposium provide exciting opportunities to bring together professionals, thought leaders, and industry experts who can share valuable insight into the latest technologies and how institutions can leverage real-life solutions on their campuses to help transform education.

Featured Sessions 9:35-10:35 a.m. Student Panel: Experiencing Generative AI Insights from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics

Bethany Hall Multipurpose Room C

This student-moderated panel will focus on their experiences engaging with generative AI tools in their coursework, emphasizing their applications, challenges, and outcomes. The insights are drawn from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics courses. The discussion will showcase practical applications of generative AI in learning environments, highlight the challenges faced and solutions students implement using AI tools, and describe the outcomes and learnings from integrating AI into student workflows.

The panel will offer an overview of the courses and generative AI tools used; discuss how the GenAI Journal enhanced students’ understanding of AI technology, its potential in learning, and its practical application in various projects; offer insight into the development of skills in prompt engineering for AI interactions, specifically with ChatGPT and  understanding how to communicate with AI effectively and ethically; describe the impact on student creativity and problem-solving skills; recount the technical and ethical challenges encountered and strategies for effective use of AI in coursework; describe key takeaways from using generative AI tools and their impact on academic performance and engagement; discuss the increased creativity and ability to generate novel ideas using AI while improving problem-solving skills through AI-assisted analysis and brainstorming; reflect on how the experience has prepared students for an AI-driven professional landscape and AI’s role in their future careers.”

Moderator:

Julia Boivin, Seton Hall University

Panelists:

Thiago Alves, Seton Hall University Filip Malesev, Seton Hall University Kathleen Meagher, Seton Hall University Victoria Torna, Seton Hall University Nicole Voltmer, Seton Hall University Jasmine Patel, Seton Hall Universit 1:30 – 2:10 p.m. Industry Perspectives: Empowering Personalized Learning and Innovation with AI in Education

Bethany Hall Multipurpose Room C

The integration of AI and generative AI in higher education is revolutionizing the way students learn, faculty teach, and research is conducted. This presentation will explore how Amazon Web Services (AWS) is empowering universities to leverage these technologies to create personalized learning experiences, automate administrative tasks, and drive innovation. We will discuss the potential of AI to analyze student data and provide personalized recommendations, as well as the use of generative AI to create customized learning materials that adapt to each student’s learning style and pace. Additionally, we will examine how AI and generative AI can streamline administrative tasks, such as grading and course scheduling, freeing up time for faculty to focus on teaching and research. The presentation will also highlight how AWS is supporting research and innovation in higher education by providing the infrastructure and tools necessary to analyze large datasets, identify patterns and trends, and develop new insights and theories. Overall, this presentation will demonstrate how the use of AI and generative AI in higher education is improving student outcomes, advancing the field of research, and transforming the way universities operate.

Presenter:

Sergio Ortega, Business Development Lead for Artificial IntelligenceI/Machine Learning Worldwide Public Sector, Amazon Web Services

Bio:  Sergio Ortega is an accomplished executive with 20+ years of experience in AI, machine learning, analytics, and public sector industries. He excels in driving organizational excellence, fostering inclusive cultures, and solving customer challenges. Sergio has 15+ years of experience in cloud computing, PaaS, SaaS, and solution selling, with expertise in program management and go-to-market strategies. Currently, he is the Business Development Lead for AI/ML WW Public Sector at Amazon Web Services. Previously, he held leadership roles at Microsoft, including Metaverse, IoT, Light Edge Azure Engineering Global GTM and Ecosystem Lead. Sergio holds advanced degrees in data science, business, computer science, and cybernetics. His extensive experience and academic credentials make him a valuable asset to any organization.

Lightning Sessions 9:00 – 9:20 a.m. Breakfast Lightning Session—Adobe’s Perspective on Gen AI

Bethany Hall Multipurpose Room C

Stephen will review some of Adobe’s most recent AI offerings including the company’s ethical approach to creation and discovery.

Presenter:

Stephen Hart, Principal Customer Manager, Adobe Education –  New York 12:45 – 1:05 p.m. Lunch Lightning Session—Navigating AI in Higher Education: Engaging Faculty and Enhancing Student Success

Bethany Hall Multipurpose Room C

This presentation will delve into the transformative impact of AI in higher education, focusing on faculty engagement, student support, and learning journeys. Drawing on insights and feedback from faculty, we will explore the multifaceted approach to AI adoption across campuses, addressing both opportunities and challenges. Participants will also experience a live demonstration of the AI Design Assistant found within Blackboard Learn, showcasing its potential to streamline administrative tasks and enhance educational outcomes. Attendees will also hear from William Paterson University and how they are leveraging Blackboard Learn Ultra to foster an interactive learning environment that empowers faculty and supports student success. This presentation aims to foster an interactive dialogue among attendees, encouraging sharing experiences and strategies for effectively integrating AI into higher education.

Presenters:

Cole, Galyon, Vice President, Academic Innovation, Anthology Dr. Jae Kim, Senior Instructional Designer, William Paterson University Breakout Sessions Session 1: 10:45 – 11:25 a.m. Enhancing Creative Workflows with Generative AI in Adobe Photoshop

Bethany Hall Multipurpose Room A

Explore the transformative impact of Generative AI tools in Adobe Photoshop during this presentation. With two decades of Photoshop experience, I have witnessed numerous advancements from Adobe. What I was originally doing by hand with the primitive lasso tool of Photoshop 7.0 in 2002 (has it been that long?)…can now be done at the click of one button in the latest Creative Suite.

This technology has enhanced both the creative/editing applications within higher education, specifically at Seton Hall University’s Teaching, Learning and Technology Center.

The introduction of Generative AI into Adobe Photoshop has revolutionized our Digital Media Team’s workflow by automating routine tasks such as image corrections and creations. These advancements have not only accelerated our project timelines but also expanded our creative possibilities. This session will provide an in-depth look at how these AI tools have altered our digital workflows, improving efficiency and allowing for the production of more complex and creative outputs.

The presentation will include a live demonstration of key Generative AI features within Adobe Photoshop, illustrating their practical applications which have been instrumental in enhancing our workflow at the TLTC. Attendees will see how AI-enhanced tools are used in real-world scenarios to fix, create, and enhance images, useful for producing high-quality content and marketing materials.

This session is tailored for individuals eager to learn about the potential of AI in enhancing digital media production, specifically within the Adobe Creative Suite. By sharing practical examples and personal insights from years of experience, I aim to inspire attendees to integrate these innovative tools into their own workflows, pushing the boundaries of what is possible in educational technology. 

It’s also just really cool to digitally insert a T-Rex in the middle of the campus green at the press of a button.

Presenter:

Christopher Petruzzi, Manager, UI and Multimedia Design, Seton Hall University Redefining the Write Path: Exploring the Impact of Generative AI on First-Year Writing Education

Bethany Hall Multipurpose Room B

This case study explores the impact of ChatGPT on First-Year Writing (FYW) courses at a public university. Anchored by an integrated framework drawing from Cognitive Process and Sociocultural Theories of Writing, the research addresses four central questions examining perceptions, challenges, and opportunities for integrating generative AI into FYW instruction. Concerns range from academic integrity to student anxiety, with a focus on how instructors navigate AI-assisted assignments and the implications on the writing process and outcomes. Utilizing a multiple methods case study approach, including interviews and surveys, the study provides nuanced insights into the dynamics of ChatGPT integration. Preliminary findings reveal diverse perspectives among instructors and students, challenging prevailing opinions on generative AI in student writing. As ChatGPT becomes more prevalent, this research informs effective utilization without compromising the quality of student learning in FYW courses.

Presenter:

Nikki Bosca, Associate Director, Online Teaching and Course Development, New Jersey Institute of Technology The Amazing Race: Keeping Up with GenAI at Montclair State University

Bethany Hall Multipurpose Room C

In January 2023, the instructional design team at Montclair State University began ideating a response to the advances in artificial intelligence, which broke headlines in late 2022. Since then, Instructional Technology and Design Services (ITDS) has produced a suite of web-based resources, workshops and trainings, consultations, and more to guide University faculty through discovery and exploration of GenAI to be leveraged pedagogically and mitigate misuse. In this session, Montclair instructional designers Joe Yankus & Gina Policastro will share their experience composing these resources, facilitating small and large-group faculty development, lessons learned, and goals for the upcoming year. 

Presenters:

Joseph Yankus, Instructional Designer, Montclair State University Gina Policastro, Instructional Designer, Montclair State University Session 2: 11:35 am – 12:15 p.m. Integrating Generative AI in Higher Education Learning Environments: Opportunities and Challenges

Bethany Multipurpose Room A

This presentation explores the transformative potential of Generative AI (GenAI) in higher education. It will address how GenAI can enhance learning environments, the importance of guiding students in its use, and best practices for its integration into university courses. The presentation will provide a comprehensive overview of the benefits and challenges of incorporating GenAI in higher education.

This presentation will examine the importance of integrating GenAI in modern teaching and learning; enhance of personalized learning, including critical thinking and digital literacy; facilitate the development of innovative teaching methods; improve student engagement and outcomes; provide students with guidance on the use of GenAI ethically and effectively including academic integrity and plagiarism concerns, issues related to transparency, privacy, accessibility, bias, accuracy, security, and regulatory challenges. Our presentation will also discuss the need for students to understand that “AI fluency” is as important as reading, writing, and arithmetic. The use of artificial intelligence (AI) has grown tremendously since the release of ChatGPT in November 2022.  The daily avalanche of announced improvements to AI platforms increases the need for higher education to address these issues. These trends demand that we develop clear policies and guidelines encouraging student awareness, education, and fluency.

Presenter:

John Shannon, Professor, Seton Hall University

Susan A. O’Sullivan-Gavin, Professor, Rider University

Centering Students and Using AI within Meaningful Learning Environments

Bethany Multipurpose Room B

“How can AI be used to enhance meaningful learning environments in culturally responsive ways? This session focuses on innovative ways AI tools can be used in the classroom to support meaningful learning experiences and culturally responsive teaching. When designing meaningful learning experiences, teachers use AI to engage students in activities that are active, collaborative, constructive, authentic, and goal directed. Culturally responsive teaching is a student centered approach that enriches how AI tools are used within meaningful learning experiences.This happens when teachers situate students’ lived experiences, frames of reference, and ways of being as resources for learning. Participants will be provided a process for designing meaningful learning experiences, a list of AI tools that promote meaningful engagement, and a culturally responsive framework that advances meaningful technology integration. 

This session also addresses educational inequities that surface when learning experiences lack meaningfulness. Teacher centered practices have historically been related to passive technology use, deficit based beliefs about students’ potential, “one size fits all” instruction, and curricular choices that do not account for diversity. As a result, teacher centered approaches have and continue to perpetuate opportunity gaps that disproportionately impact students from historically marginalized communities. To bridge digital divides and fill opportunity gaps, participants will be exposed to the strategies, beliefs, conditions, and tools that support the shift towards student centeredness. “

Presenter:

Manny Algarin, Director of Education, Ed. D. Candidate, New Jersey City University Convenience to Competence: A Spectrum for Purposeful AI Integration in Education

Bethany Multipurpose Room C

This interactive presentation will introduce educators to the Arrighi AI-C2 Utilization Spectrum – a framework for conceptualizing how learners develop skills and competencies through the purposeful integration of AI tools into education. The presentation will provide an overview of the Spectrum’s five stages from basic use to autonomous innovation.

Presenter:

Robbie Melton, Provost, Tennessee State University Session 3: 2:20 – 3:00 p.m. Universal Access Through AI: Leveraging AI for Inclusive Education

Bethany Multipurpose Room A

In this session, participants will learn how AI can be leveraged in teaching and learning to enhance accessibility.  We will explore practical strategies for implementing AI in planning, assessing, and supporting students in higher education. During the session, we will also consider the potential impact of AI on IEPs and 504s.

Presenter:

Jaimie Dubuque, Teaching and Learning Technologist, Rider University Revolutionize Classroom Learning with AI

Bethany Multipurpose Room B

In today’s rapidly evolving educational landscape, harnessing the power of Artificial Intelligence (AI) presents unparalleled opportunities to enhance learning experiences in the classroom. AI offers many tools and technologies that can dynamically adapt to students’ learning styles, preferences, and abilities. AI-powered platforms can personalize learning content, recommend supplementary materials, and facilitate real-time feedback. AI fosters an interactive and immersive learning environment that promotes active engagement and participation among students. AI presents a transformative way to enable the curation of unique assessments tailored to individual learning objectives and contexts. AI algorithms can analyze students’ performance data and preferences to design more relevant, relatable, and reflective assessments of real-world scenarios. Assessments developed using AI can incorporate multimedia elements, simulations, and gamification techniques to create interactive and engaging assessment experiences. By embracing AI in assessment design, educators can cultivate critical thinking, creativity, and problem-solving skills while providing students with meaningful feedback and evaluation metrics.

Presenter:

Muhammad Hassan, Executive Director, Nancy Thompson Learning Commons, Kean University Lessons Learned, Actions Unleashed: Unlocking Our Potential

Bethany Multipurpose Room C

Let’s reflect on our AI work together. We’ll use a ‘retrospective’ (a project management tool) to candidly discuss what worked well and where we can grow. We’ll focus on solutions we can control and make this a fun, collaborative conversation. We’re looking for real solutions for real people. 

Here are some questions to get us started…

What were the outcomes of new teaching methods, tools, etc. used? Are there opportunities to refine or expand?  Let’s talk silver linings. What wisdom can we extract from decisions that didn’t quite work out? Describe something you crushed! What made it a success? What action items can we identify to address challenges and improve the learning environment for ourselves and our students?

Even if you haven’t made any groundbreaking discoveries, join us anyway. 

The session lead will share anonymized responses compiled into a toolkit after the conference. 

Presenter:

Diane Rubino, Adjunct Professor, New York University Session 4: 3:10 – 3:50 p.m. Using AI Avatar Patients to Increase Student Readiness

Bethany Multipurpose Room A

Educators need to prepare students to enter the workforce post-graduation. Student’s readiness, which includes perceptions of knowledge, attitude, and ability, manifests in one’s sense of self-efficacy and is influenced by experiences. Additionally, experiences are impacted by context-specific readiness. The Context and Implementation of Complex Interventions framework defines context as a set of characteristics and circumstances that are active, unique, and embedded in the experience. Contextual readiness includes socio-political, community, organization, financial, and learning resources and opportunities that influence the experience. Experiences should promote active learning and simulate real-world scenarios to foster student readiness appropriately. Developing and delivering real-world learning experiences presents many challenges, including contextual characteristics, limited space for hands-on engagement, increased staffing needs, scheduling conflicts, and costs associated with patient actors or experiential opportunities. Integrating Artificial Intelligence (AI) and Virtual Reality (VR) can address these challenges. AI is a computing system that can engage in human-like processes such as learning, synthesizing, self-correction, and data integration for complex processing tasks. AI technologies generate human-like content in response to complex and varied prompts and, blended with VR, immerse the user in simulated environments to make the experience more thoughtful and interactive, allowing for repeated practice to enhance skill development, conservation of resources, and cost-effectiveness.

This panel presentation will describe the pedological design and implementation of AI avatar patients into healthcare interprofessional curricula. It will present data on financial impact and students’ perceived readiness post-AI-VR experiences and offer recommendations for future integration into the broader educational curriculum. 

Presenters:

Leslie Rippon, Associate Professor, Department of Athletic Training, Seton Hall University Genevieve Zipp, Professor, Program Director of PhD in Health Sciences, Seton Hall University Lorene Cobb, Assistant Professor, Department of Physical Therapy, Seton Hall University Machine Morality: Ethical and Creative Uses of AI for Faculty and Students

Bethany Multipurpose Room B

This professional development workshop examines artificial intelligence (AI) and its role in education. The session will go beyond the red-alert knee-jerk response of many educators and present opportunities for students and teachers to use the emerging technology to enhance teaching and learning.  

Presenter:

John Baldino, Director, Center for Teaching and Learning, Lackawanna College Exhibitor Sponsors

The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.


Identity At The Center - Podcast

Dive into the world of digital identity with our latest epis

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC. Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app. #iam #podcast #idac

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC.

Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app.

#iam #podcast #idac

Sunday, 16. June 2024

Velocity Network

Empowering Self-Sovereign Identity With Trusted Credentials: Exploring Velocity Network Checks – Part 1

Self-sovereign identity centers on placing data control squarely in the hands of individuals. The goal is to rectify a mistake that has grown exponentially since the late 90s—the dominance of certain companies over personal information.  The current model has data providers sending to data consumers directly, for the most part, with little to no consent from the data subjects themselves in

Friday, 14. June 2024

MyData

Lessons from the City of Helsinki: Three Paradigm Shifts in Smart Cities

Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness Report 2024. Finland also ranks #1 in the Digital Economy Society Index (DESI). And the country’s free world-class education system has earned the #1 rank […]
Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness Report 2024. Finland also ranks #1 in the Digital Economy Society Index (DESI). And the country’s free world-class education system has earned the #1 rank […]

Elastos Foundation

Elastos Partners with BEVM for Bitcoin Native Peer-to-Peer Loans

Partnership aims to unlock up to $1.3 trillion of dormant Layer 1 Value, as US consumers get excited about the 3rd Age of Bitcoin Location, Singapore, June 27th 2024: Elastos, the SmartWeb ecosystem provider, has announced a partnership with the L2 provider, BEVM, to develop a peer-to-peer Bitcoin-denominated loan offering around the former’s BeL2 protocol. […]

Partnership aims to unlock up to $1.3 trillion of dormant Layer 1 Value, as US consumers get excited about the 3rd Age of Bitcoin

Location, Singapore, June 27th 2024: Elastos, the SmartWeb ecosystem provider, has announced a partnership with the L2 provider, BEVM, to develop a peer-to-peer Bitcoin-denominated loan offering around the former’s BeL2 protocol. Together the companies believe they can unlock up to $1.3 Trillion of dormant Layer 1 Bitcoin Value, which is supported by data from the latest Elastos’ BIT (Bitcoin; Innovation & Trust) Index suggesting more than two-thirds of US tech-savvy consumers are comfortable using Bitcoin.

 

Collateralize 80% of assets while the Bitcoin Layer is untouched

Elastos believes momentum is building around the Third Age of Bitcoin, where users will be able to transact using Native Bitcoin. Partnering with BEVM to develop this Bitcoin Native loan product will allow users to collateralize up to 80% of their assets in return for L2 credit (stable coins, for instance) based on terms defined in a Bitcoin-assured smart contract. The integrity of the currency is assured by BeL2’s unique ZK-proof process which means the Bitcoin Layer is untouched as the process can be completed without bridging, wrapping or otherwise interfering with the Bitcoin Layer. This maintains the integrity of the currency and avoids network congestion and additional fees that would otherwise result. This approach enables Elastos and BEVM to deliver a genuinely peer-to-peer loan product, which is completely disintermediated and anonymous.  Verification (potentially through third party services) and resulting costs/delays would only be required in the event of a dispute between the two parties.

“The BeL2 protocol perfectly reflects what BEVM is all about; developing and supporting EVM-compatible DApps which can run in the Ethereum ecosystem to operate on Bitcoin L2.  The loan offering is the perfect illustration of how such services could revolutionize the finance sector,” Hakan Sezikli, Co-founder of the BEVM Foundation.

 

Enabling Insight via BTC Oracle

Launched in December ’23, the Bitcoin Elastos Layer2 (BeL2) protocol is a Layer 2 solution for Bitcoin, enabling multiple functionalities such as staking and smart contracts to be denominated directly in the World’s most popular digital currency. BEVM will be collaborating with the Elastos’ BeL2 protocol to deliver a BTC Oracle to monitor and analyze all Bitcoin-based activity in real time.  As the BeL2 protocol enables Bitcoin users to manage, literally, any relationship through the currency – from simple staking (‘interest’), to complex multi-party agreements through smart contracts – the BTC Oracle will become a vital source of insight into how the currency is being used.

 

US tech-savvy consumers trust Bitcoin

This partnership comes as new data from the Elastos’ BIT (Bitcoin; Innovation & Trust) Index indicates growing excitement among US tech-savvy consumers for Bitcoin. 63% of ‘tech savvy’ consumers feel either ‘perfectly comfortable’ or, even, ‘excited’ about transacting in Bitcoin and over half respondents in the US are using Bitcoin at least once a month.

 

Respondents to the survey also suggest they trust Bitcoin as much as online banking or cash to protect savings:

24% US respondents would place most trust in Bitcoin Compared with 25% who place most trust in online banks 23% who place their trust in cash

“What this data shows is that we’re reaching an inflection point in the understanding and embrace of crypto-currencies among early adopters in the US that reflects the global trend towards the Third Age of Bitcoin,” said Rong Chen, co-founder, Elastos. “We are on the verge of Bitcoin delivering a new era commerce, where users are in charge of their data and are no longer beholden to the Web 2 tech giants. This data shows there is work to do to encourage broader adoption in the US, but at Elastos it is our mission to develop technologies that will make it easier to interact and transact with Bitcoin.”

 

About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.

The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.

https://elastos.info

https://www.linkedin.com/company/elastosinfo/

 

About BEVM

BEVM is the first fully decentralized EVM-compatible Bitcoin L2 that uses BTC as Gas. It allows all DAppswhich can run in the Ethereum ecosystem to operate on Bitcoin L2.

www.bevm.io

https://twitter.com/BTClayer2

 

 

Thursday, 13. June 2024

Berkman Klein Center

Global AI Regulation: Protecting Rights; Leveraging Collaboration

Policy experts from Africa, Europe, Latin America, and North America outlined next steps for global AI regimes and networked capacity building Photo by NASA on Unsplash By Lis Sylvan & Niharika Vattikonda Nearly a year and a half after the introduction of ChatGPT, artificial intelligence remains in the regulatory hot seat. While the EU AI Act put the so-called Brussels Effect into p

Policy experts from Africa, Europe, Latin America, and North America outlined next steps for global AI regimes and networked capacity building

Photo by NASA on Unsplash

By Lis Sylvan & Niharika Vattikonda

Nearly a year and a half after the introduction of ChatGPT, artificial intelligence remains in the regulatory hot seat. While the EU AI Act put the so-called Brussels Effect into play, more regions across the globe are now weighing risks, rights, economic opportunities, and regional needs. On May 28th, the Global Network of Centers of Internet & Society Research Centers (NoC) and the Berkman Klein Center for Internet & Society at Harvard University (BKC) hosted a group of policy experts from Africa, Latin America, the US, and the EU to discuss this state of global AI regulation and outline next steps for collaboration across continents.

Lis Sylvan, Senior Director of Strategy and Programming at BKC, moderated the discussion with Carlos Affonso de Souza (Director of the Institute of Technology and Society of Rio de Janeiro), Mason Kortz (Clinical Instructor at the Cyberlaw Clinic at BKC), Gabriele Mazzini (European Commission, chief architect of the EU AI Act), and Ridwan Oloyede (Certa Foundation, coauthor of their recent “State of AI Regulation in Africa” report), with NoC Executive Director Armando Guio providing behind-the-scenes support. The group delved into how governments are weighing sectoral versus horizontal regulatory approaches; the role of the administrative state and existing data protection and competition regulators; the new models of AI regulation in Rwanda and Brazil; the impact of the EU AI Act across all jurisdictions; and the potential for truly global governance.

Origins and Approaches

De Souza contextualized the current moment of global AI regulation as a decade-long journey of AI regulation that started with charters and declarations of governing principles from various governments and entities. Over time, those charters and principles were reflected in national AI strategies, which have been in the works for five years and can be seen as the precursor to AI regulation; Brazil’s AI regulatory evolution, for example, closely followed this time frame. De Souza highlighted the impact of the European Union’s General Data Protection Regulation (GDPR) on this evolution; after GDPR took effect, countries have established data protection authorities that have largely been the main point of contact for early AI governance. As a result of GDPR, he said, “data protection may be an accelerator, may be an entry point for countries in the majority world, because that’s the conversation that we have been having in the last decade, and that’s where resources [and] attention had been moving forward in those countries.” However, he cautioned against using data protection law as the sole basis of AI regulation, because the data protection framework does not necessarily address the full scope of challenges raised by the development of AI.

Mazzini explained that the technical discussions about the EU’s proposed AI legislation date back to 2019. One of the key concerns with a sectoral approach, he said, was the risk of privileging certain sectors over others. The horizontal approach, though, results in added complexity as regulators needed to find regulations that would work across sectors and avoid repetitions; moreover, the scope of EU legislation is limited by the exclusion of national security, military, and defense sectors. While the EU AI Act takes an omnibus approach, Mazzini said it did not make sense to regulate AI as its own technology but rather a general-purpose tool with a variety of applications.

“What was clear to me since the get-go is that It didn’t make sense to regulate AI as a technology as such, because indeed what we are dealing with is a general purpose technology that has a variety of applications that we don’t even foresee today…” said Mazzini, “…and therefore, from my perspective, the idea to establish rules for the technology as such, regardless of its use, didn’t make any sense…We came up with this approach of establishing rules depending on the specific use to which that account is put, with the greatest burden, from a regulatory point of view, being on the high risk,” which Mazzini outlined to include applications of the technology that are linked to health and safety, including medical devices, automated cars, and drones.

Sectoral and Regional Approaches

In the U.S. and in the African Union, regulatory agencies have found it more effective to apply existing laws — across data protection, competition, consumer protection, employment, and other sectors — to govern AI, often taking a sectoral approach. Oloyede said that data protection authorities and competition authorities have largely driven the initial AI regulatory agenda, as these authorities are best equipped to enforce consumer protection, data protection, intellectual property, and competition laws as the basis for national AI governance strategies. “We might see some sort of like a clearinghouse model image where not every country in Africa, for example, will try to come up with a specific AI regulation,” Oloyede said.

Oloyede indicated that the sector-based approach has been dominant on the African continent, with countries including Nigeria, Kenya, South Africa, Rwanda, and Egypt beginning to develop roadmaps for AI governance and establish regulatory task forces. Oloyede said the sectoral approach has allowed regulators to develop specific policies for the deployment of AI in healthcare, for example.

According to Mason Kortz, this sectoral approach is typically favored in the U.S. because the U.S. regulatory approach values subject-matter expertise over technical expertise. The U.S. will likely have subject-matter experts regulate AI in their own domains, Kortz said — for example, the Department of Housing and Urban Development would regulate AI for housing. The U.S. approach relies on the country’s strong administrative state and directs specific federal agencies to take on different pieces of AI regulation. Meanwhile, certain state laws have sought to regulate specific use cases of AI in housing and employment contexts.

Kortz also noted that the current approach in the U.S. is a confirmation that existing rights-based regimes will be applied or extended to harms resulting from the use of AI systems; with a notoriously slow legislature, he said, only making small changes as needed is an advantageous approach, particularly when existing enforcement agencies may already have the power to make those changes. The U.S. common law system is well-suited to this approach, he said, as it lends judges relatively strong power to reinterpret the law in ways that are binding on lower courts without necessarily having to rewrite civil code.

“When it comes to some of the more rights-based statutes we have,” Kortz said, “I think, actually, we have a pretty good governance model right there, and we just need some small adjustments around the edges to modernize those statutes and bring them in line, not just with AI, but hopefully, if not future-proof them, at least provide a little more stability for whatever comes next after AI.” However, Kortz allowed that AI is so fundamentally transformative that certain existing laws, such as intellectual property law and copyright doctrine, may not be enough and global harmonization of AI laws should be a priority.

Global Collaboration and Capacity

Oloyede indicated that African countries have introduced solutions at the level of the Global Privacy Congress, although these solutions will need to reflect differing national and regional interests. Mazzini noted that generative AI and general-purpose AI create additional issues that require international collaboration — fighting misinformation, he said, will require such collaboration. However, de Souza cautioned that regulatory transformation must keep in mind how those laws will be applied in the future. In some cases, he noted, new liability regimes for AI are now stricter than the remaining body of law; Costa Rica, for example, has adopted a strict liability approach for high-risk uses of AI.

“If we turn out to have the chapters of liability on our AI laws more severe than what we have in our general law for other situations, if we are all in agreement that, in the future, AI is going to be in everything, the legislators that are designing those laws today, they are designing general laws on liability, because we will have AI in almost all sectors,” de Souza remarked. “So the decisions that we’re making today on liability, they might end up scrapping the provisions that you have on your civil code, consumer protection code, because the AI law will be the law that is more recent, more specific, and that may be the one that will be applying in most cases.”

This international collaboration will require capacity building across the globe, and Mazzini emphasized that the EU AI Act has prompted additional work to support the authorities in the EU that will implement and enforce the regulation. Although the AI Act will impact multiple private sectors, he said, its public enforcement will require both financial and knowledge-based resources. De Souza noted that the Brussels Effect will prompt a need for global bureaucracy to support global compliance with the EU AI Act, and well-resourced national authorities are needed to support that implementation. Oloyede, however, said that lessons learned from the GDPR rollout may inform a better approach to implementing the EU AI Act with a more nuanced understanding of the local context. While the EU AI Act will require capacity building to support new governance bodies with funding and resources, he said, it is essential to preserve existing collaborations with data protection and competition authorities and empower those authorities to address AI in their own domains.

Despite different countries taking more sectoral versus horizontal approaches, the global community is working to establish flexible approaches to AI governance in their respective regions. As Oloyede said, “AI is here today. Tomorrow is going to be a different technology. And we can’t keep legislating for every new technology that we have.” Mazzini described a need for international coordination when he said, “when it comes to this new type of AI that is sometimes is called ‘generative AI’ or ‘general purpose AI’ that we have specifically regulated in the EU — notably in the last few weeks, in final stages of the negotiations — I think I would like to see there certainly more international coordination, because there we are dealing with a number of questions that I think are pretty common across jurisdictions.”

Though approaches across the globe may be different, a common cross-cutting theme of the work is balance: protecting rights versus supporting innovation, legislating a critical technology while its capacity and impact is still developing, and providing necessary limitations while allowing nimble innovation.

The Network of Internet & Society Research Centers (NoC) is a collaborative initiative among academic institutions with a focus on interdisciplinary research on the development, social impact, policy implications, and legal issues concerning the Internet. The Berkman Klein Center at Harvard University served as NoC Secretariat from 2020–2023 and continues to participate in cross-national, cross-disciplinary conversation, debate, teaching, learning, and engagement.

Global AI Regulation: Protecting Rights; Leveraging Collaboration was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


EdgeSecure

Update on VMware’s acquisition by Broadcom

The post Update on VMware’s acquisition by Broadcom appeared first on NJEdge Inc.

Dear Edge Community,

I hope this message finds you well. As we continue to adapt to the industry changes brought about by VMware’s acquisition by Broadcom, I want to share some important updates and reaffirm our dedication to assisting you through these transitions.

Pricing Changes Post-Acquisition: Following Broadcom’s post-acquisition adjustments to VMware’s licensing, contracting, and operational model, Edge will no longer manage an Enterprise Licensing Agreement (ELA) on behalf of our members. As a result, Edge will not be responsible for negotiating pricing, co-terming licensing dates, or participating in the quoting and billing process.

VMware Horizon Update: VMware Horizon is becoming a new organization, separate from VMware/Broadcom, called Omnissa. Licensing for Horizon must be procured separately from VMware and can also be fulfilled by partners like Carahsoft.

What This Means for Members:

Members will need to procure Broadcom/VMware licensing through standard reseller channels. Discounting will be determined by negotiations between each member and Broadcom/the reseller.

Edge still has procurement vehicles available for members to quickly process orders. These include:

The EdgeMarket TeCHS contract, fulfilled by SHI. Other preferred resellers through our convenience contract via Carahsoft.

Additionally, Carahsoft will continue to assist the VMware reseller community in preparing your pricing and quotes.

Alternative Solutions and Expert Support: In light of these changes by Broadcom, we believe it is prudent to consider alternative solutions that may better meet your financial and operational needs. We have secured a robust selection of substitute products and services under Edge procurement vehicles. Most of the market available substitutes are available through the EdgeMarket portal, and I am pleased to introduce several key points of contact who can assist you with these options:

Lou Malvasi, PubSec Sr. District Sales Manager, SHI Mobile: (609) 608-2463 Email: Lou_Malvasi@shi.com Bethany Tangredi, AWS Partner Account Executive, AWS Mobile: (413) 896-4331 Email: bethtang@amazon.com Cyntya Ramirez, Senior Program Manager, Carahsoft Technology Corp. Mobile: (571) 662-4641 Email: cyntya.ramirez@carahsoft.com

Lou, Bethany, and Cyntya are available to discuss how various alternative solutions can provide the value and support you need during this time of change.

Alternative Solutions via Edge Procurement Vehicles:

AWS native services Google Cloud Platform native services TeCHS/SHI & Carahsoft: Microsoft Azure/Hyper-V Nutanix Citrix Oracle Other service providers

Cloud Migration Expertise: For those looking to enhance or modify their cloud strategies, please consider our awarded providers listed below. These firms are recognized for their excellence, are fully equipped to support your migration efforts, and are available via EdgeMarket contracts:

CampusWorks, Inc.: Contract #269EMCPS-23-002-EM-CWI, Expires On 10/04/2026 CBTS: Contract #269EMCPS-23-002-EM-CBTS, Expires On 10/29/2026 Infojini, Inc.: Contract #269EMCPS-23-002-EM-IFJ, Expires On 10/01/2026 New Era Technology, Inc.: Contract #269EMCPS-23-002-EM-NET, Expires On 11/30/2026 SHI: Contract #269EMCPS-23-002-EM-SHI, Expires On 12/04/2026 Slalom, Inc.: Contract #269EMCPS-23-002-EM-SLM, Expires On 09/19/2026 Softchoice Corporation: Contract #269EMCPS-23-002-EM-SCC, Expires On 10/16/2026 Strata Information Group: Contract #269EMCPS-23-002-EM-SIG, Expires On 10/04/2026 Trigyn Technologies, Inc.: Contract #269EMCPS-23-002-EM-TGN, Expires On 11/30/2026 Tryfacta, Inc.: Contract #269EMCPS-23-002-EM-TFC, Expires On 10/17/2026

As we face these new challenges, our team remains committed to supporting you every step of the way. We encourage you to reach out to our contacts, or directly to me, with any concerns, queries, or discussions regarding your future strategic directions.

Thank you for your continued trust and partnership as we navigate these evolving circumstances together.

The post Update on VMware’s acquisition by Broadcom appeared first on NJEdge Inc.


DIF Blog

Revolutionizing the traveler experience

Nick Price, who co-chairs DIF’s Travel & Hospitality SIG and Nick Lambert, CEO of DIF member Dock Labs explored the potential for decentralized identity to revolutionize the traveler experience during a discussion with Rob Otto of Ping Identity, Cadrick Widmann of cidas and Roger Olivieira, co founder of

Nick Price, who co-chairs DIF’s Travel & Hospitality SIG and Nick Lambert, CEO of DIF member Dock Labs explored the potential for decentralized identity to revolutionize the traveler experience during a discussion with Rob Otto of Ping Identity, Cadrick Widmann of cidas and Roger Olivieira, co founder of Ver.id at EIC in Berlin last week. 

Nick Price: “Hotels are still stressed about verifying and storing passport information. They have a very large number of customers on file, and a very low amount of usable information. You’re not surprised when they ask whether you have stayed before, though you’ve stayed many times. Decentralized identity promises a substantial improvement in these areas. 

“A lot of the valuable information that makes travel work will be self-attested. Travel is not just about crossing the border or making a transaction, it’s about 'This is me, I'm a vegetarian, I like some extra legroom on the flight', et cetera.

"Travel companies need that information. The customer wants to give it to us, but they don’t currently have the tools to do it. This is exactly what we’re building for a large project in the Middle East: a decentralized identity journey for the traveler across airlines, transport, hotels and the tourism experience.”

Rob Otto: “What consumers really want is a value exchange. When I give you my data, use it to improve my experience. Maybe I even want the hotel to know how I'm feeling today, so I present an 'introvert or extrovert' credential.” 

"You don’t need decentralized identity to figure out someone is staying at a hotel for the tenth time, you just need a system that isn’t stupid.” 

Nick Lambert: “People do care about privacy, but it’s incumbent on us to provide that control over their data. For example, if you want to book a hotel or hire a car, you only want to provide the information the hotel or car hire company needs.

"From an organizational perspective, holding all that data centrally is a honeypot for hackers to target and sell, as well as a GDPR / CCPA compliance risk. Companies are keen to get rid of the liability and pass it over to customers." 

Roger Olivieira: “There are new regulations coming up where you won’t be allowed to store this data any more. Digital wallets are a good solution. They do three things very well: authentication, consent and digital signatures."

Cadrick Widmann: “But it’s hard for users to download a wallet just for one use case.”

Nick Lambert: “True. The user experience needs to be better than what exists today. For example, staff at Condatis (an Edinburgh-based CIAM provider, and DIF member) use decentralized identity to enter the office and access systems remotely, which is great. The challenge is integration with legacy systems.”

Roger Olivieira: “We solve that problem by putting a service provider between wallets and platforms, using common protocols like AuthO / OpenID Connect." 


We Are Open co-op

Making Credentials Work for Everyone

How to think about the three-sided marketplace of skills validation CC BY-ND Visual Thinkery for WAO After more than a decade of working with digital credentials like Open Badges and Verifiable Credentials, we still sometimes hear the sceptical question: “Who’s asking for this?” While digital credentials offer far more than just helping people into a job, this remains a significant and
How to think about the three-sided marketplace of skills validation CC BY-ND Visual Thinkery for WAO

After more than a decade of working with digital credentials like Open Badges and Verifiable Credentials, we still sometimes hear the sceptical question: “Who’s asking for this?”

While digital credentials offer far more than just helping people into a job, this remains a significant and powerful use case. However, creating an ecosystem where this is not only possible but also straightforward takes time. It’s a complex, three-sided challenge that requires more than simply asking users what they want; it involves anticipating their needs and providing innovative solutions that work for everyone involved.

The Appeal of Credentials

People and organisations are often attracted to digital credentials for their potential to recognise and validate a broad range of skills and achievements. They can help democratise learning, making it accessible and recognisable beyond traditional educational settings. For learners, these kind badges represent an opportunity to showcase their skills in a way that is immediately recognisable — and verifiable.

Challenges

Despite the initial excitement, many individuals and organisations find themselves asking, “Now what?” In our experience, this question stems from several challenges. Earners can sometimes have a lack of clarity on how to effectively use badges in practice, and so struggle to see how these credentials translate into real-world opportunities.

CC BY-ND Visual Thinkery for WAO

Issuers such as educational institutions and other organisations may find it difficult to convince stakeholders of the value of alternative credentials. Meanwhile, employers may exhibit uncertainty about the validity and relevance of these badges, leading to hesitation in recognising them as part of the hiring or promotion process.

Understanding the Three-Sided Marketplace

To address these challenges, it’s important to understand the interplay between the three main groups involved: earners, issuers, and employers.

Earners: These are individuals who seek to acquire credentials to validate their skills and knowledge. A significant proportion are already using digital credentials in the application process, with their main concern being whether these badges will be recognised and valued by employers and educational institutions. Issuers: These include schools, universities, and other organisations that award credentials. Their challenge is to establish the credibility and relevance of their badges as a form of skills currency. Employers: These are the entities looking to hire or promote individuals with verified skills. Fewer than half of employers say that they find university transcripts useful in helping them to evaluate job applicant’s potential to succeed at their company. So they need a quick and easy way to verify skills in a way that is a reliable indicator of ability. Shifting the Question

So, rather than asking, “Who’s asking for this?” perhaps we should instead focus on understanding the needs and motivations of each group in this marketplace. By shifting our perspective, we can better appreciate the value of credentials and work towards making them more effective.

Just look at the progress we’ve made as an ecosystem:

✅ Clear value proposition — digital credentials offer tangible benefits, like job opportunities and career advancement, by validating a wide range of skills. Secure, digital wallets make it possible for earners to feel like they truly own and control their credentials.

✅ Widespread adoption and recognition — increasing numbers of institutions and employers are recognising and accepting digital credentials, thanks to the work of organisations such as the Digital Credentials Consortium (DCC), Jobs for the Future, and The RSA.

✅ Robust technology infrastructure — advanced platforms and secure technologies are being developed to support issuing, verification, and management of digital credentials, based on foundational work of standards organisations 1EdTech and the W3C​.

✅ Collaboration between key industry bodies — industry leaders, educational institutions, and technology providers are working together to standardise and promote the use of digital credentials. See, for example, the work around SkillsFWD, Opportunity@Work, and the T3 Innovation Network, and networking at events such as The Badge Summit and ePIC.

✅ Data standards and taxonomies — establishing consistent data standards and taxonomies helps in creating interoperable systems where credentials can be easily shared and verified across different platforms. Credential Engine has developed the Credential Transparency Description Language (CTDL), supporting comparability across credential types and providers.

A huge amount of funding and effort has gone into getting us to this place. It’s going to take more money and time to get us over the line, so where should we focus our attention?

Creating Value for All Sides CC BY-ND Visual Thinkery for WAO

To make digital credentials truly valuable for jobseekers, we need to address the concerns of all three groups:

For earners: Credentials should lead to tangible benefits, such as job opportunities or career advancement. We need clear pathways for using certain types of badges, with examples of how and where digital credentials have successfully led to job offers or promotions. For issuers: It is crucial to developing robust standards such as Open Badges 3.0 to promote the credibility of their badges. Issuers can collaborate with industry leaders to ensure their credentials remain relevant and respected, as well as regularly updating the criteria for earning these badges based on industry needs. For employers: Providing tools and frameworks to easily interpret and trust these badges will encourage wider acceptance and use. Employers can partner with educational institutions to co-create badges and/or develop practical tests to verify the skills claimed by the credentials. Final Thoughts

In summary, the question “Who’s asking for this?” may not be the most productive one. Instead, we should focus on understanding the interconnected roles of earners, issuers, and employers in the credentialing process. By doing so, we can move beyond the trough of disillusionment and realise the full potential of these new forms of recognition.

At WAO, we’ve been working closely with the Digital Credentials Consortium (DCC) on storytelling and communications strategies that help everyone understand and embrace the value of digital credentials. The DCC is a key player in the ecosystem, and one of a number of organisations helping build a future where Open Badges and Verifiable Credentials are a natural and trusted part of the hiring process.

🔥 Do you need help with digital credentials? Check out WAO’s free, email-based Reframing Recognition course, or get in touch! You may also like to check out WAO’s Compendium of Credentialing

Making Credentials Work for Everyone was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 12. June 2024

GS1

End-to-end product traceability enables safer, more efficient care for Lancashire.

End-to-end product traceability enables safer, more efficient care for Lancashire. Stock control was heavily reliant on manual processing, and orders and supplies were managed by a limited number of staff. Much of the inventory management responsibilities would fall to individual nurses, taking them away from providing direct patient care.
End-to-end product traceability enables safer, more efficient care for Lancashire. Stock control was heavily reliant on manual processing, and orders and supplies were managed by a limited number of staff. Much of the inventory management responsibilities would fall to individual nurses, taking them away from providing direct patient care.

Ingencia’s inventory management system (IMS) was implemented to help manage stock control. The IMS is kept up to date with information provided directly by suppliers via the GHX Nexus catalogue.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1_uk_01_cases_studies_2024_final_.pdf

Next Level Supply Chain Podcast with GS1

Drink Outside the Box: QR Codes and Mocktails with Daniel Scharff

In this episode, we dive into how QR codes are revolutionizing packaging, non-alcoholic beverages are making waves, and community networks are driving market success.  Daniel Scharff is the CEO and founder of Startup CPG, a vibrant community supporting emerging consumer packaged goods brands. With a background in San Francisco's food tech scene, Daniel created a Slack community with over 20

In this episode, we dive into how QR codes are revolutionizing packaging, non-alcoholic beverages are making waves, and community networks are driving market success. 

Daniel Scharff is the CEO and founder of Startup CPG, a vibrant community supporting emerging consumer packaged goods brands. With a background in San Francisco's food tech scene, Daniel created a Slack community with over 20,000 members, offering resources and fostering collaboration. He hosts 100+ events annually and produces the Startup CPG podcast, sharing his insights from his experience as a former CEO of a beverage company. 

Daniel's unique approach to helping brands navigate the complex world of product development, market-entry, and supply chain logistics is driven by his firsthand experience as a former CEO of a beverage company. Daniel's innovative strategies and relentless drive to make dreams a reality have earned him a reputation as a visionary leader in the CPG community.

 

Key takeaways:

Learn how QR codes are transforming packaging, improving traceability, and enhancing consumer engagement

Understand the impact of the growing trend of non-alcoholic beverages on supply chain logistics and market dynamics

Discover the crucial role of community support in helping emerging brands succeed in the competitive market

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Daniel Scharff on LinkedIn

Check out Startup CPG

 

Tuesday, 11. June 2024

Hyperledger Foundation

Blockchain Pioneers: Hyperledger Quilt

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now archived projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impa

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now archived projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impact of these pioneering projects.


Project VRM

The Personal AI Greenfield

What forms of pAI—personal AI—are Apple, Mozilla, Google, Meta, Microsoft and the rest not doing? Let’s look at those first two because they’re at the top of the news LIFO buffer. Apple Intelligence (“coming in beta this fall*“), announced yesterday, will help you with writing and creating images while giving you less lame answers from […]

What forms of pAI—personal AI—are Apple, Mozilla, Google, Meta, Microsoft and the rest not doing?

Let’s look at those first two because they’re at the top of the news LIFO buffer.

Apple Intelligence (“coming in beta this fall*“), announced yesterday, will help you with writing and creating images while giving you less lame answers from Siri. (Which they should re-name. Siri is Apple’s Clippy.) It “can draw on larger server-based models, running on Apple silicon, to handle more complex requests for you while protecting your privacy.” The “larger models” will be white-labeled ChatGPT, plus Apple’s own small language models (SLMs).

Mozilla, which got $400+ million a year from Google (for search in the Firefox browser) starting in 2020, announce on June 3 that they will be Building open, private AI with the Mozilla Builders Accelerator. Jive:

This program is designed to empower independent AI and machine learning engineers with the resources and support they need to thrive. It aims to cultivate a more innovative AI ecosystem, and it’s one of Mozilla’s key initiatives to make AI meaningfully impactful — alongside efforts like Mozilla.ai, the Responsible AI Challenge and the Rise25 Awards.

The Mozilla Builders Accelerator’s inaugural theme is local AI, which involves running AI models and applications directly on personal devices like laptops, smartphones, or edge devices rather than depending on cloud-based services…

We chose Local AI as the theme for the Accelerator’s first cohort because it aligns with our core values of privacy, user empowerment, and open source innovation. This method offers several benefits including:

Privacy: Data stays on the local device, minimizing exposure to potential breaches and misuse. Agency: Users have greater control over their AI tools and data. Cost-effectiveness: Reduces reliance on expensive cloud infrastructure, lowering costs for developers and users. Reliability: Local processing ensures continuous operation even without internet connectivity.

Looks to me like both of these are Big AI writ small. It’s “local,” not personal. It’s made to serve your needs with what BigAI offers through APIs. It is still essentially AIaaS (AI as a Service), rather than truly personal AI (pAI): personalized more than personal.

That’s also what I see when I read between the lines at Mozilla’s AI job openings. Take platform engineer. This person will (among other things), “assist in managing and orchestrating workloads across multiple cloud providers.” That’s fine. I’m sure true pAIs will do that too. But most of pAI will be more personal than that. It will deal with the mundanities of your everyday life. Not with coughing up answers that can only come from AIaaSes.

The problem with personalizing AI giant offerings is that they are large language models (LLM) trained on everything that can be crawled on the Internet, plus who knows what else. Not on your truly personal stuff. This is why “prompt engineering” worthy of the noun is ” not for anybody:

Prompt engineering is crucial for deploying LLMs but is poorly understood mathematically. We formalize LLM systems as a class of discrete stochastic dynamical systems to explore prompt engineering through the lens of control theory. We investigate the reachable set of output token sequences $R_y(\mathbf x_0)$ for which there exists a control input sequence $\mathbf u$ for each $\mathbf y \in R_y(\mathbf x_0)$ that steers the LLM to output $\mathbf y$ from initial state sequence $\mathbf x_0$. We offer analytic analysis on the limitations on the controllability of self-attention in terms of reachable set, where we prove an upper bound on the reachable set of outputs $R_y(\mathbf x_0)$ as a function of the singular values of the parameter matrices. We present complementary empirical analysis on the controllability of a panel of LLMs, including Falcon-7b, Llama-7b, and Falcon-40b. Our results demonstrate a lower bound on the reachable set of outputs $R_y(\mathbf x_0)$ w.r.t. initial state sequences $\mathbf x_0$ sampled from the Wikitext dataset. We find that the correct next Wikitext token following sequence $\mathbf x_0$ is reachable over 97% of the time with prompts of $k\leq 10$ tokens. We also establish that the top 75 most likely next tokens, as estimated by the LLM itself, are reachable at least 85% of the time with prompts of $k\leq 10$ tokens. Intriguingly, short prompt sequences can dramatically alter the likelihood of specific outputs, even making the least likely tokens become the most likely ones. This control-centric analysis of LLMs demonstrates the significant and poorly understood role of input sequences in steering output probabilities, offering a foundational perspective for enhancing language model system capabilities.

But all that stuff applies mostly when we’re prompting a big LLM system.

What about using AI in our own lives, where the data that matters most are in our calendars, contacts, financial and health records, our travels, our correspondence (email, chat, whatever)? And how about all the location data we might get from our cars, phone apps, and phone companies? These should be much easier for a pAI to gather, examine, and help us do useful things. Caring about much less data also means a pAI will be less likely to give wrong (hallucinated) answers.

Today the mental frame almost everybody uses for AI is the Big kind, ingesting everything they can get their crawlers on, and munching all of it in giant compute farms. Those systems are great for lots of stuff, but they still don’t deal with personal data listed in the last paragraph.

Not yet, anyway.

Look at it this way. For each of us, there are three data pools:

The entire Net, which is what gets crawled by all the giant LLM operators, plus whatever else they can get their claws on. One’s personal life, some of which is digitized in useful form (contacts, calendar, mail, stuff in folders inside PCs and attached drives). Personal data that is in the hands of giants, but is rightfully ours. These include our driving record and driving practices (,recorded by our late model cars and snitched to insurance companies and others), our location data (kept and shared by car and phone carriers to the likes of Google and the feds), our TV viewing habits, (gathered by Google, Amazon, Roku, Apple, etc.).

The pAI greenfield is with the last two.

Tell us who is working on what there, preferably with open source, and not sitting on walled garden silicon.

[Later… ] Since readers told me I had small language models (SLMs) wrong in one of the paragraphs above, and I’m not sure I had them right, I rewrote them out of the piece. I invite readers to post comments to further correct and expand on the subject of pAIs and what they can do.


Elastos Foundation

BeL2 Loan App Demo 0.2 Live! Native Bitcoin’s Journey into Smart Contracts on Elastos

The BeL2 team is excited to announce the latest update to our Bitcoin-Elastos Layer 2 (BeL2) ecosystem: the Loan App Demo 0.2, now live at lending.bel2.org. This update is a significant step forward in integrating Bitcoin with smart contract functionalities, enhancing both usability and security. Please remember this is a demo app today and not […]

The BeL2 team is excited to announce the latest update to our Bitcoin-Elastos Layer 2 (BeL2) ecosystem: the Loan App Demo 0.2, now live at lending.bel2.org. This update is a significant step forward in integrating Bitcoin with smart contract functionalities, enhancing both usability and security. Please remember this is a demo app today and not commercial, it is being built to showcase the underlying BeL2 technology as part of the larger roadmap. Let’s jump straight in!

 

What’s New in Loan App Demo 0.2

Since the last production release, showcased in Hong Kong, BeL2 has introduced several enhancements to improve user experience and functionality:

Enhanced Order Details: Comprehensive order details now include the status of Zero-Knowledge Proofs (ZKP). Manual BTC Transfer Confirmation: Borrowers and lenders can manually confirm BTC transfers before ZKP completion, saving time. Tip Functionality: Borrowers and lenders can provide tips to each other to encourage faster confirmations. Lender Time Unlock Branch 3: Adds flexibility for lenders. Wallet Compatibility: Unisat is now supported throughout the entire process, expanding beyond the Essentials wallet. Dynamic Timelock Values: The UI now dynamically follows timelocks provided by the contract, no longer relying on hardcoded values. Repayment Countdown Bug Fix: Addressed the repayment countdown duration bug. Order Cancellation: Lenders can cancel an order if it remains unpaid for more than six hours. New Order Status Filter: A new “ongoing” status filter has been added. Bug Fixes: Various minor bugs have been resolved.

 

Updates to Essentials Wallet

To support the BeL2 loan app, Essentials has also received crucial updates:

Direct APK Download: Android users can now download the latest version (3.1.5) directly from d.web3essentials.io. iOS Version Update: The iOS version has been updated and is available on the App Store.

 

BeL2: Expanding Bitcoin’s Capabilities

BeL2 enhances Bitcoin’s scalability, programmability, and privacy, leveraging the Elastos Smart Chain (ESC) while preserving Bitcoin’s integrity. BeL2 enables Bitcoin to interact with smart contracts on EVM-compatible blockchains using Zero-Knowledge Proof (ZKP) technology.

 

Decentralised Loan App

The Loan App on BeL2 allows Bitcoin holders to use their BTC as collateral for loans in USDT. Key features include:

BTC as Collateral: Users lock their BTC in a smart contract to borrow USDT. Fixed Interest Rates: Protection against crypto market volatility. No Forced Liquidations: Safeguards against violent price fluctuations. Smart Contract Automation: Ensures predictable, automated repayment schedules. Relayers and ZKPs: Enhance security and privacy in transactions.

 

Vision Forward

BeL2’s focus is on refining its technology and providing robust SDKs for developers to build a wide array of financial applications on top of Bitcoin, executable on EVM ecosystems like Elastos Smart Chain. They are focused on continuously enhancing the efficiency and security of our ZKP mechanisms and optimise smart contract performance for seamless integration across blockchain ecosystems. The team are developing next comprehensive Software Development Kits (SDKs) to simplify the process of building financial applications on BeL2. These SDKs will include:

Smart Contract Templates: Pre-built templates for common financial applications. API Integrations: Easy-to-use APIs for integrating Bitcoin with other blockchain ecosystems. Developer Tools: Advanced debugging and testing tools. Documentation and Tutorials: Extensive documentation and tutorials to guide developers.

 

Building a Financial Ecosystem

The Loan App serves as a framework for developing various financial applications, demonstrating how Bitcoin can be used as collateral within a decentralised financial ecosystem. Potential applications include:

Decentralised Exchanges (DEXs): Platforms for trading cryptocurrencies without a centralised intermediary. Lending Platforms: Smart contract-based lending services with APR. Payment Solutions: Enabling businesses to accept native Bitcoin and other cryptocurrencies for online platforms. Investment Platforms: Decentralised platforms for investing in various assets using Bitcoin’s security and liquidity.

We invite the Elastos community to explore the capabilities of the Loan App Demo 0.2 and experience the integration of Bitcoin with smart contracts on the Elastos Smart Chain. Your participation and feedback are invaluable as we continue to innovate and expand the BeL2 ecosystem.

Join us at lending.bel2.org and be part of the future where Bitcoin gains new functionalities and applications through BeL2. Together, we are building a more robust and decentralised financial landscape.

 

Monday, 10. June 2024

GS1

Jan Somers

Jan Somers CEO - GS1 Belgium & Luxembourg daniela.duarte… Mon, 06/10/2024 - 23:23 Member management GS1 in Europe Chair Jan Somers
Jan Somers CEO - GS1 Belgium & Luxembourg daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

GS1 in Europe Chair

Jan Somers

Matthias Zenger

Matthias Zenger Senior Engineering Director daniela.duarte… Mon, 06/10/2024 - 23:23 Member management Google Matthias Zenger
Matthias Zenger Senior Engineering Director daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

Google

Matthias Zenger

April Cielica

April Cielica President Global Business Services daniela.duarte… Mon, 06/10/2024 - 23:23 Member management Procter & Gamble April Cielica
April Cielica President Global Business Services daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

Procter & Gamble

April Cielica

Identity At The Center - Podcast

It’s a new episode of The Identity at the Center Podcast! Ad

It’s a new episode of The Identity at the Center Podcast! Adam Mikeal, CISO at Texas A&M University, shares insights on identity security in higher-ed, IAM for DevOps principles, and the shift from custom code to commercial solutions with us. Watch it at https://youtu.be/2foTalb9RVE?si=ZsZvGWxMUhFGrrWV More info is at idacpodcast.com #iam #podcast #idac

It’s a new episode of The Identity at the Center Podcast! Adam Mikeal, CISO at Texas A&M University, shares insights on identity security in higher-ed, IAM for DevOps principles, and the shift from custom code to commercial solutions with us.

Watch it at https://youtu.be/2foTalb9RVE?si=ZsZvGWxMUhFGrrWV

More info is at idacpodcast.com

#iam #podcast #idac

Friday, 07. June 2024

FIDO Alliance

InfoSecurity Magazine: #Infosec2024: CISOs Need to Move Beyond Passwords to Keep Up With Security Threats

Passwordless systems, even if they stop short of a full zero-trust environment, improve convenience as well as security. CISOs should look at approaches such as the FIDO model or web […]

Passwordless systems, even if they stop short of a full zero-trust environment, improve convenience as well as security. CISOs should look at approaches such as the FIDO model or web 3.0 technologies as a basis for future authentication systems.


White Paper: FIDO Attestation: Enhancing Trust, Privacy, and Interoperability in Passwordless Authentication

This document intends to provide a comprehensive understanding of attestation’s role in enhancing and advancing the digital security landscape, specifically with respect to authentication. It focuses on the core function […]

This document intends to provide a comprehensive understanding of attestation’s role in enhancing and advancing the digital security landscape, specifically with respect to authentication. It focuses on the core function of attestation: verifying the origin and integrity of user devices and their authentication materials. FIDO credentials are discussed with a focus on how they offer more secure alternatives than traditional password-based systems and how FIDO attestation enhances authentication security for both Relying Parties (RPs) and end-users. In this document, RPs are those entities that provide websites, applications and online services that require the need for secure user access by confirming the identity of users or other entities. FIDO Alliance’s historical journey is presented with practical analogies for understanding FIDO attestation, its enterprise-specific technical solutions, and privacy aspects involved in the attestation process.

Targeted for CISOs, security engineers, architects, and identity engineers, this white paper serves as a guide for professionals considering the adoption of FIDO within their enterprise ecosystem. Readers should possess a baseline understanding of FIDO technologies, the meaning of attestation, and have a desire to understand why and how to implement attestation.


Oasis Open Projects

CACAO Layout Extension v1.0 approved as a Committee Specification

OASIS is pleased to announce that CACAO Security Playbooks Version 2.0 from the OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC [1] has been approved as an OASIS Committee Specification. Collaborative Automated Course of Action Operations (CACAO) is a schema and taxonomy for cybersecurity playbooks. The CACAO specification describes how these […] The post

New Committee Specification from the CACAO TC

OASIS is pleased to announce that CACAO Security Playbooks Version 2.0 from the OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC [1] has been approved as an OASIS Committee Specification.

Collaborative Automated Course of Action Operations (CACAO) is a schema and taxonomy for cybersecurity playbooks. The CACAO specification describes how these playbooks can be created, documented, and shared in a structured and standardized way across organizational boundaries and technological solutions. This specification defines the CACAO Layout Extension for the purpose of visually representing CACAO playbooks accurately and consistently across implementations.

This Committee Specification is an OASIS deliverable, completed and approved by the TC and fully ready for testing and implementation.

CACAO Layout Extension Version 1.0
Committee Specification 01
04 April 2024

Editable Source: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.docx
HTML: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.html
PDF: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.pdf

ZIP: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.zip

Members of the CACAO TC [1] approved this specification by Special Majority Vote. The specification had been released for public review as required by the TC Process [2]. The vote to approve as a Committee Specification passed [3], and the document is now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving this milestone and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

========== Additional references:
[1] OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=b75cccb8-adc6-4de5-8b99-018dc7d322b6

[2] Public review metadata document:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01-public-review-metadata.html
– Comment resolution log:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01-comment-resolution-log.txt

[3] Approval ballot:
https://groups.oasis-open.org/higherlogic/ws/groups/b75cccb8-adc6-4de5-8b99-018dc7d322b6/ballots/ballot?id=3819

The post CACAO Layout Extension v1.0 approved as a Committee Specification appeared first on OASIS Open.


Elastos Foundation

BeatFarm: Direct and Profitable Superfan Connections on Elastos

Earlier this year, we announced a collaboration with BeatFarm, a project working to disrupt the music industry with a Superfan dApp. So, what is a Superfan dApp? In this article, we will explore BeatFarm and their mission to empower creators and fans alike. Let’s get stuck in! What is a Superfan App? A Superfan app […]

Earlier this year, we announced a collaboration with BeatFarm, a project working to disrupt the music industry with a Superfan dApp. So, what is a Superfan dApp? In this article, we will explore BeatFarm and their mission to empower creators and fans alike. Let’s get stuck in!

What is a Superfan App?

A Superfan app is a platform that helps content creators engage deeply with their most dedicated followers. It offers exclusive content, direct communication, and a community which fosters stronger and longer connections. By providing insights into fan behaviour and allowing monetisation through personalized content and merchandise, these apps enhance both engagement and revenue for creators.

Why BeatFarm Exists

BeatFarm exists to empower creators, establishing a direct and profitable avenue of engagement with their superfans. BeatFarm’s mission is to empower the artist by providing resources and tools which increase the overall value of their content through direct collaboration with their most loyal fans. By eliminating the intermediaries, BeatFarm guarantees that every artist has complete creative freedom and ensures they maximise their potential revenue. BeatFarm uses sophisticated technology like blockchain and smart contracts to ensure, in essence, that an artist is ensured to be compensated in perpetuity for the creative work they have developed.

How BeatFarm Achieves Its Purpose

At its core, BeatFarm uses the infrastructure of Elastos to create an open, transparent, and secure way of monetisation for artists. The utility of blockchain and smart contracts allows artists to easily create, share, and monetise their content. Everything – be it songs, virtual events, or merchandise, is auto-embedded with smart contracts that trigger payment at the snap of a finger whenever that content is used. BeatFarm even creates superfan channels that have various grading levels of payment mechanisms, varied analytics for measurement of fan behaviour, and a deep e-commerce channels. This gives an artist an ecosystem which provides an entire commercialization network for maximum revenue generation through superfan engagement.

What BeatFarm Offers

BeatFarm offers a superfan platform that clears and fairly values the proposition for the artist to generate more revenue from their content. Its key offerings include direct monetisation, empowerment to self-monetise a piece of content and therefore assuring that every artist earns more for every content created and shared.

Blockchain and the Use of Smart Contracts: BeatFarm’s use of blockchain technology and smart contracts ensures transparent, reliable, and automated payments in perpetuity. This consequently ensures the continued monetisation of artist’s content. Agile Content Creation: From track creation, live hosting, and merchandise selling to engaging  with the fans in real-time, everything can be done across the BeatFarm platform. Superfan Apps: Their latest launch: superfan apps providing several payment levels, analytics to measure fan behaviour, and an e-commerce channel that helps maximise revenue. Global Reach and Scalability: BeatFarm partners with artists from across the globe which create personalized superfan channels. The platform is scalable as the number of users grow and it is based on artist popularity.

BeatFarm has every aspect taken into consideration regarding the building of the platform to foster the success of the artist. With a relentless focus on direct connections, transparent monetisation, and versatile content creation, BeatFarm is on a mission to empower the artist and create new and unique ways for an artist to interact with their superfans.

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

 


Identity At The Center - Podcast

We wrap up The Identity at the Center Podcast’s week-long Id

We wrap up The Identity at the Center Podcast’s week-long Identiverse 2024 coverage with a banger of an episode. We sat down with Ian Glazer from Weave Identity, Alex Bovee from ConductorOne, and Lance Peterman from Dick’s Sporting Goods and UNC Charlotte, to get into the topic of Zero Standing Privileges and why or why not this approach makes sense in the real world. You can watch the episode at

We wrap up The Identity at the Center Podcast’s week-long Identiverse 2024 coverage with a banger of an episode. We sat down with Ian Glazer from Weave Identity, Alex Bovee from ConductorOne, and Lance Peterman from Dick’s Sporting Goods and UNC Charlotte, to get into the topic of Zero Standing Privileges and why or why not this approach makes sense in the real world.

You can watch the episode at https://youtu.be/MEWy8gVEC9o?si=51xigHwE_eb5eyVM and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024

Thursday, 06. June 2024

Oasis Open Projects

Introducing the Open Supply-Chain Information Modeling (OSIM) Technical Committee

Supply chain security has emerged as a critical concern for businesses in every sector. The importance of standardized, trustworthy, and interoperable information models cannot be overstated. Addressing this need, the OASIS Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) is being formed to enhance supply chain management worldwide. The initial TC members include AT

By Omar Santos, Distinguished Engineer, Cisco

Supply chain security has emerged as a critical concern for businesses in every sector. The importance of standardized, trustworthy, and interoperable information models cannot be overstated. Addressing this need, the OASIS Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) is being formed to enhance supply chain management worldwide. The initial TC members include AT&T, Cisco, Google, Microsoft, the Cybersecurity and Infrastructure Security Agency (CISA), the National Security Agency (NSA), and others listed in the charter.

You can read the full blog published on Cisco’s website here.

The post Introducing the Open Supply-Chain Information Modeling (OSIM) Technical Committee appeared first on OASIS Open.


Identity At The Center - Podcast

The Identity at the Center Podcast’s week-long Identiverse 2

The Identity at the Center Podcast’s week-long Identiverse 2024 coverage rolls on with another new episode debuting today. We sat down with Andrew Shikiar from the FIDO Alliance to get the latest FIDO news, including the recently announced Selfie Biometric Identity Verification FIDO certification program, and taking questions from our live studio audience. You can watch the episode at https://you

The Identity at the Center Podcast’s week-long Identiverse 2024 coverage rolls on with another new episode debuting today. We sat down with Andrew Shikiar from the FIDO Alliance to get the latest FIDO news, including the recently announced Selfie Biometric Identity Verification FIDO certification program, and taking questions from our live studio audience.

You can watch the episode at https://youtu.be/nagXfos6n_Y?si=M2XHFGCqEAr9nYn7 and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024


We Are Open co-op

A Compendium of Credentialing

From badge design to the future of recognition in networks Image CC BY-NC Visual Thinkery for WAO We’ve written a lot about Open Badges, Verifiable Credentials, and Open Recognition over the years. During a recent conversation, we realised there wasn’t an up-to-date place to point people towards which gives a summary. Anne wrote a great overview back in 2022, but a lot has changed since&
From badge design to the future of recognition in networks Image CC BY-NC Visual Thinkery for WAO

We’ve written a lot about Open Badges, Verifiable Credentials, and Open Recognition over the years. During a recent conversation, we realised there wasn’t an up-to-date place to point people towards which gives a summary. Anne wrote a great overview back in 2022, but a lot has changed since then!

This post is broken down into sections and doesn’t include everything we’ve written on these topics, so feel free to dive into the archives. If you would like assistance with any of this, get in touch!

Introductory posts

These posts give an overview of platforms to get started with Open Badges, some of the latest changes to the specification, as well as how Verifiable Credentials can be used in practice.

Why Open Badges 3.0 Matters 5 platforms for issuing Open Badges Examining the Roots Badge System Design

It’s rare for badges and credentials to exist in a vacuum, so designing a system around them is important. These posts cover some of the things you may want to consider when approaching badge system design.

WTF are ‘Stealth Badges’? Badges for digital transformation Designing Badges for Co-creation and Recognition Open Recognition

Meeting people where they’re at and helping them identify the knowledge, skills, and behaviours that make them unique is much more interesting than giving people more hoops to jump through.

Understanding Open Recognition What is Open Recognition, anyway? Open Recognition is for every type of learning Reframing Recognition Getting started with Open Recognition Creating a culture of recognition 4 benefits of Open Recognition Pathways Open Workplace Recognition using Verifiable Credentials Looking to the future of Open Recognition Open Recognition: Towards a Practical Utopia Towards a manifesto for Open Recognition Plausible Utopias: the future of Open Recognition Experimental Stuff

These posts don’t fit neatly into the other sections but we think they’re important in terms of understanding the possibilities of badges and credentials, especially in terms of community work.

Using Open Recognition to Map Real-World Skills and Attributes The Future of Trust in Professional Networks Endorsement using Open Badges and Community Recognition History and Advocacy

Whether you’re new to Open Badges and Verifiable Credentials or not, knowing the original vision around equity and opportunity is important to understand their potential.

Good things happen slowly, bad things happen fast Reflecting on the Evolving Badges and Credentials Ecosystem Keep Badges Weird: helping people understand the badges landscape How badges can change the world Open Recognition — A feminist practice for more equal workplaces

As we said at the top of this post, we’re happy to help! So if you’ve got a cool idea that you’d like us to sense check, just get in touch :)

CC BY-ND Visual Thinkery for WAO

A Compendium of Credentialing was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 05. June 2024

DIF Blog

Building Trust in AI through Decentralized Identity

DIF's ED Kim Hamilton Duffy, Wayne Chang, CEO of SpruceID and Professor Linda Jeng, a lawyer, former financial regulator and founder of Digital Self Labs, took to the stage at EIC to discuss how Decentralized Identity (DI) can help mitigate threats posed by Large Language Models (LLMs), ably

DIF's ED Kim Hamilton Duffy, Wayne Chang, CEO of SpruceID and Professor Linda Jeng, a lawyer, former financial regulator and founder of Digital Self Labs, took to the stage at EIC to discuss how Decentralized Identity (DI) can help mitigate threats posed by Large Language Models (LLMs), ably moderated by KuppingerCole's Anne Bailey.

The panelists' thoughts on the nature and scale of the problem

Wayne: “Until now, we’ve been able to get by holding a driver's license up to a webcam, but with new AI tech you can fool these systems - this is already showing up at the edges. AI voice generation is really good now. People are saying it's not a valid identification factor any more.

"Phishing attacks are also on the rise, for example people pretending to be a romantic partner before encouraging the target to invest in crypto. Using AI bots for mimicry makes it easy for scammers to quickly establish trust".

Linda: "The rights conferred by GDPR, to decide with whom you share your data, are already difficult to enforce. Deepfakes will make this even harder.

"I’ve been meeting East Berliners here who tell me surveillance capitalism reminds them of what it was like growing up under the Stasi".

Kim: "A lot of the threats are not new, what we are talking about is an acceleration of these. We were already uncomfortable online. Now, with the things we’re seeing due to the advent of cheap, easy-to-use deepfake tech, we are past breaking point." 

How Decentralized Identity can help mitigate these threats

Wayne: “We need to add authenticity to communication. We don’t want to present a strong ID every time we want to use a chat app, so it makes sense to embed DI into comms channels, to prove I’m real.

“I define DI as the ability of any party to play the role of issuer, holder and verifier, based on cryptographic trust. Having digital credentials issued by many parties will enable trusted content certification, giving us confidence about what goes into an AI model.

Linda: “It’s not about identity, it's about data governance, and creating chains of trust to combat risks from synthetic data.

Kim. “I think of DI as a set of standards, technologies and principles that restore individuals’ control over their data. With these technologies, we have the possibility to build products and solutions on strong foundations. 

"One of the key aspects with DI is the ability to provide a consistent experience across channels, creating a much safer environment for individuals. For example if you get a phone call from your CEO asking you to transfer money, with DI you can be sure it’s them and not a deep fake of their voice.

Recommendations for solution developers 

Wayne: “Focus on the value for the end user. The DI standards uniquely enable you to provide a great user experience while also ensuring privacy and solution sustainability, including the ability to swap out vendors if needed without disrupting the service you’re providing." 

Linda: "We have grown used to not having to pay for digital services. The incentives need to change. Think about new models where we get paid for our data, enabled by content authenticity and DI tech."

Kim: “We have to balance usability and privacy. It’s clear people want to use LMM based tech in their lives. On the other hand we’re seeing increasingly aggressive interfaces, for example asking you to give full access to your documents or even your desktop. With DI, finally there are ways to provide people both the convenience AND the trust”

Other opportunities

Wayne: “There’s exciting work happening at Kantara Initiative around automated compliance with data regulations. Imagine giving someone a license to your personal data. Then, if you're a Data Processor it’s easy to automatically demonstrate compliance using consent receipts.

"It makes the “Accept all” problem go away, as you can decide what kind of consent receipts should be automatically generated for which parties.” 

Linda: “We need to spend time educating policymakers and the public, but in the end it comes down to end user demand for solutions. There’s no legal requirement for open banking in the US, but it’s happening anyway as people want to share their banking data with fintechs. Creating a smooth, easy UX will help to create the demand.”

Kim: "There’s a huge role for expanding the scope of trust to content authenticity, similar to the browser check mark that shows a website has a valid SSL certificate. C2PA (link) is fantastic, and is already using VCs. However, there is a risk of getting locked into who can verify these claims, if we use Certificate Authorities (CAs) as the root of trust. We are talking to them and there’s strong interest in generalizing the trust model.”

The panelists' key takeaways

Wayne: "One of the early goals of the internet pioneers was to have your personal agent in cyberspace. We need to get back to that original definition of personal agents, taking advantage of them to certify our content and things done on our behalf." 

Linda: "We need the right to certify our data as authentic. Right now we can’t tell what’s synthetic versus from an original creator. It’s not judging whether the data is good or bad, it just gives us additional info about the data we’re using."

Kim: "Everything we’re talking about is already here, its just about connecting the pieces. If you're building products, this is a great time to get involved. Come and talk to us at DIF!"

Linda Jeng, Wayne Chang, Kim Hamilton Duffy, Kristy Lam and Elissa Maercklein published Chains of Trust: Combatting Synthetic Data Risks of AI earlier today.


FIDO Alliance

SC Media: Identiverse 2024: Deepfakes, passkeys and more

Two predominant themes stood out at last week’s Identiverse 2024 conference in Las Vegas. First, there was the issue of how to defend against rapidly evolving advances in deepfakes, especially […]

Two predominant themes stood out at last week’s Identiverse 2024 conference in Las Vegas. First, there was the issue of how to defend against rapidly evolving advances in deepfakes, especially for live remote verification. Second, there was a common assumption that widespread adoption of passkeys is right around the corner, and that organizations must prepare to manage and secure passkeys when they become mainstream.

FIDO Alliance Executive Director & CEO Andrew Shikiar touched on both topics in a session Wednesday (May 29) titled “FIDO, Passkeys and the State of Passwordless.”

He announced the alliance’s new certification standard for facial-recognition technologies. The first (and so far only) organization to receive that certification is iProov. In a keynote address Thursday (May 30), Shikiar added that the FIDO Alliance was ready to offer independent testing of facial-recognition technologies.

As for passkeys, the passwordless, FIDO-certified PKI-based WebAuthn credentials that reside on hardware keys, smartphones, PCs and in the cloud, Shikiar said the question was not if consumers would adopt them, but when.

The FIDO Alliance’s goal is “to make passkeys inevitable,” Shikiar said. No one at Identiverse expressed any doubt that they would be.


Hyperledger Foundation

Perun, a Hyperledger Lab, enables economic transactions between embedded IoT devices

Introduction

Introduction


Identity At The Center - Podcast

The week-long Identiverse coverage with the Identity at the

The week-long Identiverse coverage with the Identity at the Center podcast continues with another new episode. We hosted our biggest panel ever, including Arynn Crow, Allan Foster, and Ian Glazer from the Digital Identity Advancement Foundation (DIAF) and Kim Cameron award recipients Sophie Bennani-Taylor and Matthew Spence. Our conversation starts off by learning more about the mission of the DIA

The week-long Identiverse coverage with the Identity at the Center podcast continues with another new episode. We hosted our biggest panel ever, including Arynn Crow, Allan Foster, and Ian Glazer from the Digital Identity Advancement Foundation (DIAF) and Kim Cameron award recipients Sophie Bennani-Taylor and Matthew Spence. Our conversation starts off by learning more about the mission of the DIAF before spending quality time getting to know Sophie and Matthew who share their journey into the world of identity and their Identiverse/Las Vegas experience.

You can watch the episode at https://www.youtube.com/watch?v=uN_rKAOpSOI and hear more at idacpodcast.com

#iam #podcast #idac


The Engine Room

[closed] Join our team! Two Associates for Engagement & Support

The Engine Room is accepting applications for TWO Associates: One based in Sub-Saharan Africa and the other in Latin America.  The post [closed] Join our team! Two Associates for Engagement & Support appeared first on The Engine Room.

The Engine Room is accepting applications for TWO Associates: One based in Sub-Saharan Africa and the other in Latin America. 

The post [closed] Join our team! Two Associates for Engagement & Support appeared first on The Engine Room.


Blockchain Commons

Blockchain Commons Awarded FROST Grant from Human Rights Foundation

Today, the Human Rights Foundation (HRF) announced a Bitcoin Development Fund grant to Blockchain Common for its continued support of the development of FROST, including holding two more virtual FROST meetings for the developer community. FROST is a powerful quorum threshold signature scheme built using Schnorr signatures that offers many advantages over existing signature methodologies. That inclu

Today, the Human Rights Foundation (HRF) announced a Bitcoin Development Fund grant to Blockchain Common for its continued support of the development of FROST, including holding two more virtual FROST meetings for the developer community.

FROST is a powerful quorum threshold signature scheme built using Schnorr signatures that offers many advantages over existing signature methodologies. That includes crucial integration with Distributed Key Generation: private keys can be created in pieces by discrete online servers, with no server ever having the whole key. Altogether, FROST can improve privacy, resilience, and security alike, making it truly a next-generation key-management system.

Blockchain Commons has long looked forward to the mainstream deployment of Schnorr and FROST because of their considerable benefits over traditional key-management and signature schemes. We held out first FROST Implementer’s Round Table in 2023 to give FROST library implementers and cryptographers a chance to talk with each other, and we were asked to do more. We also hosted a FROST Developer’s Meeting this year where we worked with Jesse Posner to offer wallet developers a look at what the future of FROST means.

Blockchain Commons is thrilled by HRF’s recognition of our work. Their funding makes it possible to continue this work this year: we’ll be hosting a second round table for FROST library implementers & cryptographers on September 18, then a second meeting for wallet developers on December 4. If you’re a cryptographer or library implementer, sign up for our FROST implementer’s list and if you’re a wallet developer, sign up for our Gordian developer’s list to receive the announcements on these events. Admission is free thanks to the support of HRF!

Blockchain Commons has also placed FROST on our developmental road map for the year, to consider incorporating it our reference code and tools. More details as additional funding and our schedule firm up.

In the meantime, put September 18 and/or December 4 on your calendars so that you can join Blockchain Commons for these important events that will unveil the future of multi-party signatures and key management.

Tuesday, 04. June 2024

EdgeSecure

Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions

The post Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions appeared first on NJEdge Inc.

CC* Regional Networking: Connectivity through Regional Infrastructure for Scientific Partnerships, Innovation, and Education (CRISPIE) 

NEWARK, NJ, June 4, 2024 –Edge has been awarded an $857,000 grant from the National Science Foundation (NSF) to enhance network connectivity and access to advanced research networks and related cyberinfrastructure for seven higher education institutions in New Jersey, including a community college and several Minority Serving Institutions (MSIs). The partner institutions include Brookdale Community College, Kean University, Montclair University, Ramapo College, Rider University, Rowan University, and Saint Peter’s University. 

The project will improve access to advanced research networks and related cyberinfrastructure, aiming to reduce disparities for smaller and less resourced institutions. The initiative offers specialized training programs for IT personnel, faculty, and students, and establishes essential infrastructure elements using tools like perfSONAR for deploying network monitoring and optimization capabilities, leveraging globus file transfer and sharing service, a regional, centrally managed Data Transfer Node (DTN) for efficient data transfers and Science DMZ for direct access to regional and national resources, secured by the InCommon Federation for remote access to instruments and HPC resources. 

“This initiative creates diverse research collaboration opportunities for faculty across New Jersey, enabling further data-intensive research in disciplines including physics, astronomy, biology, genomics, earth and environmental sciences, data science, and cybersecurity. The project includes a robust training and support program to enhance professional IT support and ensure proper adoption and success for researchers and educators at participating institutions. I’m excited to work in concert with my esteemed colleagues and Co-Principal Investigators to bring this initiative to life.”

— Dr. Forough Ghahramani
Assistant Vice President for Research and Innovation
Edge

Dr. Forough Ghahramani, Assistant Vice President for Research and Innovation at Edge, serves as the principal investigator for the project. She highlights that by improving connectivity and cyberinfrastructure access, the project empowers institutions, fosters regional collaborations, and facilitates data-driven research and education. “This initiative creates diverse research collaboration opportunities for faculty across New Jersey, enabling further data-intensive research in disciplines including physics, astronomy, biology, genomics, earth and environmental sciences, data science, and cybersecurity,” explains Dr. Ghahramani. She continues, “The project includes a robust training and support program to enhance professional IT support and ensure proper adoption and success for researchers and educators at participating institutions. I’m excited to work in concert with my esteemed colleagues and Co-Principal Investigators to bring this initiative to life.”

Co-Principal Investigators include:

Dr. James Barr von Oehsen, Vice Chancellor for Research Computing; Director, Pittsburgh Supercomputing Center, University of Pittsburgh|Pitt Research; Research Professor, Electrical and Computer Engineering, Carnegie Mellon University Dr. Tabbetha Dobbins, Dean of the Graduate School, Rowan University Dr. Stefan Robila, Professor of Computer Science and the Director of the Computational Sensing Laboratory, Montclair State University Dr. Balamurugan Desinghu, Senior Scientist, Office of Advanced Research Computing, Rutgers University

As New Jersey’s research and education network, Edge’s mission is to advance research, science, innovation, and discovery through initiatives like this one. “By focusing on underserved institutions, Edge supports small MSIs with connectivity, technical support, and collaboration opportunities. This initiative strengthens and diversifies the academic community by enabling a wide range of research and education endeavors. It aims to improve current capabilities and lays the groundwork for future expansion to include other institutions.” shares Dr. Ghahramani. “This project will contribute to advancing the New Jersey AI Hub’s goals by enhancing network connectivity and providing critical resources for AI research and development, keeping New Jersey at the forefront of AI innovation.”

The NSF prioritizes proposals that support traditionally underserved institutions through partnerships with regional entities experienced in high-performance research and education networking, such as Edge. Special emphasis is placed on Historically Black Colleges and Universities (HBCUs), tribal colleges and universities, and other minority-serving institutions.

For full details about the NSF grant are available here. To learn more about Edge’s commitment to initiatives of this nature, visit https://njedge.net/research/resources-featured-research-reports/

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions appeared first on NJEdge Inc.


DIF Blog

Les Miserables of the Cyber Frontier

Digital identity pioneers Markus Sabadello and Nat Sakimura crossed swords on stage during an absorbing discussion at EIC in Berlin this evening. Markus kicked off by asking why the internet has become highly centralized, in spite of its beginnings as a peer-to-peer network. Markus observed that technology embodies the values

Digital identity pioneers Markus Sabadello and Nat Sakimura crossed swords on stage during an absorbing discussion at EIC in Berlin this evening.

Markus kicked off by asking why the internet has become highly centralized, in spite of its beginnings as a peer-to-peer network.

Markus observed that technology embodies the values of the community that spawns it. He illustrated this by comparing W3C versus SD JWT VCs - contending that the W3C format offers more "Liberte" - as well as DIDComm versus OID4VC - arguing that DIDComm features more "Fraternite". Here's a flavor of their conversation (apologies to both for any mistakes in capturing your comments).

Markus: "One of the biggest discussions right now is how to get a VC into a wallet and present it. With OpenID4VC, there is asymmetry between issuer and holder, whereas with DIDComm, you have a model where everyone can connect to everyone else and establish true peer-to-peer relationships."

Nat: "The same can be said for STMP. The philosophy was that everyone runs their own mail server. I do, but how many others do? Instead, we have enormous mail servers like Office 365 and gmail. Just the fact that protocol provides Liberte, doesn't ensure decentralization."

Markus: "No one said Liberte was easy. The easiest thing is to log in with Facebook!".

Nat: "What's the cost of enabling Liberte? It will make things more complex and error prone.

"You need to specify what is being decentralized. For example, IDPs are so-called centralized, yet there are hundreds of thousands of them. The number of wallets will be even more than the size of the population, but thinking about wallet providers, it will probably be far fewer than the number of IDPs.

"It's not only the technical architecture we need to look at, but also the operational and legal controls. Don't try to solve everything in a technical way."

Markus: "SSI arose to ensure we can have the same social structures and protections we enjoy in the real world, in the digital world. We want Liberte, Egalite, Fraternite, and these ideas are baked into SSI standards."

Nat: "The goal is not the technology, but how to achieve Liberte, Egalite, Fraternite. Don't get too fixated on the technology."

Markus: "We can agree on that!"


Identity At The Center - Podcast

We continue our week-long Identiverse coverage with another

We continue our week-long Identiverse coverage with another brand-new episode of the Identity at the Center podcast. For today’s episode, we talked with Danny de Vreeze from Thales @ OneWelcome about the transformative potential of AI in identity management, his journey in the IAM field, the evolution of customer identity and access management (CIAM), and the importance of making access frictionle

We continue our week-long Identiverse coverage with another brand-new episode of the Identity at the Center podcast. For today’s episode, we talked with Danny de Vreeze from Thales @ OneWelcome about the transformative potential of AI in identity management, his journey in the IAM field, the evolution of customer identity and access management (CIAM), and the importance of making access frictionless and secure.

You can watch the episode at https://youtu.be/TUa_ClkyS2U?si=v4gwZh8NDb3ecEpm and visit http://idacpodcast.com for more info.

#iam #podcast #idac #identiverse2024


DIF Blog

Decentralized ID Technical Mastery Sprint @ EIC

DIF’s ED Kim Hamilton Duffy and SC member Steve McCown delivered a Technical Mastery Sprint to a packed audience on the opening day of the 2024 European Identity and Cloud conference. Steve highlighted the scale of internet security and data privacy problems, noting that some 25 billion login

DIF’s ED Kim Hamilton Duffy and SC member Steve McCown delivered a Technical Mastery Sprint to a packed audience on the opening day of the 2024 European Identity and Cloud conference.

Steve highlighted the scale of internet security and data privacy problems, noting that some 25 billion login credentials had been leaked to the dark web by 2022, a 65% increase from 2020. Identity is the hackers’ real objective, since this is what enables lucrative fraud schemes. This threat impacts both individuals - what happens when your biometrics are stolen? - and organizations - 98% of whom have relationships with at least one vendor that has experienced a breach within the past 2 years, according to the Cyentia Institute.  

Steve and Kim introduced key decentralized identity building blocks including Decentralized Identifiers (DIDs), DID methods, DIDComm, wallets and agents, and how they can help address the current privacy and security challenges. They emphasized that these elements can be readily incorporated into existing systems, demonstrated by the growing use of decentralized identity to create on-ramps between Web2 and Web3 (and now Web5) applications.

Sam Curren demonstrated a new protocol that bridges DIDComm and OpenIDConnect, facilitating eIDAS-compliant integration of DI within the EU digital identity wallet to enable new use cases.

Steve and Kim spoke about how trust is established in a world where anyone can issue credentials, and highlighted several approaches that are gaining traction, including Trust over IP Foundation Trust Registries and DIF Credential Trust Establishment. 

Kim outlined credential issuance and exchange flows and highlighted some implementation challenges, solutions and best practices, including credential storage and key management, and provided tips for managing a DI project and how to get started. She also outlined several existing government, educational, workforce management and supply chain use cases, and wrapped up the session with a live demonstration of DID creation and credential issuance using the Veramo CLI toolkit. 

The session generated strong audience engagement, with Kim and Steve answering questions on topics including the trust relationship between issuers and relying parties, delegated authority (e.g. where a parent manages their child’s credentials), credential revocation, reconciling multiple user accounts, the need for centralized record keeping (e.g. for regulatory compliance), key storage and wallet recovery. 

Look out for a more in-depth post where we provide their answers!

Monday, 03. June 2024

Digital Identity NZ

Identity 2.5 with Alan Mayo

Alan Mayo is developing identity solutions beyond the Digital Economy, solutions that have applicability over multiple human scenarios, in-person, on-line, and by telephone. The post Identity 2.5 with Alan Mayo appeared first on Digital Identity New Zealand.

Alan Mayo goes beyond digital identity to consider identity in person, online, and by telephone. He has developed a structured way of describing identity that has, until now, been severely lacking. Alan questions the currently accepted status quo, suggesting that we will need to pass through an Identity 2.5 phase before reaching Identity 3.0.

With topics from verifiable credentials and passkeys to digital wallets and identity ecosystems, you can find Alan’s Digital Identity newsletters here.

The post Identity 2.5 with Alan Mayo appeared first on Digital Identity New Zealand.


We Are Open co-op

Fractional Leadership in Social Impact Organisations

Setting up a successful leadership transition After years of working with and for a variety of different kinds of social impact organisations — from educational institutions to cooperative federations, small community-based charities to global non-profits — we’ve been lucky to see how our strategy and “critical friend” services can help leaders set foundations for an organisational programme or i
Setting up a successful leadership transition

After years of working with and for a variety of different kinds of social impact organisations — from educational institutions to cooperative federations, small community-based charities to global non-profits — we’ve been lucky to see how our strategy and “critical friend” services can help leaders set foundations for an organisational programme or initiative.

It’s only lately that we’ve thought about some of what we do through the lens of “Fractional Leadership”. We are experts in cooperation, learning, technology and community, which means the type of leadership that we bring into projects is quite specifically OPEN.

What is Fractional Leadership? cc-by-nd Bryan Mathers

Fractional leadership is essentially outsourcing a role or a part of a role to an expert who can help you hit the ground running. Such positions are great when you know you need a new department (e.g. Digital Transformation) or if you’re kicking off a new project that needs someone to lead. They’re also helpful when you know you want to hire someone full-time, but want to find the right person for your organisation.

Fractional leadership provides you and your organisation with some flexibility because it gives you the time and space to make the right decisions in staffing and for your organisation as a whole. It’s potentially less expensive than hiring someone full time and can be explicitly bound in a statement of work. It’s also a way to test whether or not a particular new position makes sense to your organisation.

It is a way to work with diverse experts who have wide-ranging experience that you would like to apply to your organisation as it grows, changes and transitions.

The shadow side of fractional leadership cc-by-nd Bryan Mathers

As with any concept we pull from the world of bizniz into the social impact space, we need to be aware of how it can manifest inside an organisation that isn’t only trying to maximise profits.

Often, organisations in our space are unaware that what they need is a leader and look to us to create a specific “thing”. “We just need a digital strategy,” or “We just need a training programme,” are things that often come up in projects that obviously need something more. Working openly and helping people to understand how technology and funding intermingle, how community drives participation, how recognition motivates or how a learning programme can scale all require leadership.

Often funding in the social impact space is to develop a specific thing, not the impact that thing should have. Complex realities also mean that someone who might be a leader in one context might not have what it takes to lead in your context. This is why an open leader is so important. Fractional leadership in the open is a rare thing indeed.

A fractional leader needs to be adaptable and find ways to stay within the bounds the organisation has, while also setting a project or department up for future impact.

Preparing for a new colleague cc-by-nd Bryan Mathers

Still, experienced fractional leaders are well versed in making do with what they’re given. They can lay out plans and programmes with the future in mind. The strategies and processes put in place during a time of transition reflect a moment in time. Setting them up as iterative and designing them to evolve empowers future leaders.

We’ve offered “critical friend” services to help onboard the people who will take over from what we were asked to start. We’ve advise on workload and priority based on community engagement in tandem with organisational goals. We like to do the work of documenting and establishing open, productive processes and policies help future collaborators take over when the time comes.

Is Fractional Leadership for you? cc-by-nd Bryan Mathers

Your fractional leader can start:

Developing a strategy that focuses on long term social impact Creating processes or frameworks to measure impact or create processes for gathering data and insights Building assets to begin implementing the change needed to achieve the impact you’re looking for Supporting and working with others on your team to co-design principles and approaches that support your mission Determining the skills and competencies necessary to lead the project, programme or department they’re working within and helping your organisation find someone to take over

The whole point is that you can bring in engaged experts to help you get something going while looking for your forever person. There are lots of people working in the social impact space who have deep expertise in everything from finance to HR to product development and design.

Fractional leadership is something you can simply try out. As you create briefs or write out job descriptions just ask yourself, “Would a designated fractional leader give me more time and space to ensure this is a long-term success?” If so, get in touch!

Fractional Leadership in Social Impact Organisations was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

We are back from Identiverse with brand new episodes of the

We are back from Identiverse with brand new episodes of the Identity at the Center podcasts debuting every day this week. First up, we talked with George Roberts from McDonald’s about his career in identity, his role in shaping digital identity at McDonald’s, and his Identiverse experience. You can watch the episode at https://youtu.be/wiempmDo-Ks?si=kHZNVZf1Lbq5Oo7p and hear more at idacpodcast.

We are back from Identiverse with brand new episodes of the Identity at the Center podcasts debuting every day this week. First up, we talked with George Roberts from McDonald’s about his career in identity, his role in shaping digital identity at McDonald’s, and his Identiverse experience.

You can watch the episode at https://youtu.be/wiempmDo-Ks?si=kHZNVZf1Lbq5Oo7p and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024

Sunday, 02. June 2024

OpenID

Public Review Period for Proposed Fourth Implementer’s Draft of OpenID Federation

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft: OpenID Federation 1.0 This would be the fourth Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review […]

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft:

OpenID Federation 1.0

This would be the fourth Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpenID Implementer’s Draft. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period.

The relevant dates are:

Implementer’s Draft public review period: Saturday, June 1, 2024 to Tuesday, July 16, 2024 (45 days) Implementer’s Draft vote announcement: Wednesday, July 3, 2024 Implementer’s Draft early voting opens: Wednesday, July 10, 2024* Implementer’s Draft official voting period: Wednesday, July 17, 2024 to Wednesday, July 24, 2024 (7 days)*

* Note: Early voting before the start of the formal voting period will be allowed.

The OpenID Connect working group page is https://openid.net/wg/connect/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specification in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “AB/Connect” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-ab, and (3) sending your feedback to the list.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Public Review Period for Proposed Fourth Implementer’s Draft of OpenID Federation first appeared on OpenID Foundation.

Friday, 31. May 2024

DIF Blog

DIF Newsletter #40

May 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News DIF at ID4Africa DIF was honoured

May 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

DIF at ID4Africa

DIF was honoured to participate at the ID4Africa 2024 AGM in Cape Town last week.

DIF's Catherine Nabbala and Damian Glover delivered a plenary presentation to the conference, together with Anand Acharya of Bhutan National Digital Identity.

The session highlighted DIF’s work within the broader landscape of national ID schemes and provided real-world insights into the implementation of decentralized identity by the Thai and Bhutanese governments.


The presentation elicited engagement with DIF from a variety of quarters, including several countries that are rolling out national digital identity programs, and others looking to do so.

Our conversations with stakeholders evidenced a strong desire for Africa-specific digital infrastructure, and awakening interest in decentralized identity as a tool to empower citizens and promote cross-border interoperability while avoiding lock-in to ecosystems controlled by outside interests.

Universal Resolver

The IOTA Digital Identity (DID) method is now resolvable using the DIF's Universal Resolver. The Universal Resolver is DIF-hosted infrastructure enabling resolution of any registered DID method through its flexible plugin model. This widely-used service is invaluable to the decentralized identity community. Accordingly, DIF has invested in boosting its reliability and stability to ensure uptime and availability of the Universal Resolver.

The Universal Resolver was generously developed and contributed by Danube Tech.

You can experiment with the Universal Resolver at https://uniresolver.io and even follow our simple process for allowing your own DID method to run in the Uniresolver.

As a non-profit, DIF funds and hosts public good infrastructure like the Universal Resolver. Support our efforts by joining DIF.

DIF adds new liaison IOVF


IOV Foundation’s mission is to create a new open financial ecosystem that provides accessible and fair services. They focus on Latin America, specifically demonstrating solutions for interoperability between governments (countries, cities) and organizations in Central and South America. Their current pilot focuses on building an identity platform for the Argentine Chamber of E-Commerce, aimed at managing event tickets and board meeting attendance.

Calling all DIDComm users!

Do you use DIDComm? We want to hear about it! Let us know how you're using DIDComm here.

🛠️ Working Group Updates Claims & Credentials Working Group

Presentation Exchange 2.1 specification is published: https://identity.foundation/presentation-exchange/spec/v2.1.0/

Sam Curren presented the Credential Trust Establishment at the Trust over IP Foundation's trust registry task force.

After a successful webinar to drum up interest (see write-up here), we will be launching the credential schemas work item. Contact membership@identity.foundation if you are interested in participating.

If you are interested in participating in any of DIF's Working Groups, please click here.

🔐 Applied Cryptography WG

(1) We are waiting for IETF CFRG cryptographic review of the BBS Signature Scheme draft

(2) We are updating but haven't yet published a new version of "Blind BBS Signatures" which enables features like "anonymous holder binding"

(3) We are updating but haven't yet published a new version of "BBS per Verifier Linkability" (pseudonyms)

(4) Both these new drafts provide features that have been tentatively added to the W3C Data Integrity BBS candidate recommendations.

📖 Open Groups at DIF Veramo User Group

The user group is currently focusing on integration of SD-JWT into Veramo, improving compatibility with the SES environment in Metamask snaps for Veramo 6.x, and some changes to key management with regards to usage in Metamask snaps.

We also had a demo/presentation by Index Network on their product / plans for Veramo.

📻 China SIG

During the SIG's May meeting last week, we shared the idea of SSI, global development trends, and next steps for technical, application and translation sub-groups, and held an open discussion.

Access the recording here.

The previous SIG meeting was held on 17 April. CAICT (the China Academy of Information and Communications Technology) contributed a basic un-finished digital identity framework for China SIG members to study and participate in coding together.

Access the recording here.

The China SIG's permanent online meeting address for its regular monthly meeting is: https://meeting.tencent.com/dm/c4cNcDCmssbc
Tencent Meeting Number:507-9656-6284

📢 Announcements at DIF

European Identity and Cloud Conference (EIC) 2024

[Photo by Levin on Unsplash]

DIF will have a significant presence at EIC next week. Many Steering Committee members will be attending and presenting at the conference, which takes place in Berlin from 3 - 7 June.

Executive Director Kim Hamilton Duffy teams up with SC member Steve McCown on the opening day of the conference to deliver a Decentralized Identity Technical Mastery Sprint, plus DIF members Wayne Chang of SpruceID, Riley Hughes of Trinsic, Nick Lambert of Dock and Daniel Buchner of Block (who is also an SC member) for panel discussions on days 2, 3 and 4.

SC member Markus Sabadello joins forces with Nat Sakimura, chairman of the OpenID Foundation to deliver a keynote on the opening night of the conference titled "The Dueling Narratives of Decentralized Identities".

SC member Sam Curren, Hospitality & Travel SIG co-chair Nick Price and Dr Abbie Barbir of DIF liaison partner FIDO Alliance are among those participating in other panel discussions focused on decentralized identity, including

Post Quantum Security: Cryptography in Decentralized Identity Expert/Digital Wallet & Verifiers Q+A Decentralized Identity for Onboarding & CIAM Addressing Usability Challenges of Digital Identity Wallets Decentralized Identity in Production

Check out the full agenda here.

DIF members are eligible for a 25% reduction on their ticket to attend EIC (on top of any other discounts). Simply enter code eic24dif25members during the last step of booking: click here to buy your ticket.

Please ensure your communications teams coordinate with us if you plan to attend, so we can assist in promoting your participation.

DIF Labs

The DIF Labs working group is coming soon; contact membership@identity.foundation to learn more

🗓️ ️Community Events

Bridging the Gap: OpenID and DIDComm

SC member Sam Curren and Artur Philipp from IDUnion unveiled OpenID-DIDComm, a new protocol that bridges OpenID4VC and DIDComm, enabling credential issuers and holders to communicate securely in a way that complies with EU requirements for credential exchange.

The community call was highly anticipated and was well attended, despite public holidays in the US and Europe.

Following an introduction by DIF’s Senior Director of Community Engagement, Limari Navarrete, Artur noted that there are many reasons why issuers and holders will need to communicate, though OpenID4VCI (OpenID for Verifiable Credential Issuance) only supports the initial credential exchange.

For example, an issuer may need to notify a holder that a credential has expired or been revoked. On the other hand, a holder may want to request a batch of additional credentials from the issuer (single-use SD-JWT credentials will be important to prevent correlation of the holder).

The mutual authentication that is integral to the DIDComm protocol provides resistance to phishing attacks, compared with approaches requiring switches to channels such as SMS or email. What’s more, the security properties of a DIDComm connection do not degrade over time. These and other features make DIDComm an excellent choice to establish a persistent communication channel between issuers and holders.

The next step is for the project team to validate that the new protocol does not interfere with OpenID4VCI, in consultation with the group developing the OpenID specification. Enabling DIDComm connections to be established over OpenID4VP (OpenID for Verifiable Presentations) will follow later.

Click here to read more about the OpenID-DIDComm protocol.

Credential Schema webinar recap

As mentioned in the Claims & Credentials Working Group update, DIF hosted a well attended community call to introduce the new Credential Schema work item - see write-up here

Coffee Breaks

Make sure to tune in to the recordings of May’s DIF Coffee Breaks.

Tim Boeckmann CEO and Co-founder of Mailchain
https://x.com/DecentralizedID/status/1787539546579902766
Nara Lau Founder at Fise Technologies
https://x.com/DecentralizedID/status/1791167095650254928
Ankur Banerjee CTO and Co-founder at Cheqd
https://x.com/DecentralizedID/status/1793702820883091752
Humpty Calderon, Advisor @Ontology and creator of Crypto Sapiens Podcast:
https://x.com/DecentralizedID/status/1795533701490855965

June’s Upcoming Coffee Breaks

June 20th at 1pm PDT/ 4pm EST
Andres Olave Head of Technology at Velocity Career Labs
https://twitter.com/i/spaces/1nAJEaLwvgbJL
June 27th at 11am PDT/ 8pm CET
Cole Davis Founder and CEO at Switchchord
https://twitter.com/i/spaces/1vAxRvwmorkxl

Follow https://twitter.com/DecentralizedID to get updates

🗓️ ️DIF Members

Guest blog: Markus Sabadello


Steering Committee member and Danube Tech CEO Markus Sabadello provides an overview of the Identifiers & Discovery Working Group, where he has been contributing as co-chair for many years.

https://blog.identity.foundation/an-overview-of-the-dif-identifiers-discovery-working-group/

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events which can be found here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives


FIDO Alliance

White Paper: Synced Passkey Deployment: Emerging Practices for Consumer Use Cases

This paper explores the emerging practices surrounding the use of synced passkeys which allow passkey use across multiple devices by syncing the passkeys over the cloud, specifically addressing the initial […]

This paper explores the emerging practices surrounding the use of synced passkeys which allow passkey use across multiple devices by syncing the passkeys over the cloud, specifically addressing the initial choices and considerations for service providers (aka relying parties or RPs). These practices are in their early stages and are likely to progress, since operating systems, browsers, and passkey providers are still in a phase of enhancing functionality. This document outlines crucial areas such as registration, authentication, passkey management, and accessibility for RPs to consider and presents a range of emerging approaches for adopting this technology. The objective is to guide RPs through these budding strategies, acknowledging that the specifics of ensuring secure and convenient passkey usage may evolve as the digital landscape continues to advance.

This paper is written with independence for each section, allowing readers to read specific topics of interest without the need to read the entire paper from the beginning.

This white paper is intended for various stakeholders of relying parties, including non-developers, such as information security executives, product owners, identity and access management practitioners, UI/UX designers, and accessibility practitioners.


FIDO Alliance Osaka Seminar

FIDO Alliance held a one-day seminar in Osaka for a comprehensive dive into passkeys. The seminar covered the current state of passwordless technology, a deep dive on how passkeys work, […]

FIDO Alliance held a one-day seminar in Osaka for a comprehensive dive into passkeys. The seminar covered the current state of passwordless technology, a deep dive on how passkeys work, their benefits, practical implementation strategies and considerations, regulatory considerations, and case studies. 

Attendees had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A and networking to get first-hand insights on how to move their own passkey deployments forward.

View the seminar slides and photos below:

FIDO Alliance Osaka Seminar: LY-DOCOMO-KDDI-Mercari Panel.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: NEC & Yubico Panel.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: CloudGate.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: PlayStation Passkey Deployment Case Study.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Overview.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Welcome Slides.pdf from FIDO Alliance

Thursday, 30. May 2024

Elastos Foundation

Elastos Cyber Republic 5: Introducing Our New Council Members

In a dramatic conclusion to the Elastos Cyber Republic DAO 5 elections, the votes were fiercely contested until the final moments. With candidates from diverse regions and backgrounds vying for positions, the atmosphere was charged with anticipation and excitement. The following individuals and teams emerged victorious, bringing with them visions of growth, innovation, and community […]

In a dramatic conclusion to the Elastos Cyber Republic DAO 5 elections, the votes were fiercely contested until the final moments. With candidates from diverse regions and backgrounds vying for positions, the atmosphere was charged with anticipation and excitement. The following individuals and teams emerged victorious, bringing with them visions of growth, innovation, and community engagement.

Introducing Our Newly Elected Council Members!

The newly elected council members bring a wealth of experience and a shared vision for the future of Elastos. Their responsibilities include guiding the community’s strategic direction, approving and overseeing proposals, and ensuring transparent governance.

Gelaxy (China) Background: Technical team responsible for the development of main and side chains. Vision: Enhance functional development on the chain and standardize funding applications. Jon Hargreaves & Roger Darashah (United Kingdom) Background: Experienced in communications and strategic management. Vision: Communicate the benefits of the SmartWeb and ensure Bitcoin’s potential as a decentralized currency. Strawberry Republic (El Salvador) Background: Known for their perfect voting record and dedication to community engagement. Vision: Enhance transparency and promote grassroots art initiatives. Iggispopis CRC (Slovakia) Background: Strategic management and finance expertise. Vision: Address technical barriers and increase community engagement through active communication. WoW Africa: Lili Felix, Hannah West, Ian Anderson (Spain) Background: Diverse team with a focus on community development. Vision: Strengthen community collaboration and support technological innovation. Sash | Elacity 🐘 (United Kingdom) Background: Founder and CEO of Elacity, involved in developing dDRM. Head of BeL2, Bitcoin Elastos Layer 2. Vision: Promote decentralized technologies and enhance community engagement through transparent interactions. Chen2rong2 (-) Background: Long-term supporter of Elastos with a focus on bridging internet and crypto developers. Vision: Build the SmartWeb and foster collaboration between eastern and western community members. Rebecca Zhu (Australia) Background: Community member dedicated to promoting transparency and engagement. Vision: Enhance community participation and support the development of decentralized applications. Tyro and his Friends (China) Background: Collaborative team approach to proposal evaluation. Vision: Promote stable node operation and actively engage with the community. PG (Cayman Islands) Background: Investment and financial expertise. Vision: Support project development and improve decision-making transparency. Leo (Hong Kong) Background: Experienced in web3 project development and investment. Vision: Enhance ecosystem support for Elastos and uphold principles of fairness and transparency. Mark E. Blair, M.D. (Canada) Background: Head of Strategy for BeL2 and experienced council member. Vision: Drive innovation, support BeL2 rollout, and explore investor opportunities. The Tense Final Moments

The election’s final hours were marked by intense activity, with votes shifting as candidates rallied their supporters. Observers noted strategic manoeuvres and last-minute endorsements that significantly impacted the outcomes. The competition was particularly tight for the final seats, with candidates switching places as the deadline approached. The dynamic nature of the voting process showcased the community’s active engagement and commitment to the democratic principles of the Cyber Republic. It was both tense and fun!

Next Steps

With the election concluded the new council members will undergo a 14-day transition period before officially taking office on June 13th. During this time, they will deploy their council nodes, ensuring they are operational by June 12th. These nodes play a critical role in validating blocks on the Elastos sidechains, a key component of the ecosystem’s security and functionality.

The first Biweekly Council Meeting is scheduled for June 26th, marking the beginning of the new council’s term. These meetings will be essential for setting the strategic direction and addressing community proposals.

The successful election of the new Cyber Republic Council members marks a significant milestone for Elastos. With their diverse backgrounds and shared commitment to the community, they are well-positioned to drive innovation and foster growth. The community’s active participation and engagement throughout the election process underscore the strength of the Elastos ecosystem and its commitment to decentralized governance.

For more information about the newly elected members and their visions, visit the Elastos website, follow the Cyber Republic Twitter and join the ongoing discussions within the community Telegram.

 


Origin Trail

SingularityNET and OriginTrail: Advancing Decentralized Knowledge Graphs

An innovative collaboration has emerged in the AI sector, as SingularityNET, a leading AI platform developer headquartered in Zug, Switzerland, and Trace Labs, the core development company of OriginTrail, based in Hong Kong, have just announced a strategic partnership aimed at supporting the development of the Knowledge Layer — the Internet of Knowledge. This partnership signifies that two promin

An innovative collaboration has emerged in the AI sector, as SingularityNET, a leading AI platform developer headquartered in Zug, Switzerland, and Trace Labs, the core development company of OriginTrail, based in Hong Kong, have just announced a strategic partnership aimed at supporting the development of the Knowledge Layer — the Internet of Knowledge.

This partnership signifies that two prominent players in the AI industry have come together to support a decentralized ecosystem where AI agents and infrastructure partners collaborate within the decentralized knowledge graph (DKG) landscape.

OriginTrail is an ecosystem dedicated to building a Verifiable Internet for AI, and this partnership marks the beginning of their collaboration with SingularityNET’s leading AI platform and robust ecosystem.

In some of the key highlights of this partnership, Trace Labs is tasked with developing sophisticated infrastructure that allows for efficient access and retrieval of information stored on the DKG, tackling challenges of AI hallucinations, bias, and model collapse due to an explosion in the amount of synthetic data produced by AI. This effort is aimed at enhancing the functionality and responsiveness of the decentralized knowledge graph within the OriginTrail network.

SingularityNET will then provide users access to its decentralized platform, where specialized AI models and Large Language Models (LLMs) can be purchased and used for data analysis. These models are designed to operate seamlessly with the data supported by the OriginTrail network, fostering a more robust ecosystem. The company will also develop AI models that can be trained directly on the Decentralized Knowledge Graph. This approach helps realize the shared vision of the two partners; eliminating the need for data centralization, and leveraging the decentralized nature of the blockchain to enhance privacy and security.

Leveraging SingularityNET’s leading position in mission-critical research of Artificial General Intelligence (AGI) and Trace Labs’ experience in commercializing Web3 and AI solutions, the strategic partnership is in particular aimed at solving the real world challenges with decentralized AI within the key sectors, such as Industry 4.0, decentralized science (DeSci), real world assets (RWA), and education.

Dr. Ben Goertzel, CEO of SingularityNET, stated, “As we move from an Internet of documents, media and apps to an internet of knowledge and AI, the basic composition of the Internet as a ‘decentralized network of decentralized networks’ becomes ever more important. Both SingularityNET and Trace Labs have powerful capability to grow decentralized networks around knowledge graphs and associated AI capabilities; connecting these networks together into a cross-linked, cross-token meta-network will yield a host of different synergies enabling a broad-based boost in intelligent functionality. As a practical example: Putting together a subgraph of OriginTrail’s DKG decentralized knowledge graph covering shipping logistics, with a knowledge meta-graph living in the OpenCog Hyperon system deployed on SingularityNET covering the timing of events in various markets, one could achieve an unprecedented level of emergent knowledge in the minds of AI agents carrying out supply-chain planning and forecasting. SingularityNET’s new MeTTa-Motto tool integrating Hyperon symbolic AI with LLMs and other deep neural nets could play a critical role here. Similar examples exist in every vertical market, which gives this partnership an almost unbounded potential for economic benefit and human good.”

Žiga Drev, Managing Director of Trace Labs, added, “The benefits of AI are limitless and so are the risks, like hallucinations and model collapse as the growth in synthetic data outpaces the provision of real world data. A truly open AI that fosters inclusion and a more equitable distribution of value can only be achieved through a collaborative and modular approach. We are proud to partner with SingularityNet, founded and led by the visionary Dr. Ben Goertzel, who coined the Artificial General Intelligence (AGI). Working alongside the leading visionaries at the convergence of crypto and AI opens exciting opportunities to accelerate real-world adoption of neuro-symbolic AI, combining the power of OriginTrail Decentralized Knowledge Graph (DKG) and SingularityNET’s specialized marketplace for Large Language Models (LLMs) and other AI models.”

Both organizations will also engage in collaborative marketing and social media efforts to promote their partnership and the innovations it brings to the blockchain world, the AI industry, and the intersection of these two sectors. The two partners are also mutually exploring and plan to integrate AI services into Trace Labs’ DKG and paranets.

The partnership is about working together today to solve the challenges of tomorrow; synergizing blockchains and knowledge graphs for safe, verifiable AI that emphasizes secure and privacy-preserving mechanisms for user authentication and authorization within the knowledge layer, and ultimately ensuring all data processing by users is rooted in their consent.

About SingularityNET

SingularityNET was founded by Dr. Ben Goertzel with the mission of creating a decentralized, democratic, inclusive and beneficial Artificial General Intelligence (AGI). An AGI is not dependent on any central entity, is open to anyone and is not restricted to the narrow goals of a single corporation or even a single country. The SingularityNET team includes seasoned engineers, scientists, researchers, entrepreneurs, and marketers. Our core platform and AI teams are further complemented by specialized teams devoted to application areas such as finance, robotics, biomedical AI, media, arts and entertainment.

About OriginTrail

OriginTrail is an ecosystem building a Verifiable Internet for AI, providing an inclusive framework that tackles the world’s challenges in the AI era, such as hallucinations, bias, and model collapse, by ensuring the provenance and verifiability of data used by AI systems. OriginTrail is used by global leaders like the British Standards Institution, Swiss Federal Railways, Supplier Compliance Audit Network (SCAN), representing over 40% of US imports and several consortia funded by the European Union among others. Advised by Turing award winner Dr. Bob Metcalfe, renowned for his law of network effects, the Trace Labs team (OriginTrail core developers) plays a crucial role in promoting a more inclusive, transparent, and decentralized AI.

SingularityNET and OriginTrail: Advancing Decentralized Knowledge Graphs was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


OpenID

AuthZEN Work Group Announces Authorization Interop Results

Conformance with the AuthZEN request/response protocol marks a significant milestone in simplifying and standardizing authorization approaches   The OpenID Foundation led the standardization of authentication protocols with OpenID Connect and now we are proud to host the AuthZEN Working Group as they seek to do the same for authorization.” — Gail Hodges, Executive Director for […] The post

Conformance with the AuthZEN request/response protocol marks a significant milestone in simplifying and standardizing authorization approaches

  The OpenID Foundation led the standardization of authentication protocols with OpenID Connect and now we are proud to host the AuthZEN Working Group as they seek to do the same for authorization.”

— Gail Hodges, Executive Director for OpenID Foundation

LAS VEGAS, NV, USA, May 28, 2024 /EINPresswire.com/ — The OpenID Foundation AuthZEN Working Group announced today that select leading authorization vendors successfully achieved conformance with the AuthZEN request/response protocol, marking a significant step in bringing interoperability and standardization to the authorization market. The industry leaders include 3Edges, Aserto, Axiomatics, Cerbos, Permit.io, Rock Solid Knowledge, SGNL.ai, Strata Identity, and Thales, demonstrating their commitment to documenting common authorization patterns, defining standard mechanisms for communication between authorization components, and recommending best practices for developing secure applications.

Established through the OpenID Foundation, the AuthZEN Working Group’s focus is to tackle the complexities of authorization, to promote decoupling and externalizing authorization logic from applications, and to simplify the implementation of a robust authorization layer that can be edited and audited with ease within diverse application environments. With members from leading authorization vendors, the group aims to unify and standardize the way authorization decisions are enforced across varying platforms, with an initial focus on a specification that ensures interoperability and integration between policy enforcement points and decision points. This initiative draws on the expertise of leading companies in the security and authorization space, fostering a collaborative approach to enhancing the scalability and security of access control systems.

The AuthZEN Working Group is currently focused on three key areas to improve interoperability:
1. Defining a standard for the communication flow between policy enforcement points and policy decision engines.
2. Creating a standard for communicating access policies to policy decision points.
3. Identifying and documenting common usage patterns and recommended best practices.

The working group recently completed successful interoperability testing, which included a defined interop scenario in the form of a Todo application. Participating companies include 3Edges, Aserto, Axiomatics, Cerbos, Permit.io, Rock Solid Knowledge, SGNL.ai, Strata Identity and Thales – all of whom achieved success in this testing.

The AuthZEN Working Group is open to all organizations committed to the goal of improving interoperability and standardization in authorization. For more information, visit https://openid.net/wg/authzen/ and https://authzen-interop.net/.

Quotes:
Gail Hodges, executive director for OpenID Foundation
“As more and more players offer externalized authorization, it is critical that we ensure safe and secure patterns across implementations. The OpenID Foundation led the standardization of authentication protocols with OpenID Connect and now, ten years later, we are proud to host the AuthZEN Working Group as they seek to do the same for authorization.”

Derek Small, co-founder and president for 3Edges
“The OpenID AuthZEN Working Group is tackling authorization challenges faced by organizations of every nature and size. Dynamic authorization is cataloging the rich authorization patterns that support authorization decisions between organizations and varying platforms. As this Working Group continues its mission to address interoperability and standards in support of authorization policies of today and those of the future, 3Edges remains committed to supporting the critical workings of the AuthZEN Working Group and to supporting this interop at Identiverse 2024.”

Omri Gazitt, co-founder and CEO for Aserto
“Interoperable authentication is mostly a solved problem, thanks to standards such as SAML and OpenID Connect. But we haven’t yet had our “OIDC moment” in the authorization space. The OpenID AuthZEN Working Group is the definitive effort to get us there, and Aserto is proud to be among the first vendors to adopt it. “

David Brossard, chief technology officer (CTO) for Axiomatics
“Put simply, the goal here is to become the OAuth of authorization. We’ve taken the lessons learned from the past 15 years working to implement authorization for our customers along with the standardization efforts within OASIS XACML to produce an even simpler, more lightweight PEP-PDP protocol. Axiomatics is proud to support the work to facilitate integration between applications and externalized authorization services, raising the quality and security of authorization.”

Alex Olivier, co-founder and chief product officer (CPO) for Cerbos
“Cerbos is a proud contributor and early adopter of the OpenID AuthZEN specification enabling external authorization portability. This standardization effort provides software builders with the confidence to adopt a more secure and scalable access control layer in their applications. “

Or Weis, CEO for Permit.io
“Enterprises spend months and sometimes years struggling to apply authorization to their applications. Reinventing wheels due to the lack of standards in the space. At Permit.io, we’re excited to be early backers of the AuthZEN standard and its promise to unify simplicity across the landscape.”

Andrew Clymer, co-founder for Rock Solid Knowledge
“With many years of experience building single sign-on (SSO) solutions based on open standards, we are proud to support the AuthZEN Working Group in delivering open standards for authorization. As an early adopter of the draft standard, we are excited to make our .NET authorization engine, Enforcer, accessible to heterogeneous environments.”

Atul Tulshibagwale, chief technology officer (CTO) for SIGNL.ai
“The AuthZEN standard will be critical to achieving externalized management of authorization. SGNL is proud to have initiated standardization activity by contributing the first draft spec and is happy to participate in the interoperability event.”

Gerry Gebel, vice president of product and standards for Strata Identity
“Interoperability is a core capability needed for enterprises to securely deploy authorization services in complex environments comprised of systems from multiple vendors. Strata is honored to support the demo with the Hexa Policy Orchestration’s integration with Open Policy Agent (OPA). Identiverse is the logical place for this first interoperability demonstration to occur since the AuthZEN working group was founded based on meetings held at this event.”

Bertrand Tavernier, vice president/chief technical officer for Thales (secure communication and information systems)
“Thanks to the support of the OpenID AuthZEN Working Group and leveraging our long-standing experience with the OASIS XACML standard, we were glad to demonstrate the capability of our AuthzForce solution to combine XACML policy expressiveness and versatility with the new, ultra-lightweight AuthZEN authorization API for the great benefit of customers, especially in edge computing environments.”

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post AuthZEN Work Group Announces Authorization Interop Results first appeared on OpenID Foundation.

Wednesday, 29. May 2024

Ceramic Network

CeramicWorld 04

Welcome to the 4th edition of CeramicWorld! Last month has been busy with new features and developments happening across the Ceramic ecosystem. Here’s a TL;DR for what has happened: Ceramic released an a new library for building points on Ceramic Index is making strides in powering decentralized

Welcome to the 4th edition of CeramicWorld! Last month has been busy with new features and developments happening across the Ceramic ecosystem. Here’s a TL;DR for what has happened:

Ceramic released an a new library for building points on Ceramic Index is making strides in powering decentralized AI Ceramic Anchor Service (CAS) is moving away from IPFS pubsub Ceramic’s Data Feed API is here A new library for building points on Ceramic has dropped 🔥

Ceramic core team has just released the points library, designed to facilitate developers in initiating experimentation with reputation systems on Ceramic and accelerating iteration. The points library serves various use cases, including implementing rewards for community engagement, incentivizing collaboration, or facilitating community quests and educational initiatives.

As an illustration of its functionality, the Points library has been integrated into the ComposeDB Sandbox. By following the steps outlined in the Sandbox, you can earn your first points, stored on Ceramic! 🔥

And that’s not all. In addition to the points library, there is also a points demo application that you can use as the basis of your own project. This example app showcases how the points library can be utilized to incentivize community members for their engagement, such as participation on key social platforms. To learn more, check out the comprehensive tutorial, and watch the accompanying video walkthrough:

Start building with points library Powering Decentralized AI with Composable Data

Index, a composable discovery protocol, is revolutionizing web discovery by empowering users to curate personalized search engines, known as indexes, tailored to their specific needs.

Built on Ceramic's decentralized graph database, Index is driving decentralized AI. It leverages a decentralized semantic index, utilizing ComposeDB, to enable semantic interoperability and a highly relational composition of use cases. By integrating AI-based discovery functions with Ceramic's decentralized network, Index ensures data integrity, privacy, and personalized discovery experiences across peers, mitigating privacy risks associated with centralized systems. This interconnected approach fosters a discovery experience where responses are both personalized and trusted, exemplified by a chat setup drawing from both community and private indexes.

Today, Index network is partnering with a number of projects in the ecosystem, including Ceramic, Lit Protocol, Fluence, Disco, Intuition Systems, Verax, and Olas Network, among others.

Just a few days ago Index team announced the collaboration with LangChainAI which enables developers to seamlessly build their composable semantic indexes with LangChain's suite of LLM pipelines. The new integration is supported in both Python and JavaScript.

Explore the Index Network Ceramic Anchor Service (CAS) is moving away from IPFS pubsub

Earlier this month, the core Ceramic team shared an update regarding the work that has been done to enhance Ceramic network's reliability, scalability, and performance, particularly focusing on the Ceramic Anchor Service (CAS). As the team is approaching the release of Ceramic’s Recon protocol, CAS is transitioning away from using IPFS pubsub to synchronize Streams in Ceramic. To facilitate this, a new HTTP-based mechanism has been developed for sharing Time Events from CAS to Ceramic nodes. This eliminates the dependency of newer Ceramic nodes on IPFS pubsub.

To ensure seamless data delivery and prevent potential data loss, it's crucial for all Ceramic nodes to be upgraded to at least v5.3. Learn more here. New Ceramic features: SET Account Relations, Immutable Fields and shouldIndex flag

Earlier this month, a set of new features providing a sophisticated toolkit for data management, have been added to Ceramic. More specifically, you can now use the following tools to build your applications:
- SET account relation - enabling users to enforce a constraint where each user account (or DID) can create only one instance of a model for a specific record of another model.
- Immutable fields - allow specific data to be prevented from being altered.
- shouldIndex flag - gives developers an option to manage the data visibility by choosing which fields should be indexed.

Learn more about these features in a written tutorial and a video walkthrough. Data Feed API is here!

Data Feed API is finally out. The new Data Feed API formalizes the underlying Ceramic event streaming protocol and allows developers to build data(base) products using raw event streams.

The release finalises the initial stage of implementation of Data Feed API with additional features and improvements coming in the near future. Check out the announcement blogpost and keep an eye on Ceramic Roadmap for updates. Ceramic Community Content TRENDING We Built a Web3 Points Library on Ceramic by Mark Krasner TUTORIAL Web3 Points Library: Example App Tutorial by Mark Krasner VIDEO Building with Ceramic Points Library by Radek Sienkiewicz ANNOUNCEMENT  Upgrade your Ceramic node to v5.3 BLOGPOST Index Network x Ceramic Network: Decentralized AI with Composable Data TUTORIAL Ceramic Feature Release: SET Account Relations, Immutable Fields and shouldIndex flag by Justina Petraityte Upcoming Events May 29 - May 31 GenAI summit Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.


DIF Blog

"Streamlining KYC" Webinar Recap

Earlier this month, Otto Mora from Polygon ID and Kim Duffy from DIF hosted a webinar titled "Credential Schemas: Streamlining KYC." We were thrilled with the enthusiastic participation and valuable feedback we received from attendees. Throughout the webinar, we conducted interactive polls to gather real-time feedback from participants.

Earlier this month, Otto Mora from Polygon ID and Kim Duffy from DIF hosted a webinar titled "Credential Schemas: Streamlining KYC." We were thrilled with the enthusiastic participation and valuable feedback we received from attendees.

Throughout the webinar, we conducted interactive polls to gather real-time feedback from participants. Here's a recap of the key highlights, discussions, and feedback throughout the session.

Understanding Decentralized Identity

We began with an introduction to decentralized identity, explaining the technologies and principles that empower individuals to control their digital identities and personal data. Key benefits include:

Privacy & Security: Ensuring data is shared only with necessary parties, protecting against misuse. Streamlined Onboarding: Facilitating efficient customer onboarding with trustworthy data. Bi-Directional Trust: Reducing fraud and fostering confidence in interactions. Trust & Transparency: Enhancing economic and social opportunities through new markers of trust and reputation.

We asked participants “In your opinion, what is the most impactful use case for decentralized identity?”. At 36%, “Enhancing privacy and security” took the lead, with “Streamlining onboarding processes” coming in second at 27%.

The responses to “Other” included interesting applications such as:

Governance, membership, voting, and sybil resistance Virtual rights management (VRM)

And others added powerful cross-cutting differentiators of SSI including user-controlled identity and interoperability.

For the rest of the webinar, we focused specifically on the "streamlined onboarding" benefit.

Real-World Use Cases

We explored various use cases where credential schemas can make a significant impact in onboarding:

KYC/KYB: Streamlining financial transactions with verified identity claims. AML: Reusable claims to help identify and protect against money laundering. Age Verification: Ensuring age-appropriate access to services and content.

We asked participants which credential schemas they (and their companies) are interested in. KYC took the lead, which was not surprising given the webinar theme.

We had a freeform option to enter other credential schemas of interest, and participants responded with:

Known customer credential Verifiable product-related sustainability information

For the rest of the webinar, we focused specifically on Decentralized Identity and credential schemas as they benefit KYC/KYB use cases.

Challenges in Traditional KYC/KYB

We reviewed challenges with traditional KYC/KYB processes, including:

High Costs: Traditional methods are costly and resource-intensive. Compliance Complexity: Meeting compliance requirements like AML assurance is challenging. Time Inefficiencies: Current processes are time-consuming and require substantial manual effort. The Role of Credential Schemas

We highlighted the importance of credential schemas as data templates that ensure consistency and interoperability across systems. The benefits include:

Facilitating Innovation: Enabling innovators to focus on new applications rather than compatibility issues. Enhancing Interoperability: Ensuring credentials are broadly recognized across different systems. Encoding Best Practices: Capturing best practices and recommendations from experts. Boosting Efficiency: Reducing the need for custom integration work. DIF’s new Schema Work Item

We announced DIF's new schema work item, to be launched at the end of May, dedicated to credential schemas including:

Basic KYC Model: For identity verification across financial services and other sectors. AML Schema: To comply with anti-money laundering regulations. Proof of Age: To verify age or age range for access to age-restricted content. Proof of Humanity: To confirm real human identities, useful for community governance and anti-fraud. General Discussion

The discussions provided us with great insights; one such key insight is that such schemas do not need to provide the complete answer – if they can shave off 80% of the verification work, then that’s a win.

We also discussed the importance of schema discoverability to promote convergence, which we'll discuss more later.

Thank You for Your Participation

We extend our heartfelt thanks to everyone who joined the webinar and contributed to the conversation. Your feedback was invaluable and underscored the relevance and potential impact of credential schemas in streamlining KYC processes.

As discussed during the call, an interdisciplinary, holistic approach is key to the success of this effort, and we'd love to continue the discussion with you!

Contact membership@identity.foundation if you'd like to get involved. We are targeting an end of May start date. You can join the work item later, but getting involved now ensures we factor in your availability.


MyData

MyData goes to Brussels! Finishing our member consultation with OECD on the role of Trusted Data Intermediaries in enhancing individual agency and control

A group of MyData staff, members and friends co-hosted a workshop with OECD on the role of Trusted Data Intermediaries at this year’s CPDP conference in Brussels. We were happy to be able to take our members along to help facilitate these important discussions, presenting case studies that highlight the importance of a human-centric approach. […]
A group of MyData staff, members and friends co-hosted a workshop with OECD on the role of Trusted Data Intermediaries at this year’s CPDP conference in Brussels. We were happy to be able to take our members along to help facilitate these important discussions, presenting case studies that highlight the importance of a human-centric approach. […]

Next Level Supply Chain Podcast with GS1

From TikTok to Checkout - How Social Commerce is Revolutionizing the Supply Chain with Wes Duquette, ShipBob

Social commerce is booming, and today’s conversation proves that if you haven’t considered selling on social media, then it’s time you did. Wes Duquette is VP and GM of B2B and Retail at ShipBob, a tech-enabled 3PL that empowers small and medium businesses with advanced supply chain capabilities. In this episode, he speaks with Liz and Reid about the powerful impact of influencers and platforms

Social commerce is booming, and today’s conversation proves that if you haven’t considered selling on social media, then it’s time you did.

Wes Duquette is VP and GM of B2B and Retail at ShipBob, a tech-enabled 3PL that empowers small and medium businesses with advanced supply chain capabilities. In this episode, he speaks with Liz and Reid about the powerful impact of influencers and platforms like TikTok and Instagram in driving actionable purchases. And it’s no surprise that QR codes are revolutionizing the advertising landscape.

Wes shares invaluable insights on the critical importance of an omnichannel strategy, global expansion, and scalable infrastructure for brands aiming for growth. His pointers on barcoding, packaging, and back-office systems provide a stellar roadmap for future retail scalability. He discusses the evolving dynamics of direct-to-consumer channels and alternative marketplaces and understands why registering with giants like Amazon is crucial—even if you're a dedicated Shopify seller. 

 

Key takeaways: 

How influencers and platforms like TikTok are transforming retail engagement and driving omnichannel growth strategies for brands

Learn about the operational scalability challenges faced by SMBs during retail launches and discover strategies for effectively navigating high-volume demands and pivotal retail relationships.

Critical infrastructure elements such as barcoding, packaging, and robust back-office systems are essential for ensuring future scalability and seamless integration across various retail channels and marketplaces.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Wes Duquette on LinkedIn

Check out ShipBob

 


FIDO Alliance

FIDO Alliance addresses accuracy and bias in remote biometric identity verification technologies with first industry testing and certification program

Face Verification Certification launched to bring confidence to ID ecosystem among rising online identity theft and bias concerns   May 29, 2024 – The FIDO Alliance announced today the launch of […]

Face Verification Certification launched to bring confidence to ID ecosystem among rising online identity theft and bias concerns  

May 29, 2024 – The FIDO Alliance announced today the launch of the first globally available certification program to test and certify the performance of remote biometric identity verification technology when verifying a user against a trusted identity document for accuracy, liveness, and bias. The Face Verification Certification program comes at a time of soaring demand for face biometric identity solutions and recognition of the importance of robust enrollment and identity re-binding processes to the overall security of online accounts. Dynamic Liveness, the science powering the iProov Biometric Solution Suite’s Remote Onboarding Solution and Authentication Solution, is the first product to pass the rigorous certification testing.

The certification program, consisting of 10,000 tests at a minimum, assesses a biometric system’s performance across different demographics, including skin tone, age, and gender. It measures resistance to spoof and deepfake attacks with Imposter Attack Presentation Accept Rate (IAPAR), and also assesses the usability and security of solutions by measuring False Reject and Accept Rates (FRR and FAR respectively). The certification also tests “selfie match” capabilities to ensure a user’s “selfie” matches their government-issued ID in the initial account setup process. 

Combating bias and deepfake threats in biometric ID systems

Biometrics has ranked top as global consumers’ preferred way to log in and the method they think is most secure for the last two years. However, as governments and businesses globally roll out remote identity solutions, two urgent issues remain to address – bias in some biometric systems and new security threats. 

Organizations including NIST have been closely monitoring the disparities in performance for some time – with NIST’s most recent evaluation of solutions across different demographics ​​released this year. The issue of bias is tightly linked to brand reputation too; new research from FIDO Alliance released today has found 50% of American and British consumers said they would lose trust in an organization if its biometric system was found to be biased, while 22% said they’d stop using a service entirely. Similarly, generative AI’s boom has also heightened security apprehensions about online verification; the same survey revealed over a third of consumers (37%) are more concerned about verifying themselves online due to the rising number of deepfakes. In ENISA’s latest remote ID report, researchers observed that while deepfake injection attacks are increasing and more sophisticated, deepfake presentation and injection attacks remain the top two biometric attack types most difficult to mitigate

Bringing trust to the ID ecosystem

Commenting on the news, Andrew Shikiar, Executive Director & CEO of the FIDO Alliance, said: “Remote identity solutions unlock huge benefits for governments, organizations, and consumers alike, but as appetite grows across the globe, there are understandable concerns mixed with excitement. Identity theft is rising, while bias in biometric systems has caused organizations to delay or reconsider implementations at a time when inclusivity and accessibility have never been more important.

“Certification unlocks the power of open standards and catalyzes ecosystem-wide innovation and opportunity. With iProov’s market-first certification for biometric face verification now completed, we look forward to serving additional providers who understand the value of independent, accredited lab testing. This new certification program provides a launchpad that enables all stakeholders to fast-track deployments that are robust enough for the modern threat landscape and work well for everyone, anywhere in the world.” 

Leading biometrics solutions provider, iProov, has become the first vendor to complete the rigorous certification process. iProov provides market-leading biometric solutions that protect the world’s most security-conscious organizations from deepfakes and other types of identity fraud. Andrew Bud, founder and CEO at iProov said: “Biometrics are a powerful tool that organizations can utilize to facilitate secure, inclusive, and user-friendly interactions online. Each of these three fundamental components must be given equal consideration as organizations evaluate their options. With the FIDO Face Verification Certification program, organizations now have a trusted compass for navigating these decisions. We applaud The FIDO Alliance for addressing the importance of biometric identity verification to strengthen the full user identity lifecycle. Independent certification creates a much-needed quality benchmark for this evolving technology and further demonstrates our ability to provide trusted identity assurance in an age of AI threats and identity fraud.” 

Testing requirements are built upon proven ISO standards and are developed by a diverse international authority of stakeholders, including industry, government, and independent subject matter experts. Participating vendors can benefit from identifying gaps in product performance and demonstrating clearly to the market their solutions can be trusted, which can reduce individual testing needs and boost adoption. Two independent labs are currently accredited to support this certification – Ingenium Biometrics and TÜV Informationstechnik (TÜV NORD GROUP) – with more expected to follow later this year. 

The program expands upon the Alliance’s existing Biometric Component Certification and Document Authenticity (DocAuth) Certification programs and demonstrates FIDO’s ongoing commitment to meet marketplace demand and address evolving threats with third-party certifications. Combined, these programs provide unrivaled end-to-end assurance to implementing organizations, consumers and vendors and support the world’s migration to more secure digital verification systems and passwordless security.


FIDO Alliance Releases New Design Guidelines for Optimizing User Sign-in Experience with Passkeys

May 29, 2024 – The FIDO Alliance today released new design guidelines to help accelerate passkey adoption and deployment.  The FIDO Design Guidelines aim to help online service providers design […]

May 29, 2024 – The FIDO Alliance today released new design guidelines to help accelerate passkey adoption and deployment. 

The FIDO Design Guidelines aim to help online service providers design a better, more consistent user experience (UX) when signing in with passkeys

The guidelines are developed for designers, engineers, product managers, content strategists, and UX researchers to use for reference and guide their initial implementation of passkeys and expansion of passkey support over time.

The new guidelines are available at https://fidoalliance.org/design-guidelines/

“As organizations are increasingly deploying passwordless authentication based on FIDO standards around the world, the end users of passkeys – along with the practitioners implementing them – have become top priorities for successful adoption,” said Andrew Shikiar, Executive Director and CEO of The FIDO Alliance. “Our research shows consumers and employees are adopting phishing-resistant passkeys at a rapid pace while relying organizations are experiencing cost savings and fewer security incidents.  By continuing our investment in the evolving user experience, the FIDO Alliance is committed to ensuring brands have a consistent and accessible set of guidelines that are fully aligned with design best practices and FIDO technology requirements. We encourage online service providers everywhere to use these publicly available guidelines to enhance the user experience and enjoy greater success with FIDO passkey deployment and adoption.”

Following the first release of FIDO UX guidelines for passkeys in 2022, the 2024 Design Guidelines have been updated with optimization included for service providers evaluating and deploying passkeys. 

The 2024 Design Guidelines are organized into five sections to provide clear guidance, confirm design principles, and offer flexibility:

User experience research: Provides confidence that the guidelines are informed by design research Principles: 10 UX principles and 3 content principles for passkeys that are core to any passkey implementation “Get started” design patterns: Patterns are the heart of the guidelines, containing self-contained experiences that can be combined to match unique business needs Optional design patterns: Patterns that can be added after the “get started” patterns over time Resources: Provides additional resources like events, Figma UI kits and community groups to jump-start work with passkeys

The FIDO UX Working Group created the guidelines and comprises 131 UX researchers, designers, and PMs from 31 global brands. The guidelines were created in partnership with usability research firm Blink UX – with added underwriting support from 1Password, Dashlane, Google, HID, Trusona, U.S. Bank, and Yubico.

Hear More about the Design Guidelines

Learn about the 2024 Design Guidelines at Identiverse 2024 in Las Vegas May 28-30, 2024. To learn more, visit the FIDO Alliance website.

For a deeper dive, join these sessions from the upcoming Design Guidelines for Passkeys Webinar Series: 

June 11 | 2:00 PM ET | Essentials for Adopting Passkeys as the Foundation of Your Consumer Authentication Strategy June 18 | 2:00 PM ET | Aligning Authentication Experiences with Business Goals June 25 | 2:00 PM ET | Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication July 2 | 2:00 PM ET | Design Guidelines for Passkeys: Ask Us Anything!

Registrants of this webinar series will have access to all events both live and on-demand after they air. To register, click here.


Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024

How can we verify users are genuine in an online world? Online identity theft has steadily risen in recent years, while the generative AI boom has driven a new wave […]
How can we verify users are genuine in an online world?

Online identity theft has steadily risen in recent years, while the generative AI boom has driven a new wave of deepfake-powered attacks that threaten remote enrollment and identity security. Meanwhile, bias in biometric systems has been monitored for some time, varying significantly across solutions and impacting consumer trust and perception of the technology.

As the adoption of remote identity verification technology rises, two critical concerns and challenges need to be addressed: bias and new security threats.

The FIDO Alliance recently sponsored an independent study of 2,000 respondents across the U.S. and the U.K. to understand consumer perception towards remote identity verification, online security, and biometrics. This eBook reveals those insights on remote biometric face verification, including how many people have used biometric face recognition successfully, and their opinions on verification accuracy, potential discrimination, and concerns about deepfakes.

Key findings Consumers want to use biometrics to verify themselves online more, especially in sensitive use cases like financial services – where one out of two people said they would use biometric technology (48%). One in four feel they experience regular discrimination when using automated facial biometric systems (25%). Equity in biometric systems is vital to trust – with half saying they would lose trust in a brand or institution (50%), with one out of five saying they’d stop using a service entirely if found to have a biased biometric system (22%). Over half of respondents are concerned about deepfakes when verifying identities online (52%).

Read the full results of the survey in this eBook and learn about these consumers’ experiences with biometric face verification technologies, and discover how organizations can improve global digital access, authentication, and security when leveraging these remote identity verification technologies.

download the ebook

Tuesday, 28. May 2024

Me2B Alliance

Digital Harms Dictionary 2.0

A deep dive into online harms and what you can do about them. Open Excel The post Digital Harms Dictionary 2.0 appeared first on Internet Safety Labs.

A deep dive into online harms and what you can do about them.

Open Excel

The post Digital Harms Dictionary 2.0 appeared first on Internet Safety Labs.


Elastos Foundation

The Final Countdown! Update: New Candidates and Their Visions

With just over a day left for the Cyber Republic elections, we have new candidates and an exciting lineup of 16 members in total! In our previous article, we introduced 9 candidates and their manifestos to the community. Since then, we’ve received 4 more. Without further ado, here are their thoughts! Follow our guides if […]

With just over a day left for the Cyber Republic elections, we have new candidates and an exciting lineup of 16 members in total! In our previous article, we introduced 9 candidates and their manifestos to the community. Since then, we’ve received 4 more. Without further ado, here are their thoughts! Follow our guides if interested in running or voting. Want to learn what the Cyber Republic even is? Read here!

 

Leo

Email: leo@creda.app
Name: Leo
Current Council Member: Not elected as a CR council member

Background: “I am Leo, dedicated since 2017 to advancing Elastos in crucial areas like DPoS, BPoS, exchange listings, and mining pools. I’ve made significant contributions in forging vital partnerships, aiding Elastos’ progress. With 7 years of web3 project development experience, I’ve invested in dozens of web3 companies, with two yielding hundredfold returns. The fusion of technical prowess and astute investments underscores my capabilities. As an active member and staunch supporter of the Elastos community, I’m committed to contributing my expertise to its cause.”

Contribution Goals: “Creda possesses AI technical capabilities, providing services and ecosystem support for Elastos, making it one of the hottest tracks in the web3 space in 2024.”

Improvements: “In the past year, some CR proposers have violated CR rules and refused to cooperate. We hope that the next CR council members and proposers uphold principles of fairness, justice, and transparency to ensure adherence to and enforcement of CR rules.”

Website: Creda
Email: leo@creda.app

 

Zero

Email: zero.elastos@gmail.com
Name: Zero
Current Council Member: Yes

Experience:
“In the past year, as a CR committee member, I have witnessed the rapid development of the Elastos ecosystem, particularly with the introduction of the BeL2 strategy and the community’s support and efforts towards this strategic transformation. Through the CR mechanism, we reached multiple consensuses that propelled the ecosystem’s prosperity and development, fostering technological innovation and community co-construction. I believe the new CR committee will continue to lead the Elastos community in driving innovation, enhancing collaboration, and achieving ecological prosperity.”

Background:
“I have been part of the Elastos community since 2017 and have participated in the development of various ecosystem projects, specializing in product design and development. Through running for the CR committee, I hope to further contribute my skills and experience to the Elastos community.”

Contribution Goals:
“If elected, I will contribute to CRC by supporting the BeL2 strategy to drive technological innovation, strengthen community collaboration, and enhance the efficiency of ecosystem resource integration.”

Improvements:
“The CRC needs to improve decision-making transparency and community participation. If elected, I will work to enhance the transparency of the decision-making process, ensuring that community members can better understand and participate in decisions. Additionally, I will promote more community activities and communication channels to foster interaction and collaboration among members, thereby strengthening community cohesion and overall development.”

Social Media: Twitter
Email: zero.elastos@gmail.com

 

Mark E. Blair, M.D.

Email: markeblairmd@gmail.com
Name: Mark E. Blair, M.D.
Current Council Member: Yes

Experience:
“As a current member of the Cyber Republic Council (CRC), my experience over the past 4 years has been both rewarding and enlightening. I’ve had the privilege of collaborating with dedicated members like Sir Kahuna and Vegas Mike as part of ‘The Strawberry Council.’ Together, we have upheld the philosophy of ‘derived from the community, for the community,’ driving strategic initiatives that promote decentralization and community involvement within the Elastos ecosystem.”

Background:
“In my role, I have focused on enhancing Elastos’ contributions to the Bitcoin ecosystem and digital/data rights, especially through my work as the Head of Strategy for BeL2 since December 2023. This position has allowed me to explore investment opportunities and strategize the rollout of BeL2 infrastructure, providing me with a unique perspective on how to further our community’s goals.”

Contribution Goals:
“For another term, my aspirations include continuing to drive innovation and growth. I hope to continue to support the rollout and adoption of BeL2, which will not only increase use cases for the Bitcoin ecosystem but also boost Elastos’ visibility and adoption. Additionally, I am committed to exploring outside investor opportunities to increase the project’s ecosystem development and long-term viability.”

Improvements:
“In my opinion, while the CRC is currently effective in serving as a DAO, there are areas that could be further improved:

Increased Community Engagement: Encouraging more active participation from community members, ensuring their voices and ideas are heard and integrated into our strategic plans. Transparency and Communication: Enhancing transparency in decision-making processes and improving communication channels between the CRC and the wider Elastos community. Strategic Partnerships: Forming strategic partnerships with other blockchain projects and organizations to foster collaboration and mutual growth. Innovation and Flexibility: Implementing a framework that allows for greater innovation and flexibility in adopting new technologies and approaches, keeping Elastos at the forefront of blockchain development.”

Social Media: Twitter
Telegram: @MarkEBlairMD

 

chen2rong2

Email: chen2rong2@icloud.com
Name: chen2rong2
Current Council Member: No

Background:
“I haven’t run for a council seat previously, and I will not run next year. However, I feel obligated to offer my help at this critical juncture of the Elastos initiative if I am elected. I have been with the Elastos project since the beginning. And I believe there is a need for a bridge between internet developers and crypto developers so they can work together to build a new Web3. I wish to share some of my thoughts with the Cyber Republic communities, especially in the DePIN area, because Web3 is a network first and foremost.”

Contribution Goals:
“‘You own your data’ — that’s for everyone. Web1, Web2, and Web3 are for all internet users, not just crypto enthusiasts. I will try to help build the SmartWeb, our interpretation of Web3, in any way I can. The SmartWeb can’t survive unless project developers, internet influencers, and e-business vendors can make profits on their own.”

Improvements:
“Currently, there is also a divide between the eastern and western members of the Cyber Republic communities. I hope to be an interpreter and help people see that both the world and the internet are common ground for us all.”

Social Media: Twitter
Private messages via: Twitter

The CR election promises to be a significant step forward in shaping the future of Elastos. We encourage all community members to engage with the candidates and participate in the voting process. You can follow the elections in the Essentials wallet here!


We Are Open co-op

The Business Case for Working Openly and Transparently

Greater agility, faster innovation, increased engagement Image CC BY-ND Visual Thinkery for WAO Excuse us as we go full on bizniz mode for the rest of this post. We’re quite serious about the benefits of openness and have built our cooperative as well as our individual careers on the fact that Open Source is about more than just code. What follows is the “business case” for shifting your orga
Greater agility, faster innovation, increased engagement Image CC BY-ND Visual Thinkery for WAO

Excuse us as we go full on bizniz mode for the rest of this post. We’re quite serious about the benefits of openness and have built our cooperative as well as our individual careers on the fact that Open Source is about more than just code. What follows is the “business case” for shifting your organisation to a more open model. We’ve borrowed liberally from the Open Organization community, a group of people (including one of our members) who have written about the models and behaviours inside of FOSS for many years.

The evidence is clear: open working leads to greater agility, faster innovation, and increased engagement. Members of your organisation are more capable of working toward goals in unison and with shared vision. Ideas from both inside and outside the organisation receive more equitable consideration. Members clearly see connections between their particular activities and an organisation’s overarching values, mission, and spirit.

These advantages translate directly into better business performance and a stronger bottom line. If you want your organisation to obtain better results with the resources you currently have available, then embracing open working practices is one of your best paths toward sustainable success.

But don’t just take our word for it:

Agility: According to McKinsey & Company, organisations that adopt information transparency across their operations can achieve up to 30% gains in efficiency and customer satisfaction. Innovation: Harvard Business Review reports that companies that support open and collaborative working environments can see a speed increase of up to 30% in their innovation cycles. Engagement: Research by Gallup reveals that highly engaged business units realise a 41% reduction in absenteeism, 17% increase in productivity, 10% higher customer ratings, see a 20% increase in sales, and are 21% more profitable. 5 Characteristics of an Open Organisation

Open Organisations are defined as exhibiting the following characteristics:

Image CC BY-ND Visual Thinkery for WAO

Let’s explain what’s meant by each of these:

Transparency

This is fundamental to open organisations and involves everyone involved on a project to access all relevant materials by default. As a result, this means that team members are all able to review, assess, and contribute effectively to the work being done. Moreover, transparency means team members being not only informed about decisions and processes, but being encouraged to participate in discussions and provide feedback. Both successes and failures are discussed openly, meaning that valuable lessons can be learned, goals can be clarified, and roles well-defined. All of this enhances the overall organisational accountability.

Inclusivity

Open organisations actively welcome diverse viewpoints and ensure that everyone has a voice in the future of both the project, and organisation as a whole. This can be achieved not only through well-established technical channels but also through social norms that encourage a range of perspectives. It’s important that there are clear protocols for participation so as to promote an inclusive environment where feedback is valued. Leadership for inclusivity includes proactively including overlooked voices and ensuring that all relevant opinions are heard and considered. This approach enriches the decision-making process and also reinforces a sense of duty among employees to contribute meaningfully to discussions about their work.

Adaptability

The adaptability of open organisations is characterised by their flexibility and resilience, based on collective problem-solving and collaborative decision-making. To allow true adaptability, feedback mechanisms need to be accessible to both internal and external members. This allows for peer support and agency which can have clear impact on operational methods. This orientation towards continuous learning and the readiness to adapt based on feedback helps organisations avoid repetitive mistakes. It also helps them stay responsive to the changing needs of their environment.

Collaboration

In open organisations, collaboration is the default mode of operation. These is a general belief that working together produces superior outcomes, not only in project work, but also in terms of improving organisations. Ensuring a spirit of openness means that work is not only shared within the organisation but is also available for improvement by external parties, leading to greater innovation and continuous improvement. The ease with which people can discover, provide feedback on, and join ongoing work is a useful barometer for how different, welcoming, and collaborative open organisations are compared to others.

Community

The community aspect of open organisations is based on shared values and principles that guide participation and decision-making. These values should be clear to all members and help define the organisation’s boundaries and success criteria. Team members within this community approach are empowered to make significant contributions based on a common ‘language’ to prevent miscommunication. Sharing knowledge and experiences is encouraged to further the group’s work, which strengthens the communal bond and ensuring collective progress towards organisational goals.

As you can see, these characteristics are interconnected and intertwined, leading to a virtuous cycle of openness.

Practical Steps Towards Openness Image CC BY-ND Visual Thinkery for WAO

Ultimately, working openly is about cultural change. A change in attitude as much as a change in processes.

To transition to more open working practices, consider the following steps:

Clear guidelines: Work together to document and share clear guidelines on what open working could look like in your organisation. This includes how information is shared, how feedback is given and received, and how decisions are made. Culture of transparency: Encourage leaders and team members to share their work and the reasoning behind their decisions openly. This can be facilitated through regular meetings, shared digital workspaces, and open forums where ideas and projects are discussed transparently. Inclusive communication: Ensure that all team members have equal access to communication tools and platforms. Use a variety of channels to accommodate different working styles and preferences, and encourage everyone to contribute to discussions and decision-making processes. Collaboration across boundaries: Use technology and collaborative tools to break down silos within your organisation. Encourage teams to work together across departments and even invite external partners or customers to contribute to projects where appropriate. Community engagement: Create spaces (physical/virtual/hybrid) where employees can interact, share ideas, and support each other. Regular community-building activities and team-building exercises can strengthen the communal bonds and enhance the shared sense of purpose. Addressing Potential Challenges Image CC BY-ND Visual Thinkery for WAO

While transitioning to an open working environment offers many benefits, it also comes with challenges that need to be managed. For example, information overload can be avoided by sharing information in a structured and digestible way so as not to overwhelm team members. Try using tools that allow for information to be categorised and easily searched.

While transparency is to be encouraged, clearly define sensitive information to ensure that robust security practices are in place to protect it. Provide training to team members around data protection standards and the importance of respective privacy and confidentiality.

While open working encourages broader participation, this needs to be balanced with the need for efficiency. So set clear objectives and deadlines for collaborative projects to avoid decision-making bottlenecks or delays.

Some team members may resist changes to traditional ways of working. Overcome such cultural resistance by demonstrating the tangible benefits of open working practices through pilot projects and sharing success stories. Provide training and mentoring to help adaptation to the new ways of doing things.

Conclusion Image CC BY-ND Visual Thinkery for WAO

The shift towards working openly and transparently is a strategic move that can significantly enhance an organisation’s agility, innovation, and engagement. By encouraging a culture of openness based on transparency, inclusivity, adaptability, collaboration, and community, organisations can create an environment where every member feels valued and empowered to contribute.

This approach not only drives better business performance but also cultivates a sense of shared purpose and commitment among employees. Embracing open working practices isn’t just an ‘initiative’ but a pathway to sustainable success and a robust bottom line. By implementing clear guidelines, engendering a culture of transparency, ensuring inclusive communication, promoting collaboration, and engaging the community, organisations can navigate potential challenges and realise the full benefits of open working.

As you would expect of an organisation called We Are Open, our cooperative can help you and your organisation on this journey. Get in touch, or take our free email-based course, to find out more!

The Business Case for Working Openly and Transparently was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 27. May 2024

IDunion

IDunion SCE network infrastructure open for commercial use

IDunion SCE, the neutral and democratically organised European cooperative created from the IDunion research consortium, has been offering their managed network infrastructure for industrial and productive use since April 2024.

IDunion SCE, the neutral and democratically organised European cooperative created from the IDunion research consortium, has been offering their managed network infrastructure for industrial and productive use since April 2024.

As a neutral basic infrastructure, the distributed and redundant network enables every company to independently store and manage identity information and all types of company-related data. Companies can store unique and distinctive company identity-related identifiers (DID – Decentralised Identifier) on the IDunion network, which are permanent, unchangeable and verifiable and do not require the use of a central register. The identifiers can be linked to user-defined company data or company attributes. All company data can be managed according to predefined parameters and can of course also be revoked. IDunion SCE makes the identifiers available to companies as required and assigns corresponding write authorisations for network use on an individual contractual basis.

The network is based on the open source software Hyperledger Indy. All network nodes are operated exclusively by cooperative members in accordance with the principles of the IDunion SCE and monitored by the latter as a neutral body. The IDunion SCE and the infrastructures it manages and supports are subject to the principles of openness, neutrality, sovereignty, transparency, integrity and cooperative competition. All members of the cooperative have the same voting rights, regardless of their company size. This ensures that all participating companies can have an equal influence on the technical and organised development of the IDunion SCE and thus the ecosystem. Companies can join the European co-operative as members at any time.


MyData

As a patient, the main reason to be in charge of my health data is to ensure its QUALITY

Patients – and citizens – should be in charge of their data to ensure correctness, completeness and accuracy, optimizing continuity of care from their providers and effective generation of datasets for public health innovation & policy-making. In a previous blog post arguing for an individual-centric European Health Data Space (EHDS), we presented the comments from […]
Patients – and citizens – should be in charge of their data to ensure correctness, completeness and accuracy, optimizing continuity of care from their providers and effective generation of datasets for public health innovation & policy-making. In a previous blog post arguing for an individual-centric European Health Data Space (EHDS), we presented the comments from […]

Identity At The Center - Podcast

Identiverse 2024 starts tomorrow! We did a little pre-confer

Identiverse 2024 starts tomorrow! We did a little pre-conference tailgating on this episode of the Identity at the Center podcast to discuss our plans for the conference before closing out with a quick discussion of the Hypr State of Passwordless Identity Assurance report. Watch the episode at https://www.youtube.com/watch?v=y7B9u9H-cN8 and check us out at idacpodcast.com #iam #podcast #idac #ide

Identiverse 2024 starts tomorrow! We did a little pre-conference tailgating on this episode of the Identity at the Center podcast to discuss our plans for the conference before closing out with a quick discussion of the Hypr State of Passwordless Identity Assurance report. Watch the episode at https://www.youtube.com/watch?v=y7B9u9H-cN8 and check us out at idacpodcast.com

#iam #podcast #idac #identiverse2024

Friday, 24. May 2024

FIDO Alliance

CXMToday: Visa Unveils Card Updates

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a […]

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a face or fingerprint. When shopping online, Visa passkeys replace the need for passwords or one-time codes, enabling more streamlined, secure transactions.


Identity Week: State of Michigan’s MiLogin supported by FIDO passkeys

The system leverages passkeys based on FIDO authentication promoting strong authentication, unifying Michigan’s approach to cybersecurity and improving the user experience. The State of Michigan aimed to address several key […]

The system leverages passkeys based on FIDO authentication promoting strong authentication, unifying Michigan’s approach to cybersecurity and improving the user experience.

The State of Michigan aimed to address several key objectives with the integration of passkeys, fortifying security and enhancing the digital user experience to access critical state government services.


FindBiometrics: Visa Brings Passkeys to Online Payments in Major FIDO Victory

Visa has introduced passkeys to the payment industry, enabling customers to authorize online purchases through a biometric scan on their smartphones or computers when making a purchase online. This capability […]

Visa has introduced passkeys to the payment industry, enabling customers to authorize online purchases through a biometric scan on their smartphones or computers when making a purchase online.

This capability is powered by the Visa Payment Passkey Service, which is built on Visa’s Fast Identity Online (FIDO) server. The service allows merchants to integrate the Visa Payment Passkey Service into their checkout systems without needing to establish their own servers, thereby simplifying the setup process.

For users, this means they can use the same biometric authentication methods they use to unlock their devices to approve Visa payments online, with a one-time enrollment required during checkout. Visa also plans to extend enrollment options to banking apps in the future.

The development of passkeys was a collaborative effort among major technology companies such as Apple, Google, and Microsoft, which joined forces around 2012 to form the FIDO Alliance. This group aimed to overcome the limitations of traditional passwords by creating open standards for more robust authentication, involving biometric experts like HYPR and Nok Nok Labs.

FIDO released its first standards in 2014, setting the stage for authentication methods that do not depend on passwords. Subsequent advancements led to the establishment of the WebAuthn standard in 2019, which quickly gained acceptance among major web browsers. This progress facilitated the creation of passkeys, leveraging FIDO protocols to link authentication credentials to users’ mobile biometrics.

Visa’s recent move has been welcomed by supporters of FIDO and passkeys. HYPR’s co-founder and CEO, Bojan Simic, commented on this development, stating that nearly every regulated business he has interacted with in the past year has included a passkey initiative in their plans in an online post. “I’m so proud of the work that we have all done at the FIDO Alliance to make this a reality. When we wrote the first FIDO implementation in 2014 here at HYPR, seeing the top brands adopt the standard in a major way seemed like fantasy.”

In making this vision a reality, Visa joins several other prominent companies that have recently introduced support for passkeys, including PayPal, Samsung, and Amazon.

Thursday, 23. May 2024

Trust over IP

ToIP at EIC and Beyond: A Summer of Not-to-Be Missed Sessions

See all the conference sessions and talks in which ToIP members or our partners will be participating. The post ToIP at EIC and Beyond: A Summer of Not-to-Be Missed Sessions appeared first on Trust Over IP.

Spring conference season is in full swing and the Trust Over IP Foundation (ToIP) is heading to Europe to participate in several major conferences:

European Identity and Cloud Conference (EIC) — June 4-7, Berlin, Germany Identity Week Europe — June 11-12, Amsterdam, Netherlands Digital Identity unConference Europe (DICE) — June 18-20, Zurich, Switzerland.  

We will also be attending the second regional summit of the Sustainable & Interoperable Digital Identity (SIDI) Summit 2024 series.

Trust Over IP, which celebrates its fourth birthday this month, has long been known for its dual stack that emphasizes why we must combine technical interoperability with governance interoperability in order to establish truly interoperable digital trust ecosystems.

Although in our early history we were best known for our work on the governance side of the dual stack, at these June conferences our emphasis will be on two new technical specifications for which we just published Implementers Drafts: the Trust Spanning Protocol and the Trust Registries Protocol. Both of these protocols are designed to maximize interoperability and authenticity and minimize risk, while still enabling each ecosystem to make the choices they need for their specific context.

The following is a schedule of all the conference sessions and talks in which ToIP members or our partners will be participating.

 

Join us at EIC for a panel of ToIP Steering Committee Members, moderated by Executive Director, Judith Fleenor.

Panel: The Emerging Trust Layer for the Internet: Using Minimum Viable Protocols to Achieve Maximum Interoperability

Friday, June 07, 2024 13:30—14:30

Interoperability is not created alone and that is why ToIP collaborates with other standards organizations and certifications bodies.  Executive Director, Judith Fleenor has invited the Executive Directors from Open Identity Exchange, OpenID Foundation, Open Wallet Foundation, Decentralized Identity Foundation, and the Kantara Initiative to unpack the role each foundation plays, how we collaborate to create the full picture for digital trust, and why it is so important for organizations and individuals to support and participate in several foundations, not just one.

Panel: Executive Directors Speak: Why We Need More Collaboration

Friday, June 07, 2024 14:30—15:30

At last year’s EIC Judith gave a keynote on Decentralized Identity Why is it all the rage?  Her premise was that the technologies that enable decentralized identity could be used, not only for traditional identity and access management, but more importantly for content authenticity and to assist in the challenges that AI brings.  This year, Al will be a big topic at EIC, and several ToIP Steering Committee members will be addressing these topics in various sessions throughout the week.

On Wednesday Wenjing Chu, Futurewei Technologies and Drummond Reed, Gen Digital will be on the panel:

Redefining Human Identity in the AI Era: From Digital Evolution to AI Integration

Wednesday, June 05, 2024 12:00—13:00

Following that Wenjing Chu will be giving the session:

Decentralizing AI: A Socio-Technical Path to Responsible and Trustworthy Tech

Wednesday, June 05, 2024 14:30—15:00

Continuing in the AI track, Wenjing will be joining by others for the following panel discussion:

Is Decentralization The Best Way to Achieve Transparency and Verifiability in AI?

Wednesday, June 05, 2024 15:00—15:30

Charting the Course: Diverse Approaches to AI Regulation in a Fractured World

Wednesday, June 05, 2024 15:30—16:10

AI isn’t the only opportunity vs threat we are facing in the near future.  Quantum Computing is coming as well.  Also, on Wednesday, Steve McCown, Anonyome Labs, will be giving a presentation:

Post Quantum Security: Cryptography in Decentralized Identity

Wednesday, June 05, 2024 18:10—18:30

Another big topic at EIC will be the use of verifiable credentials and the digital wallets that hold them. Several sessions will explore these topics and give real word examples of their use in productions. 

On Wednesday Marie Wallece. Accenture, and Drummond Reed, GenDigital, will be joined by others for a panel:

Real-World Examples of How Smart Wallets will Transform how we
Navigate our Digital World

Wednesday, June 05, 2024 14:45—15:30

On Thursday, Daniel Bachenheimer, Accenture, will join others looking at the use cases in the travel industry.

Panel: The Future of Travel Credentials

Thursday, June 06, 2024 16:00—16:30

Also on Thursday, the topic of real world implementations continues when Andre Kudra, esatus, with others explore use cases in the construction industry:

Best Practice: DIDs and Verifiable Credentials in the Construction Industry

Thursday, June 06, 2024 17:30—17:50

And on Friday Marie Wallace, Accentrue, will join others in a panel discussion:

Decentralized Identity in Production

Friday, June 07, 2024 10:30—11:10

The above session will explore real world solutions being used now, but before selecting which technologies to use for your ecosystem and wallets, it is important to know the differences each technology solution has to offer.  Our friends at the Open Wallet Foundation (OWF) have put together a guide to help explain wallet security.  Daniel Bachenheimer, Accenture, will present with another OWF board member on:

Open Wallet Foundation (OWF) Guide to Safe Digital Wallet Selection

Thursday, June 06, 2024 12:00—12:20

Daniel will deepen the conversation about Digital Wallets in his Friday session on how you bind a wallet to the natural person who’s credentials it holds:

Digital Wallet Holder Binding

Friday, June 07, 2024 11:30—11:50

Another hot topic at this year’s EIC will be Organizational Identity.  Christoph Schneider, GLEIF, will be giving a presentation on this topic, as will ToIP active contributor, John Phillips, Sezoo:

Organizational Identity in the Private and Public Sector with the vLEI

Wednesday, June 05, 2024 18:15—18:30

Organisation Authentication as an Anti-Scam Measure

Friday, June 07, 2024 12:10—12:30

Let us not forget the importance of eIDAS 2.0.  Andrew Tobin from Gen will be giving a keynote Wednesday morning to kick off the track Digital ID Beyond the Enterprise, Session Stream III, Digital Identity and eIDAS 2.0 with a keynote address:

eIDAS 2.0: Game On, But What game is it?

Keynote

Wednesday, June 05, 2024 08:30—08:50

Within that track many of our friends will be sitting on panels and giving presentation, but not to be missed will be Viky Manalia, Intesti Group, who will be on the panel:

Latest on eIDAS Legislation and What it Means for People, Business and Government

Wednesday, June 05, 2024 11:00—11:40

Also in that track several of our ToIP contributors will speak on the panel:

Reusable Identity and Bootstrapping Decentralized Identity Ecosystems

Thursday, June 06, 2024 15:30—16:00

ToIP has been collaborating for the last six months with the work being done at the Content Authenticity Initiative, C2PA, and the Creator Assertions Community group, which Eric Scouten of Adobe heads.  This presentation should be interesting to those concerned about the authenticity of content.

Content Authenticity Overview and eIDAS Investigation:
How Can eIDAS Support Content Creators?

Wednesday, June 05, 2024 14:30—14:45

And it wouldn’t be right if we didn’t give a shout out to our friends at the Decentralized Identity Foundation who are presenting a pre-event workshop: