Last Update 6:21 PM July 26, 2024 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Friday, 26. July 2024

DIF Blog

Guest Blog: Daniel Buchner

More than 9 in 10 submissions to the inaugural DIF Hackathon last October used Web5 and DWNs. There’s obviously a lot of interest in this! Did that surprise you?  In a way, no. There’s only so much you can do with Decentralized Identifiers and VCs

More than 9 in 10 submissions to the inaugural DIF Hackathon last October used Web5 and DWNs. There’s obviously a lot of interest in this! Did that surprise you? 

In a way, no. There’s only so much you can do with Decentralized Identifiers and VCs on their own. The majority of use cases today are these stodgy things like government credentials or proving your humanness. 

The IAM (Identity and Access Management) industry has a very dry view of identity. A human wouldn’t think “my login credentials are my identity”. A person thinks of their identity as more a reflection of a certain part or all of who they are. 

For too long both the IAM and the Decentralized Identity communities were focused on boring stuff like government credentials. I don’t love it when I need to show my driver’s license. We’ve centrered the ecosystem around things that are done begrudgingly, they’re not fun for users. I think that’s maybe why the long tail of developers have not embraced DI yet. 

What we’re trying for with Web5 is, go build the next Eventbrite or social network on this. It’s things that are more exciting for developers AND users. I think that’s what’s driving the interest in D-Web Nodes. 

What was the journey that led you to be involved in working on the DWN specification? 

My interest in this area grew while I was working at Mozilla. After researching Microsoft’s position in identity, I saw it as an opportunity to pursue what I was really interested in. I didn’t want to do another stint on browser stuff, so I was going to give it a year at Microsoft to see if I could work on what I really wanted to work on, coming in under the auspices of Browser. 

After about six months Microsoft agreed yes, we should pursue some means of user-owned identity. None of the social oAuth is owned by Microsoft and if they could disintermediate the federated identity providers, that would be good for Microsoft. So they said, “let's take a shot”. 

Solving identifiers in a decentralized way was necessary, in order to get onto more interesting problems. How would you create decentralized storage for apps, for instance, if you didn’t know how to store the data in reference to an identifier controlled by the data’s owner? You have to have the identifier before you can have the subsequent pieces. 

So I had the opportunity to work on this stuff at Microsoft, to help get it pretty far along. But then there was this great opportunity to go work for Block, building out the piece that I’d actually wanted to work on back at Mozilla.

That’s where I think most of identity actually is. It's in people’s data. The content of your Tweets tells people more about you than your handle. Who cares how you’re logged in? What defines you is all the stuff you do after you log in. So the data store piece is huge and we’ve been contributing to that work at Block. 

You helped establish DIF in 2016. Did you envisage back then the timescale for Decentralized Identity to get to where it is today? 

I naively thought “we’ll just work with some browser vendors and be done with identity in a few years, then get onto the fun stuff”. It didn’t happen like that! I got a lot of Nos, a lot of people said “nothing needs to be decentralized”. But I was intentional about what I was doing, and I don’t like giving up. I wanted it to work so badly, and still do. 

Then we got stuck in a rut with the IAM crowd, where they were using DIDs that were just keys. They thought, I don’t need to find content. It really neuters what you can do. I don’t think the world moves forward just because we have a couple of proofs we can pull out of our digital wallet instead of our physical wallet. 

It might not be anything I’ve worked on that ends up being The Thing. That’s not important. What’s important is that Decentralized Identity comes to be, in some form. I do think it’s an idea whose time has come. I see all the signs. Like social media cancellations, deprivations of service, spam bots everywhere. It’s not stopping, it’s only accelerating. People are becoming concerned and they need something to turn to. 

Who knows if DWNs end up being The Thing? I think it’s a good candidate / first pass approach, and I hope people like it. As long as options are there, someone’s going to pick up these tools and make them well-known. There’s going to be that break out moment. I’ll just keep working on it until either we get there or I can’t work anymore.

How long will it take for that breakout moment to arrive? Are there still big obstacles to adoption, in your view?

Getting DID resolution and DID-relative resources as standard functions of the browser, like DNS resolution is today, would be the single greatest thing for this entire ecosystem. Support for non-DNS origins has been a real sticking point historically. We need to get browser vendors to recognise other sorts of URIs aligned to the same-origin security model. The IPFS crowd has got this started. IPFS addresses have their own origin. It’s slowly happening. 

DIDs being seen as first-class origins is THE thing. Once you convince them that’s ok, resolution of content follows. I don’t think the tech is mature enough yet, but I do think we have the beat on some candidate DID methods that browser vendors would be more inclined to adopt, that still have a full range of features. 

On the data stores side, it’s great to have Tim Berners-Lee involved with Solid. I might not love every single technical decision of that project, but it’s a great thing that he is spearheading it. 

I think we’re probably still five years from a breakout. We need to convince browser vendors that this is worthwhile. Until everyone has a browser that can resolve DIDs and show something visual, you’re looking at one-off use cases. It’s such a simple but important piece. 

It sounds as though DWNs can help bridge to a world where browsers are able to resolve DIDs and find DID-linked resources? 

Absolutely! We’ll be presenting some demos soon that show browsers finding DID-related data, even without an extension. Being able to click DID-relative links on the internet and have the data be visually rendered from someone’s personal data store is awesome! It’s by far the most interesting piece, and the number one thing that will make it real for people. 

Where do Big Tech, social media and the Web3 community stand on Decentralized Identity? 

When I started at Microsoft it was years before the spam problem really blew up, with all these crypto bots on Twitter and so on.  They didn’t believe it was going to be a problem. Then AI came along. Now, bots are better at being human than some humans. I suspect these services are being overrun and if they don’t figure it out soon, there’s going to be a dramatic loss of quality on their platform and it will start to hit revenues. Recruiters don’t want to see a bunch of fraudulent profiles, so it makes sense for LinkedIn to make them verifiable. There’s probably a premium you can put on that, on certain services. 

Decentralization isn’t something that’s unheard of in the social sphere. People understand it. 12 years ago it was this kookie conspiracy theory that you’d be concerned about who controls your twitter account. Ironically, centralized services and authoritarian regimes have been the best spokespeople for decentralization. By carrying out these bans, by suppressing people’s ideas and opinions, they have sold the world on the fact that they should not be in control of our data. Hats off to them! 

If you’re talking about global social networks, it’s a harder economic question. One of the common arguments I hear against decentralization is, “how is it going to be paid for?” My response is “you'll pay for it, if you care about it”. People pay for Google Drive and all sorts of other services with a storage component. It’s not weird. People will pay to protect what’s theirs. From there out, you can work on models for services with an aggregation component.  

Re Web3, there’s a blockchain crowd that wants to insert tokenization everywhere. Some do it because the way they make money is if there’s a token in the middle.  Others don’t know there might be a better way that’s faster and cheaper and doesn’t require tokens. There are almost zero drawbacks to using a personal data store, versus putting things on a blockchain. 

Where do DWNs fit into the larger SSI tech ecosystem? 

I think with all three components of DIDs, VCs and personal data stores together in one platform, we’ll be able to see people really for the first time create apps that are respectful of users, that put you more in control. It’s not about decentralization for its own sake. There are real benefits that make it different from what’s available today. For example, imagine a social media platform that lets you choose your own algorithm. 

That’s the journey we’re on. It’s not one that’s unique to me, there are lots of other people working on this, including lots of data store projects in the DID & VC space. 

Final thoughts? 

There’s going to be more delivered that we’ve long been working on from the TBD side, all open source, that gives more to users finally, than just developers.  That’s what I’m looking forward to. I hope we get to have a much different conversation towards the end of 2024, having seen these things roll out, the standards stabilize and starting to have these discussions with the larger entities that are needed to get this to the mainstream. 











 


Origin Trail

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and…

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and OriginTrail European Gymnastics is a sports organisation counting 50 national member federations, and reaches beyond the borders of political Europe. It nevertheless bears the idea of a united gymnastics nation. As guarantor of interests of its around 8,500,000 gymnasts, European Gymnastics represents
Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and OriginTrail

European Gymnastics is a sports organisation counting 50 national member federations, and reaches beyond the borders of political Europe. It nevertheless bears the idea of a united gymnastics nation. As guarantor of interests of its around 8,500,000 gymnasts, European Gymnastics represents many different facets. From high-level competitive sports in four Olympic, and three non-Olympic disciplines to leisure sports in gymnastics for all with offers for all age groups, from toddlers to senior citizens. European Gymnastics transmit their understanding of being together beyond borders and set an example in community.

Now, European Gymnastics is launching its own Artificial Intelligence (AI) assistant powered by OriginTrail to drive borderless knowledge in order to further their mission to promote, develop and support synergy among the community to make Gymnastics and gymnasts at all levels, shine. The friendly mascot Luigi who you can meet at all major European Gymnastics events, is now receiving its digital twin. Powered by AI, digital Luigi allows anyone to learn and keep in touch with the European Gymnastics community. From finding the information about the next competition to learning about​​ important events in European Gymnastics history or helping you understand which elements are important for scoring a routine on parallel bars — all can be discovered with the help of the AI-powered Luigi.

The uniqueness of our digital Luigi is that his responses always include sources of information, allowing the user to explore any particular source further. This capability is unlocked by using OriginTrail’s Decentralized Knowledge Graph, which is promising to unlock an even more powerful Luigi assistant as it will allow the initial knowledge base to continuously expand, not only by European Gymnastics’s inputs but also with contributions of the national federations, gymnasts and fans. As OriginTrail is based on blockchain, all such contributions will also be protected against tampering — extending European Gymnastics’s commitment to integrity from the sport halls to managing data.

Today’s launch of Luigi is accompanied by the launch of the biggest sporting event in the world — the Olympic Games. To help you navigate all the performances by European gymnasts in Paris, Luigi is already equipped with knowledge about the schedule and will also be receiving updates about results every day.

“European Gymnastics is excited to keep pushing the innovation in our sport. After being the first continental gymnastics federation to launch a digital cup competition this year, we are now making first steps into adopting Artificial Intelligence and blockchain to improve the ease of interaction with what is sometimes considered a complex world of Gymnastics. This is an important step in our newly adopted Strategy 2030, embracing top technology which has a lot to offer..” Dr. Farid Gayibov, European Gymnastics President.

You can find Luigi’s digital twin on the European Gymnastics website.

Championing European Gymnastics with Borderless Knowledge enabled by Artificial Intelligence and… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

A buy-now-pay-later company is behind the explosion of EVs in Kenya

In just two years, M-Kopa has become the biggest financier for EVs in East Africa’s largest market.
Kenya is among the fastest-growing electric vehicle markets in Africa. Registrations of battery-operated cars and bikes in the country increased by more than five times in the past year. While...

Thursday, 25. July 2024

Ceramic Network

Unlocking Privacy: A Step-by-Step Guide to Ceramic's Private Data Proof-of-Concept

Learn how the Ceramic team is rolling out a plan to support private data in the network starting with a minimal proof-of-concept.

About a month ago one of our founders, Joel Thorstensson, released an initial overview of our plans to begin ideating how physical private data capabilities could be natively offered to developers using the Ceramic Network. The forum post includes details of how this would be rolled out in two phases in the form of a proof-of-concept, which is advisable to read before consuming this article.

It's also important to mention the motivation behind this effort. To start, Ceramic doesn't currently offer any native privacy features. This means that all data on Ceramic is public by default. At the same time, over the past few years, we've recognized the strong need for access control across several applications and use cases. A rough estimation would be almost half of all apps built on Ceramic have access control needs in one form or another.

In thinking about a solution, we aligned on the premise that physical access control (where data lives and who can sync it, as opposed to encryption) resonates most directly with the uniqueness of Ceramic's event-based architecture and our desire to align with edge privacy principles.

As such, our next step was to define a scope around a minimalist build to showcase how physical data privacy could be implemented in Ceramic.

Phase 1: API Read Access Control

If you've read Joel's forum post (linked above), you already know the details of the concept we've designed for this first phase. However, below are several key takeaways:

Builds on Ceramic-one (implementation of the Ceramic protocol, written in Rust) Leverages the Feed API on Ceramic-one nodes, thus allowing nodes to filter the feed of events based on which permissions a user has Designed to showcase how two users could share private data from the same node by leveraging object-capabilities (OCAP) Shows how an object-capability generated by user 1 references a stream containing the data they want to share (as well as the DID corresponding to user 2, the person they want to share their data with) and allows user 2 to access data that would otherwise be physically inaccessible to query and obtain without the OCAP

This article will walk through how to run the PoC locally.

If you prefer video, please view the YouTube version of our Private Data Playground:

Setting Up Your Environment

This walk-through requires you to generate clones of two repositories, one of which is the PoC itself, while the other is the Rust-Ceramic codebase.

Rust-Ceramic Set-Up

First, clone the Rust-Ceramic codebase:

// we will need a special branch from the repo git clone https://github.com/ceramicnetwork/rust-ceramic && cd rust-ceramic && git fetch

We need to set up our Rust-Ceramic node from a specific branch. Enter the branch relevant to this PoC, build, and run the daemon:

// enter the special branch git checkout feat/private-data // build and run cargo run -p ceramic-one -- daemon

If your terminal starts populating with logs similar to the screenshot below, you've successfully started up your node!

You now have an active Ceramic node running in the background! Next, we'll walk through setup for the private data playground web app.

Private Data Playground Web App Setup

First, clone the Private Data Playground repository:

git clone https://github.com/ceramicstudio/private-data-playground

Go ahead and open the private-data-playground repo in your text editor of choice. Once open, we will need to create a copy of the example environment file and rename it:

cp .env.example .env

Our first step is to supply a value to our NEXT_PUBLIC_PROJECT_ID variable by setting up a Project ID with WalletConnect. You can set one up for free (if you don't already have one) by following the simple steps in our WalletConnect tutorial (under "Obtain a WalletConnect Project ID). We will need this given that our application's Wagmi hooks rely on a contextual wrapper that will allow us to leverage these hooks within all child components, as well as use Web3Modal.

Once obtained, paste this into your new environment file next to the variable name referenced above.

Next, install your dependencies:

npm install

We're now ready to run the PoC!

Running the Application

Start up your application from within the private-data-playground repository in developer mode to initiate the UI:

npm run dev

You should now be able to access the UI by navigating to http://localhost:3000 in your browser!

Creating a Stream and a Capability

Our first section will focus on generating a stream (containing a simple message) and an OCAP.

To begin, self-authenticate by clicking "Connect Wallet."

You should see a secondary signature request appear after selecting an account - approving this request will create a browser session (specific to your DID, stemming from your Eth address) that the application will use to sign and submit data on your behalf as you create messages:

There are two views contained in this simple PoC - one for writing data, and one for reading. Make sure you're in the "Write" view by clicking the toggle under your address:

Go ahead and enter a simple message of your choosing - for example, "I love Ceramic!" would be an obvious choice. Go ahead and click the "Create" button. This action initiates a process that builds a new Ceramic stream and constructs your message into a payload that the Rust-Ceramic feed API will accept.

You should now see the resulting identifier under "Stream ID":

Go ahead and copy this value. Save it somewhere as it's needed later (a simple text document will suffice).

Finally, select another Eth address you control (make sure to remember which one) and enter it into the text input under "Delegate read access to". When ready, click "Create Capability". If you've followed all the steps correctly, your screen should look something like this:

Make sure to copy the capability value somewhere you'll be able to reference for the next section.

Congrats! You've successfully created both a Ceramic stream and a capability object! The next section will show how to use these to access otherwise private data.

Using the OCAP to Access Private Data

Go ahead and disconnect your current authenticated account from the web app. Next, go through the sign-in flow using the address you selected for the "Delegate read access to" input from the prior section.

Once authenticated, navigate to the "Read" toggle in the web app:

Enter the Stream ID and the Capability generated and saved from the prior section.

If you've copied over the values correctly, you should now be able to view the original message:

Congratulations - you've successfully used a capability to access otherwise private data on Ceramic.

You can also run through the "Read" process again, but this time make an arbitrary edit to the OCAP (thus invalidating it). With the Stream ID value kept the same, you'll notice that you no longer access the resulting message.

Next Steps

This minimal PoC is only the beginning of our plans for rolling out private data on Ceramic, with phase 2 coming soon (showcasing data privacy in the form of nodes and their ability to sync data between each other based on signed capabilities).

Is private data relevant to what you're building? Have feedback, questions, or concerns about our current thinking around private data? We'd love to hear from you! Fill out our community contact form, or email us at partners@3box.io.

Happy buidling!


ResofWorld

Why Mexico’s delivery workers are ditching food for packages

As the popularity of Chinese e-commerce platforms grows, some gig workers are finding it more appealing to deliver packages than food orders.
Israel Valencia had spent half a year as a food delivery worker for Rappi when a colleague mentioned an e-commerce package distribution gig. His interest was piqued. “You waste a...

Wednesday, 24. July 2024

OpenID

Public Review Period for Proposed Final OpenID Connect for Identity Assurance Specifications

The OpenID Foundation’s eKYC & IDA Working Group recommends approval of the following specifications as OpenID Final Specifications. OpenID Connect for Identity Assurance 1.0 Other formats: ZIP, XML, MD OpenID Connect for Identity Assurance Claims Registration 1.0 Other formats: ZIP, XML, MD OpenID Identity Assurance schema definition 1.0 Other formats: ZIP, XML, MD A Final […] The post Publ
The OpenID Foundation’s eKYC & IDA Working Group recommends approval of the following specifications as OpenID Final Specifications. OpenID Connect for Identity Assurance 1.0 Other formats: ZIP, XML, MD OpenID Connect for Identity Assurance Claims Registration 1.0 Other formats: ZIP, XML, MD OpenID Identity Assurance schema definition 1.0 Other formats: ZIP, XML, MD A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision. This note starts the 60-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpenID Final Specification. For the convenience of members, voting will actually begin a week before the start of the official voting period for members who have completed their reviews by then. The relevant dates are: Final Specification public review period: Wednesday, July 24, 2024 to Sunday, September 22, 2024 (60 days) Final Specification vote announcement: Monday, September 9, 2024 Final Specification early voting opens: Monday, September 16, 2024 * Final Specification voting period: Monday, September 23, 2024 to Monday, September 30, 2024 (7 days)*   * Note: Early voting before the start of the formal voting will be allowed.   The eKYC & IDA work group page is https://openid.net/wg/ekyc-ida/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.   You can send feedback on the specifications in a way that enables the working group to act upon it by (1) signing the OIDF Contribution Agreement at https://openid.net/intellectual-property/ to join the work group, (2) joining the work group mailing list at openid-specs-ekyc-ida@lists.openid.net, and (3) sending your feedback to the list.    Marie Jordan – OpenID Foundation Board Secretary

The post Public Review Period for Proposed Final OpenID Connect for Identity Assurance Specifications first appeared on OpenID Foundation.


Fourth Implementer’s Draft of OpenID Federation Approved

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft: OpenID Federation 1.0 This is the fourth Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product of the OpenID Connect Working […

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft:

OpenID Federation 1.0

This is the fourth Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product of the OpenID Connect Working group.

The Implementer’s Draft is available at:

https://openid.net/specs/openid-federation-1_0-ID4.html

The voting results were:

Approve – 77 votes Object – 0 votes Abstain – 19 votes

Total votes: 96 (out of 364 members = 26% > 20% quorum requirement)

Marie Jordan – OpenID Foundation Secretary

The post Fourth Implementer’s Draft of OpenID Federation Approved first appeared on OpenID Foundation.


Guidance to the CFPB regarding US Open Banking

Authors: Gail Hodges, Joseph Heenan, Dima Postnikov, Mark Haine, Mike Leszcz, Elizabeth Garber  Following our May 16 open letter to the Consumer Financial Protection Bureau, the OpenID Foundation has been engaged in discussions about their rule-making on Personal Financial Data Rights. This post summarizes our guidance to the CFPB. Why are we engaged? The OpenID […] The post Guidance to the
Authors: Gail Hodges, Joseph Heenan, Dima Postnikov, Mark Haine, Mike Leszcz, Elizabeth Garber 

Following our May 16 open letter to the Consumer Financial Protection Bureau, the OpenID Foundation has been engaged in discussions about their rule-making on Personal Financial Data Rights. This post summarizes our guidance to the CFPB.

Why are we engaged?

The OpenID Foundation is committed to supporting Open Banking ecosystems worldwide – all ecosystems that rely on identity data. In particular, we develop and continuously iterate upon world-class identity standards that improve security and underpin interoperability. This creates conditions for competitive marketplaces that protect consumers. 

By offering this guidance to the CFPB, we are answering a call from those seeking to enhance the security of US digital infrastructure. In its March 2024 report, the United States Cyber Safety Review Board (CSRB) called on us to continue iterating on our standards to ensure they are fit for purpose in use cases requiring heightened security – and they called on Cloud Service Providers (CSPs) to adopt those standards. We believe that the US Open Banking ecosystem, to best protect consumer data, should follow suit. 

FAPI as a Secure Communications Protocol

We recommend that the CFPB, in its rule-making, ensures the use of a secure communications protocol for the exchange of identity data. We also propose that the widely adopted FAPI family of specifications performs this role.

The FAPI profile enhances the OAuth 2.0 framework for high-security use cases. It is based on an advanced attacker model and closes critical security gaps that OAuth 2.0 does not address. We provided the CFPB with the example of Client Authentication:

This slide also shows how each jurisdiction may develop its own local profile for any final configuration choices.

Current global ecosystem adoption of FAPI includes: 

 

Selected FAPI

Mandated FAPI

Deployed FAPI

United Kingdom – Open Banking

Australian Treasury & Data Standards Body

Australian ConnectID

Brazilian Open Finance

Saudi Arabian Monetary Authority

United Arab Emirates Government

2024 launch

Chilean Ministry of Finance

 

Colombian Government

Expected 2024

 

Norwegian HelseID (Health)

   

German Verimi

Canadian Open Banking 

Expected

   

US FDX 

Recommended

    The Benefits of Interoperability

By reducing the optionality inherent in the OAuth 2.0 framework, FAPI also promotes interoperability within and across ecosystems. We shared an example of one startup that sought to integrate with the US and other banks & open banking partners globally. They encountered a wide variety of:

Cryptographic Methods, including less secure signing methods (covered by FAPI) Client Authentication, including less secure authentication methods (covered by FAPI) Data formats and payloads, each of which required interpretation  Approaches to data minimization, including many cases of receiving more data than requested Security Culture & Practices, enabling the selection of less secure options (somewhat addressed by selecting FAPI)

This wide variety prevents interoperability and places heavy burdens on fintechs and new market entrants. Interoperability, on the other hand, ensures:

A level playing field for new fintech entrants Less reliance on aggregators Opportunities for banks and fintechs to work with partners across borders (see “Open Banking and Open Data: Ready to Cross Borders?” and our contributions to the “Global Assured Identity Network” and “Sustainable Interoperable Digital Identity” movements)

Our original Open Letter provides more information about FAPI and its role in underpinning security and interoperability.

Other Relevant Standards: Federation and Shared Signals

While the conversation with the CFPB began as a strong recommendation to name a secure communications protocol, we would be remiss if we did not also refer to other OpenID Standards designed to improve the security and viability of open data ecosystems. In particular:

OpenID Federation is designed to quickly establish trust between parties who have been onboarded to an ecosystem. This is how banks can ensure that data requests are coming from legitimate actors – and how legitimate actors can quickly gain access to an open banking ecosystem.  Shared Signals and Events is an open API built upon a protocol suite that enables applications and service providers to communicate about security events to make dynamic access and authorization decisions. It acts as a signaling layer on a back channel that helps secure near real-time sessions. We wrote about its benefits in a read-out from a recent interoperability event here. What’s Next?

The OpenID Foundation is engaged in ongoing discussions with the CFPB and is exploring the requirements for approved standard-setting bodies. Those interested in promoting a secure and thriving Open Banking ecosystem in the United States and around the world should stay tuned!

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy-preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments, and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Guidance to the CFPB regarding US Open Banking first appeared on OpenID Foundation.


Me2B Alliance

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you […] The post Identity Resolution and Customer Dat

This file provides the list of all the apps found in the ISL 2022 EdTech safety benchmark found to be sending data to either one or more identity resolution or customer data platform companies.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

 

The post Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic appeared first on Internet Safety Labs.


Identity Resolution and Customer Data Platform Companies

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide. ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections. This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 […

This file provides the list of all known companies that provide identity resolution or customer data platforms (or both), worldwide.

ISL provides this data as an informational tool reflecting research at this point in time. Please contact us at contact@internetsafetylabs.org if you have questions or corrections.

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

Identity Resolution and Customer Data Platform Companies

The post Identity Resolution and Customer Data Platform Companies appeared first on Internet Safety Labs.


The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms 

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and […] The post The

Today, we are excited to announce our latest research exposing the massively networked personal information sharing happening between and across identity resolution and customer data platforms that has been hiding in plain sites for over 10 years. These industries are the plumbing backbone in synthesizing personal data from hundreds of data sources—across services, devices, and spanning the digital world and the physical world.  

In February 2024, Cracked Labs published “Pervasive identity surveillance for marketing purposes”, an in-depth analysis of LiveRamp’s RampID identity graph. One of the most superficial yet most powerful functions of this excellent report was to guide attention towards industries responsible for pervasive consumer surveillance. The timing was excellent as I’d already committed to present “The Hidden Identity Infrastructure” at Identiverse (May 2024) and prompted by the report, I dug in to better understand the two industries underpinning hidden identity infrastructure, namely, Identity Resolution (ID Res) and Customer Data Platforms (CDPs).  

There are nearly $9T worth of industries worldwide that rely on persistent, hidden identification of people. Naturally, demand of this magnitude fueled the now mature industries that perform pervasive, universal identification of people and their personal information. ISL identified over 350 companies providing either identity resolution platforms, customer data platforms, or both.  

This paper explores the magnitude and reach of these two industries, how they came to be, and most importantly, why, from a human well-being perspective, it’s crucial that these kinds of platforms be held to higher regulatory standards of scrutiny, transparency, and accountability. One identity resolution company alone out of 93  such companies (worldwide) boasts the collection of 5,000 data elements for [each of] 700 million consumers in 2021. To put this in perspective, the number of user accounts breached worldwide in 2023 was about 300 million1. Is there an appreciable difference between stolen user data and undisclosed “legitimate” personally identifiable information sharing? Moreover, nearly 40% of the 93 companies that provide identity resolution platforms are registered data brokers.   

Indeed, after reviewing the research, we must ask ourselves, is this the kind of world we want to live in: a world where everything about us is always known by industry; a world where the ongoing surveillance of people is deemed necessary in the name of capitalism. Is this the kind of world in which humans and societies will flourish or self-destruct? Are humans more than capitalistic consumers? Are we more than our purchasing potential?  

A Call to Action 

ISL conducted this research to help illuminate the sizable risk of hidden identification and the worldwide web of user surveillance. ISL believes naming and exposure is crucial to effecting change. Identification resolution and customer data platforms have been hiding in plain sight for more than a decade, and yet even the “identerati” are largely unfamiliar with these industries. How can we expect everyday people to know?   

This paper is a rallying call for privacy advocates to come together to demand greater regulatory scrutiny, transparency and oversight for these industries, in conjunction with more meaningful data broker regulation.  

Additionally, this is a rallying call to acknowledge the catastrophic failure of notice and consent as a valid permissioning mechanism for highly complex and interconnected digital services. It’s inconceivable that people understand the magnitude of data sharing that consenting to sharing “your data with our marketing” entails.  

We must ask ourselves if this is the kind of world we want for ourselves and our children, where our preferences, practices, relationships, behaviors, and beliefs are all up for sale and broadly shared without our awareness. Are we ourselves in fact being sold?  

The technologies fueling these capabilities have received billions of dollars; consumers don’t have a chance in the face of voracious hunger to identify, know, and manipulate them. We hope that this research shines a much needed light on the forces enabling the worldwide web of human surveillance so that they may be held to accountability for their troves of data on nearly all internet users. 

PS. Also check out our latest podcast with guest Zach Edwards where we discuss this worldwide web of human surveillance live

Open Report PDF

Identity Resolution and Customer Data Platform Companies

Identity Resolution and Customer Data Platforms found in 2022 EdTech Benchmark Network Traffic

The post The Worldwide Web of Human Surveillance: Identity Resolution & Customer Data Platforms  appeared first on Internet Safety Labs.


ResofWorld

The AI job interviewer will see you now

AI interview services say they’re eliminating bias — but not everyone agrees.
When Floria Tan applied for an internship at China’s food delivery giant Meituan, her first video interview was not with a human being. The interviewer certainly looked real enough. It...

Next Level Supply Chain Podcast with GS1

Digital Twins & Their Supply Chain Wins with Elyse Tosi

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.   Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in produ

In the supply chain, technical requirements are the cornerstone for creating scalable and interoperable systems that ensure a seamless flow of information and enhance the accountability and traceability of materials and products throughout their lifecycle.

 

Liz and Reid got to talk about this with Elyse Tosi, the Vice President of Accounts and Implementation at EON, an innovator in product digitization. Elyse shares her extensive knowledge and experience in supply chain management, touching on her work with brands like Victoria's Secret and Eileen Fisher, to discuss the transformative impact of technology and standards on global supply chains.

 

They discuss enhancing value chain efficiency through interoperability, the significance of the EPCIS standard in scaling and achieving interoperability, and how EON, chosen by the EU to pilot digital product passports, is influencing legislation and standards adoption—an initiative critical for compliance, brand protection, and product authentication. They also explore emerging trends like digital twins, QR codes, digital links, and their game-changing potential for retail and customer engagement.

 

In this episode, you’ll learn:

How EPCIS standards ensure interoperability and scalability for digital product passports, enabling seamless data exchange and lifecycle management in supply chains

The transformative impact of digital twins, QR codes, and digital links on retail experiences, customer engagement, and product data connectivity, driving new commerce channels and incremental revenue opportunities.

How Eon leverages compliance with EU legislation to provide commercial benefits such as brand protection and product authentication, reinforcing the importance of scalable and cost-effective blockchain applications.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Elyse Tosi on LinkedIn

More about EON -https://www.eon.xyz/ 

 


DIF Blog

Extrimian HackAlong

We’re excited to announce the first ever DIF HackAlong, co-produced with DIF Associate member Extrimian! These HackAlongs are dynamic educational sessions designed to equip both developers and business leaders with essential knowledge about decentralized identity. Join us for the first of it’s kind, beginner friendly Spanish-language

We’re excited to announce the first ever DIF HackAlong, co-produced with DIF Associate member Extrimian!

These HackAlongs are dynamic educational sessions designed to equip both developers and business leaders with essential knowledge about decentralized identity.

Join us for the first of it’s kind, beginner friendly Spanish-language event. The series will consist of five one-hour sessions, starting on Thursday, August 8, and running through September 12.

Want to participate? Visit the Extrimian/DIF event page to learn more and secure your spot today.

From the organizers: "We will analyze a use case based on the Travel & Hospitality industry, and show you how the QuarkID protocol works. If you’re a fan of technology, a web3 enthusiast, developer, a decision-maker, or an IT student, don’t miss out on talks about decentralized, innovative technologies focusing on data security and privacy. Plus, create and develop your project using SSI. 

"By participating, you could be featured in the DIF’s project gallery and win many other prizes for participants! If you’re in Buenos Aires, Argentina, you can join us in person at the Innovation Park, where our team will be broadcasting the workshops!"


Elastos Foundation

Native Bitcoin Staking Becomes a Reality with a New Revenue Model for Arbiter Nodes

Elastos unveils BeL2 SDKs for partners to develop Native Bitcoin Apps and StarBTC demo DeFi app to extract value from dormant Bitcoin StarBTC first prototype for arbiter nodes and smart contracts enable collateralization of up to 80% of Bitcoin assets Elastos BeL2 offers significant premiums for node owners staking Bitcoin (BTC) on the Bitcoin Network […]
Elastos unveils BeL2 SDKs for partners to develop Native Bitcoin Apps and StarBTC demo DeFi app to extract value from dormant Bitcoin StarBTC first prototype for arbiter nodes and smart contracts enable collateralization of up to 80% of Bitcoin assets Elastos BeL2 offers significant premiums for node owners staking Bitcoin (BTC) on the Bitcoin Network on top of existing APR rewards

Singapore: July 25, 2024 – Elastos, the SmartWeb ecosystem provider, today announced innovations that enable native Bitcoin staking on the Bitcoin network. Partners can use the Elastos BeL2 Software Development Kit (SDK) to build Native Bitcoin decentralized apps (dApps) to encourage staking of the estimated value of more than 1 trillion dollars in dormant bitcoins. The BeL2 SDK will be demonstrated at Bitcoin Nashville 2024 alongside the StarBTC demo loan app, showcasing BeL2’s native Bitcoin DeFi solutions as a decentralized clearing network service.

With BeL2, users can stake Bitcoin directly on the Bitcoin network. This allows for the transmission of transaction proofs, rather than assets, across chains, enabling users to enter Layer 2 environments and utilize BTC in various DeFi applications. The StarBTC demo app shows how staked BTC can be used as collateral for stablecoin loans, maintaining Bitcoin on its main network and leveraging BeL2’s ZKP technology and decentralized arbitration for secure transactions.

The BeL2 Protocol enables direct collateralization of Bitcoin without bridging or wrapping, preserving Bitcoin’s security and integrity while avoiding network congestion and fees. The StarBTC demo app shows how arbiter nodes facilitate communication between the Bitcoin and Ethereum Virtual Machine (EVM) chains to verify transaction proofs using BeL2’s Zero-Knowledge-Proof (ZKP) process to ensure all conditions are met for collateral release.

“The combination of our innovations around arbiter nodes and the BeL2 SDK will enable developers and node holders to earn significant premiums from staking Bitcoin,” said Jonathan Hargreaves, Global Head of Business Development & ESG, Elastos. “We estimate the Native Bitcoin DeFi market could grow by over $1 Trillion, with potential in industries from creative arts to retail.”

 

Elastos’ BeL2 lets users run zkBTC full nodes on mobile phones, creating a decentralized Bitcoin DeFi network. These nodes ensure security by allowing users to verify transactions themselves. Upgraded zkBTC nodes can become arbiters, earning fees for helping others manage loans and time-based transactions. This system supports Bitcoin staking and rewards nodes with upcoming BeL2 assets and BTC fees. To prevent collusion, anyone can challenge arbitrators by submitting proof of misconduct for punishment and compensation. Arbiter nodes are backed by BTC and Elastos SmartWeb’s ELA coin, which is secured by up to 50% of Bitcoin miners’ hashpower, including BTC.com, Antpool, F2Pool, ViaBTC, and BinancePool.

 

This model enhances the Bitcoin network by increasing the number of full nodes, essential for maintaining decentralization and security. Incentivizing node participation strengthens the network’s robustness and resilience. Full nodes validate transactions and blocks, contributing to the blockchain’s overall health. Increased participation fosters a secure and trustworthy environment for Bitcoin transactions, crucial for decentralized finance adoption.

 

The BeL2 SDK simplifies development, enhances security, and offers innovation opportunities by abstracting complex blockchain interactions and ZKP operations. This allows developers to integrate Bitcoin functionality with minimal effort. Additionally, Elastos provides a ZKP Explorer for verifying zero-knowledge proofs, enhancing transparency and trust within the BeL2 ecosystem.



“Our vision is to establish a new global financial system anchored by Bitcoin and enhance its role as a global hard currency,” said Sasha Mitchell, Head of Bitcoin Layer 2 (BeL2), Elastos. “The launch of the BeL2 Protocol, the StarBTC demo app, and the BeL2 SDK is inspiring our partners to explore DeFi opportunities. We are excited to work with Layer 2s like BEVM and B² Network to demonstrate how Native Bitcoin DeFi can open up new economic opportunities.”

 

The BeL2 SDK is a critical element of the Elastos strategy to build a vibrant ecosystem using the BeL2 Protocol. Partners are developing key technologies, integrations, and dApps, including BEVM’s peer-to-peer Bitcoin-denominated loan offering and B² Network’s Layer 2 EVM execution environment. Conflux, the only regulatory-compliant, public, and permissionless Blockchain in China, is integrating with BeL2 to facilitate Bitcoin-denominated transactions and exchanges. Tuna Chain is a Layer 2 solution ensuring interoperability between ecosystems, leveraging Elastos’ Bitcoin Oracle for real-time data on Bitcoin-assured activity. EastBlue, a Bitcoin super app, and Spending Power, a Web3 e-commerce platform, are also creating integrations with BeL2. This will connect EastBlue users to Elastos’ full portfolio of dApps and enable developers to create Native Bitcoin apps compatible with Spending Power’s offering.

 

For more information about the BeL2 SDK, visit: npmjs.com/package/@bel2labs/sdk

For more information about Star BTC, please go to: lending.bel2.org/

About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.
 


Digital Identity NZ

Digital Trust Framework: Launch & Future | July Newsletter

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

The Digital Identity Services Trust Framework (DISTF) Act took effect on the first day of the month, and included the establishment and implementation of the Regulator, the Trust Framework Authority. The launch was rather low-key, with the only discernible signal from the Department of Internal Affairs being updates to their digital government web pages to reflect this milestone. 

It was a different story in industry, however, where the occasion was covered by DINZnationally by RNZlocally, internationally by Biometric Update and social media posts, including my own and those from DINZ.

The quote that really stuck was this one from Victoria University’s Professor of Informatics Markus Luczak-Roesch: “There’s a huge risk of doing nothing. Which is why it’s good that we’re doing something.” He’s absolutely right. It’s been over seven years of policy work at the DIA to reach this point, which I described as ‘the end of the beginning’. While a challenging journey, Aotearoa can build from here with those that want to opt-in.

Next month’s Digital Trust Hui Taumata ahead of Net Hui and The Point 2024, will kick off with a keynote by Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik. The panel that follows will discuss NZ’s Digital Identity Trust Framework, representing organisations that could be potential Relying Parties/Verifiers in Aotearoa under the DISTF regulation. The Trust Framework market model would see such parties seek out Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials, a topic that I blog about here. Publicly, it’s known that MSD and HNZ are piloting DIA’s platform, with RealMe as a notional issuer.

Additionally, the event will cover Digital Public Infrastructure, AI, biometrics, digital acceptance networks, digital drivers’ licences, the Metaverse, passkeys, digital cash, next generation payments, and the challenges of delegated administration across communities and much more. It’s all there, along with a panel of four experts who will review the sessions from a Te Ao Māori perspective.

In short, this year’s Digital Trust Hui Taumata will be like no other. The wait is over, and the rubber is hitting the road for the DISTF. What matters now is scale – will they come?

Lastly, I’m very excited to tell you that the DINZ podcast series is almost ready for launch so do keep an eye out for the first episode dropping very soon.

Ngā mihi
Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Digital Trust Framework: Launch & Future | July Newsletter

SUBSCRIBE FOR MORE

The post Digital Trust Framework: Launch & Future | July Newsletter appeared first on Digital Identity New Zealand.

Tuesday, 23. July 2024

Digital Identity NZ

Will the Digital Trust Hui Taumata 2024 move the dial?

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years.  The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.

Deep thought has gone into building the agenda for next month’s Digital Trust Hui Taumata, ahead of Net Hui and The Point 2024, so that conversations live on and build out later in the year and into subsequent years. 

Significant attention will be devoted to Trust Frameworks given the Digital Identity Services Trust Framework (DISTF) regulation coming into play on 1 July. Immediately following Minister Collins’ opening remarks, Microsoft’s global identity standards lead and past DIACC TFEC member Juliana Cafik will deliver an intensely interesting first keynote – The international landscape for Digital Identity Trust Frameworks and how NZ compares. Trust frameworks already exist and we use them daily – for example using your bank card to withdraw cash from another bank’s ATM. The panel that follows, representing organisations that could be potential Relying Parties (RPs)/Verifiers under the DISTF, discuss how they see Trust Frameworks playing out. To be relieved of the burden and to minimise risk, these parties notionally look for accredited Digital Identity Service Provider/Issuers to deliver privacy-aware, cryptographically secured verified credentials.  

Two of these three panellists come from regulated industries while the other is a key government agency, where in all cases the failure to verify parties correctly could have devastating consequences. Other regulated industries and government agencies that need similar verification processes include estate agents, rental companies, law firms, financial services, insurance companies, the pharmacies, doctor’s surgeries, the Police Vetting Service, driver licencing, firearms licensing, the box store where you take out a loan for your new appliance, registering for a loyalty scheme – and the list goes on. Representatives from the Regulator, the DISTF Trust Framework Authority, will lead a Roundtable discussion after lunch where delegates can pose their questions.   

There are multiple paths to achieve this nirvana of privacy-aware, cryptographically secured verified credentials available to all people under the auspices of a Trust Framework which is why, straight after the Trust Frameworks panel, Worldline’s Conrad Morgan will keynote a complementary path – Turning transactions into interactions – building New Zealand’s first digital identity acceptance network’. 

Supporting Trust Frameworks are increasingly biometrics and AI – both of which need demystifying for the public to gain confidence in them – along with Digital Public Infrastructure, the Metaverse, passkeys, digital cash, digital driver’s licences, next generation payments, the critical need for digital inclusion, and the challenges of delegated administration across communities. The agenda comprises local and international speakers covering these topics as well, all reviewed by a panel of four experts reviewing the sessions from a Te Ao Māori perspective. 

The richness of the content to be presented at this year’s event is incomparable with previous years. So do not be surprised when the 2024 Digital Trust Hui Taumata is dropped into conversations in years to come.    

Colin Wallis, Executive Director, DINZ

The post Will the Digital Trust Hui Taumata 2024 move the dial? appeared first on Digital Identity New Zealand.


Energy Web

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in…

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic pa
Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in Energy Sector Integration of Decentralized Compute Networks to Enhance Efficiency and Sustainability in Global Energy Landscape

July 23, 2024 — ZUG, Switzerland — Energy Web, a pioneer in developing open-source technology solutions for the energy sector, is thrilled to announce a strategic partnership with Acurast, an innovative leader in decentralized computing. This collaboration marks a significant step forward in enhancing the capabilities of both platforms while driving sustainability and technological innovation across the global energy landscape.

The partnership aims to seamlessly integrate Energy Web worker node networks with Acurast’s Decentralized Compute network. This integration will enable Energy Web users to host Energy Web workers on Acurast’s secure and widely distributed compute protocol. The primary goal is to facilitate a more efficient and scalable deployment of digital energy solutions.

In a move to expand its digital footprint, Energy Web will leverage the Acurast SDK to roll out a new mobile application. This collaboration will not only enhance mobile accessibility but also significantly improve the functionality, providing users with robust tools for managing their energy resources efficiently.

Both Acurast and Energy Web Foundation are committed to sustainability. Acurast’s approach to upcycling smartphones, giving them a second life as compute units in its decentralized network, dramatically reduces electronic waste and promotes efficient resource use. Similarly, Energy Web Foundation is dedicated to accelerating the clean energy transition through its development of cutting-edge, open-source technologies for energy systems.

By combining their unique resources and expertise, Acurast and Energy Web Foundation aim to foster significant innovation, efficiency, and sustainability in the energy sector. This partnership underscores their shared vision of a more sustainable and decentralized future, driving positive change across communities worldwide.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Announces Strategic Partnership with Acurast to Advance Sustainability and Innovation in… was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant  

In May 2024, the US government’s General Services Administration (GSA) updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the […] The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.

In May 2024, the US government’s General Services Administration (GSA) updated its Multiple Award Schedule (MAS) Contract with a new Special Item Number (SIN 541519CSP, Credential Service Providers) under the IT Large Category.SIN 541519CSP is designed to help federal agencies ensure that any IT services procured meet the requirements of National Institute of Standards and Technology (NIST) Special Publication (SP) 800-63 requirements and digital identity compliant services.  To provide credential services under the new SIN, companies must meet specific instructions and requirements. SIN 541519CSP was created to meet the increasing need for robust, trustworthy credential service providers. The new SIN will help government agencies quickly identify credential service providers that have been vetted against the government’s standard requirements. If your company offers credential services and meets the requirements, obtaining SIN 541519CSP will place your company in a better position to capture bids as agencies look to acquire NIST 800-63 compliant services. In order to be included on the Schedule, Credential Service Providers must either be listed on the Kantara Trust Status List or provide a letter pf approval from Kantara Initiative, or other GSA approved third party that can assure conformance to NIST SP 800-63.  To begin the process, you’ll need to complete forms that can be found on idmanagement.gov. Since both state and federal government agencies are permitted to use the vendors on this schedule for credential services, this considerably extends opportunities for Kantara certified companies.

Read the full instructions on how to be included on MAS, including the Technical Evaluation Criteria.

The post US Multiple Award Schedule requires CSPs to be NIST 800-63 compliant   appeared first on Kantara Initiative.


ResofWorld

Pressured to relocate, Microsoft’s AI engineers in China must choose between homeland and career

As geopolitical tensions grow, many employees have decided that a career with the Silicon Valley tech giant isn’t worth giving up the comforts of home.
Alan, a young engineer at Microsoft, has been living a comfortable life in Beijing working for the tech giant on cloud computing. He earns six times the average income in...

In Indonesia, social media is a “hunting ground” for religious minorities

Conservative Muslim influencers spread hate speech to their millions of followers on TikTok and YouTube, with little pushback from authorities or platforms.
For nearly two decades, hundreds of Ahmadiyya Muslims have lived in a cramped government shelter on the Indonesian island of Lombok, after they were attacked by a mob that accused them...

Blockchain Commons

2024 Q2 Blockchain Commons Report

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digi

Blockchain Commons is a not-for-profit organization that advocates for the creation of open, interoperable, secure, and compassionate digital infrastructure. Our goal is to enable people to control their own digital destiny and to maintain their human dignity online. We do this through the creation of interoperable specifications and reference software that demonstrate how to create and manage digital assets in ways that are private, independent, resilient, and open.

In Q2 of 2024, we advanced these principles through the following work:

Gordian Envelope Updates Expanded Developer Pages Request/Response Presentation Graph Representation Gordian Meetings FROST Presentation PayJoin Presentation All the Rest Seedtool-Rust Release seedtool-cli-rust Seedtool Manual dCBOR Adoption cbor.me cbor2 QCBOR IANA assignment of Tag 201 GSTP Improvements SSH Research ssh-envelope Experiment in Python SSH Key Support for envelope-cli More to Come Architectural Articles Minimum Viable Architecture Authentication Patterns DID Futures W3C DID 1.1 WG RWOT 13 Grants/Funding What’s Next? Gordian Envelope Updates

Gordian Envelope, Blockchain Commons’ “Smart Document” system, continues to be a major focus. Here’s what that meant in Q2.

Expanded Developer Pages. The developer pages were updated with a new executive summary and feature list to clarify the capabilities and advantages of using Envelope. (More executive summaries of our technology to follow!)

Request/Response Presentation. Our May Gordian Developers Meeting included a presentation on Request/Response, which is an interoperable communication methodology using Gordian Envelope. Why use it? It can make complex digital-asset procedures more accessible by using automation to dramatically reduce the amount of human interaction needed, yet it also preserves security by ensuring that human choices are required whenever data is transmitted from one device to another. (But watch the presentation for more!)

Graphs Representation. Blockchain Commons has a new research paper out on Representing Graphs with Envelope, which presents a proposed architecture for representing many types of graphs, enabling the use of Envelope for a variety of graph-based structures and algorithms.

Gordian Meetings

Gordian Developer Meetings are how we bring the wallet community together to talk about our interoperable specifications. We’ve been thrilled to expand that in the last quarter with some feature presentations from experts in the field.

FROST Presentation. April saw a special presentation on FROST by Jesse Posner that not only talked about his work to date, but also some of the emerging capabilities of FROST, such as the ability to regenerate shares or even change thresholds without changing the underlying secret! We’ve long thought FROST was a great next-generation resilience solution for digital assets, and so appreciate Jesse talking to our community about why it’s so exciting. See the complete video of our April meeting for more.

PayJoin Presentation. Privacy is one of our fundamental principles for Gordian design. It’s also a principle that will be better supported in Bitcoin with a new version of PayJoin. Dan Gould was kind enough to give a full presentation on the updates he’s working on at our May meeting. We’ve got a video of just his PayJoin presentation.

All the Rest. Both meetings of course also included details on Blockchain Commons’ own work (much of which is detailed in this report). The Gordian Developer meetings continue on the first Wednesday of every month. We’ve also already scheduled a few feature presentations for the rest of the year. On August 7th, we’ll have a special presentation on BIP-85, then on December 4th, we’ll have another FROST presentation for wallet developers. If you’d like to make a special presentation in September, October, or November on a topic of interest to wallet developers, let us know!

Also, if you’re a cryptographer, spec designer, or library developer who is working to implement FROST, please be sure to sign up for our FROST implementers announcements-only list so that you can receive invites for our second FROST Implementers Round Table, which will be on September 18 thanks to support from the Human Rights Foundation (HRF).

Seedtool-Rust Release

Blockchain Commons’ newest reference application is seedtool-cli for Rust.

seedtool-cli-rust. Seedtool is a domain-specific application that allows the creation, reconstruction, translation, and backup of cryptographic seeds. Blockchain Commons’ new Rust-based Seedtool replaces our older C++-based CLI and provides broader support for Gordian Envelope, including offering Gordian Envelopes of SSKR shares, that can backup a seed using Shamir’s Secret Sharing. Seedtool’s Gordian Envelopes can then be piped into envelope-cli-rust for compression, encryption, or the addition of further metadata.

Seedtool Manual. For more on seedtool-cli-rust, check out the full user manual, which explains how to use all of its functionality and why it’s important.

dCBOR Adoption

dCBOR is one of the foundations of Envelope, as it allows for the deterministic ordering of data, which is crucial for a hashed data system like Envelope. The IETF dCBOR Internet-Draft updated from v8 to v10 over Q2, with most of those changes due to expanding support for the spec. We’re still hoping to see the Internet-Draft finalized soon!

cbor.me. The CBOR Playground is Carsten Bormann’s foundational diagnostic site for CBOR. It now supports dCBOR thanks to a new Ruby Gem that Carsten authored.

cbor2. Joe Hildebrand’s cbor2 library for Typescript has also been expanded to support dCBOR.

QCBOR. Laurence Lundblade’s QCBOR library (which is written in C) now supports dCBOR in its development branch.

IANA Assignment of Tag 201. Finally, 201 is now officially the “enclosed dCBOR” tag for CBOR. This is also critical for Gordian Envelope, which uses this tag to wrap dCBOR in each of an envelope’s “leaf” nodes.

GSTP Improvements

Gordian Sealed Transaction Protocol (GSTP) is a Gordian Envelope extension. It allows for Envelope Requests and Responses to be sent in a secure way and is a critical element of Blockchain Commons’ Collaborative Seed Recovery system, which enables the storage of SSKR shares in a Gordian Depository.

GSTP Advances. Thanks to support from our Research Sponsor, Foundation Devices, Blockchain Commons was able to expend considerable engineering work on GSTP in the last quarter, resulting in more fluent API patterns for building GSTP requests and responses. In addition, GSTP now supports bidirectional self-encrypted state with a unique and powerful new feature that we are calling Encrypted State Continuations (ESC). Overall, GSTP is a system that is secure, distributed, and transportation-agnostic. In a world where we could be sending digital-asset info by NFC, Bluetooth, or QR codes, it’s a critical security measure. See our presentation from the most recent Gordian Developers Meeting for more!

SSH Research

SSH has been long used as an authentication system, primarily for accessing UNIX computers. However, it’s recently come under increasing usage as a signing system as well, primarily thanks to extensions in Git. That has led to Blockchain Commons experimenting with the integration of SSH keys into Envelope. (This has also demonstrateð the flexibility of Envelope through the addition of these signing methodologies.) We’ve now got some first results.

ssh-envelope Experiment in Python. Early in the quarter, we produced ssh-envelope, an experimental Python program that worked with both ssh-keygen and envelope-cli. But, thanks to some very rapid development, we’ve already moved beyond that.

SSH Key Support for envelope-cli. We’ve since integrated SSH key support throughout our Rust stack, primarily affecting our bc-components and bc-envelope Rust crates. This allowed us to bring our SSH key support fully into the Rust envelope-cli, which you can now use for SSH signing.

More to Come. We’re still working on processes that will allow for the safe, secure, and reliable signing of software releases, something that we talked about extensively in our software use cases. You can see some more of our work-in-progress in a discussion of SSH Key Best Practices. We hope to have more on using SSH to enable resilient & secure software releases later in the year.

Architectural Articles

Blockchain Commons expresses a lot of its more architectural thoughts as articles. There were two major articles in Q2.

Minimum Viable Architecture. Our first major article for the quarter focused on the methodology of Minimum Viable Architecture (MVA). Many companies still focus on Minimum Viable Products. Our article advocates instead looking at the big picture (with lots of discussion on why that’s important).

Authentication Patterns. Design patterns are a crucial element in architectural design. Much as with the adversaries found in #SmartCustody, design patterns allow you to put together a larger system piece by piece. As part of a guide to the strength of heterogeneity in architectural design, Blockchain Commons penned a set of authentication design patterns. We’d like to do more to fill out the space, but for now feel like this is a good first cut that shows the value of the design style.

DID Futures

The Blockchain Commons principals have been involved with DIDs since Christopher Allen founded Rebooting Web of Trust in 2015.

W3C DID 1.1 WG. After a hiatus, the W3C DID working group has been rechartered through 2026. Christopher Allen continues as an Invited Expert, focused on a variety of privacy issues, including elision, DID registration, and DID resolver issues.

RWOT 13. Meanwhile, Rebooting the Web of Trust continues to be on the frontline for DID advancements, with Christopher still the chair of the organization and Shannon Appelcline the editor-in-chief. RWOT13 is finally back in the USA, with the early bird deadline for advance-reading papers at the start of August.

Grants/Funding

As we’ve written elsewhere, funding has become more difficult in the last year because of large-scale financial factors such as inflation and the resultant increase in interest rates. Blockchain Commons has responded by working more closely with some of our partners on topics of special interest to them and by seeking out grants.

Thanks to Human Rights Foundation for their grant enabling our continued support of FROST work.

Thanks to Foundation Devices for their support of GSTP work.

Thanks to Digital Contract Design for their support of our advocacy over the last year.

Please consider becoming a personal or corporate sponsor of Blockchain Commons so that our work can continue. Or, if you want support to integrate or expand one of Blockchain Commons’ existing projects (such as SSKR, Envelope, or the Gordian Depositories) in an open manner, to meet your company’s needs, contact us directly about becoming a Research Sponsor.

Also, please let us know of any grants or awards that you think would be closely aligned with our work at Blockchain Commons, so that we can apply.

What’s Next?

Coming up:

More work on Envelope & GSTP. More reveals of our SSH work. A new musings on cryptographic “cliques”.

We’re looking forward to Q3!

Monday, 22. July 2024

EdgeSecure

Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program

The post Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program appeared first on NJEdge Inc.

NEWARK, NJ, July 24, 2024 – Edge is pleased to announce that the American Association of Colleges and Universities (AAC&U) has joined the EdgeMarket cooperative as an Affiliate Partner. Through this partnership, AAC&U and its members can take advantage of thousands of services and solutions through EdgeMarket’s streamlined procurement process. Edge and AAC&U will mutually explore new ways to add value to the constituencies they serve.

AAC&U is a global membership organization dedicated to advancing the vitality and democratic purposes of undergraduate liberal education. Serving administrators, faculty, staff, and students at nearly 1,000 colleges and universities across the country and around the world, AAC&U serves as a catalyst and facilitator for innovations that improve teaching and learning and support student success. AAC&U offers special discounts to its members on select products and services from a range of higher education providers, and now as an EdgeMarket Affiliate Partner, that catalog has greatly expanded.

Designed to reduce the time, cost, and hassle of purchasing products and services, the EdgeMarket portal provides easy access to a variety of procured point solutions and services, including one of the most powerful procured IT and Services catalogs in the nation.

“This is much more than a transactional relationship; this is a partnership where we will seek ways to combine our strengths to create new opportunities for the higher education community. We are excited to learn more about what is important to AAC&U and its members and how Edge and EdgeMarket can add to AAC&U’s legacy of deep and positive impact.”

— Dan Miller
Associate Vice President, Edge Market and Solution Strategy
Edge

“The ethos of Edge aligns well with AAC&U’s vision of improving educational quality and equity, and EdgeMarket now serves as pathway to the essential technologies and services that support AAC&U’s members’ pursuit of that vision,” said Dan Miller, Associate Vice President, EdgeMarket and Solution Strategy. “This is much more than a transactional relationship; this is a partnership where we will seek ways to combine our strengths to create new opportunities for the higher education community. We are excited to learn more about what is important to AAC&U and its members and how Edge and EdgeMarket can add to AAC&U’s legacy of deep and positive impact.”

To learn more about the EdgeMarket Affiliate Partner Program visit us here.

About the American Association of Colleges and Universities: AAC&U is a global membership organization dedicated to advancing the democratic purposes of higher education by promoting equity, innovation, and excellence in liberal education. Through its programs and events, publications and research, public advocacy, and campus-based projects, AAC&U serves as a catalyst and facilitator for innovations that improve educational quality and equity and that support the success of all students. In addition to accredited public and private, two-year and four-year colleges and universities, and state higher education systems and agencies throughout the United States, AAC&U’s membership includes degree-granting higher education institutions around the world as well as other organizations and individuals. To learn more, visit www.aacu.org.

About Edge: Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Welcomes the American Association of Colleges and Universities (AAC&U) to the EdgeMarket Affiliate Partner Program appeared first on NJEdge Inc.


FIDO Alliance

Strengthening Authentication with Passkeys in Automotive and Beyond

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had […]

On July 16th, 2024, the FIDO Alliance held a seminar focused on the fit for FIDO authentication and device onboarding within the automotive industry. Co-hosted with Swissbit, the event had over 100 attendees who heard from various stakeholders on the need and opportunity for standards-based approaches to securing the automotive workforce and manufacturing process. Themes included how passkeys and FIDO-certified biometrics can help transform the future of in-vehicle experiences, especially with in-car payments, smart cars, and IoT.

FIDO Momentum in the Automotive Industry

Like just about every market sector, the automotive industry is plagued by risks and ramifications associated with decades of relying on passwords – and is also uniquely poised to improve the user experience by embracing passkeys for user authentication.

With smart cars having embedded technology to connect to digital experiences, there are several innovations primed for take-off in the automotive industry. With nearly 100 million vehicles will be making payments by 2026, up from just 2.3 million in 2021, passkeys will be crucial to simplify the in-vehicle user experience. At the same time, manufacturers have the opportunity to improve IoT and secure embedded devices to improve customer experiences on and off the road.

Manufacturing and Smart Car Case Studies

On the workforce front, the event featured a case study from MTRIX and considerations on how to deploy FIDO security keys to a manufacturer’s workforce – contemplating the many types and locations of workers for today’s global manufacturers. This case study reinforced the factors called out in a presentation by Infineon on the regulatory-driven push and pull with FIDO authentication.

VinCSS described how FIDO Device Onboard is being used today to secure the smart car ecosystem both at point of manufacturing as well as for after-market use cases.

Using Passkeys for In-Vehicle Payments

The final block of sessions looked more closely at our in-vehicle future – including an overview of current trends for in-vehicle payments. Visa and Starfish then presented a blueprint and demo respectively for a standards-based approach for in-vehicle payments before Qualcomm wrapped things up with their vision for a digital chassis as the foundation for a software-defined vehicle that contemplates the need for secure identity, payments and driver/passenger personalization.

Driving FIDO in the Automotive Industry – Next Steps

Interested in this seminar’s content? Find these presentations and more on the Munich Seminar event page.

The FIDO Alliance welcomes input from the public and the identity security community on FIDO’s future in the automotive industry. Comments are welcome via our contact us page. For in-person connections, we encourage identity security and authentication professionals to join us at our conference, Authenticate, where there will be several automotive and passkey related sessions, content, and peer networking. This year’s event, held Oct. 14-16th, 2024, will be held in sunny southern California at the La Costa Omni Resort in Carlsbad, CA.


FIDO Munich Seminar: Strengthening Authentication with Passkeys in Automotive and Beyond

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state […]

The FIDO Alliance recently held a seminar in Munich for a comprehensive dive into FIDO authentication and passkeys. The seminar, co-hosted by Swissbit, provided an exploration of the current state of passwordless technology, detailed discussions on how passkeys work, their benefits, case studies, and practical implementation strategies. Attendees learned about current and emerging elements of the FIDO Certified program and how they pertain across sectors, including a focus on automotive and payments use cases. 

Attendees also had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A and networking – plus the opportunity to see demos and meet the experts that can help move FIDO deployments forward.

View the seminar slides below. More slides will be added.

FIDO Munich Seminar Introduction to FIDO.pptx from FIDO Alliance

FIDO Munich Seminar Blueprint for In-Vehicle Payment Standard.pptx from FIDO Alliance

FIDO Munich Seminar FIDO Automotive Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Biometrics and Passkeys for In-Vehicle Apps.pptx from FIDO Alliance

FIDO Munich Seminar: Strong Workforce Authn Push & Pull Factors.pptx from FIDO Alliance

FIDO Munich Seminar: Securing Smart Car.pptx from FIDO Alliance

FIDO Munich Seminar In-Vehicle Payment Trends.pptx from FIDO Alliance

FIDO Munich Seminar Workforce Authentication Case Study.pptx from FIDO Alliance

FIDO Munich Seminar: FIDO Tech Principles.pptx from FIDO Alliance

Energy Web

ECS4DRES: Shaping the Future of Renewable Energy Systems

A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government. In collaboration wi
A New Horizon Europe Project to Enhance Reliability and Resilience in Distributed Renewable Energy Across Europe

We are excited to announce our new EU project, Electronic Components and Systems for Flexible, Coordinated, and Resilient Distributed Renewable Energy Systems (ECS4DRES). This groundbreaking initiative is co-funded by Horizon Europe and the Federal Government.

In collaboration with 33 partners across 6 European countries, ECS4DRES aims to revolutionize the reliability, safety, and resilience of Distributed Renewable Energy Systems (DRES). By developing advanced monitoring and control technologies, the project will incorporate integrated sensors with energy harvesting functions, capable of various types of detection for safety and monitoring of energy transfers. Additionally, ECS4DRES will achieve interoperable and low-latency communication systems, along with sophisticated algorithms, AI tools, and methods. These innovations will enable the widespread interconnection, monitoring, and management of numerous DRES, subsystems, and components, optimizing energy management between sources, loads, and storages, enhancing power quality, and ensuring resilient system operation.

ECS4DRES is committed to thorough validation of these technologies through a series of five relevant use cases and demonstrators. The project’s results will generate a wide range of scientific, technological, economic, environmental, and societal impacts on a global scale, meeting the needs of Original Equipment Manufacturers (OEMs), Distribution System Operators (DSOs), grid operators, EV charging station aggregators, energy communities, end customers, and academia.

By providing interoperable and tailored solutions in electronic control systems, sensor technology, and smart systems integration, ECS4DRES will facilitate the deployment and efficient, resilient operation of DRES, including the integration of hydrogen equipment and components.

As we embark on this ambitious project, we are reminded of the words of renowned futurist Alvin Toffler: “The great growling engine of change — technology.” ECS4DRES represents a significant leap forward in the technological advancement of renewable energy systems, driving us toward a more sustainable and resilient future.

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

ECS4DRES: Shaping the Future of Renewable Energy Systems was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Join us on The Identity at the Center Podcast as we sit down

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea. In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field

Join us on The Identity at the Center Podcast as we sit down with Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea.

In this new episode, we explore Joseph's fascinating journey in identity and access management, cybersecurity, and his firsthand experiences in Estonia's digital identity ecosystem. We delve into the challenges and triumphs of digital identity, the emerging field of ITDR, and the intersection of digital identity, authentication, and AI in cybersecurity.

Watch the episode: https://www.youtube.com/watch?v=klBxFLvUC78

More Info: idacpodcast.com

#iam #podcast #idac


ResofWorld

Ethiopians are struggling to keep up with the new “EV or nothing” policy

Ethiopia became the first country in the world to ban the import of gas and diesel cars. But the country has only around 50 charging stations.
When Araya Belete’s employer asked him to purchase four new cars in Addis Ababa last year, the IT professional quickly settled on an electric model, manufactured by China’s Kas Auto....

Saturday, 20. July 2024

ResofWorld

Bangladesh’s internet blackout immobilizes its booming tech industry

The government and internet service providers blame each other for the blackout amid massive protests.
Bangladesh’s tech industry has come to a halt as the nationwide internet blackouts entered a third day, leaving thousands of companies with financial and reputational losses, and workers feeling helpless....

Friday, 19. July 2024

OpenID

Calling all Implementers: Shared Signals Interop at Gartner IAM Summit

Join us for a Shared Signals Interop Event at Gartner IAM in Grapevine, Texas (December 9-11) Momentum continues to build around the OpenID Shared Signals Framework, CAEP, and RISC standards. Leading companies have announced their support, and implementations are now in production. Building on the success of the first CAEP interoperability event, The OpenID Foundation […] The post Calling all Im

Join us for a Shared Signals Interop Event at Gartner IAM in Grapevine, Texas (December 9-11)

Momentum continues to build around the OpenID Shared Signals Framework, CAEP, and RISC standards. Leading companies have announced their support, and implementations are now in production.

Building on the success of the first CAEP interoperability event, The OpenID Foundation and the Shared Signals Work Group (SSWG) are delighted to announce that we are returning to Gartner’s IAM Summit for a second interop event, this time in Grapevine Texas, December 9-11 2024. 

During this event, we will demonstrate interoperability of implementations of:

The Shared Signals Framework (SSF) Continuous Access Evaluation Profile (CAEP) Risk and Incident Sharing and Collaboration (RISC) SCIM Events  How to Get Involved

There is room for up to 10 implementers and you are invited to register your interest with Atul Tulshibagwale, co-chair of the SSWG, at atul@sgnl.ai by August 1st, 2024. Note that this is not a final commitment to participate, but an expression of interest. The final scope of the event will be determined between the SSWG and the implementers who register interest. You will have the opportunity to confirm participation once all interested parties have met to agree on the scope and format of the interoperability event.

The Gartner IAM Summit will feature a breakout session and two interoperability sessions where the implementations will be discussed.

In the breakout session, we will display the matrix of interoperability test results for all committed implementers. In the interoperability sessions, conference attendees will have the opportunity to interact with the implementers and watch the live implementations interoperate with each other.

OIDF will publicize the participants’ interop test results after the event takes place – including whether they did or did not achieve interoperability for specific use cases.

Note that we can enable remote participation for offline testing and inclusion in test results, but remote participation is not possible during the summit.


About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Calling all Implementers: Shared Signals Interop at Gartner IAM Summit first appeared on OpenID Foundation.

Thursday, 18. July 2024

FIDO Alliance

Battling Deepfakes with Certified Identity Verification

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated […]

The digital transformation and the proliferation of e-identity schemes have escalated the need for secure and reliable online identity verification methods, especially in light of the alarming trend of AI-generated “deepfakes.” As internet users have learned about the increasing threat of deepfakes, they have become increasingly concerned about their identities being spoofed online, according to a new study conducted by the FIDO Alliance. As a result, deepfake awareness and the risks associated with them have steadily increased.

Amidst this landscape, the FIDO Alliance released its newest research in the eBook, Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024, which reveals insights from an independent study surveying 2,000 respondents in the U.S. and the U.K. on consumer perceptions on remote identity verification, online security, and biometrics. While the data showed consumer awareness and adoption of biometrics is increasing, consumers also expressed concerns about the rise of AI-generated deepfakes – reinforcing the need for preventative strategies and technologies focused on secure remote identity verification. 

What is a “deepfake”?

According to the Center for Internet Security, a deepfake consists of convincingly fabricated audio and video content designed to mislead audiences into believing that fabricated events or statements are real. These manipulations can create realistic yet entirely false representations of individuals through synthetic images or complete video footage. This manipulated audio/video content is dangerously effective at spreading false information. In cybersecurity, deepfakes are increasingly being used to spoof identities to fraudulently open accounts or take control of existing accounts.

With the advent of AI and the increasing use of face biometrics for remote identity verification, the deepfake risks to remote identity proofing (RIDP) methods have become a reality. Security researchers have been closely evaluating the identity verification risks associated with deepfakes to increase awareness of the rapidly changing threat landscape and support stronger countermeasures that enhance the trustworthiness and reliability of remote identity proofing (RIDP) methods. In the European Union Agency for Cybersecurity’s (ENISA) latest remote ID report, researchers observed that deepfake injection attacks are increasing and becoming more difficult to mitigate.

Users Express Concerns about Deepfakes and ID Verification

With the rise of generative AI and deepfake videos in the news, there has been a heightened consumer unease about the security of biometrics for online verification. In the FIDO Alliance’s study, the deepfake trends have not escaped consumers’ attention online, who are increasingly using face biometrics to authenticate identities online and are concerned about identity security.

On one hand, the study reinforced consumer preference for using biometrics in remote identity verification, with nearly half of the respondents indicating a preference to use face biometrics, especially for sensitive transactions, like financial services (48%). 

On the other hand, just over half of respondents revealed they are concerned about deepfakes when verifying identities online (52%).

Building Consumer Trust in Face Biometrics

As the concerns around deepfake security threats gain prominence, the industry has taken a significant step forward with the FIDO Alliance’s newly introduced Identity Verification certification program for Face Verification. This industry-first testing certification program, based on ISO standards, with requirements developed by the FIDO Alliance, aims to measure accuracy, liveness (including deepfake detection), and bias (including skin tone, age, and gender) in remote biometric identity verification technologies. By providing a framework for testing biometric performance and a network of accredited laboratories worldwide, this certification program standardizes and evaluates the performance of face verification systems while mitigating the impact of bias and security threats, like deepfakes.

Certifying Identity Verification with the FIDO Alliance

The Identity Verification certifications that the FIDO Alliance provides offer industry providers the ability to demonstrate commitment to addressing bias and security threats in remote biometric identity verification technologies. With a focus on standardizing and enhancing the performance of face verification technologies, the Alliance released its new FIDO Certification Program to elevate the performance, security, and equity of biometric solutions for remote identity verification. Combined with its Document Authenticity (DocAuth) Certification Program, these two certifications work together to ensure identity verification solution providers can leverage FIDO’s independent testing and accredited laboratories as a market differentiator. 

What is the value for IDV Biometric Vendors? Independent validation of biometric performance Opportunity to understand gaps in product performance to then improve and align with market demands Demonstrate product performance to potential customers  Improve market adoption by holding an industry-trusted certification Leverage one certification for many customers/relying parties  Benefit from FIDO delta and derivative certifications for minor updates and extendability to vendor customers Reduce need to repeatedly participate in vendor bake-offs What is the value for Relying Parties? One-of-a-kind, independent, third-party validation of biometric performance assessing accuracy, fairness and robustness against spoofing attacks Provides a consistent, independent comparison of vendor products – eliminating the burden of maintaining own program for evaluating biometric products Accelerates FIDO adoption to password-less Commitment to ensure quality products for customers of the relying parties  Requirements developed by a diverse, international group of stakeholders from industry, government, and subject matter experts Conforms to ISO FIDO Annex published in ISO standards What is the value of accredited laboratories?

FIDO Accredited Laboratories are available worldwide and follow a common set of requirements and rigorous evaluation processes, defined by the FIDO Alliance Biometrics Working Group (BWG) and follow all relevant ISO standards. These laboratories are audited and trained by the FIDO Biometric Secretariat to ensure lab testing methodologies are compliant and utilize governance mechanisms per FIDO requirements. Laboratories perform biometric evaluations in alignment with audited FIDO accreditation processes. In contrast, bespoke, single laboratory biometric evaluations may not garner sufficient trust from relying parties for authentication and remote identity verification use cases.

What are the ISO Standards that FIDO certification conforms to?

When a vendor invests in FIDO’s Face Verification Certification, they and their accredited lab are adhering to the following ISO standards:

Terminology
ISO/IEC 2382-37:2022 Information technology — Vocabulary — Part 37: BiometricsPresentation Attack Detection
ISO/IEC 30107-3:2023 Information technology — Biometric presentation attack detection — Part 3: Testing and reportingISO/IEC 30107-4:2020 Information technology — Biometric presentation attack detection — Part 4: Profile for testing of mobile devices
-FIDO Annex, published 2024Performance (e.g., FRR, FAR)
ISO/IEC 19795-1:2021 Information technology — Biometric performance testing and reporting — Part 1: Principles and frameworkISO/IEC 19795-9:2019 Information technology — Biometric performance testing and reporting — Part 9: Testing on mobile devices
-FIDO Annex, published 2019Bias (differentials due to demographics)
ISO/IEC 19795-10:2024 Information technology — Biometric performance testing and reporting — Part 10: Quantifying biometric system performance variation across demographic groups
-FIDO Annex, under developmentLaboratory
ISO/IEC 17025:2017, General requirements for the competence of testing and calibration laboratories Learn More about FIDO IDV Certification

As organizations and policymakers navigate the evolving landscape of digital identity verification, these consumer insights serve as a testament to the pressing need for independently tested and accurate biometric systems. The FIDO Alliance’s new Face Verification Certification Program offers solution providers the opportunity to demonstrate deepfake prevention to relying parties and end users by testing for security, accuracy, and liveness.

Download the Remote ID Verification eBook here today, and discover the world-class offerings from FIDO’s certified providers that have invested in independent, accredited lab testing with FIDO certification.


DIF Blog

DIF announces DWN Community Node

The Decentralized Identity Foundation (DIF) today announced the availability of the Decentralized Web Node (DWN) Community Instance, operated by DIF and powered by Google Cloud.  Decentralized Web Nodes (DWNs - also referred to as DWeb Nodes) are personal data stores that eliminate the need for individuals to trust apps

The Decentralized Identity Foundation (DIF) today announced the availability of the Decentralized Web Node (DWN) Community Instance, operated by DIF and powered by Google Cloud. 

Decentralized Web Nodes (DWNs - also referred to as DWeb Nodes) are personal data stores that eliminate the need for individuals to trust apps to responsibly use and protect their data. Instead, data is owned and controlled by the individual — offering developers a brand new way to create apps that request individuals’ permission to read and access their data, but don’t store it.

The Managed DWN service or “community node” will allow existing and new Google Cloud customers to more easily build test applications using DWNs. Rather than having to run their own DWN or server infrastructure to store data, developers will be able to leverage a DWN on Google Cloud to build test applications. 

Developers can use the community DWN node at no cost, including up to 1GB of storage per DID.

The service was launched during a strongly attended DIF community call earlier today (event highlights below). 

The launch marks an exciting milestone in DIF’s mission to make it easy for developers to build using decentralized identity, and follows the recent establishment of the Veramo User Group and an upgrade to DIF’s Universal Resolver infrastructure. 

SC members Daniel Buchner and Markus Sabadello will share the stage with leaders from Google Cloud and DIF member org TBD at We Are Developers World Congress in Berlin tomorrow to shine a spotlight on the new service, including a live demo of applications built using DWNs.

Today’s launch event in brief 

Following an introduction by DIF’s Executive Director, Kim Hamilton Duffy, and DIF Ambassador, Jeffrey Schwartz, Founder and CEO of Dentity, several companies who are deploying DWNs in real-world use cases gave case study presentations. 

Daniel Bucher and Andor Kesselman, co-chairs of the DWN work item within DIF’s Secure Data Storage working group, described how DWeb Nodes work, and what developers can do with them. 

A hot topic that surfaced during the discussion and subsequent Q&A, led by DIF’s Senior Director of Community Engagement, Limari Navarrete, was how developers can define protocols that give users fine-grained control over the data in their DWN. 

Protocols “encode rules around how people interact with my data,” Andor said. “It’s very powerful!” he added. 

Celina Villanueva from Extrimian described a pre-authorisation protocol designed to enable first responders at an emergency to access critical information in an accident victim’s DWN. The protocol powers a new service that will be piloted in Buenos Aires later this year. Extrimian also envisages the protocol enabling customers to pre-authorise their bank to access personal information when needed. 

Michal Jarmolkowicz described how Swiss Safe is working with clients in the travel and healthcare sectors to build new services around DWNs, including a protocol enabling patients to take control of their health record and grant specific, time-limited access to healthcare professionals. 

Another recurring topic was how DWNs and Verifiable Credentials can interact. Insights included how DWNs enable backup and recovery of VCs, and the benefits of combining VCs with a “bring your own data” model. 

There was also discussion about the adoption status of DWNs, and how to accelerate this. Jeffrey Schwartz said Dentity clearly sees an adoption cycle underway, based on Proofs of Concept they are involved in or aware of. Michal Jarmolkowicz noted that searching out CIOs and CTOs with an active “technology radar” (a service that flags and tracks emerging technologies) has proved fruitful for Swiss Safe, as they are more likely to be aware of Decentralized Identity and the value it offers. He added that Data Protection Officers often respond enthusiastically to DWNs.

Several participants wanted to know which cloud platforms and developer tools DWNs can be used with today. Daniel Buchner said Google has already adapted DWNs for Cloud SQL and Blobstore, while others are currently working on adapters for AWS S3 and other services. He also shared the insight that cloud providers are incentivized to enable developers to add more DWN protocols, since this drives more utility from customers’ data, encouraging them to increase their service usage. 

Wrapping up, he urged participants to “get involved and see what you can build. Together, DIF’s Community Node and the Web5 SDK make it pretty simple. Try writing your own protocols and have fun with it!”

What are Decentralized Web Nodes?

DWNs are personal data stores and peer-to-peer communication nodes that serve as the foundation for decentralized apps and protocols. They live on devices and can be replicated on hosted instances in the cloud. These data stores are a realization of true serverless applications in which developers can store app data without using centralized servers or an account with a centralized service.

DWNs are a foundational component of Web5, an open source platform that provides a new, reliable identity and trust layer for the Web to enable decentralized applications and protocols that solve real problems for people. Built on open standards developed by the World Wide Web Consortium (W3C) and DIF, Web5 consists of three key pillars: Decentralized Identifiers (DIDs), Verifiable Credentials (VCs), and Decentralized Web Nodes.

Why use DIF’s Managed DWN Service?

The service empowers developers to build decentralized apps that give users full ownership and control over their data, without needing to ask each user to deploy a local DWN. With DIF’s Managed DWN Service, developers can build decentralized apps with data in DWNs hosted on Google Cloud, making it easier than ever before to empower individuals with ownership and control of their data.

How do I get started with Managed DWNs? 

To start building decentralized apps on Web5, developers can visit developers.tbd.website. 

To access the DIF community node, start here.


Energy Web

Green Proofs by Energy Web Now Available as a Service

Enables energy companies to rapidly construct digital registries for green commodities July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly c
Enables energy companies to rapidly construct digital registries for green commodities

July 18, 2024 | Zug , Switzerland — Energy Web, a leading technology provider for the energy sector, is excited to announce the launch of Green Proofs as a Service, an advanced, cloud-based version of their acclaimed Green Proofs solution. This new offering enables businesses and organizations to rapidly construct digital registries for tracking, tracing, and exchanging digital certificates representing any green commodity with unprecedented flexibility and control.

Green Proofs as a Service includes the following key features:

Customized Data Formats and Schema: Users can tailor data formats and schema specific to different green commodities, enabling any green commodity and associated data format to be supported Configurable Business Logic and Rules: Administrators can define and adjust business logic and rules for the creation, transfer, issuance and retirement of certificates, providing full control over the certification process. Comprehensive Registry Administration: The service includes all functionalities expected of a registry administrator, such as the ability to add and remove users from individual companies or multiple companies, enhancing security and user management.

Green Proofs has already demonstrated its efficacy and reliability in supporting multiple enterprise solutions. Notable implementations include the RMI / EDF Sustainable Aviation Fuel Certificate Registry, a low-carbon shipping registry, and multiple 24/7 renewable energy matching solutions. These use cases highlight the versatility and robustness of Green Proofs in real-world applications.

“Green Proofs as a Service marks a significant milestone for Energy Web and our commitment to driving innovation in the energy sector,” said Mani Hagh Sefat, CTO of Energy Web. “By offering Green Proofs via an as-a-service model, we help our clients innovate much faster by quickly putting a digital registry into their hands for experimentation and rapid prototyping.”

Green Proofs as a Service is now available to businesses and organizations worldwide who are interested in using digital registries to support any green commodity supply chain. For more information or to schedule a demo, please visit www.energyweb.org or contact hello@energyweb.org

About Energy Web
Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Green Proofs by Energy Web Now Available as a Service was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Origin Trail

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years

Celebrating 50 Years of the GS1 Barcode June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards. As consumer demand for product information grew, regulator
Celebrating 50 Years of the GS1 Barcode

June 26 marked the 50th anniversary of the GS1 barcode, commemorating the first-ever product scan at a cash registry checkout. Over the decades, billions of products worldwide have been equipped with barcodes, streamlining and standardizing supply chain processes and adhering to GS1 standards.

As consumer demand for product information grew, regulatory requirements became stricter, and supply chain optimization pressures increased, the need for an updated barcode became evident. Enter the GS1 Digital Link, the barcode upgrade designed to provide dynamic access to comprehensive product information. Now, with leading retail and consumer goods companies actively supporting the transition to Digital Link QR codes, the stage is set for the traditional barcode to retire gracefully.

Setting a Strong Foundation for Digital Link with OriginTrail

For products and brands to fully benefit from the GS1 Digital Link transition, a robust, connected, and verifiable data foundation is crucial. Product data is often split across various supply chain partners, including manufacturers, logistics providers, wholesalers, retailers, and others. To connect billions of products to the internet in a meaningful way that provides genuine insights and business value, this scattered product data needs to be interconnected.

Scanning a Digital Link on a product and seeing the manufacturer’s information, such as production date, description, ingredients, and brand details is good. Scanning the same code and accessing comprehensive information about the product’s journey through the supply chain — including whether the ingredients were ecologically produced, if the product was stored at proper temperatures during transport, and how long it was in the supply chain — is much better. This is the true potential of the Digital Link.

Beyond consumer engagement, consider a business operating a rail or plane network being able to access details on a component’s manufacture, testing, and maintenance by scanning a Digital Link code. That would have surely been invaluable with the recent Boeing aircraft incidents.

This is where the OriginTrail Decentralized Knowledge Graph (DKG) and GS1 Digital Link make a match in heaven. The DKG provides a verifiable and interconnected knowledge base encompassing product data, supply chain events, certifications, locations, and more — across organizations and data sources. With the new DKG V8, the OriginTrail introduces the scalability needed to bring billions of products equipped with Digital Link into a world of standards-based, connected, and verifiable data. And the new DKG Edge Node concept empowers organizations and business networks to exchange product and other supply chain data with just a few clicks while maintaining data privacy, verifiability, and connectivity.

Supply chain data from multiple sources connected in a verifiable Decentralized Knowledge Graph.

As a longstanding partner of GS1, OriginTrail DKG is designed to natively support GS1 standards, including EPCIS, Core Business Vocabulary (CBV), Global Data Model (GDM), and Digital Link. This integration means that consumers, regulators, brands, and other stakeholders can access richer, more comprehensive, and trusted product data. The challenge now is to make this user experience seamless and simple, and there’s a tech perfect for the job — Artificial Intelligence (AI).

OriginTrail, Digital Link, and AI: A Consumer Engagement Power Throuple

Incorporating AI into the mix creates an incredibly powerful technology trio, enabling brands to enhance consumer engagement, based on connected and verifiable data spanning organizations, in unprecedented ways. And with the DKG Edge Node, AI capabilities come natively. Brands can thus offer personalized and tailored experiences by allowing customers to scan a product with a Digital Link QR code and ask anything — from brand details to product origins, sustainability, and environmental impact, all based on verifiable data from OriginTrail DKG.

This combination not only benefits consumers but also provides brands with valuable insights into customer preferences, allowing them to refine their business strategies. As billions of products transition from barcodes to Digital Link, the potential of this technology trio becomes evident. In fact, AI-powered product discovery, based on OriginTrail and Digital Link, is no longer a future concept but a current reality:

Some additional examples to check out:

Check the origin » Perutnina Ptuj Church of Oak Whiskey Distillery

Simultaneously, organizations can leverage AI to better understand and enhance their supply chains, ensuring they receive accurate and verifiable responses rooted in data from across their business network. By simply scanning a Digital Link QR code on a product, pallet, or shipping container, users are immediately empowered to ask questions and get verifiable answers — from basic queries like “Where was this product manufactured?” to more complex ones such as “Was the temperature in this shipping container in line with expectations?” and “Give me a list of all train wagons that are likely to experience issues with their wheels in the next month.” Exciting stuff indeed.

Where do we go from here?

As billions of products transition from traditional barcodes to Digital Link QR codes, establishing a robust foundation of connected and verifiable data becomes paramount. OriginTrail is at the forefront of this transformation, with the new DKG V8 offering the scalability and simplicity necessary to realize its full potential. When combined with AI, this technology trio unlocks immense opportunities for brands to engage with their customers in a trusted and meaningful way.

But consumer engagement is just one area set to benefit significantly from this transition. Regulatory bodies will gain streamlined access to verifiable product data, and supply chain management will become more proactive and efficient. The coming months and years promise exciting advancements and opportunities, making this a pivotal moment in the evolution of product information and consumer engagement.

We are excited to see OriginTrail at the epicenter of it all, as we — Trace Labs, the core developers of OriginTrai — along with our ecosystem partners get ready to unveil the Digital Link support via the new DKG V8 at the GS1 Industry & Standards Event. Over 1,000 business leaders from 80+ countries will come together virtually to solve today’s greatest business challenges through the development and adoption of the GS1 global standard.

For the GS1 Industry & Standards Event, register at: https://standards-event.gs1.org/

From Barcodes to Digital Links: Supercharging Trillions of Products for the Next 50 Years was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Oasis Open Projects

Introducing the Coalition for Secure AI, an OASIS Open Project

Boston, MA – 18 July 2024 – The Coalition for Secure AI (CoSAI) was announced today at the Aspen Security Forum. Hosted by the OASIS global standards body, CoSAI is an open-source initiative designed to give all practitioners and developers the guidance and tools they need to create Secure-by Design AI systems. CoSAI will foster […] The post Introducing the Coalition for Secure AI, an OASIS Open

Boston, MA – 18 July 2024 – The Coalition for Secure AI (CoSAI) was announced today at the Aspen Security Forum. Hosted by the OASIS global standards body, CoSAI is an open-source initiative designed to give all practitioners and developers the guidance and tools they need to create Secure-by Design AI systems. CoSAI will foster a collaborative ecosystem to share open-source methodologies, standardized frameworks, and tools. 

CoSAI brings together a diverse range of stakeholders, including industry leaders, academics, and other experts, to address the fragmented landscape of AI security.

CoSAI’s founding Premier Sponsors are Google, IBM, Intel, Microsoft, NVIDIA, and PayPal. Additional founding Sponsors include Amazon, Anthropic, Cisco, Chainguard, Cohere, GenLab, OpenAI, and Wiz. CoSAI is an initiative to enhance trust and security in AI use and deployment. CoSAI’s scope includes securely building, integrating, deploying, and operating AI systems, focusing on mitigating risks such as model theft, data poisoning, prompt injection, scaled abuse, and inference attacks.  The project aims to develop comprehensive security measures that address AI systems’ classical and unique risks.  CoSAI is an open-source community led by a Project Governing Board, which advances and manages its overall technical agenda, and a Technical Steering Committee of AI experts from academia and industry who will oversee its workstreams.

The Need for CoSAI

Artificial intelligence (AI) is rapidly transforming our world and holds immense potential to solve complex problems. To ensure trust in AI and drive responsible development, it is critical to develop and share methodologies that keep security at the forefront, identify and mitigate potential vulnerabilities in AI systems, and lead to the creation of systems that are Secure-by-Design.

Currently, securing AI and AI applications and services is a fragmented endeavor. Developers grapple with a patchwork of guidelines and standards which are often inconsistent and siloed. Assessing and mitigating AI-specific and prevalent risks without clear best practices and standardized approaches is a significant challenge for even the most experienced organizations.

With the support of industry leaders and experts, CoSAI is poised to make significant strides in establishing standardized practices that enhance AI security and build trust among stakeholders globally.

“CoSAI’s establishment was rooted in the necessity of democratizing the knowledge and advancements essential for the secure integration and deployment of AI,” said David LaBianca, Google, CoSAI Governing Board co-chair. “With the help of OASIS Open, we’re looking forward to continuing this work and collaboration among leading companies, experts, and academia.”

“We are committed to collaborating with organizations at the forefront of responsible and secure AI technology. Our goal is to eliminate redundancy and amplify our collective impact through key partnerships that focus on critical topics,” said Omar Santos, Cisco, CoSAI Governing Board co-chair. “At CoSAI, we will harness our combined expertise and resources to fast-track the development of robust AI security standards and practices that will benefit the entire industry.”

Initial Work

To start, CoSAI will form three workstreams, with plans to add more over time:

Software supply chain security for AI systems: enhancing composition and provenance tracking to secure AI applications. Preparing defenders for a changing cybersecurity landscape: addressing investments and integration challenges in AI and classical systems. AI security governance: developing best practices and risk assessment frameworks for AI security.

Participation 

Everyone is welcome to contribute technically as part of the CoSAI open-source community. OASIS welcomes additional sponsorship support from companies involved in this space. Contact join@oasis-open.org for more information.  

Additional Information
CoSAI charter

Support for CoSAI

Amazon
“At Amazon, our top priority is safeguarding the security and confidentiality of customer data. From day one, AWS AI infrastructure and the Amazon services built on top of it have had security and privacy features built-in that give customers strong isolation with flexible control over their systems and data. As a sponsor of CoSAI, we’re excited to collaborate with the industry on developing needed standards and practices that will strengthen AI security for everyone.”
– Paul Vixie, VP/Distinguished Engineer and Deputy CISO, Amazon Web Services

Anthropic
“As a safety-focused organization, building and deploying secure AI models has been core to our mission from the start. We’re proud to partner with other industry leaders to help foster a secure AI ecosystem and collaborate on a set of technical security best practices and standards. We look forward to the work ahead with the coalition to encourage safe AI development.”
– Jason Clinton, Chief Information Security Officer, Anthropic 

Cisco
“Cisco is very excited to join forces with other industry leaders in the Coalition for Secure AI (CoSAI). This effort underscores our commitment to advancing AI security, developing standardized best practices, and ensuring that AI technologies are secure-by-design. Together with our partners, we aim to drive innovation and build trust in AI systems across all sectors.”
– Omar Santos, Distinguished Engineer, Cisco

Chainguard
“As we witness AI workloads evolving beyond simple applications to more sensitive and critical functions, ensuring their security becomes paramount. The current landscape is fragmented, with developers navigating through inconsistent and siloed guidelines. At Chainguard, we are excited to join CoSAI and contribute our expertise in creating secure-by-design AI systems. Together, we can set new benchmarks for AI security, ensuring that innovation progresses on a foundation of safety and reliability.” 
– Kim Lewandowski, Co Founder and Chief Product Officer, Chainguard

Cohere
“Cohere is proud to join the Coalition for Secure AI (CoSAI) to further our commitment to building frontier enterprise AI solutions with security and data privacy at the core. AI will have a transformative impact on businesses and we look forward to working with the rest of the industry to develop comprehensive standards that enhance trust and security to encourage wider adoption of this technology.” 
– Prutha Parikh, Head of Security, Cohere

GenLab
“Security requires a community to support, integrate, and promote best practices globally to ensure the stability and safety of AI. That’s why we are excited about being a member of CoSAI and helping discover and promote these practices within its own companies and the broader global ecosystem.”
– Daniel Riedel, Founder, GenLab Venture Studio

Google
“We’ve been using AI for many years and see the ongoing potential for defenders, but also recognize its opportunities for adversaries. CoSAI will help organizations, big and small, securely and responsibly integrate AI – helping them leverage its benefits while mitigating risks.”
– Heather Adkins, Vice President and Cybersecurity Resilience Officer, Google

IBM
“IBM is excited to join the Coalition for Secure AI (CoSAI), a new initiative that brings together industry leaders, organizations, and technology experts to develop standardized approaches to address AI cybersecurity. By participating in CoSAI, we are committed to fostering collaboration, innovation, and education, so that AI systems are more secure-by-design. This initiative will empower developers with the best practices, tools, and methodologies needed to safeguard AI solutions.”
– Alessandro Curioni, IBM Fellow, Vice President Europe and Africa and Director IBM Research Zurich

Intel
“The speed of AI innovation must be matched by the security of its creations. Intel is committed to advancing secure AI practices and doing so will require collaboration across the ecosystem. The Coalition for Secure AI (CoSAI) will provide security practitioners and developers with accessible guidance, resources and tools to create secure AI systems. We are proud to participate in this effort as a founding member alongside our CoSAI partners.”
– Dhinesh Manoharan, Vice President and General Manager, Security for AI & Security Research, Intel

Microsoft
“Microsoft remains steadfast in its commitment that safety and security be at the heart of AI system development. As a Founding Member of the Coalition for Secure AI, Microsoft will partner with similarly committed organizations towards creating industry standards for ensuring that AI systems and the machine learning required to develop them are built with security by default and with safe and responsible use and practices in mind. Through membership and partnership within the Coalition for Secure AI, Microsoft continues its commitment to empower every person and every organization on the planet to do more…securely.” 
– Yonatan Zunger, CVP, AI Safety & Security, Microsoft

NVIDIA
“As AI adoption continues to grow across industries, it’s paramount to ensure proper guidance and security measures when building and deploying models. As a founding member of the Coalition for Secure AI, NVIDIA is committed to building a community dedicated to making secure and trustworthy AI accessible to all.”
– Daniel Rohrer, VP of Software Product Security, Architecture and Research at NVIDIA

OpenAI
“Developing and deploying AI technologies that are secure and trustworthy is central to OpenAI’s mission. We believe that developing robust standards and practices is essential for ensuring the safe and responsible use of AI and we’re committed to collaborating across the industry to do so. Through our participation in CoSAI, we aim to contribute our expertise and resources to help create a secure AI ecosystem that benefits everyone.”
– Nick Hamilton, Head of Governance, Risk, and Compliance, OpenAI

PayPal
“PayPal is proud to partner with CoSAI to help shape the industry’s guidelines and standards for secure AI development. We are at the forefront of the ever-evolving cybersecurity landscape as we power about a quarter of the world’s e-commerce transactions every year. Ensuring that every transaction is safe and secure is our top priority. We are excited to collaborate with the coalition to develop comprehensive standards and practices that ensure safe, secure AI for everyone.”
– Shaun Khalfan, Chief Information Security Officer, PayPal

Wiz
“Like the early days of cloud, AI adoption has skyrocketed while governance and security must play catch up. Wiz believes in enabling organizations to tap into the transformative power of AI while staying secure. That belief is driving our participation in CoSAI, and we can’t wait to partner alongside so many thought leaders who are equally committed to the cause. The future is bright.”
– Ryan Kazanciyan, Chief Information Security Officer, Wiz

Media Inquiries:
Carol Geyer, carol.geyer@oasis-open.org

The post Introducing the Coalition for Secure AI, an OASIS Open Project appeared first on OASIS Open.


ResofWorld

Thailand’s big market for small trucks goes electric

Toyota and Isuzu have long dominated Thailand’s pickup truck market. As they prepare to launch EV trucks, they face competition from Chinese firms.
In the Thai beach town of Pattaya, travelers disembarking at the Bali Hai pier can hail a taxi or cram into a songthaew, a modified pickup truck. On a recent...

How Apple’s India gamble paid off

India’s growing middle class is fueling a billion-dollar sales surge.
A quick programming note: Exporter is going to be taking a summer break, but look for us back in your inbox in September. In the meantime, you can keep up...

DIF Blog

Decentralizing Trust in Identity Systems

DIF's Credential Trust Establishment Working Group has released a new white paper titled "Decentralizing Trust in Identity Systems", describing how to achieve scaleable trust relationships in decentralized identity networks. The problem of trust in a decentralized identity ecosystem comes down to the simple question of whether

DIF's Credential Trust Establishment Working Group has released a new white paper titled "Decentralizing Trust in Identity Systems", describing how to achieve scaleable trust relationships in decentralized identity networks.

The problem of trust in a decentralized identity ecosystem comes down to the simple question of whether a credential verifier should trust the issuer of a credential. This problem becomes increasingly complex as networks expand to include many credential issuers and delegated trust relationships.

This paper explores different trust network architectures, comparing risks, benefits, and tradeoffs. It highlights the advantages of the Credential Trust Establishment Specification, offering practical recommendations for developing and managing of trust networks.

Highlights Introduction to Trust Networks: A discussion on Trust Networks, with examples from various industries such as credit card networks, telecommunications, and online marketplaces. Relevance to Decentralized Identity: How Trust Networks facilitate trust in decentralized identity ecosystems. Architecture Comparisons: A comparison of API-oriented and data-oriented architectures, highlighting the advantages and disadvantages. Credential Trust Establishment: An overview of the Credential Trust Establishment specification and how to get started. Read the White Paper

To read the full white paper, please see the Credential Trust Establishment White Paper.

Learn More

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| Subscribe on YouTube
| Read our DIF blog
| Read the archives

Wednesday, 17. July 2024

Ceramic Network

Optimizing Ceramic: How Pairwise Synchronization Enhances Decentralized Data Management

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic. “Information also wants to be expensive. Information Wants To Be Free.

In the past months we have replaced the algorithm at the heart of the ceramic database. This post explains why we made the change from Multicast to pairwise synchronization but first let review the design motivations of Ceramic.

“Information also wants to be expensive. Information Wants To Be Free. ...That tension will not go away.” Stewart Brand. There is tension since data storage is a competitive market but data retrieval can only be done by the service that has your data. At 3Box Labs, we want to catalyze a data ecosystem by making community driven data distribution not only possible but available out of the box. Ceramic is a decentralized storage solution for apps that are dealing with multi-party data and that is more scalable, faster and cheaper than the blockchain.

Data vendor lock-in

Many organizations and individuals have data that they want to publish and Ceramic lets them do so without instant data vendor lock in for storing their own data. In the Web2 era, data often becomes ensnared within exclusive services, restricting its accessibility and durability. Access to this data requires obtaining permission from the service provider. Numerous platforms have vanished over the years, resulting in significant data loss like GeoCities, Friendster and Orkut. Even within still existing companies like Google, numerous lost data products are documented. See killed by google.

We can break free from this risk by creating data-centric applications that multihome the data. Ceramic is the way to have many distinct data controllers publishing into shared tables in a trustless environment. Each reader can know who published what content and when they did without relying on trusting the storage service to keep accurate audit logs. Since each event is a JSON Document, signed by a controller, timestamped by ethereum, and in a documented schema it can be preserved by any interested party, with or without permission from the storage vendor.

Multihome the data

In Ceramic we separate the roles of data controllers from the data servers. By allowing data to live on any preferred server the data is durable as long as any server is interested in preserving the data. This allows data to outlive a particular data server, paired with the durability of data living in multiple places and the speed/reliability of operating on local data.

Document the schema

Throughout the history of the internet, we have witnessed numerous data services going away and taking the users data with them. While multihoming helps preserve data, it's useless without the ability to interpret it.

Ceramic preserves the data formats in two ways. The first is that the data lives in JSON Documents. This format allows us to reverse engineer and examine the data.  The second is that the model schema gets published. The model schema contains both json-schema and human language description that the original developer can use to give machine and human context to the data. This enables both the preservation of the data and schema so the data can be understood and new apps can be made to interact with the preserved data.

{ "data":{ "accountRelation":{ "type":"list" }, "description":"A blessing", "name":"Blessing", "relations":{ }, "schema":{ "$defs":{ "GraphQLDID":{ "maxLength":100, "pattern":"^did:[a-zA-Z0-9.!#$%&'*+\\/= ?^_`{|}~-]+:[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~- ]*:?[a-zA-Z0-9.!#$%&'*+\\/=?^_`{|}~-]*$", "title":"GraphQLDID", "type":"string" } }, "$schema":"https://json-schema.org/draft/2020-12/schema", "additionalProperties":false, "properties":{ "text":{ "maxLength":240, "type":"string" }, "to":{ "$ref":"#/$defs/GraphQLDID" } }, "required":[ "to" ], "type":"object" }, "version":"1.0", "views":{ "author":{ "type":"documentAccount" } } }, "Header":{ "controllers":[ "did:key:z6MkgSV3tAuw7gUWqKCUY7ae6uWNxqYgdwPhUJbJhF9EFXm9" ], "model":{ "/":{ "bytes":"zgEEAXFxCwAJaG1vZGVsLXYx" } }, "sep":"model" } }

Example schema document

Information retrieval 

The key to multihome data is being able to retrieve the data from a server that has it.

How do we move the data from the servers that have the data to the servers that are interested in storing it? When we first made Ceramic we used two multicast methods: The first was to do a gratuitous announcement of new data. Send the data to EVERY node in the network so that they can store it if they are interested in it. Second, if a node did not know about a stream then when requested by a user it would multicast a request to the whole network and take the latest version to come back as a response.

This worked but had several drawbacks. The first is that requests for streams that a node did not know used WAN traffic and would have unpredictable latencies. This meant that all applications needed to design for slow unpredictable retrieval times. The second drawback was that a node had no way to retrieve a complete set of the streams that matched their interests. They could only listen to the multicast channel and fetch any stream they happened to hear about. Any stream that they missed either because it happened before the node was online or during down time could be missed forever. Third, there is a performance cost to sending requests to nodes that have no mutual interest with your node. A node that did 100 events a year could not scale down since it would need to keep up with filtering announcements from nodes doing 100 events a second. If we wanted to support both very large and very small data centric applications we needed a new strategy. We even saw cases where a slow node could not keep up on the multicast channel harming the performance of larger more powerful nodes.

To solve these problems of performance, completeness, and scalability we switched to a pairwise synchronization model. Each node advertises the ranges of streams that the node is interested in. Each node only synchronizes the streams that are of mutual interest and the nodes synchronize pair wise.

Scalability

Since the nodes synchronize pairwise, no slow node can harm the ability of two healthy nodes to complete a synchronization. If two nodes have no intersection in their interests then the conversation is done. A range of streams that has 100s of events per second that your node is not interested in will not create work for your node. A node only needs to scale to the speed of events in the ranges it is interested in and the scale of any model you are not interested in costs you nothing. This solved our scale up / scale down objective.

Completeness

If the two nodes do have an intersection of their interests they will continue the synchronization until both nodes have ALL the events that the other node had when the synchronization began. There is no longer a need for high availability to be online either when the stream’s event was originally published or when some node queried for that stream. If the event is stored by either of the nodes both nodes will have it at the end of the pairwise synchronization. Once a node has pairwise synchronized with each of the nodes that are advertising an interest range that node has all of the events in that range as of the time of the synchronization. This solves the completeness objective.

More interestingly, the local completeness means that we can build local indexes over the events and do more complex queries over the events in the ranges nodes are interested in entirely locally.

Performance

 Lastly, since we have a complete set of events for our interests we can serve queries about the events from the local node with no need for WAN traffic. This solves the performance objective for predictable fetch latencies.

Pairwise Synchronization in Logarithmic rounds

In the multicast model ceramic sends messages to all other ceramic nodes. One of the most notable differences with synchronization is that nodes do pairwise synchronization one peer at a time. The two peers will each send the other their interests. Both nodes filter the events that they have to find the set of events of mutual interest between the two nodes. Once this intersection is found we synchronize the set with a Range-Based Set Reconciliation protocol we call Recon.

We can report progress in a Recon synchronization by reporting the percentage of events in the in sync vs syncing ranges. Alternatively we could render a bar like in the diagram showing which ranges are in which states.

This is a divide and conquer protocol. We start with the full intersection as a single range. We pull a range off the work list and send the (hash, count) of all events in the range to the other side. They compare their own (hash, count) and respond accordingly.

We have

They have

Acton

hash_A

hash_A

Done.
`in sync`

0

hash_A

Send a request for the events.
`Don’t have`

hash_A

0

Send the events.
`in sync`

hash_A

hash_B

Split the range
Push sub-ranges from split on the work list.
Each range `syncing`

The range splits are handled differently on the Initiator then the Responder. The Initiator maintains the work list and pushes all of the subranges onto the work list. The Responder just sends a message back with multiple ranges and hashes for each range. This keeps the synchronization state on the Initiator and reduces the burden on the Responder to a stateless call and response. This fits Recon into the http client server request response paradigm.

Exponential distribution

Now that we have replaced a multicast message to all nodes in the network with pairwise sync it is reasonable to ask if we have broken the exponential distribution we got from multicast trees.

How fast can data spread through the network? Now that we have replaced the multicast channel with pairwise connections, how do we match the exponential distribution of the multicast channel? 

We get this property since each node cycles through connecting to all other nodes that advertise overlapping interests. When the node that originally received the event from a client there is 1 copy on the network. After the first sync there are 2. Then both of the nodes sync to new nodes giving 4. This will grow exponentially until almost all interested nodes have the data. At that point the odds that any node with the event calls a node without it is small but the odds that the node without the event calls a node with it is large. By using synchronization we get the benefits of both push and pull gossip protocols. Push which is fast when the knowledge of the event is rare and pull which is fast when knowledge of the event is common.

Summary

By using Set reconciliation to perform pairwise synchronization of node’s overlapping interests we are able to have performance, completeness, and scalability. The predictable performance of querying local data on your node. The completeness of synchronizing all of the events of interest preemptively. The scalability of not synchronizing the events that lay outside of the interests of a node. Pairwise synchronization protects the network from slow nodes from slowing down the rest of the network. It is now possible to scale up or down without performance and completeness problems. This enables developers to build data intensive applications without the data vendor lock-in from either the storage providing service or the application that originally read the schema.


ResofWorld

Bhutan’s first AI startup is seven college kids in a dorm

NoMindBhutan services prominent clients like the Bhutan National Bank and Drukair - Royal Bhutan Airlines.
When college students Ugyen Dendup and Jamphel Yigzin Samdrup launched their startup last year, they had yet to learn that they would spend most of their time servicing some of...

Tuesday, 16. July 2024

FIDO Alliance

Case Study: Wedding Park Deploys Company-Wide Passwordless Authentication for Internal Cloud Service Logins

Corporate overview: Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review […]

Corporate overview:

Wedding Park Co., Ltd. was founded in 2004 with the management philosophy of “Making marriage happier.” Celebrating its 20th anniversary in 2024, it started as a wedding review information site and has since expanded its operations. Utilizing a wealth of information, it operates several wedding-specialized media, including the wedding preparation review site Wedding Park. In addition, it runs various businesses in the realm of weddings combined with digital technology, such as internet advertising agency services, digital transformation (DX) support, and educational ventures.

Background and challenges leading to deployment

Wedding Park was faced with the challenges of strengthening the security of multiple cloud services that were being used for internal operations and the complexity of password management. As a way to address these issues, the company introduced an ID management service and consolidated them into a cloud service entrance with a single sign-on function.

The impetus for deploying FIDO authentication came from the fact that Salesforce, which is used for authentication for customer management, order and supply systems, and time and attendance management, announced that multi-factor authentication (MFA) was mandatory. However, if MFA is applied only to Salesforce and other cloud services continue to operate with password authentication, not only will the usability of users deteriorate, but the work of the IT management department will also become more complicated. In addition, due to the vulnerability of password-only authentication, the company decided to apply MFA to all cloud services, including Salesforce, in accordance with its policy to promote zero-trust security in February 2020.

Selection and verification of an authenticator

As an authentication method for MFA, the company considered one-time password authentication (OTP) and biometric authentication using smartphone applications, but ultimately decided to deploy passwordless authentication using FIDO for its unique ability to improve both security and user convenience.

In order to realize passwordless authentication using FIDO, a terminal equipped with a FIDO-compatible biometric authentication device is required. The majority of devices currently on the market support FIDO authentication, and with the exception of a few employees, the adoption of FIDO has been supported by the fact that all in-house devices are already equipped with Windows Hello and Touch ID. For some employees who use the devices not equipped with biometric features, a separate external authenticator has been installed.

A step-by-step changeover for each department

After examining the authenticators, the policy to deploy passwordless authentication company-wide in January 2022 was officially launched. The transition took place from February to March of the same year, and the smooth implementation in a short period of one month was made possible by the department-by-department implementation and the generous support provided by the IT management department. For this implementation, the company requested the support of CloudGate UNO, an identity management platform by International System Research Corporation (ISR) that the company has been using since 2012, because it supports passwordless authentication using FIDO2 and biometric authentication using a smartphone APP. 

The introduction of the system within the company began with the development department and gradually progressed to departments with a larger number of employees. First, at regular meetings for each department, the company communicated the purpose of why the system was being introduced and the benefits of “the deployment of the system will make daily authentication more convenient,” and gained the understanding across the company. The introduction of the system on a departmental basis had the advantage of not only limiting the number of people the IT management department had to deal with at one time, but also allowing the accumulation of QA as test cases and the smooth maintenance of manuals, since the system was introduced starting with the development department, which had high IT skills.

As a result of close follow-up by the IT management department, which not only prepared materials, but also checked the progress status on the administrator website as needed, and individually approached employees who had not yet registered their certifiers, the company was able to implement the system company-wide within the targeted time frame.

Effects of introduction

The number of login errors due to mistyping of passwords, which used to occur about 200 times a month, has been reduced to zero since the deployment of FIDO authentication. Many employees commented that the system has become very convenient, eliminating authentication failures due to forgotten passwords or typing errors. In addition, the number of periodic password reset requests has decreased, resulting in a reduction in man-hours for the administrator.

The passwordless authentication is smooth, and the authentication status retention period was shortened to further enhance security, but the system has continued to operate without problems since then.

Wedding Park’s future vision is to link all cloud services used within the company to “CloudGate UNO” and centrally manage them, including authentication, with “CloudGate UNO.

Akira Nishi, General Manager of the Corporate IT Office, who spoke with us about this case study, made the following comments.

“For those who are considering the deploying of a new authentication method, there is inevitably a concern that a change in authentication method will cause a large-scale login failure. In our case, in the early stages of the project, we held explanatory meetings for each department and repeatedly brushed up on explanatory materials and procedures, which was effective in minimizing confusion and anxiety within the company.

“After the switchover, we continued to check on the progress of the implementation and followed up with each department individually, but once the use of passkey (device-bound passkey) became standardized within the company, we felt that the scope of use, including various security measures, was expanding dramatically.”

download the case study

Ceramic Network

New Ceramic release: ceramic-one with new Ceramic Recon protocol

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of

The Ceramic protocol has undergone a series of updates over the past few months, all focused on improving performance and scalability, enabling developers to build applications that work better and faster. Today, the core Ceramic team is excited to share these updates with the community by announcing the release of ceramic-one.

About the release

The new release of Ceramic includes a data synchronization protocol called Recon, implemented in Rust. This new implementation of the Ceramic protocol enables data sharing between nodes and allows developers to run multiple nodes that stay in sync and are load balanced. All this facilitates highly available Ceramic deployments and reliable data synchronization.

To utilize the Recon protocol for their applications, developers are provided with a binary called ceramic-one.

This new implementation of the Ceramic protocol offers significant performance and stability improvements. Additionally, this release marks a significant shift in making the Ceramic architecture more robust, allowing the team to iterate on and build new protocols in the future.

The new Recon protocol

Recon is a new data synchronization protocol used for synchronizing stream events in the Ceramic network, implemented on top of libp2p. Stream sets bundle multiple streams together, allowing nodes with a common interest in certain streams to synchronize efficiently.

Before Recon, Ceramic nodes broadcasted updates to streams to every node in the network using a simple libp2p pubsub topic. Due to the single channel, nodes would receive stream event announcements they were not interested in, imposing a significant overhead on every node. Additionally, the network's throughput was limited by bandwidth, which led to either prioritizing high-bandwidth nodes or greatly limiting the network throughput to support low-bandwidth nodes.

Recon provides low to no overhead for nodes with no overlap in interest, while retaining a high probability of receiving the latest events from a stream shortly after any node has the events, without any need for remote connections at query time. By shifting updates from the pubsub channel to a stream set, interested nodes can synchronize without burdening uninterested ones. Stream sets also enable sharding across multiple nodes, allowing synchronization of only sub-ranges, which distributes the storage, indexing, and retrieval workload.

Additionally, nodes need to discover peers with similar interests for synchronization. Recon achieves this through nodes gossiping their interests and maintaining a list of peers' interests, ensuring synchronization with minimal bandwidth. Nodes also avoid sending event announcements to uninterested peers.

Performance and robustness improvements

This release, along with the recent Ceramic Anchor Service (CAS) updates, marks significant scalability improvements. Currently, Ceramic provides a throughput of 250 TPS (transactions per second), more than double the previous throughput of up to 100 TPS before the Recon implementation. This increase in throughput is especially important for applications that handle large amounts of user data and require fast transaction times.

These numbers were measured between two nodes that share the same interest. It’s worth noting that nodes without overlapping interests do not affect each other's throughput. This means that, in theory, the throughput of a ceramic-one node scales horizontally. However, there is still one component that puts an upper limit on this: the CAS, which is operated by 3Box Labs. This service is currently a centralized bottleneck in the protocol, which is why the team’s next goal is Self-Anchoring, allowing any Ceramic-One node to operate completely independently.

This release of Ceramic is also a significant step towards making the Ceramic architecture more robust, enabling the team to iterate on it and build new protocol implementations more easily and quickly.

Getting started with ceramic-one

All new Ceramic developers are recommended to use the ceramic-one to start building on Ceramic. Check out the setup guides on the Ceramic documentation to get started.

Developers, who have been building on Ceramic for a while, are encouraged to migrate their applications to the ceramic-one-based implementation. Check out this migration guide to follow the migration steps.

Share you feedback with us!

We would like to get your feedback on building on Ceramic. Do you have any suggestions or ideas of how the core Ceramic team can improve the implementation of Ceramic? Do you have questions or troubles using the new release or migrating your existing application? Share your thoughts and ideas with us by posting on the Ceramic Community Forum.


FIDO Alliance

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Join this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

View the webinar slides to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

View the webinar slides to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

UX Webinar Series: Essentials for Adopting Passkeys for your Consumer Authentication Strategy

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a […]

In part one of this four-part webinar series, attendees learned why major service providers are adopting passkeys as the foundation of their consumer authentication strategy. This webinar is for a nontechnical audience. It is intended to help you investigate the nuances of passkey roll-out strategies and end user experiences (UX) for consumers.

Watch this webinar to:

Learn best practices to meet end-user needs with passkeys Learn how to reduce costs with passkeys Learn how passkeys create a long-term authentication strategy built on standards

This webinar is for:

Product managers IT managers / leaders Security Analysts Data Analysts

UX Webinar Series: Aligning Authentication Experiences with Business Goals

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical […]

In the second of a four-part webinar series, attendees learned how to adapt your authentication experiences to better solve key metrics for consumer authentication. This webinar is for a nontechnical audience seeking user interface and workflow guidance for consumer authentication.

Watch the webinar to:

Learn how to execute a passkey strategy that solves business goals and end-user needs Learn how to use the FIDO Design Guidelines to jump-start your concepts and socialize them to win stakeholder alignment within your organization Watch real users using passkeys for the first time and learn how to use passkey usability research findings to demystify passkey experiences and align requirements amongst your teams

This webinar is for:

Developers Designers Content Strategists

UX Webinar Series: Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking […]

In the third of a four-part webinar series, attendees learned how to drive revenue and decrease costs with passkeys for consumer authentication. This webinar is for a nontechnical audience seeking to make sound business decisions for new consumer authentication strategies.

Watch the webinar to:

Learn how to significantly increase first try consumer sign in success and speed to sign in Learn how to align your teams around user experience patterns proven to be easy for consumers Mitigating threats of phishing, credential stuffing and other remote attacks. Also, learn how to offer passkeys without needing passwords as an alternative sign-in or account recovery method.

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents

DIF Blog

Guest blog: Steve McCown

Anonyome Labs was founded in 2014 to give people control and freedom over their personal and private information. Based in California, Utah and Australia, the company has deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography, and equips businesses with cutting-edge privacy and cybersecurity solutions that

Anonyome Labs was founded in 2014 to give people control and freedom over their personal and private information. Based in California, Utah and Australia, the company has deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography, and equips businesses with cutting-edge privacy and cybersecurity solutions that seamlessly integrate with existing offerings and systems. 

Anonyome Labs Chief Architect and DIF Steering Committee member, Steve McCown, talked to us about how the company is using Decentralized Identity to drive interoperability and usability of their products, his involvement in DIF, and decentralized identity standards work. 

What does Anonyome offer, and who are your customers? 

We started 10 years ago, when we noticed that people’s personal and private information was being collected and used without their permission. There were certain contact points that industry was using to triangulate people’s behavior, such as phone numbers and email addresses. So we developed an app called MySudo that gives users the ability to create contact sets and associated pseudonymous profiles called ‘Sudos’, consisting of an email address, phone number and a one-time or reusable credit card number. 

People already had separate home and work contact details, so we took it further. With MySudo, you can create Sudos for your online purchases, for hobbies, for travel, and so on. Our mission was to create separate IDs, so when a hacker steals data from websites, or data brokers buy and sell your online activity data, they can’t correlate your home activities with your work activities, or your social activities with products you purchase. This increases your privacy by disrupting how your data can be correlated. 

Individuals use MySudo to privately communicate back and forth, share files, and so on — for all their usual digital activities. We also have an Enterprise-grade platform with all the same capabilities, which we license to organizations who incorporate these privacy-enhancing services into their own offerings. There are some very notable companies you would have heard of that are providing a white label version of our privacy-enhancing technologies to their customers. We’ve also been working with national governments to help preserve their citizens’ and commercial entities’ private data. 

What got Anonyome interested in Decentralized Identity? 

Anonyome is focused on creating strong privacy-enhancing and secure identity tools for everyday users.  The MySudo product is a great example of how internet users can control their privacy during everyday activities, such as email, texting, calling, and purchasing.  The critical privacy and security requirements for these elements are greatly enhanced with decentralized identity elements and protocols.  

In a previous job, I worked for the US Department of Energy as a cybersecurity exploit researcher, so that’s part of my mindset. How would someone take advantage of existing identity paradigms? It’s not just about all the good that everyone is trying to accomplish, it’s also about how an adversary might leverage identity technologies for illicit purposes. That concern is what led me to engage with Decentralized Identity and keeping the crypto keys, passwords, tokens, etc. within your own digital wallet. Because if you control the keys, you can better control access to your data assets. 

The other thing that got Anonyome on this track is that strong crypto environments like Signal were emerging as closed ecosystems. If someone wants to use Signal, they also need to convince others to adopt it, too.  For a lot of users, that can be really hard. Decentralized Identity provides a way to create an open encryption environment and keep the strong cryptography while also bridging across ecosystems.

While we may have competitors in parallel spaces, it benefits everyone if we can increase secure communication between applications that users enjoy using — this secure cross talk between applications becomes a rising tide that floats all boats … customers, companies, everyone. 

Interoperable security is the next evolution that extends beyond closed secure ecosystems, and we’re working to be interoperable with lots of other decentralized identity providers and users. This is why Anonyome has tasked me to work in the standards orgs. If we can collectively solve security and privacy issues at the identity standards level, then we will have a way to realize our goals for secure and private interoperability between applications … and people. 

What is driving Decentralized Identity uptake, in your view? 

There’s tremendous interest in Decentralized Identity. I credit a lot of this to GDPR. The EU has assigned significant liability and penalties for data breaches. Faced with these very large fines, companies are trying to figure out what they need to do to protect users' privacy, since this now affects their bottom lines. While some are doing the bare minimum, others are doing a lot more. There’s also a lot happening in Europe with the Digital Identity Wallet. We want to be interoperable with these standards and the services that implement them.  So we've been reviewing the EU’s Architecture and Reference Framework (ARF) to find out what identity, credentials, proof types and so on we need to work with. 

Privacy at the legislative level is no longer just an EU thing. For example, there are new laws in Utah (US) that mandate that state government organizations can only collect the data they actually need, keep it only as long as authorized and then destroy it unless retention regulations require that they need to retain it. I serve on the Utah Privacy Commission and we’re very strong proponents of this. We will receive presentations from state agencies and give our feedback on what enhances privacy and what needs additional work. There is a real appetite among government officials for better privacy. 

There’s also growing awareness of the potential problems associated with working in the cloud. For example, if you store your data in the cloud, it may be encrypted, but where do the keys go? If you don’t control access to the data access keys (or a provider does this as part of their service), then your information may be insecure without you really knowing it. Recently, there have been some very large data breaches, which seem like about once a month. In response, people are receiving emails saying “your data has been stolen, so here’s 3 to 6 months of free credit monitoring”. This almost seems on the level of “security theater” where companies are doing something, but it’s not very useful and doesn’t solve the problem.

Decentralized Identity can augment many existing services.  For example, DI capabilities can be used to encrypt my files before they are sent to a cloud storage provider.  The cloud storage provider can then work their storage, retrieval, and replication magic without having access to the unencrypted files. As long as I control the encryption keys, my data can only be decrypted by me. 

That’s just what’s possible with file storage. Decentralized Identity is also delivering a wide range of interoperable privacy-enhancing capabilities for communications systems, identity control, access management, digital credentials, and so on. 

That’s what I see happening with DI. If implemented as designed, we’re going to put privacy and security control in the users’ hands. Then we can continue to enjoy many wonderful cloud services while we will be able to control access to our own data. 

What is the value of open standards for Anonyome and your customers? 

We’re huge fans of open standards. We try not to build systems that are proprietary from an interoperability perspective. We strive to ensure that all of our interoperability points are based on industry standards, so that we can communicate with other platforms and they can easily communicate with ours.  Embracing DI standards means if a user wants to use a particular platform, then they don’t have to convince others to use it before it's useful.

As a quick analogy, we’re aiming for the interoperability of email combined with the strong security and authentication of a secure communications application. Pick up any old email app and it will work with most any other, but the security isn’t there. If you work for a large enterprise, they may have put something like S/MIME in place, but when you or I get a regular gmail account, that’s not something we typically add. This is primarily because it’s too hard for users to manage the certificates and so forth. Today, this means that emails are typically transmitted in the clear and not end-to-end encrypted. DI facilitates privacy-enhancing and more secure interactions, which is a key reason why we’re working with these technologies. 

As someone who is involved with DIF, W3C (the World Wide Web Consortium) and the Trust over IP Foundation, what do you see as their respective roles, and the differences between them? 

We’re super excited about each of these organizations.  W3C has spent a great deal of time and effort to create DI’s main building blocks, namely, DIDs and Verifiable Credentials.  W3C is continuing to actively refine and extend these elements in order to facilitate many enhanced DI capabilities.

Trust over IP has created an excellent DI paradigm for illustrating how all of the DI components connect and interact.  These layers depict how DIDs are anchored in a Verifiable Data Registry (such as a decentralized ledger), how DID-based communication takes place, how Verifiable Credentials fit into the ecosystem, and finally how a top layer governance model shows all participants in a system what the systems rules are.

What brought me to DIF was getting involved in DIDComm (Decentralized Identifier Communications).  I see this as one of the main attractions in DI.  As I had been participating for a while, I volunteered to become a co-chair with Sam Curren.  This gave me new insights into the community standards-building processes and in particular key details of the DIDComm protocol.  Later, Sam nominated me as a candidate during the Steering Committee elections and I am honored to have been elected to the SC.

I see DIF as one of the leading development organizations where implementation happens. While other organizations focus primarily on creating a range of standards and documents, which is vital, DIF typically focuses on producing a variety of working software that is based on industry standards such as Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).  This makes DIF a key provider in the larger symbiotic DI industry.  

All of the various standards organizations are distinct and have different missions. DIF fills a critical void that traditional standards organizations don’t typically emphasize. The term incubator is a little outdated, but that’s part of the role that DIF performs. Everything in DI has started with a few people getting together and having a conversation about how to design, build, or enhance some particular technology element. This leads to standards being created.  At some point, usable code libraries are created using the standard base elements and then those are combined into larger protocols, services, environments, and so on. 

There’s a whole lot that needs to happen in this process, and a lot of that work happens at DIF. 

If your goal is implementation right now, that’s where DIF excels. It’s where companies come to pick up architectural designs and reusable code libraries they can use in their products today. 


FIDO Alliance

UX Webinar Series: Passkeys Design Guidelines AMA ask me anything!

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format! Speakers answered audience […]

In the final edition of a four-part webinar series attendees had the opportunity to ask FIDO Alliance subject matter experts anything in an: “Ask Me Anything” format!

Speakers answered audience questions for the full hour to provide actionable guidance for the use of passkeys for consumer authentication.

Phase 1: Identity needs and the “password problem”
Phase 2: Research and Screen Ideas
Phase 3: Concept and Prototype
Phase 4: Build and Test Phase 5: Release and Optimize

This webinar is for:

Authentication product leaders Chief Technology Officers (CTO) Chief Marketing Officers (CMO) Senior Vice Presidents Designers Content Strategists Product managers IT managers / leaders Security Analysts Data Analysts

Elastos Foundation

Elastos BIT Index: Donald Trump is the “Bitcoin President” for US and Global Tech-Savvy Consumers as Bitcoin Goes Mainstream

US and global respondents perceive Donald Trump as the Most ‘Crypto-Aware’ and ‘Crypto-Ready’ Presidential Candidate, outpacing Joe Biden and Robert F. Kennedy Singapore: July 16th, 2024 – Here at Elastos, the premier SmartWeb ecosystem, we are excited to reveal the latest findings from our BIT Index (Bitcoin, Innovation & Trust). This report underscores that Donald […]

US and global respondents perceive Donald Trump as the Most ‘Crypto-Aware’ and ‘Crypto-Ready’ Presidential Candidate, outpacing Joe Biden and Robert F. Kennedy

Singapore: July 16th, 2024 – Here at Elastos, the premier SmartWeb ecosystem, we are excited to reveal the latest findings from our BIT Index (Bitcoin, Innovation & Trust). This report underscores that Donald J. Trump is perceived as the leading figure among US tech-savvy consumers for his understanding and readiness to embrace Bitcoin.

Key Findings:

38% of respondents anticipate Bitcoin becoming mainstream within four years. 80% foresee Bitcoin evolving into a ‘default’ global currency. Bitcoin adoption is progressing more rapidly in BRICS and Global South Nations compared to Western Nations.

Who is the Bitcoin President?

In the US, 50% of tech-savvy consumers recognize Donald Trump as the most ‘crypto-aware’ presidential candidate, demonstrating a profound understanding of Bitcoin’s intricacies and advantages, compared to Joe Biden (32%) and Robert F. Kennedy (19%).

Globally, the perception remains consistent:

Donald Trump: 51% Joe Biden: 31% Robert F. Kennedy: 19%

Demographic insights reveal that younger consumers (18-24) are slightly less inclined to view Trump as ‘crypto-aware’ (45%), compared to 25-34 year-olds (54%) and 35-44 year-olds (54%). Biden (34%) and Kennedy (21%) see a minor increase among the 18-24 demographic.

Trump is also viewed as the most ‘crypto-ready’ candidate in the US:

Donald Trump: 49% Joe Biden: 30% Robert F. Kennedy: 21%

Globally, the figures are:

Donald Trump: 51% Joe Biden: 29% Robert F. Kennedy: 20%

 

Again, younger demographics (18-24) show less support for Trump as ‘crypto-ready’ (47%) compared to 25-34 year-olds (51%) and 35-44 year-olds (54%). Kennedy receives a slight uplift from 18-24 year-olds (24%).

Trump is also seen by 42% of US respondents as the candidate most likely to promote the use and benefits of Bitcoin compared to Joe Biden (23%) and Robert F. Kennedy (14%).

Globally:

Trump’s support from younger voters (18-24) is lower (37%) compared to 25-34 year-olds (43%) and 35-44 year-olds (45%), while Kennedy sees a slight increase from 18-24 year-olds (17%).

Internationally, Nigerian respondents (59%), followed by the UK (56%) and Germany (54%), believe Trump is the most ‘crypto-ready’, compared to only 42% in India.

Bitcoin Going Mainstream

More than a third of tech-savvy consumers (38%) believe Bitcoin will become mainstream within four years. This belief is higher among 25-34 year-olds (41%) and 18-24 year-olds (40%).

A significant 80% foresee a future where Bitcoin becomes a ‘default’ currency for global transactions, including commodities, real estate, and company valuations.

BRICS Nations and Global South Leading in Bitcoin Adoption

24% of tech-savvy Indian consumers and 26% of UAE respondents use Bitcoin daily, compared to the global average of 18%. In contrast, only 11% of Germans, 13% of UK respondents, 14% of South Koreans, and 15% of US tech-savvy consumers use Bitcoin daily. High acceptance is observed in the UAE and Brazil (49%) for Bitcoin going mainstream within four years, compared to 22% in Germany, 25% in South Korea, and 36% in the UK. 91% of Nigerians and 90% of Indians envision Bitcoin as a ‘default’ currency, compared to 70% in Germany, 73% in the UK and South Korea, and 75% in the US. About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.

The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.

https://elastos.info

https://www.linkedin.com/company/elastosinfo/


ResofWorld

Kenya’s biggest protest in recent history played out on a walkie-talkie app

More than 40,000 Kenyans have downloaded Zello since protests began against the government’s plan to raise taxes.
Betty had never heard of the Zello app until June 18. But as she participated in Kenya’s “GenZ protests” that month — one of the biggest in the country’s history...

Monday, 15. July 2024

MOBI

Honda, Mazda, Nissan complete 1st phase of web3 global battery passport MVP

Nissan, Honda eye EV battery passports in Europe by 2027 Read on the LedgerInsights website MOBI, the mobility web3 consortium, has completed the first phase of the minimum viable product (MVP) for its global battery passport. This involved nine members including DENSO, Honda, Mazda, and Nissan exchanging battery identity and data using open standards.Stepping [...]
Nissan, Honda eye EV battery passports in Europe by 2027 Read on the LedgerInsights website

MOBI, the mobility web3 consortium, has completed the first phase of the minimum viable product (MVP) for its global battery passport. This involved nine members including DENSO, Honda, Mazda, and Nissan exchanging battery identity and data using open standards.

Stepping back, if batteries are to achieve the hoped for environmental impact, how they’re manufactured and recycled is a critical part of the process. Hence the need to keep track of battery lifecycles as mandated in the US and Europe. Digital credentials can include the battery’s composition, status and history.

What MOBI is trying to sidestep is the need for costly one off integrations and to achieve cross industry interoperability.

Stage 1 implemented the Integrated Trust Network (ITN) identity services, which enables companies to exchange battery identity and data on a peer-to-peer basis. Hence, the ITN is a federated registry built and operated by the members using W3C decentralized identifiers (DID). It’s a self sovereign identity (SSI) offering that supports multiple blockchains. The SSI aspect means the data is stored with the battery rather than in a centralized database.

“Today’s global battery value chain is complex and it’s difficult to simultaneously ensure efficiency, scalability, safety, circularity, and regulatory compliance. To balance these priorities, we need to enhance battery lifecycle management through the creation of a shared ecosystem with SSI framework for secure coordination and selective disclosure of sensitive data,” said MOBI CEO and Co-founder, Tram Vo.

Stage 2 will involve trialing Citopia, MOBI’s web3 marketplace for services. This will support applications such as enhanced battery and carbon credits management and vehicle-to-grid communications and payment. It will also support other mobility solutions, such as data-driven pricing for used electric vehicles.

Meanwhile, MOBI has developed several standards. The most relevant is its battery passport standard released a year ago. Perhaps its most foundational standard is for vehicle identity. Other standards include tracking emissions for trips, supply chain blockchain standards and blockchain in vehicle financing.

The post Honda, Mazda, Nissan complete 1st phase of web3 global battery passport MVP first appeared on MOBI | The Web3 Economy.


Elastos Foundation

Elastos at Bitcoin 2024 Conference in Nashville!

We are pleased to announce that Elastos, under the BeL2 initiative, will be attending the Bitcoin Conference in Nashville, taking place from July 25-27, 2024, at the Music City Center, less than two weeks away! This participation is the result of a collaborative effort supported equally by our Cyber Republic (our DAO) via Proposal 145 […]

We are pleased to announce that Elastos, under the BeL2 initiative, will be attending the Bitcoin Conference in Nashville, taking place from July 25-27, 2024, at the Music City Center, less than two weeks away! This participation is the result of a collaborative effort supported equally by our Cyber Republic (our DAO) via Proposal 145 and the Elastos Foundation, spearheaded by Mark Blair, longstanding CR Council Member and Head of Strategy at BeL2. In this article, we outline why this is so important for us, the details of our participation and how you can get involved! Let’s dive right in!

Transforming Bitcoin: Elastos’ Vision and the Power of BeL2

Elastos is dedicated to transforming Bitcoin into a smart, versatile asset while preserving its core principle of decentralisation. BeL2 is a powerful new technology that leverages zero-knowledge proofs (ZKP) and a decentralised clearing network to enable advanced DeFi applications for native Bitcoin (NB). The mission is to maintain Bitcoin’s integrity as the ultimate decentralised system while unlocking its potential for complex financial transactions, staking, and smart contracts​​​​.

BeL2 achieves this through a unique model where information, not assets, is transmitted across chains. This ensures that Bitcoin remains on its main network, maintaining its security and decentralisation. By utilising light client verification and zkBTC full nodes, BeL2 supports a range of Layer 2 applications, from lending to stablecoin issuance, paving the way for a new decentralised financial system akin to the historical Bretton Woods system, which, rather than being backed by gold, is backed by the 21st-century digital gold equivalent, Bitcoin.

Prime Location: Visit Elastos at Booth 621

Our booth at the conference has been officially confirmed, and we have secured a prime location on the conference floor. Booth 621 is the first large booth on the left as you enter, conveniently situated near the entrance, main stage, and the “Nakamoto Stage.” It’s also adjacent to the “Zen Lounge,” ensuring high visibility and foot traffic. This strategic positioning allows us to showcase our innovative solutions to a large audience, expected to be over 35,000 attendees​​!

At our booth, visitors will find exciting merchandise, fun giveaway prizes, and interactive games, including a spin-the-wheel activity. We will showcase alongside Elastos partners and stakeholders and also distribute a brochure (attached in this article) that highlights Elastos and its significant relationship with Bitcoin, emphasising ELA’s status as a merged-mined coin with over 50% of Bitcoin’s hash power​​. Sasha Mitchell, the Head of BeL2 and founder of Elacity, will be speaking at the event, providing deeper insights into our technologies and future plans. The event will feature influential leaders such as Donald Trump, Robert F. Kennedy Jr., Cathie Wood, Michael Saylor, Russell Brand, and Edward Snowden. This year is shaping up to be a very interesting event, given the US elections in November and Web3 becoming political.

BTC Nashville represents an important moment for Elastos and the BeL2 project as we continue to push the boundaries of what Bitcoin can achieve. We invite everyone to visit our booth, engage with our team, and get excited for the future of decentralised finance with Elastos and BeL2. For more information about our participation and updates, stay tuned to our official channels. We look forward to seeing you in Nashville on the 25th! Excited to learn more? Follow Infinity for the latest updates, we will keep you updated throughout the whole week!


We Are Open co-op

Building and Sustaining Engagement with the Digital Credentials Consortium

Developing communications for your organisation This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world. The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education t
Developing communications for your organisation

This summer WAO ties a bow around a body of work we’ve been doing together with the Digital Credentials Consortium (DCC). This initiative is hosted at MIT and has member universities from around the world.

The Digital Credentials Consortium is advancing the use and understanding of portable, verifiable digital credentials in higher education through open source technology development and leadership, research, and advocacy.

The DCC plays a pivotal role in the definition and establishment of the W3C Verifiable Credentials Standard. Standards are often invisible, but they are massively important!

In this post, we’ll use our work with the DCC to help you systematically review your communication initiatives and give you a bit of a playbook on how to develop reusable communication assets and resources.

Understanding your audience An audience map WAO created with the DCC Research

When crafting communications strategies, most organisations miss a crucial step: audience research. Implementing lessons from outdated research or making assumptions about audience only to find out that your assumptions were wrong are two mistakes that you can avoid!

Before we started creating communication messaging and assets for the DCC, we did two rounds of interviews. In both rounds, we spoke one-to-one with people deeply involved in the DCC’s work. In the first round, we spoke with staff members, W3C task group members and people already implementing the Verifiable Credentials standard. In the second round, we talked to members of the DCC and with the Leadership Board. We asked the same questions for both rounds, but allowed for organic conversation to emerge.

cc-by-nd Visual Thinkery for WAO

These interviews not only provided us with a bounty of onboarding and understanding to the DCC’s multitude of work, but it helped us identify, specifically, what stakeholders need and want from the DCC.

Segmentation

Once you have collected insights from your audience, you can begin to reflect those insights back in ways that help others understand who your audience is. Segmentation is a way to find overlapping interests and topics. We like to visualise segmentation and have done so in multiple ways, from our Audience Ikigai to Defining the Cast and Persona Spectrums, we use a couple of different tools to find audience overlaps. Figuring out a visual way to explain your audience and their unique needs and insights is a great way to help people feel connected to your organisation.

Crafting your communications First slide of a deck implementing suggested design constraints Being specific

Understanding your audience will help you tailor your messages and customise content to specific segments of your audience. Through research, you are also creating relationships with your audience and can encourage people to feel open to giving you feedback.

Our research and subsequent analysis helped us see trends and patterns to pay attention to as we began to craft communications for the DCC. We also identified some quick intervention points allowing us to immediately implement small changes and quick wins. For example, before we ran our final interview, we implemented a new README for the DCC’s Github organisation. Small wins can have big impact!

Our onboarding and research activities helped us see where there were misunderstandings, so that we could deal with them as quickly as possible.

Design guidelines

It can be helpful to put what we call “Design Constraints” in place when we’re building communication strategies and initiatives. Design Constraints are simply rules you and your colleagues use to create consistency in both visual and written language. For example, we helped the DCC select a colour palette, fonts and an illustration library for their future communications.

A brand guide is an example of visual design constraints. A “key messaging and wording” section in your communications strategy is another. It helps create consistency, so that your audiences know how you wish to communicate your organisational goals.

Growing your audience cc-by-nd Visual Thinkery for WAO Engagement

You want to engage with people strategically so that you can work sustainably and your communications are aligned with your current initiatives and goals. We use several tools to help us figure out the best way to engage with a specific audience or community. We’ve written often about the Architecture of Participation, our go-to framework for creating participatory communities.

We also like to build Contributor Pathways, which help show how different stakeholders engage with a project. These pathways can outline steps different audience take and where you might be able to engage with them more effectively.

There are four stages to the engagement model we like to use:

Awareness — The first stage invites you to think about how your particular group hears about you or your project for the first time. The questions to as are How do they hear about us and how would we like them to hear about us? First Engagement — Stage two identifies the first interaction a person or a group has with you or your project. What is the first action that they take and what action would you like them to take? Build Relationship — Stage three is about your interaction. How do you build relationships with people or groups and what value can you bring? Deepen Engagement — As people deepen their engagement with your organisation or project, you’ll want to show them that they’re valued. So how can you ensure consistent engagement with your most engaged audiences?

We think about each of these stages in reference to each specific audience group, as some audiences might be more or less engaged than others.

Advocacy

WAO tends to work with groups and organisations that are trying to create a better world. Advocacy is an integral part of our work. There are a variety of advocacy and collaboration strategies as well as best practices that you can use to ensure you are able to promote your messages in a way that lead to action.

In this post on campaigning for the right things, we take a deep dive into using an advocacy framework to figure out where we might focus efforts. You can reapply this framework to your own initiatives!

Building and sustaining engagement cc-by-nd Visual Thinkery for WAO Cadence

If you’ve truly understood your audiences through research and analysis and you’ve determined the messages and design constraints you need to utilise for maximum communication effectiveness, your audience will begin to grow. Yay! You are building engagement!

It’s time to find sustainable ways to keep your engagement going. Probably the most effective strategy we have for sustaining engagement is cadence and consistency.

an example month of DCC events and associated comms

You need to establish a cadence to your engagement efforts both so that your growing audience knows what to expect and so that you and your team can stay sane. It’s simple, but a communication schedule will help you be consistent, so that people stay engaged. Check out our how to be a great moderator post too, it has good tips on building consistency into your workflow.

Commitment

Last, but not least, commitment to your goals, team and community are essential. However it is that you are trying to have impact on the world, it is a marathon, not a sprint. We believe that open, flexible strategies with reusable and adaptable assets are a great way to help you stay committed.

🔥 Do you need help with communications and engagement? Get in touch!

Building and Sustaining Engagement with the Digital Credentials Consortium was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

It’s time for another episode of The Identity at the Center

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies. You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU Visit our website for more: idacpodcast.com #iam #podcast #idac

It’s time for another episode of The Identity at the Center podcast! Michiel Stoop joins us to discuss privileged access management including how to navigate and obtain support at your organization to invest in these processes and technologies.

You can watch the episode on YouTube here: https://www.youtube.com/watch?v=1e9dpwttuZU

Visit our website for more: idacpodcast.com

#iam #podcast #idac


GS1

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company

Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety. However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing inv
Introducing GS1 standards to the clinical trial supply chain at Creapharm, a Myonex company In the clinical trial industry, drug identification and traceability are essential to ensuring patient safety.

However, up until recently, most stakeholders used their own internal tools and proprietary identifiers for tracing investigational products and their locations, as well as for data interchange in clinical trials.

As a result, participants had to configure their IT systems to adapt to each solution implemented by each specific Investigational Medicinal Product (IMP) manufacturer.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1_healthcare_cases_studies_2024_france_v_final_.pdf

ResofWorld

Meta is training its AI with public Instagram posts. Artists in Latin America can’t opt out

Latin America lacks robust data protection laws that would allow Meta users in the region to prohibit the company from using their content.
On June 2, María Luque noticed several of her contacts on Instagram posting about a form she had never heard of. The form, Luque found out, had been sent to...

Friday, 12. July 2024

MOBI

Nissan, Honda eye EV battery passports in Europe by 2027

Nissan, Honda eye EV battery passports in Europe by 2027Read on Nikkei's website By Azusa Kawakami, Nikkei staff writer | 12 July 2024 NEW YORK -- Nissan Motor, Honda Motor and other Japanese auto industry companies have joined forces to develop a digital tracking system for electric-vehicle batteries, anticipating European Union regulations aimed at [...]
Nissan, Honda eye EV battery passports in Europe by 2027 Read on Nikkei’s website

By Azusa Kawakami, Nikkei staff writer | 12 July 2024

NEW YORK — Nissan Motor, Honda Motor and other Japanese auto industry companies have joined forces to develop a digital tracking system for electric-vehicle batteries, anticipating European Union regulations aimed at encouraging recycling and reducing reliance on China.

Mazda Motor and parts supplier Denso are also among the seven Japanese companies planning to introduce a “battery passport” by 2027.

Businesses and car owners will use a code to access these digital records and find information about the amount of recyclable metals used, origin and production history. The passport will also provide the extent of battery deterioration and residual value.

The companies will seek to put the system to use first in Europe, where battery regulations are to be introduced in 2027.

Adopted in June 2023, the regulations will make battery recycling mandatory within the EU bloc. Among the provisions is a requirement that 50% of lithium from used batteries must be recycled by 2027.

From 2027, automakers will need to meet those regulations for EVs sold in Europe, with penalties for failing to do so.

Nissan and Honda are participants in the Mobility Open Blockchain Initiative, an alliance of more than 120 companies from Japan, the U.S. and Europe, and have been involved in the development of the battery passport infrastructure.

The EU’s tougher battery regulations are due in part to a desire to cut its reliance on China, which dominates the supply of battery materials. China accounts for 65% of the world’s processing and refining of lithium, according to the International Energy Agency.

The U.S., which is strengthening its own restrictions against China, is also considering introducing battery passports. The Inflation Reduction Act made it a condition for EV makers to procure batteries within North America to qualify for vehicle subsidies. Passports can help certify where batteries come from.

But because China leads the EV battery market with a roughly 60% share and controls much of the supply chain, it could be difficult to obtain full battery data without Chinese cooperation.

China’s CATL, the world’s leading automotive battery manufacturer, is said to have over 1 trillion pieces of battery data, from raw materials to recycling.

The post Nissan, Honda eye EV battery passports in Europe by 2027 first appeared on MOBI | The Web3 Economy.

Thursday, 11. July 2024

Digital ID for Canadians

The DIACC releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.…

Canada’s digital trust leader, the DIACC, releases its Pan-Canadian Trust Framework (PCTF) Authentication Final Recommendation V1.2, signalling it’s ready for inclusion in their Certification Program.

Why is the PCTF Authentication component important?

The Authentication component helps assure the on-going integrity of login and authentication processes by certifying, through a process of assessment, that they comply with standardized Conformance Criteria. The Conformance Criteria for this component may be used to provide assurances that Trusted Processes result in the representation of a unique Subject at a Level of Assurance that it is the same Subject with each successful login to an Authentication Service Provider while also providing assurances concerning the predictability and continuity in the login processes that they offer or on which they depend.

What problems does the PCTF Authentication component solve?

The Authentication component helps establish a standardized way for individuals and organizations to verify their identities when accessing digital services. This reduces the risk of unauthorized access and potential breaches. Additionally, by providing a reliable method for authentication, this allows the PCTF to foster trust and confidence among users, service providers, and stakeholders. This is crucial for the widespread adoption of digital services.

Who does the PCTF Authentication component help?

All participants will benefit from login and authentication processes that are repeatable and consistent (whether they offer these processes, depend on them, or both). It can help lay the foundation to provide assurances that identified Users can engage in authorized interactions with remote systems. When combined with considerations from the PCTF Wallet Component, participants may have an enhanced user experience through the reuse of credentials across multiple Relying Parties.

Relying Parties can benefit from the ability to build on the assurance that Authentication Trusted Processes uniquely identify, at an acceptable level of risk, a Subject in their application or program space.

Find the PCTF Authentication component here.


The Engine Room

Launching our UXD support services!

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab. The post Launching our UXD support services! appeared first on The Engine Room.

Starting this month, The Engine Room will be service providers in OTF’s User Experience & Discovery (UXD) Lab.

The post Launching our UXD support services! appeared first on The Engine Room.


Berkman Klein Center

Fellows Spotlight: Johanna Wild, Investigative Journalist

An interview on risks, trends, and tools in OSINT digital research Photo by Emily Morter on Unsplash When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of tec

An interview on risks, trends, and tools in OSINT digital research

Photo by Emily Morter on Unsplash

When Johanna Wild entered the Berkman Klein Center at Harvard as a joint Nieman Foundation innovation fellow, I was intrigued. Wild works for the award-winning international open source (OS) investigative journalism collective Bellingcat. She is an expert on the creative deployment of technical approaches to support a more diverse cohort of public interest reporters and investigators, blending automated approaches with human-centered research methodology.

As someone who has supported expert networks in both disinformation and conflict documentation, I wanted Wild’s first-hand perspective on the benefits and risks of using novel open source intelligence (OSINT) tools to enable a broader, more transparent global knowledge base. We conducted this interview over email between Amsterdam and New York City.

Sam Hinds: Do you encounter specific types of people or professional backgrounds in the work of investigations and OSINT tool development?

Johanna Wild: The great thing about the field of open source research is that it consists of people from various backgrounds. Open source researchers spend a lot of time online. They find pieces of information on social media platforms, in online forums, and databases, and they compare features that they identify in user-generated online videos and photos with locations that can be seen on satellite imagery. This process, called geolocation, is used to verify online images. The nature of open source research allows everyone with an internet connection to do this type of work.

The open source researcher community is therefore a mix of people who do open source research as part of their job and volunteers who are passionate about contributing to important research in their free time. My surveys and user interviews with our Bellingcat community showed that our community consists of people working for human rights organizations, stay-at-home-parents who use their limited time to do something mentally challenging and useful, cybersecurity specialists, job seekers who want to learn new skills, lawyers, data scientists, people who are retired and many more. When I ask volunteers about their motivation, they often say that they want to contribute to research that reveals issues in the regions where they live, that they want to feel that in these times that are characterized by various conflicts around the world, and global challenges like climate change; they do not just passively sit around but actively contribute to something that creates new knowledge about those issues. Another motivation is to become part of a community with similar interests and to improve their open source research skills.

Of course there are also many journalists who are part of this community. Nowadays, more and more newsrooms are setting up teams focusing on open source research. However, journalists were more of the late adopters in this field. Most of them only discovered in the last few years how useful this type of research can be, especially if it is combined with traditional journalistic skills and methods. Newsrooms even started hiring skilled open source researchers who are completely self-taught and who have no journalism degree, which is something that is still rather unusual in the news industry.

Volunteers with a technical background contribute by building tools. These are often simple command line tools that are able to do one very specific task, for instance to scrape posts from a specific social media platform or to check whether an online account has been created on a platform using a specific phone number. Those tools do not usually turn into big commercial products; they are built by people from within the open source software community who focus on writing code that is publicly accessible to anyone. Several years ago, I clearly saw that the open source researcher and the open source software community are a very good match for each other, we just needed to bring them together. This is one of the things that we now do at Bellingcat. We organize hackathons, actively invite software developers into our volunteer community, and support them to build their own tools or to contribute to tools built by the Bellingcat team. This group of volunteers consists for example of people who have a full time job in a software company but want to do something meaningful in their free time, of job seekers who want to create their own portfolio of tools, or of academics who are already deep into a technical topic but would like to test its practical application.

Although the open source researcher and tech communities are very diverse in terms of their professional and personal backgrounds, they are currently still dominated by volunteers and professionals from Western countries, mainly from the US and Europe. The technical tool builder community is also, to date, still male dominated. This lack of representation raises serious questions in terms of who defines the future of our field and who has the power to research topics in regions all around the world. With people in many other regions still excluded from participating in this type of research, they mainly become the subject of Western researchers.

“While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity!”

SH: Have you seen novel trends emerge in the type of information researchers want today?

JW: I definitely observe that researchers, and especially journalists, have become more aware of how useful it is to be able to work with large datasets, to know how to scrape information from websites or to have the skills to build small tools that can speed up some of their research tasks.

Currently, everyone is of course interested in AI. Less experienced researchers are hoping for a tool that lets them input any picture or video and then spits out the exact location of where it was taken. While AI tools can be powerful, we should not expect to automate the whole open source research process. Doing open source research is a combination of specific research methods, the use of tools, a good dose of logical thinking and also creativity! Creativity is needed to spot topics that are worth getting investigated. When deciding where to look next in the vast amount of online information that is out there, creativity helps to connect multiple, often tiny, pieces of verified information which allow researchers to draw conclusions on a certain topic.

Another trend is the use of facial recognition tools. Open source researchers often find pictures that show individuals who have a connection to a certain research case but whose identity they don’t know. In the last few years, several easy to use facial recognition tools have emerged. Researchers can upload a picture of a person and the tool compares this picture with collections of photos from social media platforms. Sometimes, this can reveal the identity of a person, for instance by providing the person’s LinkedIn profile. It is obvious how useful this can be to identify individuals who were involved in serious crimes that require journalistic reporting.

However, facial recognition tools are a double-edge sword. We all know that they can provide wrong results. Two people might just look very similar and an uninvolved person might be misidentified as someone who is involved in illegal activities. It is therefore important that open source researchers do not use those tools as the only way of identifying someone. On top of that, the use of such tools raises various ethical questions ranging from the risk of stalking random people online, to questions about the data sources on which facial recognition tools rely. At Bellingcat, we reflected on how we can ensure a responsible use of facial recognition technologies and concluded that we will refrain from using these tools extensively, and never as a core element of an investigation. We also never used products from companies like Clearview AI. A good example of how we sometimes use a facial recognition tool as a starting point for further research can be found in our article on how “Cartel King Kinahan’s Google Reviews Expose Travel Partners”.

SH: Are there any overlooked tools that you like to highlight in your trainings?

JW: The best type of tool really depends on the research topic. Often a combination of several small tools can lead to the best results. For instance, our Name Variant Search Tool is basically an enhanced search engine for finding information about people. Open source researchers often start with a name and try to find out as much as possible about the person’s online presence. However, the name might be written differently on different sites. “Jane Doe” might also show up as “J. Doe” or “Doe, Jane”. The tool suggests different possible variations of a name and provides search results for all those variations. It is also possible to instruct the tool to search for a name specifically on Linkedin or Facebook.

Example: Name Variant Search results for different variants of the name “Jane Doe”

Our OpenStreetMap search tool, on the other hand, supports the geolocation process. A core task of many open source researchers is to find out where a photo or video that they found online has been taken. To do that, they try to identify specific features and compare those with what is visible on satellite imagery or maps. If researchers already have a rough idea in which region a photo might have been taken, they can input a list of features that are visible in the photo (for instance, a residential street, a school and a supermarket) into our tool, which will try to list all locations in a pre-defined region in which those features show up together. This can really help narrow down possible locations.

SH: What’s an example of an unusual story or insight one can find from OS tools?

JW: If open source researchers have no idea where a picture might have been taken but they know at which time it was captured and the photo shows objects that cast clearly visible shadows, they can try our ShadowFinder tool which is able to calculate at which locations around the world shadow lengths correspond with what can be seen in the photo at a specific point in time. This helps open source researchers concentrate their geolocation efforts to the areas suggested by the tool instead of searching across the whole world.

Example of a ShadowFinder tool result: Possible locations are shown by the yellow circle.

Another tool that has gained popularity within the open source researcher community is PeakVisor, a tool that was originally targeted at helping mountaineers orient themselves but which can also be used for geolocation tasks. For instance, we used it to research the location of the killing of Colombian journalist Abelardo Liz. This example in particular shows that a combination of research skills and the use of tools can go a long way.

SH: What frustrations or barriers do you see as a trainer, and how could the field democratize knowledge of command line tools?

JW: First of all: Teaching open source research is great. People who are interested in learning these methods come from so many different backgrounds which allows everyone to learn new things from each other, including the trainers! The topic is also quite accessible, meaning that everyone can start doing open source research with very simple methods, like using search engines in creative ways. Sometimes, this can lead to surprising results: For instance, just by googling, my colleague Foeke Postma revealed how US soldiers exposed nuclear weapons secrets via flashcard apps.

Of course not all methods are as simple, and one of the things people are struggling with the most are research tools. During my Nieman-Berkman Klein fellowship my research assistant Cooper-Morgan Bryant and I interviewed forty open source researchers about their use of tools. Their answers confirmed my previous findings on this topic: Open source researchers, who are either beginners or who are looking at a topic that is new to them, find it really difficult to figure out what tool they should use at what stage of the research process and how those tools work. With such a wide variety of online tools, some more useful and some easier to find than others, and many researchers feel overwhelmed by the task of finding their way through the landscape of available tools spread across various platforms.

In addition, the majority of open source researchers are not able to use command line tools since this requires a certain degree of technical skills. However, those are exactly the type of small tools that the open source software community is building most frequently. There is a clear divide between those who are building tools for open source researchers and the researcher community itself, for whom those tools often turn out not to be accessible.

“Open source researchers want complex tools that are easy to use and that are stable and well-developed but such tools need funders and teams who build them, and these conditions are not always easily met in the open source research and journalism space.”

On the other side, open source researchers are often not aware of the resources that are required to build mature tools that have an easy-to-use interface. It is getting easier now, but tool builders need to invest a lot more time to build such tools and this is difficult for people who do this task in their free time and without any funding. Open source researchers want complex tools that are easy to use, stable, and well-developed, but such tools need funders and teams who build them. These conditions are not always easily met in the open source research and journalism space. I hope that researchers will become a little bit more open to learn some basic technical skills, and even more importantly that they understand that not every tool that is useful for their research has to function like a fully built commercial tool.

At Bellingcat, we focus on bridging this gap between tool builders and open source researchers. We work with tech communities —often through programs like hackathons or fellowships — and make them aware of how important good user guides are, even for seemingly easy-to-use tools. On the other hand, we teach open source researchers how to use command line tools. We also launched a video series with the goal to help researchers make their first steps towards the more technical side of research tools.

SH: Tools take a lot of resources to build. Do any OSINT tools have a complicated provenance in terms of private sector origin or geopolitics?

JW: It is definitely problematic that researchers and journalists can be so dependent on tools provided by big tech companies. Meta’s social monitoring platform Crowdtangle will be shut down in August and this has caused a lot of discontent amongst journalists, in particular amongst those who are covering elections. For instance, many of the platforms and tools open source researchers use are provided by Google, like Google Search, Google Maps and Google Earth Pro. We are often at the mercy of the decisions that big tech companies take regarding use of their tools.

However, their tools are usually provided for free, which is not the case for other commercial tools. Open source researchers definitely need to look into the companies from which they are buying tools. One risk is that tool providers might be able to see what type of keywords people are typing in or on what topic someone is working on. Researchers and journalists need to be sure that their sensitive research topics are safe from being monitored by tool providers.

At Bellingcat we focus on mostly small open source tools, but those tools come with their own set of challenges. For instance, it is often not clear who is behind a tool that is offered on code-sharing platforms like Github, which can raise security-related questions.

“I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists…since both sides have the common goal of advancing research in the public interest”

This is why I really hope we can build a different tool ecosystem for open source researchers in the future. I would love to see universities getting more involved in building and maintaining tools for open source researchers and journalists. I think that such collaborations could work well since both sides have the common goal of advancing research in the public interest, and many of the tools that are used by open source researchers are equally useful for academic researchers. I also see opportunities to research security-related aspects of widely used tools together, as journalists and open source researchers could definitely use some help in assessing the risks that some of the tools they are using might be posing. If anyone who reads this would like to discuss these topics with me: Feel free to get in touch!

SH: Misinformation, disinformation, conspiratorial thinking: What are some of the uses and abuses of “research” you see in these contexts?

JW: What is most common — especially during conflicts and wars — is that people share either photos or videos from a different conflict or old imagery and make people believe that they are related to current events. In the context of the Israel-Gaza conflict since October 2023, this phenomenon has reached a new scale with countless examples circulating online. For instance, Bellingcat found videos that were shared with the claims that one showed rockets that were fired at Israel by Hamas and another that claimed to show recent Israeli strikes on Hamas; both turned out to be recycled videos that had been uploaded to YouTube several years prior.

“People who post such pictures might sometimes think they are doing ‘research’ and that they are sharing relevant information about an ongoing conflict, without realizing that they are actually sharing incorrect information.”

What is dangerous is that some of those posts go viral and are able to reach significant numbers of people who will never know that they fell for misinformation. People who post such pictures might sometimes think they are doing “research” and that they are sharing relevant information about an ongoing conflict, not realizing the information is incorrect. Others, however, will do it on purpose to evoke emotions either in favor or against one of the conflict parties. Users of online platforms cannot really do much to prevent being confronted with such posts. This is another reason it is essential that we all learn to question what we see online and to invest some time in learning basic verification skills.

What we have also been seeing is that supporters of conspiracy ideologies are increasingly using open source research tools and presenting the information as journalistic findings. For example, Qanon supporters in German-speaking countries started using flight-tracking sites to search for flights which they falsely believed were circling above “deep underground military bases” in which children were hidden and mistreated. This is problematic since people who are not aware of the methods and standards of open source research might not be able to differentiate between serious research and the distorted version of it.

SH: What are some of your favorite guidelines or best practices for journalists who aim to cover (and fact-check) broad conspiratorial thinking enabled by OS information?

JW: Looking at their business models can often be a very promising approach. More often than not, conspiracy-minded communities have business-savvy people amongst them who manage to benefit financially from those communities’ beliefs. When I was researching QAnon online communities in Germany, big platforms like Amazon and eBay had started implementing measures to ban QAnon products from their platforms. However, this seemed to have created new opportunities for QAnon influencers who were offering merchandise via their own small online shops. On top of that, customers in Germany were able to buy QAnon products from abroad, for instance from Chinese or British companies who offered products targeted specifically at German-speaking customers. It was interesting but also concerning to see how international today’s conspiracy merchandise markets are.

To research online shops, it is always worth researching what payment options those shops are using and to look into their potential use of cryptocurrencies. It is also important to take some time to learn the terminology a certain group is using. If you are looking into the far-right, for instance, it is crucial to learn how to interpret the symbols they use.

”Open source researchers are often portrayed as some type of ‘nerdy hero’ who spends time on his laptop to research ‘the bad guys’ and is celebrated once he succeeds. The idea of one hero figure who solves all the research challenges is really the exact opposite of how open source research works best…”

SH: How might international organizations build stronger support for women, femme-identified, and gender-nonconforming media and research professionals?

JW: In the field of open source research, there are definitely tendencies that I would like to see changed in the future. It is well established that women and gender-nonconforming people have traditionally had a much harder time to enter and succeed in the space of investigative journalism. Those issues are far from being overcome, but the journalism world has started to talk more openly about it, and the fact that academic researchers have published work on this topic has also been helpful.

My impression is that as open source researchers, we have not yet put enough effort into reflecting on what is happening in our own field. Maybe we thought that since it is relatively new, those issues would not appear as strongly. Unfortunately, however, they do, and it’s time to recognize this.

There are definitely many contributing factors, but one that has had a strong effect on me is that open source researchers are often portrayed as some type of “nerdy hero” who spends time on his laptop to research “the bad guys” and is celebrated once he succeeds. The idea of one lone wolf who solves all the research challenges on their own is really the exact opposite of how open source research works best, which is by nature collaborative and often requires the efforts of many to put together various small pieces of verified online sources for a specific research case. For those of us who don’t want, and are also not able to fit into this commonly portrayed male hero picture, this field might not necessarily feel like a good fit.

However, since more and more traditional newsrooms are setting up open source research units right now, I see more women entering the field and hopefully, this will also change how we publicly talk about open source research over time. To everyone who organizes a public event on open source research, I recommend to not only approach the few already well known voices in the field but to take the effort to find and invite speakers who can contribute new perspectives and who have done research on topics that are not always in the spotlight.

SH: What were the most meaningful conversations you had during your time at the Berkman Klein Center? Do you plan to use any of your connections or insights from the fellowship in your future work?

JW: I am very grateful that I was able to be a Berkman Klein Fellow this year. It was a great opportunity to be part of a community of people who all reflect on how we integrate new technologies in our lives but from various different angles. Each fellow and community hour provided me with insights into a different technology-related topic and I liked the “surprise” effect of being able to learn new things about topics I usually don’t have the time to think about. This has definitely had an impact on how I approached my own projects with Bellingcat. I feel that being immersed in such a knowledgeable and collaborative community has unlocked my creativity and I am looking forward to continuing to learn from everyone in the Berkman Klein sphere in the future.

Johanna Wild was a joint 2023–2024 Nieman-Berkman Fellow in Journalism Innovation, a joint fellowship administered between the Nieman Foundation for Journalism and the Berkman Klein Center for Internet & Society at Harvard University. Wild is currently Investigative Tech Team Lead at Bellingcat.

Fellows Spotlight: Johanna Wild, Investigative Journalist was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

Kantara awards IAL3 certification to NextGenID Component Services

World's first Trust Mark award for Component Services at IAL3 will continue to build confidence in the identity industry The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.

We are delighted to announce that NextGenID has successfully obtained IAL3 certification for its component services. This effectively makes it the first organization to achieve IAL3 in the Identity Credentialing and Access Management (ICAM) space. This sets a new industry standard for security, accessibility and reliability.

NextgenID’s Trusted Services Solution (TSS) provides Supervised Remote Identity Proofing (SRIP) identity stations. Operators use SRIP stations to collect, review, validate, prove and package IAL3 identity evidence and enrollment data.  This means that CSPs that use the NextGenID TSS will offer an enhanced level of assurance.

Speaking of the award, Kantara Exec Director, Kay Chopard, said: “Achieving Kantara certification is a significant endeavor, reflecting a rigorous commitment to excellence in identity and access management. By developing frameworks and ensuring conformance to robust standards, we provide guidance that ensures security, privacy and interoperability in digital transactions. This is critical for organizations looking to adopt identity solutions that not only comply with current regulations but also anticipate future challenges in digital identity verification. We congratulate the NextGenID team on being the first to achieve IAL3 certification for Component Services.”

Are you ready for identity assurance certification?  Visit our Approval Process page for full details of what is involved and the criteria we use when evaluating applications.

 

The post Kantara awards IAL3 certification to NextGenID Component Services appeared first on Kantara Initiative.


MOBI

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance DENSO, Honda, Mazda, and Nissan among MOBI members who have completed Stage 1 of the Cross-Industry Interoperable Minimum Viable Product, a three-year initiative towards a decentralized Global Battery Passport built for data privacy and selective disclosure [...]

First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance

DENSO, Honda, Mazda, and Nissan among MOBI members who have completed Stage 1 of the Cross-Industry Interoperable Minimum Viable Product, a three-year initiative towards a decentralized Global Battery Passport built for data privacy and selective disclosure

Los Angeles, 10 July 2024 — MOBI, a global nonprofit Web3 consortium, is thrilled to announce a significant milestone in development of the Web3 Global Battery Passport (GBP) Minimum Viable Product (MVP). In a historic first, the MVP has successfully demonstrated battery identity/data validation and exchange between nine organizations using open-standards — MOBI Battery Birth Certificate and the World Wide Web Consortium (W3C) Self-Sovereign Identity (SSI) framework — an achievement that carries exciting implications for stakeholders of the battery value chain and lays critical groundwork for the Web3 Economy. Organizations worldwide are invited to join and collaborate in this trailblazing effort.

Participants of MOBI Circular Economy and the GBP Working Group, constituting close to USD 1 Trillion in annual revenue and representing diverse functions within the global battery ecosystem, have successfully completed Stage 1 of the three-year MVP and are set to begin Stage 2. Implementers of the decentralized GBP include Anritsu, DENSO, HIOKI, Honda, Mazda, Nissan, and TradeLog Inc.

As the world increasingly turns to batteries for sustainable energy solutions, global battery value chains are making continuous improvements to enhance operational efficiency, circularity, and cross-border compliance. Forward-looking global policies like the US Treasury’s Section 30D Guidance on EV Tax Credits and the EU Battery Regulation mandate digital recordkeeping to track battery life cycles, underscoring the need for a global battery passport — a digital credential containing key information about the battery’s composition, state of health, history, and more.

Creating a scalable GBP will require cross-industry interoperability, such that entities across the value chain can securely coordinate and selectively share relevant data without the need to pay for costly one-off integrations or abandon their existing systems. MOBI and its members believe Web3 can become a key enabler for cross-industry interoperability at scale. SSI in particular introduces a promising approach to unlocking powerful synergies, enabling entities to securely control and share credentials across different web applications and platforms without the need for centralized intermediaries. To this end, MOBI and its members are building a decentralized Web3 marketplace ecosystem with standardized communication protocols for Self-Sovereign Data and SSI, designed such that the federated infrastructure and data therein are not controlled or managed by one organization. While MOBI MVP initiative demonstrates a specific use case for implementing the GBP, the benefits of MOBI Web3 implementation extend to almost any use case that involves multiparty transactions.

In Stage 1 of the initiative, implementers demonstrated the Integrated Trust Network (ITN) identity services of one-to-one cross-validation for battery identity and data. The ITN serves as a federated (member-built and operated) registry for W3C Decentralized Identifiers, offering SSI management for connected entities such as batteries and their value chain participants. The ITN is the first enterprise network supporting multiple blockchains at the same time. These features are unique to the ITN, built for high resilience by ensuring the network’s functionality and sustainability do not rely on a single organization or blockchain.

Stage 2 will demonstrate Citopia decentralized marketplace services through creation of the cross-industry interoperable, privacy-preserving GBP. ITN services are one-to-one whereas Citopia services are one-to-many (and many-to-one). Through Web3 implementation, the availability and selective disclosure of trusted data and identities throughout the battery value chain will beget digital services and applications such as enhanced battery and carbon credits management, vehicle-to-grid communications and transactions, risk-based insurance calculations, and data-driven used electric vehicle pricing.

“Today’s global battery value chain is complex and it’s difficult to simultaneously ensure efficiency, scalability, safety, circularity, and regulatory compliance. To balance these priorities, we need to enhance battery lifecycle management through the creation of a shared ecosystem with SSI framework for secure coordination and selective disclosure of sensitive data,” said MOBI CEO and Co-founder, Tram Vo. “Driving innovation at this scale requires cross-industry collaboration. We invite public and private organizations worldwide to join us in this critical pursuit.”

“Web3 is an interesting technology which may facilitate a more scalable approach to exchanging battery data in a peer-to-peer fashion between organizations,” said Christian Köbel, Senior Engineer at Honda Motor Co., Ltd.

“It is confirmed how Web3-based self-sovereign data management works throughout Stage 1 of the MVP,” said Yusuke Zushi, Senior Manager at Nissan.

Said Mazda Motor Corporation, “Through participation in the GBP Working Group, we not only acquired the technical knowledge of Web3 but also understood a vision of an ecosystem that realizes the exchange of reliable information. We appreciate MOBI for giving us this invaluable opportunity.”

“The Stage 1 MVP experiment was instrumental in deepening our understanding of a system that enables sovereign data management. In the current era of data, the integrity and safety of information flow are paramount. As an experienced manufacturer of measuring instruments, HIOKI has always placed a premium on the reliability and accuracy of data,” said Kenneth Soh, Executive Officer at HIOKI. “We believe that the insights gained from this MVP are vital for the future progress of measurement instrument manufacturers and the industries we serve, from the perspective of innovation and societal contribution driven by mechanisms that offer both security and precision in data distribution.”

“The successful implementation of the Web3 GBP MVP is a significant step towards a more transparent and sustainable battery ecosystem. We are honored to contribute to the realization of the GBP as participants in this important initiative,” said Hisashi Matsumoto, Senior Manager at Anritsu.

“We, TradeLog, Inc., proudly support the decentralized Global Battery Passport project, driven by the efforts of MOBI and its dedicated Implementers,” said Alvin Ishiguro, Project Coordinator at TradeLog, Inc. “Going forward, we will continue to deliver our customers new experiences through blockchain technologies in the energy sector.”

About MOBI

MOBI is a global nonprofit Web3 consortium. We are creating standards for trusted Self-Sovereign Data and Identities (e.g. vehicles, people, businesses, things), verifiable credentials, and cross-industry interoperability. Our goal is to make the digital economy more efficient, equitable, decentralized, and circular while preserving data privacy for users and providers alike. For additional information about joining MOBI, please visit www.dlt.mobi.

Media Contact: Grace Pulliam, MOBI Communications Manager

Email: grace@dlt.mobi | Twitter: twitter.com/dltmobi | Linkedin: MOBI

###

The post First Web3 Global Battery Passport Implementation for Current and Future Regulatory Compliance first appeared on MOBI | The Web3 Economy.

Wednesday, 10. July 2024

OpenID

All Aboard the CAEP-Ability Hype Train!

Authors: Sean O’Dell (Disney), Atul Tulshibagwale (SGNL) An Identiverse 2024 Panel Recap The attendance for this panel, which featured all co-chairs of the Shared Signals Working Group (SSWG), was near capacity and the engagement from the audience in the Q&A was resounding…because the hype is real with CAEP. The panel was moderated by IDAC podcast […] The post All Aboard the CAEP-Ability Hyp

Authors: Sean O’Dell (Disney), Atul Tulshibagwale (SGNL) An Identiverse 2024 Panel Recap

The attendance for this panel, which featured all co-chairs of the Shared Signals Working Group (SSWG), was near capacity and the engagement from the audience in the Q&A was resounding…because the hype is real with CAEP. The panel was moderated by IDAC podcast host, Jeff Steadman. His questions ranged from provisioning use cases to applicability in connected scenarios with other IAM domains (such as ITDR) and diving deeper into the CAEP specification and Shared Signals Framework. The “IAM CAEPable” T-shirts were also a hot commodity…and there might be another order coming soon.

The many questions from the audience made the discussion even more lively, allowing for open and real conversations to occur with the assembled panel of experts. The panelists felt the audience’s engagement as they saw people scribbling notes, typing on a laptop, or nodding their heads before raising their hands to elaborate or branch off into new areas. Sometimes the energetic Q&A led to a conversation between the audience and multiple panelists. This article covers the highlights.

Highlights & Key Points

Q: What are the practical use cases and applications of CAEP Events?

Apart from the immediate “session revoked” scenario, now implemented by platform providers like Apple, CAEP can be applied in numerous other scenarios. These include, for example, revoking a suspicious device’s session without impacting the end user or informing an IdP of assurance level changes – informative and actionable signals.

A real world scenario is when an event is emitted from an anomaly detection engine, which results in a CAEP event being transmitted so you could take action to revoke the specific session for both the user and possibly the device, if applicable.

Q: Where do CAEP and ITDR intersect? Can you explain the significance of this intersection?

CAEP brings the “R” in ITDR (Identity Threat Detection and Response). Additionally, Shared Signals (SSF) can be leveraged to enhance ITDR by providing a way to communicate detected threats and trigger responses to security systems…using an open standard. Think of Shared Signals as the management framework and CAEP as, effectively, the events that sit on top of it. The new events introduced in the latest CAEP draft, “Session Established” and “Session Presented”, can also help detect usage anomalies like lateral movement across cloud resources.

Q: Can this be used in provisioning use cases? 

A new draft in the IETF called “SCIM Events” defines events that can be shared using the Shared Signals Framework (SSF). This can be used to communicate changes to accounts such as new account provisioning or account termination. 

Q: How can you link events to the same underlying action or reason? 

The latest draft of the Shared Signals Framework (SSF) includes guidelines on using the JWT “txn” claim to ensure that transmitters and receivers do not process multiple events for the same underlying cause or reason and to establish a lineage between cause or reason to the events transmitted for reconciliation or closing the loop.

New Features and Drafts Released

There have been some exciting new developments from the Shared Signals Working Group. The new drafts have been released for review by the OpenID Foundation membership and voting. This stemmed from feedback at the Gartner Interoperability Summit, robust security analysis by the University of Stuttgart, natural maturation of the specification, and Work Group feedback where more use cases were brought to light.

Shared Signals Framework (SSF) – Draft 03

Clarification and bug fixes have been added to this draft. There have also been security issues addressed with issuer and stream audience mix up and potential attacker subject insertion. New features added include: the use of the txn claim to prevent cascading chains from the same underlying event and a means of using it for reconciliation and transmitters can now specify in their metadata which streams have no subjects by default or “appropriate subjects”.

Continuous Access Evaluation Profile (CAEP) Draft 03

The big update here is the introduction of 2 new events: “Session Established” and “Session Presented”. Additionally, the draft has been updated to reflect new formats and fields in examples to match the new SSF draft.

CAEP Interoperability Profile – Draft 00

The first version of the CAEP Interoperability Profile, which defines how implementations can be fully interoperable for specific use cases such as session revocation and credentials change, is also released.

To learn more about the new drafts from the Shared Signals Working Group (SSWG) please click here.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post All Aboard the CAEP-Ability Hype Train! first appeared on OpenID Foundation.


Elastos Foundation

Welcoming Elastos’ New International Airport: Chainge Finance

We are pleased to announce the recent integration between Elastos and Chainge Finance, enabling users to swap assets like ETH, USDC, and USDT from over 14 blockchain networks into ELA on the Elastos Smart Chain (ESC) or Ethereum (ETH) and back. This integration breaks down barriers and fosters a unified financial ecosystem in Web3. Use […]

We are pleased to announce the recent integration between Elastos and Chainge Finance, enabling users to swap assets like ETH, USDC, and USDT from over 14 blockchain networks into ELA on the Elastos Smart Chain (ESC) or Ethereum (ETH) and back. This integration breaks down barriers and fosters a unified financial ecosystem in Web3. Use here now!

Elastos’ Friction Point

Elastos has long struggled with bridging issues, which affects ecosystem growth. Users on exchanges like Coinbase faced difficulties accessing Elastos as only ELA on Ethereum was available. Previous solutions, such as Glide Finance’s high-fee shadow token bridge and ELK Finance‘s complex swaps, frustrated the community. Just as airports facilitate international travel and business, cross-chain interoperability is vital for a cohesive DeFi ecosystem, reducing costs, enhancing tourism, and promoting overall business.

Unlike Centralised exchanges (CEXs) with third-party control, stringent identity checks (KYC), and entry/exit restrictions, decentralised cross-chain exchanges (DEXs) like Chainge offer a permissionless alternative. They enable secure, seamless asset transfers across blockchains, supporting decentralised identities and eliminating centralisation risks, thus improving utility, security and efficiency for all Web3 stakeholders.

Interconnectivity Stemming from CRC Proposal 151 and led by Sasha Mitchell, Elacity CEO and BeL2 Head of Operations, in collaboration with Chainge team CEO DJ Qian, Elastos and BeL2 Co-Founder Sunny Feng Han, and the BunnyPunk team, Chainge’s cross-chain DEX has been integrated with Elastos. This integration merges liquidity pools on Ethereum (Uniswap) and ESC (Glide Finance) and adds 18,513 ELA and 41,655 USDC liquidity on Chainge’s Fusion blockchain, allowing users to swap assets into ELA from over 14 chains. These chains include Fusion, Ethereum, BNB Chain, Base, Avalanche C, Polygon, Aurora, Syscoin Rollux, X Layer Mainnet, CoreDAO, Syscoin NEVM, Arbitrum, Optimistic, Linea, zkSync and B2 (recent BeL2 partner).

 

In this screenshot, we show how USDC on Arbitrum was successfully used to purchase ELA on ESC in a single transaction using a decentralised wallet. This simplifies access to the Elastos ecosystem and its various Dapps, effectively opening up Elastos to the entire Web3 community with a new international airport.

 

Cyber Republic Proposal #294: Banking, Liquidity and Slippage

Next, working with Chainge, the goal is to soon connect to a fiat on/off ramp service, allowing users to buy ELA directly with credit cards or bank accounts and exchange ELA for US dollars into their bank accounts, enhancing accessibility and ease of use.

However, there is still a necessary challenge to tackle surrounding liquidity and slippage. Liquidity is the ease of converting an asset to cash without affecting its price, while slippage is the difference between the expected and actual trade price. High liquidity and low slippage ensure quick, predictable trades, enhancing user experience and driving adoption. Deepening liquidity and reducing slippage are crucial for an efficient financial ecosystem. Below we can see how on the left, a 1,000 ELA order on Chainge has low fees, however, on the right, a 10,000 ELA order has drastic fees of above 15% due to lacking liquidity and high slippage.

Sasha Mitchell proposes using the remaining 197,152 stablecoin assets from the G20 proposal to match with CRC ELA and add to Chainge’s liquidity pool, which currently holds 18,513 ELA and 41,655 USDC. This will drastically boost cross-chain and banking purchase liquidity, reducing slippage and transaction costs and setting Elastos up for the upcoming fiat on/off ramp service. This increased ELA liquidity will allow stakeholders to move assets between chains and banks more efficiently without high costs. Deep liquidity and low slippage help enable quick, stable, and predictable trades.

For more information and to participate in ongoing developments, visit the CRC proposal by Sash and explore the live bridge on Chainge Finance via the web dapp, or download the mobile app from Google Play or Apple App Store. Excited to learn more? Follow Infinity for the latest updates!

 


Next Level Supply Chain Podcast with GS1

Replay: Ways to Build an Enduring Brand on Amazon with Shannon Roddy

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond!  Key takeaways: Building a defensible brand is crucial for long-term success. Invest in building a

Today, the speed of change in the market and on Amazon is rapid, making it difficult for brands to keep up and see continued success. But never fear, Shannon Roddy, of Avenue7Media, is here to give us insights into the brand-building strategies you need to succeed on Amazon, and beyond! 

Key takeaways:

Building a defensible brand is crucial for long-term success. Invest in building a brand that is recognizable, trustworthy, and unique to differentiate yourself from your competitors.

Amazon holds over 50% of the online market and can significantly impact the success or failure of a brand. Harnessing Amazon's data and feedback is crucial for identifying trends, understanding demographics, and developing new products.

Leveraging Amazon's platform and customer data can give you a competitive edge, but you need to adapt to changing customer preferences and market demands.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with our guest:

Follow Shannon Roddy on LinkedIn

More on Avenue7Media

 

Jump into the Conersation:

[1:42] Can you share a little bit of your background and what you’ve been working on in the last couple of years?

[4:31] The Amazon space is constantly evolving, are there some major trends or changes that have happened recently?

[10:33] You gave us some examples when we talked before of how things can go wrong for brands on Amazon, so how can you help them make things go right?

[14:45] What are some other tips and tricks that you can offer?

[18:37] Does that also mean discontinuing product one and two while you expand out, or is that what you learn from the data?

[27:14] What do you see is next from Amazon’s perspective?

[29:06] What trends are you seeing that are blowing your mind?

[32:31] What’s your favorite technology?


Digital Identity NZ

Government bringing in new digital trust framework

The government has quietly ushered in the beginnings of what it hopes will be the answer to people's experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days. The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

The government has quietly ushered in the beginnings of what it hopes will be the answer to people’s experiences of fraud and lack of trust online. Its new digital trust framework has gone live in recent days.

Digital Identity New Zealand Executive Director Colin Wallis spoke to Radio New Zealand this morning, “The intent is that you’ll have a safer digital playing field as a baseline to build other services on top of. It’s just going to take some time for the ripple through where we are now for it to become seismic.”

Listen to the full recording

You can learn more about the DISTF and the Digital Public Infrastructure on Tuesday 13 August at The Digital Trust Hui at Te Papa, Te Whanganui a Tara.

The post Government bringing in new digital trust framework appeared first on Digital Identity New Zealand.

Tuesday, 09. July 2024

We Are Open co-op

Behind the Scenes of Our New Project on Job Readiness Credentials

A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary ma
A step-by-step guide to our project kickoff with Jobs for the Future and International Rescue Committee Context

We Are Open Co-op (WAO) is kicking off some work this week, collaborating with Jobs for the Future (JFF) to assess the Job Readiness Credential provided by the International Rescue Committee (IRC). WAO is managing the project, developing a user research strategy, preparing necessary materials, and conducting interviews with employers, IRC staff, and, if possible, IRC clients.

Our broad key question relates to how the visual design and metadata contained in a digital badge impact employer perceptions and interactions. We want to help JFF and the IRC have the most impact possible with the Job Readiness Credential because that impact means changing the lives of real people.

How we approach this kind of work

At the start of any project, it’s important to know the absolute basics. In fact, it’s a good time to get the Busytown Mysteries theme tune in your head as an earworm! The 5W’s and an H shown above help make sure we know all of the things necessary to set the project up for success. Ideally, we’d know most of this before even signing the contract, but anything missing we can pick up in the client kick-off meeting.

Before the client kick-off meeting, we have an internal project kick-off where we talk about everything from timelines and responsibilities, to setting up the digital environments in which we’ll do the work. If we need to purchase any new equipment or subscriptions, we’ll identify those in this meeting. Our guidelines for this can be found on the WAO wiki.

Communications and cadence Early days of JFF/IRC Trello board. It’s the usual kanban format with the additional of the self-explanatory ‘Feedback Needed’ along with ‘Undead’. The latter is for cards that would otherwise get stuck somewhere but we don’t want to delete/archive just in case they come back to bite us!

Getting into the right rhythm with clients is an art rather than a science. While it’s easy to put an hour in the calendar each week for a catch-up call, this is a sub-optimal for anything other than the very short term. This is because, in our experience, these kind of calls quick devolve into status update meetings.

Much better is to work as openly as possible. Sometimes that means entirely publicly with etherpads, public Trello boards, and the like. Other times, it’s working transparently with tools that provide either real-time or summary updates. Often this means that the number and frequency of meetings can be reduced. With our recent work with the DCC, for example, we met every other week, aiming for 45 minutes. Between meetings, we sent Loom videos and other sorts of outputs to make sure our collaborators knew how thinking had evolved.

While it’s important that there is a project lead from both sides, it’s also crucial that their inboxes do not become information silos. Larger organisations might use CRM systems, but for us information is best in context. So, for example, a Google Doc for ongoing meta-level important info, and everything else on the relevant Trello card (or equivalent).

Documentation is not putting a message in a Slack channel or mentioning something during a meeting. Documentation is writing something down in an uncontroversial way that makes sense to everybody involved in the project. This is important because humans can only hold so much information in our heads at one time, and our memories can be faulty.

Everything is a work in progress CC BY-ND Visual Thinkery for WAO

‘Perpetual beta’ is another name for saying that everything is a work in progress. What’s true of software is true of documentation and everything involved in a project. Conclusions are provisional and based on the data and knowledge we had at the time.

To account for this, we usually version our work, starting at v0.1 rather than 1.0. The reason for this is to show the client (and ourselves) that we’re working towards our first fully-formed opinions and outputs. It’s all part of our attempt to work openly and show our work.

With this work that we are starting with JFF and IRC, we’ll be talking to stakeholders in a couple of different places. Our human brains want to take shortcuts and jump to conclusions quickly so that we can take action. However, we’ve learned to “sit in ambiguity” for long enough to allow thoughts and reflections to percolate. This slower kind of thinking allows us to spot things that might have been missed by our ‘System 1’ mode of thought.

Conclusion

We’re greatly looking forward to getting started with this work. We haven’t gone into how we perform user research, which is perhaps the topic for a future post. There’s a lot to cover from that point of view in terms of ethics, data, and different kinds of methodologies.

What we hope that we have shown in this post is our commitment to working openly, holistically, and thoroughly so that the outputs we generate are trusted, interesting, and actionable. We’ll share more on the project as it progresses.

Behind the Scenes of Our New Project on Job Readiness Credentials was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 08. July 2024

Hyperledger Foundation

LF Decentralized Trust: A Bigger Tent for Projects, Labs, Members, and Communities

In case you missed it, the Linux Foundation recently announced the intent to form LF Decentralized Trust, a new bigger umbrella organization where we will gather and grow an expanded community and portfolio of technologies to deliver the transparency, reliability, security, and efficiency needed to successfully upgrade critical systems worldwide.

In case you missed it, the Linux Foundation recently announced the intent to form LF Decentralized Trust, a new bigger umbrella organization where we will gather and grow an expanded community and portfolio of technologies to deliver the transparency, reliability, security, and efficiency needed to successfully upgrade critical systems worldwide.


DIF Blog

DIF welcomes JC Ebersbach as co-chair of the Identifiers & Discovery WG

DIF is delighted to have Jan Christoph (JC) Ebersbach join us as an Identifiers and Discovery Working Group co-chair, and as a new member of DIF Technical Steering Committee. "JC is one of the rare individuals who combine strong technical expertise and creativity with community leadership abilities. Within a

DIF is delighted to have Jan Christoph (JC) Ebersbach join us as an Identifiers and Discovery Working Group co-chair, and as a new member of DIF Technical Steering Committee.

"JC is one of the rare individuals who combine strong technical expertise and creativity with community leadership abilities. Within a relatively short period of time, he has attracted a lot of interest in his work, which is already being adopted, such as Linked VPs and DID Traits. I'm excited that he has agreed to join as I&D WG co-chair, and I look forward to the collaboration!" said existing WG chair, Markus Sabadello.

"I've been following DIF's work since 2019. However, I didn't actively participate in the work items. With last year's first did:hack hackathon, my interest spiked due to the discovery of initial ideas that culminated in the Linked Verifiable Presentations specification, and its recent ratification as a DIF Deliverable", JC said.

"The collaboration and support I received from the working group, and from Markus in particular, inspired me to take on the role of co-chair.

"The Identifiers & Discovery WG is an invaluable resource for working with and building DIDs. I feel honored to serve as co-chair and I'm looking forward to advancing Decentralized Identifiers with our working group," he added.

“JC has shown tremendous leadership in the decentralized identity community, including through his work on Linked Verifiable Presentations, a recently ratified DIF specification, and DID Traits. We are delighted and honored he has accepted the role of Identifiers and Discovery WG co-chair, as well as a position on the DIF Technical Steering Committee. DIF will greatly benefit from his leadership,” commented DIF's Executive Director, Kim Hamilton Duffy.


Identity At The Center - Podcast

The Identity at the Center Podcast episode this week dives i

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world. Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast

The Identity at the Center Podcast episode this week dives into passkey insights and challenges with none other than Martin Sandren from IKEA. We discussed the future of passkeys to AI's role in cybersecurity. This episode is packed with valuable insights and practical advice for passkey adoption in the real world.

Watch it at https://www.youtube.com/watch?v=R94eG1gTcN8 or listen in your podcast app. Visit idacpodcast.com for more info.

#iam #podcast #idac

Friday, 05. July 2024

Ceramic Network

Ceramic Nodes in Production: Example Costs + Scenarios

Running a Ceramic node involves several key services. Learn about what production costs to expect across example hypothetical scenarios.

Running a Ceramic node in a production environment involves several key components. This article aims to provide an overview of the necessary resources and cost estimates for deploying a Ceramic node in the cloud. While we only showcased two specific providers for the services required (DigitalOcean and QuickNode), we hope these cost examples given the hypothetical scenarios we walk through will help give you a general idea of cost.

Components Required for a Ceramic Node

There are several sub-services to consider when running a Ceramic node in production, each serving different functions. As such, you will need:

1. Resources for JS-Ceramic Functionality: Tracks and stores the latest tips for pinned streams, caches stream state, provides the HTTP API service for connected clients to read, and communicates with the Ceramic Anchor Service (CAS) for blockchain anchoring and timestamping. 2. Resources for Ceramic-One Functionality: These nodes store the actual data and coordinate with network participants. 3. Resources for Postgres Database Functionality: Required for indexing data. 4. Ethereum RPC Node API Access Functionality: Required to validate CAS anchors. 5. Ceramic Anchor Service (CAS) Access Current Status: Anchors Ceramic protocol proofs to the blockchain. This service is currently funded by 3box Labs, however, eventually, this function will be provided by node operators and with some expected cost. Baseline Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “baseline” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state Ceramic-One 4 vCPU 4 GB memory 100 GB disk for storage Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Traffic Recommended Resources

Given the services you’ll need above, the Ceramic team has tested and organized a set of “High Traffic” configuration settings we recommend when setting up your node. However, seeing as these are baseline, or average, you may need to increase resourcing accordingly based on your actual usage:

JS-Ceramic 2 vCPU 4 GB memory 10 GB disk for state 10,000 IOPs Ceramic-One 6 vCPU 8 GB memory 500 GB disk for storage 15,000 IOPs Postgres Database 2 vCPU 4 GB memory 10 GB disk for indexing High Availability Configuration

For high availability, an additional node can be configured to sync data and handle dynamic read/write tasks, thus doubling the cost of a single-node setup.

Ethereum RPC Node Endpoint Costs

We’ve also chosen QuickNode to provide several RPC cost examples :

QuickNode Base Plan: $10/month (100 million API credits, 2 endpoints, 550 credits/second) QuickNode Middle Plan: $49/month (500 million API credits, 10 endpoints, 2,500 credits/second) QuickNode Premium Plan: $299/month (3 billion API credits, 20 endpoints, 6,000 credits/second) Hypothetical Scenarios and Cost Estimates

Let’s walk through three hypothetical need scenarios and use these to help estimate our cost structure:

Application A: Small User Base User Base: 10,000 monthly active users Query Behavior: 30% writes, 70% reads Availability: Low-priority Configuration: Baseline resources Cost Estimate: Node: $96/month Ethereum RPC: $10/month Total: $106/month Application B: Write-Heavy Mid-Sized Application User Base: 100,000-500,000 monthly active users Query Behavior: 70% writes, 30% reads Availability: High priority (2-node setup) Configuration: High Traffic Cost Estimate: Nodes (2x): $918/month (2x $459) Ethereum RPC: $49/month Total: $967/month

Example GCP budget

Other Considerations

Additional cloud costs must be considered for networking - these costs will vary based on traffic patterns. Most cloud providers offer free traffic ingress to the nodes but will charge for egress, or data leaving the nodes.

Running a Ceramic node in production involves various components and resources, each contributing to the overall cost. By understanding the necessary configurations and associated costs, developers can make informed decisions tailored to their application's needs and user base. High availability setups and resource over-provisioning can significantly impact costs, especially for mid-sized applications with high traffic and write volumes.

Thursday, 04. July 2024

Digital ID for Canadians

OIX and DIACC join forces to move digital trust and verification interoperability forward

Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification. UK, June 2024 – The global non-profit…
Open Identity Exchange (OIX) and DIACC commit to finding alignment for global policies on digital trust and verification.

UK, June 2024 – The global non-profit Open Identity Exchange (OIX) and the Canadian non-profit Digital ID Authentication Council of Canada (DIACC) have committed to working together to advance global digital interoperability – a crucial element for trusted, successful international trade in a rapidly advancing digital global economy.

OIX is an influential global community for all those involved in the ID sector to connect and collaborate, developing the thought leadership and guidance needed to enable interoperable, trusted identities carried seamlessly from place to place in ‘roaming wallets’ for everyone. DIACC is an equally influential community of public and private sector leaders committed securing inclusive digital economy benefits by promoting user-centric design principles and verifying private sector services against the Pan-Canadian Trust Framework (PCTF) to support a secure ecosystem of services to enable user-directed information verification between public and private sector data authorities.

The two organisations will explore how different country-based policies related to identity management, verification, security, data privacy innovation and approaches to digital identity assurance can be compared and analysed so that more rapid progress can be made towards global digital ID interoperability through alignment of policy or acceptance of policy differences.

The collaboration will focus on advancing methods for participants in one framework to accept identity verification and digital credentials verified through another trust framework based on a mixture of policy acceptance and technology adaption. DIACC and OIX will explore equivalency and interoperability processes, identify potential alignments, new standards required, and gaps that may need to be addressed, and highlight use cases that can be facilitated through interoperability across digital ecosystems. Within this work, they will explore methods to describe common features of jurisdictional and sectoral trust frameworks, and share insights widely available as a resource.

The exchange and transfer of knowledge and expertise will be at the heart of this collaboration. OIX and DIACC will work together to create ‘intellectual capital’ to shape debate and bring about actions, moving identity management, data privacy, and security forward at pace.

Nick Mothershaw, Chief Identity Strategist at OIX, said: “The benefits of the digital global economy will be vast, but there is still some way to go before everyone can confidently access them. Our collaboration with DIACC will play a critical role. The fantastic progress DIACC has already made across Canada is an exemplar for global interoperability and will provide much needed insight, tools and guidance to pave a much clearer way forward globally.

“Our plans are to share our work with other trust frameworks across the globe, by publishing the criteria and values, and in the short-term creating an interim tool for trust frameworks to use for policy areas. We also want to secure their input on what they want to see in Trust Framework Comparison tool, as well as to start demonstrating how a roaming wallet will work.”

Joni Brennan, DIACC President, said: “We’re thrilled to collaborate with the Open Identity Exchange. The formalization of our liaison demonstrates progress in supporting our shared values to advance secure, user-centric digital identity solutions globally. Our collaboration will leverage each organization’s expertise to explore opportunities to foster innovation, enhance interoperability, and build public trust in digital services by identifying the alignments and gaps between jurisdictional and sectoral trust frameworks.”

For more information, please contact Serj Hallam at communications@openidentityexchange.org 

About The Open Identity Exchange (OIX)

The OIX is a non-profit trade organisation on a mission to create a world where everyone can prove their identity and eligibility anywhere through a universally trusted ID. OIX is a community for all those involved in the ID sector to connect and collaborate, developing the guidance needed for inter-operable, trusted identities. Through our definition of, and education on Trust Frameworks, we create the rules, tools and confidence that will allow every individual a trusted, universally accepted, identity.

About The Digital ID and Authentication Council of Canada (DIACC)

The Digital ID and Authentication Council of Canada (DIACC) is a not-for-profit corporation of Canada that benefits from membership of public and private sector leaders committed to developing a trust framework to enable Canada’s full and secure participation in the global digital economy. DIACC’s objective is to unlock economic opportunities for consumers and businesses by providing the framework to develop a robust, secure, scalable and privacy-enhancing digital identification and authentication ecosystem that will decrease costs for governments, consumers, and businesses while improving service delivery and driving GDP growth.


Origin Trail

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various

Driving data interconnectivity, interoperability, and integrity, the Decentralized Knowledge Graph (DKG), now in its 6th iteration, delivers significant advancements that have benefited world-class organizations and shaped standards for industrial information exchange. Through partnerships with entities such as the British Standards Institution¹²³, GS1⁴⁵, European Blockchain Sandbox⁶, and various government-funded initiatives, the DKG has also played a crucial role in informing public policies.

DKG uniquely and effectively addresses the challenges of data ownership, AI hallucinations, and bias⁷ with the Decentralized Retrieval-Augmented Generation (dRAG)⁸ framework. dRAG drives a vast advancement of the RAG model initially developed by Meta⁹, by organizing external sources in a DKG while introducing incentives to grow a global, crowdsourced network of knowledge made available for AI models to use.

The DKG V8 has through a prototype demonstrated an unprecedented scale at which the Verifiable Internet for AI can drive value for anyone, on any device, and any chain. Addressing sensitive data concerns, scalability, and AI challenges concurrently has brought encouraging results that importantly shape the expected V8 release timeline.

DKG V8 — for Anyone, on Any Device, on Any Chain at Internet Scale

OriginTrail DKG has been battle-tested in real-world applications increasingly used by an ecosystem of organizations and government-supported initiatives. To date, no decentralized system has scaled in the production environment the way V6 DKG has. However, the current capacity of DKG reached its limits to support the growing usage requirements, prompting a transition to the V8, evolved to tackle the scale at which AI is consumed in any environment.

Data has been growing exponentially for decades, with AI driving further growth acceleration — according to the latest estimates, 402.74 million terabytes of data are created each day¹⁰. This trend is increasingly visible in the rising demands for additional capacity in the DKG, driven by data-intensive industry deployments in aerospace, manufacturing, railways, consumer goods, and construction driving DKG growth.

Version 8 of the DKG has therefore been designed with major scalability improvements at multiple levels, with a prototyped implementation tested in collaboration with partners from the sectors mentioned above.

3 key products of OriginTrail DKG V8

The major advancement that DKG V8 is making is in expanding the OriginTrail ecosystem’s product suite to 3 key products:

DKG Core Node V8 — highly scalable network nodes forming network core, persisting the public replicated DKG DKG Edge Node V8 — user-friendly node applications tailored to edge devices (phones, laptops, etc). ChatDKG V8 — the launchpad for creating AI solutions using decentralized Retrieval Augmented Generation (dRAG). DKG Edge Node — enabling the largest, internet-scale decentralized physical infrastructure network (DePIN)

The newcomer in the product suite is the DKG Edge Node — a new type of DKG node enabling the OriginTrail ecosystem to tackle the global challenges described above. As the name suggests, DKG edge nodes can operate on Internet edge devices. Devices such as personal computers, mobile phones, wearables, IoT devices, but also enterprise and government systems are where we can find huge volumes of very important data activity that DKG edge nodes will enable to enter the AI age in a safe and privacy-preserving way. The DKG edge node will enable such sensitive data to remain protected on the device, giving owners full control over how their data is shared.

Together with being protected on the device, edge-node data becomes a part of the global DKG with precise access management permissions controlled by the data owner. In this way, AI applications that the owner allows data access to will be able to use it together with the public data in the DKG via decentralized Retrieval Augmented Generation (dRAG).

Since such AI applications can equally be run locally on devices directly, this enables fully privacy-preserving AI solutions aimed at the ever-growing number of devices on the network edge that can at the same time use both public and private DKG data. The introduction of the DKG edge node enables the DKG to quickly expand to be the largest, internet-scale decentralized physical infrastructure network (DePIN).

New features of the DKG Edge Node

To unlock these powerful capabilities, DKG edge node will include new features that have previously not been available on DKG nodes but were elements of other proprietary or open-source products.

To enable a seamless creation of knowledge, DKG nodes will inherit the proven knowledge publishing pipelines from the Network Operating System (nOS). The data protection techniques for private and sensitive data will be based on the NGI-funded OpenPKG project outcomes. The DKG Node will support all major standards such as GS1 Digital Link, EPCIS, Verifiable Credentials, and Decentralized Identities. To support the growing field of knowledge graph implementations globally, it will enable seamless knowledge graph integrations of major knowledge graph providers such as Ontotext, Oracle, Snowflake, Neo4j, Amazon Neptune, and others.

DKG Edge Node V8 Prototype — Oura Ring integration with demonstrated 400 Knowledge Assets published in 10 seconds

DKG V8 Timeline

The V8 DKG launch sequence consists of 4 stages, aligned with the wider OriginTrail ecosystem roadmap, with a forkless upgrade to V8.

Stage 1: V8 multi-chain infrastructure deployment

Paranet deployment and first IPOs launched Base blockchain integration Cross-chain knowledge mining support

Stage 2: DKG core internet-scale V8 testnet launch

Asynchronous backing Knowledge assets V2: Batch minting (in prototype) DKG Core: Random sampling (in prototype)

Stage 3: DKG edge nodes on V8 testnet

Edge node beta launch Knowledge assets V2: Batch minting & native vector support DKG Core: Random sampling deployment

Stage 4: V8 mainnet upgrade deployment (October 2024)

To stay on trac(k) with updates on DKG V8 as it nears the deployment phase, make sure to join our Telegram or Discord channels!

¹https://v1.bsigroup.com/en-GB/insights-and-media/media-centre/press-releases/2023/july/new-solution-developed-for-cross-border-food-transfers/

²https://page.bsigroup.com/BSI-Academy-Blockchain-Solution

³https://www.bsigroup.com/globalassets/localfiles/en-th/innovation/blockchain-white-paper-th.pdf

https://www.gs1.org/sites/default/files/bridgingblockchains.pdf

https://www.gs1si.org/novice/novica/origintrail-resuje-izziv-ponarejenega-viskija

https://ec.europa.eu/digital-building-blocks/sites/display/EBSISANDCOLLAB/European+Blockchain+Sandbox+announces+the+selected+projects+for+the+second+cohort

https://origintrail.io/documents/Verifiable_Internet_for_Artificial_Intelligence_whitepaper_v3_pre_publication.pdf

https://origintrail.io/blog/decentralized-rag-with-origintrail-dkg-and-nvidia-build-ecosystem

https://ai.meta.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models/

DKG V8: Scaling Verifiable Internet for AI to Any Device, for Anyone, on Any Chain was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Energy Web

Energy Web Launches Full RPC Node for the Energy Web Chain

Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operatio
Sun Head: Robust, Reliable Full Node RPC Now Available with Multiple Deployment Options

Energy Web, a leader in blockchain technology solutions for the energy sector, is proud to announce the launch of its new Full RPC Node for the Energy Web Chain (EWC). This state-of-the-art product is designed to provide a robust and reliable full node EWC RPC offering, ensuring seamless and efficient operations for energy sector enterprises and application developers.

The new Full RPC Node is available in two flexible deployment options: fully managed or Bring Your Own Cloud (BYOC). Clients can choose to deploy their node on leading cloud platforms including AWS, GCP, and Digital Ocean. This flexibility ensures that organizations can select the deployment model that best fits their operational needs and technical environments.

Key features of the Energy Web Full RPC Node include:

Fully Dedicated Node: Each client receives a dedicated node, eliminating rate limiting and ensuring optimal performance and security for their blockchain applications. Comprehensive Security: Nodes are properly secured, providing peace of mind that organizational data and transactions are protected. Embedded Analytics Dashboards: Integrated analytics dashboards offer deep insights and real-time monitoring, enabling clients to make informed decisions based on accurate data.

The introduction of the Full RPC Node further expands Energy Web’s infrastructure offerings, reinforcing the company’s commitment to providing cutting-edge solutions that meet the evolving needs of the energy sector.

With the launch of our Full RPC Node, we’re offering a powerful tool for organizations that require robust access to the Energy Web Chain,” said Jesse Morris, Senior Fellow of Energy Web. “This product ensures that our clients can operate their applications smoothly and securely, with the flexibility to choose a deployment option that best suits their needs.

For more information about the Energy Web Full RPC Node and how it can benefit your organization, please visit www.smartflow.org

About Energy Web

Energy Web is a global non-profit organization accelerating the energy transition by developing and deploying open-source decentralized technologies. Our solutions leverage blockchain to enable new market mechanisms and decentralized applications that empower energy companies, grid operators, and customers to take control of their energy futures.

Energy Web Launches Full RPC Node for the Energy Web Chain was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 02. July 2024

OpenID

Notice of Vote for Proposed Fourth Implementer’s Draft of OpenID Federation

The official voting period will be between Wednesday, July 17, 2024 and Wednesday, July 24, 2024 (11:59:59PM PT), once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Wednesday, July 10, 2024. The OpenID Connect Working Group […] The post Notice of Vote for Proposed Fourth Implemente

The official voting period will be between Wednesday, July 17, 2024 and Wednesday, July 24, 2024 (11:59:59PM PT), once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Wednesday, July 10, 2024.

The OpenID Connect Working Group page is https://openid.net/wg/connect/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/331.

The post Notice of Vote for Proposed Fourth Implementer’s Draft of OpenID Federation first appeared on OpenID Foundation.


Oasis Open Projects

Invitation to comment on TOSCA Version 2.0

OASIS and the OASIS Topology and Orchestration. Specification for Cloud Applications (TOSCA) TC are pleased to announce that TOSCA Version 2.0 is now available for public review and comment. This 30-day review is the third public review for this specification. About the specification draft: The Topology and Orchestration Specification for Cloud Applications (TOSCA) provides a […] The post Invita

Public review - ends July 27th

OASIS and the OASIS Topology and Orchestration. Specification for Cloud Applications (TOSCA) TC are pleased to announce that TOSCA Version 2.0 is now available for public review and comment. This 30-day review is the third public review for this specification.

About the specification draft:

The Topology and Orchestration Specification for Cloud Applications (TOSCA) provides a language for describing application components and their relationships by means of a service topology, and for specifying the lifecycle management procedures for creation or modification of services using orchestration processes. The combination of topology and orchestration enables not only the automation of deployment but also the automation of the complete service lifecycle management. The TOSCA specification promotes a model-driven approach, whereby information embedded in the model structure (the dependencies, connections, compositions) drives the automated processes.

The documents and related files are available here:

TOSCA Version 2.0
Committee Specification Draft 06
20 June 2024

https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.md
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.html
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06.zip

How to Provide Feedback

OASIS and the TOSCA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 02 July 2024 at 00:00 UTC and ends 31 July 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on TOSCA”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on TOSCA works are archived at https://lists.oasis-open.org/archives/tosca-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the TOSCA TC can be found at the TC’s public home page:
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=f9412cf3-297d-4642-8598-018dc7d3f409

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=f9412cf3-297d-4642-8598-018dc7d3f409
https://www.oasis-open.org/policies-guidelines/ipr/#RD-Limited
“RF (Royalty Free) on Limited Terms”

[3] Public review metadata document:
https://docs.oasis-open.org/tosca/TOSCA/v2.0/csd06/TOSCA-v2.0-csd06-public-review-metadata.html

The post Invitation to comment on TOSCA Version 2.0 appeared first on OASIS Open.


FIDO Alliance

What is a passkey? Why Apple is betting on password-free tech

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, […]

The digital realm has long struggled with the vulnerabilities inherent in password-based authentication systems. With iOS 18 launching in September, Apple introduces a groundbreaking API for developers to implement passkeys, transforming how users secure their online accounts. This innovation is set to create a password-less future, significantly enhancing user data protection.

What Are Passkeys?

Passkeys are a sophisticated, passwordless login option for apps and websites developed by the FIDO Alliance. They consist of a “private key” stored on the user’s device and a “public key” residing with the service. This dual-key system undergoes an encrypted verification process, ensuring that access is granted only when the user’s biometrics or device PIN confirm their identity. This system effectively eliminates the need for passwords and multi-factor authentication codes, creating a seamless and secure user experience.

The Benefits of Passkeys

Traditional logins rely on passwords, which users often reuse across multiple sites, posing substantial security risks. Passkeys, however, are tied to the user’s unique device and biometric data, rendering them immune to phishing and brute-force attacks. If a passkey is stolen, it becomes useless without the rightful owner’s biometric verification. This intrinsic link between the user and the device significantly mitigates the threat landscape.

Banks and Passkey Adoption

While the advantages of passkeys are clear, some industries have been slow to adopt, including banks. Andrew Shikiar, CEO and Executive Director of the FIDO Alliance, explains, “Banks and financial institutions operate in a highly regulated industry, so they are vigilant when it comes to ensuring that user authentication complies with relevant regulations. Synced passkeys introduce a new customer assurance model that compliance leads within banks are still adjusting to.”

However, Shikiar noted that “we are now seeing regulatory and other government bodies begin to give formal guidance on how industry should contemplate passkeys,” including an April 2024 missive from the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) offering guidance about implementation.

But Shikiar says that “banks are hypersensitive to customer experience,” too, and thus more cautious about changing how customers log in—even if passkeys are quicker and more secure. New login methods require educating customers—and that takes time.

Despite these bottlenecks, Shikiar says that banks are slowly moving away from strictly password-based logins because they “inherently understand that using a passkey as a primary factor is far superior to a password.”

The Collaborative Future of Passwordless Authentication

Apple’s implementation of passkeys underlines a collective effort by tech giants within the FIDO Alliance, including Microsoft and Google, to enhance internet security. The Alliance has pioneered developments in authentication standards, striving to eliminate the vulnerabilities of password-based systems. Users can visit the FIDO Alliance to learn more about the ongoing efforts and advancements in passkey technology and the latest in passkey implementation.

As passkeys gain traction, the internet moves closer to a future where security does not come at the expense of user convenience. The collaborative efforts of industry leaders within the FIDO Alliance signal a transformative shift towards more secure, passwordless authentication methods, promising a safer digital experience for all.


Oasis Open Projects

Lead with Open Standards and Innovation

Picture this: you’re not just in the game, you’re leading it, crafting the very rules that define success. This is the golden opportunity that unfolds when you, as a tech innovator, dive headfirst into the realm of establishing open standards. Let’s talk about the magnetic pull of influence and the domineering stature in the tech […] The post Lead with Open Standards and Innovation appeared firs

By Francis Beland, Executive Director, OASIS Open

Picture this: you’re not just in the game, you’re leading it, crafting the very rules that define success. This is the golden opportunity that unfolds when you, as a tech innovator, dive headfirst into the realm of establishing open standards.

Let’s talk about the magnetic pull of influence and the domineering stature in the tech world. Engaging actively in setting open standards is not just about putting your name out there. It’s about embedding your products, your technologies, and your visionary outlook into the very fabric of the industry’s future. You’re not just aiming to lead; you’re set to redefine leadership. Why settle for just competing when you can be the one crafting the arena? It’s about creating a scenario where your competition isn’t just trying to catch up; they’re scrambling to decode your rulebook.

You’re a tech innovator armed with resilience, foresight, and an unmatched zest for breakthroughs. It’s not about facing challenges; it’s about embracing them, dissecting them, and transforming them into stepping stones towards your ultimate victory. Envision a future where your influence reverberates across markets, shaping demands, dictating trends, and steering the technological evolution. Your efforts in spearheading open standards not only catapult your products and strategies to the forefront but also enshrine your status as an indomitable leader and a relentless innovator.

But here’s the deal – it demands more than just ingenuity and expertise. It calls for an unyielding spirit, an insatiable appetite for excellence, and an unwavering commitment to surpass the benchmarks you’ve set yourself. It’s a call to arms for those ready to lead, innovate, and inspire. Let’s not just participate in the evolution of technology. Let’s lead it. Together, we can redefine the boundaries, push the limits, and craft a legacy that echoes through the annals of tech history. The path to unparalleled success and influence is before you. The question is, are you ready to seize it?

Remember, in the grand chessboard of technological advancement, it’s not about the pieces you start with, but how you decide to play them. Write the rules, win the game. Let’s make history. Join us at OASIS Open!

The post Lead with Open Standards and Innovation appeared first on OASIS Open.


Ceramic Network

CeramicWorld 05

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks: Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic

The 5th edition of the CeramicWorld is finally here! Here’s a quick recap of what has been happening in the Ceramic Ecosystem in the past few weeks:

Orbis has launched a new plugin for Gitcoin Passport 🔑 Index Network announces a Farcaster integration 💬 Index Network and Ceramic are calling developers to build for the Base Onchain Summer! 🏖️ Proof of Data is coming to EthCC ✈️ Ceramic’s new Recon Protocol is almost here! 🔥 Supercharge your crypto database with OrbisDB plugin for Gitcoin Passport! 🔥

OrbisDB team has just announced their new plugin for Gitcoin Passport.

OrbisDB is a decentralized database built on Ceramic, for onchain builders providing a practical, scalable solution for storing and managing open data. Gitcoin Passport lets users collect verifiable credentials to prove their identity and trustworthiness without revealing personal information, providing apps with a safeguard against sybil attacks and bad actors.

The new plugin allows developers to simply integrate the no-code Gitcoin Passport plugin with the OrbisDB instance to automatically generate reputation scores and filter out malicious actors from being indexed.

Check out this video to learn more and see the new plugin in action:

0:00 /2:18 1× If you’d like to become a Beta tester for Orbis plugins, shoot the team a DM! Index Network adds Farcaster integration

Index Network has recently added a Farcaster integration to their decentralized semantic index.

Index is a composable discovery protocol built on Ceramic, allowing the creation of truly personalized and autonomous discovery experiences across the web.

This integration allows for seamless interaction with decentralized graphs, including user-owned knowledge graphs on Ceramic and social discourse on Farcaster. Paired with autonomous agents, which can be used to subscribe to specific contexts, this new integration pushes the limits of what’s possible with semantic search. Check out the demo below:

0:00 /1:51 1× Learn more about Index Network

Build on Index Network for the Base Onchain Summer

Index Network has teamed up with the Ceramic team to call developers to build on Index Network for this year’s Base Onchain Summer! Base Onchain Summer is a multi-week celebration of onchain art, gaming, music, and more, powered by Base.

Devs are invited to build composable search use cases between Base and other projects participating in Base Onchain Summer. For example, those use cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain projects (Zora, Nouns)

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

And of course, there are prizes! A total prize pool of 2250 USDC will be distributed across the top 3 best applications!

Check out the bounty details on Bountycaster and reach out to Ceramic and Index teams on Farcaster if you have any additional questions.

Start building today! Proof of Data is coming to EthCC!

The third edition of the Proof of Data event series is coming to Brussels! Join the Ceramic and Textile (creators of Tableland and Basin) teams for an inspiring afternoon, expanding on the essential discussions from EthCC. This event will unite pioneers and practitioners in the decentralized data realm. Engage in dynamic panel discussions and networking opportunities, ideal for developers and innovators eager to push the boundaries of decentralized technology.

Featured presenters from IoTeX, DIMO, WeatherXM, and Filecoin will share the latest advancements and projects, sparking engaging conversations with all attendees. A moderator will guide these discussions, ensuring critical themes in crypto, web3, and beyond are covered.

Don’t miss this chance to connect, collaborate, and contribute to the future of decentralized technology. Be part of the conversation driving the next wave of technological innovation!

RSVP today and join our Data Room Telegram channel.

RSVP Index Network & CrewAI Integration

Index now supports an integration with CrewAI, which brings an intuitive way to design multi-agent systems, with Index offering composable vector database functionality. Now, autonomous agents can synthesize data from multiple sources seamlessly.

Learn more! Ceramic’s new Recon Protocol is almost here!

The core Ceramic team is getting ready for the public release of Ceramic’s new Recon Protocol. This new Ceramic networking protocol improves network scalability and data syncing efficiency. It unlocks data sharing between nodes and enables users to run multiple nodes that stay in sync and load balanced. This will enable highly available Ceramic deployments.

Ceramic’s Recon Protocol is in the last testing stages, with some key partners already building on it. It will be launched as part of the nearest upcoming Ceramic release, which will unlock the document migration process from js-ceramic + Kubo to js-ceramic + rust-ceramic.

The next Ceramic public release is scheduled in a few weeks' time. Keep an eye on the Ceramic public roadmap and Ceramic blog for updates regarding the release!
Ceramic Community Content BOUNTY: Build composable search applications on Index Network TRENDING DISCUSSION: Ceramic without Anchoring TRENDING DISCUSSION: Private Data Architecture TUTORIAL: Save OpenAI Chats to OrbisDB on Ceramic VIDEO: How data logs are defined to be easily discoverable in an open network by Charles from the Orbis team VIDEO: OrbisDB lifecycle by Charles from the Orbis team VIDEO TUTORIALS: Check out the latest video tutorials shared on the Ceramic YouTube channel Events Meet the Ceramic team at EthCC and side events: July 9, Proof of Data July 9, Data on Tap: Data & AI Cocktail Hour with Ceramic & Tableland July 10, Builders Brunch July 11th, Ceramic ecosystem developers call Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.

pic.twitter.com/yJkRHzFdtb

— Ceramic (@ceramicnetwork) June 17, 2024


Until next time! 🔥


Elastos Foundation

BeL2 Loan Demo App Updated to Version 0.3: Enhanced Features

We are excited to announce that the BeL2 Loan Demo App has been updated to version 0.3, available at https://lending.bel2.org/. This update brings significant enhancements and new features to improve the user experience and functionality of the app. Here’s a detailed look at what’s new: Added support for Taproot addresses Added the ability to manually […]

We are excited to announce that the BeL2 Loan Demo App has been updated to version 0.3, available at https://lending.bel2.org/. This update brings significant enhancements and new features to improve the user experience and functionality of the app. Here’s a detailed look at what’s new:

Added support for Taproot addresses Added the ability to manually request ZKP proofs Added support for USDC in addition to USDT Additional repayment/proof information in order details Implemented additional arbitration request use cases Fixed expired order use cases Other minor bug fixes

 

Technical Update Insights Added Support for Taproot Addresses

Integrating Taproot addresses ensures compatibility with the latest Bitcoin advancements, enhancing privacy and scalability. Implemented in November 2021, Taproot makes all transaction outputs look the same, improving privacy whether for simple payments or complex smart contracts. It also boosts efficiency and reduces fees. Supporting Taproot allows our app users to benefit from these enhancements, enabling more discreet and cost-effective transactions. This aligns with our commitment to providing a secure and efficient platform, enhancing the overall user experience.

Added Ability to Manually Request ZKP Proofs

Allowing users to manually request Zero-Knowledge Proofs (ZKPs) enhances control and flexibility, enabling them to generate proofs as needed. ZKPs are cryptographic methods that verify a statement’s truth without revealing any additional information. In BeL2, ZKPs ensure transaction privacy and security without exposing underlying data. Previously, ZKP generation was automatic, which might not have suited all users. Now, users can decide when to generate ZKPs, providing greater autonomy and control over their transactions and improving the overall user experience.

Added Support for USDC in Addition to USDT

Supporting USDC alongside USDT provides users with more stablecoin options, enhancing liquidity and flexibility within the app. USDC is known for its regulatory compliance and widespread acceptance, making it a valuable addition to our platform. As a stablecoin pegged to the US dollar, USDC maintains a stable value, attracting users who seek to avoid cryptocurrency volatility. Integrating USDC, widely supported in decentralised applications (dApps) on the Elastos Smart Chain, such as Glide and Elacity, offers more financial activity options and leverages existing liquidity, making transactions more seamless and efficient.

Additional Repayment/Proofs Information in Order Details

Providing detailed repayment and proof information in order details enhances transparency and user trust. It allows users to clearly understand their transactions, repayments, and associated proofs, essential for effectively managing financial activities. This update adds comprehensive data on repayment schedules and proof generation directly within the order details. Users can now see all necessary information in one place, making it easier to track and manage loans. This transparency is crucial for building trust in decentralised finance (DeFi) applications, ensuring users have all the information they need to make informed decisions.

Implemented Additional Arbitration Request Use Cases

Enhancing the arbitration process is critical for ensuring fair and efficient dispute resolution within the platform. By implementing additional use cases for arbitration requests, we aim to provide a more robust mechanism for handling disputes, and maintaining user trust and satisfaction. Arbitration in BeL2 resolves disputes between parties in a decentralised manner, using ELA collateralised nodes, ensuring fairness without relying on centralised authorities. The new use cases expand the scenarios for arbitration, making the process more comprehensive and adaptable, reinforcing the platform’s reliability and fairness.

Fixed Order Expired Use Cases

Addressing issues related to order expiration is vital for ensuring smooth and reliable transaction processes. Fixing these use cases improves the overall user experience by preventing disruptions caused by expired orders. Order expiration issues occur when a transaction is not completed within a specified timeframe, leading to complications or the need for manual intervention. By fixing these issues, we enhance the predictability and reliability of the platform. This ensures that orders are processed as expected, reducing the likelihood of unexpected expirations and associated complications. These improvements help maintain a seamless transaction flow, enhancing user satisfaction.

Other More Minor Bug Fixes

Continuous improvement through minor bug fixes is essential for maintaining the stability and performance of the platform. These fixes address small issues that, collectively, can significantly impact the user experience. Minor bug fixes involve resolving smaller issues that may not be immediately noticeable but contribute to the overall functionality and stability of the app. By regularly addressing and fixing minor bugs, we ensure that the app runs smoothly and efficiently, providing users with a reliable and enjoyable experience.

These updates mark a significant step forward in our mission to provide a secure, flexible, and user-friendly platform for decentralised finance built on Bitcoin. By continuously enhancing the BeL2 Loan Demo App, we aim to offer the best possible experience for our users, ensuring that they can leverage the full potential of their Bitcoin holdings in a secure and decentralised manner. Stay tuned for more updates and innovations as we continue to develop and expand the capabilities of the BeL2 ecosystem. Excited to learn more? Head over to the BeL2 website and follow Infinity for the latest updates!

 

Monday, 01. July 2024

OpenID

New Shared Signals Drafts

Authors / Shared Signals Co-Chairs: Atul Tulshibagwale, SGNL; Shayne Miel, Cisco; Sean O’Dell, Disney; and Tim Cappalli, Okta The OpenID SSWG has released three new drafts for review by the OpenID Foundation membership. We would like to describe the salient features of these drafts here. At the end of the 45-day review period, members can […] The post New Shared Signals Drafts first appeared on

Authors / Shared Signals Co-Chairs: Atul Tulshibagwale, SGNL; Shayne Miel, Cisco; Sean O’Dell, Disney; and Tim Cappalli, Okta

The OpenID SSWG has released three new drafts for review by the OpenID Foundation membership. We would like to describe the salient features of these drafts here. At the end of the 45-day review period, members can vote on adopting these drafts as implementer’s drafts.

Shared Signals Framework – Draft 03

After the Shared Signals Framework Implementer’s Draft 02 was released, the OpenID Foundation contracted with the University of Stuttgart for performing a formal security review of the draft specification. The good news is that the findings from the preliminary report were minor, but the bad news is that addressing them required changes to the normative language in the draft. As a result, the SSWG decided to create a Draft 03, which would need to go through the OpenID review process in order to be adopted as a successor Implementer’s Draft. Because we had to do this change, we decided to update some other aspects of the framework, which are backwards compatible (i.e., anything that implements draft 02 will still be draft 03 compliant). The salient features added in this draft and the issues fixed in this draft are listed below:

Security issues addressed Issuer Mix Up: Draft-02 did not specify that a receiver must validate the issuer value in incoming events and API responses from the transmitter. This language has now been updated to specify that receivers must validate the iss value in events and API responses they receive from a transmitter. Stream Audience Mix Up and Attacker Stream Subject Insertion: Draft-02 did not specify that a transmitter must authenticate a receiver, which we have remedied in draft-03. The new language in draft-03 also requires that transmitters use TLS and recommends that receivers verify the trusted source of the transmitter URL and use HTTPS. New features added Use of txn claim: Draft-03 now clarifies how to use the JWT txn claim in order to prevent cascading cyclic chains of SSF events caused by the same underlying event. By verifying that the txn claim in a newly received SSF event is the same as a previously received SSF event, the receiver can ignore subsequent events it receives. The txn claim can also be used for reconciliation or auditing purposes between a transmitter and receiver as part of “closing the loop” on security events and actions. SSF transmitters can now specify in their metadata whether streams they create have no subjects in them, or “all appropriate subjects” automatically added in them, immediately after the stream has been created. Clarifications and Bug fixes

A number of minor bugs, mostly involving non-normative language such as examples, have been fixed in this draft. Some new examples have been added and existing examples have been updated to match formats that have changed since those examples were first introduced.

CAEP Draft 03

The Continuous Access Evaluation Profile now has a new draft for review. The main update in this draft is the introduction of two new CAEP events: “Session Established” and “Session Presented”. These events can help in the following ways:

Session Established:

Notify completion of a federated identity initiated SSO Indicate to a monitoring service that a user has established a new session with a particular application Optionally bind a session to a specific device or other context so that it is easier to detect session hijacking

Session Presented:

Helps a monitoring service detect user presence at a specific application Helps detect impossible travel across applications Helps detect changes in environmental properties, such as IP-address changes.

Together these two events can help effectively monitor an organization’s cloud services for identity threats.

In addition to these new events, the draft has been updated to reflect the new formats and fields in all examples to match the latest SSF draft.

CAEP Interoperability Specification

In March 2023, the OpenID Foundation conducted an interoperability event hosted at the Gartner IAM Summit in London. The results of that interoperability event are documented as a part of this blog post. At that time, the implementers established interoperability of the actual events being exchanged. The SSWG had already begun work on an interoperability profile that would specify more than just the event formats to be supported. So now we are pleased to announce the first version of this interoperability profile, which specifies:

Spec versions that must be supported by transmitters API endpoints that must be supported by transmitters Authorization schemes that must be supported by transmitters Stream control features that must be supported by transmitters AddSubject behavior of transmitters Subject formats supported by transmitters and receivers Signature formats supported by transmitters and receivers Details of OAuth options that must be supported by transmitters and receivers Event types that must be supported by transmitters and receivers

We invite the general public and members of the OpenID Foundation to review the specifications that are available here. Feedback may be provided by opening an issue in the Shared Signals GitHub repository.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post New Shared Signals Drafts first appeared on OpenID Foundation.


We Are Open co-op

Finding your activist friends

Solidarity, common ground, and intersectionality in the climate movement This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities. Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional underst
Solidarity, common ground, and intersectionality in the climate movement

This post looks at ways we can channel our activist energy in ways that address multiple issues and find belonging in adjacent communities.

Recently at a week long event that brought together energy transition activists from around the Mediterranean, I was pleased to meet a variety of people with intersectional understandings of the climate crisis. Together, we explored what intersectionality looks like in the climate movement and how we can tell stories that lead to action.

cc-by-nd Bryan Mathers Expanding our activist energies

Although many of us care about a variety of struggles, we don’t have the time or the energy to get involved in every single thing. We focus our energies, we have to. The problem, of course, is that each issue and cause needs the visibility a group of activists coming together can provide. So how can we focus ourselves and find energy to do more?

For just a couple hours last week, I worked with a small group of rabble rousers to create campaign ideas for the challenge:

The intersection and complexities of our structural problems makes people feel powerless.

The structural problems we are facing are complex and co-exist within a matrix of other challenges. We are dealing with environmental crisis, racist societies, and social inequalities left, right, and centre. No matter how positive your personality may be, it’s hard to stay optimistic. No matter how cognisant you are about other struggles, it’s hard to pay attention to everything.

When we are overwhelmed and feeling powerless, we tend to recede. Our group had the insight that feeling overwhelmed or powerless is lonely. Loneliness is a cascading psychological phenomena that halts action and feeds despair.

We started to think about how we might address loneliness in activist movements by telling stories that help people who feel like they belong to one group (e.g. environmentalists) to understand their connection to other groups.

Our theory of change is that finding belonging amongst your activist friends can provide you with solidarity and a source of energy. We wanted to push for intersectionality in the climate movement.

Intersections in audiences, the Audience Ikigai, cc-by WAO Choose three: intersectionality in practice

Everybody cares about something, whether that be sports, the environment, or even status. If you can identify one thing you care about, you can surely identify three others. Using arbitrary design constraints, like “choosing three” is a good way to move any idea forward, including ideas around your own activism or civic participation.

We know that climate change disproportionately affects already marginalised communities, which can exacerbate existing social inequalities. With this in mind, we choose to look at the intersectionality of climate with three human rights issues:

Refugee and migrant justice Women’s rights LGBTQIA+ rights

Easy, right? Choosing three issues to put your energy into is a lot less than “everything”. Three is also enough to give variability and provide access to different communities. Different communities come with different energies and that is something you can tap into when needed.

We often consider the thematic intersections of our own work. See how we work in the overlaps together with thoughtful, ethical organisations in Practical utopias and rewilding work. Find connections and leverage points

Intersectionality is about understanding the points of interconnection between two issues. Seeing the overlaps means that you can connect issues together in new and novel ways. Novelty is just one storytelling tactic in calling attention to a particular issue. Once you’ve determined places to focus, you can further narrow your focus by looking for leverage points that lead to connection.

Refugee and migrant justice — From climate-induced displacement to the fact that people who are forced to migrate whatever the reasons can face challenges in accessing their basic rights, refugee and migrant justice ties heavily to other environmental and human rights issues. Women’s rights — Women’s societal roles as caregivers and food producers make them more vulnerable to the effects of climate change. It’s now widely understood that educating girls is a catalyst towards climate action. LGBTQIA+ rights — Again, marginalised communities are disproportionately affected by the climate crisis. LGBTQIA+ are often members of other marginalised communities, such as racial minorities, and they are more likely to live in poverty.

Human rights and environmental justice are big and complex areas of focus. Thinking about how the complexities of these issues overlap can help narrow down the impact you want to have.

cc-by Iris Maertens with Dancing Fox Have some fun

Yes, structural problems are serious and complicated. It is essential to be aware of both your own privileges (whatever they may be) and to think deeply about the issues and communities you are working with and within. It’s also important to know that joy is a common emotional human experience. Inciting joy is a way to truly help people. It can help build psychological characteristics that help people deal with whatever life has to throw at them. Joy can also open people up to a better tomorrow.

At the event I attended last week, as we thought about the intersectionality of environmentalism with human rights, we thought about how we might be able to inspire people to be joyfully curious to learning more about an issue they might not have much involvement with.

We developed a few posters, designed to be displayed on a metro, to inspire this curiosity.

Our poster ideas, drawn by the incredible Iris Maertens Solidarity with others

The complexity of our global problems can be overwhelming, but we cannot solve one complex issue without tackling the intertwining structural issues. Finding ways to relate what you care about to what others care about is a way to build solidarity and, therefore, momentum. It’s not always easy, but the more you can participate in cross-cutting social and environmental communities, the bigger our collective power becomes.

I worked with inspiring people from these organisations:

La Casa dels Futurs is both an ongoing project dedicated to supporting intersectional organizing between social and ecological movements, and a campaign to create a permanent Climate Justice Center and Movement School…” “Rinascimento Green…aims to bring together various pieces of civil society to promote, through a path of popular participation, a bottom-up Green Deal.” “WeSmellGas is a collective of organisers, researchers and film-makers based in Northern Europe. Climate justice can only be realised by dismantling capitalism and the imperial processes that reinforce it, including our current extractivist energy system.”

🔥 Do you need help with storytelling and advocacy? Check out WAO’s free, email-based Feminism is for Everybody course, or get in touch!

Finding your activist friends was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

Happy birthday to the Identity at the Center podcast! Our la

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners

Happy birthday to the Identity at the Center podcast! Our latest episode is particularly special as we celebrate the milestone of five incredible years of Identity at the Center. In this special episode, we celebrate the podcast’s fifth birthday by revisiting our very first episode to update and explain the process we use to develop IAM strategies and roadmaps. Thank you to our amazing listeners for your continued support!

Watch it here https://youtu.be/OUHTB1ncLME?si=fxeS8bNtzmKaW2kp or listen in your podcast app.

More info at idacpodcast.com

#iam #podcast #idac

Sunday, 30. June 2024

OpenID

OpenID for Verifiable Credentials Wins EIC Award!

The OpenID Foundation is proud to announce that, for the work building the “OpenID for Verifiable Credentials” family of specifications, members of the Digital Credentials Protocol (DCP) Work Group won the “Future Technologies and Standards” award at the European Identity and Cloud Conference. For the last several years, this group has been working tirelessly to […] The post OpenID for Verifiabl

The OpenID Foundation is proud to announce that, for the work building the “OpenID for Verifiable Credentials” family of specifications, members of the Digital Credentials Protocol (DCP) Work Group won the “Future Technologies and Standards” award at the European Identity and Cloud Conference.

For the last several years, this group has been working tirelessly to develop scalable OpenID specifications attuned to Issuer-Holder-Verifier use cases. This family of specifications enable both the issuance and presentation of digital credentials – regardless of their format – and pseudonymous authentication. The net result of their work will be that end-users gain control, privacy, and portability over their identity information. And their constant, simultaneous focus on verifiers underpins a solid path to adoption. 

You can learn more by listening to WG Chair, Kristina Yasuda, speak about how Digital Identity Wallets can “cross the chasm” to widespread adoption during her EIC Keynote.

This award recognizes the impact that the WG has already had on the market (learn more on their landing page):

The European Architecture and Reference Framework lists several of their specs as required for certain use cases 3 draft ISO standards reference DCP specifications 18 wallets in the European Commission  EBSI project support them NIST plans to implement reference implementations of OID4VP to present mdocs/MDL

To further support and enable OID4VC implementers, the Work Group has been engaging closely with the OpenID Certification team to develop tests that will ensure that deployments are interoperable and secure. There are already draft tests for Verifiable Presentations being trialed by a number wallets and the Foundation is working with NIST and other partners for more. So stay tuned!

Thank you so much to the DCP WG for their efforts, their commitment to the work of the Foundation, and their advocacy for the users at the heart of this family of standards. 

Congratulations to Kristina Yasuda, Torsten Lodderstedt, Joseph Heenan, Tobias Looker, Oliver Terbu, Paul Bastian, John Bradley, Mike Jones, Fabian Hauck, Jan Vereecken, Nat Sakimura, Gail Hodges, Daniel Fett, Brian Campbell, Christian Bormann, and all others who have participated in and progressed this work.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post OpenID for Verifiable Credentials Wins EIC Award! first appeared on OpenID Foundation.

Friday, 28. June 2024

Hyperledger Foundation

Introducing Splice, a New Hyperledger Lab for Canton Network Interoperability

Hyperledger Labs

Hyperledger Labs


DIF Blog

DIF Newsletter #41

June 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Conference Season wrap-up; 3. Announcements at DIF; 4. Community Events; 5. DIF Members; 6. Get involved! Join DIF 🚀 Decentralized Identity Foundation News Credential Schemas launches The Credential Schemas work item is

June 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Conference Season wrap-up; 3. Announcements at DIF; 4. Community Events; 5. DIF Members; 6. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

Credential Schemas launches

The Credential Schemas work item is up and running following a successful kick-off meeting earlier this month.

“There was strong attendance and great energy. We had participants with deep experience in schema development and KYC (Know Your Customer) compliance, as well as newcomers to decentralized identity. We are currently focused on establishing the scope and use cases,” said DIF’s Executive Director, Kim Hamilton Duffy.

“It’s not too late to participate. We’re looking to involve a broad range of expertise, including people familiar with KYC reusable ID use cases, such as compliance experts.” she added.

The work item meets every other Tuesday at 10:00 PST / 13:00 EST / 19:00 CET, with the next meeting scheduled for Tuesday 2 July.

Join DIF to get involved.

Linked Verifiable Presentations
Photo by FlyD on Unsplash

The Linked Verifiable Presentations specification has been approved.

The spec defines how to share, discover and retrieve Verifiable Credentials publicly via a service entry in a DID Document. It complements existing technologies for sharing VCs privately, like DIDComm messaging and OID4VC. Use cases enabled by Linked VPs include

Discover verifiable data about a website Simplify the onboarding of suppliers and customers by linking relevant non-sensitive data such as to business registration credentials to the organization’s DID Make mandatory data verifiable: provide imprint pages or terms of use statements as machine-readable, verifiable credentials Decentralized business network: people share their educational background and work experience as verifiable credentials publicly

The specification is available here: https://identity.foundation/linked-vp/

Working Group training

DIF provided our first WG training session. Check out the session recording and slides (recommended viewing for all DIF members).

Operational Excellence @ DIF

Work to automate DIF’s operational processes continues to make excellent progress thanks to our star Systems Administrator Pratap Mridha (pictured above).

🛠️ Conference Season wrap-up

European Identity and Cloud Conference (EIC) 2024

This year's EIC felt to many like a defining moment in decentralized identity's journey from idea to reality.

The event took place as Germany was gearing up to host the Euro 2024 football competition.

Decentralized identity luminaries were in abundance. Rolf Rauschenbach, Anil John, Daniel Goldschneider of OpenWallet Foundation, Kim, Ramesh Narayanan of MOSIP, and Damian discuss standards collaboration outside of the Berlin Congress Center.

Long-time decentraliezd identity leaders and visionaries Kaliya Young and Phil Windley catch up in between sessions.

Executive Director Kim Hamilton Duffy and DIDComm WG co-chair Steve McCown of Anonyome Labs delivered a Decentralized Identity Technical Mastery Sprint to a packed seminar theatre on the opening day of the conference (see this summary of their session on the DIF blog)

DIF Steering Committee Members Markus Sabadello of Danube Tech duelled with OpenID Foundation chairman Nat Sakimura over how to realize SSI principles in their joint keynote presentation, Les Miserables of the Cyber Frontier (session summary here)

Kim teamed up with Wayne Chang of SpruceID and Linda Jeng of Digital Self Labs to explore the key role of decentralized identity in building trust in AI (session summary here)

Misha Deville, co-founder of Mailchain, spoke about lessons learned from Web3, including the importance of network design in achieving the target outcomes of decentralized identity ecosystems

Kaliya Young gave a presentation about institutional memory, and the implications for organizations, individuals and society

Nick Lambert of Dock Labs and Nick Price, who co-chairs the DIF Travel & Hospitality SIG, joined other industry experts to explore how decentralized identity can help upgrade Customer Identity and Access Management (CIAM) - summary on the DIF blog here.

Riley Hughes of Trinsic, Sam Curren of Indicio and Kim were joined by Abbie Barbir to discuss reusable identity and bootstrapping decentralized identity ecosystems.

Fraser Edwards of cheqd and Sharon Leu of JFF Labs spoke about incentives for wallet developers during a panel discussion addressing usability challenges of digital identity wallets

The German Federal Agency for Disruptive Innovation (SPRIND) selected several companies including Sphereon to develop prototypes for the European Digital Identity Wallet

Identity Week Europe

DIF members including Polygon ID / Privado, Mailchain, Indicio, Hypermine, PassiveBolt and Tonomy Foundation converged on Amsterdam for another European identity industry gathering.

AI, and the rapidly changing cyber-threat landscape were major themes.

Decentralized identity also generated great interest at the event, which is traditionally dominated by IAM, physical and cross-border ID: see this summary of several decentralized ID themed discussions on the DIF blog.

Digital Identity unConference Europe

Europe's own IIW-inspired event returned to Zurich, Switzerland, where many of those present at last year's inaugural event were joined by a throng of new participants.

The eIDAS 2 regulation and EU Digital Identity Wallet were key topics.

Practical questions such as how to kick-start a DI ecosystem, onboard customers using government-provided PID (Personal Identification Data) credentials and make life simpler for technology implementers and users were at the heart of the discussions.

Organizational identity and B2B use cases were also recurring themes in many of the sessions.

📢 Announcements at DIF

DWN users - share your use case!!

Do you use Decentralized Web Nodes? We want to hear about it! Let us know how you're using DWNs here.

DIF Labs

The DIF Labs working group is coming soon; contact membership@identity.foundation to learn more

🗓️ ️Community Events

Coffee Breaks

If you missed this month's DIF Coffee Breaks, moderated by DIF's Senior Director of Community Engagement, Limari Navarrete, be sure to check out the recordings:

Andres Olave, Head of Technology at Velocity Career Labs Cole Davis, Founder and CEO at Switchchord

Last month's coffee breaks

Tim Boeckmann, CEO and Co-founder of Mailchain Nara Lau, Founder at Fise Technologies Ankur Banerjee, CTO and Co-founder at Cheqd Humpty Calderon, Advisor @Ontology and creator of Crypto Sapiens Podcast

Follow https://twitter.com/DecentralizedID to get updates

🗓️ ️DIF Members

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events which can be found here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives


Ceramic Network

Save OpenAI Chats to OrbisDB on Ceramic (Tutorial)

Build an AI-powered chatbot using OrbisDB for storage and the OpenAI API.

Last year we partnered with Learn Web 3 (a free educational platform for Web3 developers) to publish a tutorial on Saving OpenAI Chats to ComposeDB on Ceramic to showcase an easy-to-understand design architecture and how ComposeDB could be leveraged for storage. In that example we showed how to configure and deploy a local ComposeDB/Ceramic node, walking through data model design, server configurations, model deployment, and runtime definition generation, all of which are necessary steps a developer must undergo before running the application locally.

But what if developers could bypass local node configuration altogether and start testing their database design and application logic immediately? What if they could do so with the assurances of no lock-ins, and the open option to move to a self-hosted configuration in the future? And finally, what if they could benefit from all of these things while enjoying a seamless developer experience that makes storage setup easy?

That's where OrbisDB comes in.

What is OrbisDB?

OrbisDB is an advanced decentralized database built on the Ceramic Network and offers an ORM-like interface that developers can leverage when integrating the OrbisDB SDK. Developers who have worked with Prisma or Drizzle with a Postgres instance will find this experience familiar and exceedingly easy to work with.

As for developer experience, what sets OrbisDB apart are the following:

A user interface (UI) developers can run either locally or using a hosted Studio instance, bypassing the need to define and deploy data models by hand (which is still an option if using the SDK). The UI also includes data visualization (you can view data relevant to your applications in table format), as well as other views for configuring add-ons like plugins (described below). OrbisDB offers the option to leverage a variety of a growing list of plugins to enrich the data capabilities developers can incorporate into their application logic. Some example plugins offer gating ability, automatic resolution of ENS domains, sybil resistance, and more. Anyone can also build plugins and incorporate them immediately in the event they're running a standalone instance. The OrbisDB SDK wraps user authentication, client creation, schema creation (if developers prefer not to use the UI), and querying all under one roof, therefore simplifying the list of dependencies developers need to worry about. Finally, OrbisDB offers the option to run an instance locally (similar to the ComposeDB tutorial mentioned above), or on a shared (hosted) instance. This is a significant feature for overall development and testing velocity as it lets developers start writing and reading data right away without having to worry about node configuration. Once developers are ready to take their application to production after testing on the shared instance, setting up a self-hosted (standalone) instance is straightforward.

For this tutorial, we will be leveraging the hosted Studio instance to both define our data models and utilize a shared OrbisDB instance.

Let's Get Started!

Before we get started, you will need the following dependencies:

MetaMask Chrome Extension (or similar browser wallet for authentication) Node v20 An OpenAI API key A project ID you will need from WalletConnect  A free OrbisDB Studio account - first, log in with your browser wallet. We will use this later to define our data models and obtain a context and environment ID Initial Setup

First, clone the repository and install the dependencies:

git clone https://github.com/ceramicstudio/orbisdb-chatbot && cd orbisdb-chatbot npm install

Next, create a copy of the example env file in your root directory:

cp .env.example .env

Visit OpenAI Signup page to create an account if you don't yet have one and generate an API key. OpenAI offers a free OpenAI API trial, with $5 worth of credit (which can take you a LONG way). Go ahead and assign your new API key to OPENAI_API_KEY in your new .env file.

Navigate to WalletConnect and create a free account and a new project (with a name of your choosing and the App type selected). Copy the resulting project ID and assign it as the value for NEXT_PUBLIC_PROJECT_ID .

OrbisDB Setup

If you're logged into your OrbisDB Studio account, we can start connecting our application to a shared OrbisDB instance.

First, you will need to define a new context. Contexts are a Ceramic-native feature exposed in all data management methods, and make it easy for developers to organize data across different applications or projects (there is also the option to leverage sub-contexts, but we'll save this for a future tutorial).

Go ahead and click "+ Add context" within your root studio view - feel free to give your new context a name and description of your choosing:

If you click into your new context you can view its corresponding ID:

Go ahead and copy this value and assign it to NEXT_PUBLIC_CONTEXT_ID in your .env file.

On the right-hand side, you should also see details about your setup:

Copy the value found under "Environment ID" and assign it to NEXT_PUBLIC_ENV_ID in your .env file. This ID is required to identify you when using the shared OrbisDB instance.

You will also see the endpoints for the shared Ceramic and OrbisDB instances in the same section. No need to copy these values as they are already hard-coded into the repository.

Defining Data Models

We will also use the Studio UI to define the data models our application needs. This demo application utilizes two simple data models found within the tables file in our repository:

posts - this will contain each message within our conversation exchange. The "body" field will house the message itself, while the "tag" field will keep track of who the message came from (user vs. bot). This model will use the "List" account relation, which means an authenticated account can have an unbounded amount of instance documents that fall under this definition. profiles - this model will allow us to assign additional data to ourselves and our chatbot, including a name, username, and fun emoji. The "actor" subfield will be used to differentiate between the user (using the value "human"), and your chatbot (using the value "robot"). In contrast to posts, this model will use the "Set" account relation based on the "actor" subfield, which means an account can have exactly 1 instance document given a value assigned to "actor". For example, this ensures that our application won't allow us to accidentally create >1 document with an "actor" subfield matching "human".

To start creating the models, navigate to "Model builder" from the Studio navigation. You can start by defining your "posts" table. After clicking "Create Model" you will be able to view the model ID:

Copy this value and assign it to NEXT_PUBLIC_POST_ID in your .env file.

Go through the same steps for your "profiles" table. However, be sure to select the "Set" option under "Account relation". Copy the resulting model ID and assign it to NEXT_PUBLIC_PROFILE_ID in your .env file.

Application Architecture

As mentioned above, the OrbisDB SDK makes it easy to instantiate clients, authenticate users, and run queries using the same library. As you'll note in the application repository, there are various components that need to be able to access the state of the authenticated user. While we're wrapping all components of our application within a WagmiConfig contextual wrapper (which will allow us to leverage Wagmi's hooks to see if a user's wallet is connected - learn more about this in our WalletConnect Tutorial), we also need a way to know if the user has an active OrbisDB session.

While there are multiple ways to facilitate this, our application uses Zustand for state management to circumvent the need for contextual wrappers or prop drilling.

If you take a look at the store file you can see how we've set up four state variables (two of which are methods) and incorporated the OrbisDB SDK to authenticate users and alter the state of orbisSession:

type Store = { orbis: OrbisDB; orbisSession?: OrbisConnectResult | undefined; // setOrbisSession returns a promise setAuth: ( wallet: GetWalletClientResult | undefined ) => Promise<OrbisConnectResult | undefined>; setOrbisSession: (session: OrbisConnectResult | undefined) => void; }; const StartOrbisAuth = async ( walletClient: GetWalletClientResult, orbis: OrbisDB ): Promise<OrbisConnectResult | undefined> => { if (walletClient) { const auth = new OrbisEVMAuth(window.ethereum!); // This option authenticates and persists the session in local storage const authResult: OrbisConnectResult = await orbis.connectUser({ auth, }); if (authResult.session) { console.log("Orbis Auth'd:", authResult.session); return authResult; } } return undefined; }; const useStore = create<Store>((set) => ({ orbis: new OrbisDB({ ceramic: { gateway: "https://ceramic-orbisdb-mainnet-direct.hirenodes.io/", }, nodes: [ { gateway: "https://studio.useorbis.com", env: ENV_ID, }, ], }), orbisSession: undefined, setAuth: async (wallet) => { if (wallet) { try { const auth = await StartOrbisAuth(wallet, useStore.getState().orbis); set((state: Store) => ({ ...state, orbisSession: auth, })); return auth; } catch (err) { console.error(err); } } else { set((state: Store) => ({ ...state, session: undefined, })); } }, setOrbisSession: (session) => set((state: Store) => ({ ...state, orbisSession: session, })), }));

As you can see, we've hard-coded the Ceramic and OrbisDB gateways, whereas we've imported our environment ID that we previously assigned as an environment variable.

Our navbar component sits at the same or greater level as all of our child components and includes our Web3Modal widget. You can see how we're using a useEffect hook to check if our session is active and either set our "loggedIn" state variable as true or false. This result determines if we generate a new session for the user by leveraging our setAuth method from our Zustand store, or if we simply set our orbisSession as the value of our valid active session.

Back in the home page component you can see how we're conditionally rendering our MessageList child component based on whether we have both an active orbis session AND the user's wallet is connected (allowing us to access their address).

Reading Data

The message list and userform component files are responsible for performing the majority of writes and reads to OrbisDB. If you navigate to the message list component for example, take a look at how we've imported our client-side environment variables to identify our post and profile models, as well as our context ID. When this component is rendered, the useEffect hook first invokes the "getProfile" method:

const getProfile = async (): Promise<void> => { try { const profile = orbis .select("controller", "name", "username", "emoji", "actor") .from(PROFILE_ID) .where({ actor: ["human"] }) .context(CONTEXT_ID); const profileResult = await profile.run(); if (profileResult.rows.length) { console.log(profileResult.rows[0]); setProfile(profileResult.rows[0] as Profile); } else { // take the user to the profile page if no profile is found window.location.href = "/profile"; } await getRobotProfile(profileResult.rows[0] as Profile); } catch (error) { console.error(error); return undefined; } };

Notice how we've constructed a .select query off of our OrbisDB instance (provided by our Zustand store), asking for the corresponding values for the 5 columns we want data for.

Next, we need to notate which data model we want our query to reference, which is where we use .from with our profile model ID as the value.

We also only want the records where the profile is for the human user, indicated on the following line.

Finally, we use the context ID that corresponds to this project as the final value that's appended to the query.

If a corresponding profile exists, we then invoke the getRobotProfile method to obtain our chatbot's information. If it does not exist, we take the users to the profiles page so they can create one.

Writing Data

Let's take a quick look at an example of data mutations. Within the same message list component you will find a method called createPost which is invoked each time the user creates a new message:

const createPost = async ( thisPost: string ): Promise<PostProps | undefined> => { try { await orbis.getConnectedUser(); const query = await orbis .insert(POST_ID) .value({ body: thisPost, created: new Date().toISOString(), tag: "user", edited: new Date().toISOString(), }) .context(CONTEXT_ID) .run(); if (query.content && profile) { const createdPost: PostProps = { id: query.id, body: query.content.body as string, profile, tag: query.content.tag as string, created: query.content.created as string, authorId: query.controller, }; return createdPost; } } catch (error) { console.error(error); return undefined; } };

While this looks similar to the syntax we use to read data, there are a few differences.

First, take a look at the first line under the "try" statement - we're calling getConnectedUser() off of our OrbisDB prototype chain to ensure that our active session is applied. This is necessary to run mutation queries, whereas it's not a necessary step for reading data.

You can also see that we've swapped out the .select and .from statements for .insert which references the model ID we want to use, thus creating a new row in the corresponding table.

Finally, we're referencing the user's message value for the body while ensuring we tag the message as coming from the "user" before running the query and checking on its success status.

Running the Application in Developer Mode

We're now ready to boot up our application!

In your terminal, go ahead and start the application in developer mode:

nvm use 20 npm run dev

Navigate to http://localhost:3000/ in your browser. You should see the following:

Go ahead and click on "Connect Wallet." You should see a secondary authentication message appear after you connect your wallet:

Signing this message creates an authenticated session (using orbis.connectUser() from our Zustand store). You can check the value of this session by navigating to the "Application" tab in your browser and looking for the orbis:session key pair:

Given that you have not yet created any messages, the application should automatically direct you to the /profiles page where you can assign identifiers to yourself and your chatbot:

Finally, navigate back to the homepage to begin exchanging messages with your chatbot. Notice how the values from your corresponding profiles appear next to the messages:

How Could this Application be Improved?

Since our message history is being written and queried based on static values (for example, assigning messages to the "user" tag), you'll notice that the same conversation history appears when self-authenticating with a different wallet address and creating a new session.

As a challenge, try thinking about how to implement different ways to improve the application design to improve this experience:

Tagging the profiles and messages with values that align with actual authenticated accounts instead of static ones Altering our message data model and application to accommodate different chat contexts, allowing a user to have different conversation histories Next Steps

We hope you've enjoyed this tutorial and learned something new about how to configure and incorporate OrbisDB into your application! While this concludes our walk-through, there are other possibilities Ceramic has to offer:

Join the Ceramic Discord

Follow Ceramic on X

Follow Orbis on X

Start Building with Ceramic


GS1

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability

Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plas
Jeju SamDaSoo mineral water aiming for “top” levels of efficiency and sustainability By putting a QR Code powered by GS1 on every bottle cap, Korean water bottler JPDC is going label-less

Recent regulations are pushing Korean beverage companies to remove labels from their bottles as part of an initiative to use less plastic and make recycling easier.

Information that was previously on the labels of Jeju SamDaSoo mineral water is now available simply by scanning the QR Code with GS1 Digital Link on the bottle cap.

Beyond being compliant with national laws, the company is seeing improved engagement with consumers, better inventory management and more.

case-study-gs1-korea-jpdc.pdf

Thursday, 27. June 2024

EdgeSecure

Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid

The post Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid appeared first on NJEdge Inc.

Webinar
Thursday, July 25, 2024
10 AM ET

For higher education institutions offering financial aid to students, the Gramm-Leach-Bliley Act, or GLBA, means your institution is required to meet compliance standards for the security and protection of financial information, and to provide transparency related to how personal information is used and shared. Failure to meet these standards carries significant risk for institutions, including restrictions or loss of eligibility for Title IV funding. In this session, we’ll review how the latest revisions to GLBA compliance standards, aligned with the NIST 800-171 revision 3, will impact higher education. Our privacy and compliance experts will review how these revisions increase the compliance burden for institutions, and key steps that institutions can take to meet the new standard and maintain compliance to receive federal financial aid support.

Register Now »

The post Navigating the New Landscape of GLBA Compliance: Key Changes to Protect Your Federal Financial Aid appeared first on NJEdge Inc.


OpenID

Public Review Period for Three Shared Signals Drafts

The OpenID Shared Signals Working Group recommends approval of the following three specifications as OpenID Implementer’s Drafts: Shared Signals Framework Draft 03 Other formats: TXT, XML, MD CAEP Draft 03 Other formats: TXT, XML, MD CAEP Interoperability Profile Draft 00 Other formats: TXT, XML, MD An Implementer’s Draft is a stable version of a specification providing intellectual […]

The OpenID Shared Signals Working Group recommends approval of the following three specifications as OpenID Implementer’s Drafts:

Shared Signals Framework Draft 03 Other formats: TXT, XML, MD CAEP Draft 03 Other formats: TXT, XML, MD CAEP Interoperability Profile Draft 00 Other formats: TXT, XML, MD

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification drafts in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the Working Group believes must be addressed by revising the drafts, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve these drafts as OpenID Implementer’s Drafts. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period. The relevant dates are:

Implementer’s Draft public review period: Thursday, June 27, 2024 to Sunday, August 11, 2024 (45 days) Implementer’s Draft vote announcement: Monday, July 29, 2024 Implementer’s Draft early voting opens: Monday, August 5, 2024 * Implementer’s Draft voting period: Monday, August 12, 2024 to Monday, August 19, 2024 (7 days)*

* Note: Early voting before the start of the formal voting will be allowed. The Shared Signals work group page is https://openid.net/wg/sharedsignals.

Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specifications in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “Shared Signals” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-risc, and (3) sending your feedback to the list. 

Marie Jordan – OpenID Foundation Board Secretary

Update: July 1, 2024:

The SSWG has now released an overview of the changes found in the 3 drafts released on June 27, 2024.

About the OpenID Foundation The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Public Review Period for Three Shared Signals Drafts first appeared on OpenID Foundation.


Hyperledger Foundation

Building Bridges: Developing the Stellar Connector for Hyperledger Cacti

Introduction: The Importance of Interoperability in Blockchain

Introduction: The Importance of Interoperability in Blockchain


Oasis Open Projects

Invitation to comment on Data Model for Lexicography v1.0

Data Model for Lexicography v1.0 defines standard serialization independent interchange objects based on state of the art in the lexicographic industry. The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.

Third public review - ends July 27th

OASIS and the OASIS Lexicographic Infrastructure Data Model and API (LEXIDMA) TC are pleased to announce that Data Model for Lexicography Version 1.0 is now available for public review and comment. This 30-day review is the third public review for this specification.

About the specification draft:

The LEXIDMA TC’s high level purpose is to create an open standards based framework for internationally interoperable lexicographic work. Data Model for Lexicography v1.0 describes and defines standard serialization independent interchange objects based predominantly on state of the art in the lexicographic industry. The TC aims to develop the lexicographic infrastructure as part of a broader ecosystem of standards employed in Natural Language Processing (NLP), language services, and Semantic Web.

This document defines the first version of a data model in support of these technical goals, including:
– A serialization-independent Data Model for Lexicography (DMLex)
– An XML serialization of DMLex
– A JSON serialization of DMLex
– A relational database serialization of DMLex
– An RDF serialization of DMLex
– An informative NVH serialization of DMLex

The documents and related files are available here:

Data Model for Lexicography (DMLex) Version 1.0
Committee Specification Draft 03
12 June 2024

PDF (Authoritative):
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.pdf
HTML:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.html
PDF marked with changes since previous public review:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03-DIFF.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03.zip

How to Provide Feedback

OASIS and the LEXIDMA TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 28 June 2024 at 00:00 UTC and ends 27 July 2024 at 23:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on Data Model for Lexicography”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on LEXIDMA works are archived at https://lists.oasis-open.org/archives/lexidma-comment/.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the LEXIDMA TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/lexidma/

Additional information related to this public review, including a complete publication and review history, can be found in the public review metadata document [3].

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/lexidma/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#Non-Assertion-Mode
Non-Assertion Mode

[3] Public review metadata document:
https://docs.oasis-open.org/lexidma/dmlex/v1.0/csd03/dmlex-v1.0-csd03-public-review-metadata.html

The post Invitation to comment on Data Model for Lexicography v1.0 appeared first on OASIS Open.


OASIS Approves Universal Business Language V2.4 Standard for Global Business Transactions

Boston, MA – 27 June 2024 – OASIS Open, the international open source and standards consortium, announced the approval of Universal Business Language (UBL) V2.4 as an OASIS Standard, a status that signifies the highest level of ratification. Developed by the UBL Technical Committee (TC), UBL V2.4 is the leading interchange format for business documents, […] The post OASIS Approves Universal Busi

IBM, Logius, Orbex Global Markets, Publications Office of the European Union, U.S. Department of Defense (DoD), and Others Advance Open Standard for Enhanced Interoperability and Efficiency in Supply Chain and Digital Trade

Boston, MA – 27 June 2024 – OASIS Open, the international open source and standards consortium, announced the approval of Universal Business Language (UBL) V2.4 as an OASIS Standard, a status that signifies the highest level of ratification. Developed by the UBL Technical Committee (TC), UBL V2.4 is the leading interchange format for business documents, revolutionizing global business transactions with its latest version. 

The standard works seamlessly with frameworks like ISO/IEC 15000 (ebXML), extending the benefits of Electronic Data Interchange (EDI) systems to businesses worldwide. UBL V2.4 maintains backward compatibility with earlier V2.# versions while introducing new business document types, now totaling 93. The European Union has recognized UBL’s significance by declaring it officially eligible for referencing in tenders from public administrations. 

“UBL 2.4 represents a significant advancement, featuring enhanced support for B2C transactions, which will greatly benefit businesses and consumers alike,” said Kenneth Bengtsson, Chair of the UBL TC. “Additionally, it offers improved alignment with U.S. tax models, ensuring compliance and facilitating smoother transactions. These enhancements reaffirm our commitment to evolving and adapting UBL to meet the ever-changing needs of global commerce.”

As global sustainability efforts increase, UBL will expand its utility to encompass circular data exchange, reflecting the evolving needs of modern commerce. Looking ahead to the development of UBL V2.5, the TC will integrate circular economy data elements, marking a transformative step towards embedding sustainability into global supply chain data exchange. 

The UBL TC is forming a new UBL Commodities Subcommittee (SC), which aims to streamline electronic transactions for raw materials, recycled goods, and agricultural products in global supply chains, with the goal of improving efficiency, transparency, sustainability, and reliability in commodity markets. The SC will standardize UBL document types and semantic library entries for global commodity trading and procurement processes.

The UBL TC encourages global collaboration and actively seeks input from stakeholders to ensure the success of UBL as a cornerstone for sustainability data exchange. The TC welcomes a diverse range of contributors, including ERP vendors; software and service providers; national, regional and local public authorities; procurement and trade communities; e-invoicing networks; supply chain communities; and logistics and transportation companies. Participation is open to all through membership in OASIS, with interested parties encouraged to join and contribute to shaping the future of structured business document exchange. Contact join@oasis-open.org for more information.  

The post OASIS Approves Universal Business Language V2.4 Standard for Global Business Transactions appeared first on OASIS Open.

Wednesday, 26. June 2024

FIDO Alliance

FIDO APAC Summit 2024 Announces Keynotes, Speakers, and Sponsors

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, […]

The FIDO Alliance is thrilled to announce the lineup for its highly anticipated second FIDO APAC Summit, set to take place at the JW Marriott Kuala Lumpur on September 10-11, 2024. Co-hosted by SecureMetric Technology and supported by Malaysia Digital Economy Corporation (MDEC) and CyberSecurity Malaysia, this premier event is dedicated to advancing phishing-resistant FIDO authentication across the region under the theme, “Unlocking a Secure Tomorrow.”

The summit will feature keynote addresses by notable leaders such as Gobind Singh Deo, Malaysia’s Minister of Digital; Dato’ Dr. Amirudin Abdul Wahab, CEO of CyberSecurity Malaysia; TS. Mohamed Kheirulnaim Mohamed Danial, Senior Assistant Director of National Cyber Coordination and Command Centre (NC4) & National Cyber Security Agency (NACSA); Andrew Shikiar, CEO & Executive Director of FIDO Alliance; and Edward Law, CEO of Securemetric. 

They will be joined by a distinguished roster of speakers including Christiaan Brand, Product Manager: Identity and Security at Google; Eiji Kitamura, Developer Advocate at Google; Henry (Haixin) Chai, CEO of GMRZ Technology / Lenovo; Hyung Chul Jung, Head of Security Engineering Group at Samsung Electronics; Khanit Phatong, Senior Management Officer at Thailand Electronic Transactions Development Agency; Masao Kubo, Manager of Product Design Department at NTT DOCOMO; Naohisa Ichihara, CISO at Mercari; Niharika Arora, Developer Relations Engineer at Google; Sea Chong Seak, CTO at SecureMetric; Simon Trac Do, CEO & Founder of VinCSS; Takashi Hosono, General Manager at SBI Sumishin Net Bank; Yan Cao, Engineering Manager at TikTok; and Hao-Yuan Ting, Senior Systems Analyst at Taiwan Ministry of Digital Affairs.

The updated list of speakers can be found here.

Among the speakers, Tin Nguyen, a former U.S. Marine and FBI Special Agent, now a cybersecurity expert, will discuss the benefits of passwordless authentication and how it enhances organizational defenses against cyber threats. “Cybercriminals continuously search for vulnerabilities to take advantage of. Therefore, it is imperative for organizations to implement strong cybersecurity measures to safeguard their users,” says Nguyen. “Implementing FIDO-based passkeys provides an extra layer of security, mitigating potential threats without compromising user experience.”

The event promises to attract hundreds of attendees and will feature keynote addresses, panel discussions, technical workshops, and an expo hall showcasing the latest innovations from leading technology companies such as Securemetric, VinCSS, OneSpan, iProov, Thales, AirCuve, Zimperium, RSA, Yubico, Identiv, Utimaco, FETIAN, and many more. Attendees will have the opportunity to explore the latest trends in cybersecurity, network with top industry minds, and gain invaluable knowledge on implementing FIDO standards for enhanced security.

“The FIDO Alliance is thrilled to host its second FIDO APAC Summit 2024 in Malaysia, featuring presentations from some of the brightest minds in authentication from the APAC region and beyond,” said Andrew Shikiar, Executive Director and CEO of the FIDO Alliance. “With the continuous rise in the volume and sophistication of cyber-attacks, it is crucial for organizations to move past passwords and adopt passkeys, a user-friendly alternative based on FIDO standards.”

Registrations are now open to the public. For more information and to register, please visit www.fidoapacsummit.com. For sponsorship opportunities, please contact events@fidoalliance.org.

About the FIDO Alliance 

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.

PR Contact 

press@fidoalliance.org


Me2B Alliance

Do SDKs Represent Actual Network Traffic in EdTech Apps?

1. Background  In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on […] The post Do SDKs Represent Actual Network Traffic in EdTech
1. Background 

In 2022, Internet Safety Labs (ISL) conducted an extensive benchmark of EdTech apps used in schools across the United States. We sampled 13 schools in each state and the District of Colombia and identified 1,722 unique apps which were in use In K12 schools. During the benchmark, the apps were evaluated scored on their behaviors related to safety. As part of the safety evaluation, SDKs in each app were identified and researchers collected network traffic for 1,357 apps. In total, there were 275 unique SDKs in the apps, and 8,168 unique subdomains, 3,211 unique domains from the network traffic.  

A key research question in conducting the 2022 EdTech benchmark was to determine how accurate SDKs were as a proxy for actual third-party data sharing, since network traffic data collection is somewhat labor-intensive. This report shares the results of the analysis. 

2. Analysis 

The basis of the analysis was to compare the “expected” third parties as based on the company owners of the SDKs with the observed companies in the network traffic. This required identifying the owner companies for both the SDKs and all the subdomains observed in the aggregate network traffic.1 

Researchers first identified which SDKs were in use in apps by using AppFigures as a resource. In total, 275 SDKs unique SDKs were found in use across all apps.  Next, researchers identified the companies who published these SDKs. For each app, the number of unique company owners of SDKs found in each app is referred to as the “expected” number of companies to receive data.  

Next, researchers performed a similar analysis on the subdomains observed in the network traffic (1,175 total apps). Each subdomain was resolved to an “owner” company.  Subdomains were identified from HTTP POST/GET requests captured in the network traffic. 

We then performed two quantitative analyses: (1) we examined the network traffic of apps with at least one SDK (n=1,083 apps), and (2) we examined the network traffic of apps with no SDKs (n=92 apps).  

2.1   Apps With at Least One SDK 

Apps with at least one SDK communicated with an average of 10.1 companies based on observed network traffic (Table 1).  

2.1.1   “Expected” Companies in Network Traffic 

In apps with at least one SDK, there were an average of 4.7 unique companies represented by the SDKs–thus, 4.7 “expected” companies to receive data. However, on average, only 1.7 (or 36.2%) of the “expected” companies were seen in the network traffic of apps with at least one SDK (Table 1). 

Note that there are several contributing factors that could account for this, including: 

The manual testing performed by the researchers was unstructured and therefore had inconsistencies across researchers.  The manual testing didn’t perform all functions in the app. For instance, the tested did not make any optional purchases or upgrading to a premium version.    Table 1  Apps containing at least one SDK (n=1,083)   Average Expected Companies  Average Expected Companies Seen  Average # Unexpected Companies Seen  Average Total # of Companies Seen  Webview – With (n=609)  5.0  1.9  12.6  14.5  Webview – Without (n=474)  4.3  1.4  2.6  4.0  Advertisements – With (n=189)  5.6  2.1  24.0  26.1  Advertisements – Without (n=894)  4.5  1.6  5.0  6.6  Behavioral Advertisements – With (n=105)  5.4  2.1  33.7  35.8  Behavioral Advertisements – Without (n=978)  4.6  1.6  5.5  7.1  ALL Tested Apps With 1+ SDK (n=1083)  4.7  1.7  8.4  10.1 
2.1.2   “Unexpected” Companies in Network Traffic 

Additionally, as seen in Table 1, these apps communicated with an average of 8.4 unexpected companies.  

As expected, apps that used Webview2, had advertisements or behavioral ads all had even higher average numbers of unexpected companies, with apps with behavioral ads having the highest at 33.7 unexpected companies on average3. The ISL app score rubric regards the use of Webview and the inclusion of advertising as very high risks for K-12 students and the data in Table 1 reinforces the rubric.  

Apps with at least one SDK that use Webview had 2.6 times as many third parties as apps with at least one SDK that don’t use Webview.  Apps with at least one SDK that include ads had 3.0 times as many third parties as apps with at least one SDK that don’t include ads.  Apps with at least one SDK that include behavioral ads had 4.0 times as many third parties as apps with at least one SDK that don’t include behavioral ads.  2.2   Apps with No SDKs 

There were 92 apps in the data set that had no SDKs and for which we had network traffic. Since these apps had no SDKs, there were no “expected” companies to receive data from the app [other than the app developer, of course].  

Apps with no SDKs averaged 4.6 companies observed in network traffic—negligibly less than the average for apps with at least one SDK. However, for apps that use Webview, or include advertising or behavioral advertising, the average observed companies is markedly lower (Table 2).  

Apps with no SDKs that use Webview had 44.1% fewer observed companies.  Apps with no SDKs that include advertising had 40.6% fewer observed companies.  Apps with no SDKs that include behavioral advertising had 21.0% fewer observed companies.  Table 2  Apps with no SDKs   Average # of Companies Seen  Webview – With (n=43)  8.1  Webview – Without (n=49)  1.6  Advertising – With (n=11)  15.5  Advertising – Without (n=81)  3.2  Behavioral Ads – With (n=4)  28.3  Behavioral Ads – Without (n=88)  3.6  All Tested Apps Without SDKs (n=92)  4.6 
3. Conclusion
3.1   SDKs as a Proxy for Third Party Sharing

As the data shows, SDKs aren’t a useful proxy for the actual number of third parties receiving data from the app. Moreover, apps that include ads or that use Webview will likely have significantly more third parties than apps without.  

This means that viable measurement of third parties receiving data from apps requires testing and observation of network traffic. ISL used mostly manual methods for the collection of this data but automated methods would be extremely beneficial for ongoing and pervasive measuring of app third party sharing.  

SDKs do provide value in identifying potential omissions in the manual testing process. Can we account for the specific SDKs that don’t appear in the network traffic? Did we miss a particular functional branch of the app that we should go back and test? Or might it be an indication of an error in the SDK database? So while SDKs don’t serve as a perfect indication of the third parties communicating with the app, they still provide valuable information, and as such, they will remain in our app safety labels (see https://appmicroscope.org/).  

3.2   Validation of ISL App Scoring Rubric 

As shown in section 2, use of Webview and the inclusion of advertising substantially increase user exposure to data sharing with more third parties. This finding reinforces the ISL app scoring rubric wherein the use of Webview and presence of advertising are indicators for very high risk. 

4. Helpful Links 

App Microscope 

SDK Risk Dictionary 

Domain Risk Dictionary 

Company Risk Dictionary

 

Footnotes: See the SDK Risk Dictionary and the Subdomain Risk Dictionary for details.  Note: researchers determined the use of Webview manually, by observing third-party pages opening within the app. Thus, the presence of Webview as tagged in ISL’s AppMicroscope.org may not accurately assess Webview use for first-party web pages.  It would be interesting to study how many apps have behavioral ads and don’t use Webview.

The post Do SDKs Represent Actual Network Traffic in EdTech Apps? appeared first on Internet Safety Labs.


Elastos Foundation

The New Bretton Woods: How BeL2 Aims to Transform Global Finance using Native Bitcoin

In the records of financial history, few events have shaped the global economic landscape as profoundly as the establishment of the Bretton Woods system. In 1944, amidst the ruins of World War II, representatives from 44 Allied nations convened in Bretton Woods, New Hampshire, to create a new framework for international economic cooperation. The primary […]

In the records of financial history, few events have shaped the global economic landscape as profoundly as the establishment of the Bretton Woods system. In 1944, amidst the ruins of World War II, representatives from 44 Allied nations convened in Bretton Woods, New Hampshire, to create a new framework for international economic cooperation. The primary goal was to prevent the economic instability and competitive devaluations that had contributed to the Great Depression and the war.

The Bretton Woods system pegged major currencies to the US dollar, which was convertible to gold at a fixed rate. This effectively made the US dollar the world’s reserve currency, providing much-needed stability and fostering economic growth. However, by the late 1960s, the system began to unravel. The US faced mounting balance-of-payments deficits and dwindling gold reserves. On August 15, 1971, President Richard Nixon unilaterally ended the dollar’s convertibility to gold, effectively dismantling the Bretton Woods system and ushering in the era of fiat currencies.

The transition to fiat currencies, while offering greater flexibility for monetary policy, also introduced significant challenges. Governments could now print money without restraint, leading to inflation, currency devaluations, and a series of financial crises. Today, the world faces a staggering $307 trillion in debt, excessive currency issuance, declining bank credit, and rising economic instability. This backdrop underscores the need for a new financial paradigm, one that combines stability with the technological advancements of the digital age.

 

Bitcoin: Digital Gold

Bitcoin, created in 2009 by the pseudonymous Satoshi Nakamoto, was designed as a decentralised digital currency that could operate independently of central banks and governments. Often referred to as “digital gold,” Bitcoin possesses many qualities that make it an ideal candidate for a global reserve asset: it is scarce (with a cap of 21 million coins), durable, portable, and easily divisible. Bitcoin’s blockchain technology ensures transparency, security, and resistance to censorship, making it a robust vehicle to support fiat currencies and value exchange.

Despite its adoption and over $1 trillion in value, Bitcoin’s mainstream financial use faces challenges like scalability and programmability limitations. Its high decentralisation and security make transactions slower and resource-intensive. While its simplicity ensures robust security, it limits Bitcoin’s ability to handle complex transactions like digital agreements for loans or exchanges. Innovations like Ethereum, which introduced smart contracts in 2015, offer more functionality, leading to Layer 2 solutions aimed at uniting technologies and enhancing Bitcoin’s capabilities.

 

Bitcoin Layers

Layer 2 solutions are protocols built on top of a blockchain (Layer 1) to enhance performance and enable more complex functionalities. For Bitcoin, Layer 2 technologies like the Lightning Network and sidechains address issues of transaction speed, programmability and scalability, while bridges facilitate interoperability with other blockchain ecosystems. These solutions allow Bitcoin to interact with other blockchain ecosystems and innovations like Ethereum, enabling smart contracts and decentralised applications (DApps) that were previously not possible. However, there is a problem.

 

Inherent Problems

Scalability layers involve bridging Bitcoin off its main network and into these environments, creating security concerns that undermine its decentralised ethos. Wrapped Bitcoin (WBTC), for instance, is an ERC-20 token that represents Bitcoin on Ethereum networks. While it brings Bitcoin’s liquidity to more programmable finance platforms, it has several critical issues:

Centralisation Risk: WBTC requires users to trust centralised institutions to manage and safeguard the Bitcoin backing the WBTC. If these institutions act maliciously, users have no recourse, undermining Bitcoin’s decentralisation ethos. Custodian Risk: The centralised custodians holding the actual Bitcoin can potentially be hacked or face regulatory pressures, putting users’ assets at risk. Lack of Transparency: Users must rely on the custodians’ transparency regarding the actual reserves backing the WBTC, which may not always be reliable.

Recent cross-chain bridge hacks, such as the Nomad Bridge exploit, highlight these vulnerabilities. Chainalysis reports that $2 billion has been stolen in 13 cross-chain bridge hacks, accounting for 69% of total funds stolen in 2022, with North Korean-linked hackers stealing approximately $1 billion. Currently, there are more than 70 cross-chain bridges with over $25 billion locked and daily transaction volumes in the millions. Synapse, a popular cross-chain bridge, has surpassed $5 billion in transaction volume. Bridges are vulnerable due to their structure, combining custodians, debt issuers, and oracles, each presenting multiple attack vectors. For instance, the Poly Network and Wormhole attacks showcased vulnerabilities in cross-chain communication, resulting in significant losses.

To mitigate these risks, it is crucial for Bitcoin’s evolution to connect other innovation layers through information transmission rather than asset transfer, while decentralising staking on Bitcoin to avoid pooling assets together. This approach keeps Bitcoin native, secure, and decentralised, while enabling broader financial applications in scalable environments.

 

Native Bitcoin DeFi, Pioneering the New Bretton Woods System

Native Bitcoin refers to Bitcoin that remains on its main network while being collateralised for Layer 2 DeFi applications. Through BeL2, Bitcoin can participate in complex financial transactions without being transferred off the Bitcoin blockchain, maintaining its security and decentralisation. This allows Bitcoin to act as a versatile tool in DeFi ecosystems, leveraging its inherent strengths while expanding its functionality into smart services such as swaps, loans, and stablecoin issuance.

Staking: BeL2 employs non-custodial native Bitcoin staking in decentralised wallets, providing security for the network. Zero Knowledge Proofs (ZKPs): These provide private and verifiable proofs on Bitcoin staking transactions. BTC Oracle: Connects BTC proof information into Layer 2 smart contracts, facilitating Bitcoin DeFi services without moving assets off the main network. Arbiter Network: Utilises decentralised and collateralised nodes to facilitate time-based execution and dispute resolution, enhancing trustless financial operations.

BeL2 transmits information, not assets, across chains, preserving Bitcoin’s security and integrity while enabling smart contracts and decentralised applications for complex financial transactions. This approach eliminates reliance on centralised entities and ensures secure, decentralised financial operations. By keeping Bitcoin native, BeL2 ensures that the original blockchain remains the ultimate trust anchor for all transactions, thereby maintaining the foundational principles of Bitcoin’s decentralised ethos.

 

Native Bitcoin Loan App Demo

In the context of BeL2’s transformative role in decentralised finance (DeFi), the BeL2 Loan Dapp Demo showcases its potential to enhance Bitcoin’s utility while maintaining security and decentralisation. This demo is the first native Bitcoin lending protocol built on Starkwares Cairo programming language, allowing users to lock their native Bitcoin as collateral without relying on Wrapped Bitcoin (WBTC) or cross-chain bridges. The BTC remains on the Bitcoin mainnet, ensuring non-custodial and non-liquidatable collateral.

Users lock their Bitcoin through a bespoke transaction script, and the loan terms, including interest rates and conditions for collateral release, are governed by a smart contract on the Ethereum Virtual Machine (EVM). BeL2’s arbiter network acts as an intermediary, facilitating communication between the Bitcoin and EVM chains and verifying transaction proofs. If a borrower, like Alice, fails to repay a loan, the lender, Bob, can retrieve the BTC. If Bob refuses to cooperate in unlocking the BTC after repayment, Alice can initiate arbitration, and the arbiter will co-sign to unlock the BTC.

This peer-to-peer system ensures fairness and security through zero-knowledge proofs and the arbiter network. Any malicious actions by the arbiter or parties involved are deterred by the ability to challenge and penalise misconduct, ensuring cooperation is the best outcome. This innovative approach enables Bitcoin holders to access liquidity while preserving Bitcoin’s core principles of decentralisation and security.

 

Principles for a New Native Bitcoin Bretton Woods System: Decentralised Global Settlement: Native Bitcoin must act as a global settlement layer where all transactions are secured on Bitcoin’s main network, ensuring it remains the ultimate trust anchor for global finance. Financial Innovation and Stability: By integrating native Bitcoin with smart contracts and DApps, we can support new financial products like BTC-backed loans and stablecoins, providing liquidity and stability to the global economy whilst uniting all layers. Trustless and Transparent Operations: The implementation of information transmission through zero knowledge proofs and a decentralised arbiter network ensures trustless and transparent financial operations for native Bitcoin applications, reducing counterparty risk and enhancing transaction integrity.

BeL2’s vision is to become the defining piece of native Bitcoin infrastructure for a new Bretton Woods system. Emerging from the Elastos SmartWeb vision, which aims to create a decentralised internet where data, applications, and identities are secure, private, and user-owned, BeL2 strives to become a pivotal component of native Bitcoin infrastructure, transforming Bitcoin from digital gold into the cornerstone of a new global financial system.

 BeL2 leverages Elastos’ secure infrastructure, using ELA as collateral for arbiters to ensure robust and trustless dispute resolution. ELA, an asset merged mined with over 50% of Bitcoin’s miner security, adds an additional layer of security and decentralisation to the BeL2 ecosystem, reinforcing both projects’ commitment to a secure and decentralised financial future. Excited to learn more? Head over to the BeL2 website and follow Infinity for the latest updates!


Oasis Open Projects

Invitation to comment on OData Vocabularies v4.0

OData Vocabularies v4.0 describes a set of OData vocabularies maintained by the OASIS OData Technical Committee. The post Invitation to comment on OData Vocabularies v4.0 appeared first on OASIS Open.

Public review ends July 24th

OASIS and the OASIS Open Data Protocol (OData) TC [1] are pleased to announce that OData Vocabularies Version 4.0 is now available for public review and comment. This 30-day review is the second public review for this specification.

The Open Data Protocol (OData) enables the creation of REST-based data services, which allow resources, identified using Uniform Resource Locators (URLs) and defined in an Entity Data Model (EDM), to be published and edited by Web clients using simple HTTP messages.

OData Vocabularies v4.0 describes a set of OData vocabularies maintained by the OASIS OData Technical Committee. These vocabulary components are continuously evolved.

The documents and related files are available here:

OData Vocabularies Version 4.0
Committee Specification Draft 02
19 June 2024

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.md
HTML:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.html
PDF:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.pdf

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file at:
https://docs.oasis-open.org/odata/odata-vocabularies/v4.0/csd02/odata-vocabularies-v4.0-csd02.zip

How to Provide Feedback

OASIS and the OData TC value your feedback. We solicit feedback from potential users, developers and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

This public review starts 25 June 2024 at 00:00 UTC and ends 24 July 2024 at 11:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org
Please use a subject line like “Comment on OData Vocabularies”.

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on OData works are archived at https://lists.oasis-open.org/archives/odata-comment/.

All comments submitted to OASIS are subject to the OASIS Feedback License [2], which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with the public review of these works, we call your attention to the OASIS IPR Policy [3] applicable especially [4] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specifications, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about these specifications and the OData TC may be found on the TC’s public home page.

========== Additional references:

[1] OASIS Open Data Protocol (OData) TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=e7cac2a9-2d18-4640-b94d-018dc7d3f0e2
https://www.oasis-open.org/committees/odata/

[2] OASIS Feedback License:
https://www.oasis-open.org/who/ipr/feedback_license.pdf

[3] https://www.oasis-open.org/policies-guidelines/ipr/

[4] https://www.oasis-open.org/committees/odata/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-RAND-Mode
RF on RAND Mode

The post Invitation to comment on OData Vocabularies v4.0 appeared first on OASIS Open.


Next Level Supply Chain Podcast with GS1

50 Years of Confidence, Supply Chain Success, and the Next Dimension in Barcodes

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with: Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcode

Celebrating 50 Years of the Barcode, hosts, Reid Jackson and Liz Sertl speak to an impressive lineup of industry experts, direct from Orlando, at GS1 US’s yearly conference, Connect. They chat with:

Dave DeLaus, CIO at Wegmans, dissects the complexities of integrating new technologies to enhance consumer experience and shares how Wegmans is tackling the challenges of implementing 2D barcodes for better product traceability.

Sean Murphy from Cencora demystifies the Drug Supply Chain Security Act and emphasizes the necessity of unique serial numbers and digital backpacks for pharmaceutical products to ensure safety and compliance in the healthcare industry.

Andrew Meadows is the founder and CEO of BL.INK introduces the intriguing world of 2D barcodes and digital resolvers. Learn how BL.INK’s platform, BL.INK CXP revolutionizes consumer engagement by providing personalized experiences and enhancing data privacy.

JW Franz from Barcoding Inc. emphasizes the importance of supply chain automation innovation and the future of barcoding, including RFID and computer vision technologies.

They all speak on the gradual implementation of new technologies, the strategic importance of 2D barcodes, and the transformative potential of computer vision in inventory management. The episode also covers the crucial role of standardization and regulatory compliance in healthcare and explores the exciting advancements paving the way for smarter, safer, and more efficient supply chains.

 

Key takeaways:

Discover how the integration of 2D barcodes and QR codes, paired with advancements in computer vision, is revolutionizing retail and supply chain management for enhanced consumer experiences and operational efficiency.

Explore the significant impact of the Drug Supply Chain Security Act and the digital backpack concept on pharmaceutical traceability, with insights from Sean Murphy of Cencora on how serialization ensures compliance and safety.

Learn about BL.INK’s innovative 2D barcode technology and digital resolvers, with Andrew Meadows explaining how these tools enable personalized consumer interactions and secure data privacy, driving a more direct and meaningful brand engagement strategy.

 

Jump into the Conversation:

 

[00:00] Welcome to Next Level Supply Chain

[00:48] Coming to you from GS1 Connect 2024 in Orlando

[02:45] Introducing Dave DeLaus, CIO at Wegmans

[03:42] Hot Topics with Wegmans

[04:47] Some insights on use of the 2D barcode at Wegmans

[06:01] How you can interact with the 2D barcode differently for your customer

[10:17] Introducing Sean Murphy with Cencora

[12:14] Cencora’s use of EPCIS or Electronic Product Code Information Service

[14:04] Leveraging RFID technology

[14:53] Focusing on DSCSA to create a smart, safe, and sustainable supply chain

[16:48] 2D barcodes in the pharmaceutical and healthcare industry

[18:49] Introducing Andy Meadows, founder and CEO of BL.INK

[19:25] BL.INK platform and digital resolvers

[24:22] Advising product manufacturers about BL.INK

[25:37] Andy’s thoughts on the future of 2D barcodes

[27:53] Introducing JW Franz from Barcoding Inc.

[29:07] JW’s biggest takeaway fro attending Connect

[29:48] Barcoding Inc’s current focus

[30:28] JW’s thoughts on the future of RFID and 2D barcodes

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with the guests:

Dave DeLaus - CIO, Wegmans 

Sean Murphy - Senior Manager of Manufacturing Operations, Cencora

Andrew Meadows - Founder & CEO, BL.INK

JW Franz - IoT Automation Solution Director, Barcoding Inc.


Digital Identity NZ

Postcard from Berlin | June Newsletter

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Kia ora,

Earlier this month, I was lucky enough to be personally invited by Joerg Resch to attend Europe’s flagship digital ID conference in Berlin, EIC 2024. For someone who regularly spoke on this circuit for over 10 years, it was a blast to be back, revelling in the richness of the presentations and discussion, highlighted in this post by my predecessor at the Kantara Initiative. The brain is fully engaged for hours, absorbing expert insight, experience, and innovation that is found here at EIC in Germany and Identiverse in the US, the world’s two biggest conferences in this space. I’m looking forward to a tiny fraction of the ground being covered and contextualised locally at our Digital Trust Hui Taumata in just six weeks’ time.

The pre-conference SIDIHub made significant progress in the challenging goal to achieve digital identity cross border interoperability. Participants were interested in learning about Aotearoa becoming the first common law country to implement a regulated digital identity trust framework on July 1st. This framework regulates stakeholders that opt in for accreditation, thereby increasing customer trust in their security and privacy settings. The US, UK, Canada, and Australia have digital ID trust frameworks in operation and piloted, but not yet nationally legislated. Credential authentication and verification continue to evolve both in policy and technology at different speeds, making it a complex issue. 

At our recent DIA-hosted public/private sector working group meeting on digital identification standards, I commented that there is still much work needed globally before we can adopt comprehensive policies, protocols and standards for decentralised digital ID and its containerwallets. This is a plane we will continue to build as we fly it. 

The high interest in digital ID led us to host/co-host two events in June: the capacity-filled ‘Digital Identity, Higher Education, Aotearoa’ sponsored by Middleware and featuring the University of Auckland, and a Town Hall-styled session on digital cash with the Reserve Bank of New Zealand Te Pūtea Matua (RBNZ), in partnership with FinTechNZ. Both events showed how important broad digital trust is for ensuring cybersecurity and protecting against deepfakes, scams and hacking threats. Awareness and education are essential, so we thank our members for supporting DINZ initiatives, just as DINZ supports members’ initiatives like the upcoming series from NEC.

And finally, the DIA has released a new schedule of Identification Masterclasses through to August.

To register for any of the Zoom sessions, please email identity@dia.govt.nz with the G or HD reference number and a Zoom link will be supplied.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read full news here: Postcard from Berlin | June Newsletter

SUBSCRIBE FOR MORE

The post Postcard from Berlin | June Newsletter appeared first on Digital Identity New Zealand.

Monday, 24. June 2024

GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1 To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle. By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many fi
Coca-Cola’s reusable, refillable bottles benefit from innovative QR Codes powered by GS1

To reach their goal of 40% refillable bottles by 2030, Coca-Cola Latin America needed a way to know how many times a given bottle had been through the refill cycle.

By laser engraving a unique identifier onto every bottle, Coca-Cola can know how many filling cycles the bottle has gone through, and whether it should be refilled or recycled.

Beyond the positive sustainability impact, the initiative provides a valuable set of data about each bottle’s journeys through the market across its lifecycle.

case-study-gs1-brazil-coca-cola.pdf

Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games

Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games A top priority of the organisers of the Hangzhou Asian Games was to serve safe food from trusted supply chain partners. Under the stewardship of Zhejiang AMR, QR Codes powered by GS1 were extensively implemented across the entire end-to-end food supply chai
Safe meals and snacks were served to 12,500 athletes during the Hangzhou Asian Games

A top priority of the organisers of the Hangzhou Asian Games was to serve safe food from trusted supply chain partners.

Under the stewardship of Zhejiang AMR, QR Codes powered by GS1 were extensively implemented across the entire end-to-end food supply chain of the Games.

Zero food safety accidents – an accomplishment acknowledged by Thomas Bach, President of the International Olympic Committee.

case-study-gs1-china-hangzhou-games.pdf

Ceramic Network

Calling all devs: Build composable search applications for the Base Onchain Summer

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include: Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns) The bounty is officially

Ceramic is partnering with Index Network to challenge developers to build composable search use-cases between Base and other projects participating in Base’s Onchain Summer. For example, those use-cases can include:

Composability with commerce (Shopify) Composability with social graphs (Farcaster) Composability with on-chain (Zora, Nouns)

The bounty is officially hosted on bountycaster.

About the Index Network

Index is a discovery protocol, built on Ceramic, that eliminates the need for intermediaries when finding knowledge, products, and like-minded people through direct, composable discovery across the web. By leveraging Web3 and AI, Index offers an open layer for discovery as the first decentralized semantic index. It functions as a composable vector database with a user-centric perspective, enabling interaction with decentralized graphs like Ceramic Network for user-owned knowledge graphs and Farcaster for social discourse.

About the bounty

For this bounty, developers have access to the Base search engine created on Index Network. They can utilize this index and integrate it with other projects and tools participating in Base’s on-chain summer to innovate and enhance information discovery experiences. Additionally, using Farcaster Channel indexes as a data source can help create personalized applications.

TIP: Consider developing agents to facilitate user interactions with Index, such as notification agents, context subscription agents, or multi-agent scenarios that enable conversational participation.

Prizes

A total prize pool of 2250 USDC will be distributed across top 3 best applications:

First place: 1000 USDC Second place: 750 USDC Third place: 500 USDC Useful links

Below you can find all of the tools and links available for you to build for this bounty.

Bounty:

Official link to the bounty on bountycaster

Indexes:

Create new indexes (for bounty builders only) Base documentation index Index Network Profile on Index Farcaster Channels Profile on Index

Documentation, tutorials and support:

Index.Network documentation Video tutorial for Farcaster contextual subscription Getting Started with Index Network SDK Video tutorial: Creating the Base documentation index GitHub Discord and forum for technical support

FIDO Alliance

The Register: AWS is pushing ahead with MFA for privileged accounts. What that means for you.

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to […]

AWS is making multi-factor authentication (MFA) mandatory for privileged users, specifically management account root users and standalone account root users. Customers must enable MFA within a 30-day grace period to maintain account access.


IT Brew: FIDO Alliance announces identity-proofing certification

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of […]

FIDO’s Face Verification Certification tests for security, liveness, and bias in remote identity verification technology through FIDO-accredited laboratories, and ISO and industry standards. Andrew Shikiar, Executive Director and CEO of the FIDO Alliance, highlights that this certification technology “gives licensing companies added assurance that a vendor is performing well.”


Find Biometrics: ID Talk: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s […]

Andrew Shikiar, FIDO’s Executive Director and CEO, discusses key topics in authentication and identity security on the ID Talk podcast (produced by Find Biometrics), including passkeys, phishing threats, deepfakes, FIDO’s vendor accreditation, and the new Face Verification Certification program.


DIF Blog

Blockchain and Identity

Theatre 4 was the place to be at Identity Week Europe in Amsterdam earlier this month, when a series of presentations and panel discussions on decentralized identity and blockchain proved one of the exhibition hall's top draws. The session, on the afternoon of Day 2, began with a

Theatre 4 was the place to be at Identity Week Europe in Amsterdam earlier this month, when a series of presentations and panel discussions on decentralized identity and blockchain proved one of the exhibition hall's top draws.

The session, on the afternoon of Day 2, began with a panel discussion, "Blockchain and ID" moderated by Alex Tourski, with Steffen Schwalm, Co-ordinator, TRACE4EU, Maarten Boender, INATBA Identity Workgroup, Sphereon.com and William Wang, Founder, Palau Digital Residency Program (RNS.ID).

Alex Tourski: "Why does blockchain need identity?"

Steffen Schwalm: "If you only want to prove identities, you can use a PKI. But if you want to combine identity and transactions in one system — for example, if you want to trace the parts and materials in your Tesla’s battery — you need a distributed ledger." 

Maarten Boender: "I agree. How can you be held accountable for what’s written to the blockchain, unless the transaction is signed by your identifier? If you need to make the audit trail of your product evident, that’s much easier to do with DLT (Distributed Ledger Technology). The DID document can’t be changed and will be around for as long as the blockchain it resides on, which is available always and everywhere. There’s no single point of failure.

"There are not so many other systems with the properties of a DLT". 

Alex Tourski: "Can DIDs be considered a universal identifier scheme, when there are around 200 DID methods?" 

Steffen Schwalm: "We have multiple credential data models, signature formats and protocols. What matters is achieving interoperability. I don’t have a problem with 500 DID methods, as long as we have a universal resolver that works."

Maarten Boender: "There are many types of database. As long as everyone talks SQL, its’ fine. Consumers won’t need to think about 200 DID methods. You’ll only have one choice, whether to login with your EU digital identity wallet." 

Alex Tourski: "How do we know blockchain-based identifiers will persist?" 

William Wang: "Why do IP addresses exist? Because there's an underlying need to transfer information. If there's a better way of doing this in ten years’ time, IPs may vanish. Blockchain exists because there’s a need for instant transfer of value. Maybe another way will arise and blockchain will disappear. We can’t say anything will exist for sure, even in 5 years time." 

Maarten Boender: "DIDs and DLTs will be essential tools for businesses that need to provide audit trails. Qualified electronic ledgers are part of eIDAS 2.0. They are managed by organizations that are certified and fully liable to maintain the ledger."

Steffen Schwalm: "I’m pretty sure nothing of current IT systems will still be here in 50 years' time. It’s the data that needs to persist." 

Sovrin: an example of a blockchain-based identity system 

Stephen Curran, who chairs the Sovrin Foundation's board of trustees, and is a long-term contributor to Hyperledger Indy, Aries and AnonCreds, took to the stage to give an update on the Sovrin Network.

“Picking up from the previous talk, Sovrin is a distributed ledger that's used for identity. It’s global, for public-private use and enables different ecosystems of users.

“We provide a platform for issuers to publish information, that enables verifiers to independently verify this information. Sovrin is a valid place for any ecosystem where DIDs are used. It’s not tied to the Hyperledger stack,” he added. 

Stephen described the Lawyer Verifiable Credential, which is used to ensure certain systems can only be accessed by qualified lawyers. "The Law society of British Columbia issues a VC confirming the holder is certified to practice law. The data is held in a wallet, enabling the holder to present it directly to verifiers, such as these restricted systems, without the issuer knowing this has happened. 

"The Verifier reaches down into where the DIDs are to verify it's exactly what the issuer said, using issuer’s public keys.

“The Government of British Columbia is also very concerned about the surveillance economy. The goal of AnonCreds is to share the minimum possible data that’s needed for each use case. We’re trying to remove correlateability, traceability and surveillance”. 

Stephen also reminisced about "Sovrin’s infamous token days. We got through that. We had strong technology and governance, and that’s what we took forward.

"The technology is very solid and robust — we’ve had 100% uptime for the past 5 years”. 

Enabling the Economy of Trust

Next on stage was Catherine Fankhauser, Head of Identity at SICPA, who provided an overview of how authentication, data authenticity and communication have evolved since the inception of the web, and the impact this has had on digital trust.

Turning to the new generation of decentralized technologies, she shared that adoption will be driven by credentials with daily utility. In the context of the EU Digital Identity Wallet, this means lower-assurance credentials that improve the user experience, for example via passwordless access to online services.

Catherine concluded her presentation by highlighting the Unlimitrust Campus, the world's first site dedicated to the Economy of Trust.

Are unique identifiers a good idea?

In the final session, Alex Tourski returned to moderate a panel discussion focused on unique identifiers, with Executive Directors Judith Fleenor of Trust over IP Foundation and Mary Camacho of Holochain, plus Maarten Boender, Stephen Curran and myself.

Alex Tourski kicked off the discussion by highlighting how the lack of persistent unique identifiers for digital assets means content created in the early days of the internet is often irretrievable today, with broken weblinks all that remains.

He asked the panel to consider the proposition that "transparency means safety", citing his home country of Ukraine, where false rumours contributed to the war.

"When an Uber driver likes or dislikes me, should their feedback not be connected to an identifier, to ensure accountability?," he added.

The panel's response was unanimous: assigning persistent unique identifiers to natural persons is a bad idea — though persistent identifiers may make sense for organisations, and certain types of physical and digital objects. Moreover, there are many benefits from using a standardized identifier framework, such as the Decentralized Identifier (DID) specification.

Judith Fleenor pointed out that the Internet Protocol (IP) succeeded because it does the minimum required to establish a universal data transfer mechanism. Similarly, while content provenance is needed to address the explosion of fake content, creators must be able to use multiple identifiers, to minimize privacy risks.

Mary Camacho agreed, adding: "Not all societies are as well-governed and free as the Netherlands. In some places, knowing who took a photo could mean death for that person."

Alex Tourski asked the panel whether Privacy Enhancing Technologies can protect us from the dangers of unique identifiers.

In response, Stephen Curran stated that cryptographic techniques and identifiers are separate topics, and that clever cryptography doesn't mitigate the privacy risks of assigning unique identifiers to natural persons.

Maarten Boender agreed: "We're trying to make it impossible to correlate a holder's use of their credentials, which is the opposite of creating a single identifier."


Oasis Open Projects

OASIS Membership Elects New Leaders to its Board of Directors

Boston, MA, USA, 24 June 2024 – OASIS Open, the international standards and open source consortium, announced the results of its 2024 Board of Directors Annual and Special Elections. The Board, selected by the OASIS membership, will continue guiding the organization’s direction by encouraging ongoing growth and fostering increased collaboration within the open source and […] The post OASIS Membe

Dr. Pablo Breuer of Orthogonal Insights, Daniel Rohrer of NVIDIA, and Daniella Taveau of Bold Text Strategies Bring Diverse Expertise to OASIS

Boston, MA, USA, 24 June 2024 – OASIS Open, the international standards and open source consortium, announced the results of its 2024 Board of Directors Annual and Special Elections. The Board, selected by the OASIS membership, will continue guiding the organization’s direction by encouraging ongoing growth and fostering increased collaboration within the open source and standards communities. Newly elected Board members from the Annual Election are Daniel Rohrer, VP of Product Security, Architecture and Research, at NVIDIA, and Daniella Taveau, President of Bold Text Strategies. Additionally, Jim Cabral, Gershon Janssen of Reideate, Bret Jordan, and Vasileios Mavroeidis of University of Oslo and Sekoia.io were re-elected. These individuals will serve two-year terms ending in 2026. Dr. Pablo Breuer, President of Orthogonal Insights, was elected in the Special Election to serve a one-year term ending in 2025. Continuing members of the Board are Jason Keirstead of Cyware, Daniel Riedel, Omar Santos of Cisco, and Jay White of Microsoft.

Francis Beland, Executive Director of OASIS, expressed his congratulations. “We welcome these distinguished leaders, both newly elected and re-elected, to the Board of Directors. Their extensive leadership experience will be instrumental as OASIS continues to develop meaningful new initiatives and broaden its opportunities. I look forward to collaborating with each of them as we pursue ambitious goals for the future.”

Dr. Pablo Breuer, President of Orthogonal Insights, brings a wealth of experience from his previous role as an executive at a Fortune 50 company and his 22-year Navy career, which included top-level positions in the U.S. Special Operations Command Donovan Group, SOFWERX, the NSA, US Cyber Command and US Naval Forces Central Command. A DoD Cyber Cup and two-time Defcon Black Badge winner, Breuer has taught at the Naval Postgraduate School, National University, California State University Monterey Bay, and Carnegie Mellon CERT/SEI. Breuer is the co-founder of the Cognitive Security Collaborative and coauthor of the DISARM (Disinformation Analysis and Response Measures) framework, the methodology used by the US and EU governments and NATO to address Misinformation and Disinformation. He is a sought-after speaker in the fields of cybersecurity and Mis- and Disinformation, sits on several Boards, and has mentored countless students and professionals.

“I’m honored to join the board of OASIS Open to help develop and promote open and inclusive standards for innovation and technology which promote our global ethics and shared interest. The speed of technology and innovation require standards to promote safety, fairness, and interoperability,” said Breuer. “I look forward to working on standards including countering disinformation and promoting artificial intelligence safety and resiliency. OASIS Open has been at the forefront of technology standards for more than three decades, and I’m proud to be able to contribute to their mission.”

Daniel Rohrer serves as VP of Software Product Security, Architecture and Research at NVIDIA, where, throughout his 24-year tenure, he has led efforts to enhance AI security, deliver GPU confidential computing, and advance research efforts in secure platform design. Rohrer has taken his integrated knowledge of “everything NVIDIA” to hone security practices, explore novel cybersecurity solutions, and help deliver some of the world’s most advanced and trustworthy computing platforms. He has been at the forefront of AI Security, contributing to the development of safe and trustworthy ecosystems through training, open source tools, and other initiatives aimed at scaling communities. An advocate for democratized access to computing resources, Rohrer strives to ensure equality and accessibility for all communities. He serves on the NVIDIA AI Ethics Review Committee and has held a Board position with the nonprofit NVIDIA Foundation, driving significant product security innovations.

“As AI adoption continues to grow across every industry, building systems that advance security and trust are paramount to success,” said Rohrer, VP of Software Product Security, Architecture and Research at NVIDIA. “I am honored to join the OASIS Board and contribute to the community so invested in the transparent development of open-source software and standards.”

Daniella Taveau, President of Bold Text Strategies, is an internationally recognized expert in developing global business and regulatory strategies. She has extensive experience working with senior political officials and advising multinational corporations worldwide. Taveau’s expertise spans international trade, finance, agriculture, food security and safety, chemicals, pesticides, new technologies, cosmetics and personal care, intergovernmental organizations, information technology, and combating mis- and disinformation. Prior to starting her own firm, Taveau was an International Trade Negotiator with the U.S. Environmental Protection Agency (EPA), where she represented the US at the World Trade Organization (WTO); all U.S. Free Trade Agreements including the TransPacific Partnership, the Transatlantic Trade and Investment Partnership (U.S./E.U. FTA), and the U.S. Korea Free Trade Agreement; the U.N. Food and Agriculture Organization (U.N. FAO); and the Asia Pacific Economic Cooperation (APEC). She also served as an International Policy Analyst with the U.S. Food and Drug Administration (FDA) and held an executive role at a global cosmetics company for a decade.

“I am honored to join the board of OASIS Open, considered one of the leading global forces in open-source standards. Accessible standards are necessary to ensure interoperability, innovation, and inclusivity,” said Taveau. “As we confront the growing challenges of misinformation and disinformation, I am committed to working with OASIS Members to promote accuracy, transparency, and trust in our digital world.”

OASIS expressed sincere gratitude to outgoing Board member Duncan Sparrell of sFractal Consulting for his invaluable service, dedication, and significant contributions during his tenure as a director. To learn more about the OASIS Board of Directors, please visit our website.

The post OASIS Membership Elects New Leaders to its Board of Directors appeared first on OASIS Open.


Identity At The Center - Podcast

In our newest episode of the Identity at the Center podcast,

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments. You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO Visit our website

In our newest episode of the Identity at the Center podcast, we discuss the concept of identity bubbles with the brilliant Justin Richer, founder of Bespoke Engineering. Join us as we explore what they are and how they can be designed to revolutionize identity management in disconnected environments.

You can watch the episode at https://youtu.be/E-GtiJ2HvnA?si=pWrmQgYXO9kk4jTO

Visit our website for more: idacpodcast.com

#iam #podcast #idac

Thursday, 20. June 2024

Me2B Alliance

We Need to Talk About Product Labels

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but […] The post We Need to Talk About Product Labels a

This week, US Surgeon General Dr. Vivek Murthy has called for warning labels for social media platforms. If you know anything about Internet Safety Labs (ISL) you’ll know that our primary mission is the development of free and accurate safety labels for technology. So naturally, we heartily agree with Dr. Murthy that technology needs labels—but perhaps not warning labels and definitely not just for social media platforms. 

Various experts have written thoughtful responses1 to this week’s call for warning labels, and their concerns underscore the fact that a warning label may be inappropriate.  

But of course we need safety labels on technology. 

History of Product Labels 

There are several types of product labels: ingredient labels (food), test result labels (automobile crash test ratings), warning labels from the surgeon general (cigarettes) and from other entities (OSHA’s Hazard Communication System for chemicals).  

Safety labels have a long-standing history in the US as a core component of product safety and product liability. The purpose of safety labels is to illuminate innate—i.e. unavoidable—risks in a product whether it is food, vehicles, cleaning solvents, toys for children, or the technology that we use with increasing reliance for all facets of living our lives.  

Safety labels almost always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

According to InComplianceMag.com, US and Canadian tort-based law makes manufacturers’ product safety obligations clear (emphasis is mine): 

” Manufacturers have an obligation to provide safe products and to warn people about any hazards related to the product. Those requirements have risen up out of the original product safety/liability cases, some of which happened in the same timeframe as the Chicago World’s Fair, the middle to late 19th century, with many more to follow.

 

The assumption in U.S. liability law, and also typically if a case is brought in Canada, is that the manufacturer of the product is guilty and has to prove that they did everything necessary to provide a safe product. That includes warnings, user instructions, and other elements. Today, that continues to be the basic concept in product liability, that the burden lies on the manufacturer to prove that they did everything possible to make their product safe.” 

 

https://incompliancemag.com/product-safety-and-liability-a-historical-overview/

Another interesting fact about safety labels is that they always lag commercial product introduction, and in at least a few cases, product safety can lag by decades. For instance, for cars, product safety awareness and measures (like seatbelts) emerged 50-plus years after their mass availability. Consumer computing has been around for about 40 years now, and it will likely be another 10 years before we see product safety in full swing for software-driven technologies.  

If tech were food we would have never stood for the absence of product information for as long as we have. Never. We use tech with little to no visibility or awareness of what it’s actually doing. That simply must change.  

We need a science of product safety for software and software driven technology. And that’s exactly what we’ve been building for five years at ISL. The current attitude of placing the onus on consumers to somehow gird themselves against invisible risks that not even vendors fully understand is absurd. Of course we need labels.  

And the good news is we’ve got them started on over 1,300 EdTech related apps. Here’s an example https://appmicroscope.org/app/1614/. The image below shows just the label header and the safety facts summary.   

Labels for Technology 

What type of label is appropriate for technology? A warning label is appropriate when the science is irrefutable. Are we there with the physical and mental health risks due to the use of technology? Maybe. Depends on who you ask. But maybe a label more like chemical warning labels is appropriate. Or perhaps just a test results label.  

In our work at Internet Safety Labs, our intention since day one was to expose invisible or difficult to recognize facts about risky behaviors of technology. As can be seen from the design of our app safety labels, we chose to emulate food nutrition labels that report measured findings. This approach of reporting measured findings works very well for this early stage of the science of product safety for technology.  

For instance, in our safety labels, you can see the category averages for most of the measures in the label. Why did we do that? Because there is no concrete threshold that distinguishes safe from unsafe ranges. There’s no industry standard that says, “more than ten SDKs is bad” for example. Moreover, technology norms vary by industry, such that personal information collection in fintech and medical apps is quite different than personal information collection in retail (at least one hopes). Thus, the category averages displayed in our labels don’t necessarily mean “safe”, they just provide context as we continue to measure and quantify technology behavior. An example of the shortcomings of this approach is when, for instance, the category average number of data brokers is greater than zero for apps typically used by children. (We advocate for no data brokers in technology used by children.) But we need to start with understanding the norms. We can’t change what we can’t see.  

The Devil is in the Details 

The call for a congressional mandate for something (not necessarily a warning label) is a step in the right direction. Why? Because it treats software as a product and tacitly places product safety requirements on it. This is an advancement in our eyes.  

Moreover, product safety is almost always the domain of government (or insurance). In the absence of a government mandate for product safety for technology, we see fragmented efforts with the FTC boldly championing privacy risks in technology, and the FCC advocating for a different type of label. So indeed, it’s encouraging that we’re starting to talk about technology in product safety terms.  

But the devil is in the details of any labeling program. In the words of Shoshana Zuboff, “who decides and who decides who decides?” As in, who decides what goes on the labels? Also who oversees the integrity of the labels? The US government is a customer of data obtained by surveillance capitalism2. When it comes to technology can the government be trusted to keep people safe? (When it comes to food can the government be trusted to keep people safe? When you dig into it, the track record is spotty.)  

Product safety exists in natural opposition to the industry status quo and any kind of regulation is already facing and will continue to face strong opposition3. In the early 1900s, when chemist Dr. Harvey W. Wiley began a crusade for the labeling of ingredients and identifying toxic elements in food, industries who relied on the opacity of ingredients (snake oil salesmen) or who simply didn’t want to incur the cost of change (whiskey distillers) opposed such a mandate.  

“Strenuous opposition to Wiley’s campaign for a federal food and drug law came from whiskey distillers and the patent medicine firms, who were then the largest advertisers in the country. Many of these men thought they would be put out of business by federal regulation. In any case, it was argued, the federal government had no business policing what people ate, drank, or used for medicine. On the other side were strong agricultural organizations, many food packers, state food and drug officials, and the health professions. But the tide was turned, according to historians and Dr. Wiley himself, when the activist club women of the country rallied to the pure food cause.”

 

https://www.fda.gov/media/116890/download  

Product safety challenges the status quo and creates necessary growing pains for industry. But industry always survives. And more often than not, new industries emerge, such as the ongoing development of safety features for vehicles.  

Let’s return to the challenge of deciding what goes in the labels. We at ISL know quite a lot about what it takes to develop safety labels in a space where the naming and measurement of risk isn’t fully baked (or worse, non-existent). Determining what goes into a previously uncharted, unmeasured safety label is extraordinarily challenging. It’s even more challenging if the measurement tools don’t exist. But our situation is even worse than that: we don’t even have agreement on what the risky behaviors in technology are. AND, we are talking about behaviors here—which is not language we typically associate with products. Products don’t typically behave. From our several years in development, these are the highly iterative steps that must occur to reach consensus on labels for technology: 

Consensus on naming the hazards/harms in technology.4  Consensus on assessing and quantifying the risks.   Identify the behaviors that embody the risks.  Figure out a way to measure the behaviors that embody the risks.   Assess the measurements.5  Consensus on presentation of the measurements/information. 

As far as presentation of the data, in our case, we decided to aggregate the data into clusters based on riskiness, and we also ultimately decided to provide a single app score. This was done with some reluctance, and it will no doubt be a much-evolving scoring rubric for the next few years.  

For now, we believe the best thing the labels can do is objectively report the invisible (or poorly understood) behaviors of the products until such time as definitive harm thresholds can be derived.  

There’s a final vital detail regarding the establishment of any labels, and that’s having what I would characterize as exceptional diversity of participants in establishing safety standards. This isn’t lip service. A few years ago, when I started to better see how what was risky for me was very different than what was risky for people who are different from me such as a person of color, or a person with a disability, or an incarcerated person, I woke up one night from a deep sleep with the awareness that any attempt at standardizing or consensus is doomed if it doesn’t have full diversity involved6 . Why this is so is a long and complicated matter. On the one hand, everything ever done should endeavor to have exceptional inclusion of a massively diverse set of participants.  

But it also has to do with the fact the software and software driven tech is “alive” and interactive in a way that other products in our lives aren’t. We have a special duty when it comes to product safety of software animated products. We may even need to reconsider what a “product” is. We have seen evidence of the hazards of animated technology not built with adequate understanding of the diversity of users with the embodiment of human bias in automated decision making or with hand dryers that don’t activate for people of color. The point is that technology acts on and with us in a different (and constantly changeable) way than other products. So labeling is both harder and matters more than ever.  

Conclusion 

Overall, I remain optimistic that the lens is happily starting to focus on product safety, implicit though it may be. People will be thinking more about labels for technology. And they will see that ISL is already providing labels with privacy risks. We can call out the presence of infinite scroll, and like buttons and other widely recognized as addictive user interface patterns in labels today.  

As I mentioned above, confusion stems from Dr. Murthy’s call for a “warning label” instead of a safety or ingredients label. Technology is cigarettes7. We use the metaphor all the time. Technology today is cigarettes in the 1940s/1950s when just about everybody chain smoked and the harms were likely all anecdotal and pooh-poohed. It took decades to assemble causal evidence. But tech is also much more complicated than cigarettes and a warning label is premature. This is not a compelling argument to say that we don’t deserve to have accurate information on tech’s risky behaviors. As it is right now, we don’t even have an ingredient label for technology. We are flying (tech-ing?) blind. 

Of course we need labels. Industry would do well to proactively embrace label enablers like software bills of material, consent receipts, and machine-readable record of processing activities (ROPAs). Because there can be no doubt that labels are imminent.  

Earlier, I said that we’ve “started”. I say that because our labels only include privacy risks at present. Our labels are deliberately modular and we’ve scoped additional sections: 

Risky UI Patterns –like deliberately addictive UI patterns of the sort Dr. Murthy is calling for exposing. Our Safe Software Specification for Websites and Mobile Apps already describes measurement of these kinds of risks.  Automated Decision-Making Risks  Security [client side only] Risks  Differences between observed tech behavior and privacy policy and/or terms of service.  

All of these are on our roadmap. We know exactly how to add these sections to the label, it’s strictly a resource and funding issue. If they sound good to you, please consider supporting our mission

Because of course we need labels. 

 

Footnotes: https://www.wsj.com/us-news/u-s-surgeon-general-calls-for-warning-labels-on-social-media-platforms-473db8a8?st=gmnjmhotka7febm&reflink=desktopwebshare_permalink 
https://technosapiens.substack.com/p/should-social-media-have-a-warning  https://arstechnica.com/tech-policy/2024/01/nsa-finally-admits-to-spying-on-americans-by-purchasing-sensitive-data/
https://www.nbcnews.com/tech/security/us-government-buys-data-americans-little-oversight-report-finds-rcna89035
https://www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x  https://www.politico.com/news/2023/08/16/tech-lobbyists-state-privacy-laws-00111363 We have ongoing work with our Digital Harms Dictionary. They will be wrong, and you will have to find a different measure. We welcome everyone, whether you are technical or not, to participate in our open Software Safety Standards Panel where we define the content of the safety labels, and name hazards and harms. Tech may actually be worse than cigarettes because it has the capability of inflicting every kind of harm people can experience, either directly or indirectly, in a multitude of increasingly creative ways: financial, reputational, social, emotional/psychological, and even physical. 

The post We Need to Talk About Product Labels appeared first on Internet Safety Labs.


Hyperledger Foundation

Energy & Mines Digital Trust: The Future of Global Supply Chains

After two years of piloting, Energy & Mines Digital Trust (EMDT) has successfully launched two digital credentials, reshaping how mining operators in British Columbia (B.C.) share verified data. As the project evolves, EMDT is working with the United Nations to explore how digital trust technology can improve cross-border trade and supply chain traceability, while further lowering t

After two years of piloting, Energy & Mines Digital Trust (EMDT) has successfully launched two digital credentials, reshaping how mining operators in British Columbia (B.C.) share verified data. As the project evolves, EMDT is working with the United Nations to explore how digital trust technology can improve cross-border trade and supply chain traceability, while further lowering the barriers to entry for companies worldwide. 


GS1

Maintenance release 2.10

Maintenance release 2.10 daniela.duarte… Thu, 06/20/2024 - 14:58 Maintenance release 2.10
Maintenance release 2.10 daniela.duarte… Thu, 06/20/2024 - 14:58 Maintenance release 2.10

GS1 GDM SMG voted to implement the 2.10 standard into production in May 2024.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.10 contains updated reference material aligned with ADB 2.4 and GDSN 3.1.27.

 

Updated For Maintenance Release 2.10

GDM Standard 2.10 (May 2024)

Local Layers For Maintenance Release 2.10

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (December 2021)

USA - GSMP RATIFIED (February 2023)

Finland - GSMP RATIFIED (November 2023)

Netherlands - GSMP RATIFIED (May 2024)

Italy - GSMP RATIFIED (May 2024)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (February 2024)

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.10 and 2.11 (April 2024)

Attribute Definitions for Business (May 2024)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.27 (May 2024)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (May 2024)

GDM Local Layer Submission Template (May 2024)

Training

E-Learning Course

Future Release Documentation

GPC Bricks to GDM (Sub-) Category Mapping for GDM 2.10 and 2.11 (April 2024)

Any questions

We can help you get help you get started using the GS1 standards

Contact your local office


Oasis Open Projects

OASIS Launches Global Initiative to Standardize Supply Chain Information Models

Boston, MA – 20 June 2024 – With escalating cybersecurity threats exploiting software supply chain vulnerabilities, there’s an urgent need for better understanding and proactive measures to identify and prevent future risks. Members of OASIS Open, the global open source and standards organization, have formed the Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) […] The post

Checkmarx, Cisco, Cyware, Google, IBM, Legit Security, Microsoft, Root, SAP, US NSA, CISA, and Others Join Forces to Build a Framework to Complement SBOM Data Formats, CSAF, CycloneDX, OpenVEX, and SPDX

Boston, MA – 20 June 2024 – With escalating cybersecurity threats exploiting software supply chain vulnerabilities, there’s an urgent need for better understanding and proactive measures to identify and prevent future risks. Members of OASIS Open, the global open source and standards organization, have formed the Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) to standardize and promote information models crucial to supply chain security. 

The aim of OSIM is to build a unifying framework that sits on top of existing SBOM data models–such as CSAF, CycloneDX, OpenVEX, and SPDX. OSIM is not intended to replace or endorse any one of these models. Instead, as an information model, OSIM will bring clarity to software supply chain partners, mitigate vulnerabilities and disruptions, reduce security risks, and make it easier for companies to plan for upgrades and contingencies.

“CISA is excited to be a part of this technical effort to bring greater visibility to the software supply chain,” said Allan Friedman, Senior Technical Advisor at CISA. “We have many of the basic building blocks for software transparency and security, including SBOM, VEX, and CSAF. This work by OASIS will facilitate automation for easier and cheaper implementation and tooling, and help provide a unifying supply chain framework and raise the level of collaboration across industries.”

“OSIM represents an important effort to address the need for greater structure and comprehensibility of software supply chains,” said Isaac Hepworth, Google, and OSIM co-chair. “By establishing standardized information models we can enhance transparency, interoperability, and resilience in end-to-end operations — ultimately aiding cyber risk management and protecting critical infrastructure.”

Recognizing the crucial role of Software Bill of Materials (SBOMs) in fortifying software supply chain security, the OSIM TC aims to create, for example, a standardized SBOM information model that would enhance understanding and interoperability across diverse SBOM data formats (i.e. SPDX and CycloneDX). Competing data models, like SPDX, CycloneDX, CSAF, and OpenVex, show the need for creating information models that would bring coherence across diverse specifications.

“OSIM’s approach not only drives a universal taxonomy of thought, it also brings clarity and ease to how we implement standards and frameworks to support multiple industry software supply chain security needs. OSIM facilitates the identification of similarities and differences across specifications, enhancing interoperability and simplifying processes. The current cybersecurity landscape can no longer be defended in a silo,” said Jay White, Microsoft, and OSIM co-chair.

The OSIM TC welcomes a diverse range of contributors, including software and hardware vendors, open-source maintainers, technology consultants, business stakeholders, government organizations, and regulatory bodies. Participation is open to all through membership in OASIS, with interested parties encouraged to join and contribute to shaping the future of supply chain information modeling.

Support for OSIM

Checkmarx
“Checkmarx is proud to be working with OASIS and be part of the OSIM Technical Committee. A major part of Checkmarx’ mission to secure the applications driving our world involves sharing our time, experience, and threat intelligence to help make the software supply chain ecosystem safer. As one of the biggest challenges remains education and closing the knowledge gap, we believe standardization is a crucial step and are committed to assisting in laying the foundations.”
– Erez Yalon, VP of Security Research, Checkmarx

Root
“The OASIS OSIM is a vital project for enhancing security and trust in the software supply chain. As a part of the OSIM Technical Committee, Root is committed to advancing supply chain security and transparency, aligning perfectly with this initiative’s goals. By collaborating on data schemas, data modeling, and security standards, we aim to improve vulnerability management and software security, ensuring threats are identified and mitigated promptly. This enhances software integrity, benefiting our customers and strengthening trust in the broader digital ecosystem.”
– Ian Riopel, CEO, Root.io

SAP SE
“Having a unified information model for representation of objects in the supply chain domain would enable efficient integration models and interoperability. Especially with the wave for generative AI, such aligned models can bring benefits in development efficiency , reduced maintenance and operations for upcoming innovations in the domain.”
– Gururaj Raman, Chief Development Expert, SAP SE

Additional Information
OSIM Project Charter

Disclaimer: CISA does not endorse any commercial entity, product, company, or service, including any entities, products, or services linked or referenced within this press release. Any reference to specific commercial entities, products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply endorsement, recommendation, or favoring by CISA.

The post OASIS Launches Global Initiative to Standardize Supply Chain Information Models appeared first on OASIS Open.

Wednesday, 19. June 2024

Origin Trail

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI. There is however another gap that has become strikingly apparent — t

With tens of billions invested in AI last year and leading players such as OpenAI looking for trillions more, the tech industry is racing to grow large generative AI models. The goal is to steadily demonstrate better performance and, in doing so, close the gap between what humans can do and what can be accomplished with AI.

There is however another gap that has become strikingly apparent — the AI trust gap. As challenges such as AI hallucinations, bias, and intellectual property slurping continually cause damage, we look into how the base of the current Web could be effectively transformed to support the Verifiable Internet for AI.

The announced Apple and OpenAI integration signals the trust gap is widening, with Apple users’ data becoming the next frontier for training ChatGPT and questionable transparency on how it is used. This data is so valuable that it reportedly makes up for charges Apple would pay for using costly ChatGPT AI models. The Verifiable Internet for AI shifts this paradigm, making such data transactions transparent on chain with ownership of data taken back by users, who ultimately get to monetize it.

Decentralized AI: Intersection of Crypto and AI

Having employed the fundamentals of crypto, AI, and knowledge graphs successfully within a plethora of sectors, where trust, transparency, and accuracy are of paramount importance, OriginTrail now integrates Base blockchain with OriginTrail Decentralized Knowledge Graph (DKG), to help drive trust and transparency with neuro-symbolic AI. Instilling information provenance, ownership, and graph structure through blockchains and knowledge graphs together can effectively address the aforementioned problems of AI, as detailed in the most recent White Paper 3.0.

Your Body of Knowledge, Your Choice of AI

The opportunity of graph algorithms as a foundation for reputation in the age of AI was also highlighted by Brian Armstrong, CEO of Coinbase, in a recent podcast:

“Another piece of a puzzle that I feel could be missing, is something around reputation that’s on chain. You can imagine a version of this that’s like using the graph structure of the chains. To sort of say, okay if I trust this node, and they sent money to this node, that sort of implies some amount of trust. Kind of like a Google Page Rank had an algorithm, something like that could be built on chain.” — Brian Armstrong, CEO of Coinbase

The recently introduced OriginTrail Paranets (user-controlled on-chain knowledge graphs), enable users total control over their data, connecting it into the DKG decentralized physical infrastructure (DePIN), while keeping it safely stored on their devices. Users are then able to choose from a growing selection of open-source AI systems integrated with OriginTrail via ChatDKG.ai, a launchpad for user-controlled AI.

Knowledge graphs with paranets enable transparent on-chain reputation, relevance scoring with PageRank, recommendation engines, graph neural networks, and other AI reasoning applications.

The first of such knowledge graphs to launch on Base is the DeSci paranet for autonomous scientific research by ID Theory, crowdsourcing knowledge assets utilizing on-chain reputation via the OriginTrail DKG.

#DeSci has great potential. Those who have a working product will have the power to forever improve science and the scientific process.” — Brian Armstrong, CEO of Coinbase

One of the first dapps deployed on the DeSci paranet on Base will be the DeSci AI agent, which will include a knowledge mining interface through which scientific knowledge will be minted on chain, with publishers receiving token incentives.

“We’re creating a user-friendly hub to coordinate scientific knowledge creation onchain — a co-owned substrate to crowdsource AI and supercharge research and discovery as we know it through autonomous science. The first iteration will focus on neuroscience as it’s very close to our hearts, but the future is boundless. Who knows, crypto might actually cure cancer.” — ID Theory

DeSci AI Agent in action built on OriginTrail

AI and Crypto, converging together in the OriginTrail DKG can tackle some of the largest challenges while providing users with an inclusive, unbiased, and verifiable way of making mission-critical decisions. As we bring this technology to more data-intensive sectors such as science, the trust layer — blockchain underpinning the neuro-symbolic AI approach made possible by the DKG — needs to fulfill both the scalability and user experience requirements.

This is where Base can help Trust Thy AI — in a scalable, inclusive, and user-friendly way.

Make sure to subscribe and follow the next steps as we make AI Base-d.

Trust Thy AI: Artificial Intelligence Base-d with OriginTrail was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


We Are Open co-op

Building Credibility into Digital Credentials

Validity + Reliability + Viability = Credibility Image CC BY-ND Visual Thinkery for WAO Digital credentials are playing an increasingly important role in recognising a broad spectrum of skills in the workplace. These range from the tangible to the abstract and innovative, but in all cases, we want credentials that are credible — that is to say, something worth earning. In this post, we
Validity + Reliability + Viability = Credibility Image CC BY-ND Visual Thinkery for WAO

Digital credentials are playing an increasingly important role in recognising a broad spectrum of skills in the workplace. These range from the tangible to the abstract and innovative, but in all cases, we want credentials that are credible — that is to say, something worth earning.

In this post, we explore two distinct scenarios, a culinary school and a large organisation looking to recognise and encourage the creativity of its exmployers. In each, the core elements of validity, reliability, and viability are discussed in terms of developing a credible digital credential system .

Validity

Validity ensures that an assessment accurately measures what it claims to measure. In the context of digital credentials, this means that the assessment process leading to the badge should accurately evaluate the abilities, knowledge, or competencies it is intended to certify. This alignment between the assessment criteria and the attributes it assesses is important. Without strong validity, the integrity of the credential could be questioned, undermining its acceptance by educational institutions and employers.

Culinary School: Issuing badges for skills like knife handling, pastry making, and creative presentation means assessing specific competencies. Each badge must represent true mastery of these skills. Assessments might involve practical tests where students must produce a dish that adheres to professional standards. These practical assessments must be designed to measure relevant culinary skills accurately, ensuring the badge directly reflects the student’s capability. Creative Organisation: For certifying creativity, the assessments might require participants to propose innovative solutions to real business challenges. These should be evaluated for their originality, practicality, and impact, ensuring the badge reflects genuine creative thinking and problem-solving ability. Reliability

Reliability focuses on the consistency and dependability of the assessment results. A reliable digital credential system ensures that all recipients are evaluated using the same standards, and that these standards are consistently applied. This consistency builds trust in the credentialing system, making the credentials more likely to be recognised across different sectors.

Culinary School: It is crucial that all culinary tests are graded on a consistent rubric. If two students deliver dishes of similar quality, they should both earn the badge. This uniformity builds trust in the credentialing system by affirming its fairness and rigour. Creative Organisation: When evaluating creative projects, it’s essential that all judges use the same criteria to assess the submissions. This ensures that every employee who meets the standard of creativity receives recognition, maintaining the reliability of the credential across the organisation. Viability

Viability deals with the practical aspects of sustaining an assessment system. This includes considerations such as the costs involved, the resources required, and the technology necessary to issue and maintain the credentials. A viable digital credential system is scalable and sustainable, capable of adapting to growing demands and evolving educational environments.

Culinary School: The school needs a system that can efficiently handle various forms of assessments, from practical cooking exams to theoretical tests. This includes logistical aspects such as scheduling, recording results, and managing digital badges. The technology used must support these activities without prohibitive costs. Creative Organisation: For a badges relating to creativity, the system should support diverse submission formats and robust communication for feedback. It must also be scalable and adaptable to accommodate a growing number of participants as the organization evolves. Credibility as convergence

In both examples, credibility arises from the effective integration of validity, reliability, and viability. A credible digital credential system not only accurately and consistently evaluates and represents diverse skills — from cooking to creativity — but also operates efficiently and sustainably within its intended scope. This credibility enhances the value of a digital credential, making it a recognized and sought-after mark of achievement.

Next steps

Understanding and applying the principles of validity, reliability, and viability can enhance the effectiveness and perception of digital credentialing systems. Whether in a culinary school or a large corporation, these principles ensure that credentials issued represent genuine skills and achievements, fostering trust and respect in the digital badges awarded. Through these dual examples, this post demonstrates how these principles can be applied across various disciplines, encouraging readers to think broadly about the potential of digital credentials in their own fields.

Need help thinking about digital credentials in your organisation? Get in touch!

Related posts Badge Project Blueprint Making Credentials Work for Everyone A Compendium of Credentialing

Special mention and thanks to Paddy Craven of City & Guilds, who initially pointed us in the right direction around this!

Building Credibility into Digital Credentials was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

Musings of a Trust Architect: Minimum Viable Architecture (MVA)

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian sys

ABSTRACT: Minimum Viable Architecture (MVA) is an alternative to the Minimum Viable Product (MVP) approach, emphasizing the importance of a robust, scalable, and expandable architecture. The MVA methodology mitigates risks associated with reputation, competitiveness, and architectural deficiencies, and fosters collaboration among competitors. Real-world examples, such as SSL/TLS and the Gordian system, illustrate the successful implementation of MVA in software development.

A business methodology focused on producing a Minimum Viable Product blossomed in the 21st century. Unfortunately, it can set businesses up for future failure because it doesn’t properly define the larger architecture that is needed to evolve a product past its earliest, minimal state.

The Old Methodology: Minimum Viable Product

A Minimum Viable Product (MVP) is a business methodology that advocates creating the simplest possible version of a product as a first release, to see if the market responds positively, or else to understand why it doesn’t1. If an MVP is successful, possibly through iterations of the initial work, the product can then be grown and ultimately find large-scale success in the market.

Twitter has long been used as an example of an MVP that did great, with Dropbox and Facebook being other examples of MVPs2 (to various extents). By the criteria of these companies, MVP would seem to be a win-win methodology.

However, they’re not the full story.

MVP Biases

Unfortunately, current literature about Minimum Viable Products suffers from Survivorship Bias. We hear about the success of companies that used MVPs, but we don’t know that their doing so actually led to success. In fact, the successes that we see might just be a false signal.

How many hundreds or even thousands of companies pursuing MVPs failed for each Twitter or Facebook that succeeded? How many companies found that they couldn’t scale their MVP, realized that they couldn’t take commercial advantage of an otherwise successful MVP, or simply were beaten by competitors with even more viable products?

We can’t measure the success of the MVP methodology by the anecdotal success of a few individual companies.

Survivorship bias image by Martin Grandjean (vector), McGeddon (picture), Cameron Moll (concept). Released under cc by-sa 4.0.

MVP Dangers

The MVP system also has other dangers.

Some of these are related to company brand. Though a new company doesn’t have a reputation to damage, a series of unsuccessful MVPs could nonetheless curtail their future opportunities. Meanwhile, a more mature company might find their existing reputation blemished by a poor MVP. This is especially true today, as companies are increasingly saying that MVPs can be poor-quality releases3. That didn’t work out that well for Cyberpunk 2077, one of the highest profile and most controversial computer game releases of recent years, even though (like many modern-day computer games) it wasn’t quite released as an MVP, but not in a fully complete form either4.

There are also competitive dangers. Within a developmental niche, MVPs only work if everyone pursues them; otherwise, an MVP built on solid ideas could easily be out-competed by a firm who produced a slightly more mature prototype. Similarly, a company with more resources might be able to scoop up the ideas in a MVP and replicate them to their own advantage5.

However the biggest dangers of MVPs are probably architectural. By defining an MVP, a company can easily ignore the larger architecture issues that would have once been considered before starting work on any serious release. This can cause problems with scaling, with missing features that can’t easily be added, and with locked-in decisions that become part of the final product.

Twitter (“X”), for example, didn’t finalize its network architecture design until 20106, four years after its advent. It would have been easy for a better architected social-medium system to get in there first; the fact that no one did is one of the pieces of luck that led to Twitter’s ultimate success. In fact, one of the developers at Twitter has noted this, saying: “In the end, Twitter barely made it, and product progress was slow for years due to post facto infrastructure investment.”7

The biases and dangers implicit in MVPs suggest the need to at least experiment with other methodologies for product releases. The huge problems implicit in the potential lack of architecture in an MVP also suggest what that alternative methodology should be: a Minimum Viable Architecture.

A New Methodology: Minimum Viable Architecture

Minimum Viable Architecture (MVA)8 is a methodology that has been discussed in somewhat different forms over the last several years. It doesn’t focus on the simplest product that can be released to consumers, but instead on the simplest architecture that can support future development within a product’s technological ecosystem.

The goal of an MVA is still to create a product that doesn’t strain the resources of a company and that doesn’t create a situation where a company’s ultimate success or failure depends on that singular release. However, that simple product must be created with the understanding of a larger architecture that has enough flexibility9 that designers can fill in gaps in that architecture in the future. It’s just that the decisions for filling in those gaps are delayed as much as possible10. It’s a melding of agile methodologies with architectural concerns.

Though an MVA could be created with a full understanding of future expansions that may or may not be ultimately accomodated, it’s more powerful to create an MVA that is modular and expandable — that doesn’t depend on the architect thinking of everything, but instead future-proofs itself so that the architecture could include unthought-of elements in the future. As Jorge Lebrato says: “The architecture remains cohesive and each piece cooperates with the others, despite having had different rhythms.” The best MVA is a compromise between entirely ignoring the architecture (as is likely in an MVP) and designing it entirely (which would likely result in time cost and waste)11.

MVA Examples

The following examples contain some real-world usages of MVA instead of MVP.

SSL/TLS

When I co-authored the TLS spec in the ’90s, I did my best to future-proof it by simultaneously constraining the design and giving it enough flexibility to be expanded in the future. This is an example of a Minimum Viable Architecture whose usefulness has proven itself: TLS is now the most deployed security system on the internet, at the heart of almost every shopping, financial, or banking transaction.

This future-proofing was thanks in part to our architecting elements that we suspected would be required in the future, but which couldn’t be deployed in the then-present, primarily due to CPU limitations. Perfect forward secrecy12 is an example. Users were able to simply turn it on when its usage became viable on standard hardware platforms.

However, our more notable work in creating an MVA came from our inclusion of ciphersuites. These are powerful encryption and decryption rules that do the actual cryptographic work of TLS. By defining them as modular plug-ins, we supported the future innovation of TLS, even in ways that we could not envision. And, there was considerable innovation. TLS 1.2 had 37 ciphersuites, though that dropped back to five with TLS 1.313.

The Gordian System

One of my most recent endeavors is Blockchain Commons’ Gordian system14, which is a layered architecture for protecting digital assets and identity that has seen early successes with the protection of seeds with systems like SSKR15 and CSR16 and that focuses on the Gordian Principles of independence, resilience, privacy, and openness.

Blockchain Commons’ Mission: Advocating for the creation of open, interoperable, secure & compassionate digital infrastructure to enable people to control their own digital destiny and to maintain their human dignity online

In order to create an MVA that future-proofs the Gordian products, the Gordian architecture identifies points of potential interoperability and breaks the architecture into discrete components across those interoperable interfaces, thus allowing individual elemetns to be replaced. This was done both at the large-scale application level and at the small-scale programmatic level. It’s important everywhere.

At the large-scale application level, the Gordian system achieves interoperability by the careful architecting of both discrete applications and the ways that they can interact. Airgaps are a traditional methodology for introducing security into a digital asset system17, but the Gordian system has expanded that to include Torgaps18, which is a way for making transactions between connected applications both secure and non-correlatable. This modular approach is one way to enable future-proofing, and it’s only strengthed by systems such as airgaps and torgaps that tightly constrain communications between the modules.


At the small-scall programmatic level, the Gordian system introduces a layered stack of specifications that together enable the private and secure transmission of sensitive data. This stack includes dCBOR19, Bytewords20, URs21, Animated QRs22, Envelope23, Gordian Transport Protocol24, and Gordian Sealed Transaction Protocol25. Together these specifications allow for the deterministic storage of binary data (dCBOR), the alphabetic representation of binary data (Bytewords), the tagged display of that representation with functionality to support multipart data (URs), the QR display of multipart data (animated QRs), the structured & smart storage of content (Envelope), the communication of Envelopes (GTP), and the secure communication of Envelopes (GSTP). But we didn’t know what all the layers would be when we got started: this is another example of future-proofing, and one that easily arises from carefully layered specifications.

Similarly, when Blockchain Commons creates its progressive use cases we focus first on the requirements without needing to know the technology. The technological specifics can be filled in by ourselves or individual vendors in the future.

By abstracting and separating architectural elements—whether they be large-scale components, layered specifications, or additional requirements found in progressive use cases—the Gordian system will be able to incorporate options that we are not even considering. The ultimate goal of all of these designs is to ensure that our MVA architecture does not limit itself, but instead remains flexible for the future.

Other MVA Examples

This type of MVA thinking is a pattern that can be widely successful and that doesn’t create some of the limitations that appear in MVP thinking. For example, when I was supporting the creation of the earliest specifications for Decentralized Identifiers (DIDs)26, I was pleased to see us arrive at a compromise where core DID specifications were separated from specific DID methods and from signature suites. It’s an architecture that allows for a lot of future expansion.

Similarly, some of my earliest Blockchain Commons work was with a company who was adapting the Gordian architecture. Even though they weren’t planning to initially implement multi-sigs, I ensured that they don’t make decisions that would lock them out of multi-sig usage in the future, because I was thinking of a MVA that went beyond the MVP they were focused on.

Coda: The Benefits of Coopetition

It can be quite hard for a single company to figure out an MVA. Thus, it’s great to work with other companies in your technology space.

This is particularly true if your industry supports coopetition, where business competitors can work together for a mutually beneficial good. If an industry supports interoperability, or one company adding services to another company’s products, then it’s a great candidate for coopetition—and thus MVAs are even more likely to be successful.

Blockchain Commons has been able to take advantage of this. A variety of companies have participated in the Gordian Developer Community community27, each contributing their own ideas and requirements for the Gordian architecture. In turn, they’ve then gone off and created open-source libraries that adapt the architecture28, before beginning work on their own wallets that use the MVA that we cooperatively designed. A not-for-profit organization can be a great support for MVA work of this type; that’s what Blockchain Commons does.

Conclusion

Hollowing out spaces in architectures for future development and creating flexibility for the future through modular designs are two of the most successful methods for turning an MVP into an MVA. They give you something that supports minimal investment and agile development, while simultaneously maximizing the ability to scale and expand in the future.

We don’t always know the right solutions. We can’t predict what will work best. So the best we can do is create architectures that won’t lock us in to specific decisions about the future. By doing so, especially by working in coopetition to do so, we also ensure that no one company will lock us or our users into futures that we don’t agree with.

This article was originally drafted in 2021, and then back-burnered for various reasons. It’s been great to see a real exposion in discussion of MVA in the years since by authors such as Ekaterina Novoseltseva 9, Jorge Labrato11 and Murat Erder and Pierre Pureur10, much of which reflects my own thoughts on MVA. Hopefully that means we’re moving in this direction!

Various. Retrieved 2021. “Minimum Viable Product”. Wikipedia. https://en.wikipedia.org/wiki/Minimum_viable_product. 

Michael Sweeney. 2015, 2020. “5 Successful Startups That Began With an MVP”. Clearcode. https://clearcode.cc/blog/successful-startups-minimum-viable-product/. 

Allan Kelly. 2020. “The MVP is broken: It’s time to restore the minimum viable product”. TechBeacon. https://techbeacon.com/app-dev-testing/mvp-broken-its-time-restore-minimum-viable-product. 

Frank, Allegra. 2020. “How one of the biggest games of 2020 became one of the most controversial”. Vox. https://www.vox.com/culture/22187377/cyberpunk-2077-criticism-ps4-xbox-one-bugs-glitches-refunds. 

Andrea Contigiani. 2018. “The Downside of Applying Lean Startup Principles”. Knowledge at Wharton. 

Mazdak Hashemi. 2017. “The Infrastructure behind Twitter: Scale”. Twitter blog. 

Evan Weaver quoted by James Governor. 2017. “Minimum Viable Architecture – good enough is good enough in an enterprise”. James Governor’s Microchips. https://redmonk.com/jgovernor/2017/06/13/minimum-viable-architecture-good-enough-is-good-enough-in-an-enterprise/. 

Deepak Karanth. 2016. “How to Create a Minimum Viable Architecture”. Dzone. https://dzone.com/articles/minimum-viable-architecture. 

Novoseltseva, Ekaterina. 2022. “Minimum Viable Architecture”. Apiumhub. https://apiumhub.com/tech-blog-barcelona/minimum-viable-architecture/#.  ↩2

Pureur, Pierre. 2021. “Minimum Viable Architecture: How To Continuously Evolve an Architectural Design over Time”. Continuous Architecture in Practice. https://continuousarchitecture.com/2021/12/21/minimum-viable-architecture-how-to-continuously-evolve-an-architectural-design-over-time/.  ↩2

Lebrato, Jorge. 2022. “What is a Minimum Viable Architecture (MVA) and why an iPaaS such as Anypoint Platform can help you achieve it”. Medium: Another Integration Blog. https://medium.com/another-integration-blog/what-is-a-minimum-viable-architecture-mva-and-why-an-ipaas-such-as-anypoint-platform-can-help-you-f54c9791f6c3.  ↩2

Various. Retrieved 2021. “Forward Secrecy”. Wikipedia. https://en.wikipedia.org/wiki/Forward_secrecy. 

Uncredited. 2020. “Cipher Suites and TLS Protocols”. SSLs.com Blog. https://www.ssls.com/blog/cipher-suites-and-tls-protocols/. 

Various. Retrieved 2024. “Blockchain Commons Developer pages”. Blockchain Commons website. https://developer.blockchaincommons.com/. 

Various. Retrieved 2024. “SSKR: Sharded Secret Key Reconstruction”. Blockchain Commons website. https://developer.blockchaincommons.com/sskr/. 

Various. Retrieved 2024. “CSR: Collaborative Seed Recovery”. Blockchain Commons website. https://developer.blockchaincommons.com/csr/. 

Various. Retrieved 2024. “Air Gaps” Blockchain Commons website. https://developer.blockchaincommons.com/airgap/. 

Various. Retrieved 2024. “Torgaps”. Blockchain Commons website. https://developer.blockchaincommons.com/torgap/. 

Various. Retrieved 2024. “Deterministic CBOR (dCBOR)”. Blockchain Commons website. https://developer.blockchaincommons.com/dcbor/. 

Various. Retrieved 2024. “Bytewords”. Blockchain Commons website. https://developer.blockchaincommons.com/bytewords/. 

Various. Retrieved 2024. “Uniform Resources (URs)”. Blockchain Commons website. https://developer.blockchaincommons.com/ur/. 

Various. Retrieved 2024. “Animated QRs”. Blockchain Commons website. https://developer.blockchaincommons.com/animated-qrs/. 

Various. Retrieved 2024. “Gordian Envelope”. Blockchain Commons website. https://developer.blockchaincommons.com/envelope/. 

Appelcline, Shannon, Wolf McNally & Christopher Allen. 2024. “Gordian Transport Protocol / Envelope Request & Response Implementation Guide 📖”. GitHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.mdhttps://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2024-004-request.md. 

McNally, Wolf & Christopher Allen. 2023. “Gordian Sealed Transaction Protocol (GSTP)”. GItHub. https://github.com/BlockchainCommons/Research/blob/master/papers/bcr-2023-014-gstp.md. 

Drummond Reed, Manu Sporny, Dave Longley, Christopher Allen, Ryan Grant, and Markus Sabadello. 2021. “Decentralized Identifiers (DIDs) v1.0”. https://www.w3.org/TR/did-core/ 

Various. Retrieved 2024. “Gordian Developer Community”. GitHub. https://github.com/BlockchainCommons/Gordian-Developer-Community 

Various. Retrieved 2024. “Blockchain Commons Libraries”. Blockchain Commons website. https://developer.blockchaincommons.com/libraries/ 

Tuesday, 18. June 2024

OpenID

Digital Identity at the G20

On June 18, 2024 the OpenID Foundation’s Executive Director, Gail Hodges, spoke about Digital Identity at the G20 during the Digital Government and Inclusion Workshop. The following are her prepared remarks.   Bom dia and hello. I’d first like to applaud the Brazilian Government for your impressive work on Digital Identity here, in Brazil, and […] The post Digital Identity at the G20 first

On June 18, 2024 the OpenID Foundation’s Executive Director, Gail Hodges, spoke about Digital Identity at the G20 during the Digital Government and Inclusion Workshop. The following are her prepared remarks.

 

Bom dia and hello. I’d first like to applaud the Brazilian Government for your impressive work on Digital Identity here, in Brazil, and your G20 leadership. Brazil is modeling the kind of multi-stakeholder approach we need to enable Digital Public Infrastructure, data sharing and Digital Identity. The G20 is ideal forum to accelerate work in this area globally.

Today I’d like to suggest a vision for Digital Identity. We have a UN Sustainable Development Goal 16.9 for Identity, with the principle that 8 billion people should have access to an identity credential. What should our goal be for Digital Identity? Should 8 billion people also have the right to a digital identity credential? How can we achieve social inclusion if all 8 billion do not have the option to fully participate in the digital economy? 

If we have a Digital Identity for everyone, what should it feel like? I suggest that it should be as easy for people to assert their Digital Identity credential as it is to assert their email or phone number.

It is possible to achieve these goals – the technology is not the barrier. But it will take G20 leadership, national leadership, multi-stakeholder collaboration (like our conversation today), and crucially… it will take global open standards.

The OpenID Foundation is one of the open standards bodies at the center of Digital Identity specification development. We seek to offer secure and interoperable standards that respect domestic sovereignty. Our most popular standard is OpenID Connect, which is currently used by over 3 billion people across millions of applications. The OpenID Foundation also supports 28+ countries that have selected our OpenID for Verifiable Credential specifications, and another 12 countries have selected the OpenID Foundation’s high security profile for data sharing called FAPI. In fact, Brazil is one jurisdiction where FAPI was selected, and the OIDF is proud to supports Brazil’s Open Finance and Open Insurance programs by offering certification to all ecosystem participants.

Last year the OpenID Foundation and 12 other non-profits announced a white paper titled “Human-Centric Digital Identity” in conjunction with the OECD’s Recommendations on the Governance of Digital Identity. In that paper we recognized that countries have a mix of operating models that align to their technical implementations. Some countries lean to centralized modeled others to “decentralized” models. Some countries are government-led and others are private sector-led.  From our vantage point all of these are legitimate models. There is no single way to develop and deliver a Digital Identity program. However, if you want to achieve social inclusion and a human-centric approach, you need to stare hard at your domestic model to ensure no one is left behind.

So what about your country? For those of you in countries at the start of your Digital Identity journey, I suggest you answer one critical question early on in your program development. Will you use global standards or will you develop your own local specifications? It is your choice. But I encourage you to take that decision at the most senior levels. Global standards offer you confidence in the security model, technical interoperability, the ability to scale, interoperability across borders, and it is resistant to vendor and consulting provider lock-in. Local standards could place limitations on the ability of your people and your businesses to thrive outside of the local context, and it could open you up to security threats if you become the “weakest link” relative to your peers. Even if you choose to leverage open source code, I encourage you to ensure that the open source code and your local implementations are certified as conformant to global open standards.

I assure you, the trend is already toward global standards. It was the rallying cry in Cape Town last month during ID4Africa. In recent weeks I was delighted to hear friends focused on the global south countries from the World Bank, UNDP, GovStack, MOSIP, Center of Digital Public Infrastructure all encouraging use of global standards and moving towards certification to global standards. Similarly, the European Digital Wallet program has been shaking up stakeholders in the global north with its Architectural Reference Framework, which leverages global standards that all EU member states will need to conform to so that European countries can interoperate.  

Unfortunately, Digital Identity standards are complicated: there is not a single place or a single playbook to follow at this time. I encourage you to embrace the complexity. The strategy you develop could well include global standards from ISO, the IETF, the W3C, the OpenID Foundation, as well as best practices from other organizations like NIST to help avoid bias in your biometric algorithms – and you might want to consider using open source code to help you accelerate down the adoption curve. Either way, you are likely going to have to embrace some complexity in-house in order for your residents and businesses to benefit from simple user experiences.

Some of you represent countries with mature Digital Identity programs. We applaud you for being early adopters. I have a different question for you. How will you serve your residents and businesses that need to transact across borders? Is it worth investing in capabilities that will allow cross-border interoperability? You might want to ask yourselves, what percentage of your GDP is driven by cross-border trade, how important your global diaspora is, and / or how often your citizens travel abroad. Or you might look at Digital Identity in terms of how it can enhance your national security posture.   With $1.4T lost annually to cybercrime globally, we all have room to improve to better protect our residents, our businesses, and our security posture. I also encourage countries with mature Digital Identity programs to take a leadership role in the work to develop global open standards, and to work on achieving cross-border interoperability of digital identity in practice.   As David said in the earlier panel, the transformation and Digital Identity leadership in the global south is impressive … but global south representatives and the entities that fund their transformation are much less active in global standards bodies working on Digital Identity.

Earlier, I offered a vision of what good can look like: 8 billion people with Digital Identity, using their credentials seamlessly. But what does it look like when it goes wrong and we do not have global standards? One example is the train tracks where train gauges do not line up and people and goods have to move from one train to another.

In a new project called the Sustainable and Interoperable Digital Identity HUB or SIDI Hub, we are challenging ourselves to tackle the question of cross-border interoperability of Digital Identity. SIDI Hub is a multi-stakeholder community comprised of more than 25 countries from the global north and global south, 25 non-profits, and many of the major multinational organizations. In the last 7 months since we formed SIDI Hub, we held 3 summits on two continents, and we will have three more summit on three additional continents this year. We encourage the G20 to leverage multi-stakeholder forums like the SIDI Hub to ensure that the principles you develop can be implemented in practice, all the way down to the protocol layer, in a way that millions of developers will be able implement against those policies by default. Only then can your residents benefit from digital identity as a “public good”.

I will leave you with one last tip. If you want to achieve domestic or global interoperability, you need to test and certify implementations to a common specification, and then maintain conformance to that specification. When you multiply this by millions of entities — and millions of developers— testing, certification and conformance become pivotal.

Many thanks.

The following comments were made in response to other speakers and the Q&A:

First I’d like to agree with Adam: that the critical path for any jurisdiction is domestic use cases. My ask is to ensure that each G20 jurisdiction reserve some thought for cross-border interoperability. We have heard from Adam about the progress with the EU in cross-border interoperability, and we know in the African Union they also want to enable cross-border trade and interoperability of Digital Identity deployments across Africa. From Husdon’s comments we know that Latin American interoperability is also growing in interest, and if we had an Asian representative on the panel we would probably hear the same from them.

I’d like to elaborate on one of the key ways we need to enable cross border Digital Identity. It starts with identifying use cases. To date with SIDI Hub we have identified 30 potential “champion” cross-border use cases. Let me offer four examples that are bubbling towards the top.

In Africa, the most popular use case was “cross-border trade” – helping people living lives along a geographic border, a use case that ranked lower in Europe for obvious reasons. A second use case was “helping people assert their educational and employment certifications across borders” a use case that can serve all migrants whether they are high income or low income. A third example is the “refugee” use case.  UNHCR currently cares for 120 million refugees and they need to deliver on their mission to serve these individuals from the countries they originate from, through the UN system including host countries, all the way to any future destination country or back to their home jurisdiction. The fourth example is “opening a bank account” which can be for students, employee relocations or any other migration use case.

We will not select “champion use cases” until later this year, but we already know that we need these use cases to be able to flesh out the minimum technical requirements to enable cross border interoperability and to map the trust frameworks across borders.

About the OpenID Foundation

The OpenID Foundation (OIDF) is a global open standards body committed to helping people assert their identity wherever they choose. Founded in 2007, we are a community of technical experts leading the creation of open identity standards that are secure, interoperable, and privacy preserving. The Foundation’s OpenID Connect standard is now used by billions of people across millions of applications. In the last five years, the Financial Grade API has become the standard of choice for Open Banking and Open Data implementations, allowing people to access and share data across entities. Today, the OpenID Foundation’s standards are the connective tissue to enable people to assert their identity and access their data at scale, the scale of the internet, enabling “networks of networks” to interoperate globally. Individuals, companies, governments and non-profits are encouraged to join or participate.   Find out more at openid.net.

The post Digital Identity at the G20 first appeared on OpenID Foundation.


Velocity Network

Authoritative Sources for Verifiable Credentials – Part 3

Issuer permissions are the mechanism that Velocity Network introduces to enable relying parties (and wallets) to determine if an issuer is an authoritative source for a particular credential. After requesting the ability to issue on the Network, the request is reviewed by Velocity Network to ensure that the issuing service parameters are within the remit of the organization’s business activities.

FIDO Alliance

AWS Expands MFA Requirements, Boosting Security and Usability with Passkeys

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by […]

AWS has announced the introduction of FIDO passkeys for multi-factor authentication (MFA) to further secure customer accounts. This move aligns with AWS’s objective to offer a secure cloud environment by incorporating secure-by-design and safe-by-default principles. FIDO passkeys offer a strong and easy MFA option, leveraging public key cryptography to resist phishing attempts and enhance overall account protection.


Velocity Network

Velocity Network’s Architecture for Issuer Trust – Part 2

Velocity Network aims to migrate career data to a three-party data exchange model with reliable data. This architecture is key to the revolution in career data that is waiting to happen. Without the three-party model, relying parties must create API integrations with a source of trusted digitized data from a single, often monopolistic, trusted issuer. The post Velocity Network’s Architecture for

Monday, 17. June 2024

FIDO Alliance

ID Talk Podcast: Passkeys, Standards, and Selfie Certification with FIDO’s Andrew Shikiar

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential […]

The FIDO Alliance, founded in 2012, stands as a pivotal organization in the identity technology sector, advocating for strong passwordless authentication mechanisms. The Alliance has been instrumental in establishing influential industry standards, promoting the adoption of biometrics, and enhancing digital security through two-factor and multi-factor authentication technologies.

This week, Andrew Shikiar, FIDO’s Executive Director and CEO, joins the ID Talk podcast to discuss critical issues in authentication and identity security. The conversation covers topics such as the intricacies of passkeys, the dangers of phishing and deepfakes, and the comprehensive testing FIDO certified products undergo with independent, accredited labs to gain FIDO certification. Additionally, Shikiar introduces FIDO’s new Face Verification Certification program, aimed at standardizing selfie-based identity verification technologies across various sectors.Gain valuable insights from Andrew Shikiar by tuning into the podcast, available on Soundcloud, Spotify, Apple Podcasts, or using the link below.


Hyperledger Foundation

Developer Showcase Series: Nithin Pankaj, Senior Software Engineer, Walmart

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Nithin Pankaj, a Senior Software Engineer at Walmart. 

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Nithin Pankaj, a Senior Software Engineer at Walmart


EdgeSecure

AI Teaching & Learning Symposium, presented by Edge and Seton Hall University

Read all about this SOLD OUT event! The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.

Edge and Seton Hall University hosted the inaugural AI Teaching & Learning Symposium on June 11, 2024, to explore the impact of AI on teaching, learning, and the student experience. The sold-out event was held at the University’s Bethany Hall and included a student panel, breakout sessions, lightning sessions, and the opportunity to connect with industry leaders, exhibitors, and fellow members.

Engaging with Generative AI Tools
The day’s events kicked off with a student panel, Experiencing Generative AI Insights from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics. Seton Hall University students, Thiago Alves, Filip Malesev, Kathleen Meagher, Victoria Torna, Nicole Voltmer, and Jasmine Patel, joined moderator, Julia Boivin, to discuss their experiences engaging with generative AI (GenAI) tools in their coursework and which applications, challenges, and outcomes they have encountered from integrating AI into student workflows. The panel shared a closer look at the courses and GenAI tools used and how the GenAI Journal helped enhance students’ understanding of AI technology.

During the session, the group also discussed AI’s potential in learning and its practical application in various projects, as well as insight into the development of skills in prompt engineering for AI interactions, specifically with ChatGPT. Students shared ways to communicate with AI effectively and ethically and how the technology can impact student creativity and problem-solving skills through AI-assisted analysis and brainstorming. In recounting their experiences, the group also explained the technical and ethical challenges that were encountered, the strategies that improved the effectiveness of using AI in their coursework, and how these tools could positively impact academic performance and engagement. Most importantly, the student panel reflected on how the experience has prepared them for an AI-driven professional landscape and what they see for AI’s role in their future careers.

“This was one of the best conferences/symposiums I have ever attended. Every session was exceptional. The energy was extremely positive!” — Kate S., Instructional Designer

Empowering Personalized Learning with AI
The integration of AI and generative AI in higher education is revolutionizing student learning, teaching, and research. To explore this topic more in depth, Sergio Ortega, Business Development Lead for Artificial Intelligence/Machine Learning Worldwide Public Sector, Amazon Web Services, led the afternoon session, Industry Perspectives: Empowering Personalized Learning and Innovation with AI in Education. The presentation discussed how Amazon Web Services (AWS) is empowering universities to leverage these technologies to create personalized learning experiences, automate administrative tasks, and drive innovation.

The session looked at the potential of AI to analyze student data and provide personalized recommendations, as well as using GenAI tools to create customized learning materials that adapt to each student’s learning style and pace. Ortega examined how AI and generative AI can streamline administrative tasks, such as grading and course scheduling; helping to free up time for faculty to focus on teaching and research. Participants learned how AWS is supporting research and innovation in higher education by providing the infrastructure and tools necessary to analyze large datasets, identify patterns and trends, and develop new insights and theories.

Attendees of the symposium also had the opportunity to attend lightning sessions, which were shortened presentations exploring certain topics. The breakfast presentation, Adobe’s Perspective on Gen AI, was led by Stephen Hart, Principal Customer Manager, Adobe Education – New York, and reviewed some of Adobe’s most recent AI offerings, including the company’s ethical approach to creation and discovery. The lunch lightning session, Navigating AI in Higher Education: Engaging Faculty and Enhancing Student Success, delved into the transformative impact of AI in higher education, focusing on faculty engagement, student support, and learning journeys.

Presenters, Cole Galyon, Vice President, Academic Innovation, Anthology, and Dr. Jae Kim, Senior Instructional Designer, William Paterson University, explored how they used the insights and feedback from faculty to create a multifaceted approach to AI adoption across campuses and how it was designed to address both opportunities and challenges. Participants experienced a live demonstration of the AI Design Assistant found within Blackboard Learn, and how this tool has the potential to streamline administrative tasks and enhance educational outcomes. Dr. Kim explained how the University is leveraging Blackboard Learn Ultra to foster an interactive learning environment that empowers faculty and supports student success.

“The content covered in the conference was excellent and very helpful.”

— Patrick S., Professor

The Transformative Impact of Generative AI Tools
Christopher Petruzzi, Manager, User Interface and Multimedia Design, Seton Hall University, led the breakout session, Enhancing Creative Workflows with Generative AI in Adobe Photoshop, to discuss the transformative impact of GenAI tools in Adobe Photoshop. With two decades of Photoshop experience, Petruzzi has witnessed numerous advancements from Adobe and has seen how technology has enhanced both the creative/editing applications within higher education, specifically at Seton Hall University’s Teaching, Learning and Technology Center (TLTC).

The introduction of generative AI into Adobe Photoshop has revolutionized the institution’s Digital Media Team’s workflow by automating routine tasks such as image corrections and creations. Petruzzi shared how these advancements have not only accelerated their project timelines, but also expanded their creative possibilities. Attendees gained an in-depth look at how these AI tools have altered Seton Hall’s digital workflows—improving efficiency and allowing for the production of more complex and creative outputs.

With a live demonstration of key generative AI features within Adobe Photoshop, Petruzzi illustrated their practical applications which have been instrumental in enhancing workflows at the TLTC. The audience saw how AI-enhanced tools are used in real-world scenarios to fix, create, and enhance images, and how they are useful for producing high-quality content and marketing materials.

The Impact of ChatGPT on First-Year Writing
Redefining the Write Path: Exploring the Impact of Generative AI on First-Year Writing Education, presented by Nikki Bosca, Associate Director, Online Teaching and Course Development, New Jersey Institute of Technology (NJIT), shared the impact of ChatGPT on their First-Year Writing (FYW) courses. Taking the audience through their case study, Bosca showed how the research was anchored by an integrated framework drawing from Cognitive Process and Sociocultural Theories of Writing, and addressed four central questions examining perceptions, challenges, and opportunities for integrating generative AI into FYW instruction.

Utilizing a multiple methods case study approach, including interviews and surveys, the study provided nuanced insights into the dynamics of ChatGPT integration. Bosca explained how preliminary findings revealed diverse perspectives among instructors and students, which have challenged prevailing opinions on generative AI in student writing. As ChatGPT becomes more prevalent, this research has informed effective AI utilization that does not compromise the quality of student learning in FYW courses.

“The student panel was excellent! I found the agenda on the postcard super useful! It kept me off my devices. Overall it was a pleasure attending. Thanks so much for your efforts!”

— Elizabeth P., Senior Instructional Designer

Keeping Up with GenAI at Montclair State University
In January 2023, the instructional design team at Montclair State University began ideating a response to the advances in AI. Since then, Instructional Technology and Design Services (ITDS) has produced a suite of web-based resources, workshops and trainings, and consultations to guide their faculty through discovery and exploration of GenAI and how it can be leveraged pedagogically and ethically. In this session, The Amazing Race: Keeping Up with GenAI at Montclair State University, Montclair instructional designers, Joe Yankus and Gina Policastro, shared their experience composing these resources and facilitating small and large-group faculty development, as well as their lessons learned and the goals for the upcoming year.

The Transformative Potential of GenAI in Higher Education
John Shannon, Professor, Seton Hall University, and Susan A. O’Sullivan-Gavin, Professor, Rider University, joined together to present, Integrating Generative AI in Higher Education Learning Environments: Opportunities and Challenges. The presentation explored the transformative potential of GenAI and how it can help enhance learning environments in higher education. Attendees received a comprehensive overview of the benefits, challenges, and best practices of incorporating GenAI at an institution and the importance of guiding students in its use to gain “AI fluency.”

The session examined the importance of integrating GenAI in modern teaching and learning and the ways this advanced technology can enhance personalized learning, critical thinking, and digital literacy. GenAI can also facilitate the development of innovative teaching methods, improve student engagement and outcomes. Presenters offered guidance on the use of GenAI ethically and effectively, including academic integrity and plagiarism concerns, and issues related to transparency, privacy, accessibility, bias, accuracy, security, and regulatory challenges.

Creating Meaningful Learning Environments
With a look at the innovative ways AI tools can be used in the classroom, Centering Students and Using AI within Meaningful Learning Environments, explored how instructors can use AI to engage students in active, collaborative, constructive, authentic, and goal-directed activities, while also achieving culturally responsive teaching. This student-centered approach enriches how AI tools are used within meaningful learning experiences and is achieved when teachers situate students’ lived experiences, frames of reference, and ways of being as resources for learning. Presenter, Manny Algarin, Director of Education, Ed.D. Candidate, New Jersey City University, shared a process for designing meaningful learning experiences, a list of AI tools that promote meaningful engagement, and a culturally responsive framework that advances meaningful technology integration.

Robbie Melton, Provost, Tennessee State University, led another interactive session called Convenience to Competence: A Spectrum for Purposeful AI Integration in Education. The presentation introduced educators to the Arrighi AI-C2 Utilization Spectrum, a framework for conceptualizing how learners develop skills and competencies through the purposeful integration of AI tools into education. Attendees also received an overview of the Spectrum’s five stages from basic use to autonomous innovation.

“It was a great symposium – a nice variety of sessions. One enhancement for breakfast would be to include a yogurt option for those have gluten free dietary restrictions. The fruit was fresh and delicious. Very nice campus and good presenters. Kudos to Seton Hall and Edge for a well executed event! Thank you.”

— Abigail H., Senior Technology Trainer

Harnessing the Power of AI
The afternoon breakout sessions included Universal Access Through AI: Leveraging AI for Inclusive Education, presented by Jaimie Dubuque, Teaching and Learning Technologist, Rider University. In this presentation, participants learned how AI can be leveraged in teaching and learning to enhance accessibility and the practical strategies for implementing AI in planning, assessing, and supporting students in higher education. The session also examined the potential impact of AI on IEPs and 504s.

Also looking at the unparalleled opportunities that AI can unlock in the classroom, Muhammad Hassan, Executive Director, Nancy Thompson Learning Commons, Kean University, led the breakout session, Revolutionize Classroom Learning with AI. The presentation discussed how AI-powered platforms can personalize learning content, recommend supplementary materials, and facilitate real-time feedback. Thompson shared ways AI can enable the curation of unique assessments tailored to individual learning objectives and contexts and that AI algorithms can analyze students’ performance data and preferences to design more relevant, relatable, and reflective assessments of real-world scenarios. By embracing AI in assessment design, Thompson said educators can cultivate critical thinking, creativity, and problem-solving skills while providing students with meaningful feedback and evaluation metrics.

In Lessons Learned, Actions Unleashed: Unlocking Our Potential, Diane Rubino, Adjunct Professor, New York University led a candid discussion of what worked well and the opportunities for growth in integrating AI tools at their institution. The collaborative conversation asked various questions, including what were the outcomes of new teaching methods used, what wisdom can we extract from decisions that didn’t quite work out, and what action items can we identify to address challenges and improve the learning environment for ourselves and our students.

Improving Student Readiness
The panel presentation, Using AI Avatar Patients to Increase Student Readiness, explored how AI can be used to help educators to prepare students to enter the workforce after graduation. Seton Hall University’s Leslie Rippon, Associate Professor, Department of Athletic Training, Genevieve Zipp, Professor, Program Director of Ph.D. in Health Sciences, and Lorene Cobb, Assistant Professor, Department of Physical Therapy, talked about the pedological design and implementation of AI avatar patients into healthcare interprofessional curricula. The session explored student readiness, which includes perceptions of knowledge, attitude, and ability, and how experiences are impacted by context-specific readiness.

Contextual readiness includes socio-political, community, organization, financial, and learning resources and opportunities that influence the experience. Presenters looked at how experiences should promote active learning and simulate real-world scenarios to foster student readiness appropriately. Developing and delivering real-world learning experiences can present many challenges, including contextual characteristics, limited space for hands-on engagement, increased staffing needs, scheduling conflicts, and costs associated with patient actors or experiential opportunities. The session highlighted how integrating AI and virtual reality (VR) can address these challenges and generate human-like content in response to complex and varied prompts. Attendees gained an inside look at the data on financial impact and students’ perceived readiness post-AI-VR experiences and the recommendations for future integration into the broader educational curriculum.

John Baldino, Director, Center for Teaching and Learning, Lackawanna College added to the discussion in Machine Morality: Ethical and Creative Uses of AI for Faculty and Students. The professional development workshop examined AI, its role in education, and the opportunities for students and teachers to use the emerging technology to enhance teaching and learning.

As AI continues to be at the forefront of strategic planning discussions within academia, Edge will further explore how advanced technology can improve institutional effectiveness, enhance teaching and learning, expand research capabilities, and shape the skill sets that will be essential in tomorrow’s workforce. Edge events like the AI Teaching & Learning Symposium provide exciting opportunities to bring together professionals, thought leaders, and industry experts who can share valuable insight into the latest technologies and how institutions can leverage real-life solutions on their campuses to help transform education.

Featured Sessions 9:35-10:35 a.m. Student Panel: Experiencing Generative AI Insights from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics

Bethany Hall Multipurpose Room C

This student-moderated panel will focus on their experiences engaging with generative AI tools in their coursework, emphasizing their applications, challenges, and outcomes. The insights are drawn from the Legal Foundations of Business, Disruption, Technology & Law, and Advanced Topics courses. The discussion will showcase practical applications of generative AI in learning environments, highlight the challenges faced and solutions students implement using AI tools, and describe the outcomes and learnings from integrating AI into student workflows.

The panel will offer an overview of the courses and generative AI tools used; discuss how the GenAI Journal enhanced students’ understanding of AI technology, its potential in learning, and its practical application in various projects; offer insight into the development of skills in prompt engineering for AI interactions, specifically with ChatGPT and  understanding how to communicate with AI effectively and ethically; describe the impact on student creativity and problem-solving skills; recount the technical and ethical challenges encountered and strategies for effective use of AI in coursework; describe key takeaways from using generative AI tools and their impact on academic performance and engagement; discuss the increased creativity and ability to generate novel ideas using AI while improving problem-solving skills through AI-assisted analysis and brainstorming; reflect on how the experience has prepared students for an AI-driven professional landscape and AI’s role in their future careers.”

Moderator:

Julia Boivin, Seton Hall University

Panelists:

Thiago Alves, Seton Hall University Filip Malesev, Seton Hall University Kathleen Meagher, Seton Hall University Victoria Torna, Seton Hall University Nicole Voltmer, Seton Hall University Jasmine Patel, Seton Hall Universit 1:30 – 2:10 p.m. Industry Perspectives: Empowering Personalized Learning and Innovation with AI in Education

Bethany Hall Multipurpose Room C

The integration of AI and generative AI in higher education is revolutionizing the way students learn, faculty teach, and research is conducted. This presentation will explore how Amazon Web Services (AWS) is empowering universities to leverage these technologies to create personalized learning experiences, automate administrative tasks, and drive innovation. We will discuss the potential of AI to analyze student data and provide personalized recommendations, as well as the use of generative AI to create customized learning materials that adapt to each student’s learning style and pace. Additionally, we will examine how AI and generative AI can streamline administrative tasks, such as grading and course scheduling, freeing up time for faculty to focus on teaching and research. The presentation will also highlight how AWS is supporting research and innovation in higher education by providing the infrastructure and tools necessary to analyze large datasets, identify patterns and trends, and develop new insights and theories. Overall, this presentation will demonstrate how the use of AI and generative AI in higher education is improving student outcomes, advancing the field of research, and transforming the way universities operate.

Presenter:

Sergio Ortega, Business Development Lead for Artificial IntelligenceI/Machine Learning Worldwide Public Sector, Amazon Web Services

Bio:  Sergio Ortega is an accomplished executive with 20+ years of experience in AI, machine learning, analytics, and public sector industries. He excels in driving organizational excellence, fostering inclusive cultures, and solving customer challenges. Sergio has 15+ years of experience in cloud computing, PaaS, SaaS, and solution selling, with expertise in program management and go-to-market strategies. Currently, he is the Business Development Lead for AI/ML WW Public Sector at Amazon Web Services. Previously, he held leadership roles at Microsoft, including Metaverse, IoT, Light Edge Azure Engineering Global GTM and Ecosystem Lead. Sergio holds advanced degrees in data science, business, computer science, and cybernetics. His extensive experience and academic credentials make him a valuable asset to any organization.

Lightning Sessions 9:00 – 9:20 a.m. Breakfast Lightning Session—Adobe’s Perspective on Gen AI

Bethany Hall Multipurpose Room C

Stephen will review some of Adobe’s most recent AI offerings including the company’s ethical approach to creation and discovery.

Presenter:

Stephen Hart, Principal Customer Manager, Adobe Education –  New York 12:45 – 1:05 p.m. Lunch Lightning Session—Navigating AI in Higher Education: Engaging Faculty and Enhancing Student Success

Bethany Hall Multipurpose Room C

This presentation will delve into the transformative impact of AI in higher education, focusing on faculty engagement, student support, and learning journeys. Drawing on insights and feedback from faculty, we will explore the multifaceted approach to AI adoption across campuses, addressing both opportunities and challenges. Participants will also experience a live demonstration of the AI Design Assistant found within Blackboard Learn, showcasing its potential to streamline administrative tasks and enhance educational outcomes. Attendees will also hear from William Paterson University and how they are leveraging Blackboard Learn Ultra to foster an interactive learning environment that empowers faculty and supports student success. This presentation aims to foster an interactive dialogue among attendees, encouraging sharing experiences and strategies for effectively integrating AI into higher education.

Presenters:

Cole, Galyon, Vice President, Academic Innovation, Anthology Dr. Jae Kim, Senior Instructional Designer, William Paterson University Breakout Sessions Session 1: 10:45 – 11:25 a.m. Enhancing Creative Workflows with Generative AI in Adobe Photoshop

Bethany Hall Multipurpose Room A

Explore the transformative impact of Generative AI tools in Adobe Photoshop during this presentation. With two decades of Photoshop experience, I have witnessed numerous advancements from Adobe. What I was originally doing by hand with the primitive lasso tool of Photoshop 7.0 in 2002 (has it been that long?)…can now be done at the click of one button in the latest Creative Suite.

This technology has enhanced both the creative/editing applications within higher education, specifically at Seton Hall University’s Teaching, Learning and Technology Center.

The introduction of Generative AI into Adobe Photoshop has revolutionized our Digital Media Team’s workflow by automating routine tasks such as image corrections and creations. These advancements have not only accelerated our project timelines but also expanded our creative possibilities. This session will provide an in-depth look at how these AI tools have altered our digital workflows, improving efficiency and allowing for the production of more complex and creative outputs.

The presentation will include a live demonstration of key Generative AI features within Adobe Photoshop, illustrating their practical applications which have been instrumental in enhancing our workflow at the TLTC. Attendees will see how AI-enhanced tools are used in real-world scenarios to fix, create, and enhance images, useful for producing high-quality content and marketing materials.

This session is tailored for individuals eager to learn about the potential of AI in enhancing digital media production, specifically within the Adobe Creative Suite. By sharing practical examples and personal insights from years of experience, I aim to inspire attendees to integrate these innovative tools into their own workflows, pushing the boundaries of what is possible in educational technology. 

It’s also just really cool to digitally insert a T-Rex in the middle of the campus green at the press of a button.

Presenter:

Christopher Petruzzi, Manager, UI and Multimedia Design, Seton Hall University Redefining the Write Path: Exploring the Impact of Generative AI on First-Year Writing Education

Bethany Hall Multipurpose Room B

This case study explores the impact of ChatGPT on First-Year Writing (FYW) courses at a public university. Anchored by an integrated framework drawing from Cognitive Process and Sociocultural Theories of Writing, the research addresses four central questions examining perceptions, challenges, and opportunities for integrating generative AI into FYW instruction. Concerns range from academic integrity to student anxiety, with a focus on how instructors navigate AI-assisted assignments and the implications on the writing process and outcomes. Utilizing a multiple methods case study approach, including interviews and surveys, the study provides nuanced insights into the dynamics of ChatGPT integration. Preliminary findings reveal diverse perspectives among instructors and students, challenging prevailing opinions on generative AI in student writing. As ChatGPT becomes more prevalent, this research informs effective utilization without compromising the quality of student learning in FYW courses.

Presenter:

Nikki Bosca, Associate Director, Online Teaching and Course Development, New Jersey Institute of Technology The Amazing Race: Keeping Up with GenAI at Montclair State University

Bethany Hall Multipurpose Room C

In January 2023, the instructional design team at Montclair State University began ideating a response to the advances in artificial intelligence, which broke headlines in late 2022. Since then, Instructional Technology and Design Services (ITDS) has produced a suite of web-based resources, workshops and trainings, consultations, and more to guide University faculty through discovery and exploration of GenAI to be leveraged pedagogically and mitigate misuse. In this session, Montclair instructional designers Joe Yankus & Gina Policastro will share their experience composing these resources, facilitating small and large-group faculty development, lessons learned, and goals for the upcoming year. 

Presenters:

Joseph Yankus, Instructional Designer, Montclair State University Gina Policastro, Instructional Designer, Montclair State University Session 2: 11:35 am – 12:15 p.m. Integrating Generative AI in Higher Education Learning Environments: Opportunities and Challenges

Bethany Multipurpose Room A

This presentation explores the transformative potential of Generative AI (GenAI) in higher education. It will address how GenAI can enhance learning environments, the importance of guiding students in its use, and best practices for its integration into university courses. The presentation will provide a comprehensive overview of the benefits and challenges of incorporating GenAI in higher education.

This presentation will examine the importance of integrating GenAI in modern teaching and learning; enhance of personalized learning, including critical thinking and digital literacy; facilitate the development of innovative teaching methods; improve student engagement and outcomes; provide students with guidance on the use of GenAI ethically and effectively including academic integrity and plagiarism concerns, issues related to transparency, privacy, accessibility, bias, accuracy, security, and regulatory challenges. Our presentation will also discuss the need for students to understand that “AI fluency” is as important as reading, writing, and arithmetic. The use of artificial intelligence (AI) has grown tremendously since the release of ChatGPT in November 2022.  The daily avalanche of announced improvements to AI platforms increases the need for higher education to address these issues. These trends demand that we develop clear policies and guidelines encouraging student awareness, education, and fluency.

Presenter:

John Shannon, Professor, Seton Hall University

Susan A. O’Sullivan-Gavin, Professor, Rider University

Centering Students and Using AI within Meaningful Learning Environments

Bethany Multipurpose Room B

“How can AI be used to enhance meaningful learning environments in culturally responsive ways? This session focuses on innovative ways AI tools can be used in the classroom to support meaningful learning experiences and culturally responsive teaching. When designing meaningful learning experiences, teachers use AI to engage students in activities that are active, collaborative, constructive, authentic, and goal directed. Culturally responsive teaching is a student centered approach that enriches how AI tools are used within meaningful learning experiences.This happens when teachers situate students’ lived experiences, frames of reference, and ways of being as resources for learning. Participants will be provided a process for designing meaningful learning experiences, a list of AI tools that promote meaningful engagement, and a culturally responsive framework that advances meaningful technology integration. 

This session also addresses educational inequities that surface when learning experiences lack meaningfulness. Teacher centered practices have historically been related to passive technology use, deficit based beliefs about students’ potential, “one size fits all” instruction, and curricular choices that do not account for diversity. As a result, teacher centered approaches have and continue to perpetuate opportunity gaps that disproportionately impact students from historically marginalized communities. To bridge digital divides and fill opportunity gaps, participants will be exposed to the strategies, beliefs, conditions, and tools that support the shift towards student centeredness. “

Presenter:

Manny Algarin, Director of Education, Ed. D. Candidate, New Jersey City University Convenience to Competence: A Spectrum for Purposeful AI Integration in Education

Bethany Multipurpose Room C

This interactive presentation will introduce educators to the Arrighi AI-C2 Utilization Spectrum – a framework for conceptualizing how learners develop skills and competencies through the purposeful integration of AI tools into education. The presentation will provide an overview of the Spectrum’s five stages from basic use to autonomous innovation.

Presenter:

Robbie Melton, Provost, Tennessee State University Session 3: 2:20 – 3:00 p.m. Universal Access Through AI: Leveraging AI for Inclusive Education

Bethany Multipurpose Room A

In this session, participants will learn how AI can be leveraged in teaching and learning to enhance accessibility.  We will explore practical strategies for implementing AI in planning, assessing, and supporting students in higher education. During the session, we will also consider the potential impact of AI on IEPs and 504s.

Presenter:

Jaimie Dubuque, Teaching and Learning Technologist, Rider University Revolutionize Classroom Learning with AI

Bethany Multipurpose Room B

In today’s rapidly evolving educational landscape, harnessing the power of Artificial Intelligence (AI) presents unparalleled opportunities to enhance learning experiences in the classroom. AI offers many tools and technologies that can dynamically adapt to students’ learning styles, preferences, and abilities. AI-powered platforms can personalize learning content, recommend supplementary materials, and facilitate real-time feedback. AI fosters an interactive and immersive learning environment that promotes active engagement and participation among students. AI presents a transformative way to enable the curation of unique assessments tailored to individual learning objectives and contexts. AI algorithms can analyze students’ performance data and preferences to design more relevant, relatable, and reflective assessments of real-world scenarios. Assessments developed using AI can incorporate multimedia elements, simulations, and gamification techniques to create interactive and engaging assessment experiences. By embracing AI in assessment design, educators can cultivate critical thinking, creativity, and problem-solving skills while providing students with meaningful feedback and evaluation metrics.

Presenter:

Muhammad Hassan, Executive Director, Nancy Thompson Learning Commons, Kean University Lessons Learned, Actions Unleashed: Unlocking Our Potential

Bethany Multipurpose Room C

Let’s reflect on our AI work together. We’ll use a ‘retrospective’ (a project management tool) to candidly discuss what worked well and where we can grow. We’ll focus on solutions we can control and make this a fun, collaborative conversation. We’re looking for real solutions for real people. 

Here are some questions to get us started…

What were the outcomes of new teaching methods, tools, etc. used? Are there opportunities to refine or expand?  Let’s talk silver linings. What wisdom can we extract from decisions that didn’t quite work out? Describe something you crushed! What made it a success? What action items can we identify to address challenges and improve the learning environment for ourselves and our students?

Even if you haven’t made any groundbreaking discoveries, join us anyway. 

The session lead will share anonymized responses compiled into a toolkit after the conference. 

Presenter:

Diane Rubino, Adjunct Professor, New York University Session 4: 3:10 – 3:50 p.m. Using AI Avatar Patients to Increase Student Readiness

Bethany Multipurpose Room A

Educators need to prepare students to enter the workforce post-graduation. Student’s readiness, which includes perceptions of knowledge, attitude, and ability, manifests in one’s sense of self-efficacy and is influenced by experiences. Additionally, experiences are impacted by context-specific readiness. The Context and Implementation of Complex Interventions framework defines context as a set of characteristics and circumstances that are active, unique, and embedded in the experience. Contextual readiness includes socio-political, community, organization, financial, and learning resources and opportunities that influence the experience. Experiences should promote active learning and simulate real-world scenarios to foster student readiness appropriately. Developing and delivering real-world learning experiences presents many challenges, including contextual characteristics, limited space for hands-on engagement, increased staffing needs, scheduling conflicts, and costs associated with patient actors or experiential opportunities. Integrating Artificial Intelligence (AI) and Virtual Reality (VR) can address these challenges. AI is a computing system that can engage in human-like processes such as learning, synthesizing, self-correction, and data integration for complex processing tasks. AI technologies generate human-like content in response to complex and varied prompts and, blended with VR, immerse the user in simulated environments to make the experience more thoughtful and interactive, allowing for repeated practice to enhance skill development, conservation of resources, and cost-effectiveness.

This panel presentation will describe the pedological design and implementation of AI avatar patients into healthcare interprofessional curricula. It will present data on financial impact and students’ perceived readiness post-AI-VR experiences and offer recommendations for future integration into the broader educational curriculum. 

Presenters:

Leslie Rippon, Associate Professor, Department of Athletic Training, Seton Hall University Genevieve Zipp, Professor, Program Director of PhD in Health Sciences, Seton Hall University Lorene Cobb, Assistant Professor, Department of Physical Therapy, Seton Hall University Machine Morality: Ethical and Creative Uses of AI for Faculty and Students

Bethany Multipurpose Room B

This professional development workshop examines artificial intelligence (AI) and its role in education. The session will go beyond the red-alert knee-jerk response of many educators and present opportunities for students and teachers to use the emerging technology to enhance teaching and learning.  

Presenter:

John Baldino, Director, Center for Teaching and Learning, Lackawanna College Exhibitor Sponsors

The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.


Identity At The Center - Podcast

Dive into the world of digital identity with our latest epis

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC. Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app. #iam #podcast #idac

Dive into the world of digital identity with our latest episode of The Identity at the Center podcast! We discussed the future of digital wallets, authentication, and the importance of trust frameworks with Joni Brennan from the DIACC.

Watch the episode at https://youtu.be/phQtu14jlJU?si=u8N_zXgjuK-8uqD1 or listen in your podcast app.

#iam #podcast #idac

Sunday, 16. June 2024

Velocity Network

Empowering Self-Sovereign Identity With Trusted Credentials: Exploring Velocity Network Checks – Part 1

Self-sovereign identity centers on placing data control squarely in the hands of individuals. The goal is to rectify a mistake that has grown exponentially since the late 90s—the dominance of certain companies over personal information.  The current model has data providers sending to data consumers directly, for the most part, with little to no consent from the data subjects themselves in

Friday, 14. June 2024

MyData

Lessons from the City of Helsinki: Three Paradigm Shifts in Smart Cities

Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness Report 2024. Finland also ranks #1 in the Digital Economy Society Index (DESI). And the country’s free world-class education system has earned the #1 rank […]
Author: Mikko Rusama, Managing Partner at Nexus Transform. Finland is now the happiest country in the world for seven years in a row, according to the United Nations’ World Happiness Report 2024. Finland also ranks #1 in the Digital Economy Society Index (DESI). And the country’s free world-class education system has earned the #1 rank […]

Elastos Foundation

Elastos Partners with BEVM for Bitcoin Native Peer-to-Peer Loans

Partnership aims to unlock up to $1.3 trillion of dormant Layer 1 Value, as US consumers get excited about the 3rd Age of Bitcoin Location, Singapore, June 27th 2024: Elastos, the SmartWeb ecosystem provider, has announced a partnership with the L2 provider, BEVM, to develop a peer-to-peer Bitcoin-denominated loan offering around the former’s BeL2 protocol. […]

Partnership aims to unlock up to $1.3 trillion of dormant Layer 1 Value, as US consumers get excited about the 3rd Age of Bitcoin

Location, Singapore, June 27th 2024: Elastos, the SmartWeb ecosystem provider, has announced a partnership with the L2 provider, BEVM, to develop a peer-to-peer Bitcoin-denominated loan offering around the former’s BeL2 protocol. Together the companies believe they can unlock up to $1.3 Trillion of dormant Layer 1 Bitcoin Value, which is supported by data from the latest Elastos’ BIT (Bitcoin; Innovation & Trust) Index suggesting more than two-thirds of US tech-savvy consumers are comfortable using Bitcoin.

 

Collateralize 80% of assets while the Bitcoin Layer is untouched

Elastos believes momentum is building around the Third Age of Bitcoin, where users will be able to transact using Native Bitcoin. Partnering with BEVM to develop this Bitcoin Native loan product will allow users to collateralize up to 80% of their assets in return for L2 credit (stable coins, for instance) based on terms defined in a Bitcoin-assured smart contract. The integrity of the currency is assured by BeL2’s unique ZK-proof process which means the Bitcoin Layer is untouched as the process can be completed without bridging, wrapping or otherwise interfering with the Bitcoin Layer. This maintains the integrity of the currency and avoids network congestion and additional fees that would otherwise result. This approach enables Elastos and BEVM to deliver a genuinely peer-to-peer loan product, which is completely disintermediated and anonymous.  Verification (potentially through third party services) and resulting costs/delays would only be required in the event of a dispute between the two parties.

“The BeL2 protocol perfectly reflects what BEVM is all about; developing and supporting EVM-compatible DApps which can run in the Ethereum ecosystem to operate on Bitcoin L2.  The loan offering is the perfect illustration of how such services could revolutionize the finance sector,” Hakan Sezikli, Co-founder of the BEVM Foundation.

 

Enabling Insight via BTC Oracle

Launched in December ’23, the Bitcoin Elastos Layer2 (BeL2) protocol is a Layer 2 solution for Bitcoin, enabling multiple functionalities such as staking and smart contracts to be denominated directly in the World’s most popular digital currency. BEVM will be collaborating with the Elastos’ BeL2 protocol to deliver a BTC Oracle to monitor and analyze all Bitcoin-based activity in real time.  As the BeL2 protocol enables Bitcoin users to manage, literally, any relationship through the currency – from simple staking (‘interest’), to complex multi-party agreements through smart contracts – the BTC Oracle will become a vital source of insight into how the currency is being used.

 

US tech-savvy consumers trust Bitcoin

This partnership comes as new data from the Elastos’ BIT (Bitcoin; Innovation & Trust) Index indicates growing excitement among US tech-savvy consumers for Bitcoin. 63% of ‘tech savvy’ consumers feel either ‘perfectly comfortable’ or, even, ‘excited’ about transacting in Bitcoin and over half respondents in the US are using Bitcoin at least once a month.

 

Respondents to the survey also suggest they trust Bitcoin as much as online banking or cash to protect savings:

24% US respondents would place most trust in Bitcoin Compared with 25% who place most trust in online banks 23% who place their trust in cash

“What this data shows is that we’re reaching an inflection point in the understanding and embrace of crypto-currencies among early adopters in the US that reflects the global trend towards the Third Age of Bitcoin,” said Rong Chen, co-founder, Elastos. “We are on the verge of Bitcoin delivering a new era commerce, where users are in charge of their data and are no longer beholden to the Web 2 tech giants. This data shows there is work to do to encourage broader adoption in the US, but at Elastos it is our mission to develop technologies that will make it easier to interact and transact with Bitcoin.”

 

About Elastos

Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. The mission is to build accessible, open-source services for the world, so developers can build an internet where individuals own and control their data.

The Elastos SmartWeb platform enables organizations to recalibrate how the Internet works for them to better control their own data.

https://elastos.info

https://www.linkedin.com/company/elastosinfo/

 

About BEVM

BEVM is the first fully decentralized EVM-compatible Bitcoin L2 that uses BTC as Gas. It allows all DAppswhich can run in the Ethereum ecosystem to operate on Bitcoin L2.

www.bevm.io

https://twitter.com/BTClayer2

 

 

Thursday, 13. June 2024

Berkman Klein Center

Global AI Regulation: Protecting Rights; Leveraging Collaboration

Policy experts from Africa, Europe, Latin America, and North America outlined next steps for global AI regimes and networked capacity building Photo by NASA on Unsplash By Lis Sylvan & Niharika Vattikonda Nearly a year and a half after the introduction of ChatGPT, artificial intelligence remains in the regulatory hot seat. While the EU AI Act put the so-called Brussels Effect into p

Policy experts from Africa, Europe, Latin America, and North America outlined next steps for global AI regimes and networked capacity building

Photo by NASA on Unsplash

By Lis Sylvan & Niharika Vattikonda

Nearly a year and a half after the introduction of ChatGPT, artificial intelligence remains in the regulatory hot seat. While the EU AI Act put the so-called Brussels Effect into play, more regions across the globe are now weighing risks, rights, economic opportunities, and regional needs. On May 28th, the Global Network of Centers of Internet & Society Research Centers (NoC) and the Berkman Klein Center for Internet & Society at Harvard University (BKC) hosted a group of policy experts from Africa, Latin America, the US, and the EU to discuss this state of global AI regulation and outline next steps for collaboration across continents.

Lis Sylvan, Senior Director of Strategy and Programming at BKC, moderated the discussion with Carlos Affonso de Souza (Director of the Institute of Technology and Society of Rio de Janeiro), Mason Kortz (Clinical Instructor at the Cyberlaw Clinic at BKC), Gabriele Mazzini (European Commission, chief architect of the EU AI Act), and Ridwan Oloyede (Certa Foundation, coauthor of their recent “State of AI Regulation in Africa” report), with NoC Executive Director Armando Guio providing behind-the-scenes support. The group delved into how governments are weighing sectoral versus horizontal regulatory approaches; the role of the administrative state and existing data protection and competition regulators; the new models of AI regulation in Rwanda and Brazil; the impact of the EU AI Act across all jurisdictions; and the potential for truly global governance.

Origins and Approaches

De Souza contextualized the current moment of global AI regulation as a decade-long journey of AI regulation that started with charters and declarations of governing principles from various governments and entities. Over time, those charters and principles were reflected in national AI strategies, which have been in the works for five years and can be seen as the precursor to AI regulation; Brazil’s AI regulatory evolution, for example, closely followed this time frame. De Souza highlighted the impact of the European Union’s General Data Protection Regulation (GDPR) on this evolution; after GDPR took effect, countries have established data protection authorities that have largely been the main point of contact for early AI governance. As a result of GDPR, he said, “data protection may be an accelerator, may be an entry point for countries in the majority world, because that’s the conversation that we have been having in the last decade, and that’s where resources [and] attention had been moving forward in those countries.” However, he cautioned against using data protection law as the sole basis of AI regulation, because the data protection framework does not necessarily address the full scope of challenges raised by the development of AI.

Mazzini explained that the technical discussions about the EU’s proposed AI legislation date back to 2019. One of the key concerns with a sectoral approach, he said, was the risk of privileging certain sectors over others. The horizontal approach, though, results in added complexity as regulators needed to find regulations that would work across sectors and avoid repetitions; moreover, the scope of EU legislation is limited by the exclusion of national security, military, and defense sectors. While the EU AI Act takes an omnibus approach, Mazzini said it did not make sense to regulate AI as its own technology but rather a general-purpose tool with a variety of applications.

“What was clear to me since the get-go is that It didn’t make sense to regulate AI as a technology as such, because indeed what we are dealing with is a general purpose technology that has a variety of applications that we don’t even foresee today…” said Mazzini, “…and therefore, from my perspective, the idea to establish rules for the technology as such, regardless of its use, didn’t make any sense…We came up with this approach of establishing rules depending on the specific use to which that account is put, with the greatest burden, from a regulatory point of view, being on the high risk,” which Mazzini outlined to include applications of the technology that are linked to health and safety, including medical devices, automated cars, and drones.

Sectoral and Regional Approaches

In the U.S. and in the African Union, regulatory agencies have found it more effective to apply existing laws — across data protection, competition, consumer protection, employment, and other sectors — to govern AI, often taking a sectoral approach. Oloyede said that data protection authorities and competition authorities have largely driven the initial AI regulatory agenda, as these authorities are best equipped to enforce consumer protection, data protection, intellectual property, and competition laws as the basis for national AI governance strategies. “We might see some sort of like a clearinghouse model image where not every country in Africa, for example, will try to come up with a specific AI regulation,” Oloyede said.

Oloyede indicated that the sector-based approach has been dominant on the African continent, with countries including Nigeria, Kenya, South Africa, Rwanda, and Egypt beginning to develop roadmaps for AI governance and establish regulatory task forces. Oloyede said the sectoral approach has allowed regulators to develop specific policies for the deployment of AI in healthcare, for example.

According to Mason Kortz, this sectoral approach is typically favored in the U.S. because the U.S. regulatory approach values subject-matter expertise over technical expertise. The U.S. will likely have subject-matter experts regulate AI in their own domains, Kortz said — for example, the Department of Housing and Urban Development would regulate AI for housing. The U.S. approach relies on the country’s strong administrative state and directs specific federal agencies to take on different pieces of AI regulation. Meanwhile, certain state laws have sought to regulate specific use cases of AI in housing and employment contexts.

Kortz also noted that the current approach in the U.S. is a confirmation that existing rights-based regimes will be applied or extended to harms resulting from the use of AI systems; with a notoriously slow legislature, he said, only making small changes as needed is an advantageous approach, particularly when existing enforcement agencies may already have the power to make those changes. The U.S. common law system is well-suited to this approach, he said, as it lends judges relatively strong power to reinterpret the law in ways that are binding on lower courts without necessarily having to rewrite civil code.

“When it comes to some of the more rights-based statutes we have,” Kortz said, “I think, actually, we have a pretty good governance model right there, and we just need some small adjustments around the edges to modernize those statutes and bring them in line, not just with AI, but hopefully, if not future-proof them, at least provide a little more stability for whatever comes next after AI.” However, Kortz allowed that AI is so fundamentally transformative that certain existing laws, such as intellectual property law and copyright doctrine, may not be enough and global harmonization of AI laws should be a priority.

Global Collaboration and Capacity

Oloyede indicated that African countries have introduced solutions at the level of the Global Privacy Congress, although these solutions will need to reflect differing national and regional interests. Mazzini noted that generative AI and general-purpose AI create additional issues that require international collaboration — fighting misinformation, he said, will require such collaboration. However, de Souza cautioned that regulatory transformation must keep in mind how those laws will be applied in the future. In some cases, he noted, new liability regimes for AI are now stricter than the remaining body of law; Costa Rica, for example, has adopted a strict liability approach for high-risk uses of AI.

“If we turn out to have the chapters of liability on our AI laws more severe than what we have in our general law for other situations, if we are all in agreement that, in the future, AI is going to be in everything, the legislators that are designing those laws today, they are designing general laws on liability, because we will have AI in almost all sectors,” de Souza remarked. “So the decisions that we’re making today on liability, they might end up scrapping the provisions that you have on your civil code, consumer protection code, because the AI law will be the law that is more recent, more specific, and that may be the one that will be applying in most cases.”

This international collaboration will require capacity building across the globe, and Mazzini emphasized that the EU AI Act has prompted additional work to support the authorities in the EU that will implement and enforce the regulation. Although the AI Act will impact multiple private sectors, he said, its public enforcement will require both financial and knowledge-based resources. De Souza noted that the Brussels Effect will prompt a need for global bureaucracy to support global compliance with the EU AI Act, and well-resourced national authorities are needed to support that implementation. Oloyede, however, said that lessons learned from the GDPR rollout may inform a better approach to implementing the EU AI Act with a more nuanced understanding of the local context. While the EU AI Act will require capacity building to support new governance bodies with funding and resources, he said, it is essential to preserve existing collaborations with data protection and competition authorities and empower those authorities to address AI in their own domains.

Despite different countries taking more sectoral versus horizontal approaches, the global community is working to establish flexible approaches to AI governance in their respective regions. As Oloyede said, “AI is here today. Tomorrow is going to be a different technology. And we can’t keep legislating for every new technology that we have.” Mazzini described a need for international coordination when he said, “when it comes to this new type of AI that is sometimes is called ‘generative AI’ or ‘general purpose AI’ that we have specifically regulated in the EU — notably in the last few weeks, in final stages of the negotiations — I think I would like to see there certainly more international coordination, because there we are dealing with a number of questions that I think are pretty common across jurisdictions.”

Though approaches across the globe may be different, a common cross-cutting theme of the work is balance: protecting rights versus supporting innovation, legislating a critical technology while its capacity and impact is still developing, and providing necessary limitations while allowing nimble innovation.

The Network of Internet & Society Research Centers (NoC) is a collaborative initiative among academic institutions with a focus on interdisciplinary research on the development, social impact, policy implications, and legal issues concerning the Internet. The Berkman Klein Center at Harvard University served as NoC Secretariat from 2020–2023 and continues to participate in cross-national, cross-disciplinary conversation, debate, teaching, learning, and engagement.

Global AI Regulation: Protecting Rights; Leveraging Collaboration was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


EdgeSecure

Update on VMware’s acquisition by Broadcom

The post Update on VMware’s acquisition by Broadcom appeared first on NJEdge Inc.

Dear Edge Community,

I hope this message finds you well. As we continue to adapt to the industry changes brought about by VMware’s acquisition by Broadcom, I want to share some important updates and reaffirm our dedication to assisting you through these transitions.

Pricing Changes Post-Acquisition: Following Broadcom’s post-acquisition adjustments to VMware’s licensing, contracting, and operational model, Edge will no longer manage an Enterprise Licensing Agreement (ELA) on behalf of our members. As a result, Edge will not be responsible for negotiating pricing, co-terming licensing dates, or participating in the quoting and billing process.

VMware Horizon Update: VMware Horizon is becoming a new organization, separate from VMware/Broadcom, called Omnissa. Licensing for Horizon must be procured separately from VMware and can also be fulfilled by partners like Carahsoft.

What This Means for Members:

Members will need to procure Broadcom/VMware licensing through standard reseller channels. Discounting will be determined by negotiations between each member and Broadcom/the reseller.

Edge still has procurement vehicles available for members to quickly process orders. These include:

The EdgeMarket TeCHS contract, fulfilled by SHI. Other preferred resellers through our convenience contract via Carahsoft.

Additionally, Carahsoft will continue to assist the VMware reseller community in preparing your pricing and quotes.

Alternative Solutions and Expert Support: In light of these changes by Broadcom, we believe it is prudent to consider alternative solutions that may better meet your financial and operational needs. We have secured a robust selection of substitute products and services under Edge procurement vehicles. Most of the market available substitutes are available through the EdgeMarket portal, and I am pleased to introduce several key points of contact who can assist you with these options:

Lou Malvasi, PubSec Sr. District Sales Manager, SHI Mobile: (609) 608-2463 Email: Lou_Malvasi@shi.com Bethany Tangredi, AWS Partner Account Executive, AWS Mobile: (413) 896-4331 Email: bethtang@amazon.com Cyntya Ramirez, Senior Program Manager, Carahsoft Technology Corp. Mobile: (571) 662-4641 Email: cyntya.ramirez@carahsoft.com

Lou, Bethany, and Cyntya are available to discuss how various alternative solutions can provide the value and support you need during this time of change.

Alternative Solutions via Edge Procurement Vehicles:

AWS native services Google Cloud Platform native services TeCHS/SHI & Carahsoft: Microsoft Azure/Hyper-V Nutanix Citrix Oracle Other service providers

Cloud Migration Expertise: For those looking to enhance or modify their cloud strategies, please consider our awarded providers listed below. These firms are recognized for their excellence, are fully equipped to support your migration efforts, and are available via EdgeMarket contracts:

CampusWorks, Inc.: Contract #269EMCPS-23-002-EM-CWI, Expires On 10/04/2026 CBTS: Contract #269EMCPS-23-002-EM-CBTS, Expires On 10/29/2026 Infojini, Inc.: Contract #269EMCPS-23-002-EM-IFJ, Expires On 10/01/2026 New Era Technology, Inc.: Contract #269EMCPS-23-002-EM-NET, Expires On 11/30/2026 SHI: Contract #269EMCPS-23-002-EM-SHI, Expires On 12/04/2026 Slalom, Inc.: Contract #269EMCPS-23-002-EM-SLM, Expires On 09/19/2026 Softchoice Corporation: Contract #269EMCPS-23-002-EM-SCC, Expires On 10/16/2026 Strata Information Group: Contract #269EMCPS-23-002-EM-SIG, Expires On 10/04/2026 Trigyn Technologies, Inc.: Contract #269EMCPS-23-002-EM-TGN, Expires On 11/30/2026 Tryfacta, Inc.: Contract #269EMCPS-23-002-EM-TFC, Expires On 10/17/2026

As we face these new challenges, our team remains committed to supporting you every step of the way. We encourage you to reach out to our contacts, or directly to me, with any concerns, queries, or discussions regarding your future strategic directions.

Thank you for your continued trust and partnership as we navigate these evolving circumstances together.

The post Update on VMware’s acquisition by Broadcom appeared first on NJEdge Inc.


DIF Blog

Revolutionizing the traveler experience

Nick Price, who co-chairs DIF’s Travel & Hospitality SIG and Nick Lambert, CEO of DIF member Dock Labs explored the potential for decentralized identity to revolutionize the traveler experience during a discussion with Rob Otto of Ping Identity, Cadrick Widmann of cidas and Roger Olivieira, co founder of

Nick Price, who co-chairs DIF’s Travel & Hospitality SIG and Nick Lambert, CEO of DIF member Dock Labs explored the potential for decentralized identity to revolutionize the traveler experience during a discussion with Rob Otto of Ping Identity, Cadrick Widmann of cidas and Roger Olivieira, co founder of Ver.id at EIC in Berlin last week. 

Nick Price: “Hotels are still stressed about verifying and storing passport information. They have a very large number of customers on file, and a very low amount of usable information. You’re not surprised when they ask whether you have stayed before, though you’ve stayed many times. Decentralized identity promises a substantial improvement in these areas. 

“A lot of the valuable information that makes travel work will be self-attested. Travel is not just about crossing the border or making a transaction, it’s about 'This is me, I'm a vegetarian, I like some extra legroom on the flight', et cetera.

"Travel companies need that information. The customer wants to give it to us, but they don’t currently have the tools to do it. This is exactly what we’re building for a large project in the Middle East: a decentralized identity journey for the traveler across airlines, transport, hotels and the tourism experience.”

Rob Otto: “What consumers really want is a value exchange. When I give you my data, use it to improve my experience. Maybe I even want the hotel to know how I'm feeling today, so I present an 'introvert or extrovert' credential.” 

"You don’t need decentralized identity to figure out someone is staying at a hotel for the tenth time, you just need a system that isn’t stupid.” 

Nick Lambert: “People do care about privacy, but it’s incumbent on us to provide that control over their data. For example, if you want to book a hotel or hire a car, you only want to provide the information the hotel or car hire company needs.

"From an organizational perspective, holding all that data centrally is a honeypot for hackers to target and sell, as well as a GDPR / CCPA compliance risk. Companies are keen to get rid of the liability and pass it over to customers." 

Roger Olivieira: “There are new regulations coming up where you won’t be allowed to store this data any more. Digital wallets are a good solution. They do three things very well: authentication, consent and digital signatures."

Cadrick Widmann: “But it’s hard for users to download a wallet just for one use case.”

Nick Lambert: “True. The user experience needs to be better than what exists today. For example, staff at Condatis (an Edinburgh-based CIAM provider, and DIF member) use decentralized identity to enter the office and access systems remotely, which is great. The challenge is integration with legacy systems.”

Roger Olivieira: “We solve that problem by putting a service provider between wallets and platforms, using common protocols like AuthO / OpenID Connect." 


We Are Open co-op

Making Credentials Work for Everyone

How to think about the three-sided marketplace of skills validation CC BY-ND Visual Thinkery for WAO After more than a decade of working with digital credentials like Open Badges and Verifiable Credentials, we still sometimes hear the sceptical question: “Who’s asking for this?” While digital credentials offer far more than just helping people into a job, this remains a significant and
How to think about the three-sided marketplace of skills validation CC BY-ND Visual Thinkery for WAO

After more than a decade of working with digital credentials like Open Badges and Verifiable Credentials, we still sometimes hear the sceptical question: “Who’s asking for this?”

While digital credentials offer far more than just helping people into a job, this remains a significant and powerful use case. However, creating an ecosystem where this is not only possible but also straightforward takes time. It’s a complex, three-sided challenge that requires more than simply asking users what they want; it involves anticipating their needs and providing innovative solutions that work for everyone involved.

The Appeal of Credentials

People and organisations are often attracted to digital credentials for their potential to recognise and validate a broad range of skills and achievements. They can help democratise learning, making it accessible and recognisable beyond traditional educational settings. For learners, these kind badges represent an opportunity to showcase their skills in a way that is immediately recognisable — and verifiable.

Challenges

Despite the initial excitement, many individuals and organisations find themselves asking, “Now what?” In our experience, this question stems from several challenges. Earners can sometimes have a lack of clarity on how to effectively use badges in practice, and so struggle to see how these credentials translate into real-world opportunities.

CC BY-ND Visual Thinkery for WAO

Issuers such as educational institutions and other organisations may find it difficult to convince stakeholders of the value of alternative credentials. Meanwhile, employers may exhibit uncertainty about the validity and relevance of these badges, leading to hesitation in recognising them as part of the hiring or promotion process.

Understanding the Three-Sided Marketplace

To address these challenges, it’s important to understand the interplay between the three main groups involved: earners, issuers, and employers.

Earners: These are individuals who seek to acquire credentials to validate their skills and knowledge. A significant proportion are already using digital credentials in the application process, with their main concern being whether these badges will be recognised and valued by employers and educational institutions. Issuers: These include schools, universities, and other organisations that award credentials. Their challenge is to establish the credibility and relevance of their badges as a form of skills currency. Employers: These are the entities looking to hire or promote individuals with verified skills. Fewer than half of employers say that they find university transcripts useful in helping them to evaluate job applicant’s potential to succeed at their company. So they need a quick and easy way to verify skills in a way that is a reliable indicator of ability. Shifting the Question

So, rather than asking, “Who’s asking for this?” perhaps we should instead focus on understanding the needs and motivations of each group in this marketplace. By shifting our perspective, we can better appreciate the value of credentials and work towards making them more effective.

Just look at the progress we’ve made as an ecosystem:

✅ Clear value proposition — digital credentials offer tangible benefits, like job opportunities and career advancement, by validating a wide range of skills. Secure, digital wallets make it possible for earners to feel like they truly own and control their credentials.

✅ Widespread adoption and recognition — increasing numbers of institutions and employers are recognising and accepting digital credentials, thanks to the work of organisations such as the Digital Credentials Consortium (DCC), Jobs for the Future, and The RSA.

✅ Robust technology infrastructure — advanced platforms and secure technologies are being developed to support issuing, verification, and management of digital credentials, based on foundational work of standards organisations 1EdTech and the W3C​.

✅ Collaboration between key industry bodies — industry leaders, educational institutions, and technology providers are working together to standardise and promote the use of digital credentials. See, for example, the work around SkillsFWD, Opportunity@Work, and the T3 Innovation Network, and networking at events such as The Badge Summit and ePIC.

✅ Data standards and taxonomies — establishing consistent data standards and taxonomies helps in creating interoperable systems where credentials can be easily shared and verified across different platforms. Credential Engine has developed the Credential Transparency Description Language (CTDL), supporting comparability across credential types and providers.

A huge amount of funding and effort has gone into getting us to this place. It’s going to take more money and time to get us over the line, so where should we focus our attention?

Creating Value for All Sides CC BY-ND Visual Thinkery for WAO

To make digital credentials truly valuable for jobseekers, we need to address the concerns of all three groups:

For earners: Credentials should lead to tangible benefits, such as job opportunities or career advancement. We need clear pathways for using certain types of badges, with examples of how and where digital credentials have successfully led to job offers or promotions. For issuers: It is crucial to developing robust standards such as Open Badges 3.0 to promote the credibility of their badges. Issuers can collaborate with industry leaders to ensure their credentials remain relevant and respected, as well as regularly updating the criteria for earning these badges based on industry needs. For employers: Providing tools and frameworks to easily interpret and trust these badges will encourage wider acceptance and use. Employers can partner with educational institutions to co-create badges and/or develop practical tests to verify the skills claimed by the credentials. Final Thoughts

In summary, the question “Who’s asking for this?” may not be the most productive one. Instead, we should focus on understanding the interconnected roles of earners, issuers, and employers in the credentialing process. By doing so, we can move beyond the trough of disillusionment and realise the full potential of these new forms of recognition.

At WAO, we’ve been working closely with the Digital Credentials Consortium (DCC) on storytelling and communications strategies that help everyone understand and embrace the value of digital credentials. The DCC is a key player in the ecosystem, and one of a number of organisations helping build a future where Open Badges and Verifiable Credentials are a natural and trusted part of the hiring process.

🔥 Do you need help with digital credentials? Check out WAO’s free, email-based Reframing Recognition course, or get in touch! You may also like to check out WAO’s Compendium of Credentialing

Making Credentials Work for Everyone was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 12. June 2024

GS1

End-to-end product traceability enables safer, more efficient care for Lancashire.

End-to-end product traceability enables safer, more efficient care for Lancashire. Stock control was heavily reliant on manual processing, and orders and supplies were managed by a limited number of staff. Much of the inventory management responsibilities would fall to individual nurses, taking them away from providing direct patient care.
End-to-end product traceability enables safer, more efficient care for Lancashire. Stock control was heavily reliant on manual processing, and orders and supplies were managed by a limited number of staff. Much of the inventory management responsibilities would fall to individual nurses, taking them away from providing direct patient care.

Ingencia’s inventory management system (IMS) was implemented to help manage stock control. The IMS is kept up to date with information provided directly by suppliers via the GHX Nexus catalogue.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1_uk_01_cases_studies_2024_final_.pdf

Next Level Supply Chain Podcast with GS1

Drink Outside the Box: QR Codes and Mocktails with Daniel Scharff

In this episode, we dive into how QR codes are revolutionizing packaging, non-alcoholic beverages are making waves, and community networks are driving market success.  Daniel Scharff is the CEO and founder of Startup CPG, a vibrant community supporting emerging consumer packaged goods brands. With a background in San Francisco's food tech scene, Daniel created a Slack community with over 20

In this episode, we dive into how QR codes are revolutionizing packaging, non-alcoholic beverages are making waves, and community networks are driving market success. 

Daniel Scharff is the CEO and founder of Startup CPG, a vibrant community supporting emerging consumer packaged goods brands. With a background in San Francisco's food tech scene, Daniel created a Slack community with over 20,000 members, offering resources and fostering collaboration. He hosts 100+ events annually and produces the Startup CPG podcast, sharing his insights from his experience as a former CEO of a beverage company. 

Daniel's unique approach to helping brands navigate the complex world of product development, market-entry, and supply chain logistics is driven by his firsthand experience as a former CEO of a beverage company. Daniel's innovative strategies and relentless drive to make dreams a reality have earned him a reputation as a visionary leader in the CPG community.

 

Key takeaways:

Learn how QR codes are transforming packaging, improving traceability, and enhancing consumer engagement

Understand the impact of the growing trend of non-alcoholic beverages on supply chain logistics and market dynamics

Discover the crucial role of community support in helping emerging brands succeed in the competitive market

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Daniel Scharff on LinkedIn

Check out Startup CPG

 

Tuesday, 11. June 2024

Hyperledger Foundation

Blockchain Pioneers: Hyperledger Quilt

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now archived projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impa

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now archived projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impact of these pioneering projects.


Project VRM

The Personal AI Greenfield

What forms of pAI—personal AI—are Apple, Mozilla, Google, Meta, Microsoft and the rest not doing? Let’s look at those first two because they’re at the top of the news LIFO buffer. Apple Intelligence (“coming in beta this fall*“), announced yesterday, will help you with writing and creating images while giving you less lame answers from […]

What forms of pAI—personal AI—are Apple, Mozilla, Google, Meta, Microsoft and the rest not doing?

Let’s look at those first two because they’re at the top of the news LIFO buffer.

Apple Intelligence (“coming in beta this fall*“), announced yesterday, will help you with writing and creating images while giving you less lame answers from Siri. (Which they should re-name. Siri is Apple’s Clippy.) It “can draw on larger server-based models, running on Apple silicon, to handle more complex requests for you while protecting your privacy.” The “larger models” will be white-labeled ChatGPT, plus Apple’s own small language models (SLMs).

Mozilla, which got $400+ million a year from Google (for search in the Firefox browser) starting in 2020, announce on June 3 that they will be Building open, private AI with the Mozilla Builders Accelerator. Jive:

This program is designed to empower independent AI and machine learning engineers with the resources and support they need to thrive. It aims to cultivate a more innovative AI ecosystem, and it’s one of Mozilla’s key initiatives to make AI meaningfully impactful — alongside efforts like Mozilla.ai, the Responsible AI Challenge and the Rise25 Awards.

The Mozilla Builders Accelerator’s inaugural theme is local AI, which involves running AI models and applications directly on personal devices like laptops, smartphones, or edge devices rather than depending on cloud-based services…

We chose Local AI as the theme for the Accelerator’s first cohort because it aligns with our core values of privacy, user empowerment, and open source innovation. This method offers several benefits including:

Privacy: Data stays on the local device, minimizing exposure to potential breaches and misuse. Agency: Users have greater control over their AI tools and data. Cost-effectiveness: Reduces reliance on expensive cloud infrastructure, lowering costs for developers and users. Reliability: Local processing ensures continuous operation even without internet connectivity.

Looks to me like both of these are Big AI writ small. It’s “local,” not personal. It’s made to serve your needs with what BigAI offers through APIs. It is still essentially AIaaS (AI as a Service), rather than truly personal AI (pAI): personalized more than personal.

That’s also what I see when I read between the lines at Mozilla’s AI job openings. Take platform engineer. This person will (among other things), “assist in managing and orchestrating workloads across multiple cloud providers.” That’s fine. I’m sure true pAIs will do that too. But most of pAI will be more personal than that. It will deal with the mundanities of your everyday life. Not with coughing up answers that can only come from AIaaSes.

The problem with personalizing AI giant offerings is that they are large language models (LLM) trained on everything that can be crawled on the Internet, plus who knows what else. Not on your truly personal stuff. This is why “prompt engineering” worthy of the noun is ” not for anybody:

Prompt engineering is crucial for deploying LLMs but is poorly understood mathematically. We formalize LLM systems as a class of discrete stochastic dynamical systems to explore prompt engineering through the lens of control theory. We investigate the reachable set of output token sequences $R_y(\mathbf x_0)$ for which there exists a control input sequence $\mathbf u$ for each $\mathbf y \in R_y(\mathbf x_0)$ that steers the LLM to output $\mathbf y$ from initial state sequence $\mathbf x_0$. We offer analytic analysis on the limitations on the controllability of self-attention in terms of reachable set, where we prove an upper bound on the reachable set of outputs $R_y(\mathbf x_0)$ as a function of the singular values of the parameter matrices. We present complementary empirical analysis on the controllability of a panel of LLMs, including Falcon-7b, Llama-7b, and Falcon-40b. Our results demonstrate a lower bound on the reachable set of outputs $R_y(\mathbf x_0)$ w.r.t. initial state sequences $\mathbf x_0$ sampled from the Wikitext dataset. We find that the correct next Wikitext token following sequence $\mathbf x_0$ is reachable over 97% of the time with prompts of $k\leq 10$ tokens. We also establish that the top 75 most likely next tokens, as estimated by the LLM itself, are reachable at least 85% of the time with prompts of $k\leq 10$ tokens. Intriguingly, short prompt sequences can dramatically alter the likelihood of specific outputs, even making the least likely tokens become the most likely ones. This control-centric analysis of LLMs demonstrates the significant and poorly understood role of input sequences in steering output probabilities, offering a foundational perspective for enhancing language model system capabilities.

But all that stuff applies mostly when we’re prompting a big LLM system.

What about using AI in our own lives, where the data that matters most are in our calendars, contacts, financial and health records, our travels, our correspondence (email, chat, whatever)? And how about all the location data we might get from our cars, phone apps, and phone companies? These should be much easier for a pAI to gather, examine, and help us do useful things. Caring about much less data also means a pAI will be less likely to give wrong (hallucinated) answers.

Today the mental frame almost everybody uses for AI is the Big kind, ingesting everything they can get their crawlers on, and munching all of it in giant compute farms. Those systems are great for lots of stuff, but they still don’t deal with personal data listed in the last paragraph.

Not yet, anyway.

Look at it this way. For each of us, there are three data pools:

The entire Net, which is what gets crawled by all the giant LLM operators, plus whatever else they can get their claws on. One’s personal life, some of which is digitized in useful form (contacts, calendar, mail, stuff in folders inside PCs and attached drives). Personal data that is in the hands of giants, but is rightfully ours. These include our driving record and driving practices (,recorded by our late model cars and snitched to insurance companies and others), our location data (kept and shared by car and phone carriers to the likes of Google and the feds), our TV viewing habits, (gathered by Google, Amazon, Roku, Apple, etc.).

The pAI greenfield is with the last two.

Tell us who is working on what there, preferably with open source, and not sitting on walled garden silicon.

[Later… ] Since readers told me I had small language models (SLMs) wrong in one of the paragraphs above, and I’m not sure I had them right, I rewrote them out of the piece. I invite readers to post comments to further correct and expand on the subject of pAIs and what they can do.


Elastos Foundation

BeL2 Loan App Demo 0.2 Live! Native Bitcoin’s Journey into Smart Contracts on Elastos

The BeL2 team is excited to announce the latest update to our Bitcoin-Elastos Layer 2 (BeL2) ecosystem: the Loan App Demo 0.2, now live at lending.bel2.org. This update is a significant step forward in integrating Bitcoin with smart contract functionalities, enhancing both usability and security. Please remember this is a demo app today and not […]

The BeL2 team is excited to announce the latest update to our Bitcoin-Elastos Layer 2 (BeL2) ecosystem: the Loan App Demo 0.2, now live at lending.bel2.org. This update is a significant step forward in integrating Bitcoin with smart contract functionalities, enhancing both usability and security. Please remember this is a demo app today and not commercial, it is being built to showcase the underlying BeL2 technology as part of the larger roadmap. Let’s jump straight in!

 

What’s New in Loan App Demo 0.2

Since the last production release, showcased in Hong Kong, BeL2 has introduced several enhancements to improve user experience and functionality:

Enhanced Order Details: Comprehensive order details now include the status of Zero-Knowledge Proofs (ZKP). Manual BTC Transfer Confirmation: Borrowers and lenders can manually confirm BTC transfers before ZKP completion, saving time. Tip Functionality: Borrowers and lenders can provide tips to each other to encourage faster confirmations. Lender Time Unlock Branch 3: Adds flexibility for lenders. Wallet Compatibility: Unisat is now supported throughout the entire process, expanding beyond the Essentials wallet. Dynamic Timelock Values: The UI now dynamically follows timelocks provided by the contract, no longer relying on hardcoded values. Repayment Countdown Bug Fix: Addressed the repayment countdown duration bug. Order Cancellation: Lenders can cancel an order if it remains unpaid for more than six hours. New Order Status Filter: A new “ongoing” status filter has been added. Bug Fixes: Various minor bugs have been resolved.

 

Updates to Essentials Wallet

To support the BeL2 loan app, Essentials has also received crucial updates:

Direct APK Download: Android users can now download the latest version (3.1.5) directly from d.web3essentials.io. iOS Version Update: The iOS version has been updated and is available on the App Store.

 

BeL2: Expanding Bitcoin’s Capabilities

BeL2 enhances Bitcoin’s scalability, programmability, and privacy, leveraging the Elastos Smart Chain (ESC) while preserving Bitcoin’s integrity. BeL2 enables Bitcoin to interact with smart contracts on EVM-compatible blockchains using Zero-Knowledge Proof (ZKP) technology.

 

Decentralised Loan App

The Loan App on BeL2 allows Bitcoin holders to use their BTC as collateral for loans in USDT. Key features include:

BTC as Collateral: Users lock their BTC in a smart contract to borrow USDT. Fixed Interest Rates: Protection against crypto market volatility. No Forced Liquidations: Safeguards against violent price fluctuations. Smart Contract Automation: Ensures predictable, automated repayment schedules. Relayers and ZKPs: Enhance security and privacy in transactions.

 

Vision Forward

BeL2’s focus is on refining its technology and providing robust SDKs for developers to build a wide array of financial applications on top of Bitcoin, executable on EVM ecosystems like Elastos Smart Chain. They are focused on continuously enhancing the efficiency and security of our ZKP mechanisms and optimise smart contract performance for seamless integration across blockchain ecosystems. The team are developing next comprehensive Software Development Kits (SDKs) to simplify the process of building financial applications on BeL2. These SDKs will include:

Smart Contract Templates: Pre-built templates for common financial applications. API Integrations: Easy-to-use APIs for integrating Bitcoin with other blockchain ecosystems. Developer Tools: Advanced debugging and testing tools. Documentation and Tutorials: Extensive documentation and tutorials to guide developers.

 

Building a Financial Ecosystem

The Loan App serves as a framework for developing various financial applications, demonstrating how Bitcoin can be used as collateral within a decentralised financial ecosystem. Potential applications include:

Decentralised Exchanges (DEXs): Platforms for trading cryptocurrencies without a centralised intermediary. Lending Platforms: Smart contract-based lending services with APR. Payment Solutions: Enabling businesses to accept native Bitcoin and other cryptocurrencies for online platforms. Investment Platforms: Decentralised platforms for investing in various assets using Bitcoin’s security and liquidity.

We invite the Elastos community to explore the capabilities of the Loan App Demo 0.2 and experience the integration of Bitcoin with smart contracts on the Elastos Smart Chain. Your participation and feedback are invaluable as we continue to innovate and expand the BeL2 ecosystem.

Join us at lending.bel2.org and be part of the future where Bitcoin gains new functionalities and applications through BeL2. Together, we are building a more robust and decentralised financial landscape.

 

Monday, 10. June 2024

GS1

Jan Somers

Jan Somers CEO - GS1 Belgium & Luxembourg daniela.duarte… Mon, 06/10/2024 - 23:23 Member management GS1 in Europe Chair Jan Somers
Jan Somers CEO - GS1 Belgium & Luxembourg daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

GS1 in Europe Chair

Jan Somers

Matthias Zenger

Matthias Zenger Senior Engineering Director daniela.duarte… Mon, 06/10/2024 - 23:23 Member management Google Matthias Zenger
Matthias Zenger Senior Engineering Director daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

Google

Matthias Zenger

April Cielica

April Cielica President Global Business Services daniela.duarte… Mon, 06/10/2024 - 23:23 Member management Procter & Gamble April Cielica
April Cielica President Global Business Services daniela.duarte… Mon, 06/10/2024 - 23:23 Member management

Procter & Gamble

April Cielica

Identity At The Center - Podcast

It’s a new episode of The Identity at the Center Podcast! Ad

It’s a new episode of The Identity at the Center Podcast! Adam Mikeal, CISO at Texas A&M University, shares insights on identity security in higher-ed, IAM for DevOps principles, and the shift from custom code to commercial solutions with us. Watch it at https://youtu.be/2foTalb9RVE?si=ZsZvGWxMUhFGrrWV More info is at idacpodcast.com #iam #podcast #idac

It’s a new episode of The Identity at the Center Podcast! Adam Mikeal, CISO at Texas A&M University, shares insights on identity security in higher-ed, IAM for DevOps principles, and the shift from custom code to commercial solutions with us.

Watch it at https://youtu.be/2foTalb9RVE?si=ZsZvGWxMUhFGrrWV

More info is at idacpodcast.com

#iam #podcast #idac

Friday, 07. June 2024

FIDO Alliance

InfoSecurity Magazine: #Infosec2024: CISOs Need to Move Beyond Passwords to Keep Up With Security Threats

Passwordless systems, even if they stop short of a full zero-trust environment, improve convenience as well as security. CISOs should look at approaches such as the FIDO model or web […]

Passwordless systems, even if they stop short of a full zero-trust environment, improve convenience as well as security. CISOs should look at approaches such as the FIDO model or web 3.0 technologies as a basis for future authentication systems.


White Paper: FIDO Attestation: Enhancing Trust, Privacy, and Interoperability in Passwordless Authentication

This document intends to provide a comprehensive understanding of attestation’s role in enhancing and advancing the digital security landscape, specifically with respect to authentication. It focuses on the core function […]

This document intends to provide a comprehensive understanding of attestation’s role in enhancing and advancing the digital security landscape, specifically with respect to authentication. It focuses on the core function of attestation: verifying the origin and integrity of user devices and their authentication materials. FIDO credentials are discussed with a focus on how they offer more secure alternatives than traditional password-based systems and how FIDO attestation enhances authentication security for both Relying Parties (RPs) and end-users. In this document, RPs are those entities that provide websites, applications and online services that require the need for secure user access by confirming the identity of users or other entities. FIDO Alliance’s historical journey is presented with practical analogies for understanding FIDO attestation, its enterprise-specific technical solutions, and privacy aspects involved in the attestation process.

Targeted for CISOs, security engineers, architects, and identity engineers, this white paper serves as a guide for professionals considering the adoption of FIDO within their enterprise ecosystem. Readers should possess a baseline understanding of FIDO technologies, the meaning of attestation, and have a desire to understand why and how to implement attestation.


Oasis Open Projects

CACAO Layout Extension v1.0 approved as a Committee Specification

OASIS is pleased to announce that CACAO Security Playbooks Version 2.0 from the OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC [1] has been approved as an OASIS Committee Specification. Collaborative Automated Course of Action Operations (CACAO) is a schema and taxonomy for cybersecurity playbooks. The CACAO specification describes how these […] The post

New Committee Specification from the CACAO TC

OASIS is pleased to announce that CACAO Security Playbooks Version 2.0 from the OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC [1] has been approved as an OASIS Committee Specification.

Collaborative Automated Course of Action Operations (CACAO) is a schema and taxonomy for cybersecurity playbooks. The CACAO specification describes how these playbooks can be created, documented, and shared in a structured and standardized way across organizational boundaries and technological solutions. This specification defines the CACAO Layout Extension for the purpose of visually representing CACAO playbooks accurately and consistently across implementations.

This Committee Specification is an OASIS deliverable, completed and approved by the TC and fully ready for testing and implementation.

CACAO Layout Extension Version 1.0
Committee Specification 01
04 April 2024

Editable Source: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.docx
HTML: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.html
PDF: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.pdf

ZIP: https://docs.oasis-open.org/cacao/layout-extension/v1.0/cs01/layout-extension-v1.0-cs01.zip

Members of the CACAO TC [1] approved this specification by Special Majority Vote. The specification had been released for public review as required by the TC Process [2]. The vote to approve as a Committee Specification passed [3], and the document is now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving this milestone and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

========== Additional references:
[1] OASIS Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=b75cccb8-adc6-4de5-8b99-018dc7d322b6

[2] Public review metadata document:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01-public-review-metadata.html
– Comment resolution log:
https://docs.oasis-open.org/cacao/layout-extension/v1.0/csd01/layout-extension-v1.0-csd01-comment-resolution-log.txt

[3] Approval ballot:
https://groups.oasis-open.org/higherlogic/ws/groups/b75cccb8-adc6-4de5-8b99-018dc7d322b6/ballots/ballot?id=3819

The post CACAO Layout Extension v1.0 approved as a Committee Specification appeared first on OASIS Open.


Elastos Foundation

BeatFarm: Direct and Profitable Superfan Connections on Elastos

Earlier this year, we announced a collaboration with BeatFarm, a project working to disrupt the music industry with a Superfan dApp. So, what is a Superfan dApp? In this article, we will explore BeatFarm and their mission to empower creators and fans alike. Let’s get stuck in! What is a Superfan App? A Superfan app […]

Earlier this year, we announced a collaboration with BeatFarm, a project working to disrupt the music industry with a Superfan dApp. So, what is a Superfan dApp? In this article, we will explore BeatFarm and their mission to empower creators and fans alike. Let’s get stuck in!

What is a Superfan App?

A Superfan app is a platform that helps content creators engage deeply with their most dedicated followers. It offers exclusive content, direct communication, and a community which fosters stronger and longer connections. By providing insights into fan behaviour and allowing monetisation through personalized content and merchandise, these apps enhance both engagement and revenue for creators.

Why BeatFarm Exists

BeatFarm exists to empower creators, establishing a direct and profitable avenue of engagement with their superfans. BeatFarm’s mission is to empower the artist by providing resources and tools which increase the overall value of their content through direct collaboration with their most loyal fans. By eliminating the intermediaries, BeatFarm guarantees that every artist has complete creative freedom and ensures they maximise their potential revenue. BeatFarm uses sophisticated technology like blockchain and smart contracts to ensure, in essence, that an artist is ensured to be compensated in perpetuity for the creative work they have developed.

How BeatFarm Achieves Its Purpose

At its core, BeatFarm uses the infrastructure of Elastos to create an open, transparent, and secure way of monetisation for artists. The utility of blockchain and smart contracts allows artists to easily create, share, and monetise their content. Everything – be it songs, virtual events, or merchandise, is auto-embedded with smart contracts that trigger payment at the snap of a finger whenever that content is used. BeatFarm even creates superfan channels that have various grading levels of payment mechanisms, varied analytics for measurement of fan behaviour, and a deep e-commerce channels. This gives an artist an ecosystem which provides an entire commercialization network for maximum revenue generation through superfan engagement.

What BeatFarm Offers

BeatFarm offers a superfan platform that clears and fairly values the proposition for the artist to generate more revenue from their content. Its key offerings include direct monetisation, empowerment to self-monetise a piece of content and therefore assuring that every artist earns more for every content created and shared.

Blockchain and the Use of Smart Contracts: BeatFarm’s use of blockchain technology and smart contracts ensures transparent, reliable, and automated payments in perpetuity. This consequently ensures the continued monetisation of artist’s content. Agile Content Creation: From track creation, live hosting, and merchandise selling to engaging  with the fans in real-time, everything can be done across the BeatFarm platform. Superfan Apps: Their latest launch: superfan apps providing several payment levels, analytics to measure fan behaviour, and an e-commerce channel that helps maximise revenue. Global Reach and Scalability: BeatFarm partners with artists from across the globe which create personalized superfan channels. The platform is scalable as the number of users grow and it is based on artist popularity.

BeatFarm has every aspect taken into consideration regarding the building of the platform to foster the success of the artist. With a relentless focus on direct connections, transparent monetisation, and versatile content creation, BeatFarm is on a mission to empower the artist and create new and unique ways for an artist to interact with their superfans.

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

 


Identity At The Center - Podcast

We wrap up The Identity at the Center Podcast’s week-long Id

We wrap up The Identity at the Center Podcast’s week-long Identiverse 2024 coverage with a banger of an episode. We sat down with Ian Glazer from Weave Identity, Alex Bovee from ConductorOne, and Lance Peterman from Dick’s Sporting Goods and UNC Charlotte, to get into the topic of Zero Standing Privileges and why or why not this approach makes sense in the real world. You can watch the episode at

We wrap up The Identity at the Center Podcast’s week-long Identiverse 2024 coverage with a banger of an episode. We sat down with Ian Glazer from Weave Identity, Alex Bovee from ConductorOne, and Lance Peterman from Dick’s Sporting Goods and UNC Charlotte, to get into the topic of Zero Standing Privileges and why or why not this approach makes sense in the real world.

You can watch the episode at https://youtu.be/MEWy8gVEC9o?si=51xigHwE_eb5eyVM and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024

Thursday, 06. June 2024

Oasis Open Projects

Introducing the Open Supply-Chain Information Modeling (OSIM) Technical Committee

Supply chain security has emerged as a critical concern for businesses in every sector. The importance of standardized, trustworthy, and interoperable information models cannot be overstated. Addressing this need, the OASIS Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) is being formed to enhance supply chain management worldwide. The initial TC members include AT

By Omar Santos, Distinguished Engineer, Cisco

Supply chain security has emerged as a critical concern for businesses in every sector. The importance of standardized, trustworthy, and interoperable information models cannot be overstated. Addressing this need, the OASIS Open Supply Chain Information Modeling (OSIM) Technical Committee (TC) is being formed to enhance supply chain management worldwide. The initial TC members include AT&T, Cisco, Google, Microsoft, the Cybersecurity and Infrastructure Security Agency (CISA), the National Security Agency (NSA), and others listed in the charter.

You can read the full blog published on Cisco’s website here.

The post Introducing the Open Supply-Chain Information Modeling (OSIM) Technical Committee appeared first on OASIS Open.


Identity At The Center - Podcast

The Identity at the Center Podcast’s week-long Identiverse 2

The Identity at the Center Podcast’s week-long Identiverse 2024 coverage rolls on with another new episode debuting today. We sat down with Andrew Shikiar from the FIDO Alliance to get the latest FIDO news, including the recently announced Selfie Biometric Identity Verification FIDO certification program, and taking questions from our live studio audience. You can watch the episode at https://you

The Identity at the Center Podcast’s week-long Identiverse 2024 coverage rolls on with another new episode debuting today. We sat down with Andrew Shikiar from the FIDO Alliance to get the latest FIDO news, including the recently announced Selfie Biometric Identity Verification FIDO certification program, and taking questions from our live studio audience.

You can watch the episode at https://youtu.be/nagXfos6n_Y?si=M2XHFGCqEAr9nYn7 and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024


We Are Open co-op

A Compendium of Credentialing

From badge design to the future of recognition in networks Image CC BY-NC Visual Thinkery for WAO We’ve written a lot about Open Badges, Verifiable Credentials, and Open Recognition over the years. During a recent conversation, we realised there wasn’t an up-to-date place to point people towards which gives a summary. Anne wrote a great overview back in 2022, but a lot has changed since&
From badge design to the future of recognition in networks Image CC BY-NC Visual Thinkery for WAO

We’ve written a lot about Open Badges, Verifiable Credentials, and Open Recognition over the years. During a recent conversation, we realised there wasn’t an up-to-date place to point people towards which gives a summary. Anne wrote a great overview back in 2022, but a lot has changed since then!

This post is broken down into sections and doesn’t include everything we’ve written on these topics, so feel free to dive into the archives. If you would like assistance with any of this, get in touch!

Introductory posts

These posts give an overview of platforms to get started with Open Badges, some of the latest changes to the specification, as well as how Verifiable Credentials can be used in practice.

Why Open Badges 3.0 Matters 5 platforms for issuing Open Badges Examining the Roots Badge System Design

It’s rare for badges and credentials to exist in a vacuum, so designing a system around them is important. These posts cover some of the things you may want to consider when approaching badge system design.

WTF are ‘Stealth Badges’? Badges for digital transformation Designing Badges for Co-creation and Recognition Open Recognition

Meeting people where they’re at and helping them identify the knowledge, skills, and behaviours that make them unique is much more interesting than giving people more hoops to jump through.

Understanding Open Recognition What is Open Recognition, anyway? Open Recognition is for every type of learning Reframing Recognition Getting started with Open Recognition Creating a culture of recognition 4 benefits of Open Recognition Pathways Open Workplace Recognition using Verifiable Credentials Looking to the future of Open Recognition Open Recognition: Towards a Practical Utopia Towards a manifesto for Open Recognition Plausible Utopias: the future of Open Recognition Experimental Stuff

These posts don’t fit neatly into the other sections but we think they’re important in terms of understanding the possibilities of badges and credentials, especially in terms of community work.

Using Open Recognition to Map Real-World Skills and Attributes The Future of Trust in Professional Networks Endorsement using Open Badges and Community Recognition History and Advocacy

Whether you’re new to Open Badges and Verifiable Credentials or not, knowing the original vision around equity and opportunity is important to understand their potential.

Good things happen slowly, bad things happen fast Reflecting on the Evolving Badges and Credentials Ecosystem Keep Badges Weird: helping people understand the badges landscape How badges can change the world Open Recognition — A feminist practice for more equal workplaces

As we said at the top of this post, we’re happy to help! So if you’ve got a cool idea that you’d like us to sense check, just get in touch :)

CC BY-ND Visual Thinkery for WAO

A Compendium of Credentialing was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 05. June 2024

DIF Blog

Building Trust in AI through Decentralized Identity

DIF's ED Kim Hamilton Duffy, Wayne Chang, CEO of SpruceID and Professor Linda Jeng, a lawyer, former financial regulator and founder of Digital Self Labs, took to the stage at EIC to discuss how Decentralized Identity (DI) can help mitigate threats posed by Large Language Models (LLMs), ably

DIF's ED Kim Hamilton Duffy, Wayne Chang, CEO of SpruceID and Professor Linda Jeng, a lawyer, former financial regulator and founder of Digital Self Labs, took to the stage at EIC to discuss how Decentralized Identity (DI) can help mitigate threats posed by Large Language Models (LLMs), ably moderated by KuppingerCole's Anne Bailey.

The panelists' thoughts on the nature and scale of the problem

Wayne: “Until now, we’ve been able to get by holding a driver's license up to a webcam, but with new AI tech you can fool these systems - this is already showing up at the edges. AI voice generation is really good now. People are saying it's not a valid identification factor any more.

"Phishing attacks are also on the rise, for example people pretending to be a romantic partner before encouraging the target to invest in crypto. Using AI bots for mimicry makes it easy for scammers to quickly establish trust".

Linda: "The rights conferred by GDPR, to decide with whom you share your data, are already difficult to enforce. Deepfakes will make this even harder.

"I’ve been meeting East Berliners here who tell me surveillance capitalism reminds them of what it was like growing up under the Stasi".

Kim: "A lot of the threats are not new, what we are talking about is an acceleration of these. We were already uncomfortable online. Now, with the things we’re seeing due to the advent of cheap, easy-to-use deepfake tech, we are past breaking point." 

How Decentralized Identity can help mitigate these threats

Wayne: “We need to add authenticity to communication. We don’t want to present a strong ID every time we want to use a chat app, so it makes sense to embed DI into comms channels, to prove I’m real.

“I define DI as the ability of any party to play the role of issuer, holder and verifier, based on cryptographic trust. Having digital credentials issued by many parties will enable trusted content certification, giving us confidence about what goes into an AI model.

Linda: “It’s not about identity, it's about data governance, and creating chains of trust to combat risks from synthetic data.

Kim. “I think of DI as a set of standards, technologies and principles that restore individuals’ control over their data. With these technologies, we have the possibility to build products and solutions on strong foundations. 

"One of the key aspects with DI is the ability to provide a consistent experience across channels, creating a much safer environment for individuals. For example if you get a phone call from your CEO asking you to transfer money, with DI you can be sure it’s them and not a deep fake of their voice.

Recommendations for solution developers 

Wayne: “Focus on the value for the end user. The DI standards uniquely enable you to provide a great user experience while also ensuring privacy and solution sustainability, including the ability to swap out vendors if needed without disrupting the service you’re providing." 

Linda: "We have grown used to not having to pay for digital services. The incentives need to change. Think about new models where we get paid for our data, enabled by content authenticity and DI tech."

Kim: “We have to balance usability and privacy. It’s clear people want to use LMM based tech in their lives. On the other hand we’re seeing increasingly aggressive interfaces, for example asking you to give full access to your documents or even your desktop. With DI, finally there are ways to provide people both the convenience AND the trust”

Other opportunities

Wayne: “There’s exciting work happening at Kantara Initiative around automated compliance with data regulations. Imagine giving someone a license to your personal data. Then, if you're a Data Processor it’s easy to automatically demonstrate compliance using consent receipts.

"It makes the “Accept all” problem go away, as you can decide what kind of consent receipts should be automatically generated for which parties.” 

Linda: “We need to spend time educating policymakers and the public, but in the end it comes down to end user demand for solutions. There’s no legal requirement for open banking in the US, but it’s happening anyway as people want to share their banking data with fintechs. Creating a smooth, easy UX will help to create the demand.”

Kim: "There’s a huge role for expanding the scope of trust to content authenticity, similar to the browser check mark that shows a website has a valid SSL certificate. C2PA (link) is fantastic, and is already using VCs. However, there is a risk of getting locked into who can verify these claims, if we use Certificate Authorities (CAs) as the root of trust. We are talking to them and there’s strong interest in generalizing the trust model.”

The panelists' key takeaways

Wayne: "One of the early goals of the internet pioneers was to have your personal agent in cyberspace. We need to get back to that original definition of personal agents, taking advantage of them to certify our content and things done on our behalf." 

Linda: "We need the right to certify our data as authentic. Right now we can’t tell what’s synthetic versus from an original creator. It’s not judging whether the data is good or bad, it just gives us additional info about the data we’re using."

Kim: "Everything we’re talking about is already here, its just about connecting the pieces. If you're building products, this is a great time to get involved. Come and talk to us at DIF!"

Linda Jeng, Wayne Chang, Kim Hamilton Duffy, Kristy Lam and Elissa Maercklein published Chains of Trust: Combatting Synthetic Data Risks of AI earlier today.


FIDO Alliance

SC Media: Identiverse 2024: Deepfakes, passkeys and more

Two predominant themes stood out at last week’s Identiverse 2024 conference in Las Vegas. First, there was the issue of how to defend against rapidly evolving advances in deepfakes, especially […]

Two predominant themes stood out at last week’s Identiverse 2024 conference in Las Vegas. First, there was the issue of how to defend against rapidly evolving advances in deepfakes, especially for live remote verification. Second, there was a common assumption that widespread adoption of passkeys is right around the corner, and that organizations must prepare to manage and secure passkeys when they become mainstream.

FIDO Alliance Executive Director & CEO Andrew Shikiar touched on both topics in a session Wednesday (May 29) titled “FIDO, Passkeys and the State of Passwordless.”

He announced the alliance’s new certification standard for facial-recognition technologies. The first (and so far only) organization to receive that certification is iProov. In a keynote address Thursday (May 30), Shikiar added that the FIDO Alliance was ready to offer independent testing of facial-recognition technologies.

As for passkeys, the passwordless, FIDO-certified PKI-based WebAuthn credentials that reside on hardware keys, smartphones, PCs and in the cloud, Shikiar said the question was not if consumers would adopt them, but when.

The FIDO Alliance’s goal is “to make passkeys inevitable,” Shikiar said. No one at Identiverse expressed any doubt that they would be.


Hyperledger Foundation

Perun, a Hyperledger Lab, enables economic transactions between embedded IoT devices

Introduction

Introduction


Identity At The Center - Podcast

The week-long Identiverse coverage with the Identity at the

The week-long Identiverse coverage with the Identity at the Center podcast continues with another new episode. We hosted our biggest panel ever, including Arynn Crow, Allan Foster, and Ian Glazer from the Digital Identity Advancement Foundation (DIAF) and Kim Cameron award recipients Sophie Bennani-Taylor and Matthew Spence. Our conversation starts off by learning more about the mission of the DIA

The week-long Identiverse coverage with the Identity at the Center podcast continues with another new episode. We hosted our biggest panel ever, including Arynn Crow, Allan Foster, and Ian Glazer from the Digital Identity Advancement Foundation (DIAF) and Kim Cameron award recipients Sophie Bennani-Taylor and Matthew Spence. Our conversation starts off by learning more about the mission of the DIAF before spending quality time getting to know Sophie and Matthew who share their journey into the world of identity and their Identiverse/Las Vegas experience.

You can watch the episode at https://www.youtube.com/watch?v=uN_rKAOpSOI and hear more at idacpodcast.com

#iam #podcast #idac


The Engine Room

[closed] Join our team! Two Associates for Engagement & Support

The Engine Room is accepting applications for TWO Associates: One based in Sub-Saharan Africa and the other in Latin America.  The post [closed] Join our team! Two Associates for Engagement & Support appeared first on The Engine Room.

The Engine Room is accepting applications for TWO Associates: One based in Sub-Saharan Africa and the other in Latin America. 

The post [closed] Join our team! Two Associates for Engagement & Support appeared first on The Engine Room.


Blockchain Commons

Blockchain Commons Awarded FROST Grant from Human Rights Foundation

Today, the Human Rights Foundation (HRF) announced a Bitcoin Development Fund grant to Blockchain Common for its continued support of the development of FROST, including holding two more virtual FROST meetings for the developer community. FROST is a powerful quorum threshold signature scheme built using Schnorr signatures that offers many advantages over existing signature methodologies. That inclu

Today, the Human Rights Foundation (HRF) announced a Bitcoin Development Fund grant to Blockchain Common for its continued support of the development of FROST, including holding two more virtual FROST meetings for the developer community.

FROST is a powerful quorum threshold signature scheme built using Schnorr signatures that offers many advantages over existing signature methodologies. That includes crucial integration with Distributed Key Generation: private keys can be created in pieces by discrete online servers, with no server ever having the whole key. Altogether, FROST can improve privacy, resilience, and security alike, making it truly a next-generation key-management system.

Blockchain Commons has long looked forward to the mainstream deployment of Schnorr and FROST because of their considerable benefits over traditional key-management and signature schemes. We held out first FROST Implementer’s Round Table in 2023 to give FROST library implementers and cryptographers a chance to talk with each other, and we were asked to do more. We also hosted a FROST Developer’s Meeting this year where we worked with Jesse Posner to offer wallet developers a look at what the future of FROST means.

Blockchain Commons is thrilled by HRF’s recognition of our work. Their funding makes it possible to continue this work this year: we’ll be hosting a second round table for FROST library implementers & cryptographers on September 18, then a second meeting for wallet developers on December 4. If you’re a cryptographer or library implementer, sign up for our FROST implementer’s list and if you’re a wallet developer, sign up for our Gordian developer’s list to receive the announcements on these events. Admission is free thanks to the support of HRF!

Blockchain Commons has also placed FROST on our developmental road map for the year, to consider incorporating it our reference code and tools. More details as additional funding and our schedule firm up.

In the meantime, put September 18 and/or December 4 on your calendars so that you can join Blockchain Commons for these important events that will unveil the future of multi-party signatures and key management.

Tuesday, 04. June 2024

EdgeSecure

Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions

The post Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions appeared first on NJEdge Inc.

CC* Regional Networking: Connectivity through Regional Infrastructure for Scientific Partnerships, Innovation, and Education (CRISPIE) 

NEWARK, NJ, June 4, 2024 –Edge has been awarded an $857,000 grant from the National Science Foundation (NSF) to enhance network connectivity and access to advanced research networks and related cyberinfrastructure for seven higher education institutions in New Jersey, including a community college and several Minority Serving Institutions (MSIs). The partner institutions include Brookdale Community College, Kean University, Montclair University, Ramapo College, Rider University, Rowan University, and Saint Peter’s University. 

The project will improve access to advanced research networks and related cyberinfrastructure, aiming to reduce disparities for smaller and less resourced institutions. The initiative offers specialized training programs for IT personnel, faculty, and students, and establishes essential infrastructure elements using tools like perfSONAR for deploying network monitoring and optimization capabilities, leveraging globus file transfer and sharing service, a regional, centrally managed Data Transfer Node (DTN) for efficient data transfers and Science DMZ for direct access to regional and national resources, secured by the InCommon Federation for remote access to instruments and HPC resources. 

“This initiative creates diverse research collaboration opportunities for faculty across New Jersey, enabling further data-intensive research in disciplines including physics, astronomy, biology, genomics, earth and environmental sciences, data science, and cybersecurity. The project includes a robust training and support program to enhance professional IT support and ensure proper adoption and success for researchers and educators at participating institutions. I’m excited to work in concert with my esteemed colleagues and Co-Principal Investigators to bring this initiative to life.”

— Dr. Forough Ghahramani
Assistant Vice President for Research and Innovation
Edge

Dr. Forough Ghahramani, Assistant Vice President for Research and Innovation at Edge, serves as the principal investigator for the project. She highlights that by improving connectivity and cyberinfrastructure access, the project empowers institutions, fosters regional collaborations, and facilitates data-driven research and education. “This initiative creates diverse research collaboration opportunities for faculty across New Jersey, enabling further data-intensive research in disciplines including physics, astronomy, biology, genomics, earth and environmental sciences, data science, and cybersecurity,” explains Dr. Ghahramani. She continues, “The project includes a robust training and support program to enhance professional IT support and ensure proper adoption and success for researchers and educators at participating institutions. I’m excited to work in concert with my esteemed colleagues and Co-Principal Investigators to bring this initiative to life.”

Co-Principal Investigators include:

Dr. James Barr von Oehsen, Vice Chancellor for Research Computing; Director, Pittsburgh Supercomputing Center, University of Pittsburgh|Pitt Research; Research Professor, Electrical and Computer Engineering, Carnegie Mellon University Dr. Tabbetha Dobbins, Dean of the Graduate School, Rowan University Dr. Stefan Robila, Professor of Computer Science and the Director of the Computational Sensing Laboratory, Montclair State University Dr. Balamurugan Desinghu, Senior Scientist, Office of Advanced Research Computing, Rutgers University

As New Jersey’s research and education network, Edge’s mission is to advance research, science, innovation, and discovery through initiatives like this one. “By focusing on underserved institutions, Edge supports small MSIs with connectivity, technical support, and collaboration opportunities. This initiative strengthens and diversifies the academic community by enabling a wide range of research and education endeavors. It aims to improve current capabilities and lays the groundwork for future expansion to include other institutions.” shares Dr. Ghahramani. “This project will contribute to advancing the New Jersey AI Hub’s goals by enhancing network connectivity and providing critical resources for AI research and development, keeping New Jersey at the forefront of AI innovation.”

The NSF prioritizes proposals that support traditionally underserved institutions through partnerships with regional entities experienced in high-performance research and education networking, such as Edge. Special emphasis is placed on Historically Black Colleges and Universities (HBCUs), tribal colleges and universities, and other minority-serving institutions.

For full details about the NSF grant are available here. To learn more about Edge’s commitment to initiatives of this nature, visit https://njedge.net/research/resources-featured-research-reports/

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Receives $857,000 National Science Foundation (NSF) Grant to Enhance Network Connectivity for New Jersey Higher Education Institutions appeared first on NJEdge Inc.


DIF Blog

Les Miserables of the Cyber Frontier

Digital identity pioneers Markus Sabadello and Nat Sakimura crossed swords on stage during an absorbing discussion at EIC in Berlin this evening. Markus kicked off by asking why the internet has become highly centralized, in spite of its beginnings as a peer-to-peer network. Markus observed that technology embodies the values

Digital identity pioneers Markus Sabadello and Nat Sakimura crossed swords on stage during an absorbing discussion at EIC in Berlin this evening.

Markus kicked off by asking why the internet has become highly centralized, in spite of its beginnings as a peer-to-peer network.

Markus observed that technology embodies the values of the community that spawns it. He illustrated this by comparing W3C versus SD JWT VCs - contending that the W3C format offers more "Liberte" - as well as DIDComm versus OID4VC - arguing that DIDComm features more "Fraternite". Here's a flavor of their conversation (apologies to both for any mistakes in capturing your comments).

Markus: "One of the biggest discussions right now is how to get a VC into a wallet and present it. With OpenID4VC, there is asymmetry between issuer and holder, whereas with DIDComm, you have a model where everyone can connect to everyone else and establish true peer-to-peer relationships."

Nat: "The same can be said for STMP. The philosophy was that everyone runs their own mail server. I do, but how many others do? Instead, we have enormous mail servers like Office 365 and gmail. Just the fact that protocol provides Liberte, doesn't ensure decentralization."

Markus: "No one said Liberte was easy. The easiest thing is to log in with Facebook!".

Nat: "What's the cost of enabling Liberte? It will make things more complex and error prone.

"You need to specify what is being decentralized. For example, IDPs are so-called centralized, yet there are hundreds of thousands of them. The number of wallets will be even more than the size of the population, but thinking about wallet providers, it will probably be far fewer than the number of IDPs.

"It's not only the technical architecture we need to look at, but also the operational and legal controls. Don't try to solve everything in a technical way."

Markus: "SSI arose to ensure we can have the same social structures and protections we enjoy in the real world, in the digital world. We want Liberte, Egalite, Fraternite, and these ideas are baked into SSI standards."

Nat: "The goal is not the technology, but how to achieve Liberte, Egalite, Fraternite. Don't get too fixated on the technology."

Markus: "We can agree on that!"


Identity At The Center - Podcast

We continue our week-long Identiverse coverage with another

We continue our week-long Identiverse coverage with another brand-new episode of the Identity at the Center podcast. For today’s episode, we talked with Danny de Vreeze from Thales @ OneWelcome about the transformative potential of AI in identity management, his journey in the IAM field, the evolution of customer identity and access management (CIAM), and the importance of making access frictionle

We continue our week-long Identiverse coverage with another brand-new episode of the Identity at the Center podcast. For today’s episode, we talked with Danny de Vreeze from Thales @ OneWelcome about the transformative potential of AI in identity management, his journey in the IAM field, the evolution of customer identity and access management (CIAM), and the importance of making access frictionless and secure.

You can watch the episode at https://youtu.be/TUa_ClkyS2U?si=v4gwZh8NDb3ecEpm and visit http://idacpodcast.com for more info.

#iam #podcast #idac #identiverse2024


DIF Blog

Decentralized ID Technical Mastery Sprint @ EIC

DIF’s ED Kim Hamilton Duffy and SC member Steve McCown delivered a Technical Mastery Sprint to a packed audience on the opening day of the 2024 European Identity and Cloud conference. Steve highlighted the scale of internet security and data privacy problems, noting that some 25 billion login

DIF’s ED Kim Hamilton Duffy and SC member Steve McCown delivered a Technical Mastery Sprint to a packed audience on the opening day of the 2024 European Identity and Cloud conference.

Steve highlighted the scale of internet security and data privacy problems, noting that some 25 billion login credentials had been leaked to the dark web by 2022, a 65% increase from 2020. Identity is the hackers’ real objective, since this is what enables lucrative fraud schemes. This threat impacts both individuals - what happens when your biometrics are stolen? - and organizations - 98% of whom have relationships with at least one vendor that has experienced a breach within the past 2 years, according to the Cyentia Institute.  

Steve and Kim introduced key decentralized identity building blocks including Decentralized Identifiers (DIDs), DID methods, DIDComm, wallets and agents, and how they can help address the current privacy and security challenges. They emphasized that these elements can be readily incorporated into existing systems, demonstrated by the growing use of decentralized identity to create on-ramps between Web2 and Web3 (and now Web5) applications.

Sam Curren demonstrated a new protocol that bridges DIDComm and OpenIDConnect, facilitating eIDAS-compliant integration of DI within the EU digital identity wallet to enable new use cases.

Steve and Kim spoke about how trust is established in a world where anyone can issue credentials, and highlighted several approaches that are gaining traction, including Trust over IP Foundation Trust Registries and DIF Credential Trust Establishment. 

Kim outlined credential issuance and exchange flows and highlighted some implementation challenges, solutions and best practices, including credential storage and key management, and provided tips for managing a DI project and how to get started. She also outlined several existing government, educational, workforce management and supply chain use cases, and wrapped up the session with a live demonstration of DID creation and credential issuance using the Veramo CLI toolkit. 

The session generated strong audience engagement, with Kim and Steve answering questions on topics including the trust relationship between issuers and relying parties, delegated authority (e.g. where a parent manages their child’s credentials), credential revocation, reconciling multiple user accounts, the need for centralized record keeping (e.g. for regulatory compliance), key storage and wallet recovery. 

Look out for a more in-depth post where we provide their answers!

Monday, 03. June 2024

Digital Identity NZ

Identity 2.5 with Alan Mayo

Alan Mayo is developing identity solutions beyond the Digital Economy, solutions that have applicability over multiple human scenarios, in-person, on-line, and by telephone. The post Identity 2.5 with Alan Mayo appeared first on Digital Identity New Zealand.

Alan Mayo goes beyond digital identity to consider identity in person, online, and by telephone. He has developed a structured way of describing identity that has, until now, been severely lacking. Alan questions the currently accepted status quo, suggesting that we will need to pass through an Identity 2.5 phase before reaching Identity 3.0.

With topics from verifiable credentials and passkeys to digital wallets and identity ecosystems, you can find Alan’s Digital Identity newsletters here.

The post Identity 2.5 with Alan Mayo appeared first on Digital Identity New Zealand.


We Are Open co-op

Fractional Leadership in Social Impact Organisations

Setting up a successful leadership transition After years of working with and for a variety of different kinds of social impact organisations — from educational institutions to cooperative federations, small community-based charities to global non-profits — we’ve been lucky to see how our strategy and “critical friend” services can help leaders set foundations for an organisational programme or i
Setting up a successful leadership transition

After years of working with and for a variety of different kinds of social impact organisations — from educational institutions to cooperative federations, small community-based charities to global non-profits — we’ve been lucky to see how our strategy and “critical friend” services can help leaders set foundations for an organisational programme or initiative.

It’s only lately that we’ve thought about some of what we do through the lens of “Fractional Leadership”. We are experts in cooperation, learning, technology and community, which means the type of leadership that we bring into projects is quite specifically OPEN.

What is Fractional Leadership? cc-by-nd Bryan Mathers

Fractional leadership is essentially outsourcing a role or a part of a role to an expert who can help you hit the ground running. Such positions are great when you know you need a new department (e.g. Digital Transformation) or if you’re kicking off a new project that needs someone to lead. They’re also helpful when you know you want to hire someone full-time, but want to find the right person for your organisation.

Fractional leadership provides you and your organisation with some flexibility because it gives you the time and space to make the right decisions in staffing and for your organisation as a whole. It’s potentially less expensive than hiring someone full time and can be explicitly bound in a statement of work. It’s also a way to test whether or not a particular new position makes sense to your organisation.

It is a way to work with diverse experts who have wide-ranging experience that you would like to apply to your organisation as it grows, changes and transitions.

The shadow side of fractional leadership cc-by-nd Bryan Mathers

As with any concept we pull from the world of bizniz into the social impact space, we need to be aware of how it can manifest inside an organisation that isn’t only trying to maximise profits.

Often, organisations in our space are unaware that what they need is a leader and look to us to create a specific “thing”. “We just need a digital strategy,” or “We just need a training programme,” are things that often come up in projects that obviously need something more. Working openly and helping people to understand how technology and funding intermingle, how community drives participation, how recognition motivates or how a learning programme can scale all require leadership.

Often funding in the social impact space is to develop a specific thing, not the impact that thing should have. Complex realities also mean that someone who might be a leader in one context might not have what it takes to lead in your context. This is why an open leader is so important. Fractional leadership in the open is a rare thing indeed.

A fractional leader needs to be adaptable and find ways to stay within the bounds the organisation has, while also setting a project or department up for future impact.

Preparing for a new colleague cc-by-nd Bryan Mathers

Still, experienced fractional leaders are well versed in making do with what they’re given. They can lay out plans and programmes with the future in mind. The strategies and processes put in place during a time of transition reflect a moment in time. Setting them up as iterative and designing them to evolve empowers future leaders.

We’ve offered “critical friend” services to help onboard the people who will take over from what we were asked to start. We’ve advise on workload and priority based on community engagement in tandem with organisational goals. We like to do the work of documenting and establishing open, productive processes and policies help future collaborators take over when the time comes.

Is Fractional Leadership for you? cc-by-nd Bryan Mathers

Your fractional leader can start:

Developing a strategy that focuses on long term social impact Creating processes or frameworks to measure impact or create processes for gathering data and insights Building assets to begin implementing the change needed to achieve the impact you’re looking for Supporting and working with others on your team to co-design principles and approaches that support your mission Determining the skills and competencies necessary to lead the project, programme or department they’re working within and helping your organisation find someone to take over

The whole point is that you can bring in engaged experts to help you get something going while looking for your forever person. There are lots of people working in the social impact space who have deep expertise in everything from finance to HR to product development and design.

Fractional leadership is something you can simply try out. As you create briefs or write out job descriptions just ask yourself, “Would a designated fractional leader give me more time and space to ensure this is a long-term success?” If so, get in touch!

Fractional Leadership in Social Impact Organisations was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

We are back from Identiverse with brand new episodes of the

We are back from Identiverse with brand new episodes of the Identity at the Center podcasts debuting every day this week. First up, we talked with George Roberts from McDonald’s about his career in identity, his role in shaping digital identity at McDonald’s, and his Identiverse experience. You can watch the episode at https://youtu.be/wiempmDo-Ks?si=kHZNVZf1Lbq5Oo7p and hear more at idacpodcast.

We are back from Identiverse with brand new episodes of the Identity at the Center podcasts debuting every day this week. First up, we talked with George Roberts from McDonald’s about his career in identity, his role in shaping digital identity at McDonald’s, and his Identiverse experience.

You can watch the episode at https://youtu.be/wiempmDo-Ks?si=kHZNVZf1Lbq5Oo7p and hear more at idacpodcast.com

#iam #podcast #idac #identiverse2024

Friday, 31. May 2024

DIF Blog

DIF Newsletter #40

May 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News DIF at ID4Africa DIF was honoured

May 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

DIF at ID4Africa

DIF was honoured to participate at the ID4Africa 2024 AGM in Cape Town last week.

DIF's Catherine Nabbala and Damian Glover delivered a plenary presentation to the conference, together with Anand Acharya of Bhutan National Digital Identity.

The session highlighted DIF’s work within the broader landscape of national ID schemes and provided real-world insights into the implementation of decentralized identity by the Thai and Bhutanese governments.


The presentation elicited engagement with DIF from a variety of quarters, including several countries that are rolling out national digital identity programs, and others looking to do so.

Our conversations with stakeholders evidenced a strong desire for Africa-specific digital infrastructure, and awakening interest in decentralized identity as a tool to empower citizens and promote cross-border interoperability while avoiding lock-in to ecosystems controlled by outside interests.

Universal Resolver

The IOTA Digital Identity (DID) method is now resolvable using the DIF's Universal Resolver. The Universal Resolver is DIF-hosted infrastructure enabling resolution of any registered DID method through its flexible plugin model. This widely-used service is invaluable to the decentralized identity community. Accordingly, DIF has invested in boosting its reliability and stability to ensure uptime and availability of the Universal Resolver.

The Universal Resolver was generously developed and contributed by Danube Tech.

You can experiment with the Universal Resolver at https://uniresolver.io and even follow our simple process for allowing your own DID method to run in the Uniresolver.

As a non-profit, DIF funds and hosts public good infrastructure like the Universal Resolver. Support our efforts by joining DIF.

DIF adds new liaison IOVF


IOV Foundation’s mission is to create a new open financial ecosystem that provides accessible and fair services. They focus on Latin America, specifically demonstrating solutions for interoperability between governments (countries, cities) and organizations in Central and South America. Their current pilot focuses on building an identity platform for the Argentine Chamber of E-Commerce, aimed at managing event tickets and board meeting attendance.

Calling all DIDComm users!

Do you use DIDComm? We want to hear about it! Let us know how you're using DIDComm here.

🛠️ Working Group Updates Claims & Credentials Working Group

Presentation Exchange 2.1 specification is published: https://identity.foundation/presentation-exchange/spec/v2.1.0/

Sam Curren presented the Credential Trust Establishment at the Trust over IP Foundation's trust registry task force.

After a successful webinar to drum up interest (see write-up here), we will be launching the credential schemas work item. Contact membership@identity.foundation if you are interested in participating.

If you are interested in participating in any of DIF's Working Groups, please click here.

🔐 Applied Cryptography WG

(1) We are waiting for IETF CFRG cryptographic review of the BBS Signature Scheme draft

(2) We are updating but haven't yet published a new version of "Blind BBS Signatures" which enables features like "anonymous holder binding"

(3) We are updating but haven't yet published a new version of "BBS per Verifier Linkability" (pseudonyms)

(4) Both these new drafts provide features that have been tentatively added to the W3C Data Integrity BBS candidate recommendations.

📖 Open Groups at DIF Veramo User Group

The user group is currently focusing on integration of SD-JWT into Veramo, improving compatibility with the SES environment in Metamask snaps for Veramo 6.x, and some changes to key management with regards to usage in Metamask snaps.

We also had a demo/presentation by Index Network on their product / plans for Veramo.

📻 China SIG

During the SIG's May meeting last week, we shared the idea of SSI, global development trends, and next steps for technical, application and translation sub-groups, and held an open discussion.

Access the recording here.

The previous SIG meeting was held on 17 April. CAICT (the China Academy of Information and Communications Technology) contributed a basic un-finished digital identity framework for China SIG members to study and participate in coding together.

Access the recording here.

The China SIG's permanent online meeting address for its regular monthly meeting is: https://meeting.tencent.com/dm/c4cNcDCmssbc
Tencent Meeting Number:507-9656-6284

📢 Announcements at DIF

European Identity and Cloud Conference (EIC) 2024

[Photo by Levin on Unsplash]

DIF will have a significant presence at EIC next week. Many Steering Committee members will be attending and presenting at the conference, which takes place in Berlin from 3 - 7 June.

Executive Director Kim Hamilton Duffy teams up with SC member Steve McCown on the opening day of the conference to deliver a Decentralized Identity Technical Mastery Sprint, plus DIF members Wayne Chang of SpruceID, Riley Hughes of Trinsic, Nick Lambert of Dock and Daniel Buchner of Block (who is also an SC member) for panel discussions on days 2, 3 and 4.

SC member Markus Sabadello joins forces with Nat Sakimura, chairman of the OpenID Foundation to deliver a keynote on the opening night of the conference titled "The Dueling Narratives of Decentralized Identities".

SC member Sam Curren, Hospitality & Travel SIG co-chair Nick Price and Dr Abbie Barbir of DIF liaison partner FIDO Alliance are among those participating in other panel discussions focused on decentralized identity, including

Post Quantum Security: Cryptography in Decentralized Identity Expert/Digital Wallet & Verifiers Q+A Decentralized Identity for Onboarding & CIAM Addressing Usability Challenges of Digital Identity Wallets Decentralized Identity in Production

Check out the full agenda here.

DIF members are eligible for a 25% reduction on their ticket to attend EIC (on top of any other discounts). Simply enter code eic24dif25members during the last step of booking: click here to buy your ticket.

Please ensure your communications teams coordinate with us if you plan to attend, so we can assist in promoting your participation.

DIF Labs

The DIF Labs working group is coming soon; contact membership@identity.foundation to learn more

🗓️ ️Community Events

Bridging the Gap: OpenID and DIDComm

SC member Sam Curren and Artur Philipp from IDUnion unveiled OpenID-DIDComm, a new protocol that bridges OpenID4VC and DIDComm, enabling credential issuers and holders to communicate securely in a way that complies with EU requirements for credential exchange.

The community call was highly anticipated and was well attended, despite public holidays in the US and Europe.

Following an introduction by DIF’s Senior Director of Community Engagement, Limari Navarrete, Artur noted that there are many reasons why issuers and holders will need to communicate, though OpenID4VCI (OpenID for Verifiable Credential Issuance) only supports the initial credential exchange.

For example, an issuer may need to notify a holder that a credential has expired or been revoked. On the other hand, a holder may want to request a batch of additional credentials from the issuer (single-use SD-JWT credentials will be important to prevent correlation of the holder).

The mutual authentication that is integral to the DIDComm protocol provides resistance to phishing attacks, compared with approaches requiring switches to channels such as SMS or email. What’s more, the security properties of a DIDComm connection do not degrade over time. These and other features make DIDComm an excellent choice to establish a persistent communication channel between issuers and holders.

The next step is for the project team to validate that the new protocol does not interfere with OpenID4VCI, in consultation with the group developing the OpenID specification. Enabling DIDComm connections to be established over OpenID4VP (OpenID for Verifiable Presentations) will follow later.

Click here to read more about the OpenID-DIDComm protocol.

Credential Schema webinar recap

As mentioned in the Claims & Credentials Working Group update, DIF hosted a well attended community call to introduce the new Credential Schema work item - see write-up here

Coffee Breaks

Make sure to tune in to the recordings of May’s DIF Coffee Breaks.

Tim Boeckmann CEO and Co-founder of Mailchain
https://x.com/DecentralizedID/status/1787539546579902766
Nara Lau Founder at Fise Technologies
https://x.com/DecentralizedID/status/1791167095650254928
Ankur Banerjee CTO and Co-founder at Cheqd
https://x.com/DecentralizedID/status/1793702820883091752
Humpty Calderon, Advisor @Ontology and creator of Crypto Sapiens Podcast:
https://x.com/DecentralizedID/status/1795533701490855965

June’s Upcoming Coffee Breaks

June 20th at 1pm PDT/ 4pm EST
Andres Olave Head of Technology at Velocity Career Labs
https://twitter.com/i/spaces/1nAJEaLwvgbJL
June 27th at 11am PDT/ 8pm CET
Cole Davis Founder and CEO at Switchchord
https://twitter.com/i/spaces/1vAxRvwmorkxl

Follow https://twitter.com/DecentralizedID to get updates

🗓️ ️DIF Members

Guest blog: Markus Sabadello


Steering Committee member and Danube Tech CEO Markus Sabadello provides an overview of the Identifiers & Discovery Working Group, where he has been contributing as co-chair for many years.

https://blog.identity.foundation/an-overview-of-the-dif-identifiers-discovery-working-group/

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events which can be found here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives


FIDO Alliance

White Paper: Synced Passkey Deployment: Emerging Practices for Consumer Use Cases

This paper explores the emerging practices surrounding the use of synced passkeys which allow passkey use across multiple devices by syncing the passkeys over the cloud, specifically addressing the initial […]

This paper explores the emerging practices surrounding the use of synced passkeys which allow passkey use across multiple devices by syncing the passkeys over the cloud, specifically addressing the initial choices and considerations for service providers (aka relying parties or RPs). These practices are in their early stages and are likely to progress, since operating systems, browsers, and passkey providers are still in a phase of enhancing functionality. This document outlines crucial areas such as registration, authentication, passkey management, and accessibility for RPs to consider and presents a range of emerging approaches for adopting this technology. The objective is to guide RPs through these budding strategies, acknowledging that the specifics of ensuring secure and convenient passkey usage may evolve as the digital landscape continues to advance.

This paper is written with independence for each section, allowing readers to read specific topics of interest without the need to read the entire paper from the beginning.

This white paper is intended for various stakeholders of relying parties, including non-developers, such as information security executives, product owners, identity and access management practitioners, UI/UX designers, and accessibility practitioners.


FIDO Alliance Osaka Seminar

FIDO Alliance held a one-day seminar in Osaka for a comprehensive dive into passkeys. The seminar covered the current state of passwordless technology, a deep dive on how passkeys work, […]

FIDO Alliance held a one-day seminar in Osaka for a comprehensive dive into passkeys. The seminar covered the current state of passwordless technology, a deep dive on how passkeys work, their benefits, practical implementation strategies and considerations, regulatory considerations, and case studies. 

Attendees had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A and networking to get first-hand insights on how to move their own passkey deployments forward.

View the seminar slides and photos below:

FIDO Alliance Osaka Seminar: LY-DOCOMO-KDDI-Mercari Panel.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: NEC & Yubico Panel.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: CloudGate.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: PlayStation Passkey Deployment Case Study.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Overview.pdf from FIDO Alliance

FIDO Alliance Osaka Seminar: Welcome Slides.pdf from FIDO Alliance

Thursday, 30. May 2024

Elastos Foundation

Elastos Cyber Republic 5: Introducing Our New Council Members

In a dramatic conclusion to the Elastos Cyber Republic DAO 5 elections, the votes were fiercely contested until the final moments. With candidates from diverse regions and backgrounds vying for positions, the atmosphere was charged with anticipation and excitement. The following individuals and teams emerged victorious, bringing with them visions of growth, innovation, and community […]

In a dramatic conclusion to the Elastos Cyber Republic DAO 5 elections, the votes were fiercely contested until the final moments. With candidates from diverse regions and backgrounds vying for positions, the atmosphere was charged with anticipation and excitement. The following individuals and teams emerged victorious, bringing with them visions of growth, innovation, and community engagement.

Introducing Our Newly Elected Council Members!

The newly elected council members bring a wealth of experience and a shared vision for the future of Elastos. Their responsibilities include guiding the community’s strategic direction, approving and overseeing proposals, and ensuring transparent governance.

Gelaxy (China) Background: Technical team responsible for the development of main and side chains. Vision: Enhance functional development on the chain and standardize funding applications. Jon Hargreaves & Roger Darashah (United Kingdom) Background: Experienced in communications and strategic management. Vision: Communicate the benefits of the SmartWeb and ensure Bitcoin’s potential as a decentralized currency. Strawberry Republic (El Salvador) Background: Known for their perfect voting record and dedication to community engagement. Vision: Enhance transparency and promote grassroots art initiatives. Iggispopis CRC (Slovakia) Background: Strategic management and finance expertise. Vision: Address technical barriers and increase community engagement through active communication. WoW Africa: Lili Felix, Hannah West, Ian Anderson (Spain) Background: Diverse team with a focus on community development. Vision: Strengthen community collaboration and support technological innovation. Sash | Elacity 🐘 (United Kingdom) Background: Founder and CEO of Elacity, involved in developing dDRM. Head of BeL2, Bitcoin Elastos Layer 2. Vision: Promote decentralized technologies and enhance community engagement through transparent interactions. Chen2rong2 (-) Background: Long-term supporter of Elastos with a focus on bridging internet and crypto developers. Vision: Build the SmartWeb and foster collaboration between eastern and western community members. Rebecca Zhu (Australia) Background: Community member dedicated to promoting transparency and engagement. Vision: Enhance community participation and support the development of decentralized applications. Tyro and his Friends (China) Background: Collaborative team approach to proposal evaluation. Vision: Promote stable node operation and actively engage with the community. PG (Cayman Islands) Background: Investment and financial expertise. Vision: Support project development and improve decision-making transparency. Leo (Hong Kong) Background: Experienced in web3 project development and investment. Vision: Enhance ecosystem support for Elastos and uphold principles of fairness and transparency. Mark E. Blair, M.D. (Canada) Background: Head of Strategy for BeL2 and experienced council member. Vision: Drive innovation, support BeL2 rollout, and explore investor opportunities. The Tense Final Moments

The election’s final hours were marked by intense activity, with votes shifting as candidates rallied their supporters. Observers noted strategic manoeuvres and last-minute endorsements that significantly impacted the outcomes. The competition was particularly tight for the final seats, with candidates switching places as the deadline approached. The dynamic nature of the voting process showcased the community’s active engagement and commitment to the democratic principles of the Cyber Republic. It was both tense and fun!

Next Steps

With the election concluded the new council members will undergo a 14-day transition period before officially taking office on June 13th. During this time, they will deploy their council nodes, ensuring they are operational by June 12th. These nodes play a critical role in validating blocks on the Elastos sidechains, a key component of the ecosystem’s security and functionality.

The first Biweekly Council Meeting is scheduled for June 26th, marking the beginning of the new council’s term. These meetings will be essential for setting the strategic direction and addressing community proposals.

The successful election of the new Cyber Republic Council members marks a significant milestone for Elastos. With their diverse backgrounds and shared commitment to the community, they are well-positioned to drive innovation and foster growth. The community’s active participation and engagement throughout the election process underscore the strength of the Elastos ecosystem and its commitment to decentralized governance.

For more information about the newly elected members and their visions, visit the Elastos website, follow the Cyber Republic Twitter and join the ongoing discussions within the community Telegram.

 


Origin Trail

SingularityNET and OriginTrail: Advancing Decentralized Knowledge Graphs

An innovative collaboration has emerged in the AI sector, as SingularityNET, a leading AI platform developer headquartered in Zug, Switzerland, and Trace Labs, the core development company of OriginTrail, based in Hong Kong, have just announced a strategic partnership aimed at supporting the development of the Knowledge Layer — the Internet of Knowledge. This partnership signifies that two promin

An innovative collaboration has emerged in the AI sector, as SingularityNET, a leading AI platform developer headquartered in Zug, Switzerland, and Trace Labs, the core development company of OriginTrail, based in Hong Kong, have just announced a strategic partnership aimed at supporting the development of the Knowledge Layer — the Internet of Knowledge.

This partnership signifies that two prominent players in the AI industry have come together to support a decentralized ecosystem where AI agents and infrastructure partners collaborate within the decentralized knowledge graph (DKG) landscape.

OriginTrail is an ecosystem dedicated to building a Verifiable Internet for AI, and this partnership marks the beginning of their collaboration with SingularityNET’s leading AI platform and robust ecosystem.

In some of the key highlights of this partnership, Trace Labs is tasked with developing sophisticated infrastructure that allows for efficient access and retrieval of information stored on the DKG, tackling challenges of AI hallucinations, bias, and model collapse due to an explosion in the amount of synthetic data produced by AI. This effort is aimed at enhancing the functionality and responsiveness of the decentralized knowledge graph within the OriginTrail network.

SingularityNET will then provide users access to its decentralized platform, where specialized AI models and Large Language Models (LLMs) can be purchased and used for data analysis. These models are designed to operate seamlessly with the data supported by the OriginTrail network, fostering a more robust ecosystem. The company will also develop AI models that can be trained directly on the Decentralized Knowledge Graph. This approach helps realize the shared vision of the two partners; eliminating the need for data centralization, and leveraging the decentralized nature of the blockchain to enhance privacy and security.

Leveraging SingularityNET’s leading position in mission-critical research of Artificial General Intelligence (AGI) and Trace Labs’ experience in commercializing Web3 and AI solutions, the strategic partnership is in particular aimed at solving the real world challenges with decentralized AI within the key sectors, such as Industry 4.0, decentralized science (DeSci), real world assets (RWA), and education.

Dr. Ben Goertzel, CEO of SingularityNET, stated, “As we move from an Internet of documents, media and apps to an internet of knowledge and AI, the basic composition of the Internet as a ‘decentralized network of decentralized networks’ becomes ever more important. Both SingularityNET and Trace Labs have powerful capability to grow decentralized networks around knowledge graphs and associated AI capabilities; connecting these networks together into a cross-linked, cross-token meta-network will yield a host of different synergies enabling a broad-based boost in intelligent functionality. As a practical example: Putting together a subgraph of OriginTrail’s DKG decentralized knowledge graph covering shipping logistics, with a knowledge meta-graph living in the OpenCog Hyperon system deployed on SingularityNET covering the timing of events in various markets, one could achieve an unprecedented level of emergent knowledge in the minds of AI agents carrying out supply-chain planning and forecasting. SingularityNET’s new MeTTa-Motto tool integrating Hyperon symbolic AI with LLMs and other deep neural nets could play a critical role here. Similar examples exist in every vertical market, which gives this partnership an almost unbounded potential for economic benefit and human good.”

Žiga Drev, Managing Director of Trace Labs, added, “The benefits of AI are limitless and so are the risks, like hallucinations and model collapse as the growth in synthetic data outpaces the provision of real world data. A truly open AI that fosters inclusion and a more equitable distribution of value can only be achieved through a collaborative and modular approach. We are proud to partner with SingularityNet, founded and led by the visionary Dr. Ben Goertzel, who coined the Artificial General Intelligence (AGI). Working alongside the leading visionaries at the convergence of crypto and AI opens exciting opportunities to accelerate real-world adoption of neuro-symbolic AI, combining the power of OriginTrail Decentralized Knowledge Graph (DKG) and SingularityNET’s specialized marketplace for Large Language Models (LLMs) and other AI models.”

Both organizations will also engage in collaborative marketing and social media efforts to promote their partnership and the innovations it brings to the blockchain world, the AI industry, and the intersection of these two sectors. The two partners are also mutually exploring and plan to integrate AI services into Trace Labs’ DKG and paranets.

The partnership is about working together today to solve the challenges of tomorrow; synergizing blockchains and knowledge graphs for safe, verifiable AI that emphasizes secure and privacy-preserving mechanisms for user authentication and authorization within the knowledge layer, and ultimately ensuring all data processing by users is rooted in their consent.

About SingularityNET

SingularityNET was founded by Dr. Ben Goertzel with the mission of creating a decentralized, democratic, inclusive and beneficial Artificial General Intelligence (AGI). An AGI is not dependent on any central entity, is open to anyone and is not restricted to the narrow goals of a single corporation or even a single country. The SingularityNET team includes seasoned engineers, scientists, researchers, entrepreneurs, and marketers. Our core platform and AI teams are further complemented by specialized teams devoted to application areas such as finance, robotics, biomedical AI, media, arts and entertainment.

About OriginTrail

OriginTrail is an ecosystem building a Verifiable Internet for AI, providing an inclusive framework that tackles the world’s challenges in the AI era, such as hallucinations, bias, and model collapse, by ensuring the provenance and verifiability of data used by AI systems. OriginTrail is used by global leaders like the British Standards Institution, Swiss Federal Railways, Supplier Compliance Audit Network (SCAN), representing over 40% of US imports and several consortia funded by the European Union among others. Advised by Turing award winner Dr. Bob Metcalfe, renowned for his law of network effects, the Trace Labs team (OriginTrail core developers) plays a crucial role in promoting a more inclusive, transparent, and decentralized AI.

SingularityNET and OriginTrail: Advancing Decentralized Knowledge Graphs was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 29. May 2024

Ceramic Network

CeramicWorld 04

Welcome to the 4th edition of CeramicWorld! Last month has been busy with new features and developments happening across the Ceramic ecosystem. Here’s a TL;DR for what has happened: Ceramic released an a new library for building points on Ceramic Index is making strides in powering decentralized

Welcome to the 4th edition of CeramicWorld! Last month has been busy with new features and developments happening across the Ceramic ecosystem. Here’s a TL;DR for what has happened:

Ceramic released an a new library for building points on Ceramic Index is making strides in powering decentralized AI Ceramic Anchor Service (CAS) is moving away from IPFS pubsub Ceramic’s Data Feed API is here A new library for building points on Ceramic has dropped 🔥

Ceramic core team has just released the points library, designed to facilitate developers in initiating experimentation with reputation systems on Ceramic and accelerating iteration. The points library serves various use cases, including implementing rewards for community engagement, incentivizing collaboration, or facilitating community quests and educational initiatives.

As an illustration of its functionality, the Points library has been integrated into the ComposeDB Sandbox. By following the steps outlined in the Sandbox, you can earn your first points, stored on Ceramic! 🔥

And that’s not all. In addition to the points library, there is also a points demo application that you can use as the basis of your own project. This example app showcases how the points library can be utilized to incentivize community members for their engagement, such as participation on key social platforms. To learn more, check out the comprehensive tutorial, and watch the accompanying video walkthrough:

Start building with points library Powering Decentralized AI with Composable Data

Index, a composable discovery protocol, is revolutionizing web discovery by empowering users to curate personalized search engines, known as indexes, tailored to their specific needs.

Built on Ceramic's decentralized graph database, Index is driving decentralized AI. It leverages a decentralized semantic index, utilizing ComposeDB, to enable semantic interoperability and a highly relational composition of use cases. By integrating AI-based discovery functions with Ceramic's decentralized network, Index ensures data integrity, privacy, and personalized discovery experiences across peers, mitigating privacy risks associated with centralized systems. This interconnected approach fosters a discovery experience where responses are both personalized and trusted, exemplified by a chat setup drawing from both community and private indexes.

Today, Index network is partnering with a number of projects in the ecosystem, including Ceramic, Lit Protocol, Fluence, Disco, Intuition Systems, Verax, and Olas Network, among others.

Just a few days ago Index team announced the collaboration with LangChainAI which enables developers to seamlessly build their composable semantic indexes with LangChain's suite of LLM pipelines. The new integration is supported in both Python and JavaScript.

Explore the Index Network Ceramic Anchor Service (CAS) is moving away from IPFS pubsub

Earlier this month, the core Ceramic team shared an update regarding the work that has been done to enhance Ceramic network's reliability, scalability, and performance, particularly focusing on the Ceramic Anchor Service (CAS). As the team is approaching the release of Ceramic’s Recon protocol, CAS is transitioning away from using IPFS pubsub to synchronize Streams in Ceramic. To facilitate this, a new HTTP-based mechanism has been developed for sharing Time Events from CAS to Ceramic nodes. This eliminates the dependency of newer Ceramic nodes on IPFS pubsub.

To ensure seamless data delivery and prevent potential data loss, it's crucial for all Ceramic nodes to be upgraded to at least v5.3. Learn more here. New Ceramic features: SET Account Relations, Immutable Fields and shouldIndex flag

Earlier this month, a set of new features providing a sophisticated toolkit for data management, have been added to Ceramic. More specifically, you can now use the following tools to build your applications:
- SET account relation - enabling users to enforce a constraint where each user account (or DID) can create only one instance of a model for a specific record of another model.
- Immutable fields - allow specific data to be prevented from being altered.
- shouldIndex flag - gives developers an option to manage the data visibility by choosing which fields should be indexed.

Learn more about these features in a written tutorial and a video walkthrough. Data Feed API is here!

Data Feed API is finally out. The new Data Feed API formalizes the underlying Ceramic event streaming protocol and allows developers to build data(base) products using raw event streams.

The release finalises the initial stage of implementation of Data Feed API with additional features and improvements coming in the near future. Check out the announcement blogpost and keep an eye on Ceramic Roadmap for updates. Ceramic Community Content TRENDING We Built a Web3 Points Library on Ceramic by Mark Krasner TUTORIAL Web3 Points Library: Example App Tutorial by Mark Krasner VIDEO Building with Ceramic Points Library by Radek Sienkiewicz ANNOUNCEMENT  Upgrade your Ceramic node to v5.3 BLOGPOST Index Network x Ceramic Network: Decentralized AI with Composable Data TUTORIAL Ceramic Feature Release: SET Account Relations, Immutable Fields and shouldIndex flag by Justina Petraityte Upcoming Events May 29 - May 31 GenAI summit Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.


DIF Blog

"Streamlining KYC" Webinar Recap

Earlier this month, Otto Mora from Polygon ID and Kim Duffy from DIF hosted a webinar titled "Credential Schemas: Streamlining KYC." We were thrilled with the enthusiastic participation and valuable feedback we received from attendees. Throughout the webinar, we conducted interactive polls to gather real-time feedback from participants.

Earlier this month, Otto Mora from Polygon ID and Kim Duffy from DIF hosted a webinar titled "Credential Schemas: Streamlining KYC." We were thrilled with the enthusiastic participation and valuable feedback we received from attendees.

Throughout the webinar, we conducted interactive polls to gather real-time feedback from participants. Here's a recap of the key highlights, discussions, and feedback throughout the session.

Understanding Decentralized Identity

We began with an introduction to decentralized identity, explaining the technologies and principles that empower individuals to control their digital identities and personal data. Key benefits include:

Privacy & Security: Ensuring data is shared only with necessary parties, protecting against misuse. Streamlined Onboarding: Facilitating efficient customer onboarding with trustworthy data. Bi-Directional Trust: Reducing fraud and fostering confidence in interactions. Trust & Transparency: Enhancing economic and social opportunities through new markers of trust and reputation.

We asked participants “In your opinion, what is the most impactful use case for decentralized identity?”. At 36%, “Enhancing privacy and security” took the lead, with “Streamlining onboarding processes” coming in second at 27%.

The responses to “Other” included interesting applications such as:

Governance, membership, voting, and sybil resistance Virtual rights management (VRM)

And others added powerful cross-cutting differentiators of SSI including user-controlled identity and interoperability.

For the rest of the webinar, we focused specifically on the "streamlined onboarding" benefit.

Real-World Use Cases

We explored various use cases where credential schemas can make a significant impact in onboarding:

KYC/KYB: Streamlining financial transactions with verified identity claims. AML: Reusable claims to help identify and protect against money laundering. Age Verification: Ensuring age-appropriate access to services and content.

We asked participants which credential schemas they (and their companies) are interested in. KYC took the lead, which was not surprising given the webinar theme.

We had a freeform option to enter other credential schemas of interest, and participants responded with:

Known customer credential Verifiable product-related sustainability information

For the rest of the webinar, we focused specifically on Decentralized Identity and credential schemas as they benefit KYC/KYB use cases.

Challenges in Traditional KYC/KYB

We reviewed challenges with traditional KYC/KYB processes, including:

High Costs: Traditional methods are costly and resource-intensive. Compliance Complexity: Meeting compliance requirements like AML assurance is challenging. Time Inefficiencies: Current processes are time-consuming and require substantial manual effort. The Role of Credential Schemas

We highlighted the importance of credential schemas as data templates that ensure consistency and interoperability across systems. The benefits include:

Facilitating Innovation: Enabling innovators to focus on new applications rather than compatibility issues. Enhancing Interoperability: Ensuring credentials are broadly recognized across different systems. Encoding Best Practices: Capturing best practices and recommendations from experts. Boosting Efficiency: Reducing the need for custom integration work. DIF’s new Schema Work Item

We announced DIF's new schema work item, to be launched at the end of May, dedicated to credential schemas including:

Basic KYC Model: For identity verification across financial services and other sectors. AML Schema: To comply with anti-money laundering regulations. Proof of Age: To verify age or age range for access to age-restricted content. Proof of Humanity: To confirm real human identities, useful for community governance and anti-fraud. General Discussion

The discussions provided us with great insights; one such key insight is that such schemas do not need to provide the complete answer – if they can shave off 80% of the verification work, then that’s a win.

We also discussed the importance of schema discoverability to promote convergence, which we'll discuss more later.

Thank You for Your Participation

We extend our heartfelt thanks to everyone who joined the webinar and contributed to the conversation. Your feedback was invaluable and underscored the relevance and potential impact of credential schemas in streamlining KYC processes.

As discussed during the call, an interdisciplinary, holistic approach is key to the success of this effort, and we'd love to continue the discussion with you!

Contact membership@identity.foundation if you'd like to get involved. We are targeting an end of May start date. You can join the work item later, but getting involved now ensures we factor in your availability.


MyData

MyData goes to Brussels! Finishing our member consultation with OECD on the role of Trusted Data Intermediaries in enhancing individual agency and control

A group of MyData staff, members and friends co-hosted a workshop with OECD on the role of Trusted Data Intermediaries at this year’s CPDP conference in Brussels. We were happy to be able to take our members along to help facilitate these important discussions, presenting case studies that highlight the importance of a human-centric approach. […]
A group of MyData staff, members and friends co-hosted a workshop with OECD on the role of Trusted Data Intermediaries at this year’s CPDP conference in Brussels. We were happy to be able to take our members along to help facilitate these important discussions, presenting case studies that highlight the importance of a human-centric approach. […]

Next Level Supply Chain Podcast with GS1

From TikTok to Checkout - How Social Commerce is Revolutionizing the Supply Chain with Wes Duquette, ShipBob

Social commerce is booming, and today’s conversation proves that if you haven’t considered selling on social media, then it’s time you did. Wes Duquette is VP and GM of B2B and Retail at ShipBob, a tech-enabled 3PL that empowers small and medium businesses with advanced supply chain capabilities. In this episode, he speaks with Liz and Reid about the powerful impact of influencers and platforms

Social commerce is booming, and today’s conversation proves that if you haven’t considered selling on social media, then it’s time you did.

Wes Duquette is VP and GM of B2B and Retail at ShipBob, a tech-enabled 3PL that empowers small and medium businesses with advanced supply chain capabilities. In this episode, he speaks with Liz and Reid about the powerful impact of influencers and platforms like TikTok and Instagram in driving actionable purchases. And it’s no surprise that QR codes are revolutionizing the advertising landscape.

Wes shares invaluable insights on the critical importance of an omnichannel strategy, global expansion, and scalable infrastructure for brands aiming for growth. His pointers on barcoding, packaging, and back-office systems provide a stellar roadmap for future retail scalability. He discusses the evolving dynamics of direct-to-consumer channels and alternative marketplaces and understands why registering with giants like Amazon is crucial—even if you're a dedicated Shopify seller. 

 

Key takeaways: 

How influencers and platforms like TikTok are transforming retail engagement and driving omnichannel growth strategies for brands

Learn about the operational scalability challenges faced by SMBs during retail launches and discover strategies for effectively navigating high-volume demands and pivotal retail relationships.

Critical infrastructure elements such as barcoding, packaging, and robust back-office systems are essential for ensuring future scalability and seamless integration across various retail channels and marketplaces.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Wes Duquette on LinkedIn

Check out ShipBob

 


FIDO Alliance

FIDO Alliance addresses accuracy and bias in remote biometric identity verification technologies with first industry testing and certification program

Face Verification Certification launched to bring confidence to ID ecosystem among rising online identity theft and bias concerns   May 29, 2024 – The FIDO Alliance announced today the launch of […]

Face Verification Certification launched to bring confidence to ID ecosystem among rising online identity theft and bias concerns  

May 29, 2024 – The FIDO Alliance announced today the launch of the first globally available certification program to test and certify the performance of remote biometric identity verification technology when verifying a user against a trusted identity document for accuracy, liveness, and bias. The Face Verification Certification program comes at a time of soaring demand for face biometric identity solutions and recognition of the importance of robust enrollment and identity re-binding processes to the overall security of online accounts. Dynamic Liveness, the science powering the iProov Biometric Solution Suite’s Remote Onboarding Solution and Authentication Solution, is the first product to pass the rigorous certification testing.

The certification program, consisting of 10,000 tests at a minimum, assesses a biometric system’s performance across different demographics, including skin tone, age, and gender. It measures resistance to spoof and deepfake attacks with Imposter Attack Presentation Accept Rate (IAPAR), and also assesses the usability and security of solutions by measuring False Reject and Accept Rates (FRR and FAR respectively). The certification also tests “selfie match” capabilities to ensure a user’s “selfie” matches their government-issued ID in the initial account setup process. 

Combating bias and deepfake threats in biometric ID systems

Biometrics has ranked top as global consumers’ preferred way to log in and the method they think is most secure for the last two years. However, as governments and businesses globally roll out remote identity solutions, two urgent issues remain to address – bias in some biometric systems and new security threats. 

Organizations including NIST have been closely monitoring the disparities in performance for some time – with NIST’s most recent evaluation of solutions across different demographics ​​released this year. The issue of bias is tightly linked to brand reputation too; new research from FIDO Alliance released today has found 50% of American and British consumers said they would lose trust in an organization if its biometric system was found to be biased, while 22% said they’d stop using a service entirely. Similarly, generative AI’s boom has also heightened security apprehensions about online verification; the same survey revealed over a third of consumers (37%) are more concerned about verifying themselves online due to the rising number of deepfakes. In ENISA’s latest remote ID report, researchers observed that while deepfake injection attacks are increasing and more sophisticated, deepfake presentation and injection attacks remain the top two biometric attack types most difficult to mitigate

Bringing trust to the ID ecosystem

Commenting on the news, Andrew Shikiar, Executive Director & CEO of the FIDO Alliance, said: “Remote identity solutions unlock huge benefits for governments, organizations, and consumers alike, but as appetite grows across the globe, there are understandable concerns mixed with excitement. Identity theft is rising, while bias in biometric systems has caused organizations to delay or reconsider implementations at a time when inclusivity and accessibility have never been more important.

“Certification unlocks the power of open standards and catalyzes ecosystem-wide innovation and opportunity. With iProov’s market-first certification for biometric face verification now completed, we look forward to serving additional providers who understand the value of independent, accredited lab testing. This new certification program provides a launchpad that enables all stakeholders to fast-track deployments that are robust enough for the modern threat landscape and work well for everyone, anywhere in the world.” 

Leading biometrics solutions provider, iProov, has become the first vendor to complete the rigorous certification process. iProov provides market-leading biometric solutions that protect the world’s most security-conscious organizations from deepfakes and other types of identity fraud. Andrew Bud, founder and CEO at iProov said: “Biometrics are a powerful tool that organizations can utilize to facilitate secure, inclusive, and user-friendly interactions online. Each of these three fundamental components must be given equal consideration as organizations evaluate their options. With the FIDO Face Verification Certification program, organizations now have a trusted compass for navigating these decisions. We applaud The FIDO Alliance for addressing the importance of biometric identity verification to strengthen the full user identity lifecycle. Independent certification creates a much-needed quality benchmark for this evolving technology and further demonstrates our ability to provide trusted identity assurance in an age of AI threats and identity fraud.” 

Testing requirements are built upon proven ISO standards and are developed by a diverse international authority of stakeholders, including industry, government, and independent subject matter experts. Participating vendors can benefit from identifying gaps in product performance and demonstrating clearly to the market their solutions can be trusted, which can reduce individual testing needs and boost adoption. Two independent labs are currently accredited to support this certification – Ingenium Biometrics and TÜV Informationstechnik (TÜV NORD GROUP) – with more expected to follow later this year. 

The program expands upon the Alliance’s existing Biometric Component Certification and Document Authenticity (DocAuth) Certification programs and demonstrates FIDO’s ongoing commitment to meet marketplace demand and address evolving threats with third-party certifications. Combined, these programs provide unrivaled end-to-end assurance to implementing organizations, consumers and vendors and support the world’s migration to more secure digital verification systems and passwordless security.


FIDO Alliance Releases New Design Guidelines for Optimizing User Sign-in Experience with Passkeys

May 29, 2024 – The FIDO Alliance today released new design guidelines to help accelerate passkey adoption and deployment.  The FIDO Design Guidelines aim to help online service providers design […]

May 29, 2024 – The FIDO Alliance today released new design guidelines to help accelerate passkey adoption and deployment. 

The FIDO Design Guidelines aim to help online service providers design a better, more consistent user experience (UX) when signing in with passkeys

The guidelines are developed for designers, engineers, product managers, content strategists, and UX researchers to use for reference and guide their initial implementation of passkeys and expansion of passkey support over time.

The new guidelines are available at https://fidoalliance.org/design-guidelines/

“As organizations are increasingly deploying passwordless authentication based on FIDO standards around the world, the end users of passkeys – along with the practitioners implementing them – have become top priorities for successful adoption,” said Andrew Shikiar, Executive Director and CEO of The FIDO Alliance. “Our research shows consumers and employees are adopting phishing-resistant passkeys at a rapid pace while relying organizations are experiencing cost savings and fewer security incidents.  By continuing our investment in the evolving user experience, the FIDO Alliance is committed to ensuring brands have a consistent and accessible set of guidelines that are fully aligned with design best practices and FIDO technology requirements. We encourage online service providers everywhere to use these publicly available guidelines to enhance the user experience and enjoy greater success with FIDO passkey deployment and adoption.”

Following the first release of FIDO UX guidelines for passkeys in 2022, the 2024 Design Guidelines have been updated with optimization included for service providers evaluating and deploying passkeys. 

The 2024 Design Guidelines are organized into five sections to provide clear guidance, confirm design principles, and offer flexibility:

User experience research: Provides confidence that the guidelines are informed by design research Principles: 10 UX principles and 3 content principles for passkeys that are core to any passkey implementation “Get started” design patterns: Patterns are the heart of the guidelines, containing self-contained experiences that can be combined to match unique business needs Optional design patterns: Patterns that can be added after the “get started” patterns over time Resources: Provides additional resources like events, Figma UI kits and community groups to jump-start work with passkeys

The FIDO UX Working Group created the guidelines and comprises 131 UX researchers, designers, and PMs from 31 global brands. The guidelines were created in partnership with usability research firm Blink UX – with added underwriting support from 1Password, Dashlane, Google, HID, Trusona, U.S. Bank, and Yubico.

Hear More about the Design Guidelines

Learn about the 2024 Design Guidelines at Identiverse 2024 in Las Vegas May 28-30, 2024. To learn more, visit the FIDO Alliance website.

For a deeper dive, join these sessions from the upcoming Design Guidelines for Passkeys Webinar Series: 

June 11 | 2:00 PM ET | Essentials for Adopting Passkeys as the Foundation of Your Consumer Authentication Strategy June 18 | 2:00 PM ET | Aligning Authentication Experiences with Business Goals June 25 | 2:00 PM ET | Drive Revenue and Decrease Costs with Passkeys for Consumer Authentication July 2 | 2:00 PM ET | Design Guidelines for Passkeys: Ask Us Anything!

Registrants of this webinar series will have access to all events both live and on-demand after they air. To register, click here.


Remote ID Verification – Bringing Confidence to Biometric Systems Consumer Insights 2024

How can we verify users are genuine in an online world? Online identity theft has steadily risen in recent years, while the generative AI boom has driven a new wave […]
How can we verify users are genuine in an online world?

Online identity theft has steadily risen in recent years, while the generative AI boom has driven a new wave of deepfake-powered attacks that threaten remote enrollment and identity security. Meanwhile, bias in biometric systems has been monitored for some time, varying significantly across solutions and impacting consumer trust and perception of the technology.

As the adoption of remote identity verification technology rises, two critical concerns and challenges need to be addressed: bias and new security threats.

The FIDO Alliance recently sponsored an independent study of 2,000 respondents across the U.S. and the U.K. to understand consumer perception towards remote identity verification, online security, and biometrics. This eBook reveals those insights on remote biometric face verification, including how many people have used biometric face recognition successfully, and their opinions on verification accuracy, potential discrimination, and concerns about deepfakes.

Key findings Consumers want to use biometrics to verify themselves online more, especially in sensitive use cases like financial services – where one out of two people said they would use biometric technology (48%). One in four feel they experience regular discrimination when using automated facial biometric systems (25%). Equity in biometric systems is vital to trust – with half saying they would lose trust in a brand or institution (50%), with one out of five saying they’d stop using a service entirely if found to have a biased biometric system (22%). Over half of respondents are concerned about deepfakes when verifying identities online (52%).

Read the full results of the survey in this eBook and learn about these consumers’ experiences with biometric face verification technologies, and discover how organizations can improve global digital access, authentication, and security when leveraging these remote identity verification technologies.

download the ebook

Tuesday, 28. May 2024

Me2B Alliance

Digital Harms Dictionary 2.0

A deep dive into online harms and what you can do about them. Open Excel The post Digital Harms Dictionary 2.0 appeared first on Internet Safety Labs.

A deep dive into online harms and what you can do about them.

Open Excel

The post Digital Harms Dictionary 2.0 appeared first on Internet Safety Labs.


We Are Open co-op

The Business Case for Working Openly and Transparently

Greater agility, faster innovation, increased engagement Image CC BY-ND Visual Thinkery for WAO Excuse us as we go full on bizniz mode for the rest of this post. We’re quite serious about the benefits of openness and have built our cooperative as well as our individual careers on the fact that Open Source is about more than just code. What follows is the “business case” for shifting your orga
Greater agility, faster innovation, increased engagement Image CC BY-ND Visual Thinkery for WAO

Excuse us as we go full on bizniz mode for the rest of this post. We’re quite serious about the benefits of openness and have built our cooperative as well as our individual careers on the fact that Open Source is about more than just code. What follows is the “business case” for shifting your organisation to a more open model. We’ve borrowed liberally from the Open Organization community, a group of people (including one of our members) who have written about the models and behaviours inside of FOSS for many years.

The evidence is clear: open working leads to greater agility, faster innovation, and increased engagement. Members of your organisation are more capable of working toward goals in unison and with shared vision. Ideas from both inside and outside the organisation receive more equitable consideration. Members clearly see connections between their particular activities and an organisation’s overarching values, mission, and spirit.

These advantages translate directly into better business performance and a stronger bottom line. If you want your organisation to obtain better results with the resources you currently have available, then embracing open working practices is one of your best paths toward sustainable success.

But don’t just take our word for it:

Agility: According to McKinsey & Company, organisations that adopt information transparency across their operations can achieve up to 30% gains in efficiency and customer satisfaction. Innovation: Harvard Business Review reports that companies that support open and collaborative working environments can see a speed increase of up to 30% in their innovation cycles. Engagement: Research by Gallup reveals that highly engaged business units realise a 41% reduction in absenteeism, 17% increase in productivity, 10% higher customer ratings, see a 20% increase in sales, and are 21% more profitable. 5 Characteristics of an Open Organisation

Open Organisations are defined as exhibiting the following characteristics:

Image CC BY-ND Visual Thinkery for WAO

Let’s explain what’s meant by each of these:

Transparency

This is fundamental to open organisations and involves everyone involved on a project to access all relevant materials by default. As a result, this means that team members are all able to review, assess, and contribute effectively to the work being done. Moreover, transparency means team members being not only informed about decisions and processes, but being encouraged to participate in discussions and provide feedback. Both successes and failures are discussed openly, meaning that valuable lessons can be learned, goals can be clarified, and roles well-defined. All of this enhances the overall organisational accountability.

Inclusivity

Open organisations actively welcome diverse viewpoints and ensure that everyone has a voice in the future of both the project, and organisation as a whole. This can be achieved not only through well-established technical channels but also through social norms that encourage a range of perspectives. It’s important that there are clear protocols for participation so as to promote an inclusive environment where feedback is valued. Leadership for inclusivity includes proactively including overlooked voices and ensuring that all relevant opinions are heard and considered. This approach enriches the decision-making process and also reinforces a sense of duty among employees to contribute meaningfully to discussions about their work.

Adaptability

The adaptability of open organisations is characterised by their flexibility and resilience, based on collective problem-solving and collaborative decision-making. To allow true adaptability, feedback mechanisms need to be accessible to both internal and external members. This allows for peer support and agency which can have clear impact on operational methods. This orientation towards continuous learning and the readiness to adapt based on feedback helps organisations avoid repetitive mistakes. It also helps them stay responsive to the changing needs of their environment.

Collaboration

In open organisations, collaboration is the default mode of operation. These is a general belief that working together produces superior outcomes, not only in project work, but also in terms of improving organisations. Ensuring a spirit of openness means that work is not only shared within the organisation but is also available for improvement by external parties, leading to greater innovation and continuous improvement. The ease with which people can discover, provide feedback on, and join ongoing work is a useful barometer for how different, welcoming, and collaborative open organisations are compared to others.

Community

The community aspect of open organisations is based on shared values and principles that guide participation and decision-making. These values should be clear to all members and help define the organisation’s boundaries and success criteria. Team members within this community approach are empowered to make significant contributions based on a common ‘language’ to prevent miscommunication. Sharing knowledge and experiences is encouraged to further the group’s work, which strengthens the communal bond and ensuring collective progress towards organisational goals.

As you can see, these characteristics are interconnected and intertwined, leading to a virtuous cycle of openness.

Practical Steps Towards Openness Image CC BY-ND Visual Thinkery for WAO

Ultimately, working openly is about cultural change. A change in attitude as much as a change in processes.

To transition to more open working practices, consider the following steps:

Clear guidelines: Work together to document and share clear guidelines on what open working could look like in your organisation. This includes how information is shared, how feedback is given and received, and how decisions are made. Culture of transparency: Encourage leaders and team members to share their work and the reasoning behind their decisions openly. This can be facilitated through regular meetings, shared digital workspaces, and open forums where ideas and projects are discussed transparently. Inclusive communication: Ensure that all team members have equal access to communication tools and platforms. Use a variety of channels to accommodate different working styles and preferences, and encourage everyone to contribute to discussions and decision-making processes. Collaboration across boundaries: Use technology and collaborative tools to break down silos within your organisation. Encourage teams to work together across departments and even invite external partners or customers to contribute to projects where appropriate. Community engagement: Create spaces (physical/virtual/hybrid) where employees can interact, share ideas, and support each other. Regular community-building activities and team-building exercises can strengthen the communal bonds and enhance the shared sense of purpose. Addressing Potential Challenges Image CC BY-ND Visual Thinkery for WAO

While transitioning to an open working environment offers many benefits, it also comes with challenges that need to be managed. For example, information overload can be avoided by sharing information in a structured and digestible way so as not to overwhelm team members. Try using tools that allow for information to be categorised and easily searched.

While transparency is to be encouraged, clearly define sensitive information to ensure that robust security practices are in place to protect it. Provide training to team members around data protection standards and the importance of respective privacy and confidentiality.

While open working encourages broader participation, this needs to be balanced with the need for efficiency. So set clear objectives and deadlines for collaborative projects to avoid decision-making bottlenecks or delays.

Some team members may resist changes to traditional ways of working. Overcome such cultural resistance by demonstrating the tangible benefits of open working practices through pilot projects and sharing success stories. Provide training and mentoring to help adaptation to the new ways of doing things.

Conclusion Image CC BY-ND Visual Thinkery for WAO

The shift towards working openly and transparently is a strategic move that can significantly enhance an organisation’s agility, innovation, and engagement. By encouraging a culture of openness based on transparency, inclusivity, adaptability, collaboration, and community, organisations can create an environment where every member feels valued and empowered to contribute.

This approach not only drives better business performance but also cultivates a sense of shared purpose and commitment among employees. Embracing open working practices isn’t just an ‘initiative’ but a pathway to sustainable success and a robust bottom line. By implementing clear guidelines, engendering a culture of transparency, ensuring inclusive communication, promoting collaboration, and engaging the community, organisations can navigate potential challenges and realise the full benefits of open working.

As you would expect of an organisation called We Are Open, our cooperative can help you and your organisation on this journey. Get in touch, or take our free email-based course, to find out more!

The Business Case for Working Openly and Transparently was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 27. May 2024

IDunion

IDunion SCE network infrastructure open for commercial use

IDunion SCE, the neutral and democratically organised European cooperative created from the IDunion research consortium, has been offering their managed network infrastructure for industrial and productive use since April 2024.

IDunion SCE, the neutral and democratically organised European cooperative created from the IDunion research consortium, has been offering their managed network infrastructure for industrial and productive use since April 2024.

As a neutral basic infrastructure, the distributed and redundant network enables every company to independently store and manage identity information and all types of company-related data. Companies can store unique and distinctive company identity-related identifiers (DID – Decentralised Identifier) on the IDunion network, which are permanent, unchangeable and verifiable and do not require the use of a central register. The identifiers can be linked to user-defined company data or company attributes. All company data can be managed according to predefined parameters and can of course also be revoked. IDunion SCE makes the identifiers available to companies as required and assigns corresponding write authorisations for network use on an individual contractual basis.

The network is based on the open source software Hyperledger Indy. All network nodes are operated exclusively by cooperative members in accordance with the principles of the IDunion SCE and monitored by the latter as a neutral body. The IDunion SCE and the infrastructures it manages and supports are subject to the principles of openness, neutrality, sovereignty, transparency, integrity and cooperative competition. All members of the cooperative have the same voting rights, regardless of their company size. This ensures that all participating companies can have an equal influence on the technical and organised development of the IDunion SCE and thus the ecosystem. Companies can join the European co-operative as members at any time.


MyData

As a patient, the main reason to be in charge of my health data is to ensure its QUALITY

Patients – and citizens – should be in charge of their data to ensure correctness, completeness and accuracy, optimizing continuity of care from their providers and effective generation of datasets for public health innovation & policy-making. In a previous blog post arguing for an individual-centric European Health Data Space (EHDS), we presented the comments from […]
Patients – and citizens – should be in charge of their data to ensure correctness, completeness and accuracy, optimizing continuity of care from their providers and effective generation of datasets for public health innovation & policy-making. In a previous blog post arguing for an individual-centric European Health Data Space (EHDS), we presented the comments from […]

Identity At The Center - Podcast

Identiverse 2024 starts tomorrow! We did a little pre-confer

Identiverse 2024 starts tomorrow! We did a little pre-conference tailgating on this episode of the Identity at the Center podcast to discuss our plans for the conference before closing out with a quick discussion of the Hypr State of Passwordless Identity Assurance report. Watch the episode at https://www.youtube.com/watch?v=y7B9u9H-cN8 and check us out at idacpodcast.com #iam #podcast #idac #ide

Identiverse 2024 starts tomorrow! We did a little pre-conference tailgating on this episode of the Identity at the Center podcast to discuss our plans for the conference before closing out with a quick discussion of the Hypr State of Passwordless Identity Assurance report. Watch the episode at https://www.youtube.com/watch?v=y7B9u9H-cN8 and check us out at idacpodcast.com

#iam #podcast #idac #identiverse2024

Friday, 24. May 2024

FIDO Alliance

CXMToday: Visa Unveils Card Updates

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a […]

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a face or fingerprint. When shopping online, Visa passkeys replace the need for passwords or one-time codes, enabling more streamlined, secure transactions.


Identity Week: State of Michigan’s MiLogin supported by FIDO passkeys

The system leverages passkeys based on FIDO authentication promoting strong authentication, unifying Michigan’s approach to cybersecurity and improving the user experience. The State of Michigan aimed to address several key […]

The system leverages passkeys based on FIDO authentication promoting strong authentication, unifying Michigan’s approach to cybersecurity and improving the user experience.

The State of Michigan aimed to address several key objectives with the integration of passkeys, fortifying security and enhancing the digital user experience to access critical state government services.


FindBiometrics: Visa Brings Passkeys to Online Payments in Major FIDO Victory

Visa has introduced passkeys to the payment industry, enabling customers to authorize online purchases through a biometric scan on their smartphones or computers when making a purchase online. This capability […]

Visa has introduced passkeys to the payment industry, enabling customers to authorize online purchases through a biometric scan on their smartphones or computers when making a purchase online.

This capability is powered by the Visa Payment Passkey Service, which is built on Visa’s Fast Identity Online (FIDO) server. The service allows merchants to integrate the Visa Payment Passkey Service into their checkout systems without needing to establish their own servers, thereby simplifying the setup process.

For users, this means they can use the same biometric authentication methods they use to unlock their devices to approve Visa payments online, with a one-time enrollment required during checkout. Visa also plans to extend enrollment options to banking apps in the future.

The development of passkeys was a collaborative effort among major technology companies such as Apple, Google, and Microsoft, which joined forces around 2012 to form the FIDO Alliance. This group aimed to overcome the limitations of traditional passwords by creating open standards for more robust authentication, involving biometric experts like HYPR and Nok Nok Labs.

FIDO released its first standards in 2014, setting the stage for authentication methods that do not depend on passwords. Subsequent advancements led to the establishment of the WebAuthn standard in 2019, which quickly gained acceptance among major web browsers. This progress facilitated the creation of passkeys, leveraging FIDO protocols to link authentication credentials to users’ mobile biometrics.

Visa’s recent move has been welcomed by supporters of FIDO and passkeys. HYPR’s co-founder and CEO, Bojan Simic, commented on this development, stating that nearly every regulated business he has interacted with in the past year has included a passkey initiative in their plans in an online post. “I’m so proud of the work that we have all done at the FIDO Alliance to make this a reality. When we wrote the first FIDO implementation in 2014 here at HYPR, seeing the top brands adopt the standard in a major way seemed like fantasy.”

In making this vision a reality, Visa joins several other prominent companies that have recently introduced support for passkeys, including PayPal, Samsung, and Amazon.

Thursday, 23. May 2024

Trust over IP

ToIP at EIC and Beyond: A Summer of Not-to-Be Missed Sessions

See all the conference sessions and talks in which ToIP members or our partners will be participating. The post ToIP at EIC and Beyond: A Summer of Not-to-Be Missed Sessions appeared first on Trust Over IP.

Spring conference season is in full swing and the Trust Over IP Foundation (ToIP) is heading to Europe to participate in several major conferences:

European Identity and Cloud Conference (EIC) — June 4-7, Berlin, Germany Identity Week Europe — June 11-12, Amsterdam, Netherlands Digital Identity unConference Europe (DICE) — June 18-20, Zurich, Switzerland.  

We will also be attending the second regional summit of the Sustainable & Interoperable Digital Identity (SIDI) Summit 2024 series.

Trust Over IP, which celebrates its fourth birthday this month, has long been known for its dual stack that emphasizes why we must combine technical interoperability with governance interoperability in order to establish truly interoperable digital trust ecosystems.

Although in our early history we were best known for our work on the governance side of the dual stack, at these June conferences our emphasis will be on two new technical specifications for which we just published Implementers Drafts: the Trust Spanning Protocol and the Trust Registries Protocol. Both of these protocols are designed to maximize interoperability and authenticity and minimize risk, while still enabling each ecosystem to make the choices they need for their specific context.

The following is a schedule of all the conference sessions and talks in which ToIP members or our partners will be participating.

 

Join us at EIC for a panel of ToIP Steering Committee Members, moderated by Executive Director, Judith Fleenor.

Panel: The Emerging Trust Layer for the Internet: Using Minimum Viable Protocols to Achieve Maximum Interoperability

Friday, June 07, 2024 13:30—14:30

Interoperability is not created alone and that is why ToIP collaborates with other standards organizations and certifications bodies.  Executive Director, Judith Fleenor has invited the Executive Directors from Open Identity Exchange, OpenID Foundation, Open Wallet Foundation, Decentralized Identity Foundation, and the Kantara Initiative to unpack the role each foundation plays, how we collaborate to create the full picture for digital trust, and why it is so important for organizations and individuals to support and participate in several foundations, not just one.

Panel: Executive Directors Speak: Why We Need More Collaboration

Friday, June 07, 2024 14:30—15:30

At last year’s EIC Judith gave a keynote on Decentralized Identity Why is it all the rage?  Her premise was that the technologies that enable decentralized identity could be used, not only for traditional identity and access management, but more importantly for content authenticity and to assist in the challenges that AI brings.  This year, Al will be a big topic at EIC, and several ToIP Steering Committee members will be addressing these topics in various sessions throughout the week.

On Wednesday Wenjing Chu, Futurewei Technologies and Drummond Reed, Gen Digital will be on the panel:

Redefining Human Identity in the AI Era: From Digital Evolution to AI Integration

Wednesday, June 05, 2024 12:00—13:00

Following that Wenjing Chu will be giving the session:

Decentralizing AI: A Socio-Technical Path to Responsible and Trustworthy Tech

Wednesday, June 05, 2024 14:30—15:00

Continuing in the AI track, Wenjing will be joining by others for the following panel discussion:

Is Decentralization The Best Way to Achieve Transparency and Verifiability in AI?

Wednesday, June 05, 2024 15:00—15:30

Charting the Course: Diverse Approaches to AI Regulation in a Fractured World

Wednesday, June 05, 2024 15:30—16:10

AI isn’t the only opportunity vs threat we are facing in the near future.  Quantum Computing is coming as well.  Also, on Wednesday, Steve McCown, Anonyome Labs, will be giving a presentation:

Post Quantum Security: Cryptography in Decentralized Identity

Wednesday, June 05, 2024 18:10—18:30

Another big topic at EIC will be the use of verifiable credentials and the digital wallets that hold them. Several sessions will explore these topics and give real word examples of their use in productions. 

On Wednesday Marie Wallece. Accenture, and Drummond Reed, GenDigital, will be joined by others for a panel:

Real-World Examples of How Smart Wallets will Transform how we
Navigate our Digital World

Wednesday, June 05, 2024 14:45—15:30

On Thursday, Daniel Bachenheimer, Accenture, will join others looking at the use cases in the travel industry.

Panel: The Future of Travel Credentials

Thursday, June 06, 2024 16:00—16:30

Also on Thursday, the topic of real world implementations continues when Andre Kudra, esatus, with others explore use cases in the construction industry:

Best Practice: DIDs and Verifiable Credentials in the Construction Industry

Thursday, June 06, 2024 17:30—17:50

And on Friday Marie Wallace, Accentrue, will join others in a panel discussion:

Decentralized Identity in Production

Friday, June 07, 2024 10:30—11:10

The above session will explore real world solutions being used now, but before selecting which technologies to use for your ecosystem and wallets, it is important to know the differences each technology solution has to offer.  Our friends at the Open Wallet Foundation (OWF) have put together a guide to help explain wallet security.  Daniel Bachenheimer, Accenture, will present with another OWF board member on:

Open Wallet Foundation (OWF) Guide to Safe Digital Wallet Selection

Thursday, June 06, 2024 12:00—12:20

Daniel will deepen the conversation about Digital Wallets in his Friday session on how you bind a wallet to the natural person who’s credentials it holds:

Digital Wallet Holder Binding

Friday, June 07, 2024 11:30—11:50

Another hot topic at this year’s EIC will be Organizational Identity.  Christoph Schneider, GLEIF, will be giving a presentation on this topic, as will ToIP active contributor, John Phillips, Sezoo:

Organizational Identity in the Private and Public Sector with the vLEI

Wednesday, June 05, 2024 18:15—18:30

Organisation Authentication as an Anti-Scam Measure

Friday, June 07, 2024 12:10—12:30

Let us not forget the importance of eIDAS 2.0.  Andrew Tobin from Gen will be giving a keynote Wednesday morning to kick off the track Digital ID Beyond the Enterprise, Session Stream III, Digital Identity and eIDAS 2.0 with a keynote address:

eIDAS 2.0: Game On, But What game is it?

Keynote

Wednesday, June 05, 2024 08:30—08:50

Within that track many of our friends will be sitting on panels and giving presentation, but not to be missed will be Viky Manalia, Intesti Group, who will be on the panel:

Latest on eIDAS Legislation and What it Means for People, Business and Government

Wednesday, June 05, 2024 11:00—11:40

Also in that track several of our ToIP contributors will speak on the panel:

Reusable Identity and Bootstrapping Decentralized Identity Ecosystems

Thursday, June 06, 2024 15:30—16:00

ToIP has been collaborating for the last six months with the work being done at the Content Authenticity Initiative, C2PA, and the Creator Assertions Community group, which Eric Scouten of Adobe heads.  This presentation should be interesting to those concerned about the authenticity of content.

Content Authenticity Overview and eIDAS Investigation:
How Can eIDAS Support Content Creators?

Wednesday, June 05, 2024 14:30—14:45

And it wouldn’t be right if we didn’t give a shout out to our friends at the Decentralized Identity Foundation who are presenting a pre-event workshop:

Decentralized ID Technical Mastery Sprint for IT Architects and Project Managers

Tuesday, June 04, 2024 10:30—12:30

 

This year’s EIC will be jam packed.  We hope to see you at some of our sessions, or grab us for a conversation on the river cruise or at one of the breaks. 

Look for our members wearing the “Ask Me about Trust Over IP Foundation” button, as we would love to discuss any of our published deliverables or get you involved in the creation of our work towards Internet-Scale Digital Trust.  Join Us!

If you are not attending EIC, look for us the following week at Identity Week Europe in Amsterdam, where we will be present on the Decentralized Identity Panel on June 12th.  

Or join us at the Decentralized Identity unConference Europe (DICE)  in Zurich June 18-20th, where we will be calling several session on everything from our recently in public review set of Keri Specifications to our Trust Spanning Protocol, Trust Registry Protocol and Technical Architecture Specification V2.  Executive Director Judith Fleenor will also be presenting an Introduction to Trust Over IP Foundation, if you are new and interested in getting involved.

The post ToIP at EIC and Beyond: A Summer of Not-to-Be Missed Sessions appeared first on Trust Over IP.


Project VRM

Personal AI +/vs Corporate AI

You’re reading this on a machine with an operating system: Linux, Windows, MacOS, iOS, or Android. But that’s not your OS. It’s your machine’s. How about one for you, that runs on your machine but is entirely yours? Let’s call it a Personal OS, or a POS. The POS will have a kernel onto which […]

You’re reading this on a machine with an operating system: Linux, Windows, MacOS, iOS, or Android.

But that’s not your OS. It’s your machine’s.

How about one for you, that runs on your machine but is entirely yours? Let’s call it a Personal OS, or a POS.

The POS will have a kernel onto which abilities (not just applications) can be added. An extreme example of how this might work is Neo learning ju jitsu in The Matrix:

That OS amplified Neo’s own intelligence, in his own head. We’re far from that today. But we can at least add abilities to a POS of our own. Those too can give us more agency of many kinds.

To my knowledge, there is only one POS so far. It’s called pAI-OS (Github code), and it’s led by Kwaai.* To my knowledge, pAI-OS is the first and only truly personal operating system. (If others do the same, let me know and I’ll talk those up too.) And it is built to run our own AIs. Let’s call them PAIs, where the A can mean amplified or augmented (sourcing Doug Englebart for the latter).

So, what kind of abilities are we talking about?

Let’s start with something that could hardly be more mundane and important: memory.

In Laws of Media, Marshall McLuhan said (five decades ago) that computing promises “perfect memory—total and exact.” For many millennia, our species has been outboarding memory through speech, the written word, and collecting all of that in libraries and museums. And now, in the digital age that dawned with microcircuits and the Internet, we now occupy a digital world where everybody can publish whatever they want. To peruse that, we made search engines. Those ruled from the late ’90s until approximately yesterday, when AIs took over servicing our interest in answers to questions. Google, Microsoft, ChatGPT, Perplexity.ai, and others have moved into a space we might call AI answerware.

Running all that answerware are corporate AIs. Lets call them CAIs. Nothing wrong with CAIs, but also nothing personal, because they are not ours. I explain the difference in Personal vs. Personalized AI. Here’s a graphic from that post showing a bit of what abilities might run on your PAI:

PAIs can extend our own memories by accumulating personal stuff we need to know better, and our ability to meet, access, and use the external abilities of the CAI world. So we’ll have our agents + their agents, working together.

For an example of how that might work, take a look at The most important standard in development today: P7012: Standard for Machine Readable Personal Privacy Terms, which “identifies/addresses the manner in which personal privacy terms are proffered and how they can be read and agreed to by machines.” After seven years with a working group, it is now in the IEEE editing and approval mill, edging toward becoming a finished standard by next year. It works like this:

Here your agent (a PAI, represented by the ⊂ symbol) proffers your privacy terms (here is one example) to a corporate agent (which might or might not be a CAI, but is still represented with the reciprocal symbol ⊃. (This should be familiar to ProjectVRM veterans as the r-button. We may finally get to use it!)

The ceremony here is the exact reverse of what we have today with the cookie popovers on most website home pages. This can and should be done ⊂ to ⊃. So should signing and recording the agreement, or the choice of the site, should it tell you to screw off. (An agent running on your PAI will record that diss.)

I also bring this up because it will be a key required ability—not just for you and me but for the world, starting with Europe, where the GDPR lists six lawful bases for processing personal data. They begin—

(a) Consent: the individual has given clear consent for you to process their personal data for a specific purpose.
(b) Contract: the processing is necessary for a contract you have with the individual, or because they have asked you to take specific steps before entering into a contract.

By now everyone knows that (a) Consent has failed. It’s an expensive and meaningless dance, with high cognitive (mostly cynical) overhead, and almost no accountability. Now they’re ready for (b) Contract, especially in ceremonies where the individual (not a mere “user”) takes the lead.

I believe there is less limit to what each of us can do with a PAI than there is to what we can do with a laptop or a phone. Because our PAI is our own. It runs on a deeper machine OS, but is not limited by that. Your PAI, running on your POS, may prove to be the first truly personal layer ever put on a machine OS.

*Full disclosure: I am now the Chief Intention Officer there. At this stage, it’s a voluntary position.


Digital Identity NZ

Regulation, Regulation Regulation Everywhere | May Newsletter

Digital ID has been increasingly in the news lately, as Australia begins to ramp up its systems pilots development following the passing of legislation. In Aotearoa there’s been a lot happening too as Minister Collins opened this year’s Summit where DINZ was on stage later. The post Regulation, Regulation Regulation Everywhere | May Newsletter appeared first on

Kia ora,

Digital ID has been increasingly in the news lately, as Australia begins to ramp up its systems pilots development following the passing of legislation. In Aotearoa there’s been a lot happening too as Minister Collins opened this year’s Summit where DINZ was on stage later.

Contained within the avalanche of consultations posted in March and early April, was the second revision of the rules that support NZ’s own Digital Identity Trust Framework (DISTF) Act. The Department of Internal Affairs(DIA) presented to a packed virtual lunchtime audience on it earlier this month. Digital Identity NZ’s DISTF Working Group, comprising both private and public sector participants, responded in detail, reflecting the overall message that, despite good intentions to improve security, data protection, and privacy to reduce online fraud, there are alot of moving parts for stakeholders to consider. The fact that it is ‘opt-in’ for stakeholders and that alternative/non-digital channels should always be offered to people and organisations because it is not compulsory, should help provide time to become comfortable and confident with it. 

The DINZ Biometrics Special Interest Group made a deeply considered submission on the Office of the Privacy Commissioner’s (OPC) exposure draft of a Code of Practice for biometrics under the Privacy Act 2020. We continue to assert that a code is premature with potential for unintended consequences, whereas well drafted, clear guidance at the outset will deliver better outcomes for all New Zealanders. As if the above two were not enough, the CDR/CPD bill is set to be introduced to Parliament, which also has a critical digital identity component.

Since the CPD and the Biometrics Code (should it be implemented) are mandatory for specific digital ID service providers undertaking particular identity-related processes, whereas the DISTF is opt-in, I sense that the weight of regulatory burden will result in tough choices for the industry in this small market, despite the good intentions of all stakeholders. In my view, we’ll have to be conscious of the potential for market stall. Could the CDR/CPD bill help unify our data and streamline our sector and service laws to better suit our market size?

Meanwhile, sectors continue to analyse the role of digital identity and the potential to operate schemes within the DISTF, with financial services leading the way. This is demonstrated so clearly by the Commerce Commission’s consultation on personal banking, which highlights the importance of robust digital identification for frictionless bank-switching and faster progress towards Open Banking.

With the above very much in mind, Payments NZ and DINZ co-hosted a highly productive investigative sprint earlier this month across its combined membership. We brought together experts, representatives from the private, public and NGO sectors, and heard inspiring presentations from Australia’s ConnectID and our very own MATTR.

The research and education sector have been deeply engaged with digital identity for many years and continue to lead on many fronts. Learn more at our lunchtime webinar on Thursday 13 June simply titled ‘Digital Identity, Higher Education, Aotearoa’, when DINZ member Middleware showcases fellow DINZ member University of Auckland.

The following week, DINZ and FinTechNZ members with a general understanding of ‘digital cash’ should diary note 20 June for a ‘town hall’ styled dynamic submission with The Reserve Bank of New Zealand – Te Pūtea Matua on the criticalities (including digital ID) surrounding ‘digital cash’ – a Central Bank Digital Currency as a potential alternative to physical cash. 

DINZ and FinTechNZ members who are familiar with ‘digital cash’ should watch for details about a town hall meeting on Thursday, June 20, with The Reserve Bank of New Zealand – Te Pūtea Matua, discussing a potential Central Bank Digital Currency as an alternative to physical cash. The meeting will cover important topics such as digital ID.

To close this month’s newsletter I’d like to give a big shout-out for the Digital Trust Hui Taumata. Please get your tickets early on the Members 2-4-1 super saver. At the time of writing, our sponsor list has grown to include AuthsignalJNCTNMiddlewareNEC NZPayments NZWorldline and Xebo as well as DINZ showcasing its own thought leadership.

With more sponsors and more speakers committing every week, this event is sure to be ‘simply the best’.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read full news here: Regulation, Regulation Regulation Everywhere | May Newsletter

SUBSCRIBE FOR MORE

The post Regulation, Regulation Regulation Everywhere | May Newsletter appeared first on Digital Identity New Zealand.

Wednesday, 22. May 2024

Ceramic Network

Data Feed API is out now: build custom indexers on Ceramic

From the very beginning, the Ceramic network was meant to be an open and customizable platform for developers to build on. The core Ceramic team is excited to keep delivering on that promise and launch Ceramic’s new Data Feed API. The Data Feed provides a new API for

From the very beginning, the Ceramic network was meant to be an open and customizable platform for developers to build on. The core Ceramic team is excited to keep delivering on that promise and launch Ceramic’s new Data Feed API.

The Data Feed provides a new API for Ceramic that allows developers to receive notifications of updates to JSON documents that are managed via Ceramic. This feed of document updates allows developers to customize the way their data is indexed and queried, and enables the development of new custom database products on top of Ceramic. Databases built on top of the Data Feed API get the advantage of powerful querying capabilities custom-tailored to the needs of the application, while retaining interoperability and composability of the underlying data with all other databases built on the Data Feed API throughout the Ceramic ecosystem.

What is the Data Feed API

The Data Feed API enables developers to keep track of all the new state changes happening to documents in the Ceramic network, as they are observed on a Ceramic node. These state updates can either be from writes sent to the Ceramic node through the HTTP Client, or writes discovered from the network for Streams belonging to Models that are indexed on the Ceramic node. Ultimately, the Data Feed API can be used to build custom indexers and other database products, e.g., OrbisDB.

Supported API Methods

The Data Feed is a read-only API, allowing you to retrieve the feed data through HTTP GET methods. The Data Feed operates on the aggregated state of documents (such as Models and ModelInstanceDocuments), abstracting away the lower-level concerns of the underlying event streaming system. This allows developers to build with familiar JSON documents while taking advantage of automatic application of updates to document state with built-in schema validation.

Feed data can be accessed via a new HTTP endpoint on the js-ceramic node with the URL path api/v0/feed/aggregation/documents. The feed utilizes Server-Sent Events (SSE) to push messages with the following schema:

FeedDocument = { resumeToken: string commitId: CommitID content: any metadata: StreamMetadata eventType: EventType }

So for example, the following request:

curl http://localhost:7007/api/v0/feed/aggregation/documents

will return the following response:

data: { "resumeToken": "1714742204565000000" "commitId": "k6zn3t2py84tn1dpy24625xjv65g4r23wuqpch6mmrywshreivaqiyaqctrz2ba5kk0qjvec61pbmyl15b49zxfd8qd3aiiupltnpveh45oiranqr4njj40", "content": "{...}", "metadata": { "controllers": [ "did:key:z6MknE3RuK7XU2W1KGCQrsSVhzRwCUJ9uMb6ugwbELm9JdP6" ], "model": "kh4q0ozorrgaq2mezktnrmdwleo1d" }, "eventType": 2 }

The Data Feed API can also be consumed through event listeners to continuously pull and process events from the feed. For example:

import { EventSource  } from "cross-eventsource"; import { JsonAsString, AggregationDocument } from '@ceramicnetwork/codecs'; import { decode } from "codeco"; const source = new EventSource('<http://localhost:7007/api/v0/feed/aggregation/documents>') const Codec = JsonAsString.pipe(AggregationDocument) source.addEventListener('message', (event) => { console.log('message', event) //use JsonAsString, and AggregationDocument to decode and use event.data const parsedData = decode(Codec, event.data); console.log('parsed', parsedData) }) source.addEventListener('error', error => { console.log('error', error) }) console.log('listening...') Resumability

The Data Feed API also comes with a resumability feature that provides “at least once” delivery semantics. Resumability is the ability for a consumer of the feed to pick up “where it left off” after downtime. Without resumability, an indexer that goes down has no way of receiving and indexing data that was propagated while the indexer was offline. The goal of resumability is to ensure that a consumer of the Data Feed API can be guaranteed to receive the most recent data for each Stream at least once, even in the event of failure.

Resumability in Data Feed API is enabled by the resumeToken property, which is added to every object emitted by the Data Feed API. This property enables you to initiate a connection and ask for the entries starting immediately after resumeToken.

For example, let’s say your application got an entry containing resumeToken: "1714742204565000000" . When connecting, you can pass the token value as a query parameter, as follows:

// ... same as a code snipped above const url = new URL("<http://localhost:7007/api/v0/feed/aggregation/documents>") url.searchParams.set('after', '1714742204565000000') // Value of the last resumeToken // Connection to <http://localhost:7007/api/v0/feed/aggregation/documents?after=1714742204565000000> const source = new EventSource(url)

Note that the internal structure of the resumeToken is meant to be considered opaque and may change in the future. You should not try to craft resumeTokens manually, instead, you should always use the resumeToken of the last event you received and processed successfully from the Data Feed resuming the feed with a new connection.

What’s coming next?

This release marks the initial public release of the Data Feed API with quite a few significant updates already on the horizon.

One of the upcoming significant updates will be introduced with the new Event Streaming API, powered by ceramic-one (a new implementation of the Ceramic protocol, written in Rust), which will allow developers direct access to the underlying event streams that have historically been hidden behind document-based APIs.

In addition to that, the core Ceramic team is committed to listening to your feedback and feature requests and taking them into account for future releases.

At this stage, the core Ceramic team would like to see early adopters trying out the new API and sharing the early feedback with the team.

Share your feedback with the Ceramic team!

Do you have ideas for how Data Feed API can enable the development of your projects? Or maybe you have questions or feedback on what could be improved? Share your thoughts with the core Ceramic team on the Ceramic Forum. We would love to hear from you!


DIF Blog

Interoperability in the spotlight

Identity providers including Canadian Bank Note (CBN), Veridos (a joint venture between German security tech company Giesecke+Devrient and federal technology company Bundesdruckerei) and Thales shared digital and mobile identity innovations, developments and lessons learned from real-world deployments during Day 2 of ID4Africa in Cape Town. The discussion concluded that

Identity providers including Canadian Bank Note (CBN), Veridos (a joint venture between German security tech company Giesecke+Devrient and federal technology company Bundesdruckerei) and Thales shared digital and mobile identity innovations, developments and lessons learned from real-world deployments during Day 2 of ID4Africa in Cape Town.

The discussion concluded that Digital Identity is a powerful tool to increase inclusion and opportunity. A successful program is one everyone can use, including issuers, holders and verifiers. Convenience, security and interoperability are critical success factors.

Why build on global standards?
Trust - global standards are what enable relying parties / verifiers to trust digital IDs

Security - digital IDs are always verified electronically using standardized protocols, eliminating the need for error-prone physical inspection

Ecosystem development - building on standards enables ecosystems to benefit from network effects. For example, wallet provider like Google and Apple are working to persuade entities like Uber and Airbnb to adopt mDOC - reducing the need for ecosystem conveners to find verifiers for the ecosystem

Scalability - building on standards facilitates planning for the future and readiness to expand into new use cases. For example, the TSA (the body that manages security of air travel in the US) has started accepting mobile IDs for domestic air travel in some states where ISO mDL has been implemented

Cost reduction - avoid being locked in to a single vendor and reduce the cost of ongoing maintenance and support. Digital Identity based on the 3-party model enables holders to present credentials directly to verifiers, meaning there is no longer a need to maintain verification infrastructure, unlocking major financial savings compared to federated identity schemes, Mark Sullivan said

Benefits of participation in standards development

Canadian Bank Note (CBN) is involved in both standards development and implementation. "Digital Identity will be constantly changing for a while now. Participating in the standards process enables us to bring the latest and greatest in Digital Identity back into our solutions space", said Mark Sullivan, Director of Digital Services

Standards development gaps

Standards integration. CBN is currently focusing on ISO mDL, as it is the only credential format that formalizes credential exchange today Cross-border trust. Don't reinvent what already exists. For example, the International Civil Aviation Organization (ICAO) Public Key Directory is a database holding national cryptographic keys used in authentication of e-passport information - need something similar to ensure interoperability between countries

Adoption insights

The tools and standards are in place, but there is still hesitance to start building Issuing Digital Identities is not enough to ensure adoption; there must be services associated with these identities Governments' role is to create digital infrastructure which individual agencies and private sector orgs can leverage to make customers' lives easier and safer Ease of use is critical: Digital Identity needs to be as easy to use as presenting a physical ID card. CBN recommends tight integration of UX/ UI teams into the software development process; their UX team benefited from deep dive research into other apps that require trust to be established, such as mobile banking

Addis Ababa scoops ID4Africa 2025 AGM host city

The National ID Ethiopia team are celebrating in Cape Town tonight after being announced as winners of the competition to host the ID4Africa 2025 AGM!


Hyperledger Foundation

Announcing Hyperledger Identus, a new decentralized identity applications project

We are thrilled to share the latest addition to the Hyperledger family - Hyperledger Identus. The Atala team at IOG contributed this code base to Hyperledger Labs as Open Enterprise Agent in October 2023 and has since been working to grow a vibrant open source community to support it. We are pleased to announce that Identus has now been accepted as a Hyperledger project and has moved in

We are thrilled to share the latest addition to the Hyperledger family - Hyperledger Identus. The Atala team at IOG contributed this code base to Hyperledger Labs as Open Enterprise Agent in October 2023 and has since been working to grow a vibrant open source community to support it. We are pleased to announce that Identus has now been accepted as a Hyperledger project and has moved into Incubation status.


EdgeSecure

EdgeCon Spring 2024 Sees Debut of AI Student Success Tool

The post EdgeCon Spring 2024 Sees Debut of AI Student Success Tool appeared first on NJEdge Inc.

Slalom and New Jersey Institute of Technology Launch “Sophie”–A Prototype Digital Student Concierge

NEWARK, NJ, May 22, 2024 – The EdgeCon conference series continues to solidify its place as a destination event for higher education decision makers, with EdgeCon Spring garnering the event’s highest-ever attendance rate and attendee satisfaction rate. Edge members and vendors are taking note, and choosing the event as a key venue for innovative announcements.  Slalom and longtime Edge member, New Jersey Institute of Technology (NJIT), chose EdgeCon as their venue to officially debut a prototype AI Student Success Tool to attendees.

EdgeCon served as the ideal environment for the debut, given the prevalence of higher education membership in attendance, as well as the timely discussions of how AI is transforming higher education. Currently being referred to in development as Sophie, the AI tool is called the NJIT Digital Student Advisor, and is a prototype “digital student concierge”. Sophie was built using Soul Machines technology, with chat capabilities provided by GPT 3.5, and hosted in Microsoft Azure. The prototype was designed to work on any cloud platform with a native LLM integration (AWS, Google, etc.).

“Launching the prototype for our digital student advisor ‘Sophie’ in partnership with NJIT is just the beginning of a transformative journey for higher education in New Jersey. As we embark on a journey to humanize the future of AI, we’re not just shaping technology, we’re shaping the future of education itself, unlocking limitless possibilities for personalized learning and student success.” 

— Stephen Walsh
Senior Director – Public & Social Impact
Slalom

In this prototype stage, Sophie was programmed to address a specific use case regarding a student interested in switching majors. The Q&A format shown in the demo was developed around typical questions a student may ask when switching majors or attending NJIT for the first time. Further development will focus on additional use cases to support students in their journey when accessing academic or social services at NJIT. The overall purpose is to create and improve upon a digital humanizing experience for students and faculty – one that is both informative and supportive.

Ed Wozencroft, Vice President for Digital Strategy and Chief Digital Officer, Vice President Digital Strategy CIO, NJIT, shared, “NJIT’s Artificial Intelligence Initiative seeks to embed AI in all facets of our university–from teaching and learning; to research and innovation; our operations and infrastructure – all while remaining human-centered. Undoubtedly AI is here to stay, and higher education must adapt to succeed. With great partners like Slalom, we can quickly develop and deploy great tools and services that change the nature of education and improve the value proposition of investing in a now highly personalized higher education.” 

While the prototype is in its early stages, NJIT is currently in development discussions with Slalom to determine the next phases and potential release dates. Stay tuned for launch dates and availability.

Slalom’s professional services are available for procurement by the higher education and public sector communities via Edge’s cooperative pricing system, EdgeMarket. To learn more about EdgeMarket, visit https://edgemarket.njedge.net/home.

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post EdgeCon Spring 2024 Sees Debut of AI Student Success Tool appeared first on NJEdge Inc.


Identity At The Center - Podcast

We have a bonus Sponsor Spotlight episode of The Identity at

We have a bonus Sponsor Spotlight episode of The Identity at the Center Podcast for you this week. We have an engaging conversation with Chad Wolcott, Managing Director at RSM US LLP. This episode, sponsored by RSM, explores Chad's entry into the identity industry and the broad range of digital identity services that RSM provides. We talk about our roles at RSM as part of the Digital Identity co

We have a bonus Sponsor Spotlight episode of The Identity at the Center Podcast for you this week. We have an engaging conversation with Chad Wolcott, Managing Director at RSM US LLP.

This episode, sponsored by RSM, explores Chad's entry into the identity industry and the broad range of digital identity services that RSM provides. We talk about our roles at RSM as part of the Digital Identity consulting practice and how we help our clients navigate IAM waters. The conversation also delves into the future trajectory of the identity industry, highlighting topics such as authentication, authorization, automation, AI, and analytics. The value of attending conferences like Identiverse is also touched upon, and we share some fascinating IAM horror stories.

You can watch it on YouTube at https://www.youtube.com/watch?v=16SjPeR1iKU and hear more at idacpodcast.com or in your podcast app.

#iam #podcast #idac

Tuesday, 21. May 2024

We Are Open co-op

Why Open Badges 3.0 Matters

A less technical community-centric guide Image CC BY-ND Visual Thinkery for WAO It’s been a while since the upcoming changes to the Open Badges specification were first proposed. The change to v3.0 has now been ratified, with six organisations at the time of writing having platforms that are certified compatible. We’d like to give a shout-out to Kerri Lemoie and Nate Otto who have tirelessly
A less technical community-centric guide Image CC BY-ND Visual Thinkery for WAO

It’s been a while since the upcoming changes to the Open Badges specification were first proposed. The change to v3.0 has now been ratified, with six organisations at the time of writing having platforms that are certified compatible.

We’d like to give a shout-out to Kerri Lemoie and Nate Otto who have tirelessly pushed this work forward, ensuring that the original community-focused mission has continued now that the standard is stewarded by 1EdTech.

The video below gives a overview with the rest of the post helping break down the changes in a less technical way. We can’t say “non-technical” because, well, it’s a technical standard. But we hope that it what follows helps you understand some of the changes and what it means in practice.

1. Enhanced Security

Open Badges 3.0 adopts the W3C Verifiable Credentials (VC) data model, which enhances the security and integrity of credentials. Unlike with previous versions of the specification, the verifiable data that they contain can be validated without having to check back with the issuer each time.

This matters because organisations go out of business, web pages get taken down, and life moves on. By ensuring that the badge contains everything required, it can be used for truly lifelong learning for decades to come.

For example, imagine a cooperative that issues digital badges to its members for skills like collaborative decision-making or sustainable practices. When co-op members with these badges present them to, say, a partner organisation, they can trust that the badge is genuine without needing to contact the co-op directly.

This is particularly useful in large networks like those we support through our storytelling and communications work, where verifying a member’s skills quickly and accurately is valuable.

2. Control and Privacy

The introduction of Decentralised Identifiers (DIDs) in Open Badges 3.0 gives badge earners more control over their personal data and how it is shared. DIDs are like personal web addresses for people’s digital identities that they can manage independently.

For community-oriented organisations, DIDs mean that members can manage their own credentials without relying on a central authority. For example, a member of a social enterprise could share their badges with a funder directly from their personal digital wallet, enhancing privacy and control.

Also, because DIDs are big long strings of numbers you can also use them anonymously. This could be useful for verifying the claims of sources who want to remain unidentified, for example that a whistleblower actually worked at an organisation that they’re reporting. It’s also possible to control multiple DIDs for different facets of your life, should you wish, such as work, activism, and your social life.

3. Richer Metadata

Open Badges 3.0 allows for richer metadata within badges, which means more detailed information about achievements, how they were earned, and what they represent. This includes supporting multimedia content, making badges more descriptive and informative.

Credentials like badges are useful in helping us tell the story of where we’ve been and where we’re going. This is sometimes called the ‘learner journey’. With Open Badges 3.0, a badge can include images, links, videos, and more, to provide a narrative of how the badge was earned. For those that can remember back 20 years to the interest in eportfolios, each badge is essentially a mini-eportfolio, and can also be part of a wider portfolio.

For example, a badge for a professional development course could include video testimonials, project examples, and peer evaluations directly within the badge. This makes it easier for potential employers or collaborators to get a true sense of what the badge represents, enhancing the value of the credential to all parties.

4. Interoperability

It’s one of those words that it’s difficult to avoid when talking about technical systems, and not one that we use in every day conversation. But ‘interoperability’ just means the ability of different systems, organisations, or products to work together effectively without special effort from the user. Various systems and devices can communicate, exchange data, and use the information that has been exchanged.

That’s might not sound too exciting, but with its alignment to the VC data model, Open Badges 3.0 is designed to work seamlessly with other digital credentials systems, including those used by educational institutions and businesses. For example, we’re working with the Digital Credentials Consortium, and systems can issue all different kinds of achievements, because of the interoperability between Open Badges and VCs.

For instance, a non-traditional learner who has earned badges as part of their self-taught skills development might want to share these with a university to get onto an MSc course. The university’s system can automatically recognise and understand the badges thanks to the shared VC model. This eases the process of credit recognition and transfer, making it simpler for learners to continue their education or seek job opportunities without so many bureaucratic hurdles.

5. Revocation and Status Management

To ‘revoke’ a badge simply means to say that it is no longer valid. There are many reasons why an individual or organisation may wish to do this, and not just because of errors in issuing or poor behaviour by the recipient.

For example, imagine that an B-Corp has a training programme which they update. While they could simply leave the old badges as-is, to provide less of a headache for people trying to understand the difference between current and previous alumni of the programme, they can now revoke, update, or withdraw badges efficiently.

With Open Badges 3.0, this process is transparent and straightforward, ensuring that only the most current and accurate badges are in circulation. This helps maintain trust in the badge ecosystem, which is vital for the credibility of our collective efforts in promoting digital credentials.

Conclusion

I’ve been unreasonably excited about a technical specification for 13 years now, ever since the first MacArthur-funded pilot at Mozilla. This latest update to Open Badges 3.0 is a significant step forward in making digital credentials more secure, user-centered, and meaningful within various communities, including educational institutions, professional networks, and social organisations.

Along with the work that we’ve been doing around creating plausible utopias for Open Recognition, I’m really looking forward to seeing how this development helps further empower individuals and communities, enhances trust in digital credentials, and supports the broader landscape of lifelong learning.

If you’d like help with understanding either the basics or the latest developments in credentialing technology, get in touch! We’ve also got a free email-based course called Reframing Recognition: strategies for success in going beyond microcredentials

Why Open Badges 3.0 Matters was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

FIDO Taipei Workshop: Securing the Edge with FDO

On April 24, 2024, the FIDO Alliance held its first ever in-person FDO Workshop at the Institute of Information Science, Academia Sinica Nangang Campus in Taipei. The event attracted over […]

On April 24, 2024, the FIDO Alliance held its first ever in-person FDO Workshop at the Institute of Information Science, Academia Sinica Nangang Campus in Taipei. The event attracted over 100 attendees from 30 different organizations. This workshop was dedicated to unveiling the power of the FIDO Device Onboard (FDO)—a revolutionary open standard that simplifies and secures the device onboarding process by moving away from legacy approaches that often rely on inefficient and insecure passwords and other knowledge-based credentials.

View the seminar slides below:

Introduction to FDO and How It works Applications.pdf from FIDO Alliance

The Value of Certifying Products for FDO.pdf from FIDO Alliance

Linux Foundation Edge _ Overview of FDO Software Components.pdf from FIDO Alliance

Simplified FDO Manufacturing Flow with TPMs.pdf from FIDO Alliance

Choosing the Right FDO Deployment Model for Your Application.pdf from FIDO Alliance

Where to Learn More About FDO.pdf from FIDO Alliance

Secure Zero Touch enabled Edge compute with Dell NativeEdge via FDO.pdf from FIDO Alliance

How Red Hat Uses FDO in Device Lifecycle.pdf from FIDO Alliance

ASRock Industrial FDO Solutions in Action for Industrial Edge AI.pdf from FIDO Alliance

FDO for Camera, Sensor and Networking Device – Commercial Solutions from VinCSS.pdf from FIDO Alliance

FIDO Taipei Workshop: Securing the Edge with FDO

[Watch the FIDO Taipei Workshop Recap Video] On April 24, 2024, the FIDO Alliance held its first ever in-person FDO Workshop at the Institute of Information Science, Academia Sinica Nangang […]

[Watch the FIDO Taipei Workshop Recap Video]

On April 24, 2024, the FIDO Alliance held its first ever in-person FDO Workshop at the Institute of Information Science, Academia Sinica Nangang Campus in Taipei. The event attracted over 100 attendees from 30 different organizations. This workshop was dedicated to unveiling the power of the FIDO Device Onboard (FDO)—a revolutionary open standard that simplifies and secures the device onboarding process by moving away from legacy approaches that often rely on inefficient and insecure passwords and other knowledge-based credentials.

[Pictures from FIDO Taipei Workshop]

The sessions included “FDO 101,” “FDO Certification Programs,” “How to Deploy FDO” and diverse showcases from leaders in edge computing and IoT. Companies including Intel, ASRock, Red Hat, Dell, VinCSS, and Infineon demonstrated how they have developed, applied, and used FDO in their solutions. We are excited to share slides from the workshop:

Introduction to FDO and How It Works: Richard Kerslake, Market Development Manager, Connected Device Standards, FIDO Alliance (Download Slides) The Value of Certifying Products for FDO: Paul Heim, Director of Certification, FIDO Alliance (Download Slides) Choosing the Right FDO Deployment Model for Your Application: Geoffrey Cooper, Intel (Download Slides) Simplified FDO Manufacturing Flow with TPMs: Liam Cheng, Marketing Manager, Infineon (Download Slides) Linux Foundation Edge – Overview of FDO Software Components: Randy Templeton, Software Engineer, Intel (Download Slides) Where to Learn More about FDO: Richard Kerslake, Market Development Manager, Connected Device Standards, FIDO Alliance (Download Slides) Secure Zero Touch Enabled Edge Compute with Dell NativEdge via FDO: Brad Goodman, Architect, Edge Computing, Dell (Download Slides) FDO for Camera, Sensor, and Networking Device – Commercial Solution from VinCSS: Quan Do Head of IoT Security Solutions & Van Nguyen, Senior Researcher, R&D Center, VinCSS (Download Slides) How Red Hat Uses FDO in Device Lifecycle: Costin Gament, Senior Integration Engineer & Vitaliy Emporopulo, Principal Software Engineer, Red Hat (Download Slides) ASRock Industrial’s FDO Solutions in Action for Industrial Edge AI: Kenny Chang, VP of Product and Marketing Division, ASRock Industrial (Download Slides)

[Pictures from FIDO Taipei Workshop]

We had the honor of welcoming the newly inaugurated Minister of Digital Affairs, Dr. Yennun Huang, along with other key government officials who participated to extend their congratulations on FIDO Alliance’s contributions to the digitally connected world. The event garnered significant attention from various local media outlets, including Liberty Times Net, Radio Taiwan International, Yahoo Taiwan, EE Times, CNA News, iThome, and many others.

The FIDO Alliance sincerely thanks our sponsors for their unwavering commitment to a password-free future, which was crucial in making the first-ever in-person FDO Workshop possible. Their ongoing dedication was essential to the event’s success. We look forward to the continued partnerships and insights gained during this workshop, which will help shape a more secure digital future.

We invite you to join us for the next in-person event, the FIDO APAC Summit 2024, scheduled for September 10-11, 2024, in Kuala Lumpur, Malaysia. This summit will feature a comprehensive FDO workshop among its sessions.


The Engine Room

Now out: Read our 2023 Impact Report

2023 was a year in which we provided targeted strategic support for a record 85 partners, published 3 research reports, piloted a successful cohort learning programme, established new ways of working and created new organisational structures that set up our work for 2024 and into the future. The post Now out: Read our 2023 Impact Report appeared first on The Engine Room.

2023 was a year in which we provided targeted strategic support for a record 85 partners, published 3 research reports, piloted a successful cohort learning programme, established new ways of working and created new organisational structures that set up our work for 2024 and into the future.

The post Now out: Read our 2023 Impact Report appeared first on The Engine Room.

Monday, 20. May 2024

Hyperledger Foundation

Hyperledger FireFly V1.3 is Now Available

The Hyperledger FireFly community is pleased to announce the release of version v1.3! Enhancements in this release make the industry’s first open-source Web3 Gateway an even more powerful platform for tokenization, multi-chain interoperability, and building blockchain-based applications. 

The Hyperledger FireFly community is pleased to announce the release of version v1.3! Enhancements in this release make the industry’s first open-source Web3 Gateway an even more powerful platform for tokenization, multi-chain interoperability, and building blockchain-based applications. 


FIDO Alliance

Tech Radar: Navigating towards a passwordless future

Traditionally, passwords have served as the primary means of securing digital identities, yet their limitations are becoming increasingly evident. To pave the way for a passwordless future, accessibility is paramount. Any alternative […]

Traditionally, passwords have served as the primary means of securing digital identities, yet their limitations are becoming increasingly evident. To pave the way for a passwordless future, accessibility is paramount. Any alternative authentication method must be inclusive, catering to users across diverse technological environments. Whether it’s the latest smartphone or a dated desktop, the authentication process should seamlessly adapt. For example, solutions like the FIDO Alliance’s Web Authentication (WebAuthn) standard aim to bridge this accessibility gap, enabling passwordless logins across a spectrum of devices and platforms.


Security Informed: trinamiX Unveils Secure Face Authentication In Foldable Phones

This touchless solution offers enhanced security and convenience, meeting the biometric security requirements set by organizations such as the International Internet Finance Authentication Alliance (IIFAA), the FIDO Alliance, and Android […]

This touchless solution offers enhanced security and convenience, meeting the biometric security requirements set by organizations such as the International Internet Finance Authentication Alliance (IIFAA), the FIDO Alliance, and Android (Google).


The Fintech Times: Visa Reveals Digital Products to be Launched Over the Year Catering to Evolving Consumer Demands

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a face […]

Built on the latest Fast Identity Online (FIDO) standards, the Visa Payment Passkey Service confirms a consumer’s identity and authorises online payments with a quick scan of their biometrics like a face or fingerprint. When shopping online, Visa passkeys replace the need for passwords or one-time codes, enabling more streamlined, secure transactions.


PYMNTS: Visa Recasts Digital Wallet Landscape at Intersection of Identity and Payments

Visa has enhanced their security and streamlined transactions by onboarding passkeys. Now consumers can confirm their identity and authorize online payments through facial or fingerprint scans, eliminating the need for […]

Visa has enhanced their security and streamlined transactions by onboarding passkeys. Now consumers can confirm their identity and authorize online payments through facial or fingerprint scans, eliminating the need for passwords and one-time codes.


Identity At The Center - Podcast

I’m very excited for this week’s episode of The Identity at

I’m very excited for this week’s episode of The Identity at the Center podcast. We were FINALLY able to record an episode with Henrique Teixiera who recently joined Saviynt from Gartner. We covered a range of topics which you can watch on YouTube at https://www.youtube.com/watch?v=cO8D-nL_BNY or listen to in your favorite podcast app. #iam #podcast #idac

I’m very excited for this week’s episode of The Identity at the Center podcast. We were FINALLY able to record an episode with Henrique Teixiera who recently joined Saviynt from Gartner. We covered a range of topics which you can watch on YouTube at https://www.youtube.com/watch?v=cO8D-nL_BNY or listen to in your favorite podcast app.

#iam #podcast #idac

Friday, 17. May 2024

FIDO Alliance

Biometric Update: Authenticate 2024

Authenticate 2024Omni La Costa Resort & Spa, Carlsbad, CAOctober 14-16, 2024 It’s time to modernize your authentication! Organizations around the globe are embracing a new way to authenticate with FIDO […]

Authenticate 2024
Omni La Costa Resort & Spa, Carlsbad, CA
October 14-16, 2024

It’s time to modernize your authentication! Organizations around the globe are embracing a new way to authenticate with FIDO standards, moving past passwords and legacy forms of multi-factor authentication to provide users with passkeys for phishing-resistant sign-ins. Their results? Strong security, lessened data breach risk, improved user experiences, faster sign-in rates, and reduced costs.

Join these industry leaders as they come together at Authenticate 2024, and get the latest tools and insights to get your organization on the path to strong, modern passwordless authentication.

Hosted by the FIDO Alliance, Authenticate is the industry’s only conference dedicated to all aspects of user authentication – including a focus on FIDO-based sign-ins. It is the place for CISOs, business leaders, product managers, security strategists and identity architects to get all of the education, tools and best practices to roll out modern authentication across web, enterprise and government applications.

Authenticate 2024 will be held at the Omni La Costa Resort & Spa in Carlsbad, California for the second year in a row. This venue includes ample space for our growing audience, more sessions and session types for all levels, and more opportunities for networking with peers. The 2024 event will include our most dynamic expo hall yet, where all exhibiting sponsors can showcase their solutions and meet companies looking for partners on their path to passwordless.

Whether you are new to FIDO, in the midst of deployment or somewhere in between, Authenticate 2024 will have the right content – and community – for you.

Register and learn more here.


Biometric Update: Passkeys continue march to mainstream with Visa, WhatsApp updates

FIDO2 protocol finding wide adoption but analysts may have found MITM vulnerability. Visa has unveiled new digital products and services based on biometrics and passkeys, as it aims to address rapid […]

FIDO2 protocol finding wide adoption but analysts may have found MITM vulnerability.

Visa has unveiled new digital products and services based on biometrics and passkeys, as it aims to address rapid changes in AI and digital identity technology. WhatsApp has expanded its passkey availability for all users. And the FIDO Alliance welcomes a new board member, while researchers question how airtight its security protocol really is.

Thursday, 16. May 2024

Origin Trail

Trace Labs joins NVIDIA Inception program to advance the Verifiable Internet for AI

Trace Labs, the core developer of OriginTrail, has recently become a member of the NVIDIA Inception program. Achieving this milestone serves not only as a recognition of the importance of creating a Verifiable Internet for Artificial Intelligence (AI) with OriginTrail Decentralized Knowledge Graph (DKG), but also opens up an opportunity to work closely with an industry leader in the field of 

Trace Labs, the core developer of OriginTrail, has recently become a member of the NVIDIA Inception program. Achieving this milestone serves not only as a recognition of the importance of creating a Verifiable Internet for Artificial Intelligence (AI) with OriginTrail Decentralized Knowledge Graph (DKG), but also opens up an opportunity to work closely with an industry leader in the field of AI.

What is the NVIDIA Inception program?

NVIDIA Inception is a program designed to help companies evolve faster through cutting-edge technology, opportunities to connect with venture capitalists, and access to the latest technical resources from NVIDIA. Inception program support also includes access to NVIDIA Deep Learning Institute and unlimited access to the NVIDIA Developer Forums, allowing Trace Labs to be in close contact with the latest software and hardware AI product developments NVIDIA is bringing to the market.

Towards the Verifiable Internet for AI

As a part of the Inception program, Trace Labs will be able to further its vision of the Verifiable Internet for AI that aims to limit AI’s current shortfalls (e.g. hallucinations, IP infringements, bias, and model collapse) and instead offer information provenance, data ownership and integrity as the core pillars of future AI solutions. For anyone wanting to take OriginTrail DKG and Nvidia out for a spin and build a trusted AI solution, you can make it happen in just a few easy steps.

Trace Labs is already implementing solutions based on a Truly Open AI across multiple industries like supply chains, healthcare, construction, sports, and the recently revealed aviation industry which also received support from the European Union (EU).

OriginTrail on Twitter: "Imagine a world where trust & transparency are key to a Verifiable Internet for Artificial Intelligence💡Introducing https://t.co/6yT8dV6Paj - A Truly Open #AI pic.twitter.com/G7gUKr0QH5 / Twitter"

Imagine a world where trust & transparency are key to a Verifiable Internet for Artificial Intelligence💡Introducing https://t.co/6yT8dV6Paj - A Truly Open #AI pic.twitter.com/G7gUKr0QH5

OriginTrail on Twitter: "Making sure any take off is done safely✈️Enter DMaaST🇪🇺 - an EU funded initiative harnessing @origin_trail to advance the Digital Product Passport🛂for enhanced responsiveness to external & unforeseen events in electronics & aerospace industries!👉https://t.co/XfNoxkJlrL1/🧵 pic.twitter.com/x7E7nVMRQP / Twitter"

Making sure any take off is done safely✈️Enter DMaaST🇪🇺 - an EU funded initiative harnessing @origin_trail to advance the Digital Product Passport🛂for enhanced responsiveness to external & unforeseen events in electronics & aerospace industries!👉https://t.co/XfNoxkJlrL1/🧵 pic.twitter.com/x7E7nVMRQP

Stay tuned for more developments from Trace Labs’ involvement in the NVIDIA Inception program, leveraging the power of OriginTrail DKG and AI to create a more transparent and trustworthy digital world.

Trace Labs joins NVIDIA Inception program to advance the Verifiable Internet for AI was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Energy Web

Case Study: Smartflow

A deep dive on Smartflow by Energy Web Smartflow enables enterprises to configure worker nodes — a new technology developed by Energy Web. These nodes comprise distributed computing networks that securely execute sensitive business operations impacting multiple companies. Worker nodes themselves are decentralized nodes controlled by companies or individuals who have staked Energy Web tokens

A deep dive on Smartflow by Energy Web

Smartflow enables enterprises to configure worker nodes — a new technology developed by Energy Web. These nodes comprise distributed computing networks that securely execute sensitive business operations impacting multiple companies. Worker nodes themselves are decentralized nodes controlled by companies or individuals who have staked Energy Web tokens in order to become trusted worker node operators. All nodes are anchored to the Energy Web X parachain, itself a Parachain on the Polkadot network.

This technology was originally developed to solve a paradox that hinders advanced renewable energy tracking solutions like renewable matching and green electric vehicle charging: credibility relies on accurate, publicly verifiable results (e.g., proof that digital representations of renewables are not double-counted). But inputs from separate organizations, such as granular renewable energy production and electricity demand data, are commercially sensitive and need to remain private. Complex commercial, legal, and technical requirements often make it challenging for a single actor to unilaterally access and process all requisite data. The ability to establish a shared source of truth from segregated, individual information sources has been an ongoing challenge in multilateral settings; the challenge is even greater for energy market participants seeking to exchange and process granular data in order to procure services from distributed energy resources.

Smartflow solves these challenges by enabling enterprises to configure, launch, and maintain distributed computing networks that ingest data from external sources, execute custom workflows based on business logic, and vote on results in order to establish consensus and trustworthy results without revealing or modifying the underlying data. Worker Nodes apply proven and tested concepts and components from blockchain technology in a novel, enterprise-friendly architecture to provide all stakeholders with cryptographic proof that mutually-agreed rules and and processes are followed correctly, ensure computational outputs from business processes are correct, trustworthy, and preserve the privacy and integrity of underlying data for auditing purposes.

In addition to Smartflow being used as a generic configuration tool for any enterprise by Energy Web customers, the following are examples of existing worker node networks configured using Smartflow:

Worker nodes supporting the Maritime Shipping registry supported by the Maersk McKinney Moller center for Zero Carbon Shipping Worker nodes are also supporting a recently released solution for green electric vehicle charging on Energy Web X, Autogreencharge A network of nodes powering the Sustainable Aviation Fuel registry in partnership with the Sustainable Aviation Buyers Alliance A network of worker nodes is currently powering the Digital Spine solution supporting multiple grid decarbonization projects

How Smartflow and Worker Nodes Work: Worker nodes can be implemented in any development framework, including node.js, python, and rust. Each worker node is programmed to execute a specific conditional logic workflow based on a predefined dataset (i.e. source and schema) and event trigger (e.g. a regular time interval, or a specific external event). Workers are given read-only access to one or more external data sources via API or message broker. When the workflow “event” is triggered, the workers initiate a round of calculation and voting (worker nodes can be configured to either poll an external data source at regular intervals, or “listen” to an external source for a specific trigger).

In the calculation step, each worker node independently executes the same conditional logic based on the data it received during the trigger. Then, each worker submits its results (i.e. output of the logic workflow) to the smart contract, which collates the worker nodes’ votes into a unique voting round and keeps track of each round to maintain continuity. The smart contract defines a voting threshold (typically a simple majority, such as 2 of 3, or 4 of 7, etc.) that must be reached in order to establish consensus on the results of each voting round. Each voting round is defined by a voting ID, the number of unique votes (from worker nodes), and a consensus result. If the threshold is reached (i.e. a majority of nodes independently reach the same conclusion and submit identical results to the vote), then the voting round is closed and the result is hashed on the Energy Web X Parachain; this creates a connected tree of results that can be queried and validated for auditing purposes. If consensus is not reached in the voting round, the entire process from event trigger through voting is repeated; if a second round fails, a custom workflow is executed (this can include alerts, failover to other nodes, or acceptance of results from a plurality of workers).

In addition to collating the voting results, the smart contract is configured to issue rewards for workers in the pool based on performance criteria. A base reward can be issued simply for workers being available in the pool, and additional rewards can be issued based on metrics such as consistency (i.e. availability for all voting rounds), accuracy (i.e. being part of the winning consensus), and speed (i.e. being fastest to submit voting results following the event trigger). Workers presenting faulty results are penalized, so the only profitable strategy for a worker is to provide honest, accurate results. This mechanism ensures that the outcome of the process, the consensus reached, can be trusted by everyone, even without knowing all the input data.

This entire process is now abstracted away via Smartflow which enables enterprises to configure their own Worker Node networks in a no-code environment. Please see smartflow.org for additional information.

The infographics below summarize the value proposition of Smartflow. The first graphic shows how solutions are typically configured by enterprises without Smartflow before showing what the new process can be leveraging worker nodes and Smartflow.

Without Smartflow:

With Smartflow:

Industry testimonials:

“As long-time Energy Web supporters, we are excited by the decentralized approach to multi-party computation provided by SmartFlow. It is a digital tool that provides a new kind of trust layer in business relations based on shared information. We are eager to discover how it can be integrated into our processes, and what kind of value it can create,” said Etienne Gehain, New Digital Solutions Director at Engie.

“Energy Web’s collaboration with Deutsche Telekom shows that Web 3 technology can be an important tool in the fight against climate change,” says Dirk Röder, Head of Deutsche Telekom’s Blockchain Solutions Center. “Deutsche Telekom is not only securing the energy grid, but also accelerating progress towards climate targets while promoting renewable energies.”

On Smartflow’s embedded self-sovereign identity solution: “To ensure network security and stability, the integration of renewable energy sources into our energy systems will require household and industrial flexibility to be activated. SSI will be an important tool for creating a registry of decentralised and flexible assets that will allow us to monitor the state of the network and steer these decentralised assets.” Kris Laermans, Innovation at Elia Group

Jasper Verwaal, Digital Assets Manager at Deloitte recently provided the following insights on Energy Web X’s architecture, solutions, and worker nodes: “The architecture Energy Web has already built around asset management, exchanging data, and green proofs…can be seen as public infrastructure. The more standardization that’s in place, the easier it is to adopt common methodologies.” He continued, “Many enterprises find it challenging to put data on-chain. But we hope that Energy Web X will contribute to making it easier to decide what data lives on-chain and what data off-chain.”

Sahas Katta, CEO at Smartcar, the leading connected car API platform and Autogreencharge partner, said “This partnership will empower EV drivers with the knowledge and confidence to maximize the sustainability of their EV charging experience. With Smartcar, Energy Web can seamlessly integrate across diverse vehicle brands, giving more EV drivers access to accurate insights into charging transparency and renewable energy options,” in reference to the new Autogreencharge solution powered by Smartflow, worker nodes, and Energy Web X.

Bryan Fisher, Managing Director at the clean energy nonprofit RMI commenting on the SAFc registry deployed on Energy Web X supported by worker nodes, “The SAFc Registry is the culmination of RMI’s ongoing work to catalyze the sustainable aviation fuel market. By creating a throughline from corporate consumers to fuel producers, we can expand investment in SAF infrastructure and increase supply of low-carbon fuels. Voluntary uptake is critical to scaling a burgeoning market, and we hope this registry can serve as an example for other sectors in the future.”

Frederik Jacobsen, project manager at the Mærsk Mc-Kinney Møller Center for Zero Carbon Shipping commenting on the Maritime registry on Energy Web X supported by Smartflow worker nodes: “The collaboration with ZEMBA further strengthens the development of a global Maritime Book and Claim System and underlines how important such a system will be to overcome obstacles to accelerate the adoption of low-emissions fuels at scale.”

“We’re very interested in driving trust and transparency across the supply chain, which is why we’re so excited to join the Energy Web ecosystem,” said Thatcher Young, CEO of CarbonEnfo. “In addition to supporting the public blockchain, we see tremendous value in leveraging the Energy Web technology stack to advance our own application development.”

Case Study: Smartflow was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 15. May 2024

MOBI

Alliance of 120 companies including Honda and Ford collaborates on EV batteries

Alliance of 120 companies including Honda and Ford collaborates with European companies on EV batteries Read on Nikkei's website By Azusa Kawakami, Nikkei staff writer | April 23, 2024  【ニューヨーク=川上梓】ホンダや日産自動車、米フォード・モーターなど日米企業120社で構成する企業連合が、電気自動車(EV)電池に関するデータの活用で欧州発の企業連合と連携する。電池の材料やリサイクルにつながる情報を相互に安全に共有・運用できるインフラを企業主体でつくり、EVの普及につなげる。 ホンダや日産、米自動車大手「ビッグ3」、IT(情報技術)企業など120社でEV電池の情報をデジタル上で管理する国際規格づくり
Alliance of 120 companies including Honda and Ford collaborates with European companies on EV batteries

Read on Nikkei’s website

By Azusa Kawakami, Nikkei staff writer | April 23, 2024 

【ニューヨーク=川上梓】ホンダや日産自動車、米フォード・モーターなど日米企業120社で構成する企業連合が、電気自動車(EV)電池に関するデータの活用で欧州発の企業連合と連携する。電池の材料やリサイクルにつながる情報を相互に安全に共有・運用できるインフラを企業主体でつくり、EVの普及につなげる。

ホンダや日産、米自動車大手「ビッグ3」、IT(情報技術)企業など120社でEV電池の情報をデジタル上で管理する国際規格づくりを進めている「モビリティ・オープン・ブロックチェーン・イニシアチブ(MOBI)」が、欧州発の企業連合「GAIA-X(ガイアエックス)」と組む。

ガイアエックスは欧州で企業間のデータ連携を推進している。日米中心の企業連合のMOBIと、欧州のガイアエックスが組むことで、日本、米国、欧州の3地域の企業間で、電池の材料や劣化状態、充放電などの情報を相互に共有し、運用できるようにする。

具体的には、暗号技術を使って機密性を保ったうえで企業がデータを記録・管理できる「ブロックチェーン(分散型台帳)」の技術を使う。今後、両者で実証を始める。

相互運用が進めば、企業は利用できるデータの数や範囲が増え、サービスの低コスト化につながる。電池の劣化状態が見える化されれば、将来的に電池の残存価値をEVの中古車価格に反映できるようになる。利用者にとっては1つのアプリケーションから利用できるEVや充電インフラの種類が広がり、EVを利用しやすくなる。

電池の情報を巡っては、日本の経済産業省傘下の団体と欧州の官民データ基盤「カテナX」が連携を決めた。ガイアエックスはカテナXの上部組織にあたり、製品に使う素材や二酸化炭素排出量などのデータを企業間でやりとりする仕組みをつくっている。

電池の情報を管理・共有する仕組みは欧州で先行してきたが、米国企業も導入を検討している。22年に成立したインフレ抑制法でEV補助金を受けるため北米域内で電池材料や部材を調達することが条件になったためだ。

特に電池の材料であるレアメタル(希少金属)は中国への依存度が大きい。米国企業としては、日欧企業と連携することで電池の材料調達に関する情報を透明化したい狙いもある。

The post Alliance of 120 companies including Honda and Ford collaborates on EV batteries first appeared on MOBI | The Web3 Economy.


Digital ID for Canadians

Chicago Title Insurance Company’s VerifID™ first PCTF Verified Person certified service

Confirming Chicago Title Insurance Company’s VerifID™ service PCTF Verified Person Component conformance May 14, 2024 – Toronto –  We proudly announce that Chicago Title Insurance…
Confirming Chicago Title Insurance Company’s VerifID™ service PCTF Verified Person Component conformance

May 14, 2024 – Toronto –  We proudly announce that Chicago Title Insurance Company’s ID verification platform, VerifID™, obtained the DIACC Pan-Canadian Trust Framework (PCTF) Verified Person certification, a massive milestone in becoming a trusted and reliable service in the Canadian ecosystem. DIACC welcomes Chicago Title’s VerifID to the growing cohort of certified providers in digital identity, particularly in the critical area of real estate. 

In Canada, mortgage and title fraud is a pressing concern, presenting an opportunity for fraudsters. In response, Chicago Title has developed VerifID, an innovative identity verification and fraud solution. VerifID combats fraudulent real estate transactions, demonstrating its proactive approach to addressing industry challenges. 

“We’re passionate about stopping fraud; it’s why we developed VerifID,” said John Rider, Senior Vice President of Retail and Commercial Title at Chicago Title Insurance Company. “We wanted to take that passion to the next level by completing our PCTF certification, demonstrating our commitment to being the best in the business. VerifID doesn’t allow users to modify the verification process, which, in our experience, reduces the intake of valuable information; ours is the only solution that will meet DIACC standards 100% of the time.”

VerifID’s journey to DIACC PCTF certification was rigorous, marked by a thorough third-party evaluation that left no room for shortcuts. This evaluation ensured that the service met the stringent requirements of the Verified Person Component at LOA2, which shows Chicago Title’s commitment to leading the fraud and identity industry through continuous innovation and best practices.

The DIACC certification is a standardized process that includes a point-in-time audit conducted by DIACC Accredited Auditor KUMA and an independent committee review for quality assurance. As a result, DIACC has issued a three-year cycle Trustmark, subject to annual surveillance audits, and added Chicago Title’s VerifID to its Trusted List of certified providers. 

“Chicago Title’s VerifID achieved a significant industry milestone and competitive advantage by becoming the first Canadian vendor to earn certification against the PCTF identity proofing requirements,” said Joni Brennan, DIACC President.   

The PCTF is a comprehensive risk management and assurance framework for validating the design of private-sector digital trust services. Its criteria address aspects of digital trust and verification, including privacy, security, and interoperability, mitigating risks associated with fraud and breaches. This framework supports robust digital verification and enhances user confidence by providing a reliable means to authenticate digital verification across services, fostering a safer and more integrated digital environment. The PCTF helps people and organizations across the public and private sectors to recognize verified trusted services. 

To explore the benefits of certification, contact voila@diacc.ca 

About DIACC

Established in 2012, DIACC is a non-profit organization of public and private sector members committed to advancing full and beneficial participation in the global digital economy by promoting design principles, PCTF adoption and conformity assessment. DIACC prioritizes personal data control, privacy, security, accountability, and inclusive people-centered design.

For more information about DIACC, please visit https://diacc.ca

About Chicago Title Insurance Company

Chicago Title Insurance Company is a wholly owned division of Fidelity National Financial, Inc. (FNF). FNF, operating through its subsidiary Fidelity National Title Group, Inc., is one of North America’s largest title companies, providing core title insurance products, escrow and other real estate related products. FNF, a Fortune 500 company, has offered security for real estate transactions for over 170 years. Chicago Title Insurance Company, a subsidiary of FNF, has been licensed in Canada for over 70 years.

For more information about Chicago Title Insurance Company, please visit https://chicagotitle.ca  


FIDO Alliance

FIDO Seminar at RSAC: The State of Authentication 2024: The Global Progress Past Passwords

FIDO Alliance’s seminar at RSAC 2024 included the latest with FIDO authentication and passkeys – and more! During the seminar, the FIDO Alliance and its industry stakeholders discussed the latest developments in […]

FIDO Alliance’s seminar at RSAC 2024 included the latest with FIDO authentication and passkeys – and more! During the seminar, the FIDO Alliance and its industry stakeholders discussed the latest developments in the global movement to passwordless technology for better security and user experiences.

View the seminar slides below:

Intro to Passkeys and the State of Passwordless.pptx from FIDO Alliance

ADP Passwordless Journey Case Study.pptx from FIDO Alliance

Design Guidelines for Passkeys 2024.pptx from FIDO Alliance

Harnessing Passkeys in the Battle Against AI-Powered Cyber Threats.pptx from FIDO Alliance

Tales from a Passkey Provider Progress from Awareness to Implementation.pptx from FIDO Alliance

Hyatt driving innovation and exceptional customer experiences with FIDO passwordless authentication.pptx from FIDO Alliance

Introduction to FIDO Authentication and Passkeys.pptx from FIDO Alliance

Hyperledger Foundation

Developer Showcase Series: Alvaro Picazo Haase, Blockchain Engineer, Accenture

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Alvaro Picazo Haase, a Blockchain Engineer at Accenture. 

Back to our Developer Showcase Series to learn what developers in the real world are doing with Hyperledger technologies. Next up is Alvaro Picazo Haase, a Blockchain Engineer at Accenture


EdgeSecure

Edge Commitment to Service Continuity and Strategic Alignment Post the VMware-Broadcom Acquisition

The post Edge Commitment to Service Continuity and Strategic Alignment Post the VMware-Broadcom Acquisition appeared first on NJEdge Inc.

Dear Edge Community,

I hope this message finds you well as we continue to adapt to the industry changes brought about by VMware’s acquisition by Broadcom. In our ongoing commitment to transparency and support, I want to share some important updates and reaffirm our dedication to assisting you through these transitions.

Pricing Changes Post-Acquisition:
It has become apparent in our recent assessments and communications with Broadcom that the pricing structure for VMware’s services may undergo changes. At this time, it does not appear that Broadcom intends to maintain or reduce the prices from their current or previous year’s levels, even when considering potential discounts. This development is significant, and we want to ensure it factors into your planning and decision-making processes.

Alternative Solutions and Expert Support:
In light of these potential changes, we believe it is prudent to consider alternative solutions that may better meet your financial and operational needs. We have secured a robust selection of substitute products and services under Edge procurement vehicles. Most of the market available substitutes are available through the EdgeMarket portal, and I am pleased to introduce several key points of contact who can assist you with these options:

Lou Malvasi, PubSec Sr. District Sales Manager
SHI
Mobile: (609) 608-2463
Email: Lou_Malvasi@shi.com Bethany Tangredi, AWS Partner Account Executive
AWS
Mobile: (413) 896-4331
Email: bethtang@amazon.com Cyntya Ramirez, Senior Program Manager
Carahsoft Technology Corp.
Mobile: (571) 662-4641
Email: cyntya.ramirez@carahsoft.com

Lou, Bethany, and Cyntya are available to discuss how various alternative solutions can provide the value and support you need during this time of change.

Alternative Solutions via Edge procurement Vehicles

AWS Native VMC on AWS

TeCHS/SHI & Carahsoft:

Microsoft Azure/Hyper-V Nutanix Citrix Oracle Other service providers

Cloud Migration Expertise:
For those looking to enhance or modify their cloud strategies, please consider our awarded providers listed below. These firms are recognized for their excellence, are fully equipped to support your migration efforts, and are available via EdgeMarket contracts:

CampusWorks, Inc.: Contract #269EMCPS-23-002-EM-CWI, Expires On 10/04/2026 CBTS: Contract #269EMCPS-23-002-EM-CBTS, Expires On 10/29/2026 Infojini, Inc.: Contract #269EMCPS-23-002-EM-IFJ, Expires On 10/01/2026 New Era Technology, Inc.: Contract #269EMCPS-23-002-EM-NET, Expires On 11/30/2026 SHI: Contract #269EMCPS-23-002-EM-SHI, Expires On 12/04/2026 Slalom, Inc.: Contract #269EMCPS-23-002-EM-SLM, Expires On 09/19/2026 Softchoice Corporation: Contract #269EMCPS-23-002-EM-SCC, Expires On 10/16/2026 Strata Information Group: Contract #269EMCPS-23-002-EM-SIG, Expires On 10/04/2026 Trigyn Technologies, Inc.: Contract #269EMCPS-23-002-EM-TGN, Expires On 11/30/2026 Tryfacta, Inc.: Contract #269EMCPS-23-002-EM-TFC, Expires On 10/17/2026

As we face these new challenges, our team remains committed to supporting you every step of the way. We encourage you to reach out to our contacts, or directly to me, with any concerns, queries, or discussions regarding your future strategic directions.

Thank you for your continued trust and partnership as we navigate these evolving circumstances together.

Sincerely,

Christopher R. Markham, Ph.D.(c)
Executive Vice President
Edge (NJEdge)

The post Edge Commitment to Service Continuity and Strategic Alignment Post the VMware-Broadcom Acquisition appeared first on NJEdge Inc.


Next Level Supply Chain Podcast with GS1

Follow That Food! Food Safety Through Traceability with Andy Kennedy

Speedy and efficient traceability systems are vital for compliance, consumer health, and resilience in the food safety industry. Andy Kennedy, Principal Traceability Advisor at New Era Partners, joins us to unravel the complexities of food safety in the supply chain. With 17 years of experience creating SAS food traceability businesses, Andy offers insights into the historical context of traceab

Speedy and efficient traceability systems are vital for compliance, consumer health, and resilience in the food safety industry.

Andy Kennedy, Principal Traceability Advisor at New Era Partners, joins us to unravel the complexities of food safety in the supply chain. With 17 years of experience creating SAS food traceability businesses, Andy offers insights into the historical context of traceability regulations, the significance of technological progress, and how innovation will continue to shape our approach to food safety. He candidly describes the tension-filled environment of managing a food safety outbreak and the critical role of coordination during these investigations.

Andy touches on COVID-19 and the food supply chain, the transition challenges food items faced from food service to retail, and the astounding potential of 2D barcodes and digital receipts to engage consumers and ensure their safety. Learn how tracking contaminated products has been reduced from over a month to less than a week, the role GS1 Standards play in compliance, and how FSMA 204, compared to the Bioterrorism Act, is mandating a new era of data sharing. 

 

Key takeaways: 

Traceability systems play a critical role in effectively managing and shortening the timeframe of contamination investigations. 

Utilizing GS1 standards to comply with evolving regulatory demands illustrates how technology has progressed over two decades.

The pandemic has highlighted the need for resilience, real-time visibility, and interoperability in supply chains

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

Food Safety Resources

 

Connect with guest:

Andy Kennedy on LinkedIn

Check out New Era Partners

Tuesday, 14. May 2024

Origin Trail

ChatDKG.ai: Driving synergies across the AI solution landscape to tackle hallucinations, bias, and…

ChatDKG.ai: Driving synergies across the AI solution landscape to tackle hallucinations, bias, and model collapse Abusing social networks for political manipulations may look minuscule compared to a lack of trust in solutions to which we are “outsourcing” our cognition — AI systems. As we would trust them to process large amounts of knowledge and provide us inputs for our actions, or even pe
ChatDKG.ai: Driving synergies across the AI solution landscape to tackle hallucinations, bias, and model collapse

Abusing social networks for political manipulations may look minuscule compared to a lack of trust in solutions to which we are “outsourcing” our cognition — AI systems. As we would trust them to process large amounts of knowledge and provide us inputs for our actions, or even perform certain actions autonomously, they have the highest possible requirements for transparency and verifiability.

There should be no compromise in designing AI solutions when it comes to data ownership, information provenance, verifiability of information, or bias that would include any censorship-by-design approach. The risk of this revolution not unfolding in an inclusive way is a societal threat of establishing a monopoly on AI.

We therefore introduced an effective way of establishing a new paradigm, using a Decentralized Retrieval-Augmented Generation (dRAG) framework. dRAG advances the RAG model by organizing external sources in a Decentralized Knowledge Graph (DKG) while introducing incentives to grow a global, crowdsourced network of knowledge made available for AI models to use. See an example of a ChatGPT4 without and with dRAG use:

The dRAG framework enables a hybrid, decentralized AI system that brings together neural (e.g. LLMs) and symbolic AI (e.g. Knowledge Graph) methodologies. Contrary to using a solely probabilistic neural AI approach, the symbolic AI approach enhances it with the strength of Knowledge Graphs, introducing a more deterministic component.

Figure 1: The interplay between neural (LLMs) and symbolic (KGs) AI methodologies. Source: https://arxiv.org/pdf/2306.08302.pdf [4]

To harness a harmonious development between Web3 fundamentals and rapidly deployed AI systems, our approach is to integrate the core Web3 technologies such as the OriginTrail Decentralized Knowledge Graph (DKG) and AI systems (OpenAI, Gemini, Microsoft Co-pilot, xAI’s Grok, and others).

We believe that we can realize the potential of trusted AI by creating a Verifiable Internet for AI that is founded on principles of neutrality, inclusiveness, and usability, while giving users freedom of choice with a multi-modal and a multi-model AI framework.

ChatDKG.ai: neuro-symbolic AI driving synergies across the solution landscape

Below, you can find a comparison table of AI types and frameworks with very diverse advantages and disadvantages. The globally most adopted and centralized AI solutions like Google Gemini, OpenAI, xAI, Perplexity deliver an immense value for a spectrum of use cases. Leveraging OriginTrail’s dRAG — brand name ChatDKG.ai, they can improve their shortfalls by harnessing the synergy of neuro-symbolic AI, data ownership, and better cost performance. Therefore, ChatDKG.ai is not competing against any established AI solutions, but rather empowers users to enhance them with its dRAG, driving knowledge verifiability, cost effectiveness, users’ sovereignty in owning their data and freedom of AI model choice.

The open source and permissionless nature of the OriginTrail DKG, allows for inclusiveness and neutrality, giving users a tremendous level of freedom on all layers — to choose AI models enabled by the DKG data portability, choose knowledge sources discoverable in the DKG, as well as pick AI services, centralized or decentralized on different blockchains.

The same principles apply to AI agents, search engines, and a growing variety of AI services integrating into every tool in existence — leveraging dRAG they will enable user freedom of choice, AI autonomy and trust, all while leveraging network effects through connectivity.

As AI becomes the new UI, make sure you are on the right side of history. Let’s build towards a Truly Open AI, together!

Apply for the ChatDKG.ai inception program today!

ChatDKG.ai: Driving synergies across the AI solution landscape to tackle hallucinations, bias, and… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


We Are Open co-op

From Ramble Chats to Utopian Futures

Season 9 of the Tao of WAO is now available! After a meticulously organised Season 8, which was a submission for a journal article focusing on media literacies, Season 9 of The Tao of WAO podcast takes a more relaxed and spontaneous approach. Co-hosts Laura Hilliger and Doug Belshaw (me!) engaged in a series of ramble chats, covering a wide array of topics from goofy websites to the potential fu
Season 9 of the Tao of WAO is now available!

After a meticulously organised Season 8, which was a submission for a journal article focusing on media literacies, Season 9 of The Tao of WAO podcast takes a more relaxed and spontaneous approach.

Co-hosts Laura Hilliger and Doug Belshaw (me!) engaged in a series of ramble chats, covering a wide array of topics from goofy websites to the potential future impacts of AI. We even spice things up mid-season with a special guest to discuss utopian futures.

Join us for a season filled with thought-provoking discussions, personal anecdotes, and plenty of spontaneous insights. You can find The Tao of WAO wherever you get podcasts. W

We’ve going to use SoundCloud embeds below, but encourage you to use whatever platform you’re most comfortable with!

❤️ Support The Tao of WAO on OpenCollective

(thanks to Tim Eccleston, Adam Procter, and Alex Enkerli for their ongoing donations to keep the lights on!)

Can’t support the podcast financially? Spread the word! Rate us on podcast platforms, leave a comment here or on one of the episodes via SoundCloud! We love to hear from anyone listening in (hi, mum!)

S09 E01: Ramble Auctions

Laura and I kick off the season with a casual chat about our personal technology use, ranging from e-readers and work devices to gaming. We explore AI’s impact on jobs and highlight some unique websites like the German customs auction platform.

We also note a trend of professionals, including us, experiencing job market shifts potentially influenced by AI. The episode concludes with an invitation to engage with us through social media, reviews, or direct contact.

S09 E02: Wikis

In this episode, we explore the Six Degrees of Wikipedia tool, feminism, and the divergence of Gen Z’s political views by gender. We touch on conspiracy theories involving Taylor Swift, our experiences with Wikipedia, and the role of AI.

We also address misconceptions about feminism in younger generations and discuss the media’s role in amplifying controversial figures like Andrew Tate. We end on a lighter note with a discussion on monumental trees around the world, sharing personal anecdotes and concerns about nature preservation.

S09 E03: Etherpad

We lament the crashing of our Etherpad server in this episode, and chat about the tools we use for knowledge management. This episode offers a candid look at the challenges and solutions we encounter in our workflow.

S09 E04: Adam Greenfield

In this episode, we had the pleasure of speaking the author of Radical Technologies: The Design of Everyday Life, with Adam Greenfield. He shared insights from his upcoming book Lifehouse: Taking Care of Ourselves in A World on Fire. It’s a thought-provoking discussion spanning various aspects of technology and society.

S09 E05: Yacht Money

Laura and I ramble in this episode about my lost microphone and the sound quality of the mic in my Apple Studio Display, eventually getting stuck into broader societal issues.

Along the way, we critique Adobe’s commodification of creativity through digital credentials and highlight the importance of local, community-driven initiatives. Our conversation explores futuristic concepts like AI developing its own form of consciousness and experiencing the world directly, contrasting with natural communication in ecosystems, such as trees using mycelium networks.

S09 E06: Charcuterie of Asshattery

In the final episode of the season, we cover a variety of topics including Laura’s anger over a minor incident (hence the episode title), poetry, yoga, electric scooters, book recommendations, and ‘debogging yourself’. It’s a diverse and engaging discussion that captures the essence of our ramble chats.

Next: Season 10?

We hope you enjoy listening to Season 9 of The Tao of WAO as much as we enjoyed making it. Feel free to reach out with your thoughts, and remember to support us if you can through Open Collective.

We’re open to sponsorship of the podcast, but you have to be aligned with the Spirit of WAO. Get in touch if you’re interested: podcast@weareopen.coop

From Ramble Chats to Utopian Futures was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 13. May 2024

Identity At The Center - Podcast

Dive into our latest episode of The Identity at the Center P

Dive into our latest episode of The Identity at the Center Podcast! This week, we explored the world of authorization within the IAM space with our guest, Omri Gazitt of Aserto. We discussed the evolution of authorization, the crucial role of developers in adopting standards, and the journey towards a single authorization control plane for multiple applications. A significant part of our conversat

Dive into our latest episode of The Identity at the Center Podcast! This week, we explored the world of authorization within the IAM space with our guest, Omri Gazitt of Aserto. We discussed the evolution of authorization, the crucial role of developers in adopting standards, and the journey towards a single authorization control plane for multiple applications. A significant part of our conversation is focused on AuthZEN, a working group at the OpenID Foundation, co-chaired by Omri, which aims to standardize and simplify authorization. Don't miss this enlightening discussion. Check out the episode at idacpodcast.com, on YouTube, or in your favorite podcast app. Remember to subscribe so you won't miss any future episodes!

Friday, 10. May 2024

MOBI

ILATAM

The Center for Advanced Vehicle and Technologies Research and Application (ILATAM) specifically focuses on smart vehicles and intelligent transportation, digitalization of the mobility ecosystem, next-generation battery and power transmission systems, functional safety, advanced vehicle software, testing and verification, electrification, power electronics, and lightweighting technologies. https://

The Center for Advanced Vehicle and Technologies Research and Application (ILATAM) specifically focuses on smart vehicles and intelligent transportation, digitalization of the mobility ecosystem, next-generation battery and power transmission systems, functional safety, advanced vehicle software, testing and verification, electrification, power electronics, and lightweighting technologies. https://www.ilatam.net/

The post ILATAM first appeared on MOBI | The Web3 Economy.

Thursday, 09. May 2024

Origin Trail

Powered by OriginTrail, ChatDKG.AI is unlocking a Truly Open Artificial Intelligence

The convergence of Internet, Crypto, and Artificial Intelligence (AI) appears as the antidote to many of the current shortcomings of AI — hallucinations, bias, data ownership challenges, even model collapse. It also holds the key to unlocking AI’s full societal potential by enabling a truly open AI. In a world of AI solutions relying on performance battles of pre-trained AI models, ChatDKG stands

The convergence of Internet, Crypto, and Artificial Intelligence (AI) appears as the antidote to many of the current shortcomings of AI — hallucinations, bias, data ownership challenges, even model collapse. It also holds the key to unlocking AI’s full societal potential by enabling a truly open AI. In a world of AI solutions relying on performance battles of pre-trained AI models, ChatDKG stands for freedom of choice. A truly open AI approach allows anyone to define the trusted knowledge as the foundation for their AI solutions. It further extends the freedom of choice from trusted knowledge to multimodal support of a plethora of different AI models. For creators ready to take the plunge into the truly open AI future, an inception program is available to onboard them to ChatDKG.

A new paradigm for AI — Your knowledge network is your net worth

The currently prevalent approach in creating AI solutions is a siloed one — either in the shackles of a single AI model or a walled garden of external sources provided by a single organization. ChatDKG is breaking down those walls with an inclusive and open approach to creating AI solutions. Leveraging the OriginTrail Decentralized Knowledge Graph (DKG), it creates a neutral, inclusive knowledge foundation. As anyone publishes their knowledge to the DKG, their addition gets equipped with a timestamp, their identity, ascribed ownership, and verifiability proofs ensuring their knowledge is kept intact. Through a framework of decentralized Retrieval-Augmented Generation (dRAG), such knowledge is made available to any AI model as input to produce outputs that both have information provenance and respect data ownership. As trusted knowledge inputs become mission-critical for AI solutions, it ignites a knowledge economy where your knowledge network is your net worth.

Getting involved with Initial Paranet Offering

In addition to building trusted AI solutions with ChatDKG, participants in this new knowledge economy can get involved in growing the DKG in knowledge domains important to them. New types of autonomous knowledge networks can appear in any domain, narrow or wide, such as:

Decentralized Science — building a truth layer from existing scientific knowledge to power autonomous AI research and decentralized scientific advancement. Real World Assets (RWA) — creating tokenized digital twins from any asset that exists outside the digital spectrum, such as real estate, commodities, art, and more. Industry 4.0 — revolutionizing production processes to achieve the highest safety and efficiency, faster innovation, and tailored customers’ interactions with trusted AI solutions.

Launching an Initial Paranet Offering (IPO) allows anyone to open up their mission to others who would like to join creating a knowledge foundation in that domain. Through an IPO, a paranet operator also ensures incentive rewards for participants that will add useful knowledge — knowledge miners. By accessing NeuroWeb network incentives for mining new knowledge, the paranet’s trusted knowledge grows in a crowdsourced way.

Future-proofing for continuous advancements of AI

The ChatDKG’s truly open AI does not only lower dependency on any single AI company, the freedom of choice of AI also ensures that any development in the field of AI models, prompt engineering, agent frameworks, or similar, is making your solution stronger. By putting the trusted knowledge foundation in the driving seat, the tech giants working on advancing AI systems are working in your favor as you are always able to use your knowledge network with the latest AI technologies.

Inception program open for application

Those looking to build the next big thing with ChatDKG’s trusted AI solution or launch an Initial Paranet Offering can submit their application for the ChatDKG inception program to receive additional support from the OriginTrail team. Applications are open on the website.

Discover a Truly Open AI & apply for the ChatDKG inception program at: https://chatdkg.ai.

Powered by OriginTrail, ChatDKG.AI is unlocking a Truly Open Artificial Intelligence was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


The Engine Room

Questions about responsible data and digital resilience? We got you :) 

Join our AMA on Monday, May 20th on responsible data and digital resilience.  The post Questions about responsible data and digital resilience? We got you :)  appeared first on The Engine Room.

Join our AMA on Monday, May 20th on responsible data and digital resilience. 

The post Questions about responsible data and digital resilience? We got you :)  appeared first on The Engine Room.

Wednesday, 08. May 2024

Me2B Alliance

ISL Finds Location-Based Advertising on Kids’ Site CoolMathGames.com

Last year, Internet Safety Labs (ISL) observed that CoolMathGames.com website (from CoolMath.com, LLC [referred to as “Cool Math”], owned by Sandbox Group) contained location-based behavioral advertisements. Using the Federal Trade Commission’s guidelines on a “child-directed” website, ISL believes CoolMathGames.com may fall under the protections afforded by the US Children’s Online Privacy Protec

Updated May 21, 2024: Since our original post, Sandbox Group has taken steps to make clear that CoolMathGames.com is not intended for children under the age of 13, and that children under 13 should instead use CoolMath4Kids.com. Note however, that the advertising behavior in CoolMathGames.com (particularly including the use of location information) is risky for users of any age. See the update at the end of the post for more details. 

1  Executive Summary

Last year, Internet Safety Labs (ISL) observed that CoolMathGames.com website (from CoolMath.com, LLC [referred to as “Cool Math”], owned by Sandbox Group) contained location-based behavioral advertisements. Using the Federal Trade Commission’s guidelines on a “child-directed” website, ISL believes CoolMathGames.com may fall under the protections afforded by the US Children’s Online Privacy Protection Rule (COPPA).   ISL had seen Cool Math before, as it came up in our 2022 K-12 EdTech benchmark where twenty-five (3.8%) schools in the sample recommended or required Cool Math Games to students. Twenty-one (84%) of those 25 schools were elementary or middle schools (i.e. schools with children under the age of 13; see Table 8.1 for more details).   Of note, however, there is another Cool Math site called CoolMath4Kids.com which has different ad behavior that may be compliant with COPPA. However, none of the schools in our 2022 benchmark recommended CoolMath4Kids.com.   Both Cool Math sites utilize a COPPA Safe Harbor Certified [by KidSafe] advertising platform from Playwire, who is also listed as the “ManagerDomain” for both Cool Math websites’ ads.txt files. The ManagerDomain role is responsible for managing ads on the domain. 

ISL wonders what Playwire’s responsibility is in ensuring that the COPPA Certified configuration is applied to child-directed sites for which it is the manager domain in the ads.txt file. ISL further questions whether an ad platform really can/should be COPPA Safe Harbor certified, since they can be easily configured as either COPPA compliant or non-compliant, as seems to be the case with the Cool Math websites. 

ISL requested that CoolMath.com, LLC / Sandbox Group modify the CoolMathGames.com site, removing the behavioral advertising [i.e. suggested that they use the seemingly COPPA compliant version of the ads.txt file]. Despite our repeated attempts, Sandbox Group did not modify the site and the behavioral ads remain on the site.  

2  Description of Problem(s)

Site:  coolmathgames.com 

Number of Monthly Users:  13.03M monthly visitors.  74% from the USA, 6% each from the UK and Australia, 5% from Canada, and about 1% from New Zealand. (Source: Similarweb.com) 

Description of Problem:  While evaluating coolmathgames.com, ISL observed the presence of location based behavioral advertising and cross-site trackers uniquely identifying users (i.e. children).   Despite being contacted several times by ISL, Sandbox Group, the parent company Cool Math Games, has not acted on ISL’s request to change the advertising behavior in CoolMathGames.com.  

3  Details

The CoolMathGames.com homepage has multiple ads on its website. As seen below in Figure 3.1, without scrolling the page the user sees three (3) advertisements. Thus, upon loading the CoolMathGames.com webpage, viewer information is sent into the Real Time Bidding (RTB) stream as a part of filling the ad spaces on the homepage. This information includes data such as a unique user identifier, as well as geolocation information. These advertisements are not generic/contextual. The advertisement indicated “1”, contains geolocation information which is blurred for researcher privacy.  

Figure 3.1

Figure 3.2 shows the geolocation information in clear text in the website’s traffic.  Along with the geolocation information, internet service provider (ISP) information is also collected and transferred.  Again, the geolocation information has been blurred for researcher privacy.

Figure 3.2

3.1  Ads.txt Files

Per the Internet Advertising Bureau (IAB) standard, any website that serves advertisements should contain an “ads.txt” file. This file can usually be found at /ads.txt, and contains a list of the Authorized Digital Sellers for ads served on the website. (Note that there is also a version of Authorized Digital Sellers for mobile apps, called app-ads.txt) Figure 3.3 shows the ads.txt files for CoolMathGames.com, which indicates that the file is managed by Playwire.com. 

Figure 3.3

In network traffic, researchers saw calls to a Playwire configuration json file. Figure 3.4 shows this configuration file, with the pertinent information for CoolMathGames.com expanded. Note the presence of a “coppa” flag [set to “false”] as well as website categories including “kids” and “games_casual”. Playwire offers COPPA compliant advertising services, but we can see this does not appear to be applied to CoolMathGames.com, despite the primary category being, by their own designation “kids”. 

Figure 3.4

4  Concerns

These findings lead to three concerns.  

First and foremost is the concern over the presence of behavioral ads—particularly location-based—in the CoolMathGames.com website, a service that is recommended to students in elementary and middle schools around the country. In the ISL 2022 EdTech benchmark, 21 elementary or middle schools (with students as young as pre-K) recommended CoolMathGames.com (see Table 8.1).      Second, how much responsibility does Playwire have in this scenario? It’s clear in their pre_content config file that the site is for “kids” but the COPPA flag is set to false. Moreover, Playwire has the role of ManagerDomain for the ads.txt file. What is their responsibility in this case?  Finally, ISL questions whether ad platforms like Playwire can realistically be COPPA Safe Harbor Certified—at least in the way that it is currently being performed. COPPA certification for ad platforms requires monitoring of the deployed configuration, and that does not appear to be happening right now. ISL was able to have a discussion with kidSAFE regarding these concerns, and the issue appears to be systemic in nature, i.e. with the nature of the Safe Harbor certification program requirements as prescribed by COPPA and the FTC.  5  ISL’s Responsible Disclosures

ISL undertook the following efforts to make CoolMath.com aware of the concerns stated in this report:

December 12, 2023: ISL sends first email sent to CoolMath.com.  December 19, 2023: ISL receives response from Sandbox Group, including a number of questions.  December 22, 2023: ISL sends response with detailed data/files from research and answers to questions. No response was received.  January 16, 2024: ISL sends an email seeking a response. (None received.)  February 22, 2024: ISL sends another email seeking a response. (None received.)  

ISL also sent emails to Playwire (legal@playwire.com) on December 12, 2023 and twice on January 16, 2024, all times receiving an automated and unhelpful response. 

6  Call to Action for CoolMath.com LLC, Sandbox Group, and Playwire

Once again, ISL requests that CoolMath.com LLC and Sandbox Group to remove behavioral advertising from CoolMathGames.com. In particular: 

Inform the ad networks and exchanges you partner with that this site is child-directed and should therefore receive COPPA protections, and   Immediately change the CoolMathGames ads.txt and app-ads.txt files for the website and the two mobile apps to the appropriate “Playwire COPPA Ads.txt” files.  

ISL also requests that Playwire take a more active role in providing the right ads.txt files for child-directed sites.  

7  Safety Suggestions

ISL suggests that schools refer students to CoolMath4kids.com instead of CoolMathGames.com until behavioral advertising is removed from the latter site.

8  References

The following table 8.1 lists the schools from the ISL 2022 K-12 EdTech Safety Benchmark found to be recommending or requiring the use of CoolMathGames to students. 

Table 8.1 Schools in 2022 EdTech Benchmark Recommending CoolMathGames

About ISL Responsible Disclosures 

As a non-profit, independent product safety watchdog organization, Internet Safety Labs (ISL) is dedicated to catalyzing software changes to keep people safer while using technology. To this end, we sometimes discover serious safety risks as we conduct our ongoing research; we aren’t looking for these, but we happen upon them (such was the case with the dangling domain that Apple ultimately purchased, keeping potentially millions of people safe).  When we find these risks, our practice is to contact the developer and request that they make a specific change. We call this a responsible disclosure of a safety risk in the software, similar to the responsible disclosure of a security vulnerability. The best outcome is that the developer makes the change, and we then commend their commitment to keeping their users safe.   No organization is exempt from our safety scrutiny—whether it’s a commercial entity, a non-profit organization, or government organization. Our responsible disclosures of safety risks are offered in a constructive and supportive spirit, working from an assumption that the organization may not be aware of the risk. 

May 21, 2024 Update 

Since our original post, ISL has observed the following changes to CoolMathGames.com and related materials: 

The CoolMathGames.com website has been modified to include a “Games for Kids” notification and link to CoolMath4Kids.com in the upper right corner:

Figure A.1

The CoolMathGames.com privacy policy was updated on May 11, 2024 and now includes the highlighted statement:

“CoolmathGames.com is designed for users aged 13 and older. By accessing our Website, you attest that you are at least 13 years of age. If you are under 13 years of age, please visit our kids gaming website at Coolmath4kids.com.”

Figure A.2

  The CoolMathGames.com terms of service has been similarly updated:

“CoolmathGames.com is designed for users aged 13 and older. By accessing our Website, you attest that you are at least 13 years of age. If you are under 13 years of age, please visit our kids gaming website at Coolmath4Kids.com.”

Figure A.3

  The “Kids” tag was removed from the Playwire config file, leaving only “games_casual“. The COPPA flag remains set to “false”:

Figure A.4

ISL appreciates CoolMath.com, LLC / Sandbox Group‘s recent efforts in clarifying that CoolMathGames.com isn’t intended for children and students under the age of 13. This type of labeling is important and ISL would like to see it in place as a universal practice.

While ISL is glad that CoolMath.com, LLC / Sandbox Group made these changes, we are disappointed that the changes didn’t come when we made the initial, private disclosure, and instead only happened after our 90-day window had expired and we published the first version of this blog post.  

 

The post ISL Finds Location-Based Advertising on Kids’ Site CoolMathGames.com appeared first on Internet Safety Labs.

Tuesday, 07. May 2024

FIDO Alliance

State of Michigan’s MiLogin  Adopts Passkeys

“FIDO Drives Strong  Authentication Results  for the State of Michigan’s  MiLogin” The State of Michigan’s Department of Technology, Management &  Budget (DTMB) is a principal department of the state’s government  responsible for providing […]
“FIDO Drives Strong  Authentication Results  for the State of Michigan’s  MiLogin”

The State of Michigan’s Department of Technology, Management &  Budget (DTMB) is a principal department of the state’s government  responsible for providing a wide range of support functions to other state  agencies.  

The department’s broad spectrum of responsibilities includes technology  services, labor market information, facilities management, financial  services, procurement, retirement services, real estate management, the  Michigan public safety communication system, fleet and records  management, and more. 

The DTMB also plays a crucial role in cybersecurity for the state,  providing resources and tools to protect against cyber threats and  manage the State’s IT infrastructure. One of the DTMB’s efforts is the  MiLogin digital identity solution, which enables over 10 million users to  access state government services securely and conveniently. 

DTMB was looking to secure MiLogin, Michigan’s application that allows  users to access multiple state applications and services with a single user  ID, with strong authentication that improves user experience and decided  to go with passkeys, based on FIDO authentication. 

Passkeys are a password replacement that provide faster, easier, and more secure sign-ins to websites and apps across a user’s devices. Unlike passwords, passkeys are resistant to phishing, are always strong, and are designed so that there are no shared secrets. Key Objectives 

The State of Michigan aimed to address several key objectives with the integration of passkeys:

Enhance the digital user experience.  

The goal was to streamline the digital user experience, particularly in providing users with seamless access to critical state  government services. DTMB aimed to simplify the login process, making it more user-friendly and efficient. 

Reduce help desk support dependency.  

Recognizing the strain on help desk resources due to login access issues, DTMB sought to reduce users’ need to access help desk support. By implementing changes to enhance the login process, the goal was to empower users to navigate.

Fortify security resilience.  

There is no shortage of risk and vulnerabilities associated with traditional username and password authentication. A key  objective was to fortify the system against security threats and phishing incidents, by adopting advanced FIDO strong  authentication to mitigate the risks commonly exploited by bad actors seeking unauthorized access.

The Importance of Open Standards and Interoperability 

Before deciding to implement passkeys, the DTMB explored a proprietary passwordless login solution offered by a cloud based identity-as-a-solution (IDaaS) provider. However, the solution lacked the interoperability required. 

The DTMB determined early on in its process that open standard and interoperability were critical and required  components of its strong authentication strategy.  

A standards-based approach provides interoperability across popular device types and web browsers, maintains vendor  neutrality, allows for cost savings through community adoption, and a pathway to adopt future innovations in the FIDO  ecosystem. 

The Solution: FIDO Drives Results 

Passkeys checked all the boxes for the DTMB, utilizing open standards and an interoperable approach for authentication.

The DTMB found that passkeys provide the following advantages:

Passkeys are based on open standards, ensuring interoperability without  necessitating additional software for users to download.  Multiple vendor support for FIDO standards and the tech’s rapid  adoption promotes the long-term continuity of FIDO authentication as a  service.  FIDO standards accommodate various authenticator types (such as  biometric sensors, hardware keys, etc.) across desktop and mobile  devices, catering to the DTMB’s user base’s diverse authentication  requirements.  Prioritizing user and ecosystem partner security, passkeys provide strong  phishing resistance.  MiLogin’s Path to Passkeys 

The DTMB’s passkey rollout involved a meticulous process to ensure a seamless, secure transition. 

Extensive research on passwordless authentication solutions was conducted, engaging the DTMB’s cybersecurity review  board in the evaluation process. Upon selecting passkeys for further exploration, the DTMB delved into the analysis of  various FIDO options and sought feedback from the National Institute of Standards and Technology (NIST). 

Working together with Deloitte, which is the DTMB’s trusted systems integrator for the State’s enterprise digital identity  solution, a comprehensive strategy was planned for the design, development, and implementation phases. 

In the design phase, insights from the FIDO Alliance’s usability study results were working into screen and workflow  designs. Findings from the DTMB’s MiLogin human-centered design usability study were also used to create a user  experience tailored to address the diverse needs of various personas.  

The development phase focused on integrating MiLogin with FIDO authentication methods, accompanied by the creation  of animated user help guides and tutorial videos to drive greater user adoption. 

Post-implementation, the DTMB monitored production metrics and gathered feedback from end-users, ensuring the  success of the implementation and identifying areas for functionality enhancements in future releases.

MiLogin’s Impressive Passwordless Results 

Within the first six months of release, MiLogin achieved impressive results for the State of Michigan:

100,000+ customer devices enrolled in passkeys  ~18,000 new passkey enrollments per month  Increased FIDO-based logins with zero reported issues  Decreased help desk initiated password resets with 1,300 fewer calls  related to password resets in a single month. 

“I am proud that our MiLogin team has brought passwordless authentication to our public  digital identities. Passwordless brings additional protections to our public digital identities, and  helps protect our systems from account takeover attempts such as brute force and password  spray attacks.”  

– Jayson Cavendish, Chief Security Officer, State of Michigan, DTMB 

The Road Ahead for Passwordless in the State of Michigan 

The State of Michigan anticipates a significant increase in passkey adoption, targeting over 10 million public users.  They also plan to implement passwordless authentication for their workforce, integrating with their state directory  services solution. 

FIDO authentication is a part of the State of Michigan’s Zero Trust Identity strategy to establish a secure identity in citizen  interactions with state services. It will also improve the user experience, generate cost savings for the state, and increase adoption of the State’s digital identity solution by diverse state agency partners. 

So what advice does the DTMB have for other other organizations? The State of Michigan recommends understanding  diverse user bases and use cases, prioritizing user experience, and incorporating usability studies, clear end-user messaging,  and a well-designed communication plan for a successful FIDO authentication implementation.

download the case study

Ceramic Network

WEB3 Points Library: Example App Tutorial

A tutorial walk-through of our latest experiment - a points library built directly on Ceramic!

As mentioned in our post introducing our points library, the Ceramic Solutions SDK also contains examples (found within the demo directory) designed to help developers understand how to use the libraries within a larger application context. The remainder of this article will walk through a tutorial of the "full-stack app" example. If you prefer to follow a video tutorial, visit our YouTube video of this tutorial.

The Use Case: Rewarding Points for Joining a Community

This example offers a simple demonstration of how one could use the points library to reward community members for their engagement such as joining key platforms. To keep things simple, this tutorial will walk through the act of joining a Discord server as the trigger.

Getting Started

While an identical version of this example lives directly in the SDK repository, we've pulled it into a separate codebase for this walk-through. To get started, clone the code, install your dependencies, and duplicate the example environment file:

// if you don't have pnpm installed npm install -g pnpm@9 // otherwise: git clone https://github.com/ceramicstudio/points-example && cd points-example pnpm install cp .env.example .env

Next, we'll need to configure a few environment variables:

NEXTAUTH_SECRET

We will be using NextAuth as our open-source authentication solution (in order to verify Discord server membership). You can create a secret easily by running the following in your terminal:

openssl rand -base64 32

DISCORD_CLIENT_ID and DISCORD_CLIENT_SECRET

This app will use Discord as the authentication provider (wrapped by NextAuth). To obtain these credentials, navigate to the Discord Developer Portal and set up a new application. On the left-hand panel, click on "OAuth2" and bring over your Client ID and Client Secret from the "Client Information" box.

Finally, since you'll be running the application locally, set the following URI value within the "Redirects" box (found under "Client Information"): http://localhost:3000/api/auth/callback/discord.

CERAMIC_PRIVATE_KEY

This is the private key your application will use to instantiate a static key:did in order to write points using the library. This DID will act as the identifier for the issuer of points for your application (you).

If you have the ComposeDB CLI installed globally, you can run the following command in your terminal to create one:

composedb did:generate-private-key

AGGREGATION_ID

A default value for this environment variable has been provided for you within the .env.example file. Please leave this as-is. You can reference the "extending the default library interfaces" below to learn more about this variable.

PROJECT_ID

We will be using WalletConnect's Web3Modal for Web3 authentication. In your new .env file, assign your project id to the key labeled `NEXT_PUBLIC_PROJECT_ID`.

You can set up a developer account for free by visiting cloud.walletconnect.com. Once authenticated, create a new app and copy over the "Project ID" value (found in the dashboard view for that corresponding app).

Extending the Default Library Interfaces

For our use case let's assume the following about our point reward structure:

Participants can earn points related to engaging on certain platforms (such as Discord) There are various ways participants can earn points across each platform (for example, following, posting, liking, and so on)

As such, our application logic has the following requirements:

Our application needs to easily access platform-specific subtotals for each participant Our application also needs to easily access "global" totals for each participant as a sum of all eligible behavior

The points library's default implementation of the PointsAggregationInterface serves only the second requirement above:

type SimplePointsAggregation implements PointsAggregationInterface @createModel( description: "Simple points aggregation to an account at a specific date" accountRelation: SET accountRelationFields: ["recipient"] ) { issuer: DID! @documentAccount recipient: DID! @accountReference points: Int! date: DateTime! }

This default model defines a SET accountRelation using the "recipient" subfield, which ensures that there can be only 1 model instance document any given account can create that points to a specific recipient. This will be useful for logging the global total for each participant, but would not be ideal for our first requirement.

Conversely, the SimplePointsAllocation model, the default type implementation of the PointsAllocationInterface (found in schemas), defines a LIST accountRelation, and is designed to track the history of allocation events rather than the current sum.

We've therefore chosen this example to show how easily developers can extend the defaults to meet more nuanced use cases. For this example app, we've defined a new type to fit our needs:

type ContextPointAggregation implements PointsAggregationInterface @createModel( description: "A simple context-based point aggregation model" accountRelation: SET accountRelationFields: ["recipient", "context"] ) { issuer: DID! @documentAccount recipient: DID! @accountReference points: Int! date: DateTime! context: String! @string(maxLength: 100) }

The extension you'll notice is that our SET relation now sits on two fields - "recipient" and "context". If, for example, we wanted to sum up all Discord-eligible behavior under a "discord" context value, this new accountRelation scheme ensures that our application (as the issuer) can create only 1 model instance per recipient+context combination.

For this tutorial, we've already pre-deployed this special type to the default node endpoint (if you want to create your own custom extensions, you will need to set up a node first). The model's stream ID is the corresponding value for the AGGREGATION_ID key provided for you in your .env file.

Using the Points Library in the Application

You can find the points library being used in our app's server logic as API routes and a context utility. If you look at our context.ts file, you can see how we've imported our CERAMIC_PRIVATE_KEY and AGGREGATION_ID environment variables to instantiate our reader and writer classes:

import { PointsWriter, PointsReader } from '@ceramic-solutions/points' import { getAuthenticatedDID } from '@ceramic-solutions/key-did' import { fromString } from 'uint8arrays' const CERAMIC_PRIVATE_KEY: string = process.env.CERAMIC_PRIVATE_KEY ?? '' const aggregationModelID: string | undefined = process.env.AGGREGATION_ID ?? undefined const seed = fromString(CERAMIC_PRIVATE_KEY, 'base16') as Uint8Array // create a context writer const contextWriter = await PointsWriter.fromSeed({ aggregationModelID, seed, }) // create a total writer const writer = await PointsWriter.fromSeed({ seed, }) // generate issuer for reader context const issuer = await getAuthenticatedDID(seed) //create a context reader const contextReader = PointsReader.create({ issuer: issuer.id, aggregationModelID, }) //create a total reader const reader = PointsReader.create({ issuer: issuer.id, }) export { contextWriter, writer, contextReader, reader }

We're purposely creating 2 instances of each, one of which is blank and therefore uses the default value of aggregationModelID (the SimplePointsAggregation discussed above), while the other uses our imported extension.

In our API routes you'll find simple endpoints for reading and creating points. Notice how we import our writer instances from our context utility in order to mutate or create two separate documents:

import { type NextApiRequest, type NextApiResponse } from 'next' import { contextWriter, writer } from '@/utils/context' import type { ModelInstanceDocument } from '@composedb/types' import { type PointsContent, type AggregationContent } from '@/utils/types' interface Request extends NextApiRequest { body: { recipient: string amount: number context: string } } interface Response extends NextApiResponse { status(code: number): Response send(data: { contextTotal: number; total: number } | { error: string }): void } export default async function handler(req: Request, res: Response) { try { const { recipient, amount, context } = req.body // get context aggregation doc if exists const aggregationDoc: ModelInstanceDocument<AggregationContent> | null = await contextWriter.loadAggregationDocumentFor([recipient, context]) // if aggregation doc does not exist for that context, set points aggregation for both context and global total if (aggregationDoc === null) { // update context-specific aggregation const updatedContextAgg: ModelInstanceDocument<AggregationContent> = await contextWriter.setPointsAggregationFor([recipient, context], amount, { recipient, points: amount, date: new Date().toISOString(), context, } as Partial<PointsContent>) // update total aggregation const updatedTotalAgg: ModelInstanceDocument<AggregationContent> = await writer.updatePointsAggregationFor([recipient], (content) => { return { points: content ? content.points + amount : amount, date: new Date().toISOString(), recipient, } }) res.status(200).send({ contextTotal: updatedContextAgg.content ? updatedContextAgg.content.points : 0, total: updatedTotalAgg.content ? updatedTotalAgg.content.points : 0, }) } } catch (error) { console.error(error) res.status(500).send({ error: 'Internal Server Error' }) } }

As you might've noticed, the current logic only issues the points if the recipient has not yet been rewarded for the input context, which makes sense given that the app in its current form only confirms Discord server membership.

You'll notice that our readPoints.ts route is even simpler, making use of the getAggregationPointsFor method our reader instances offer.

Requesting to View Users' Servers with NextAuth

If you navigate to our auth.ts route you'll notice the inclusion of an authorization URI that specifies the scope of read access we need to view a user's server (or guild) membership:

providers: [ DiscordProvider({ clientId: env.DISCORD_CLIENT_ID, clientSecret: env.DISCORD_CLIENT_SECRET, authorization: "https://discord.com/api/oauth2/authorize?scope=identify+email+guilds", }), ],

We use this access within the frontend component of our single-page app to request guild membership and determine if one of them is named "Ceramic" (alongside checking the user's points and awarding them if eligible):

const checkPoints = async (context: string, recipient: string) => { try { const response = await fetch("/api/readPoints", { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ recipient, context }), }); const data = await response.json() as { contextTotal: number; total: number }; setTotals(data); return data; } catch (error) { console.error(error); } }; const awardPoints = async ( context: string, recipient: string, amount: number, ) => { try { const response = await fetch("/api/createPoints", { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ recipient, context, amount }), }); const data = await response.json() as { contextTotal: number; total: number }; console.log(data); setTotals(data); return data; } catch (error) { console.error(error); } }; const getGuilds = async ( accessToken: string, ): Promise<Guilds[] | undefined> => { const guilds = await fetch("/api/servers", { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ token: accessToken }), }) .then(function (response) { return response.json(); }) .then(function (data) { return data as Guilds[] | undefined; }); console.log(guilds); return guilds; }; ... const guilds = await getGuilds(access); // check if user is in the Ceramic server const ceramic = guilds?.find((guild) => guild.name === "Ceramic"); const points = await checkPoints("discord",`did:pkh:eip155:1:${address.toLowerCase()}`); if (ceramic && (points?.contextTotal ?? 0) === 0) { setClaim(true); } // ... if claim await awardPoints("discord",`did:pkh:eip155:1:${address.toLowerCase()}`, 20); Running the Application

Once you've correctly generated and entered your environment variables, you can run the application in developer mode:

pnpm dev

Navigate to localhost:3000 in your browser and you should see the following:

Go ahead and self-authenticate by clicking "Connect Wallet." Once connected, you should see the following on your screen:

Given the instructions, you'll want to join the Ceramic Discord server (if you haven't already).

After both joining the Ceramic Discord server and self-authenticating with Discord, you should now be able to check the status of your eligibility:

Go ahead and click "Check Status" to view your current server membership and eligibility to claim points (this action queries our Discord NextAuth provider mentioned above, and automatically triggers a call to the readPoints.ts route):

If Ceramic now appears as one of the servers you're a member of, you'll be able to select "Claim Points" in order to engage your createPoints.ts route.

Finally, once the mutation is successful, you should now see the points you've just earned appear in the UI.

Integrate Allocations and Gitcoin Passport

As you might've noticed, this simple demo only uses the aggregation docs - after all, in this example, we're only awarding points for Discord engagement.

If you're interested in experimenting with an example that also uses the allocation docs, feel free to check out the with-gitcoin repository branch (which uses Gitcoin Passport scoring as an additional means to earn points).

What's Next?

While this demo showcases a straightforward example of how to put the point library into action, we encourage our community to fork this repository and build out features you'd like to see! The library is flexible and extendable by design.

Whatever your idea is, we'd love to hear about it as you build. Feel free to contact me directly at mzk@3box.io, or start a conversation on the Ceramic Forum. We look forward to hearing from you!


Energy Web

Introducing AutoGreenCharge by Energy Web: a decentralized application for decarbonizing electric…

Introducing AutoGreenCharge by Energy Web: a decentralized application for decarbonizing electric vehicle charging Mobile app helps automotive companies, charging companies, and EV drivers decarbonize electric mobility Energy Web, a global nonprofit organization that builds open source software to accelerate the energy transition, has announced AutoGreenCharge by Energy Web, a new prod
Introducing AutoGreenCharge by Energy Web: a decentralized application for decarbonizing electric vehicle charging Mobile app helps automotive companies, charging companies, and EV drivers decarbonize electric mobility

Energy Web, a global nonprofit organization that builds open source software to accelerate the energy transition, has announced AutoGreenCharge by Energy Web, a new product that makes it simple for electric vehicle drivers, manufacturers, and other electric mobility companies to decarbonize electric vehicle charging. The application is designed to serve both electric vehicle drivers and e-mobility companies while bringing new levels of transparency and verifiability to the electric vehicle charging space.

In 2023, electric vehicles used approximately 110 terawatt hours of electricity — comparable to the annual electricity consumption of the Netherlands, and this figure is expected to increase tenfold by 2030. And although electric vehicles are far more efficient than gas-powered vehicles, electric vehicle charging in most markets around the world is not fully renewable: in the U.S. approximately 38% of charging is green while in Europe the figure is 60%.

AutoGreenCharge by Energy Web closes this gap by calculating the sustainability of charging sessions and matching the non-renewable portion with verified energy attribute certificates. These digital tokens represent renewable power generation and have been used by governments and corporations like Google and Microsoft since the 2010s to decarbonize business operations. AutoGreenCharge’s functionality extends to a mobile app, allowing drivers to clean up their charging, and can be integrated into existing apps used by EV manufacturers and charging companies. Over time, users will be able to choose certificates from a variety of power plants with different characteristics, such as solar power plants situated in particularly dirty grids that have a greater impact on decarbonization than renewable energy facilities in renewable-rich areas.

Currently, Energy Web is partnering with top EV manufacturers and additional partners to test, develop, and release AutoGreenCharge to the public.

Sign up to become one of the first beta-testers at: www.autogreencharge.com

Sahas Katta, CEO at Smartcar, the leading connected car API platform and Greencharge partner, said “This partnership will empower EV drivers with the knowledge and confidence to maximize the sustainability of their EV charging experience. With Smartcar, Energy Web can seamlessly integrate across diverse vehicle brands, giving more EV drivers access to accurate insights into charging transparency and renewable energy options.
We could not be more excited to apply Energy Web technology to electric vehicle charging,” said Jesse Morris, CEO of Energy Web Foundation. “This is a fast growing sector of electricity demand; pairing demand with high impact renewables from around the globe is a great way to bring more capital into renewable energy markets.

About Energy Web Foundation:

Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. The Energy Web ecosystem comprises leading corporates, energy companies, utilities, renewable energy developers, transportation sector majors and telecommunications leaders. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Introducing AutoGreenCharge by Energy Web: a decentralized application for decarbonizing electric… was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 06. May 2024

Identity At The Center - Podcast

A new episode of The Identity at the Center Podcast is now a

A new episode of The Identity at the Center Podcast is now available. In this episode, we discuss the intersection of AI and the IAM industry with Patrick Harding at Ping Identity. We delve into the future of AI, its potential impacts on identity, and the importance of AI governance. Listen in for an insightful conversation at idacpodcast.com or on your podcast app. #iam #podcast #idac

A new episode of The Identity at the Center Podcast is now available. In this episode, we discuss the intersection of AI and the IAM industry with Patrick Harding at Ping Identity. We delve into the future of AI, its potential impacts on identity, and the importance of AI governance. Listen in for an insightful conversation at idacpodcast.com or on your podcast app.

#iam #podcast #idac

Friday, 03. May 2024

FIDO Alliance

The Register: Microsoft, Google do a victory lap around passkeys

Passkeys are based on a FIDO alliance standard that’s supported by Apple, Microsoft and Google. Think of them as password replacements. The tech, simply put, works like this: When you create an […]

Passkeys are based on a FIDO alliance standard that’s supported by Apple, Microsoft and Google. Think of them as password replacements. The tech, simply put, works like this: When you create an account for a website or app, your device generates a cryptographic public-private key pair.


Silicon Republic: Microsoft and Google are pushing harder for passkeys

Passkeys have been growing rapidly in popularity. In the UK, for instance, more than half the population has enabled passkeys on at least one of their accounts, according to a […]

Passkeys have been growing rapidly in popularity. In the UK, for instance, more than half the population has enabled passkeys on at least one of their accounts, according to a FIDO Alliance survey published this week. What’s more, around a fifth have passkeys activated on every account that allows them.


TechCrunch: Google expands passkey support to its Advanced Protection Program ahead of the US presidential election

Google is introducing passkey support to its Advanced Protection Program (APP), designed for individuals facing elevated risks of targeted attacks, including campaign workers, candidates, journalists, human rights activists, and others. […]

Google is introducing passkey support to its Advanced Protection Program (APP), designed for individuals facing elevated risks of targeted attacks, including campaign workers, candidates, journalists, human rights activists, and others. The company reports that passkeys have authenticated users over 1 billion times across more than 400 million Google Accounts since the introduction of passkey support in 2022.


Origin Trail

OriginTrail Decentralized Knowledge Graph for trusted cross-organization real-time data integration…

OriginTrail Decentralized Knowledge Graph for trusted cross-organization real-time data integration in EU-funded DMaaST Trace Labs, the core developers of OriginTrail, has joined the European Union’s initiative to foster a resilient and adaptive manufacturing ecosystem through the DMaaST project. Collaborating with partners from Slovenia, Spain, Germany, Portugal, Turkey, Serbia, Belgium, Lithuan
OriginTrail Decentralized Knowledge Graph for trusted cross-organization real-time data integration in EU-funded DMaaST

Trace Labs, the core developers of OriginTrail, has joined the European Union’s initiative to foster a resilient and adaptive manufacturing ecosystem through the DMaaST project. Collaborating with partners from Slovenia, Spain, Germany, Portugal, Turkey, Serbia, Belgium, Lithuania, France, Denmark, and Switzerland, the initiative will leverage the OriginTrail Decentralized Knowledge Graph (DKG) and Knowledge Assets (KA) to encapsulate all pertinent information regarding products, processes, facilities, and human expertise. This comprehensive approach will facilitate the precise mapping of data flows and knowledge interconnections, laying the groundwork for comprehensive information mapping within the manufacturing ecosystem using OriginTrail DKG. Consequently, this will ensure trustworthy cross-organizational real-time data integration.

Once more, attention has been drawn to challenges within the aeronautic and manufacturing industries following a January incident in which a Boeing 737 MAX 9 door plug blew out in the middle of an Alaska Airlines flight. If the company had established reliable cross-organizational communication, it could have prevented this incident. Such communication would enhance the value chain’s responsiveness to external and unforeseen events, as well as improve operability and production planning capacity.

Effective, transparent, and reliable data exchange are the most important points for fostering sustainability, resilience, and energy efficiency in the manufacturing industry. However, over the past years, various challenges have come to the forefront within this sector.

Supply Chain Disruptions: The COVID-19 pandemic highlighted existing vulnerabilities in global supply chains, leading to disruptions in the flow of materials and components. Issues such as raw material shortages, transportation bottlenecks, and labor shortages have persisted, impacting manufacturing operations worldwide. Cybersecurity Risks: With the increasing digitization of manufacturing processes through technologies like the Internet of Things (IoT) and Industry 4.0, cybersecurity threats have become a significant concern. Manufacturing facilities are increasingly vulnerable to cyberattacks that can disrupt operations, steal sensitive data, or compromise product quality and safety. Data Silos: Manufacturing organizations often operate with fragmented data systems, leading to isolated data silos across departments or functions. This fragmentation inhibits seamless data interoperability and hampers comprehensive insights that could drive operational efficiency and innovation. Lack of Standards: The absence of standardized data formats and protocols complicates data exchange and integration efforts within and across manufacturing enterprises. Without universally accepted standards, interoperability becomes a significant challenge, impeding the flow of data between different systems and stakeholders. Data Privacy Concerns: With the proliferation of data collection and sharing practices in manufacturing, ensuring data privacy and protection is paramount. Manufacturers must navigate complex regulatory landscapes, safeguarding sensitive information from unauthorized access or misuse while balancing the need for data-driven decision-making. Ownership and Control: Determining ownership rights and control over manufacturing data can be contentious, especially in collaborative environments or supply chain networks. Disputes may arise regarding data ownership, usage rights, and intellectual property, complicating data sharing agreements and hindering collaborative initiatives. Legacy Systems Integration: Many manufacturing facilities still rely on legacy systems that were not designed with interoperability in mind. Integrating these outdated systems with modern data platforms and technologies poses significant challenges, requiring extensive customization, retrofitting, and investments in interoperability solutions.

DMaaST aims to enhance manufacturing ecosystem resilience and adaptability by employing a Smart Manufacturing Platform comprising four layers. The data layer establishes a foundation for real-time data integration across organizations using ontologies and OriginTrail Decentralized Knowledge Graph. Following this, a two-level cognitive digital twin is deployed to model both manufacturing services production lines and value chain stages. It incorporates human expertise, data-driven algorithms, and physical modeling. An algorithm for multi-objective distributed decision support systems leverages this data to facilitate optimal production decisions. Outcomes will be communicated via user-friendly interfaces and timely scoreboards, assessing circularity, sustainability, and product traceability. Over the four-year period, DMaaST ensures scalability and innovation by providing insights for replicating and improving manufacturing processes, advancing technologies in aerospace and electronics sectors.

Trace Labs will lead the data working group to develop and validate technologies aimed at facilitating data understanding, interoperability, and secure cross-organization integration. With integration of OriginTrail DKG for the electronic and aeronautical sector, creating a new powerful knowledge base with artificial intelligence capabilities. The DKG will establish a decentralized database accessible to all participants in a manufacturing value chain, including manufacturers, suppliers, distributors, retailers, regulatory bodies, research institutes, and others. This will enhance the manufacturing ecosystem’s ability to autonomously withstand and adapt to external events.

OriginTrail DKG has been widely utilized to foster trust and transparency in enterprise knowledge exchange across various industries. Now, it is evolving to facilitate global knowledge connectivity, powering the Decentralized Retrieval Augmented Generation (dRAG) framework for more precise and inclusive AI. Given the challenges of verifying AI-generated results, OriginTrail DKG, with Knowledge Assets as its primary resource, represents a pivotal innovation in this context. It offers a robust framework for ensuring the ownership, discoverability, and verifiability of information utilized by AI systems for the manufacturing industry.

Project information available here: DMaaST Project

OriginTrail Decentralized Knowledge Graph for trusted cross-organization real-time data integration… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 02. May 2024

FIDO Alliance

Microsoft Blog: Microsoft introduces passkeys for consumer accounts

Ten years ago, Microsoft envisioned a bold future: a world free of passwords. Every year, we celebrate World Password Day by updating you on our progress toward eliminating passwords for […]

Ten years ago, Microsoft envisioned a bold future: a world free of passwords. Every year, we celebrate World Password Day by updating you on our progress toward eliminating passwords for good. Today, we’re announcing passkey support for Microsoft consumer accounts, the next step toward our vision of simple, safe access for everyone.


ZDNet: Two years in, Google says passkeys now protect more than 400 million accounts

Google Account users have authenticated themselves using passkeys more than 1 billion times, but passwords are likely to be around for years.

Google Account users have authenticated themselves using passkeys more than 1 billion times, but passwords are likely to be around for years.


EdgeSecure

Dr. Forough Ghahramani Receives Innovate 100 Nomination

The post Dr. Forough Ghahramani Receives Innovate 100 Nomination appeared first on NJEdge Inc.

NEWARK, NJ, May 2, 2024 – Edge is pleased to announce, Dr. Forough Ghahramani, Assistant Vice President for Research, Innovation, and Sponsored Programs at Edge, has been nominated as an INNOVATE100 honoree. Highlighting the top 100+ innovators statewide, the event will take place Tuesday, May 14th at the  New Brunswick Performing Arts Center

This inaugural INNOVATE100 initiative, led by James Barrood and Innovation+, is designed to highlight and elevate the finest innovation leaders from all sectors and industries across New Jersey. Ghahramani is proud to be recognized alongside other visionary leaders who are driving innovation forward.

Notes Dr. Ghahramani, “Being nominated as an INNOVATE100 honoree is a tremendous tribute to my colleagues and peers. I am honored to be recognized for efforts in driving innovation within the community. This accolade is a testament to the hard work and dedication of the incredible team at Edge and our higher education, government, and healthcare partners in New Jersey. Thank you to James Barrood and the Innovation+  community for supporting innovation in New Jersey.” 

I am honored to be recognized for efforts in driving innovation within the community. This accolade is a testament to the hard work and dedication of the incredible team at Edge and our higher education, government, and healthcare partners in New Jersey.

— Forough Ghahramani
Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge

Dr. Samuel S. Conn, President and Chief Executive Officer, Edge, remarks, “We are thrilled to see Dr. Ghahramani nominated as an INNOVATE100 honoree. This recognition not only celebrates her outstanding contributions to research and innovation but also highlights the collaborative spirit of our team at Edge, the nation’s technology and innovation partner. We are proud to have her as part of our organization, driving innovation forward in our community and beyond.”

This celebration is being hosted in partnership with the NJBIA and Middlesex County, and a portion of the proceeds will fund the next generation of innovators via summer entrepreneurship/tech program scholarships for underserved teenagers.

Those interested in attending the event on May 14, 2024, should visit https://innovatenewjersey.com/ to register. 

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Dr. Forough Ghahramani Receives Innovate 100 Nomination appeared first on NJEdge Inc.


FIDO Alliance

Google Blog: Passkeys, cross-account protection and new ways we’re protecting your accounts

For World Password Day, we’re sharing updates to passkeys across our products and sharing more ways we’re keeping people safe online.

For World Password Day, we’re sharing updates to passkeys across our products and sharing more ways we’re keeping people safe online.


CNET: World Password Day: We’re closer to ditching this crackable tech

Passkeys promise to be a big help, but until they take hold, we all need to make sure we’re still using good passwords.

Passkeys promise to be a big help, but until they take hold, we all need to make sure we’re still using good passwords.


The Washington Post: Microsoft is changing how you log in to your accounts

Microsoft 365, Copilot and Skype accounts can use “passkeys”, which are more secure than passwords.

Microsoft 365, Copilot and Skype accounts can use “passkeys”, which are more secure than passwords.


New Survey: Half of People Use Passkeys as Frustrations with Passwords Continue

20% of the world’s top 100 websites now support the password alternative MOUNTAIN VIEW, Calif., 02 May, 2024 – World Password Day may soon need a rebrand, as the FIDO […]

20% of the world’s top 100 websites now support the password alternative

MOUNTAIN VIEW, Calif., 02 May, 2024 – World Password Day may soon need a rebrand, as the FIDO Alliance survey released today shows that half of people in the US and UK have begun ditching the password in favor of more convenient and secure passwordless alternatives.

An independent survey commissioned by the FIDO Alliance found that 53% of people have enabled passkeys on at least one of their accounts, with 22% enabling them on every account they possibly can. In separate research, the Alliance found that passkeys are now supported by 20% of the world’s top 100 websites and 12% of the top 250. 

This shift away from passwords toward passkeys is being driven by three key trends; people’s concerns over password security, their frustrations in using them, and the growing availability of passkeys on major websites and services.

The Alliance’s research found that in the last year, 24% of people had at least one of their accounts compromised due to password vulnerabilities, and 26% had to reset or recover at least one password every month. In addition, 45% of consumers will abandon purchases if they have forgotten their password for that particular account. This is hugely significant for passkey adoption, as 61% of people familiar with passkeys consider them to be more convenient than passwords, and 58% believe they offer greater security. 

The availability of passkeys has also increased steadily over the last year, with Microsoft today announcing that Microsoft accounts, including a wide range of services like Bing, Microsoft 365 and Xbox.com – now also support passkeys. This is added to support from large global consumer brands, such as Adobe, Amazon, Apple, Google, Hyatt, Nintendo, PayPal, PlayStation, Shopify and TikTok. In all, more than 13 billion user accounts can now leverage passkeys for sign in. 

As a result of high-profile passkey deployments like these, awareness of the technology has grown to 62% of people, according to the research. Among people with some knowledge of passkeys, those enabling them on at least one account rises substantially to 74%, while those enabling passkeys on every account possible rises to 32%. This suggests that adoption will only increase as more people become more familiar with passkeys.

“It was just two years ago that FIDO Alliance, alongside the world’s largest platform providers, introduced the vision for passkeys to accelerate the scale and usability of password-free sign-ins. The market’s reaction since then has been nothing short of phenomenal, with hundreds of services enabling billions of consumers to use passkeys,” said Andrew Shikiar, executive director and CEO of the FIDO Alliance. “We expect this trend to accelerate in the months and years ahead, and our research makes it clear that when offered, people prefer the better security and usability of passkeys over passwords.”

Ends

Notes to editors:

The independent survey was conducted by Sapio Research in April 2024 among 2,000 consumers across the UK and US – with 1,000 in each country. Results of any sample are subject to sampling variation. In this particular study, the chances are 95 in 100 that a survey result does not vary, plus or minus, by more than 2.2 percentage points from the result that would be obtained if interviews had been conducted with all persons in the universe represented by the sample. To calculate the proportion of the world’s top websites and services that support passkeys, the FIDO Alliance combined publicly available information with its own data on passkey deployments. 

About the FIDO Alliance

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies, and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.
Contact
press@fidoalliance.org


Human Colossus Foundation

Addressing Data Challenges for the Next Generation of Digital Therapeutics Development

“The largest bottleneck medical research organisations at the forefront of technology face is access to useful data. These difficulties stem from strict policies that limit access to ML teams instead of allowing technology to enable and ensure responsible data use.”

Lisbon, April 22-23

The Human Colossus Foundation presented its "Dynamic Data Economy” design for healthcare during a two-day workshop at the esteemed Champalimaud Foundation in Lisbon, Portugal. This concept holds immense potential for a patient-centric future of healthcare.
The "Addressing Data Challenges for the Next Generation of Digital Therapeutics Development" event delved into the complexities and opportunities surrounding data in healthcare.
Key discussions revolved around several critical facets, including:
Privacy and Cybersecurity: Ensuring robust measures to protect sensitive healthcare data from unauthorised access or breaches.
Privacy Enhancing Technologies (PETs): Implementing advanced tools and techniques to enhance data privacy without compromising utility.
Synthetic Data Generation: Exploring the potential of synthetic data as a means to preserve privacy while enabling broader data access for research and innovation.
Ethical Considerations: Incorporating ethical frameworks to guide the responsible collection, use, and sharing of healthcare data.
Health Data Infrastructure: Upgrading existing infrastructure to support the seamless exchange and interoperability of healthcare data across various systems.
Secured Access to Biobanks: Developing protocols and mechanisms to ensure secure access to biobanks for clinical and research purposes.
Economic and Organisational Challenges: Addressing the financial and structural barriers that hinder effectively utilising healthcare data for transformative purposes.

We are looking forward for practical implementation of DDE in patients journey and helping shaping future of the patient data flows 


We thanks the Champalimaud Foundation, Joe Paton, John Krakauer, and their organising team, Alexander Loktyushin, Eric Lacosse, and João Santinha, for hosting the event and creating a synergetic data-centric forum with a human-centric approach.

Subscribe to our newsletter

Wednesday, 01. May 2024

KABN Network

تصدر نتائج البحث يوتيوب

تعرف كيف تتصدر نتائج البحث يوتيوب لماذا تحتاج لتصدر نتائج البحث يوتيوب سوف يفيدك تصدر نتائج البحث يوتيوب في الحالات التالية: الحصول علي مشاهدات أكثر وبالتالي أرباح اكثر من مشاهدات الإعلانات الحصول علي عدد أكبر من الجمهور لعلامتك التجارية الوصول إلي عدد أكبر من الفئة التي لديها نفس المشكلة التي تقدم الحل لها في مقطع فيديو اليوتيوب الخاص بك الحصول علي عملاء محتملين يت

تعرف كيف تتصدر نتائج البحث يوتيوب

لماذا تحتاج لتصدر نتائج البحث يوتيوب

سوف يفيدك تصدر نتائج البحث يوتيوب في الحالات التالية:

الحصول علي مشاهدات أكثر وبالتالي أرباح اكثر من مشاهدات الإعلانات الحصول علي عدد أكبر من الجمهور لعلامتك التجارية الوصول إلي عدد أكبر من الفئة التي لديها نفس المشكلة التي تقدم الحل لها في مقطع فيديو اليوتيوب الخاص بك الحصول علي عملاء محتملين يتصلون بك زيادة الوعي بمنتجاتك وخدماتك و إذا لم يكن لديك اسلوب عمل يجعلك تربح من تصدر نتائج البحث يوتيوب يمكنك التواصل معنا سوف ونساعدك في أسرع وقت كيف تصدر نتائج البحث يوتيوب

سوف تتصدر بحث اليوتيوب في الحالات التالية:

إذا كنت تمتلك محتوي علي مقطع فيديو اليوتيوب الخاص بك ذو قيمة مفيدة للزائر. إذا اتعبت ارشادات اليوتيوب لتهيئة مقطع الفيديو الخاص بك لمحركات البحث إذا كنت تستهدف في عنوان فيديو اليوتيوب الخاص بك كلمة البحث الصحيحة. إذا كنت تستيعن بأحد خبراء تحسين محركات البحث هل تصدر نتائج اليوتيوب صعب

تصدر نتائج بحث اليوتيوب ليس صعباً كل ما تحتاجه هو العمل علي هدفك مع الحفاظ علي الاستمرارية وسوف تحصل علي نتائج ممتازة من تصدر نتائج البحث يوتيوب.

هل تحتاج إلي مساعدة شخصية في تصدر نتائج بحث اليوتيوب؟ برجاء التواصل معنا الآن

كم يستغق من الوقت تصدر نتائج البحث يوتيوب

يستغرق من الوقت تصدر نتائج البحث في اليوتيوب غالباً عدة أيام، لذلك لا تيأس المهم أن تستمر في تحسين مقاطع الفيديو علي قناتك والعمل علي إنتاج محتوي مفيد باستمرار، تواصل معنا للتعر علي خواريزميات اليوتيوب


Ceramic Network

We Built a Web3 Points Library on Ceramic

Learn about our new points library built on Ceramic - discover what it is, why we built it, and more!

For those in our community who have been staying up-to-date on our thoughts and experimenting thus far as they relate to points, you already know why we see points as a logical vehicle for Web3 reputation systems. As our co-founder, Danny importantly points out in his blog post entitled Points: How Reputation & Tokens Collide, points not only align the incentives of the platform and its users, but Web3 enables point systems to be more open and composable by "...making every event driving points both transparent and verifiable..."

Given Ceramic's native composability, provenance, performance, and flexible-by-design characteristics, "points built on Ceramic" became an obvious idea for a design pattern that would appeal to teams looking to integrate reward systems into their projects. In March, we released a post called Building Points on Ceramic - an Example and Learnings meant to walk readers through a points proof-of-concept application we deployed for EthDenver '24. Later that month, we also released another post to announce Oamo as the first points provider on Ceramic.

This is all to say - as an organization, we've strived to be thoughtful and deliberate about how we approach the opportunity Web3 points presents while aiming to maintain quick iteration and experimentation as engineering and product norms.

As such, we've been working on a new experiment that we're excited to share with you - a points library built on Ceramic!

Why Develop a Points Library?

Given that we identified points as an ideal use case for Ceramic, building a points library presented an opportunity to help developers start experimenting with and iterating on Ceramic much faster. We also predicted that the approach would accelerate developers' learning rate by packaging Ceramic within an easy-to-understand paradigm. Finally, a use case-specific library allowed us to:

Predefine library interfaces and base models for users, helping them bypass this development stage Eliminate the friction of compiling and deploying a composite Enable immediate utility and experimentation by removing the need for node setup Help developers focus on their core application logic by baking straightforward functionality into the library

...and so on.

Which Use Cases is it Built for?

Throughout this post, we'll take a high-level walk through the anatomy of the points module (and the broader SDK it lives in) which should reveal how flexible and extendable the library is. However, here are some scenarios we considered that helped inform its design and utility:

Rewarding Community Engagement

A project wants to incentivize meaningful participation by rewarding points for engagement across key social platforms The project must track a history of point allocation events per participant, as well as have easy access to a participant's total

Incentivizing Collaboration

A project wants to reward collaborators for contributing to a codebase Various types of contributions under this system are eligible for point rewards

Quests or Education

An education platform wants to reward students for completing class modules Points are distributed in correlation to quiz or test scores at the end of each module What is the Points Library?

The points library is the first module within our larger Ceramic Solutions SDK, designed to be the eventual home for any experimental tooling our team creates for certain common patterns or use cases we've identified as priorities. While the individual libraries (points, for example) sit at the core of this SDK, this repository will also contain example applications, API documentation, and other helpful modules to support developers looking to get started.

The Points Library module (which is now available as an NPM module - @ceramic-solutions/points) removes the common points of tension mentioned above by:

Delivering a hosted node endpoint

This is the default endpoint if no alternate is provided by the user (for test purposes only - not intended for production). This feature was designed to accelerate developer experimentation. Of course, the library also supports private endpoints as well.

Predefined Interfaces and Default Models

The points library relies on two schema interfaces (implemented by two default models) for keeping track of points. These interfaces can easily be extended by developers who desire more complex use cases, or developers can leverage the default models out of the box with no additional configuration.

interface PointsAllocationInterface implements PointsInterface @createModel(description: "Interface for a single allocation of points to an account") { issuer: DID! @documentAccount recipient: DID! @accountReference points: Int! } interface PointsAggregationInterface implements PointsInterface @createModel(description: "Interface for an aggregation of points to an account") { issuer: DID! @documentAccount recipient: DID! @accountReference points: Int! date: DateTime! }

While the Points Library Readme offers more detail, these interfaces broadly speak to the two primary utilities we anticipate developers will need. Model instances of PointsAllocationInterface will keep track of the history of allocation events (and if extended, can include fields to capture additional details such as when and why points were allocated). Conversely, PointsAggregationInterface is designed to keep track of the sum of points for a recipient at any given time (and like allocation docs, can also be extended to keep track of more nuanced data points like contextual subtotals).

Built-in Reader and Writer Classes

The two core library modules consist of a points reader class and a separate writer class, allowing developers multiple ways to instantiate an instance (the easiest of which simply requires a private key if using the default node endpoint and default models). Each class includes built-in methods to read or issue points and can accommodate extensions of the interfaces without the necessity of additional wrapping.

What Will You Build with Points?

While we covered a few examples of typical use cases for points, there are so many other possibilities out there for developers to explore. Perhaps you want to reward your community for showing up to in-person events or collaborating on projects. Maybe you want to build a service that analyzes on-chain behavior (such as interacting with certain smart contracts or holding certain assets) and uses a framework of weights to award contextual points.

We'd love to learn how you're using reward systems across your communities and projects, and whether this initial library has what it needs to support your use case. Feel free to contact me directly at mzk@3box.io, or start a conversation on the Ceramic Forum. We look forward to hearing from you!


Energy Web

Energy Web Announces the public release of Launchpad by Energy Web

Launchpad by Energy Web empowers enterprises and software developers to configure and deploy solutions across thousands of worker nodes globally Energy Web is proud to announce the public release of Launchpad by Energy Web, a new offering that enables energy companies and software developers to rapidly configure and deploy Energy Web solutions. Launchpad is now available for all users with f
Launchpad by Energy Web empowers enterprises and software developers to configure and deploy solutions across thousands of worker nodes globally

Energy Web is proud to announce the public release of Launchpad by Energy Web, a new offering that enables energy companies and software developers to rapidly configure and deploy Energy Web solutions. Launchpad is now available for all users with flexible pricing and support levels for developers and enterprise. Launchpad’s products fall into three categories:

Smartflow enables customers to configure and deploy custom business logic using decentralized networks of worker nodes. Worker nodes ingest data from third parties, perform application-specific computational work on the data, and publish the results for enterprises and/or the public to verify. Work conducted by the nodes can be independently verified without needing to trust a single centralized entity or server. Multiple starting templates provided based on real-world deployments in the energy sector Instant access to existing worker node networks capable of picking up and running new business logic Includes a simple wizard-based user experience Launchpad SmartFlow Blockchain API provides web 3 developers with an RPC API gateway for four blockchains in the Energy Web ecosystem. Example code snippets are included alongside multiple subscription plans.

Validator as-a-service enables users to rapidly spin up cloud-based validators on Energy Web blockchains. Users can choose from bring-your-own-cloud or Energy Web-managed solutions in addition to a variety of automation and support options.

For more information, please visit the Launchpad webpage here.

About Energy Web Foundation:

Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, Digital Spine, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Energy Web Announces the public release of Launchpad by Energy Web was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Identity At The Center - Podcast

In the latest Sponsor Spotlight episode of The Identity at t

In the latest Sponsor Spotlight episode of The Identity at the Center Podcast, we had a riveting conversation with Venkat Raghavan, the Founder and CEO of Stack Identity. We delved deep into the criticality of identity security in the cloud, and the risks posed by over-provisioned access. Venkat highlighted the importance of continuous access management in reducing the attack surface and introduc

In the latest Sponsor Spotlight episode of The Identity at the Center Podcast, we had a riveting conversation with Venkat Raghavan, the Founder and CEO of Stack Identity. We delved deep into the criticality of identity security in the cloud, and the risks posed by over-provisioned access.

Venkat highlighted the importance of continuous access management in reducing the attack surface and introduced listeners to Stack Identity's innovative approach. This focuses on simplifying the removal of access to prevent breaches, thereby guiding organizations to identify and mitigate identity-related risks.

You’ll also got a sneak peek into Stack Identity's presence at the upcoming 2024 RSA Conference where they will showcase their solution to help businesses reduce excessive cloud permissions.

For a hands-on experience with Stack Identity and to assess your organization's shadow access risk for free, visit stackidentity.com/idac

Connect with Venkat on LinkedIn and seize the opportunity to meet the team at RSA, booth N 6564 in the North Expo Hall.

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

Revolutionizing Warehouses with Robotics and Automation with Kevin Lawton

Small tweaks in processes can result in unexpected financial windfalls and improved employee satisfaction.   Kevin Lawton, warehouse operations expert and Founder of The New Warehouse podcast,  discusses a wealth of insights into current trends and the critical role of foundational process improvements. The conversation shifts towards the importance of automating mundane tasks, the

Small tweaks in processes can result in unexpected financial windfalls and improved employee satisfaction.

 

Kevin Lawton, warehouse operations expert and Founder of The New Warehouse podcast, 

discusses a wealth of insights into current trends and the critical role of foundational process improvements. The conversation shifts towards the importance of automating mundane tasks, the profound potential for digital transformation across companies, and the surprising financial benefits companies can reap from small operational adjustments.

 

Kevin explains Modex trends and spotlights the value of low-tech solutions and how maintaining standardized and actionable data is pivotal for leveraging advanced robotics and automation systems. The episode emphasizes systemic change, interoperability between solutions, and harnessing robotics and automation to address labor challenges and elevate operational efficiency.

 

Key takeaways: 

Trends at Modex, where robotics took center stage, show that robotics is not just a fascinating concept — it's becoming a practical part of warehouse operations.

Companies need to streamline processes to create a strong foundation that can truly leverage the efficiency of new technologies.

While robots and automation can address labor challenges and efficiency, the orchestration layer will make or break the integration.

 

Resources: 

Learn More About 2D Barcodes

Resources for the Transition from 1D to 2D Barcodes 

Behind the Barcode: Mastering 2D Barcodes with GS1 US

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Kevin Lawton on LinkedIn

Check out The New Warehouse

Tuesday, 30. April 2024

Origin Trail

Launching the “ChatDKG for AI builders” course at Trace Alliance Academy

Welcome to the future of AI: Building a verifiable Internet with ChatDKG We are excited to announce the launch of a pioneering educational course, “ChatDKG for AI Builders” at Trace Alliance Academy. This course is designed for forward-thinking artificial intelligence (AI) developers, data scientists, and technology innovators who aim to enhance the fidelity and integrity of their AI solutio
Welcome to the future of AI: Building a verifiable Internet with ChatDKG

We are excited to announce the launch of a pioneering educational course, “ChatDKG for AI Builders” at Trace Alliance Academy. This course is designed for forward-thinking artificial intelligence (AI) developers, data scientists, and technology innovators who aim to enhance the fidelity and integrity of their AI solutions through the Verifiable Internet for AI powered by the OriginTrail Decentralized Knowledge Graph (DKG).

Why OriginTrail-powered verifiable Internet for AI?

Verifiable Internet for AI leverages the synergies of crypto, the Internet, and AI technologies (read more in in the OriginTrail Whitepaper 3.0). With the novel Decentralized Knowledge Graph (DKG) approach it ensures the provenance, integrity, and verifiability of information utilized by AI systems. It aims to address the challenges posed by misinformation, data ownership, intellectual property rights, and biases inherent in AI technologies, by synergizing neural and symbolic AI approaches with Web3 technologies.

In an era marked by abundant connectivity and, unfortunately, abundant misinformation, the need for reliable data has never been more crucial. The OriginTrail DKG stands out as an open-source, trusted knowledge infrastructure that leverages blockchain and knowledge graph technologies and is founded on the principles of neutrality, inclusiveness, and usability.

As Dr. Bob Metcalfe, the inventor of Ethernet and an Internet pioneer, notes, knowledge graphs are critical in improving the fidelity of artificial intelligence. By leveraging the synergies of knowledge graph and blockchain technologies, OriginTrail DKG enables structuring and connecting data, making it discoverable, verifiable, and ownable in the form of knowledge assets, and, as such, usable for verifiable AI solutions employing the innovative Decentralized Augmented Generation (dRAG) framework.

By introducing the dRAG framework, the OriginTrail-powered verifiable Internet for AI builds on and expands the concept of Retrieval Augmented Generation (RAG), a key paradigm for builders in the AI space looking to feed large language models (LLMs) with specific context and datasets. Besides empowering builders to develop verifiable AI solutions based on trusted knowledge, the dRAG framework also opens up a tremendous opportunity for connecting individual knowledge bases that AI builders are creating and curating for their RAG pipelines, enabling the sharing of knowledge and value between AI systems in a decentralized way.

Builders can leverage all of these capabilities of the verifiable Internet for AI through ChatDKG — a launchpad for trusted AI. Through ChatDKG, they can create predictable AI agents to execute tasks based on trusted data stored in the OriginTrail DKG. Their AI agents benefit from information provenance and data ownership, while the builders enjoy the freedom to choose between multiple AI models.

Learning to build and leverage a verifiable Internet for AI with ChatDKG, therefore, allows AI pioneers to seize the opportunities analogous to those previously seized by advances in networking that early computer networks such as Ethernet and the Internet brought about. Opportunities that have demonstrated tremendous value generation through network effects famously articulated in Metcalfe’s law.

What will you learn in the ChatDKG for AI builders course?

The course covers a range of crucial topics and tools essential for building and leveraging verifiable Internet for AI.

Here is what you can expect to learn:

Verifiable Internet for AI: Understand the foundational concepts and key properties of verifiable Internet for AI like clear information provenance, verifiable information, and incentivization for high-quality knowledge production. OriginTrail enablement: Explore how OriginTrail DKG facilitates a verifiable Internet for AI, including its structure and operational dynamics. Decentralized Retrieval Augmented Generation (dRAG) framework: Dive into the dRAG framework and learn how it ensures knowledge provenance and aids in the accurate retrieval of information to build verifiable AI applications. ChatDKG launchpad: With ChatDKG as a launchpad for trusted AI solutions, you can turn your data into Knowledge Assets and create predictable AI agents to execute tasks using your data. Because Knowledge Assets are created on the OriginTrail DKG, your AI agents benefit from information provenance and data ownership and you have the freedom to choose between multiple AI models. Integration techniques: Learn to integrate different LLMs and other tools with OriginTrail DKG to build robust and verifiable AI solutions. Start with exploring integrations with the Google Gemini, NVIDIA Build ecosystem, and Chainlink Oracle capabilities and expect new tutorials to be added subsequently. Join us to shape the future of AI

This course is a unique opportunity to be at the forefront of creating safer, verifiable, and more reliable AI solutions. Whether you are an AI enthusiast, a seasoned developer, or a curious learner, “ChatDKG for AI builders” offers valuable insights and practical skills to navigate and shape the future of AI.

Enroll today at Trace Alliance Academy and join us in pioneering the verifiable Internet for AI. Together, let’s lead the charge in fostering a democratic, ethically guided, and economically viable future for AI applications.

To receive additional tools and support to build transformative AI solutions on OriginTrail, apply for the ChatDKG Inception Program.

Launching the “ChatDKG for AI builders” course at Trace Alliance Academy was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 29. April 2024

FIDO Alliance

Verdict: OneSpan: Partner Ecosystem Profile

The company’s various solutions include regulatory compliance, PSD2 compliance, FIDO standard, fraud prevention, mobile app security, transaction signing, digital onboarding and omnichannel security solutions. It operates in North America, Europe […]

The company’s various solutions include regulatory compliance, PSD2 compliance, FIDO standard, fraud prevention, mobile app security, transaction signing, digital onboarding and omnichannel security solutions. It operates in North America, Europe and the Asia Pacific regions.


Tech telegraph: WhatsApp now rolling out passkey support for iPhone users

Passkey is a technology developed by the FIDO Alliance in collaboration with major companies like Apple, Google, and Microsoft. Instead of traditional passwords, it enables users to log in using […]

Passkey is a technology developed by the FIDO Alliance in collaboration with major companies like Apple, Google, and Microsoft. Instead of traditional passwords, it enables users to log in using secure methods like facial recognition or biometrics, eliminating the need to create and type a passcode.


Biometric Update: NIST issues guidance to fit passkeys into digital identity recommendations

Andrew Shikiar, CEO of the FIDO Alliance, noted that the updated NIST guidance confirms passkeys’ ability, along with other FIDO authenticators, to meet AAL2 and AAL3 requirements. Synchronized passkeys can […]

Andrew Shikiar, CEO of the FIDO Alliance, noted that the updated NIST guidance confirms passkeys’ ability, along with other FIDO authenticators, to meet AAL2 and AAL3 requirements. Synchronized passkeys can achieve AAL2, while device-bound passkeys can reach AAL3.


TechCrunch: WhatsApp adds global support for passkeys on iOS

WhatsApp is launching passkey verification on iOS, eliminating the requirement for users to manage SMS one-time passcodes. The company announced on Wednesday that this feature is currently being rolled out […]

WhatsApp is launching passkey verification on iOS, eliminating the requirement for users to manage SMS one-time passcodes. The company announced on Wednesday that this feature is currently being rolled out and will soon be accessible to all iOS users.


Identity At The Center - Podcast

Join us in the latest episode of The Identity at the Center

Join us in the latest episode of The Identity at the Center Podcast where we discuss listener questions and critique AI answers. We delve into topics such as key IAM metrics, the challenges of implementing IAM strategies in large multinational companies, and upcoming trends in the IAM sector. You can watch it on YouTube at https://www.youtube.com/@idacpodcast or listen to this episode at https://i

Join us in the latest episode of The Identity at the Center Podcast where we discuss listener questions and critique AI answers. We delve into topics such as key IAM metrics, the challenges of implementing IAM strategies in large multinational companies, and upcoming trends in the IAM sector. You can watch it on YouTube at https://www.youtube.com/@idacpodcast or listen to this episode at https://idacpodcast.com or in your podcast app. Don't forget to share your thoughts with us and thanks to the listeners for their questions.

#iam #podcast #idac

Sunday, 28. April 2024

Velocity Network

Velocity: The Next Gen Public Infrastructure for Workforce Credential Verification

Dror Gurevich, our CEO has joined Trevor Schachner at SHRM for an exciting episode of SHRM’s WorkplaceTech Spotlight, a SHRM podcast. The post Velocity: The Next Gen Public Infrastructure for Workforce Credential Verification appeared first on Velocity.

Thursday, 25. April 2024

Ceramic Network

Ceramic Feature Release: SET Account Relations, Immutable Fields and shouldIndex flag

The powerful new features in Ceramic and ComposeDB offer users a sophisticated toolkit for data management. Explore the SET account relation, immutable fields and the benefits of shouldIndex flag.

The functionality of Ceramic and ComposeDB has been recently enhanced by a number of new features that give developers more control over account relation definitions and data accessibility. More specifically, you can now use the following tools to enhance your applications:

SET account relation - enabling users to enforce a constraint where each user account (or DID) can create only one instance of a model for a specific record of another model. Immutable fields - allow specific data to be prevented from being altered. shouldIndex flag - gives developers an option to manage the data visibility by choosing which fields should be indexed.

In this blog post, we are going to dive into these features in more detail. For a video walkthrough, check out this video tutorial.

SET account relations

SET relations in ComposeDB enable developers to define relations between the data models that follow specific constraints and include the user as part of the relationship. SET account relation allows users to enforce the constraint that a specific account (DID) can have only one instance of a model for a specific record of another model.

The best example to illustrate the “like” feature of a social media application. SET relation can be utilized to make sure that the user (DID) can “like” a specific post only once, while at the same time allowing the user to like multiple posts.

Let’s have a look at how SET Relations can be used in practice.

Ceramic Layer

To use SET account relation in Ceramic, you will first have to define a SET accountRelation in your model definition. An example below consists of two simple models - POST_MODEL representing the model definition for social media posts, and LIKE_MODEL representing the model definition for users liking the posts.

The model definition for POST_MODEL has the accountRelation as a list, meaning that one user account will be allowed to create multiple posts.

The model definition for LIKE_MODEL has a SET accountRelation and includes the fields which should be used to create the unique relation - postID and userID. This defines that a specific user can create only one "like" record for a specific post.

const POST_MODEL: ModelDefinition = { name: 'Post', version: '2.0', interface: false, implements: [], accountRelation: {type: 'list'}, schema: { $schema: '<https://json-schema.org/draft/2020-12/schema>', type: 'object', additionalProperties: false, properties: { content: {type: 'string', maxLength: 500}, author: {type: 'string'}, }, required: ['content', 'author'], }, } const LIKE_MODEL: ModelDefinition = { name: 'Like', version: '2.0', interface: false, implements: [], accountRelation: {type: 'set', fields: ['postID', 'userID']}, schema: { $schema: '<https://json-schema.org/draft/2020-12/schema>', type: 'object', additionalProperties: false, properties: { postID: {type: 'string'}, userID: {type: 'string'}, }, required: ['postID', 'userID'], }, } ComposeDB Layer

Now let's see an example of how you can use SET account relations in ComposeDB. Similar to the example above, the key component that allows you to define the SET account relation for a specific model is the accountRelation scalar alongside the fields that should be used to define the unique relation.

Take the example below. Here we have two models defined using GraphQL schema definition language. The first model is a model for storing data about a Picture - the source and the dimensions of the image. The model definition Favourite implements the behavior of the user setting a picture as a favorite. Note that this model has an accountRelation defined as SET. The field that is used to define the relation is docID, which refers to the document ID of the picture record.

type Picture @createModel(description: "A model for pictures", accountRelation: SINGLE) { src: String! @string(maxLength: 150), mimeType: String! @string(maxLength: 50), width: Int! @int(min:1), height: Int! @int(min:1), size: Int! @int(min:1), } type Favourite @createModel(description: "A set of favourite documents", accountRelation: SET, accountRelationFields: ["docID"]){ docID: StreamID! @documentReference(model: "Picture") doc: Node @relationDocument(property: "docID") note: String @string(maxLength: 500) }

All this means that the user will be able to set only one image as a favorite. They can set different pictures as favorites, but only one record per picture.

Immutable Fields

Another feature that has been recently added to Ceramic is Immutable Fields. With Immutable Fields, you are able to define which fields (for example, some critical data) should remain unchangeable no matter what and be accessible as read-only data. Any attempt to alter the data set as immutable would result in an error message.

Ceramic Layer

Defining specific fields as immutable is pretty simple. Below we have an example of a simple model defining a Person - their address, name, and other details. To make these fields immutable you simply need to include them into the immutableFields array. In the example below, fields like address, name, myArray, and myMultipleType will be set as immutable, meaning that once this data is created, it will be unchangeable:

const example_model : ModelDefinition = { name : 'Person', views : {}, schema : { type : 'object', $defs : { Address : { type : 'object', title : 'Address', required : [ 'street', 'city', 'zipCode' ], properties : { city : {type : 'string', maxLength : 100, minLength : 5}, street : {type : 'string', maxLength : 100, minLength : 5}, zipCode : {type : 'string', maxLength : 100, minLength : 5}, }, additionalProperties : false, }, }, $schema : '<https://json-schema.org/draft/2020-12/schema>', required : [ 'name', 'address' ], properties : { name : {type : 'string', maxLength : 100, minLength : 10}, address : {$ref : '#/$defs/Address'}, myArray : {type : 'array', maxItems : 3, items : {type : 'integer'}}, myMultipleType : {oneOf : [ {type : 'integer'}, {type : 'string'} ]}, }, additionalProperties : false, }, version : '2.0', interface : false, relations : {}, implements : [], description : 'Simple person with immutable field', accountRelation : {type : 'list'}, immutableFields : [ 'address', 'name', 'myArray', 'myMultipleType' ], } ComposeDB Layer

In ComposeDB, a specific field can be set as immutable by adding a directive @immutable to the fields that should remain unchangeable. For example:

type ModelWithImmutableProp@createModel( accountRelation: SINGLE, description: "Test model with an immutable int property" ) { uniqueValue: Int @immutable uniqueValue2: Int @immutable } }

Here, we set that fields uniqueValue and uniqueValue2 are going to be immutable.

shouldIndex Flag

Last but not least, let’s talk about the shouldIndex Flag available in Ceramic and ComposeDB. shouldIndex Flag allows you to control the stream indexing by toggling a boolean metadata flag. It enables you to manage data visibility and indexing. By setting the shouldIndex Flag to false, you can disable the stream from being indexed, making it “invisible” for indexing operations. Let’s take a look at how you can use this feature.

Ceramic Layer

When working with model documents, ModelInstanceDocument, there is a new method called shouldIndex(boolean-value) where false would indicate the stream corresponding to this model should not be indexed, and can be called with value = true to reindex an existing document, e.g.:

const document = await ModelInstanceDocument.create(ceramic, CONTENT0, midMetadata) // Unindex await document.shouldIndex(false) ComposeDB Layer

There are two ways to signal that a stream shouldn’t be indexed using ComposeDB. The first one is by including the shouldIndexoption in a mutation query, setting it to true if it should be indexed and set to false in the contrary:

const runtime = new ComposeRuntime({ ceramic, context, definition: composite.toRuntime() }) await runtime.executeQuery<{ updateProfile: { viewer: { profile: { name: string } } } }>(` mutation UpdateProfile($input: UpdateProfileInput!) { updateProfile(input: $input) { viewer { profile { name } } } } `, { input: { id: profileID, content: {}, options: { shouldIndex: false } } }, )

The second way is to use a mutation type called enableIndexing, just like a create or update mutation, it should be paired with the model’s name, sending the streamId and shouldIndex value as part of the input, e.g.:

const enableIndexingPostMutation = `mutation EnableIndexingPost($input: EnableIndexingPostInput!) { enableIndexingPost(input: $input) { document { id } } }` await runtime.executeQuery<{ enableIndexingPost: { document: { id: string } } }> enableIndexingPostMutation, { input: { id, shouldIndex: false } },)

Note that the shouldIndex flag doesn’t delete the data. If set to false, the stream will still exist on the network; however, it will not be indexed and will not be available for data interactions.

Summary

The powerful new features in Ceramic and ComposeDB offer users a sophisticated toolkit for data management. From enforcing unique constraints with SET account relations to securing key data with immutable fields and controlling indexing operations using the shouldIndex flag, these features empower developers to build robust and efficient data models for their applications. Check out the Ceramic documentation for more information and examples.

Let us know how you are using all of these new features by posting on our Ceramic developer community forum.

Ceramic Resources

Developer Documentation: https://developers.ceramic.network/

Discord: https://chat.ceramic.network/

Github: https://github.com/ceramicnetwork

Twitter: https://twitter.com/ceramicnetwork

Website: https://ceramic.network/

Forum: https://forum.ceramic.network/


LionsGate Digital

PLEASE UPDATE THE RSS FEED

The RSS feed URL you're currently using https://follow.it/lions-gate-digital-advocacy will stop working shortly. Please add /rss at the and of the URL, so that the URL will be https://follow.it/lions-gate-digital-advocacy/rss

The RSS feed URL you're currently using https://follow.it/lions-gate-digital-advocacy will stop working shortly. Please add /rss at the and of the URL, so that the URL will be https://follow.it/lions-gate-digital-advocacy/rss

Wednesday, 24. April 2024

EdgeSecure

Edge’s Chief Digital Learning Officer Joins United States Distance Learning Association (USDLA) Public Policy Committee; Confirmed to Present at 2024 National USDLA Conference

The post Edge’s Chief Digital Learning Officer Joins United States Distance Learning Association (USDLA) Public Policy Committee; Confirmed to Present at 2024 National USDLA Conference appeared first on NJEdge Inc.

NEWARK, NJ, April 24, 2024 – Edge’s Chief Digital Learning Officer, Joshua Gaul, joins an esteemed group of distance learning practitioners, vendors, and individuals with an academic interest in distance learning pedagogy, on the United States Distance Learning Association (USDLA) Public Policy Committee. 

USDLA is the premier professional membership organization designed to advocate and support the needs of distance education leaders. USDLA’s resources support the Distance Education Professional Community who serve education, business, health, and government. Founded in 1987, its vision and mission to advocate, research, and share best practices in the utilization of distance learning modalities in education, business, health, and government nationally and internationally continues today.

Notes Gaul, Chief Digital Learning Officer, Edge, “I’m proud to join the diverse group of members on the USDLA’s Public Policy Committee. Our shared passion for all-things-distance-learning will enable us to focus on the countless public policy issues related to digital/distance learning that are top-of-mind with the USDLA Board of Directors and its members.”

“I’m proud to join the diverse group of members on the USDLA’s Public Policy Committee. Our shared passion for all-things-distance-learning will enable us to focus on the countless public policy issues related to digital/distance learning that are top-of-mind with the USDLA Board of Directors and its members.”

— Joshua Gaul
Chief Digital Learning Officer, Edge

Josh will serve as a presenter at the 2024 National Conference for the 37th edition of the National Distance Learning Conference. With a theme of Gateway to the Future of Distance and Digital Learning, the event will take place June 17-20, 2024 at the Marriott St. Louis Grand. Josh’s presentation, Sustainable Online Learning for All: Developing and Accelerating High-Quality Online Programs with Nonprofit Consortium will take place June 18 at 3:30 pm. Those interested in registering for the event, may do so via https://usdla.org/2024-conference-registration/.

To learn more about the USDLA, visit https://usdla.org/about/history/

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge’s Chief Digital Learning Officer Joins United States Distance Learning Association (USDLA) Public Policy Committee; Confirmed to Present at 2024 National USDLA Conference appeared first on NJEdge Inc.


Next Level Supply Chain Podcast with GS1

Behind the Barcode: Enhancing Supply Chain Efficiency with GS1 US Data Hub

In this episode of Next Level Supply Chain’s special series Behind the Barcode, Liz and Reid speak with Jess Urriola, the VP of Product Management at GS1 US, about GS1 US Data Hub - a tool built to secure data sharing between brand owners and trading partners. They talk about why this service is pivotal for maintaining data quality across the global supply chain, both from the strategic and techni

In this episode of Next Level Supply Chain’s special series Behind the Barcode, Liz and Reid speak with Jess Urriola, the VP of Product Management at GS1 US, about GS1 US Data Hub - a tool built to secure data sharing between brand owners and trading partners. They talk about why this service is pivotal for maintaining data quality across the global supply chain, both from the strategic and technical aspects, from creating UPC barcodes to combating GTIN misuse.

Learn how GS1 US Data Hub supports regulatory compliance, the importance of global location numbers in traceability, and the benefits of ensuring each product has a unique identifier—similar to a license plate. 

 

Key takeaways: 

How GS1 US Data Hub's user-friendly SaaS platform empowers both small and large businesses to effortlessly manage and authenticate product identifiers and location data, ensuring accuracy and trust throughout the global supply chain.

The critical role of the GS1 Registry Platform (GRP) in combating Global Trade Item Number (GTIN) misuse and fostering global transparency by enabling real-time, cross-border verification of core product attributes for reliable traceability and inventory management.

The strategic advantages of GS1's identification system as Jess Urriola highlights its integral part in compliance with healthcare and food safety regulations, streamlining the entire traceability process from manufacturer to end-point via Global Location Numbers (GLNs).

 

Resources: 

GS1 US’s Data Hub

Data Hub Help Center

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Jess Urriola on LinkedIn


MyData

International consortium set to advance AI & Multimodal data integration in personalised cardiovascular medicine.

The NextGen project kicked off at the beginning of 2024. MyData Global teamed up with clinical research centres, universities, professional associations, SMEs and non-profits in this EU Horizon Europe project to develop the next-generation tools for genome-centric multimodal data integration in personalised cardiovascular medicine.  The main trends in global demographics and health impact on t
The NextGen project kicked off at the beginning of 2024. MyData Global teamed up with clinical research centres, universities, professional associations, SMEs and non-profits in this EU Horizon Europe project to develop the next-generation tools for genome-centric multimodal data integration in personalised cardiovascular medicine.  The main trends in global demographics and health impact on the […]