Last Update 6:18 PM April 18, 2024 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Thursday, 18. April 2024

ResofWorld

The regional flavors of labor-on-demand

A new report digs into the social dynamics of gig work.
Gig work has become universal, used for everything from remote copywriting in South Africa to cleaning homes in Vietnam. It’s so widespread that it’s easy to forget that, for many...

Bangladesh built a tech park for 100,000 workers. Now it’s a ghost town

The facility was partly funded by the World Bank and touted to become the country’s “cyber capital.”
When Bangladesh inaugurated its first technology business park — a sprawling campus for tech companies to set up offices and factories — in 2015, local computer manufacturer DataSoft swiftly seized...

Wednesday, 17. April 2024

Oasis Open Projects

Invitation to comment on Energy Interoperation Common Transactive Services (CTS) v1.0 – ends June 15

CTS permits energy consumers and producers to interact through energy markets. The post Invitation to comment on Energy Interoperation Common Transactive Services (CTS) v1.0 – ends June 15 appeared first on OASIS Open.

OASIS and the OASIS Energy Interoperation TC are pleased to announce that Energy Interoperation Common Transactive Services (CTS) v1.0 is now available for public review and comment. This is the third public review of this draft specification.

Common Transactive Services (CTS) permits energy consumers and producers to interact through energy markets by simplifying actor interaction with any market. CTS is a streamlined and simplified profile of the OASIS Energy Interoperation (EI) specification, which describes an information and communication model to coordinate the exchange of energy between any two Parties that consume or supply energy, such as energy suppliers and customers, markets and service providers.

The documents and related files are available here:

Energy Interoperation Common Transactive Services (CTS) Version 1.0
Committee Specification Draft 03
28 March 2024

PDF (Authoritative):
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03.pdf
Editable source:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03.docx
HTML:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03.html
PDF marked with changes since previous publication:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03-DIFF.pdf
Comment resolution log for previous public review:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd02/ei-cts-v1.0-csd02-comment-resolution-log.txt

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03.zip

A public review metadata record documenting this and any previous public reviews is available at:
https://docs.oasis-open.org/energyinterop/ei-cts/v1.0/csd03/ei-cts-v1.0-csd03-public-review-metadata.html

How to Provide Feedback

OASIS and the Energy Interoperation TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 17 April 2024 at 00:00 UTC and ends 15 June 2024 at 23:59 UTC.

The TC requests that comments should cite the line numbers from the PDF formatted version for clarity.

Any individual may submit comments to the TC by sending email to Technical-Committee-Comments@oasis-open.org. Please use a Subject line like “Comment on Energy Interoperation Common Transactive Services”.

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the Energy Interoperation TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/energyinterop/

========== Additional references:

[1] https://www.oasis-open.org/policies-guidelines/ipr/

[2] https://www.oasis-open.org/committees/energyinterop/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-Limited-Mode
RF on Limited Terms Mode

The post Invitation to comment on Energy Interoperation Common Transactive Services (CTS) v1.0 – ends June 15 appeared first on OASIS Open.


MOBI

NIADA

The National Independent Automobile Dealers Association (NIADA) has advanced independent automobile dealers since 1946 through advocacy, education and promotion. NIADA advocates for used auto dealers by addressing the challenging issues that disrupt the industry’s ability to create jobs, build thriving dealerships and maximize profitability. www.niada.com

The National Independent Automobile Dealers Association (NIADA) has advanced independent automobile dealers since 1946 through advocacy, education and promotion. NIADA advocates for used auto dealers by addressing the challenging issues that disrupt the industry’s ability to create jobs, build thriving dealerships and maximize profitability. www.niada.com

The post NIADA first appeared on MOBI | The New Economy of Movement.


NAF Association

The National Automotive Finance (NAF) Association is the only forum for the exclusive benefit of the non-prime auto finance industry, addressing the challenges of sales finance companies, dealers, and third-party service providers. nafassociation.com

The National Automotive Finance (NAF) Association is the only forum for the exclusive benefit of the non-prime auto finance industry, addressing the challenges of sales finance companies, dealers, and third-party service providers. nafassociation.com

The post NAF Association first appeared on MOBI | The New Economy of Movement.


Identity At The Center - Podcast

Join us for a new Sponsor Spotlight episode of The Identity

Join us for a new Sponsor Spotlight episode of The Identity at the Center Podcast. Sandy Bird, co-founder and CTO of Sonrai Security, returns to introduce us to permissions on demand by way of Sonrai’s new Cloud Permissions Firewall. Learn more about it at https://sonrai.co/identity-at-the-center and listen to the episode now at idacpodcast.com or on your favorite podcast app. We also tried somet

Join us for a new Sponsor Spotlight episode of The Identity at the Center Podcast. Sandy Bird, co-founder and CTO of Sonrai Security, returns to introduce us to permissions on demand by way of Sonrai’s new Cloud Permissions Firewall. Learn more about it at https://sonrai.co/identity-at-the-center and listen to the episode now at idacpodcast.com or on your favorite podcast app.

We also tried something new for this episode... video! You can watch this episode on our YouTube channel at https://www.youtube.com/watch?v=oPlUwY4jqKg

#iam #podcast #idac


ResofWorld

How RRR’s success brought a wave of Telugu-language movies to Netflix

Netflix's U.S. catalog has more content in Telugu than in German, Russian, or any dialect of Chinese.
Since its release in March 2022, the Telugu-language film RRR has become one of the most celebrated Indian films in recent memory. The action epic garnered international acclaim and even...

AI “deathbots” are helping people in China grieve

Avatars of deceased relatives are increasingly popular for consoling those in mourning, or hiding the deaths of loved ones from children.
“Dad, were you suffering before you left?” Yancy Zhu texted.  “I was not in pain,” said the artificial intelligence bot, in a man’s voice that Zhu had chosen on chatbot...

Next Level Supply Chain Podcast with GS1

How 2D Barcodes Are Changing the Retail Landscape with Chuck Lasley

Chuck Lasley, IT Director at Dillard’s, explains the pivotal role of 2D barcodes in retail innovation, illustrating Dillard's strategy of incorporating these versatile codes into their products, which range from apparel to accessories. Amidst the growing demand for intricate product details, Chuck emphasizes the imperative for sales associates to be adept in product knowledge facilitated by 2D bar

Chuck Lasley, IT Director at Dillard’s, explains the pivotal role of 2D barcodes in retail innovation, illustrating Dillard's strategy of incorporating these versatile codes into their products, which range from apparel to accessories. Amidst the growing demand for intricate product details, Chuck emphasizes the imperative for sales associates to be adept in product knowledge facilitated by 2D barcodes. As Chuck explains, 2D barcodes can lead to improved inventory management, better customer service, and enhanced consumer storytelling possibilities.

The conversation also explores AI's potential in customer service, the impact smartphones have had on computing power, and the potential of automated vehicles in altering supply chain dynamics. Chuck applauds the implementation of evolving technologies like RFID, which are crucial in the industry-wide 'Sunrise 2027' initiative. Sunrise 2027 aims for widespread adoption of 2D barcode scanning by 2027, with Dillards ambitiously targeting an earlier date. This episode covers automation, innovation, and the pursuit of a unique identity within the global supply chain.

 

Key takeaways: 

Technology in customer service is advancing with tools integrating RFID and 2D barcode technologies in supply chain operations to improve accuracy and efficiency.

The retail industry recognizes the importance and advantages of transitioning from 1D to 2D barcodes and RFID technology for improved inventory management, customer service, and access to detailed product information.  

Technological advancements create enriched consumer experiences through unique transaction identifiers and product storytelling.

 

Resources: 

Learn More About 2D Barcodes

Resources for the Transition from 1D to 2D Barcodes 

Behind the Barcode: Mastering 2D Barcodes with GS1 US

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Chuck Lasley on LinkedIn

Check out Dillard’s

Tuesday, 16. April 2024

Origin Trail

V8 roadmap update: Scalable knowledge engine for convergence of crypto, internet, and AI

The upcoming Decentralized Knowledge Graph (DKG) V8 update represents a significant advancement in Decentralized AI, building on the achievements of previous innovations brought by V6. The DKG V6 materialized knowledge as a new asset class, with its core AI-ready Knowledge Assets setting the stage for advanced AI applications in the domains of real-world assets (RWAs), decentralized science (DeSci

The upcoming Decentralized Knowledge Graph (DKG) V8 update represents a significant advancement in Decentralized AI, building on the achievements of previous innovations brought by V6. The DKG V6 materialized knowledge as a new asset class, with its core AI-ready Knowledge Assets setting the stage for advanced AI applications in the domains of real-world assets (RWAs), decentralized science (DeSci), industry 4.0, and more.

Moving forward, DKG V8 introduces autonomous DKG growth, support for Initial Paranet Offerings (IPOs), and also significantly increases scalability. With this, the Decentralized Retrieval Augmented Generation (dRAG) becomes a foundational framework instilled in the DKG V8, significantly advancing a spectrum of large language model (LLM) applications.

DKG V8 is tailored to drive the next generation of AI through multi-modal content, which is crucial for a diversified and robust AI ecosystem. The integration of dRAG and other decentralized AI functionalities allows for a more verifiable and secure application of AI technologies, addressing challenges such as misinformation, data bias, and model collapse.

The present roadmap update focuses on DKG V8 catalysts designed to bootstrap and accelerate these advancements, including enhanced knowledge mining processes, integration across multiple blockchain ecosystems, and scalability improvements aimed at supporting an expansive growth of knowledge assets. These initiatives ensure that DKG V8 not only extends its foundational network effects but also reinforces its position as a cornerstone of future AI developments.

The entire roadmap can be found here.

DKG V8 — Decentralized Retrieval Augmented Generation at scale

“…it (LLM) unfortunately hallucinates most when you least want it to hallucinate. When you’re asking the important and difficult questions that’s where it tends to be confidently wrong. So we’re really trying hard to say how do we be as grounded as possible so you can count on the results?” Elon Musk at Lex Fridman podcast

The truth, however, is an elusive concept, especially if attempted to be captured by a single organization/product. A better approach is through connectivity and transparency achieved by leveraging multiple open-source technologies. Turing Award winner Dr. Bob Metcalfe explained this idea, saying,

“… through connectivity, decentralized knowledge graphs, blockchains and AI are converging — and it’s an important convergence, because it is going to help us with one of the biggest problems we have nowadays, which is the truth.”

Inspired by Turing award winner Dr Bob Metcalfe and his pioneering work, including “Metcalfe’s Law” and the creation of the first computer network, the OriginTrail Metcalfe phase aims to leverage network effects through building a web of verifiable Knowledge Assets for decentralized AI.

The Genesis period of the Metcalfe phase bootstraps the growth of the AI-native V8 Decentralized Knowledge Graph (DKG V8), driving the verifiability of AI via Decentralized Retrieval Augmented Generation (dRAG). Supported by a unique knowledge mining system through the NeuroWeb blockchain, the Genesis period is followed by a “Convergence” period, which further leverages network effects via autonomous knowledge inferencing. The DKG V8 aims to offer a user-centric, trusted knowledge foundation with decentralized AI functionalities, enabling individuals and organizations to participate in a knowledge economy based on neutrality, inclusiveness, and usability — the core principles of the OriginTrail ecosystem.

Genesis — The V8 Foundation (Q4 2023–2025)

“When you connect things together, the value rises really fast because of all the possible connections that can be made, and the friction that’s reduced, and the collaboration that’s enhanced. So it’s good to bet on connectivity.” — Dr Bob Metcalfe

The Genesis period leverages connectivity to achieve network effects across the multi-chain OriginTrail Decentralized Knowledge Graph to reach a growth target of 1B Knowledge Assets. With the introduction of the community-driven NeuroWeb blockchain supporting DKG growth, Genesis bootstraps the OriginTrail AI-Native version 8 for Decentralized Retrieval Augmented Generation (dRAG), introduced to drive a multi-modal ecosystem of AI solutions. The V8 Foundation impact stages were initially described here and are now being expanded.

Genesis period targets:

1B knowledge assets available on the DKG

Minimum 40% of TRAC circulating supply activated for utility

TRAC locked for network security: 100MM+ TRAC Average security collateral per node: 300k+ TRAC Impact base: Trantor (established in Q1 2024)

One of the prominent features of Trantor was the Library of Trantor, in which librarians indexed the entirety of human knowledge by walking up to a different computer terminal every day and resuming where the previous librarian left off.

Catalyst 1: Knowledge Mining

Incentivized growth of high-quality knowledge in the DKG with Initial Paranet Offerings and Autonomous Knowledge Mining.

Genesis Knowledge Mining RFC
Genesis Knowledge Asset mining
Beta Mining Program

Catalyst 2: Delegated staking

Expanding inclusivity of the DKG infrastructure by enabling TRAC stake delegation across all integrated chains.

Delegated staking dashboard
Documentation
Delegated Staking RFC
DKG node TRAC token delegation release (Gnosis integration)

Whitepaper 3.0

Verifiable Internet for Artificial Intelligence: The Convergence of Crypto, Internet, and AI.

Link

This whitepaper presents a vision for the future of Artificial Intelligence through the concept of a Verifiable Internet for AI, leveraging synergies of crypto, internet, and AI technologies. It introduces the Decentralized Knowledge Graph (DKG) and Decentralized Retrieval Augmented Generation (dRAG) approach to ensure the provenance, integrity, and verifiability of information utilized by AI. It aims to address the challenges posed by misinformation, data ownership, and bias inherent in AI, by synergizing neural and symbolic AI approaches with Web3 technologies.

Impact base: Terminus (established in Q2 2024)

The founding population of Terminus consisted of 100,000 especially healthy scientists, whose ostensible purpose was to publish an Encyclopedia Galactica in order to preserve science and technology. The lack of natural resources forced Terminians to develop extremely high-efficiency tech, as their knowledge due to their position as the inheritors of the Imperial Library allowed them to do so.

Catalyst 1: Multichain growth

Bringing the DKG to any EVM-compatible ecosystem with significant demand (more information in OT-RFC-17).

◻️ NeuroWeb delegated staking release
◻️ Additional blockchain integrations (based on OT-RFC-17)

Catalyst 2: 100x scalability

Increasing scalability in the capacity of publishing Knowledge Assets by implementing random sampling and other scalability improvements.

◻️ NeuroWeb scaling: Asynchronous backing
◻️ DKG V8 random sampling update

Catalyst 3: Paranets and Initial Paranet Offerings (IPOs)

The autonomously operated collections of Knowledge Assets owned by its communities residing on the DKG.

◻️ Initial Paranet Offerings (IPOs) launch
◻️ First IPO launched — the ID Theory decentralized Science (DeSci)
◻️ Cross-chain knowledge mining
◻️ Decentralized Identities on the DKG — name service integration

Catalyst 4: ChatDKG.ai

Interact with the DKG and its paranets using natural language and the power of multiple AI models and agents. Build your own Decentralized Retrieval Augmented Generation (dRAG) product seamlessly.

◻️ Multi modal LLM ChatDKG
OriginTrail World Launch Trusted AI platform
Google Vertex AI support
OpenAI support
NVIDIA Platform support
Chainlink support
◻️ xAI (Grok) support
V1 of unified framework (Whitepaper 3.0)
AI-based knowledge publishing
◻️ AI agent integrations
✅ Additional ChatDKG grant waves

Impact base: Gaia (established in H2 2024)

The human beings on Gaia, under robotic guidance, not only evolved their ability to form an ongoing telepathic group consciousness but also extended this consciousness to the fauna and flora of the planet itself, even including inanimate matter. As a result, the entire planet became a super-organism.

DKG V8

Scalable and robust foundation for enabling the next stage of Artificial Intelligence adoption with Decentralized Retrieval Augmented Generation (dRAG), combining symbolic and neural decentralized AI.

◻️ AI-native Knowledge Assets: native vector support (e.g. knowledge graph embeddings for native Graph ML)
◻️ AI-native search based on DKG V8 decentralized vector index
◻️ Knowledge Contracts

Catalyst 1: Autonomous knowledge mining

Mine new knowledge for paranets autonomously by using the power of symbolic AI (the DKG) and neural networks.

◻️ AI-agent driven knowledge mining
◻️ Knowledge mining library integrations for popular data science languages (e.g. Python, R etc)

Catalyst 2: DePIN for private knowledge

Keep your knowledge private, on your devices, while being able to use it in the bleeding edge AI solutions.

◻️ Private Knowledge Assets repository (Knowledge Wallet)
◻️ Private data monetization with Knowledge Assets (Knowledge Marketplace)
◻️ DKG Decentralized File Storage integration libraries (e.g. IPFS, Filecoin)

V8 roadmap update: Scalable knowledge engine for convergence of crypto, internet, and AI was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

How we’re tracking AI incidents around global elections

Rest of World is collecting examples of AI being used for campaigning, misinformation, and memes in a regularly updated tracker.
From the general election in Bangladesh in January to the one in Ghana in December, this year will see around 2 billion people across the world casting their votes. While...

The co-founder driving India’s two-wheeler EV revolution for over a decade

Tarun Mehta is the co-founder and CEO of Indian electric two-wheeler company Ather Energy.
When Tarun Mehta and his co-founder Swapnil Jain — both engineering graduates from the elite Indian Institute of Technology Madras — founded Ather Energy in 2013, electric scooters were a...

2024 AI Elections Tracker

As dozens of countries head to the polls, we’re monitoring the way AI is being used to inform, misinform, and entertain voters.
As more than 2 billion people in 50 countries head to the polls this year, artificial intelligence-generated content is now widely being used to spread misinformation, as well as to...

Blockchain Commons

2024 Q1 Blockchain Commons Report

In the first quarter of 2024, Blockchain Commons continued its work on specifications, updated some of its references, and did new research on constrained devices, provenance, and identity. Specifications dCBOR Hashed Data Elision FROST Specification Docs Multipart UR Implementation Guide Request & Response Implementation Guide Updated Research List Reference Releases Gordian SeedTool 1.6 Gordi

In the first quarter of 2024, Blockchain Commons continued its work on specifications, updated some of its references, and did new research on constrained devices, provenance, and identity.

Specifications

dCBOR Hashed Data Elision FROST

Specification Docs

Multipart UR Implementation Guide Request & Response Implementation Guide Updated Research List

Reference Releases

Gordian SeedTool 1.6 Gordian Server 1.1 Rust Crates Updates

Constrained Devices Research

JavaCards no_std in Rust

Provenance Research

C2PA Source Code Provenance

Identity Research

eIDAS Dangers Identity Dangers

The Future

Specifications

One of Blockchain Commons biggest priorities is producing interoperable specifications that can be used by principals in the digital asset & identity field to create apps and hardware devices that support independence, privacy, resilience, and openness for users. In Q1, we worked to advance some of our specifications into true standards and also interacted with standards being developed by the rest of the field.

dCBOR. Our Internet-Draft for Decentralized CBOR (dCBOR) went through drafts 6, 7, and 8 this quarter. CBOR expert Carten Bormann joined us as a co-author and we continued to make tweaks based on expertise from the CBOR community, most recently revising how we defined leaves (“enclosed CBOR”). dCBOR is crucial for deterministic data formats such as Gordian Envelope because it ensures that data is always encoded in the same way, no matter when or where the encoding is done. We have high hopes that dCBOR will be an IETF standard soon.

Hashed Data Elision. We previously authored an Internet-Draft for Gordian Envelope, which we’ve continued to update in connection with our dCBOR updates. We’ve been getting less traction here, so we supplemented it this quarter with a problem statement on Deterministic Hashed Data Elision Internet-Draft. We also presented on hashed data elision at IETF 119 Dispatch. Our core premise was that privacy and human-rights needs are not well supported in IETF standards. We believe that hashed data elision (including Gordian Envelope) should be used as an easy method to address those needs. Unfortunately, the IETF hasn’t been strong on privacy concerns. Previous RFCs on Privacy and Humans Rights Considerations are mere recommendations with no weight. The bottom line seems to be that unless an existing protocol expresses a desire for privacy standards, there’s no place for hashed data elision in the IETF, though the IRTF, which focuses on “Research” instead of “Engineering”, might be a home for it.



FROST. FROST, a threshold scheme for Schnorr signatures, originated with a paper in 2020. We’ve been looking forward to its deployment in wallets because of its improved resilience and privacy, plus other advantages such as being able to change thresholds offline. See our FROST page and Layperson’s introduction to Schnorr for some foundational info. In the last six months, we’ve been doing our share to help FROST become a reality. In Q4, we held an implementer’s round table to allow people working on FROST to talk to each other. This quarter, one of those implementers, Jesse Posner, gave a presentation at our most recent Gordian Developers meeting to help to introduce developers to the powers of Schnorr and FROST. Dare we say: winter is coming? At least, FROST is.

Be sure to also check out our January Gordian Developers Meeting with its focus on the “Multi-Part Implementation Guide for URs” and our February Gordian Developers Meeting with its Gordian SeedTool 1.6 demo, and be sure to sign up for the Gordian Developer announcements list or Signal channel so that you can hear about the demos or talks at future meetings.

January February FROST Specification Docs

We want to make sure that our specifications are easily adaptable, especially to developers who might want to implement them. As a result, in Q1, we added two new implementation guides and revised how we flag the status of our specifications.

Multipart UR Implementation Guide. Multipart Uniform Resources (MURs) are Blockchain Commons’ biggest success because they allow for the interoperable and efficient creation of Animated QRs. They’ve been adopted by over a dozen wallets, mainly to pass PSBTs, but they can also pass other large data sets over an airgap: we’ve even tested megabytes! (video link). The pioneering MUR developers based their implementations on our reference code. This quarter we supplemented that with a MUR Implementation Guide that still focuses on our code, but offers explanations of how MURs work and precise instructions on how to make them work for you. Also see our January Gordian Developers Meeting for a walk-through of the Implementation Guide.

Request & Response Implementation Guide. The heart of Gordian Envelope is its ability to elide information while still allowing both certification and verification of that data. That’s the Hashed Data Elision concept that we presented at IETF. However, Gordian Envelope has much more functionality than just hashed data elision, including literal functions, which can be used in a request-response system where one interoperable system asks for something and another provides it. Our request/response docs were spread across a variety of smaller docs such as the Expressions Research doc and our source code, so we conslidated it all into a single Gordian Transport Protocol Implementation Guide, which describes the layers that build up to GTP and how to do requests and responses with Gordian Envelope. As for why you might want to, our new Improving Multisigs with Request/Reponse document presents an important case study on the usage of this system: it makes very difficult processes, like creating a multisig with multiple devices, much more accessiblem by reducing decision, research, and human-initiated actions, and thus much more likely to be used.


Updated Research List. All of our specifications can be found in our Research list. Since we have specifications at a variety of levels of development, from pure investigation on our part to implementation by multiple partners, we introduced a status listing that tells developers exactly how mature each specification is.

Reference Releases

Specifications are just one step in creating interoperability. We also produce reference apps and libraries that demonstrate how to use our specifications and how to incorporate best practices.

Gordian SeedTool 1.6. Our top reference app has long been Gordian SeedTool, which demonstrates specifications like Gordian Envelope, SSKR, and URs as well as best practices for safe and resilient digital-asset holding. We’ve been working on version 1.6 for a long time, but it’s now finally out, through GitHub and the Apple App Store. It includes updates to our newest specifications, new best-practices for UIs, and integration of Tezos assets.

Gordian Server 1.1. Our oldest supported reference app is Gordian Server, a Macintosh app that installs, maintains, and run Bitcoin Core. It demonstrates our Quick Connect URI, but more importantly it shows off some of our architectural fundamentals, such as maintaining partitioned services that are separated from each other by a TorGap. The new 1.1.0 release of Gordian Server updates for Apple’s native Arm64 (M1+) chips and also works with newer version of Bitcoin Core, up to and including the newly released Bitcoin Core 26.1. It also contains a major update that’s been a few years coming that replaces the older RPC password files with the much more secure rpcauth system. (Thanks to Peter Denton for this update and check out his Fully Noded for a wallet that integrates with Gordian Server!)

Rust Crate Updates. Over recent quarters, we converted our fundamental crypto libraries to Rust. We are continuing to keep them updated with our newest specifications and continuing to polish them, such as our recent work on bc-envelope-rust to streamline the API and reduce calls to clone().

Constrained Devices Research

Specifications, reference apps, and reference libraries represent some of our more mature work, but we’re also constantly researching other fields that might improve the management and usage of digital assets and identity online. One of our fields of research recently has been on constrained devices.

JavaCards. Could we hold assets or private keys on NFCs or JavaCards? We’ve discussed the topic at some of our recent Gordian meetings and have a Signal group dedicated to the topic, which you’re welcome to join. We’re hoping to do more with it in 2024.

no_std in Rust. Our Rust crate for bc-dcbor-rust now supports no_std. The no_std crate attribute allows Rust code to run on embedded systems and other constrained environments that don’t have access to the standard library. This means that dCBOR can now be used to serialize and deserialize data in firmware for IoT devices, microcontrollers, and smart cards. It’s another step forward in our support of constrained environments.

Provenance Research

How can you validate the provenance of data or identity? This is a natural expansion of our work with Gordian Envelope, which allows validation of data even when it’s been elided, so it’s been another source of research in the last quarter.

C2PA. We have joined the Coalition for Content Provenance and Authenticity (C2PA, which is focused on developing standards for certifying the provenance of media content. We’ve talked with them some about Gordian Envelope as a possible tool for this purpose.

Source-Code Provenance. We’ve long been thinking about source-code provenance and the validation of software developers, going back to our support for Joe Andrieu’s Amira Use Case, which grew out of RWOT5. Our software release use cases discuss many of the issues and how to resolve them with Gordian Envelope. More recently, we’ve been investigating SSH signing, which is now supported at GitHub. We’re working on a doc of best practices and also an SSH tool that will link up the envelope-cli with ssh-keygen. We’ve got a working prototype and expect to be able to talk more about the project, and the issues with software-release provenance, next quarter.

Identity Research

Work on identity, ultimately stemming from Christopher Allen’s “Path to Self-Sovereign Identity”, and continued work on DIDs and VCs at Rebooting the Web of Trust, was one of the things that got Blockchain Commons going in the first place. However, our partners and patrons have all been focused more on digital assets, so that’s where most of our work has been concentrated over the last five years. Nonetheless, we keep our foot in the identity pond, particularly for some of our Advocacy work.

eIDAS Dangers. Late in 2023, Christopher published an article on “The Dangers of eIDAS”, which is Europe’s new European Digital Identity regulation. Unfortunately, besides having some deeply flawed security models, it also ignores many of the dangers of the past.

Identity Dangers. What are those dangers? That’s the topic of “Foremembrance”, a book that Christopher been working on for a few years that remembers how overidentification led to genocide during World War II. On March 27, which is Foremembrance Day, Christopher gave a talk on the topic on Twitter. The YouTube video and the presentation are both available.

The Future

Many of these topics will continue onward, especially our more research-focused projects, such as our look at SSH signing and software provenance. We’re also hoping to do some major work with one or more of our partners to turn many of our specifications into deployed reality.

Our next Gordian Meeting, on May 1st, will have another feature presentation: Dan Gould will talk about PayJoin.

Finally, our advocacy is heating up again as the Wyoming legislature prepares for its meetings in May through September. We are engaged in discussions and early agenda setting with the co-chairs of the Wyoming Select Committe on Blockchain. We are hoping to see topics such as digital attestations from the Secretary of State, best practices for data minimization, and duties of care and best practices for digital identity on the agenda, as well as topics that we’ve pushed in the past such as Wyoming eResidency.

Blockchain Commons needs funding to continue its work as an architect for interoperable specifications and a central forum for their discussion because many of our funding sources dried up due to the economic conditions of recent years. If you’re a developer, please consider becoming a Benefactor to lend your name to our efforts, and if you’re a company, especially one using our work, please consider becoming a Sustaining Sponsor. We’re also open to working with partners on special projects that are open-source and aligned with our objectives, or that accelerate the deployment of our specs in your products. If sponsorship or a special project interest you, please drop us a line for more information.

We have further been seeking grants to continue our work. If you have the inside track on any grants that you think would be well-aligned with our work, or just want to make suggestions, again drop us a line.

Blockchain Commons is literally a commons: we are producing work that we hope is useful for the rest of the community, some of which is now widely deployed. But, the name might have been too apt, because the Tragedy of the Commons has long said that public resources of this sort are depleted. Help us replenish our resources to make sure the Commons continues!

Monday, 15. April 2024

FIDO Alliance

FIDO Paris Seminar: Mastering Passkeys, the Future of Secure Authentication:

FIDO Alliance and host sponsor Thales held a one-day seminar in Paris for a comprehensive dive into passkeys. The seminar provided an exploration of the current state of passwordless technology, detailed […]

FIDO Alliance and host sponsor Thales held a one-day seminar in Paris for a comprehensive dive into passkeys. The seminar provided an exploration of the current state of passwordless technology, detailed discussions on how passkeys work, their benefits, practical implementation strategies and considerations, and case studies. 

Attendees had the opportunity to engage directly with those who are currently implementing FIDO technology through open Q&A, networking and exhibits to get first-hand insights on how to move their own passkey deployments forward.

View the seminar slides below:

The State of Passkeys with FIDO Alliance.pptx

A Deep Dive on Passkeys: FIDO Paris Seminar.pptx

Digital Identity is Under Attack: FIDO Paris Seminar.pptx

Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx

Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx

The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx

The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx

Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx

Hyperledger Foundation

Apply Now for the Annual Hyperledger Mentorship Program!

Are you passionate about blockchain and eager to impact the field? Ready for a structured, hands-on opportunity to learn the ropes of open source development? Looking for a pathway to develop source code, documentations, and research skills while helping to advance open source projects and communities? Then you should apply for the annual Hyperledger Mentorship Program.

Are you passionate about blockchain and eager to impact the field? Ready for a structured, hands-on opportunity to learn the ropes of open source development? Looking for a pathway to develop source code, documentations, and research skills while helping to advance open source projects and communities? Then you should apply for the annual Hyperledger Mentorship Program.


Identity At The Center - Podcast

Dive into a conversation that marries the complexity of iden

Dive into a conversation that marries the complexity of identity management with the subtleties of winemaking. Our latest episode features John Podboy, a cybersecurity SVP and a wine enthusiast, who shares his insights on the future of IAM in the banking industry, the role of AI, and the potential of FIDO2. Plus, discover his unique perspective on how vineyards mirror the growth and challenges of

Dive into a conversation that marries the complexity of identity management with the subtleties of winemaking. Our latest episode features John Podboy, a cybersecurity SVP and a wine enthusiast, who shares his insights on the future of IAM in the banking industry, the role of AI, and the potential of FIDO2. Plus, discover his unique perspective on how vineyards mirror the growth and challenges of digital identity. Don't miss this rich blend of topics. Listen now and enrich your understanding of the identity landscape.

#iam #podcast #idac


ResofWorld

The startup offering free toilets and coffee for delivery workers — in exchange for their data

Argentine startup Nippy aggregates data from gig workers and sells it to companies like Mastercard and Movistar, who in turn offer workers their services.
Every day, Fredy Ivan Alba Trejo bikes for over an hour through busy highways to reach pedestrian-friendly neighborhoods of Mexico City where he works as a food delivery worker for...

Friday, 12. April 2024

DIF Blog

Presentation Exchange v2.1: Working Group Approval

We are excited to announce that Presentation Exchange v2.1 has reached a significant milestone and is now under review for Working Group Approval. This update marks a critical step forward marking the specification's continued adoption. Community members and stakeholders are encouraged to provide their feedback by April

We are excited to announce that Presentation Exchange v2.1 has reached a significant milestone and is now under review for Working Group Approval. This update marks a critical step forward marking the specification's continued adoption. Community members and stakeholders are encouraged to provide their feedback by April 26, 2024. Barring any significant objections, the proposal will transition to the Working Group Approved state and subsequently seek the approval of the DIF Steering Committee.

What’s New in v2.1?

The latest iteration is a minor release, bringing with it several important updates for stability and adoption:

Security Enhancements: We have introduced a "Security Considerations" section, helping users navigate the security implications of the exchange more effectively. Expanded Use Cases: A new "Use Cases" section has been added. This aims to broaden the understanding and applicability of the Presentation Exchange, providing examples and scenarios where it can be implemented. Editorial Improvements: To further enhance the readability and clarity of the documentation, we have made various editorial changes throughout the text. Future Developments

It’s important to note that as a minor release, v2.1 does not incorporate any breaking changes. This decision ensures stability and backward compatibility. Future potential enhancements are currently being explored and can be tracked via the "Future" tagged issues in our GitHub repository.

We Want to Hear from You!

Your input is invaluable to us. We invite all community members to review the proposed changes and share their feedback via the Github issues. Your insights will play a pivotal role in shaping the final version of Presentation Exchange v2.1 and future iterations. Together, we can continue to evolve and strengthen this essential standard.


FIDO Alliance

Tech Radar: Bitwarden now supports passkeys on iOS devices

Popular free password manager Bitwarden now supports passkeys on iOS devices. The news follows the recent trend of password managers bringing passkey support to mobile, including Keeper and Proton Pass. […]

Popular free password manager Bitwarden now supports passkeys on iOS devices. The news follows the recent trend of password managers bringing passkey support to mobile, including Keeper and Proton Pass. Bitwarden added passkey support to its desktop browser extension last year, and now users can create and store passkeys on their iOS app too. Android support is yet to arrive, however.


GB News: Elon Musk just killed passwords on X, here’s what you need to use a passkey to login

Passkeys were developed by the FIDO Alliance, an industry body with the stated aim of helping to “reduce the world’s over-reliance on passwords” with the likes of Apple, Google and […]

Passkeys were developed by the FIDO Alliance, an industry body with the stated aim of helping to “reduce the world’s over-reliance on passwords” with the likes of Apple, Google and Microsoft amongst its members. First promoted as an alternative to passwords back in mid-2022, the clever system relies on the same biometrics that allow you login to your iPhone, iPad, Windows PCs, Samsung phones and tablets, Android phones, and dozens more, without typing out a password or PIN.


Dark Reading: Selecting the Right Authentication Protocol for Your Business

Authentication protocols like passkeys serve as the backbone of online security, enabling users to confirm their identities securely and access protected information and services. Passkeys have been deployed by several […]

Authentication protocols like passkeys serve as the backbone of online security, enabling users to confirm their identities securely and access protected information and services. Passkeys have been deployed by several major organizations such as Google, Apple, Shopify, Best Buy, TikTok, and GitHub.


TechCrunch: X adds support for passkeys globally on iOS

X has officially extended support for passkeys to all global iOS users. This news was announced on the heels of the social media platform introducing passkeys to US-based users earlier […]

X has officially extended support for passkeys to all global iOS users. This news was announced on the heels of the social media platform introducing passkeys to US-based users earlier in January.


EdgeSecure

EdgeCon Spring 2024

Registration Now Open! The post EdgeCon Spring 2024 appeared first on NJEdge Inc.
EdgeCon Spring 2024

In partnership with The College of New Jersey, we are thrilled to bring this transformational event to a campus known for its natural beauty situated on 289 tree-lined acres in suburban Ewing Township, New Jersey, in close proximity to both New York City and Philadelphia.

EdgeCon Spring 2024 is dedicated to Excelling in a Digital Teaching & Learning Future. Featuring 15-20 breakout sessions exploring the event theme, EdgeCon Spring will also feature high profile, industry leading vendors from across the academic enterprise. Attendees will have the opportunity to engage with and learn from a growing community of digital learning professionals while discovering innovative solutions to help institutions solve today’s biggest digital learning challenges.

Date: April 18, 2024
Time: 9 am – 5 pm
Attendee Ticket: $49

Event Location:
The College of New Jersey
2000 Pennington Road
Ewing, NJ 08628-0718

Register Now » Vendor/Sponsorship Opportunities at EdgeCon

Exhibitor Sponsorship and Branding/Conference Meal sponsorships are available. Vendors may also attend the conference without sponsoring, but at a higher ticket price of $250.

Contact Adam Scarzafava, Associate Vice President for Marketing and Communications, for additional details via adam.scarzafava@njedge.net.

Download the Sponsor Prospectus Now » Agenda

8 a.m.-8:30 a.m.—Check-In & Networking

8:30 a.m.-9:30 a.m.—Breakfast, Networking, & Exhibitor Connections

9:35 a.m.-10:35 a.m.—General Session: AI and the New Era of Learning: How Higher Education Must Respond

10:45 a.m.-11:25 a.m.—Breakout Sessions

11:35 a.m.-12:15 p.m.—Breakout Sessions

12:15 p.m.-1:20 p.m.—Lunch, Networking, & Exhibitor Connections

1:30 p.m.-2:10 p.m.—Breakout Sessions

2:20 p.m.-3:00 p.m.—Breakout Sessions

3:10 p.m.-3:50 p.m.—Breakout Sessions

3:50 p.m.-5:00 p.m.—Snacks/Coffee, Networking, & Exhibitor Connections

C. Edward Watson, Ph.D.

Associate Vice President for Curricular and Pedagogical Innovation and
Executive Director of Open Educational Resources andDigital Innovation,
American Association of Colleges and Universities (AAC&U)

Announcing EdgeCon Spring 2024 Keynote Speaker

Generative AI tools, such as ChatGPT, Claude, Gemini, and others, have had an astonishing impact on the ways we learn, work, and think over the past year.  Initially, the concern for many in higher education was how students might use these tools to complete assignments; however, a much more complex and daunting challenge has emerged.  A 2023 Goldman Sachs report analyzed tasks versus jobs and concluded that two-thirds of current occupations could be partially automated by AI. This doesn’t mean that two-thirds of jobs will be replaced by AI, though some positions will indeed be lost to the new technology; rather, most of our graduates will soon be asked to collaborate with AI to complete significant portions of their work each week.

Drawing from the presenter’s new book, Teaching with AI: A Practical Guide to a New Era of Human Learning (Johns Hopkins University Press), this keynote will explore the evolving AI landscape and detail the companion challenges and opportunities that are emerging for higher education.  While academic integrity and AI detection will be discussed, the core focus of this keynote will be on concrete approaches and strategies higher education can adopt, both within the classroom and across larger curricular structures, to best prepare students for the life that awaits them after graduation.

At AAC&U, he leads national and state-level advocacy and policy efforts to advance quality in undergraduate student.  Before joining AAC&U, Dr. Watson was the Director of the Center for Teaching and Learning at the University of Georgia (UGA).  At UGA, he led university efforts associated with faculty development, TA development, student learning outcomes assessment, learning technologies, and media production services.

He has published on teaching and learning in a number of journals, including Change, Diversity & Democracy, Educational Technology, EDUCAUSE Review,International Review of Research in Open and Distributed Learning, Journal for Effective Teaching, Liberal Education, Peer Review, and To Improve the Academy, and has recently been quoted in the New York Times, Chronicle of Higher Education, Campus Technology, EdSurge, Consumer Reports, UK Financial Times, and University Business Magazine and by the AP, CNN and NPR regarding current teaching and learning issues and trends in higher education.  His most recent book is the forthcoming Teaching with AI: A Practical Guide to a New Era of Human Learning (Johns Hopkins University Press).

Breakout Sessions Session 1: 10:45 – 11:25 a.m. Embracing or Limiting AI to Enhance Authentic Learning

Room: BSC 100

While fully ‘ChatGPT-proofing’ your course might be challenging, learn how to creatively design assignments that promote genuine student engagement. This session will guide you through innovative strategies to modify your assessment approach,  either using or limiting AI tool use to create captivating, challenging assignments that inspire authenticity and excitement in your students.

Presenter:

Ellen Farr, Assistant Director, Center for Excellence in Teaching and Learning, The College of New Jersey Judi Cook, Executive Director, Center for Excellence in Teaching and Learning, The College of New Jersey We'll Do the Dirty Work: EdgeLearn and the Realities of Digital Learning & Instructional Design Support

Room: BSC 225 East

Innovation in higher education is fueled by new approaches to instructional design and technology, partnered with advances in pedagogical theory and process. But most schools don’t have the time or budget to do it because their most talented, motivated staff and faculty are weighed down by important, but somewhat monotonous tasks and responsibilities. This session will demonstrate how EdgeLearn can lessen that burden at a non-profit price and allow you to advance your online programming with ease.

Presenter:

Joshua Gaul, Associate Vice President & Chief Digital Learning Officer, Edge Future of AI: Insights from the Next Generation

Room: BSC 225 West

Tired of the same old AI discussions? This panel flips the script! Join a conversation with college and graduate students, the future leaders in AI development and application, to hear their unfiltered thoughts and expectations. Get ready for a dynamic discussion about:

Student concerns: What ethical considerations are paramount for the next generation of AI? Emerging trends: What exciting possibilities do students see for AI in their fields? Bridging the gap: How can academia and industry better prepare students for the AI-powered future?

This isn’t your typical AI talk. Be prepared to be challenged and inspired!

Moderator:

Diane Rubino, Adjunct Assistant Professor, NYU 

Student Panelists:

Harshil Thakkar, Stevens Institute of Technology, Master’s Engineering Management Candidate Evangelia Gkaravela, Stevens Institute of Technology, Master’s Engineering Management (researcher), Space Systems Engineering Candidate Katherine Weiss, NYU, MS in PR/Corporate Communications Candidate Session 2: 11:35 a.m. – 12:15 p.m. Harnessing the Power of AI: A Foundation for Higher Education Faculty

Room: BSC 100

New to AI? This workshop is your launchpad! Designed specifically for faculty new to AI, this session will equip you with a foundational understanding of AI’s potential as you rethink activities and assessments to address AI disruption. We’ll break down key terms, explore innovative AI tools that personalize instruction, boost engagement, and deepen understanding. We will also address the challenges faced by artificial intelligence usage. Get hands-on ideas about where to start redesigning your online course with practical applications of AI tools in your field. Walk away with a solid foundation to revolutionize your teaching and student success!  

Presenter:

Laurie Hallick, Instructional Designer, Molloy University Interactive Examination of Organizational Ecosystems and Online Success

Room: BSC 225 East

This goal of this interactive discussion will be to address questions about the relationship between organizational structures and the success or demise, as well as the level of quality of online education programs. During the session we will employ dynamic online polling to gather group insights and present them visually throughout the session, as well as the opportunity to engage in a deep exploration of key questions dissecting the organizational ecosystem which includes the interplay of administrative policies, institutional culture, technology infrastructure, and student support. Through this dialogue, the goal is to identify challenges and opportunities for better synergies within institutional frameworks to advance online learning.

Presenter:

Alexandra Salas, President, Cognition Ink LLC Integrating High Performance Computing into the Undergraduate Curriculum: Insights from the School of Science at TCNJ

Room: BSC 225 West

Almost all fields that our students enter into after graduation require enhanced data science and computational skills in the modern workforce. The importance of these skill-sets will only continue to increase. TCNJ’s High Performance Computing (HPC) cluster is used for computationally driven scientific research by all departments in the School of Science and supports 500 to 700 students per academic year in both class related usage as well as faculty mentored research opportunities. In this presentation we describe how we have successfully integrated HPC into our science curriculum. allowing us to equip students directly with the skills they will need to enter the 21st century workforce, and providing faculty with a resource to engage students in transformational research experiences and hands-on learning in the classroom and the laboratory.

Presenters:

Sunita Kramer, Dean, School of Science, The College of New Jersey Joseph Baker, Professor of Chemistry, The College of New Jersey Shawn Sivy, HPC Systems Administrator, The College of New Jersey Session 3: 1:30 – 2:10 p.m. AI and the Future of Student Success

Room: BSC 100

New Jersey Institute of Technology is embracing AI to support everything from campus life to curriculum planning. In this presentation, you’ll see a glimpse of the future of student success enriched by AI. Along with their partner, Slalom, NJIT will debut their early-stage Digital Student Advisor. Ed Wozencroft, NJIT’s Vice President for Digital Strategy & CIO, will inspire you to think of the world of possibility for students and faculty… What If…?

Presenters:

Stephen Walsh, Senior Director, Public & Social Impact, Slalom Ed Wozencroft, Chief Information Officer, VP for Digital Strategy, New Jersey Institute of Technology MAGIC in Higher Education: Motivating, Active Learning, Gamifying, Imagining, and Collaborating

Room: BSC 225 East

“MAGIC in Higher Education: Motivating, Active Learning, Gamifying, Imagining, and Collaborating,” a convergent parallel design, mixed-methods study, assessed how embedding play into the architecture of a classroom can improve the learning process for students. We aimed to identify if changing the natural passive environment of a classroom to an active, play-driven environment would influence learning outcomes. Considering low national retention and graduation rates within community colleges, we examined concepts highlighting embedded play in the lesson as an extrinsic motivator to augment the learning process. 

We hypothesized that creating an abstract classroom learning environment, considering both passive and active learning can positively impact comprehension and the student learning experience. We collaborated with faculty and administration to investigate learning environments and teaching practices. We focused on the architecture and design of the classroom environment and how engaging students in play might strengthen its structure, increasing comprehension of the subject material. Our research revealed that play promotes positive experiences for students focusing on active learning.

The data exposed a dichotomy between teaching and learning; faculty primarily engage in passive lecture-based teaching, whereas students prefer active play-based learning. We recognized that a natural classroom environment is subjective depending on the discipline and pedagogy. Therefore, we engaged faculty to redesign a lecture to include play-based learning aligning with their discipline. The data from the active learning investigation revealed that participating in the playful activity significantly improved students’ understanding and application of the lesson’s content. Reflecting on our research and outcomes, we created a forum to showcase our data to faculty, administration, and students. This showcase has launched a play-based, active learning Community of Practice (CoP) for faculty professional development.

Presenters:

Dr. Jennifer Gasparino, Associate Professor, Human Services & Phi Theta Kappa Advisor, Passaic County Community College Andy Perales, Program Coordinator, Teachers Excellence Project & Phi Theta Kappa Co-Advisor, Passaic County Community College Alexandra Della Fera, Associate Professor, English, Passaic County Community College John Paul Rodriguez, Assistant Professor, Computer Information Services, Passaic County Community College

Student Presenters:

Bilal Gebril, President, Phi Theta Kappa Erick Vasquez Minaya, Provisional Membership Coordinator, Phi Theta Kappa Venus John, Honor in Action Co-Chair Beyond Barriers: Crafting Inclusive Learning Environments through Digital Accessibility and Universal Design

Room: BSC 225 West

Digital accessibility is frequently approached reactively, wherein instructors generate course content, students submit accommodation letters, and the subsequent realization of content inaccessibility prompts efforts to modify and enhance accessibility. This method proves time-consuming and perpetuates the marginalization of students by reinforcing structural and environmental barriers to learning. Rather, embracing a proactive universal design perspective in addressing digital accessibility enables instructors to prioritize the diverse needs of learners during the creation of digital content and materials. This approach minimizes the necessity for accommodations, fostering a more inclusive learning environment from the outset. While achieving digital accessibility necessitates a comprehensive commitment at the systemic and institutional levels, instructors can adopt various practices within their classrooms to advance the creation and provision of accessible course materials. This interactive workshop will guide participants in contemplating the significance of digital accessibility in higher education and in exploring practical tools for implementing digital accessibility principles across physical, hybrid, and online learning environments, grounded in a universal design approach.

Presenter:

Mel Katz, Accommodations Support Specialist for Curriculum and Assessment, The College of New Jersey HPC, AI, and Data (HPC AID) Affinity Group

Room: BSC 104

Edge, in collaboration with our partner institutions, has launched the HPC, AI, and Data (HPC AID) Affinity Group. The group aims to expand knowledge access, community, and practice in HPC and Research Computing in support of AI and data intensive research and education. 

Join us in our mission to share information and best practices related to High Performance Computing and Research Computing and Data. The group is open to anyone with focus on leveraging HPC and Data in support of Research and Education, so please invite peers or colleagues to join us. 

Session 4: 2:20 – 3:00 p.m. Unlocking Learning: The Educational Power of DIY Escape Rooms

Room: BSC 100

Escape rooms offer numerous benefits when integrated into higher education classroom settings. By presenting students with complex puzzles and challenges, escape rooms promote teamwork, communication, and critical thinking skills. Collaborative problem-solving becomes the focal point, encouraging students to leverage each other’s strengths and expertise to achieve a common goal.

During this workshop, participants will see photos of the TLTC’s Pirate Escape Room and an example of an online escape room, followed by a discussion of pros and potential pitfalls of designing one. Everyone will receive a digital copy of resources, tips, and ideas to guide them through creating an escape room of their own.

Presenter:

Kate Sierra, Instructional Designer, Seton Hall University Trends and Future Prospects of Digital Accessibility in Learning Environments

Room: BSC 225 East

This presentation explores the latest trends and future prospects of digital accessibility in learning environments, focusing on integrating Artificial Intelligence (AI), Voice User Interfaces (VUI), Augmented Reality (AR), Virtual Reality (VR), Mobile Accessibility, and Inclusive Design principles. Participants will gain insights into AI-driven accessibility solutions, the benefits of VUI in digital learning platforms, leveraging AR and VR for accessible learning experiences, mobile accessibility considerations, and strategies for incorporating inclusive design.

Presenter:

Laura Romeo, Instructional Designer, Edge Active Learning Design: Contextualizing Multimedia for Knowledge Transfer

Room: BSC 225 West

Technological advancements have made multimedia a main language for conveying information and knowledge. Multimedia elements like video, audio, animation, and interactive media enable learners to encode information in multiple formats, which leads to deeper understanding. A multimedia learning environment tailored to the context allows students to integrate and interpret relationships. This approach promotes learner-centered teaching and adheres to constructivist theory. According to this theory, learners do not passively absorb new knowledge and understanding. Instead, they actively build new knowledge by experiencing and integrating new information with their prior knowledge.

This session will explore the concept of visual thinking by delving into the cognitive psychology behind media-based instructions and their role in humanizing digital learning and fostering stronger teacher-student relationships. It will also highlight the significance of interactive multimedia in learning environments. Encouraging students to interact with and manipulate media to achieve their learning goals creates an environment that promotes learning by doing. This teaching approach promotes higher-order thinking in multiple dimensions, resulting in better knowledge retention. 

This session will also explore the distinct capability of multimedia in addressing various learning objectives and requirements and discuss the methods of integrating them into instructional design. I will use actual course examples to illustrate how and when students learn best. By participating in this session, attendees will better understand the distinct advantages of multimedia teaching and acquire practical techniques for integrating multimedia design into their courses.

Presenter:

Cecily McKeown, Instructional Multimedia Specialist, Hudson County Community College Session 5: 3:10 – 3:50 p.m. The Amazing Race: Keeping Up with GenAI at Montclair State University

Room: BSC 100

In January 2023, the instructional design team at Montclair State University began ideating a response to the advances in artificial intelligence, which broke headlines in late 2022. Since then, Instructional Technology and Design Services (ITDS) has produced a suite of web-based resources, workshops and trainings, consultations, and more to guide University faculty through discovery and exploration of GenAI to be leveraged pedagogically and mitigate misuse. In this session, Montclair instructional designers Joe Yankus & Gina Policastro will share their experience composing these resources, facilitating small and large-group faculty development, lessons learned, and goals for the upcoming year. 

Presenters:

Joseph Yankus, Instructional Designer, Montclair State University Gina Policastro, Instructional Designer, Montclair State University 10 Things I Wish I Knew About Accessible Digital Media Before Becoming an Instructional Designer

Room: BSC 225 East

Word-processed documents, presentation slide decks, PDFs, and videos can all be made ready for use by all students. It’s not just a good thing to do, it’s also the law. In this presentation, you will learn ten easy tips that can help anyone have a better experience using your digital documents. 

This session will concentrate on Microsoft documents. The concepts will be applicable to other programs available on other platforms as well as documents created in the cloud.

The big ideas include the importance of headings, alternative text for images, tables, accessibility checkers, lists, font selection and color, slide titles, saving files as PDFs, reading order, and captioning.

Presenter:

Ann Oro, Senior Instructional Designer, Seton Hall University Dual Rubrics That Offer Learning Insights

Room: BSC 225 West

Simple Rubrics support LEARNING by offering a checklist of expectations, a mechanism for delivering formative/summative evaluation, and a framework for learner reflection and self-remediation. Dual Rubrics go further to also support TEACHING by offering a means for making an inference about about students’ mastery of learning outcomes/competencies. Implementing Dual Rubrics in the Canvas LMS, at the course or program level, offers a data-driven opportunity to incorporate learning insights that support quality improvement in instructional effectiveness and curricular design.

Presenter:

Karen Harris, Senior Instructional Designer and Assessment Specialist, Rutgers University Exhibitor Sponsors Lanyard Sponsor VIP Reception Sponsor

The post EdgeCon Spring 2024 appeared first on NJEdge Inc.


AI Teaching & Learning Symposium, presented by Edge and Seton Hall University

Join us for the AI Teaching & Learning Symposium The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.

Join Edge and Seton Hall University for the inaugural “AI Teaching & Learning Symposium”. The symposium will consider the impact of AI on teaching, learning, and the student experience. Located in the quaint town of South Orange, New Jersey, the 58-acre main campus is only 14 miles from Manhattan.

Date: Tuesday, June 11, 2024

Event Location:
Seton Hall University
400 South Orange Avenue
South Orange, NJ 07079

REGISTRATION COMING SOON Call for Proposals

Submit your presentation topic for the upcoming AI Teaching & Learning Symposium, presented by Edge and Seton Hall University! The inaugural symposium will consider the impact of AI on teaching and learning.

Submit Proposal »

The post AI Teaching & Learning Symposium, presented by Edge and Seton Hall University appeared first on NJEdge Inc.


ResofWorld

This delivery app takes away health insurance when workers don’t meet quotas

Rest of World spoke with 40 riders for Swiggy in India. Many described losing coverage when they needed help the most.
Delivery worker Rakesh was dropping off food orders in the southern Indian city of Hyderabad in late January when he received a distressing call from his wife — she was...

Thursday, 11. April 2024

Trust over IP

ToIP Announces the First Implementers Draft of the Trust Spanning Protocol Specification

Read about a protocol that is to digital trust what the Internet Protocol (IP) is to digital data. The post ToIP Announces the First Implementers Draft of the Trust Spanning Protocol Specification appeared first on Trust Over IP.
Why do we need a Trust Spanning Protocol? Where can I get a high-level overview of TSP? What does the Implementers Draft cover? How does TSP differ from other trust protocols? What implementation projects have been announced? What kind of feedback are we seeking on this draft? How can you provide feedback? Why do we need a Trust Spanning Protocol?

No one would question that the Internet has completely transformed the global information economy. It has become indispensable for connectivity and reliable content delivery. But as it has grown, so have the threats against it and the vexing challenges in deciding what to trust. 

Now, AI is pushing those concerns into overdrive. A 2023 study by CyberArk found that 93% of security professionals expect AI-enabled threats to affect their organization in 2023—with AI-powered malware cited as the #1 concern. No less an industry luminary than Marc Andreessen recently said that the war against detecting AI fakes was unwinnable—our only solution was to find a way to “invert the problem” by being able to prove content authenticity.

Why, after 30 years of steadily compounding security issues, does industry not yet have a fix? Why, with technologies like DNSSEC and TLS, and industry bodies like IETF and the CA/Browser Forum, do we still have daily headlines about data breaches, ransomware attacks, identity theft, and malware infestations? Why, with the explosive interest in generative AI, are many experts more worried about it being used to attack us than to protect us?

The answer is the reason the Trust Over IP (ToIP) Foundation was founded four years ago. In short, authenticity, confidentiality, and metadata privacy features were never built into the core fabric of the Internet. To solve the root of this problem and not the symptoms, we need a next-generation trust layer on top of the existing Internet.

The heart of this layer is a protocol that is to digital trust what the Internet Protocol (IP) is to digital data. That is the ToIP Trust Spanning Protocol (TSP).

Where can I get a high-level overview of TSP?

First, start with this blog post we published in January 2023 when we launched the ToIP Trust Spanning Protocol Task Force (TSPTF). It explains the overall purpose of the TSP and where it fits in the ToIP stack.

Second, read the Mid-Year Progress Report on the ToIP Trust Spanning Protocol, published last August 2023 to summarize the seven pillars of the TSP design. With the exception of some terminology evolution, these seven pillars have not changed as we worked through multiple stages of Working Drafts over the past seven months.

Today we are pleased to announce the release of the first Implementers Draft.

What does the Implementers Draft cover?

This table summarizes the 10 major sections of the specification:

Verifiable Identifiers (VIDs)VIDs are the first of the seven pillars — cryptographically-verifiable identifiers that provide technical trust in the TSP. Covers: why they are necessary, how they provide access to public keys and ToIP endpoint addresses, how they are verified, and how keys can be rotated.MessagesTSP is a message-based asynchronous communication protocol. Covers: message envelopes, payloads (confidential, non-confidential, headers), signatures, relationship setup and out-of-band introductions.Nested MessagesTSP messages can be nested one or two layers to achieve metadata privacy. Covers: payload nesting, nested relationship VIDs.Messages Routed through IntermediariesTSP messages can be routed through intermediaries for several reasons, e.g., asynchronous delivery, reliability, and performance. However the primary focus is metadata privacy protection. Covers: message routing, direct neighbor relationships, endpoint-to-endpoint (“tunneled”) messages, private VIDs, single intermediaries, two-or-more intermediaries.Multi-Recipient CommunicationsTSP messages may be sent to multiple recipients. Covers: multi-recipient lists and anycast intermediaries.Control Payload FieldsTSP messages can be multi-purpose, so rather than dedicated control messages, the specification defines control payloads. Covers: relationship formation (parallel, nested, third-party introductions), relationship events (key updates, routing info, and relationship cancellation).Cryptographic AlgorithmsThe authenticity and confidentiality properties of TSP relies on public/private key cryptography. Covers: public key signatures, public key authenticated encryption, encryption and decryption primitives, Hybrid Public Key Encryption (HPKE), Libsodium Sealed Box.Serialization and EncodingTSP uses Composable Event Streaming Representation (CESR) for message serialization and encoding. CESR supports popular data encoding formats including JSON, CBOR, MsgPak, and others. Covers: envelope encoding, payload encoding (non-confidential, confidential, nested), signature encoding.TransportsTSP’s authenticity, confidentiality and metadata privacy properties are designed to be independent of the choice of transport protocol. Covers: transport service interface, transport mechanism examples.Appendix A:
Test Vectors(Still being completed) Test vectors for common use cases. Covers: direct mode messages, direct mode nested messages, routed model messages. How does TSP differ from other trust protocols?

Proposing a fundamental new Internet-scale protocol for digital trust is an ambitious undertaking. Why did the ToIP Foundation take this path? Let’s start by looking at related efforts in this area.

Related protocols

This table summaries other well-known protocols that address various facets of digital trust:

OpenID Connect (OIDC)An authentication layer on top of the OAuth 2.0 authorization framework specified by the OpenID Foundation as a RESTful HTTP API using JSON as a data format. Supports basic user profile information access; optionalYeahHighlight features include encryption of identity data, discovery of OpenID providers, and session management.OpenID for Verifiable Credentials (OID4VC)A family of protocols from the OpenID Connect Working Group built on top of OIDC for issuance (OID4VCI) and presentation (OID4VP) of verifiable digital credentials, plus a wallet-based user authentication protocol (SIOP).DIDCommSpecified by the DIDComm Working Group of the Decentralized Identity Foundation (DIF), DIDComm is a peer-to-peer secure messaging protocol in which the endpoints are specified by DIDs (decentralized identifiers).TLS (Transport Layer Security)A cryptographic protocol from the IETF best known for enabling secure HTTPS browser connections; also widely used in applications such as email, instant messaging, and voice over IP. Provides security, confidentiality, integrity, and authenticity through the use of X.509 digital certificates.MLS (Message Layer Security)Specified by the MLS Working Group of the IETF, MLS is a security layer for end-to-end encrypted messages in arbitrarily sized groups. Its security properties include message confidentiality, message integrity and authentication, membership authentication, asynchronicity, forward secrecy, post-compromise security, and scalability.RCS (Rich Communications Services)A text-based mobile messaging protocol specified by GSMA to replace SMS messages with a richer feature set including in-call multimedia. RCS does not natively support end-to-end encryption; Google added it using the Signal Protocol in their own implementation. Apple has said it will support RCS once GSMA standardizes end-to-end encryption.Signal ProtocolA non-federated cryptographic protocol specified by the Signal Foundation that provides end-to-end encryption for voice and instant messaging conversations. The protocol combines the Double Ratchet Algorithm, prekeys, and a triple Elliptic-curve Diffie–Hellman (3-DH) handshake.Matrix ProtocolAn application layer communication protocol for federated real-time communication specified by the Matrix Foundation, it provides HTTP APIs for securely distributing and persisting messages in JSON format over an open federation of servers. It can integrate with standard web services via WebRTC to facilitate browser-to-browser applications.DNSSECA suite of extension specifications from the IETF for securing data exchanged in the Domain Name System (DNS). The protocol provides cryptographic authentication of data, authenticated denial of existence, and data integrity, but not availability or confidentiality.ISO/IEC 14443-4:2018A half-duplex block transmission protocol designed for a contactless environment, it defines the activation and deactivation sequence of the protocol. It is intended for use with other parts of ISO/IEC 14443 and is applicable to proximity cards or objects of Type A and Type B. Related cryptographic data structures

Protocols are not the only ingredient required for Internet-scale digital trust. This table summarizes some of the standard cryptographic data structures that have been developed:

X.509 digital certificatesAn ITU standard defining the format of the public key certificates used in many Internet protocols, including TLS, as well as  digital signatures. An X.509 certificate binds an identity (a hostname, or an organization, or an individual) to a public key using a digital signature. A certificate is signed either by a certificate authority (CA) or self-signed. X.509 also defines certificate revocation lists and a certification path validation algorithm.Encrypted/signed PDF filesPortable Document Format, originally developed by Adobe, became an ISO standard in 2008. A PDF file may be encrypted; PDF 2.0 defines 256-bit AES encryption as the standard but also defines how third parties can define their own encryption systems for PDF. ISO 32000-2 defines how PDF files may be digitally signed for secure authentication.Verifiable digital credentialsWith the emergence of digital wallets, multiple formats for cryptographically verifiable digital credentials have been developed, including the W3C Verifiable Credentials Data Model; ISO mDL/mDOC; IETF SD-JWTs; Hyperledger AnonCreds; and ToIP Authentic Chained Data Container (ACDC).C2PA content credentialsThe C2PA standards define a model for binding cryptographically verifiable provenance information to digital media content together with a model for evaluating the trustworthiness of that information. How and why is TSP different?

As the sections above show, many thousands of person-hours have been invested in protocols and cryptographic data structures designed to address the Internet’s multitude of security, confidentiality, and privacy issues. So why have the members of the ToIP Foundation spent four years developing TSP?

The fundamental reasons are summarized in this table:

Minimalist design as a spanning layer for higher-layer protocolsThe single most important design goal for the TSP—and the biggest differentiator from the protocols listed above (with the possible exception of DIDComm)—is the critical role of a spanning layer in a protocol stack. The reasons it must be “as simple as possible but no simpler” is explained at length in Principle #3 of the Design Principles for the ToIP Stack. The TSP does not include many of the features of the protocols above precisely because it is designed so those features can be provided in higher-level protocols. The benefit is that all of those higher-level protocols can be much simpler and more future-proof because they automatically inherit all the technical trust features achieved at the TSP layer.Decentralized peer-to-peer architectureBy building on HTTP and RESTful APIs, the OpenID family of protocols are inherently Web-centric (client/server). The TSP does not make that architectural assumption. Like the IP protocol, it can work between any two peers across any kind of network or software architecture.VIDs & DIDsLike DIDComm, all TSP endpoints use cryptographically verifiable identifiers (VIDs) such as those defined by the W3C Decentralized Identifiers (DIDs) specification. VIDs not only support full decentralization, but they provide portability and cryptographic agility for lifetime relationship management.Public Key Authenticated Encryption and SignatureTSP combines modern Public Key Authenticated Encryption and Public Key Signature to provide the strongest protection against both key compromise impersonation and sender impersonation. This is achieved by using either IETF RFC9180 HPKE (Hypbrid Public Key Encryption) defined Auth Mode primitives, or HPKE Base Mode or Libsodium Sealed Box primitives enhanced with an ESSR (Encrypt Sender Sign Receiver) pattern.Payload agilityTSP uses the CESR text-binary dual encoding format that supports composability of both text and binary primitives—including JSON, CBOR, and MsgPak—in the same message.Cryptographic agilityAnother key feature of CESR is code tables for all types of cryptographic primitives. For example, it can transmit any of the cryptographic data structures listed above. This enables TSP ecosystems to standardize on precise signature and encryption algorithms yet still evolve them over time.Transport independenceAlthough the name “Trust Over IP” suggests a dependence on the TCP/IP stack for transport, in fact a core design goal of TSP is to provide end-to-end authenticity, confidentiality, and metadata privacy entirely independent of the underlying transport protocol. What implementation projects have been announced?

In parallel with the first Implementers Draft, at next week’s Internet Identity Workshop #38 we will be announcing the first two TSP implementation projects—each one led by one of the primary authors of the TSP specification:

A Rust open source implementation led by co-author Wenjing Chu is being proposed as a new project at the OpenWallet Foundation (OWF) sponsored by OWF Premier members Futurewei, Gen, and Accenture. A Python open source implementation is being developed by co-author Sam Smith and his colleagues at the Web of Trust GitHub community project.

If you are interested in contributing to either of these projects or starting your own, we welcome your collaboration. Just contact us via the ToIP contact page or GitHub.

What kind of feedback are we seeking on this draft?

As always, we would like feedback on the usual questions about any new protocol specification:

Is the spec clear and understandable?  Are there any missing sections?  Are there places where more examples or illustrations would be helpful? Are there specific test vectors you would like to see added?

In addition, we are specifically looking for your feedback in the following areas:

How would you imagine using the TSP? How can it enhance what you already have? What types of trust task protocols are you most interested in layering over the TSP? Does the TSP provide the baseline capabilities you need to support your planned trust task protocol(s)? Does the TSP enable your solution to be more decentralized? What types of VIDs do you plan to implement support for? What types of transport protocols do you intend to bind to? What cryptographic algorithms do you want or need to use? What problems are you trying to solve in your tech stack that are not addressed by existing solutions and can or should be addressed by TSP? Are there other protocols in development that we are not aware of that may conflict or complement TSP? How can you provide feedback?

To review the ​​specification:

Github Pages version: https://trustoverip.github.io/tswg-tsp-specification  Markdown version: https://github.com/trustoverip/tswg-tsp-specification 

To make a comment, report a bug, or file an issue, please follow the ToIP Public Review Process on GitHub:

Bugs/Issues: https://github.com/trustoverip/tswg-tsp-specification/issues Discussion: https://github.com/trustoverip/tswg-tsp-specification/discussions

The post ToIP Announces the First Implementers Draft of the Trust Spanning Protocol Specification appeared first on Trust Over IP.


Ceramic Network

Ceramic World 03

Welcome to the third edition of CeramicWorld, our monthly ecosystem newsletter. We have lots of updates to share with you all - EthDenver recap, the latest updates on the Ceramic roadmap, the OrbisDB alpha launch, and so much more. Let’s dive in! EthDenver 2024 recap Ceramic and Tableland

Welcome to the third edition of CeramicWorld, our monthly ecosystem newsletter. We have lots of updates to share with you all - EthDenver recap, the latest updates on the Ceramic roadmap, the OrbisDB alpha launch, and so much more. Let’s dive in!

EthDenver 2024 recap

Ceramic and Tableland co-hosted the Proof of Data Summit at ETHDenver. The event was a full-day community gathering focusing on reputation, identity, DePIN, decentralized AI, and decentralized computing.

The summit featured engaging lightning talks, technical discussions, and panels with industry leaders, sparking new ideas and collaborations in decentralized technologies. We heard from Juan Benet (Protocol Labs), MetaMask, Karma3 Labs, Fluence, Gensyn, and more, who shared their expertise and perspectives, helping us gain a deeper appreciation for the power of decentralized networks.

If you missed any of these talks, you can catch up on the conversations on our YouTube channel!

Watch the recordings of Proof of Data Summit Orbis team announces OrbisDB Beta - a new database powered by Ceramic

Last month, the Orbis team kicked off EthDenver with a bang—they announced the OrbisDB beta release. OrbisDB is a new database built using Ceramic’s upcoming Data Feed API. This is a big leap in Ceramic’s ecosystem growth, as we hope to see more tools like this built on top of Ceramic.

OrbisDB offers an intuitive SQL interface to explore and query data stored on Ceramic. It also comes with an easy-to-use, no-code setup and an ORM-like SDK framework. On top of that, OrbisDB Plugins can be used to enhance OrbisDB instances and enable different actions during the stream’s lifecycle, including:

Gating mechanisms Enrichment of streams Trigger actions post-indexing Creation of streams

It’s a big leap for the Ceramic ecosystem, unlocking new tools and use cases. We already have some ideas floating around using OrbisDB Plugins as game engines.

Start building with OrbisDB today! Oamo becomes the first points provider on Ceramic

Oamo has partnered with Ceramic to take the first steps towards developing and standardizing the first decentralized points system.

Having partnered with Ceramic for a long time on projects like harnessing Ceramic's innovative DID (Decentralized Identifier) infrastructure and ComposeDB for zero-party data storage, Oamo is now the first points data provider on Ceramic. Oamo will issue tens of millions of publicly available credentials based on wallets’ on-chain activity and holdings.

This partnership will supercharge the Ceramic ecosystem:

Enhancing digital identity and engagement through credential distribution, decentralized and standardized points system, and credentials and points system SDK will make it easier for developers to build using the new points system. Allowing the users to claim their credentials and scorecards seamlessly Powering the development across multiple use cases - DeFi, NFT, Wallet Providers, Liquid Staking, Game Development, and others. Read more on Ceramic blog Points on Ceramic

Like many in the ecosystem, we’ve been thinking a lot about points lately. They represent a powerful way to attract, measure and reward users for activity, reputation and credentials. However, many teams still work with points tabulated and stored on centralized databases.

To unlock one of the core promises of web3, we’ve been engaging deeply in building and fostering the technical infrastructure for truly open, decentralized points storage.

To learn more, check out our new Points landing page. To get a deeper sense of how we’re thinking about points, read ‘Points: How Reputation & Tokens Collide’ by our co-founder, Danny Zuckerman. We built our own points application on Ceramic. Get all the technical details here, thanks to our partner engineer, Mark Krasner. Check out the Points landing page ⚠️ Breaking change notice: ComposeDB v0.7 is out. Upgrade your Ceramic server to v5.3
A recent release of ComposeDB v0.7 introduced quite a few new features, including new SET account relation when defining models and more (check out the detailed release notes). To use this version of ComposeDB, developers will have to upgrade their Ceramic server to v5.3, as it is not compatible with earlier versions of Ceramic.

This release included a few patches that enabled upgrading ComposeDB and Ceramic server independently. Depending on which versions of Ceramic and ComposeDB you have been initially running and whether or not you have been using deterministic documents, you should consider a few upgrade considerations.

Check out this forum post for more details and instructions on how to upgrade. Ceramic roadmap update
Recently, we published our quarterly Ceramic roadmap update. Over the past few months, the core Ceramic team has been making strides in improving the Ceramic server performance, shipping ComposeDB features like SET account relation, and, most importantly, taking big steps towards enabling developers to create custom indexes by announcing the Data Feed API alpha release.
Check out the detailed roadmap overview here. Build using ComposeDB's new SET account relation
Recently, we have added a new account relation to ComposeDB - SET account relation. It complements the SINGLE and LIST account relations by creating a new type of relation that must be unique per combination of user account (DID) and instance of a model. With SET account relation you can now implement features like "post likes" meaning that each user can "like" a post only once.
Check out the documentation and start building using the SET account relation. Ceramic Community Content CAIP CIP-146 After TRENDING Toward the first decentralized points system: Oamo becomes the first points provider on Ceramic DISCUSSION Orbis plugins as gaming engines; Ceramic x OrbisDB WORKSHOP Intro to ComposeDB on Ceramic; Ceramic at LearnWeb3 Decentralized Intelligence Hackathon TUTORIAL Building points on Ceramic - an Example and Learnings by Mark Krasner BLOGPOST Points: How Reputation & Tokens Collide by Danny Zuckerman Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.


ResofWorld

What Elon Musk doesn’t understand about Brazil

You can’t fight for free speech in Brasília from Texas.
Lately, it is hard to say why Elon Musk does what he does. You can pick out themes like right-wing populism or a libertarian founder cult, but they don’t explain...

TSMC’s rise has young tech hopefuls moving to Taiwan

Southeast Asians are heading to Taiwan to train for semiconductor jobs, which is helping to fill a talent gap at the world’s top producer.
When Hans Juliano was looking to go abroad for a master’s degree in semiconductors, the Indonesian student initially considered Japan and South Korea. But he wasn’t eligible for a scholarship...

Velocity Network

Empowering Self-Sovereign Identity: Revolutionizing Data Control With Velocity Network

The post Empowering Self-Sovereign Identity: Revolutionizing Data Control With Velocity Network appeared first on Velocity.

Wednesday, 10. April 2024

EdgeSecure

Ecosystem for Research Networking (ERN) Summit 2024

The post Ecosystem for Research Networking (ERN) Summit 2024 appeared first on NJEdge Inc.

NEWARK, NJ, April 10, 2024 –

Dr. Forough Ghahramani, Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge, and Ecosystem for Research Networking (ERN) Steering Committee member, is joining an esteemed group of scientific and cyberinfrastructure researchers at the Ecosystem for Research Networking (ERN) Summit 2024. 

The Ecosystem for Research Networking Summit provides the scientific and cyberinfrastructure research community an opportunity to come together and discuss ERN mission and accomplishments, hear from domain researchers and CI professionals at smaller institutions about the the successes and challenges related to leveraging local, regional, and national resources for projects, and learn about funding resources and partnership opportunities, as well as regional and national communities.

This year’s summit will be held April 11-12, 2024 at the Pittsburgh Supercomputing Center in Pittsburgh, PA. There will be open discussions and conversations on focus areas and policies as they pertain to areas of community interest including AI, quantum, Big Data, cybersecurity and protecting data, research instruments, workforce development, applications for ERN, education and training.

“As the co-chair of the ERN Broadening the Reach working group, gaining a better understanding of the advanced computing and resource requirements and how the ERN can support the needs of the smaller institutions, historically black colleges and universities (HBCUs), and Minority Serving institutions is an important aspect of our mission. I am excited to learn from the community how the ERN can expand outreach and increase collaboration opportunities for broadening the reach and impact in support of the research community in the smaller-less resourced institutions.”

— Dr. Forough Ghahramani
Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge

Featuring a keynote speech titled Re-imagining American Innovation: Bridging the Gap to Unlock America’s Stranded Brilliance presented by Dr. James Martin, NSF Equal Opportunities in Science and Engineering Committee member, and Vice Chancellor for STEM Research and Innovation at the University of Pittsburgh, the two-day summit will also include domain researchers and CI professionals at smaller institutions about the successes and challenges related to leveraging  local, regional, and national resources for projects, and learn about funding resources and partnership opportunities, as well as regional and national communities. In addition to representation from small schools, HBCU’s, and MSI’s, we are grateful to National Science Foundation ( NSF) colleagues who will also be in attendance, providing opportunities for interaction between attendees and the NSF program directors. The full Summit agenda is available HERE.

Elaborates Dr. Barr von Oehsen, Director of the Pittsburgh Supercomputing Center, “Scientific discoveries have always been driven by advances in instruments of observation. Today, experimental tools are more advanced and more costly to construct and maintain, and the interpretation and simulation of data is more dependent on the use of cutting-edge computing resources, services, and knowledge. Consequently, many academic institutions lack access to these facilities. Our aim is to democratize access to research instruments, and the ERN Summit provides a platform for devising strategies to achieve this objective.”

For Summit details please visit the ERN Summit Events website. Any questions, please contact us via email at info@ern.ci.

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Ecosystem for Research Networking (ERN) Summit 2024 appeared first on NJEdge Inc.


Oasis Open Projects

OASIS Open: the Best Suite of Standards for ESG Data Reporting and Compliance

The role of audit and assurance in environmental, social, and governance (ESG) reporting is crucial for enhancing the credibility, reliability, and accuracy of ESG disclosures. As investors, regulators, and other stakeholders increasingly rely on ESG information to make informed decisions, the demand for high-quality, verifiable ESG data grows. Auditors and assurance providers play a key […] The

By Francis Beland, Executive Director, OASIS Open

The role of audit and assurance in environmental, social, and governance (ESG) reporting is crucial for enhancing the credibility, reliability, and accuracy of ESG disclosures. As investors, regulators, and other stakeholders increasingly rely on ESG information to make informed decisions, the demand for high-quality, verifiable ESG data grows.

Auditors and assurance providers play a key role in verifying ESG reports, ensuring they meet established standards and guidelines, and providing stakeholders with confidence in the reported information. Integrating OASIS Open standards such as UBL, OData, ebXML or STIX/TAXII can significantly enhance the effectiveness and efficiency of audit and assurance processes in ESG reporting.

Enhanced Data Exchange and Interoperability

UBL and ebXML facilitate standardized electronic business document exchange. AS4, a standard for secure document exchange, ensures that ESG data and reports transmitted between entities are done securely and reliably.

Secure Data Access and Authentication

SAML can be used to secure access to ESG reporting and data systems, ensuring that only authorized individuals and entities can view or modify sensitive ESG data.

Standardization of Codes and Terms

Genericode and Code List Representation standards help in defining and using standardized codes and terminologies in ESG reporting.

Efficient Data Querying and Management

OData facilitates simple and standardized queries for data, including ESG information stored across different databases and systems. BDXR standards can be used to discover and connect with ESG reporting entities and systems, streamlining the process of obtaining necessary reports and data for auditing purposes.

Cybersecurity and Information Sharing

STIX/TAXII standards for cybersecurity threat information sharing can help auditors and assurance providers stay informed about potential cyber threats to ESG reporting systems.

Blockchain-based Verification

The Baseline Protocol offers a framework for establishing verifiable, consistent records of ESG data and transactions on public blockchains without exposing sensitive information.

By leveraging these OASIS Open standards, auditors and assurance providers can ensure that ESG reporting is not only consistent and reliable but also meets the high standards of data security, integrity, and accessibility demanded by stakeholders. These technologies enable more efficient audit processes, reduce the risk of errors, and increase the overall trust in ESG reporting.

The post OASIS Open: the Best Suite of Standards for ESG Data Reporting and Compliance appeared first on OASIS Open.


MOBI

SC-ABeam

SC-ABeam Automotive Consulting was established as a joint venture between Sumitomo Corporation, a general trading company with wide-ranging business operations that cover the automotive and mobility sectors, and ABeam Consulting, a global consulting firm originating in Asia. Drawing on the strengths of its two corporate investors, SC-ABeam will contribute to the automotive and mobility sectors [...

SC-ABeam Automotive Consulting was established as a joint venture between Sumitomo Corporation, a general trading company with wide-ranging business operations that cover the automotive and mobility sectors, and ABeam Consulting, a global consulting firm originating in Asia. Drawing on the strengths of its two corporate investors, SC-ABeam will contribute to the automotive and mobility sectors by engaging in consulting activities focused on value creation in order to achieve sustainable growth in conjunction with society. https://www.sc-abeam.com/en/

The post SC-ABeam first appeared on MOBI | The New Economy of Movement.


MyData

Data Spaces Alliance Finland: United to move faster and stronger 

The Alliance brings together Finnish pioneers in data technology and offers its members a unified view to develop solutions that cross their organizational boundaries.  The Alliance is a collaborative community working to accelerate, build, and utilize data spaces in Finland. It strives to accelerate the growth and maturity of the Finnish data space initiatives that […]
The Alliance brings together Finnish pioneers in data technology and offers its members a unified view to develop solutions that cross their organizational boundaries.  The Alliance is a collaborative community working to accelerate, build, and utilize data spaces in Finland. It strives to accelerate the growth and maturity of the Finnish data space initiatives that […]

Tuesday, 09. April 2024

ResofWorld

India’s electric rickshaws are leaving EVs in the dust

Little-known e-rickshaw companies like YC Electric are at the forefront of the country’s EV revolution.
At a small factory just north of Delhi, a welder named Ram Baran spends several hours each day training his coworkers in metal cutting, molding, and shaping bodies of three-wheeler...

We Are Open co-op

Examining the Roots

Unpacking the foundations of Verifiable Credentials Image CC BY-ND Visual Thinkery for WAO Did you ever consider that looking at a tree only reveals half of its story? Much like a tree’s roots, which stretch as a wide and deep as its branches, the visible aspect of technology barely scratches the surface. In this post, we’re going to look at the the underpinnings to technologies such as
Unpacking the foundations of Verifiable Credentials Image CC BY-ND Visual Thinkery for WAO

Did you ever consider that looking at a tree only reveals half of its story? Much like a tree’s roots, which stretch as a wide and deep as its branches, the visible aspect of technology barely scratches the surface.

In this post, we’re going to look at the the underpinnings to technologies such as microcredentials, particularly those based on Verifiable Credentials. We share two crucial insights: the importance of understanding the ideological foundations we use, and how seemingly-similar technologies can differ significantly beneath the surface.

The old adage ‘technology is not neutral’ may be true but that’s just the tip of the iceberg. Technologies achieve widespread use do so because of rich, complex backgrounds.

The Role of Standards

In our everyday lives, we interact seamlessly with technologies that allow us to pay for a coffee with our smartphones using Apple or Google Wallet. We scan QR codes to find a link to websites. We use digital boarding passes.

All of these examples are built upon standards — agreements on how technologies should operate, which are developed collaboratively by individuals and organisations. These people either have a deep interest in the topic, work on it professionally, or both!

Image CC BY-ND Visual Thinkery for WAO

The aim behind Verifiable Credentials is for them to be integrated into society to the same extent as payment methods and boarding passes. The difference is that they help us prove our identity and our achievements.

As we highlighted in a previous post, standards ensure consistency:

A standard can be just the usual way of doing something. A standard can also be a reference to make sure things can be understood in the same way, no matter where or what is referencing the standard.
For example a kilogram in France is the same as kilogram in Germany, and a kilogram of feathers weighs the same as a kilogram of bricks. The kilogram is the standard, but where it is applied or what it is used for is up to whoever is using it.

This consistency is of critical importance for credentials to remain valid and recognised, long outliving the organisations that initially adopt them.

Community Interaction

Standards don’t spontaneously arrive, but are rather the fruits of community interaction. This collaboration happens in formal settings such as the W3C or the IEEE, or through more informal groups, as was the case with ActivityPub. The latter laid the foundation for Fediverse apps such as Mastodon.

The drive to develop Verifiable Credentials has been fueled by various needs, from decentralising proof of identity, to enable the issuing of trusted documents such as passports and degrees at scale. There are also those who looking to Verifiable Credentials to finally deliver on the dream of Open Recognition.

Diversity within communities developing standards is vital. Without a range of perspectives, there is the risk that the resulting technologies serve only a fraction of its potential users. When we’re talking about proof of identity and achievement, this is an important consideration.

The Importance of Ideology

Ideologies shape the vision and goals of a community, and encompass systems of beliefs and values. In the context of Verifiable Credentials, ‘Open’ (or openness) is a key ideology. This is not just in the sense of Open Source code but in terms of broader principles around transparency, inclusivity, adapability, collaboration, and the importance of community.

Image CC BY-ND Visual Thinkery for WAO

Going back to the tree analogy at the top of this post, a microcredential issued using the Verifiable Credentials standard is deeply rooted in community consensus and open standards. This contrasts with other credentialing methods which may use proprietary technologies, lack adherence to standards, and ignore the broader ethos of openness.

So, to embrace the full potential of technologies built on Verifiable Credentials, we need to dig beneath the surface. We must understand and appreciate that complex supporting structure of community, standards, and ideology. This understanding helps guide us towards contributing to a more just, inclusive, secure, and adaptable digital ecosystem.

Conclusion

In examining the foundations of Verifiable Credentials, we uncover a complex blend of ideology, standards, and community collaboration. These core elements go beyond mere technical specifications, and help define the technology’s purpose and potential.

As we explore the underlying principles of technologies such as microcredentials, it’s evident that our interaction with these technologies should extend deeper than their surface-level function. Embracing values of openness, inclusivity, and collective effort, we can contribute to a digital landscape that safeguards individual rights while promoting innovation and trust.

We all need to dig deeper, explore the ideological foundations, understand the importance of standards, and actively participate in the communities that build our world. Reach out to us and let’s work together to help steer towards a future where digital credentials support and empower everyone.

Doug Belshaw and Laura Hilliger collaborated closely, as they tend to do, on this post.

Examining the Roots was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Content Authenticity Initiative

Durable Content Credentials

Ensuring reliable provenance information is available no matter where a piece of content goes by combining C2PA metadata, watermarking, and fingerprinting technology.

by Andy Parsons, Sr. Director, Content Authenticity Initiative

Faced with a newly chaotic media landscape made up of generative AI and other heavily manipulated content, alongside authentic real photographs, video, and audio it is becoming increasingly difficult to know what to trust.  

Understanding the origins of digital media and if/how it was manipulated - as well as sharing that information with the consumer is now possible through Content Credentials, the global open technical specification developed by the C2PA, a consortium of over 100 companies working together within the Linux Foundation. 

Implementation of Content Credentials is on the rise with in-product support released and soon-to-be released by Adobe, Open AI, Meta, Google, Sony, Leica, Microsoft, Truepic, and many other companies.  

As this technology becomes increasingly commonplace, we’re seeing criticism circulating that relying solely on Content Credentials’ secure metadata, or solely on invisible watermarking to label generative AI content, may not be sufficient to prevent the spread of misinformation. 

To be clear, we agree. 

That is why, since its founding in 2021, the C2PA has been hard at work creating a robust and secure open standard in Content Credentials. While the standard focuses on a new kind of “signed” metadata, it also specifies measures to make the metadata durable, or able to persist in the face of screenshots and rebroadcast attacks. 

Content Credentials are sometimes confusingly described as a type of watermark, but watermarking has a specific meaning in this context and is only one piece in the three-pronged approach represented by Content Credentials. Let’s clarify all of this. 

The promise of Content Credentials is that they can combine secure metadata, undetectable watermarks, and content fingerprinting to offer the most comprehensive solution available for expressing content provenance for audio, video, and images.

Secure metadata: This is verifiable information about how content was made that is baked into the content itself, in a way that cannot be altered without leaving evidence of alteration. A Content Credential can tell us about the provenance of any media or composite. It can tell us whether a video, image, or sound file was created with AI or captured in the real world with a device like a camera or audio recorder. Because Content Credentials are designed to be chained together, they can indicate how content may have been altered, what content was combined to produce the final content, and even what device or software was involved in each stage of production. The various provenance bits can be combined in ways that preserve privacy and enable creators, fact checkers, and information consumers to decide what’s trustworthy, what’s not, and what may be satirical or purely creative.   

Watermarking: This term is often used in a generic way to refer to data that is permanently attached to content and hard or impossible to remove. For our purposes here, I specifically refer to watermarking as a kind of hidden information that is not detectable by humans. It embeds a small amount of information in content that can be decoded using a watermark detector. State-of-the-art watermarks can be impervious to alterations such as the cropping or rotating of images or the addition of noise to video and audio. Importantly, the strength of a watermark is that it can survive rebroadcasting efforts like screenshotting, pictures of pictures, or re-recording of media, which effectively remove secure metadata.

Fingerprinting: This is a way to create a unique code based on pixels, frames, or audio waveforms that can be computed and matched against other instances of the same content, even if there has been some degree of alteration. Think of the way your favorite music-matching service works, locating a specific song from an audio sample you provide. The fingerprint can be stored separately from the content as part of the Content Credential. When someone encounters the content, the fingerprint can be re-computed on the fly and matched against a database of Content Credentials and its associated stored fingerprints. The advantage of this technique is it does not require the embedding of any information in the media itself. It is immune to information removal because there is no information to remove.

So, we have three techniques that can be used to inform consumers about how media came to be. If each of these techniques were robust enough to ensure the availability of rich provenance no matter where the content goes, we would have a versatile set of measures, each of which could be applied where optimal and as appropriate.  

However, none of these techniques is durable enough in isolation to be effective on its own. Consider: 

Even if Content Credentials metadata cannot be tampered with without detection, metadata of any kind can be removed deliberately or accidentally. 

Watermarking is limited by the amount data that can be encoded without visibly or audibly degrading the content, and even then, watermarks can be removed or spoofed. 

Fingerprint retrieval is fuzzy. Matches cannot be made with perfect certainty, meaning that while useful as a perceptual check, they are not exact enough to ensure that a fingerprint matches stored provenance with full confidence. 

But combined into a single approach, the three form a unified solution that is robust and secure enough to ensure that reliable provenance information is available no matter where a piece of content goes. This single, harmonized approach is the essence of durable Content Credentials.  

Here is a deeper dive into how C2PA metadata, watermarks, and fingerprints are bound to the content to achieve permanent, immutable provenance. The thoughtful combination of these techniques leverages the strengths of each to mitigate the shortcomings of the others.  

A simple comparison of the components of durable Content Credentials, and their strength in combination.

Let’s look at how this works. First, the content is watermarked using a mode-specific technique purpose-built for audio, video, or images. Since a watermark can only contain an extremely limited amount of data, it is important to make the most of the bandwidth it affords. We therefore encode a short identifier and an indicator of where the C2PA manifest, or the signed metadata, can be retrieved. This could be a Content Credentials cloud host or a distributed ledger/blockchain. 

Next, we compute a fingerprint of the media, essentially another short numerical descriptor. The descriptor represents a perceptual key that can be used later to match the content to its Content Credentials, albeit in an inexact way as described earlier. 

Then, the identifier in the watermark and the fingerprint are added to the Content Credential, which already includes data pertaining to the origin of the content and the ingredients and tools that were used to make it. Now we digitally sign the entire package, so that it is uniquely connected to this content and tamper evident. And finally, the Content Credential is injected into the content and stored remotely. And just like that, in a few milliseconds, we have created a durable Content Credential. 

When a consumer of this media wishes to check the provenance, the process is reversed. If the provenance and content are intact, we need only verify the signed manifest and display the data. However, if the metadata has been removed, we make use of durability as follows: 

Decode the watermark, retrieving the identifier it stores. 

Use the identifier to look up the stored Content Credential on the appropriate Content Credentials cloud or distributed ledger. 

Check that the manifest and the content match by using the fingerprint to verify that there is a perceptual match, and the watermark has not been spoofed or incorrectly decoded. 

Verify the cryptographic integrity of the manifest and its provenance data. 

Again, within a few milliseconds we can fetch and verify information about how this content was made, even if the metadata was removed maliciously or accidentally. 

This approach to durability is not appropriate for every use case. For example, if a photojournalist wishes to focus primarily on privacy, they may not wish to store anything related to their photos and videos on any server or blockchain. Instead, they would ensure that the chain of custody between the camera and the publisher is carefully maintained so that provenance is kept connected and intact, but not stored remotely. 

However, in many cases, durable Content Credentials provide an essential balance between performance and permanence. And although technology providers are just beginning to implement the durability approach now, this idea is nothing new. The C2PA specification has always had the facility under its affordances for “soft bindings.”  

We recognize that although Content Credentials are an important part of the ultimate solution to help address the problem of deepfakes, they are not a silver bullet. For the Content Credentials solution to work, we need it everywhere — across devices and platforms — and we need to invest in education so people can be on the lookout for Content Credentials, feeling empowered to interpret the trust signals of provenance while maintaining a healthy skepticism toward what they see and hear online.  

Malicious parties will always find novel ways to exploit technology like generative AI for deceptive purposes. Content Credentials can be a crucial tool for good actors to prove the authenticity of their content, providing consumers with a verifiable means to differentiate fact from fiction.  

As the adoption of Content Credentials increases and availability grows quickly across news, social media, and creative outlets, durable Content Credentials will become as expected as secure connections in web browsers. Content without provenance will become the exception, provenance with privacy preservation will be a norm, and durability will ensure that everyone has the fundamental right to understand what content is and how it was made. 

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Monday, 08. April 2024

Oasis Open Projects

Invitation to comment on four OData v4.02 specification drafts

The Open Data Protocol (OData) enables the creation of REST-based data services, which allow resources to be published and edited by Web clients using simple HTTP messages. The post Invitation to comment on four OData v4.02 specification drafts appeared first on OASIS Open.

First public review for Version 4.02 specifications - ends May 8th

OASIS and the OASIS Open Data Protocol (OData) TC [1] are pleased to announce that OData Version 4.02, OData Common Schema Definition Language (CSDL) XML Representation Version 4.02, OData Common Schema Definition Language (CSDL) JSON Representation Version 4.02, and OData JSON Format Version 4.02 are now available for public review and comment.

The Open Data Protocol (OData) enables the creation of REST-based data services, which allow resources, identified using Uniform Resource Locators (URLs) and defined in an Entity Data Model (EDM), to be published and edited by Web clients using simple HTTP messages. The public review drafts released today are:

– OData Version 4.02: This document defines the core semantics and facilities of the protocol.

– OData Common Schema Definition Language (CSDL) XML Representation Version 4.02: OData services are described by an Entity Model (EDM). The Common Schema Definition Language (CSDL) defines specific representations of the entity data model exposed by an OData service using, XML, JSON, and other formats. This document specifically defines the XML representation of CSDL.

– OData Common Schema Definition Language (CSDL) JSON Representation Version 4.02: This document specifically defines the JSON representation of CSDL.

– OData JSON Format Version 4.02: This document extends the core specification by defining representations for OData requests and responses using a JSON format.

The documents and related files are available here:

OData Version 4.02
Committee Specification Draft 01
28 February 2024

— OData Version 4.02. Part 1: Protocol
Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part1-protocol/odata-v4.02-csd01-part1-protocol.md
HTML:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part1-protocol/odata-v4.02-csd01-part1-protocol.html
PDF:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part1-protocol/odata-v4.02-csd01-part1-protocol.pdf
— OData Version 4.02. Part 2: URL Conventions
Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part2-url-conventions/odata-v4.02-csd01-part2-url-conventions.md
HTML:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part2-url-conventions/odata-v4.02-csd01-part2-url-conventions.html
PDF:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/part2-url-conventions/odata-v4.02-csd01-part2-url-conventions.pdf
— OData Version 4.02. ABNF components:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/abnf/

OData Common Schema Definition Language (CSDL) XML Representation Version 4.02
Committee Specification Draft 01
28 February 2024

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-csdl-xml/v4.02/csd01/odata-csdl-xml-v4.02-csd01.md
HTML:
https://docs.oasis-open.org/odata/odata-csdl-xml/v4.02/csd01/odata-csdl-xml-v4.02-csd01.html
PDF:
https://docs.oasis-open.org/odata/odata-csdl-xml/v4.02/csd01/odata-csdl-xml-v4.02-csd01.pdf
XML schemas:
https://docs.oasis-open.org/odata/odata-csdl-xml/v4.02/csd01/schemas/

OData Common Schema Definition Language (CSDL) JSON Representation Version 4.02
Committee Specification Draft 01
28 February 2024

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-csdl-json/v4.02/csd01/odata-csdl-json-v4.02-csd01.md
HTML:
https://docs.oasis-open.org/odata/odata-csdl-json/v4.02/csd01/odata-csdl-json-v4.02-csd01.html
PDF:
https://docs.oasis-open.org/odata/odata-csdl-json/v4.02/csd01/odata-csdl-json-v4.02-csd01.pdf
JSON schemas:
https://docs.oasis-open.org/odata/odata-csdl-json/v4.02/csd01/schemas/

OData JSON Format Version 4.02
Committee Specification Draft 01
28 February 2024

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-json-format/v4.02/csd01/odata-json-format-v4.02-csd01.md
HTML:
https://docs.oasis-open.org/odata/odata-json-format/v4.02/csd01/odata-json-format-v4.02-csd01.html
PDF:
https://docs.oasis-open.org/odata/odata-json-format/v4.02/csd01/odata-json-format-v4.02-csd01.pdf

For your convenience, OASIS provides complete packages of the prose specifications and related files in ZIP distribution files. You can download the ZIP files at:

OData Version 4.02:
https://docs.oasis-open.org/odata/odata/v4.02/csd01/odata-v4.02-csd01.zip

OData Common Schema Definition Language (CSDL) XML Representation Version 4.02:
https://docs.oasis-open.org/odata/odata-csdl-xml/v4.02/csd01/odata-csdl-xml-v4.02-csd01.zip

OData Common Schema Definition Language (CSDL) JSON Representation Version 4.02:
https://docs.oasis-open.org/odata/odata-csdl-json/v4.02/csd01/odata-csdl-json-v4.02-csd01.zip

OData JSON Format Version 4.02:
https://docs.oasis-open.org/odata/odata-json-format/v4.02/csd01/odata-json-format-v4.02-csd01.zip

How to Provide Feedback

OASIS and the OData TC value your feedback. We solicit feedback from potential users, developers and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

This public review starts 09 April 2024 at 00:00 UTC and ends 08 May 2024 at 11:59 UTC.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/.
Previous comments on OData works are archived at https://lists.oasis-open.org/archives/odata-comment/.

All comments submitted to OASIS are subject to the OASIS Feedback License [2], which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with the public review of these works, we call your attention to the OASIS IPR Policy [3] applicable especially [4] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specifications, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about these specifications and the OData TC may be found on the TC’s public home page.

========== Additional references:

[1] OASIS Open Data Protocol (OData) TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=e7cac2a9-2d18-4640-b94d-018dc7d3f0e2
https://www.oasis-open.org/committees/odata/

Approval (four specifications): https://github.com/oasis-tcs/odata-specs/blob/256d65b9f5f6fa5c3f6c3caa341947e6c711fb8c/zip/Minutes%20of%202024-02-28%20Meeting%20%23463.md

[2] OASIS Feedback License:
https://www.oasis-open.org/who/ipr/feedback_license.pdf

[3] https://www.oasis-open.org/policies-guidelines/ipr/

[4] https://www.oasis-open.org/committees/odata/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-RAND-Mode
RF on RAND Mode

The post Invitation to comment on four OData v4.02 specification drafts appeared first on OASIS Open.


Identity At The Center - Podcast

In the latest episode of The Identity at the Center Podcast,

In the latest episode of The Identity at the Center Podcast, we sit down with special guest Martin Kuppinger, Founder and Principal Analyst at KuppingerCole Analysts. We discussed topics ranging from who should oversee IAM to the end-of-life situation of SAP Identity Management. Also, we dove into the details about the upcoming European Identity and Cloud Conference in Berlin. It's an insightful c

In the latest episode of The Identity at the Center Podcast, we sit down with special guest Martin Kuppinger, Founder and Principal Analyst at KuppingerCole Analysts. We discussed topics ranging from who should oversee IAM to the end-of-life situation of SAP Identity Management. Also, we dove into the details about the upcoming European Identity and Cloud Conference in Berlin. It's an insightful conversation you won't want to miss. Listen to the full episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 05. April 2024

FIDO Alliance

Tech Telegraph: 5 easy tasks to supercharge your cybersecurity

This post summarises five ways for consumers to improve their cybersecurity. FIDO USB keys and passkeys are included. Of passkeys, the article said: “Services including Google are switching to passwordless […]

This post summarises five ways for consumers to improve their cybersecurity. FIDO USB keys and passkeys are included. Of passkeys, the article said: “Services including Google are switching to passwordless ‘passkey’ authentication that supercharges security without needing 2FA, but that technology is still in its early adoption days.”


Source Security: Introducing the latest innovation from Sentry Enterprises: The batteryless multi-funciton biometric credential

Sentry Enterprises announced its latest innovation – a multi-factor physical and logical access solution that aims to offer more affordable and secure biometric security. It is FIDO2 compliant.

Sentry Enterprises announced its latest innovation – a multi-factor physical and logical access solution that aims to offer more affordable and secure biometric security. It is FIDO2 compliant.


Android Headlines: Keeper password manager app to get Passkeys support following browser extensions

Keeper Security is introducing passkeys support to its smartphone applications, addressing the ongoing struggle websites face with user authentication methods. Passkeys, created by FIDO, allows users to ditch traditional usernames […]

Keeper Security is introducing passkeys support to its smartphone applications, addressing the ongoing struggle websites face with user authentication methods. Passkeys, created by FIDO, allows users to ditch traditional usernames and passwords and securely log in to a website, app, or other digital services.


Security Today: Mobile IDs, MFA and Sustainability Emerge as Top Trends in New HID Report

The end of passwords is near as the FIDO Alliance is paving the way toward new and more secure authentication options that will be part of a more robust Zero […]

The end of passwords is near as the FIDO Alliance is paving the way toward new and more secure authentication options that will be part of a more robust Zero Trust architecture.


GS1

Reducing the global impact of environmentally harmful anaesthetic gases using a medical device

Reducing the global impact of environmentally harmful anaesthetic gases using a medical device 95% of anaesthetic gases used in an operation are not metabolised by the patient so a significant proportion is released into the atmosphere. It is estimated that anaesthetic gases account for around 100,000 tonnes of carbon di
Reducing the global impact of environmentally harmful anaesthetic gases using a medical device 95% of anaesthetic gases used in an operation are not metabolised by the patient so a significant proportion is released into the atmosphere.

It is estimated that anaesthetic gases account for around 100,000 tonnes of carbon dioxide per year across NHS and private hospitals across the UK (covering England, Scotland, Wales and Northern Ireland). These highly volatile gases make up 2% of the National Health Service’s (NHS) total carbon footprint and 15-20% of a theatre’s carbon footprint for each operation in England alone.

SageTech Medical’s circular economy solution safely captures available volatile anaesthetic agents in a reusable capture canister (SID-Can) is recovered, processed and recycled back into a usable drug form to minimise the environmental impact.

Business goal GS1 Healthcare Case Studies 2023-2024 gs1seg230313_01_cases_studies_2024_final_.pdf

Thursday, 04. April 2024

Hyperledger Foundation

Hyperledger Aries: An Epicenter for Decentralized Digital Identity Collaboration and Innovation

How was 2023 for Hyperledger Aries, and what’s in store for 2024? Given that Aries is uniquely positioned for anyone adopting a decentralized approach to data and identity verification, let’s review what’s been a fascinating year for the project.

How was 2023 for Hyperledger Aries, and what’s in store for 2024? Given that Aries is uniquely positioned for anyone adopting a decentralized approach to data and identity verification, let’s review what’s been a fascinating year for the project.


Elastos Foundation

Meme Contest: Win Your Share of 250 ELA in the Ultimate Crypto Meme Challenge!

In an age where dogs with hats become multi-billion dollar projects and creativity and fun intersects with the unpredictable and wild world of cryptocurrency, a contest emerges not just as a competition but as a canvas for the expression of wit, satire, and ingenuity. We are talking about the Elastos meme contest! Starting today, this […]

In an age where dogs with hats become multi-billion dollar projects and creativity and fun intersects with the unpredictable and wild world of cryptocurrency, a contest emerges not just as a competition but as a canvas for the expression of wit, satire, and ingenuity. We are talking about the Elastos meme contest! Starting today, this contest like no other, offering a total bounty of 250 ELA.

The goal is to be Elastos related and to make the community laugh, where 10 winners, each poised to claim their share of 25 ELA (~$100), navigating through the realms of $ELA, Elastos, BeL2, and the SmartWeb. Here are the core details!

 Contest Details: Total Prize: 250 ELA Number of Winners: 10 Prize per Winner: 25 ELA Selection Method: Winners chosen by Elastos Social Media Team Contest Duration: Start Date: April 4, 2024 End Date: April 30th, 2024 at 11:59 PM UTC Participation Requirements: Create a meme that includes at least 2 of the following: $ELA Elastos Bitcoin Layer2 SmartWeb Share your meme on X(formerly Twitter) with the hashtags: #ElastosMemes and #Elastos

Best of luck fellow memers! We look forward to sharing all the creativity and announcing the winners on May, 7th!

Here is a meme to quick-start the contest:

Make a meme easily online for free

We Are Open co-op

Building a Minimum Viable Community of Practice (MVCoP)

Using a design workshop to jump start your CoP At the end of the month, we’ll be gathering a number of community-based organisations, interns and faculty in an online design workshop to jump start a Community of Practice together with Cal State University and Participate. This community, supported through the Internet for All programme, will provide an online space for those promoting access
Using a design workshop to jump start your CoP

At the end of the month, we’ll be gathering a number of community-based organisations, interns and faculty in an online design workshop to jump start a Community of Practice together with Cal State University and Participate. This community, supported through the Internet for All programme, will provide an online space for those promoting access and training in digital technologies in the CSUDH community.

We know that building a thriving and supportive online community requires some deliberate efforts. In this blog post, we explore the idea of a “minimum viable community of practice” (MVCoP). Once again, we dive into our Architecture of Participation (AoP) framework, and discuss running design workshops to empower individuals to co-design their community.

header in the budding CoP on Participate What is a MVCoP? cc-by-nd Visual Thinkery for WAO

Let’s start with the value of a Community of Practice. Communities of Practice (CoPs) are important for people to network, share knowledge, and learn from each other. They play a vital role in supporting lifelong learning and helping people level up throughout their careers. By connecting with like-minded peers who share similar interests and expertise, people can tap into a vast pool of collective wisdom and experience. Through active participation in discussions, collaborative projects, and sharing of best practices, community members continuously learn from each other, gaining new insights, perspectives, and opportunities.

Communities of practice not only foster professional growth but also create opportunities for personal development, encouraging individuals to stay curious, adapt to change, and embrace a lifelong learning mindset. We develop real relationships in these communities, and we learn about belonging and acceptance.

But before all of that, we have to design spaces that are supportive, interesting, engaging and inclusive. This takes intention. A MVCoP is a community set up to encourage organic and collaborative growth and belonging.

The Bare Minimum AoP AoP from WAO

We’ve written extensively about one of the tools, the Architecture of Participation (AoP), that we use to help us build thoughtful, inclusive and empowering communities. Briefly, because we really have written about this all over the internet, the AoP is 8 steps to help people cover all their bases when thinking about volunteering, contributing and facilitating communities.

But what are the key steps for building a Minimum Viable Community of Practice? Well, we think that the most important to start with are:

Ways of working openly — Even at the very beginning, people need to see what’s happening within the community and how they can get involved. The open principles of transparency, inclusivity, collaboration and adaptability underpin community. Does the project have secret areas, or is everything out in the open? Backchannels and watercoolers — We need places to share memes, make jokes and chat about the weather. These evolve organically, but including them in your MVCoP design suggests understanding of the social dynamics within groups of people. Are there ‘social’ spaces for members of the project to interact over and above those focused on project aims? Celebration of milestones — Building a space where people belong means thinking of them as, you know, people. We need motivation and recognition. Does the MVCoP recognise the efforts and input of the community members? Empowering Co-Design cc-by-nd Visual Thinkery for WAO

Running a design workshop is a great way to empower co-design within a community. By bringing together individuals with diverse perspectives, skills, and experiences, a design workshop creates a collaborative space where participants can actively contribute to shaping the emerging community.

We facilitate such workshops to encourage open dialogue, brainstorming, and hands-on activities. We aim to help people feel a sense of ownership and engagement because, after all, this is a community. This inclusive approach means that we can work together on the community’s mission, invitations and onboarding. Our intentions and goals are co-created and reflect the collective aspirations and needs of the group.

By actively involving community members in the design process, a design workshop not only strengthens the sense of belonging but also ensures that the resulting community is truly representative of its members’ interests, resulting in a more vibrant and sustainable community of practice.

Conclusion

Building a minimum viable community of practice (MVCoP) is a dynamic and iterative process that requires active involvement from community members. Empowering people to co-design their community will help that community evolve. The journey to a thriving community starts with small steps, but with dedication and collective effort, it can lead to a flourishing network of professionals supporting each other’s growth and success.

Need some community help? We’ve written an entire guide! Or you are very welcome to get in touch!

Building a Minimum Viable Community of Practice (MVCoP) was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 03. April 2024

Trust over IP

ToIP Announces the Implementers Draft of the Trust Registry Protocol Specification V2.0

Creating a simple and consistent way to programmatically get answers from authoritative ecosystem sources. The post ToIP Announces the Implementers Draft of the Trust Registry Protocol Specification V2.0 appeared first on Trust Over IP.

The Trust Registry Task Force (TRTF) at the Trust Over IP (ToIP) Foundation has released a new version of its ToIP Trust Registry Protocol Specification as an Implementers Draft. This draft aims to elicit open public feedback from implementers of any type of online registry service or client software. Instructions for providing feedback are at the end of this post.

Background

The TRTF was established in the summer of 2021 in response to the sudden demand for cross-jurisdiction verification of digital health credentials during the COVID crisis. In the fall of 2021, the TRTF produced a preliminary version of the ToIP Trust Registry Protocol Specification to begin public experimentation with the protocol.

As the adoption of digital wallets and verifiable digital credentials has grown, so has the challenge for relying parties to verify the authorized issuers of those credentials. The same applies to credential holders, who need to judge which relying parties they can safely trust with their credential data.

These digital trust decisions are complicated—in both directions. To make them more accessible, participants need trusted sources of information. That’s the job of trust registries. A trust registry is a system of record that contains answers to questions that help drive trust decisions. 

Many of these systems of record already exist. For example, almost any legal jurisdiction has a method of registering and licensing all types of businesses and professionals (CPAs, lawyers, doctors, professional engineers, etc.) And there are hundreds of registries of accredited institutions—universities, hospitals, insurance companies, nursing homes, etc.

New trust registries are also emerging for new online communities, including social networks, blockchains, and peer-to-peer networks. The challenge is that the methods of accessing the information across all these different registries are wildly inconsistent—if such information is available online.

The Trust Registry Protocol V2.0

The ToIP Trust Registry Protocol (TRP) V2.0 aims to solve this problem by providing a simple and consistent way to discover who is authorized to do what within a specific digital trust ecosystem. In short, it enables parties to ask programmatically:

Does entity X hold authorization Y under ecosystem governance framework Z?

In addition to that core query type, the TRP V2 also supports queries to:

Assist integrators in retrieving information critical to interacting with the trust registry (e.g. get a list of supported authorizations, namespaces, or resources). Assert the relationships of the queried trust registry with other trust registries, allowing the development of a registry-of-registries capability.

Currently, in this Implementers Draft stage, this question can be asked via a RESTful (OpenAPI Specification 3.1.0) protocol query. Future versions of the TRP may support other underlying protocol specifications (e.g. DIDComm co-protocols, ToIP Trust Spanning Protocol). 

It is important to note that in V2, the TRP does not manage information inside the trust registry (i.e., the system-of-record). It is a read-only query protocol. Create, update, or delete operations may be specified in future protocol versions if demand exists.

To be clear, a trust registry does not create trust in itself. Your decision to trust the outputs from a trust registry is entirely yours. However, the information provided by trust registries is often required to build trust—especially between parties with no previous relationship. 

“A trust registry does not create authority. The authority of a trust registry is an outcome of governance.”

 – Jacques Latour, CTO, CIRA.ca (.ca top-level domain registry)

How to Provide Feedback

We invite feedback from implementers: systems integrators, developers, and product leaders who either need to share or access the information necessary to facilitate digital trust decisions within their ecosystem.

To review the ​​specification:

Github Pages version: https://trustoverip.github.io/tswg-trust-registry-protocol/  Markdown version: https://github.com/trustoverip/tswg-trust-registry-protocol 

To make a comment, report a bug, or file an issue, please follow the ToIP Public Review Process on GitHub:

Bugs/Issues: https://github.com/trustoverip/tswg-trust-registry-protocol/issues  Discussion: https://github.com/trustoverip/tswg-trust-registry-protocol/discussions 

The post ToIP Announces the Implementers Draft of the Trust Registry Protocol Specification V2.0 appeared first on Trust Over IP.


Identity At The Center - Podcast

Join us for the latest Sponsor Spotlight edition of The Iden

Join us for the latest Sponsor Spotlight edition of The Identity at the Center Podcast. In this fully sponsored episode, we have an insightful discussion with Gil Hoffer, the Co-Founder and CTO of Salto. We delve into Gil's journey into the world of identity, the inception of Salto, and how they're revolutionizing DevOps for business apps and identity platforms like Okta to solve age-old configura

Join us for the latest Sponsor Spotlight edition of The Identity at the Center Podcast. In this fully sponsored episode, we have an insightful discussion with Gil Hoffer, the Co-Founder and CTO of Salto. We delve into Gil's journey into the world of identity, the inception of Salto, and how they're revolutionizing DevOps for business apps and identity platforms like Okta to solve age-old configuration challenges.

Listen to our discussion on idacpodcast.com or on your preferred podcast app.

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

International Inventory Optimization with Burak Yolga, Forceget

Global Supply Chains are dramatically shifting due to economic shifts such as rising interest rates and inflation. There is a pressing need for efficiency, from reducing FBA fees to renegotiating costs and finding ingenious savings in your supply chain. Burak Yolga, Co-Founder and CEO of Forceget, talks with hosts Liz Sertl and Reid Jackson about this and his journey through the intricate world

Global Supply Chains are dramatically shifting due to economic shifts such as rising interest rates and inflation. There is a pressing need for efficiency, from reducing FBA fees to renegotiating costs and finding ingenious savings in your supply chain.

Burak Yolga, Co-Founder and CEO of Forceget, talks with hosts Liz Sertl and Reid Jackson about this and his journey through the intricate world of global supply chains. Drawing on examples from industry leaders, he offers a fresh perspective on the transformative power of digitalization in business processes, focusing on enhancing visibility and standardization to scale. He discusses the complexities of managing international teams across time zones and the critical importance of environmentally conscious shipping practices, including cost-effective innovations like solar-powered vessels.

 

Key takeaways: 

How Forceget has mastered inventory management amidst fluctuating interest rates and complex international logistics.

The financial and operational advantages of eco-innovations and resilient supply chain practices ensure professionals stay ahead of industry trends and environmental mandates.

How their team leverages AI for inventory forecasting, efficient resource allocation, and contingency planning.

 

Resources: 

What is Inventory Management?

Resources for Improving Supply Chain Visibility

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Burak Yolga on LinkedIn

Check out Forceget

 

Tuesday, 02. April 2024

Oasis Open Projects

Invitation to comment on Universal Business Language v2.4 before call for consent as OASIS Standard

UBL is the leading interchange format for business documents. The post Invitation to comment on Universal Business Language v2.4 before call for consent as OASIS Standard appeared first on OASIS Open.

Public review ends May 26th

OASIS and the OASIS Universal Business Language TC [1] are pleased to announce that Universal Business Language Version 2.4 is now available for public review and comment.

UBL is the leading interchange format for business documents. It is designed to operate within a standard business framework such as ISO/IEC 15000 (ebXML) to provide a complete, standards-based infrastructure that can extend the benefits of existing EDI systems to businesses of all sizes. The European Commission has declared UBL officially eligible for referencing in tenders from public administrations, and in 2015 UBL was approved as ISO/IEC 19845:2015.

Specifically, UBL provides:
– A suite of structured business objects and their associated semantics expressed as reusable data components and common business documents.
– A library of schemas for reusable data components such as Address, Item, and Payment, the common data elements of everyday business documents.
– A set of schemas for common business documents such as Order, Despatch Advice, and Invoice that are constructed from the UBL library components and can be used in generic procurement and transportation contexts.

UBL v2.4 is a minor revision to v2.3 that preserves backwards compatibility with previous v2.# versions. It adds new document types, bringing the total number of UBL business documents to 93.

The TC received three Statements of Use from Efact, Google, and Semantic [3].

The candidate specification and related files are available here:

Universal Business Language Version 2.4
Committee Specification 01
17 October 2023

Editable source (Authoritative):
docs.oasis-open.org/ubl/cs01-UBL-2.4/UBL-2.4.xml
HTML:
docs.oasis-open.org/ubl/cs01-UBL-2.4/UBL-2.4.html
PDF:
docs.oasis-open.org/ubl/cs01-UBL-2.4/UBL-2.4.pdf
Code lists for constraint validation:
docs.oasis-open.org/ubl/cs01-UBL-2.4/cl/
Context/value Association files for constraint validation:
docs.oasis-open.org/ubl/cs01-UBL-2.4/cva/
Document models of information bundles:
docs.oasis-open.org/ubl/cs01-UBL-2.4/mod/
Default validation test environment:
docs.oasis-open.org/ubl/cs01-UBL-2.4/val/
XML examples:
docs.oasis-open.org/ubl/cs01-UBL-2.4/xml/
Annotated XSD schemas:
docs.oasis-open.org/ubl/cs01-UBL-2.4/xsd/
Runtime XSD schemas:
docs.oasis-open.org/ubl/cs01-UBL-2.4/xsdrt/

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file at:
docs.oasis-open.org/ubl/cs01-UBL-2.4/UBL-2.4.zip

Members of the UBL TC [1] approved this specification by Special Majority Vote [2]. The specification had been released for public review as required by the TC Process [4].

Public Review Period

The 60-day public review starts 28 March 2024 at 00:00 UTC and ends 26 May 2024 at 23:59 UTC.

This is an open invitation to comment. OASIS solicits feedback from potential users, developers and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

Comments may be submitted to the TC by any person directly at:
Technical-Committee-Comments@oasis-open.org

Comments submitted by for this work and for other work of this TC are publicly archived and can be viewed at:
https://groups.google.com/a/oasis-open.org/g/technical-committee-comments/
link to previous comments on UBL works: lists.oasis-open.org/archives/ubl-comment

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review of Universal Business Language Version 2.4 we call your attention to the OASIS IPR Policy [5] applicable especially [6] to the work of this technical committee. All members of the TC/OP should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

========== Additional references:
[1] OASIS Universal Business Language TC
https://groups.oasis-open.org/communities/tc-community-home2?CommunityKey=556949c8-dac8-40e6-bb16-018dc7ce54d6
former link: https://www.oasis-open.org/committees/ubl/

[2] Approval ballot:
https://groups.oasis-open.org/higherlogic/ws/groups/556949c8-dac8-40e6-bb16-018dc7ce54d6/ballots/ballot?id=3818

[3] Links to Statements of Use

Efact: https://lists.oasis-open.org/archives/ubl-comment/202402/msg00003.html Google: https://lists.oasis-open.org/archives/ubl-comment/202402/msg00001.html Semantic: https://lists.oasis-open.org/archives/ubl/202312/msg00007.html

[4] History of publication, including previous public reviews:
https://docs.oasis-open.org/ubl/csd02-UBL-2.4/UBL-2.4-csd02-public-review-metadata.html

[5] https://www.oasis-open.org/policies-guidelines/ipr/

[6] https://www.oasis-open.org/committees/ubl/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-Limited-Mode
RF on Limited Terms Mode

The post Invitation to comment on Universal Business Language v2.4 before call for consent as OASIS Standard appeared first on OASIS Open.


GS1

Andrew Tuerk

Andrew Tuerk Chief Data Officer glenda.fitzpatrick Tue, 04/02/2024 - 18:43 Member excellence Syndigo Andrew Tuerk
Andrew Tuerk Chief Data Officer glenda.fitzpatrick Tue, 04/02/2024 - 18:43 Member excellence

Syndigo

Andrew Tuerk

Trust over IP

ToIP Announces New Issuer Governance Requirements Guide for Verifiable Credentials – Public Comment Needed

Comment on our first effort to define standard requirements for issuers of verifiable credentials. The post ToIP Announces New Issuer Governance Requirements Guide for Verifiable Credentials – Public Comment Needed appeared first on Trust Over IP.

ToIP invites the public to comment on their newly released document, Issuers Requirements Guide for Governance Framework of Verifiable Credentials.

The mission of the Trust over IP (ToIP) Foundation is to define a complete architecture for Internet-scale digital trust that combines cryptographic assurance at the machine layer with human accountability at the business, legal, and social layers.  Part of that mission is to define generally accepted requirements for standard roles that play a critical part in accountability for digital trust.  

Our Governance Stack Working Group has completed a new deliverable, the Issuer Requirements Guide for Governance Frameworks of Verifiable Credentials (PDF) and is soliciting public comment using our public review process and this GitHub link. While many schemes in the US, UK and Canada have focused on elements of identity credential issuance and verification, this is the first effort to define standard requirements for issuers of verifiable credential to ensure that their processes are transparent and consistent and meet the needs of relying parties and ecosystem governing bodies.

Verifiable credential ecosystems require both technical trust and human trust where the core requirements for the corresponding issuance processes are captured in this newly released document being circulated for public review and comment. Verifiable credentials are a type of digital representation of claims or attributes about a subject, which can be an individual, organization, or thing. These credentials are tamper-evident, cryptographically secure, and can be verified by relying parties without the need for a central authority.

The Governance requirements of an issuer in a verifiable credential ecosystem can be summarized as follows:

Issuance of Credentials: The issuer is responsible for creating and issuing verifiable credentials to subjects based on certain claims or attributes. These credentials are digitally signed by the issuer using their private key, ensuring the authenticity and integrity of the information. Trust and Reputation: The issuer’s reputation and trustworthiness are crucial in the verifiable credential ecosystem. Relying parties (such as service providers or verifiers) rely on the credentials being issued by reputable and trusted issuers. The credibility of the issuer is established through various mechanisms, such as being a well-known organization, being part of a recognized authority, or holding themselves accountable to the requirements of a governing authority. Validation of Claims: Before issuing credentials, the issuer does its due diligence to validate the claims made in the credential. This validation process ensures that the information presented in the credential is accurate and can be trusted by relying parties. Verification of Issuer and Holder: Issued credentials that contain links to the issuer and/or holder should be engineered so they can be cryptographically verified. Privacy Considerations: Issuers need to handle personal data responsibly and in compliance with privacy regulations. They should only collect and use the minimum necessary data required to issue the credentials and should obtain explicit consent from the subjects. Revocation and Expiry: For credentials that require expiration or revocation, issuers must have mechanisms in place to revoke or expire credentials if the claims become invalid or if the credentials are compromised. This is essential to maintain the trustworthiness of the digital trust ecosystem. Interoperability: Issuers need to follow standardized formats and protocols to ensure that the issued credentials are interoperable and can be easily understood and verified by different relying parties. Auditability and Accountability: Issuers should keep records of issued credentials for audit purposes,  lifecycle maintenance, including updates to claims, or re-issuance for any reason and revocation. This enables traceability and accountability in case of disputes or issues with the credentials. Transparency: The issuer should publicly disclose all the policies it follows in the process of claim and credential issuance and revocation. This disclosure should be included in a publicly available governance framework.

The ToIP Issuer Requirements Guide is intended to aid implementers conform to the ToIP Governance Metamodel Specification for issuers of verifiable credentials within an ecosystem governed by a governance framework that conforms to the ToIP Governance Metamodel Specification. We encourage you to read this landmark document and provide feedback using the ToIP Public Review Process by following this GitHub Link to submit comments within the public review and comment period ending on May 31, 2024.  If you have any questions regarding this document, please contact Scott Perry at scott.perry@schellman.com.

The post ToIP Announces New Issuer Governance Requirements Guide for Verifiable Credentials – Public Comment Needed appeared first on Trust Over IP.


Elastos Foundation

ELA: Bitcoin Merged-Mining, Halvings & Unique Economics

Elastos ($ELA) presents a compelling narrative in the cryptocurrency ecosystem, paralleling Bitcoin’s disinflationary ethos while charting a unique path through its technological integration and economic modelling. Let’s explore some of the key highlights of ELA and its economic model: Merge Mining with Bitcoin: ELA’s merge mining with Bitcoin allows it to benefit from Bitcoin’s substantial […]

Elastos ($ELA) presents a compelling narrative in the cryptocurrency ecosystem, paralleling Bitcoin’s disinflationary ethos while charting a unique path through its technological integration and economic modelling. Let’s explore some of the key highlights of ELA and its economic model:

Merge Mining with Bitcoin: ELA’s merge mining with Bitcoin allows it to benefit from Bitcoin’s substantial hashing power (580.74 EH/s), with ELA itself achieving an impressive 293.69 EH/s, roughly 50%. This synergy enhances security while maintaining energy efficiency. BPoS Validator System: The Elastos BPoS Supernodes add a secondary layer of security by verifying and signing each ELA mainchain block provided by Bitcoin miners. BPoS engages two participant groups: stakers and validators, with staking ELA to vote for validators and APR for both. Fixed Maximum Supply: ELA caps at 28.22 million coins, with the final coins expected to be minted by December 2105. This fixed supply mirrors Bitcoin’s scarcity principle, foundational to its value. Disinflationary Nature: ELA follows a 4-year halving cycle similar to Bitcoin, effectively cutting its annual inflation rate in half. This model ensures a gradual decrease in new ELA supply, enhancing scarcity and potential value over time. Like Bitcoin, ELA’s halving reduces the reward for block production, transitioning incentives towards transaction fees over time, and ensuring long-term network sustainability. The next halving is in December 2015. Current Mining Dynamics: With a block generation time of every two minutes, ELA rewards are distributed among Bitcoin PoW miners (35%), BPoS validators (35%), and the CRC DAO treasury (30%). This distribution model incentivizes diverse participation in the network’s security and governance.

 

The Significance of Merge-Mining with Bitcoin

Merge mining enables Bitcoin miners to mine both Bitcoin and Elastos by running Elastos code alongside, without additional costs or energy. This leverages Bitcoin’s vast hashing power (580.74 EH/s, with ELA at 293.69 EH/s) to secure both networks efficiently. By integrating Elastos’s mining process with Bitcoin’s infrastructure, miners can earn extra rewards in ELA, fostering a mutually beneficial relationship that enhances ELA’s security and economic incentives. This dual mining opportunity not only augments revenue for Bitcoin miners but also promotes a cooperative ecosystem, highlighting merge mining with Bitcoin as a strategically valuable feature for bolstering network security while being environmentally considerate. ELA today has roughly 50% of Bitcoins security protecting the network’s value.

 

Earning APR with ELA.

What’s more, you can earn ELA as a community member with Bitcoins security provided. Here are three of the core ways:

1. Participate in Staking: APR: Up to 2-3%. Lockup Duration: 10 to 1000 days. Equity Tokens: 1 staked ELA = 1 voting token. Rewards: Based on amount and duration. Profit Sharing: 25% to node owners, 75% to stakers. Special Nodes: 12 CR Council nodes excluded from voting.

Re-Voting: Necessary at pledge end to continue earning. Here is a detailed guide on how to stake.

2. Becoming a BPOS Validator: APR: Up to 22%. Entry Requirements: 2,000 ELA pledge, $6/month maintenance. Rewards: 25% of block rewards. Yield Factors: Staking amount and time. Selection: Randomly chosen 36 nodes every 36 blocks. Rewards Distribution: Automated by Elastos mainchain.

Here is a detailed guide on how to become a BPOS Validator.  Here is additional support on validator requirements.

3. Cyber Republic Council (CRC) DAO Member: APR: Up to 35% (rewards and sidechain transactions). Cyber Republic Consensus: Governance mechanism for community decisions, Elastos sidechain blockchain validation (EVM and Identity), and ecosystem development, utilizing a delegate model for decision-making and proposal voting. CR Council Member Nodes: 12 community-elected delegates using ELA to vote become responsible for decision-making on community affairs, proposal recommendation, and voting. Community Members: Rights include voting in elections, submitting proposals, and monitoring and impeaching council members. Election and Term: Participants need Elastos DIDs and a 5,000 ELA deposit. Election via ELA voting, top 12 candidates become council members. The election process starts one month (about 21,900 main chain blocks) before the current members’ term ends for seamless transitions. The next election term begins in April 2024. One-year term, with provisions for impeachment and automatic removal under specific conditions. Rewards and Responsibilities: Council members receive mainchain ELA rewards and Sidechain (EVM and DID) transaction revenue.

Learn more here.  Follow Cyber Republic Twitter for the latest updates on elections and guidance.

 

Elastos and ELA combines Bitcoin’s disinflationary approach with its own technological advancements and economic strategies, enhancing its network security through merge mining with Bitcoin and offering a BPoS Validator System for additional security and APR. With a disinflationary model and a fixed supply limit of 28.22 million ELA, the ecosystem incentivizes participation through mining rewards distribution and provides APR earning opportunities via staking, BPoS validation, and the Cyber Republic Council (CRC) DAO governance. These features, alongside its economic policies, position Elastos as a distinctive and engaging platform in the cryptocurrency realm, aligning with Bitcoin’s ethos and security. Learn more here!


DIF Blog

DIF and KuppingerCole announce collaboration

The Decentralized Identity Foundation (DIF) and KuppingerCole are excited to announce a collaboration aimed at bringing new value to members, customers and digital transformation leaders. DIF is a global membership organization that is building the foundational elements necessary to establish security, privacy, interoperability and trust between the participants in any

The Decentralized Identity Foundation (DIF) and KuppingerCole are excited to announce a collaboration aimed at bringing new value to members, customers and digital transformation leaders.

DIF is a global membership organization that is building the foundational elements necessary to establish security, privacy, interoperability and trust between the participants in any digital ecosystem.

Founded in 2004, KuppingerCole is a European analyst company focusing on identities and access management, their governance, and risk management to facilitate innovation and secure, privacy-maintaining information management.

Planned activities include a program of joint virtual events and targeted publications. The two organizations are also exploring the potential to leverage their tools, processes and operational resources to create a first-of-a-kind industry collaboration platform.

“Our strategic partnership with KuppingerCole marks a pivotal moment for decentralized identity, accelerating its impact and reach,” said Kim Hamilton Duffy, Executive Director of DIF. “Our members are at the helm of creating the next-generation infrastructure for secure, user-focused ecosystems that will transform our digital interactions. Partnering with KuppingerCole will allow broader audiences to discover these innovations and explore new capabilities and business models they enable.”

“By combining our expertise in identity and access management with DIF's global reach and commitment to building trust in digital ecosystems, we are poised to deliver unparalleled insights and solutions to enterprises navigating the decentralized identity landscape. Together, we aim to empower organizations to embrace decentralized identity technologies confidently and securely, driving innovation and fostering trust in the digital age," added Martin Kuppinger, Co-Founder of  KuppingerCole.

The partnership kicks off with Road to EIC: Leveraging Reusable Identities in Your Organization, a virtual event at 7:00am PST, 10:00am EST, 4:00pm CEST on April 03. A second virtual event, Building Trust in AI, is being planned for May. 

DIF is set to play a prominent role at the European Identity and Cloud conference (EIC) in Berlin from 4 - 7 June, including a keynote address, use case presentations and panel discussions featuring DIF leadership, members and liaison partners. DIF members are eligible for a 25% reduction on their ticket to attend the event (on top of any other discounts). Simply enter code eic24dif25members during the last step of booking Get tickets | EIC 2024 (kuppingercole.com).

 

 

Monday, 01. April 2024

OpenID

Implementer’s Draft of OpenID for Verifiable Credential Issuance Approved

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft: OpenID for Verifiable Credential Issuance 1.0 This is the first Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product of the […

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft:

OpenID for Verifiable Credential Issuance 1.0

This is the first Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product of the OpenID Connect Working group.

The Implementer’s Draft is available at:

https://openid.net/specs/openid-4-verifiable-credential-issuance-1_0-ID1.html

The voting results were:

Approve – 79 votes Object – 2 votes Abstain – 12 votes

Total votes: 93 (out of 321 members = 29% > 20% quorum requirement)

The post Implementer’s Draft of OpenID for Verifiable Credential Issuance Approved first appeared on OpenID Foundation.


Identity At The Center - Podcast

April arrives with the latest episode of the Identity at the

April arrives with the latest episode of the Identity at the Center Podcast. We were joined by Jeff Reich of the IDSA to bring awareness to Identity Management Day taking place next week. We also talked about what's new with the IDSA and even shared some light-hearted thoughts on the best April Fool's pranks we've seen. You can catch the episode at idacpodcast.com or on your favorite podcast app.

April arrives with the latest episode of the Identity at the Center Podcast. We were joined by Jeff Reich of the IDSA to bring awareness to Identity Management Day taking place next week. We also talked about what's new with the IDSA and even shared some light-hearted thoughts on the best April Fool's pranks we've seen.

You can catch the episode at idacpodcast.com or on your favorite podcast app.

#iam #podcast #idac


Project VRM

Survey Hell

On a scale of one to ten, how do you rate the  Customer Experience Management (CEM) business? I give it a zero. Have you noticed that every service comes with a bonus survey—one you answer on a phone or fill out on a Web page? And that every one of those surveys is about rating […]

On a scale of one to ten, how do you rate the  Customer Experience Management (CEM) business?

I give it a zero.

Have you noticed that every service comes with a bonus survey—one you answer on a phone or fill out on a Web page? And that every one of those surveys is about rating the poor soul you spoke to or chatted with, rather than the company’s own crappy CEM system?

I always say yes to the question “Was your problem resolved?” because I know the human I spoke to will be punished if I say no.  Saying yes to that question complies with Don Marti‘s tweeted advice: “5 stars for everyone always—never betray a human to the machines.”

The main problem with CEM is that it’s all about getting service to scale across populations by faking interest in human contact. You can see it all through McKinsey’s The CEO Guide to Customer Experience. The customer is always on a “journey” through which a company has “touchpoints.”

Oh please.

IU Health, my primary provider of health services, does a good job on the whole, but one downside is the phone survey that follows up seemingly every interaction I have with a doctor or an assistant of some kind. The survey is always from a robot that says it “will only take a few minutes.” I haven’t counted, but I am sure some of those surveys last longer than the interaction I had with the human who provided the service: an annoyingly looooong touchpoint.

I wrote Why Surveys Suck here, way back in 2007. In it, I wrote,  “One way we can gauge the success of VRM is by watching the number of surveys decline.”

Makes me cringe a bit, but I think it’s still true.

The image above was created by Bing Creator and depicts “A hellscape of unhappy people, some on phones and others filling out surveys.”

Saturday, 30. March 2024

DIF Blog

DIF Newsletter #38

March 2024 DIF Website | DIF Mailing Lists | Meeting Recording Archive Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News Spring conference season The community is

March 2024

DIF Website | DIF Mailing Lists | Meeting Recording Archive

Table of contents Decentralized Identity Foundation News; 2. Working Group Updates; 3. Open Groups; 4. Announcements at DIF; 5. Community Events; 6. DIF Members; 7. Get involved! Join DIF 🚀 Decentralized Identity Foundation News

Spring conference season

The community is gearing up to put decentralized identity centre stage during the upcoming identity conference season.

Steering Committee and core team members, working / open group co-chairs and member orgs are representing DIF and participating in a range of activities to raise awareness of decentralized identity and DIF contributions, drive engagement in DIF and our key initiatives for 2024, demonstrate the viability of DI-based solutions and help people get started with DI.

Check out the Announcements section below to hear about DIF's confirmed and planned activities at IIW 38, the ID4Africa AGM, EIC and other events.

Veramo User Group

The Veramo User Group has been powering ahead since the group's first meeting on 15 February, with new SD-JWT functionality nearing release. All are welcome to participate in our thriving community of users and contributors - see Open Groups, below, for details.

China SIG

We're excited to formally welcome China SIG to the Decentralized Identity Foundation, marking a key step toward global adoption of decentralized identity to enable secure foundations for next-generation architectures. See Open Groups, below, for details of how to join the China SIG meeting on 17 April.

DIF Coffee Breaks

Our Senior Director of Community Engagement, Limari Navarrete, has kicked off a weekly Coffee Break on Twitter Spaces, with some fantastic guests lined up for April:

April 4th: MG co-founder of GoPlausible April 11th: Otto Mora from Polygon ID April 18th: Evin McMullen co-founder & CEO of Disco.xyz April 25th: Key Pillars of Quark ID: Mexico City and Buenos Aires

Past Spaces so far:

Nick Dazé CEO of Heirloom: https://twitter.com/i/spaces/1ynJOyjEqLXKR?s=20 Damian Glover, Senior Director of Communications @DIF https://x.com/DecentralizedID/status/1770882034162172386?s=20 Kim Hamilton Duffy, Executive Director @DIF https://x.com/DecentralizedID/status/1768336239168782566?s=20

Follow us on Twitter / X to set reminders for upcoming spaces. https://twitter.com/DecentralizedID

🛠️ Working Group Updates 💡Identifiers and Discovery Work Group

Andor Kesselman presented his work on Service Profiles for DID Documents: https://service-profiles.andor.us/

Daniel Buchner presented the did:dht method: https://did-dht.com/

The Linked Verifiable Presentations work item will soon progress to "Working Group Approved", reviews are welcome: https://github.com/decentralized-identity/linked-vp

Identifiers and Discovery meets bi-weekly at 11am PT/ 2pmET/ 8pm CET Mondays

🔐 Applied Crypto WG

Open source code implemented for BBS pseudonyms and BBS pseudonyms with hidden PID (based on Blind BBS).

The DIF Crypto - BBS work item meets weekly at 11am PT/2pm ET /8pm CET Mondays

📦 Secure Data Storage

Decentralized Web Node (DWN) Task Force
Nearing 1.0 of the DWN spec and implementation, with a reference app to be debuted at IIW.

DIF/CCG Secure Data Storage WG - DWN Task Force meets bi-weekly at 9am PT/12pm ET/6pm CET Wednesdays

Claims & Credentials Working Group

Credential Trust Establishment (CTE) is gaining traction as we approach IIW, with a plan to advance it to formal V1 status. Check out the latest post on the DIF blog.

If you are interested in participating in any of the Working Groups highlighted above, or any of DIF's other Working Groups, please click here.

📖 Open Groups at DIF Veramo User Group

The Veramo User Group has been meeting weekly since the middle of February. In addition to discussing Veramo use cases amongst our members and educating each other, we've been working to collaboratively improve Veramo and have new SD-JWT functionality nearing release as well as improvements to EIP-712 credentials underway. If you want to discuss use cases or have any bandwidth to help with improvements, please join us on Thursdays!

Meetings take place weekly on Thursdays, alternating between Noon EST / 18.00 CET and 09.00 EST / 15.00 CET. Click here for more details

📡 DIDComm User Group

The DIDComm User Group has been exploring and prioritising themes / opportunities including Authentication, Enhanced Chat / Group Chat, "Please call me" / Phone call related protocols, Implications of UX (App -> App?), WebRTC coordination, Using DIDComm to protect cloud-storage, DIDComm for IOT, Push Notifications and Webhook, DIDComm as a VC and B2C Protocols.

The DIDComm user group meets weekly at 12pm PT/3pm ET/ 9pm CET Mondays

📻 China SIG

The China SIG has officially launched after DIF’s Steering Committee voted to accept the China SIG Charter following a well-attended SIG kick-off meeting last month. The next meeting will take place on 17 April. All are welcome, here are the details:

Topic:DIF China SIG Monthly Meeting
Time:2024/04/17 20:00-21:00 (GMT+08:00) Beijing Time
Meeting Link: https://meeting.tencent.com/dm/ScmnTNk3pTL6
Meeting number:437-967-238
Meeting Password:2404

You can download the recording from the kick-off meeting in February here

🏦 Korea SIG

The SIG met online last month, following a face to face meeting in Seoul in January to plan activities for the year ahead.

Everyone can join us via the SIG website.

🌏 APAC/ASEAN Discussion Group

We invite everyone in the APAC region to join our monthly calls and contribute to the discussion. You will be able to find the minutes of the latest meeting here.

The DIF APAC call takes place Monthly on the 4th Thursday of the month. Please see the DIF calendar for updated timing.

🌍 DIF Africa Discussion Group

We are in need of someone to chair this process. Let us know if you’re interested in building, organizing and hosting monthly meetings.

Occurs on the first Thursday of the month at 9am UTC

☂️ DIF Interoperability Group

If you are interested in speaking in the User Adpotion and Interop series, or simply want to bounce around some interop thoughts and ideas, please reach out to group co-chairs Bonnie Yau, Brent Shambaugh or Elena Dumitrascu on Slack.

The Interoperability Group meets bi-weekly at 8am PT/11am ET/5pm CET Wednesdays

📢 Announcements at DIF

Public events calendar

DIF now has a public events calendar. 🎉

Here you will find not only DIF events but also conferences we'll be attending and participating in over the coming year. We look forward to connecting with you at these various public events! See below for how to subscribe.

Subscribe to the Calendar: Public URL; ICal Format.

Or find our calendar on the DIF Website.

Universal Resolver

DIF hosts and maintains the Universal Resolver, which is valuable public infrastructure for resolving DIDs across different methods. Experiment with it here.

🗓️ ️Community Events

ID Management Day

DIF's Executive Director, Kim Hamilton Duffy will lead a session on "Decentralized Identity for the People, and for the Non-People (NPEs that is): Updates, Trends, and Killer Use Cases" at ID Management Day on April 9. Check out the agenda for this virtual conference (hosted by IDSA) and register for free here.

Internet Identity Workshop (IIW) #38

DIF is busy gearing up for the Internet Identity Workshop's upcoming gathering at the Computer History Museum in Mountain View from 16 to 18 April. DIF's Executive Director Kim Hamilton Duffy and DIF's Senior Director of Communications Limari Navarette are looking forward to meeting with you at the event.

Planned DIF-themed sessions include The future of DIF; DIF Projects: From idea to demo (including the DIF Hackathon); and a discussion about DIF's proposed new Implementations and Applications Working Group.

Other sessions include Extending DID Documents for better service discovery with Service Profiles; Presentation Exchange v2.1 and where does it go from here?; and Credential Trust Establishment.

Look out for some exciting demos, including a decentralized social networking app built using DWNs and Open ID Connect and DIDComm working alongside each other.

This is shaping up to be an IIW not to be missed! Grab your ticket with DIF's 20% off discount here.

ID4Africa 2024 Annual General Meeting

DIF will be in Cape Town from 21 - 24 May for ID4Africa to share our insights about how VCs can be integrated with national identity programs and systems, and to connect with policy makers and implementers.

Steering Committee member Catherine Nabbala and Senior Director of Communications, Damian Glover will join Anand Acharya, Senior Project Manager of the Bhuthan NDI scheme to deliver a plenary presentation, From Policy To Reality: A Non-Technical Journey To Integrate Verifiable Credentials at 09:30 local time on 23 May.

Cathie and Damian will be available to meet throughout the event - our base is Stand A07 in the conference hall - we look forward to meeting you there!

European Identity & Cloud Conference 2024

DIF is set to play a prominent role at the European Identity and Cloud Conference (EIC) in Berlin from 4 - 7 June, with DIF staff, Steering Committee members and other member orgs participating in a series of keynotes, presentations and panel discussions.

Our involvement kicks into gear with Road to EIC: Leveraging Reusable Identities in Your Organization, a webinar hosted by the event's organisers, analyst firm KuppingerCole, on 3 April.

DIF members are eligible for a 25% reduction on their ticket to attend EIC (on top of any other discounts). Simply enter code eic24dif25members during the last step of booking: click here to buy your ticket.

Also look out for more details of our partnership with Kuppingercole on the DIF blog next week!

IEEE Intelligent Systems ’24

DIF member Ivan Lambov is chairing a session on "Beyond the hype: exploring the real-world impact of Blockchain" at the IEEE 12th International Conference on Intelligent Systems, which takes place in Varna, Bulgaria from 29 - 31 August.

Ivan would like to extend an invitation to the conference to the entire DIF community. "It will be nice to get together and meet in person with members of the DIF community from the EAME region or elsewhere this summer. This conference presents a great opportunity to get global recognition for the work the DIF is doing and the projects it is involved in. And last, but not least, the conference takes place in a beach resort at the end of August:)," Ivan added.

Check out the agenda and register here.

🗓️ ️DIF Members

Guest blog - Mailchain

Mailchain, founded in 2021, aims to revolutionize decentralized identity and communication with its services, including Vidos and the Mailchain Communication Protocol, simplifying the integration and adoption of decentralized identity technologies.We spoke to co-founder Tim Boeckmann who shared the company's journey to date.

Guest blog - David Birch

DIF caught up with David Birch, author of Identity Is The New Money, who shared his views on the development of the digital identity space, and some key challenges and opportunities for decentralized identity.

Gataca

Gataca is introducing the Higher Education Program, an initiative aimed at European universities to boost the adoption of ID wallets and verifiable credentials in education.

This program gives 20 universities free access to our decentralized identity platform for one year. They can issue and verify credentials with the universities and third parties in the program, eventually extending to all eIDAS 2.0 compliant organizations.

See the program's landing page for more details.

New Member Orientations

If you are new to DIF join us for our upcoming new member orientations. Please subscribe to DIF’s eventbrite for upcoming notifications on orientations and events which can be found here.

🆔 Join DIF!

If you would like to get in touch with us or become a member of the DIF community, please visit our website.

Can't get enough of DIF?
| Follow us on Twitter
| Join us on GitHub
| subscribe on YouTube
| read our DIF blog
| read the archives

Friday, 29. March 2024

FIDO Alliance

Silicon Republic: The long road to passkeys: When will they become mainstream?

Andrew Shikiar, CEO at FIDO Alliance discusses the various benefits of passkeys and the long-discussed goal of removing our password dependence.

Andrew Shikiar, CEO at FIDO Alliance discusses the various benefits of passkeys and the long-discussed goal of removing our password dependence.


Innovation & Tech Today: Minimize Risk and Fraud With New Technologies

Biometric authentication, advocated by FIDO, revolutionizes fraud prevention by replacing vulnerable passwords. With many users abandoning transactions due to forgotten passwords, biometrics offer secure verification through PINs, fingerprints, or facial […]

Biometric authentication, advocated by FIDO, revolutionizes fraud prevention by replacing vulnerable passwords. With many users abandoning transactions due to forgotten passwords, biometrics offer secure verification through PINs, fingerprints, or facial scans, mitigating cyber risks.


Cloudflare TV: Why security keys are the safest way to secure the web

Cloudflare CTO John Graham-Cumming joins FIDO Alliance’s Andrew Shikiar in a fireside chat to discuss the significance of hardware keys in combating online attacks like phishing, offering insights for businesses […]

Cloudflare CTO John Graham-Cumming joins FIDO Alliance’s Andrew Shikiar in a fireside chat to discuss the significance of hardware keys in combating online attacks like phishing, offering insights for businesses seeking to protect their employees.


Content Authenticity Initiative

March 2024 | This Month in Generative AI: Text-to-Movie

An update on recent breakthroughs in a category of techniques that generate images, audio, and video from a simple text prompt.

Adobe Stock

by Hany Farid, UC Berkeley Professor, CAI Advisor

News and trends shaping our understanding of generative AI technology and its applications.

Generative AI embodies a class of techniques for creating audio, image, or video content that mimics the human content creation process. Starting in 2018 and continuing through today, techniques to generate highly realistic content have continued their impressive trajectory. In this post, I will discuss some recent breakthroughs in a category of techniques that generate images, audio, and video from a simple text prompt.

Faces

A common computational technique for synthesizing images involves the use of a generative adversarial network (GAN). StyleGAN is, for example, one of the earliest successful systems for generating realistic human faces. When tasked with generating a face, the generator starts by laying down a random array of pixels and feeding this first guess to the discriminator. If the discriminator, equipped with a large database of real faces, can distinguish the generated image from the real faces, the discriminator provides this feedback to the generator. The generator then updates its initial guess and feeds this update to the discriminator in a second round. This process continues with the generator and discriminator competing in an adversarial game until an equilibrium is reached when the generator produces an image that the discriminator cannot distinguish from real faces.

Below are representative examples of GAN-generated faces. In two earlier posts, I discussed how photorealistic these faces are and some techniques for distinguishing real from GAN-generated faces.

Eight GAN-generated faces. (Credit: Hany Farid)

Text-to-image

Although they produce highly realistic results, GANs do not afford much control over the appearance or surroundings of the synthesized face. By comparison, text-to-image (or diffusion-based) synthesis affords more rendering control. Models are trained on billions of images that are  accompanied by descriptive captions, and each training image is progressively corrupted until only visual noise remains. The model then learns to denoise each image by reversing this corruption. This model can then be conditioned to generate an image that is semantically consistent with a text prompt like “Pope Francis in a white Balenciaga coat.” 

From Adobe Firefly to OpenAI's DALL-E, Midjourney to Stable Diffusion, text-to-image generation is capable of generating highly photorealistic images with increasingly fewer obvious visual artifacts (like hands with too many or too few fingers).

You probably saw on the news A.I. generations of Pope Francis wearing a white cozy jacket. I’d love to see your generations inspired by it.

Here’s a prompt by the original creator Guerrero Art (Pablo Xavier):

Catholic Pope Francis wearing Balenciaga puffy jacket in drill rap… pic.twitter.com/5WA2UTYG7b

— Kris Kashtanova (@icreatelife) March 28, 2023

Text-to-audio

In 2019, researchers were able to clone the voice of Joe Rogan from eight hours of voice recordings. Today, from only one minute of audio, anyone can clone any voice. What is most striking about this advance is that unlike the Rogan example, in which a model was trained to generate only Rogan's voice, today's zero-shot, multi-speaker text-to-speech can clone a voice not seen during training. Also striking is the easy access to these voice-cloning technologies through low-cost commercial or free open-source services. Once a voice is cloned, text-to-audio systems can convert any text input into a highly compelling audio clip that is difficult to distinguish from an authentic audio clip. Such fake clips are being used for everything from scams and fraud to election interference.

Text-to-video

A year ago, text-to-video systems tasked with creating short video clips from a text prompt like "Pope Francis walking in Times Square wearing a white Balanciaga coat" or "Will Smith eating spaghetti'' yielded videos of which nightmares are made. A typical video consists of 24 to 30 still images per second. Generating many realistic still images, however, is not enough to create a coherent video. These earlier systems struggled to create temporally coherent and physically plausible videos in which the inter-frame motion was convincing. 

However, just this month researchers from Google and OpenAI released a sneak peek into their latest efforts. While not perfect, the resulting videos are stunning in their realism and temporal consistency. One of the major breakthroughs in this work is the ability to generalize existing text-conditional image models to train on entire video sequences in which the characteristics of a full space-time video sequence can be learned.

In the same way that text-to-image models extend the range of what is possible as compared to GANs, these text-to-video models extend the ability to create realistic videos beyond existing lip-sync and face-swap models that are designed specifically to manipulate a video of a person talking.

Text-to-audio-to-video

Researchers from the Alibaba Group released an impressive new tool for generating a video of a person talking or singing. Unlike earlier lip-sync models, this technique requires only a single image as input, and the image is then fully animated to be consistent with any audio track. The results are remarkable, including a video of Mona Lisa reading a Shakespearean sonnet

When paired with text-to-audio, this technology can generate, from a single image, a video of a person saying (or singing) anything the creator wishes.

Looking ahead

I've come to learn not to make bold predictions about when and what will come next in the space of generative AI. I am, however, comfortable predicting that full-blown text-to-movie (combined audio and video) will soon be here, allowing for the generation of video clips from text such as: "A video of a couple walking down a busy New York City street with background traffic sounds as they sing Frank Sinatra's New York, New York." While there is much to be excited about on the content creation and creativity side, legitimate concerns persist and need to be addressed. 

While there are clear and compelling positive use cases of generative AI, we are already seeing troubling examples in the form of people creating non-consensual sexual imagery, scams and frauds, and disinformation

Some generative AI systems have been accused of infringing on the rights of creators whose content has been ingested into large training data sets. As we move forward, we need to find an equitable way to compensate creators and to give them the ability to opt in to or out of being part of training future generative AI models.

Relatedly, last summer saw a historic strike in Hollywood by writers and performers. A particularly contentious issue centered around the use (or not) of AI and how workers would be protected. The writers’ settlement requires that AI-generated material cannot be used to undermine a writer’s credit, and its use must be disclosed to writers. Protections for performers include that studios give fair compensation to performers for the use of digital replicas, and for the labor unions and studios to meet twice a year to assess developments and implications of generative AI. This latter agreement is particularly important given the pace of progress in this space.

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Author bio: Professor Hany Farid is a world-renowned expert in the field of misinformation, disinformation, and digital forensics. He joined the Content Authenticity Initiative (CAI) as an advisor in June 2023. The CAI is an Adobe-led community of media and tech companies, NGOs, academics, and others working to promote adoption of the open industry standard for content authenticity and provenance.

Professor Farid teaches at the University of California, Berkeley, with a joint appointment in electrical engineering and computer sciences at the School of Information. He’s also a member of the Berkeley Artificial Intelligence Lab, Berkeley Institute for Data Science, Center for Innovation in Vision and Optics, Development Engineering Program, and Vision Science Program, and he’s a senior faculty advisor for the Center for Long-Term Cybersecurity. His research focuses on digital forensics, forensic science, misinformation, image analysis, and human perception.

He received his undergraduate degree in computer science and applied mathematics from the University of Rochester in 1989, his M.S. in computer science from SUNY Albany, and his Ph.D. in computer science from the University of Pennsylvania in 1997. Following a two-year post-doctoral fellowship in brain and cognitive sciences at MIT, he joined the faculty at Dartmouth College in 1999 where he remained until 2019.

Professor Farid is the recipient of an Alfred P. Sloan Fellowship and a John Simon Guggenheim Fellowship, and he’s a fellow of the National Academy of Inventors.


Origin Trail

Trusted AI for next generation RWAs with OriginTrail and Chainlink

We are witnessing an important convergence of technologies of Artificial Intelligence (AI), Internet, and Crypto promising to reshape our digital landscape. This convergence enables a Verifiable Internet for AI, unlocking AI solutions without hallucinations and ensuring full respect for data ownership and Intellectual Property rights. Trillion-dollar industries spanning from tokenization of

We are witnessing an important convergence of technologies of Artificial Intelligence (AI), Internet, and Crypto promising to reshape our digital landscape. This convergence enables a Verifiable Internet for AI, unlocking AI solutions without hallucinations and ensuring full respect for data ownership and Intellectual Property rights.

Trillion-dollar industries spanning from tokenization of real world assets (RWAs), supply chains, metaverse, construction, life sciences and healthcare, among others, require AI systems to use verifiable information to deliver the multiplication effect to the benefit of users in a safe manner.

A modular and collaborative approach is necessary to achieve that. OriginTrail and Chainlink are working together to bring the vision of the Verifiable Internet for AI to reality, allowing the transformation of real world asset (RWA) tokenization.

OriginTrail Decentralized Knowledge Graph (DKG) is already powering trusted AI solutions across multiple RWA industries, such as protecting whisky authenticity with trusted AI and DNA tagging techniques, helping Swiss Federal Railways increase rail travel safety with cross-border rail operator connectivity, increasing EU build environment sustainability with trusted AI, supporting representatives of over 40% of US imports to safeguard data on security audits for overseas factories, etc.

Expanding the OriginTrail decentralized AI framework with Chainlink oracle capability further extends the strength of RWA solutions by giving them access to real-time real world data. By synergizing the power of the Decentralized Knowledge Graph and Chainlink Data feeds, the capabilities of AI to retrieve verifiable information on RWAs can be applied across any domain.

Integrating Chainlink Data Feeds with OriginTrail DKG to create a Trusted AI solution

Each knowledge resource on the OriginTrail DKG is created as a Knowledge Asset consisting of knowledge content, cryptographic proofs for immutability, and NFT for ownership. For our example, we will create a Knowledge Asset for Chainlink Data Feed. Once created, Knowledge Assets can be used in decentralized Retrieval-Augmented Generation (dRAG) AI applications. For our showcase, we will use an existing DOT/USD data feed in an AI application using the DKG and dRAG in 3 simple steps.

Step 1: Create a Knowledge Asset on the DKG

Since we wish to retrieve DOT/USD data feed in our AI application, we need to start by creating a Knowledge Asset linking to the data feed which we will use to retrieve the live price:

{
"@context": "http://schema.org/",
"@type": "ChainlinkDataFeed",
"@id": "https://data.chain.link/feeds/moonbeam/mainnet/dot-usd",
"description": "Chainlink DataFeed providing real-time DOT/USD price information on the Moonbeam network.",
"baseAsset": {
"@id": "urn:chainlink:base-asset:dot",
"@type": "ChainlinkBaseAsset",
"name": "DOT_CR",
"description": "Polkadot cryptocurrency (DOT)"
},
"quoteAsset": {
"@id": "urn:chainlink:quote-asset:usd",
"@type": "ChainlinkQuoteAsset",
"name": "USD_FX",
"description": "United States Dollar (USD)"
},
"productType": "price",
"productSubType": "reference",
"productName": "DOT/USD-RefPrice-DF-Moonbeam-001",
"contractAddress": "0x1466b4bD0C4B6B8e1164991909961e0EE6a66d8c",
"network": "moonbeam",
"rpcProvider": "https://rpc.api.moonbeam.network"
}

The main entities represented in the Knowledge Asset are:

Base asset (DOT_CR) — the first asset listed in a trading pair, this is the asset that is being priced Quote asset (USD_FX) — the second asset listed in a trading pair, this is the currency the base asset is priced in

Necessary fields for DOT/USD value retrieval:

Contract address (contractAddress) RPC Provider (rpcProvider)

This Knowledge Asset content can also be visualized in the DKG Explorer:

Step 2: Use AI to query the DKG

From your application, you can use an LLM to generate DKG queries based on the user prompts. This step can have different degrees of complexity, so for this showcase, we will use the selected LLM to:

Determine if the prompt is relevant for the data feed (in our case, is the user’s question mentioning DOT token) Use the LLM to structure a SPARQL query for the OriginTrail DKG to retrieve the Data Feed URL

An example of an engineered prompt to determine the relevance of the question for DOT token:

Given that the chatbot primarily responds to inquiries about the Polkadot ecosystem, including its native token, DOT, analyze the provided question to determine if there's a direct or indirect reference to DOT. Provide a response indicating 'true' if the question pertains to the value, function, or any aspect of DOT, within the context of discussions related to Polkadot ecosystem, either explicitly or implicitly, and 'false' if it does not. Question: {question}

If the above prompt determines the question as relevant (returns true), we proceed with a SPARQL query for the OriginTrail Decentralized Knowledge Graph. There are various techniques to obtain a SPARQL query with the LLM you’re using. In our case, we seek ChainlinkDataFeed type entities (Knowledge Assets) with DOT as the BaseAsset. The query result in our case will be a single Knowledge Asset containing information about the DOT/USD value. The SPARQL query should look like this:

PREFIX schema: <http://schema.org/>
SELECT ?dataFeed ?contractAddress ?rpcProvider
WHERE {
?dataFeed a schema:ChainlinkDataFeed ;
schema:baseAsset ?baseAsset ;
schema:contractAddress ?contractAddress ;
schema:rpcProvider ?rpcProvider .
?baseAsset a schema:ChainlinkBaseAsset ;
schema:name "DOT_CR" .
} Step 3: Retrieve the data and display it in your application

Retrieve all the necessary information from the Knowledge Assets obtained through the SPARQL query. Essential information includes the contract address and RPC endpoint, as they are required to execute the code fetching price information from Chainlink. In our case, we are fetching the DOT/USD price.

Code execution

The following code uses ethers.js to fetch the requested value from the retrieved Data Feed. Here’s a simple example:

const { ethers } = require('ethers');

//Your code that executes SPARQL queries.

const rpcProvider = sparqlResult.data[0].rpcProvider;
const contractAddress = sparqlResult.data[0].contractAddress;

const provider = new ethers.providers.JsonRpcProvider(rpcProvider);
const abi = [{
inputs: [],
name: "latestRoundData",
outputs: [
{ internalType: "uint80", name: "roundId", type: "uint80" },
{ internalType: "int256", name: "answer", type: "int256" },
{ internalType: "uint256", name: "startedAt", type: "uint256" },
{ internalType: "uint256", name: "updatedAt", type: "uint256" },
{ internalType: "uint80", name: "answeredInRound", type: "uint80" },
],
stateMutability: "view",
type: "function",
}
];
async function getDOTUSDPrice() {
const contract = new ethers.Contract(contractAddress, abi, provider);
const [ , price] = await contract.latestRoundData();

console.log(`DOT/USD Price: ${ethers.utils.formatUnits(price, 8)}`);
}
getDOTUSDPrice(); Include Chainlink Data feed into the final response

You can modify how the LLM will perform the decentralized Retrieval Augmented Generation and include the data feed as a part of the response by engineering the prompt based on your requirements. Here’s one example that appends it at the end of the generated response.

The next generation RWA solutions will be using the best of what the Internet, Crypto and AI have to offer. Combining the power of OriginTrail DKG and Chainlink unlocks avenues of value in the RWA industries that can disrupt the way those industries operate today. Your path to disrupting the trillion dollar industries can start with the 3 steps shown above. Join us in Discord to let us know how OriginTrail and Chainlink can boost your solution with trusted AI.

Trusted AI for next generation RWAs with OriginTrail and Chainlink was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 28. March 2024

Origin Trail

Announcing the ID Theory DeSci IPO (Initial Paranet Offering)

By OriginTrail and ID Theory AI-ready knowledge foundation for the future of scientific research The combination of AI (artificial intelligence) and DeSci (decentralised science) is poised to create a huge leap in humanity’s capacity to drive scientific research. While it will not happen overnight, technological maturity allows us to take critically important steps today that will create a

By OriginTrail and ID Theory

AI-ready knowledge foundation for the future of scientific research

The combination of AI (artificial intelligence) and DeSci (decentralised science) is poised to create a huge leap in humanity’s capacity to drive scientific research. While it will not happen overnight, technological maturity allows us to take critically important steps today that will create a better future for everyone tomorrow.

ID Theory has been at the forefront of this intersection for quite some time now, with their CIO on the board of Molecule and the fund as founding members of BeakerDAO.

“Whilst we have enjoyed helping shape the future through exciting conversations, musings, and capital, it is now time for us, alongside the pioneers at OriginTrail, to help make this future a practical reality. Sometimes, you have to roll up your sleeves and personally shape the future you want to see.” — ID Theory

As a major step towards this future, ID Theory will be among the first to leverage the Decentralized Knowledge Graph Paranets to build out the AI-ready knowledge foundation for DeSci. The DeSci paranet will launch as a collaborative community mining relevant DeSci knowledge. Knowledge miners contributing knowledge to the Paranet will be mining NEURO rewards.

The Decentralized Knowledge Graph and the DeSci Paranet

The DeSci Paranet will live on the OriginTrail Decentralized Knowledge Graph (DKG), a permissionless peer-to-peer network which will ensure that all the DeSci knowledge published to the DeSci Paranet will be discoverable, verifiable, and attributed to the owners who will mine it. This way, our AI services will omit the challenges of hallucination, have managed bias, and always respect the intellectual property of the knowledge owners.

As part of DeSci paranet, ID Theory will run designated AI services, allowing users to interact with the mined knowledge. The first AI service will be a DeScAI chatbot allowing you to explore the knowledge in DeSci Paranet using the decentralized retrieval-augmented generation (dRAG) method and, in the future, these services will evolve into more end-to-end research frameworks for autonomous agents to make scientific breakthroughs!

To explore more about the technical design of paranets, DKG and dRAG we recommend diving into the OriginTrail Whitepaper.

Calling all Knowledge Miners to Arms

In a short while, a full proposal for the DeSci Paranet will be put to the NeuroWeb community for approval. The proposal will include:

A showcase of the genesis knowledge assets that will be created The incentives model for knowledge miners The AI service demo for DeScAI

As a part of the creation of the DeSci paranet, OriginTrail and ID Theory are calling for future DeSci knowledge miners to get involved. As part of their Blueprint for Breakthroughs, ID Theory identified several Decentralised Autonomous Organisations (DAOs) that are focused on gathering and creating relevant knowledge such as Vita, Valley, Athena, Hair, Cerebrum, and Cryo. The paranet incentives are inclusive, and we invite all interested participants to get involved.

Towards Autonomous Research

The DeSci paranet is aimed at supporting the autonomous research vision, and delivers following critical elements for its success:

The data verifiability and ownership capabilities ensured by NeuroWeb blockchain The symbolic AI capabilities ensured by the OriginTrail DKG The neural AI capabilities of Generative AI like Large Language Models (LLMs) The incentives for relevant knowledge growth on NeuroWeb

Once we combine the trust and incentives of NeuroWeb, the deterministic foundation of the DKG, and the reasoning potential of LLMs we can create not only specific AI solutions but also wider research tasks for AI agents which can take these building blocks and conduct autonomous research on verifiable sources.

About ID Theory

ID Theory is a liquid and venture focused crypto fund investing in the next trillion users over three main verticals:

Decentralised AI: autonomous agents will rule the world. Decentralised Finance: trust code not bankers. Decentralised Science: every disease is curable.

Decentralisation is the guiding principle for all investments — providing a trustless foundation for humans and AI agents to thrive.

See you at the bleeding edge.

About OriginTrail

OriginTrail is an ecosystem-building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

Web | X | Facebook | Telegram | LinkedIn | GitHubDiscord

Announcing the ID Theory DeSci IPO (Initial Paranet Offering) was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


The Initial Paranet Offerings (IPOs) to supercharge the Verifiable Internet for AI

Access to shared open knowledge constructed in a collaborative way is mission-critical for the future of AI, especially since non-AI-generated content is expected to be surpassed in size by synthetic, AI-generated content in the coming period. The importance of it has also been highlighted by the Turing Award winner in the field of Deep Learning, Yann LeCun: “The way you train that (AI) syst

Access to shared open knowledge constructed in a collaborative way is mission-critical for the future of AI, especially since non-AI-generated content is expected to be surpassed in size by synthetic, AI-generated content in the coming period. The importance of it has also been highlighted by the Turing Award winner in the field of Deep Learning, Yann LeCun:

“The way you train that (AI) system will have to be crowdsourced … if you want it to be a repository of all human knowledge, all humans need to contribute to it.” Yann LeCun

To achieve that, AI para — networks or paranets, the autonomously operated collections of Knowledge Assets owned by its communities and residing on the OriginTrail Decentralized Knowledge Graph (DKG), were introduced in the Whitepaper 3.0.

Initial Paranet Offerings (IPO) are now introduced as a means of a public launch of a paranet, with a collection of Knowledge Assets and accompanying incentivization structure proposed and voted upon via the NeuroWeb governance mechanism. Each IPO is structured as an initial proposal and an initial set of Knowledge Assets published, along with an incentivization structure set forth by an IPO operator that proposes how the incentives will be split across three groups:

IPO operator Knowledge miners Neuro holders that participated in supporting the creation of an IPO and approved the requested allocation of Neuro utility tokens for an IPO’s knowledge mining.

The success of an IPO largely depends on the IPO’s operator ability to wisely propose the incentive structure, taking into consideration the following factors among others:

IPO operator autonomously selects AI services to be used to drive value of a knowledge base, and must undertake an economically and commercially viable approach for both creation and maintenance of a paranet. It is expected that an IPO operator proposes an operator fee that renders the birth of a paranet economically viable (earning a share of allocated emissions), while also setting up a fee structure for both knowledge miners and NEURO holders that partake in voting. Assuming the cost of mining Knowledge Assets on the DKG in TRAC utility tokens, knowledge miners are to be considered as central to the success of not only an IPO proposal, but even more so as entities that drive incentives in NEURO tokens only as each new knowledge asset is mined. Only when new Knowledge Assets are mined, the allocated emissions of NEURO are executed across the three groups as incentives. When launching an IPO, the paranet operator will define the ratio of NEURO to be earned per TRAC spent to mine each Knowledge Asset. An IPO operator may set the ratio autonomously to target a desired profitability before the proposal is submitted to voting, yet attempts of price gouging might not receive support from NEURO holders. NEURO holders that support an IPO via governance voting are to lock up tokens for the duration of NEURO emission allocated for the IPO. Though the share of emissions allocated for an IPO is an important factor for NEURO holders’ decision, the duration of the “lock period” can also play an important role. The paranet operator also defines what portion of paranet incentives will be shared with NEURO holders supporting the proposal. The ecosystem incentivizing the Verifiable Internet for AI

The interest to launch the first IPOs has already been pre-registered by several institutional entities and builders, with the inaugural batch nearing the announcement stage. If you are interested in launching a paranet and knowledge mining, hop into the community discussion in Discord and share your ideas.

The Initial Paranet Offerings (IPOs) to supercharge the Verifiable Internet for AI was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 27. March 2024

DIF Blog

Effective governance now with DIF Credential Trust Establishment

In the digital identity space, the Trust Establishment (TE) and Credential Trust Establishment (CTE) specifications play crucial roles in defining how trust is established and managed. CTE, in particular, is gaining traction as we approach the Internet Identity Workshop (IIW), with a plan to advance it to formal V1 status.

In the digital identity space, the Trust Establishment (TE) and Credential Trust Establishment (CTE) specifications play crucial roles in defining how trust is established and managed. CTE, in particular, is gaining traction as we approach the Internet Identity Workshop (IIW), with a plan to advance it to formal V1 status. This article focuses on the CTE, shedding light on its key features that make it a game-changer in building trust within digital credentials.

Core Aspects of CTE

CTE builds upon TE by enabling ecosystems to express their trust in the issuers of decentralized identifiers (DIDs) and credentials. Credential validation steps of checking the integrity and revocation status are well known and understood, but there are not yet commonly-agreed-upon standards for evaluating the authority of a party to issue a credential’s claims. 

Existing approaches have fallen short in one or more of the following areas: 

Ensuring the approach is sufficiently adaptable Ability to express authorization for a specific role (not just general authorization) Allows good performance and minimal resources, even eligible for offline use Low-cost to implement, deploy, and use

This is where CTE comes in: enabling ecosystems to express the credibility of participants, but in a way that meets the above needs. By doing so, it helps avoid “rent-seeking” behavior, in which an ecosystem participant tries to position themselves to collect transaction fees or similar.

Authority in the Ecosystem

CTE is non-prescriptive in its stance on defining who is an authority. It operates on the principle that authority is determined by an ecosystem’s existing trust structure, informing the acceptance and recognition of the credentials. This flexibility allows for wide adoption and adaptation, making it a practical solution for managing trust.

Governance and Flexibility

CTE introduces a practical governance model that is lightweight and adaptable. It serves ecosystems both large and small. It specifies roles such as credential issuance and verification, and allows grouping by schemas, or type of credential. This allows CTE to adapt well to a wide variety of use cases and simplifies the process of determining who is authorized to issue or verify credentials.

Trust on Demand

CTE includes flexible dials in cases where more fluidity is required. For example, instead of being statically included in the registry, an individual can hold credential(s) that assigns them a specific role, and the root authority of that credential corresponds to an entry/role in the registry.   This method is not only efficient for offline use but also broadens the compatibility with different protocols, enhancing the flexibility and utility of the trust establishment process.

Impact

CTE is designed to counter rent-seeking behaviors and establish a solid trust foundation in digital credentials. It enables organizations and individuals to easily verify the legitimacy of credentials, providing a clear pathway for recognizing valuable credentials for professional development, for example. The specification’s governance model is straightforward and requires minimal technical investment, making it accessible and implementable across various industries.

How it can be used

In the wild, CTE files would be used by software representing companies and people. Companies and people will have a collection of governance files they use for different industries and purposes. In general, companies will be interested in software providing an immediate yes or no answer informing whether to accept or reject a credential. For individuals, however, software can use CTE files to advise on whether a credential is recognized by different parties. By indexing different CTE files, software can help individuals decide which ecosystems and credentials are most valuable for them.

Future Directions

As CTE heads towards v1, its potential to streamline the verification process and enhance the credibility of digital credentials is becoming increasingly apparent. DIF invites you to learn more about how CTE can revolutionize the digital identity field in providing a scalable, flexible, and trustworthy framework for managing digital credentials.

Learn more at:

Internet Identity Workshop DIF virtual event (details coming soon)

In summary, CTE is not just about establishing trust; it's about making the process more accessible, adaptable, and reliable for everyone involved in the digital identity ecosystem. Its forward-thinking approach to governance, authority, and risk mitigation positions it as a cornerstone specification in the evolving landscape of digital credentials.


GS1

Maintenance release 2.9

Maintenance release 2.9 daniela.duarte… Wed, 03/27/2024 - 16:20 Maintenance release 2.9
Maintenance release 2.9 daniela.duarte… Wed, 03/27/2024 - 16:20 Maintenance release 2.9

GS1 GDM SMG voted to implement the 2.9 standard into production in February 2024.

Key Milestones:

See GS1 GDM Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.
GDSN Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools (if using GDSN) and/or Member Organisations on understanding the release and any impacts to business processes.

GDM 2.9 contains updated reference material aligned with ADB 2.3 and GDSN 3.1.26.

 

Updated For Maintenance Release 2.9

GDM Standard 2.9 (February 2024)

Local Layers For Maintenance Release 2.9

China - GSMP RATIFIED (April 2022)

France - GSMP RATIFIED (November 2023)

Germany - GSMP RATIFIED (November 2023)

Poland - GSMP RATIFIED (November 2023)

Romania - GSMP RATIFIED (December 2021)

USA - GSMP RATIFIED (February 2023)

Finland - GSMP RATIFIED (November 2023)

 

Release Guidance

GDM Market Stages Guideline (June 2023)

GDM Attribute Implementation Guideline (February 2024)

GPC Bricks To GDM (Sub-) Category Mapping (March 2024)

Attribute Definitions for Business (February 2024)

GDM (Sub-) Categories (October 2021)

GDM Regions and Countries (17 December 2021)

GDSN Release 3.1.26 (February 2024)

Tools

GDM Navigator on the Web 

GS1 GDM Attribute Analysis Tool (Nov 2023)

GDM Local Layer Submission Template (May 2023)

Training

E-Learning Course

Any questions?

We can help you get started using GS1 standards.

Contact your local office


EdgeSecure

Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure

The post Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure appeared first on NJEdge Inc.

NEWARK, NJ, March 27, 2024 –Edge recently partnered with FABRIC, Rutgers, The State University of New Jersey, and Princeton University to provide high performance network infrastructure connecting university researchers and their local compute clusters and scientific instruments to the larger FABRIC infrastructure. 

Notes Dr. Forough Ghahramani, Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge, “The partnership with the FABRIC team and researchers at Princeton University and Rutgers will create opportunities to explore innovative solutions not previously possible for a large variety of high-end science applications and provide a platform on which to educate and train the next generation of researchers on future advanced distributed system designs.”

FABRIC is an international infrastructure that enables cutting-edge experimentation and research at-scale in the areas of networking, cybersecurity, distributed computing, storage, virtual reality, 5G, machine learning, and science applications. Funded by the National Science Foundation’s (NSF’s) Mid-Scale Research Infrastructure program, FABRIC enables computer science and networking researchers to develop and test innovative architectures that could yield a faster, more secure Internet. 

“EdgeNet is uniquely well-positioned to provide infrastructure support to these types of research networking initiatives,” explains Bruce Tyrrell, Associate Vice President, Programs & Services, Edge. Continues Tyrrell, “As a backbone and external services provider to both Rutgers and Princeton University, Edge has the capacity and capability to meet the high bandwidth research needs of our partner institutions. Our extensive optical backbone enables Edge to efficiently and economically deploy 100Gb transport services to all of our members.”    

The FABRIC team is led by researchers from University of North Carolina at Chapel Hill, University of Kentucky, Clemson University, University of Illinois, and the Department of Energy’s ESnet (Energy Sciences Network). The team also includes researchers from many other universities, including Rutgers and Princeton University, to help test the design of the facility and integrate their computing facilities, testbeds, and instruments into FABRIC.

“The partnership with the FABRIC team and researchers at Princeton University and Rutgers will create opportunities to explore innovative solutions not previously possible for a large variety of high-end science applications and provide a platform on which to educate and train the next generation of researchers on future advanced distributed system designs.”

— Dr. Forough Ghahramani
Assistant Vice President for Research, Innovation, and Sponsored Programs, Edge

“FABRIC aims to be an infrastructure to explore impactful new ideas that are impossible or impractical with the current Internet. It provides an experimental sandbox that is connected to the globally distributed testbeds, scientific instruments, computing centers, data, and campuses that researchers rely on everyday,” said Paul Ruth, FABRIC Lead PI. “Edge enables us to support research across many facilities including the COSMOS wireless testbed, Princeton’s experimental P4 testbed, and remotely controlled instruments such as a CyroEM microscope at Rutgers.”

“The integration of FABRIC with COSMOS, both being pivotal national testbeds, opens unparalleled avenues for experimentation that blend wired and wireless networking with edge computing. Supported by Edge’s provision of connectivity between these pivotal national testbeds as well as to other national and international networks in NYC and Philadelphia carrier hotels, it opens unparalleled avenues for experimentation that blend wired and wireless networking with edge computing. This synergy not only enhances our research capabilities but also paves the way for groundbreaking advancements in network infrastructure and distributed systems,” notes Ivan Seskar, Chief Technologist at WINLAB, Rutgers, emphasizing the importance of collaborative efforts in pushing the boundaries of networking and computing research.

“As a backbone and external services provider to both Rutgers and Princeton University, Edge has the capacity and capability to meet the high bandwidth research needs of our partner institutions. Our extensive optical backbone enables Edge to efficiently and economically deploy 100Gb transport services to all of our members.”

— Bruce Tyrell
Associate Vice President, Programs & Services, Edge

Princeton University Provost and Gordon Y.S. Wu Professor in Engineering and Computer Science, Dr. Jennifer Rexford, was an early supporter of bringing FABRIC to Princeton, serving as a founding member of the project’s steering committee. Shares Rexford, “Linking into FABRIC allows Princeton to support science on a global scale, across multiple domains and enables researchers to reinvent the internet by experimenting with novel networking ideas in a realistic setting — at tremendous speed, scope and scale.” Further elaborates Jack Brassil, Ph.D., Senior Director of Advanced CyberInfrastructure, Office of the Vice President for Information Technology, and Senior Research Scholar Department of Computer Science, Princeton University, “FABRIC enables the Princeton University campus to usher in a new generation of terabit per second networking applications.By connecting our faculty to experimental testbeds, scientific instruments, and research collaborators at other higher education institutions, FABRIC will provide a fast path to scientific discovery.”

To learn more about FABRIC capabilities, visit https://whatisfabric.net/. Contact Forough Ghahramani (research@njeged.net) for additional information. 

About Edge

Edge serves as a member-owned, nonprofit provider of high performance optical fiber networking and internetworking, Internet2, and a vast array of best-in-class technology solutions for cybersecurity, educational technologies, cloud computing, and professional managed services. Edge provides these solutions to colleges and universities, K-12 school districts, government entities, hospital networks and nonprofit business entities as part of a membership-based consortium. Edge’s membership spans the northeast, along with a growing list of EdgeMarket participants nationwide. Edge’s common good mission ensures success by empowering members for digital transformation with affordable, reliable and thought-leading purpose-built, advanced connectivity, technologies and services.

The post Edge Partners with FABRIC, Princeton University, and Rutgers, The State University of New Jersey, on High Performance Network Infrastructure appeared first on NJEdge Inc.


We Are Open co-op

Towards a manifesto for Open Recognition

Advocating for a more diverse future for the recognition of talents, skills, and aspirations CC BY-ND Visual Thinkery for WAO Back in 2016, the Open Recognition Alliance created the Bologna Open Recogition Declaration (BORD). This has helped the community organise around principles relating to the concept of Open Recognition for all. It emphasises the importance of building out technologies a
Advocating for a more diverse future for the recognition of talents, skills, and aspirations CC BY-ND Visual Thinkery for WAO

Back in 2016, the Open Recognition Alliance created the Bologna Open Recogition Declaration (BORD). This has helped the community organise around principles relating to the concept of Open Recognition for all. It emphasises the importance of building out technologies and infrastructure to enable Open Recognition, as well as advocating for policies which foster its development.

Eight years later, the Open Recognition is for Everybody (ORE) community has started work on a manifesto for Open Recognition. This will be part of the Open Recognition Toolkit and extends the BORD to help people envision and advocate for a future where Open Recognition is commonplace.

Unpacking Open Recognition

Let’s begin with defining terms:

Open Recognition is the awareness and appreciation of talents, skills and aspirations in ways that go beyond credentialing. This includes recognising the rights of individuals, communities, and territories to apply their own labels and definitions. Their frameworks may be emergent and/or implicit.
(What is Open Recognition, anyway?)

We want to help people understand that traditional approaches to credentialing, while important for unlocking opportunities, are just one part of a wider recognition landscape.

Image CC BY-ND Visual Thinkery for WAO

For example, you could think of traditional credentialing — with courses, modules, and diplomas as like a greenhouse where growth conditions are carefully controlled. Only certain plants thrive in this environment, and they are pre-selected to do so.

Open Recognition, on the other hand, is more like the garden that surrounds the greenhouse where a diverse array of plants grow naturally, adapt to their environment, and flourish in unique ways. Not only that, but there are many different gardens with different types of soil and varying atmospheric conditions.

Getting started with a manifesto

A manifesto is a call to action. It’s a way of allowing people to sign up to implement specific principles in order to work towards a better future.

To get started on that road, in a recent ORE community call we asked two questions:

What sucks that we want to do the opposite of? What doesn’t exist that we want to bring into being?

While these are only our first steps towards a manifesto with a subset of the community, we’re keen to share what we’ve discussed so far.

What sucks? Simplifying complex systems — our digital landscape is cluttered with overly complex technologies and terminology. We aim to streamline these technologies, making open recognition accessible to everyone, not just the tech-savvy. Clearing confusion and enhancing communication — there’s a tendency to overlook past contributions in the field, creating a cycle where new initiatives ignore the groundwork laid by predecessors. We want to provide clear, accurate information about Open Recognition to varied audiences. Dismantling exclusivity — some forms of recognition and credentials are guarded as if they’re an exclusive membership available only to a select few. It’s important that we break down these barriers to create a more inclusive environment where everyone’s achievements are acknowledged. What doesn’t exist? Streamlined badge creation — we want to make creating badges for Open Recognition as easy as filling out a social media profile. This would encourage wider adoption and creativity in badge design/issuing. Stories of success —examples and case studies help guide and inspire others. This could be part of the Open Recognition Toolkit, allowing stories to be shared and help provide practical and conceptual guidance to others. Bridging spheres of learning — different forms of learning, for example formal and informal, tend to be siloed. As we know valuable skills can be acquired outside of traditional educational settings, we want to build a bridge to recognise the worth of both formal training and self-taught expertise. Next steps

Creating a manifesto for Open Recognition involves creating something that resonates with a broad audience. It needs to be informative and upbeat, and have an ideological stance which advocates for a better future world.

Our next community call will continue the work we started this week, helping us work towards a plausible utopia for Open Recognition. If this is something which resonates with you, and you’d like to get involved, join us!

Related posts How badges can change the world — Part 1: The Two Loops Model for Open Recognition advocacy How badges can change the world — Part 2: Why we need to transition Advocating for learner-centric badge systems: Some thoughts on campaigning for the right things

Towards a manifesto for Open Recognition was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

Foremembrance Day 2024 Presentation

For Foremembrance Day 2024, Christopher Allen gave a Twitter Livestream discussing the tragedy of overidentification in The Netherlands in WWII, how France offered a different path, and how we must continue to be wary about what identity information we collect and distribute today. For more see the slides of this presentation, the original article “Echoes from History” and a discussion of modern th

For Foremembrance Day 2024, Christopher Allen gave a Twitter Livestream discussing the tragedy of overidentification in The Netherlands in WWII, how France offered a different path, and how we must continue to be wary about what identity information we collect and distribute today.

For more see the slides of this presentation, the original article “Echoes from History” and a discussion of modern threats in “The Dangers of eIDAS”.

Tuesday, 26. March 2024

Energy Web

Web 3 as-a-service solution for enterprise now live on Energy Web X and Polkadot

Global energy majors at the forefront of Web 3 enterprise adoption on Energy Web X and Polkadot Energy Web, a global ecosystem of energy companies focused on developing and deploying Web 3 technologies to accelerate decarbonization of the global energy system, recently released Smartflow, a new as-a-service product that makes it simple and easy for enterprise customers to launch web 3 soluti
Global energy majors at the forefront of Web 3 enterprise adoption on Energy Web X and Polkadot

Energy Web, a global ecosystem of energy companies focused on developing and deploying Web 3 technologies to accelerate decarbonization of the global energy system, recently released Smartflow, a new as-a-service product that makes it simple and easy for enterprise customers to launch web 3 solutions built on Energy Web X, a parachain powered by the Polkadot blockchain and substrate technology.

Smartflow enables enterprise customers to configure and deploy custom business logic using decentralized networks of “worker nodes”. Worker nodes ingest data from individual enterprises or consortia of corporate customers, perform application-specific computational work on the data, and publish the results for enterprises and the public to verify. Worker nodes create business value because of the zero trust relationships they unlock: with worker nodes, no central entity has access to underlying data and, more importantly, work conducted by the nodes can be independently verified without needing to trust a single centralized entity or server. Worker nodes are connected to and secured by the Energy Web X parachain, a new blockchain on the Polkadot network.

Given the regulated nature of the energy industry and growing calls for more transparency on sustainability data, worker nodes are uniquely positioned to create value for energy companies. Over the past 7 years, corporates in the Energy Web ecosystem have uncovered a number use cases for Smartflow and worker nodes, including:

Matching granular energy consumption from buildings with energy produced by individual renewable energy power plants Balancing the grid using fleets of distributed solar systems and batteries Verifying the carbon intensity of sustainable aviation fuel
“As long-time Energy Web supporters, we are excited by the decentralized approach to multi-party computation provided by SmartFlow. It is a digital tool that provides a new kind of trust layer in business relations based on shared information. We are eager to discover how it can be integrated into our processes, and what kind of value it can create,” said Etienne Gehain, New Digital Solutions Director at Engie.

Smartflow enables enterprises to build custom worker node workflows and integrate them with existing data sources and APIs in minutes using a no-code infrastructure. Though currently configured to support Energy Web’s ecosystem of member energy companies, the technology can create value for enterprises in any industry where exchange and processing of sensitive data between companies is required to create business value.

Please visit the smartflow website to create your account and begin building today.

About Energy Web Foundation:Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, Digital Spine, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Web 3 as-a-service solution for enterprise now live on Energy Web X and Polkadot was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

Recap: Virtual Summit: Demystifying Passkey Implementations

By: FIDO staff Passkeys hold the promise of enabling simpler, strong authentication. But first organizations, governments and individuals will have to adopt the technology – and some of them have […]

By: FIDO staff

Passkeys hold the promise of enabling simpler, strong authentication. But first organizations, governments and individuals will have to adopt the technology – and some of them have questions.

At the Authenticate Virtual Summit: Demystifying Passkey Implementation on March 13, speakers from the FIDO Alliance, Intercede, IDEMIA, Yubico, Dashlane and 1Password as well as implementers including Amazon and Target, presented on their experiences implementing and working with passkeys. The virtual summit covered the technical perspective on passkeys from the FIDO Alliance, as well as use cases for passkeys in the enterprise, consumer authentication, and the U.S. government. Along the way, attendees asked lots of questions and got lots of insightful answers.

Fundamentally a key theme that resonated throughout the virtual summit was that passkeys are a password replacement – and it’s a replacement that can’t come soon enough.

“Passwords are still the primary way for logging on and they are still easily phished through social engineering and they tend to be very difficult to use and to maintain,” David Turner, senior director of standards development at the FIDO Alliance said. “The consequences are real and the impact is real to the world at large.”

Passkeys 101

During his session, Turner provided a high-level overview on what passkeys are and how they work.

Passkeys build upon existing FIDO authentication protocols and simplify the user experience. 

Passkeys can now be synchronized across devices through the use of passkey providers, removing the need for separate credentials on each device. Passkeys also enable new capabilities like cross-device authentication. Turner demonstrated how a QR code scanned on one device can securely connect to credentials stored on another nearby device. 

In addition to synced passkeys there are also device-bound passkeys, that rely on technologies like a security key to provide the required credentials.

The State of Passkeys

The current and future state of passkey adoption was the topic tackled by

Andrew Shikiar, executive director and CEO of the FIDO Alliance.

There are now hundreds of services, including the major platform vendors Microsoft, Apple and Google, representing billions of users, that support passkeys at this point in 2024.

“If you are a service provider and you wish to deploy passkeys, you can do so with high confidence that your consumers will be able to leverage them,” he said.

The FIDO Alliance aims to drive passkey support over the coming years, in part by sharing best practices and success stories, which is a core part of what the virtual summit was all about.

Usability was emphasized as a key factor for widespread adoption. 

“Usability is paramount. It must be front and center in what you do,” said Shikiar. 

The FIDO Alliance has released user experience guidelines and a design system to help companies implement passkeys in a user-friendly way. Future guidelines will address additional use cases.

Shikiar emphasized that passkeys are not about being a new addition to improve the security of passwords. His expectation is that passkeys will be seen as a true password replacement rather than just an attempt at bolstering existing authentication methods. He emphasized that the fundamental problem is passwords, and the goal should be replacing them, not just adding extra security layers on top of passwords. Shikiar wants people to stop thinking about multi-factor authentication factors and instead think about enabling phishing resistant identities. 

Passkeys are on Target at Target

Passkeys are already in use at retail giant Target, helping to improve security and optimize authentication for its employees. 

Tom Sheffield, senior director cybersecurity at Target, said that the company has been leveraging FIDO for workforce authentication since 2018 and adopted it as a primary authenticator in 2021.

One of the ways that Target has been able to more easily enable passkey support across its platforms is via Single Sign On (SSO). 

“We have a very robust SSO environment across our web application suite,” Sheffield said. “So for us, that made it very easy to integrate FIDO into the SSO platform, and then therefore every application behind SSO automatically got the benefit of it.”

In terms of how Target was able to get its users to adopt passkeys quickly, Sheffield said that the option was communicated to users in the login flow, rather than trying to explain to users what they should do in an email.

Overall, Sheffield emphasized that if an organization is using OTP (one time passwords) today for multi-factor authentication (MFA), any form of FIDO will provide significantly better user experience and security. 

“There have not been many security programs that I’ve been part of in my 25-year career in this space that offer you security and user experience simultaneously,” he said. “So if you’re using anything other than FIDO you’ve got a great opportunity to up your game and provide a great experience for users which should make you a hero.”

Authenticating a Billion Customers with Passkeys at Amazon

Among the biggest consumer-facing websites that supports passkeys today is online giant Amazon.

Yash Patodia, senior manager of product management at Amazon, detailed how passkeys were rolled out to hundreds of millions of consumers worldwide. Patodia explained Amazon’s motivation noting that passwords are relatively easy for a bad actor to crack. He noted that passkeys help customers to authenticate more easily than other methods with a better user experience. 

Amazon implemented passkeys using different APIs for web, iOS, and Android platforms. Now available across devices, Amazon’s goal is to drive awareness and increase passkey adoption among its customer base over the next year. In his view, passkeys are well suited for mass adoption and early indications from Amazon’s user base are very encouraging.

“If you’re a consumer facing company who has a big customer base, definitely explore this option,” he said.

Considerations for FIDO and Passkeys in the US Government 

The U.S. Government is no stranger to the world of strong authentication, with many staffers already using PIV (Personal Identity Verification) smart card credentials. 

Teresa Wu from IDEMIA and Joe Scalone from Yubico, who both serve on the FIDO Alliance’s Government Deployment Working Group (GDWG), provided an overview of how passkeys can complement PIV credentials and support a zero trust security model. 

As government agencies work to implement phishing-resistant multi-factor authentication, passkeys are an option that could provide a more seamless user experience than one-time passwords or hardware tokens. 

“We are not here to replace PIV, we are here to supplement and use FIDO where PIV is not covered,” said Wu. 

One area they see opportunities for FIDO is for federal contractors and employees who are not eligible for a PIV card due to their job functions. Currently these individuals rely on passwords for system access.

State of Passkey Portability Set to Improve

A critical aspect of user experience is the ability to change passkey providers and move from one provider to another, if that’s what the user wants to do.

With existing password managers and legacy passwords, the process of moving credentials isn’t particularly efficient or secure, according to Rew Islam from Dashlane and Nick Steele from 1Password. It’s a situation that the Credential Provider Special Interest Group within the FIDO Alliance is looking to solve with a new standard for securely porting passwords between different password/passkey management applications.

The group is developing a new Credential Exchange Protocol that will use hybrid public key encryption to securely transfer credentials; the effort also includes the development of a standardized data format for credential information.

“By having the standard credential format, it will allow for interoperability of sharing credentials between two different providers in different organizations,” Steele said.

A proof of concept demo for the credential exchange is currently set for May, during the FIDO Member Plenary in Osaka, Japan. Islam noted that the effort represents a real triumph for the power of FIDO to bring different competitive vendors together for common purpose.

Common Questions about Passkeys 

The virtual summit was concluded with an ‘Ask Me Anything’ (AMA) session where attendees asked their most pressing questions on passkeys.

Among the big questions asked:

How should organizations consider choosing synced passkeys or device-bound passkeys from a security and usability perspective?

Turner answered that the first thing to make really clear is that synced passkeys are probably the right answer for the majority of use cases. That said, he noted that FIDO recognizes that there are some areas where people have a much higher risk profile, and in those cases the device- bound passkeys can provide an extra level of trust.

Can passkeys play a role in transaction signing?

Pedro Martinez from Thales responded that yes, passkeys can be used to sign transactions. He explained that the beauty of the FIDO protocol is that it is based on the signature of a challenge. As such, it’s possible to adjust the challenge in order to contain data related to a transaction that needs to be digitally signed.

When will passkeys be the default mode of authentication? 

Shikiar said that he doesn’t think that all passwords will go away, but he is hopeful for a passwordless future.

“Sophisticated risk engines and anomaly detectors don’t really think twice about accepting a password,” he said. “But as passkeys become more prevalent and become the default all of a sudden using a password will be anomalous in and of itself.and I think that’s when we’ll be in the fabulous future when using a password is rightfully seen as a high risk and anomalous action.”

Monday, 25. March 2024

Identity At The Center - Podcast

It’s time for a public conversation about privacy on the lat

It’s time for a public conversation about privacy on the latest episode of the Identity at the Center Podcast. We had an open conversation with Hannah Sutor, a Principal Product Manager at GitLab and IDPro Board Member, about privacy. We delved into the nuances of privacy as a human right, the expectations of privacy in our roles as employees and consumers, and much more. Check out this episode a

It’s time for a public conversation about privacy on the latest episode of the Identity at the Center Podcast. We had an open conversation with Hannah Sutor, a Principal Product Manager at GitLab and IDPro Board Member, about privacy. We delved into the nuances of privacy as a human right, the expectations of privacy in our roles as employees and consumers, and much more.

Check out this episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 22. March 2024

World Identity Network

World Identity Network Releases “Shadows in the Dark” Documentary on Amazon

WASHINGTON, March 22, 2024 /PRNewswire/ — World Identity Network (WIN), the leading nonprofit organization advocating for universal identity rights, has released its groundbreaking documentary, Shadows in the Dark: Our Global Identity Crisis, exclusively on Amazon. “Releasing this film to the public is a moment of great triumph for our organization,” says WIN Founder and CEO, D

WASHINGTON, March 22, 2024 /PRNewswire/ — World Identity Network (WIN), the leading nonprofit organization advocating for universal identity rights, has released its groundbreaking documentary, Shadows in the Dark: Our Global Identity Crisis, exclusively on Amazon.

“Releasing this film to the public is a moment of great triumph for our organization,” says WIN Founder and CEO, Dr. Mariana Dahan. “We spent years interviewing undocumented persons and refugees. Telling their stories with the utmost care, precision, and nuance was a tremendous responsibility, and we could not be happier with the final result.”

Shadows in the Dark is a sprawling saga following the stories of undocumented individuals across the United States, the Middle East, and refugee camps in Europe and beyond. The documentary shines a light on those born in the shadows of the formal economy, at the margins of society, lacking common identity documents, such as birth certificates and passports.

The movie highlights the work that Dr. Mariana Dahan has conducted at The World Bank, as the initiator and first global coordinator of the Identification for Development (ID4D) agenda, which celebrates its 10-years anniversary this year. Shadows in the Dark offers a compelling analysis of the successes and the risks associated with this multi-billion dollars program.

The Emmy Award – winning film crew has interviewed decision-makers, technologists and human rights activists, advocating for universal identification and responsible use of digital technologies, such as biometrics, facial recognition and AI.

“Identity is at the heart of many of today’s global challenges,” says Shadows in the Dark Co-director, Brad Kremer. “It is the common thread in immigration and many of the conflict zones existing throughout the world. When Dr. Mariana Dahan approached me to do this film together, I knew it would be a journey of immense meaning. But directing this narration, and telling the stories of everyone this issue impacts, has exceeded all our expectations.”

Produced in partnership with the United Nations, the Human Rights Foundation and Singularity University, Shadows in the Dark features extensive interviews with displaced Ukrainian and Syrian refugees recounting their experiences with the asylum process, along with leading officials at the World Bank and the United Nations, and the founders building new digital identity solutions. The film likewise explores nuances surrounding surveillance, authoritarian regimes, and biometric systems, as well as a dialogue with a group of far-right border advocates in the United States.

“In many ways, this film is a culmination of my life’s work,” continues Dr. Dahan. “Having been born without a birth certificate in Soviet-era Moldova, at the border with Ukraine, I know firsthand how crucial identity is to the preservation of human rights. I encourage everyone to watch the film and learn more about this global issue impacting millions. Identity is the cornerstone of human civilization”.

To learn more about Shadows in the Dark go to www.shadowsinthedark.movie

The post World Identity Network Releases “Shadows in the Dark” Documentary on Amazon appeared first on World Identity Network.


FIDO Alliance

Identity Week: HID’s 2024 report highlights mobile IDs, MFA, and sustainability in security trends

With over 83% of organisations currently using MFA, the shift away from password-dependency is clear. However, the report indicates a slower but growing implementation of Zero Trust architectures, currently in […]

With over 83% of organisations currently using MFA, the shift away from password-dependency is clear. However, the report indicates a slower but growing implementation of Zero Trust architectures, currently in place in up to 16% of larger organisations. The development of standards like FIDO heralds a move toward more secure authentication options.


Neowin: Proton Pass gets passkey support for both free and paid users

Proton has announced passkey support in its Proton Pass password manager, which now offers enhanced security and usability for both free and paid users across all platforms.

Proton has announced passkey support in its Proton Pass password manager, which now offers enhanced security and usability for both free and paid users across all platforms.


Biometric Update: FIDO’s influence expands with new security key and board member

Cisco has further solidified its commitment to passkeys by joining the FIDO Alliance’s board of member representatives. Andrew Shikiar, executive director and CEO of the FIDO Alliance welcomes Cisco’s expanded […]

Cisco has further solidified its commitment to passkeys by joining the FIDO Alliance’s board of member representatives. Andrew Shikiar, executive director and CEO of the FIDO Alliance welcomes Cisco’s expanded involvement, noting their historical contributions through Duo Security and now as an official member.


Elastos Foundation

Bitcoin Layer 2 Evolution: Unveiling BeL2’s BTC Oracle with Elastos

The launch of BeL2’s BTC Oracle marks a critical juncture,  a paradigm shift in how Bitcoin interacts with the broader ecosystem of decentralised applications (DApps) and Ethereum Virtual Machine (EVM) compatible blockchains. Bitcoin, as the first cryptocurrency, has long been critiqued for its limitations in scalability and flexibility, particularly in the context of smart contracts […]

The launch of BeL2’s BTC Oracle marks a critical juncture,  a paradigm shift in how Bitcoin interacts with the broader ecosystem of decentralised applications (DApps) and Ethereum Virtual Machine (EVM) compatible blockchains.

Bitcoin, as the first cryptocurrency, has long been critiqued for its limitations in scalability and flexibility, particularly in the context of smart contracts and DApps. The introduction of BeL2 and its BTC Oracle addresses these critiques head-on by generating zero-knowledge proofs (ZKPs) to enable secure, private, and efficient communication between Bitcoin and EVM blockchains. This development is crucial because it expands Bitcoin’s utility beyond being a mere store of value to a foundational layer upon which complex decentralised applications can be built and managed directly.

 

 

The Core

The core of this innovation lies in BeL2’s BTC Oracle. The BTC Oracle generates ZKPs to feed real-time Bitcoin transaction data into EVM smart contracts without compromising the privacy or security of the transactions. This functionality is revolutionary, as it allows for the creation of Bitcoin-denominated smart contracts across any EVM-compatible blockchain, vastly expanding the potential use cases and applications for Bitcoin in the decentralised finance (DeFi) space.

BeL2, or Bitcoin Layer 2, further extends this capability by providing a framework for developing and managing Bitcoin-native smart contracts. It represents the culmination of efforts to integrate Bitcoin more deeply into the ecosystem of decentralised applications, enabling novel financial products and services such as BTC lending, algorithmic stablecoin issuance, and more.

 

The Mechanism

BeL2’s technology stack comprises a BTC Oracle that inputs Bitcoin-related data into EVM contracts, an upcoming ELA-powered relay network to decentralise and secure the data transmission, and the application layer where the actual development of Bitcoin-native smart contracts takes place.

This approach minimises reliance on intermediaries, reduces points of failure, and enhances the system’s overall resilience and efficiency. BeL2’s BTC Oracle is centred around enhancing Bitcoin’s utility and accessibility, involving innovative cryptographic techniques like ZKPs to deliver a comprehensive solution for Bitcoin and EVM blockchain interoperability.

 

 

The Impact

By enabling direct development on Bitcoin Layer 2, Elastos is not just augmenting Bitcoin’s functionality; it is redefining the possibilities of the blockchain space. The ability for any EVM blockchain to leverage Bitcoin in smart contracts opens up new avenues for innovation, potentially increasing the market for Bitcoin-based applications sevenfold.

This development aligns with the broader trend of seeking solutions that respect the foundational principles of blockchain technology—decentralisation, security, and user sovereignty—while pushing the boundaries of what’s possible. It embodies a non-consensus, forward-thinking approach that challenges conventional limitations and opens up new opportunities for the entire crypto ecosystem.

In conclusion, the launch of Elastos’ BTC Oracle and BeL2 platform represents a significant milestone in the evolution of Bitcoin and blockchain technology. By addressing fundamental challenges of interoperability and functionality, Bitcoin’s value is not just in its scarcity and security but in its utility and integration into the decentralised web.

Try the BeL2 demo here!


DIDAS

Parallel Signatures – a relevant input to the Technology Discussion

To enhance the Swiss e-ID framework with selective disclosure while ensuring unlinkability, it's imperative to incorporate advanced digital signature technologies such as BBS+ signatures. These technologies not only fortify the security of digital credentials but also significantly enhance user privacy. Such capabilities are crucial in minimizing the risk of personal data exposure and ensuring ...

To enhance the Swiss e-ID framework with selective disclosure while ensuring unlinkability, it’s imperative to incorporate advanced digital signature technologies such as BBS+ signatures. These technologies not only fortify the security of digital credentials but also significantly enhance user privacy. Such capabilities are crucial in minimizing the risk of personal data exposure and ensuring that users retain control over their information. It’s essential to continuously align our Trust Infrastructure with international cryptographic standards while remaining adaptable to emerging norms. This approach will facilitate interoperability across borders and sectors, ensuring that e-ID systems are both secure and universally recognized.

The parallel signatures model involves attaching multiple digital signatures to a single document or payload, with each signature providing different security or privacy features. This approach allows for a flexible and robust security framework, accommodating various cryptographic standards and privacy needs without compromising the integrity of the original document. It’s particularly useful in environments requiring adherence to diverse regulatory standards or in scenarios where resilience and both, high security and privacy are paramount. Cryptographic layering supports adaptiveness by incorporating multiple layers of cryptographic techniques within a system. This approach allows for the seamless integration and removal of cryptographic methods as needed by the Trust Ecosystem governance, enabling the system to adapt to evolving security threats and advancements in cryptographic research. It ensures long-term resilience and flexibility, allowing systems to maintain security without complete overhauls. Applying cryptographic schemes always mandates careful handling of private keys. Preventing their exposure is vital, even more so when using advanced schemes supporting derivative keys, as possible with BBS+. This underscores the need for strict security measures to prevent unauthorized access and ensure the system’s integrity.

Public-Private Partnerships (PPPs) represent a proven strategic model to operationalize digital trust and -identity solutions, combining public oversight with private sector efficiency and innovation. Such partnerships should be structured to encourage shared investment and risk, with a clear focus on public interest, global standards and local governance, protection of digital sovereignty and value-based adoption. These initiatives should be complemented by ongoing research into cryptographic innovations, preparing the ground for future advancements in e-ID security and privacy.

To address the challenges comprehensively and to build a continuously improving framework that is not only secure and compliant but also resilient and forward-looking, we must evaluate to invest in an independent body that accompanies the further progress in technology, governance and supports public and private sector adoption – to benefit from the opportunities of a trusted digital economy in the long term.

Thank you DIDAS Technology Working Group and Manu Sporny of Digital Bazaar for the dialogue!


MyData

A Recorded Delivery Network… for Data

In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Since the 1980s, personal data has been managed in essentially the same way. Organisations aggregate customer information in vast data warehouses, with the assumption that more data is always better to […]
In the MyData Matters blog series, MyData members introduce innovative solutions and practical use cases that leverage personal data in line with MyData values. Since the 1980s, personal data has been managed in essentially the same way. Organisations aggregate customer information in vast data warehouses, with the assumption that more data is always better to […]

Thursday, 21. March 2024

Digital ID for Canadians

DIACC Women in Identity: Marli Lichtman

DIACC women in identity spotlights showcase outstanding DIACC member women in identity. If you are a DIACC member woman in identity and would like us…

DIACC women in identity spotlights showcase outstanding DIACC member women in identity. If you are a DIACC member woman in identity and would like us to feature you in the spotlight, contact us!

Marli Lichtman is Managing Director and Head, Digital Strategy and Controls at BMO Financial Group, BMO.

Follow Marli on LinkedIn

What has your career journey looked like?

Let me work backwards, starting with my current role as Head of Digital Strategy and Controls at BMO. In this role, I lead two teams accountable for: (1) Strategy: defining and executing BMO’s “Digital First” agenda and (2) Controls: working in partnership with the Financial Crimes Unit to build and enhance digital controls to protect our customers against fraud.

I initially joined BMO’s Corporate Strategy Team in 2013 and since then have worked in progressively senior roles across Finance, Risk, Transformation and Business Operations.

Before joining BMO, I was a consultant in Oliver Wyman’s Finance and Risk Practice and prior to that, I worked in wealth management and earned my CFA (Chartered Financial Analyst) designation. My first job out of school was at a boutique investment advisory firm. I graduated from Ivey at Western University with an Honours Business Administration (HBA) degree.

When you were 20 years old, did you visualize a dream job and if so, why?

I didn’t really know what I wanted to do when I was 20! I focused the early days of my career on finding opportunities where I could be challenged, learn as much as possible, maintain optionality to transition to other industries or career paths, and work with great people who would champion my career.

Have you encountered significant barriers in your career as a woman in leadership, and if so, what were they?

I have experienced many of the usual challenges you hear about concerning women in the workplace. However, my biggest barrier has been getting into my own head and thinking that I don’t deserve the positions I’ve been given (I mean, earned 😊). Through executive coaching, mentors, sponsors, and simply the experience of failing and rebounding, I’ve been able to overcome this (although I would be lying if I said I don’t experience imposter syndrome from time to time!).

How do you balance work and life responsibilities?

It’s a constant juggling act, but I try to focus on 5 things:

Regular calendar reviews to “optimize” my time (e.g., which calls can I take from the car on my way to / from the office?) Learning to say “no” and setting clear boundaries (applies to both work and personal life). Finding time for self-care. Working as a team with my partner who is also balancing a demanding schedule. Living my values and knowing what’s important in life.

How can more women be encouraged to pursue digital trust and identity careers?

We need to start with education – What is Digital ID? What skillsets do you need to enter the space? Why is diversity so important? Who are female trailblazers in the space, and what has their career path looked like? Early exposure, encouragement, and mentorship are key to increasing female representation in this space.

What are some strategies you have learned to help women achieve a more prominent role in their organizations?

Build meaningful relationships. Earn the trust of your colleagues. Network within and outside of your industry. Ensure you have a mentor and a sponsor at your organization. Most importantly, stay true to yourself.

What will be the biggest challenge for the generation of women behind you?

While women have made considerable progress over the past decade, there is still more work to do. The next generation will continue to face the same challenges (e.g., gender bias, pay inequality, balancing personal life) but will benefit from increased female representation and sponsorship at Senior levels.

What advice would you give to young women entering the field?

Be confident – you are in the field for a reason! Trust your instincts, and don’t be too hard on yourself.


Ceramic Network

Toward the first decentralized points system: Oamo becomes the first points provider on Ceramic

We're thrilled to announce that Oamo is partnering with Ceramic as a data provider on the platform. Oamo will issue tens of millions of publicly available credentials based on wallets’ on-chain activity and holdings. This is the first step in a broader initiative to develop and standardize

We're thrilled to announce that Oamo is partnering with Ceramic as a data provider on the platform. Oamo will issue tens of millions of publicly available credentials based on wallets’ on-chain activity and holdings. This is the first step in a broader initiative to develop and standardize the first decentralized point system, powered by Ceramic and Oamo’s credential models.

Oamo has been a big supporter of the Ceramic ecosystem from day one. By harnessing Ceramic's innovative DID (Decentralized Identifier) infrastructure and ComposeDB for zero-party data storage, they’re setting the foundation for a future where user data is private by design, perishable at will, and accessible only with explicit permission. Oamo and Ceramic are crafting a path toward a consensual and rewarding digital ecosystem.

The partnership so far

Since launching on Ceramic in Q3 2023, Oamo has witnessed remarkable results – over 65,000 Oamo Profiles have been created, with more than 200,000 Ceramic documents generated. Additionally, Oamo has distributed over 400,000 credentials spanning Web2 and on-chain behaviors, enriching the digital identity and access privileges of Oamo Profile users across various platforms.

Oamo credentials cover:

On-chain activity across DeFi, NFTs, staking and gaming; Wallet holdings including major ERC-20s and NFT collections; and Social activity across Web2 platforms like Discord, Youtube and X. Supercharging the Ceramic ecosystem

With this partnership and announcement, Oamo aims to enhance digital identity and engagement through:

Credential Distribution
Oamo has indexed millions of EVM wallets’ behaviors and holdings, and will be distributing tens of millions of publicly available credentials to enrich user identities across platforms, ensuring the security and verification of online activities. These credentials can then be used to:Credentials issued will be maintained and updated monthly to include time decay and ensure they always represent the latest behaviors of the indexed wallets. Feedback from the community is welcome to develop new credentials that track the most relevant on-chain behaviors for builders in the ecosystem. Compile specific wallet lists for airdrops. Establish reputation frameworks based on behavioral data points. Launch strategic user acquisition campaigns by identifying wallets in a specific target audience and contacting them via XMTP for example. Decentralized Point System
Oamo will leverage its credential models to develop the first standardized decentralized point system on Ceramic, with each indexed wallet receiving its own scorecard based on its on-chain activity and holdings. Builders in the ecosystem will be able to leverage these scorecards and customize their own points system with their own credentials and Oamo’s. Credential & Points Management SDK
Oamo will release an SDK to allow any builder to search and leverage Oamo’s credentials and points system easily. This middleware will also allow builders to issue their own credentials and points based on their own models and app activity. What’s in it for users

Anyone creating their Decentralized Identifier (DID) on the Ceramic Network (by creating an Oamo Profile, for instance) will be able to claim their credentials and scorecards seamlessly. This open and inclusive approach democratizes access to digital credentials, ensuring users from all backgrounds and levels of onchain experience can benefit from Ceramic’s ecosystem of builders.

What’s in it for developers

Oamo's vision includes diverse use cases, transforming how developers interact with consumers. The Oamo platform offers endless opportunities for various types of protocols and apps:

DeFi Protocols
Easily find wallets matching their target audience, such as active liquidity providers on leading AMMs or active traders on DEXes across major EVM chains. NFT Projects
Identify potential collectors based on their NFT holdings and distribute collections to the right user base. Wallet Providers
Identify and reach whales holding specific token amounts across multiple chains. Liquid Staking Projects
Identify wallets holding significant ETH amounts and generating yield via lending protocols as high-value acquisition targets. Game Developers
Find gamers in Web3 that hold specific NFTs or have engaged with similar on-chain games.

While the Oamo app provides a hub for user acquisition and relationship development, this publicly available tooling and data will allow anyone to craft their own strategies.

Builders on the Ceramic Network will have the capability to query, consume, and customize issued credentials and points to power new data-rich use cases, such as targeted airdrops, credential-gated experiences, loyalty programs, and more. To streamline integrations, Oamo will be launching an SDK, making it easier for developers to incorporate these capabilities into their own projects.

Join the Ceramic Discord and Oamo’s Telegram channel for builders to contribute or be notified about updates and releases.

About Ceramic

Ceramic is a decentralized data network for managing verifiable data at scale, combining the trust and composability of a blockchain with the flexibility of an event-driven architecture to help organizations get more value from their data. Thousands of developers use it to manage reputation data, store attestations, log user activity, and build novel data infrastructure. Ceramic frees entrepreneurs from the constraints of traditional siloed infrastructure, letting them tap into a vibrant data ecosystem to bring their unique vision to life faster.

About Oamo

Oamo allows consumers to discover and match with their favorite brands based on their online activity. Brands can define their ideal user persona based on their online behaviors, optionally incentivize data sharing via token rewards, and design personalized conversion and retention campaigns to acquire power users. Zero-party data guarantees an optimal match between interested consumers and brands through rich behavioral alignment, leading to higher conversion rates and LTV.


Origin Trail

Decentralized RAG with OriginTrail DKG and NVIDIA Build ecosystem

Introduction Generative Artificial Intelligence (AI) is already reaching relevant adoption across multiple fields, however, some of its limitations are significantly hurting the potential of mainstream adoption and delivering improvements in all fields of modern humanity. For GenAI to be production-ready for such a scale of impact we need to limit hallucinations, manage bias, and reject intellect
Introduction

Generative Artificial Intelligence (AI) is already reaching relevant adoption across multiple fields, however, some of its limitations are significantly hurting the potential of mainstream adoption and delivering improvements in all fields of modern humanity. For GenAI to be production-ready for such a scale of impact we need to limit hallucinations, manage bias, and reject intellectual property (or data ownership) infringements. The promise of Verifiable Internet for AI is to address these shortfalls by providing information provenance in model outputs, ensuring verifiability of presented information, respecting data ownership, and incentivizing new knowledge creation.

Below we’re showcasing the implementation framework called Decentralized Retrieval-Augmented Generation (dRAG) on the NVIDIA Build ecosystem, combining an ample amount of powerful models across industries and types. dRAG is advancing the Retrieval-Augmented Generation (RAG) framework proposed by Patrick Lewis in an attempt to increase accuracy and reliability of GenAI models with facts fetched from external sources. The RAG framework has gained prominence both among AI developers and the leading tech companies’ leaders, such as NVIDIA’s CEO Jensen Huang.

The dRAG advances the RAG system by leveraging the Decentralized Knowledge Graph (DKG), a permissionless network of Knowledge Assets. Each Knowledge Asset contains Graph data and/or Vector embeddings, immutability proofs, a Decentralized Identifier (DID), and the ownership NFT. When connected in one permission-less DKG, the following capabilities are enabled:

Knowledge Graphs — structural knowledge in knowledge graphs allows a hybrid of neural and symbolic AI methodologies, enhancing the GenAI models with deterministic inputs. Ownership — dRAG uses input from Knowledge Assets that have an owner that can manage access to the data contained in the Knowledge Asset. Verifiability — every piece of knowledge on the DKG has cryptographic proofs published ensuring that no tampering has occurred since it was published.

In this tutorial, you will learn how to query the OriginTrail DKG and retrieve verified Knowledge Assets on the DKG.

Prerequisites A NVIDIA build platform account and API key. A DKG node. Please visit the official docs to learn how to set one up. A Python project with a virtual environment set up. Step 1 — Installing packages and setting up dkg.py

In this step, you’ll install the necessary packages using pip and set up the credentials for dkg.py.

Navigate to your Python project’s environment and run the following command to install the packages:

pip install openai dkg python-dotenv annoy

The OpenAI client is going to act as an intermediary for interacting with the NVIDIA API. You’ll store the environment variables in a file called .env. Create and open it for editing in your favorite editor:

nano .env

Add the following lines:

OT_NODE_HOSTNAME="your_ot_node_hostname"
PRIVATE_KEY="your_private_key"
NVIDIA_API_TOKEN="your_nvidia_api_token"

Replace the values with your own, which you can find in the configuration file of your OT Node, as well as your wallet’s private key in order to perform the Knowledge Asset create operation, which needs to be funded with TRAC tokens (more information available in the OriginTrail documentation). Keep in mind that this information should be kept private, especially your wallet’s key. When you’re done, save and close the file.

Then, create a Python file where you’ll store the code for connecting to the DKG:

nano dkg_version.py

Add the following code to the file:

from dkg import DKG
from dkg.providers import BlockchainProvider, NodeHTTPProvider
from dotenv import load_dotenv
import os
import json

dotenv_path = './.nvidia.env' # Replace with your .env file address
load_dotenv(dotenv_path)
ot_node_hostname = os.getenv('OT_NODE_HOSTNAME')
private_key = os.getenv('PRIVATE_KEY')

node_provider = NodeHTTPProvider(ot_node_hostname)
blockchain_provider = BlockchainProvider("testnet", "otp:20430", private_key=private_key)

dkg = DKG(node_provider, blockchain_provider)
print(dkg.node.info)

Here, you first import the required classes and packages. Then, you load the values from .env and instantiate a NodeHTTPProvider and BlockchainProvider with those values, which you pass in to the DKG constructor, creating the dkg object for communicating with the graph.

If all credentials and values are correct, the output will show you the version that your OT Node is running on:

{'version': '6.2.3'}

That’s all you have to do to be connected to the DKG!

Step 2 — Instructing the LLM to create Knowledge assets on the DKG

In this step, you’ll connect to the NVIDIA API using the OpenAI Python library. Then, you’ll instruct it to generate

First, you need to initialize the OpenAI class, passing in the NVIDIA API as the base_url along with your API key. OpenAI acts as an intermediary to the NVIDIA API here, and will be able to use multiple LLM models, such as the Google Gemma and Meta’s Llama which are used in the tutorial.

from openai import OpenAI

client = OpenAI(
base_url = "https://integrate.api.nvidia.com/v1",
api_key = os.getenv('NVIDIA_API_TOKEN')
)

Then, you define the instructions, telling the model what to do:

instruction_message = '''
Your task is the following:

Construct a JSON object following the Product JSON-LD schema based on the provided information by the user.
The user will provide the name, description, tags, category and deployer of the product, as well as the URL which you will use as the '@id'.

Here's an example of an Product that corresponds to the mentioned JSON-LD schema.:
{
"@context": "http://schema.org",
"@type": "Product",
"@id": "https://build.nvidia.com/nvidia/ai-weather-forecasting",
"name": "ai-weather-forecasting",
"description": "AI-based weather prediction pipeline with global models and downscaling models.",
"tags": [
"ai weather prediction",
"climate science"
],
"category": "Industrial",
"deployer": "nvidia"
}

Follow the provided JSON-LD schema, using the provided properties and DO NOT add or remove any one of them.
Output the JSON as a string, between ```json and ```.
'''

chat_history = [{"role":"system","content":instruction_message}]

As part of the instructions, you provide the model with an example Product definition, according to which a new one should be generated. We want to create a Knowledge Asset which will represent the ‘rerank-qa-mistral-4b’ model from the NVIDIA Build platform. You add the contents of that message to chat_history with a system role, meaning that it instructs the model before the user comes in with actionable prompts.

Then, you define an example user_instruction for testing the model:

user_instruction = '''I want to create a product (model) with name 'rerank-qa-mistral-4b', which is a GPU-accelerated model optimized for providing a probability score
that a given passage contains the information to answer a question. It's in category Retrieval and deployed by nvidia.
It's used for ranking and retrieval augmented generation. You can reach it at https://build.nvidia.com/nvidia/rerank-qa-mistral-4b. Give me the schema JSON LD object.'''

This user prompt wants the LLM to output a Product with the given name and gives information as to where that model can be found.

Finally, you can ask the LLM to compute the output and print it:

completion = client.chat.completions.create(
model="google/gemma-7b",
messages=chat_history + [{"role":"user","content":user_instruction}],
temperature=0,
top_p=1,
max_tokens=1024,
)

generated_json = completion.choices[0].message.content
print(generated_json)

The output will look like this:

```json
{
"@context": "http://schema.org",
"@type": "Product",
"@id": "https://build.nvidia.com/nvidia/rerank-qa-mistral-4b",
"name": "rerank-qa-mistral-4b",
"description": "GPU-accelerated model optimized for providing a probability score that a given passage contains the information to answer a question.",
"tags": [
"rerank-qa-mistral-4b",
"information retrieval",
"retrieval augmentation"
],
"category": "Retrieval",
"deployer": "nvidia"
}
```

The LLM has returned a JSON-LD structure that can be added to the DKG.

def clean_json_string(input_string):
if input_string.startswith("```json") and input_string.endswith("```"):
cleaned_query = input_string[7:-3].strip()
return cleaned_query
elif input_string.startswith("```") and input_string.endswith("```"):
cleaned_query = input_string[3:-3].strip()
else:
return input_string

product = json.loads(clean_json_string(generated_json))

content = {"public": product}
create_asset_result = dkg.asset.create(content, 2)
print('Asset created!')
print(json.dumps(create_asset_result, indent=4))
print(create_asset_result["UAL"])

Here you first define a function (clean_json_string) that will clean up the JSON string and remove the Markdown code markup. Then, you load the product by deserializing the JSON and add it to the DKG by calling dkg.asset.create().

The output will look like this:

Asset created!
{
"publicAssertionId": "0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef",
"operation": {
"mintKnowledgeAsset": {
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"from": "0xD988B6fd921CFab980a7f2F60B9aC9F7918D7F71",
"to": "0xB25D47412721f681f1EaffD1b67ff0638C06f2B7",
"blockNumber": 3674556,
"cumulativeGasUsed": 397582,
"gasUsed": 397582,
"contractAddress": null,
"logs": [
{
"address": "0x1A061136Ed9f5eD69395f18961a0a535EF4B3E5f",
"topics": [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x0000000000000000000000000000000000000000000000000000000000000000",
"0x000000000000000000000000d988b6fd921cfab980a7f2f60b9ac9f7918d7f71",
"0x000000000000000000000000000000000000000000000000000000000027fb68"
],
"data": "0x",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 0,
"transactionLogIndex": "0x0",
"removed": false
},
{
"address": "0xf305D2d97C7201Cea2A54A2B074baC2EdfCE7E45",
"topics": [
"0x6228bc6c1a8f028a2e3476a455a34f5fa23b4387611f3c147a965e375ebd17ba",
"0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
],
"data": "0x00000000000000000000000000000000000000000000000000000000000003e700000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000008",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 1,
"transactionLogIndex": "0x1",
"removed": false
},
{
"address": "0xFfFFFFff00000000000000000000000000000001",
"topics": [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x000000000000000000000000d988b6fd921cfab980a7f2f60b9ac9f7918d7f71",
"0x000000000000000000000000f43b6a63f3f6479c8f972d95858a1684d5f129f5"
],
"data": "0x0000000000000000000000000000000000000000000000000000000000000006",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 2,
"transactionLogIndex": "0x2",
"removed": false
},
{
"address": "0x082AC991000F6e8aF99679f5A2F46cB2Be4E101B",
"topics": [
"0x4b81188c3c973dd634ec0dae5b7e72f92bb03834c830739d63935923950d6f64",
"0x0000000000000000000000001a061136ed9f5ed69395f18961a0a535ef4b3e5f",
"0x000000000000000000000000000000000000000000000000000000000027fb68"
],
"data": "0x00000000000000000000000000000000000000000000000000000000000000c000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000065fc48a00000000000000000000000000000000000000000000000000000000000000002000000000000000000000000000000000000000000000000000000000076a700000000000000000000000000000000000000000000000000000000000000000600000000000000000000000000000000000000000000000000000000000000341a061136ed9f5ed69395f18961a0a535ef4b3e5f09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef000000000000000000000000",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 3,
"transactionLogIndex": "0x3",
"removed": false
},
{
"address": "0xB25D47412721f681f1EaffD1b67ff0638C06f2B7",
"topics": [
"0x60e45db7c8cb9f55f92f3de18053b0b426eb919a763a1daca0ea9ad20961e878",
"0x0000000000000000000000001a061136ed9f5ed69395f18961a0a535ef4b3e5f",
"0x000000000000000000000000000000000000000000000000000000000027fb68",
"0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
],
"data": "0x",
"blockHash": "0x7f73351931b89c4c75d3bf66206df69fc55cb3a0f6e126545d82cfe3e81c0d6a",
"blockNumber": 3674556,
"transactionHash": "0x6fb8a6039f97cf3c0d8cb8b1a221be405d4a7cbdeab7f27240ae9848322cad98",
"transactionIndex": 0,
"logIndex": 4,
"transactionLogIndex": "0x4",
"removed": false
}
],
"logsBloom": "0x00000100400000000000800000000000000000000000000000000000000000000000010020000000000000000000000000000000000010800000000000001000000040000000400040000008002400000080000000004000000000000000000000040000020000000000000000000a00000000008000020000000010000210015000000000000000000080000000001000000000000000000000000200000000040000001020002002000000000000000000000000000000000000000000000000000002000000000000000000008004000000000000010000000000000020000000000000002800000000000000000000000000000000100000000000010000",
"status": 1,
"effectiveGasPrice": 40,
"type": 0
},
"publish": {
"operationId": "1bb622c7-8fa1-4414-b39e-0aaf3f5465f9",
"status": "COMPLETED"
}
},
"UAL": "did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2620264"
}

Here we can see a lot of useful information, such as the Knowledge Asset issuer, transaction IDs from the blockchain, and the status of the operation, which was completed. The UAL returned is the Uniform Asset Locator, a decentralized identifier connected to each Knowledge Asset on the DKG.

Then, you can retrieve the same product, but from the DKG by passing in the UAL to dkg.asset.get(). The output will look like this:

get_asset_result = dkg.asset.get(create_asset_result["UAL"])
print(json.dumps(get_asset_result, indent=4))

The output will be:

did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2620264
{
"operation": {
"publicGet": {
"operationId": "c138515a-d82c-45a8-bef9-82c7edf2ef6b",
"status": "COMPLETED"
}
},
"public": {
"assertion": "<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/category> \"Retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/deployer> \"nvidia\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/description> \"GPU-accelerated model optimized for providing a probability score that a given passage contains the information to answer a question.\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/name> \"rerank-qa-mistral-4b\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"information retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"rerank-qa-mistral-4b\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://schema.org/tags> \"text retrieval\" .\n<https://build.nvidia.com/nvidia/rerank-qa-mistral-4b> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://schema.org/Product> .",
"assertionId": "0x09d8d7c5b82bd09bc3f51770f575e15f1157c6292652d977afbe453932e270ef"
}
}

In this step, you’ve seen how to instruct the NVIDIA LLM to generate Product entities according to user prompts, and how to insert them into the DKG. You’ll now learn how to generate SPARQL queries for products using the LLM.

Step 3 — Generating SPARQL with the AI model

In this step, you’ll use the NVIDIA LLM to generate a SPARQL query for retrieving results from the DKG. The data that we’ll be querying consists of Knowledge Assets that represent each of the models from the NVIDIA Build platform — with the same properties as the one created in Step 2.

SPARQL is a query language for graphs and is very similar to SQL. Just like SQL, it has a SELECT and a WHERE clause, so as long as you’re familiar with SQL you should be able to understand the structure of the queries pretty well.

The data that you’ll be querying is related to Products, stored in the DKG as Knowledge Assets.

Similarly to before, you’ll need to instruct the LLM on what to do:

all_categories = ["Biology", "Gaming", "Visual Design", "Industrial", "Reasoning", "Retrieval", "Speech"];
all_tags = ["3d-generation", "automatic speech recognition", "chat", "digital humans", "docking", "drug discovery", "embeddings", "gaming", "healthcare", "image generation", "image modification", "image understanding", "language generation", "molecule generation", "nvidia nim", "protein folding", "ranking", "retrieval augmented generation", "route optimization", "text-to-3d", "advanced reasoning", "ai weather prediction", "climate science"];

instruction_message = '''
You have access to data connected to the new NVIDIA Build platform and the products available there.
You have a schema in JSON-LD format that outlines the structure and relationships of the data you are dealing with.
Based on this schema, you need to construct a SPARQL query to retrieve specific information from the NVIDIA products dataset that follows this schema.

The schema is focused on AI products and includes various properties such as name, description, category, deployer, URL and tags related to the product.
My goal with the SPARQL queries is to retrieve data from the graph about the products, based on the natural language question that the user posed.

Here's an example of a query to find products from category "AI Weather Prediction":
```sparql
PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description ?ual

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:tags "ai weather prediction" ; schema:name ?name ; schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }```

Pay attention to retrieving the UAL, this is a mandatory step of all your queries. After getting the product with '?product a schema:Product ;' you should wrap the next conditions around GRAPH ?g { }, and later use the graph retrieved (g) to get the UAL like in the example above.

Make sure you ALWAYS retrieve the UAL no matter what the user asks for and filter whether it contains "2043".
Make sure you always retrieve the NAME and the DESCRIPTION of the products.

Only return the SPARQL query wrapped in ```sparql ``` and DO NOT return anything extra.
'''

limitations_instruction = '''\nThe existing categories are: {}. The existing tags are: {}'''.format(all_categories, all_tags)
user_instruction = '''Give me all NVIDIA tools which I can use for use cases related to biology.'''

chat_history = [{"role":"system","content":instruction_message + limitations_instruction}, {"role":"user","content":user_instruction}]

The instruction_message prompt contains the instructions in natural language. You provide the model with a schema of a Product object (in JSON-LD notation) and an example SPARQL query in the appropriate format for the DKG. You also order it to pay attention to the examples and to return nothing else except the SPARQL query.

You can now define the chat history and pass in a user prompt to get the resulting code:

limitations_instruction = '''\nThe existing categories are: {}. The existing tags are: {}'''.format(all_categories, all_tags)
user_instruction = '''Give me all NVIDIA tools which I can use for use cases related to biology.'''

chat_history = [{"role":"system","content":instruction_message + limitations_instruction}, {"role":"user","content":user_instruction}]

completion = client.chat.completions.create(
model="meta/llama2-70b", # NVIDIA lets you choose any LLM from the platform
messages=chat_history,
temperature=0,
top_p=1,
max_tokens=1024,
)

answer = completion.choices[0].message.content
print(answer)

The output will look similar to this:

```sparql
PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:category "Biology" ;
?product schema:name ?name ;
?product schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }
```

This SPARQL query retrieves all products that have the category "Biology" and returns their names and descriptions. The `GRAPH ?g` clause is used to retrieve the graph that contains the product information, and the `FILTER` clause is used to filter the results to only include products that have a UAL that contains "20430".
```

You can employ a similar strategy to clean the result from the Markdown code formatting:

def clean_sparql_query(input_string):
start_index = input_string.find("```sparql")
end_index = input_string.find("```", start_index + 1)
if start_index != -1 and end_index != -1:
cleaned_query = input_string[start_index + 9:end_index].strip()
return cleaned_query
else:
return input_string

query = clean_sparql_query(answer)
print(query)

The output will now be clean SPARQL:

PREFIX schema: <http://schema.org/>

SELECT ?product ?name ?description

WHERE { ?product a schema:Product ;
GRAPH ?g
{ ?product schema:category "Biology" ;
?product schema:name ?name ;
?product schema:description ?description }

?ual schema:assertion ?g
FILTER(CONTAINS(str(?ual), "20430")) }
```

This SPARQL query retrieves all products that have the category "Biology" and returns their names and descriptions. The `GRAPH ?g` clause is used to retrieve the graph that contains the product information, and the `FILTER` clause is used to filter the results to only include products that have a UAL that contains "20430". Step 4 — Querying the OriginTrail DKG

Querying the DKG is very easy with SPARQL. You only need to specify the query and the repository to search:

query_result = dkg.graph.query(query, "privateCurrent")
print(query_result)

The privateCurrent option ensures that the SPARQL query retrieves the latest state of Knowledge Assets in the DKG, as it includes the private and public data of the latest finalized state of the Graph.

An example result for the above query looks like this:

[
{
'product': 'https: //build.nvidia.com/nvidia/molmim-generate',
'description': '"MolMIM performs controlled generation, finding molecules with the right properties."',
'name': '"molmim-generate"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619549'
},
{
'product': 'https: //build.nvidia.com/meta/esmfold',
'description': '"Predicts the 3D structure of a protein from its amino acid sequence."',
'name': '"esmfold"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619597'
},
{
'product': 'https: //build.nvidia.com/mit/diffdock',
'description': '"Predicts the 3D structure of how a molecule interacts with a protein."',
'name': '"diffdock"',
'ual': 'did: dkg: otp: 20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619643'
}
]

You’ll now be able to utilize the DKG to improve the runtime cost of the LLM model, as well as have it rely on trustable data stored in the Knowledge Assets.

Step 5 — Vector search with NVIDIA embed-qa-4 model and the DKG

In this step, you’ll build an in-memory vector DB based on the verified data queried from the DKG and invoke the NVIDIA model with it to generate more accurate results for the end-user. Sometimes, using SPARrQL queries may not be enough to answer a question, and you can use a vector database to extract specific Knowledge Assets by semantic similarity.

First, you initialize the NVIDIA embed-qa-4 model that you’ll use to generate the vector embeddings:

import requests

invoke_url = "https://ai.api.nvidia.com/v1/retrieval/nvidia/embeddings"

headers = {
"Authorization": f"Bearer {os.getenv('NVIDIA_API_TOKEN')}",
"Accept": "application/json",
}

def get_embeddings(input):
payload = {
"input": input,
"input_type": "query",
"model": "NV-Embed-QA"
}

session = requests.Session()

response = session.post(invoke_url, headers=headers, json=payload)

response.raise_for_status()
response_body = response.json()
return response_body["data"][0].embedding

Then, you build the vector DB in-memory by making embeddings based on the Product description:

from annoy import AnnoyIndex

def build_embeddings_index(embeddings, n_trees=10):
dim = len(embeddings[0])
index = AnnoyIndex(dim, 'angular') # Using angular distance

for i, vector in enumerate(embeddings):
index.add_item(i, vector)

index.build(n_trees)
return index

def add_text_embeddings(products):
for product in products:
product["embedding"] = get_embeddings([product["description"]])

add_text_embeddings(products)

Then, you can retrieve the Product that is semantically nearest to the user prompt, in order to generate the answer to his question with the following:

index = build_embeddings_index([product["embedding"] for product in products])
question = "I would like a model which will help me find the molecules with the chosen properties."

nearest_neighbors = index.get_nns_by_vector(get_embeddings(question), 1, include_distances=True)
index_of_nearest_neighbor = nearest_neighbors[0][0]

print(f"Vector search result: {products[index_of_nearest_neighbor]['description']}")
print(f"Product name: {products[index_of_nearest_neighbor]['name']}")
print(f"https://dkg.origintrail.io/explore?ual={products[index_of_nearest_neighbor]['ual']}")

The output will be similar to this:

Vector search result: Predicts the 3D structure of how a molecule interacts with a protein.
Product name: diffdock
https://dkg-testnet.origintrail.io/explore?ual=did:dkg:otp:20430/0x1a061136ed9f5ed69395f18961a0a535ef4b3e5f/2619643 Conclusion

You have now created a Python project which uses tools from the NVIDIA Build platform to help create and query verifiable Knowledge Assets on OriginTrail DKG. You’ve seen how to instruct it to generate SPARQL queries from Natural Language inputs and query the DKG with the resulting code, as well as how to create embeddings and use vector similarity search to find the right Knowledge Assets.

Additionally, you’ve explored the capabilities of the NVIDIA Build platform and how to use it with the DKG, offering versatile options for both structured data querying with SPARQL and semantic similarity search with vectors. With these tools at your disposal, you’re well-equipped to tackle a wide range of tasks requiring knowledge discovery and retrieval by using the decentralized RAG (dRAG).

Decentralized RAG with OriginTrail DKG and NVIDIA Build ecosystem was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Oasis Open Projects

Building Trust in AI with Open Standards

Open standards in artificial intelligence (AI) are important for a number of reasons: Interoperability: Open standards allow different AI systems to work together seamlessly, regardless of who developed them or what platform they run on. This means that data and services can be shared across different systems, increasing efficiency and reducing costs. Innovation: Open standards […] The post Buil

By Francis Beland, Executive Director, OASIS Open

Open standards in artificial intelligence (AI) are important for a number of reasons:

Interoperability: Open standards allow different AI systems to work together seamlessly, regardless of who developed them or what platform they run on. This means that data and services can be shared across different systems, increasing efficiency and reducing costs.

Innovation: Open standards encourage innovation by providing a common framework for developers to work within. This can lead to the development of new AI tools and techniques that can benefit a wide range of users.

Transparency: Open standards can help increase the transparency of AI systems, making it easier for users to understand how they work and how they make decisions. This is particularly important in applications such as healthcare, finance, and legal, where transparency and accountability are critical.

Accessibility: Open standards can help make AI more accessible to a wider range of users, including those who may not have the resources to develop their own systems. This can help democratize access to AI technology and promote inclusivity.

Trust: Open standards can help build trust in AI by establishing a common set of ethical principles and technical standards that developers can adhere to. This can help address concerns around bias, privacy, and security, and promote responsible AI development and deployment.

The post Building Trust in AI with Open Standards appeared first on OASIS Open.


Hyperledger Foundation

Blockchain Pioneers: Hyperledger Burrow

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now retired projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the

As we laid out in our Helping a Community Grow by Pruning Inactive Projects post, there is an important life cycle to well governed open source projects. Since our launch in 2015, Hyperledger Foundation has hosted a number of now retired projects that helped drive innovation and advanced the development of enterprise-grade blockchain technologies. This series will look back at the impact of these pioneering projects.


Trust over IP

Authentic Chained Data Containers (ACDC) Task Force Announces Public Review

A blueprint for creating truly decentralized, authentic, and verifiable ecosystems of identifiers, “credentials”, and attestations The post Authentic Chained Data Containers (ACDC) Task Force Announces Public Review appeared first on Trust Over IP.

The Authentic Chained Data Containers (ACDC) Task Force at the Trust Over IP Foundation is pleased to request public review of the following deliverables:

Key Event Receipt Infrastructure (KERI) specification  Authentic Chained Data Containers specification  Composable Event Streaming Representation specification 

Together, this suite of specifications provides a blueprint for creating truly decentralized, authentic, and verifiable ecosystems of identifiers, “credentials” [see footnote], and attestations.

The specifications describe a series of unique, innovative features:

Pre-rotation of keys, enabling truly unbounded term identifiers; Cryptographic root-of-trust; Chained “credentials” [see footnote] with fully verifiable proof of ownership and proof of authorship; A serialization format that is optimized for both text and binary representations equally with unique properties that support lookahead streaming for uncompromised scalability.

This suite of specifications contains additional sub-specifications including Out-Of-Band Introductions, Self-Addressing Identifiers and a revolutionary “path signature’ approach for signed containers required to provide a comprehensive solution for Organizational Identity.

With the launch of the vLEI Root of Official Trust this suite of specifications saw its first production deployment through the Python reference implementation in 2022.

The Task Force expects feedback to be provided by April 20, 2024 via GitHub issues on the following repositories using the ToIP Public Review Process:

https://github.com/trustoverip/tswg-keri-specification/issues https://github.com/trustoverip/tswg-acdc-specification/issues https://github.com/trustoverip/tswg-cesr-specification/issues

Licensing Information:

The Trust Over IP Foundation Technology Stack Working group deliverables are published under the following licenses:

Copyright mode: OWFa 1.0 (available at https://www.openwebfoundation.org/the-agreements/the-owf-1-0-agreements-granted-claims/owfa-1-0) Patent mode: OWFa 1.0 (available at https://www.openwebfoundation.org/the-agreements/the-owf-1-0-agreements-granted-claims/owfa-1-0) Source code: Apache 2.0 (available at http://www.apache.org/licenses/LICENSE-2.0.html)

Note: The Task Force considers “credentials” or “verifiable credentials” as termed by the W3C, only one use and a subset of ACDCs.

The post Authentic Chained Data Containers (ACDC) Task Force Announces Public Review appeared first on Trust Over IP.

Wednesday, 20. March 2024

Project VRM

Personal AI at VRM Day and IIW

Most AI news is about what the giants (OpenAI/Microsoft, Meta, Google/Apple, Amazon, Adobe, Nvidia) are doing (seven $trillion, anyone?), or what AI is doing for business (all of Forbes’ AI 50). Against all that, personal AI appears to be about where personal computing was in 1974: no longer an oxymoron but discussed more than delivered. […]

Prompt: A woman uses personal AI to know, get control of, and put to better use all available data about her property, health, finances, contacts, calendar, subscriptions, shopping, travel, and work. Via Microsoft Copilot Designer, with spelling corrections by the author.

Most AI news is about what the giants (OpenAI/Microsoft, Meta, Google/Apple, Amazon, Adobe, Nvidia) are doing (seven $trillion, anyone?), or what AI is doing for business (all of Forbes’ AI 50). Against all that, personal AI appears to be about where personal computing was in 1974: no longer an oxymoron but discussed more than delivered.

For evidence, look up “personal AI.” All the results will be about business (see here and here) or “assistants” that are just suction cups on the tentacles of giants (Siri, Google Assistant, Alexa, Bixby), or wannabes that do the same kind of thing (Lindy, Hound, DataBot).

There may be others, but three exceptions I know are Kin, Personal AI and Pi.

Personal AI is finding its most promoted early uses on the side of business more than the side of customers. Zapier, for example, explains that Personal AI “can be used as a productivity or business tool.”

Kin and Pi are personal assistants that help you with your life by surveilling your activities for your own benefit. I’ve signed up for both, but have only experienced Pit,” or “just vent,” when I ask it to help me with the stuff outlined in (and under) the AI-generated image above, it wants to hook me up with a bunch of siloed platforms that cost money, or to do geeky things (PostgreSQL, MongoDB, Python on my own computer. Provisional conclusion: Pi means well, but the tools aren’t there yet. [Later… Looks like it’s going to morph into some kind of B2B thing, or be abandoned outright, now that Inflection AI’s CEO, Mustafa Suleyman is gone to Microsoft. Hmm… will Microsoft do what we’d like in this space?]

Open source approaches are out there: OpenDAN, Khoj, Kwaai , and Llama are four, and I know at least one will be at VRM Day and IIW.

So, since personal AI may finally be what pushes VRM into becoming a Real Thing, we’ll make it the focus of our next VRM Day.

As always, VRM Day will precede IIW in the same location: the Boole Room of the Computer History Museum in Mountain View, just off Highway 101 in the heart of Silicon Valley. It’ll be on Monday, 15 April, and start at 9am. There’s a Starbucks across the street and ample parking because the museum is officially closed on Mondays, but the door is open. We lunch outdoors (it’s always clear) at the sports bar on the other corner.

Registration is open now at this Eventbrite link:

https://vrmday2024a.eventbrite.com

You can also just show up, but registering gives us a rough headcount, which is helpful for bringing in the right number of chairs and stuff like that.

See you there!

 


Elastos Foundation

Elastos Announces Partnership with IoTeX to Deliver Security and Access to DePIN Infrastructure

Elastos today announced a partnership with IoTeX, to deliver ID verification and validation services across the Decentralized Physical Infrastructure Networks (DePINs) specialists’ portfolio including DePINscan, DePINasset and W3bstream. DePINs lie at the intersection between crowd-sourced participation, funding and governance models and so-called Real World Assets (RWA) – tangible infrastructure s

Elastos today announced a partnership with IoTeX, to deliver ID verification and validation services across the Decentralized Physical Infrastructure Networks (DePINs) specialists’ portfolio including DePINscan, DePINasset and W3bstream.

DePINs lie at the intersection between crowd-sourced participation, funding and governance models and so-called Real World Assets (RWA) – tangible infrastructure such as buildings, equipment or other capital-intense assets.  They offer a mechanism to recruit and reward participants to maintain these assets, through the Blockchain.  When the latter is combined with a physical interface such as IoT, the contributions of these so-called physical ‘node managers’ can be tracked and, in turn, rewarded against tokens whose value itself increases with the development and use of the asset.

DePINscan provides a ready-to-use dashboard with essential visualizations for IoT projects and roll outs; while W3bstream is a decentralized protocol that connects data generated in the physical world to the Blockchain world.  IoTeX harnesses its innovative Roll-Delegated Proof of Stake (Roll-DPoS) consensus mechanism, designed to optimize speed and scalability for the seamless integration of IoT devices, while ensuring integrity throughout the entire process. Stakeholders cast their votes to elect specific block producers; block producers receive rewards for their contributions, which they subsequently share with the stakeholders who endorsed them.

Jonathan Hargreaves, Elastos’ Global Head of Business Development & ESG, describes the partnership as Web3 ‘next frontier’.

“Extending the benefits of the SmartWeb in terms of disintermediation, transparency and privacy into the physical domain is a logical but nonetheless exciting next step.  Our partnership with IoTeX means that entrepreneurs and businesses of any size will now have access to infrastructure that would otherwise be off limits to them, direct and on their terms.  This epitomizes Web3’s promise to level the playing field, thanks to its unique ability to ensure irrefutable identity proof which actually requires neither party to relinquish control of the same,” he says.

Raullen Chai, IoTeX’s co-founder and CEO, explains that DePINs permit an entirely new generation of businesses and entrepreneurs to access and monetize global infrastructure – from buildings to cabling, for instance – that otherwise would be prohibitively expensive or inaccessible.    

“Our partnership with Elastos represents an important milestone.  Extending our offering to the Elastos Smart Chain (ESC) offers some compelling advantages, including direct integration with ‘Layer 2’ Bitcoin, meaning that agreements can be embedded and reconciled direct in the World’s most popular and trusted digital currency.  This is an essential capability as DePINs become more mainstream,” he says. 

Interested in staying up to date? Follow Elastos here and join our live telegram chat.


DIF Blog

DIF's work on Interoperability Profiles

The challenge  Interoperability is a basic requirement for secure identity management and seamless communication between identity systems and services. However, in a world of multiple digital identity standards and protocols, interoperability doesn’t just happen ‘out of the box’.  Identity standards and protocols tend to

The challenge 

Interoperability is a basic requirement for secure identity management and seamless communication between identity systems and services.

However, in a world of multiple digital identity standards and protocols, interoperability doesn’t just happen ‘out of the box’. 

Identity standards and protocols tend to be flexible by design, entailing a range of decisions about how they should be implemented. 

Differences in business priorities, local regulations and how these are interpreted drive divergent implementations, making interoperability hard to achieve in practice.

This means that standards are a necessary, but not sufficient part of interoperability.

Interop Profiles: reducing optionality to enable interoperability

Interop profiles describe a set of specifications and other design choices to establish interoperability. These profiles specify items like

Data models and supported formats Protocols to transfer Verifiable Credentials (VCs) Which Decentralized Identifier (DID) methods must be supported  Supported revocation mechanism Supported signature suites

They also specify what’s out of scope, further reducing optionality and easing implementation. 

Profiles can be developed to achieve interoperability for a variety of needs in order to establish a trusted ecosystem.

Interop Profiles and Decentralized Identity

There is growing support for interoperability profiles that enable real-world applications of decentralized identity standards and technologies. 

For example, the US Department of Homeland Security (DHS) leads the Silicon Valley Innovation Program, which focuses (among other things) on digitization of trade documentation using Decentralized Identifiers and Verifiable Credentials. To prove interoperability, and help build confidence that the solution doesn’t result in vendor lockin, participants have developed profiles and interoperability test suites to ensure they are able to exchange and verify trade credentials. 

The International Air Transport Association (IATA) plays a similar role in ensuring interoperability within the travel supply chain (for example, when using verifiable credentials to onboard travel agents and intermediaries to an airline's agency portal). 

The Jobs for the Future Foundation has hosted a series of interoperability events (called “JFF Plug Fests”) to select profiles and develop test harnesses demonstrating that individuals can receive and share their credentials using their choice of conformant wallets, and that the flows work across conformant issuers and relying parties.

How DIF is working to make life easier for implementers 

The interoperability challenges highlighted in this article matter for our members. 

For one thing, it’s hard to build workable products, or viable ecosystems, on top of standards and protocols with divergent implementations.

There’s also a growing need for specific approaches to decentralized identity within different industries, regions, and use cases (such as the trade, travel and employment cases mentioned above). 

Interoperability is a core part of the Decentralized Identity Foundation (DIF)’s mission.

Which is why DIF has hosted collaborative work to develop robust interoperability profiles for a number of years. 

Examples include the JWT VC Issuance Profile, which describes the technical protocols, data formats, and other requirements to enable interoperable issuance of VCs from Issuers to Wallets (see https://github.com/decentralized-identity/jwt-vc-issuance-profile ), and the JWT VC Presentation Profile, which describes the technical protocols, data formats, and other technical requirements to enable interoperable exchange of VCs presentations between Wallets and Verifiers (see https://github.com/decentralized-identity/jwt-vc-presentation-profile ). 

Taking a closer look at these examples, the VC Data Model v1.1 defines the data model of Verifiable Credentials (VCs) but does not prescribe standards for transport protocol, key management, authentication, query language, et cetera. The same is true for DIDs.

A range of specifications are available, providing options for how these things (transport, key management, etc) are achieved, but if implementers have to support all possible specifications (and combinations), it would be a lot of work.

So a profile is a way to make choices and even restrictions for a certain use case, allowing all participants to establish interoperability.

Summary

Collaboration on interoperability is an essential part of the process of establishing a viable digital trust ecosystem. 

Interop profiles define specific requirements that must be followed by identity providers, relying parties, and other stakeholders.

DIF provides a neutral venue to collaborate on interop profile development. 

Together with our working group tools, best practices and IPR protection, and our members’ subject matter expertise in decentralized identity technologies, DIF is the destination of choice to host this work. 

Got a question? Email us - we’ll be happy to discuss your requirements. 


Velocity Network

Jen Berres & Mike Andrus on why HCA Healthcare is adopting verifiable digital credentials

On Mar. 8, 2024, HCA Healthcare’s Senior Vice President and Chief Human Resources Officer, Jen Berres, and Vice President of Operations and Technology, Mike Andrus, joined Velocity’s Co-founder and Head of Ecosystem, Etan Bernstein, to discuss the verifiable digital credential movement, the value to healthcare organizations in particular, and the opportunity to work together to solve for an HR cha

Identity At The Center - Podcast

A new episode of the Identity at the Center podcast is now a

A new episode of the Identity at the Center podcast is now available. This is a special Sponsor Spotlight episode, made in collaboration with our sponsor, Zilla Security. We had a great conversation with Deepak Taneja, CEO & Co-founder of Zilla Security, discussing a range of topics from how Zilla differentiates itself in the crowded IAM market to the role of Robotic Process Automation (RPA) i

A new episode of the Identity at the Center podcast is now available. This is a special Sponsor Spotlight episode, made in collaboration with our sponsor, Zilla Security. We had a great conversation with Deepak Taneja, CEO & Co-founder of Zilla Security, discussing a range of topics from how Zilla differentiates itself in the crowded IAM market to the role of Robotic Process Automation (RPA) in the identity lifecycle.

You can listen to this episode on our website, idacpodcast.com, or in your favorite podcast app. Don't miss it!

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

Future-Proofing Retail with RFID and 2D Barcodes with Sarah Jones Fairchild

Radio frequency identification (RFID) and 2D barcodes are transforming how we handle the supply chain.  Sarah Jones Fairchild, Vice President of Sales Operations at SWIM USA, talks 2D barcode applications for customer safety, efficiency in retail checkout, inventory management, and the broader implications for companies as they prepare for the technological demands of the future. Sarah expl

Radio frequency identification (RFID) and 2D barcodes are transforming how we handle the supply chain. 

Sarah Jones Fairchild, Vice President of Sales Operations at SWIM USA, talks 2D barcode applications for customer safety, efficiency in retail checkout, inventory management, and the broader implications for companies as they prepare for the technological demands of the future. Sarah explains the importance of high-quality data and the impact of incorrect data on consumers. She also touches on the potential for these technologies to address industry-specific needs and regulatory requirements. 

Sarah highlights her personal experience with tech at home and work, specifically how it helps align information for everyone. The discussion emphasizes the importance of GS1 standards for ensuring compatibility in the supply chain and the necessity of proper data management to fully leverage RFID and 2D barcode capabilities. The conversation levels supply chain tracking information for business owners of all types and why RFID can take a few years to implement. 

 

Key takeaways: 

Integrating RFID and 2D barcode technologies in supply chain operations is essential for improving accuracy and efficiency.

Data quality and management are challenging across industries, particularly with the need for high compatibility and usability standards.

Companies must embrace technologies such as RFID and 2D barcodes for the future.

 

Resources: 

What Is RFID Technology, and How Does It Work?

2D Barcodes: Changing the way you eat, shop, and live

Sunrise 2027: The Next Dimension in Barcodes

Enhance Your Supply Chain Visibility

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Sarah Jones Fairchild on LinkedIn

Check out SWIM USA


Digital Identity NZ

Hello Autumn: Let’s Dive into Serious Work Together

If the first two months of 2024 for Digital Identity NZ are anything to go by, this year is certainly turning out to be every bit as busy as 2023. The post Hello Autumn: Let’s Dive into Serious Work Together appeared first on Digital Identity New Zealand.

Kia ora,

If the first two months of 2024 for Digital Identity NZ are anything to go by, this year is certainly turning out to be every bit as busy as 2023. It is a different kind of busy, with more collaboration and partnership engagement needed to ‘get things done’ in the digital identity domain, against a backdrop of regulation and economic headwinds.

A couple of weeks ago, the year’s first Coffee Chat saw good attendance as did last month’s Air New Zealand and Authsignal sponsored webinar on passkeys. This exclusive member-only event shared how DINZ members, Authsignal and Air New Zealand worked together to deliver a world class implementation of passkeys to secure Air New Zealand’s customers accounts. Speaking of Authsignal, founder and DINZ Executive Council Member Justin Soong wrote this exceptional thought piece on AI published in Forbes last month. And DINZ member SSS – IT Security Specialists received this accolade!

Next week, members will receive a personal email from me seeking expressions of interest, particularly from digital ID service and attribute providers, to participate in an investigative sprint early next month from DINZ member PaymentsNZ’s. The aim is to surface the digital identity-related issues that people encounter in the payments industry, and develop best practice requirements to overcome them as part of PaymentsNZ’s Next Generation Payments programme. Stay tuned.

We kick off April with a lunchtime fireside chat; Digital Health Identity: History, current state and the future with two Te Whatu Ora specialists. There’s so much happening in this space. You can find out more and register here.

If you’re getting the impression that April is the month for digital identity, you’re correct! Tuesday 9 April is World Identity Management Day! While co-founded by the Identity Defined Security Alliance and the National Cybersecurity Alliance in the US in 2021, the day is recognised in many countries globally. In its fourth year, the 2024 Virtual Conference brings together identity and security leaders and practitioners from all over the world to learn and engage.

April is also a favourite time of year to publish research that helps to level-set our own strategies and plans, as DINZ did last year. This Australian research forwarded by a public sector member, would probably show similar results in NZ, as reflected in DINZ member InternetNZ’s insights research. And the EU digital wallet is taking shape as it aims to showcase a robust and interoperable platform for digital identification, authentication and electronic signatures based on common standards across the European Union. We hope to continue our research and additional initiatives for 2024, and we’re continually looking for support in the way of sponsorship from our members. Click here to find out how you can support DINZ’s research, and future ambitions.

Ngā mihi

Colin Wallis
Executive Director, Digital Identity NZ

Read the full news here: Hello Autumn: Let’s Dive into Serious Work Together

SUBSCRIBE FOR MORE

The post Hello Autumn: Let’s Dive into Serious Work Together appeared first on Digital Identity New Zealand.

Tuesday, 19. March 2024

Hyperledger Foundation

Why Hyperledger Besu is a Top Choice for Financial Use Cases

Hyperledger Besu has emerged as a preferred runtime for EVM-based financial initiatives worldwide. For projects like tokenization, settlements, CBDCs (Central Bank Digital Currencies), and trade finance, Besu stands out for its robust security features, versatility in network construction, performance, pluggability, and enterprise-friendly licensing and programming language.

Hyperledger Besu has emerged as a preferred runtime for EVM-based financial initiatives worldwide. For projects like tokenization, settlements, CBDCs (Central Bank Digital Currencies), and trade finance, Besu stands out for its robust security features, versatility in network construction, performance, pluggability, and enterprise-friendly licensing and programming language.

Monday, 18. March 2024

FIDO Alliance

Tech Telegraph: Best PC and laptop security accessories 2024

If you haven’t had the pleasure of using biometrics on a device for authentication through Windows Hello, you’re missing out. It’s much faster and easier than having to type in […]

If you haven’t had the pleasure of using biometrics on a device for authentication through Windows Hello, you’re missing out. It’s much faster and easier than having to type in your password.


Android Headlines: X Android App Beta Gets Password-less Passkeys Authentication Support

Passkeys enhance security by eliminating traditional passwords and relying on the interaction between Private and Public keys for user authentication, reducing the instance of phishing attacks and data breaches. Passkeys […]

Passkeys enhance security by eliminating traditional passwords and relying on the interaction between Private and Public keys for user authentication, reducing the instance of phishing attacks and data breaches. Passkeys are gaining traction among various platforms, including websites, gaming platforms, and Windows 11 apps.


The New Stack: 3 Steps to Make Logins with Passkeys Reliable

Passkeys offer modern and secure authentication by enabling cryptography-backed user authentication with a frictionless user experience. With users becoming more accustomed to passkeys, 2024 is the year to ditch passwords […]

Passkeys offer modern and secure authentication by enabling cryptography-backed user authentication with a frictionless user experience. With users becoming more accustomed to passkeys, 2024 is the year to ditch passwords and upgrade to passkeys with these considerations in mind.


Identity At The Center - Podcast

It’s time for the latest episode of the Identity at the Cent

It’s time for the latest episode of the Identity at the Center Podcast! We had the pleasure of welcoming back Andi Hindle, the Conference Chair for Identiverse, for an in-depth discussion about the planning and unique aspects of the Identiverse conference. We explore whether Identiverse is a Digital Identity conference or an IAM conference. Looking forward to an enlightening conversation? Listen t

It’s time for the latest episode of the Identity at the Center Podcast! We had the pleasure of welcoming back Andi Hindle, the Conference Chair for Identiverse, for an in-depth discussion about the planning and unique aspects of the Identiverse conference. We explore whether Identiverse is a Digital Identity conference or an IAM conference. Looking forward to an enlightening conversation? Listen to the full episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 15. March 2024

Identity At The Center - Podcast

Join us for a special Friday episode of The Identity at the

Join us for a special Friday episode of The Identity at the Center Podcast. We discussed the rapidly evolving world of Privileged Access Management with our guest Paul Mezzera. We talked about the driving forces behind these changes and what the future might hold. Listen to our conversation at idacpodcast.com or in your favorite podcast app. #iam #podcast #idac

Join us for a special Friday episode of The Identity at the Center Podcast. We discussed the rapidly evolving world of Privileged Access Management with our guest Paul Mezzera. We talked about the driving forces behind these changes and what the future might hold. Listen to our conversation at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Thursday, 14. March 2024

Berkman Klein Center

Accuracy, Incentives, Honesty: Insights from COVID-19 Exposure Notification Apps

The next pandemic response must respect user preferences or risk low adoption By Elissa M. Redmiles and Oshrat Ayalon Photo by Mika Baumeister on Unsplash Four years after COVID-19 was first declared a pandemic, policy makers, companies and citizens alike have moved on. The CDC no longer offers separate guidance for COVID-19. Apple and Google have shut down their exposure notificat

The next pandemic response must respect user preferences or risk low adoption

By Elissa M. Redmiles and Oshrat Ayalon

Photo by Mika Baumeister on Unsplash

Four years after COVID-19 was first declared a pandemic, policy makers, companies and citizens alike have moved on. The CDC no longer offers separate guidance for COVID-19. Apple and Google have shut down their exposure notification infrastructure, which was used heavily in the US and Europe. As COVID-19 spread, technologists were called to serve by building and deploying exposure notification apps to scale parts of the contact tracing process. These apps allowed users to report when they tested positive for COVID-19 and to notify other users when they had been in the vicinity of an infected user. But getting people to use exposure notification apps during the pandemic proved challenging.

More than three million lives have been lost to COVID-19 over the past four years. Any hope of losing fewer lives during the next pandemic rests on reflection: what did we do, what can we learn from it, and what can we do better next time? Here, we offer five key lessons-learned from research on COVID-19 apps in the US and Europe that can help us prepare for the next pandemic.

Privacy is important, but accuracy also matters

Privacy was the primary focus in early exposure notification apps, and rightfully so. The apps all trace their users’ medical information and movements in various ways, and may store some or all of that information in a central database in order to inform other users of potential infection. The misuse of this information could easily result in unintentional, or even intentional, harm.

However, research into whether (and how) people used exposure notification apps during the pandemic showed that privacy might not be the most important factor. People care about accuracy, or an app’s rate of incorrect reports of COVID-19 exposure (both false positives and false negatives), which may have also influenced rates of public app adoption. Yet, we still know little about how effective the deployed exposure notification apps were. Future apps will need to have measurement tools and methods designed into them before they are released to accurately track their usefulness.

We need to better understand the role of incentives

Researchers discovered that using direct incentives, such as monetary compensation, to get people to install exposure notification apps worked at first, but had little effect in the long term. In fact, one field study found that people who received money were less likely to still be using the app eight months later than those who didn’t. Paying people to download a contact tracing app is even less effective when the app is perceived to be bad quality or inaccurate. However, monetary incentives may be able to “compensate” when the app is perceived to be costly in other ways, such as eating up mobile data.

Given the ethical problems and lack of success with direct incentives, focusing on indirect incentives, such as functionality, may be key to increasing adoption. Exposure notification apps have the potential to serve a greater purpose during pandemics than merely exposure notification. Our research found that people using exposure notification apps wanted them to serve as a “one-stop-shop” for quick receipt of test results, information on the state of public health in their region, and assistance finding testing centers.

Future app design needs to examine user wants and expectations to ensure widespread adoption. This is hardly a new concept — every successful “fun” app begins with this user-centered model. Apps that provide these extra benefits to users will not only be better adopted, they will also see more frequent and prolonged use.

…Over a third of the Coronalert app users we interviewed believed that it tracked their location, despite repeated communications over the course of a year that it used proximity rather than location to detect possible exposures.

Honesty is the most effective communication strategy

Exposure notification apps are often framed to the public as having inherent individual benefits: if you use this app, you’ll be able to tell when you’ve been exposed to a disease. In reality, exposure notification apps have a stronger collective benefit of preventing the overall spread of disease in communities. Being honest with potential users about the true benefits is more effective than playing up the less significant individual benefit. When examining how to best advertise Louisiana’s exposure notification app, we found that people were most receptive to the app when its collectivistic benefits were centered.

Honesty and openness in privacy is also essential, especially when it comes to data collection and storage. Despite this transparency, however, people may still make assumptions based on false preconceptions or logic. For example, over a third of the Coronalert app users we interviewed believed that it tracked their location, despite repeated communications over the course of a year that it used proximity rather than location to detect possible exposures.

Integration with existing health systems is essential

There was a disconnect between COVID-19 exposure notification apps and public healthcare systems, even in countries with universal healthcare and government-supported apps. Belgium’s Coronalert app, for example, allowed users to receive their test results faster by linking their test to their app using a unique code. But, testing center staff were not trained on the app and failed to prompt users for that code. Not only was receiving test results a primary motivator in getting people to use the app; failing to link positive results to specific app users reduced the app’s efficacy.

This disconnect may be far greater in countries without universal healthcare or where exposure notification apps are privately created. In order for these apps to be effective, developers must collaborate with public health workers to develop a shared understanding of how testing centers operate, determine the information needed to provide accurate tracking, and decide on the best way to follow up on potential infections.

Resourcing technical capacity is critical

A wide range of exposure notification apps were developed to combat COVID-19, and by many different organizations. In the absence of immediate government action, many of the earliest efforts were led by universities or volunteer efforts. Academics developed the DP3T proximity tracing protocol, which guided Google and Apple’s development of exposure notification infrastructure for Android and iOS phones.

However, privatization of exposure notification infrastructure created an enormous potential for private medical and other information to fall into the hands of corporations who are in the business of big data. It also subjected exposure notification technology to private company’s rules (and whims).

Google and Apple released exposure notification infrastructure in April 2020 but did not release direct-to-user exposure notification functionality until later in the pandemic. This decision left the development of exposure notification apps to public health agencies that lacked the resources and technical capacity to do so. Volunteers stepped in to fill this void. For example, the PathCheck foundation developed exposure notification apps for 7 states and countries on top of the Google-Apple Exposure Notification infrastructure.

“…We need to eliminate these scattered responses, align incentives, and integrate the strengths and perspectives of public, private, and academic bodies to develop protocols, models, and best practices.”

While it is natural for universities to support the public good, and encouraging that private citizens volunteered so much of their time and resources to do so, they should not have to in the next pandemic. To respond to future pandemics, we need to eliminate these scattered responses, align incentives, and integrate the strengths and perspectives of public, private, and academic bodies to develop protocols, models, and best practices.

Applying the lessons learned

Building tech responsibly means not just considering privacy, but providing technology that respects user preferences. When people give up their data, they expect a benefit — be that a collective benefit, such as fighting a pandemic or helping cancer research, or an individual one. They likewise expect utility: apps that are accurate, achieve their goals, and provide an holistic set of features.

If we continue to build tech based on our assumptions of what users want, we risk low adoption of these technologies. And during times of crisis, such as this still-ongoing COVID-19 pandemic, the consequences of low adoption are dire.

Elissa M. Redmiles is a computer scientist specializing in security and privacy for marginalized & vulnerable groups at Georgetown University and Harvard’s Berkman Klein Center.

Oshrat Ayalon is a human-computer interaction researcher focusing on privacy and security at the University of Haifa.

Accuracy, Incentives, Honesty: Insights from COVID-19 Exposure Notification Apps was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elastos Foundation

Elastos Launches Grant Program to Accelerate Deployment of “Smart Bitcoin” Applications

Visit Destiny Calls Website Page! Today Elastos, a pioneer in blockchain technology announced the launch of its Destiny Calls Program. Elastos is the creator of BeL2, the first Bitcoin Layer 2 applying zero-knowledge technology to enable the direct development and management of  ‘Bitcoin-native’ smart contracts.  The new program is now welcoming applications from the digital […]

Visit Destiny Calls Website Page!

Today Elastos, a pioneer in blockchain technology announced the launch of its Destiny Calls Program. Elastos is the creator of BeL2, the first Bitcoin Layer 2 applying zero-knowledge technology to enable the direct development and management of  ‘Bitcoin-native’ smart contracts. 

The new program is now welcoming applications from the digital entertainment, gaming and leisure sector utilising Elastos’ decentralised infrastructure including BeL2 to deliver Bitcoin- denominated services and experiences. The initial cohort of 6 to 8 projects will be backed by up to 100,000 ELA in funding, equivalent to approximately $378,000 USD to kick start a new and non-invasive approach to Layer 2 solutions. The program is a key part of Elastos’ ongoing mission to accelerate the development of the user-controlled SmartWeb.

“With the recent launch of Elastos’ BeL2, innovators and entrepreneurs now have access to the functionality of layer 2 blockchains backed by the unparalleled security of Bitcoin,” said Jonathan Hargreaves, Global Head of Business Development & ESG at Elastos. “Bitcoin Layer 2 promises to unlock various applications that will underpin the SmartWeb and has fundamentally addressed some of the capacity and functionality restrictions that have hindered the mainstream adoption of the Bitcoin ecosystem. Destiny Calls will provide crucial initial funding for teams exploring the potential of BeL2 and Elastos’ other SmartWeb infrastructure, and will accelerate the transformation of the internet into a user-driven and interactive ecosystem.”

Projects will be selected by the Destiny Calls board and reviewed with support by QuestBook, the on-chain grant funding review and administration platform. The initial cohort will be focused on three sectors: digital entertainment, gaming and leisure. In addition to funding, as part of the program Elastos will provide marketing and technical support, along with mentorships to support grantees in reaching their program milestones. Interested applicants are encouraged to visit the Destiny Calls page here.

 

Elastos’ Bitcoin Layer2, BeL2

The launch of Destiny Calls, follows the recent launch of Elastos’ Bitcoin layer 2, BeL2. BeL2 is the first Bitcoin layer 2 to facilitate the creation, recognition, management and exchange of any Bitcoin-denominated smart contract directly between concerned parties, without workarounds like intermediaries, side chains or additional applications. BeL2 promises to unlock the SmartWeb, by providing unprecedented functionality to Bitcoin and is part of growing industry excitement and focus on unlocking layer 2 functionality on Bitcon after the significant growth of L2s in the Ethereum ecosystem.

 

Pilot Recipients Announced

As part of the launch, Elastos is confirming that BeatFarm will join Destiny Calls as an inaugural member, having successfully completed pilot projects with Elastos. Beatfarm is a Decentralised platform to give artists direct access to potential collaborators, promotors, producers and industry professionals on their own terms. In collaboration with Elastos, Beatfarm is working to enable artists to establish Smart Contracts on their own terms with the resulting contracts – eScriptions – secured and assured through Bitcoin that can be traded through a decentralised marketplace. 

“Beatfarm’s success as a pilot project perfectly illustrates the potential of BeL2 to create sustainable business models for decentralised Web3 experiences,” adds Jonathan. “ Beatfarm exemplifies our goal of supporting innovative ideas in digital entertainment, gaming and leisure through Destiny Calls.”

For more information, please visit the Destiny Calls Website plage.


MOBI

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes Coalition Advocates for Improvements to Streamline Auto Transactions Los Angeles — 14 March 2024. MOBI, a global nonprofit Web3 consortium, is excited to announce its participation in the Electronic Secure Title and Registration Transformation (eSTART) [...]

MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes

Coalition Advocates for Improvements to Streamline Auto Transactions

Los Angeles — 14 March 2024. MOBI, a global nonprofit Web3 consortium, is excited to announce its participation in the Electronic Secure Title and Registration Transformation (eSTART) Coalition as a founding member. eSTART is a group of leading auto industry organizations united in advocating for modern solutions to replace the paper-based processes that currently dominate state and local DMV operations.

The eSTART Coalition focuses on three key areas of vehicle transactions:

Permitting electronic signatures on all title and registration documents; Adopting tools for electronic submission and processing of title and registration; and Enabling electronic vehicle records transfers.

Modernizing these processes will result in significant cost and time savings for consumers, state and local DMV operations, and industry participants.

Across the U.S., countless titling/registration service providers maintain unique databases and processes for vehicle registration and titling. While some of these jurisdictions have begun digitizing certain processes, many rely entirely on paper-based and manual workflows. This fragmented approach presents several pain points for Motor Vehicle Authorities (MVAs), private sector participants, and consumers, including:

Lack of standardized processes leading to inconsistencies in data management and accessibility. Incurrence of substantial costs associated with paper-based systems, including storage, processing, and handling. Prolonged processing times and increased risk of errors due to manual verification processes. Missed opportunities for cost savings, efficiency gains, and enhanced customer experiences.

Addressing these pain points requires a solution that can be easily adopted across all jurisdictions rather than a solution that functions at a state, county or municipal jurisdiction level. MOBI and its members are collaborating on a Web3-enabled standardized solution to enhance efficiency and cross-border regulatory compliance in MVA operations with an interoperability framework rooted in self-sovereign data and identities. This unified framework serves as a common language, enabling organizations with diverse business processes and legacy systems to efficiently coordinate in a standardized manner without having to build and maintain new infrastructure.

The implementation of a standardized Web3 ecosystem offers a promising solution to streamline operations, increase efficiency, reduce costs, and greatly improve data permissioned-only-access. The ability to verify identities and transactions in a decentralized way can reduce odometer and titling fraud, eliminate the need for manual identity verification, improve insurance products, and enable more seamless remote transactions (e.g. online sales and road usage charging).

“We’re excited to be part of a coalition that not only shares our vision for a more streamlined and modern automotive industry but is actively working towards making it a reality,” said Tram Vo, MOBI CEO and Co-Founder. “MOBI and its members are proud to bring a unique Web3 standardized approach to this groundbreaking endeavor. Together, we’re setting the stage for a more efficient, interoperable ecosystem that empowers stakeholders through enhanced trust and data privacy for all.”

Other transportation industry organizations, including government agencies, industry partners, and associations, are encouraged to join the eSTART Coalition to advocate for these important changes. For more information about eSTART, please visit www.estartcoalition.org or contact info@estartcoalition.org.

About MOBI

MOBI is a global nonprofit Web3 consortium. We are creating standards for trusted self-sovereign data and identities (e.g. vehicles, people, businesses, things), verifiable credentials, and cross-industry interoperability. Our goal is to make the digital economy more efficient, equitable, decentralized, and sustainable while preserving data privacy for users and providers alike. For additional information about joining MOBI, please visit www.dlt.mobi.

About eSTART Coalition

The Electronic Secure Title and Registration Transformation (eSTART) Coalition is a united group of leading automotive organizations committed to modernizing and streamlining automotive title and registration processes. eSTART focuses on advocating for the implementation of efficient technology solutions to replace the paper-dependent systems currently used by DMVs. Through collective advocacy and action at the local and national levels, the coalition aims to drive significant improvement in automotive industry processes in ways that benefit all customers, DMVs and industry participants.

For more information, please visit www.estartcoalition.org.

Media Contact: Grace Pulliam, MOBI Communications Manager

Email: grace@dlt.mobi | Twitter: twitter.com/dltmobi | Linkedin: MOBI

The post MOBI Joins eSTART Coalition to Help Modernize Automotive Title and Registration Processes first appeared on MOBI | The New Economy of Movement.

Wednesday, 13. March 2024

Elastos Foundation

ELA: The Queen of Bitcoin

Bitcoin transformed finance by deploying blockchain technology, a decentralised system that replaces central authority with cryptographic trust. At its heart lies the Proof of Work (PoW) consensus algorithm, where miners expend computational energy to compete and solve complex mathematical problems, securing the network and validating transactions for BTC rewards. This model reflects the natural co

Bitcoin transformed finance by deploying blockchain technology, a decentralised system that replaces central authority with cryptographic trust. At its heart lies the Proof of Work (PoW) consensus algorithm, where miners expend computational energy to compete and solve complex mathematical problems, securing the network and validating transactions for BTC rewards.

This model reflects the natural competition for survival, akin to trees vying for sunlight, businesses vying for market dominance, individuals competing for a mate or the dynamics between predators and prey —each process governed by the relentless pursuit of energy and dominance.

Bitcoin’s hashrate represents its own competitive edge in the digital realm. This hashrate, a staggering 595.79 EH/s, signifies a computational battle much like those found in nature, but on a scale that dwarfs the combined power of the world’s supercomputers, underscoring the network’s unmatched security and the near-impossibility of overpowering it.

PoW elevates beyond a simple mechanism, integrating nature’s laws into the digital domain to fortify Bitcoin’s network through electricity, a tangible, physical cost. Bitcoin—becoming the unchallenged cornerstone of digital finance, offers a decentralised alternative that empowers individuals with financial sovereignty and freedom from central authority. It provides a secure, transparent, and accessible financial system for everyone, regardless of location or status.

 

Satoshi’s Vision for Merged Mining

 

 

Merged mining, or Auxiliary Proof of Work (AuxPoW), allows two different blockchains to use the same consensus mechanism. Miners can mine blocks on both chains simultaneously, submitting proof of their work to both networks. The key is that the ‘child’ blockchain, while independent in transactions and storage, relies on the ‘parent’ blockchain’s PoW for its security.

The concept of merged mining was introduced in a Bitcoin forum post by Satoshi Nakamoto in 2010, discussing the possibility of a new service called BitDNS to be mined simultaneously with Bitcoin. Satoshi proposed that by allowing miners to work on both chains at once, without extra effort or splitting the mining community, both networks could benefit from increased security and efficiency. The benefits include:

Economic Assurance of Security: Merged mining with Bitcoin means a ‘child’ blockchain’s security is underwritten by the considerable economic cost of Bitcoin mining. This straightforwardly leverages the existing, well-established energy expenditure of Bitcoin for maximum security with no additional complexity. Resource Optimisation and Environmental Consideration: Utilising Bitcoin’s existing mining infrastructure, merged mining does not require extra energy, making it an efficient and environmentally considerate approach to securing a blockchain. Scalability through Proven Infrastructure: By tapping into Bitcoin’s vast network of miners, merged mining scales a ‘child’ blockchain’s security with the growth of Bitcoin’s network.

Merged mining showcases efficiency and symbiosis, much like the natural cooperation in mycorrhisal networks, bees’ cross-species pollination, and mutualistic relationships between birds and mammals. It mirrors human ingenuity in leveraging established resources, such as start-ups utilising corporate infrastructures and solar panels or trees harnessing the sun’s energy, emphasising the smart utilisation of existing networks to bolster security and growth without additional expenditure.

Notably, Namecoin, one of the first to adopt this with Bitcoin, aims at decentralising domain-name registration. While Dogecoin, known for being merge mined, actually pairs with Litecoin due to the shared Scrypt algorithm, not Bitcoin. Myriadcoin’s unique approach supports multiple algorithms, including SHA-256, making it compatible with Bitcoin. Syscoin and Elastos also leverage Bitcoin’s hash power for enhanced security through merge mining.

 

Elastos and Bitcoin Merged Mining

Elastos, which began with the vision of creating a secure, decentralised internet, incorporated merged mining with Bitcoin in 2018. BTC.com helped mine its first block, and today, its network and currency ELA benefits from over 50% of Bitcoin’s mining security. So, what does this mean?

Elastos Utilises the Strongest Proof of Work Security Model in Existence: By merged mining with Bitcoin, Elastos capitalises on the most extensive PoW network, inheriting Bitcoin’s unparalleled security attributes. This symbiotic relationship means Elastos’ blockchain integrity is as robust as Bitcoin’s, mitigating risks without directly vying for Bitcoin’s mining resources. Elastos Has Achieved an Energy-Efficient Design Without Compromising Security: Energy efficiency is a major concern in cryptocurrency mining. Elastos adds transaction and block validation on its network by piggybacking on the work done by Bitcoin miners, thus maintaining high security with no additional energy requirements. This model serves as a case study in eco-conscious blockchain design. Elastos Offers a Unique Combination of a Decentralised Operating System with Bitcoin-Level Security: Unlike conventional blockchains, Elastos is a fully-fledged operating system for decentralised applications, secured by a blockchain layer. By integrating Bitcoin’s hash power through merged mining, it ensures a fortified environment for running dApps, differentiating itself significantly from competitors. Elastos Is Pioneering the True Decentralised Internet Backed by the Robustness of Bitcoin’s Network: Elastos’ aim to revamp the internet structure into a truly decentralised form is ambitious. By aligning its consensus mechanism with that of Bitcoin, it anchors its network to the tried-and-tested resilience of Bitcoin’s mining power, driving forward a new paradigm for digital communication and interaction. Elastos’s Ecosystem Is Designed to be Self-Sustaining and Independent, Yet Benefits Directly from Bitcoin’s Continued Growth: The design of Elastos’s ecosystem ensures it remains autonomous. As Bitcoin’s network expands and becomes more secure, Elastos indirectly benefits from these enhancements, bolstering its own proposition without the need for additional investment in security. Elastos May Be the Most Direct Implementation of Satoshi Nakamoto’s Vision for Merged Mining: Elastos’s use of merged mining is arguably a direct reflection of Satoshi’s initial musings on the subject. Its broad strategic outlook that includes an operating system, a carrier network, and SDKs for developers, all secured by the hash rate of Bitcoin, makes it a comprehensive and multidimensional implementation of the concept.

 

BTC’s Queen

Elastos, by merging mining with Bitcoin, can be likened to a queen in the chess game of digital finance, where Bitcoin holds the position of king. Just as a queen’s versatility and power are essential for protecting the king and dominating the board, Elastos’ integration with Bitcoin’s security framework amplifies the ecosystem’s resilience and innovation and gives it’s own ecosystem a plethora of utility. This includes:

Transaction Fees: ELA powers Elastos by covering transaction fees, including smart contracts and asset registrations, ensuring network security and efficiency. Digital Asset Exchange: ELA fuels a decentralised economy in Elastos, enabling direct trade of digital assets and services, cutting out middlemen. Incentive Mechanism: ELA rewards participants, including miners who secure the network via merge mining with Bitcoin, enhancing security and sustainability. Governance: Holding ELA grants governance rights, allowing stakeholders to vote on network decisions through the Cyber Republic, promoting community-driven development. Decentralised Applications (DApps): ELA is essential for using DApps on Elastos, providing access to a broad range of services and expanding the ecosystem’s functionality.

Together, Bitcoin and Elastos form a formidable duo, combining the steadfast security of the king with the dynamic reach and versatility of the queen, setting the stage for a future where digital finance is both secure and boundlessly innovative. What’s more, Elastos is developing BeL2, the Bitcoin Elastos Layer 2 protocol, allowing EVM smart contracts to run directly on top of Bitcoin, a scalable BitVM innovation. What if such services enable anyone with their decentralised wallet the ability to generate their own Bitcoin-backed algorithmic stablecoins, free from censorship? If Bitcoin introduces the concept of “Be Your Own Bank,” what if Elastos can expand on the idea to “Be Your Own Central Bank?”, both secured in POW. This could drastically disrupt finance as we know it.

Interested in staying up to date? Follow Elastos here and join our live telegram.


Hyperledger Foundation

Hyperledger Mentorship Spotlight: Aries-vcx based message mediator

The world of technology has seen significant developments over the past few decades, largely driven by advancements in cryptography. These advancements have led to innovations including secure internet traffic through HTTPS and WireGuard; protected data storage via BitLocker, LUKS, and fscrypt; decentralized consensus records using Bitcoin and Ethereum; and privacy-focused messaging pr


The world of technology has seen significant developments over the past few decades, largely driven by advancements in cryptography. These advancements have led to innovations including secure internet traffic through HTTPS and WireGuard; protected data storage via BitLocker, LUKS, and fscrypt; decentralized consensus records using Bitcoin and Ethereum; and privacy-focused messaging protocols like Signal and MLS (Messaging Layer Security).

However, despite these advances, our online identities remain controlled by third parties, whether we sign in to apps using Google or Facebook OpenID or manage "verified" accounts on platforms such as Twitter or Instagram. An emerging movement seeks to change this status quo by harnessing the transformative power of cryptography. Governments are also starting to recognize the value of self-sovereign identity (SSI)—a system in which individuals retain full control of their own digital identities.


MyData

Open position: Legal and policy specialist/ ecosystems specialist

Job title:  Legal and policy specialist / ecosystems specialistEmployment type:  Fixed contractContract duration:   March 2024 through 31 March 2026, with opportunity for renewal.Location: Remote, based in the EU and with a preference for Oslo, or Helsinki. Reports to: Executive Director Role description   The ecosystems specialist is responsible for advancing MyD
Job title:  Legal and policy specialist / ecosystems specialistEmployment type:  Fixed contractContract duration:   March 2024 through 31 March 2026, with opportunity for renewal.Location: Remote, based in the EU and with a preference for Oslo, or Helsinki. Reports to: Executive Director Role description   The ecosystems specialist is responsible for advancing MyData’s work to facilitate the emergence of […]

Tuesday, 12. March 2024

MOBI

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens Stay tuned for updates! About Our Web3 Cross-Industry Interoperability Pilots Alongside our global community, we’ve demonstrated several potential use cases for Citopia and Integrated Trust Network (ITN) services through various pilot projects. Together, Citopia and the [...]

Standardized Web3 Solution for Vehicle Registration, Titling, and Liens

Stay tuned for updates!

Toggle Navigation Get Involved MOBI Standards Citopia Integrated Trust Network About Our Web3 Cross-Industry Interoperability Pilots

Alongside our global community, we’ve demonstrated several potential use cases for Citopia and Integrated Trust Network (ITN) services through various pilot projects. Together, Citopia and the ITN provide the necessary infrastructure for node operators to build out secure, seamless, globally compliant web services and applications. MOBI membership is required to operate a node on Citopia and/or the ITN. Contact us to learn more about becoming a node operator

Overview of the Pilot and the Problem It Solves

Across the United States, there is a diverse array of jurisdictions (numbering in the thousands across states, counties, and municipalities) and titling/registration service providers, each maintaining unique databases and processes for vehicle registration and titling. Many states (AZ, DE, GA, FL, LA, MA, MD, NC, SC, PA, VA, and WI) currently mandate the use of electronic lien and title (ELT) systems. Other states have planned ELT mandates in 2024, or more generally are developing a digital approach to electronic vehicle titling. For example, New York and Idaho have or are developing processes for dealer reassignments electronically.

Each of these jurisdictions will maintain their own systems for these varied processes. The challenge lies in achieving interoperability between those systems through standardized communications and data reporting/exchange across jurisdictional, platform, and organizational lines while enabling each jurisdiction to maintain control over its processes. For example, today, each vehicle manufacturer or lender can have hundreds of unique identifiers assigned to them by different jurisdictions, creating confusion, mismanagement, and inefficiency.

Currently, secure digital authentication and communication rely on identifiers issued by centralized platforms to prove their credentials. However, in addition to being vulnerable to fraud, identity theft, and data leaks, centralized approaches to identity management fail to address the trust problems created by the rise of decentralized services, IOT, and Generative AI. As digitization advances, it will become increasingly challenging — and costly — to verify data authenticity, secure digital perimeters, and ensure cross-border regulation compliance. This is critical for state agencies like MVAs as well as dealers and lenders, who are responsible for executing the bulk of the registration/titling process.

Stakeholders: Vehicle Manufacturers (OEMs); Financial Institutions (FIs)/Lenders; Servicers; Dealerships; Motor Vehicle Authorities (MVAs)/Third-party Registration/Titling Providers (RTPs); State Authorized Inspectors; Third-Party Data Consolidators; Fleet Operators; Trade Associations; Vehicle Auctions; and Consumers. Our Innovative Solution

Overcoming these challenges calls for a new solution. The White House’s Federal Zero Trust Strategy (2022) mandates that federal agencies and any organization that works with the federal government adopt a Zero Trust framework by the end of FY 2024. Zero Trust requires every entity to authenticate and validate every other entity for every single digital communication at all times. Since this is not possible at scale through Web2/centralized means, Web3 technologies and principles must be leveraged.

MOBI and its members have developed platform-agnostic standardized “universal translators” that work with any existing legacy system or web service to enable cross-industry interoperable communication through World Wide Web Consortium (W3C) decentralized identity and verifiable credential framework, called Citopia Passports (Web3 Plug-and-Play). Citopia Passports ensure that organizations’ and customers’ data privacy, which is key for complying with comprehensive data privacy laws being passed by many states (e.g., CA, CT, OR, TX, UT, VA).

Explore Cross-Industry Interoperability Requirements

Interested in learning more? Dive deeper on our Web3 Infrastructure Page!

Zero Trust Authentication: Cross-industry interoperability requires claims and identities to be verified for each transaction to ensure maximum security. Read the Federal Zero Trust Strategy

Infosec & Selective Disclosure: Participants must be able to selectively disclose information for transactions at the edge. Verification must be done at the moment of transaction to eliminate the need for PII storage.

Scalability and Extensibility: Cross-industry interoperability requires a shared standards-based framework to enable the creation of globally scalable multiparty applications.

Data Privacy Compliance: Cross-industry interoperability requires (1) compliance with existing global data privacy regulations and (2) the flexibility to comply with future directives.

Global Standards: Cross-industry interoperability requires a standardized system for frictionless data exchange and collaboration while allowing stakeholders to retain their legacy systems.

Decentralization: Cross-industry interoperability requires a community-owned and -operated infrastructure to (1) prevent monopolization and (2) enable consensus-based trust.

Web3 Plug-and-Play

Citopia Passports utilize W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs) standards. This creates an interoperability framework that provides:

Explore Citopia Passports

Decentralized, trusted identities

Digital credential issuance/verification

Interoperable communication between each stakeholder’s centralized databases

A bridge between jurisdictions, organizations, and platforms allowing each stakeholder to keep their legacy systems

The result is the reduction of errors, streamlined operations, increased efficiency, and reduced costs, as well as greatly improved data permissioned-access. More generally, this cross-industry, platform-agnostic, universal interoperability is part of what has motivated government interest worldwide in implementing and adopting standards-based digital identity and credential systems (e.g., the Department of Homeland Security (DHS) in the US; European Union Agency for Cybersecurity (ENISA) and European Self-Sovereign Identity Framework (ESSIF) in the EU).

Proposed Stakeholder Meeting

MOBI is proposing a two-part meeting in the first half of 2024: part one being a meeting between the association stakeholders (e.g. AAMVA, NADA, ATAEs, NIADA, AFSA, MOBI) and their representative members, and part two being a meeting including the titling service providers. The goals of the meeting are:

to bring together the key stakeholders to assess the pain points, needs/requirements, and path forward to achieve interoperability between the numerous centralized systems for registration/titling to jointly address the opportunity to develop standardized communication between each stakeholder to achieve interoperability for registration/titling processes to discuss how secure, verifiable digital identifiers and claims (using open-standard Web3 technologies) can address fundamental problems, such as each lender having hundreds of different identifiers assigned to them by different jurisdictions to finalize the scope and scale of the Standardized Web3 Solution for Titling/Registration Pilot Pilot Planning

In Phase 1 of the Pilot, the FSSC WG will demonstrate privacy-preserving cross-industry interoperability for Titling/Registration via standardized universal identifiers and communication/claims without the need to build new infrastructure. This will involve working with MVAs, lenders, dealers, OEMs, and service providers to demonstrate interoperability across different legacy systems and jurisdictions. At the end of Phase 1, stakeholders will have successfully created Citopia Passports and be able to use their Citopia Passport to easily authenticate each other’s identifiers and claims (such as lien release, odometer disclosures, insurance validation, etc.). Stakeholders will be able to examine the code and outputs to verify that all transactions/communications are private and only visible to the intended recipient.

In Phase 2 of the Pilot, each stakeholder will have the opportunity to run nodes, conduct research and development for their own applications, and actively participate in the pilot for a duration of 6-12 months. The FSSC WG will determine the final scope of Phase 2 after the conclusion of Phase 1.

MOBI WEB3 INFRASTRUCTURE

Explore the Future of
Cross-Industry Interoperability

Together, Citopia and the Integrated Trust Network (ITN) form our federated Web3 infrastructure for verifiable identity, location, and business automation. Learn more

JOIN MOBI

Learn How Your Organization Can Get Involved

Join our community to help shape the future of interoperability, accelerate the adoption of cutting-edge tech, and define a new era of digital trust! Submit an inquiry

Dive Deeper

Interested in learning more about MOBI, our community-owned and operated Web3 Infrastructure, and our interoperability pilots? Contact us at connect@dlt.mobi to get in touch with the team!

Get Involved

The post Standardized Web3 Solution for Vehicle Registration, Titling, and Liens first appeared on MOBI | The New Economy of Movement.

Monday, 11. March 2024

OpenID

Notice of Vote for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance

The official voting period will be between Monday, March 25, 2024 and Monday, April 1, 2024, once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, March 18, 2024. The OpenID Connect Working Group page is […] The post Notice of Vote for Proposed Implementer’s Draft of OpenID fo

The official voting period will be between Monday, March 25, 2024 and Monday, April 1, 2024, once the 45 day review of the specification has been completed. For the convenience of members who have completed their reviews by then, voting will actually begin on Monday, March 18, 2024.

The OpenID Connect Working Group page is https://openid.net/wg/connect/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/328.

The post Notice of Vote for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance first appeared on OpenID Foundation.


Identity At The Center - Podcast

In the latest episode of the Identity at the Center Podcast,

In the latest episode of the Identity at the Center Podcast, we had the pleasure of speaking with Nick Mothershaw, Chief Identity Strategist at the Open Identity Exchange (OIX). We discussed the concept and functionality of digital wallets, the role of governments in issuing these wallets, and the future of smart and roaming wallets. This was a truly fascinating conversation, and I'm sure you'll f

In the latest episode of the Identity at the Center Podcast, we had the pleasure of speaking with Nick Mothershaw, Chief Identity Strategist at the Open Identity Exchange (OIX). We discussed the concept and functionality of digital wallets, the role of governments in issuing these wallets, and the future of smart and roaming wallets. This was a truly fascinating conversation, and I'm sure you'll find it as insightful as we did. If you're interested in the evolving landscape of identity security, this is one episode you don't want to miss!

You can listen to the episode at idacpodcast.com or on your favorite podcast app.

#iam #podcast #idac

Saturday, 09. March 2024

Elastos Foundation

Elastos Bi-Weekly Update – March 9th, 2024

In the latest Elastos Bi-Weekly Update, significant progress has been made across different areas of Elastos. Let’s take a look at some of the BeL2 and Elacity innovations! BeL2 Implementation of Consensus Circuit for BTC Transactions: A significant milestone has been achieved with the implementation of a consensus circuit based on Cairo 0. This circuit […]

In the latest Elastos Bi-Weekly Update, significant progress has been made across different areas of Elastos. Let’s take a look at some of the BeL2 and Elacity innovations!

BeL2 Implementation of Consensus Circuit for BTC Transactions: A significant milestone has been achieved with the implementation of a consensus circuit based on Cairo 0. This circuit is designed to perform basic checks on Bitcoin (BTC) Legacy address transactions. It includes validation of elliptic curve signatures and unspent Transaction Outputs (UTXO) checks, among other crucial verification steps. Zero-Knowledge Proof Verification Contract: A verification contract for zero-knowledge proof of Bitcoin transactions has been successfully implemented. This contract enables the demonstration that a given transaction has passed through the consensus circuit, thereby completing the technical feasibility verification phase. Schnorr Signature Verification Circuit: With the introduction of the Schnorr signature verification circuit, based on Cairo 1, the groundwork has been laid for supporting advanced BTC transactions, including those involving Taproot addresses and Ordinals. This is a foundational step toward enhancing transaction security and efficiency on the blockchain. BTC Oracle Development: The objective is to create a BTC Oracle capable of generating Zero-Knowledge Proofs for all types of BTC transactions. These proofs can then be submitted to the Ethereum Virtual Machine (EVM) smart contracts for verification. The development team has successfully implemented zero-knowledge proof for legacy address transactions using Cairo version 0. This achievement marks a significant step towards building a comprehensive BTC Oracle framework that will eventually support all BTC OP Codes, Segwit transactions, Schnorr signatures, and Taproot transactions. Smart Contract and Proof Verification: The development includes smart contracts and tools for verifying BTC transactions in a decentralized manner. This includes the creation and validation of Merkle proofs for BTC transactions, enabling the secure and efficient handling of BTC assets within the Elastos ecosystem. Infrastructure Enhancements and Tools: The deployment and improvement of various infrastructure components and tools have been noted. This includes the development of contracts for asset exchange, order management, and fraud proof submission. These components are essential for the robust operation of the Elastos infrastructure, ensuring a secure, efficient, and decentralized environment for asset exchange and transaction verification. Elacity Player Update for Flexible Media Streams: The player’s capability has been enhanced to accommodate a broader range of media stream combinations. It now supports playing audio-only or video-only streams, handling multiple streams by selecting the first one available. This update addresses the previous limitation where the player would break if the media was not formatted with one audio and one video stream. This refactoring ensures a more flexible and robust playback experience, catering to diverse media types. A unified signature notification system has been implemented, enhancing the user experience across the platform. Adaptive Streaming Support: Significant work on adaptive streaming support has been completed, ensuring that video playback can dynamically adjust to various internet speeds and device capabilities, optimizing the viewing experience. Android Connection Flow: Enhancements in the connection flow on Android devices have been made to improve usability and performance. ABR Selection Flow: An adaptive bitrate (ABR) selection flow has been developed to further enhance the streaming quality based on the user’s current network conditions. NFT Marketplace Updates: Updates to the filter in the mobile view for the NFT marketplace have been implemented, alongside adjustments to how NFTs opened from search are viewed or routed. Efforts have been made to address sync issues with NFTs, ensuring that collection displays and NFT minting processes are seamless and intuitive. Quality Assurance and Final Preparations: Pre-release testing and quality assurance checks have been conducted, including code reviews and fixes for specific transaction failures and playback issues. Preparations for the release include addressing feedback on collection cover image changes, and ensuring that the mobile filter pop-up experience is consistent across all collection pages. Work on the backend includes fixing RPC call errors, addressing DRM playback issues on iOS, and researching efficient deployment strategies for IPFS nodes.

Interested in staying up to date? Follow Elastos here and join our live telegram.

Friday, 08. March 2024

FIDO Alliance

TeleMedia Online: Should All Mobile Business Apps Scrap Passwords and Integrate Biometrics?

Now that all the most advanced mobile devices on the market offer biometric authentication, it’s a good opportunity for apps to align with this and integrate it. FIDO Alliance reported that around […]

Now that all the most advanced mobile devices on the market offer biometric authentication, it’s a good opportunity for apps to align with this and integrate it. FIDO Alliance reported that around 80 percent of data leaks are linked to passwords, so it would be useful for a better alternative to become more widespread.


Security Magazine: Cyber Insights 2024: A Dire Year for CISOs?

“CISOs are too often overlooked or low on resources, funding and/or business support to properly implement change,” adds Andrew Shikiar, executive director at FIDO. “Resting the legal liability on one […]

“CISOs are too often overlooked or low on resources, funding and/or business support to properly implement change,” adds Andrew Shikiar, executive director at FIDO. “Resting the legal liability on one individual is overlooking the vacuum of responsibility and engagement at the top of organizations that is preventing meaningful change and true cyber resilience.”


Biometric Update: FIDO Alliance ensures long-term value of its specifications in post quantum era

The FIDO Alliance is actively involved in integrating PQC into its standards to ensure long-term efficacy and security, forming working groups to understand the implications and develop migration strategies. With […]

The FIDO Alliance is actively involved in integrating PQC into its standards to ensure long-term efficacy and security, forming working groups to understand the implications and develop migration strategies. With the addition of Prove Identity to its Board of Directors, the coalition continues its mission to shaping future standards for identity authentication.


Engadget: 1Password adds passkey support for Android

Passkey adoption is on the rise, showcased by 1Password’s support of passkeys for Android devices to provide a more secure alternative to traditional passwords through the use of public and […]

Passkey adoption is on the rise, showcased by 1Password’s support of passkeys for Android devices to provide a more secure alternative to traditional passwords through the use of public and private keys.


Human Colossus Foundation

Securing Your Digital Future: A Three-Part Series on Enhanced Privacy through Data Protection - Part 1

In 'Securing Your Digital Future,' Part 1 of this three-part series unveils the pivotal role of the Blinding Identity Taxonomy (BIT) and its Supplementary Document in fortifying data privacy. Emphasizing the critical need to protect sensitive personal data, we explore the foundation of data semantics—bolstered by the BIT framework crafted by the Human Colossus Foundation and backed by Kantara
Part 1: Understanding the Semantic Foundation of Privacy: The Critical Role of BIT and Its Supplementary Document in Data Protection

In the rapidly evolving digital landscape, the significance of data protection has never been more pronounced. Recent developments, such as the presidential order issued by the White House on February 28th, 2024, to prevent access to sensitive personal data by overseas 'bad actors,' underscore the urgency of safeguarding personal information from exploitation. This context sets the stage for a pivotal conversation on protecting sensitive data from a data semantics perspective—the cornerstone of understanding and interpreting data correctly across diverse systems and stakeholders.

Data semantics supports data interpretability, clarity, and consistency in the digital realm. It includes utilizing data models, vocabularies, taxonomies, ontologies, and knowledge representation to accurately recognize and interpret Personally Identifiable Information (PII) and sensitive data, ensuring that digital entities comprehend the sensitivity of this information, irrespective of their domain. The Blinding Identity Taxonomy (BIT) emerges as a beacon of guidance in data protection, supporting the fight against intrusive surveillance, scams, blackmail, and other privacy violations.

Celebrating the BIT and Its Evolution

Developed by the Human Colossus Foundation (HCF) and supported by Kantara Initiative, the BIT provides a robust framework for identifying and flagging sensitive information within data sets. Its purpose is not just to adhere to privacy laws such as GDPR and CCPA but to fortify the semantic understanding of what constitutes 'sensitive data.' The BIT involves a nuanced comprehension of data attributes that, if mishandled, could lead to privacy breaches or misuse.

With notable contributions from Paul Knowles, Chair of the HCF Decentralised Semantics WG, the BIT Supplementary Document significantly enhances the comprehension of the taxonomy. As an active contributor to the Dynamic Data Economy (DDE), HCF transferred the intellectual property rights of the newly released BIT Supplementary Document on December 13th, 2023, to Kantara Initiative, a global community focused on improving the trustworthy use of identity and personal data. Although not yet incorporated into regulations like GDPR, CCPA, or similar national regulations as an official appendix, the BIT Supplementary Document's publication as an official Kantara Initiative report on March 5th, 2024, significantly enhances the BIT's utility by offering detailed insights into the BIT categories.

The release of the BIT Supplementary Document marks a significant advancement in this journey. Offering detailed insights into the 49 BIT categories, it serves as an indispensable manual for practitioners aiming to navigate the complexities of data protection. It not only enumerates what constitutes sensitive information but also elaborates on how to interpret and handle this data, ensuring semantic integrity across systems. The BIT is the world's most comprehensive taxonomy for preventing re-identification attacks, with the Supplementary Document adding further depth and clarity.

Flagging Sensitive Attributes: A Semantic Safeguard

As the BIT report recommends, flagging sensitive attributes in a schema capture base is a practice rooted in semantic precision. This approach enables data protection officers and schema issuers to identify elements that demand cryptographic encoding, thereby minimizing the risk of re-identifying a data principal, where flagging acts as semantic annotation, marking data with an additional layer of meaning—its sensitivity or risk level, which aids in compliance with data protection regulations and enhances the semantic coherence of data handling practices.

By utilizing the BIT and its Supplementary Document, practitioners have a common guideline for determining which attributes to flag. This standard practice ensures that sensitive data is understood and interpreted consistently, avoiding ambiguities that could lead to data breaches. The BIT framework empowers practitioners to embed data protection principles directly into their semantic models, making privacy a foundational aspect of data interpretation.

Conclusion: The Semantic Imperative for Data Protection

In a digitally interconnected world, we cannot overstate the importance of data semantics as we navigate the complexities of data protection. The BIT and its Supplementary Document offer a comprehensive framework for understanding and protecting sensitive data, grounding data protection in semantic precision. As we move forward, we encourage individuals, organizations, and ecosystems to embrace these tools, ensuring that sensitive information is flagged, protected, and interpreted carefully.

BIT Supplementary Document

The BIT and its Supplementary Document enrich our toolkit for privacy preservation. The BIT is accessible in PDF and HTML formats, catering to diverse user preferences. Those seeking deeper insights can download the BIT Supplementary Document in PDF format from Kantara Initiative's Reports & Recommendations page. This invaluable resource resides under the 'Kantara Initiative Reports' section, clearly labeled as "Supplementary Report to Blinding Identity Taxonomy Report," ensuring straightforward access for all interested parties.

Stay tuned for Part 2 of this three-part series, where we will delve into the crucial aspect of data governance. We will explore how to implement BIT guidelines for protecting sensitive personal information from a data administration vantage point. Our discussion will navigate the governance frameworks and practices that ensure these recommendations are not just theoretical ideals but are effectively integrated into the operational fabric of organizations and distributed data ecosystems, safeguarding privacy at every turn.


OpenID

OpenID Foundation Certification Program Recruiting a Java Developer

The OpenID Foundation is pleased to announce that it is looking to add a Java developer to the successful OpenID certification program team. The OpenID Foundation enables deployments of OpenID specifications to be certified to specific conformance profiles to promote interoperability among implementations. The certification process utilizes self-certification and conformance test suites developed

The OpenID Foundation is pleased to announce that it is looking to add a Java developer to the successful OpenID certification program team. The OpenID Foundation enables deployments of OpenID specifications to be certified to specific conformance profiles to promote interoperability among implementations. The certification process utilizes self-certification and conformance test suites developed by the Foundation.

The Foundation is seeking a consultant (contractor) to join the team on a part- to full-time basis based on availability. This team member will provide development, maintenance, and support services to the program that include but are not limited to implementing new tests, addressing conformance suite bugs, and updating existing conformance test suites.

SKILLS:

Strong and documented experience with Java or a similar language Some knowledge of OAuth 2 / OpenID Connect / OpenID for Verifiable Credentials / SIOPv2 / FAPI / JWTs (with an interest in becoming more proficient in these standards) An interest in security & interoperability Experience participating in relevant standards working groups (e.g. IETF OAuth, OpenID Connect, OIDF Digital Credentials Protocols, and/or FAPI) is a bonus Experience with one or more of the OpenID Certification conformance suites is a bonus


TASKS:

Development tasks include: Developing new test modules Updating existing conformance tests when changes to the specs are approved Extending the conformance tests to work against servers in new ecosystems including to adding additional security / interoperability checks Undertaking more extensive development tasks including developing conformance tests for previously untested specifications Reviewing code changes done by other team members Pushing new versions to production as/when necessary & writing release notes Investigating / fixing reported bugs in the conformance suite Providing guidance to ecosystems that adopt OpenID Foundation specifications Attend OIDF working group calls as/when necessary Attend 1 hour virtual team call every 2 weeks Attend annual team meeting that is usually adjacent to an industry event


If this opportunity is of interest, please send your resume and cover letter to director@oidf.org with the subject, “OIDF Certification Program Java Developer Opportunity”. Please include in your cover letter how your skills and experience align to the requirements outlined above, your available hours per month, including when you are available to start, and your hourly rate.

The post OpenID Foundation Certification Program Recruiting a Java Developer first appeared on OpenID Foundation.

Thursday, 07. March 2024

FIDO Alliance

Mercari’s Passkey Authentication Speeds Up Sign-in 3.9 Times

Mercari, Inc. is a Japanese e-commerce company, offering marketplace services as well as online and mobile payment solutions. With Mercari users can sell items on the marketplace, and make purchases […]

Mercari, Inc. is a Japanese e-commerce company, offering marketplace services as well as online and mobile payment solutions. With Mercari users can sell items on the marketplace, and make purchases in physical stores. In 2023, they implemented passkeys. This article will explain the motivation behind their decision and the results they achieved.

Motivation

Previously Mercari was using passwords and faced with real-time phishing attacks, added SMS OTPs as an authentication method to protect their users. While this improved their security, it did not completely eliminate real-time phishing attacks. Sending a high volume of SMS OTPs was also both expensive and not very user-friendly.

Mercari also had a new service Mercoin, a platform for buying and selling Bitcoin with the user’s available balance in Mercari, which had strong security requirements and passkeys met their needs.

Because passkeys are bound to a website or app’s identity, they’re safe from phishing attacks. The browser and operating system ensure that a passkey can only be used with the website or app that created them. This frees users from being responsible for signing in to the genuine website or app.

Requiring users to use extra authentication methods and perform additional action is an obstacle when what users actually want is to accomplish something else using the app.

Adding passkey authentication removes that additional step of SMS OTP and improves user experience while also providing better protection for users from real-time phishing attacks and reducing the cost associated with SMS OTPs.

Results

900,000 Mercari accounts have registered passkeys and the success rate of signing in with them is 82.5% compared to 67.7% success rate for signing in with SMS OTP.

Signing in with passkeys has also proved to be 3.9 times faster than signing in with SMS OTP–Mercari users on average take 4.4 seconds to sign in with passkeys, while it takes them 17 seconds to do the same with SMS OTP.

The higher the success rate of authentication and the shorter the authentication time, the better the user experience and Mercari has seen great success with implementing passkeys.

Learn more about Mercari’s implementation of passkeys

To learn more about how Mercari solved the challenges of making a phishing resistant environment with passkeys, read their blog on Mercari’s passkey adoption.

Download Case Study

We Are Open co-op

The Power of Community Knowledge Management

Celebrating Open Education Week 2024 A couple of days ago we ran our fourth Community Conversations session. This one was timed to coincide with Open Education Week, an initiative of OE Global created as “an annual celebration [and] opportunity for actively sharing and learning about the latest achievements in Open Education worldwide”. Our focus was on managing knowledge in communities. Th
Celebrating Open Education Week 2024

A couple of days ago we ran our fourth Community Conversations session. This one was timed to coincide with Open Education Week, an initiative of OE Global created as “an annual celebration [and] opportunity for actively sharing and learning about the latest achievements in Open Education worldwide”.

Our focus was on managing knowledge in communities. The version in the video above is a shortened version of the session, which we recorded without the activities. This blog post contains most of the information in the recording.

What is Knowledge?

Community is key to open education, with an often-overlooked aspect of community management and evolution being how knowledge is stewarded within such networks.

Image by gapingvoid

Let’s start with the above image, showing the difference between terms and concepts that are sometimes used interchangeably, but actually mean different things.

When we talk about community knowledge we’re talking about connecting the dots between information being shared between members. This can turn into insight through a process of reflection, and wisdom by connecting together different insights.

In practice, nothing is ever as simple as the process shown in the above diagram. However, it’s a convenient way to tease apart some of the subtleties.

A Simple, Homely Example

I went on holiday with my family recently. We ‘favourited’ some places on Google Maps as part of our planning, to help us navigate while we were there, and to be able to share what we enjoyed with others afterwards.

Screenshot of Google Maps showing ‘favourited’ and ‘bookmarked’ places in Faro, Portugal

What’s represented on the above screenshot is a bunch of data arranged on a map. When you click on each point, there is further information about each place. If I put these together into an itinerary, this could be considered a form of knowledge.

This is a form of community knowledge management on a very small scale: the community represented by my nuclear family, my extended family and friends, and potentially those people who might in future ask for recommendations on what to do in Faro, Portugal.

Other proprietary tools that might be used to store data and information with others include Trello and Pinterest. You are curating these things as individuals for a particular purpose, but there is not necessarily an effort to connect together the dots in any meaningful way.

Community Knowledge Management

So, what’s the difference between what we’ve discussed so far and managing knowledge within communities?

In this case, we’re specifically talking about Communities of Practice, which we discuss in the first three Community Conversations workshops. Briefly put, they can be defined in the following way:

“Communities of Practice are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly.” (Etienne Wenger)

Harold Jarche has a very clear diagram that he uses regularly in his work around Personal Knowledge Management (PKM) to explore the differences between the spaces in which we interact:

Image via jarche.com/pkm

We’re interested in the middle oval in this diagram, with Communities of Practice (CoPs )overlapping with ‘Work Teams’ and ‘Social Networks’. While we might build knowledge within the walls of our organisations, and share things online with strangers, CoPs are intentional spaces for us to build knowledge between organisations with people we get to know better over time.

Rosie Sherry defines Community Knowledge Management in the following way:

Community Knowledge Management is a process for collaboratively collecting information, insights, stories and perspectives with the goal of supporting a community and the ecosystem with their own learning and growth. (Rosie Sherry)

Although she doesn’t mention it explicitly, the inference is that by “collecting information, insights, stories, and perspectives” the idea is that we not only share knowledge, but we also co-create it.

Tools for Community Knowledge Management

The new version of the Participate platform, to which we are migrating the ORE community, is organised around three types of ‘thing’: Badges, Events, and Docs.

This is useful for keeping communities organised. But what if you’ve got a lot of information — books worth, almost, and you need to organise that? In this case, it’s worth looking at another tool to augment your community’s ‘home’ and which provides some more specialised features.

As you would expect from an organisation entitled We Are Open Co-op, we’re interested in working openly, using openly-licensed resources, open source tools, and cooperating with others. That means we’re going to point towards Open Source software in this section that we know, have used, and trust.

Here are three examples of the types of platforms which can host knowledge created in CoPs:

Wikis — everyone knows Wikipedia, but any organisation or community can have a wiki! You can use the same software, called MediaWiki, or use many other alternatives (we use wiki.js) Forums — these are easily searchable so can be used to capture useful information as part of conversations. We’re big fans of Discourse and have used it for several clients projects. Learning Management Systems (LMS) — can be used to capture information, especially if your community is based around educational resources. Our go-to for this is Moodle.

For the sake of brevity, and to point to our own example, we’re going to show our use of MediaWiki to form Badge Wiki. This has been around for over six years at this point, and serves as a knowledge base for the Open Badges and wider digital credentials community.

Community Knowledge Contribution

There are behaviours around this knowledge repository that overlap with those inside the main community platform. But there are also others, specific to it. For example:

Community Calls specifically focused on discussing and planning elements of Badge Wiki. Barn raisings which focus on co-creation of pages to help establish the knowledge base. Asynchronous discussions to talk about strategy, and catch up between synchronous events such as the previous two. Pro-social behaviours are encouraged and recognised through the use of badges.

To dig into the last of these, we know that there are all kinds of reasons why people contribute to Open Source and community projects. We just want to give them a reason to keep doing so.

Image taken from work WAO did with Greenpeace. See more in this post.

We created a range of badges specifically focused on the community knowledge base. There are attendance badges, for example with the barn raising (and attending multiple times) but also for particular actions such as authoring pages, tidying up existing content, and making it look better!

Images CC BY-ND Visual Thinkery for WAO

Once you’ve got a knowledge base, you can run projects on top of it. So when an ORE community member mentioned that it would be useful to have a ‘toolkit’ for helping people understand Open Recognition… Badge Wiki was the obvious place for it to live!

We launched v0.1 of the Open Recognition Toolkit at the ePIC 2023 in Vienna. As it’s a wiki, this can be easily iterated over time with multiple authors — who can contribute as little or as much as they want.

There’s so much more we could say, but there’s no substitute for practice! Whether you’re planning to start a new community, in the midst of setting one up, or stewarding an existing one, it’s important to think about good practices around Community Knowledge Management.

Being intentional and inclusive about what kind of knowledge is captured and shared within communities is crucial. It’s powerful to pool resources and to help generate insights; it helps to provide impact. It also helps fulfil the needs of different members of the community and helps increase the diversity and richness of who gets involved — and how.

If you would like a thought partner for this kind of work, why not get in touch and have a chat with the friendly people at WAO? The first 30 min call is free of charge, and we’ll do our best to help, or point you towards someone who can!

CC BY-ND Visual Thinkery for WAO

The Power of Community Knowledge Management was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 06. March 2024

Ceramic Network

Building Points on Ceramic - an Example and Learnings

We built a Web3 points application on Ceramic to explore the design considerations a successful system requires.

We've made the case in a recent blog post that web3 point systems align the incentives of platforms and their users, acting as reputation systems that allow participants to draw inferences between who's creating value and who's likely to receive rewards for their actions. More importantly, these systems help participants understand what user interactions matter to applications using points. And while points often manifest as objects referred to by different names (badges and attestations, for example), there's a commonality across these implementations relevant to their verifiability.

Why Points and Ceramic?

Points data requires properties allowing consumers of points (traditionally the same applications issuing them) to trust their provenance and lineage. This is unsurprisingly why most Web3 points systems today are built on centralized rails - not only is a simple Postgres instance easy to spin up, but the only data corruption vulnerability would result from poor code or security practices.

For readers familiar with Ceramic's composability value proposition, it's likely obvious why we view web3 point systems (and reputation systems more broadly) as ideal Ceramic use cases. Not only does Ceramic offer rich query capabilities, data provenance and verifiability promises, and performance-related guarantees, but both end users and applications benefit from portable activity. We foresee an ecosystem where end users can leverage the identity they've aggregated from one application across many others. In turn, applications can start building on user reputation data from day one.

To put this into practice, we built a scavenger hunt application for EthDenver '24 that allowed participants to collect points based on in-person event attendance.

A Scavenger Hunt for Points

Ceramic was officially involved in 8 or so in-person engagements this year at EthDenver, some of which were cosponsored events (such as Proof of Data and Open Data Day), while others were cross-collaborations between Ceramic and our partners (for example, driving participants to check in at official partner booths at the official EthDenver conference location). The idea was simple - participants would collect points for checking in at these events, and based on different thresholds or interpretations of participant point data (for example, participants with the most event check-ins) would be eligible for prizes.

To make this happen, we ideated on various patterns of data control and schema design that presented the best balance of trade-offs for this use case. In simple terms, we needed to:

Track event attendance by creating or updating abstractions of that activity in Ceramic Provide a crypto-native means for participants to self-identify to leverage Ceramic-native scalar types Secure the application against potential spoofing attempts Collect enough information necessary to perform creative computation on verifiable point data

We were also presented with several considerations. For example, should we go through the effort to emulate a user-centric data control design whereby we implement a pattern that requires additional server-side verification and signed data to allow the end user to control their Ceramic point data? Or what's the right balance of data we should collect to enable interesting interpretations (or PointMaterializations) to be made as a result of computing over points?

Architecting Document Control

Before we jump in, reading our blog post on Data Control Patterns in Decentralized Storage would help provide useful context. As for the problem at hand, two options stand out as the most obvious ways to build a verifiable points system on open data rails:

Reconstruct the approach that would be taken on traditional rails (the application is the author and controller of all points data they generate). This makes the data easy to verify externally based on the Ceramic document controller (which will always be the same), and data consumers wouldn't have to worry about end users attempting to modify stream data in their favor Allow the end users to control their points data on Ceramic. In this environment, we'd need a flow that would be able to validate the existing data had been "approved" by us by verifying a signed payload, then update the data and sign it again before having the user save the update to their Ceramic document, thus ensuring the data is tamper-evident

You might've guessed that the second option is higher-touch. At the same time, a future iteration of this system might want to involve a data marketplace that allows users to sell their points data, requiring users to control their data and its access control conditions. For this reason and many others, we went with #2. We'll discuss how we executed this in the sections below.

What Data Models Did We Use?

When we first started building the scavenger hunt application the SET accountRelation schema option had not yet been released in ComposeDB (important to note due to the high likelihood we would've used it). Keep that in mind as we overview some of the APIs we built to check if existing model instances had been created (later in this article).

In discussing internally how points data manifests, we decided to mirror a flow that looked like trigger -> point issuance -> point materialization. This means that attending an event triggers issuing point data related to that action. In response, that issuance event might materialize as an interpretation of the weight and context of those points (which could be created by both the application that issued the points and any other entity listening in on a user's point activity).

As a result, our ComposeDB schemas ended up like this:

type PointClaims @createModel(accountRelation: LIST, description: "A point claim model") @createIndex(fields: [{ path: ["issuer"] }]) { holder: DID! @documentAccount issuer: DID! @accountReference issuer_verification: String! @string(maxLength: 100000) data: [Data!]! @list(maxLength: 100000) } type Data { value: Int! timestamp: DateTime! context: String @string(maxLength: 1000000) refId: StreamID } type PointMaterializations @createModel( accountRelation: LIST description: "A point materialization model" ) @createIndex(fields: [{ path: ["recipient"] }]) { issuer: DID! @documentAccount recipient: DID! @accountReference context: String @string(maxLength: 1000000) value: Int! pointClaimsId: StreamID! @documentReference(model: "PointClaims") pointClaim: PointClaims! @relationDocument(property: "pointClaimsId") }

To provide more context, we built the application to create a new PointClaims instance if one did not already exist for that user, and update the existing PointClaims instance if one already existed (and, in doing so, append an instance of Data to the "data" field). I mentioned above that the SET accountRelation option would've likely come in handy. Since we were hoping to maintain a unique list of PointClaims that only had 1 instance for each user (where the issuer represents the DID of our application), SET would've likely been the preferred way to go to make our lives easier.

You'll also notice that an optional field called "refId" that takes in a StreamID value exists in the Data embedded type. The idea here was that issuing points might be in response to the creation of a Ceramic document, in which case we might want to store a reference pointer to that document. For our scavenger hunt example, this was the case - points were issued in recognition of event attendance represented as individual Ceramic documents:

type EthDenverAttendance @createModel( accountRelation: LIST description: "An attendance claim at an EthDenver event" ) @createIndex(fields: [{ path: ["recipient"] }]) @createIndex(fields: [{ path: ["event"] }]) @createIndex(fields: [{ path: ["latitude"] }]) @createIndex(fields: [{ path: ["longitude"] }]) @createIndex(fields: [{ path: ["timestamp"] }]) @createIndex(fields: [{ path: ["issuer"] }]) { controller: DID! @documentAccount issuer: DID! @accountReference recipient: String! @string(minLength: 42, maxLength: 42) event: String! @string(maxLength: 100) latitude: Float longitude: Float timestamp: DateTime! jwt: String! @string(maxLength: 100000) }

Finally, take a look at the "issuer_verification" field in PointClaims and "jwt" field in EthDenverAttendance. Both fields were allocated to store the data our application verified + signed, represented as a base64-encoded string of a JSON web signature. For PointClaims, this entailed just the values within the "data" array (involving a verification, updating, and resigning process each time new point data needed to be appended).

Issuing Points - Data Flow

For the remainder of the article, feel free to follow along in the following public code:

https://github.com/ceramicstudio/fluence-demo

You'll notice two environment variables (SECRET_KEY and STRING) scoped only for server-side access, the first of which is meant to contain our secret 64-character seed from which we'll instantiate our application's DID (to be used for filtering PointClaims instances for documents where our application's DID is the issuer, as well as for verifying and signing our tamper-evident fields). To explain STRING, it might be helpful at this point if I dive a bit deeper into what we built to support the user flow.

Private PostgreSQL Instance (for Whitelisted Codes)

You'll notice that a findEvent method is called first in the useEffect lifecycle hook within the main component rendered on our post-login screen, which subsequently calls a /api/find route (which uses our STRING environment variable to connect to our PostgreSQL client). For this application, we needed to quickly build a pattern where we were able to both issue and verify codes corresponding to each in-person event that had been generated beforehand. This ties back to our planned in-person flow:

Participant scans a QR code or taps an NFC disc that contains the URL of our application + a parameterized whitelisted code that hasn't yet been used The application checks the database to ensure the code hasn't yet been used

While in theory this part could've been built on Ceramic with an added layer of encryption, it was easier to stand this up quickly with a private Postgres instance.

Determining Participant Eligibility

If the call to /api/find determines that the code has not been used, findEvent then calls a createEligibility method, passing in the name of the event as the input variable. Notice that the first thing we do is call a getDID method, which calls a /api/checkdid server route that uses our SECRET_KEY variable to instantiate a DID and send us back the did:key identifier.

This is the second check our application performs to prevent cheating, whereby we query ComposeDB for EthDenverAttendance instances, filtering for documents where the signed-in user is the controller, where the event is the string passed into createEligibility, and where our application is the issuer (as evidenced by the DID).

Finally, if no matching document exists, we determine that the participant is eligible to create a badge.

Generating Points Data

While there's plenty to discuss related to generating and validating badge data, given that the pattern is quite similar when issuing points, I'll focus on that flow. The important thing to know here is that within both our createBadge and createFinal methods found in the same component mentioned above call an issuePoint method if a badge was successfully created by the user, passing in the corresponding value, context, and name of the event corresponding to that issuance.

What happens next is a result of our decision to allow the end user to control their points-related data, such that we:

Call an API route to access our application's DID Call yet another /api/issue route, where we Query PointClaims to see if one already exists or not for the end user where our application is also the issuer const authenticateDID = async (seed: string) => { const key = fromString(seed, "base16"); const provider = new Ed25519Provider(key); const staticDid = new DID({ resolver: KeyResolver.getResolver(), provider }); await staticDid.authenticate(); ceramic.did = staticDid; return staticDid; } // we'll use this both for our query's filter and for signing/verifying data const did = await authenticateDID(SECRET_KEY); const exists = await composeClient.executeQuery<{ node: { pointClaimsList: { edges: { node: { id: string; data: { value: number; refId: string; timestamp: string; context: string; }[]; issuer: { id: string; }; holder: { id: string; }; issuer_verification: string; }; }[]; }; } | null; }>(` query CheckPointClaims { node(id: "${`did:pkh:eip155:${chainId}:${address.toLowerCase()}`}") { ... on CeramicAccount { pointClaimsList(filters: { where: { issuer: { equalTo: "${did.id}" } } }, first: 1) { edges { node { id data { value refId timestamp context } issuer { id } holder { id } issuer_verification } } } } } } `); Use the data passed into the API's request body to sign and encode the values with our application's DID (if no PointClaims instance exists) Decode and verify the existing values of "issuer_verification" against our application's DID before appending the new data, resigning, and re-encoding it with our application's DID (if a PointClaims instance does exist) if (!exists?.data?.node?.pointClaimsList?.edges.length) { const dataToAppend = [{ value: parseInt(value), timestamp: new Date().toISOString(), context: context, refId: refId ?? undefined, }]; if (!refId) { delete dataToAppend[0]?.refId; } const jws = await did.createJWS(dataToAppend); const jwsJsonStr = JSON.stringify(jws); const jwsJsonB64 = Buffer.from(jwsJsonStr).toString("base64"); const completePoint = { dataToAppend, issuer_verification: jwsJsonB64, streamId: "", }; return res.json({ completePoint }); } else { const dataToVerify = exists?.data?.node?.pointClaimsList?.edges[0]?.node?.issuer_verification; const json = Buffer.from(dataToVerify!, "base64").toString(); const parsed = JSON.parse(json) as DagJWS; const newDid = new DID({ resolver: KeyResolver.getResolver() }); const result = parsed.payload ? await newDid.verifyJWS(parsed) : undefined; const didFromJwt = result?.payload ? result?.didResolutionResult.didDocument?.id : undefined; if (didFromJwt === did.id) { const existingData = result?.payload; const dataToAppend = [{ value: parseInt(value), timestamp: new Date().toISOString(), context: context, refId: refId ?? undefined, }]; if (!refId) { delete dataToAppend[0]?.refId; } existingData?.forEach((data: { value: number; timestamp: string; context: string; refId: string; }) => { dataToAppend.push({ value: data.value, timestamp: data.timestamp, context: data.context, refId: data.refId, }); }); const jws = await did.createJWS(dataToAppend); const jwsJsonStr = JSON.stringify(jws); const jwsJsonB64 = Buffer.from(jwsJsonStr).toString("base64"); const completePoint = { dataToAppend, issuer_verification: jwsJsonB64, streamId: exists?.data?.node?.pointClaimsList?.edges[0]?.node?.id, }; return res.json({ completePoint }); } else { return res.json({ err: "Invalid issuer", }); } } Send the result back client-side Use our client-side ComposeDB context (on which our end user is already authenticated) to either create or update a PointClaims instance, using the results of our API call as inputs to our mutation //if the instance doesn't exist yet if (finalPoint.completePoint.dataToAppend.length === 1) { data = await compose.executeQuery(` mutation { createPointClaims(input: { content: { issuer: "${did}" data: ${JSON.stringify(finalPoint.completePoint.dataToAppend).replace(/"([^"]+)":/g, '$1:')} issuer_verification: "${finalPoint.completePoint.issuer_verification}" } }) { document { id holder { id } issuer { id } issuer_verification data { value refId timestamp context } } } } `); }

Does this sound a bit tedious? This is the same pattern we're using for issuing and verifying badges as well. And yes, it is verbose compared to what our code would've looked like had we decided not to go through the trouble of allowing our participants to control their Ceramic data.

Creating Manifestations

As mentioned above, PointMaterializations represent how points manifest in a platform for reward structures (like a new badge, an aggregation for a leaderboard, or gating an airdrop). Most importantly, the PointMaterializations collection is a new dataset built from our composable piece PointClaims.

To create PointMaterializations, we use an event-driven architecture, leveraging our MVP EventStream feature. When PointClaims instances are written to Ceramic, we will receive a notification in another application, in this case, a Fluence compute function.

Our compute function works like this

Determine that the notification is for the model (PointClaims) and the issuer is the DID of our application. Extract from the notification content the PointClaims Verify that the issuer_verification is valid for the data field in PointClaims If the subject of the PointClaims (the document owner) has an existing PointMaterializations, retrieve it, otherwise create a new one. For the context of the PointMaterializations calculate a new value unique-events : tally all the context unique entries in the data field all-events : tally all the entries in the data field first-all-events : similar to all events, we check all unique context entries in the data field. If they have attended all the events, we then record the latest first event check-in as the value, so that we can rank users by that time

If you want to view the Rust code that implements the sequence above, please check out the compute repository.

At the time of writing, the EventStream MVP does not include checkpointing or reusability, so we have set up a checkpointing server to save our state and then use a Fluence cron job, or spell, to periodically run our compute function. In the future, we hope to trigger Fluence compute functions from new events on the EventStream.

What We Learned

This exercise left our team with a multitude of valuable learnings, some of which were more surprising than others:

Wallet Safety and Aversion to Wallet Authentication

We optimized much of the flow and the UI for mobile devices, given that the expected flow required scanning a code/tapping a disc as the entry point to interact with the application. However, throughout EthDenver and the various events we tried to facilitate issuing points, we overwhelmingly noticed a combination of:

Participants intentionally do not have a MetaMask/wallet app installed on their phones (for safety reasons) If a participant has such a wallet app on their phone, they are VERY averse to connecting it to our scavenger hunt application (particularly if they haven't heard of Ceramic previously)

This presents several problems. First, given that our flow required a scanning/tapping action from the user, this almost entirely rules out using anything other than a phone or tablet. In a busy conference setting, it's unreasonable to expect the user to pull out their laptop, hence why those devices were not prioritized in our design.

Second, the end user must connect their wallet to sign an authentication message from Ceramic to write data to the network (thus aligning with our user-centric data design). There's no other way around this.

Finally, our scavenger hunt application stood ironically in contrast with the dozens of POAP NFC stands scattered throughout the conference (which did not require end users to connect their wallets, and instead allowed them to input their ENS or ETH addresses to receive POAPs). We could've quite easily architected our application to do the same, though we'd sacrifice our user-centric data design.

SET Account Relation will be Useful in Future Iterations

As explained above, the PointsClaims model presents an ideal opportunity to use the SET accountRelation configuration in ComposeDB (given how we update an existing model if it exists).

Data Verifiability in User-Centric Data Design Entails More Work

Not a huge shocker here, and this point is certainly relevant for other teams building with Verifiable Credentials or EAS Off-Chain Attestations on Ceramic. While there are plenty of considerations to go around, we figured that our simple use of an encoded JWT was sufficient enough for our need to validate both the originating DID and the payload. It was hard to imagine how we would benefit from the additional baggage relevant to saving point-related VCs to ComposeDB.

Interested in Building Points on Ceramic?

If your team is looking for jam on some points, or you have ideas for how we can improve this implementation, feel free to contact me directly at mzk@3box.io, or start a conversation on the Ceramic Forum. We look forward to hearing from you!


Elastos Foundation

Elacity Enables ERC404 Standard for Revolutionary NFT Functionality

Elacity, the pioneering NFT Marketplace built on Elastos, today announces its support for the trading of ERC404 standard NFTs. This technical development enables the buying and selling of fractional NFTs, like Elawings, seamlessly aligning with current token trading standards. ERC404 addresses the limitations posed by existing NFT trading processes. Designed from the ground up to […]

Elacity, the pioneering NFT Marketplace built on Elastos, today announces its support for the trading of ERC404 standard NFTs. This technical development enables the buying and selling of fractional NFTs, like Elawings, seamlessly aligning with current token trading standards.

ERC404 addresses the limitations posed by existing NFT trading processes. Designed from the ground up to integrate the characteristics of ERC-20 and ERC-721 tokens into a single, more flexible model, ERC404 standard NFTs provide customers the ability to buy and sell portions of NFTs rather than previous methods which only allowed for the purchasing of whole NFTs. This capability brings the ability to create liquidity pools for NFTs, creating better markets for NFT trading. It also unlocks new use cases for NFT platforms for example, the buying and selling of fractional royalties for any form of digital content and assets, including music, artwork, books, and the like. 

Sasha Mitchell, the CEO and Founder of Elacity, says about the development, “The adoption of ERC404 a massive step forward in the digital rights and NFT space as a whole. Providing creators unprecedented ownership over the rights to their content, while also allowing users to engage with creators of their choice on a never-before-seen level. Meanwhile, adopting ERC404 is a unique opportunity to enhance trading for NFT markets which can offer utility through access or royalties to services.“

“The addition of fractional NFT ownership will significantly increase flexibility and choice for both buyers and sellers of exclusive content, potentially creating further secondary markets and other forms of value addition,” he says.

“It’s difficult to overstate the technical challenges that have been overcome to deliver genuine interoperability and conformity with multiple standards.  But the result will mean more control for creators, and more choice for their audiences.”

This milestone aligns with Elacity’s main vision of becoming a Decentralized Digital Rights Marketplace (DDRM), where creators and users alike can reap the benefits of fractional ownership and royalty generation. DDRM is an extension of existing Digital Rights Management Technology (DRM), a familiar technology that is currently used by industry players to protect creator’s content from unauthorized use and distribution. In essence, DRM systems employ encryption techniques, software licenses, and other security measures to control access to digital content and limit who can use it.

Elacity stands as an innovative online decentralized content marketplace, revolutionizing the way users engage in the creation, purchase, and sale of online content through cutting-edge blockchain technology. Elacity’s parent company, Elastos is a public blockchain project that integrates blockchain technology with a suite of reimagined platform components to produce a modern Internet infrastructure that provides intrinsic protection for privacy and digital asset ownership. 

Join Us on the Journey

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.


Identity At The Center - Podcast

We have another Sponsor Spotlight episode of the Identity at

We have another Sponsor Spotlight episode of the Identity at the Center podcast for you this week. We were joined by Rich Dandliker, Chief Strategist at Veza. We had an insightful discussion about Veza's unique approach to identity security, their 'anti-convergence' strategy, the significance of a reputable customer base, and the importance of a data-first approach to identity management. Don't

We have another Sponsor Spotlight episode of the Identity at the Center podcast for you this week. We were joined by Rich Dandliker, Chief Strategist at Veza.

We had an insightful discussion about Veza's unique approach to identity security, their 'anti-convergence' strategy, the significance of a reputable customer base, and the importance of a data-first approach to identity management.

Don't miss out on this episode for a comprehensive understanding of Veza's innovative solutions in the IAM market. You can listen to the episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac


Next Level Supply Chain Podcast with GS1

Risk, Resilience, and AI in the Supply Chain with Yossi Sheffi

The COVID-19 pandemic threatened to derail supply chain management completely. Or did it? Yossi Sheffi, distinguished MIT professor and an expert with 49 years in supply chain management, breaks down supply chain resilience into five levels and argues that supply chain managers were unsung heroes during the pandemic. Yossi also touches on balancing resilience with sustainability, pointing out th

The COVID-19 pandemic threatened to derail supply chain management completely. Or did it?

Yossi Sheffi, distinguished MIT professor and an expert with 49 years in supply chain management, breaks down supply chain resilience into five levels and argues that supply chain managers were unsung heroes during the pandemic. Yossi also touches on balancing resilience with sustainability, pointing out that while essential, both can introduce short-term costs and competitive imbalances. He underscores the delicate balance companies must strike between cost management and maintaining multiple suppliers for risk mitigation.

He expounds on the role of AI in supply chains, emphasizing the importance of leveraging artificial intelligence for identifying alternative suppliers and predictive analysis. The conversation also delves into the roles of machine learning, large language models, and robotics in evolving supply chains. Despite skepticism about fully autonomous applications like pilotless planes, Yossi highlights ongoing experiments with AI as potential co-pilots. The episode concludes with reflections on the rapid technological evolution impacting the professional landscape and the fabric of daily life.

 

Key takeaways: 

Resilience in supply chains is crucial for navigating disruptions and maintaining operational continuity.

Artificial intelligence (AI) technology is vital for supply chain management despite potential challenges.

Supply chain resilience and sustainability are critical concerns, as are the investments in these areas.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Yossi Sheffi on LinkedIn

Check out Yossi’s book - The Magic Conveyor Belt: Supply Chains, A.I., and the Future of Work

 


FIDO Alliance

Tech Game World: Passkeys are arriving on PlayStation: how the smart alternative to the password works

The advantages are many. Let’s start by saying that Passkeys are more secure than traditional passwords. These are fraud resistant and follow standards Fast Identity Online (FIDO )established by the […]

The advantages are many. Let’s start by saying that Passkeys are more secure than traditional passwords. These are fraud resistant and follow standards Fast Identity Online (FIDO )established by the FIDO Alliance, a global organization of which the Sony Group (owner of PlayStation) is part. The FIDO Alliance is responsible for defining and promoting the more advanced authentication standards for a wide range of devices and platforms. The goal is to reduce the dependence on passwords, which are considered an obsolete method in contemporary times. These standards are supported by leading companies and institutions in the technology sector, with which PlayStation itself has collaborated to offer an optimal access experience.


PCMag: No More Passwords: Sony Adopts Passkeys for PlayStation 4, PS5

Sony has introduced passkey support for PlayStation, eliminating the need for traditional passwords. Users can now opt for a more secure and convenient sign-in method by setting up a passkey […]

Sony has introduced passkey support for PlayStation, eliminating the need for traditional passwords. Users can now opt for a more secure and convenient sign-in method by setting up a passkey stored on their phone or laptop. Passkeys use unique cryptographic keys that remain on the device which are phishing resistant, and can be accessed through other devices in case of loss.

Tuesday, 05. March 2024

Hyperledger Foundation

Hyperledger Collaborative Learning Spotlight: BiniBFT - An Optimized BFT on Fabric

WHAT WE WORKED ON:

WHAT WE WORKED ON:

Monday, 04. March 2024

Project VRM

On Customer Constituency

I’m in a discussion of business constituencies. On the list (sourced from the writings of Doug Shapiro) are investors, employees, suppliers, customers, and regulators. The first three are aware of their membership, but the last two? Not so sure. Since ProjectVRM works for customers, let’s spin the question around. Do customers have a business constituency? […]

A customer looks at a market where choice rules and nobody owns anybody. Source: Microsoft Copilot | Designer

I’m in a discussion of business constituencies. On the list (sourced from the writings of Doug Shapiro) are investors, employees, suppliers, customers, and regulators.

The first three are aware of their membership, but the last two? Not so sure.

Since ProjectVRM works for customers, let’s spin the question around. Do customers have a business constituency? If so, businesses are members by the customer’s grace. She can favor, ignore, or more deeply engage with any of those businesses at her pleasure. She does not “belong” to any of them, even though any or all of them may refer to her, or their many other customers, with possessive pronouns.

Take membership (e.g. Costco, Sam’s Club) and loyalty (CVS, Kroger) programs off the table. Membership systems are private markets, and loyalty programs are misnomered. (For more about that, read the “Dysloyalty” chapter of The Intention Economy.)

Let’s look instead at businesses that customers engage as a matter of course: contractors, medical doctors, auto mechanics, retail stores, restaurants, clubs, farmers’ markets, whatever. Some may be on speed dial, but most are not. What matters in all cases is that these businesses are responsible to their customers. “The real and effectual discipline which is exercised over a workman is that of his customers,” Adam Smith writes. “It is the fear of losing their employment which restrains his frauds and corrects his negligence.” That’s what it means to be a customer’s constituent.

An early promise of the Internet was supporting that “effectual discipline.” For the most part, that hasn’t happened. The “one clue” in The Cluetrain Manifesto said “we are not seats or eyeballs or end users or consumers. we are human beings and our reach exceeds your grasp. deal with it.” Thanks to ubiquitous surveillance and capture by corporate giants and unavoidable platforms, corporate grasp far outreaches customer agency.

That’s one reason ProjectVRM has been working against corporate grasp since 2006, and just as long for customer reach. Our case from the start has been that customer independence and agency are good for business. We just need to prove it.


Oasis Open Projects

OASIS Board Member Spotlight Series: Q&A with Jautau “Jay” White, Ph.D.

Meet Jautau “Jay” White, Ph.D., an accomplished leader with a strong focus on people and teamwork. With two decades of experience, he specializes in building top-notch teams and programs that enhance information security and cybersecurity while reducing risks and ensuring compliance. His expertise spans AI/ML vulnerabilities, supply chain security, data privacy, cybersecurity, and more. What […]

The OASIS Board of Directors are integral to the organization's success. Read our Q&A to gain a better sense of who they are and why they serve the OASIS community.

Meet Jautau “Jay” White, Ph.D., an accomplished leader with a strong focus on people and teamwork. With two decades of experience, he specializes in building top-notch teams and programs that enhance information security and cybersecurity while reducing risks and ensuring compliance. His expertise spans AI/ML vulnerabilities, supply chain security, data privacy, cybersecurity, and more.

What can you tell us about your current role?
At Microsoft, my role involves supply chain security and open source strategy work. My main function is to be the subject matter expert on cybersecurity and information security matters, and take that knowledge and use it to communicate internally to extrapolate ideas, initiatives, and strategies that can be worked on in a collaborative environment such as open source. 

A large part of my job is going out into the open source ecosystem to see what communities are already in place and to help build communities around work that’s for the betterment of mankind. I seek out opportunities that align with Microsoft’s ongoing projects, identifying areas where Microsoft wants to invest its efforts and finding where those efforts are already underway. We initiate projects within Microsoft and leverage open source collaboration to crowdsource innovative solutions from open source communities. I bring those insights back to Microsoft, advocating for the adoption of these solutions, saying “This is already being done, why don’t we use this?” or “Why don’t we get involved with that?” That’s a large part of my job. I love what I do mainly because it takes everything I’ve learned throughout my entire career to do it.

What inspired you to join the OASIS Board?
I love standards, specs, and policies. Having had a hand in writing standards and then using them throughout my entire career, joining the OASIS Board was an excellent opportunity. One of the things I think that I liked most was the fact that I had to run for the board seat. I campaigned and talked to community members and staff; I really put myself out there and I enjoyed that immensely.

I love what OASIS does in terms of the international community. I love its recognition. There are so many specs and technologies that are being used today that people don’t even know originated in OASIS and I just love that I get a chance to be part of it.

Prior to serving on the OASIS Board, were you involved in open source or open standards? 
For the past few years, I’ve been involved with the Linux Foundation, especially their Open Source Security Foundation (OpenSSF) project. I currently sit on OpenSSF’s Technical Advisory Council (TAC) and I lead a few working groups and special interest groups there as well. Getting involved with OASIS was the next evolution. OASIS does such an amazing job bringing standards and specs to market. I’ve always felt that I want to be involved in this part, because the regulatory part is where I thrive.

What skills and expertise do you bring to the OASIS Board and how do you hope to make an impact?
I bring extensive cyber and security knowledge. Unlike many individuals who specialize in one area for the entirety of their careers, I’ve navigated through many roles inside of cyber and information systems. I’ve been a network engineer, a systems admin, a desktop support engineer, and a penetration tester. Also, I’ve done physical security assessments, admin security assessments, and I’ve installed firewalls. I have a software engineering degree, so I’ve written programs. There are so many different places that I’ve touched throughout my entire career across government, healthcare, finance, and professional services sectors. My experiences have enabled me to approach situations from different vantage points and engage meaningfully in conversations. I’m excited to learn about emerging standards and specs from diverse industries.

Why are you passionate about the OASIS mission to advance technology through global collaboration?
Global collaboration is key. I spent my last few years working in open source, and it’s so important to work collaboratively. I coined the phrase, “strategically competing through partnership and collaboration.” A lot of these major companies are competitors in nature, but there’s so much out there right now that is affecting every single one of our businesses at the same time, that we have to come together to build these standards, technologies, controls, and safeguards so that our joint customer base remains safe. Trust is huge and our customers have to trust each and every one of us equally.

What sets OASIS apart from other organizations that you’ve worked with in the past? 
The way OASIS is constructed around Technical Committees and Open Projects is still relatively new to me. I think where OASIS shines is how standards get created and brought to market. That’s the niche.

What would you say to companies that want to bring their projects to OASIS?
It would totally be dependent on what that company wanted. If they want to create a spec or a standard around a tool that’s being created, I would definitely say go to OASIS.

Do you have an impact story about your work in open source or open standards?
I take great pride in establishing a Diversity Equity and Inclusion (DEI) working group in the OpenSSF where there wasn’t one before. Additionally, I’m proud of the AI work that I’ve been able to bring to Microsoft.

At OASIS, I’m excited to be one of the founding members of the OpenEoX Technical Committee alongside Omar Santos. I’m extremely excited about OpenEoX’s potential; I think it’s going to be huge in the industry because there isn’t a standard for end-of-life and end-of-support. There’s nothing out there that allows customers to understand when new releases are coming in, when they’re going out, and how things are deprecated. Having been a part of OpenEoX since its inception and participating in the initial meetings thus far has been incredibly fulfilling.

Can you tell me about any exciting changes or trends in open source and standards?
The AI space is extremely large and there’s so much room to play in it. I don’t want us to get consumed by one area over the other. There are so many different specs and standards that can be created and I want us to be open to all the possibilities and open to the entire knowledge space.

Where do you see standards going in the future?
I see standards becoming more prevalent with respect to these different government regulations coming in. We have more and more regulatory requirements coming out that are beginning to drive standards, for example the EO from the White House, the EU’s Cyber Resilience Act (CRA), and a policy that’s coming out in Germany. I can see that gap closing where you’ll have a standard that could even drive a regulatory requirement at some point which will be something weird to see.

What’s a fun fact about you?
I ride motorcycles and I like to work on cars and bikes. More than anything, I enjoy getting under the hood of a car or lifting the bike up and taking it apart and putting it back together.

The post OASIS Board Member Spotlight Series: Q&A with Jautau “Jay” White, Ph.D. appeared first on OASIS Open.


Identity At The Center - Podcast

It’s another brand-new episode of the Identity at the Center

It’s another brand-new episode of the Identity at the Center Podcast! This week, we had the pleasure of speaking with Laura Gomez-Martin from RSM. We dove into the role of government in protecting privacy, the complexity of privacy policies, and the balance between public and company expectations. Laura shared her unique insights on these topics and much more. You can listen to the episode at idac

It’s another brand-new episode of the Identity at the Center Podcast! This week, we had the pleasure of speaking with Laura Gomez-Martin from RSM. We dove into the role of government in protecting privacy, the complexity of privacy policies, and the balance between public and company expectations. Laura shared her unique insights on these topics and much more. You can listen to the episode at idacpodcast.com or in your favorite podcast app.

#iam #podcast #idac

Friday, 01. March 2024

Elastos Foundation

Open Questions After Elastos Crypto Class Action Settlement

The cryptocurrency world is new, exciting, and complex in relation to governing laws and jurisdictions around the world. The Elastos Foundation’s settlement of a class action lawsuit highlights the ongoing debate over digital assets and securities law. Bradley D. Simon, a seasoned legal expert with a background as both an assistant U.S. attorney and a […]

The cryptocurrency world is new, exciting, and complex in relation to governing laws and jurisdictions around the world. The Elastos Foundation’s settlement of a class action lawsuit highlights the ongoing debate over digital assets and securities law. Bradley D. Simon, a seasoned legal expert with a background as both an assistant U.S. attorney and a trial attorney with the U.S. Department of Justice, recently provided an insightful analysis titled, “Open Questions After Elastos Crypto Class Action Settlement”, in his recent article on Law360. Read the full article here.

Core Points Simplified: Litigation Resolution: The Elastos Foundation settled a major legal case without admitting fault, demonstrating its ability to navigate complex legal challenges effectively. Security Classification Challenge: Elastos successfully argued that its ELA token is not a security, emphasizing the need for a refined understanding of cryptocurrencies in regulatory contexts. International Jurisdiction: Elastos showcased the difficulty of applying U.S. securities laws to decentralized, international entities, highlighting the need for global legal perspectives. Technology and Decentralization: Operating as a DAO and using decentralized blockchain technology, Elastos leads in innovating digital economies. E-Discovery Challenges: The case exposed the inadequacy of current e-discovery tools for modern communication platforms, stressing the need for legal processes to evolve alongside technology. Regulatory Dialogue: The settlement advances discussions on cryptocurrency regulation, advocating for more nuanced legal frameworks for digital assets. Future Litigation Precedent: Elastos’s case offers insights for future crypto litigation, potentially shaping legal approaches to digital currencies.

The Elastos Foundation’s legal battle underscores the need for clarity in how cryptocurrencies are regulated as securities. Simon’s analysis not only sheds light on the intricacies of this particular case but also on the broader challenges facing the regulation of digital assets. As the legal landscape continues to evolve, this case serves as a crucial reference point for both legal practitioners and participants in the cryptocurrency market. Be sure to read here!


DIF Blog

Guest blog: David Birch

David Birch is a keynote speaker, a published author on digital money and digital identity, fintech columnist, social media commentator and an international advisor on digital financial services. A recognised thought-leader in digital identity and digital money, he was named one of the global top 15 favorite sources of business

David Birch is a keynote speaker, a published author on digital money and digital identity, fintech columnist, social media commentator and an international advisor on digital financial services. A recognised thought-leader in digital identity and digital money, he was named one of the global top 15 favorite sources of business information by Wired magazine. 

What was the career path that led to you becoming a thought leader on digital money and identity? 

It’s quite straightforward. I started out in the world of secure electronic communications. I was working in primarily in defence and safety-critical communications when suddenly the financial sector needed to know about this stuff too, which took me into the world of finance and payments. 

I’d edited a book about digital identity and then I was encouraged by the economist Dame Diane Coyle to write Identity Is The New Money. She was the person who pushed  me to have a go at writing a book myself. The timing was good, as others were talking about similar ideas. 

The book helped people to rethink some of the problems, and I think it’s stood the test of time. 

It’s been ten years since you published Identity Is The New Money. Is this a reality today? 

On one level it is. Some time ago I started to realize that the big problems in payments were identity problems, not payment problems. It doesn't matter if it's push payment fraud or whatever, the problems all come from the lack of identity infrastructure. Why is Elon Musk spending so much on money transmitter licenses, KYC (Know Your Customer) and AML (Anti Money Laundering)? Is it because he wants to earn a tiny slice when you pay a friend for concert tickets, or is it because he wants to know who you are and what you’re into? The data about a payment is much more valuable than the transaction fee. 

But in the sense in which I meant ‘identity is the new money’, it still isn't, and that’s surprising.

What needs to change? 

The lack of privacy is one area. Digital payments are too intrusive, though a lot of people don't care. I get a lot of messages about how terrible it would be for CBDCs (Central Bank Digital Currencies) to include not-yet existent features such as the ability to block your spending on certain things, yet when it’s Visa or Twitter being able to see everything you buy, no-one seems bothered. 

Authentication is another area. It bugs me that 10 years later I'm still doing password resets. Recently I needed to book a hotel room, so I tried logging into a brand where I've got points. I got the password wrong and didn’t want to wait for the reset email. Instead I logged into a different brand and booked a room. My product choice was based on which password I remembered!  

Do you see a role for decentralized identity in fixing these issues? 

I like the underlying standards, Decentralized Identifiers and Verifiable Credentials. But the implementation isn’t there yet. From the history of Bitcoin we can see that people lack the competence to manage their keys. When I drop my phone in the canal, how do I get all my stuff back? In a centralized world it’s easy. I buy a new phone and Apple puts all the pictures back. I’ve got my 2FA (2-Factor Authentication) device, so I can easily log into my bank again. 

Otherwise, I'd have to put my secret key on a memory stick and bury it in a biscuit tin in the back garden. For 99 per cent of the population that will never work. 

How can we overcome these challenges? 

I believe the answer is custodial SSI (Self Sovereign Identity), whereby I generate the key on my app and my bank looks after it. That looks like a viable option to me, because banks know how to securely manage keys in distributed hardware, so I trust them not to lose the key. If they do, there’s recourse, as they’re regulated. 

Do I want to control my digital identity? Absolutely. Do I want to hold the keys? No, I want my bank to do it for me. 

What makes you believe people will trust their bank with the keys to their digital identity? 

There’s trust in the legal sense, and then there’s trust in the everyday sense: I trust that my personal data won’t be looted, that I won’t lose access if I lose my phone… I trust the system to regulate my bank and ensure they don’t do stupid things. In the mass market, that’s the kind of trust that matters — the belief that if something goes wrong, it will get fixed. 

What does a good digital identity experience look like, in your view? 

When I log in to the airline, it should ask “Which ID would you like to use?” If I want to use my Avios app, I should be able to. It might call my EU wallet in the background, but I don't see that, everything is embedded. Personally I'd like to never think about managing my identity again.

In June 2023 you stated that the lack of mass-market digital identity is a drag on the economy. Have you seen much progress since then? 

Lots of companies are experimenting. But is anything mainstream happening? We’re not there yet. For example, I can’t download my Barclays ID and use it to log into my HSBC account.  

We’re starting to see people storing mDLs (mobile driving licenses) in their Apple wallet, and the EU Digital Identity Wallet is on the horizon. Whether it gets traction or not, it’s driving forward research and development. Does that mean the EU wallet will be all-conquering? I don't know. 

You’ve talked about how machine customers or ‘custobots’ will revolutionize ecommerce. Can you expand on this a bit please? 

I think there’s a good chance this will happen, starting with financial services. A bot can probably do a better job of managing my finances than I can. On April 6 (the start of the UK tax year) I’ll be looking at what are the best ISAs (Individual Savings Accounts). I will spend hours faffing about, researching, applying, waving my phone in front of my face to prove it’s me, figuring out which account to transfer money from… It’s the kind of thing that could be done in nanoseconds by AI. 

I might choose the Martin Lewis bot or the Waitrose bot to do this for me. The idea that they could be regulated by the FCA (Financial Conduct Authority) and operate under relevant duty of care legislation, with the co-ordinated goal of delivering my financial health, is appealing. 

I’ve also proposed that companies will soon need to provide APIs to support the needs of custobots rather than people.

Where is digital identity headed, in your view? 

There’s energy arriving into the space from two unexpected quarters. One is CBDCs. There’s a need for identity coming from that side, and pressure to get it fixed. The other area is the metaverse. People looked at the lack of obvious progress following Meta’s early pronouncements and thought, it’s not going anywhere. That’s the wrong lesson to take away. For example Apple Vision Pro (Apple’s extended reality headset) is out and there will be no shortage of people wanting to buy it. 

Digital identity is fundamental to make the metaverse a safe space. Assuming this is done right and the metaverse has an identity layer built in from the start, it could become a safer, less expensive, and therefore more desirable, place to transact than the “real world”. 

Money in the Metaverse: Digital Assets, Online Identities, Spatial Computing and Why Virtual Worlds Mean Real Business will be published in late April 2024. To pre-order, click here.



Thursday, 29. February 2024

EdgeSecure

Empowering Campus Networks

The post Empowering Campus Networks appeared first on NJEdge Inc.

Beginning his career as a student assistant in technology services, Michel Davidoff was responsible for pulling thin and thick ethernet at Sonoma State University. Upon leaving the University ten years later, Davidoff was in the role of Director of Networking and Telecommunication. “I was responsible for the campus network that we had grown from around three hundred devices to well over 10,000,” says Davidoff. “I left Sonoma State in 2002 and set my sights on California State University (CSU) Chancellors’ Office. They had started a technology strategy program that included at its core digital equity. The Trustees of CSU were looking to implement the Information Technololgy Strategy (ITS) that was represented as the pyramid. IT infrastructure was at the base of the pyramid, the center, called middle world, included initiatives and projects including security and identity management, and the top was student success. CSU’s visionaries, including leadership and trustees, understood very early on the importance of technology to enable education. I was invited to participate in a system wide initiative to determine the standards for the bottom of the pyramid. Following this meeting, I was eager to help further advance this initiative and joined CSU in a new and exciting role.”

Finding Cost-Effective Solutions
Tasked with helping create a consortium of 23 directors of networking at CSU, Davidoff began building working groups of experts. “Along with being a process facilitator for the consortium, I also created guidelines for writing RFPs, particularly outlining the functional requirements,” explains Davidoff. “A large part of my job at CSU was to provide accurate financial projections, both internal and external, and maintain connectivity for all 23 campuses. In 2006 as wireless technology became more prominent, I was tasked with integrating wireless into each campus. Without more money in the budget to do so, I had to get creative.”

“We began by creating a strategy for network rightsizing,” continues Davidoff. “Since I had the data of every single closet at CSU, I knew that more than 50 percent of the switches were not being used, but because of the fear of not having enough as they grow, the network had been built significantly bigger than necessary. I developed a process that if a port has not been used in 90 days, it will not be refreshed. That freed about 50 percent of the budget delegated for switching and routing. We were able to deploy wireless techology on the campuses, and through a RFP, develop the functional requirements. Later when we needed to enhance and standardize security, we went through a similar process and selected a firewall vendor, and became much more systematic and methodical about the deployment process.”

Spanning over two decades, Davidoff’s career at CSU allowed him to become well versed in delivering scalable, ground-breaking strategies and solutions for large scale infrastructure deployments. “I am proud of our accomplishments at CSU, and I believe I was able to bring a few key things to the University,” shares Davidoff. “First, is that collaboration works, even if it might be a little slower. Working together and developing RFPs can offer tremendous cost savings, in fact, at the end of my career, most of the equipment we purchased was at discounts greater than 80 percent.  My job was to create the most efficient administrative process that requires the least amount of money, while providing an exceptional student learning experience.”

“We need to eliminate complexity to enable innovation. If we keep this complexity in our environment, every time someone wants to innovate, we need to change all these features and configurations. Many vendors or wireless controllers have hundreds of different features and it’s difficult to develop best practices. We need a deterministic network and not a non-deterministic network in order to predict the performance.”

— Michel Davidoff
Chief Strategist of Education, Nile

Encouraging Collaboration
Over the years, Davidoff was responsible for security, log management, and later in 2014, developing the cloud strategy for CSU. “For the next three to four years, I developed the CSU cloud strategy and I believe the biggest selling point for leadership was, unlike networking, the Cloud was a new technology,” explains Davidoff. “Instead of several experts from Amazon Cloud and several Azure experts at CSU, I suggested creating a small team that focused on cloud technology and how to make it the most efficient and automated. Throughout my career, I’ve seen the value of collaboration, especially when making important decisions on how the campus is going to run and ensuring systems are as efficient as possible. From a long-term strategic standpoint, I am a believer in the wisdom of the group, rather than the wisdom of the individual. If everyone feels they have a voice, a successful outcome is more likely. This approach was aligned with my approach that we don’t give a campus a budget, we give them a solution.”

A day after Davidoff was set to retire in March 2020, CSU shut down its physical campuses due to the pandemic. “Leadership knew CSU must prepare for remote learning and I began doing a lot of research, along with forming a working group,” explains Davidoff. “We selected a small company to help us teach online in case we would need to offer remote classes. Part of the contract included free licenses for half a million students for up to ten years, as well as licenses for every faculty and staff member. We trained everyone on the software and ensured we could operate online. When the pandemic hit, CSU was the first university system in the U.S. without any downtime because our processes and strategies were ready to go.”

Bringing Insights to a New Role
After retiring, Davidoff began thinking about where he could have an even larger impact on education and helping students. “It became clear that technology companies, especially in the networking domain, are a place where I could make my mark in creating efficient technology solutions,” shares Davidoff. “I learned of a new company, Nile, and I wanted to bring my knowledge and unique perspective of higher education infrastructure and my vast experience in over two hundred network deployments. I knew I could share how the customer thinks because I had been the customer for thirty years.”

“The Edge team works hard to stay at the forefront of innovation in the marketplace. In the world of enterprise networking, Nile represents an entirely new approach that enables organizations to offload the often overwhelming complexities of network management while reaping the benefits of a financially predictable, highly adaptable, and supremely secure network environment. We’re proud to have Nile as a new awardee and partner in our fast-growing EdgeMarket co-op.”

— Dan Miller
Associate Vice President, EdgeMarket

Joining Nile as the Chief Strategist of Education early last year, Davidoff aligns the company’s strategy with an educational lens to ensure all technology and services deliver a superior connectivity experience. “I love the thought leadership part of my role at Nile and writing papers about rethinking networking and higher education,” says Davidoff. “I talk to a lot of students and gather valuable insights about today’s learning expectations. Nile modernizes IT operations through the delivery of a new wired and wireless enterprise network and as a leader in enterprise Network as a Service (NaaS), it allows institutions to stabilize their budgets. From a financial perspective, you’re able to buy a service that assures capacity, availability, and performance. Organizations can plan how much money is needed every year, instead of seeing a huge spike in the budget five years from now to replace the network or to replace routing. Plus, most importantly, using Nile services helps free up staff to focus on other initiatives, like classroom technology or digital transformation.”

“Normally, if an institution purchases a technology solution from a vendor, that system is at max performance on day one,” continues Davidoff. “Six months later, your firewall is upgraded, your core router is not at current code, and you added ten new features. Your capacity and features are now starting to degrade. Without the time to take care of all the maintenance that needs to happen, your investment keeps losing value over time.”

Davidoff says many organizations are not sufficiently leveraging automation in order to efficiently run and maintain the network while creating complexity that no human can solve. “We need to eliminate complexity to enable innovation. If we keep this complexity in our environment, every time someone wants to innovate, we need to change all these features and configurations. Many vendors or wireless controllers have hundreds of different features and it’s difficult to develop best practices. We need a deterministic network and not a non-deterministic network in order to predict the performance.”

Partnering with Edge
Recognizing the important role networking infrastructure plays in the evolution of IT, Edge recently released an RFP to prospective vendors who could provide a NaaS to member organizations. The goal was to provide Edge members with NaaS services that allow these institutions to focus on promoting capabilities and skills, while reducing costs, promoting efficiencies, and improving security. Davidoff led Nile’s response to the RFP and was recently awarded a master contract with Edge (CBTS was the other awardee). “The Edge team works hard to stay at the forefront of innovation in the marketplace,” says Dan Miller, Associate Vice President, EdgeMarket. “In the world of enterprise networking, Nile represents an entirely new approach that enables organizations to offload the often overwhelming complexities of network management while reaping the benefits of a financially predictable, highly adaptable, and supremely secure network environment. We’re proud to have Nile as a new awardee and partner in our fast-growing EdgeMarket co-op.”

Nile helps higher education institutions deliver an uninterrupted wired and wireless experience with a performance guarantee for coverage, availability, and capacity. “Nile can help free up capital and resources to focus on meeting the demands of modern education,” says Davidoff. “We want to help institutions deliver on their mission and provide the strategic value that leadership is looking to achieve. Nile aims to help organizations break free from the traditional constraints of legacy network infrastructures and use IT resources to strategically enhance learning in a digital era.”

To learn more about how Nile is helping institutions move beyond the networking status quo, visit nilesecure.com/enterprise-network/higher-education.

View Article in View From The Edge Magazine

The post Empowering Campus Networks appeared first on NJEdge Inc.


Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation

The post Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation appeared first on NJEdge Inc.

As education institutions and public sector organizations continue to navigate through the critical process of adapting their IT organizations for the digital age, many look for innovative ways to align team members and streamline processes to help advance these objectives. To create an effective strategy, Christopher Markham, Executive Vice President and Chief Revenue Officer, Edge, says starting with a few basic questions can help frame the conversation in how to move forward. “An important question to begin with is how does your organization view information technology? Do you view IT more as an engineering operation or as a service operation? Leadership must also determine if IT is viewed as an art or science, because there are plenty of institutions where IT is expected to be the primary change agent or innovator, not just in the administrative side of the house, but in educational technologies.”

“Organizations should also explore their return on investment from IT, including technology assets and staff,” continues Markham. “Do you have a return on investment and a rate of return? In addition, leadership must explore if technology is informing the business process both on the administrative and academic side, or is technology being informed by those business processes.” Achieving alignment across an IT organization involves several core axioms, including:

Authority and accountability must match Specialization and teamwork Precise domains and clear boundaries The basis of a substructure Avoid conflicts of interest Cluster by professional synergies Business within a business

“The golden rule is that authority and accountability in an IT organization must match,” says Markham. “You want to define clear boundaries with no overlaps or gaps and divide a function into groups based upon its strengths. In addition, cluster groups under a common leader based on similar professions. Institutions must also view higher education and information technology as a business. Faculty, students, and staff are considered customers and every manager is an entrepreneur. An entrepreneur is anyone who brings together all the different pieces to ensure service delivery of IT and high-quality services and solutions.”

“IT governance, funding and financial management, and enterprise data and data governance are among the top technology-related domains that impact digital transformation readiness.”

— Christopher Markham
Executive Vice President and Chief Revenue Officer, Edge

Achieving Digital Transformation Readiness
The first principle of aligning authority and accountability is of top importance and what Markham calls the golden rule in IT organizational design. “This alignment is essential to the success of every IT organization and institution that it is serving. In a particular case study, a CIO appointed a few process owners at the suggestion of process improvement consultants. Each was assigned a process that engaged people from various parts of the organization in producing a specific service. These process owners had authority over those processes, and while they were collaborative and involved stakeholders in designing and implementing the new processes, they were not process facilitators who served others by bringing teams to consensus on how they’ll work together. Process owners didn’t have matching accountability for their effectiveness of those processes and weren’t always the individuals accountable for delivering those services. They were accountable for the delivery of services, but they didn’t have the power to determine the processes they used to do their jobs.”

“If these service delivery groups failed, there was no way to know whether it was due to their own poor performance or due to a bad process,” continues Markham. “Nonetheless, they took the blame. Process owners implemented detailed, rigorous business processes and succeeded at their mission, but the organization became bureaucratic, slow, and inflexible as a result. This structure violated the golden rule. In re-envisioning and restructuring an IT organization, the CIO needs to decide the rules of the game and create the organizational ecosystem, including the group structure, the resource governance process, and the culture.”

Increasing the Pace of Innovation
Once the right structure is in place, leaders can take the opportunity to adjust domains as needed, arbitrate any disputes, create a healthy environment for teamwork, and develop talent through recruiting, inspiring, and coaching efforts. “Leaders should manage performance including negotiating staff’s objectives, giving frequent feedback, measuring the results, deciding rewards, and managing performance problems,” says Markham. “CIOs can leverage performance programs and evaluations to restructure, reorganize and incentivize.  They must also manage commitments and resources which includes assigning work within the group and coordinating shared decisions, like common methods and tools and professional practices. In addition, the CIO must make decisions when consensus cannot be reached.”

Markham shares another case study where the CIO in a large insurance company was tasked with addressing complaints from the business executives regarding the IT department’s opacity, unresponsiveness, and poor understanding of their business strategies. “The leadership in this organization was frustrated that they couldn’t control IT’s priorities and did not understand why many of the requests were not being fulfilled. There was a trend toward decentralization and many business units had started their own small IT groups, which the CIO disparagingly called Shadow IT. These groups only existed because business units did not want to do business with corporate IT. In response, the CIO dedicated a group to each business unit and divided his engineering staff among them. Each group was relatively self-sufficient with all the skills needed to deliver.”

“The senior managers also served as the primary liaisons to those business units,” continues Markham. “The CIO felt this structure would appease the business units and stave off further decentralization, while holding senior managers accountable for business results and client satisfaction. Unfortunately, technical specialists were needed throughout the organization, and since technology subspeciality was scattered among the various client-dedicated groups, this limited their professional exchange. When the sales team, for example, ran into technical challenges, they may not have known that someone in another group already had encountered that issue and knew a solution. Their peers were busy with other priorities, costs rose, and response time slowed, and everyone was reinventing solutions to common problems. Meanwhile, there was little impetus for standards, and individual teams built systems that were optimal for their specific clients, not for the enterprise as a whole.”

Markham continues, “The pace of innovation also slowed, and the organization could not hire an expert in an emerging technology until demand grew across the whole enterprise. As a result, business opportunities to build customer loyalty were missed and the impacts extended beyond IT’s performance. Over time, the structure led to multiple general ledger systems and multiple records for the same customer. Synergies were lost as the company lost a single view of its customers, resources, and suppliers.”

Including productivity specialists can bring efficiency to an IT organization which can translate into cost savings for return on investment. “Specialists have ready answers and don’t have to climb the learning curve with each new challenge,” says Markham. “Quality specialists know the latest methods and technologies in their field and understand how their products are more capable and have lower lifecycle costs. Competence and experience deliver results with fewer risks. Innovation specialists can keep up with the literature and be the first to learn about emerging technologies and techniques.  As a result, the pace of innovation improves. Since they are confident in their abilities, specialists experience less stress, are more productive, and are more likely to excel in their career.”

 “An important question to begin with is how does your organization view information technology? Do you view IT more as an engineering operation or as a service operation? Leadership must also determine if IT is viewed as an art or science, because there are plenty of institutions where IT is expected to be the primary change agent or innovator, not just in the administrative side of the house, but in educational technologies.”

— Christopher Markham
Executive Vice President and Chief Revenue Officer, Edge

Driving Organizational Change
Creating an IT strategy that optimizes processes and technology and fosters a culture of innovation includes several domains of enterprise architecture. “IT governance, funding and financial management, and enterprise data and data governance are among the top technology-related domains that impact digital transformation readiness,” says Markham. “Each of these domains represent specializations of the IT reference disciplines or olive branches from those IT reference disciplines, and the business architecture is an olive branch with each of the functional offices in both administration and academics. But without labeling these domains properly as a CIO, it’s very difficult to reorganize, restructure, or re-envision your organization. The cost of overlapping these domains and clustering by professional synergies is reduced specialization, redundant efforts, confusion, product disintegration, less innovation and teamwork, and lack of entrepreneurship.”

Edge’s E360 assessment is designed to provide a holistic, 360-degree view of an institution’s current-state technology program with a focus on the technology-related domains. Taking a diagnostic and prescriptive approach to evaluating the technology organization, Edge looks at four key areas. “We first identify any unreliable processes and if there is reduced specialization as a result of these gaps,” explains Markham. “We also look if that reduced specialization leads to conflicts of interest. The E360 assessment also focuses on the professional exchange between the domains, if there are domain overlaps, the level of coordination, and whether it is a whole business. Lastly, we explore the substructure and the results of reduced specialization, domain overlaps, and inappropriate biases. E360 produces a final report that not only includes outcomes and analysis, but a three-year roadmap for an IT organization to drive organizational change, improve their technology landscape, and achieve digital transformation goals successfully.”

Ready to achieve operational efficiency and digital transformation? Learn more at njedge.net/solutions-overview/digital-transformation

View Article in View From The Edge Magazine

The post Reorganizing, Restructuring, and Revisioning Your IT Organization for Digital Transformation appeared first on NJEdge Inc.


Edge Colocation Services

The post Edge Colocation Services appeared first on NJEdge Inc.

In an age where data collection and analysis continue to grow in importance in nearly every industry, many organizations seek innovative and affordable ways to store data and expand their networking capabilities. In the education community, not every institution is equipped with a large IT infrastructure or the space to host servers, networking equipment, and data storage. To help address this need, Edge offers affordable colocation services where member institutions can receive data center support and colocation space for disaster recovery and business continuity. “Colleges and universities have always had the responsibility to design, build, and run data centers on college campuses,” says Bruce Tyrrell, Associate Vice President Programs & Services, Edge. “Unfortunately, the physical infrastructure, including commercial power, backup generators, and environmental services are extremely expensive and complex to deploy, especially in a typical college campus environment that was not designed for these requirements. Our colocation services are an enhancement of our members’ existing connectivity to the Edge network. By leveraging their existing last mile connections, members have the ability to place hardware at one of several locations around the region.”

With Edge maintaining high availability colocation data centers throughout the Northeast region, several members are choosing to exit the owned data center space and move their hardware to an off-campus location. “Many institutions are relocating hardware to a purpose-built facility that has been professionally engineered and constructed with the desired features,” says Tyrrell. “Access to these features is included in the monthly recurring costs for space outsourcing and using a colocation provider can help reduce the need for additional staff to handle the physical management of those environments.”

Benefits of Colocation
From their optical network, Edge can build connections for members from their campuses directly into the colocation facilities. “Member institutions can choose to place hardware infrastructure at the enterprise grade colocation facility on the campus of Montclair State University at a significant discount over commercial space,” explains Tyrrell. “Colocation is available along our optical network and provides access to 165 Halsey Street in Newark, New Jersey Fiber Exchange (NJFX), which is in Wall Township adjacent to the Tata international cable landing station, and at 401 N Broad Street in Philadelphia. Members can also access the Digital Realty colocation at 32 Avenue, the Americas in Manhattan. Edge is expanding our colocation capability by adding the colocation facility at Data Bank in Piscataway, New Jersey, a bespoke water-cooled facility designed with High Performance Computing in mind.”

Colocation data centers allow members to store their equipment in a secure location with a public IP address, bandwidth, and power availability. These locations also include backup power in the event of an outage. “An organization can use Edge colocation services to extend their internal infrastructure into a professional collocation space from an end user point of view,” says Tyrrell. “The Edge model is unique in that the bandwidth provided to our members is not shared with any other organization, and since this extension is transparent, students, faculty, and staff do not realize their data is traveling off campus and out to a data center and back—the data transfer only takes microseconds.”

With Edge as the provider of the bandwidth, both internally connected to the campus, as well as externally via their internet connections, these connections are designed to scale and burst. “Unlike a cloud environment, where there is an increased cost for bursting when an organization’s computing resources reach their max, a colocation environment offers costs that are fixed,” explains Tyrrell. “An organization rents a cabinet and purchases hardware to store in this cabinet. Edge fixes the cost of transport and internet capacity which can allow for greater budget predictability. This is different from the Cloud, where once an application is placed in the Cloud, upticks in utilization for those apps can have a direct impact on the monthly expense to operate those services. For some institutions, having a fixed monthly budget for colocation services is easier to operationalize from a financial perspective.”

“Unlike a cloud environment, where there is an increased cost for bursting when an organization’s computing resources reach their max, a colocation environment offers costs that are fixed,” explains Tyrrell. “An organization rents a cabinet and purchases hardware to store in this cabinet. Edge fixes the cost of transport and internet capacity which can allow for greater budget predictability. This is different from the Cloud, where once an application is placed in the Cloud, upticks in utilization for those apps can have a direct impact on the monthly expense to operate those services. For some institutions, having a fixed monthly budget for colocation services is easier to operationalize from a financial perspective.”

— Bruce Tyrrell
Associate Vice President Programs & Services, Edge

Onboarding and Support
When an institution selects colocation services, Edge’s engineers help walk the member’s IT team through the ins and outs of the processes and can accompany them to colocation facilities to familiarize them with the data centers. “Edge acquires the space, coordinates the connectivity, and assists in providing remote and physically secured access to the cabinets or gauges,” says Tyrrell. “We also handle all the administrative pieces like billing and passing along clean invoices to the member. Since colocation facilities can often be complex and intimidating, Edge can visit the facilities with you during the onboarding process.”

“Colocation is a unique environment that can be complex from both an operational and an acquisition perspective,” continues Tyrrell. “Edge has decades of experience in operating these environments and we stand ready to assist our members with transitioning hardware and application into these professionally maintained tier three colocation facilities. Once the transition has been made, members are better positioned to weather the storms and unforeseen outage conditions that have been known to impact on campus data centers. This resilient infrastructure can provide peace of mind and a cost-friendly way to optimize resources and meet the growing demands of today’s higher education community.”

To learn more about Edge’s colocation services and how to take advantage of the latest and greatest developments in networking technology, visit njedge.net/solutions-overview/network-connectivity-and-internet2.

View Article in View From The Edge Magazine

The post Edge Colocation Services appeared first on NJEdge Inc.


Navigating AI-Powered Education and the Future of Teaching and Learning

The post Navigating AI-Powered Education and the Future of Teaching and Learning appeared first on NJEdge Inc.

With the age of artificial intelligence (AI) well underway, how we work, learn, and conduct business continues to transform and open the door to new opportunities. In the classroom, AI can be a powerful teaching tool and support innovative and interactive learning techniques and critical thinking. Dr. C. Edward Watson, Associate Vice President for Curricular and Pedagogical Innovation with the American Association of Colleges and Universities (AAC&U) and formerly Director of the Center for Teaching and Learning at the University of Georgia, explores how AI is revolutionizing the future of learning and how educators can adapt to this new era of human thinking in his new book, Teaching with AI: A Practical Guide to a New Era of Human Learning (Johns Hopkins University Press).

“AI is a significant game changer and is presenting a new challenge that is going to be dramatically different from past disruptive innovations,” says Watson. “Goldman Sachs and other sources estimate that two-thirds of U.S. occupations will be impacted by AI.1 With a vastly accelerating expectation within the workforce that new graduates will be able to leverage AI for work, there is a growing pressure on institutions of higher education to ensure students become well-versed in AI techniques. This new learning outcome for higher education is being termed AI literacy.”

AI is also introducing a new academic integrity challenge including how to accurately determine if students are using AI to complete assignments. Along with Teaching with AI co-author, José Antonio Bowen, Watson explores crucial questions related to academic integrity, cheating, and other emerging issues in AI-powered education. “The real focus of the book is how to create assignments and practices that increase the probability that students will engage with the work rather than turn to AI, as well as ways to partner with AI and use these tools in meaningful and impactful ways. Instead of fearing AI and how students may misuse it, the education community must employ logical pedagogical practices within the classroom that encourage our students to become competent partners with AI, including building AI literacy skills that will help them on their future career paths.”

“I look forward to discussing the higher education landscape at EdgeCon and exploring suggestions for how we might move forward. We need to acknowledge that AI is going to be an important thread in the education and research industries. Disruption is not always a bad thing, especially in the workforce. AI can help improve efficiencies, reduce costs, increase productivity, and create new job opportunities. In the higher education setting, these tools have the potential to offer personalized learning experiences, strengthen retention, and resolve accessibility issues. Along with the potential challenges this type of technology may introduce, we must also look at the positive opportunities that will arise and how we can better prepare our students for the world that is already waiting for them.”

— Dr. C. Edward Watson
Associate Vice President for Curricular and Pedagogical Innovation,
American Association of Colleges and Universities (AAC&U)

AI in the Classroom and Beyond
With over twenty-five years of experience in faculty and instructional development, Watson is nationally recognized for his expertise in general education, active learning, classroom practice, course design, faculty development, student learning, and learning outcomes assessment. “I believe in the transformative opportunities that higher education can provide individuals, especially first-generation students like myself,” shares Watson. “When I entered a master’s program in English, I became increasingly interested in the puzzle of how learning works. I wanted to better understand how to make learning more meaningful for students, how to engage them, and how to ensure what I’m teaching is not just memorized for an exam, but will be remembered and utilized long after the course is completed. As I advanced in my career, I was able to take what I learned helping students in my own classroom to provide programming and opportunities that could benefit the breadth of higher education.”

Even though change can be slow within the education community, Watson says the dramatic, fast shifts happening in the industry are causing many institutions to take notice. “Unfortunately, as higher education begins to adapt, AI is creating new digital inequities. Many institutions are struggling to determine how to best serve their students given the new challenges and opportunities. Institutions will need leaders who continue to explore how advancements like AI are changing their world and the ways in which they can harness and manage AI as a powerful teaching tool.”

“To begin to understand AI and its capabilities, I recommend that faculty copy and paste a current assignment into two or three different AI tools to better understand the opportunities, restrictions, and surprises. This can provide insight into ways to improve the assignment and to make it better aligned with the way students might be expected to complete similar work in the real-world post graduation. I think going forward, we will see AI more deeply integrated within systems we already depend upon. For instance, within learning management systems (LMS), it’s foreseeable that when tudents submit assignments, the AI-assisted LMS will check for AI, plagiarism, and may even grade and provide customized feedback using a faculty designed rubric.”

From a teaching perspective, AI can also be beneficial in helping instructors create rubrics and improve the quality of their course syllabus and assignments. “I hope more faculty look at AI as a toolbox, rather than something to fear,” says Watson. “Teachers are still the experts in their field, and AI can help them elevate their courses and find new ways to improve the learning experience. AI is not a search engine; it is more like a knowledgeable colleague. Using it is more about prompt engineering and having a conversation that fine tunes the results. Faculty should see AI as an idea generator that could be leveraged and helpful with many aspects of the classroom and beyond.”

ChatGPT, a chatbot developed by OpenAI and launched in November 2022, is a common AI tool used to automate tasks, compose essays and emails, and have human-like conversations. According to a recent survey conducted by Study.com, 89 percent of students over the age of 18 have used ChatGPT to help with homework, while 48 percent confessed they had used it to complete an at-home test or quiz.2 “While many students are familiar with AI tools like ChatGPT, not all educators are aware of its prevalence, causing a disconnect,” says Watson. “Showing faculty how this tool can be useful is key and encouraging them to have open and honest conversations with students about how AI can be used as a tool of learning, rather than a way to cheat on their schoolwork is now an essential early-in-the-semester conversation. Instead of approaching AI with how it is breaking your pedagogy, consider how AI is relevant for what you would like to accomplish in preparing your students for the future.”

“I hope more faculty look at AI as a toolbox, rather than something to fear. Teachers are still the experts in their field, and AI can help them elevate their courses and find new ways to improve the learning experience. AI is not a search engine; it is more like a knowledgeable colleague. Using it is more about prompt engineering and having a conversation that fine tunes the results. Faculty should see AI as an idea generator that could be leveraged and helpful with many aspects of the classroom and beyond.”

— Dr. C. Edward Watson
Associate Vice President for Curricular and Pedagogical Innovation,
American Association of Colleges and Universities (AAC&U)

Adapting Higher Education in a New Era
With a theme of Excelling in a Digital Teaching and Learning Future, EdgeCon Spring 2024 will welcome Dr. Watson as a keynote speaker to explore how higher education is evolving and ways to overcome the challenges the industry is facing. “A recent Gallup survey shows a steep decline in how higher education is perceived in this country3,” says Watson. “Less than half of Americans have confidence in higher education. All of us within our industry should consider how we can positively impact this national perception of higher education as there are ramifications. Not preparing students for what will certainly be an AI-enhanced career or recklessly using AI detection tools in ways that might unjustly accuse significant numbers of students of cheating can be significantly dangerous for higher education. Combining such practices with the ongoing student debt crisis and a politically polarized higher education dynamic, and more and more students will question if higher education is still as important as it once was. Already many ask if higher education is still a cornerstone of the American Dream.”

“I look forward to discussing the higher education landscape at EdgeCon and exploring suggestions for how we might move forward,” continues Watson. “We need to acknowledge that AI is going to be an important thread in the education and research industries. Disruption is not always a bad thing, especially in the workforce. AI can help improve efficiencies, reduce costs, increase productivity, and create new job opportunities. In the higher education setting, these tools have the potential to offer personalized learning experiences, strengthen retention, and resolve accessibility issues. Along with the potential challenges this type of technology may introduce, we must also look at the positive opportunities that will arise and how we can better prepare our students for the world that is already waiting for them.”

View Article in View From The Edge Magazine

The post Navigating AI-Powered Education and the Future of Teaching and Learning appeared first on NJEdge Inc.


Maintaining Quality Online Learning Programs

The post Maintaining Quality Online Learning Programs appeared first on NJEdge Inc.

Creating and sustaining quality online learning experiences has become a top priority across the higher education community and plays a key role in the appeal and competitiveness of an institution. As these online programs are developed and implemented, quality assurance frameworks and processes are essential to ensuring that these programs meet rigorous standards and continue to align with learning objectives. “Having standards that everyone from across an institution has to meet is of paramount importance in higher education,” says Joshua Gaul, Associate Vice President & Chief Digital Learning Officer. “The lack of standards in today’s higher education system is a top reason for the drop in retention and enrollment, especially among community colleges and small private schools. Every organization should ensure their course offerings and entire digital presence meet quality industry standards, including ADA compliance.”

Using Rubrics to Assess Course Quality
To help ensure learners are engaging with high-quality courses, Quality Matters (QM) is among the most well-known programs for creating a scalable process for quality assurance. “QM is a global organization leading quality assurance in online and digital teaching and learning and is used to impact the quality of teaching and learning at a state and national level,” says Gaul. “QM has eight general standards and 42 total standards. More than 1,500 colleges and universities have joined the Quality Matters community and they’ve certified thousands of online and hybrid courses, as well as trained over 60,000 education professionals, including myself, on online course design standards.”

The SUNY Online Course Quality Review Rubric (OSCQR) is another well-respected online design rubric, used and developed by SUNY Online, in collaboration with campuses through the SUNY system. “With six general standards and 50 total standards, the QSCQR is openly licensed for anyone to use and adopt and aims to support continuous improvements and quality accessibility in online courses,” explains Gaul. “The rubric and the online course review and refresh process support large scale online course design efforts systematically and consistently. The goal is to ensure that all online courses meet a quality instructional design and accessibility standard, and are regularly and systematically reviewed, refreshed, and improved to reflect campus guidelines and research based online effective practices.”

“In addition to QM and OSCQR, there are many other rubrics being used to systematically check courses against,” continues Gaul. “No matter which rubric you are using, it’s important to have accountability and a knowledge sharing process about these standards across the entire institution.”

Implementing an Evaluation Cycle
Regardless of the program being used to conduct online course quality review, developing an evaluation cycle is essential to ensuring courses are meeting key standards. “The first step in implementing an evaluation cycle is gathering data and understanding the trends of your organization,” says Gaul. “What is the enrollment frequency, what courses have high enrollment, how many students fail or drop out? In classes that have very low enrollment or high drop rates, what are their barriers to success? Institutions should review the disciplines and courses with the highest enrollment and which courses should be evaluated and revised on a more frequent basis. Looking at the data closely can provide valuable insight into the effectiveness and quality of each online course.”

In between offerings, institutions should take stock of online courses as a whole and reflect on ways to enhance course content, engagement, and student outcomes. During this assessment, important questions to ask include:

Does the course learning environment welcome and include all students? Is engagement encouraging? Are there opportunities for self-reflection and discussion? Do activities provide opportunities for realistic, relevant, and meaningful application of knowledge? Are students achieving the goals of the course? Is the workload reasonable for both students and the instructor?

By adopting a mission to review and update all courses to ensure the highest quality content and experience, that promise can go a long way in improving the brand of an institution and creating a student centric learning environment that attracts positive attention. To successfully create an evaluation cycle, Gaul says each institution needs a defined project management process. “Each organization should map out a review process that defines individual roles and responsibilities. This should involve instructional designers, librarians, IT services, student support, and academic support. This process should not fall solely on the instructor. If you think of it like building a house, the faculty member is the homeowner, the instructional designer is the general contractor, and IT is your plumbing and electrical. Every person needs to be involved in the planning from day one to ensure a successful build.”

Building a Course Assessment System
Any time an institution begins assessing courses, whether it’s from a system level or individual course level, there are often barriers to overcome. “When technology is involved in instruction, there should be a collaborative effort to identify and overcome any hurdles,” says Gaul. “Technology should never lead the academia; the teaching should lead the technology. We must remember that all students are cognitively different, and this is why Universal Design for Learning (UDL) leans towards accessibility and flexibility and removing barriers to learning. These barriers can include inadequate support, where students do not know where to go for help, whether that’s technical, tutoring, writing style, etc. Access to support must be built into the course in order for students to feel supported and demonstrate emotional intelligence within the class.”

Other common barriers include a lack of a learning community and boredom. Without students feeling connected to the instructor and other classmates, they can become isolated, and without interesting content and delivery, students can feel disengaged. “System barriers we regularly see in regards to course assessments involve implementation,” says Gaul. “Lack of commitment, poor preparation, and inconsistency can all affect the success of a course assessment. Unless there’s some sort of checks and balances, courses are going to be inconsistent, and students are going to have difficulty moving seamlessly between classes if they’re taking more than one online course. The purpose of building a course assessment system is to free up faculty and give them the proper support they need to be successful.”

“Whether a course is fully online, hybrid, HyFlex, or in-person, we can help make sure it meets all the standards of quality technology enhanced instruction. This can provide a level of risk management and quality control that can often get ignored when there’s too much focus on the tools, system recruitment, and retention. Member institutions can also count on web and educational technology support. Edge provides technology and web support service management frameworks and ticketing systems to help with website maintenance and web content management. Most importantly, we can help provide thought leadership in how to implement a systemwide course assessment and revision cycle.”

— Joshua Gaul
Associate Vice President & Chief Digital Learning Officer, Edge

Instructional Design Support
Designing and managing online courses can be a challenging task, especially without the resources and training to do so effectively. Well-versed in instructional design, the Edge team understands digitally-enabled learning environments and how to evaluate online courses against standard industry rubrics. “Edge understands the methodologies, rubrics, and standards that go into the creation of a high quality curriculum,” says Gaul. “We have worked with colleges and universities to conduct evaluations and identify trends we see in their courses. We can also build workshops to help train faculty and students and improve their understanding of why online instruction is different from traditional classroom learning. Specifically, we help prepare staff and students for the challenge of online education through engaging student-centered experiences built to encourage online presence and encouraging active learning methodologies.”

Edge’s course and curriculum evaluation services are designed to help an institution deliver a top-quality product. “Whether a course is fully online, hybrid, HyFlex, or in-person, we can help make sure it meets all the standards of quality technology enhanced instruction,” says Gaul. “This can provide a level of risk management and quality control that can often get ignored when there’s too much focus on the tools, system recruitment, and retention. Member institutions can also count on web and educational technology support. Edge provides technology and web support service management frameworks and ticketing systems to help with website maintenance and web content management. Most importantly, we can help provide thought leadership in how to implement a systemwide course assessment and revision cycle.”

“Our team of experts can help an organization bridge the gap between technology and academia and lead a collaborative effort as opposed to two silos working in competition,” continues Gaul. “We can customize for smaller niche projects, support larger, longer-term initiatives, or become an extension of your team. Edge can provide documentation used in the project and whatever we produce will be owned by the institution, whether it’s a learning object or a series of training modules.”

Gaul says if online courses are not being reviewed and revised regularly, those learning experiences will not make an impact. “Revision cycles that are high quality, trust the data, and have accountability and responsibility are incredibly important to ensuring course content is engaging and impactful. Every institution should look at how their offices work together to create a course evaluation and revision cycle that is beneficial and supportive to the student. As you look for ways to improve your institution, Edge wants to help you transform your instruction, advance your online education, and find powerful ways to improve the way you do business.”

To learn more about optimizing courses for online learning and transforming the student experience, visit njedge.net/solutions-overview/digital-learning.

View Article in View From The Edge Magazine

The post Maintaining Quality Online Learning Programs appeared first on NJEdge Inc.


The Growing Demand for Instructional Designers

The post The Growing Demand for Instructional Designers appeared first on NJEdge Inc.

As the wave of digital transformation continues to change and shape higher education, the demand for highly-skilled talent who understands instructional design is growing too. Especially over the last couple of years when online learning skyrocketed, institutions had to quicken their pace in offering remote classes, while also creating new online courses, programs, and degrees as we entered a modern era of learning. Instructional designers or learning designers have become essential members of an organization, but not only are they difficult to find, many with these credentials do not pursue roles in higher education. And for those who do work at colleges and universities, the increasing pressure to be experts in a multi-faceted profession where institutions are investing in technology at an astounding rate is causing many instructional designers to experience workplace burnout.

Many instructional designers find themselves responsible for designing courses, building learning materials, coding, project management, and ensuring the effective delivery of instructional materials and experiences. With such a high bar, it can be challenging for these individuals to keep up. “Many institutions look at their instructional designers as workhorses,” says Joshua Gaul, Associate Vice President & Chief Digital Learning Officer, Edge. “Oftentimes, faculty members bring the content to the instructional designer and they then organize the content and build the course from a technical standpoint.”

“The biggest benefit of instructional design is not just knowing how to use the learning management system (LMS) or how to repurpose your content and put a discussion board together,” continues Gaul. “These experts work with faculty and leadership to bounce off ideas and integrate learning theory and pedagogy. Instead of shouldering faculty members with learning design on top of teaching and working to elevate the curriculum, instructional designers can help lighten this load and bring an expert perspective that can be hugely valuable to an institution.”

“The biggest benefit of instructional design is not just knowing how to use the learning management system (LMS) or how to repurpose your content and put a discussion board together. These experts work with faculty and leadership to bounce off ideas and integrate learning theory and pedagogy. Instead of shouldering faculty members with learning design on top of teaching and working to elevate the curriculum, instructional designers can help lighten this load and bring an expert perspective that can be hugely valuable to an institution.”

— Josh Gaul
Associate Vice President & Chief Digital Learning Officer
Edge

Quality Assurance in Online Learning
The approach to instructional design and how the field is regarded varies across the higher education community, and even between departments within an organization. “Institutions view instructional designers differently and it’s often tied to their current digital learning path,” shares Gaul. “The schools that were already forward thinking during the pandemic, didn’t have as large a shift in their business processes. The organizations seeing the most change in these processes are the ones who embraced the change but had to adjust on the fly. For the schools that do have instructional designers on staff, there is not always a unified approach to instructional design. The school of biology has different looking courses than the school of journalism, for example, but there need to be instructional design standards to ensure quality and compliance and can offer a clear model for all courses to follow.”

“Rubrics like QSCQR and Quality Matters (QM) establish an instructional design support framework,” continues Gaul. “Some organizations fear a centralized online learning approach will make courses too uniform, when in actuality, standardization gives faculty more academic freedom to customize their courses without worrying about accessibility. For instance, think about a textbook. Each has a table of contents, an index, and is broken into chapters. While the subject matter may be vastly different, it still follows this basic format and when a student opens the book, they know how to use it to get the information that they need.”

Aligning Business Goals with Instructional Design
Instructional design not only encompasses online learning, but extends to in-person instruction and hybrid learning as well. Many in this profession report long hours, lack of support, tight deadlines, and unrealistic expectations, all of which can lead to frustration and fatigue. “Some smaller institutions have a centralized instructional design office, but it’s often poorly staffed and leans more toward instructional technology training,” says Gaul. “This team will train faculty on how to use the LMS or the educational technology tools.”

Gaul continues, “Not every organization is going to have the budget to employ an instructional designer, especially someone who has an advanced education and necessary skill set. This is where Edge can be of value and offer instructional design support. With a team of over twenty seasoned instructional designers, we have consultants across the country, all with at least a master’s degree and several years of experience. Our instructional designers will work with an organization’s subject matter experts to analyze, design, develop, implement, and evaluate instructional materials and programs for an institution.”

As a longtime partner of Edge, Rowan University has a full team of instructional designers, but wanted to free them up to focus on faculty support and taking learning to the next level. “Rowan was able to move the course development and term to term updates to Edge and give their instructional design team the ability to work more closely with faculty members to elevate their courses and create more engaging, student centric content that meets quality standards. However, we understand many schools do not have the budget for a large instructional design team and that is why EdgeLearn can be a valuable solution for institutions. Edge can provide the expert support, strategy, and tools needed to enhance teaching, learning, and student engagement, without breaking the bank.”

One of the most important factors in successfully implementing digital learning programs and systems is ensuring business processes and goals align with instructional design initiatives. “Technology doesn’t drive the mission, but the technology can be informed by and follow the academic mission of the institution,” says Gaul. “As long as you have the right tools in place and the people who understand those tools and processes, you can accomplish amazing things with a small group of instructional designers. Edge can supplement an institution’s existing team or provide expert assistance in developing student-focused curriculum.”

“Edge can come in and hit the ground running because our team knows every tool,” continues Gaul. “We understand how to use data to glean important insights about an organization’s instructional design or ways artificial intelligence (AI) can open doors to new opportunities for digital learning. No matter what LMS an institution is using, or what rubric they follow, we can offer the specialized support needed to help fill any gaps and create superior teaching and learning experiences—anytime, anywhere.”

Ready to discover how Edge’s Digital Learning, Strategy, and Instructional Excellence experts can help your organization?
Visit njedge.net/solutions-overview/digital-learning/.

View Article in View From The Edge Magazine

The post The Growing Demand for Instructional Designers appeared first on NJEdge Inc.


Hyperledger Foundation

Meet Aries Agent Controller, a New Hyperledger Lab

A code base developed and contributed by Superlogic that facilitates deploying Hyperledger Aries agents in cloud environments is the latest Hyperledger lab. The new lab, Aries Agent Controller, is now officially a part of the Hyperledger ecosystem, and we are excited to work with the broader community to grow it.  

A code base developed and contributed by Superlogic that facilitates deploying Hyperledger Aries agents in cloud environments is the latest Hyperledger lab. The new lab, Aries Agent Controller, is now officially a part of the Hyperledger ecosystem, and we are excited to work with the broader community to grow it.  


Identity At The Center - Podcast

We are thrilled to announce a new Sponsor Spotlight on the I

We are thrilled to announce a new Sponsor Spotlight on the Identity at the Center podcast! We had the pleasure of hosting Marco Venuti, Director of IAM Business Acceleration for Thales, and Jason Keenaghan, Director of IAM Product Management for Thales. In this episode, we explore the Thales Cloud Security OneWelcome Identity Platform and its comprehensive solution for managing digital identitie

We are thrilled to announce a new Sponsor Spotlight on the Identity at the Center podcast! We had the pleasure of hosting Marco Venuti, Director of IAM Business Acceleration for Thales, and Jason Keenaghan, Director of IAM Product Management for Thales.

In this episode, we explore the Thales Cloud Security OneWelcome Identity Platform and its comprehensive solution for managing digital identities. We dive deep into the world of B2B IAM and discuss its differences from B2C and B2E IAM.

You can listen to the episode on IDACPodcast.com or in your favorite podcast app. Don't miss out on the insights and expert perspectives straight from the source!

A big thank you to Marco and Jason for joining us and sharing their valuable knowledge.

#iam #podcast #idac

Wednesday, 28. February 2024

Next Level Supply Chain Podcast with GS1

Behind the Barcode: Mastering 2D Barcodes with GS1 US's Gena Morgan

Keeping track of product information and inventory with multiple barcode types can be tricky for businesses.  Gena Morgan, who leads the standards team at GS1 US, shares valuable insights into the world of barcodes, specifically focusing on the transition from traditional 1D barcodes to 2D barcodes and the importance of GS1 standards in driving industry adoption. Gena explains the technical

Keeping track of product information and inventory with multiple barcode types can be tricky for businesses. 

Gena Morgan, who leads the standards team at GS1 US, shares valuable insights into the world of barcodes, specifically focusing on the transition from traditional 1D barcodes to 2D barcodes and the importance of GS1 standards in driving industry adoption. Gena explains the technical differences between traditional linear barcodes and 2D barcodes, such as QR codes and GS1 DataMatrix, highlighting the increased data capacity and smaller footprint of 2D barcodes. 

She elaborates on the potential consumer and business benefits, emphasizing the ability of 2D barcodes to provide more accurate and direct information to consumers, streamline supply chain processes for brands and retailers, and enable functionalities such as product recalls and promotions. The discussion delves into the challenges and opportunities presented by the transition to 2D barcodes, as well as the support and resources available for brands looking to embark on this journey. Gena's expertise on the subject makes for an enlightening and informative conversation, encouraging businesses to consider the advantages of 2D barcodes and GS1 standards in their operations.

 

Key takeaways: 

 The transition from traditional barcodes to 2D barcodes allows brands to provide information to consumers and tailor experiences. 

The adoption of 2D barcodes in the industry allows products to carry more data in a smaller footprint.

GS1 US supports brands transitioning to 2D barcodes and GS1 digital link standards with pilot programs and toolkits. 

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Gena Morgan on LinkedIn

 

Resources:

Learn More About 2D Barcodes

Resources for the Transition from 1D to 2D Barcodes

Fresenius Kabi Infuses Safety from Production to Patient with Unit-of-Use 2D Barcodes

 


Elastos Foundation

Beatfarm Digital and Elastos Collaborate on Music-focused Web3 Platform

Beatfarm Digital (“Beatfarm”) and Elastos today unveiled a collaboration to deliver ‘positive disruption’ to the music business based on new inscription technology and Blockchain-based music consumption models. The music creation and performance industry is notoriously inefficient when it comes to matching artists with potential collaborators and – even more so – when remunerating and protecting […

Beatfarm Digital (“Beatfarm”) and Elastos today unveiled a collaboration to deliver ‘positive disruption’ to the music business based on new inscription technology and Blockchain-based music consumption models.

The music creation and performance industry is notoriously inefficient when it comes to matching artists with potential collaborators and – even more so – when remunerating and protecting the rights of creators themselves. Research from industry research firm MIDIA suggests that 1% of artists make a staggering 77% revenue related to recorded music sales; a trend that is actually becoming more regressive over time.  According to research published in 2019 in The Journal of Business Research, in 1982, 5% of the top-earning artists accounted for 62% of concert revenues globally; by 2003 that proportion had risen to 84%.

A shift that has only been exacerbated by the emergence of new formats and technology; by the turn of the century, while 1% musicians accounted for 75% of ‘traditional’ formats such a CD revenues, they earned an even higher proportion – 79% – of subscription and streaming revenue.  Elastos’ partnership with Beatfarm represents a welcome alternative to this trend; with technology helping to put musicians and artists in control of their work, who they work with and how the resulting work is monetized.

Through the collaboration, artists will have direct and secure access to all aspects of the music ecosystem – from composition and production, to merchandising and promotion as well as genuine ‘superfans’ –complete with a direct transaction mechanism based on Elastos’ recently launched BeL2 technology, enabling them to establish Smart Contracts on their own terms and remunerated direct in Bitcoin.  The resulting contracts – eScriptions – are secured and assured through Bitcoin and can themselves be traded through a decentralized marketplace.

“The Elastos chain is an ideal platform for providing artists the tools and resources to control the monetization of their content and develop groundbreaking ways to connect with their fans in ways which the industry hasn’t seen before”, said Beatfarm’s Co-Founder, Alex Panos.

“Our collaboration with Beatfarm reflects everything that Elastos is about and what BeL2 can deliver.  Now artists and creators will not only have direct access to unlimited collaborators and resources, they’ll be able to partner with them on their terms, retaining full control and ownership of their work.  This is the very promise of the SmartWeb in action,” says Jonathan Hargreaves, Global Head of Business Development & ESG.

About Beatfarm

Beatfarm is a Web3 platform focused on the music industry whose mission is to provide artists the tools and resources to control the monetization of their content and develop new sources of revenue through direct collaboration with fans.

Developed by music industry veterans, Beatfarm aims to become the priority destination for direct artist monetization and enhanced artist to fan engagement.  Follow @beatfarm_io on X

Join Us on the Journey

As we continue to build the SmartWeb, we invite you to learn more about Elastos and join us in shaping a future where digital sovereignty is a reality. Discover how we’re making this vision come to life at Elastos.info and connect with us on X and LinkedIn.

 

Tuesday, 27. February 2024

Hyperledger Foundation

Building Better Together: Insights from the Governing Board to Mark Hyperledger Foundation’s 8th Anniversary

As a follow-up to Hyperledger 8: A Celebration of Building Better Together, Daniela Barbosa asked our Governing Board Representatives for their take on the success and value of Hyperledger Foundation as well as the technical priorities they see for the community. 

As a follow-up to Hyperledger 8: A Celebration of Building Better Together, Daniela Barbosa asked our Governing Board Representatives for their take on the success and value of Hyperledger Foundation as well as the technical priorities they see for the community. 


Oasis Open Projects

OASIS Members to Advance Global Standard for Computing Ecosystem Supply Chain Data Exchange

Boston, MA – 27 February 2024 – Members of OASIS Open, the international open source and standards consortium, have formed the Computing Ecosystem Supply Chain Technical Committee (CES-TC). Leaders in the computing and semiconductor industries established the TC with aims to revolutionize global supply chain dynamics through standardized data exchange. With digital transformation rapidly reshaping

Cisco, Hewlett Packard Enterprise, Intel, Micron, Microsoft, U.S. NIST, SAP, and Others to Develop Use Cases, Standards, and APIs that Enable End-to-End Visibility for Supply Chains

Boston, MA – 27 February 2024 – Members of OASIS Open, the international open source and standards consortium, have formed the Computing Ecosystem Supply Chain Technical Committee (CES-TC). Leaders in the computing and semiconductor industries established the TC with aims to revolutionize global supply chain dynamics through standardized data exchange. With digital transformation rapidly reshaping industries and systems worldwide, the imperative for seamless data exchange has never been more pronounced.

This collaborative endeavor highlights the consensus in the computing ecosystem that digital transformation requires standardized data exchange among member companies over a network. The TC will focus on developing use cases, data schemas and ontologies, and APIs that enable end-to-end visibility for supply chains. The TC’s work will facilitate building resilient capacity, trusted hardware and software, secure systems, and sustainable practices to benefit all customers and end-users.

“Standardization plays a pivotal role in establishing secure and sustainable systems, which are crucial for the evolving digital landscape,” noted Joaquin Sufuentes, CES-TC co-chair, of Intel. “As the CES-TC sets its course, it signifies the collective dedication of OASIS members to lead the charge in technological advancement that directly enriches industries and end-users. The TC’s work will extend to smart contracts that drive logic functions, process automation, and role-based entitlements within the blockchain context.”

“TC contributions will focus on the data schemas and ontologies that define the attributes and entities and a REST API model for putting the data into and getting the data from blockchain or other distributed infrastructure,” said Tom Dodson, CES-TC co-chair, of Intel. “Through standardized approaches, we are empowering industries with the tools necessary to navigate the complexities of the digital age.”

Participation in the OASIS CES-TC is open to all through membership in OASIS. The profile for the types of contributors to the CES-TC include business stakeholders responsible for product delivery, technical experts managing integrations, supply chain professionals, data specialists focusing on ontologies, government representatives concerned with traceability, and industry professionals driving digital transformations.

Support for the CES-TC
Cisco
“The OASIS CES-TC represents a great advancement in standardizing and securing the supply chain of the digital age. By focusing on the development of universally accepted data schemas, APIs, and smart contract specifications, this effort is laying the groundwork for transparency, efficiency, and security in supply chain management. I fully support CES-TC’s efforts to create a more resilient and trustworthy digital ecosystem.”
– Omar Santos, Distinguished Engineer, Cisco | OASIS Board of Directors

Intel
“Working as an ecosystem for the benefit of customers and end users of our computing products requires that we operationalize how we collaborate with data in real time to build more efficient operations and new revenue services. We want to standardize and scale the ability to share the right data and signals.”
-Paul Dumke, Senior Director, Ecosystem Strategy & Operations, Intel Corporation

Micron
“The storage and memory business is complex and competition is fierce. Micron’s success depends on our ability to innovate, and with more than 50,000 lifetime patents, we take innovation very seriously. The value chain ecosystem is no exception. Ecosystem innovation is the next frontier and Micron is thrilled to be on this journey with our fellow CES-TC members.”
-Matt Draper, Senior Director of Micron Supply Chain Optimization

Additional Information
CES Project Charter

The post OASIS Members to Advance Global Standard for Computing Ecosystem Supply Chain Data Exchange appeared first on OASIS Open.


Origin Trail

The ON TRAC(k) podcast returns! Episode 2 on Delegated Staking, AI Agents, & More

We’re excited to announce that the ON TRAC(k) podcast will return on February 29th at 16:00 CET with a brand new episode on delegated staking, AI agents, and more. Hosted by Jonathan DeYoung, (who you may know already as co-host of Cointelegraph’s The Agenda) and recorded live, the second episode of the ON TRAC(k) podcast features a special guest — Martin Köppelmann, co-founder and CEO of Gn

We’re excited to announce that the ON TRAC(k) podcast will return on February 29th at 16:00 CET with a brand new episode on delegated staking, AI agents, and more.

Hosted by Jonathan DeYoung, (who you may know already as co-host of Cointelegraph’s The Agenda) and recorded live, the second episode of the ON TRAC(k) podcast features a special guest — Martin Köppelmann, co-founder and CEO of Gnosis! Martin will join the three co-founders of OriginTrail, Žiga Drev, Branimir Rakić, and Tomaž Levak to discuss a Verifiable Internet for AI & more.

Take this opportunity to tune in to a live conversation between industry pioneers and thought leaders here.

In case you missed it

Last time around, Jonathan DeYoung spoke with the OriginTrail co-founders, about the significance of OriginTrail’s V8 Foundation, explored its robust partnerships, and shed light on the ecosystem’s key initiatives including knowledge mining and staking.

If you missed out on watching this episode live, you can watch it back on the OriginTrail YouTube channel or listen wherever you consume your favourite shows.

And, if you’re curious about OriginTrail’sV8 foundation, you can read more here.

Future Episodes

The On TRAC(k) podcast will continue to bring you the latest and most innovative ideas and advancements both in the OriginTrail ecosystem and beyond. We’re giving our listeners exclusive insights into the world of blockchain and Web3 as we develop the technology that empowers brands and builders alike with verifiable, decentralized knowledge through AI and DKG technology.

Here at the On TRAC(k) podcast, we’re lucky to have such a vibrant, curious community of listeners, and we want to give you a listening experience that matches the cutting-edge ideas and excitement in our community. That’s why we’re making you a part of the podcast. Ahead of each episode, you’ll have the chance to submit questions that delve deeper into the things you want to learn more about.

We’re excited to reveal our upcoming guests and topics to you further down the line. To keep up to date with all announcements and upcoming episodes, don’t forget to follow OriginTrail on X and, of course, subscribe to On TRAC(k) wherever you get your podcasts.

Climb aboard and welcome to the OriginTrail community. Together, let’s explore, learn, and shape the future.

About OriginTrail

OriginTrail is an ecosystem-building decentralized knowledge infrastructure for artificial intelligence (AI). With the mission of tackling misinformation, which is exacerbated with AI adoption, OriginTrail enables verifiably tracking origins of information, discoverability, and integrity of knowledge to enable trusted AI. It has various applications in the domains of real-world assets (RWAs), search and recommendation engines, question-answering systems, and generally knowledge-dependent applications (such as AI systems).

OriginTrail’s initial adoption was in global supply chains, serving as a trusted hub for supply chain data sharing, allowing customers to authenticate and track products and keep these operations secure. In recent years, the rise of AI has not only created unprecedented opportunities for progress but also amplified the challenge of misinformation. OriginTrail also addresses this by functioning as an ecosystem focused on building a trusted knowledge infrastructure for AI in two ways — driving discoverability of the world’s most important knowledge and enabling the verifiable origin of the information. The adoption of OriginTrail in various enterprise solutions underscores the technology’s growing relevance and impact across diverse industries including real-world asset tokenization (RWAs), the construction industry, supply chains, healthcare, metaverse, and others.

OriginTrail is creating a Verifiable Web for decentralized AI by empowering world-class brands and builders. It utilizes its unique Decentralized Knowledge Graph and OriginTrail Parachain to deliver AI-powered search and solutions for enterprises and individuals worldwide.

OriginTrail has gained support and partnerships with world-class organizations such as British Standards Institution, SCAN, Polkadot, Parity, Walmart, the World Federation of Hemophilia, Oracle, and the EU Commission’s Next Generation Internet. These partnerships contribute to advancing OriginTrail’s trusted knowledge foundation and its applicability in trillion-dollar industries while providing a verifiable web of knowledge important in particular to drive the economies of RWAs.

Web | On TRAC(k) Podcasts | X | Facebook | Telegram | LinkedIn | GitHubDiscord

The ON TRAC(k) podcast returns! Episode 2 on Delegated Staking, AI Agents, & More was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Digital ID for Canadians

First DIACC PCTF-Certified Service Provider Trustmark Granted

Confirming ATB Ventures’s Oliu service PCTF Privacy Component conformance 

Feb. 27, 2024 – Vancouver – We are thrilled to announce that  ATB Ventures’s  Oliu has been certified against the Pan-Canadian Trust Framework (PCTF) Privacy Component. Established in 2012, DIACC is Canada’s largest and most diverse multistakeholder organization, fostering confidence and consistency in the digital trust and identity services market through its internationally recognized PCTF and standardized third-party conformity assessment program. 

Being the first DIACC PCTF-certified service provider is a significant milestone and a unique leadership opportunity.  DIACC PCTF certification provides an assurance signal to the market, indicating that a service fulfills specified requirements. 

The PCTF comprises a set of rules that offers a versatile code of practice and risk assessment approach that organizations agree to follow, which includes best practices, policies, technical specifications, guidance, regulations, and standards, prioritizing interoperability, privacy, security, and trustworthy use of digital identity and personal data. 

ATB’s Oliu, an Identity verification and authentication platform, has been subject to certification for the PCTF, including a point-in-time audit conducted by DIACC Accredited Auditor KUMA and an independent committee review for quality assurance. Oliu demonstrated conformity to the PCTF Privacy conformance criteria, meeting the applicable requirements. Based on the conformity assessment process results, DIACC has issued a three-year cycle Trustmark subject to annual surveillance audits and added ATB Oliu to the DIACC Trusted List – an authoritative trust registry of DIACC PCTF-certified service providers. 

“This certification begins an exciting journey in providing certainty to the market through trusted services subject to DIACC’s certification program, designed around ISO/IEC 17065,” said DIACC President Joni Brennan.  “For Oliu, achieving the certification demonstrates its commitment to providing trustworthy and reliable digital identity verification services and advancing secure and interoperable digital trust and identity services in Canada.“

About DIACC

Established in 2012, DIACC is a non-profit organization of public and private sector members committed to advancing full and beneficial participation in the global digital economy by promoting PCTF adoption and conformity assessment. DIACC prioritizes personal data control, privacy, security, accountability, and inclusive people-centered design.

To learn more about DIACC, please visit https://diacc.ca/ 

ABOUT OLIU™

Oliu is a blockchain-identity management solution that makes it easy for businesses to issue, manage, and verify digital credentials. Built on open (W3C) standards, Oliu leverages identity frameworks such as the Pan-Canadian Trust Framework (PCTF) and National Trust and Identity Fundamentals to make mobility and interoperability between identity systems possible.

To learn more about Oliu, please visit https://oliu.id/ 

About ATB Ventures™

ATB Ventures is the research and innovation arm of ATB Financial, a leading Alberta-based financial institution. Driving growth at the edges and exploring opportunities beyond financial services, ATB Ventures focuses on helping companies bridge the gap between consumers’ increasing concerns about privacy and security, and their desire for more advanced personalized experiences. 

To learn more about ATB Ventures, please visit https://atbventures.com/ 

Monday, 26. February 2024

FIDO Alliance

EMVCo and FIDO Alliance Provide Essential Guidance on Use of FIDO with EMV 3DS

As leaders in authentication and payments spaces respectively, the FIDO Alliance and EMVCo collaborate to provide guidance on how FIDO authentication can be incorporated in payment use-cases allowing merchants, acquirers/PSPs […]

As leaders in authentication and payments spaces respectively, the FIDO Alliance and EMVCo collaborate to provide guidance on how FIDO authentication can be incorporated in payment use-cases allowing merchants, acquirers/PSPs and issuers to have a consistent way to submit and process FIDO authentication data.  

EMVCo released a white paper with FIDO Alliance’s inputs, “EMV® 3-D Secure White Paper – Use of FIDO® Data in 3-D Secure Messages,” which explains how the use of FIDO authentication data in EMV 3DS messages can streamline e-commerce checkout while reducing friction for consumers. 

Authentication flows are evolving, and merchants are increasingly building seamless experiences based on FIDO standards for device-based authentication, where a trusted device is bound to a payment credential to ensure the credential is being used by the verified cardholder. Consequently, it has become apparent that in some scenarios the issuer may require more data to assess risk and validate the authentication cryptographically. 

This paper addresses these scenarios by providing a data structure that allows for a chain of trust to be established between cardholder authentication, FIDO enrolments and FIDO authentication, hence giving issuers increased control and insight into the authentication process as well as validate authentication. 

In the EU, where payment authentication is required as per PSD2 SCA, this industry-wide guidance can provide assistance to enabling more device-based authentication in a standardized way using globally known authentication standards such as FIDO while using widely accepted authentication rails such as EMVCo.

Read the full white paper on the EMVCo website to learn more.


Oasis Open Projects

The Importance of Open Standards for Data Interoperability

The use of open standards in data interoperability is crucial for enhancing governance not only in the European Union but globally. Open standards determine the format, storage, and exchange of data and enable different organizations and systems to communicate seamlessly. This is especially vital for the EU, with its diverse member states and institutions, where […] The post The Importance of Op

By Francis Beland, Executive Director, OASIS Open

The use of open standards in data interoperability is crucial for enhancing governance not only in the European Union but globally. Open standards determine the format, storage, and exchange of data and enable different organizations and systems to communicate seamlessly. This is especially vital for the EU, with its diverse member states and institutions, where open standards ensure free and secure data flow across borders, enabling better coordination and cooperation in implementing healthcare, trade, environmental protection, and security policies.

Furthermore, open standards uphold the principles of transparency and democracy, enabling citizens’ access to governmental data and enhancing public accountability, thereby promoting civic engagement. From an economic standpoint, open standards foster innovation, facilitate cross-border business operations and drive economic growth. Moreover, they help address global challenges such as climate change and pandemics, allowing effective data sharing and collaboration among nations.

OASIS Open interoperability standards are pivotal in ensuring data protection, privacy, and security while harmonizing technological infrastructures. Our standards are vital for the EU and other governments to fully leverage data interoperability’s benefits in an increasingly interconnected world.

The post The Importance of Open Standards for Data Interoperability appeared first on OASIS Open.


Identity At The Center - Podcast

We’ve got another great episode of the Identity at the Cente

We’ve got another great episode of the Identity at the Center podcast for you! We caught up with Eve Maler of Venn Factory to answer a few listener voicemail questions and to see if her thoughts on the difference between digital identity and identity and access management has changed since we last asked her almost two years ago. Episode #262 is available now at idacpodcast.com and in your favorit

We’ve got another great episode of the Identity at the Center podcast for you! We caught up with Eve Maler of Venn Factory to answer a few listener voicemail questions and to see if her thoughts on the difference between digital identity and identity and access management has changed since we last asked her almost two years ago.

Episode #262 is available now at idacpodcast.com and in your favorite podcast app.

#iam #podcast #idac


The Engine Room

Welcoming Dalia Othman as Co-Executive Director

Dalia Othman has been selected by our Board as The Engine Room’s other Co-Executive Director, to lead the organisation alongside Paola Mosso from mid-March The post Welcoming Dalia Othman as Co-Executive Director appeared first on The Engine Room.

Dalia Othman has been selected by our Board as The Engine Room’s other Co-Executive Director, to lead the organisation alongside Paola Mosso from mid-March

The post Welcoming Dalia Othman as Co-Executive Director appeared first on The Engine Room.

Wednesday, 04. October 2023

decentralized-id.com

Ecosystem Overview

This page includes a breakdown of the Web Standards, Protocols,Open Source Projects, Organizations, Companies, Regions, Government and Policy surrounding Verifiable Credentials and Self Sovereign Identity.

Note to reader This is a Work in Progress, and should not be taken as authoritative or comprehensive. Internal Links in Italic

Open Standards Decentralized Identifiers Explainer Literature DID Methods Supporting Tech DIDAuth Critique Verifiable Credentials Explainer Comparisons Varieties Data Integrity JSON-LD LD-Proof (w3c) JSON-LD ZKP BBS+ (w3c) JOSE / COSE JSON SD-JWT (ietf) JWP (ietf) ZKP-CL (Hyperledger) Related JSON-LD (W3C) JSON (IETF) BBS (SIAM 1986) Exchange Protocols DIDComm (DIF) CHAPI (DIF) OIDC4VC (OpenID) mDL (ISO/IEC) WACI-Pex (DIF) VC-HTTP-API (CCG) Authorization Protocols zCap (w3c) UCAN (Fission, Bluesky, Protocol Labs) GNAP (IETF) OAuth (IETF) ISO Standards mDL (ISO/IEC 18013-5) JTC 1/SC 17/WG 3 - Travel Documents (ISO/IEC) ISO 27001 Data Stores Encrypted Data Vaults - EDV (DIF) Decentralized Web Node - DWN (DIF) Trust Frameworks 800-63-3 (NIST) PCTF (DIACC) Non SSI Identity Standards OpenID (OpenID) FIDO (FIDO) OAuth (IETF) SCIM (IETF) SAML (OASIS) KMIP (OASIS) WebAuthN (W3C) Secure QR Code (OASIS) Blockchain Standards ISOTC 307 (ISO) CEN/CLC/JTC 19 (CEN/EENTLIC) ERC-EIP (Ethereum) Code-Bases Open Source Projects Universal Resolver (DIF) KERI (DIF) Other Tools & Libraries (DIF) ESSIF-Lab (ESSIF-Lab) Aires (Hyperledger) Indy (Hyperledger) Ursa (Hyperledger) Other Tools & Libraries (Hyperledger) Blockcerts (Hyland) Company Code Walt.id Verite SpruceID Organizations International Standard Development Organizations [SDO] W3C IETF OASIS ITU-T ISO/IEC National Government/Standard Setting Bodies NIST The Standards Council of Canada BSI - The Federal Office for Information Security, Germany Community Organizations W3C - CCG DIF ToIP ADIA Kantara MyData DIACC ID2020 OpenID Foundation Internet Safety Labs GLEIF Hyperledger Foundation FIDO Alliance OASIS SSI Networks DizmeID Sovrin BedRock ONT Velocity GlobalID Dock ITN , Mobi Companies Microsoft - Azure / Entra EU SSI Startups MyDex MeeCo ValidatedID Bloqzone Procivis Gataca US SSI Startups Dock Anonoyome GlobalID Hyland Magic IDRamp Indicio Verified Inc (formerly UNUMID) Animo Mattr Liquid Avatar Hedera IOTA Trinsic Transmute Spruce Disco.xyz Asia SSI Startups Affinidi ZADA Dhiway Ayanworks NewLogic Africa SSI Startups FlexID Diwala Acquisitions Avast-Evernym-SecureKey Analyst Firms KuppingerCole Forrester Gartner Consulting Firms Deloitte Accenture McKinsey BCG IAM Industry Ping (TomaBravo rollup) Okta Auth0 ForgeRock (TomaBravo rollup) IDENTOS SailPoint (TomaBravo rollup) Policies/Regulations (by region) FATF Europe Data Governance Act GDPR eIDAS1 eIDAS2 UK Data Protection Asia USA COPPA Privacy Act California SB786 India Canada Pan Canadian Trust Framework (PCTF) Government Initiatives US SVIP National Cybersecurity Strategy Germany IDUnion UK Scotland UK Digital Strategy EU eIDAS2 Large Scale Pilots Architecutre and Reference Framework EBSI ESSIF-Lab Catalonia Switzerland APAC New Zealand Australia Singapore South Korea Canada BCGov Alberta Ontario LatAm LACCHAIN Real-World Implementation Government Issued ID Passport eMRTD/DTC (ICAO) Immigraion (USCIS) mDL (US AAMVA) [not SSI standards conformant] IDCard (IATA / Switzerland) Trust Registries & Directories TRAIN (ToIP) Regi-Trust (UNDP) OrgBook BC (BCGov) SupplyChain/Trade GS1 GLEIF Banking Bonifi COVID NYState VCI CCI DTCC DIVOC Enterprise Healthcare Learning/Career/Education Jobs for the Future Velocity Network Learning Economy Foundation TLN - Trusted Learner Network KYC Real Estate Rental Travel Humanitarian Energy IoT Guardianship Wallets Types (by type+topic) Research Papers/Academic Literature Turing Institute Research: Privacy & Trust Events IIW RWoT Topics Biometrics Privacy Human Rights User Experience Business Critiques Future Web3, DWeb, & Other Tech (by focus) Web3 Web3 and SSI DAO Decentralization Metaverse NFT SBT DeFi Organizations Ethereum Enterprise Alliance* Fission Protocol Labs DWeb Secure Suttlebutt Bluesky Web5 Handshake Blockchain Ecosystems Bitcoin Ethereum

Friday, 23. February 2024

FIDO Alliance

Cybersecurity Policy Forum: Identity, Authentication and the Road Ahead

2023 demonstrated that we still have a lot of work to do when it comes to protecting Americans from identity theft and identity-related cybercrime. The GAO and FinCEN together documented […]

2023 demonstrated that we still have a lot of work to do when it comes to protecting Americans from identity theft and identity-related cybercrime. The GAO and FinCEN together documented more than $300 billion in identity-related cybercrime, DHS’ Cyber Safety Review Board (CSRB) outlined how weaknesses in legacy authentication tools enabled adversaries to launch a wave of high-profile attacks, and millions of Americans struggled to recover from identity theft. Meanwhile, the introduction of new tools powered by biometrics and AI to help block attacks also raised concerns about equity and bias, and in the physical world, many Americans still struggle to get foundational credentials that they need to prove who they are. As 2024 kicks off, these issues will all continue to be front and center.  

On Thursday, January 25th in Washington DC, the Better Identity Coalition, FIDO Alliance, and the Identity Theft Resource Center (ITRC) joined forces to present a full-day policy forum looking at “Identity, Authentication and the Road Ahead.”


Security Journal: Fingerprints agrees distribution partnership with Ansal Component

Fingerprints’ biometric access solution is designed for physical and logical access devices and applications such as smart locks, FIDO tokens, crypto wallets and more.

Fingerprints’ biometric access solution is designed for physical and logical access devices and applications such as smart locks, FIDO tokens, crypto wallets and more.


FinExtra: Mitigating fraud risk: effective strategies for small financial institutions

Passwords are one of the most common targets for fraudsters. Strengthening password security demands robust authentication methods, risk-based measures and behavioural analysis to detect anomalies. Active exploration of innovations like […]

Passwords are one of the most common targets for fraudsters. Strengthening password security demands robust authentication methods, risk-based measures and behavioural analysis to detect anomalies. Active exploration of innovations like Passwordless Login, based on the robust Fast Identity Online 2 (FIDO2) standards developed by the FIDO Alliance, is essential to bolster online security and authentication. 


Engadget: PlayStation now supports passkey sign-ins

Sony Interactive Entertainment (SIE) introduces passkey support for PlayStation accounts, allowing users to log in via their mobile device or computer’s screen unlocking method like PIN, fingerprint, or facial recognition. […]

Sony Interactive Entertainment (SIE) introduces passkey support for PlayStation accounts, allowing users to log in via their mobile device or computer’s screen unlocking method like PIN, fingerprint, or facial recognition. Passkeys enhance security by preventing reuse or sharing, reducing vulnerability to phishing and data breaches.


The Verge: Now you can sign into your PlayStation account without a password

Sony PlayStation has introduced passkey support for account logins, enabling users to authenticate without passwords. Similar to Nintendo’s implementation, users can now use authentication methods like iOS Face ID or […]

Sony PlayStation has introduced passkey support for account logins, enabling users to authenticate without passwords. Similar to Nintendo’s implementation, users can now use authentication methods like iOS Face ID or Android fingerprint sensors for account access.


Ceramic Network

Points: How Reputation & Tokens Collide

Points are here and they signal how networks and apps will evolve next with verifiable data.

Points have taken Web3 by storm in the last six months, catalyzed by projects like Blur and EigenLayer rewarding users with points on the way to seizing the NFT market and amassing $7 Billion TVL respectively. More than 115 Billion points have been given out by Web3 projects so far, according to Tim Copeland at The Block.

There are two ways to look at points:

As a precursor to an airdrop. Projects use points ahead of a token to generate interest, signal what they care about and will reward, more effectively target engagement, and navigate legal risks associated with tokens. As a measure of quantifiable reputation. Points ascribe a value to user activity, just like many reputation systems have before: traditional loyalty programs, Reddit karma, check-ins, credentials. They can signal legitimacy in pseudo-anonymous systems and — because they’re more quantitative than, for example, verifiable credentials — standing within the community.

Both of these are right. Points align the incentives of the platform and the user base, like all reputation systems. And they forecast who is creating value and is likely to be rewarded. By understanding how these two intersect, we can forecast where Web3 will go far beyond today’s points craze.

Points are quantifiable like money, enduring like reputation

Tokens were the first major innovation of Web3 and the primary incentive. They offer fully quantifiable value, are transactional, and require no additional context. They work as “one time games.” Reputation is how social systems achieve repeat game use cases, rewarding ‘good behavior’ of an actor (e.g. following contracts and policies, not cheating counterparties) with access and benefits over the long term. Reputations are non-fungible — for them to establish trust, it has to be hard to buy reputation.

Points are proof of activity that act as a building block for reputation (and in Web3, often carry the suggestion of future value). **All reputations come with some benefit. Usually, they’re more subtle than financial rewards. Reputations might gain access to a service (credit score) or club (referral), earn discounts (loyalty programs) or introductions (dating), convince counterparties to transact (Uber rating, credit lending), and build trust with customers (influencers, corporate brands). Reputations are less measurable than financial assets, but often more valuable.

For Web3 to grow into social and other non-financial use cases, more robust reputations are needed. Points are not an isolated mechanism to forecast token earnings — they are one point on a broad spectrum of token (financial) and reputational rewards that Web3 will keep innovating on.

Points as part of the evolution of Web3 reputation

Reputation naturally starts in the most discrete, high-value places and evolves to more broad ones.

1. B2C “Badges”

The earliest forms of reputation helped networks solve discrete pressing problems: anti-sybil and KYC. This involved a business or service issuing badges to users for achieving important milestones. For example, Gitcoin Passport stamps prevent sybil attacks in Grants rounds for Ethereum and other ecosystems. The meaning of the badge is objective and clearly denominated.

2. Attestations

After discrete badges, platforms needed reputation for a wider variety of activities and credentials: history of contribution, trust as a delegate, skills in a DAO, proof of activity. Attestations are still clearly denominated, but rather than signifying a clear milestone like badges they’re more continuous.

This also started B2C (for example, participating in an event, completing a certain step in a user flow, etc.) EAS has emerged as a standard for issuing these, used natively in the Optimism stack and widely in Ethereum. Increasingly, attestations are community-driven as well. Open platforms let users create and verify claims. For example, Metamask is working on a review and reputation system for Snaps.

3. Points: scored activity

On-chain transactions, B2C badges, and attestations are all cryptographically verifiable actions. Users do something that has provable provenance and time, whether that’s on-chain or off-chain signed data.

Point systems specify which of these activities have value in their system (and how much), tabulate them over time for each identity, and ascribe a numeric ‘point’ value to them. Points create a quantifiable reputation that continuously aggregates the previous forms.

4. Events: all activity

There’s no reason to believe the evolution will stop with points. At scale, all of a user’s activities in Web3 apps and platforms will be recorded as cryptographically verifiable events. This might be likes on a cast, messages on a forum, contributions to a codebase, visits to a blackbird restaurant, etc.

These are all events, and they all have value — but it’s not always known up-front what that value is. Some might have value in driving community engagement, others in improved analytics for products or targeting for ads, some as inputs into reputation systems. Because all will be cryptographically recorded, they can be referenced any time in the future.

Points will dominate for now, but before long we’ll see a huge increase in retroactive airdrops, activations, rewards, access, and other forms of value awarded to users based on a much broader history of their events — not just those that are made explicit up-front via point systems.

Infrastructure for points, events, and trust

All of these forms of reputation serve to reward users. Web2 used points exhaustively, but Web3 can uniquely do it openly and with composability. By making every event driving points both transparent and verifiable, events and points can be leveraged across platforms and have trust built in. This trust can reinforce future rewards, encourage more activity, and enable cross-platform point innovation.

Unfortunately, while points are proliferating, to date most haven’t tapped into this unique Web3 possibility — most have been tabulated on centralized databases, putting rewards at risk.

Data ledgers vs. asset ledgers

Financial blockchains were built to be asset ledgers, not point or event ledgers. They’re designed for scarcity; e.g., they must protect against double-spend as a core principle. Points are not bought, sold or traded like assets — they’re earned. They’re best served — fast, cheaply, scalably — on a data ledger.

Data ledgers, for data that is not scarce, operate with different principles. They must still offer strong verifiability and composability; but they don’t have to protect against double spend, and they must scale many orders of magnitude beyond asset ledgers. There are exponentially more data transactions than asset transactions in any web service.

Ceramic is the first decentralized data ledger with the scale, composability, and trust guarantees required to be a system of record for points and events. It’s built to enable the scaling of Web3 beyond financial transactions to richer experiences, including those powered by attestations, points, and the billions of events that are coming to enable a data-rich Web3.

Building with Points

If you’re thinking about a point system for your product, or how to advance point-enabled experiences, please reach out to partners@3box.io.

If you are interested in emerging standards for points, reach out to us on the Ceramic discord to learn more about our working group.

If you’ll be at EthDenver next week, come talk points, reputation and verifiable data with us at Proof of Data.

Thursday, 22. February 2024

The Engine Room

Community call: Dreams of a collective infrastructure for information ecosystems in Latin America 

Join our next community call to talk about the kinds of infrastructures we need to collectively create a better flow of creation, distribution and reception of information in Latin America.  The post Community call: Dreams of a collective infrastructure for information ecosystems in Latin America  appeared first on The Engine Room.

Join our next community call to talk about the kinds of infrastructures we need to collectively create a better flow of creation, distribution and reception of information in Latin America. 

The post Community call: Dreams of a collective infrastructure for information ecosystems in Latin America  appeared first on The Engine Room.


Ceramic Network

ETHDenver 2024: Where to find Ceramic in Denver

The core Ceramic team is coming to ETHDenver 2024! Check out all of the events where you can meet the team.

ETHDenver 2024 is kicking off, and we are very excited to meet you all there. This year, you will find the Ceramic team at a list of side events, talks, workshops, and, most importantly - a Proof of Data event co-organized by Ceramic and Tableland that you don’t want to miss.

Collect attendance points at ETHDenver 2024!

Are you ready for an exciting scavenger hunt at ETHDenver 2024?

Ceramic is partnering with Fluence, a decentralized computing marketplace, to create a fun and interactive game for collecting attendance badges at the majority of the events listed below. Those badges will allow you to collect points throughout the ETHDenver2024.

You can find us at each event and tap a disc to participate! With each attendance, you will claim points represented as documents on Ceramic. Fluence will be consuming the new Ceramic Data Feed API to enable compute over incoming points.

Rumor has it that the first participants to collect all the necessary points will be rewarded with some really cool prizes! So make sure to participate and we can’t wait to see you at all of the ETHDenver events listed below!

Sunday, February 25th - Saturday, March 2nd Silk ETHDenver hackerhouse

Ceramic is partnering with Silk and other ecosystem partners to invite hackers to work together on building better scientific tooling, web account UX, governance forums, and much more.

🚀 Calling all hackers! Unveiling the Silk ETH Denver Hacker House – where innovation meets decentralized tech! 🏡 Join our quest to revolutionize scientific tooling, web account UX, governance forums, and much more!

Are you ready? Save the dates: Feb 25th - March 2nd 🤍🧵

— Silk (@silkysignon) January 26, 2024
Tuesday, February 27th DePIN Day

Join us and our friends at Fluence for a day filled with talks, workshops, and discussions on all things #DePIN.

Location:
Green Spaces
2950 Walnut St.
Denver, CO

Time:
13:00 - 17:00 MST

Wednesday, February 28th Open Data Day

Our co-founder, Danny Zuckerman, will deliver a keynote at Open Data Day hosted by Chainbase. Come hear more about Ceramic and what we have coming up on the roadmap.

Location:
1261 Delaware St,
Denver, CO

Time:
13:00 - 17:00 MST

SciOS

Don’t miss out on a workshop led by Radek, our Developer Advocate at SciOS. The workshop will be focused on discussing the barriers and workarounds for enabling developers to build interoperable DeSci solutions.

Location:
2601 Walnut St 80205,
Denver, CO

Time:
13:00 - 16:00 MST

Thursday, February 29th libp2p day

Come listen to our core engineers discussing the implementation of Recon, a new Ceramic networking protocol that improves network scalability and data syncing efficiency.

Location:
The Slate Denver,
Denver, CO

Time:
13:30 - 18:00 MST

Friday, March 1st Proof of Data

Join Ceramic x Tableland on March 1 in Denver and livestream for Proof of Data Summit, a full-day community gathering on reputation, identity, DePIN, decentralized AI, and decentralized data computing. Featuring lightning talks, technical discussions, and panels with industry visionaries, this will be a can't miss event! Don't miss your chance to RSVP now to secure your spot in person or via livestream.

Location:
Denver Art Museum
Denver, CO

Time:
9:00 - 16:00 MST

ETHDenver - Triton Stage

Ceramic engineer Golda Velez will lead us on a talk about decentralized trust and AI on the Triton Stage at the Spork Castle!

Location:
Spork Castle
Denver, CO

Time:
14:45 MST

Keep an eye on the updates and our twitter as we get closer to the events. We can't wait to see you there!


DIF Blog

Guest blog: Tim Boeckmann, Mailchain

Mailchain, founded in 2021, aims to revolutionize decentralized identity and communication with its services, including Vidos and the Mailchain Communication Protocol. These offerings streamline management of, and interaction with, decentralized identifiers (DIDs), ensuring secure, efficient, and compliant operations across various industries, simplifying the integration and adoption of decentrali

Mailchain, founded in 2021, aims to revolutionize decentralized identity and communication with its services, including Vidos and the Mailchain Communication Protocol. These offerings streamline management of, and interaction with, decentralized identifiers (DIDs), ensuring secure, efficient, and compliant operations across various industries, simplifying the integration and adoption of decentralized identity technologies.

What is Mailchain, and how did it come into existence? 

I always had a passion for startups. In 2016 I joined the team at AWS (Amazon Web Services) that helps startups with technology and go-to-market strategy, for example by introducing them to other AWS customers. 

The blockchain landscape was evolving at the time and my soon-to-be co-founders (who I met at AWS) and I started tracking the space closely. We noticed that it wasn’t possible to communicate privately between blockchain addresses without providing an additional piece of information, like an email address. So we sent some encrypted messages with the transaction data as an experiment. 

This grew into a side project. It was open source and had quite a few contributors, but we realized we needed something more scalable that wasn't dependent on blockchain protocols, with the associated gas fees and speed constraints. 

So, in 2021 we set out to build Mailchain, a protocol that enables people to communicate privately using any blockchain address. 

With our SDK, developers can easily add web3 email to their own projects and applications, allowing them to engage with their users in a truly web3 way.

It’s an interesting strategy to focus on upgrading email with web3 capability. Why did you choose this route? 

There are over 3.9 billion active email users today. Each user’s inbox paints a rich picture of who they are. It stores their online actions, communication habits, spending behavior, even their thoughts, ideas, and feelings.  And everybody wants to keep that information private.

‍Web3 on the other hand is underpinned by the principles of decentralization, privacy and digital property rights, using wallets and blockchain addresses as identities. But there’s no native way to communicate privately using these addresses. The workaround is to use another communication channel, whether that’s email, instant messaging or social media.

With Mailchain, users enjoy the privacy and other benefits of a digital identity wallet without needing to leave their email inbox. For instance, people can authenticate with a Web3 application by clicking a link in their inbox. Upon clicking the link, the system creates a self-signed Verifiable Credential (VC). The app knows who should be signing it, and is able to verify the user. 

This use case came from a customer who needed to prevent Zoom-bombing (unauthorized intrusion into a private video conference call). Another use-case is universities selling remote courses. They don’t want people who are not enrolled joining the sessions, or others joining on behalf of those who are enrolled — particularly when it comes to exams. 

How did decentralized identity become part of the Mailchain story? 

We wanted to enable the community, so we open-sourced as much of the codebase as we could. 

We started to see people using Mailchain for authentication, and realized identity was vital to what they were trying to achieve. These developers needed tools to manage user identities. It was early in the adoption cycle and there were a lot of gaps. 

We also started hearing people talking about DIDs (Decentralized Identifiers) and VCs (Verifiable Credentials).  We saw a pattern between VCs and our previous work with NFTs. So, we went deep into the W3C standards and looked at how they were being used in the real world. 

At the time, we didn’t know if we wanted to put people’s Mailchain IDs on-chain. We were looking for a standard way to construct the IDs and convey related attributes, such as the authorized senders for a blockchain address.  

Over time, we saw an opportunity to converge on a standardized approach. We also wanted to extend what we built to help other developers in the ecosystem, so we created Vidos, a suite of managed services to help people building with DIDs and VC related applications. 

Tell us more about the tools you’re building, and how they promote adoption, and interoperability, of decentralized identities

Our first service is the Vidos Universal Resolver. DID resolution forms a core part of any interaction with VCs and needs to be reliable and performant. It’s also something that developers and ops teams shouldn’t need to spend time deploying and managing. The service takes away this burden so deploying a resolver is simple and just requires adding the API details to an application.

The service comes with uptime guarantees and offers granular policy and permissions features, helping organizations meet their own availability and compliance requirements. 

This helps organizations who are not just issuers (of credentials such as course certificates and educational qualifications). They may also need to verify other credentials (such as proof of identity, age, etc.), which potentially involves resolving DIDs on multiple networks and services. 

We also have other services coming later in the year that will facilitate credential verification with similar compliance and logging features. 

You mentioned go-to-market strategy as an area of personal interest. Can you tell us a bit about your own strategy? 

The DID resolution and Mailchain audiences are different. For Vidos, we’re working with enterprises and closing some gaps where technology is not available today. Mailchain is largely feature complete. 

Vidos is a good fit with Mailchain because there’s strong interest in enabling Web3 communication, whether that’s machine-to-machine messages triggered by blockchain transactions or certain types of business communication. 

We need to ground this in the real world, so developing SaaS (Software as a Service) products to move the entire ecosystem forward is what we think is most important right now. 

I’d like to think that building on W3C standards ensures we don’t get ruled out of any geographic  markets. The DID resolver is intended to be multi-region. Customers can already deploy into the UK and EU. We will stand up services elsewhere, as needed. 

What market signals do you see? 

The market never moves fast enough for an entrepreneur! But we’re seeing strong signs. It’s becoming a priority for enterprises to see how they can move beyond identity federation.  Regulatory change and fraud are also encouraging supply chain actors and financial institutions to look at how they can use decentralized identity. 

We’re seeing this pop up in different places, for example it’s good to see LinkedIn verifying humans on the platform. There are certainly tail winds. 

What is the value of DIF membership to Mailchain? 

We’re hoping to collaborate with industry participants, to make sure what we build is right for the use cases we’re targeting, starting with the Vidos Universal Resolver for DIDs, as well as to learn from others building in the space. 

We also want to contribute back to what’s a very useful and sensible set of standards, whether that’s ideas in the working groups and/or contributing packages or libraries.

It’s a great time to be involved in DIF. The standards are reaching a stage where they are mature enough. The opportunity is now!


MyData

MyData4Children ZINE 2024: A challenge for MyData Community – Design a Robot School

Introducing MyData4Children Zine 2024 Numerous studies and real-life events have shown us that emergent technologies affect children, for good and bad. However, the dominant narrative is framed with an individualistic focus, putting a single child or a person in a child’s circle of trust on the spot, leaving many of us feeling defeated, nervous, and […]
Introducing MyData4Children Zine 2024 Numerous studies and real-life events have shown us that emergent technologies affect children, for good and bad. However, the dominant narrative is framed with an individualistic focus, putting a single child or a person in a child’s circle of trust on the spot, leaving many of us feeling defeated, nervous, and […]

Wednesday, 21. February 2024

OpenID

OpenID Summit Tokyo 2024 and Celebrating 10 Years of OpenID Connect

OpenID Foundation Japan (OIDF-J) hosted the OpenID Summit Tokyo 2024 in Shibuya Tokyo on Friday, January 19, 2024 with over 250 in attendance. The OpenID Foundation (OIDF) was thrilled to be a part of the Summit that included contributors from Japan and abroad presenting on current digital identity, security, and digital wallet topics. Gail Hodges, […] The post OpenID Summit Tokyo 2024 and Celeb

OpenID Foundation Japan (OIDF-J) hosted the OpenID Summit Tokyo 2024 in Shibuya Tokyo on Friday, January 19, 2024 with over 250 in attendance. The OpenID Foundation (OIDF) was thrilled to be a part of the Summit that included contributors from Japan and abroad presenting on current digital identity, security, and digital wallet topics.

Gail Hodges, OIDF Executive Director, kicked the Summit off by presenting OIDF’s strategic outlook for 2024 as well as a detailed briefing on the Sustainable Interoperable Digital Identity (SIDI) Summit held in Paris in November 2023.

A highlight of the Summit was a panel discussion celebrating ten years of OpenID Connect. This panel was coordinated and moderated by longtime OIDF board member and OpenID Connect editor, Mike Jones. Panelists included OIDF Chairman, Nat Sakimura, longtime Connect contributor and evangelist, Nov Matake, and Ryo Ito, OIDF-J Evangelist. As Mike Jones noted in his blog, the panelists shared their experiences on what led to OpenID Connect, why it’s been successful, and lessons learned along the way. This was the first of three planned OpenID Connect celebrations in 2024 with the other two taking place at Identiverse in May and the European Identity and Cloud Conference in June.

Nat Sakimura concluded the OpenID Summit Tokyo 2024 by delivering the closing keynote.

The post OpenID Summit Tokyo 2024 and Celebrating 10 Years of OpenID Connect first appeared on OpenID Foundation.


Identity At The Center - Podcast

In our latest episode of the Identity at the Center podcast,

In our latest episode of the Identity at the Center podcast, we had the pleasure of welcoming Sara King and Raul Cepeda from rf IDEAS for a Sponsor Spotlight discussion. This episode, generously sponsored by rf IDEAS, dives deep into the realms of physical security and identity, highlighting the innovative solutions rf IDEAS brings to the table. We explored their unique market positioning, their

In our latest episode of the Identity at the Center podcast, we had the pleasure of welcoming Sara King and Raul Cepeda from rf IDEAS for a Sponsor Spotlight discussion.

This episode, generously sponsored by rf IDEAS, dives deep into the realms of physical security and identity, highlighting the innovative solutions rf IDEAS brings to the table. We explored their unique market positioning, their impactful presence in sectors like healthcare and manufacturing, and how they're leading the charge towards passwordless environments. Our conversation also touched on current industry trends, including the move to secure mobile credentials and the future of biometrics, capped off with insights into rf IDEAS' Reader Remote Management capabilities.

Tune in to this engaging episode to discover how rf IDEAS is bridging the gap between physical and logical security for a seamless authentication experience. It's an insightful discussion on the latest advancements in the field that you won't want to miss.

#iam #podcast #idac


DIDAS

DIDAS Statement for E-ID Technology Discussion Paper

In this latest contribution to the ongoing dialogue surrounding Switzerland's E-ID initiative, DIDAS has released a comprehensive document that critically evaluates the current technological proposals for the Swiss trust infrastructure. This document underscores DIDAS's commitment to a principle-based, collaborative methodology in developing a secure, adaptive E-ID ecosystem, echoing the neces
In this latest contribution to the ongoing dialogue surrounding Switzerland’s E-ID initiative, DIDAS has released a comprehensive document that critically evaluates the current technological proposals for the Swiss trust infrastructure. This document underscores DIDAS’s commitment to a principle-based, collaborative methodology in developing a secure, adaptive E-ID ecosystem, echoing the necessity for an approach that is both inclusive and forward-thinking.

It focuses on the existing scenarios’ technological shortcomings, and is proposing an ‘A+’ scenario that better aligns with EU standards, addresses aspects of privacy (specifically unlinkability and correlation) and fosters iterative development. This approach champions not only secure cryptographic practices but also advocates for the coexistence of various credential types, ensuring a flexible, future-proof infrastructure.

The imperative for cryptographically safe owner binding, a cornerstone for qualified digital identities are further aspects. The document elucidates the necessity for cryptographic primitives embedded directly within the secure elements of devices, particularly for high levels of assurance. This technical requirement is not merely a suggestion but a mandatory prerequisite to prevent any potential misuse or impersonation attempts. It confines of a device’s silicon is highlighted as a critical measure to prevent the unauthorized replication of private keys, ensuring that the sanctity of digital identities remains inviolable.

Furthermore, the document highlights the urgency of action, urging stakeholders to lead the way in establishing a continuously evolving, privacy-centric E-ID framework. It is also aimed at striking a balance between Swiss-specific requirements and EU interoperability, setting a precedent for digital identity management.

DIDAS’s insights into governance structures and the collaborative design of the trust infrastructure serves as a high level guide for policymakers, technologists, and industry stakeholders, emphasizing the collective responsibility in shaping a digital identity ecosystem that is secure, user-centric, adaptable by private sector businesses and aligned with broader societal values and international standards.

Download here: 2024-02 DIDAS E-ID Technology Discussion Paper Response Final  ​​

Next Level Supply Chain Podcast with GS1

Tackling Inventory Headaches in the E-commerce Universe

Tracking and managing inventory from end-to-end is challenging for business merchants dealing with perishable food items. Lichen Zhang, co-founder of Freshly Commerce, is changing how merchants handle the complexities of tracking bundles, managing perishable inventory, and the significance of complying with regulations such as FSMA 204. The company's initial success with its foundation, growth,

Tracking and managing inventory from end-to-end is challenging for business merchants dealing with perishable food items.

Lichen Zhang, co-founder of Freshly Commerce, is changing how merchants handle the complexities of tracking bundles, managing perishable inventory, and the significance of complying with regulations such as FSMA 204. The company's initial success with its foundation, growth, and innovative solutions for inventory and order fulfillment in the e-commerce industry spurred Freshly's evolution into a suite of tools helping e-commerce merchants manage their inventory and order fulfillment. 

The episode provides valuable insights into the evolution of e-commerce and the vital role of innovative solutions like Freshly Commerce in meeting the changing needs of the industry. Explore how Freshly Commerce addresses challenges faced by merchants, the importance of data sharing and collaboration, and the positive impacts of technological advancements on adapting to new conditions. Lichen also emphasizes the role of education and customer-centric strategies in Freshly Commerce's ongoing development and how Freshly Commerce started as a result of identifying a need in the market, leading them to participate in a Shopify app challenge where they secured third place.

 

Key takeaways: 

Accurate and timely inventory management in e-commerce is complex but necessary.

Addressing food safety and compliance with regulations helps prevent food waste and maximize profits.

Embracing technological advancements such as AI-driven tools can positively change some aspects of business operations.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1 US on LinkedIn

 

Connect with guest:

Lichen Zang on LinkedIn

Check out Freshly Commerce

 


Digital Identity NZ

Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter

Kia ora e te whānau Biometrics hit the news again earlier this month. TV1’s 7 Sharp and 1News together with RNZ news, online and several printmedia carried the story of Foodstuffs North Island’s trial of Facial Recognition in 25 of its stores to see if it reduces retail crime. In addition, Māori, Pasifika and People of colour have&nb

Kia ora e te whānau

Biometrics hit the news again earlier this month. TV1’s 7 Sharp and 1News together with RNZ newsonline and several printmedia carried the story of Foodstuffs North Island’s trial of Facial Recognition in 25 of its stores to see if it reduces retail crime. In addition, Māori, Pasifika and People of colour have concerns of bias. Naturally the Office of Privacy Commissioner (OPC) is closely monitoring the trial. I have no special insight but from the links above I deduce that the trial stores run their CCTV feed through facial image matching software set at a high 90% threshold, matching it against that particular store’s database of known and convicted offenders. If a possible match is made, specially trained ‘super recognisers’ visually inspect both enrolled and detected images which in itself should eliminate racial bias while the rest of the feed is deleted.

Permanent deletion being not straightforward and the ‘no sharing’ rule between stores are matters that OPC likely monitors along with the trial’s effectiveness of reducing retail crime. While emerging anecdotal evidence overseas suggests its effectiveness, direct comparative research is needed.

CCTV and facial recognition are widely used for crime detection in public places, we are all using facial recognition every day on our phones, when we cross the border using Smart Gate, or when we use a browser on our PC, so you might ask why all the fuss? 

There are large notices in-store, it’s private property and people can choose to shop elsewhere. The additional use of image software in stores improves matching processes traditionally done by humans, albeit with potential human error. FR software and camera quality continuously improves while human-based matching has limitations. Perfection is challenging, but by combining human and technological efforts we can improve outcomes.

Foodstuffs North Island’s adherence to its rules raises the question of whether striving for perfection impedes progress. DINZ’s Biometrics Special Interest Group reflects on differing community views, agrees with the Deputy Police Commissioner on the need for an open discussion and emphasises the need for education on the technology’s workings and potential benefits when implemented correctly.

Help us provide much needed education and understanding in this domain.

Ngā mihi nui

Colin Wallis

DINZ Executive Director

Read the full news here: Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter

SUBSCRIBE FOR MORE

The post Biometrics… ‘Perfect is the enemy of good?’ | February 2024 Newsletter appeared first on Digital Identity New Zealand.


FIDO Alliance

FIDO Alliance Announces Call for Speakers and Sponsors for FIDO APAC Summit 2024

February 21, 2024 The FIDO Alliance is excited to announce the return of the FIDO APAC Summit for its second year, building on the success of the 2023 event in […]

February 21, 2024

The FIDO Alliance is excited to announce the return of the FIDO APAC Summit for its second year, building on the success of the 2023 event in Vietnam. Scheduled to take place at the JW Marriott Kuala Lumpur, Malaysia, from September 10th to 11th, this premier event in the APAC region is dedicated to advancing phishing-resistant FIDO authentication – focusing on FIDO-based sign-ins with passkeys, and addressing IoT security and edge computing challenges with FIDO Device Onboarding (FDO).

Last year’s conference in Vietnam welcomed over 300 attendees and featured more than 20 sessions with engaging content alongside a sold-out exhibit area with over 20 industry-leading exhibitors and sponsors. The 2024 summit aims to build upon last year’s momentum with detailed case studies, technical tutorials, expert panels, and hands-on workshops. Sessions are designed to educate attendees on business drivers, technical considerations, and best practices for deploying modern authentication systems across web, enterprise and government applications. Additionally, attendees will benefit from a dynamic expo hall and engaging networking opportunities, set against the backdrop of downtown Kuala Lumpur’s natural beauty.

FIDO APAC Summit 2024 Call for Speakers

The FIDO Alliance invites thought leaders, industry experts, entrepreneurs, and academic professionals to submit speaking proposals to enrich the diverse FIDO APAC Summit 2024 program. Speakers with innovative ideas, implementation strategies, and successes in authentication and/or edge computing, from case studies to transformative projects, can submit proposals here. Selected speakers will join the ranks of top cybersecurity minds, influencing the community and promoting phishing-resistant authentication methods. Submit a proposal for an opportunity to shape cybersecurity’s future in the APAC region. Deadline for submissions is May 31, 2024. 

Sponsorship Opportunities at FIDO APAC Summit 2024

Join sponsors such as Samsung Electronics, SecureMetric, RSA, Thales, VinCSS, iProov, AirCuve, Zimperium, SmartDisplayer, and Utimaco and elevate your brand in the digital security landscape by sponsoring the FIDO APAC Summit 2024. This key event draws the cybersecurity community, offering sponsors a chance to interact with over 30 VIPs, speakers, and 300+ delegates, providing unparalleled brand visibility and thought leadership opportunities in the Asia-Pacific tech ecosystem. The summit is an ideal platform for sponsors eager to connect with an audience passionate about advanced passkeys and phishing-resistant authentication methods. Sponsoring this event places your brand at the forefront, engaging directly with professionals and policymakers driving the future of secure digital identities. Demonstrate your commitment to innovation and the development of secure, user-friendly digital ecosystems and influence the benchmark for authentication technologies by becoming a sponsor.

To become a sponsor, view the prospectus and complete the Sponsorship Request Form.

About FIDO Alliance

Formed in July 2012, the FIDO (Fast IDentity Online) Alliance aims to address the lack of interoperability among strong authentication technologies and the difficulties users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is revolutionizing authentication with standards for simpler, stronger methods that reduce reliance on passwords. FIDO Authentication offers stronger, private, and easier use when authenticating to online services. For more information, visit www.fidoalliance.org.

Tuesday, 20. February 2024

DIF Blog

Full steam ahead for the Veramo User Group

The Veramo User Group has kicked into action with a well-attended and productive first meeting.  The meeting on 15th February provided context for the origins of the popular Javascript framework and its subsequent donation to DIF, and surfaced a range of questions, ideas, use cases and feedback from current

The Veramo User Group has kicked into action with a well-attended and productive first meeting. 

The meeting on 15th February provided context for the origins of the popular Javascript framework and its subsequent donation to DIF, and surfaced a range of questions, ideas, use cases and feedback from current and prospective users, plus success stories including an Enterprise solution built on top of Veramo, now in full production. 

After an initial round of introductions, the Veramo Labs team provided an overview of the project’s history and current status, and the goals of the User Group. 

Ideas shared by participants included a registry of plugins that is actively maintained by the community, additional development languages and a 12-month roadmap of planned new features. 

“The team’s goal has been from the beginning to grow the project by growing the community,” Veramo co-founder and group co-chair Mircea Nistor commented during the meeting. 

“Prior to donating Veramo to DIF, we had previously donated other libraries, and we saw more adoption and contributions happening on these libraries. We’d love to see the same kinds of activities happening within the Veramo User Group,” he added.

“To make this into a community project, it needs the original team to be no longer exclusively in charge. The Veramo framework and plugins already contain enough functionality to get people started in this space. We hope to see many new GitHub issues coming from the User Group, and for those with bandwidth to take them on” said Senior Engineer at Consensys Identity and group co-chair Nick Reynolds.

The plan is to start reviewing GitHub issues and Pull Requests (PRs) at next week’s meeting, which is at 09:00 PST / Noon EST / 18:00 CET on Thursday, 22 February. The User Group is open to all, the meeting link can be found in the DIF calendar.

In the meantime, interested parties are welcome to join the community on Discord. The new Veramo User Group channel on DIF’s Discord server is recommended for meeting follow-ups and agenda items, and the Veramo Labs Discord server is the place to head for specific technical questions.

The Veramo Labs team is hoping for the community to participate in leading the User Group within the next six months. 


OpenID

Registration Open for OpenID Foundation Hybrid Workshop at Google on Monday, April 15, 2024

Workshop Overview OpenID Foundation Workshops provide technical insight and influence on current digital identity standards while offering a collaborative platform to openly address current trends and market opportunities. This OpenID Foundation Workshop includes a number of presentations focused on 2024 Foundation strategic initiatives as well as updates on active working groups. Workshop Details

Workshop Overview

OpenID Foundation Workshops provide technical insight and influence on current digital identity standards while offering a collaborative platform to openly address current trends and market opportunities. This OpenID Foundation Workshop includes a number of presentations focused on 2024 Foundation strategic initiatives as well as updates on active working groups.


Workshop Details

Thank you kindly to Google for hosting this after lunch, hybrid workshop on Monday, April 15, 2024 12:30-4pm PT:

Google 
242 Humboldt Ct
Humboldt 1
Sunnyvale, CA 94089

This is an after-lunch workshop with beverages and snacks provided to those attending in person. The Foundation’s Note Well Statement can be found here and is used to govern workshops.


Agenda

TIME

TOPIC

PRESENTERS

12:30-12:35pm

Welcome

Nat Sakimura

12:35-12:50pm

eKYC & IDA WG Update

Mark Haine

12:50-1:05pm

AuthZEN WG Update

David Brossard & Omri Gazitt

1:05-1:20pm

AB/Connect WG Update

Michael Jones

1:20-1:35pm

FAPI WG Update

Nat Sakimura

1:35-1:50pm

MODRNA WG Update

Bjorn Hjelm

1:50-2:05pm

DCP WG Update

Kristina Yasuda & Joseph Heenan

2:05-2:20pm

Shared Signals WG Update

Tim Cappalli

2:20-2:30

BREAK

 

2:30-2:40

OIDF Certification Program Update + Roadmap

Joseph Heenan

2:40-2:55pm

Death & the Digital Estate Community Group

Dean Saxe

2:55-3:05pm

Sustainable & Interoperable Digital Identity Hub Update

Gail Hodges

3:05-3:30pm

Listening Session: Post-Quantum Computing & Identity. What are your concerns? What is OIDF’s role?

Gail Hodges, Nancy Cam-Winget, John Bradley, Rick Byers  

3:30-3:55pm

Listening Session: AI & Identity. What are your concerns? What is OIDF’s role?

Nancy Cam-Winget, Kaelig Deloumeau-Prigent, Mike Kiser, Geraint Rogers   

3:55-4:00pm

Closing Remarks

Nat Sakimura

The post Registration Open for OpenID Foundation Hybrid Workshop at Google on Monday, April 15, 2024 first appeared on OpenID Foundation.


Content Authenticity Initiative

February 2024 | This Month in Generative AI: Election Season

From AI resurrected dictators to AI powered interactive chatbots, political campaigns around the world are deploying the technology to expand their audience and win over voters. This month, Hany Farid, UC Berkeley Professor, CAI Advisor, looks at examples of increasingly easier to combine fake audio with video, its clear effect on the electorate, and existing solutions to authenticating digita

Adobe Stock

by Hany Farid, UC Berkeley Professor, CAI Advisor

News and trends shaping our understanding of generative AI technology and its applications.

In May of 2019, a manipulated video of House Speaker Nancy Pelosi purportedly slurring her words in a public speech racked up over 2.5 million views on Facebook. Although the video was widely reported to be a deepfake, it was what we would today call a “cheap fake.” The original video of Speaker Pelosi was simply slowed down to make her sound inebriated — no AI needed. The cheap fake was, however, a harbinger.

Around 2 billion citizens will vote this year in some 70 elections around the globe. At the same time, generative AI has emerged as a powerful technology that can entertain, defraud, and deceive.

Today, nearly anyone can use generative AI to create hyper-realistic images from only a text prompt, clone a person's voice from a 30-second recording, or modify a video to make the speaker say things they never did or would say. Perhaps not surprisingly, generative AI is finding its way into everything from local to national and international politics. Some of these applications are used to bolster a candidate, but many are designed to be harmful to a candidate or party, and all applications raise new and complex questions.

Trying to help

In October of last year, New York City Mayor Eric Adams used generative AI to make robocalls in which he spoke Mandarin and Yiddish. (Adams only speaks English.) The calls did not disclose that the voice was AI-generated, and at least some New Yorkers believe that Adams is multilingual: "People stop me on the street all the time and say, ‘I didn’t know you speak Mandarin,’" Adams said. While the content of the calls was not deceptive, some claimed that the calls themselves were deceptive and an unethical use of AI.

Not to be outdone, earlier this year Representative Dean Phillips deployed a full-blown OpenAI-powered interactive chatbot to bolster his long-shot bid for the Democratic nomination in the upcoming presidential primary. The chatbot disclosed that it was an AI-bot and allowed voters to ask questions and hear an AI-generated response in an AI-generated version of Phillips's voice. Because this bot violated OpenAI's terms of service, it was eventually taken offline.

Trying to harm

In October of last year, Slovakia — a country that shares part of its eastern border with Ukraine — saw a last-minute and dramatic shift in its presidential election. Just 48 hours before election day, the pro-NATO and Western-aligned candidate Michal Šimečka was leading in the polls by some four points. A fake audio of Šimečka seeming to claim that he was going to rig the election spread quickly online, and two days later the pro-Moscow candidate Robert Fico won the presidential election by five points. It is impossible to say exactly how much the audio impacted the election outcome, but this incident raised concerns about the use of AI in campaigns.

Fast-forward to January of this month when the state of New Hampshire was holding the nation's first primary for the 2024 US presidential election. On the eve of the primary, more than 20,000 New Hampshire residents received robocalls impersonating President Biden. The call urged voters not to vote in the primary and to "save your vote for the November election." It took two weeks before New Hampshire’s Attorney General announced that his office identified two businesses behind these robocalls. 

The past few months have also seen an increasing number of viral images making the rounds on social media. These range from faked images of Trump with convicted child sex trafficker Jeffrey Epstein and a young girl, to faked images of Biden in military fatigues on the verge of authorizing military strikes. 

On the video front, it is becoming increasingly easier to combine fake audio with video to make people say and do things they never did. For example, a speech originally given by Vice President Harris on April 25, 2023, at Howard University was digitally altered to replace the voice track with a seemingly inebriated and rambling Harris.

And these are just a few examples of the politically motivated deepfakes that we have already started to see as the US national election heats up. In the coming months, I'll be keeping track of these examples as they continue to emerge.

Something in between

In the lead up to their election earlier in February, a once-feared army general, who ruled Indonesia with an iron fist for more than three decades, was AI resurrected with a message for voters. And, in India, former Dravida Munnetra Kazhagam – deceased since 2018 – was AI resurrected with an endorsement for his son, the sitting head of the state of Bengaluru. I expect this type of virtual endorsement will become an (ethically complex) trend.

Looking ahead

There are two primary approaches to authenticating digital media. Reactive techniques analyze various aspects of an image or video for traces of implausible or inconsistent properties. Learn more about these photo forensics techniques in my series for the CAI. Proactive techniques, on the other hand, operate at the source of content creation, embedding into or extracting from an image or video an identifying digital watermark or signature. 

Although not perfect, these combined reactive and proactive technologies will make it harder (but not impossible) to create a compelling fake and easier to verify the integrity of real content. The creation and detection of manipulated media, however, is inherently adversarial. Both sides will continually adapt, making distinguishing the real from the fake an ongoing challenge.

While it is relatively straightforward to regulate AI-powered non-consensual sexual imagery, child abuse imagery, and content designed to defraud, regulating political speech is more fraught. We, of course, want to give a wide berth for political discourse, but there should be limits on activities like those we saw in New Hampshire, where bad actors attempt to interfere with our voting rights. 

As a first step, following the New Hampshire AI-powered robocalls, the Federal Communications Commission quickly announced a ban on AI-powered robocalls. While the ruling is fairly narrow and doesn't address the wider issue of AI-powered election interference or non-AI-powered interference, it is a reasonable precaution as we all try to sort out this brave new world where anybody's voice or likeness can be manipulated.

As we continue to wrestle with these complex questions, we as consumers have to be particularly vigilant as we enter what is sure to be a highly contentious election season. We should be vigilant not to fall for disinformation just because it conforms to our personal views, we should be vigilant not to be part of the problem by spreading disinformation, and we should be vigilant to protect our and others' rights (even if we disagree with them) to participate in our democracy.

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Author bio: Professor Hany Farid is a world-renowned expert in the field of misinformation, disinformation, and digital forensics. He joined the Content Authenticity Initiative (CAI) as an advisor in June 2023. The CAI is an Adobe-led community of media and tech companies, NGOs, academics, and others working to promote adoption of the open industry standard for content authenticity and provenance.

Professor Farid teaches at the University of California, Berkeley, with a joint appointment in electrical engineering and computer sciences at the School of Information. He’s also a member of the Berkeley Artificial Intelligence Lab, Berkeley Institute for Data Science, Center for Innovation in Vision and Optics, Development Engineering Program, and Vision Science Program, and he’s a senior faculty advisor for the Center for Long-Term Cybersecurity. His research focuses on digital forensics, forensic science, misinformation, image analysis, and human perception.

He received his undergraduate degree in computer science and applied mathematics from the University of Rochester in 1989, his M.S. in computer science from SUNY Albany, and his Ph.D. in computer science from the University of Pennsylvania in 1997. Following a two-year post-doctoral fellowship in brain and cognitive sciences at MIT, he joined the faculty at Dartmouth College in 1999 where he remained until 2019.

Professor Farid is the recipient of an Alfred P. Sloan Fellowship and a John Simon Guggenheim Fellowship, and he’s a fellow of the National Academy of Inventors.


FIDO Alliance

Intelligent Health.Tech: Site security: Passwordless fingerprint authentication

Thales has announced the SafeNet IDPrime FIDO Bio Smart Card – a security key that enables strong multi-factor authentication (MFA) for the enterprise. This new contactless smart card allows users […]

Thales has announced the SafeNet IDPrime FIDO Bio Smart Card – a security key that enables strong multi-factor authentication (MFA) for the enterprise. This new contactless smart card allows users to access enterprise devices, applications and cloud services using a fingerprint instead of a password. 


StateTech Magazine: How Passwordless Authentication Supports Zero Trust

Utilizing FIDO passkeys addresses security risks associated with password-based systems which often lead to account takeovers, data breaches and even stolen identities. While password managers and legacy forms of two-factor authentication […]

Utilizing FIDO passkeys addresses security risks associated with password-based systems which often lead to account takeovers, data breaches and even stolen identities. While password managers and legacy forms of two-factor authentication offer incremental improvements, there has been industry-wide collaboration to create passkey sign-in technology that is more convenient and more secure.


TechTarget: How passwordless helps guard against AI-enhanced attacks

In the age of generative AI, phishing scams (which already account for 90% of data breaches according to CISA) are becoming increasingly persuasive and humanlike. To mitigate these evolving threats, […]

In the age of generative AI, phishing scams (which already account for 90% of data breaches according to CISA) are becoming increasingly persuasive and humanlike. To mitigate these evolving threats, organizations should prioritize transitioning to passkeys, a phishing-resistant alternative backed by industry giants like Google, Apple, Amazon, and Microsoft, to enhance both security and usability.


The Wall Street Journal: Forget Passwords and Badges: Your Body Is Your Next Security Key

Andrew Shikiar, executive director of the FIDO Alliance, emphasizes the importance of biometric scans as hacking attempts and other cyber threats have become more sophisticated.

Andrew Shikiar, executive director of the FIDO Alliance, emphasizes the importance of biometric scans as hacking attempts and other cyber threats have become more sophisticated.


Hyperledger Foundation

Hyperledger Mentorship Spotlight: Iroha 2 Blockchain Explorer

Engaging in the Iroha 2 Blockchain Explorer project through the Hyperledger Mentorship project has been an exhilarating journey marked by technical challenges, continuous learning, and a profound sense of contributing to the broader technical community. As a mentee in this project, I immersed myself in various facets that not only enhanced my technical skills but also offered valuable i

Engaging in the Iroha 2 Blockchain Explorer project through the Hyperledger Mentorship project has been an exhilarating journey marked by technical challenges, continuous learning, and a profound sense of contributing to the broader technical community. As a mentee in this project, I immersed myself in various facets that not only enhanced my technical skills but also offered valuable insights on becoming a more effective contributor to the open-source realm.


Velocity Network

Live event with Randstad and Rabobank

On March 19th, discover how the Dutch Banking industry is using verifiable credentials to accelerate the shift to a skills-based economy. The post Live event with Randstad and Rabobank appeared first on Velocity.

The post Live event with Randstad and Rabobank appeared first on Velocity.


MyData

Open Position: Finance and admin officer (50% FTE)

Job title:  Finance and admin officerEmployment type:  50% employment contractContract duration:  Permanent with a 6-month trial periodSalary range: 1,100 € – 1,250 €  (2,200 € – 2,500 € FTE)Location: Finland, with a preference for HelsinkiReports to: Executive Director Role description   The Finance and Administration Officer is responsible for monitoring and imp
Job title:  Finance and admin officerEmployment type:  50% employment contractContract duration:  Permanent with a 6-month trial periodSalary range: 1,100 € – 1,250 €  (2,200 € – 2,500 € FTE)Location: Finland, with a preference for HelsinkiReports to: Executive Director Role description   The Finance and Administration Officer is responsible for monitoring and implementing financial operations, setting up and […]

Monday, 19. February 2024

Identity At The Center - Podcast

In our latest episode of The Identity at the Center Podcast,

In our latest episode of The Identity at the Center Podcast, we dive into a conversation with Daniel Grube about TikTok's adoption of FIDO technology to enhance security. Daniel shares insights into the seamless integration of this technology for both enterprise and user benefits, emphasizing the importance of user education and phased technology rollouts. We also explore the lighter side with a d

In our latest episode of The Identity at the Center Podcast, we dive into a conversation with Daniel Grube about TikTok's adoption of FIDO technology to enhance security. Daniel shares insights into the seamless integration of this technology for both enterprise and user benefits, emphasizing the importance of user education and phased technology rollouts. We also explore the lighter side with a debate on airplane seating preferences. Listen to this enlightening discussion at idacpodcast.com or wherever you download your podcasts.

#iam #podcast #idac


GS1

e-CMR in GS1 Belgium

e-CMR in GS1 Belgium In April 2022, GS1 Belgium & Luxembourg launched a pilot project on e-CMR together with 7 companies, amongst which AB InBev. The goal of this pilot project is to optimise the digitalisation of transport with e-CMR and to define standards that everyone can use. One year later, we have already some
e-CMR in GS1 Belgium In April 2022, GS1 Belgium & Luxembourg launched a pilot project on e-CMR together with 7 companies, amongst which AB InBev.

The goal of this pilot project is to optimise the digitalisation of transport with e-CMR and to define standards that everyone can use. One year later, we have already some great insights and asked Andreea Calin from AB InBev to share her findings on e-CMR and our pilot project.

See more on GS1 Belgilux's article


Paperless – GS1 Poland (in Polish)

Paperless – GS1 Poland (in Polish) paperless_logistyka_bez_papieru_taniej_szybciej_bezpieczniej.pdf

E-CMR in Colian Logistic – GS1 Poland (in Polish)

E-CMR in Colian Logistic – GS1 Poland (in Polish) bc_ecmr.pdf
E-CMR in Colian Logistic – GS1 Poland (in Polish) bc_ecmr.pdf

Friday, 16. February 2024

Oasis Open Projects

Approved Errata for Common Security Advisory Framework v2.0 published

Update to the definitive reference for the CSAF language now available. The post Approved Errata for Common Security Advisory Framework v2.0 published appeared first on OASIS Open.

CSAF Aggregator schema updated

OASIS and the OASIS Common Security Advisory Framework (CSAF) TC [1] are pleased to announce the approval and publication of Common Security Advisory Framework Version 2.0 Errata 01.

This document lists the approved errata for the OASIS Standard “Common Security Advisory Framework Version 2.0.” The specific changes are listed in section 1.1, at https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.html#11-description-of-changes.

The Common Security Advisory Framework (CSAF) Version 2.0 is the definitive reference for the CSAF language which supports creation, update, and interoperable exchange of security advisories as structured information on products, vulnerabilities and the status of impact and remediation among interested parties.

The OASIS CSAF Technical Committee is chartered to make a major revision to the widely-adopted Common Vulnerability Reporting Framework (CVRF) specification, originally developed by the Industry Consortium for Advancement of Security on the Internet (ICASI). ICASI has contributed CVRF to the CSAF TC. The revision is being developed under the name Common Security Advisory Framework (CSAF). TC deliverables are designed to standardize existing practice in structured machine-readable vulnerability-related advisories and further refine those standards over time.

The documents and related files are available here:

Common Security Advisory Framework Version 2.0 Errata 01
OASIS Approved Errata
26 January 2024

Editable source (Authoritative):
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.md

HTML:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.html

PDF:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.pdf

JSON schemas:
Aggregator JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/aggregator_json_schema.json
CSAF JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/csaf_json_schema.json
Provider JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/schemas/provider_json_schema.json

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/os/csaf-v2.0-errata01-os.zip

Members of the CSAF TC [1] approved the publication of these Errata by Full Majority Vote [2]. The Errata had been released for public review as required by the TC Process [3]. The Approved Errata are now available online in the OASIS Library as referenced above.

Our congratulations to the CSAF TC on achieving this milestone.

========== Additional references:
[1] OASIS Common Security Advisory Framework (CSAF) TC
https://www.oasis-open.org/committees/csaf/

[2] https://lists.oasis-open.org/archives/csaf/202402/msg00001.html

[3] Public review:
– 15-day public review, 20 December 2023: https://lists.oasis-open.org/archives/members/202312/msg00005.html
– Comment resolution log: https://docs.oasis-open.org/csaf/csaf/v2.0/errata01/csd01/csaf-v2.0-errata01-csd01-comment-resolution-log.txt

The post Approved Errata for Common Security Advisory Framework v2.0 published appeared first on OASIS Open.


FIDO Alliance

White Paper: Addressing FIDO Alliance’s Technologies in Post Quantum World

There has been considerable press, a number of papers, and several formal initiatives concerned with quantum computing’s impact on cryptographic algorithms and protocols. Most standards development organizations are addressing concerns […]

There has been considerable press, a number of papers, and several formal initiatives concerned with quantum computing’s impact on cryptographic algorithms and protocols. Most standards development organizations are addressing concerns about the impact on the security of the currently deployed cryptographic algorithms and protocols. This paper presents FIDO Alliance initiatives that address the impact of quantum computing on the Alliance’s specifications and how the FIDO Alliance is working to retain the long-term value provided by products and services based on the FIDO Alliance specifications. 

This paper is directed to those who have or are considering FIDO-enabled products and solutions but have concerns about the impact of Quantum Computing on their business. This paper will focus, from a high-level approach, on the FIDO Alliance’s acknowledgment of issues related to Quantum Computing and explain how the FIDO Alliance is taking appropriate steps to provide a seamless transition from the current cryptographic algorithms and protocols to new PQC (or quantum-safe) algorithms in a timely manner.

For any questions or comments, please contact feedback@fidoalliance.org.

Wednesday, 14. February 2024

FIDO Alliance

Webinar: Next-Gen Authentication: Implementing Passkeys for your Digital Services

Despite their shortcomings, passwords have been a necessary evil; an unavoidable reality. No wonder online services have struggled to get rid of passwords for nearly three decades. Not anymore! Passkeys […]

Despite their shortcomings, passwords have been a necessary evil; an unavoidable reality. No wonder online services have struggled to get rid of passwords for nearly three decades. Not anymore! Passkeys have emerged as a modern form of authentication, offering a superior user experience and higher security. With over 8 billion accounts already protected by passkeys, the question for service providers isn’t “if” they should be adopting passkeys, rather “when” and “how”. 

During the webinar attendees were able to: 

Learn why this standards-based approach is gaining such rapid traction  Understand how ready your end users are for adopting passkeys Get actionable guidance to roll out passkeys for both low and high assurance authentication Understand why you should introduce passkeys for your digital services right away

Tuesday, 13. February 2024

Oasis Open Projects

The DocBook Schema Version 5.2 OASIS Standard published

DocBook Version 5.2 continues the evolution of the DocBook XML schema. The post The DocBook Schema Version 5.2 OASIS Standard published appeared first on OASIS Open.

DocBook continues its evolution - over 25 years since origin

OASIS is pleased to announce the publication of its newest OASIS Standard, approved by the members on 06 February 2024:

The DocBook Schema Version 5.2
OASIS Standard
06 February 2024

Overview:

Almost all computer hardware and software developed around the world needs some documentation. For the most part, this documentation has a similar structure and a large core of common idioms. The community benefits from having a standard, open, interchangeable vocabulary in which to write this documentation. DocBook has been, and will continue to be, designed to satisfy this requirement. For more than 25 years, DocBook has provided a structured markup vocabulary for just this purpose. DocBook Version 5.2 continues the evolution of the DocBook XML schema.

The prose specifications and related files are available here:

The DocBook Schema Version 5.2

Editable source (Authoritative):
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.docx

HTML:
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.html

PDF:
https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.pdf

Schemas:
Relax NG schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/rng/
Schematron schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/sch/
XML catalog: https://docs.oasis-open.org/docbook/docbook/v5.2/os/catalog.xml
NVDL schemas: https://docs.oasis-open.org/docbook/docbook/v5.2/os/

Distribution ZIP file

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:

https://docs.oasis-open.org/docbook/docbook/v5.2/os/docbook-v5.2-os.zip

Our congratulations to the members of the OASIS DocBook Technical Committee on achieving this milestone.

The post The DocBook Schema Version 5.2 OASIS Standard published appeared first on OASIS Open.

Monday, 12. February 2024

Hyperledger Foundation

Introducing Hyperledger Web3j, the Ethereum Integration Library for Enterprises

We are very excited to announce the newest Hyperledger project, Hyperledger Web3j. Contributed by Web3 Labs to Hyperledger Foundation in January 2024, Web3j is a well-established open source project with an active community that plans to further thrive within the Hyperledger ecosystem.

We are very excited to announce the newest Hyperledger project, Hyperledger Web3j. Contributed by Web3 Labs to Hyperledger Foundation in January 2024, Web3j is a well-established open source project with an active community that plans to further thrive within the Hyperledger ecosystem.


Identity At The Center - Podcast

It’s time for the next exciting episode of the Identity at t

It’s time for the next exciting episode of the Identity at the Center Podcast! We dive into the fascinating world of Security Operations Centers (SOCs) and their crucial role in identity security. In this episode, we had the privilege of hosting two experts from RSM's Managed Security Practice, Steve Kane and Todd Willoughby. Their insights and expertise shed light on the role of SOCs in identit

It’s time for the next exciting episode of the Identity at the Center Podcast! We dive into the fascinating world of Security Operations Centers (SOCs) and their crucial role in identity security.

In this episode, we had the privilege of hosting two experts from RSM's Managed Security Practice, Steve Kane and Todd Willoughby. Their insights and expertise shed light on the role of SOCs in identity security, evolving threats, and the importance of identity data within SOCs. We also explore the decision-making process between building your own SOC or outsourcing.

Listen to the full episode on idacpodcast.com or in your favorite podcast app and gain valuable insights into the anatomy of a breach, actions taken by SOCs to prevent attacks, and the tactics and techniques used by threat actors to avoid detection.

#iam #podcast #idac

Friday, 09. February 2024

FIDO Alliance

WIRED: I Stopped Using Passwords. It’s Great—and a Total Mess

More than 8 billion online accounts can set up passkeys right now, says Andrew Shikiar, the chief executive of the FIDO Alliance, an industry body that has developed the passkey […]

More than 8 billion online accounts can set up passkeys right now, says Andrew Shikiar, the chief executive of the FIDO Alliance, an industry body that has developed the passkey over the past decade. So, I decided to kill my passwords.


TechRound: Top 10 UK Business Cybersecurity Providers

Intercede’s solutions offer maximum protection against data breaches, focusing on:

Intercede’s solutions offer maximum protection against data breaches, focusing on:

Digital Identity Management for citizens, the workforce, and supply chains Compliance adherence Technological solutions such as FIDO, Digital ID Registration, Mobile Authentication, and PKI for robust identity and credential management

International Security Journal: The role of MFA in the fight against phishing

Based on FIDO Alliance and W3C standards, passkeys replace passwords with cryptographic key pairs.This requires the user to further authenticate themselves off-site using either soft or hardware-bound solutions.

Based on FIDO Alliance and W3C standards, passkeys replace passwords with cryptographic key pairs.This requires the user to further authenticate themselves off-site using either soft or hardware-bound solutions.


国际安全期刊》:MFA 在打击网络钓鱼中的作用


Gear Patrol: Want a Faster, More Secure Way of Logging into X on Your iPhone? Use a Passkey

X (formerly Twitter) has introduced passkeys for iPhone users as an alternative to traditional passwords. Passkeys offer heightened security through its inherent two-step authentication system and are generated by the […]

X (formerly Twitter) has introduced passkeys for iPhone users as an alternative to traditional passwords. Passkeys offer heightened security through its inherent two-step authentication system and are generated by the device along with the X account, making it less vulnerable to phishing and unauthorized access.


Velocity Network

NSC’s Chris Goodson joins Velocity’s board

We're delighted that National Student Clearinghouse's Chris Goodson has been voted onto the Velocity Network Foundation Board of Directors. The post NSC’s Chris Goodson joins Velocity’s board appeared first on Velocity.

Thursday, 08. February 2024

OpenID

Public Review Period for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft: OpenID for Verifiable Credential Issuance 1.0 This would be the first Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the […]

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft:

OpenID for Verifiable Credential Issuance 1.0

This would be the first Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpenID Implementer’s Draft. For the convenience of members who have completed their reviews by then, voting will actually begin a week before the start of the official voting period.

The relevant dates are:

Implementer’s Draft public review period: Thursday, February 8, 2024 to Sunday, March 24, 2024 (45 days) Implementer’s Draft vote announcement: Monday, March 11, 2024 Implementer’s Draft early voting opens: Monday, March 18, 2024* Implementer’s Draft official voting period: Monday, March 25, 2024 to Monday, April 1, 2024 (7 days)*

* Note: Early voting before the start of the formal voting period will be allowed.

The OpenID Connect working group page is https://openid.net/wg/connect/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specification in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “AB/Connect” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-ab, and (3) sending your feedback to the list.

— Michael B. Jones

The post Public Review Period for Proposed Implementer’s Draft of OpenID for Verifiable Credential Issuance first appeared on OpenID Foundation.


We Are Open co-op

Pathways to Change

How to run an impactful workshop using our free template Recently, we ran a Theory of Change workshop for the team at the Digital Credentials Consortium, which is hosted by MIT. We’ve found that organisations and projects that are looking to create big impact can benefit from this way of seeing into the future. Theory of Change (ToC) is a methodology or a criterion for planning, partic
How to run an impactful workshop using our free template

Recently, we ran a Theory of Change workshop for the team at the Digital Credentials Consortium, which is hosted by MIT. We’ve found that organisations and projects that are looking to create big impact can benefit from this way of seeing into the future.

Theory of Change (ToC) is a methodology or a criterion for planning, participation, adaptive management, and evaluation that is used in companies, philanthropy, not-for-profit, international development, research, and government sectors to promote social change. (Wikipedia)

In this post we’ll outline what this kind of session aims to achieve, share a template which you can re-use, and explain how to make best use of it.

We’ve become pretty good at running these kinds of workshops for all kinds of clients, large and small, and find them particularly useful in charting a course for collaborative working. Thanks goes to Outlandish for introducing this approach to us!

ToC workshop template by WAO available under a Creative Commons Attribution 4.0 International license

Note: around seven people is ideal for this kind of workshop. We run this workshop remotely, but there’s no reason why it couldn’t be done in person.

🥝 At the Core

The template has several sections to it, but at the core is the triangle of Final goal, Outcomes, and Activities. You work through these in turn, first defining the goal, moving onto outcomes to support that goal, and then activities which lead to the outcomes.

The core of the ToC approach

One of the first things to figure out as a team is the timeframe for the work you are doing together. In terms of the final goal, is that to be achieved in six months? a year? 18 months? three years?

Write that down on the sticky note at the top left-hand corner just to remind everyone.

⛏️ Breaking down the goal

The final goal can be difficult to write, so we’ve broken it down into three sections to make it easier for participants:

Before asking people to contribute ideas, we run through some examples from our own experience. The first row relates to work over around six months, the second over about 18 months, and the third over about three years.

Next, we use the section to the right hand side where each individual participant can take some time to write down what they think the organisation does (or should) do, to influence their stakeholders, to have the desired impact in the world.

They can approach these boxes in any order — for example, some people find it easier to go straight to the impact and then work backwards.

Once everyone has written something in all of the boxes, we go around and ask everyone in turn to explain what they’ve written. This adds some context.

Then, we go around again, and ask everyone to point to things that other people have written that they definitely agree with. This sets the scene for combining ideas into a collaborative final goal.

✅ Good enough for now, safe enough to try

After a quick break, participants are ready to collaborate on a combined final goal. We ask if anyone would like to have a go at filling in one of the boxes. They can do this directly themselves, or we can screenshare and fill it in for them.

After some discussion and iteration of what’s been written, we move onto the other boxes. It’s worth mentioning that the most important thing here is facilitated discussion, which means timeboxing in a way that doesn’t feel rushed.

The phrase to bear in mind is “good enough for now, safe enough to try” which is a slightly different way of “perfect is the enemy of good”.

🔍 Identifying the Outcomes

Getting the goal agreed on by the team is 80% of the work in this session. In our experience, it’s entirely normal for this to take an entire 90-minute session, or even longer.

Moving onto the outcomes, these are statements which support the goal. They are change or achievements that need to happen to help it be achieved; they should be written in a way that it’s possible to say “yes that has happened” or “no it has not”.

For example, “the world is a better place” is not an example of a well-written outcome, but “more people agree that the city is a safer place to live” would work.

Other examples of decent outcomes from different kinds of work might be:

Local biodiversity is enhanced and pollution is reduced. Parents demonstrate improved understanding of internet safety and digital citizenship. Economic diversity within neighbourhoods is increased.

There are several ways we’ve run this part of the workshop, from full-anarchism mode where people just ‘have at it’ through to taking it in turns in an orderly way to add (and then discuss) an outcome.

🚣 Getting to the Activities

People new to ToC workshops often conflate Outcomes and Activities. The easiest way to tell the difference is to ask whether it’s something we’re working towards, or whether it’s something we’re doing.

So, for example, if we take the outcome “Local biodiversity is enhanced and pollution is reduced” some supporting activities might be:

Introduce incentives for creating wildlife-friendly spaces, such as green roofs and community gardens. Run regular river and park clean-up operations to remove pollutants and litter. Enforce stricter regulations on industrial emissions and waste management. Offer subsidies for businesses that implement green practices that reduce pollution and enhance biodiversity. Promote the use of environmentally friendly pesticides and fertilisers in local farming and gardening.

Again, we’ve run workshops where we’ve just had a free-for-all, others where it’s been more orderly, and then others where teams have gone away and come up with the activities outside the session.

Some, in fact, have taken the existing activities they’re engaged with and tried mapping those onto the outcomes. It’s an interesting conversation when those activities don’t map!

💡 Final thoughts

A ToC workshop is a powerful way to chart a course together. It’s a collaborative endeavour for a small group to spend time on. What’s important is strong facilitation, as without it, participants can spend too much time (or not enough!) sharing their thoughts.

If you would like to explore WAO running a ToC workshop for your organisation, get in touch! We also have other approaches and openly-licensed templates that you may want to use and peruse at our Learn with WAO site.

Pathways to Change was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. February 2024

Ceramic Network

CeramicWorld 02

A digest of everything happening across the Ceramic ecosystem for February 2024: ETHDenver, new Orbis, DataFeed API, points, mainnet launches and more!

Welcome to the second edition of CeramicWorld, the monthly Ceramic ecosystem newsletter. Let's dive right in!

🏔️ Attend Proof of Data Summit!

Join Ceramic x Tableland on March 1 in Denver and livestream for Proof of Data Summit, a full-day community gathering on reputation, identity, DePIN, decentralized AI, and decentralized data computing. Featuring lightning talks, technical discussions, and panels with industry visionaries, this is going to be a can't miss event! Don't miss your chance to RSVP now to secure your spot in person or via livestream.

RSVP to Proof of Data Summit 👀 Loading the all new Orbis...

Orbis is expanding beyond social. Driven by developer feedback and a new role as core developers in the Ceramic ecosystem, Orbis’ mission is evolving to offer a simple and efficient gateway for storing and managing open data on Ceramic.

The all new Orbis will provide a developer-friendly SQL interface to explore and query data on Ceramic as well as a user interface and plugin store to save development time on crypto-specific features – from data migration and token gating mechanisms to automated blockchain interactions.

Orbis is built on Ceramic's new Data Feed API, making it fully compatible with ComposeDB. With the new Orbis, developing your project on Ceramic is easier than ever. If you want to learn more or join as an alpha tester, get in touch with the Orbis team on Discord or Twitter.

Learn more about OrbisDB 🔎 Index Network connects LLMs to Ceramic data

Index Network is a composable discovery protocol that enables personalized and autonomous discovery experiences across the web. Index is currently focused on enabling AI agents to query and interact with Ceramic data in a reactive, event-driven manner. Index has also partnered with a number of ecosystem teams like Intuition, Veramo, and more to enable users to create claims and attestations with natural language. Keep an eye out for updates from this team – mainnet launch seems imminent – and check out their documentation.

Learn more about Index 📈 The Ceramic ecosystem is growing

We found it difficult to keep track of all the new projects and initiatives sprouting up throughout the Ceramic ecosystem, so we made this ecosystem map. Let us know if we missed anyone. Enjoy! :)

Ceramic's Data Feed API opens for alpha testing
The Data Feed API is a set of new Ceramic APIs that enable developers to subscribe to the node's data change feed, allowing them to build Ceramic-powered databases and indexing solutions. A number of ecosystem partners are already in testing as we gear up for an early release before EthDenver! ComposeDB now supports interfaces
Interfaces enable standardized data models for interoperability. By defining essential fields that must be shared across models, interfaces facilitate data integration and querying across different models. This is vital for ensuring data consistency, especially in decentralized systems like verifiable credentials. For a detailed overview, see Intro to Interfaces. Get started quickly with create-ceramic-app
The create-ceramic-app CLI tool simplifies the process of starting with the Ceramic network by allowing you to quickly set up a ComposeDB example app. If you're familiar with create-react-app or create-next-app, you should be right at home. If you want to quickly test a Ceramic app locally on your system, simply run npx @ceramicnetwork/create-ceramic-app. This command will guide you through creating a new Ceramic-powered social app in under a minute. Collect attendance badges at ETHDenver 2024!
Ceramic is partnering with Fluence, a decentralized computing marketplace, to put forward a demo that will be in play at each of the events above. You will be able to find us at each event and tap a disc to participate! With each attendance you will claim badges represented as documents on Ceramic. Fluence will be consuming the new Ceramic Data Feed API to enable compute over incoming badges. Deprecation of IDX, Self.ID, Glaze, DID DataStore, 3ID, TileDocuments, Caip10Link
3Box Labs announced the deprecation of a suite of outdated Ceramic development tools including Self.ID, Glaze, DID DataStore, 3ID, 3id-datastore, TileDocuments, and Caip10Link. Due to the improvements in ComposeDB an other Ceramic databases over the last 2 years, these tools saw waning demand, creating significant maintenance overhead while failing to meet our strict UX and security standards. If you're using any of these tools, read this announcement for next steps. Ceramic Community Content FORUM Ceramic protocol minimization? WORKING GROUP Ceramic Points Working Group consisting of 10+ teams formed as a result of this forum post and tweet PODCAST Why to Store ID Data Decentralized with Ceramic TWITTER Oamo launches many new data pools in January BLOG Charmverse x Ceramic: Empowering User Data Ownership in the Blockchain Era TUTORIAL WalletConnect: Create User Sessions with Web3Modal BLOG How Rust delivers speed and security for Ceramic FORUM Ceramic x Farcaster Frames FORUM Making ComposeDB’s composites sharable WORKING GROUP Ceramic Core Devs Notes: 2024-01-02 Upcoming Events Feb 15 Ceramic Core Devs Call Feb 25 - Mar 2 Ceramic x Silk Hacker House (EthDenver): Calling all hackers excited about decentralized tech! Apply to join the Silk EthDenver Hacker House from Feb 25th - March 2nd and take part in revolutionizing scientific tooling, web account UX, governance forums, and more! Participants are encouraged to utilize Ceramic as a decentralized data layer alongside other ecosystem tools like Orbis, EAS and more. (Very limited spots) Feb 25 - Mar 2 DeSci Denver (EthDenver) Feb 27 DePin Day (EthDenver) Feb 28 Open Data Day (EthDenver) Mar 1 Proof of Data Summit (EthDenver) Work on Ceramic JOBS Head of Growth, 3Box Labs (Remote) JOBS Engineering Manager, 3Box Labs (Remote) BOUNTY Build a Ceramic x Scaffold-Eth Module Contact Us

Want to get in touch with the Ceramic core team? Fill out this form (1m). Otherwise, drop us a note in the Forum.


Next Level Supply Chain Podcast with GS1

The Future of Connectivity with Digital Twins, AI, and the Global Supply Chain

Real-time data monitoring is revolutionizing maintenance and efficiency in industries such as aviation and automotive through digital twin technology. Richard Donaldson, host of the Supply Chain Next podcast, is a visionary in supply chain management and circular economy advocate. His  insights on moving from linear to circular supply chains highlight the potential for substantial environme

Real-time data monitoring is revolutionizing maintenance and efficiency in industries such as aviation and automotive through digital twin technology.

Richard Donaldson, host of the Supply Chain Next podcast, is a visionary in supply chain management and circular economy advocate. His  insights on moving from linear to circular supply chains highlight the potential for substantial environmental benefits and the importance of embracing reuse, especially in the context of his work with startups promoting circularity.

The dialogue extends beyond the digital twin to the broader digital transformation of global supply chains, drawing comparisons to the quick adoption of airplane wifi as an example of rapid technological progress. It explores the role of artificial intelligence in supply chain automation and predictive maintenance, touching upon the divide between machine learning and self-actualized thought. The conversation resonates with historical references and Richard's personal entrepreneurial experiences, including his tenure at eBay, his podcast Supply Chain Next, and his perspective on learning from failure. This episode offers a thought-provoking reflection on the future of supply chains and the role of technology in sustainable business practices.

 

Key takeaways: 

The early days of the Internet continue to influence current work in digitizing supply chains.

The global supply chain still lacks full digitization and transparency, particularly in older, established processes.

There is a strong advocacy for shifting towards circular supply chains that are environmentally mindful and focused on sustainability.

 

Connect with GS1 US:

Our website - www.gs1us.org

GS1US on LinkedIn

 

Connect with guest:

Richard Donaldson on LinkedIn

 

Monday, 05. February 2024

FIDO Alliance

ITPro: The end of passwords – and how businesses will embrace it

Big tech firms including Microsoft, Apple and Google have been moving towards a passwordless future for several years, with solutions such as security keys and more recently, passkeys, starting to take off as part of multi-factor authentication […]

Big tech firms including MicrosoftApple and Google have been moving towards a passwordless future for several years, with solutions such as security keys and more recently, passkeys, starting to take off as part of multi-factor authentication (MFA) setups. 

The FIDO Alliance – which most big tech players are members of – is pushing hard for the demise of the password. But what exactly does “the end of the password” mean, in practical terms?


GovTech: Forum Questions Future of Digital Identity, Path Forward

At the recent ID policy forum, the FIDO Alliance, The Identity Theft Resource Center, and other cybersecurity experts discussed the need for new identity verification methods as data breaches reached […]

At the recent ID policy forum, the FIDO Alliance, The Identity Theft Resource Center, and other cybersecurity experts discussed the need for new identity verification methods as data breaches reached record levels in 2023. Panelists argued that relying solely on knowledge-based methods like passwords and Social Security numbers is no longer secure and highlighted the importance of multifactor authentication, passkeys, and biometric checks.


PCMag: Passkeys Are Here: We Just Have to Convince People to Use Them

In a recent identity and authentication conference, Andrew Shikiar, Executive Director of the FIDO Alliance, declared 2023 as the “year of the passkey,” citing 8 billion user accounts with passkey […]

In a recent identity and authentication conference, Andrew Shikiar, Executive Director of the FIDO Alliance, declared 2023 as the “year of the passkey,” citing 8 billion user accounts with passkey access. Shikiar also emphasized the importance of passkeys in enhancing security, streamlining customer experiences, and gradually eliminating the reliance on traditional passwords, while acknowledging ongoing challenges and gaps in support across different industries and platforms.


Content Authenticity Initiative

January 2024 | This Month in Generative AI: Frauds and Scams

News and trends shaping our understanding of generative AI technology and its applications.

Adobe Stock

by Hany Farid, UC Berkeley Professor, CAI Advisor

News and trends shaping our understanding of generative AI technology and its applications.

Advances in generative AI continue to stun and amaze. It seems like every month we see rapid progression in the power and realism of AI-generated images, audio, and video. At the same time, it also seems like we are also seeing rapid advances in how the resulting content is being weaponized against individuals, societies, and democracies. In this post, I will discuss trends that have emerged in the new year.

First it was Instagram ads of Tom Hanks promoting dental plans. Then it was TV personality Gayle King hawking a sketchy weight-loss plan. Next, Elon Musk was shilling for the latest crypto scam, and, most recently, Taylor Swift was announcing a giveaway of Le Creuset cookware. All ads, of course, were fake. 

How it works

Each of these financial scams was powered by a so-called lip-sync deepfake, itself powered by two separate technologies. First, a celebrity's voice is cloned from authentic recordings. Where it used to take hours of audio to convincingly clone a person's voice, today it takes only 60 to 90 seconds of authentic recording. Once the voice is cloned, an audio file is generated from a simple text prompt in a process called text-to-speech. 

In a variant of this voice cloning, a scammer creates a fake audio file by modifying an existing audio file to sound like someone else. This process is called speech-to-speech. This latter fake is a bit more convincing because with a human voice driving the fake, intonation and cadence tend to be more realistic.

Once the voice has been created, an original video is modified to make the celebrity’s mouth region move consistently with the new audio. Tools for both the voice cloning and video generation are now readily available online for free or for a nominal cost.

Although the resulting fakes are not (yet) perfect, they are reasonably convincing, particularly when being viewed on a small mobile screen. The genius — if you can call it that — of these types of fakes is that they can fail 99% of the time and still be highly lucrative for scam artists. More than any other nefarious use of generative AI, it is these types of frauds and scams that seem to have gained the most traction over the past few months. 

Protecting consumers from AI-powered scams

These scams have not escaped the attention of the US government. In March of last year, the Federal Trade Commission (FTC) warned citizens about AI-enhanced scams. And more recently, the FTC announced a voice cloning challenge designed to encourage "the development of multidisciplinary approaches — from products to policies to procedures — aimed at protecting consumers from AI-enabled voice cloning harms, such as fraud and the broader misuse of biometric data and creative content. The goal of the challenge is to foster breakthrough ideas on preventing, monitoring, and evaluating malicious voice cloning."

The US Congress is paying attention, too. A bipartisan bill, the NO FAKES Act, would "prevent a person from producing or distributing an unauthorized AI-generated replica of an individual to perform in an audiovisual or sound recording without the consent of the individual being replicated." 

Acknowledging that there may be legitimate uses of AI-powered impersonations, the Act has carve-outs for protected speech: "Exclusions are provided for the representation of an individual in works that are protected by the First Amendment, such as sports broadcasts, documentaries, biographical works, or for purposes of comment, criticism, or parody, among others." While the NO FAKES Act focuses on consent, Adobe’s proposed Federal Anti-Impersonation Right (the FAIR Act) provides a new mechanism for artists to protect their livelihoods while also protecting the evolution of creative style.

Looking ahead

Voice scams will come in many forms, from celebrity-powered scams on social media to highly personalized scams on your phone. The conventional wisdom of "If it seems too good to be true, it probably is" will go a long way toward protecting you online. In addition, for now at least, the videos often have telltale signs of AI-generation because there are typically several places where the audio and video appear de-synchronized, like a badly dubbed movie. Recognizing these flaws just requires slowing down and being a little more thoughtful before clicking, sharing, and liking.

Efforts are underway to add digital provenance or verifiable Content Credentials to audio. Respeecher, a voice-cloning marketplace gaining traction among creators and Hollywood studios, is adding Content Credentials to files generated with its tool.

For the more personalized attacks that will reach you on your phone in the form of a loved one saying they are in trouble and in need of cash, you and your family should agree on an easy-to-remember secret code word that can easily distinguish an authentic call from a scam.

Subscribe to the CAI newsletter to receive ecosystem news.

Stay connected and consider joining the movement to restore trust and transparency online.

Author bio: Professor Hany Farid is a world-renowned expert in the field of misinformation, disinformation, and digital forensics. He joined the Content Authenticity Initiative (CAI) as an advisor in June 2023. The CAI is an Adobe-led community of media and tech companies, NGOs, academics, and others working to promote adoption of the open industry standard for content authenticity and provenance.

Professor Farid teaches at the University of California, Berkeley, with a joint appointment in electrical engineering and computer sciences at the School of Information. He’s also a member of the Berkeley Artificial Intelligence Lab, Berkeley Institute for Data Science, Center for Innovation in Vision and Optics, Development Engineering Program, and Vision Science Program, and he’s a senior faculty advisor for the Center for Long-Term Cybersecurity. His research focuses on digital forensics, forensic science, misinformation, image analysis, and human perception.

He received his undergraduate degree in computer science and applied mathematics from the University of Rochester in 1989, his M.S. in computer science from SUNY Albany, and his Ph.D. in computer science from the University of Pennsylvania in 1997. Following a two-year post-doctoral fellowship in brain and cognitive sciences at MIT, he joined the faculty at Dartmouth College in 1999 where he remained until 2019.

Professor Farid is the recipient of an Alfred P. Sloan Fellowship and a John Simon Guggenheim Fellowship, and he’s a fellow of the National Academy of Inventors.


Identity At The Center - Podcast

Announcing another episode of The Identity at the Center Pod

Announcing another episode of The Identity at the Center Podcast! Join us as we dive into answering voicemail questions from our listeners. In this episode, we discuss topics such as the barrier of entry to IAM for entry-level roles, the role of IAM architects, influential roles in IAM with the rise of AI, and the choice between using Microsoft Enterprise Identity Protection or a dedicated third-p

Announcing another episode of The Identity at the Center Podcast! Join us as we dive into answering voicemail questions from our listeners. In this episode, we discuss topics such as the barrier of entry to IAM for entry-level roles, the role of IAM architects, influential roles in IAM with the rise of AI, and the choice between using Microsoft Enterprise Identity Protection or a dedicated third-party ITDR (IT Disaster Recovery) solution.

Congrats to listeners Andrew, Alex, Tim, Pedro, and Chris for sending in their questions and winning a digital copy of the book “Learning Digital Identity” by and courtesy of Phil Windley

You can listen to this episode and catch up on all our previous episodes at idacpodcast.com or on your favorite podcast app.

#iam #podcast #idac

Friday, 02. February 2024

Ceramic Network

CharmVerse X Ceramic: Empowering User Data Ownership in the Blockchain Era

Discover how CharmVerse integrated Ceramic to implement a credentialing and rewards system that supports user-sovereign data.

CharmVerse, a pioneering web3 community engagement and onboarding platform, recently integrated ComposeDB on Ceramic to store user attestations for grants and rewards. CharmVerse’s decision to build on Ceramic was driven by the need to store credentials in a user-owned, decentralized manner, without relying on traditional databases.

A who’s who of well-known web3 projects leverage CharmVerse to help manage their community and grants programs. Optimism, Game7, Mantle, Safe, Green Pill, Purple DAO, Orange DAO, Taiko (and the list goes on) have all experienced the need for a unique, web3-centric platform to interact with and empower their ecosystems.

What Objectives Does the Integration Address?

The work of vetting developer teams and distributing grants demands a significant investment of time and focus to ensure responsible treasury deployment. This need-driven use case is a wonderful fit for Ceramic’s capabilities.

CharmVerse identified an opportunity to enhance grants/community managers’ capabilities by implementing a credentialing and rewards system that supports user-sovereign data. This system allows grants managers to better understand their applicants, scale the number of teams they can work with, and issue attestations representing skills and participation in grants and other community programs, creating a verifiable record of participation. However, this solution came with technical challenges in maintaining user data privacy and ownership while ensuring decentralization as this data represents significant insight into the historical activity and capabilities of individuals and teams.

Why did CharmVerse Choose Ceramic?

CharmVerse considered various options but ultimately chose Ceramic due to its unique capability to support decentralized credentials and store attestations in a way that aligned with CharmVerse's vision. Alex Poon, CEO & co-founder of CharmVerse, shared:

“Ceramic's unique approach to data decentralization has been a game changer for us, allowing us to truly empower our users while respecting their privacy, allowing users the choice to keep their data private or publish it on-chain. This integration aligns perfectly with CharmVerse's success metrics, centering on community empowerment and data sovereignty.”

How did CharmVerse Integrate Ceramic?

CharmVerse's integration utilizes Ceramic's ability to store user attestations and leverages Ceramic’s work with the Ethereum Attestation Service (EAS) as the underlying model for supporting decentralized credentials. The integration was not only a technical milestone for CharmVerse but also achieved the strategic goal of appealing to an audience concerned with data privacy and ownership.

More specifically, CharmVerse issues off-chain signed attestations in recognition of important grant program milestones (designed to award these credentials both when users create proposals, and when their grants are accepted). Given Ceramic’s open-access design, we expect to see other teams utilize these credentials issued by CharmVerse as a strong indication of applicant reputation, track record, and ability to deliver.

How to See CharmVerse in Action

This collaboration illustrates the power of innovative solutions in advancing blockchain usability, value, and adoption while maintaining the values of the early cypherpunk vision of decentralization. If you would like to check out this integration and use the tool to manage your own community programs, visit app.charmverse.io and follow the CharmVerse X account for more updates!


FIDO Alliance

Recap: 2024 Identity, Authentication and the Road Ahead Policy Forum

What’s the state of identity and authentication in 2024? That was the primary topic addressed in a day full of insightful speaker sessions and panels at the annual Identity, Authentication […]

What’s the state of identity and authentication in 2024?

That was the primary topic addressed in a day full of insightful speaker sessions and panels at the annual Identity, Authentication and the Road Ahead Policy Forum held on January 25 in Washington D.C. The event was sponsored by the Better Identity Coalition, the FIDO Alliance, and the ID Theft Resource Center (ITRC). 

Topics covered included the latest data on identity theft, financial crimes involving compromised identities and the overall ongoing challenges of identity and authentication. The opportunities for phishing-resistant authentication standards and passkeys resonated throughout the event as well. In his opening remarks, Jeremy Grant of the Better Identity Coalition framed identity as both a cause and potential solution to security problems. 

White House advances strong authentication agenda

In the opening keynote, Caitlin Clarke,  Senior Director, White House National Security Council, detailed some of the steps the Biden-Harris administration is taking to improve digital identity and combat rising cybercrime.

“Money is fuelling the ecosystem of crime, but we often see that identity is either the target or the culprit of the cyber incidents that we are seeing every day,” Clarke said. 

In a bid to help improve the state of identity and authentication, the administration is implementing multi-factor authentication (MFA) for all federal government systems. Clarke also highlighted that the administration strongly believes in implementing phishing-resistant MFA.

“We need to make it harder for threat actors to gain access into systems by requiring and ensuring that a person is who they say they are beyond the username and password,” she said. “That is why authentication is also at the heart of the work we are doing to improve the cybersecurity of critical infrastructure, upon which we all rely.”

The role of biometrics

Biometrics have a role to play in the authentication and identity landscape according to a panel of experts.

The panel included Arun Vemury, Biometrics Expert and ITRC Advisory Board Member; James Lee, COO of the Identity Theft Resource Center; Dr. Stephanie Schuckers, Director, Center for Identification Technology Research (CITeR), Clarkson University; and John Breyault VP, Public Policy, Telecom and Fraud, at National Consumers League.

Panelists generally agreed that properly implemented biometrics combined with other security practices could help devalue stolen identity data and strengthen security overall. 

“Biometrics has the potential to affect fraud numbers,” Breyault said. “It’s not a silver bullet, it’s not going to stop everyone and, it may not be useful in every context, but it is something different than what we’re doing now.”

Better Identity at 5 years

Five years ago, the Better Identity Coalition published Better Identity in America: A Blueprint for Policymakers in response to significant questions from both government and industry about the future of how the United States should address challenges in remote identity proofing and other key issues impacting identity and authentication.

Jeremy Grant, Coordinator at the Better Identity Coalition, detailed the progress made in the past five years and also detailed new guidance for 2024.

The report assessed that while some progress has been made in certain areas like promoting strong authentication, overall the government receives poor grades for failing to prioritize the development of modern remote identity proofing systems or establish a national digital identity strategy. 

The revised blueprint outlines 21 new recommendations and action items for policymakers to help close gaps in America’s digital identity infrastructure and get ahead of growing security and privacy challenges posed by issues like synthetic identity fraud and deep fakes.

“Our message today is the same as it was back in 2018, which is that if you take this as a package, if this policy blueprint is enacted and funded by government, it’s going to address some very critical challenges in digital identity and as the name of our coalition would suggest, make things better,” Grant said.

The year of passkeys

While there is much to lament about the state of identity and authentication, there is also cause for optimism too.

Andrew Shikiar, executive director of the FIDO Alliance detailed the progress that has been made in the past year with the rollout and adoption of passkey deployments.

“Passkeys are simpler, stronger authentication, they are a password replacement,” he said. 

Shikiar noted that there are now hundreds of companies enabling consumers to use passkeys, which is helping to dramatically improve the overall authentication landscape. Not only is a passkey more secure, he also emphasized that it’s easier for organizations to use, than traditional passwords and MFA approaches.

“If you’re in the business of selling things, or providing content, or anything like that you want people to get on your site as quickly as possible –  passkeys are doing that,” he said.

Shikiar noted that the FIDO Alliance understands that user authentication is just one piece of the identity value chain. To that end the FIDO Alliance has multiple efforts beyond passkeys, including certification programs for biometrics and document authenticity certification programs among other efforts.

Don’t want to get breached? Use strong, phishing-resistant authentication

The primary importance of strong authentication was highlighted by Chris DeRusha, Federal Chief Information Security Officer in the  Office of Management and Budget (OMB), who detailed a recent report on a Lapsus cybersecurity gang that was released by the Cyber Safety Review Board. 

DeRusha noted that Lapsus hackers were able to beat MFA prompts using a variety of techniques, including social engineering and even just mass spamming employees with prompts to get someone to act.

A key recommendation from the report is to move away from phishable forms of MFA, including SMS and instead embrace FIDO based authentication with passkeys.

The view from FinCEN

The U.S. Treasury’s Financial Crimes Enforcement Network, more commonly known by the acronym FinCEN, is a critical element of the U.S financial system.

FinCEN Director Andrea Gacki spoke at the event about the agency’s recent progress on beneficial ownership reporting and the FinCEN Identity Project. The FinCEN Identity Project refers to FinCEN’s ongoing work related to analyzing how criminals exploit identity-related processes to perpetuate financial crimes. As part of this, FinCEN published a financial trends analysis earlier this month that looked at 2021 Bank Secrecy Act data to quantify how bad actors take advantage of identity processes during account openings, access, and transactions.

“Robust customer identity processes are the foundation of a secure and trusted U.S. financial system and are fundamental to the effectiveness of every financial institution,” Gacki said.

Sean Evans, lead cyber analyst at FinCEN noted that the recent report examined over 3.8 million suspicious activity reports filed in 2021 and found that approximately 1.6 million reports, representing $212 billion in activity, involved some form of identity exploitation.. Evans explained that cybercriminals are finding ways to circumvent or exploit weaknesses in identity validation, verification, and authentication processes to conduct illicit activities like fraud.

Kay Turner, chief digital identity adviser at FinCEN, emphasized that strengthening identity verification is critical for security. 

“We have to get identity right, it is vital to building trust in the system,” Turner stated.

CISA praises the push towards passkeys

Closing out the event was a keynote from Eric Goldstein, Executive Assistant Director for Cybersecurity, Cybersecurity and Infrastructure Security Agency, (CISA), Department of Homeland Security (DHS).

Goldstein emphasized that it’s important to note that while there are challenges, there has also been progress. Passkeys are now used by consumers everyday and increasing numbers of enterprises are moving toward passwordless deployments.

“It’s worth starting out just with some reflection on how far we have come in moving towards a passwordless future,” Goldstein said.”We are seeing more and more enterprises moving to passwordless for their enterprise privileges, their admin, their their employee authentication solutions and that’s a remarkable shift.”